There's plenty of Q&A about GADTs being better than DatatypeContexts, because GADTs automagically make constraints available in the right places. For example here, here, here. But sometimes it seems I still need an explicit constraint. What's going on? Example adapted from this answer:
{-# LANGUAGE GADTs #-}
import Data.Maybe -- fromJust
data GADTBag a where
MkGADTBag :: Eq a => { unGADTBag :: [a] } -> GADTBag a
baz (MkGADTBag x) (Just y) = x == y
baz2 x y = unGADTBag x == fromJust y
-- unGADTBag :: GADTBag a -> [a] -- inferred, no Eq a
-- baz :: GADTBag a -> Maybe [a] -> Bool -- inferred, no Eq a
-- baz2 :: Eq a => GADTBag a -> Maybe [a] -> Bool -- inferred, with Eq a
Why can't the type for unGADTBag tell us Eq a?
baz and baz2 are morally equivalent, yet have different types. Presumably because unGADTBag has no Eq a, then the constraint can't propagate into any code using unGADTBag.
But with baz2 there's an Eq a constraint hiding inside the GADTBag a. Presumably baz2's Eq a will want a duplicate of the dictionary already there(?)
Is it that potentially a GADT might have many data constructors, each with different (or no) constraints? That's not the case here, or with typical examples for constrained data structures like Bags, Sets, Ordered Lists.
The equivalent for a GADTBag datatype using DatatypeContexts infers baz's type same as baz2.
Bonus question: why can't I get an ordinary ... deriving (Eq) for GADTBag? I can get one with StandaloneDeriving, but it's blimmin obvious, why can't GHC just do it for me?
deriving instance (Eq a) => Eq (GADTBag a)
Is the problem again that there might be other data constructors?
(Code exercised at GHC 8.6.5, if that's relevant.)
Addit: in light of @chi's and @leftroundabout's answers -- neither of which I find convincing. All of these give *** Exception: Prelude.undefined:
*DTContexts> unGADTBag undefined
*DTContexts> unGADTBag $ MkGADTBag undefined
*DTContexts> unGADTBag $ MkGADTBag (undefined :: String)
*DTContexts> unGADTBag $ MkGADTBag (undefined :: [a])
*DTContexts> baz undefined (Just "hello")
*DTContexts> baz (MkGADTBag undefined) (Just "hello")
*DTContexts> baz (MkGADTBag (undefined :: String)) (Just "hello")
*DTContexts> baz2 undefined (Just "hello")
*DTContexts> baz2 (MkGADTBag undefined) (Just "hello")
*DTContexts> baz2 (MkGADTBag (undefined :: String)) (Just "hello")
Whereas these two give the same type error at compile time * Couldn't match expected type ``[Char]'* No instance for (Eq (Int -> Int)) arising from a use of ``MkGADTBag'/ ``baz2' respectively [Edit: my initial Addit gave the wrong expression and wrong error message]:
*DTContexts> baz (MkGADTBag (undefined :: [Int -> Int])) (Just [(+ 1)])
*DTContexts> baz2 (MkGADTBag (undefined :: [Int -> Int])) (Just [(+ 1)])
So baz, baz2 are morally equivalent not just in that they return the same result for the same well-defined arguments; but also in that they exhibit the same behaviour for the same ill-defined arguments. Or they differ only in where the absence of an Eq instance gets reported?
@leftroundabout Before you've actually deconstructed the
xvalue, there's no way of knowing that theMkGADTBagconstructor indeed applies.
Yes there is: field label unGADTBag is defined if and only if there's a pattern match on MkGADTBag. (It would maybe be different if there were other constructors for the type -- especially if those also had a label unGADTBag.) Again, being undefined/lazy evaluation doesn't postpone the type-inference.
To be clear, by "[not] convincing" I mean: I can see the behaviour and the inferred types I'm getting. I don't see that laziness or potential undefinedness gets in the way of type inference. How could I expose a difference between baz, baz2 that would explain why they have different types?