69

In Scala, algebraic data types are encoded as sealed one-level type hierarchies. Example:

-- Haskell
data Positioning a = Append
                   | AppendIf (a -> Bool)
                   | Explicit ([a] -> [a]) 
// Scala
sealed trait Positioning[A]
case object Append extends Positioning[Nothing]
case class AppendIf[A](condition: A => Boolean) extends Positioning[A]
case class Explicit[A](f: Seq[A] => Seq[A]) extends Positioning[A]

With case classes and case objects, Scala generates a bunch of things like equals, hashCode, unapply (used by pattern matching) etc that brings us many of the key properties and features of traditional ADTs.

There is one key difference though – In Scala, "data constructors" have their own types. Compare the following two for example (Copied from the respective REPLs).

// Scala

scala> :t Append
Append.type

scala> :t AppendIf[Int](Function const true)
AppendIf[Int]

-- Haskell

haskell> :t Append
Append :: Positioning a

haskell> :t AppendIf (const True)
AppendIf (const True) :: Positioning a

I have always considered the Scala variation to be on the advantageous side.

After all, there is no loss of type information. AppendIf[Int] for instance is a subtype of Positioning[Int].

scala> val subtypeProof = implicitly[AppendIf[Int] <:< Positioning[Int]]
subtypeProof: <:<[AppendIf[Int],Positioning[Int]] = <function1>

In fact, you get an additional compile time invariant about the value. (Could we call this a limited version of dependent typing?)

This can be put to good use – Once you know what data constructor was used to create a value, the corresponding type can be propagated through rest of the flow to add more type safety. For example, Play JSON, which uses this Scala encoding, will only allow you to extract fields from JsObject, not from any arbitrary JsValue.

scala> import play.api.libs.json._
import play.api.libs.json._

scala> val obj = Json.obj("key" -> 3)
obj: play.api.libs.json.JsObject = {"key":3}

scala> obj.fields
res0: Seq[(String, play.api.libs.json.JsValue)] = ArrayBuffer((key,3))

scala> val arr = Json.arr(3, 4)
arr: play.api.libs.json.JsArray = [3,4]

scala> arr.fields
<console>:15: error: value fields is not a member of play.api.libs.json.JsArray
              arr.fields
                  ^

scala> val jsons = Set(obj, arr)
jsons: scala.collection.immutable.Set[Product with Serializable with play.api.libs.json.JsValue] = Set({"key":3}, [3,4])

In Haskell, fields would probably have type JsValue -> Set (String, JsValue). Which means it will fail at runtime for a JsArray etc. This problem also manifests in the form of well known partial record accessors.

The view that Scala's treatment of data constructors is wrong has been expressed numerous times – on Twitter, mailing lists, IRC, SO etc. Unfortunately I don't have links to any of those, except for a couple - this answer by Travis Brown, and Argonaut, a purely functional JSON library for Scala.

Argonaut consciously takes the Haskell approach (by privateing case classes, and providing data constructors manually). You can see that the problem I mentioned with Haskell encoding exists with Argonaut as well. (Except it uses Option to indicate partiality.)

scala> import argonaut._, Argonaut._
import argonaut._
import Argonaut._

scala> val obj = Json.obj("k" := 3)
obj: argonaut.Json = {"k":3}

scala> obj.obj.map(_.toList)
res6: Option[List[(argonaut.Json.JsonField, argonaut.Json)]] = Some(List((k,3)))

scala> val arr = Json.array(jNumber(3), jNumber(4))
arr: argonaut.Json = [3,4]

scala> arr.obj.map(_.toList)
res7: Option[List[(argonaut.Json.JsonField, argonaut.Json)]] = None

I have been pondering this for quite some time, but still do not understand what makes Scala's encoding wrong. Sure it hampers type inference at times, but that does not seem like a strong enough reason to decree it wrong. What am I missing?

Community
  • 1
  • 1
missingfaktor
  • 86,952
  • 56
  • 271
  • 360
  • Are you looking for a Haskell example where different data constructors of the same type result in values of differing types? – David Young Aug 15 '14 at 16:44
  • @DavidYoung, not really. Only wish to know the demerits of the Scala approach. – missingfaktor Aug 15 '14 at 16:45
  • 11
    @missingfaktor Oh. Well, you can do that in Haskell with GADTs and phantom types, so you know. – David Young Aug 15 '14 at 16:48
  • @DavidYoung, I thought so. :) – missingfaktor Aug 15 '14 at 16:49
  • 4
    +1, great question. I'm not sure how I feel about representing the "because Haskell" side, since I often _do_ use constructor types in Scala. For me the preference against is largely a matter of parsimony, and the type inference problems can actually be fairly annoying, but I definitely wouldn't advocate being fundamentalist about the issue. – Travis Brown Aug 15 '14 at 17:16
  • 6
    You were speculating on how Haskell would handle the json example. Two popular json libraries are [json](http://hackage.haskell.org/package/json) and [aeson](http://hackage.haskell.org/package/aeson). Both treat objects and arrays as separate types that get wrapped into a sum type. Functions that might handle various json values take the sum type as an argument, and apply pattern matching. – Michael Steele Aug 15 '14 at 18:06
  • One guess is that you need subtyping to achieve that kind of functionality and since subtyping destroys syntax directedness you have a loss of inference—which is true, as far as I know. So, perhaps Haskell can infer more programs in light of this distinction. Does Argonaut have better inference properties than other JSON libraries in Scala? (I don't know enough about Scala to venture more than this guess) – J. Abrahamson Aug 15 '14 at 18:11
  • 1
    At a purely pedantic level, I think it's fair to say that ADTs stand on their own and mix well with certain kinds of subtyping... whereas Scala seems to mix them in a way that feels more confusing (to me). – J. Abrahamson Aug 15 '14 at 18:14
  • @MichaelSteele, my remark still stays valid. With Scala's approach, what you have is not the sum type, but one of those types. – missingfaktor Aug 15 '14 at 18:24
  • 1
    @J.Abrahamson, I haven't used Argonaut in anger, but I have used Play JSON, and I cannot recollect any type inference issues that its design has caused. – missingfaktor Aug 15 '14 at 18:26
  • @J.Abrahamson, your guess about this requiring subtyping is correct. What do you mean by syntax-directedness? – missingfaktor Aug 15 '14 at 18:27
  • 7
    Syntax-directedness is the property where looking at the syntax of a fragment of code alone is sufficient to know which typing judgement is involved. So, if you see syntax `(a, b)` you know that you're dealing with a pair... until you add subtyping since now you could be dealing with typing judgements of any supertype. Section 23.1 here: https://www.cs.cmu.edu/~rwh/plbook/book.pdf – J. Abrahamson Aug 15 '14 at 18:49
  • 4
    Note that Haskell does have subtyping... but it's of a really limited form—it only occurs on quantified variables with respect to the typeclass dictionaries available, the active constraints. Universally quantified types can always add more type constraints and existentially quantified types can always add fewer constraints. So—really restricted! – J. Abrahamson Aug 15 '14 at 19:00

1 Answers1

32

To the best of my knowledge, there are two reasons why Scala's idiomatic encoding of case classes can be bad: type inference, and type specificity. The former is a matter of syntactic convenience, while the latter is a matter of increased scope of reasoning.

The subtyping issue is relatively easy to illustrate:

val x = Some(42)

The type of x turns out to be Some[Int], which is probably not what you wanted. You can generate similar issues in other, more problematic areas:

sealed trait ADT
case class Case1(x: Int) extends ADT
case class Case2(x: String) extends ADT

val xs = List(Case1(42), Case1(12))

The type of xs is List[Case1]. This is basically guaranteed to be not what you want. In order to get around this issue, containers like List need to be covariant in their type parameter. Unfortunately, covariance introduces a whole bucket of issues, and in fact degrades the soundness of certain constructs (e.g. Scalaz compromises on its Monad type and several monad transformers by allowing covariant containers, despite the fact that it is unsound to do so).

So, encoding ADTs in this fashion has a somewhat viral effect on your code. Not only do you need to deal with subtyping in the ADT itself, but every container you ever write needs to take into account the fact that you're landing on subtypes of your ADT at inopportune moments.

The second reason not to encode your ADTs using public case classes is to avoid cluttering up your type space with "non-types". From a certain perspective, ADT cases are not really types: they are data. If you reason about ADTs in this fashion (which is not wrong!), then having first-class types for each of your ADT cases increases the set of things you need to carry in your mind to reason about your code.

For example, consider the ADT algebra from above. If you want to reason about code which uses this ADT, you need to be constantly thinking about "well, what if this type is Case1?" That just not a question anyone really needs to ask, since Case1 is data. It's a tag for a particular coproduct case. That's all.

Personally, I don't care much about any of the above. I mean, the unsoundness issues with covariance are real, but I generally just prefer to make my containers invariant and instruct my users to "suck it up and annotate your types". It's inconvenient and it's dumb, but I find it preferable to the alternative, which is a lot of boilerplate folds and "lower-case" data constructors.

As a wildcard, a third potential disadvantage to this sort of type specificity is it encourages (or rather, allows) a more "object-oriented" style where you put case-specific functions on the individual ADT types. I think there is very little question that mixing your metaphors (case classes vs subtype polymorphism) in this way is a recipe for bad. However, whether or not this outcome is the fault of typed cases is sort of an open question.

Daniel Spiewak
  • 52,267
  • 12
  • 104
  • 120
  • 3
    I agree with the first point, but the second one isn't very compelling. In my experience (similar to @missingfaktor's examples), I found the opposite to be true. Knowing the type of a coproduct case allows me to ignore *the other* cases. Consider also the case of singleton types, like `1.type`, which are desired in libraries like shapeless for the extra guarantees they provide. – Ionuț G. Stan Aug 15 '14 at 22:02
  • @IonuțG.Stan Sometimes yes, the more specific type does help factor things down. It's hard to deny though that it's an extra name that you need to keep in your head, know that it's floating around, etc. – Daniel Spiewak Aug 15 '14 at 22:05
  • 1
    I guess that happens anyway, even if it represents a type or not. You still have to deal with that case in the end. – Ionuț G. Stan Aug 15 '14 at 22:20
  • 5
    How is the third point not, basically, "OOP is bad"? What's wrong with multi-paradigm programming that mixes the best features of ADTs and OOP? – Rex Kerr Aug 16 '14 at 01:04
  • 3
    @RexKerr I think even if you excise the "OOP is bad" you still have the "metaphor mixing is awkward" bit. – J. Abrahamson Aug 16 '14 at 02:02
  • I generally don't ascribe to the view that OOP is bad, but metaphor mixing is definitely very weird most of the time. Not always, but most of the time. – Daniel Spiewak Aug 16 '14 at 21:08
  • 4
    Well, let's phrase it this way. When would I ever want my data to not know how to perform the most natural computations on itself? Why would I want my data wrapped twice when it can be wrapped once? – Rex Kerr Aug 17 '14 at 00:09
  • I don't find myself ever wanting that, @RexKerr. I don't mean to say that you shouldn't, but merely want to express that I find there to be a perfectly workable metaphor which does not include that concept at all—thus leading to a potential for mixing. I'm not sure that the wrapping concept is a necessary thing, merely an artifact of Scala's implementation. – J. Abrahamson Aug 18 '14 at 05:04
  • 1
    @J.Abrahamson - Given the constraints of implementation, you can still ask: why don't you want it this way in Scala? (Haskell has different constraints of implementation that makes ADTs work well the way they do them.) – Rex Kerr Aug 18 '14 at 05:17
  • 1
    It's not constraints of implementation, it's just a different typing model. I could argue for it, but that's not my point: my point is merely that *there exists* another sensible model for this which is consistent and meaningful and therefore Scala's desire to capture both this model and a more OO-inspired model causes "weird mixing". If you want to talk about in what ways this model differs or is better or worse than the more OO/dynamic dispatch model then I suggest you open another question. – J. Abrahamson Aug 18 '14 at 12:49