Monday, September 27, 2010

Domain Models - Thinking differently in Scala & Clojure

A language that doesn't affect the way you think about programming, is not worth knowing.
- Alan J Perlis

When you model a domain you map the artifacts from the problem domain to the solution domain. The problem domain artifacts are the same irrespective of your solution domain. But the mapping process depends on your medium of modeling, the target platform, the programming language and the paradigms that it offers. Accordingly you need to orient your thought process so as to adapt to the language idioms that you have chosen for implementation.

Recently I did a fun exercise in modeling the same problem domain on to 2 target languages - Scala, that offers a mix of OO and functional features and Clojure, that's much more functional and makes you think more in terms of functions and combinators. The idea is to share the thought process of this domain modeling exercise and demonstrate how even similar architectural patterns of the solution can map to entirely different paradigms in the  implementation model.

This is also one of the underlying themes of my upcoming book DSLs In Action. When you think of a DSL you need to think not only of the surface syntax that the user gets to know, but also of the underlying domain model that forms the core of the DSL implementation. And the underlying host language paradigms shape the way you think of your DSL implementation. In the book there are plenty of examples where I take similar examples from one problem domain and design DSLs in multiple languages. That way you get to know how your thought process needs to be re-oriented when you change implementation languages even for the same problem at hand.

Modeling in Scala, a hybrid object functional language

Consider an abstraction for a security trade. For simplicity we will consider only a small set of attributes meaningful only for the current discussion. Let's say we are modeling a Trade abstraction using a statically typed language like Scala that offers OO as one of the paradigms of modeling. Here's a sample implementation that models a Trade as a case class in Scala ..

Objects for coarse-grained abstractions ..

type Instrument = String
type Account = String
type Quantity = BigDecimal
type Money = BigDecimal

import java.util.{Calendar, Date}
val today = Calendar.getInstance.getTime

case class Trade(ref: String, ins: Instrument, account: Account, unitPrice: Money,
  quantity: Quantity, tradeDate: Date = today) {
  def principal = unitPrice * quantity
}


Ok .. that was simple. When we have classes as the primary modeling primitive in the language we try to map artifacts to objects. So the trade artifact of the problem domain maps nicely to the above class Trade.

But a trade has a definite lifecycle and a trade abstraction needs to be enriched with additional attributes and behaviors in course of the various stages of the trading process. How do we add behaviors to a trade dynamically ?

Consider enriching a trade with tax and fee attributes when we make an instance of a Trade. Similar to Trade we also model the tax and fee types as separate artifacts that can be mixed in with the Trade abstraction.

trait TradeTax { this: Trade =>
  def tradeTax(logic: Money => Money): Money = logic(principal)
}

trait Commission { this: Trade =>
  def commission(logic: (Money, Quantity) => Money): Money = logic(principal, quantity)
}


Now we can instantiate a trade decorated with the additional set of taxes and fees required as per the market regulations ..

lazy val t = new Trade("1", "IBM", "a-123", 12.45, 200) with TradeTax with Commission


Note how the final abstraction composes the Trade class along with the mixins defined for tax and fee types. The thought process is OO along with mixin inheritance - a successful implementation of the decorator pattern. In a language that offers classes and objects as the modeling primitives, we tend to think of them as abstracting the coarse grained artifacts of the domain.

Functions for fine grained domain logic ..

Also note that we use higher order functions to model the variable part of the tax fee calculation. The user supplies this logic during instantiation which gets passed as a function to the calculation of TradeTax and Commission.

lazy val tradeTax = t.tradeTax { p => p * 0.05 }
lazy val commission = t.commission { (p, q) => if (> 100) p * 0.05 else p * 0.07 }


When modeling with Scala as the language that offers both OO and functional paradigms, we tend to use the combo pack - not a pure way of thinking, but the hybrid model that takes advantage of both the paradigms that the language offers.

And now Clojure - a more functional language than Scala

As an alternate paradigm let's consider the same modeling activity in Clojure, a language that's more functional than Scala despite being hosted on top of the Java language infrastructure. Clojure forces you to think more functionally than Scala or Java (though not as purely as Haskell). Accordingly our solution domain modeling thoughts also need to take a functional bend. The following example is also from my upcoming book DSLs In Action and illustrates the same point while discussing DSL design in Clojure.

Trade needs to be a concrete abstraction and one way of ensuring that in Clojure is as a Map. We can also use defrecord to create a Clojure record, but that's not important for the point of today's discussion. We model a trade as a Map, but abstract its construction behind a function. Remember we are dealing with a functional language and all manipulations of a trade need to be thought out as pure functions that operate on immutable data structures.

This is how we construct a trade from a request, which can be an arbitrary structure. In the following listing, trade is the constructor function that returns a Map populated from a request structure.

Abstractions to map naturally to functions ..

; create a trade from a request
(defn trade
    "Make a trade from the request"
  [request]
  {:ref-no (:ref-no request)
   :account (:account request)
   :instrument (:instrument request)
   :principal (* (:unit-price request) (:quantity request))
   :tax-fees {}})
    
0x3b a sample request 
(def request
  {:ref-no "trd-123"
   :account "nomura-123"
   :instrument "IBM"
   :unit-price 120
   :quantity 300})


In our case the Map just acts as the holder of data, the center of attraction is the function trade, which, as we will see shortly, will be the main subject of composition.

Note that the Map that trade returns contains an empty tax-fees structure, which will be filled up when we decorate the trade with the tax fee values. But how do we do that idiomatically, keeping in mind that our modeling language is functional and offers all goodness of immutable and persistent data structures. No, we can't mutate the Map!

Combinators for wiring up abstractions ..

But we can generate another function that takes the current trade function and gives us another Map with the tax-fee values filled up. Clojure has higher order functions and we can make a nice little combinator out of them for the job at hand ..

; augment a trade with a tax fee value
(defn with-values [trade tax-fee value]
  (fn [request]
    (let [trdval (trade request)
          principal (:principal trdval)]
       (assoc-in trdval [:tax-fees tax-fee]
         (* principal (/ value 100))))))


Here trade is a function which with-values takes as input and it generates another function that decorates the original trade Map with the passed in tax-fee name and the value. Here the value that we pass is a percentage that's calculated based on the principal of the trade. I have kept it simpler than the Scala version that models some more logic for calculating each individual tax fee value. This is the crux of the decorator pattern of our model. We will soon dress it up a little bit and give it a user friendly syntax.

and Macros for DSL ..

Now we can define another combinator that plays around nicely with with-values to model the little language that our users can relate to their trading desk vocabulary.

Weeee .. it's a macro :)

; macro to decorate a function       
(defmacro with-tax-fee
    "Wrap a function in one or more decorators"
  [func & decorators]
  `(redef ~func (-> ~func ~@decorators)))


and this is how we mix it with the other combinator with-values ..

(with-tax-fee trade
  (with-values :tax 12)
  (with-values :commission 23))


Note how we made with-tax-fee a decorator that's replete with functional idioms that Clojure offers. -> is a Thrush combinator in Clojure that redefines the original function trade by threading it through the decorators. redef is just a helper macro that redefines the root binding of the function preserving the metadata. This is adapted from the decorator implementation that Compojure offers.

We had the same problem domain to model. In the solution domain we adopted the same overall architecture of augmenting behaviors through decorators. But the solution modeling medium were different. Scala offers the hybrid of OO and functional paradigms and we used both of them to implement decorator based behavior in the first case. In the second effort, we exploited the functional nature of Clojure. Our decorators and the subject were all functions and using the power of higher order functions and combinator based approach we were able to thread the subject through a series of decorators to augment the original trade abstraction.

Friday, September 03, 2010

Towards generic APIs for the open world


In my last post on how Clojure protocols encourage open abstractions, I did some quick rounds between type classes in Haskell and protocols in Clojure. At the end in the section titled "Not really a type class", I mentioned about the read function of Haskell's Read type class. read takes a String and returns a type - hence it doesn't dispatch on the function argument, but rather on the return type. Clojure protocols can't do this, I am not aware of any dynamic language that can do this. Check out James Iry's insightful comment on this subject on the post.


With type classes all dispatch is static - the dispatch map is passed as a dictionary of types and inferred by the compiler. What benefit does this bring on to us ? Do we really get anything special when the language supports APIs like the read method of Haskell's Read type class ?


In this post I try to explore how type classes help design generic APIs that are open and can work seamlessly with abstractions that you implement much later in timeline than the type class itself. This is in contrast to subtype polymorphism where all subtypes are bound by the contracts that the super type exposes. In this sense subtype polymorphism is closed.


This post is inspired in part by the excellent article Generalizing APIs by Edward Z. Yang. For this post I will use Scala, my current language of choice for most of the things I do today.


My generic API


I want to implement a read API like the one in Haskell encoded in a Scala type class .. Let's make it generic in the type that it returns ..


// type class
// reads a string, returns a T
trait Read[T] {
  def read(s: String): T
}

For the open world


We can define instances of this type class by instantiating the trait as objects. Type classes are implemented in Scala using implicits. In case you're not familiar with the concept, here's what I wrote about them some time back.


// instance for Int
implicit object IntRead extends Read[Int] {
  def read(s: String) = s.toInt
}

// instance for Float
implicit object FloatRead extends Read[Float] {
  def read(s: String) = s.toFloat
}

These are very much like what you would do with type class instances in Haskell. You can even create instances for your own abstractions ..


case class Name(last: String, first: String)

object NameDescription {
  def unapply(s: String): Option[(String, String)] = {
    val a = s.split("/")
    Some((a(1), a(0)))
  }
}

// instance for Name
import NameDescription._
implicit object NameRead extends Read[Name] {
  def read(s: String) = s match {             
    case NameDescription(l, f) => Name(l, f)
    case _ => error("invalid")
  }
}

So the Read type class in Scala is generic enough to be instantiated for all kinds of abstractions. Note that unlike interfaces in Java, the polymorphism is not coupled with inheritance hierarchies. With interface, your abstraction needs to implement the interface statically, which means that the interface has to exist before you design your abstraction. With type classes, the abstractions for Int and Float existed well before we define the Read type class.


Now if we have a generic function that takes a String, we can make it return an instance of the type it is generic on.


def foo[: Read](s: String) = implicitly[Read[T]].read(s)

foo[Int]("123") // 123
foo[Float]("123.0") // 123.0
foo[Name]("debasish/ghosh") // Name("ghosh", "debasish")

Ok .. so that was our generic read API adapting violently to already existing abstractions. In this case it's exactly the Scala variant of how simple type class instances behave in Haskell. The authors of Real World Haskell uses the term open world assumption to describe this feature of the type class system.


Context for selecting the API instance


When the function foo is invoked, the compiler needs to find out the exact instance of the Read type class from the method dictionary in case of Haskell and from the list of available implicit conversions in case of Scala. For this we specify the context bound of the generic type T as T : Read. This is same as the context of the type class that we have in Haskell.  It specifies that the method foo can return any type T provided the type is an instance of the type class Read. Apart from using the context bound, in Scala you can also use view bounds to implement context of a type class. The Haskell equivalent is ..


foo :: Read a => String -> a

Irrespective of Haskell or Scala, our API becomes hugely expressive through such constraints that the static type system allows us to write. And all these constraints are checked during compile time.


Context in implementing specific instances


When defining a generic API, you can also set up a context for specific instances of the type class. Consider our read method for a List datatype in Scala. Haskell defines the instance as ..


instance Read a => Read [a] where ..

Note the context Read a following the instance keyword. This is called the context of the type class instance which says that we can read a List of a only if all individual a's also implement the Read type class. 


We do this in Scala using conditional implicits as ..


implicit def ListRead[A](implicit r: Read[A]) = 
  new Read[List[A]] {
    def read(s: String) = {
      val es = s.split(" ").toList
      es.map(r.read(_))
    }
  }

The implicit definition itself takes another implicit argument to validate during compile time that the individual elements of the List also are instances of the type class. This is similar to what the context does in case of Haskell's type class instantiation.


foo[List[Int]]("12 234 45 678") // List(12, 234, 45, 678)
foo[List[Float]]("12.0 234.0 45.0 678.0") // List(12.0, 234.0, 45.0, 678.0)
foo[List[Name]]("debasish/ghosh maulindu/chatterjee nilanjan/das")
  // List(Name("ghosh", "debasish"), Name("chatterjee", "maulindu"), Name("das", "nilanjan"))

As part of common extensions of GHCI, Haskell also provides support for overlapping instances of type classes ..


instance Read a => Read [a] where ..
instance Read [Int] where ..

In such cases although there are two possible matches for [Int], the compiler can make an unambiguous decision and select the most specific instance. With Scala, there is no such ambiguity to be resolved since Scala anyway allows multiple implementations of the same type class and it's up to the user to import the specific one into the module.


In this post I discussed the power that you get with type class based generic API design. In functional languages like Haskell, type classes are the most potent way to implement extensible APIs for the open world. Of course in object functional languages like Scala, you also have the power of subtyping, which comes good in many circumstances. It will be interesting to come up with a comparative analysis of situations when we prefer one to the other. But that's up for some other day, some other post ..