Friday, February 19, 2010

His tip cleans up the examples in post Chaining Partial Functions with orElseThe secret is to define the type alias type =>?[-A, +B] = PartialFunction[A, B]. This alias may be added to Predef in the future but until it is you can add it yourself.

Thursday, February 18, 2010

PartialFunctions are extremely valuable Scala constructs that are used in many APIs. Commonly you will encounter the pattern:

obj match {

case"Val" => // do something

case _ => // handle all other cases

}

It is intuitive and obvious how to share the code of the right hand side if the case statement by factoring that code out to a method. But would it not be useful to be able to factor out an entire case statement (PartialFunction) and later chain them together as needed?

This is indeed possible and very easy to do:

/*

We need to declare Partial Functions so to add brevity I am adding this alias

*/

scala>import scala.{PartialFunction => PF}

import scala.{PartialFunction=>PF}

/*

You have to explicitly declare the type because the type inferencer cannot know what type of PartialFunction to create

A PartialFunction is Strictly type so some functions can only be used on Ints for example

Monday, February 15, 2010

At first glance the "sself annotation" declaration seems similar to extending another class. (For a look at self annotations read the topic: Self Type.) They are completely different but the comparison is understandable since both of them provide access to the functionality of referenced class.

For example both of the following compile:

class Base {

def magic = "bibbity bobbity boo!!"

}

trait Extender extends Base {

def myMethod = "I can "+magic

}

trait SelfTyper {

self : Base =>

def myMethod = "I can "+magic

}

But the two are completely different. Extender can be mixed in with any class and adds both the "magic" and "myMethod" to the class it is mixed with. SelfType can only be mixed in with a class that extends Base and SelfTyper only adds the method "myMethod" NOT "magic".

Why is the "self annotations" useful? Because it allows several provides a way of declaring dependencies. One can think of the self annotation declaration as the phrase "I am useable with" or "I require a".

The following example demonstrates one possible reason to use self annotations instead of extend.

Note: These examples can be pasted into the REPL but I have shown that here because it would make the examples too long.

import java.io._

import java.util.{Properties => JProps}

trait Properties {

defapply(key:String) : String

}

trait XmlProperties extends Properties {

import scala.xml._

def xml(key:String) = Elem(null,key,Null,TopScope, Text(apply(key)))

}

trait JSonProperties extends Properties {

def json(key:String) : String = "%s : %s".format(key, apply(key))

}

trait StreamProperties {

self : Properties =>

protecteddef source : InputStream

privateval props = new JProps()

props.load(source)

defapply(key:String) = props.get(key).asInstanceOf[String]

}

trait MapProperties {

self : Properties =>

protecteddef source : Map[String,String]

defapply(key:String) = source.apply(key)

}

val sampleMap = Map("1" -> "one", "2" -> "two", "3" -> "three")

val sampleData = """1=one

2=two

3=three"""

val sXml = new XmlProperties() with StreamProperties{

def source = new ByteArrayInputStream(sampleData.getBytes)

}

val mXml = new XmlProperties() with MapProperties{

def source = sampleMap

}

val sJSon = new JSonProperties() with StreamProperties{

def source = new ByteArrayInputStream(sampleData.getBytes)

}

val mJSon = new JSonProperties() with MapProperties{

def source = sampleMap

}

sXml.xml("1")

mXml.xml("2")

sJSon.json("1")

mJSon.json("2")

The justification for using self annotations here is flexibility. A couple other solutions would be

Use subclassing - this is poor solution because there would be an explosion of classes. Instead of having 5 traits you would need 7 traits. Properties, XmlProperties, JSonProperties, XmlStreamProperties, XmlMapProperties, JsonStreamProperties and JsonMapProperties. And if you later wanted to add a new type of properties or a new source like reading from a database then you need 2 new subclasses.

Composition - Another strategy is to use construct the XmlProperties with a strategy that reads from the source. This is essentially the same mechanism except that you need to build and maintain the the dependencies. It also makes layering more difficult. For example:

trait IterableXmlProperties {

self : MapProperties with XmlProperties =>

def xmlIterable = source.keySet map {xml _}

}

new XmlProperties with MapProperties with IterableXmlProperties {def source = sampleMap}

The next question that comes to mind is why use extends then if self annotation is so flexible? My answer (and I welcome discussion on this point) has three points.

The first is of semantics and modeling. When designing a model it is often more logical to use inheritance because of the semantics that comes with inheriting from another object.

Another argument is pragmatism. Imagine the collections library where there is no inheritance. If you wanted a map with Iterable functionality you would have to always declare Traversable with Iterable with Map (and this is greatly simplified). That declaration would have to be used for virtually all methods that require both the Iterable and Map functionality. To say that is impractical is an understatement. Also the semantics of Map is changed from what it is now. The trait Map currently includes the concept of Iterable.

The last point is sealed traits/classes. When a trait is "sealed" all of its subclasses are declared within the same file and that makes the set of subclasses finite which allows certain compiler checks. This (as far as I know) cannot be done with self annotations.

My personal preference is to use contexts because they are more flexible and can have names associated with them. In addition contexts can be shared between specifications and multiple contexts can be used within a single specification. This example is kept very simple for demonstration purposes

Thursday, February 11, 2010

The idea is that depending on a given parameter of type T a particular type of object is required. There are several ways to do this. One would be to use matching to match the type T and create the correct object. Most likely the biggest draw back to matching is caused by type erasure. The following solution gets around that issue.

Two very interesting points.

Implicit methods do not require parameters. They can be selected based only on type parameters

Structural Types are not limited to defining a single method. In that regard they are very similar to interfaces without the binary incompatibility issues. However do not be fooled into thinking they are the same thing. For one reason reflection is used so performance can be an issue in certain cases and also interfaces/traits have semantics that structural types do not.

/*

Defining a types that has both a length and charAt method.

Just a warning. If you leave off the () after length this will not work. This is not a bug. Martin kindly left a comment on why it is not.

Monday, February 8, 2010

Using reflection can be a real pain in Java since the API is a Java API and consists of many gets and searches through collections not to mention so many exceptions that need to be handled. In Scala there is a wonderful way to clean up a reflective call down to a single line (assuming you don't want to worry about handling exceptions.) Here structural typing can really be a pleasure.

// I am assigning a string to an Any reference

scala>val s:Any = "hello :D"

s: Any = hello :D

// Any does not have a length method

scala> s.length

< console>:6: error: value length is not a member of Any

s.length

^

/*

But I can cast it to a structural type with a length method

*/

scala> s.asInstanceOf[{def length:Int}].length

res2: Int = 8

There are restrictions to this. For example implicits will not work:

/*

The method r is part of StringLike (or RichString in Scala 2.7)

and there is an implicit conversion from String to RichString/StringLike.

The structural type does not try to apply the implicits because implicits are a

compile time artifact and that information is not kept at run time. Perhaps this

Friday, February 5, 2010

Structural types allows one to declare types based on the methods the type has. For example you could define a method that takes a class containing a close method. This is fairy analogous to duck-typing in dynamic languages. Except that it is statically enforced.

The main example used here was from a comment on the Code Monkeyism Blog. The commenter further explains that this example is in fact from Beginning Scala chapter 4 (which I would like to read but have not yet had the time.)

/*

A can be any object that has a close method.

This is statically typed which makes some restrictions which are explained later

That is extremely powerful and the consequences will be visited more in the future. But because structural typing is statically enforced it is not quite as flexible as dynamic language's version of duck typing. For example you cannot do:

I don't lik? to commit mys?lf about h?av?n and h?ll - you s*?, I hav? fri*nds in both plac*s.

Mark Twain

/*

Another example using some of the matcher functionality

*/

scala> expr.replaceAllIn(quote, m => m.start.toString)

res6: String =

I don't lik11 to commit mys26lf about h37av40n and h48ll - you s5960, I hav68 fri73nds in both plac90s.

Mark Twain

/*

Another crazy useful method is the replaceSomeIn. It is similar to the replaceAllIn that takes a function except that the function in replaceSomeIn returns an Option. If None then there is no replacement. Otherwise a replacement is performed. Very nice when dealing with complex regular expressions.

In this example we are replacing all 'e's start are before the 50th character in the string with -

A simple but handy use for Option is to select the first valid option from a selection of possible choices. Sound vague? Well it is because it can be used in many different situations. The one presented here is: the program needs a directory that can be set by the user either as a system variable, and environment variable or the default value. The java code is a nightmare of if (xxx == null) statements. The Scala code is beautiful.

Monday, February 1, 2010

In Java a common pattern with class constructors is to assign field values and often there are several intermediate values used for the calculation. If the code is ported to Scala the resulting class will have the intermediate values as fields, which take up space in the object. However the issue is easily worked around. Lets look at a couple examples.

Example 1: Assigning a single field

// Java

import java.io.File

/**

No real logic behind class. But for some reason it needs the path of a tmp directory in the working directory

*/

class OneAssignment {

final String field;

public OneAssignment() {

File file = new File("tmp");

if(!file.exists()) {

file.mkdirs();

}

field = file.getAbsolutePath();

}

}

In Scala the naive way to port this would be:

// Scala

import java.io.File

class OneAssignment {

val file = new File("tmp")

if(!file.exists()) {

file.mkdirs()

}

val field = file.getAbsolutePath()

}

Problem is that it has an extra field "file" now. The correct way to port this would be as follows:

// Scala

import java.io.File

class OneAssignment {

/*

notice that assignment is in a block so file is only visible within the block

*/

val field = {

val file = new File("tmp")

if(!file.exists()) {

file.mkdirs()

}

file.getAbsolutePath()

}

}

Example 2: Assigning multiple fields

// Java

import java.io.File

/**

Basically the same as last example but multiple fields are assigned

Notice that 2 fields depend on the temporary file variable but count does not

*/

class MultipleAssignments {

final String tmp,mvn_repo;

find int count;

public OneAssignment() {

File file = new File("tmp");

if(!file.exists()) {

file.mkdirs();

}

tmp = file.getAbsolutePath();

count = file.listFiles.length;

File home = new File(System.getProperty("user.home"));

mvn_repo = new File(home, ".m2").getPath();

}

}

The Scala port:

// Scala

import java.io.File

class MultipleAssignments {

/*

When multiple fields depend on the same temporary variables the fields can be assigned together from one block by returning a tuple and using Scala's matching to expand the tuple during assignment. See previous topics on assignment for details

*/

val (tmp,count) = {

val file = new File("tmp");

if(!file.exists()) {

file.mkdirs();

}

val tmp = file.getAbsolutePath();

val count = file.listFiles.length;

(tmp, count)

}

val mvn_repo = {

val home = new File(System.getProperty("user.home"));

new File(home, ".m2").getPath();

}

}

In some ways the Scala port is cleaner in that it splits the constructor up and decouples the dependencies between fields.

Search This Blog

About Me

Jesse EicharI am a senior software developer at Camptocamp SA (Swiss office) and specialize in open-source geospatial Java projects. I am a member of the uDig steering committee and a contributor to Geotools and Mapfish. In addition I regularly work with Geoserver and Geonetwork.

In my free time I am a Scala enthusiast. I am working on the Scala IO incubator project and WebSpecs a Specs2 based testing framework for webapplications.