Author Avatar Image
Alexander Reelsen

Backend developer, productivity fan, likes the JVM, full text search, distributed databases & systems

Spark, groovy & JRebel - a productive combination
Feb 2, 2013
10 minutes read

JVM web frameworks have too much unused boilerplate, need a full server restart to resemble changes and are not fun in general - so seems the public opinion.

Most of the time this is true, but there are tons of frameworks, which can be extended with a few tweaks and can be fun again.

Take this code sample, it looks somewhat similar to a route definition in expressjs, but actually is groovy code using spark. I bet you understand it almost immediately.

get "/", { Request request, Response response ->
  jade "index.jade", [title: 'My page']


Before you read on, get a first impression of the spark framework at its homepage, especially the documentation. Basically Spark adds some nice syntactic sugar on top of a servlet application. It uses jetty inside and can be run standalone or as a servlet inside of a servlet container. Spark source is available on github and despite the lack of commits in the last months I mailed with the author Per Wendel and he made it clear to me that the project is not dead.

A simple definition, which includes a preprocessing filter and a normal routes with a parameter looks like this:

import static spark.Spark.*;
import spark.*;
public class HelloWorld {
  public static void main(String[] args) {
    before(new Filter() { // matches all routes
      public void handle(Request request, Response response) {
        boolean authenticated;
        // ... check if authenticated
    	  if (!authenticated) {
          halt(401, "You are not welcome here");
    get(new Route("/hello/:name") {
      public Object handle(Request request, Response response) {
        return "Hello: " + request.params(":name");

The above listing shows several neat features of spark

  • Preprocessing filters
  • Routes with matchers
  • Simple returning of data within your route
  • Nice simplifications for request and session classes

So, what is missing from here? Why didn’t I just stick with it and went on? Fair enough:

  • Having developed with a mixture of groovy and java the last two years, this feels somewhat complex for rapid prototyping - even though it is still tons better than most servlet based web development.
  • Boilerplate: See the big anonymous inner classes of the route and filter? Just too much
  • Returning of data is too simple, I want to return json or render a jade template
  • You cannot check url parameters in my filters. Most common sample: My request is like /foo/:name/checkout and you want to make sure the user in the session is the same user then in the URL

Whether you are using Spark for rapid prototyping or real web applications - always be sure you get fast development cycles, reduce setup time and use a preferred technology. If you lack any of them, it will cost money.

Setting up a project with groovy

As we will be using groovy, I wont make the mistake of using maven for my project. So bootstrap a small project like this:

mkdir groovy-spark
cd groovy-spark
mkdir -p src/main/groovy
mkdir -p src/main/resources
mkdir -p src/test/groovy
mkdir -p src/test/resources

The next step is to create a build.gradle file

apply plugin: 'groovy'
apply plugin: 'idea'
repositories {
    maven { url "" }
    maven { url "" }
// avoid groovy/jade asm clashes
configurations {
    all*.exclude group: 'asm'
dependencies {
    compile 'spark:spark:'
    compile 'de.neuland:jade4j:0.3.8'
    compile 'org.codehaus.groovy:groovy:2.1.0-beta-1'
    compile 'org.slf4j:slf4j-log4j12:1.6.4'
    testCompile 'junit:junit:4.11'

In case you are feeling that gradle is incredibly slow to use, make sure that you set GRADLE_OPTS="-Dorg.gradle.daemon=true" in your environment.

Using spark with groovy

As a first try, lets make the above application short and remove all the anonymous classes:

import spark.*
class WebServer extends SparkGroovy {
  static void main(String[] args) {
    new WebServer().init();
  void init() {
    before { Request req, Response res ->
      boolean authenticated;
      // ... check if authenticated
      if (!authenticated) {
        halt(401, "You are not welcome here");
    get "/hello/:name", { Request request, Response response ->
        return "Hello " + request.params('name') + "\n"

The anonymous classes have been replaced with closures, resulting in much less code, while retaining the full readability. The closure is executed in the context of a Filter or Route class inside of the SparkGroovy class. SparkGroovy.get() looks like this (but can be seen as a hidden implementation detail).

def get(String path, Closure closure) {
  Spark.get(new Route(path) {
    def handle(Request request, Response response) {
      closure.delegate = this
      return, response)

Neat renderers

The next step is to get support rendering different formats. The resulting spark framework code should look like this for rendering JSON or a jade template (thus the dependency in the build.gradle file)

get "/foo.json", { Request request, Response response ->
  json([foo: 'bar'])
get "/jade", { Request request, Response response ->
  jade "test.jade", [foo:  'bar', pageName: 'My page']

In order to support this it is pretty easy to extend the SparkGroovy class (the JSON class is included due to the Jetty dependency, alternatively you could just use the JsonBuilder in the groovy-all package)

static JadeConfiguration config = new JadeConfiguration()
static {
  config.setTemplateLoader(new FileTemplateLoader("src/main/resources/views/", "UTF-8"))
def json(Object obj) {
  return JSON.default.toJSON(obj)
def jade(String template, Object obj) {
  JadeTemplate jadeTemplate = config.getTemplate(template)
  return config.renderTemplate(jadeTemplate, obj)

The JadeConfiguration class is configured to look for templates in src/main/resources/views as root directory, which you would need to create in order to render jade templates.

Making spark really groovy

Chaining closures

If you have ever used node and expressjs (whenever I mention this at java user groups, most of the attendees tend to have one of those “haha, he really said it”-laughs. Think about your attitude and the market share your favourite framework is losing against this powerful stack), you might know the chain of responsibility pattern it uses, which allows to push a request through a series of components. Each component may decide whether it actually does something - like writing out data or checking authentication - or just passes the request on. This actually is a very neat pattern, as it allows to do one of my requirements above - checking for URL parameters and rejecting the request in case of a bad check.

def get(String path, Closure ... closures) {
  spark.Spark.get(createClosureBasedRouteForPath(path, closures))
private Route createClosureBasedRouteForPath(String path, Closure ... closures) {
  new Route(path) {
    def handle(Request request, Response response) {
      closures*.delegate = this
      return closures*.call(request, response).findAll { it }.join()

So, what exactly does this minor change to the SparkGroovy.get() method do? Well, actually a lot. It allows the developer to chain closures for methods, as in this example

def authCheck = { Request request, Response response ->
  if (request.session().attribute('userLogin') != request.params(':name')) {
    halt(401, "No permissions to check for ${request.params(":name")}\n")
get "/greet/:name", authCheck, { Request request, Response response ->
  return "Hello " + request.params('name') + "\n"

Another very cool feature of this “chain-of-responsibility” pattern is the possibility to support an arbitrary amount of output formats by simply chaining any amount of renderers.

def jsonClosure = { Request request, Response response ->
  if (request.contentType() == 'application/json')
    json([name: request.attribute('name')])
def xmlClosure = { Request request, Response response ->
  if (request.contentType() == 'application/xml')
    '<name>' + request.attribute('name') + '</name>\n'
get "/format", { Request request, Response response ->
    request.attribute("name", "some cool name")
}, jsonClosure, xmlClosure

In order to check the authentication sample, you can now login first, and then use the session cookie to call the greeting URL

# curl -v -X POST localhost:4567/login/foo
> POST /login/foo HTTP/1.1
> User-Agent: curl/7.19.7 (universal-apple-darwin10.0) libcurl/7.19.7 OpenSSL/0.9.8r zlib/1.2.3
> Host: localhost:4567
> Accept: */*
< HTTP/1.1 204 No Content
< Set-Cookie: JSESSIONID=1pouphqxh5z3l1pusx23ewksw6;Path=/
< Expires: Thu, 01-Jan-1970 00:00:00 GMT
< Server: Jetty(7.3.0.v20110203)
# Now use the session id cookie for greeting
# curl -v localhost:4567/greet/foo --header "Cookie: JSESSIONID=1pouphqxh5z3l1pusx23ewksw6"
> GET /greet/foo HTTP/1.1
> User-Agent: curl/7.19.7 (universal-apple-darwin10.0) libcurl/7.19.7 OpenSSL/0.9.8r zlib/1.2.3
> Host: localhost:4567
> Accept: */*
> Cookie: JSESSIONID=1pouphqxh5z3l1pusx23ewksw6
< HTTP/1.1 200 OK
< Content-Length: 10
< Server: Jetty(7.3.0.v20110203)
Hello foo

I know that you would usually design a web framework to split between view and controller. The controller should return a list of elements to be rendered, where as the view should cater for the format in this particular case. This not really good solved and can be done way better. Remember, this is prototyping.

Adding JRebel support to your application

Ok, so now I shortened lots of code - but still almost any other web framework (like rails, node or php based ones) will smile at the stop/change/compile/start cycles the typical java application suffers from. There are not too much frameworks coming to mind, which solve this problem at framework level. Playframework 1 and 2 being among them, the ninja framework tries to solve this via the maven-jetty-plugin (as jetty supports a hot reloading mechanism by default, which I have not yet elaborated a lot and seems to require a servlet based setup), but that’s it. Please name other web frameworks having this feature.

Anyway, JRebel is a commercial software (free trial available) doing reliable class reloading. Either use your existing license key or obtain one and make sure your JRebel installation is working. There is also a gradle plugin available, but somehow I was not able to get it to work - though I only played 10 minutes with it.

Using your IDE with JRebel

As I converted to intellij in the last two months due to eclipse 4.0 being unusable, my advice here is to simply install the plugin - but there are plugins for all the big IDEs out there, get them at the download page. Also make sure, you configured the plugin appropriately for your IDE after the installation.

Installing the spark jrebel plugin

First, check out the git repository via git clone, then run mvn package inside of it and you should get a jar in target/jrebel-plugin-spark-0.1-SNAPSHOT.jar.

Configuring IntelliJ

You need to set the following properties in order to have a correctly running spark application with jrebel support


Now fire up your application via Shift+Alt+F10 (on Mac OS) and select Run with JRebel. Now your application should start with the JRebel banner first (in case of the trial version). Whenever some code changes, the jrebel spark plugin is calling the method and class you specified in the properties, after it cleared out all the routes. This may lead to memory leaks, but I do not care for prototyping.

Heads up! It is not fully automated yet. You still need to compile your code manually via Shift+Cmd+F9 in IntelliJ as this feature does not work when the application is running (that is mentioned at the configuration setting). This works flawlessly in eclipse by the way, in case you set the Build automatically option.

Extending it further aka TODO

The whole project was created during a weekend, so the quality is sort of low - but sufficient for my ways of rapid prototyping.

  • Your prototype has to have a static method for reloading. Not too cool (maybe a small dependency injection via guice might be useful here). This is a big reason I merely use this approach mainly for prototyping right now (maintaining persistence layers this way is not too cool).
  • No exception handling in the jrebel plugin (sorry, but only did a quick hack here). Also loading the whole application over and over might lead to problems if you start up some persistence layer stuff. You might need to hack around that.
  • Usually you do not need SparkGroovy with non-static methods (as Spark is working completely with static methods). However my groovy was not good enough to make the closure delegation work in the scope of a static get or post class. If you can help me, I will sponsor beer :-)


Of course all the sources are at github (spark-groovy and jrebel-spark-plugin). Do whatever you want with it. I would be more than glad, if you contributed something back. Write a blogpost, correct my horrible groovy.

Back to posts