Introduction to Spark Java
last modified July 6, 2020
This is an introductory tutorial of the Spark Java web framework. We introduce the Spark Java framework and provide three code examples.
Spark Java
Spark is a Java micro framework for creating web applications in Java 8 with minimal effort. Spark framework is a simple and lightweight Java web framework built for rapid development. It was inspired by Sinatra, a popular Ruby micro framework.
Spark uses Java 8's lambda expressions extensively, which makes Spark applications a lot less verbose. In contrast to other Java web frameworks, Spark does not use heavily XML files or annotations.
Routes
A Spark application contains a set of routes. A route maps URL patterns to Java handlers.
A route has three parts:
- a verb, including get, post, put, delete, head, trace, connect, and options
- a path such as /first or /hello/:name
- a callback (request, response) -> { }
First application
The first application returns a simple message. Gradle is used to build the application.
$ tree . ├── build.gradle └── src └── main └── java └── com └── zetcode └── firstspark └── FirstSpark.java
This is the project structure. The Gradle's Java plugin expects the
Java production code to be located in src/main/java
directory.
apply plugin: 'java' apply plugin: 'application' archivesBaseName = "first" version = '1.0' mainClassName = "com.zetcode.firstspark.FirstSpark" repositories { mavenCentral() } dependencies { compile 'com.sparkjava:spark-core:2.5' compile 'org.slf4j:slf4j-simple:1.7.6' }
This is the Gradle build file. It includes dependencies for Spark core components and the slf4j simple logger.
package com.zetcode.firstspark; import static spark.Spark.get; public class FirstSpark { public static void main(String[] args) { get("/first", (req, res) -> "First Spark application"); } }
The applications returns the "First Spark application message" to the GET request. When we run the application, Spark starts an embedded Jetty web server.
get("/first", (req, res) -> "First Spark application");
The get()
method maps the routes for HTTP GET requests.
In the Spark lingo, a route is a handler
A route is a URL pattern that is mapped to a handler.
A handler can be a physical file or a
$ gradle build
We build the application with gradle build
command.
$ gradle run
We run the application with gradle run
command.
An embedded Jetty server is started.
$ curl localhost:4567/first First Spark application
We send a GET request to the server with the curl
tool.
The built-in embedded Jetty server listens on port 4567 by default.
Hello application
The second application will respond with a greeting to the user. The client sends a name with the URL and the application responds with a greeting to the user.
apply plugin: 'java' apply plugin: 'application' archivesBaseName = "hello" version = '1.0' mainClassName = "com.zetcode.hellospark.HelloSpark" repositories { mavenCentral() } dependencies { compile 'com.sparkjava:spark-core:2.5' compile 'org.slf4j:slf4j-simple:1.7.6' }
This is the Gradle build file of the application.
$ tree . ├── build.gradle └── src └── main └── java └── com └── zetcode └── hellospark └── HelloSpark.java 6 directories, 2 files
This is the project structure.
package com.zetcode.hellospark; import static spark.Spark.get; public class HelloSpark { public static void main(String[] args) { get("/hello/:name", (req, res) -> "Hello " + req.params(":name")); } }
The Spark application retrieves the request parameter, builds a message, and returns it to the caller.
get("/hello/:name", (req, res) -> "Hello " + req.params(":name"));
The params()
method returns the value of the provided route
pattern parameter.
$ gradle build run
We build and run the application.
$ curl localhost:4567/hello/Peter Hello Peter
We send a request to the server; the URL includes a name. The application sends back a greeting.
Running Spark application in Tomcat
By default, Spark applications run in an embedded Jetty server. In this example, we show how to run a Spark Java application in Tomcat. This time we use Maven build tool and create a project in NetBeans.

The figure shows how the project looks like in NetBeans.
<?xml version="1.0" encoding="UTF-8"?> <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"> <modelVersion>4.0.0</modelVersion> <groupId>com.zetcode</groupId> <artifactId>HelloSpark2</artifactId> <version>1.0-SNAPSHOT</version> <packaging>war</packaging> <name>HelloSpark2</name> <properties> <endorsed.dir>${project.build.directory}/endorsed</endorsed.dir> <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding> </properties> <dependencies> <dependency> <groupId>com.sparkjava</groupId> <artifactId>spark-core</artifactId> <version>2.5</version> </dependency> <dependency> <groupId>org.slf4j</groupId> <artifactId>slf4j-simple</artifactId> <version>1.7.21</version> </dependency> </dependencies> <build> <plugins> <plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-compiler-plugin</artifactId> <version>3.1</version> <configuration> <source>1.8</source> <target>1.8</target> <compilerArguments> <endorseddirs>${endorsed.dir}</endorseddirs> </compilerArguments> </configuration> </plugin> <plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-war-plugin</artifactId> <version>2.3</version> <configuration> <failOnMissingWebXml>false</failOnMissingWebXml> </configuration> </plugin> </plugins> </build> </project>
This is the Maven build file.
<?xml version="1.0" encoding="UTF-8"?> <Context path="/HelloSpark2"/>
This is the context.xml
file.
<?xml version="1.0" encoding="UTF-8"?> <web-app xmlns="http://xmlns.jcp.org/xml/ns/javaee" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://xmlns.jcp.org/xml/ns/javaee http://xmlns.jcp.org/xml/ns/javaee/web-app_3_1.xsd" version="3.1"> <welcome-file-list> <welcome-file>index.html</welcome-file> </welcome-file-list> <filter> <filter-name>SparkFilter</filter-name> <filter-class>spark.servlet.SparkFilter</filter-class> <init-param> <param-name>applicationClass</param-name> <param-value>com.zetcode.hellospark2.HelloSpark</param-value> </init-param> </filter> <filter-mapping> <filter-name>SparkFilter</filter-name> <url-pattern>/*</url-pattern> </filter-mapping> </web-app>
In the web.xml
deployment descriptor, we specify the spark.servlet.SparkFilter
.
package com.zetcode.hellospark2; import static spark.Spark.get; import spark.servlet.SparkApplication; public class HelloSpark implements SparkApplication { @Override public void init() { get("/hello/:name", (request, response) -> "Hello " + request.params(":name")); } }
We implement the SparkApplication
interface and specify the route in
the init()
method.
Finally, we run the Tomcat web server.
$ curl localhost:8084/HelloSpark2/hello/Peter Hello Peter
The NetBeans' built-in Tomcat listens on port 8084.
Template engines
Spark does not have its own templating system; it uses third-party engines. In the following two examples, we use Thymeleaf and FreeMarker.
Using Thymeleaf
In the following example, we are going to integrate the Thymeleaf template engine into our Spark application. Thymeleaf is a modern server-side Java template engine for both web and standalone environments.
$ tree . ├── build.gradle └── src └── main ├── java │ └── com │ └── zetcode │ └── thymeleaf │ └── SparkThymeleaf.java └── resources └── templates └── hello.html
This is the directory structure of the project. The template files are located
in the src/main/resources/templates
directory.
apply plugin: 'java' apply plugin: 'application' archivesBaseName = "sparkthymeleaf" version = '1.0' mainClassName = "com.zetcode.thymeleaf.SparkThymeleaf" repositories { mavenCentral() } dependencies { compile 'com.sparkjava:spark-core:2.5' compile 'org.slf4j:slf4j-simple:1.7.6' compile 'com.sparkjava:spark-template-thymeleaf:2.3' }
Here we have the Gradle build file, which includes the spark-template-thymeleaf
dependency.
package com.zetcode.thymeleaf; import java.util.HashMap; import java.util.Map; import spark.ModelAndView; import spark.Request; import spark.Response; import spark.template.thymeleaf.ThymeleafTemplateEngine; import static spark.Spark.get; import static spark.Spark.staticFileLocation; public class SparkThymeleaf { public static void main(String[] args) { get("/hello/:name", SparkThymeleaf::message, new ThymeleafTemplateEngine()); } public static ModelAndView message(Request req, Response res) { Map<String, Object> params = new HashMap<>(); params.put("name", req.params(":name")); return new ModelAndView(params, "hello"); } }
The application reads the request parameter and puts it into the ModelAndView
object.
get("/hello/:name", SparkThymeleaf::message, new ThymeleafTemplateEngine());
An instance of the ThymeleafTemplateEngine
is passed to the get()
method.
<pre class="code"> <!DOCTYPE html> <html lang="en" xmlns="http://www.w3.org/1999/xhtml" xmlns:th="http://www.thymeleaf.org"> <head> <meta charset="UTF-8"></meta> <title>Hello user</title> </head> <body> <p th:inline="text">Hello, [[${name}]]!</p> </body> </html>
This is the hello.html
template file. It refers to the name variable
which was passed with the ModelAndView
object.
$ curl localhost:4567/hello/Peter <!DOCTYPE html> <html lang="en" xmlns="http://www.w3.org/1999/xhtml"> <head> <meta charset="UTF-8" /> <title>Hello user</title> </head> <body> <p>Hello, Peter!</p> </body> </html>
We get this output.
FreeMarker
In the following example, we are going to integrate the FreeMarker template engine into our Spark application. FreeMarker is a well established Java template engine.
$ tree . ├── build.gradle └── src └── main ├── java │ └── com │ └── zetcode │ └── SparkFreeMarker.java └── resources └── views └── hello.ftl
This is the directory structure of the project. The template file is located
in the src/main/resources/views
directory.
apply plugin: 'application' sourceCompatibility = '1.8' version = '1.0' mainClassName = "com.zetcode.SparkFreeMarker" repositories { mavenCentral() } dependencies { compile 'com.sparkjava:spark-core:2.5.5' compile 'org.slf4j:slf4j-simple:1.7.24' compile 'com.sparkjava:spark-template-freemarker:2.5.5' }
Here we have the Gradle build file, which includes the spark-template-freemarker
dependency.
package com.zetcode; import freemarker.template.Configuration; import freemarker.template.Version; import java.io.IOException; import java.util.HashMap; import java.util.Map; import spark.ModelAndView; import spark.Request; import spark.Response; import static spark.Spark.get; import spark.template.freemarker.FreeMarkerEngine; public class SparkFreeMarker { public static void main(String[] args) throws IOException { Configuration conf = new Configuration(new Version(2, 3, 23)); conf.setClassForTemplateLoading(SparkFreeMarker.class, "/views"); get("/hello/:name", SparkFreeMarker::message, new FreeMarkerEngine(conf)); } public static ModelAndView message(Request req, Response res) { Map<String, Object> params = new HashMap<>(); params.put("name", req.params(":name")); return new ModelAndView(params, "hello.ftl"); } }
We set up the same application for FreeMarker.
Configuration conf = new Configuration(new Version(2, 3, 23)); conf.setClassForTemplateLoading(SparkFreeMarker.class, "/views");
We configure FreeMarker with the Configuration
class.
The template files are going to be placed into the views
directory,
which must be located on the classpath.
get("/hello/:name", SparkFreeMarker::message, new FreeMarkerEngine(conf));
The FreeMarkerEngine
is passed to the get()
method.
<!DOCTYPE html> <html> <head> <title>Home page</title> <meta charset="UTF-8"> <meta name="viewport" content="width=device-width, initial-scale=1.0"> </head> <body> <p>Hello ${name}</p> </body> </html>
This is the hello.ftl
template file; it refers to the name variable
which was passed with the ModelAndView
object.
$ curl localhost:4567/hello/Lucy <!DOCTYPE html> <html> <head> <title>Home page</title> <meta charset="UTF-8"> <meta name="viewport" content="width=device-width, initial-scale=1.0"> </head> <body> <p>Hello Lucy</p> </body> </html>
This is the output.
In this tutorial, we have introduced the Spark Java framework.
List all Java tutorials.