Author Avatar Image
Alexander Reelsen

Backend developer, productivity fan, likes distributed systems & the new serverless era

Using Testcontainers To Test Elasticsearch Plugins
Aug 25, 2021
11 minutes read

TLDR; This post shows how to use testcontainers in your Elasticsearch plugin project to test your plugin functionality via an integration test to resemble how the plugin is run in production.

This blog post starts with an empty project, creates test and production code as well as taking care of packaging before Testcontainers is used for integration tests, ensuring that our plugin is ready for release.

But, first things first…

What is Testcontainers?

Testcontainers is a helper library to spin up Docker containers during your integration tests. While that does not sound too awesome itself, the neat integration into Java and JUnit makes using Testcontainers a breeze and very enjoyable. Spinning up a whole development environment during testing now can be easily done. There are several modules support a set of databases like MySQL, Postgres or Clickhouse (and more), but also other services like Kafka, Nginx, Vault, Webdriver as well as Elasticsearch. You can also start services based on a docker-compose file. Also there is support for several test frameworks like Junit 4/5 or Spock or spinning containers up & down manually.

And while the original library was written in java, there are integration libraries for node, rust, python, dotnet or go, see the Testcontainers GitHub page.

Also, the creators of the project recently founded a commercial entity named AtomicJar around Testcontainers, so I expect to see commercial support and products around Testcontainers very soon.

Equipped with this knowledge, let’s go ahead and do the boring work first, starting with nothing and creating an Elasticsearch plugin step by step.

Note: If you do not write own plugins, but you just want to spin up Elasticsearch in your tests, take a look at the Elasticsearch Testcontainers Module.

Initializing the maven project

I am using mvn version 3.8.2 at the time of this writing. Elasticsearch features a build-tools dependency for gradle that simplifies development for plugins. However parts of it are only for internal Elasticsearch consumption and it can break during minor or patch releases and does not offer any compatibility guarantees, so we’ll stick with mvn and Testcontainers and do not need any further dependency.

Create a new project structure:

# setup directory structure
mkdir elasticsearch-plugin-testcontainers
cd elasticsearch-plugin-testcontainers
mkdir -p src/test/java/de/spinscale/elasticsearch
mkdir -p src/main/java/de/spinscale/elasticsearch

# install the mvn wrapper
mvn -N io.takari:maven:wrapper -Dmaven=3.8.2

# git FTW
echo target > .gitignore
git init

Next up is a bare bones pom.xml file, including the required dependencies - there are no runtime dependencies, so every dependency has a different scope:

<project xmlns="http://maven.apache.org/POM/4.0.0"
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
  <modelVersion>4.0.0</modelVersion>

  <groupId>de.spinscale.elasticsearch</groupId>
  <artifactId>elasticsearch-plugin-testcontainers</artifactId>
  <version>1.0-SNAPSHOT</version>
  <name>elasticsearch-plugin-testcontainers</name>

  <properties>
    <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
    <maven.compiler.source>16</maven.compiler.source>
    <maven.compiler.target>16</maven.compiler.target>
  </properties>

  <dependencies>
    <dependency>
      <groupId>org.elasticsearch</groupId>
      <artifactId>elasticsearch</artifactId>
      <version>7.14.0</version>
      <scope>provided</scope>
    </dependency>

    <dependency>
      <groupId>org.testcontainers</groupId>
      <artifactId>elasticsearch</artifactId>
      <version>1.16.0</version>
      <scope>test</scope>
    </dependency>

    <dependency>
      <groupId>org.mockito</groupId>
      <artifactId>mockito-core</artifactId>
      <version>3.11.2</version>
      <scope>test</scope>
    </dependency>

    <dependency>
      <groupId>org.junit.jupiter</groupId>
      <artifactId>junit-jupiter</artifactId>
      <version>5.7.2</version>
      <scope>test</scope>
    </dependency>
  </dependencies>

  <build>
    <plugins>
      <plugin>
        <artifactId>maven-surefire-plugin</artifactId>
        <version>2.22.2</version>
      </plugin>
    </plugins>
  </build>
</project>

With this in place you can now run ./mvnw clean package even though no code has been written yet. The surefire plugin is needed for running tests.

Writing the plugin

Create a custom REST endpoint, that returns a static string in order to keep the code simple. Create src/main/java/de/spinscale/elasticsearch/MyRestHandler.java with the following code:

package de.spinscale.elasticsearch;

import org.elasticsearch.client.node.NodeClient;
import org.elasticsearch.rest.BaseRestHandler;
import org.elasticsearch.rest.BytesRestResponse;
import org.elasticsearch.rest.RestRequest;
import org.elasticsearch.rest.RestStatus;

import java.io.IOException;
import java.util.Collections;
import java.util.List;

public class MyRestHandler extends BaseRestHandler {

    private static final BytesRestResponse RESPONSE = 
        new BytesRestResponse(RestStatus.OK, "my-rest-handler");

    @Override
    public String getName() {
        return "my-rest-handler";
    }

    @Override
    public List<Route> routes() {
        return Collections.singletonList(
          new Route(RestRequest.Method.GET, "my-rest-handler"));
    }

    @Override
    protected RestChannelConsumer prepareRequest(
        RestRequest request, NodeClient client)
          throws IOException {
        
        return channel -> channel.sendResponse(RESPONSE);
    }
}

In order to keep the example lean, this code is not doing anything smart. A new endpoint is registered under /my-rest-handler. When you hit that endpoint it returns my-rest-handler. Peak functionality!

Register this REST handler by creating a plugin class like this in src/main/java/de/spinscale/elasticsearch/MyPlugin.java:

package de.spinscale.elasticsearch;

import org.elasticsearch.cluster.metadata.IndexNameExpressionResolver;
import org.elasticsearch.cluster.node.DiscoveryNodes;
import org.elasticsearch.common.settings.ClusterSettings;
import org.elasticsearch.common.settings.IndexScopedSettings;
import org.elasticsearch.common.settings.Settings;
import org.elasticsearch.common.settings.SettingsFilter;
import org.elasticsearch.plugins.ActionPlugin;
import org.elasticsearch.plugins.Plugin;
import org.elasticsearch.rest.RestController;
import org.elasticsearch.rest.RestHandler;

import java.util.Collections;
import java.util.List;
import java.util.function.Supplier;

public class MyPlugin extends Plugin implements ActionPlugin  {

    @Override
    public List<RestHandler> getRestHandlers(Settings settings, 
        RestController restController,
        ClusterSettings clusterSettings,
        IndexScopedSettings indexScopedSettings,
        SettingsFilter settingsFilter,
        IndexNameExpressionResolver indexNameExpressionResolver,
        Supplier<DiscoveryNodes> nodesInCluster) {

        return Collections.singletonList(new MyRestHandler());
    }
}

Every custom plugin has to extend the Plugin class and then can implement different plugin classes, depending on the use-case. In case of adding new REST handlers, one needs to implement the ActionPlugin.

Unit testing the plugin

Before running integration tests, a unit test makes sense. Add the following Java test in src/test/java/de/spinscale/elasticsearch/MyRestHandlerTest.java:

package de.spinscale.elasticsearch;

import org.elasticsearch.common.bytes.BytesArray;
import org.elasticsearch.common.xcontent.NamedXContentRegistry;
import org.elasticsearch.http.HttpChannel;
import org.elasticsearch.http.HttpRequest;
import org.elasticsearch.rest.RestChannel;
import org.elasticsearch.rest.RestRequest;
import org.elasticsearch.rest.RestResponse;
import org.elasticsearch.rest.RestStatus;
import org.junit.jupiter.api.Test;
import org.mockito.ArgumentCaptor;

import static org.junit.jupiter.api.Assertions.assertEquals;
import static org.mockito.Mockito.mock;
import static org.mockito.Mockito.verify;
import static org.mockito.Mockito.when;

public class MyRestHandlerTest {

  private final MyRestHandler handler = new MyRestHandler();
  private final HttpRequest request = mock(HttpRequest.class);
  private final HttpChannel channel = mock(HttpChannel.class);
  private final RestChannel restChannel = mock(RestChannel.class);

  @Test
  public void testRestHandlerResponse() throws Exception {
    when(request.uri()).thenReturn("my-rest-handler");
    when(request.content()).thenReturn(BytesArray.EMPTY);
    RestRequest restRequest = 
      RestRequest.request(NamedXContentRegistry.EMPTY, request, channel);

    handler.handleRequest(restRequest, restChannel, null);

    ArgumentCaptor<RestResponse> captor =
      ArgumentCaptor.forClass(RestResponse.class);
    verify(restChannel).sendResponse(captor.capture());

    final RestResponse response = captor.getValue();
    assertEquals(response.status(), RestStatus.OK);
    assertEquals(response.content().utf8ToString(), "my-rest-handler");
  }
}

The test checks for the correct HTTP response and body. However, there is a lot of mocking involved - which is okay for a unit test, but might not reflect the code being run, when nothing is mocked. You can run this test via ./mvnw test.

Time to package the plugin properly, so that it is testable within our project.

Creating a plugin descriptor file

Elasticsearch plugins are just zip files in a certain format. Take a quick look at the analysis-kuromoji plugin, that can be downloaded via a link in the documentation.

After downloading the zip file, the structure looks like this:

> unzip -l analysis-kuromoji-7.14.0.zip

Archive:  analysis-kuromoji-7.14.0.zip
  Length      Date    Time    Name
---------  ---------- -----   ----
     1776  07-29-2021 20:52   plugin-descriptor.properties
    25580  07-29-2021 20:52   analysis-kuromoji-7.14.0.jar
  4654579  07-01-2021 16:01   lucene-analyzers-kuromoji-8.9.0.jar
    34741  07-29-2021 20:52   NOTICE.txt
    34447  07-29-2021 20:47   LICENSE.txt
---------                     -------
  4751123                     5 files

The .txt files are optional from a plugin perspective to make it work, but are required for legal reasons. The jar files are the compiled code for this plugin in addition to required dependencies.

The least known interesting part is the plugin-descriptor.properties file, which must be part of every plugin. If you remove the comments and empty lines the file is rather short

type=isolated
description=The Japanese (kuromoji) Analysis plugin ...
version=7.14.0
name=analysis-kuromoji
classname=org.elasticsearch.plugin.analysis.kuromoji.AnalysisKuromojiPlugin
java.version=1.8
elasticsearch.version=7.14.0
extended.plugins=
has.native.controller=false

Next step is to create a very similar file for our plugin. What needs to be dynamic is the elasticsearch.version property as well as the version one. Create a src/main/resources/plugin-descriptor.properties file that we can reuse later:

type=isolated
name=${project.name}
version=${version}
elasticsearch.version=${elasticsearchVersion}
classname=de.spinscale.elasticsearch.MyPlugin
description=Description of My Plugin
java.version=16
extended.plugins=
has.native.controller=false

So, how do to replace those placeholders? Luckily maven has a nice filtering ability for resources. Add an elasticsearchVersion property to our list of properties first in the pom.xml.

<properties>
  <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
  <maven.compiler.source>16</maven.compiler.source>
  <maven.compiler.target>16</maven.compiler.target>
  <elasticsearchVersion>7.14.0</elasticsearchVersion>
</properties>

Also, one can replace the elasticsearchVersion in our dependency now:

<dependency>
  <groupId>org.elasticsearch</groupId>
  <artifactId>elasticsearch</artifactId>
  <version>${elasticsearchVersion}</version>
  <scope>provided</scope>
</dependency>

Next up, setup resource filtering

<build>
  <resources>
    <resource>
      <directory>src/main/resources</directory>
      <filtering>true</filtering>
    </resource>
  </resources>
  <plugins>
    <plugin>
      <artifactId>maven-surefire-plugin</artifactId>
      <version>2.22.2</version>
    </plugin>
  </plugins>
</build>

Now run ./mvnw resources:resources. After running check the file at target/classes/plugin-descriptor.properties and you will see the version and elasticsearch version properties being replaced, so this file is ready to be packaged. This is the next step.

Packaging the plugin

As already mentioned above, packaging means creating a zip file that contains all the jars plus the processed plugin descriptor file. Time for some maven magic: This example uses the assembly plugin to achieve this, even though there is also other possibilities to create custom packaging types within maven.

First, add the mvn assembly plugin to our list of plugins in our pom.xml file:

...
    <plugins>
      <plugin>
        <artifactId>maven-surefire-plugin</artifactId>
        <version>2.22.2</version>
      </plugin>
      <plugin>
        <groupId>org.apache.maven.plugins</groupId>
        <artifactId>maven-assembly-plugin</artifactId>
        <version>3.3.0</version>
        <configuration>
          <appendAssemblyId>false</appendAssemblyId>
          <finalName>${project.name}</finalName>
          <descriptors>
            <descriptor>src/main/assembly/dist.xml</descriptor>
          </descriptors>
        </configuration>
        <executions>
          <execution>
            <phase>package</phase>
            <goals>
              <goal>single</goal>
            </goals>
          </execution>
        </executions>
      </plugin>
    </plugins>
  </build>
</project>

This plugin refers to a dist.xml file that creates the list of files and the formats of the assembly. Ours should look like this:

<assembly>
  <id>dist</id>
  <formats>
    <format>zip</format>
  </formats>
  <includeBaseDirectory>false</includeBaseDirectory>
  <fileSets>
    <fileSet>
      <directory>${project.build.directory}</directory>
      <includes>
        <include>elasticsearch-plugin-testcontainers-${pom.version}.jar</include>
      </includes>
      <outputDirectory>/</outputDirectory>
    </fileSet>
    <fileSet>
      <directory>${project.build.directory}/classes</directory>
      <includes>
        <include>plugin-descriptor.properties</include>
      </includes>
      <outputDirectory>/</outputDirectory>
    </fileSet>
  </fileSets>
</assembly>

This assembly descriptor lists two files to be included, basically the compiled jar file and the dynamically created properties file. After adding this you can run ./mvnw clean package to create the zip file. This will create a target/elasticsearch-plugin-testcontainers-1.0-SNAPSHOT.zip file, that you can take a look at

> unzip -l target/elasticsearch-plugin-testcontainers-1.0-SNAPSHOT.zip

Archive:  target/elasticsearch-plugin-testcontainers-1.0-SNAPSHOT.zip
  Length      Date    Time    Name
---------  ---------- -----   ----
     4932  08-24-2021 13:55   elasticsearch-plugin-testcontainers-1.0-SNAPSHOT.jar
      224  08-24-2021 13:55   plugin-descriptor.properties
---------                     -------
     5156                     2 files

So, the format of the file looks correct. Ready to go ahead and test the plugin by installing it into our Elasticsearch distribution. But this should really be a part of testing and not require manual intervention.

Note: This plugin does not require dependencies to be included in the build, but this is also possible with some more configuration in the assembly XML file.

Running tests against the packaged plugin

In order to make the correct changes to run integration tests some understanding about maven life cycles is required. To run Testcontainers enabled tests, the final packaged plugin needs to be created first. Therefore those tests cannot be run as part of the test life cycle, which runs before packaging. Maven has a special integration-test life cycle, that happens after the package phase, which means, the artifacts are available when running those tests.

The first step is to setup integration tests running at that stage. You might have guessed it, this is done with a plugin, namely the maven-failsafe-plugin (don’t ask me why it has been named like that, I’m sure there is a good reason):

      <!-- run integration tests -->
      <plugin>
        <groupId>org.apache.maven.plugins</groupId>
        <artifactId>maven-failsafe-plugin</artifactId>
        <version>3.0.0-M5</version>
        <configuration>
          <includes>
            <include>**/*IT.java</include>
          </includes>
        </configuration>
        <executions>
          <execution>
            <goals>
              <goal>integration-test</goal>
            </goals>
          </execution>
        </executions>
      </plugin>
    </plugins>
  </build>
</project>

This configures all test files ending with *IT.java to be run when the integration-test phase is running. Now finally add the Testcontainers test:

package de.spinscale.elasticsearch;

import org.junit.jupiter.api.Test;
import org.testcontainers.containers.GenericContainer;
import org.testcontainers.images.builder.ImageFromDockerfile;

import java.net.URI;
import java.net.http.HttpClient;
import java.net.http.HttpRequest;
import java.net.http.HttpResponse.BodyHandlers;
import java.net.http.HttpResponse;
import java.nio.file.Paths;

import static org.junit.jupiter.api.Assertions.assertEquals;

public class MyRestHandlerIT {

  @Test
  public void testPluginEndpoint() throws Exception {
    final Path dockerFilePath =
      Paths.get(System.getenv("PWD"), "Dockerfile");
    final ImageFromDockerfile image =
      new ImageFromDockerfile().withDockerfile(dockerFilePath);

    try (GenericContainer container = new GenericContainer(image)) {
      container.addEnv("discovery.type", "single-node");
      container.addExposedPorts(9200);
      container.start();

      String endpoint = String.format(
        "http://localhost:%s/my-rest-handler", 
        container.getMappedPort(9200)
      );

      HttpRequest request =  HttpRequest.newBuilder()
        .GET()
        .uri(URI.create(endpoint))
        .build();

      HttpClient httpClient = HttpClient.newHttpClient();
      HttpResponse<String> response =
        httpClient.send(request, BodyHandlers.ofString());

      assertEquals(response.body(), "my-rest-handler");
      assertEquals(response.statusCode(), 200);
    }
  }
}

You cannot run this test yet because of a design decision. While I could have written the Dockerfile programmatically, I think it makes sense to have this in the file system, so you could run everything outside of a Java test as well via docker build. The Dockerfile however is refreshingly short and all it does is installing the plugin on top of the Elasticsearch version:

FROM docker.elastic.co/elasticsearch/elasticsearch:7.14.0

ADD target/elasticsearch-plugin-testcontainers.zip /elasticsearch-plugin-testcontainers.zip

RUN /usr/share/elasticsearch/bin/elasticsearch-plugin install --batch file:///elasticsearch-plugin-testcontainers.zip

Make sure you are using the --batch (or -b) option when installing plugins to not get stalled for terminal input depending on what permissions your plugin requires - this one does not need any.

Summary

Aaaaand that’s it for today, folks! We have written an Elasticsearch plugin with a maven project from scratch, added a unit test, and then moved forward to get packaging up and running to run integration tests, which really installs the plugin the same it is done in production and then run tests against the Elasticsearch node thanks to Testcontainers.

Testcontainers is a great framework, that makes integration tests not only much easier, but also more reproducible on different environments. I highly encourage you to take a closer look if you have not yet.

Of course there are a few things that require some improvements. Right now, you need to manually upgrade the Elasticsearch version in the Dockerfile. However your integration tests will fail, if you forget that, as your plugin is only supposed to work with exactly one Elasticsearch version.

Important: You cannot run the same packaged plugin against a different patch version of Elasticsearch, you need to create a new version for every Elasticsearch release. At some point there might be more stable APIs, but this is future talk for now. This also means, you need to pick which versions you would like to update your plugin on plus a solid branching strategy to work on releases. The example here only works for a single version as it depends on the Elasticsearch version in the pom.xml file.

One more important point, especially from a security perspective: Security is not handled in REST endpoints, but in transport actions. This means, that anyone can connect to the endpoint implemented in the example plugin. The correct way would be to also implement a transport action, which is executing the actual logic and the REST endpoint is just handing things over to the transport action. This way security gets taken care of automatically on top of many more things like forwarding the request to the correct nodes (i.e. if it requires to be executed on the master node or on all nodes).

If want to take a look at the code, check out the GitHub repository containing this example.

And that’s it for today. Thanks for reading!

Resources

Final remarks

If you made it down here, wooow! Thanks for sticking with me. You can follow or ping me on twitter, GitHub or reach me via Email (just to tell me, you read this whole thing :-).

If there is anything to correct, drop me a note, and I am happy to do so and append to this post!

Same applies for questions. If you have question, go ahead and ask!

If you want me to speak about this, drop me an email!


Back to posts