2016-06-12

Spring Boot + PostgreSQL + Custom Type = exception!

I do have demo application that shows features of various ORM frameworks. Application performs several basic operation on a database. To have clean database for each application's run I wanted to recreate whole database at its start. This means dropping whole schema and then re-initialising it by executing following script.

DROP SCHEMA IF EXISTS public CASCADE;
CREATE SCHEMA public;

CREATE TYPE plane_dimensions AS (
  length_meters   DECIMAL,
  wingspan_meters DECIMAL,
  height_meters   DECIMAL
);

CREATE TABLE planes (
  id           BIGSERIAL PRIMARY KEY,
  name         VARCHAR(250)     NOT NULL,
  dimensions   plane_dimensions NOT NULL
);

...

Unfortunately this doesn't work.

When the application connects to the database it loads existing types and stores it to a cache. Then my type is dropped and immediately re-created by the script. But re-created type does have different OID than the former (cached). So if you try to use it from the application you will get an exception.

Caused by: org.postgresql.util.PSQLException: ERROR: cache lookup failed for type 1234567

Solution is to re-initialise connection or not to modify the type.

Query to find type's OID:

SELECT oid FROM pg_type WHERE typname = 'plane_dimensions';

2016-06-05

Shared test sources in Gradle multi-module project

Basic assumptions

Let's suppose following project structure.

:root-module
|    +--- main source set
|    \--- test source set
\--- :module1
     |    +--- main source set
     |    \--- test source set
     \--- :core-module
              +--- main source set
              \--- test source set
Shared test-utility classes shouldn't be placed in test source set

In some legacy projects I've seen utility classes to be located in test source set. This is problematic in multi-project applications because Gradle does not pull test sources to a dependent module. So module1 won't "see" classes from core-module's test source set. This makes sense because you don't want to have module1's test-classpath flooded with core-modules's test classes.

Note: IDEA 14 does not respect this rule and it pull test sources to a dependent module. This behaviour has been fixed lately.

Test-utility classes shouldn't be placed in main source set

For obvious reasons you don't want to pollute main source set with test classes. In main source set there should be production code only. Tests are usually executed on application's build phase so there is no need to put them into final build.

In Spring application test classes shouldn't be located in component-scanned package

This is mandatory for Spring developers who uses component scan. In some cases you may want to create annotated class used exclusively by tests.

IDEA places all source sets marked as sources to classpath. So if you place your test-only annotated class to a package scanned by Spring your application context will ends up polluted.

Shared test classes should be kept in module where it belongs

Because of previous point some people have tendency to put shared test classes to a stand-alone module. Although core-module-test (main) -> core-module (main), core-module (test) -> core-module-test (main) is legal construct in Gradle you don't want to complicate your project's dependency structure by creating "unnecessary" modules.

Module dependency resolution basics in Gradle and IDEA 14 and 16

Gradle

Gradle allows you to define dependency between two module configurations.

// Declared in root-module
dependencies {
    // Current module's testCompile configuration depends on another module's default configuration.
    testCompile project(':module1')
    // Current module's configuration depends on another module's test configuration.
    testCompile project(path: ':core-module', configuration: 'test')
}

Dependency to a default configuration of another module pulls module's dependencies and sources to a dependent module. Dependency to a test configuration pulls only dependencies without sources. So in our case root-module won't see any core-module's test classes.

IDEA 14

In contrast to Gradle the IDEA 14 pulls all sources to a dependent module. Even test sources.

If you open Gradle project IDEA will behave oddly. It will let you run tests that Gradle won't even compile. Unfortunately IDEA 14 doesn't have any feature that can be used to fix this so you have to think about it when you will write your test.

IDEA 16

Latest version of IDEA does not pull test sources to a dependent module which is consistent behaviour to Gradle. It also allows you to define dependency to a specific source set. You just have to choose create separate module per source set option when importing Gradle project. It has been added lately to harmonize IDEA's dependency resolution with Gradle.

Using shared test sources by dependent module

Declaring dependency to a module's output directory

In projects having utility test-classes in test source set there is usual hot-fix to add testCompile dependency to other module's output directory. So in our example the module1's tests will depend on core-module's compiled test classes.

// Declared in module1
dependencies {
  testCompile project(':core-module').sourceSets.test.output
}

This is quick solution but it has some cons:

  • In your IDE you have to rebuild shared classes on every change to refresh dependencies. Remember IDEA 16 does not pull test sources.
  • It's against the requirements in first chapter of this article.
Placing utility classes to a custom source set

The good solution is to create custom source set for shared test-classes and declare dependencies to this source set from dependent modules. This approach allows you to keep classes in modules where it belongs still separated from main and test sources. This approach is used by Gradle project itself.

Implementation is quite straightforward. First create script plugin that will add new testFixtures source set and appropriate configurations to a module.

Then apply script plugin on a module and declare necessary dependencies.

// Declared in core-module
apply from: 'testFixtures.gradle'

dependencies {
  testFixturesCompile('junit:junit:4.12')
}

Finally add dependency to the newly created source set.

// Declared in module1
dependencies {
    testCompile project(path: ':core-module', configuration: 'testFixturesUsageCompile')
}

Note for IDEA users: There is a small drawback. If you open the project without create separate module per source set option your custom-source set will be available in main source set. If you use a shared test-class in main source set IDEA won't complain about it but Gradle won't be able to compile. So you have to be careful about what you are using.

2016-06-01

Zero downtime deployment with Nginx proxy

At Factorify we wanted to make deployent of our application unnoticed by users. Application is AngularJS client connected to Spring backend. User's requests are proxied by Nginx running on FreeBSD.

Basic idea was to hold user's requests during application deployment. Nginx does not provide such function by default but it is possible to extend it by scripts written in Lua language.

I wasn't able to compile Nginx with Lua support. Fortunately there is Nginx "distribution" called OpenResty which integrates Lua compiler.

Nginx configuration

  1. Nginx listens on port 8080.
  2. Request is sent to primary node. If primary node is alive response will be served to user.
  3. If primary node fails to return response request will be send to backup node.
  4. Backup node suspends request for 10 seconds and then forwards it to the backend. 10 seconds should be sufficient time for application to restart. If no response is returned by backend in 10 seconds then error will be send to user.
http {
    upstream backend {
        # 2) Primary node.
        server localhost:8667;
        # 3) Backup node.
        server localhost:8666 backup;
    }

    server {
        # 1) Endpoint exposed to users.
        listen 8080;

        location / {
            proxy_pass http://backend;
        }
    }

    server {
        listen 8666;

        location / {
            # 4) Suspend request for 10 second.
            access_by_lua '
                ngx.sleep(10)
            ';
            proxy_pass http://localhost:8080/;
        }
    }

    server {
        # This is primary node that emulates backend application.
        listen 8667;

        location / {
            default_type text/html;
            content_by_lua '
                ngx.say("Hello World!")
            ';
        }
    }
}

To fine-tune the configuration please refer to Nginx documentation.

2016-05-15

Spring Boot internationalisation with database stored messages and IBM ICU

Motivation

In Java world there is a common approach to store localisation messages in property files. I want to have messages stored in database so users can manage them on runtime. I also want better plural forms handling provided by project ICU.

Introduction to Spring's localisation process

Following figure shows localisation processing in a Spring application.

  1. Requested locale (language and country codes) is passed to the Spring application as part of ServletRequest object. In ServletRequest there are several properties that can hold locale value. For example HTTP header Accept-Language, a cookie or query string parameter. Locale is resolved from the request object in DispatcherServlet.render method by instance of LocaleResolver class. There are several resolvers available in the Spring. You can check out org.springframework.web.servlet.i18n package for more details.
  2. Locale value is send by user's browser by default (as a Accept-Language header). To allow your application to change locale you have to configure LocaleChangeInterceptor. The interceptor reads locale value from query string and sets it to the request. Please read my older article Request mapping on demand in Spring MVC if you want to know more about request processing.
  3. Message code and resolved locale are passed to MessageSource via a view. Message source is responsible for loading message from storage, processing it and returning localised message back to the view. The view incorporates processed message into template.
  4. A message can be simple text or a pattern consisted of placeholders that will be replaced while pattern is processed. Placeholders are used to render text in a locale sensitive way. Patterns are processed by java.text.MessageFormat by default. In this article I will also describe enhanced version of message format (com.ibm.icu.text.MessageFormat) provided by ICU.

LocaleChangeInterceptor and LocaleResolver configuration

AcceptHeaderLocaleResolver is Spring Boot's default locale resolver. The resolver reads locale from Accept-Language header. Unfortunately it does not support locale change on runtime because it's setLocale method is not implemented.

No locale change interceptor is configured by default.

Locale resolver can be changed by creating LocaleResolver bean. I've decided to use CookieLocaleResolver which resolves locale stored in a cookie.

Locale change interceptor can be added by extending WebMvcConfigurerAdapter. Following code snippet shows configuration of locale change interceptor that reads new language code from query string. By adding lang parameter to a query string user can change requested locale.

Database aware MessageSource

To create message source you just need to implement MessageSource interface.

And then configure template engine so it will load messages from newly created message source. Thymeleaf template engine is used in this article.

Message patterns and plural forms handling

Plural forms handling is very common case in multilingual applications. Plural forms handling has to be robust because some languages have pretty complicated plural rules. Here is a example of three sentences that should be generated from a single message pattern:

  • There is 1 apple in 1 basket.
  • There are 2 apples in 2 baskets.
  • There are 0 apples in 2 baskets.
Standard MessageFormat

To deal with plurals Java offers MessageFormat formatter. This formatter can evaluate conditions in message pattern which can be used for plural forms handling. Let's revisit DatabaseMessageSource.resolveMessage method to incorporate the formatter into the application.

private String resolveMessage(String code, Object[] args, Locale locale) {
    String message = ""; // TODO Load message from database...
    MessageFormat messageFormat = new MessageFormat(message, locale)
    return messageFormat.format(args);
}

Then you need to create a message pattern with choice conditions. Curly braces are used as a placeholders for message parameters and can contain message formatting syntax.

There {0,choice,0#are|1#is|2#are} {0} {0,choice,0#apples|1#apple|2#apples} in {1} {1,choice,0#baskets|1#basket|2#baskets}.

To pass values into message you just need to add braces with values at the end of message code.

<p th:text="#{apple.message(1,1)}"></p>
<p th:text="#{apple.message(2,2)}"></p>
<p th:text="#{apple.message(0,2)}"></p>

Shown example is official Java recommended approach for plural forms handling. Unfortunately it's choice conditions are quite complicated even for English language which has only two plural forms. For languages like Polish it's almost unreadable.

IBM ICU MessageFormat

Project ICU focus on dealing with internationalisation. It offers its own implementation of MessageFormat that is more robust and easier to use. To incorporate this formatter to the application you need to slightly change DatabaseMessageSource.resolveMessage method.

private String resolveMessage(String code, Object[] args, Locale locale) {
    String message = ""; // TODO Load message from database...
    MessageFormat messageFormat = new MessageFormat(message, locale);
    StringBuffer formattedMessage = new StringBuffer();
    messageFormat.format(args, formattedMessage, null);
    return formattedMessage.toString();
}

For plural forms handling ICU formatter offers purpose built plural pattern argument.

There {0,plural,one{is # apple}other{are # apples}} in {1,plural,one{# basket}other{# baskets}}.

Localised URLs

In the Request Mapping on Demand article I've described possibilities of URL localisation on the fly.

Conclusion

There are several more internationalisation approaches available in Java world. For example there is possibility to use GNU gettext in Java. But it's not that easy to implement it in Spring. I've done some research and found out that combination of ICU and message codes seems to be the best choice.

2016-05-03

Request mapping on demand in Spring MVC

Motivation

I want to generate request mappings during execution time of my application. I want to have several URL paths that points to one controller method. Paths will be generated from e-shop product names. Each path should be bound to a language code send in a header. Of course paths can vary as user changes product names during an application's excution.

Here is an example list of mappings. Mapping for English language will match only if lang-header is set to en code. Same rule should apply for Czech lang-header.

[header: Accept-Language=en*]
/spinach
/carrot
/brocoli

[header: Accept-Language=cs*]
/spenat
/mrkev
/brokolice

These prerequisites disqualifies @RequestMapping annotation because:

  • Request mapping conditions are evaluated on application's init phase and cannot be changed on runtime.
  • It's not possible to attach a mapping path to a specific header value. There is many to many relation between paths and headers defined as a request mapping condition.
    @RequestMapping(value = {"/spinach", "/carrot", "/spenat", "/mrkev"}, headers = {"Accept-Language=en*", "Accept-Language=cs*"})
        

Theory of mapping a request to a controller method in Spring MVC

Following figure shows request's lifecycle in a Spring application.

  1. The first place where the request is processed by an application is DispatcherServlet. There is only one servlet of this kind which processes all incoming requests.
  2. Then DispatcherServlet in its doDispatch method tries to find handler capable of processing request. DispatcherServlet holds a list of HandlerMappings built on application start. These mappings tells the servlet which particular mapping is suitable for processing request. RequestMappingHandlerMapping is interesting in our case because it maps a controller method according to @RequestMapping annotation. If request matches conditions specified by @RequestMapping annotation then a request handler will be send back to the servlet. The handler is instance of HandlerMethod wrapped in HandlerExecutionChain object and actually it's kind of pointer to the controller method.
  3. Handler (HandlerMethod) is passed to the HandlerAdapter's specialization RequestMappingHandlerAdapter and then executed. Actually RequestMappingHandlerAdapter can be easily renamed as HandlerMethodHandlerAdapter because this adapter doesn't have almost no relation to @RequestMapping annotation. This will be useful in one of my solutions.
  4. The handler is executed in RequestMappingHandlerAdapter.invokeHandlerMethod method. Result is wrapped into ModelAndView object and then send back to the servlet.
  5. The servlet resolves a view, processes it, and so on.

Solution #1: Custom request condition

Spring allows to extend @RequestMapping with a custom condition. By overwriting RequestMappingHandlerMapping.getCustomMethodCondition or RequestMappingHandlerMapping.getCustomTypeCondition methods you can create your own condition that will (or will not) match requests. If request matches mapping's condition and your custom conditions then handler is returned by the handler mapping.

Custom conditions are created during application's runtime so it can also be changed on application's runtime.

To demonstrate this approach I've created a demo application Request Mapping On Demand (Custom Condition).

Pros:

  • Sort of Spring-way approach.
  • Less custom code.

Cons:

  • In Spring Boot it's not possible (in a sense of "clean way") to extend existing RequestMappingHandlerMapping. By adding new HandlerMapping to the dispatcher you can create pretty mess. Check the comment in CustomConditionRequestMappingHandlerMapping. Eventually this will be fixed soon.
  • You have to process raw HttpServletRequest object in your condition.

This solution was inspired by Rob Hinds's article Spring MVC & custom routing conditions.

Solution #2: Creating a new handler mapping

There is also demo application which demonstrates this approach: Request Mapping On Demand Demo.

In this solution the @RequestMapping annotation is replaced by custom @RequestMappingOnDemand. All conditions of this new annotation are created by application. So there's no conditions evaluated on application's init phase. Except for a pointer to a condition manager which creates these conditions.

The handler mapping RequestMappingHandlerMapping is replaced by RequestMappingOnDemandHandlerMapping. This custom handler mapping is capable of processing conditions attached to a controller method by @RequestMappingOnDemand annotation.

Because the custom handler mapping returns HandlerMethod there is no need to change anything in the adapter.

Pros:

  • More robust handling of requests.
  • Works flawlessly with current version of Spring Boot.

Cons:

  • More custom code.
  • Code duplication in RequestMappingOnDemandHandlerMapping and RequestMappingHandlerMapping.

Conclusion

I've used second solution in my project because of Spring Boot's inability to extend RequestMappingHandlerMapping. I will probably switch to first solution when it will be fixed.

2016-03-08

Spring Boot + JPA (Hibernate) + Atomikos + PostgreSQL = exception!

If you try to use combination of Spring Boot (1.3.3), Spring Data JPA, Atomikos and PostgreSQL database you will probably experience an exception during start of an application.

java.sql.SQLFeatureNotSupportedException: Method org.postgresql.jdbc.PgConnection.createClob() is not yet implemented.
 at org.postgresql.Driver.notImplemented(Driver.java:642) ~[postgresql-9.4.1209.jre7-20160307.201142-10.jar:9.4.1209.jre7-SNAPSHOT]
        ...

com.atomikos.datasource.pool.CreateConnectionException: an AtomikosXAPooledConnection with a SessionHandleState with 0 context(s): connection is erroneous
 at com.atomikos.jdbc.AtomikosXAPooledConnection.testUnderlyingConnection(AtomikosXAPooledConnection.java:116) ~[transactions-jdbc-3.9.3.jar:na]
        ...

These exceptions appears because JPA (Hibernate) supported by Atomikos is trying to verify PostgreSQL CLOB feature. This feature is not implemented in JDBC driver so driver throws unimportant exception. Unfortunately Atomikos has an exception listener which marks connection as erroneous when any exception occurs.

To suppress this behaviour you have to disable driver's feature detection and configure it's features manually. This is kind of shady undocumented way of how to do it but it works.

# Disable feature detection by this undocumented parameter. Check the org.hibernate.engine.jdbc.internal.JdbcServiceImpl.configure method for more details.
spring.jpa.properties.hibernate.temp.use_jdbc_metadata_defaults = false

# Because detection is disabled you have to set correct dialect by hand.
spring.jpa.database-platform=org.hibernate.dialect.PostgreSQL9Dialect

For more details about configuration of distributed transactions with Atomikos check the Fabio Maffioletti's article.

2016-03-04

Remove all NULL checks from table in Microsoft SQL Server 2008

I wanted to create mockup schema based on existing database. I was trying to insert some simple data into pretty complex tables flooded with NOT NULL columns. Unfortunately in MS SQL there's no easy way how to disable NULL checks temporarily so I had to do it permanently. To simplify this task I've used following script generating set of ALTER TABLE commands that removes NULL checks from a table.

This is just a raw solution with few flaws that has to be solved by hand. For example alter commands are generated for identity columns which cannot exist without NULL check.

2016-01-31

Wi-Fi and Bluetooth USB dongles for Raspberry Pi 1 model B

I wanted to expand my Raspberry Pi with Wi-Fi and Bluetooth capabilities. For this task I bought two cheap USB dongles on the AliExpress website. A 802.11n Wi-Fi dongle with Realtek's RTL8188EUS chipset and a USB Bluetooth adapter compatible with Cambridge Silicon Radio, Ltd Bluetooth Dongle (HCI mode) adapter.

Both of these devices can be used with the latest version of Raspbian. Check out list of Raspberry compilant hardware:

First of all I had to upgrade Raspbian to latest version.

$ rpi-update

# Check out Raspbian's version
$ uname -a
Linux raspberrypi 4.1.16+ #833 Wed Jan 27 14:29:11 GMT 2016 armv61 GNU/Linux

$ sudo apt-get update & sudo apt-get upgrade

After upgrading I checked connected hardware:

$ lsusb
Bus 001 Device 004 ID 0a12:0001 Cambridge Silicon Radio, Ltd Bluetooth Dongle (HCI mode)
Bus 001 Device 005 ID 0bda:0179 Realtek Semiconductor Corp.

Then I configured configured connection to my Wi-Fi router guided by Oliver's article.

#/etc/wpa.config
network={
    ssid=""
    proto=RSM
    key_mgmt=WPA-PSK
    pairwise=CCMP TKIP
    group=CCMP TKIP
    psk=""
}

#/etc/network/interfaces
allow-hotplug wlan0
auto wlan0

iface wlan0 inet dhcp
wpa-conf /etc/wpa.config

Wi-Fi started working on reboot of Raspbian.

Installation of Bluetooth keyboard was quite easy as well. I followed steps described in Tyler's article. First I've installed bluetooth support into my Raspbian and then connected wireless keyboard via bluez tools.

# To scan nearby Bluetooth hardware ready to be paired
$ hcitool scan
Scanning ...
  7C:1E:52:AA:0B:E3 Microsoft Sculpt Wireless Keyboard
# To connect keyboard and to 
$ sudo bluez-simple-agent hci0 7C:1E:52:AA:0B:E3
$ sudo bluez-test-device trusted 7C:1E:52:AA:0B:E3 yes 
$ sudo bluez-test-input connect 7C:1E:52:AA:0B:E3

2016-01-23

Gradle multi-project using cross dependent modules

I have a framework which consists of several modules. On top of this framework I'd like to build various projects where each project can use one or more modules of this framework. In real world these projects can represent framework's implementations for customers. Framework's modules can depend on each other. For this task I decided to use Gradle build system.

Part of this article is an example project on GitHub.

Setting up the build

I've created three modules that are shared by two root projects. Project1 and project2. Second project doesn't use one of the modules as it shown by following diagram. Both root projects has very similar build scripts so I will focus on project1 for the rest of this article.

:project1
+--- :module1
|    \--- :core-module
\--- :module2
     \--- :core-module

:project2
+--- :module1
     \--- :core-module

Root project's build script contains buildscript block which introduces plugin dependency and other common configuration used by Gradle when compiling/building project. The build script applies spring-boot-multi-project on the root project. This plugin is wrapper of the Gradle Spring Boot plugin and enhances it's functionality. The plugin also adds new task that generates serialized version of dependency graph. We will get to the dependency graph later.

There are three modules (sub-projects). A core module which is Spring Boot project and two modules (module1 and module2) which depends on core module and extends its functionality. These modules forms the framework. The framework is just a collection of libraries so it can't work as a standalone application.

Core module depends on Spring Boot starter project so to be compiled it needs to have compile dependency configuration set to spring-boot-starter. To be able to test this module it also need to have declared testCompile dependency on spring-boot-starter-test.

Note that module's build scripts does not contain repository configuration or does not apply any plugin. Repository configuration is included in spring-boot-multi-project plugin which is applied in root project's build script. This plugin adds Maven Central Repository and JitPack repository to every module including root project.

Module1 depends on core module. And because core module contains Spring Boot project it's not necessary to declare Spring Boot compile dependency in module1. Compile dependencies are transitive so all dependencies from core module are going to be placed on classpath of module1 too.

This rule doesn't apply to testCompile dependencies. When a module (subproject in Gradle's terminology) is tested it's tested separately. This means module is first compiled with all it's dependencies (not root project's dependencies) and it's own testCompile dependencies (not a root project's or subproject's dependencies) and then tests are executed. This means three things.

  1. You need to add necessary testCompile dependencies to each subproject's build script.
  2. If a module depends on a resource which is not added in compile dependencies you need to add this resource as a testCompile dependency to build script of the module. For example the resource can be a configuration file located in root project. To add directory as a testCompile dependency you can use file command testCompile files("path/to/resource").
  3. If you want to use some shared class in your tests you will need to place it to main source set or a new source set and declare a dependency to it. In this example a source set called testFixtures is created in core module. Support for testFixtures source set is added in spring-boot-multi-project plugin.

Of course Gradle's multi-project build has to contain settings.gradle file which tells Gradle where all modules (in Gradle's terminology sub projects) are located.

Project dependencies in application

Sometimes it is useful to know module dependencies in the application. For example if you want to execute data model updating scripts you will need to know which script should run first (core project's) and which last (root project's which depends on everything else).

To do this you can use project dependencies file generated by discoverProjectDependencies task. With this graph you can easily sort resources on classpath in certain order as it is shown in ProjectDependencyManager service.

Running the project

To start projects or to build it you can experiment with tasks provided by Gradle in project1 or project2 directories. For example:

gradle discoverProjectDependencies - to serialize project dependencies into a file
gradle bootRun - to run the application
gradle build - to build jar file
gradle test - to run all tests

If you'd like to open project in IntelliJ's IDEA use the import Gradle project functionality instead of simple open project. Import project will open all framework modules as a project modules so you'll have all required source codes available via project panel. You can also use IDEA's Gradle plugin to easily modify build scripts.

When running project in IDEA please make sure that your run configuration does have root project selected in use classpath of module option.

Be aware of if you don't add any source code into Gradle project then Gradle won't generate empty build directories for this project. Unfortunately it adds these non-existing build directories to a classpath when bootRun or a simmilar task is executed. Invalid paths breaks Java's class loader and it causes problems with loading libraries which usually ends up with java.io.FileNotFoundException: class path resource [] cannot be resolved to URL because it does not exist error message. This can happen in early stages of a project when you create an empty module with some static content but without any Java code.