jOOQ 3.14 Released With SQL/XML and SQL/JSON Support

jOOQ 3.14 has been released with support for SQL/XML, SQL/JSON, Kotlin code generation, embeddable types, and domain types, synthetic constraints, better MERGE support, and more SQL transformations.

In this release, we’ve sorted our github issues according to user feedback and
finally implemented some of the most wanted features, which include better
Kotlin support, embeddable types, and domain type support.

In addition to this, we believe that our newly added XML and JSON operation
support will be a leading game changer in how ORMs interact with SQL databases
in the future.

XML and JSON

Standard SQL has long supported SQL/XML, and since recently, most RDBMS also
support standard SQL/JSON or vendor specific variants thereof. While ORDBMS
extensions have never seen the adoption they deserve, XML and JSON do. It makes
sense to occasionally denormalise data and store documents in the database
directly. However, this is not what we’re most excited about.

We’re excited about our support for all the fancy operators like:

  • JSON_TABLE to turn JSON values into SQL tables
  • JSON_ARRAY, JSON_OBJECT, JSON_VALUE to construct JSON data from values
  • JSON_ARRAYAGG, JSON_OBJECTAGG to aggregate data into nested JSON documents
  • JSON_EXISTS to query documents with JSON path

Similar functions are available for XML, like XMLTABLE, XMLELEMENT, XMLAGG, etc.
All editions of jOOQ 3.14 support standard SQL XML and JSON operators, and
emulate them where non-standard support is available (e.g. PostgreSQL and SQL
Server).

The commercial editions also support SQL Server’s very convenient FOR XML and
FOR JSON APIs, emulating that syntax using standard operators elsewhere. See:

https://blog.jooq.org/2020/05/05/using-sql-server-for-xml-and-for-json-syntax-on-other-rdbms-with-jooq/

But that’s not all of it! If you have Gson, Jackson, or JAXB on the classpath,
we can use that to map org.jooq.XML, org.jooq.JSON, org.jooq.JSONB types from
your query results to your nested data structures, automatically. See:

https://blog.jooq.org/2020/10/09/nesting-collections-with-jooq-3-14s-sql-xml-or-sql-json-support/

These approaches are extremely powerful. In many cases, you might just skip
most of your middleware layer and bind a REST or similar endpoint directly to a
jOOQ / SQL query, producing JSON for your frontend:

https://blog.jooq.org/2019/11/13/stop-mapping-stuff-in-your-middleware-use-sqls-xml-or-json-operators-instead/

Kotlin support

We’ve long supported some Scala extensions and a ScalaGenerator. Kotlin is an
additional very promising JVM language where a new jOOQ-kotlin module as well as
a KotlinGenerator will add a lot of value for your jOOQ/Kotlin integration.

The KotlinGenerator offers, among other things:

  • Data class support for generated POJOs
  • Property support for generated POJOs, interfaces and records
  • Better nullability support

The jOOQ-kotlin module offers some useful extension functions to further improve
the experience of the jOOQ/Kotlin integration.

In addition to the above, we’ve annotated the entire jOOQ API with nullability
annotations from JetBrains:

  • org.jetbrains.annotations.Nullable
  • org.jetbrains.annotations.NotNull

This will remove many of the annoying T! types in your Kotlin/jOOQ code,
turning them into T or T? types instead, giving you more confidence.

With these improvements, we’ve also critically reviewed our existing
ScalaGenerator, fixing a lot of bugs.

Embeddable types

One of the biggest new features in jOOQ 3.14 is inspired by JPA, which ships
with embeddable types. An embeddable type is an emulation of a database user-
defined type (UDT), which are supported natively only in Oracle and PostgreSQL.
The biggest gain of such types is to create more semantic, composite data types
in your database schema, and profit from the additional type safety.

Our interpretation of the feature is mostly in the source code generator, whose
output now becomes even more valuable. All jOOQ editions support the basic
infrastructure to pattern-match column sets and turn them into synthetic
embeddable types.

In addition to the above, our commercial editions offer some auto configuration
of embeddable types in cases where they really shine:

  • For primary/unique constraints and their matching foreign keys
  • For DOMAIN types (see below)
  • Handling overlapping embeddable types
  • Allowing for embeddable types to replace their underlying columns

We took the concept a step further than JPA. In jOOQ, embeddable types can act
as views on the underlying columns, without replacing them, or as a replacement
like in JPA. jOOQ respects all levels of relational modelling, including
overlapping constraints and thus allowing for two embeddable types to overlap.

For more information, please refer to:

https://www.jooq.org/doc/3.14/manual/code-generation/codegen-embeddable-types/

DOMAIN types

Speaking of types, some database dialects support standard SQL DOMAIN types,
which are a simpler form of UDTs. Instead of working with low level technical
types like VARCHAR(10), why not give your single-column types a name, and add
a few re-usable CHECK constraints to them?

That’s what a DOMAIN type is:

  • A named type
  • Aliasing a technical type, like VARCHAR(10)
  • Possibly adding a DEFAULT expression
  • Possibly adding a NOT NULL constraint
  • Possibly adding a set of CHECK constraints

All of the above is reusable across your schema and if you’re commercially
licensed, you can even have the code generator auto-generate embeddable types
for all of your domains to profit from the additional type safety in Java.

For more information, please refer to:

Synthetic constraints

Related to the above improved code generator output are synthetic objects,
such as the previously supported synthetic primary keys, and now also synthetic
unique and foreign keys.

If you invest heavily in security and re-usable components within your database,
you will make heavy use of SQL views. Unfortunately, views do not have any meta
data like foreign key constraints – despite the meta data being “obvious” to you
the database designer. With synthetic constraints, you can tell jOOQ about your
knowledge of underlying constraints.

The main benefits of meta data being available to jOOQ being:

  • You can now use implicit joins on views as well
  • You can now use the JOIN .. ON KEY syntax on views as well
  • You can use embeddable key types from before on views just like on tables

For more information, please refer to:

Better MERGE support

We’ve finally tackled support for more advanced MERGE statement clauses and now
support:

  • Multiple WHEN MATCHED AND condition THEN UPDATE clauses
  • Multiple WHEN MATCHED AND condition THEN DELETE clauses
  • UpdatableRecord.merge() and all the related goodies to simplify record merging

Transformations

With the parser and our translator tool (https://www.jooq.org/translate), we’ll
invest more and more in new use-cases for putting jOOQ to use other than as an
embeddable DSL in Java.

Our translation capabilities have already been strong, and with a new set of SQL
transformations, they become even stronger helping customers migrate off RDBMS A
towards RDBMS B (and back, if they made a mistake).

While our website translator is free of charge, the jOOQ library can always be
used programmatically, or as a command line utility:

https://www.jooq.org/doc/3.14/manual/sql-building/sql-parser/sql-parser-cli/

To make this use-case even more useful, new transformation features have been
added, including:

  • ROWNUM to LIMIT or to ROW_NUMBER()
  • Table lists to ANSI JOIN (including Oracle (+) support)
  • Unnecessary arithmetic expressions

This is an exciting area that we’ll be exploring for our commercially licensed
customers in the future, while even the jOOQ Open Source Edition profits from
these improvements. For example, the infrastructure created for transformations
finally enabled emulating PostgreSQL’s DISTINCT ON clause, elsewhere.

For more information, please refer to:

Better manual

We’ve taken a step back and reviewed a few important parts of our documentation.
We’re now offering:

  • Sticky tabs for code generator techniques (XML, Programmatic, Gradle): If
    you’re using Gradle with jOOQ’s code generator, you don’t want to look at the
    XML configuration again, in the manual. These tabs finally hide unneeded
    information.
  • Subsections for each function: We’ve started documenting each SQL function
    individually, explaining how it works in SQL, and providing some examples and
    example results.
  • Generate example vendor specific rendering of SQL: We’re using jOOQ when
    generating the manual, translating some jOOQ API usage to all of our supported
    dialects, and displaying how the function renders in each dialect.
  • Show imports button and display context sensitive imports: All the examples in
    the manual can be overwhelming. We’re assuming a lot of (static) imports,
    which we’re finally documenting in an expandable “show imports” section of
    each code fragment.
  • We’ve rewritten some sections to be much more complete with examples, such as
    the data import section.
  • A new API diff page displays what has changed between each minor release in
    terms of list of Javadoc links: https://www.jooq.org/api-diff

Nesting Collections With jOOQ 3.14’s SQL/XML or SQL/JSON support

One of the main features of ORMs is M as in Mapping. Libraries like jOOQ help auto-mapping flat or nested database records onto Java classes that have the same structure as the SQL result set.

The following has always been possible in jOOQ, assuming PostgreSQL’s INFORMATION_SCHEMA (using the generated code from the jOOQ-meta module):

class Column {
    String tableSchema;
    String tableName;
    String columnName;
}

for (Column c :
    ctx.select(
            COLUMNS.TABLE_SCHEMA, 
            COLUMNS.TABLE_NAME, 
            COLUMNS.COLUMN_NAME)
       .from(COLUMNS)
       .where(COLUMNS.TABLE_NAME.eq("t_author"))
       .orderBy(COLUMNS.ORDINAL_POSITION)
       .fetchInto(Column.class))
    System.out.println(
        c.tableSchema + "." + c.tableName + "." + c.columnName
    );

The above resulting in something like:

public.t_author.id
public.t_author.first_name
public.t_author.last_name
public.t_author.date_of_birth
public.t_author.year_of_birth
public.t_author.address

The mapping is straight forward, as explained in jOOQ’s DefaultRecordMapper.

Nested mappings

A lesser known feature that we’ve offered for a while was to use a dot notation to emulate nesting records into nested Java classes. Assuming you want to use a re-usable data type description in your columns and elsewhere:

class Type {
    String name;
    int precision;
    int scale;
    int length;
}

class Column {
    String tableSchema;
    String tableName;
    String columnName;
    Type type;
}

You can now write this query where you’ll alias some columns using the dot notation to type.name, for example (several nesting levels are possible):

for (Column c :
    ctx.select(
            COLUMNS.TABLE_SCHEMA,
            COLUMNS.TABLE_NAME,
            COLUMNS.COLUMN_NAME,
            COLUMNS.DATA_TYPE.as("type.name"),
            COLUMNS.NUMERIC_PRECISION.as("type.precision"),
            COLUMNS.NUMERIC_SCALE.as("type.scale"),
            COLUMNS.CHARACTER_MAXIMUM_LENGTH.as("type.length")
       )
       .from(COLUMNS)
       .where(COLUMNS.TABLE_NAME.eq("t_author"))
       .orderBy(COLUMNS.ORDINAL_POSITION)
       .fetchInto(Column.class))

    System.out.println(String.format(
        "%1$-30s: %2$s",
        c.tableSchema + "." + c.tableName + "." + c.columnName,
        c.type.name + (c.type.precision != 0
               ? "(" + c.type.precision + ", " + c.type.scale + ")"
               :       c.type.length != 0
               ? "(" + c.type.length + ")"
               : "")
    ));

The above will print:

public.t_author.id            : integer(32, 0)
public.t_author.first_name    : character varying(50)
public.t_author.last_name     : character varying(50)
public.t_author.date_of_birth : date
public.t_author.year_of_birth : integer(32, 0)
public.t_author.address       : USER-DEFINED

Using XML or JSON

Using XML or JSON, starting from jOOQ 3.14, you can also nest collections in your result set mapping very easily. First, let’s look again at how to use a JSON query using jOOQ, e.g. to find all columns per table:

for (Record1<JSON> record :
    ctx.select(
            jsonObject(
                jsonEntry("tableSchema", COLUMNS.TABLE_SCHEMA),
                jsonEntry("tableName", COLUMNS.TABLE_NAME),
                jsonEntry("columns", jsonArrayAgg(
                    jsonObject(
                        jsonEntry("columnName", COLUMNS.COLUMN_NAME),
                        jsonEntry("type", jsonObject(
                            "name", COLUMNS.DATA_TYPE)
                        )
                    )
                ).orderBy(COLUMNS.ORDINAL_POSITION))
            )
       )
       .from(COLUMNS)
       .where(COLUMNS.TABLE_NAME.in("t_author", "t_book"))
       .groupBy(COLUMNS.TABLE_SCHEMA, COLUMNS.TABLE_NAME)
       .orderBy(COLUMNS.TABLE_SCHEMA, COLUMNS.TABLE_NAME)
       .fetch())
    System.out.println(record.value1());

The following JSON documents are returned:

{
  "tableSchema": "public", 
  "tableName": "t_author", 
  "columns": [{
    "columnName": "id", 
    "type": {"name": "integer"}
  }, {
    "columnName": "first_name", 
    "type": {"name": "character varying"}
  }, {...}]
}

{
  "tableSchema": "public", 
  "tableName": "t_book", 
  "columns": [{...}, ...]
}

That’s already awesome, isn’t it? We’ve blogged about this previously here and here. Starting with jOOQ 3.14, you can remove all the other middleware and mapping and what not, and produce your XML or JSON documents directly from your database using standard SQL/XML or SQL/JSON API!

But that’s not all!

Maybe, you don’t actually need the JSON document, you just want to use JSON to allow for nesting data structures, mapping them back to Java.

What about these nested Java classes:

public static class Type {
    public String name;
}

public static class Column {
    public String columnName;
    public Type type;
}

public static class Table {
    public String tableSchema;
    public String tableName;

    public List<Column> columns;
}

Assuming you have gson or Jackson or JAXB on your classpath (or you configure them directly), you can write the exact same query as before, and use jOOQ’s DefaultRecordMapper using the fetchInto(Table.class) call:

for (Table t :
    ctx.select(
            jsonObject(
                jsonEntry("tableSchema", COLUMNS.TABLE_SCHEMA),
                jsonEntry("tableName", COLUMNS.TABLE_NAME),
                jsonEntry("columns", jsonArrayAgg(
                    jsonObject(
                        jsonEntry("columnName", COLUMNS.COLUMN_NAME),
                        jsonEntry("type", jsonObject(
                            "name", COLUMNS.DATA_TYPE)
                        )
                    )
                ).orderBy(COLUMNS.ORDINAL_POSITION))
            )
       )
       .from(COLUMNS)
       .where(COLUMNS.TABLE_NAME.in("t_author", "t_book"))
       .groupBy(COLUMNS.TABLE_SCHEMA, COLUMNS.TABLE_NAME)
       .orderBy(COLUMNS.TABLE_SCHEMA, COLUMNS.TABLE_NAME)
       .fetchInto(Table.class))
    System.out.println(t.tableName + ":\n" + t.columns
       .stream()
       .map(c -> c.columnName + " (" + c.type.name + ")")
       .collect(joining("\n  ")));

The output being:

t_author:
  id (integer)
  first_name (character varying)
  last_name (character varying)
  date_of_birth (date)
  year_of_birth (integer)
  address (USER-DEFINED)
t_book:
  id (integer)
  author_id (integer)
  co_author_id (integer)
  details_id (integer)
  title (character varying)
  published_in (integer)
  language_id (integer)
  content_text (text)
  content_pdf (bytea)
  status (USER-DEFINED)
  rec_version (integer)
  rec_timestamp (timestamp without time zone)

No join magic. No cartesian products. No data deduplication. Just SQL-native nested collections, using an intuitive, declarative approach to creating the document data structure, combined with the usual awesomeness of SQL.

Using this without the jOOQ DSL

Of course, this also works without the jOOQ API, e.g. using our parser. Check out our translator tool. Plug in this native SQL beauty:

SELECT
  json_object(
    KEY 'tableSchema' VALUE columns.table_schema,
    KEY 'tableName' VALUE columns.table_name,
    KEY 'columns' VALUE json_arrayagg(
      json_object(
        KEY 'columnName' VALUE columns.column_name,
        KEY 'type' VALUE json_object(
          KEY 'name' VALUE columns.data_type
        )
      )
    )
  )
FROM columns
WHERE columns.table_name IN ('t_author', 't_book')
GROUP BY columns.table_schema, columns.table_name
ORDER BY columns.table_schema, columns.table_name

And, because the devil of SQL agnosticity and translation is in the detail, take out the vendor-specific version, e.g. for PostgreSQL:

SELECT json_build_object(
  'tableSchema', columns.table_schema,
  'tableName', columns.table_name,
  'columns', json_agg(json_build_object(
    'columnName', columns.column_name,
    'type', json_build_object('name', columns.data_type)
  ))
)
FROM columns
WHERE columns.table_name IN (
  't_author', 't_book'
)
GROUP BY
  columns.table_schema,
  columns.table_name
ORDER BY
  columns.table_schema,
  columns.table_name

You might need to run this, before:

SET search_path = 'information_schema'

Conclusion

We’ve waited way too long with this game changing feature. I truly think this approach will change how we perceive ORMs in the future. The database first approach, where we can use SQL and only SQL to map SQL data onto any hierarchical data structure is very commpelling.

On the jOOQ side, we’re far from done yet. What if we can auto-generate some of the JSON document declaration from other types of meta data for you? What if you can do that, yourself? E.g. to map a GraphQL specification to jOOQ API based JSON queries? On all the SQL dialects that support these features!

The future of mapping nested data structures from SQL to any client, XML, JSON, objects is bright. jOOQ 3.14 is around the corner and will be released within the next 2 weeks. You can already build it from github: https://github.com/jOOQ/jOOQ, or if you have a license, download a nightly build from here: https://www.jooq.org/download/versions

Looking forward to your feedback.

What’s a “String” in the jOOQ API?

One of jOOQ’s biggest strength is the fact that it is a type safe SQL API. “Type safe”, in this context, means that every object that you put in a jOOQ query has a well defined type, such as:

These can be used in jOOQ in a type safe way as such:

ctx.select(T.A)      // A Field
   .from(T)          // A Table
   .where(T.B.eq(1)) // A Condition
   .fetch();

No Strings here, no string concatenation.

However, there are a few cases where you want to bypass type safety, for whatever reason, including to extend jOOQ e.g. with plain SQL templating. In those cases, you would pass a “String” object to the jOOQ API. But not every such String object is the same. What different types of strings are there? There are 4 main types of String in the jOOQ API:

1. The bind value

The most obvious type of string is the bind value or literal. You can create those explicitly using:

// As always, this static import is implied
import static org.jooq.impl.DSL.*;

Field<String> bind = val("abc");
Field<String> literal = inline("xyz");

By default, the first value will produce a bind parameter marker "?" in the generated SQL, whereas the second one will produce an escaped string literal 'xyz'. You’re probably already doing this implicitly. Whenever you pass a String value where the jOOQ API expects a T type, you’re implicitly wrapping your String value using DSL.val(T):

ctx.select(T.A)
   .from(T)
   .where(T.C.eq("xyz")) // Implicit bind value
   .fetch();

This is still type safe usage of a String value, as it is really wrapped as a Field<String>

2. The plain SQL template

Whenever jOOQ is missing some vendor specific functionality, the backdoor is to use plain SQL templating. You can create plain SQL templates explicitly for the most popular types as shown above, like this:

Field<Integer> field = field("(1 + 2)", SQLDataType.INTEGER);
Table<?> table       = table("generate_series(1, 10)");
Condition condition  = condition("some_function() = 1");

These expressions can now be embedded in your queries, like any other:

ctx.select(field)
   .from(table)
   .where(condition)
   .fetch();

Alternatively, there also exists convenience overloads on some query methods to make this simpler:

ctx.select(field("(1 + 2)", SQLDataType.INTEGER)) // Not on SELECT
   .from("generate_series(1, 10)")
   .where("some_function() = 1")
   .fetch();

Notice that the select() method doesn’t have such convenience API yet, as of jOOQ 3.13

Important disclaimer: Using these APIs, you’re going to expose yourself to the usual SQL injection risk that is present also with JDBC or JPQL usage, when composing SQL from strings. NEVER concatenate plain SQL templates, nor use user input in these strings. Use the templating language instead, and turn every user input into bind variables. Examples:

ctx.select(...)
   .from(...)
   .where("some_function() = ?", 1) // Bind variable
   .fetch();

ctx.select(...)
   .from(...)
   .where("some_function() = {0}", val(1)) // Templating
   .fetch();

If you encounter String types on most query API in jOOQ, it’s for plain SQL templating. All of this API is annotated with @org.jooq.PlainSQL for additional documentation purposes, and to be preprocessed with a static checker that can be used to disallow such API usage by default, for added security.

3. Names (identifiers)

But on some query API in jOOQ, Strings are not convenience for plain SQL templating, but for names and identifiers. Namely, all DDL statements use strings this way in their API.

You can create qualified or unqualified identifiers explicitly as follows:

// Unqualified table identifier
Name table = name("t");

// Qualified column identifier
Name field = name("t", "col");

And then, use those identifiers in your DDL statements, e.g. to create the table:

ctx.createTable(table)
   .column(field, SQLDataType.INTEGER)
   .execute();

Depending on the context, qualification is necessary or not. In this case, field qualification would not have been necessary.

For convenience, you can also just use String types on the createTable(String) API:

ctx.createTable("t")
   .column("col", SQLDataType.INTEGER)
   .execute();

These strings will simply be wrapped in DSL.name(String) as illustrated before.

Beware: In jOOQ, all identifiers are quoted by default (RenderQuotedNames.EXPLICIT_DEFAULT_QUOTED). This has two benefits:

  • Special characters, and more importantly, keyword conflicts are handled correctly out of the box
  • Quoting prevents SQL injection
  • Case sensitivity is handled correctly in those dialects that support it for quoted identifiers

The price to pay for this convenience is that quoted identifiers might turn case sensitive, when this is undesired. To work around this, you can turn off quoting using your settings, e.g. by setting RenderQuotedNames.EXPLICIT_DEFAULT_UNQUOTED. But beware, this will put you again at SQL injection risk if you don’t sanitise your identifier names first!

4. Keywords

Keywords are strings too, in jOOQ. In rare cases, you may want to wrap your keyword-as-string representation in a org.jooq.Keyword type. The main benefit (as of jOOQ 3.13) is a consistent keyword style. There is no convenience API for this, as it is very rare for client code to use this feature. There’s only DSL.keyword(String):

Keyword current = keyword("current");
Keyword current = keyword("time");

You can now use keywords in plain SQL templates:

Field<Time> currentTime = field(
  "{0} {1}", 
  SQLDataType.TIME, 
  current, time
);

Create Empty Optional SQL Clauses with jOOQ

When using jOOQ to create dynamic SQL statements (one of jOOQ’s core value propositions), it is often necessary to add query elements conditionally, with a default “No-op” behaviour. For first time users, this default “no-op” behaviour is not always obvious as the jOOQ API is vast, and as with any vast API, there are many different options to do similar things.

How not to do it

A common pitfall is to be tempted to work with the many XYZStep types. What types are these? They are usually invisible to the developer as developers use jOOQ’s DSL API in a fluent fashion, just like the JDK Stream API. For example:

DSLContext ctx = ...;

Result<?> result =
ctx.select(T.A, T.B)
   .from(T)
   .where(T.C.eq(1))
   .and(T.D.eq(2))
   .fetch();

Let’s decompose the above query to see what happens in the API. We could assign every method result to a local variable:

SelectFromStep<?> s1 = ctx.select(T.A, T.B);
SelectWhereStep<?> s2 = s1.from(T);
SelectConditionStep<?> s3 = s2.where(T.C.eq(1));
SelectConditionStep<?> s4 = s3.and(T.D.eq(2))

Result<?> result = s4.fetch();

Our previous fluent API design blog post explains this API design technique.

This is not what people usually do with “static SQL” statements, but they might be tempted to do this if they wanted to add the last predicate (T.D = 2) conditionally, e.g:

DSLContext ctx = ...;

SelectConditionStep<?> c =
ctx.select(T.A, T.B)
   .from(T)
   .where(T.C.eq(1));

if (something)
    c = c.and(T.D.eq(2));

Result<?> result = c.fetch();

This is perfectly valid API usage, but we do not recommend it because it is very messy and leads to difficult to maintain client code. Also, it is absolutely unnecessary, because there is a better way:

Composing queries from its parts

The problem with the above approach is that it is trying to use an imperative approach of adding things to a query step by step. This is how many developers tend to structure their code, but with SQL (and by consequence, jOOQ) that can turn out to be difficult to get right. A functional approach tends to work better.

Notice that not only the entire DSL structure could be assigned to local variables, but also the individual SELECT clause arguments. For example:

DSLContext ctx = ...;

List<SelectField<?>> select = Arrays.asList(T.A, T.B);
Table<?> from = T;
Condition where = T.C.eq(1).and(T.D.eq(2));

Result<?> result =
ctx.select(select)
   .from(from)
   .where(where)
   .fetch();

In fact, every jOOQ query is a dynamic SQL query. Many queries just happen to look like static queries, due to jOOQ’s API design.

Again, we wouldn’t be assigning every SELECT clause argument to a local variable, only the truly dynamic ones. For example:

DSLContext ctx = ...;

Condition where = T.C.eq(1);

if (something)
    where = where.and(T.D.eq(2));

Result<?> result =
ctx.select(T.A, T.B)
   .from(T)
   .where(where)
   .fetch();

This already looks quite decent.

Avoid breaking readability

A lot of people aren’t happy with this approach either, because it breaks a query’s readability by making its components non-local. The predicates in the query are declared up front, away from the query itself. This isn’t how many people like to reason about SQL.

And you don’t have to! It is totally possible to embed the condition directly in the WHERE clause like this:

DSLContext ctx = ...;

Result<?> result =
ctx.select(T.A, T.B)
   .from(T)

   // We always need this predicate
   .where(T.C.eq(1))

   // This is only added conditionally
   .and(something
      ? T.D.eq(2)
      : DSL.noCondition())
   .fetch();

The magic is in the above usage of DSL.noCondition, which is a pseudo predicate that does not generate any content. It is a placeholder where an org.jooq.Condition type is required without actually materialising one.

There is also:

… but that requires having to think about these identities and the reductions all the time. Also, if you append many of these trueCondition() or falseCondition() to a query, the resulting SQL tends to be quite ugly, for example for people having to analyse performance in production. noCondition() just never generates any content at all.

Note that noCondition() does not act as an identity! If your noCondition() is the only predicate left in a WHERE clause, there will not be any WHERE clause, regardless if you work with AND predicates or OR predicates.

No-op expressions in jOOQ

When using dynamic SQL like this, and adding things conditionally to queries, such “no-op expressions” become mandatory. In the previous example, we’ve seen how to add a “no-op predicate” to a WHERE clause (the same approach would obviously work with HAVING and all other clauses that work with boolean expressions).

The three most important jOOQ query types are:

Users may want to add all of these conditionally to queries.

org.jooq.Condition

We’ve already seen how to do this with org.jooq.Condition.

org.jooq.Field

What about dynamic column expressions in the projection (the SELECT clause)? Assuming you want to project columns only in certain cases. In our example, the T.B column is something we don’t always need. That’s easy! The same approach can be used (assuming T.B is a string column):

DSLContext ctx = ...;

Result<Record2<String, String>> result =
ctx.select(T.A, something ? T.B : DSL.inline("").as(T.B))
   .from(T)
   .where(T.C.eq(1))
   .and(T.D.eq(2))
   .fetch();

Using inlined parameters via DSL.inline(), you can easily produce a no-op value in your projection if you don’t want to modify the projection’s row type. The advantage is that you can now use this subquery in a union that expects two columns:

DSLContext ctx = ...;

Result<Record2<String, String>> result =

// First union subquery has a conditionally projected column
ctx.select(T.A, something ? T.B : DSL.inline("").as(T.B))
   .from(T)
   .where(T.C.eq(1))
   .and(T.D.eq(2))

   .union(

// Second union subquery has no such conditions
    select(U.A, U.B)
   .from(U))
   .fetch();

You can take this one step further, and make an entire union subquery conditional this way!

DSLContext ctx = ...;

Result<Record2<String, String>> result =

// First union subquery has a conditionally projected column
ctx.select(T.A, T.B)
   .from(T)
   .union(
      something
        ? select(U.A, U.B).from(U)
        : select(inline(""), inline("")).where(falseCondition())
   )
   .fetch();

This is a bit more syntactic work, but it’s nice to see how easy it is to add something conditionally to a jOOQ query without making the query completely unreadable. Everything is local to where it is being used. No local variables are needed, no imperative control flow is invoked.

And because everything is now an expression (and not a statement / no control flow), we can factor out parts of this query into auxiliary methods, that can be made reusable.

org.jooq.Table

Conditional table expressions usually appear when doing conditional joins. This is usually not done in isolation, but together with other conditional elements in a query. E.g. if some columns are projected conditionally, those columns may require an additional join, as they originate from another table than the tables that are used unconditionally. For example:

DSLContext ctx = ...;

Result<?> result =
ctx.select(
      T.A, 
      T.B, 
      something ? U.X : inline(""))
   .from(
      something
      ? T.join(U).on(T.Y.eq(U.Y))
      : T)
   .where(T.C.eq(1))
   .and(T.D.eq(2))
   .fetch();

There isn’t a more simple way to produce the conditional JOIN expression, because JOIN and ON need to be provided independently. For simple cases as shown above, this is perfectly fine. In more complex cases, some auxiliary methods may be needed, or views.

Conclusion

There are two important messages here in this post:

  1. The XYZStep types are auxiliary types only. They are there to make your dynamically constructed SQL statement look like static SQL. But you should never feel the need to assign them to local variables, or return them from methods. While it is not wrong to do so, there is almost always a better way to write dynamic SQL.
  2. In jOOQ, every query is a dynamic query. This is the benefit of composing SQL queries using an expression tree like the one that is used in jOOQ’s internals. You may not see the expression tree because the jOOQ DSL API mimicks static SQL statement syntax. But behind the scenes, you’re effectively building this expression tree. Every part of the expression tree can be produced dynamically, from local variables, methods, or expressions, such as conditional expressions. I’m looking forward to using the new JEP 361 switch expressions in dynamic SQL. Just like a SQL CASE expression, some SQL statement parts can be constructed dynamically in the client, prior to passing them to the server.

Once these two things are internalised, you can write very fancy dynamic SQL, including using FP approaches to constructing data structures, such as a jOOQ query object.

Using Java 13+ Text Blocks for Plain SQL with jOOQ

Most jOOQ users use the jOOQ DSL API, which provides compile time type safety and an easy way to write dynamic SQL.

But occasionally, this DSL get in the way, because it might be

  • Overkill for some very simple quick and dirty SQL queries
  • Too limited when running very advanced vendor specific SQL, such as Oracle’s MODEL or MATCH_RECOGNIZE clauses

In such cases, you can still benefit from jOOQ’s many secondary features, including for example its nice integration with the Stream API, export functionality, and much more. Think of jOOQ as an improved JDBC!

Starting with Java 13, when enabling preview features, you can now make use of text blocks, or “multi line strings”, which are very useful for embedding static SQL strings in Java code (and XML, JSON, regular expressions, etc). There are two main use cases for using text blocks in jOOQ:

Plain SQL

The main use case is to use plain SQL and also plain SQL templating. For example, to run a quick and dirty jOOQ query like this on H2 or PostgreSQL, for example:

System.out.println(ctx.fetch("""
        SELECT table_schema, count(*)
        FROM information_schema.tables
        GROUP BY table_schema
        ORDER BY table_schema
        """));

(unfortunately, this blog’s syntax highlighter is not quite there yet…)

The output is jOOQ’s nicely formatted text result:

+------------------+--------+
|TABLE_SCHEMA      |COUNT(*)|
+------------------+--------+
|INFORMATION_SCHEMA|      33|
|MCVE              |       2|
|PUBLIC            |       1|
+------------------+--------+

The above is a simple plain SQL query, but we can also make use of plain SQL templating with text blocks! What if we wanted to have a dynamic GROUP BY clause?

Always assuming this static import:

import static org.jooq.impl.DSL.*;

We can now write:

Stream.of(
        field("table_schema"),
        list(field("table_schema"), field("table_type")))
    .forEach(q -> {
        System.out.println(ctx.fetch("""
          SELECT {0}, count(*), row_number() OVER (ORDER BY {0}) AS rn
          FROM information_schema.tables
          GROUP BY {0}
          ORDER BY {0}
          """, q));
    });

We’ve written a dynamic SQL query with jOOQ, but without using much type safety (we could still use some type safe, generated schema object references if we wanted to). jOOQ’s plain SQL templating capabilities are a much underestimated feature set. People have been using MyBatis for many years precisely for these purposes, or maybe built home grown, velocity template based frameworks to produce dynamic SQL using native SQL strings.

You can also use jOOQ for this, and profit from all of the jOOQ API, including, again the formatted output from the above:

+------------------+--------+----+
|TABLE_SCHEMA      |COUNT(*)|  RN|
+------------------+--------+----+
|INFORMATION_SCHEMA|      33|   1|
|MCVE              |       2|   2|
|PUBLIC            |       1|   3|
+------------------+--------+----+

+------------------+------------+--------+----+
|TABLE_SCHEMA      |TABLE_TYPE  |COUNT(*)|  RN|
+------------------+------------+--------+----+
|INFORMATION_SCHEMA|SYSTEM TABLE|      33|   1|
|MCVE              |TABLE       |       1|   2|
|MCVE              |VIEW        |       1|   3|
|PUBLIC            |TABLE       |       1|   4|
+------------------+------------+--------+----+

All of this was available before text blocks, but with text blocks, it makes even more sense to use these features. There is not just one way to use jOOQ, but this approach is really very underestimated!

The parser

Another very important jOOQ feature that works with string based SQL is the jOOQ parser. There are a variety of use cases for using jOOQ’s parser, as we’re adding more and more features to it. One use-case is simple formatting of SQL. Why would you want to do this? One example is to standardise your SQL strings for better execution plan cache usage. As you may know, some database vendors (e.g. Oracle) treat these two SQL strings as different:

SELECT 1 FROM dual
SELECT  1 FROM dual

Observe the irrelevant whitespace difference. Oracle will parse a new SQL string, produce a new SQL_ID, and yet come up with the same execution plan. You don’t want this to happen too often, because it puts a lot of contention on the plan cache and its locks and latches, and creates extra work.

This can happen, nonetheless, with string based dynamic SQL. Using jOOQ’s parser, you can normalise all sorts of SQL strings (including removing comments)

System.out.println(
    ctx.parser()
       .parseResultQuery("""
            SELECT table_schema, count(*)
            FROM information_schema.tables
            GROUP BY table_schema
            -- Order by column index!
            ORDER BY 1
            """)
       .fetch()
);

The SQL string that gets sent to the JDBC driver is this:

select table_schema, count(*) from information_schema.tables group by table_schema order by 1

Alternatively, you could specify some formatting in jOOQ:

DSLContext ctx = DSL.using(connection, 
    new Settings().withRenderFormatted(true));

And now, the SQL string that gets sent to the JDBC driver is this:

select 
  table_schema, 
  count(*)
from information_schema.tables
group by table_schema
order by 1

You can tweak the various supported formatting rules, and use this for logging your hand written SQL strings in a human readable way, for example, rather than executing the SQL through jOOQ. The options are endless. You could also translate parser output to some other SQL dialect. Assuming you run your queries against SQL Server, but want to translate them to H2. No problem! The following SQL Server specific query can be run easily on H2 as well:

System.out.println(
    ctx.parser()
       .parseResultQuery("""
            SELECT TOP 1 table_schema, count(*)
            FROM information_schema.tables
            GROUP BY table_schema
            ORDER BY count(*) DESC
            """)
       .fetch()
);

The translated SQL that gets executed on H2 is this:

select 
  table_schema, 
  count(*)
from information_schema.tables
group by table_schema
order by count(*) desc
limit 1

Or, because I like showing off:

System.out.println(
    ctx.parser()
       .parseResultQuery("""
            SELECT TOP 1 WITH TIES table_schema, count(*)
            FROM information_schema.tables
            GROUP BY table_schema
            ORDER BY count(*) DESC
            """)
       .fetch()
);

Producing this in H2:

select 
  TABLE_SCHEMA, 
  count(*)
from INFORMATION_SCHEMA.TABLES
group by TABLE_SCHEMA
order by 2 desc
fetch next 1 rows with ties

Or this, in PostgreSQL:

select 
  "v0" as table_schema, 
  "v1" as "count"
from (
  select 
    table_schema as "v0", 
    count(*) as "v1", 
    rank() over (order by 2 desc) as "rn"
  from information_schema.tables
  group by table_schema
) "x"
where "rn" > 0
and "rn" <= (0 + 1)
order by "rn"

I know, right?

Parser-derived features

The parser is used in more and more of jOOQ’s many features. Including for example in jOOQ 3.13’s schema diff and management utility.

Run this with jOOQ:

System.out.println(
    ctx.meta("""
    create table t (
      i int
    )
    """).apply("""
    alter table t
      add j int;
    alter table t
      add constraint t_pk primary key (i)
    """)
);

I.e. just copy paste your plain SQL schema definitions and increments (as used in Flyway, for example, or hand written while developing them against your database), and start using the entire tool chain on it. For example, the above prints:

create table T(
  I int null,
  J int null,
  constraint T_PK
    primary key (I)
);

Instead of applying increments to a schema, you can also calculate the diff between two schema versions:

System.out.println(
    ctx.meta("""
    create table t (
      i int
    )
    """).migrateTo(ctx.meta("""
    create table t (
      i int,
      j int,
      constraint t_pk primary key (i)
    )
    """))
);

And now, the output is the increment we’ve applied earlier

alter table T
  add J int null;
alter table T
  add constraint T_PK
    primary key (I);

Conclusion

jOOQ’s DSL is very powerful because:

  • It offers type safety
  • It is compile time checked
  • It helps with auto completion

But sometimes, it gets in the way and plain SQL strings work better, especially when using text blocks. In that case, jOOQ is still also very powerful. The entirety of the jOOQ tool chain is still available to you behind the scenes. You’re just not using the DSL as an API anymore.

Never Concatenate Strings With jOOQ

jOOQ supports a vast amount of SQL syntax out of the box. As such, most users will not think of resorting to string concatenation like in the old days when writing dynamic SQL with JDBC.

But every now and then, a vendor specific feature is not supported by jOOQ (yes, it happens). In that case, jOOQ supports a variety of “plain SQL” API, which can be used to construct almost all types of jOOQ API elements, such as:

// Static import is implied, as always
import static org.jooq.impl.DSL.*;

// Column expressions
Field<String> f = field("cool_function(1, 2, 3)", String.class);

// Predicates
Condition c = condition("col1 <fancy operator> col2");

// Tables
Table<?> t = table("wicked_table_valued_function(x, y)");

But then, sometimes, you need to pass an argument to such a function dynamically, such as another column expression. And you want to do that in a type safe way, because the jOOQ code generator already produced type safe column expressions. So you might be inclined to concatenate, nonetheless:

field("cool_function(1, " + MY_TABLE.MY_COLUMN + ", 3)");

Never do this!

For these reasons:

  1. Despite jOOQ being very SQL injection safe in general, this is where you can in fact introduce a plain SQL injection vulnerability nonetheless. Not in this case, as the column is generated code, but maybe, you will concatenate user input. Note that in order to increase SQL injection protection, plain SQL usage can be prevented globally, and allowed only locally when needed by adding our PlainSQL checker, using the checker framework or Google ErrorProne.
  2. As always with string concatenation, you’re prone to SQL syntax errors. In this case, the generated SQL is not specific to any dialect, as MY_TABLE.MY_COLUMN.toString() is being called, without any contextual information, such as the SQLDialect and all the other configuration flags.

Instead, use jOOQ’s plain SQL templating mini language, which allows for template placeholders like {0}, {1}, {2}:

field("cool_function(1, {0}, 3)", MY_TABLE.MY_COLUMN);

And if you’re doing this more often, you can factor out this call in your own mini DSL:

public static Field<String> coolFunction(Field<?> field) {
    field("cool_function(1, {0}, 3)", field);
}

And now, call it like this:

coolFunction(MY_TABLE.MY_COLUMN)

As a rule of thumb:

With jOOQ, you should never need to resort to SQL string concatenation

You can always use either:

  • The type safe jOOQ DSL API
  • The plain SQL templating API (and ideally hide such usage behind your own type safe DSL API)

Use the jOOQ-Refaster Module for Automatic Migration off of Deprecated jOOQ API

Deprecation notice

After encountering numerous problems building the jOOQ-refaster module in various JDK versions and after receiving no feedback from the community about this feature, we have decided to remove it again in jOOQ 3.15: https://github.com/jOOQ/jOOQ/issues/10803

Starting with jOOQ 3.13, we’re offering a new module called jOOQ Refaster, which provides refaster templates for automatic API migration.

What’s Refaster?

Refaster is a lesser known sub project of the popular Google Error Prone library, a library for static code analysis, implemented as a Java compiler plugin. We already support a few such static code analysis tools through the jOOQ Checker module, which can be used with both the Checker Framework and with Google’s Error Prone.

Refaster allows for finding “bad” API usage patterns in your code, and produce .patch files automatically, which fix such patterns, producing better code.

What are we offering in jOOQ?

Over the years, we have deprecated quite a bit of API, as new API came along. For example, there is the ResultQuery.fetchLazy(int) method, where the int parameter corresponds to the JDBC fetchSize that is being passed along to JDBC.

Long ago, we introduced better API for passing the fetchSize. You can:

  • Specify a Settings value with your configuration for a global fetchSize to apply
  • Specify ResultQuery.fetchSize(int), which can then be combined with fetchLazy(), or with any type of fetch() call, as this isn’t really specific to “lazy” fetching

The Javadoc says:

Deprecated. [#2811] – 3.3.0 – Use fetchSize(int) and fetchLazy() instead.

…and your IDE will probably warn you about deprecated API usage, but who has time to fix all these warnings and read all these Javadoc?

Instead, you can now download the refaster file for your jOOQ version from our website https://www.jooq.org/refaster, configure your Java compiler to include ErrorProne’s refaster library (as documented in the jOOQ manual), and get a patch right away.

For example, if your code reads:

public void runQuery() {
    try (Cursor<?> cursor = ctx.select(T.A)
       .from(T)
       .fetchLazy(42)) {
        // Do something
    }
}

You might be getting a patch file like this one:

--- ..\..\src\main\java\org\jooq\refaster\test\RefasterTests.java
+++ ..\..\src\main\java\org\jooq\refaster\test\RefasterTests.java
@@ -62,6 +62,5 @@
 public void runQuery() {
     try (Cursor<?> cursor = ctx.select(T.A)
-       .from(T)
-       .fetchLazy(42)) {
+       .from(T).fetchSize(42).fetchLazy()) {
         // Do something
     }

And you’ll get the improved version of your code automatically:

public void runQuery() {
    try (Cursor<?> cursor = ctx.select(T.A)
       .from(T).fetchSize(42).fetchLazy()) {
        // Do something
    }
}

In our opinion, all libraries should offer such refaster files (or another technology doing something similar) to help users upgrade more easily between versions.

What’s next?

In addition to upgrading deprecated API usage (which is a very obvious use-case for Refaster), we’re also experimenting with some other static code analysis and improvement approaches that deal with our understanding of the SQL language, and best practices. For example, as we’ve blogged before, it is not recommended to use COUNT(*) queries in SQL (or in any language), when a simple EXISTS() check would suffice.

jOOQ 3.13 already offers an introspection that replaces calls like these:

ctx.fetchCount(table) != 0
ctx.fetchCount(table) >= 1 // And many more

By equivalent, but likely much more performant expressions:

ctx.fetchExists(table)

So, stay tuned for more such goodies in the future!

jOOQ 3.13 Released with More API and Tooling for DDL Management

jOOQ 3.13 has been released with CockroachDB support, much more API and tooling for DDL management, and SQL:2011 temporal table support

Starting with this release, we will further embrace our support for parsing,
translating, executing, and now also interpreting DDL statements. The driving
force is better code generation support, but in the future, also better
integration with database change management systems like Flyway or Liquibase.

We’ve added support for a very promising new dialect: CockroachDB.

We’re starting to support one of SQL:2011’s most interesting features: temporal
tables for system and application time versioning, and we’ve generally improved
the way we build and distribute jOOQ to our paying customers.

SQL Dialects

We’ve worked together with a team of great engineers at CockroachDB to speed up
our newly added, formal CockroachDB support in the jOOQ Professional and
Enterprise Editions. While many users have successfully used the POSTGRES
dialect with CockroachDB for simple integrations, this new dialect is now fully
integration tested and supported for future jOOQ and CockroachDB versions. The
integration fixed a variety of issues and minor differences that we have
encountered in the code generator and the runtime library.

We’ve added support for a variety of features now available in Firebird 3.0+,
MariaDB 10.3+, MySQL 8+, and PostgreSQL 10+.

DDL interpretation

Our biggest investment has been made in the area of DDL interpretation. Since
jOOQ 3.9, we can parse more and more SQL syntax. A logical next step is to allow
for interpreting parsed (or ordinary, DSL constructed) DDL statements in order
to maintain an incremental, in-memory meta model of a database. This has a
variety of benefits, which we will take advantage of in this release and in the
future.

The main out of the box benefits in this version include:

  • Being able to create a diff between two versions of a schema, programmatically through Meta.migrateTo(Meta), on the command line through DiffCLI (https://www.jooq.org/doc/latest/manual/sql-building/schema-diff-cli), or online, through our website: https://www.jooq.org/diff.
  • Such a diff or the meta model itself can be used for ad-hoc introspections, or formally, for automated migrations by tools built on top of jOOQ. Future versions will offer such migrations out of the box.
  • Unfortunately, we have not yet replaced our H2 backed DDLDatabase by our new DDL interpretation capabilities. This is a high priority for jOOQ 3.14, though

In the context of these investments, we have improved a variety of
infrastructure that were needed for these features and that users can benefit as
well. These improvements include:

  • A new LiquibaseDatabase, which uses Liquibase behind the scenes to simulate a schema migration in memory (similar to the DDLDatabase for Flyway), to reverse engineer that for the code generator. If you’re using Liquibase, chances are that you no longer need to connect to an actual database to use jOOQ’s code generator. More info here: https://blog.jooq.org/2019/10/16/how-to-simulate-a-liquibase-migration-using-h2/
  • We now support the inclusion of SQL text in generated code and other parts of the runtime meta model, including for CHECK constraints and views
  • A lot of improvements to the JDBC DatabaseMetaData backed org.jooq.Meta API have been implemented.
  • A lot of additional DDL statement and syntax support has been added throughout the API, parser, and other tooling.
  • The runtime meta model has been enhanced with new information, such as sequence flags, check constraints, table type information, and more.

Temporal table support

Perhaps the SQL:2011 standard’s most exciting “new” feature are temporal tables,
an enhancement on top of Oracle’s historic flashback query feature, and much
more.

In many systems, single, or bi temporal versioning of data is essential. The two
dimensions of versioning that are now supported in jOOQ are:

  • System versioning, useful for backups and auditing, where every mutating operation results in a backup copy of the data in a managed backup table.
  • Application versioning, useful for data that has a timely limited validity, such as a price for a product, which could have a temporary discount.

jOOQ 3.13 supports org.jooq.Period and related API in Select, Update, and Delete
to manipulate and query temporal tables in dialects that ship with native
support for these features. To different degrees, these dialects include:

  • DB2
  • MariaDB
  • Oracle
  • SQL Server

Emulations of incompletely supported syntax in the above dialects, as well as in
all other dialects will be added in future jOOQ versions, making temporal
validity a problem of the past.

More details here:
https://www.jooq.org/doc/3.13/manual/sql-building/table-expressions/temporal-tables/

This is a commercially licensed only feature.

Build improvements

We’ve invested a lot of time in improving our build automation using newer
Jenkins features that allow us to publish builds more frequently, especially
snapshot builds of upcoming minor releases for early adopters, as well as patch
releases for customers needing urgent fixes:

https://www.jooq.org/download/versions

These new releases also include trial versions for Java 6, 8, 11, without the
need to register for extended trials.

In addition to the above, we’ve improved and continue to improve our
documentation of features per edition, in the future.

Procedural API improvement

In jOOQ 3.12, we’ve introduced support for procedural languages, which we are
improving in every new release. jOOQ 3.13 adds support for additional
procedural statements, including:

  • EXECUTE for dynamic SQL execution embedded in procedural blocks
  • RETURN to return from functions
  • Plain SQL statement support

In the future, we’re planning to support also the generation of functions,
procedures, and triggers through standard jOOQ API.

More details here:
https://www.jooq.org/doc/3.13/manual/sql-building/procedural-statements/

This is a commercially licensed only feature.

jOOQ-Refaster

Some of our users have had a lot of success using our checker-framework or
ErrorProne integrations, which allow for some static API usage analysis in
client code (e.g. plain SQL / SQLi risk, dialect compatiblity, etc.)

Starting from jOOQ 3.13, we’re also adding support for ErrorProne’s sub project
“refaster”, which allows for automatically rewriting bad API usage to something
better, such as for example:

  • Migrate deprecated API usage like field.trim(), to its replacement trim(field)
  • Migrate inefficient SQL usage like ctx.fetchCount(select) != 0 to ctx.fetchExists(select)

The complete list of changes can be found on our website:
https://www.jooq.org/notes

How to Simulate a Liquibase Migration using H2

This post is part of a new blog series about database migrations, which will cover a variety of database change management topics.

In the near future, we’ll look much more into these topics, hoping to add more value to our users’ existing Flyway, Liquibase, and other integrations where the migration tools can profit a lot from jOOQ’s most recent features, including the parser.

Embracing Liquibase in the next jOOQ versions

In jOOQ 3.13, we’re going to offer a better integration for those users who use Liquibase for database change management. Our existing DDLDatabase parses, translates, and simulates a DDL script based migration against an in-memory H2 database instance.

This is using our built-in SQL translation functionality (test it on our website here: https://www.jooq.org/translate) to translate this Oracle SQL:

create table t (v varchar2(100))

to this H2 SQL:

create table t (v varchar(100))

That’s just a trivial example. More sophisticated translations are possible too. The main purpose of doing this has been, historically, to offer an “offline” jOOQ source code generation step that does not require connecting to an actual Oracle database instance to reverse engineer your schema, which you can represent in form of SQL migration scripts, which you’d typically run with something like Flyway.

In the future, we’ll add more features around this. One thing we’ve been thinking about has been to allow for translating Flyway migrations to other dialects, to make them vendor agnostic.

Simulating Liquibase migrations

Flyway and Liquibase work in quite a similar fashion, with Liquibase offering an additional abstraction layer over the SQL language. While pure SQL migrations are also possible with Liquibase’s SQL change, Liquibase also offers a set of mostly low level DDL command abstractions, such as the ADD COLUMN change.

Using their XML based DSL, you don’t have to remember whether the command is called:

alter table t add i int;
alter table t add column i int;

Or whatever creative syntax your SQL vendor came up with. Notice that, again, you can achieve the same thing with jOOQ’s translator, which you can use as an API or command line interface

Now, assuming you have the following liquibase database change log:

<?xml version="1.0" encoding="UTF-8"?>
<databaseChangeLog
    xmlns="http://www.liquibase.org/xml/ns/dbchangelog"
    xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xsi:schemaLocation="http://www.liquibase.org/xml/ns/dbchangelog
    http://www.liquibase.org/xml/ns/dbchangelog/dbchangelog-3.8.xsd">
    <changeSet author="authorName" id="changelog-1.0">
        <createTable tableName="TAB">
            <column name="COL" type="VARCHAR(10)">
                <constraints nullable="true"
                    primaryKey="false" unique="false" />
            </column>
        </createTable>
    </changeSet>
</databaseChangeLog>

And now, you would like to check if it is correct, i.e. if you can ship it to production as it is. Just create an in-memory H2 connection, and run the file against it:

Properties info = new Properties();
info.put("user", "sa");
info.put("password", "");

try (Connection con = new org.h2.Driver()
        .connect("jdbc:h2:mem:db", info)) {
    Database database = DatabaseFactory.getInstance()
      .findCorrectDatabaseImplementation(new JdbcConnection(con));
    Liquibase liquibase = new Liquibase("/path/to/liquibase.xml", 
      new FileSystemResourceAccessor(), database);
    liquibase.update("");
}

To check, just repeat the table definition a second time

<createTable tableName="TAB">
    <column name="COL" type="VARCHAR(10)">
        <constraints nullable="true"
            primaryKey="false" unique="false" />
    </column>
</createTable>
<createTable tableName="TAB">
    <column name="COL" type="VARCHAR(10)">
        <constraints nullable="true"
            primaryKey="false" unique="false" />
    </column>
</createTable>

and see how H2 complains about the statement:

Table "TAB" already exists; SQL statement:
CREATE TABLE PUBLIC.TAB (COL CLOB) [42101-200] [Failed SQL: (42101) CREATE TABLE PUBLIC.TAB (COL CLOB)]
	at liquibase.changelog.ChangeSet.execute(ChangeSet.java:646)
	at liquibase.changelog.visitor.UpdateVisitor.visit(UpdateVisitor.java:53)
	at liquibase.changelog.ChangeLogIterator.run(ChangeLogIterator.java:83)
	at liquibase.Liquibase.update(Liquibase.java:202)
	at liquibase.Liquibase.update(Liquibase.java:179)
	at liquibase.Liquibase.update(Liquibase.java:175)
	at liquibase.Liquibase.update(Liquibase.java:168)
	at LB.main(LB.java:21)

Using jOOQ’s tool chain to create schema snapshots

Want to export the schema again as DDL? Very easy, just use jOOQ for the task. For example, you can write the following Java code right after the above Liquibase migration simulation:

DSLContext ctx = DSL.using(connection);
ctx.ddl(ctx.meta().getSchemas("PUBLIC").get(0))
    .forEach(System.out::println);

And you can get, in H2 syntax (omitting the liquibase tables):

create table "PUBLIC"."TAB"(
  "COL" varchar(10) null
)

Want to export this to Oracle, instead? And using a custom schema, instead of H2’s PUBLIC? Write this instead:

DSLContext h2 = DSL.using(connection);
DSLContext oracle = DSL.using(ORACLE, new Settings()
   .withRenderMapping(new RenderMapping()
       .withSchemata(new MappedSchema()
           .withInput("PUBLIC")
           .withOutput("MY_SCHEMA"))));
oracle.ddl(h2.meta().getSchemas("PUBLIC").get(0))
    .forEach(System.out::println);

And now, we’re getting:

create table "MY_SCHEMA"."TAB"(
  "COL" varchar2(10) null
)

(notice, this works once #9384 is fixed).

Prefer an XML representation of your schema snapshot? Easy with jOOQ, as well:

DSLContext ctx = DSL.using(connection);
JAXB.marshal(ctx.informationSchema(
   ctx.meta().getSchemas("PUBLIC").get(0)
), System.out);

The output being:

<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<information_schema xmlns="http://www.jooq.org/xsd/jooq-meta-3.12.0.xsd">
    <schemata>
        <schema>
            <schema_name>PUBLIC</schema_name>
            <comment></comment>
        </schema>
    </schemata>
    <tables>
        <table>
            <table_schema>PUBLIC</table_schema>
            <table_name>TAB</table_name>
            <comment></comment>
        </table>
    </tables>
    <columns>
        <column>
            <table_schema>PUBLIC</table_schema>
            <table_name>TAB</table_name>
            <column_name>COL</column_name>
            <data_type>varchar</data_type>
            <character_maximum_length>10</character_maximum_length>
            <ordinal_position>1</ordinal_position>
            <is_nullable>true</is_nullable>
            <column_default></column_default>
            <comment></comment>
        </column>
    </columns>
</information_schema>

A useful file that can be put under version control with your current commit set of database change set to validate your schema, or export to other tools that can reverse engineer DDL or XML.

Using Liquibase migrations as jOOQ code generation input

Just like the pre-existing DDLDatabase, starting from jOOQ 3.13, you can simulate your Liquibase migration in-memory against an H2 database to reverse engineer it again for jOOQ’s code generator. This is documented here.

The relevant code generation configuration looks like this (standalone or with maven):

<configuration>
  <generator>
    <database>
      <name>org.jooq.meta.extensions.liquibase.LiquibaseDatabase</name>
      <properties>
        <property>
          <key>scripts</key>
          <value>src/main/resources/database.xml</value>
        </property>
      </properties>
    </database>
  </generator>
</configuration>

That’s it! As a bonus, you won’t have to manually simulate your migration anymore, as jOOQ’s LiquibaseDatabase already does it for you, behind the scenes, using the same three lines of code:

Database database = DatabaseFactory.getInstance()
  .findCorrectDatabaseImplementation(new JdbcConnection(con));
Liquibase liquibase = new Liquibase("/path/to/liquibase.xml", 
  new FileSystemResourceAccessor(), database);
liquibase.update("");

Stay tuned for more goodies on the topic of SQL migrations and database change management

How to Map MySQL’s TINYINT(1) to Boolean in jOOQ

MySQL 8 does not yet support the BOOLEAN type as specified in the SQL standard. There is a DDL “type” called BOOL, which is just an alias for TINYINT:

create table t(b bool);

select 
  table_name, 
  column_name, 
  data_type, 
  column_type
from information_schema.columns
where table_name = 't';

The above produces:

TABLE_NAME|COLUMN_NAME|DATA_TYPE|COLUMN_TYPE|
----------|-----------|---------|-----------|
t         |b          |tinyint  |tinyint(1) |

Notice that BOOL translates to a specific “type” of TINYINT, a TINYINT(1), where we might be inclined to believe that the (1) corresponds to some sort of precision, as with NUMERIC types.

However, counter intuitively, that is not the case. It corresponds to the display width of the type, when fetching it, using some deprecated modes. Consider:

insert into t(b) values (0), (1), (10);
select * from t;

We’re getting:

b |
--|
 0|
 1|
10|

Notice also that MySQL can process non-boolean types as booleans. Running the following statement:

select * from t where b;

We’re getting:

b |
--|
 1|
10|

Using this column as a Boolean column in jOOQ

By default, jOOQ doesn’t recognise such TINYINT(1) columns as boolean columns, because it is totally possible that a user has created such a column without thinking of boolean types, as the above example has shown.

In previous versions of jOOQ, the data type rewriting feature could be used on arbitrary expressions that match the boolean column name, e.g. the below would treat all columns named "B" as BOOLEAN:

<forcedTypes>
  <forcedType>
    <name>BOOLEAN</name>
    <includeExpression>B</includeExpression>
  </forcedType>
</forcedTypes>

With jOOQ 3.12.0 (issue #7719), we can now match this display width as well for MySQL types. That way, you can write this single data type rewriting configuration to treat all integer types of display width 1 as booleans:

<forcedTypes>
  <forcedType>
    <name>BOOLEAN</name>
    <includeTypes>(?i:TINYINT\(1\))</includeTypes>
  </forcedType>
</forcedTypes>

Using this configuration in the code generator, the above query:

select * from t where b;

… can now be written as follows, in jOOQ

selectFrom(T).where(T.B).fetch();