I'm asking how can I write a generic solution for inserting and selecting any type of Java objects to and from a JSON or JSONB PostgreSQL database column.
Additionally I have to use (inject) my Jackson ObjectMapper, because it's a pre-configured Spring bean.
I tried to solve it with a custom ConverterProvider and it's working fine on the select side, but my converter provider isn't called when I did the insert. Did I miss something?
I checked the jOOQ documentation and some stack overflow article about the Biding solution, but I don't want to exactly specify the userType. So I cannot register the Binding, because I cannot declare the exact userType.
My goal is that:
I would like to eliminate the requirements for the Java object string serialization in the repository classes for every JSON and JSONB field (and the requirements to inject my ObjectMapper into every repository). I don't want to write this code line for each JSONB field, jOOQ should do it under the hood with my JSONConverterProvider.
.set(myJSONBTableField, JSONB.jsonb(objectMapper.writeValueAsString(myJavaObjectToStore))
I would like to use simply that:
.set(myJSONBTableField, myJavaObjectToStore)
I would like to ask jOOQ to store any type of Java object with the help of my custom ObjectMapper bean, if the DB column type is JSON or JSONB. My JSONBConverterProvider is called in the read phase, but not in the write phase. The JSONBConverterProvider has a converter for the vica conversion too, but jOOQ doesn't use it.
The relevant part of my jOOQ configuration:
@Bean
public DefaultConfigurationCustomizer ziffitJooqCustomizer(JSONConverterProvider converterProvider) {
return configuration -> configuration
.set(converterProvider)
.settings()...
My custom converter provider:
@ConditionalOnClass(DSLContext.class)
@Component
public class JSONConverterProvider implements ConverterProvider {
private final ConverterProvider delegate = new DefaultConverterProvider();
private final ObjectMapper mapper;
public JSONConverterProvider(ObjectMapper mapper) {
this.mapper = mapper;
}
@Override
public <T, U> Converter<T, U> provide(Class<T> dbClass, Class<U> entityClass) {
if (dbClass == JSON.class) {
return getJSONConverter(dbClass, entityClass);
} else if (dbClass == JSONB.class) {
return getJSONBConverter(dbClass, entityClass);
} else {
// Delegate all other type pairs to jOOQ's default
return delegate.provide(dbClass, entityClass);
}
}
private <T, U> Converter<T, U> getJSONConverter(Class<T> dbClass, Class<U> entityClass) {
return Converter.ofNullable(dbClass, entityClass,
t -> {
try {
return mapper.readValue(((JSON) t).data(), entityClass);
} catch (Exception e) {
throw new DataTypeException("JSON deserialization error", e);
}
},
u -> {
try {
return (T) JSON.valueOf(mapper.writeValueAsString(u));
} catch (Exception e) {
throw new DataTypeException("JSON serialization error", e);
}
}
);
}
private <T, U> Converter<T, U> getJSONBConverter(Class<T> dbClass, Class<U> entityClass) {
return Converter.ofNullable(dbClass, entityClass,
t -> {
try {
return mapper.readValue(((JSONB) t).data(), entityClass);
} catch (Exception e) {
throw new DataTypeException("JSONB deserialization error", e);
}
},
u -> {
try {
return (T) JSONB.valueOf(mapper.writeValueAsString(u));
} catch (Exception e) {
throw new DataTypeException("JSONB serialization error", e);
}
}
);
}
}
I'm using the read query like this:
dslCtx
.select(myJSONBTableField)
.from(myTable)
.where(...)
.fetchOptionalInto(myClazz);
And this is working fine, and call my JSONBConverterProvider.
But my write query doesn't. I would like to use the write query like this:
dslCtx
.insertInto(myTable)
.set(myJSONBTableField, myJavaObjectToStore)
.execute();