Low stability: it is very likely that we’re going to need another very specific argument in a future version of the JavaCompiler, e.g. another Iterable of something. This will make for incompatible API enhancement
Low abstractness: Even if the above is an interface method, there is very little chance of this method being implemented more than once, as it is quite hard to fulfil the above contract in a useful way.
Maybe, for this compiler API, this isn’t too important you may think. But the biggest value of “high cohesion”, i.e. of an ideal stability / abstractness balance is the fact that you have highly reusable code. This isn’t just good because your developers spend less time implementing a specific task, it also means that your code es extremely error-resistant. For example, check out the data type conversion logic from the internals of jOOQ:
jOOQ’s data type conversion hierarchy
The above is just an extract of the call hierarchy leading towards a single data type conversion API that is indirectly used in the whole framework. Everything leads through there, so if there is any data type conversion bug, it is either
Extremely local to a single method / single leaf of the above tree representation
Extremely global to the whole tree
In other words, any bug related to data type conversion is either merely cosmetic, or completely catastrophic. Which basically means that there is almost no possibility for a regression in that area, as any data type conversion regression will immediately break hundreds of unit and integration tests. This is a major benefit of having high cohesion in your code.
CREATE TYPE street_type AS OBJECT (
street VARCHAR2(100),
no VARCHAR2(30)
);
CREATE TYPE address_type AS OBJECT (
street street_type,
zip VARCHAR2(50),
city VARCHAR2(50)
);
And now, we would like to recursively map this data onto custom nested POJOs:
public class Street {
public String street;
public String number;
}
public class Address {
public Street street;
public String city;
public String country;
}
public class Person {
public String firstName;
public String lastName;
public Address address;
}
And the mapping should be available through:
// The configuration object contains the
// Mapping algorithm implementation
Person person = DSL.using(configuration)
.selectFrom(PERSON)
.where(PERSON.ID.eq(1))
// We want to make the mapping algorithm recursive
// to automatically map Address and Street as well
.fetchOneInto(Person.class);
Mapping of a Record onto a POJO is already implemented, but recursion is not. And when we do implement recursion, we want to respect the existing, aforementioned customisable mapping SPI that was introduced in jOOQ 3.1. It’s very simple, we just have a single implementation point at the top in the ConvertAll type.
Implementing this in a highly cohesive code base means that:
We have to implement this new feature only once
Implementing this new feature costs less effort than writing this blog post
Nesting of record mapping and conversion will work for all use-cases in one go
We have only slightly increased complexity (low risk of bugs) while adding an awesome new feature
Do you refactor mercilessly?
The perfect design cannot be foreseen. It grows, slowly. Today, we know so many things about Java and collections, it took a while for the new Streams API to surface. No one would have implemented such a great new API in the JDK 1.2 from scratch, although from that perspective, it has already been pretty good at the time.
This mainly means two things for you:
For your essential core code, it is important to get it to a state where you attain high cohesion. If you’re an E-Banking vendor, your payment and brokerage logic should be exactly as above, with a balanced stability / abstractness ratio
For your non-essential code (e.g. UI / DB-access), you should rely on third-party software, because someone else will spend a lot more time at getting their code on a high level of quality (UI: such as Vaadin, ZK or DB-access: such as Hibernate, jOOQ, Spring Data, just to name a few)
… and if you request a new feature from a highly cohesive framework, it might just be that the only thing that needs to be done is these four lines of code.
4 thoughts on “How to Eliminate Bugs Through High Cohesion”
a (very) small nitpick on the github code above: on the test class, why junit.framework.Assert.assertEquals is used instead of org.junit.Assert.assertEquals?
Hmm, where did you see that JUnit piece of code? I actually just recently found out that junit.framework.Assert seems to be deprecated. My IDE always offers me both and I tend to pick the first…
OK, thanks for the hint about the location. We’ll replace these references and see if Eclipse can prevent them in the future. I have created issue 3119 for this.
a (very) small nitpick on the github code above: on the test class, why junit.framework.Assert.assertEquals is used instead of org.junit.Assert.assertEquals?
other than that, really nice post!
Thanks for your feedback and your nice words.
Hmm, where did you see that JUnit piece of code? I actually just recently found out that junit.framework.Assert seems to be deprecated. My IDE always offers me both and I tend to pick the first…
on jOOQ-test/src/org/jooq/test/PostgresTest.java, line 45.
org.framework.Assert is deprecated but still shipped with latest JUnit4, alongside with org.junit.Assert.
Most IDEs offer classes sorted by jar and then alphabetically (unless configured otherwise), so it’s quite common..
cheers,
perry
OK, thanks for the hint about the location. We’ll replace these references and see if Eclipse can prevent them in the future. I have created issue 3119 for this.