I’m not well known for my love of annotations. While I do recognise that they can serve a very limited purpose in some areas (e.g. hinting stuff to the compiler or extending the language where we don’t want new keywords), I certainly don’t think they were ever meant to be used for API design.
“unfortunately” (but this is a matter of taste), Java 8 introduced type annotations. An entirely new extension to the annotation type system, which allows you to do things like:
@Positive int positive = 1;
Thus far, I’ve seen such common type restriction features only in the Ada or PL/SQL languages in a much more rigid way, but others may have similar features.
The nice thing about Java 8’s implementation is the fact that the meaning of the type of the above local variable (@Positive int) is unknown to the Java compiler (and to the runtime), until you write and activate a specific compiler plugin to enforce your custom meaning. The easiest way to do that is by using the Checker Framework (and yes, we’re guilty at jOOQ. We have our own checker for SQL dialect validation). You can implement any semantics, for instance:
// This compiles because @Positive int is a subtype of int
int number = positive;
// Doesn't compile, because number might be negative
@Positive int positive2 = number;
// Doesn't compile, because -1 is negative
@Positive int positive3 = -1;
As you can see, using type annotations is a very strategic decision. Either you want to create hundreds of types in this parallel universe as in this example:
The only thing crazier than annotations are type annotations. On arrays. Who thinks this is valid Java code? pic.twitter.com/M9fSRRerAD
Or, in my opinion, you better leave this set of features alone, because probably: YAGNI
Unfortunately, and to the disappointment of Mike Ernst, the author of the Checker Framework (whom I’ve talked to about this some years ago), most people abuse this new JSR-308 feature for boring and simple null checking. For instance, just recently, there had been a feature request on the popular vavr library to add support for such annotations that help users and IDEs guarantee that vavr API methods return non-null results.
Please no. Don’t use this atomic bomb for boring null checks
Let me make this very clear:
Type annotations are the wrong tool to enforce nullability
– Lukas Eder, timeless
You may quote me on that. The only exception to the above is if you strategically embrace JSR-308 type annotations in every possible way and start adding annotations for all sorts of type restrictions, including the @Positive example that I’ve given, then yes, adding nullability annotations won’t hurt you much anymore, as your types will take 50 lines of code to declare and reference anyway. But frankly, this is an extremely niche approach to type systems that only few general purpose programs, let alone publicly available APIs can profit from. If in doubt, don’t use type annotations.
One important reason why a library like vavr shouldn’t add such annotations is the fact that in a library like vavr , you can be very sure that you will hardly ever encounter null, because references in vavr are mostly one of three things:
A collection, which is never null but empty
An Option which replaces null (it is in fact a collection of cardinality 0..1)
A non-null reference (because in the presence of Option, all references can be expected to be non-null)
Of course, these rules aren’t valid for every API. There are some low quality APIs out there that return “unexpected” null values, or leak “internal” null values (and historically, some of the JDK APIs, unfortunately, are part of these “low quality APIs”). But vavr is not one of them, and the APIs you are designing also shouldn’t be one of them.
So, let go of your null fear. null is not a problem in well-designed software. You can spare yourself the work of adding a @NonNull annotation on 99% of all of your types just to shut up your IDE, in case you turned on those warnings. Focus on writing high-quality software rather than bikeshedding null.
Because: YAGNI.
And, if you haven’t had enough bikeshedding already, consider watching this entertaining talk by Stuart Marks:
17 thoughts on “The Java Ecosystem’s Obsession with NonNull Annotations”
Annotations «saved» Java (or at least simplified some code-generation processes), then annotations become something completely ambiguous and quite wrong (who cried Lombok?). Keep developing on extending features for them feeds such bad habit and type-annotations is a pretty horrible non-needed thing (so far I wasn’t able to find a real case-scenario).
I see (as far as I understood) that extending annotation features is easier than extending Java-lang features (new keywords etc…) because of retro compatibility, but this annotation thing is able to generate legacy and unpredictable applications (with a lot of side-effects) in zero-time. Most of all I believe they actually force you to develop by using reflection and that’s great, isn’t?
It can be useful to add types to the type system “after the fact.” For instance, Mike Ernst told me about a technique where only non-blocking code could be run on a GUI rendering thread, and that non-blockingness was enforced using type annotations. If code was (potentially) blocking, it just didn’t compile in a rendering thread context.
You can declare a package to require Oracle + MySQL support. From then on, only jOOQ code that supports both of these databases will compile. Oracle only API is no longer usable. Now, of course, this is heavily abusing the type system, but in any case, you can opt in to the validation and don’t need to run it all the time, so it can be run during CI, for instance.
Don’t get me wrong. I don’t think this is pretty or a good practice. But maybe, it does add some helpful quality control tools for some people.
Ceylon got annotations quite right. In Ceylon, all the method modifiers are annotations as well. Things like “public” or “static” or “synchronized”… Those shouldn’t be keywords.
But in hindsight (20 years from now), we’ll just ignore annotations as they will be recognised as a big mistake.
pretty interesting what you said about “public” and “static”, do you think an object should be declared as-is without specifying by using keywords if a method must be public, static or private? and then if you want to declare that you need annotations?
Quality checking I think it’s a nice scenario, but what I meant it was related about API design, I believe there should be a distinction between quality/”convention over configuration” frameworks (checker framework for instance as you said or yours) and development frameworks (Spring for instance). The first ones should check the provided Annotations in order to let you develop in the most convinient and safe way (let’s say), the second ones in order to get the job done. Well this is what I always thought about, for instance I always feel weird to declare a Class with a RestController annotation on it.
But I got your point about type-annotations anyway and let’s see what it’s going to happen in 20 years (@Annotated that :D ).
I’m not 100% sure about the annotations vs. keywords approach. I haven’t thought this through to the end. But I find the thought quite elegant. In the end, some Java keywords behave like instructions, others are types, others are classifiers, others are simply annotations (i.e. modifiers). And then, Java has actual annotations. Ceylon doesn’t make the difference.
I think we agree. The @RestController annotation does feel weird, because in fact, it is orthogonal to a concept that we already have and appreciate: A marker interface. Why doesn’t the class implementRestController (i.e. it is-aRestController)? Why is it annotated with the tag (i.e. a weird declarative form of is-aRestController)…
I guess we’re all intrigued with this experiment, like we were with checked exceptions. Until we figure out that the concept is good but the implementation is crap. (e.g. checked exceptions are also much better implemented with Ceylon or TypeScript union types. Blog post in the pipeline…)
IMHO NPEs are nasty enough to deserve some countermeasures. The only thing I dislike about @NonNull (apart from the existence of the spelling @NotNull) is its verbosity. I’d love it, if there was something like postfix “?” and “!!” in Kotlin.
Actually, I don’t get what’s your problem with @NotNull. A concise syntax for null-related stuff would be way better than anything Optional can offer (a 0-1 set is an interesting concept, but in a language having already nulls it’s just redundant). And yes, bikeshedding is hobby.
A marker interface.
… feels strange by itself. Moreover, it applies to types only. Wouldn’t mixing marker interfaces and annotations feel strange?
intrigued … with checked exceptions
There’s one big difference:
There are places where you can’t simply replace file.close() by db.close() as you’re not allowed to throw a SQLException. If you circumvent the compiler, the code will work as the JVM doesn’t give a damn.
If you circumvent the @NonNull checker, the code will probably break.
Actually, I don’t get what’s your problem with @NotNull.
Creating ad-hoc intersection types by adding annotations (tags) to existing types is a very weird thing to do and I think this approach should be followed either thoroughly (e.g. using the Checker Framework) or not at all. This “ok it works for nullability” is just a wishy washy mediocre approach at hacking stuff together. Why not resort to using Perl then, if the language really doesn’t matter anymore?? :)
Ceylon combines the JVM’s unfortunate historic legacy of “null” with any arbitrary other way of declaring a union type. Note that union types are the correct approach here, because something is for instance String|Null, i.e. it is of type String or of type Null. Again, annotations create intersection types, and that’s the wrong, warty tool for the job.
So, Ceylon gets this right by shoehorning this unfortunate legacy into an awesome feature where the legacy actually fits. An even better approach (in my opinion) would be to do away with null entirely and let go of the concept of arity entirely. Why not treat everything as a collection/iterable? For instance:
int a = { 1, 2, 3 };
int b = a + 1;
println(b); // { 2, 3, 4 };
What’s the point of distinguishing between 0..1, 1..1, and 0..N arities? I don’t see it. With this approach, magically, null (and the use-case that created it) ceases to exist.
Wouldn’t mixing marker interfaces and annotations feel strange?
Don’t forget: (type) annotations are interfaces / tags / markers that are attached to a type at the use-site, i.e. they are structural marker interfaces, or again: they create intersection types. But in a completely orthogonal type system that is not available to the Java compiler or the runtime directly.
There are places where you can’t simply replace file.close() by db.close() as you’re not allowed to throw a SQLException. If you circumvent the compiler, the code will work as the JVM doesn’t give a damn.
There are two problems here:
System exceptions should have never been checked. That was a big mistake. So, if SQLException and IOException were runtime exceptions, then no one would complain about user-defined, checked business exceptions
But still, checked exceptions are a warty language extension that could have been covered with union types (again)! Imagine a method like this:
In fact, Java’s checked exceptions are a way to create a union type for the return type. But the syntax to switch on the union type (i.e. catch) is very irregular. It doesn’t really make any sense to (ab)use this concept for checked exceptions only. Union types would’ve been again a better solution.
This is what I’ll surely do, sooner or later, depending on the overhead.
Ceylon is the only language that got nulls right
I fully agree, but I’m not gonna migrate anytime soon. So I need to get this feature to Java and all I can do is to pretend that String is actually String|null and @NonNull String is String.
Why not treat everything as a collection/iterable?
Because the idea is crazy, but not crazy enough ((c) Niels Bohr).
int a = {1, 2};
int b = a + a;
println(b); // {1, 2}; // union
println(b); // {2, 3, 4}; // cross sum
println(b); // {2, 4}; // pointwise sum
println(b); // {1, 2, 1, 2}; // concatenation
println(b); // {{0, 1}, {0, 2}, {1, 1}, {1, 2}}; // direct sum
println(b); // {{1, 2}, {1, 2}} // append
println(b); // {42} // expected result
What’s the point of distinguishing between 0..1, 1..1, and 0..N arities?
What about a > b? Do you get a collection of booleans? What about if (a > b)? Do you evaluate all branches?
What worse: You’d “solve” the NPE problem in a way equivalent to
Integer plus(Integer a, Integer b) {
return a==null || b==null ? null : a+b;
}
and that’s a silent error propagation. You won’t ever get an NPE, but you’ll get (the equivalent of) null at many places and never know where it comes from.
System exceptions should have never been checked.
Wow! I’ve never heard that. This might make checked exceptions slightly usable.
checked exceptions are a warty language extension that could have been covered with union types
Agreed, but would you use them for system exceptions then?
And the problem of A throwing and Z catching means that all methods on the whole stacktrace B-Y get polluted remains.
IMHO checked exceptions of any kind may be the right thing for top-reliability software, but not for 99% of applications. The problem is their poor benefit-cost ratio. I really tried to embrace them, but they’re just too ugly.
(btw, the HTML tag you’re looking for is <blockquote>)
pretend that String is actually String|null and @NonNull String is String.
Good luck. Let me kow how live as a paid-by-the-loc developer is ;)
Because the idea is crazy, but not crazy enough ((c) Niels Bohr).
I’m sure there will be many useful operations that make total sense in maths, matrix calculus, set calculus, and a reasonable default following the principle of least surprise.
Let go of your believes that you can be sure of such things as arity. There is no such thing (anymore). It doesn’t matter if your a or b elements contain values. They’re sets. If they’re empty, well you get an empty set as a result. No surprise, no error.
See, the whole problem with all this meaningless null bike shedding is the fact that people have always pretended that arities are actually meaningful. But they’re too bored to implement the checks. There are only two solutions:
Remove all concept of arity. This will solve the problem thoroughly.
Accept the fact that in Java, ALL types are of arity 0..1 and that only by convention and good API design, you can occasionally omit the arity check
Both solutions will lead to clean software.
Ceylon’s approach isn’t part of this list, because it doesn’t address the arity problem, it works around it in a very elegant way.
This might make checked exceptions slightly usable.
Indeed, I’ve seen them being used this way. The “exceptions” weren’t actual “exceptions” but expected, meaningful outcomes that needed special treatment, and it was always immediately clear how to do so. The best example is: InsufficientFundsException in a banking application.
Agreed, but would you use them for system exceptions then?
Not sure. Unchecked exceptions are interesting relicts from times when programmers wanted complete control over control flow (think of FORTRAN’s wacko ENTRY statement). I’m actually not sure if there’s a better way to deal with system exceptions…
all methods on the whole stacktrace B-Y get polluted remains.
Yes, but I haven’t seen any better way (yet). I’m very sure that the monad approach using Try is much worse than checked exceptions.
but they’re just too ugly.
Yes indeed. Again, union types are completely unopinionated about how they’re put to use, and they could 100% replace checked exceptions, so they would be a much nicer solution for those few top-reliability systems.
(btw, the HTML tag you’re looking for is <blockquote>)
That’s hopeless, I won’t remember this next week. I just don’t understand how WordPress can work proper formatting (markdown) and without a preview.
pretend that String is actually String|null and @NonNull String is String.
Good luck. Let me kow how live as a paid-by-the-loc developer is
I’ll surely ask one if I meet him. With Lombok I’m saving a majority of the LOC, so it’d be a bad idea to be paid this way. The savings by Lombok are actually much more than what Java 8 offers.
I was wrong with what I wrote. I’m using @ParametersAreNonnullByDefault and @ReturnValuesAreNonnullByDefault, so plain String is just a String and @Nullable String is String|null. This is far better as in 90% of cases, no annotation is needed. It’s far from perfect as not everything gets checked, but that’s just matter of time.
many useful operations that make total sense in maths
Sure, but that looks like a operator overloading disaster, when added to a “normal” language. Otherwise, it looks just like scattered tee. Math has much more symbols, 2D layout and it’s still badly ambiguous (e.g., https://en.wikipedia.org/wiki/Quadratic_residue#Notations).
If they’re empty, well you get an empty set as a result. No surprise, no error.
No error reported by the runtime, but a big problem as the poor programmer just didn’t think about it. Now, the question where does the empty set come from arises and cost a lot of time. An NPE would show us much more.
Remove all concept of arity. This will solve the problem thoroughly.
You’d have to convince the customers about it. What’s the price of the shopping cart? Empty set? Sure, that’s acceptable, isn’t it? One items is missing the price and this is OK (no arity, right?) and propagates everywhere.
Accept the fact that in Java, ALL types are of arity 0..1 and that only by convention and good API design, you can occasionally omit the arity check
… and the places where you can and where you can’t should be documented. For example, by something like @NotNull.
Ceylon’s approach isn’t part of this list, because it doesn’t address the arity problem, it works around it in a very elegant way.
IMHO it’s the only solution. But I must confess, I can’t really follow you with either list item.
I’m actually not sure if there’s a better way to deal with system exceptions…
I guess, nobody is. Returning some Either in Java would be probably fine. I guess, I’d convert it to an exception in many places anyway as usually 1. knowing there’s some problem is good enough, 2. the stacktrace is important in case there’s bug.
The following example illustrates what I mean: I periodically download a file and import in into a database. I can get an IOException, an SQL-lock-timeout-whatever-exception or a RuntimeException. In any case, I just retry later. The external URL may be down, it may provide broken data, the DB may be busy, or there may be a bug manifesting itself only occasionally. Whatever happens, it has to be analysed later, but for the moment, retrying is better than not retrying.
union types are completely unopinionated about how they’re put to use, and they could 100% replace checked exceptions, so they would be a much nicer solution for those few top-reliability systems.
I would like to to take this opportunity to rant about weird runtime importing of stuff using annotations when plain Java would be fine. While I agree using annotations like that is pretty awful I have to say how Spring framework uses annotations for extended configuration (aka @Configuration aka what it calls Java configuration ironically) is super disgusting. I’m not talking about the @Inject or @Autowiring or even the MVC annotations but the @InsertSomeFeatureLikeAWeirdMacro in @Configuration classes. Seriously annotations that will load up property files is the wrong way to use annotations.
So if you feel it is nasty take a nice look at Spring 4/Boot/@Configuration for some happy feelings.
Well, Adam, I’m happy to say that I agree 100% with you on your rant. Oh well, we’ll see and learn (once more) in hindsight that it was an extremely bad idea to put logic in annotations…
I don’t really agree with the whole article. I find using @Nullable/@NonNull for method arguments much better than doing nothing at all.
Benefits of using @Nullable/@NonNull for method arguments:
* Simplifies method body – you know if you need to handle the nulls or not.
* No runtime checks if the input parameter is null (or if you have public methods, e.g a library, you can add runtime checks with annotation processing, or just write explicit null checks).
And you say either go all-in with checker framework or don’t use it at all. I say using just the @Nullable (doesn’t have to use checker framework for this) saves you or others the trouble they would otherwise have, not knowing if the method they are changing is supposed to allow nulls for some argument value or not and they don’t know if they should handle the null value or throw an error.
Basically, assume non null by default and use @Nullable for other cases.
As “experiments” (e.g. Kotlin / Java interop) have shown, this breaks whenever you have a dependency that disagrees with this. E.g. a random Java library, including the JDK libraries. If every Map.get() call is “unsound”, and every type resulting from such a call needs to be annotated, the whole thing quickly becomes a PITA.
And all of that just for nulls. Might as well “just” switch the language and use something like Scala, Kotlin, Ceylon instead, and try to avoid Java interop.
I wouldn’t annotate the assigned variable of map.get(x) either.
Also, your example is if method return type was annotated instead of the parameters. Let’s look at example for parameters.
Let’s say there is alternative to Map called SomeComplexMap. Now you might assume that calling someComplexMap.get(null) returns null (either value is missing of if SomeComplexMap allows null values, it’s just that value). But instead it throws IllegalArgumentException with message – null keys not supported.
How could you’ve known it?
a) check the javadoc
b) check the source code
c) find out during runtime
If, the x would’ve been marked @Nullable the IDEA would warn you if you call it with possibly null value – I don’t see any downsides on that.
Basically – use it as documentation which is understandable by e.g IDEA and also more noticeable than javadoc.
I don’t see the difference between parameter types or return types. Either everything is annotated (it isn’t), or the whole thing breaks apart. What if you call someComplexMap.get(map.get(x))? Would the IDE complain or not? It should.
Annotations «saved» Java (or at least simplified some code-generation processes), then annotations become something completely ambiguous and quite wrong (who cried Lombok?). Keep developing on extending features for them feeds such bad habit and type-annotations is a pretty horrible non-needed thing (so far I wasn’t able to find a real case-scenario).
I see (as far as I understood) that extending annotation features is easier than extending Java-lang features (new keywords etc…) because of retro compatibility, but this annotation thing is able to generate legacy and unpredictable applications (with a lot of side-effects) in zero-time. Most of all I believe they actually force you to develop by using reflection and that’s great, isn’t?
It can be useful to add types to the type system “after the fact.” For instance, Mike Ernst told me about a technique where only non-blocking code could be run on a GUI rendering thread, and that non-blockingness was enforced using type annotations. If code was (potentially) blocking, it just didn’t compile in a rendering thread context.
A similar technique is used in jOOQ 3.9:
https://blog.jooq.org/2016/05/09/jsr-308-and-the-checker-framework-add-even-more-typesafety-to-jooq-3-9
You can declare a package to require Oracle + MySQL support. From then on, only jOOQ code that supports both of these databases will compile. Oracle only API is no longer usable. Now, of course, this is heavily abusing the type system, but in any case, you can opt in to the validation and don’t need to run it all the time, so it can be run during CI, for instance.
Don’t get me wrong. I don’t think this is pretty or a good practice. But maybe, it does add some helpful quality control tools for some people.
Ceylon got annotations quite right. In Ceylon, all the method modifiers are annotations as well. Things like “public” or “static” or “synchronized”… Those shouldn’t be keywords.
But in hindsight (20 years from now), we’ll just ignore annotations as they will be recognised as a big mistake.
pretty interesting what you said about “public” and “static”, do you think an object should be declared as-is without specifying by using keywords if a method must be public, static or private? and then if you want to declare that you need annotations?
Quality checking I think it’s a nice scenario, but what I meant it was related about API design, I believe there should be a distinction between quality/”convention over configuration” frameworks (checker framework for instance as you said or yours) and development frameworks (Spring for instance). The first ones should check the provided Annotations in order to let you develop in the most convinient and safe way (let’s say), the second ones in order to get the job done. Well this is what I always thought about, for instance I always feel weird to declare a Class with a RestController annotation on it.
But I got your point about type-annotations anyway and let’s see what it’s going to happen in 20 years (@Annotated that :D ).
I’m not 100% sure about the annotations vs. keywords approach. I haven’t thought this through to the end. But I find the thought quite elegant. In the end, some Java keywords behave like instructions, others are types, others are classifiers, others are simply annotations (i.e. modifiers). And then, Java has actual annotations. Ceylon doesn’t make the difference.
I think we agree. The
@RestController
annotation does feel weird, because in fact, it is orthogonal to a concept that we already have and appreciate: A marker interface. Why doesn’t the class implementRestController
(i.e. it is-aRestController
)? Why is it annotated with the tag (i.e. a weird declarative form of is-aRestController
)…I guess we’re all intrigued with this experiment, like we were with checked exceptions. Until we figure out that the concept is good but the implementation is crap. (e.g. checked exceptions are also much better implemented with Ceylon or TypeScript union types. Blog post in the pipeline…)
IMHO NPEs are nasty enough to deserve some countermeasures. The only thing I dislike about @NonNull (apart from the existence of the spelling @NotNull) is its verbosity. I’d love it, if there was something like postfix “?” and “!!” in Kotlin.
Actually, I don’t get what’s your problem with @NotNull. A concise syntax for null-related stuff would be way better than anything Optional can offer (a 0-1 set is an interesting concept, but in a language having already nulls it’s just redundant). And yes, bikeshedding is hobby.
… feels strange by itself. Moreover, it applies to types only. Wouldn’t mixing marker interfaces and annotations feel strange?
There’s one big difference:
There are places where you can’t simply replace file.close() by db.close() as you’re not allowed to throw a SQLException. If you circumvent the compiler, the code will work as the JVM doesn’t give a damn.
If you circumvent the @NonNull checker, the code will probably break.
Looking forward to it.
Creating ad-hoc intersection types by adding annotations (tags) to existing types is a very weird thing to do and I think this approach should be followed either thoroughly (e.g. using the Checker Framework) or not at all. This “ok it works for nullability” is just a wishy washy mediocre approach at hacking stuff together. Why not resort to using Perl then, if the language really doesn’t matter anymore?? :)
Thus far, I think that Ceylon is the only language that got nulls right:
https://blog.jooq.org/2016/03/15/ceylon-might-just-be-the-only-language-that-got-nulls-right/
Ceylon combines the JVM’s unfortunate historic legacy of “null” with any arbitrary other way of declaring a union type. Note that union types are the correct approach here, because something is for instance
String|Null
, i.e. it is of typeString
or of typeNull
. Again, annotations create intersection types, and that’s the wrong, warty tool for the job.So, Ceylon gets this right by shoehorning this unfortunate legacy into an awesome feature where the legacy actually fits. An even better approach (in my opinion) would be to do away with null entirely and let go of the concept of arity entirely. Why not treat everything as a collection/iterable? For instance:
What’s the point of distinguishing between 0..1, 1..1, and 0..N arities? I don’t see it. With this approach, magically, null (and the use-case that created it) ceases to exist.
Don’t forget: (type) annotations are interfaces / tags / markers that are attached to a type at the use-site, i.e. they are structural marker interfaces, or again: they create intersection types. But in a completely orthogonal type system that is not available to the Java compiler or the runtime directly.
There are two problems here:
SQLException
andIOException
were runtime exceptions, then no one would complain about user-defined, checked business exceptionsIn fact, Java’s checked exceptions are a way to create a union type for the return type. But the syntax to switch on the union type (i.e.
catch
) is very irregular. It doesn’t really make any sense to (ab)use this concept for checked exceptions only. Union types would’ve been again a better solution.Another blog post that is in the pipeline :)
Sounds like you’re hankering for Lisp. :)
This is what I’ll surely do, sooner or later, depending on the overhead.
I fully agree, but I’m not gonna migrate anytime soon. So I need to get this feature to Java and all I can do is to pretend that String is actually String|null and @NonNull String is String.
Because the idea is crazy, but not crazy enough ((c) Niels Bohr).
What about a > b? Do you get a collection of booleans? What about if (a > b)? Do you evaluate all branches?
What worse: You’d “solve” the NPE problem in a way equivalent to
and that’s a silent error propagation. You won’t ever get an NPE, but you’ll get (the equivalent of) null at many places and never know where it comes from.
Wow! I’ve never heard that. This might make checked exceptions slightly usable.
Agreed, but would you use them for system exceptions then?
And the problem of A throwing and Z catching means that all methods on the whole stacktrace B-Y get polluted remains.
IMHO checked exceptions of any kind may be the right thing for top-reliability software, but not for 99% of applications. The problem is their poor benefit-cost ratio. I really tried to embrace them, but they’re just too ugly.
(btw, the HTML tag you’re looking for is <blockquote>)
Good luck. Let me kow how live as a paid-by-the-loc developer is ;)
I’m sure there will be many useful operations that make total sense in maths, matrix calculus, set calculus, and a reasonable default following the principle of least surprise.
Just look at https://en.wikipedia.org/wiki/J_(programming_language) and https://en.wikipedia.org/wiki/APL_(programming_language)
Let go of your believes that you can be sure of such things as arity. There is no such thing (anymore). It doesn’t matter if your
a
orb
elements contain values. They’re sets. If they’re empty, well you get an empty set as a result. No surprise, no error.See, the whole problem with all this meaningless null bike shedding is the fact that people have always pretended that arities are actually meaningful. But they’re too bored to implement the checks. There are only two solutions:
Both solutions will lead to clean software.
Ceylon’s approach isn’t part of this list, because it doesn’t address the arity problem, it works around it in a very elegant way.
Indeed, I’ve seen them being used this way. The “exceptions” weren’t actual “exceptions” but expected, meaningful outcomes that needed special treatment, and it was always immediately clear how to do so. The best example is:
InsufficientFundsException
in a banking application.Not sure. Unchecked exceptions are interesting relicts from times when programmers wanted complete control over control flow (think of FORTRAN’s wacko ENTRY statement). I’m actually not sure if there’s a better way to deal with system exceptions…
Yes, but I haven’t seen any better way (yet). I’m very sure that the monad approach using
Try
is much worse than checked exceptions.Yes indeed. Again, union types are completely unopinionated about how they’re put to use, and they could 100% replace checked exceptions, so they would be a much nicer solution for those few top-reliability systems.
That’s hopeless, I won’t remember this next week. I just don’t understand how WordPress can work proper formatting (markdown) and without a preview.
I’ll surely ask one if I meet him. With Lombok I’m saving a majority of the LOC, so it’d be a bad idea to be paid this way. The savings by Lombok are actually much more than what Java 8 offers.
I was wrong with what I wrote. I’m using @ParametersAreNonnullByDefault and @ReturnValuesAreNonnullByDefault, so plain String is just a String and @Nullable String is String|null. This is far better as in 90% of cases, no annotation is needed. It’s far from perfect as not everything gets checked, but that’s just matter of time.
Sure, but that looks like a operator overloading disaster, when added to a “normal” language. Otherwise, it looks just like scattered tee. Math has much more symbols, 2D layout and it’s still badly ambiguous (e.g., https://en.wikipedia.org/wiki/Quadratic_residue#Notations).
No error reported by the runtime, but a big problem as the poor programmer just didn’t think about it. Now, the question where does the empty set come from arises and cost a lot of time. An NPE would show us much more.
You’d have to convince the customers about it. What’s the price of the shopping cart? Empty set? Sure, that’s acceptable, isn’t it? One items is missing the price and this is OK (no arity, right?) and propagates everywhere.
… and the places where you can and where you can’t should be documented. For example, by something like @NotNull.
I would like to to take this opportunity to rant about weird runtime importing of stuff using annotations when plain Java would be fine. While I agree using annotations like that is pretty awful I have to say how Spring framework uses annotations for extended configuration (aka @Configuration aka what it calls Java configuration ironically) is super disgusting. I’m not talking about the @Inject or @Autowiring or even the MVC annotations but the @InsertSomeFeatureLikeAWeirdMacro in @Configuration classes. Seriously annotations that will load up property files is the wrong way to use annotations.
So if you feel it is nasty take a nice look at Spring 4/Boot/@Configuration for some happy feelings.
Well, Adam, I’m happy to say that I agree 100% with you on your rant. Oh well, we’ll see and learn (once more) in hindsight that it was an extremely bad idea to put logic in annotations…
I don’t really agree with the whole article. I find using @Nullable/@NonNull for method arguments much better than doing nothing at all.
Benefits of using @Nullable/@NonNull for method arguments:
* Simplifies method body – you know if you need to handle the nulls or not.
* No runtime checks if the input parameter is null (or if you have public methods, e.g a library, you can add runtime checks with annotation processing, or just write explicit null checks).
And you say either go all-in with checker framework or don’t use it at all. I say using just the @Nullable (doesn’t have to use checker framework for this) saves you or others the trouble they would otherwise have, not knowing if the method they are changing is supposed to allow nulls for some argument value or not and they don’t know if they should handle the null value or throw an error.
Basically, assume non null by default and use @Nullable for other cases.
As “experiments” (e.g. Kotlin / Java interop) have shown, this breaks whenever you have a dependency that disagrees with this. E.g. a random Java library, including the JDK libraries. If every Map.get() call is “unsound”, and every type resulting from such a call needs to be annotated, the whole thing quickly becomes a PITA.
And all of that just for nulls. Might as well “just” switch the language and use something like Scala, Kotlin, Ceylon instead, and try to avoid Java interop.
I wouldn’t annotate the assigned variable of map.get(x) either.
Also, your example is if method return type was annotated instead of the parameters. Let’s look at example for parameters.
Let’s say there is alternative to Map called SomeComplexMap. Now you might assume that calling someComplexMap.get(null) returns null (either value is missing of if SomeComplexMap allows null values, it’s just that value). But instead it throws IllegalArgumentException with message – null keys not supported.
How could you’ve known it?
a) check the javadoc
b) check the source code
c) find out during runtime
If, the x would’ve been marked @Nullable the IDEA would warn you if you call it with possibly null value – I don’t see any downsides on that.
Basically – use it as documentation which is understandable by e.g IDEA and also more noticeable than javadoc.
I don’t see the difference between parameter types or return types. Either everything is annotated (it isn’t), or the whole thing breaks apart. What if you call someComplexMap.get(map.get(x))? Would the IDE complain or not? It should.
My example was plain wrong though.. In this case @NonNull (instead of @Nullable) annotation would make sense on argument.
But if it had @Nullable, then that would show that our map does allow null keys.