The billion dollar mistake
Java/C# developers. What is the most common type of exception that youhave encountered? Think. If it is plain Exception, sorry, your code is wrong.You should throw specific exceptions. This exception is commonly expected(which is weird), that java allows you to omit it from the throws list of themethod signature. Yes, it is the NullPointerException, NullReferenceException,Argument Null Exception or whatever name you give it in your language. Most common exceptions that we encounter are those that handle null(or None inpython). And this is a pain in most programming languages we use today, not only Java and C#.
In fact, we developers/testers spend considerable time and brain power countering null values. No matter how much you countered it, you catch another few during testing phase. Then it hurts us again as the root cause of a hiddenbug in some legacy code that caused a production failure. Security vulnerabilities also have been caused by this.
In 2009,Sir Tony Hoare apologized publicly for introducing null in 1965 in Algol W, one of the highly influential programming languages. Algol W influenced C and C influenced most programming languages that exists to this day. So the concept of null has passed on. In 2009, Tony Hoare said it may have caused a billion dollars worth of trouble in the last 40 years.
The trillion dollar mistake?
If it was billions in 2009, it must be reaching trillions by now, givenhow large the entire IT industry is. I think this is a fair assumption. Remember even if modern programming languages have fixed this,fixing/mitigating the already existing APIs, getting developers to thinkNon-Null first, are non trivial tasks and has a cost in itself. In fact it is a mistake that language designers should have corrected much earlier. Sir Tony Hoare alone should not get the blame.
"But I check for null/ We use this annotation/library that gives me warnings..."
However, you still have contributed to that billion. You had invested incoding those additional stuff. It counts towards that billion dollars. Some choose to not worry about it at dev time and deal with runtime issues. You choose to be safe at development time. Less burden. Still it hurts because you have to put extra effort into development, forget about discussions and lengthy arguments to get the team to follow a standard.
The solution
Well, it sounds easy. Null was defined to make computers life easy. But we humans define variables to hold something that is not empty/undefined inmost cases. It should have been Non-Null able by default. Null is not the problem, actually; problem is that everything is assigned to null by default.If you want to assign a null value then you should be able to use special semantics. The compiler should detect null, not the runtime. Most modern programming languages are designed keeping this in mind. Good example is Kotlin. And there is Ballerina. Check it out if you haven’t heard of it.
Workaroundsexist for older programming languages to catch the errors at compile time. Javadevelopers for example, use annotations. Annotations are optional and thereforeit is a sub optimal solution.
The C# way
C# 8 introduced the nullable reference types feature to fix this.Introducing nullable to fix null? Sounds strange, right? Well that is actually correct. What that means is you can now make your code non-nullable by default and optionally use nullable properties, variables, etc.
Show me the code
Ok. Add this to the project file
Compiler jumps in
Make your intentions clear. Compiler is happy now.
I think the hard part is making the developers think in non-null bedefault. I wrote a few classes in a new library and it felt awesome when compiler jumping in to point possible null references. Peace of mind :)
The compiler gives you a warning, not an error, by default. But in my case I like to set warnings as errors from the beginning in all new code and then suppress any warnings that are out of place. My suggestion is all your newcode should make use of this feature. And treat warnings as errors. Remember,your code is used for longer than you think, used to solve critical problems that you never even imagined. And may cause trouble like you never imagined. So let’s write safe code.
Agree? Don’t agree? Any ideas? Please add a comment.