A lot of time people approach Typescript with the mindset that it's halfway between an untyped language like Javascript and a traditional statically typed language. That might be true in some ways, but when it comes to nullability, Typescript (the way it's typically used) is actually much more strict/expressive than traditional languages like C, Java, and C#, and you might be led astray if you don't take that into account.
In Javascript, values are completely untyped and you can of course assign any value to any variable or parameter, including null
and undefined
. There is no compiler, and if you want your code to be protected against invalid values you will have to write your own runtime checks.
In C# (and other traditional statically typed languages), at least before recent null-checking contexts, any reference type accepts null
. This goes back to C where a reference type value is represented with a pointer, and `null