The type system is supposed to help us, but like a sword, it has two sides.
Human works very well in context. We are very good at switch in and out of contexts. Within a contexts, we only keep track of several items, that is how we can be so efficient. However, as we slide in and out of context, we often forget about some details, that is why we are often encumbered with inconsistency. Computer, on the other hand, are best without context. Computers never forgets so it is always consistent. So this creates the mismatch -- when our understanding is different from what computer is understanding. This is the origin of bugs.
There are two ways to go about this. We can teach computer to learn our inconsistency -- e.g. fuzzy logic, neural networks, etc. Or, as we determine that consistency is more important (than efficiency), we can try to be consistent. The type system is essentially one of such systems to maintain consistency.
Then we realize that when we are in context, the assumed backgroud assumptions are often vast. To really help us keep consistency, we really need very smart type systems. We don't have that, so we settle with the current ruimentary one. Rudimentay types are very narrow, just like a sword is very sharp. So in helping us keep consistent, it constantly bothers us with nuisance.
For example, when we declare an integer variable, it is either signed integer or unsigend, and it is of certain but various sizes. These attributes are there because sometime it matters. However, for vast majority of time, they simply does not matter. Is it worthwhile to having compilers constantly interrupt us by the conversion between signed and unsigned?