As the layers of abstraction get piled onto our software solutions, we have lost sight of what is going on at the base. We no longer understand exactly what our hardware is doing when we type an abstracted command into our editor.
The lack of understanding about programming and the underlying hardware will cause systemic decay and potential collapse of important infrastructure.
Modern programs are slow despite the yearly hardware improvements. This can be seen quite evidently if you just open up something like Slack for example.
We have programs built on top of web technologies, running on Desktops. Because of this shift in technology, more and more programmers are learning and using particularly abstract languages and frameworks.
The general argument is that computers nowadays are fast enough that the stack does not matter, this leads to almost zero consideration about hardware limitations, memory management, and CPU usage.
These skills are not being refined in the general course of a programmers career and therefore are becoming less prevalent, less sought-after and less known about. In 2020, the 5 most sought after programming languages are JavaScript (53.6%), Python (49.5%), Java (44.1%), C# (19.7%), and finally, the first non-garbace-collected lanaguage is C++ (18.3%).
The highest salaries are are all for garbage-collected languages, giving less incentive to become truly proficient in non-garbage-collected languages. The top 5 are Perl, Scala, Go, Ruby, and Objective-C. The version of Objective-C is not known so it may or may not be garbage-collected.
In either case, the point remains that programmers are less incentivised to learn about the lower levels of their craft, happy to let the languages sort it out for them.
From a business perspective, the standardization of programming paradigms through the wide-spread adoption of particular frameworks is highly beneficial.
The more people who know how to use these frameworks to build software, the easier it is to replace employees. Some companies may even hire an experienced programmer at a high rate to build the initial software, and then fire them and replace them with lower-wage maintainers. This is facilitated very nicely by mass adoptions of frameworks.
As the flavour-of-the-month frameworks change and programmers ignore the fundamentals in favour of learning these new frameworks, we will see a decliine in the availability of programmers knowledgable enough to maintain legacy systems. This effect will carry over to said frameworks in a decade or two, resulting in a cascading failure of software applications.
It will be the case that if the current trend continues, we will find ourselves in a time in which nobody alive can understand how our infrastructure works and it will slowly degrade and collapse. This has happened several times in the past - these events are commonly called Dark Ages.
This was demonstrated when Texas Instruments started shipping DOA processors. When asked about what was going on, they stated that the designers of these processors had left and gone to other jobs, and the new staff lacked the deep knowledge required to correctly produce these processors.
The fact that anybody can do a 12 week boot-camp and get a job as a programmer and do well is alarming. This means that the heavy burden of understanding what is going on under the hood is falling on only a few people as well as the automated software pillars we have come to rely on.
Because it’s not required to know anything about the hardware one is working on in order to succeed in the field, programmers simply do not learn about it unless they are keenly curious or are specifically told to do so.
This leads to a lack of consideration and planning in regards to how written code will interface with the hardware it is being written for.
Many pieces of software are created with a driving ideology behind them which enforce the way they are structured and create problems for the programmer to solve are only there due to the dogmatic approach.
Designing a whole piece of software around the core idea that is decided before hand can lead to critical issues.
Focusing efforts on upholding these arbitrary frameworks and pushing square pegs into round holes causes a lack of understanding how to solve difficult problems efficiently.
The Digital Dark Age is approaching, which is the degredation of data over time.
For us, this seems like a strange idea. The idea that we might not have access to the information we have at our finger-tips right now. However, history has shown us that the collapse of a civilization and loss of knowledge can happen quickly, or it can happen slowly without anyone even realising it.
For an example of the slow degredation of society and loss of knowledge, look no further than the collapse of the Western Roman Empire. The civlisation was in decline for some 300 years with people in the capital convinced that "It will get better soon. There is no way Rome can fall". This is, of course, a naive way of thinking and it is all too human to fall into this trap.
The late Bronze Age collapse of the cultures around the Mediterranean Sea during the 12th Century is a prime example of the quick degeredation of 4 kingdoms and total destruction of 33 cities which were all previously prospering due to their advanced trading networks.
The half-century between 1200 BC 1500 BC, saw the collapse of the Mycenaean kingdoms, of the Kassite dynasty of Babylonia, of the Hittite Empire in Anatolia and the Levant, and the Egyptian Empire; the destruction of Ugarit and the Amorite states in the Levant, the fragmentation of the Luiwian states of western Asia Minor, and a period of chaos in Canaan.
The collapse of these civlizations is thought to be a combination of natural disasters, drought, and scarcity of tin, which was imported from Afghanistan.
The complexity of the cultures, their economic and social orgnisations were such that it became impossible for the survivors to re-establish themselves and plunged this part of the world into a dark age.
A similar situation could befall us, a signifcant solar flare like the one which almost hit earth in July 2012, or a nuclear detonation in the atmosphere could send out a large enough electromagnetic pulses to wipe all data from the face of the earth. Knowledge about how to rebuild our technology would be vital in this situation.
Pursuing mastery of programming is important for the future, but it comes at a cost.
Becoming a master in any field requires a tremendous amount of time, most of which will not be in the work-place as the current trends of abstracted technology continue to increase.
It will be harder in the mean-time to be paid as well for your efforts as they are counter to the ideal lacky mentality which businesses are generally looking for.
There will always be a place for truly dedicated individuals, and master programmers will be needed more than ever in the coming decades to help with the afforementioned dysfunctional critical systems.
It is important to remember that knowledge lives in the mind of Humans, and if it is not passed on, it can easily be lost. Digital data is not immune to the effects of entropy, natural disaster, or intentional destruction.
If you are one of those people looking to truly master programming, make sure to understand the hardware and runtime environments, as well as any lower-level language concepts like memory management, CPU caching. Consider learning the Assembly Language and reading the specification for the target architecture to speed up critical systems.
Deep understanding about these topics will be required in the future. It is only a matter of time and complexity until the abstract house of cards comes falling down.
-
https://www.ancient.eu/article/835/fall-of-the-western-roman-empire/
-
https://www.livescience.com/38848-emp-solar-storm-danger.html
-
http://www.righto.com/2015/05/the-texas-instruments-tmx-1795-first.html
-
https://martinfowler.com/articles/is-quality-worth-cost.html
-
https://wpengine.com.au/developers-feel-programming-languages/