As a programmer, it is your job to put yourself out of business. What you do today can be automated tomorrow.
Vannevar Bush, scientist, participant of the Manhattan project, AT&T board of directors, inventor of the Differential Analyzer machine, etc. While he is a conceptual father of “personal computer” or Memex in his As We May Think - The Atlantic - he was famously against digital computers, did not believe it can be built in a reliable way and was in favour of doing analog computations instead.
Renting IBM 701 was like 300$/h, which, with inflation is like a $1 / second
now.
As assemblers (or "symbolic systems" as they were known back than) were "too wistful of the machine time" - people were sceptical and kept programming in absolute binary instead, not worrying about “human time” that takes to do so.
Richard Hamming at The Art of Doing Science and Engineering, chapter 4 mentions:
At the time [the assembler] first appeared I would guess about 1% of the older programmers were interested in it — using [assembly] was “sissy stuff”, and a real programmer would not stoop to wasting machine capacity to do the assembly. Yes! Programmers wanted no part of it, though when pressed they had to admit their old methods used more machine time in locating and fixing up errors than the [assembler] ever used. One of the main complaints was when using a symbolic system you do not know where anything was in storage — though in the early days we supplied a mapping of symbolic to actual storage, and believe it or not they later lovingly pored over such sheets rather than realize they did not need to know that information if they stuck to operating within the system — no! When correcting errors they preferred to do it in absolute binary.
Fortran (Formula Translating System) for translating "human language" of math/calculus into machine language.
Richard Hamming recalls:
It was opposed by almost all programmers:
- “It can’t be done”,
- “it will be so inefficient, that you can not afford it”,
- “even if it did work, no respectable programmer would use it — it was only for sissies!”
Algol (Algorithmic Language) Large group of members at the Panel Discussion on “Philosophies for Efficient Processor Construction” at the “International Symposium of Symbolic Languages in Data Processing” in 1962 was sceptical about usefulness of /recursive procedures/ in ALGOL60 language (which E.W. Dijkstra implemented one year before that) source
Lisp 1959, Forth 1970, Smalltalk 1972 are interpreted
- Go: Why do garbage collection? Won’t it be too expensive?
- D Programming Language: Garbage Collection
Edsger Dijkstra joked
object-oriented programming is an exceptionally bad idea which could only have originated in California.
Linus Torvalds at gmane.comp.version-control.git said:
limiting your project to C means that people don’t screw that up, and also means that you get a lot of programmers that do actually understand low-level issues and don’t screw things up with any idiotic “object model” crap.
Rob Pike in comp.os.plan9 commented:
object-oriented design is the roman numerals of computing
Joe Armstrong (more at Why OO Sucks by Joe Armstrong) mentioned elsewhere:
The problem with object-oriented languages is they’ve got all this implicit environment that they carry around with them. You wanted a banana but what you got was a gorilla holding the banana and the entire jungle
- Real Programmers Don’t Use PASCAL lore
- some things at Programming Quotes