Mr. Jobs is indeed starting to behave like that other convicted monopolist we know and love. Yet unlike the latter, Jobs did not engage in underhanded business practices to create his monopolies. They were handed to him on a silver platter by the rest of the market, which insists on peddling either outright crap  or cheap imitations  of Apple’s aesthetic. In order to resist the temptation this worldwide herd of mindless junk-peddlers and imitators have placed before him, it would not be enough for Jobs to merely “not be evil.” He would have to be a saint (and a traitor to his shareholders.)
Month: February 2015 Page 1 of 2
What would I recommend learning?
– Erlang (I’m biased)
– Hakell / ML /OCaml
A couple of years should be enough (PER LANGUAGE).
Notice there is no quick fix here – if you want a quick fix go buy “learn PHP in ten minutes" and spend the next twenty years googling for "how do I compute the length of a string”
Ok, here we go, yet another “online presence” of some sort. If this is still active six months from today, I’ll give it its domain name.
The view that machines cannot give rise to surprises is due, I believe, to a fallacy to which philosophers and mathematicians are particularly subject. This is the assumption that as soon as a fact is presented to a mind all consequences of that fact spring into the mind simultaneously with it. It is a very useful assumption under many circumstances, but one too easily forgets that it is false
In an announcement that has stunned the computer industry, Ken Thompson, Dennis Ritchie and Brian Kernighan admitted that the Unix operating system and C programming language created by them is an elaborate April Fools prank kept alive for over 20 years. Speaking at the recent UnixWorld Software Development Forum, Thompson revealed the following:
The anthropomorphic analogy is misleading. A computer program is a formal system, not a collection of living beings. Forget that at your peril.
Imagine for a moment that programmers were constantly baited with snake oil: paradigm shifts this way! Agile scrum productivity boost ahead! An IDE that will astound you! A framework that solves the internets in only three lines of code!
Oh, that’s right. We don’t have to imagine.
The idea of a formal design discipline is often rejected on account of vague cultural/philosophical condemnations such as “stifling creativity”; this is more pronounced in the Anglo-Saxon world where a romantic vision of “the humanities” in fact idealizes technical incompetence. Another aspect of that same trait is the cult of iterative design.
GUI “variety”? Are you joking? The very reason there exists so much “variety” is that they all suck.
Lispers, however, take a radically different approach. We do not take
for granted that the fundamental primitive objects are those that OS
and hardware provide. We believe that the OS and hardware should
provide a much richer set of primitive objects than simple `raw
seething bits’. Frankly, raw seething bits is an abstraction on what
is really provided: a bunch of circuitry. If we can abstract out the
wires, we can abstract out the bits.