"I've been commissioned to design a roadway for the city, and I've come up with a great design! It assumes that everyone has V12 cars...But come on, V12s have been around forever. Isn't it way past time that all those 4-cylinder owners finally upgraded? I'll be dammed if I'm going to compromise my wonderful design and take slightly more development time just to cater to the few people still living in the stone age."
Hypothetical, obviously. But it demonstrates exactly why programmers who trot out the "640k should be enough for everyone" show horse to defend their consumer-whoreism approach to development piss me off. (Well, that, and the fact that Gates never actually said it.)
I'll certainly grant that there are legitimate uses for 64-bit and multi-core. But this whole attitude of "Something that doesn't emit 64-bit is useless" and such has gotten ridiculously out of hand. Most people and programs don't need 64-bit or multi-core. Sure, a few do. And sure, many things can be better with 64-bit or multi-core - but they don't fucking need it. The notion that they do is a load of high-octane V12 bullshit.
This is the point where I inevitably get a bunch of crap about "But that's all the stores sell!" So what? Is that all that's in common use? Of course not. I don't know about you, but I develop for the hardware that people have, not hardware they might get if and when they decide to go buy something new (nevermind the second-hand market...you know...like eBay...maybe you've heard of it?). And when I optimize something to run well on the lower-end, guess what? It'll still run even better on the V12s. Even moreso since mine isn't one of the inevitably three or more programs on the user's system that all simultaneously believe their optimizations can rely on having at least of couple cores to their self.
And of course there's embedded software. You know, that stuff that the self-centered "waste loads of resources, because my time is more important" programmers always seem to forget exists. Embedded 32-bit and/or single-core devices are going to be around for quite awhile longer. Even the ones that don't stay 32-bit or single-core are still typically going to lag behind desktops, laptops and servers. Even aside from that, there's still power drainage and battery life. All of which leads me to another reason for software developers to cut the consumer-whore crap:
True story: A certain OS developer kept making each version more bloated than the last. They did it because they were Moore-worshipers, plus the bloat led to more hardware sales, which 90% of the time, were pre-packaged with their OS. Then they continued that with OS "Big Panoramic View 6" which completely fucked up their ability to compete in the emerging netbook and tablet markets: Ie, devices which were, guess what? Low-powered! Ah ha ha! Stupid fucks. So...are you behaving like Microsoft?
2 comments for "V12s And The 640k Show Horse"
True, too true. I'm an old Amiga user who watched as Symbian OS and all the other smaller-footprint systems ignored the OS which had the most intuitive UI and efficient design overall be routinely ignored, even though it did everything necessary and important in just 6MB. Watching Google hack Linux down to shoehorn it into these small forms is absurd while perfectly viable technologies such as this, originally designed for exactly such scale of system resources, continue to be disregarded.
You can't tell me, for all that the video stage output had to be re-implemented, that there would have been more expense in writing AmigaOS to newer silicon, x86 in particular, than the hundresd of million spent on various OS developmental endeavors bearing substantially identical design goals.
After all, BeOS, which was greatly influenced by AmigaOS, was almost bought by Apple to become its new OS to succeed Mac Classic on Motorola 68K. If the CEO of BeOS had been content with $125,000,000 instead of the $175,000,000 he demanded, computer OS and Apple's history would be very different now.
It's true; those technologists of us who enjoy, or at least prefer, efficiency and accuracy have been overwhelmed by those who prefer the glamor of power potential less fettered by such practical considerations. I really do not see much improvement in word processing programs since WordStar and the earlier WordPerfect, at a time when these programs ran in just 4MB of RAM. The protocols and standards establishing Postscript, file format conventions, and X Windows, for example, occurred well before the prevalence of OOP, multi-core processing, and 4GB+ memory address spaces.
Thus, it is clearly glamorized immaturity that motivates various 'geeks' to throw more resources at technological challenges and problems before working assiduously in terms of applying the tools and techniques already available.
I never knew that about BeOS and Apple, interesting.
I often feel that I really missed out on the Amiga. I started on the Apple IIc (which I loved: to this day, the Apple II is the only Apple product line I can say I'm overall pleased with) and then went straight from there to a 486 (SX2, IIRC). IIRC, that was right around the end of the Amiga's lifetime (not counting hobbyist/enthusiast use and resurrection attempts). Plus, I was still fairly young at the time and not very knowledgeable about the overall computer scene. So by the time I really started hearing much about the Amiga, it was all past-tense "Wasn't the Amiga great?" stuff that left me thinking "Shoot! I missed it!"