97% of academics were convinced during the 1990’s that Intel’s x-86 microprocessor was doomed. Companies like IBM wasted billions of dollars based on clueless academic opinions.
Almost all Windows and Macintosh PC’s now are based on derivatives of Intel’s x-86 architecture.
Strangely enough almost every mobile phone is running the descendent of “BBC Computer Literacy Project” from 1981. Not many people predicted that would happen.
http://en.wikipedia.org/wiki/BBC_Micro
The ARM processor has been easily licensed and modified by firms that wanted to add a few extra features to a CPU. My opinion is that the success ARM is due more to the design owner being willing to allow others to change the design to better suit their needs.
On the other hand, I do remember the virtual snowstorm of academic papers proclaiming that RISC (Reduced Instruction Set Computing, basically no more than one memory access per instruction) processors were going to make all CISC (Complex Instruction Set Computing) processors obsolete very quickly.
There are some technically compelling arguments for RISC to win over CISC, yet the average user doesn’t give a flying flip about technical elegance and instead cares about just a couple of key issues. 1) Will it do what I want? 2) Is it inexpensive to own?
Based on those issues the x86 refused to die on command and instead dominated the market. Inertia in the installed software base did play a part as well. The Dec Alpha RISC CPU was fast and elegant, yet had very little software available and thus did not survive.
And the ARM is on the rise because it’s easy to integrate new features. (I use cypress processors with amazing I/O features added to the basic ARM CPU)
Intel and AMD figured out that they could do a RISC pipeline with the x86 instruction set – which is much better because of its high code density. That was the part which academics were too dumb to understand. They thought that an orthogonal instruction decode was important, but it isn’t.
An orthogonal instruction code set is important if you have to write assembly code. From the late 1960’s to the mid 1980’s I did a huge amount of it. It was the only way to get any kind of performance out of micro CPU’s. The 6502 and 68000 was a dream to program in assembler – almost as easy as writing in a higher language. The 8086 with it’s memory segmentation and incoherent instruction set was a nightmare of nightmares.
Considering that most processors had a clock under 10 MHz until the late 1980’s, the difference between the two classes of CPU’s was important. When the chips broke the 10 MHz clock limit and had floating point processors built in, assembly language was no longer necessary for anything except computer hacking.
But if you are a CPU manufacturer, you want a CISC instruction set because you get much higher icache efficiency for the same amount of real estate.
@Lionell–YES! I remember the 6502/68000 very fondly…and the nightmare that was 8086 assembler. Seems the best programmers around me are the ones who cut their teeth on 6502/68000.
@steven–
My favorite analogy when planning/discussing architecture for software projects is the Ferrari V8 vs. Chevy small-block V8.
Never discount the power of sheer numbers and cheapness! There are 100 million Chevy small blocks, and counting…and perhaps ten thousand Ferrari V8’s. The former is demonstrably a POS technically; and yet its weaknesses have been surmounted by tenacity, practicality, and cost.
I used a few Alpha 2100s as Sybase database servers. Worked great! Never had a hint of trouble from them.
DEC built bullet-proof equipment….. A transmission test system I was part of the development team for in 1991 was just decommissioned last December. Those Vaxes were still running flawlessly testing nearly a million transmissions a year.
However, Ken Olsen didn’t think there was any need or market for personal computers and DEC declined from there….. Kinda like the 640k should be more than enough memory for anybody decision…..
I went to a DEC PC demo in Richmond in the mid-80s. It was essentially a desktop RSX-11M system. I could have used it, but the public couldn’t. The public needed software to run on them. Without games, spreadsheets, etc., it was useless to the public.
ARM is not really a descendant of that project, though there was some interaction in the development of ARM. The architecture is completely different.
That is correct Steven, I work on these devices every day.
I have some amazing stories from about working with ARM and Hermann Hauser, but unfortunately I can’t tell them. ROFL
They are tied in my Robbie Burns night at the Flying Pig stories.
The smartest companies don’t even pay attention to academics, including climate scientists. Only politicians do.
Ah ! Still remember my first real computer … IBM PS2 8088MHz processor … thought I was the bees knees with 2 x 3.5″ disc drive and 512k memory. Gave it to my father-in-law and he still has it although he’s not capable nowadays.
I began my programming career (prior to college) writing games on a Texas Instruments TI-99/4A, which holds the distinction of being the first 16-bit personal computer, utilizing a 16-bit TMS9900 CPU running at 3.0 MHz. It boasted solid state software (ROM cartridges) and stored user data and programs on a typical audio cassette recorder/player.
Now I write code on machines ranging from 92 core processor IBM mainframes to ARM processor micro machines for embedded systems and energy devices. The world has indeed changed a lot since the original “Personal Computer” days.
Don’t get me going on CS (Computer Science) academic’s. They are so freaking bright that in 1985, two 2nd year college students majoring in CS entered the ACM (Association for Computerized Machinery) programming contest. The contest was designed for 4 person teams, but these two students couldn’t find anyone to join them, so figuring they would get their clocks cleaned by the others, they decided to enter anyway, just for the fun. One of the other teams was packed, with 4 of the college CS professors. Turns out, those two 2nd year college students wiped the floor with everyone, INCLUDING the college profs, completing 9 of the 10 assignment tasks, and beating everyone by a margin of 9 to 2.
By the way, I was one of those two 2nd year college students. … pfffttt… So much for academic’s, they don’t know shit!
I have a rule of thumb about “teams” developing software:
A team of one can do the work of four.
A team of two can do the work of three.
A team of three can do the work of two.
A team of four can do the work of one.
A team of more than four can’t get much done other than writing progress reports and holding meetings about why the software is late, over budget, and still doesn’t work.
I work alone.
BUT BUT, But, but, Industry wants TEAM PLAYERS!
Yep, communications overhead…. People do not understand that complexity requires communication overheard.
In complex systems design, I have found a team of 2 to be optimum. In coding, one person is superior.
Yet government persists in the belief that you can get a baby in a month if you just assign 9 women to the task. And the results are little things like half-billion dollar websites that cannot and will not ever function properly.
Your rule of thumb is never more true than when deadlines are tight. That’s when mgt starts adding staff, and that’s when things start going down hill fast.
OT
Couple of useful maps.
Especially the distribution of population by latitude since we all have to emigrate to Antarctica sometime soon.
http://asheepnomore.net/2013/12/29/40-maps-will-help-make-sense-world/
Love the maps especially the Economic center of gravity. Certainly brings home the destruction done by Bill Clinton and the World Trade Organization.
I wonder if there is a version of map 17 for highest paid US government employees by state and whether it correlates with Steven’s red vs blue voting counties map?
What about the alcohol consumption map – litres of “pure” alcohol. Turn that into beer percentages and the numbers skyrocket.
I dunno how much of a modern x86 processor is actually an x86 processor. I mean, it’s turtles all the way down, right?
They have pretty much the same instruction set architecture, I believe.
Reblogged this on The Firewall.
Comments so far show the unlikely success of team software projects. Let me show how teams made up from disparate disciplines are even less likely to succeed.
During my son’s last year in college studying mechanical engineering he had a course titled: senior project. A company in the private sector would set up a problem and pay for materials and incidentals. The professor running the project would select the team. My son’s team was composed of 3 mech. eng. students and 2 computer engineering students. The mechanical apparatus was the purview of the mech. engineers and the software/hardware to control the machine was delegated to the computer engineering students. It was a year long project. My son’s side had the mechanical parts all completed by the end of February. Meanwhile, the two computer engineering students were all talk and no action, just excuses.
The three mechanical engineering students didn’t say anything but they rigged up their own controller, wrote the soft ware, debugged the software and installed their hardware/software to the machine and presented it to the professor and the company. IT WAS JOB WELL DONE. The computer programming and design was essentially the work of one student who had a lot of similar experience. He mapped out and programmed a fuel injection system for a Formula SAE race car as an extracurricular project during his Junior year. That student is now a rocket scientist with a Ph.D.
However, the two computer engineering students went ballistic as they felt that they had been bypassed in the project. Their own professor complained on their behalf. Since they had no hardware/software solution nothing came of their complaints.
Dan Kurt
The computer science curriculum, with it’s emphasis on operating systems and programming structures does not provide much, if any, instruction in the fine art of interfacing software and hardware. Or as I like to call it “programming on the bare metal”. Plus, the poor math skills and weak science education of many CS students hinders their efforts in any type control system involving real world physics. (I can freely criticize the CS curricula as I hold a CS minor and EE major)
I can’t count the times I’ve assisted in issues where the CS educated programmer threw up his or her hands and said the problem has to be in the hardware (and wasn’t). Then walked away. Your typical (neanderthal-type) engineer will be reading specifications and/or disassembling the hardware to find the true source of the problem. (There was that incident where I nearly caused a Sr engineering manager a heart-attack when he discovered me in the process of field-stripping a brand new 50k VAX computer to remove a jumper…. You were supposed to call DEC field service for that and I didn’t want to waste a day waiting)
To most (not all) CS programmers the world ends where the software stops. And I suspect those 2 students basically didn’t know where to start as their education didn’t prepare them to be “self-directed” in defining the problem, and then solving it….
Thus you will find much of the embedded software development in the hands of the people who work comfortably with the hardware and are not afraid to get their hands dirty or grab a soldering iron and make necessary changes to a circuit board. i.e. the engineers.
It was my experience that it was far easier to take someone out of the plant who understood the business and had some computing aptitude and teach them the computers than to take a CS and teach them the business.