The High Price Of Listening Too Closely To Academics

Almost everyone’s desktop or laptop computer has a derivative of the Intel x-86 architecture as its central brain.

About 25 years ago, academics declared the x-86 architecture doomed, and started development of new architectures, including RISC and VLIW.

I was on the design team of most of the RISC and VLIW projects designed to replace the x-86 architecture, including MIPS, IBM Power PC and Intel Itanium. Despite being engineering successes, all of the new architectures were doomed from the start from a marketing point of view. Because there was a huge legacy x-86 software base, and because there was never anything wrong with x-86 that couldn’t be fixed.

As a result of listening to clueless academics, companies like IBM, Intel and Motorola threw tens of billions down the drain. The founder of Compaq (Rod Canion) went from being the most successful CEO in history, to unemployed in less than two years – because he listened to academics’ BS  about microprocessors. Even Intel wasted billions of dollars trying to kill their own architecture, and then fell behind AMD for a few years. All because they listened to blowhards in academia, instead of staying focused on their core competence.

Academics occasionally provide some useful information, but their primary job is obtain funding and recognition. And that normally requires turning the BS up to volume 11.

ScreenHunter_3648 Oct. 12 17.30

About Tony Heller

Just having fun
This entry was posted in Uncategorized. Bookmark the permalink.

45 Responses to The High Price Of Listening Too Closely To Academics

  1. MrX says:

    As a programmer who was studying in the early 1990’s, I remember this all too well.

    The x86 is like the Borg. It assimilates and adapts to everything that was thrown out there. It started out as 16-bit. It then had 32-bit mode and now 64bit. Today, both AMD and Intel are adding SIMD registers and opcodes. It is evolving. Each incarnation being able to run the very first programs made for the x86. As a side-note, the Motorola 68K had a good run, but fell behind in performance, mostly because of competition I think and its use being mostly in small devices (ironic in today’s world).

    The climate is similar in that it can adapt. This is always the biggest underestimation. Things that can adapt are difficult to predict. They will do things later that don’t seem likely today.

    Another case of things that don’t adapt falling behind in technology, look at the history of 3Dfx. In 1996, S3 owned 50% of the 3D market. Nearly overnight, 3Dfx had 85% of the market with its Voodoo video card. But it was 3D only. ATI and Nvidia then made cards that combined 2D and 3D. 3Dfx sat back for too long. By the time they tried to catch up (and were still ahead in performance), the other companies were already planning the next phase. 3Dfx disappeared in a few short years.

    This same mistake happens over and over and over. It’s happened before and is happening now with climate scientists. They underestimate the adaptability of the climate. But most of all, they’re circlejerking around how they think the world should operate instead of how it actually is.

    • daveandrews723 says:

      Mann, Hansen, et al think they have reinvented the wheel with “CO2 as climate driver.” They and their disciples will go down fighting, even in the face of reality. Their pride, egos and reputations are at stake now. Science be damned,

    • darrylb says:

      Adaptability for the most part may be considered to be negative feedback. In climate science language, the earth is constantly reacting to forcing agents by producing a feedback (negative) to produce equilibrium. However, it never reaches equilibrium because there are constant changes in the net value of the forcing agents.
      In the general climate change hypothesis, the transient climate response, (TCR) is estimated to be a 3 deg C raise in temperature due to a doubling of the quantity of atmospheric CO2
      However, in the model there is a relatively small temperature change due to the increase in CO2. Most of the 3 deg C response is really a shot in the dark hypothetical positive feedback response based primarily on guesses of increased water vapor in the lower atmosphere. (Water Vapor being the most significant greenhouse gas) A second predicted positive feedback is loss of albedo, (snow and ice)
      Observations indicate these feed backs are not happening and so the entire group think is in shambles. There was no historical data upon which to base the feedback, only wishful thinking to maintain careers and money.
      When some scientists start to realize what is happening and with integrity publicly state their change in understanding of what is happening, they receive wicked vitriol from the AGW gatekeepers for stating the wonderful news that there is no significant worry.

      .

  2. Gamecock says:

    In the mid-80s, I went to a demonstration of a DEC [Digitial Equipment Corporation] desktop computer. It ran RSX11-M. Delightful. I asked what software was available for it.

    “Duh” was all I heard.

    I can’t remember what it was called. In researching DEC to find what it was called, I can’t even find it. Stillborn, for the same reason Steven cites.

    • nielszoo says:

      I used to use DEC gear for motion control systems and polyForth was the language of choice and it was amazingly adaptable. I think they’re still around.

  3. tom0mason says:

    Should academics, no matter how well qualified, have a greater say in how everyday public policy is derived and applied? Have academics any proven evidence of improving outcomes when consulted?
    IMO, too often this happens against the electorates’ wishes and representatives’ stated policies, often the results are worse than if academics were ignored.

  4. Stargazer says:

    I remember when the first IBM PC (until then they were called microcomputers) came on the market. It was a giant leap backward in technology at that time. Altos, IMSAI (maybe IMSAI had already bit the dust by then), OSI, etc all had better machines running Digital Research CPM/MPM OS. But IBM legitimized the market, so it was all over for the others. Eventually, of course, IBM’s partnership with Microsoft (then a minor software utilities/language developer) revolutionized the industry. BTW: IMHO, DEC VAX computers ran circles around their IBM competition in the ’80s.

    • V. Uil says:

      And the Motorola 68K and the Zilog Z8K knocked spots off the Intel chips – even the bizarre straight from academia Intel project, the iAPX432, designed to replace the x86 with an object oriented processor.

      But marketing and software support destroyed all those dreams. Academics – and I was one at the time – simply did not understand the realities of the world.

      I guess they still don’t.

    • rishrac says:

      I liked the cpm os. Then the machine died. It was a sad day. I looked for a replacement but found none. The world had moved on to windows.

    • Keitho says:

      Those VAX machines were simply awesome. Even the DEC support guys were absolutely brilliant. What happened to it all, where did it go wrong?

  5. there is no substitute for victory says:

    Isn’t this x86 chip is dead a little like the Spanish Explorers Coronado and de Soto looking for el Dorado or the Fountain of Youth?

  6. KevinK says:

    Yup, I had Power PC and Intel Itanium computers, the available software was always just one step behind the x86 stuff. In theory those machines (Power PC, etc) could execute the software faster (that is of course if you actually had some software to execute ha ha ha).

    Are you hinting that the Academics that insist they have a theory that proves that a minor trace gas in the atmosphere controls the temperature of the Oceans might be mistaken ? Perish the thought, they are “climate scientists” after all…..

    Cheers, Kevin

  7. R. Shearer says:

    Nullius in verba

  8. Andy DC says:

    For someone like myself who knows very little about technology, that is an amazing story.

  9. David Talbot says:

    I worked at IBM supporting mainframes beginning in the mid-1960s. The 4 main operating systems at that time were PCP (Primary Control Program) for the larger system 360s, and DOS (or TOS or BPS) for the smaller 360s. While none of these were capable of secure multi-tasking, the hardware architecture was already built (with program state and supervisor state) to support that under subsequent operating systems. Then in ~1980 along came the x-86 based personal computers, which “since they would NEVER run more than one program at a time, and would NEVER connect with other computers” (!?!) did not find it necessary to build a secure architecture. Without the active collusion of a mainframe’s system programmer it is impossible to load a virus or any other deleterious software onto these systems.

  10. Scott Allen says:

    Academics occasionally provide some useful information, but their primary job is obtain funding and recognition. And that normally requires turning the BS up to volume 11.
    No truer words were ever spoken.
    As for an example look no further than PhD. Nobel laureate Paul Krugman advisor to ENRON

    • KTWO says:

      Krugman now has the dial set above 12.

      The article stirred memories. As we look back we can see that even those experienced and well qualified often made bad choices re. processors, software, etc. You got no Crystal Ball when given your job title.

      A more recent example was HDTV’s. Not too long ago it wasn’t clear if LCDs or Plasmas would prevail. And knowing how each worked wasn’t much help. Even the DLPs looked promising because I thought they would become dirt cheap to manufacture.

  11. Curt says:

    You do have to consider other markets besides PCs. The market for smartphones, tablets, and embedded devices such as digital video recorders and servers is booming — the PC market is shrinking — and x86/Microsoft does not have a dominant position in these markets.

    • And those chips sell for a few pennies over cost.

    • Tel says:

      The winner in the phone market was ARM which was a RISC architecture some time before MIPS, IBM Power PC, Itanium and DEC Alpha were even thought of.

      It all started with the Acorn Archimedes in 1987 (a very academic project if ever there was one), and somehow the ARM just became the world premiere RISC design after many decades waiting. Made a big difference that Intel bought it up and got behind it.

      By the way PowerPC is still used for engine management, and MIPS turns up in Cisco routers and similar network devices. Not completely dead, but unlikely to end up on a desktop soon.

      Morotola kind of dropped the ball with their ColdFire design, it should have ended up in phones but ARM just beat it.

  12. B says:

    Motorola processors were competition to Intel going back to the beginning. Motorola wasted money because well it was MOT and they did that a lot, but it wasn’t because they kept competing for business. Apple, Atari, Amiga, and others were Motorola based going way way way back.

    There were others that were just in other market segments and/or lived in other market niches, not trying to replace x86. For instance back in the day 3D CAD modeling was done on SGI, Sun, and HP workstations. NeXT boxes had their place in other things. x86 machines couldn’t do this stuff until…. later. So from a late 1990s point of view it was these other processor based machines that were doing the heavy lifting, that’s why they looked like the future.

    What ended up happening is that x86 invaded every niche and pushed it’s way into every market segment until not much else was left. Why? Cheap and could run the market dominate software. I didn’t get a Sun and a x86 windows PC. I didn’t get a Sun with native software for email, spread sheets, etc… I got a Sun for most of my work and then had to run a windows emulator for everything else. Once the x86 could run ProE and Solidworks well enough, it took over. Companies refused to buy expensive workstations and hire expensive unix admins to take care of them. That was that.

    • Tel says:

      Motorola split their attention between the 68k CISC architecture and the 88k RISC architecture.

      People liked programming the 68k and which got into many desktop machines at the time: Amiga, Apple-mac, Atari-ST, Sun, Apollo workstations, SGI workstations, HPUX workstations. The 68k was the first chip with a good quality MMU that could to virtual memory properly, and it was early off the mark with a strong FPU design (Intel was well behind on both of those at the time).

      I remember at the time everyone thought the 88k would somehow do something cool but it just never did. Motorola pissed away their advantage and never pressed ahead with 68k upgrades. Slowly their customers lost interest so Sun went their own way and made SPARC, which is still selling under the Oracle brand for database servers (and web servers believe it or not). Apple moved to PPC and eventually gave up and went back to Intel. Amiga just went broke, and HP merged with DEC and wasted money on Alpha.

      That was the big turning point, when they all abandoned the 68k architecture.

    • Ivan says:

      and hire expensive unix admins to take care of them
      It wasn’t just that they were expensive – it was that they inhabited their own universe.
      They are almost impossible to communicate with using any known dialect or derivative of English.

      • Tel says:

        So they ended up hiring 4 times as many Microsoft admins to do the same job. The MS admins dressed neatly, said all they key business words, and repeated the only two maintenance operations they knew about: reboot a few times, and if that doesn’t fix it reinstall everything.

        The cost was always irrelevant because IBM and MS managed to position themselves as the default option. They got their key words into the common computer lingo, so people who knew very little about computers would be comforted by knowing a little bit about Microsoft. Key marketing at the right time.

    • Keitho says:

      Sparc . . nuff said.

  13. KTM says:

    “I believe that a scientist looking at nonscientific problems is just as dumb as the next guy” – Richard Feynman

    The same is true of lawyers looking at anything non-legalistic, yet we have hordes of lawyers deciding how to conduct war operations, contain an Ebola outbreak, conduct economic policy, where to focus science dollars, etc.

  14. philjourdan says:

    The RISC market has a niche. What people do not understand is that whether the instructions are on the chip or in the higher level code, they are still needed for computers (for other purposes like your car windshield wipers, perhaps they can use a RISC chip as the instructions are simpler).

    That is what doomed RISC. If you think code on the X86 platform is bloated, try adding in the simple instructions embedded in the chip to EVERY function call, and you get real code bloat. PCs (and that includes servers) all basically do the same thing. And they were “designed” around the Intel CISC chip. so to move to RISC, you have to change the paradigm radically. And the installed base (as Steven correctly pointed out) was not going to move. There was no price/performance incentive to do so.

    RISC is still around. It is in every day life. But most do not see it, they just hear their refrigerator telling you the water filter needs to be changed.

  15. methylamine says:

    Steven–I use an analogy often when I’m explaining software architecture to my developers.

    It’s the Chevy small block vs. Ferrari V8 argument: The Ferrari V8 is a miracle of beautifully intricate engine design and manufacture, exquisite in its every detail. Modern, taking advantage of every development, it’s highly stressed to extract the most from every component.

    And it costs roughly $50,000.

    The Chevy small block is 1940’s technology; engine purists sneer at its primitive design…”pushrods? What is it for, a farm implement?”

    And you can buy one in a crate for a couple grand. They’ve made over 100 million of them.

    Never discount the power of the market to take something cheap and simple and overcome its engineering compromises by sheer numbers and popularity.

    And don’t write Ferrari code that nobody can work on. That Ferrari V8? Yeah, you have to drop the engine out to change the plugs.

    • But x-86 is superior to RISC in many ways

      • methylamine says:

        Still, Intel’s RISC internally….they’re just translating in silico from x-86
        But that doesn’t negate your argument–X-86 won despite academia’s bloviations by sheer market force.

        • x-86 has a huge advantage in icache code density. RISC processors with competitive compute power are more expensive to manufacture as a result. Academics walked off the “memory is cheap” plank, without bothering to think about how expensive static cache RAM is.

        • methylamine says:

          And, to torture an already tenuous analogy…how like the compactness advantage of the pushrod Chevy small block, lending it to tight packaging constraints!

          I’ve lost touch with processor architecture…do Intel chips store X86 in their code cache or the translated RISC? Pardon my ignorance…

        • The whole point of CISC is high code density and efficiency. The RISC instructions come from microcoded decoders.

  16. Keitho says:

    This thread is the song of my people. 🙂

    I started out programming a Toshiba with levers and nixie tubes in 1969. *sigh*

Leave a Reply

Your email address will not be published. Required fields are marked *