Intel had constantly try to bring in visionaries, but failed over and over. With the exception of Jim Keller, Intel was duped into believing in incompetent people. At a critical juncture during the smart-phone revolution it was Mike Bell, a full-on Mr. Magoo. He never did anything after his stint with Intel worth mentioning - he was exposed as a pretender. Eric Kim would be another. Murthy Renduchintala is another. It goes on and on.
Also critical was the the failure of an in-house exec named Anand Chandrasekher who completely flubbed the mega-project coop between Intel and Nokia to bring about Moblin OS and create a third phone ecosystem to the marketplace. WHY would Anand be put in charge of such an important effort?????? In Intel's defense, this project was submarined by Nokia's Stephen Elop, who usurped their CEO and left Intel standing at the altar. (Elop was a former Microsoft exec, Microsoft was also working on their foray into smartphones at the time. . very suspicious). XScale was mis-handled, Intel had a working phone with XScale prior to the iPhone being release .. but Intel was afraid of fostering a development community outside of x86 (Balmer once chanted -> developer, developer, developer).
My guess is that ultimately, Intel suffers from the Kodak conundrum, i.e. they have probably rejected true visionaries because their ideas would always threaten the sacred cash cows. They have been afraid to innovate at the expense of profit margins (short term thinkers).
brcmthrowaway · 2h ago
Is Raja Koduri another phony?
acroyear · 2h ago
I don't know tbh, heard both good and bad things .. he was brought in after many of the problems had already become serious. He probably had a very difficult charter.
fidotron · 1h ago
The core problem at Intel is they promoted the myth that ISA has no impact on performance to such a degree they started fully believing it while also somehow believing their process advantage was unassailable. By that time they'd accumulated so many worthless departments that turning it around at any time after 2010 was an impossibility.
You could be the greatest business leader in history but you cannot save Intel without making most of the company hate you, so it will not happen. Just look at the blame game being played in these threads where somehow it's always the fault of these newly found to be inept individuals, and never the blundering morass of the bureaucratic whole.
I'll give a viewpoint that the article reads like a listing of spec sheets and process improvements for CPUs of that era and not much else. Not really worth reading imho.
I'd love some discussion on why Intel left XScale and went to Atom and i think Itanium is worthy of discussion in this era too. I don't really want a raw listing of [In year X Intel launched Y with SPEC_SHEET_LISTING features].
MangoCoffee · 2h ago
>Itanium
IMO, Intel took us from common, affordable CPUs to high-priced, "Intel-only" CPUs. It was originally designed to use Rambus RAM, and it turned out Intel had a stake in that company. Intel got greedy and tried to force the market to go the way it wanted.
Honestly, AMD saved the x86 market for us common folks. Their approach of extending x86 to 64-bit and adopting DDR RAM allowed for the continuation of affordable, mainstream CPUs. This enabled companies to buy tons of servers for cheap.
Intel’s u-turn on x86-64 shows even they knew they couldn’t win.
AMD has saved Intel’s x86 platform more than once. The market wants a common, gradual upgrade path for the PC platform not a sudden, expensive, single-vendor ecosystem.
sbierwagen · 59m ago
Itanium didn't support RDRAM until Itanium 2.
deaddodo · 4h ago
> I'd love some discussion on why Intel left XScale and went to Atom
I thought it was pretty obvious. They didn't control the ARM ISA and ARM Ltd designs had caught up to + surpassed XScale innovations (superscalar, Out-of-order pipelining, MIPS/w, etc). So instead of further innovating they decided to launch a competitor of their own ISA.
KerrAvon · 3h ago
Intel at the time was clear about it: they wanted to concentrate fully on x86. They thought they could do everything with x86; hadn’t they already won against their RISC competitors by pushing billions into x86? Why would ARM be any different? Shortsighted, in hindsight, but you can see how they got there.
ianand · 5h ago
The site’s domain name is the best use of a .fail tld ever.
jagged-chisel · 49m ago
OT from TFA, so high jacking your thread …
I don’t recall if there was ever a difference between “abort” and “fail.” I could choose to abort the operation, or tell it … to fail? That this is a failure?
¯\_(ツ)_/¯
ashvardanian · 5h ago
The article mostly focuses on the 2008-2014 era.
dash2 · 3h ago
These are the years when Intel lost dominance, right? This article doesn't seem to show much insight as to why that happened or what caused the missteps.
BearOso · 3h ago
Intel really lost dominance when 14nm stagnated. This article only goes up to that point.
mrandish · 1h ago
Yep, in 2014 Intel's Haswell architecture was a banger. It was one of those occasional node+design intersections which yields a CPU with an unusually long useful lifespan due to a combination of Haswell being stronger than a typical gen and the many generations that followed being decidedly 'meh'. In fact, I still run a Haswell i5 in a well-optimized, slightly overclocked retro gaming system (with a more modern SSD and GFX card).
About a year ago I looked into what practical benefits I'd gain if I upgraded the CPU and mobo to a more recent (but still used) spec from eBay. Using it mainly for retro game emulation and virtual pinball, I assessed single core performance and no CPU/mobo upgrade looked potentially compelling in real-world performance until at least 2020-ish - which is pretty crazy. Even then, one of the primary benefits would be access to NVME drives. It reminded me how much Intel under-performed and, more broadly, how the end of Moore's Law and Dennard Scaling combined around roughly 2010-ish to end the 30+ year 'Golden Era' of scaling that gave us computers which often roughly doubled performance across a broad range of applications which you could feel in everyday use - AND at >30% lower price - every three years or so.
Nowadays 8% to 15% performance uplift across mainstream applications at the same price is considered good and people are delighted if the performance is >15% OR if the price for the same performance drops >15%. If a generation delivers both >15% performance AND >15% lower price it would be stop-the-presses newsworthy. Kind of sad how our far our expectations have fallen compared to 1995-2005 when >30% perf at <30% price was considered baseline and >50% at <50% price was good and ~double perf at around half price was "great deal, time to upgrade again boys!".
igtztorrero · 5h ago
The Atom model was the breaking point for Intel. No one forgives them for wasting their money on Atom-based laptops, which are slower than a tortoise. Never play with the customer's intelligence.
AlotOfReading · 2h ago
I was working as a contractor in this period and remember meeting a thermometer company. They had made the extremely questionable decision to build it with Intel Edison, which used an even lower performance product line called Quark. The Edison chips baffled me. Worse performance than many ARM SoCs at the time, far worse efficiency, and they cost so much. That thermometer had a BOM cost of over $40 and barely enough battery life for its intended purpose.
Demiurge · 5h ago
I've always wondered, how do some smart companies, or smart film directors, or smart musicians can fail so hard? I understand that, sometimes, it's a matter of someone abusing a project for personal gain. Some CEOs, workers just want to pitch, pocket the money, and move on, but the level of absurdity of some of the decisions made are counter-productive the 'get rich quick' scheme too. I think there are self perpetuating echo chamber self dellusions. Perhaps this is why an outside perspective can see the painfully obvious. This is probably why having some churn with the outside world, and also understanding what is the periphery of the outside, unbiased opinion is, is very important.
foobarian · 2h ago
At some point organizations get taken over by the 9-5 crowd who just want to collect a paycheck and live a nice life. This also leads to the hard-driving talent to leave for more aggressive organizations, leaving behind a more average team. What leaders remain will come up with not so great ideas, and the rank and file will follow along because there won't be a critical mass of passionate thought leaders to find a better way.
I don't mean to look down on this kind of group, I am probably one of them. There is nothing wrong with people enjoying a good work life balance at a decent paying job. However, I think there is a reality that if one wants a world-best company creating world-best products this is simply not good enough. Just like a team of weekend warriors would not be able to win the Superbowl (or even ever make it anywhere close to a NFL team) - which is perfectly fine! - the same way it's not fair to expect an average organization to perform world champion feats.
mattkevan · 2h ago
Disagree. 9-5 working is fine, and probably more efficient long term than permanent crunches.
Organisations fail when the ‘business’ people take over. People who let short term money-thinking make the decisions, instead good taste, vision or judgement.
Think Intel when they turned down making the iPhone chips because they didn’t think it’d be profitable enough, or Google’s head of advertising (same guy who killed yahoo search) degrading search results to improve ad revenue.
Apple have been remarkably immune to it post-Jobs, but it’s clear that’s on the way out with the recent revelations about in-app purchases.
keyringlight · 2h ago
I wonder if there will be a similar situation at nvidia, which apparently has a challenge with so many of their employees being rich as their stock has rocketed up in value, and then could cause concerns about motivation or if skilled and knowledgeable employees will leave.
iwontberude · 37m ago
I think many Nvidia employees will stick around because their new found wealth being at the biggest most important company in the world will give them insight about the market they invest in. I make an order of magnitude more day trading than as a software engineer at a Mag7 company and I stay employed for the access to they way modern businesses think. Companies like mine are an amalgamation of management and engineering from other Silicon Valley companies so the tribal knowledge gets spread around to my neck of the woods.
nikanj · 3h ago
Essentially no organizations actually reward telling your superiors that they're wrong. You pretend to sip the kool-aid and work on your resume. If one or two high-ranking leaders are steering the ship to the rocks, there's basically nothing the rank-and-file can do
KerrAvon · 3h ago
It doesn’t even have to be that negative. With the best intentions in the world, it’s rare to have a CEO who is fundamentally capable of understanding both the technology and the viable market applications of that technology. Steve Jobs didn’t manage to do it at NeXT.
acroyear · 2h ago
NeXT was a failed rocket launch (analogous to early rocket failures within SpaceX). A great step forward and a necessary step in the evolution of the PC. I thought NeXT workstations were pretty bad-ass for their time and place.
Recall that only 3 years prior to NeXT, was computers like the Atari ST .. what a vast difference!!
iwontberude · 5h ago
I could tell they were cooked when they bought McAfee.
acroyear · 2h ago
yes, this was a direct consequence of the Craig Barrett mentality. Intel wanted a finger in many pies, since it could not predict what would be the next 'thing'. So they went on multiple acquisition sprees hoping to hit gold on something. I can't think of a single post-2000 acquisition that succeeded.
jbverschoor · 2h ago
They what??
acroyear · 2h ago
Atom was shit. A desperation move. I was so embarrassed to recommend a Poulsbo laptop to friend, it was the worst machine I have every seen.
aurizon · 2h ago
Intel is a failed monopolist, unlike Apple! So is IBM with MCA, micro-channel-architecture
acroyear · 2h ago
yes, they tried with the 'Compute Continuum' .. but this never panned out. They spent loads of bandwidth and money trying to bring this reality into being, but it failed miserably. They assumed every user would have a smart-TV, smart-phone, tablet, and desktop .. all running their hardware/software. Turns out, no - they won't. They didn't "see" that the phone would dominate the non-business segment as it has.
jbverschoor · 5h ago
Their domain name is probably most of their market cap
Intel had constantly try to bring in visionaries, but failed over and over. With the exception of Jim Keller, Intel was duped into believing in incompetent people. At a critical juncture during the smart-phone revolution it was Mike Bell, a full-on Mr. Magoo. He never did anything after his stint with Intel worth mentioning - he was exposed as a pretender. Eric Kim would be another. Murthy Renduchintala is another. It goes on and on. Also critical was the the failure of an in-house exec named Anand Chandrasekher who completely flubbed the mega-project coop between Intel and Nokia to bring about Moblin OS and create a third phone ecosystem to the marketplace. WHY would Anand be put in charge of such an important effort?????? In Intel's defense, this project was submarined by Nokia's Stephen Elop, who usurped their CEO and left Intel standing at the altar. (Elop was a former Microsoft exec, Microsoft was also working on their foray into smartphones at the time. . very suspicious). XScale was mis-handled, Intel had a working phone with XScale prior to the iPhone being release .. but Intel was afraid of fostering a development community outside of x86 (Balmer once chanted -> developer, developer, developer). My guess is that ultimately, Intel suffers from the Kodak conundrum, i.e. they have probably rejected true visionaries because their ideas would always threaten the sacred cash cows. They have been afraid to innovate at the expense of profit margins (short term thinkers).
You could be the greatest business leader in history but you cannot save Intel without making most of the company hate you, so it will not happen. Just look at the blame game being played in these threads where somehow it's always the fault of these newly found to be inept individuals, and never the blundering morass of the bureaucratic whole.
I'd love some discussion on why Intel left XScale and went to Atom and i think Itanium is worthy of discussion in this era too. I don't really want a raw listing of [In year X Intel launched Y with SPEC_SHEET_LISTING features].
IMO, Intel took us from common, affordable CPUs to high-priced, "Intel-only" CPUs. It was originally designed to use Rambus RAM, and it turned out Intel had a stake in that company. Intel got greedy and tried to force the market to go the way it wanted.
Honestly, AMD saved the x86 market for us common folks. Their approach of extending x86 to 64-bit and adopting DDR RAM allowed for the continuation of affordable, mainstream CPUs. This enabled companies to buy tons of servers for cheap.
Intel’s u-turn on x86-64 shows even they knew they couldn’t win.
AMD has saved Intel’s x86 platform more than once. The market wants a common, gradual upgrade path for the PC platform not a sudden, expensive, single-vendor ecosystem.
I thought it was pretty obvious. They didn't control the ARM ISA and ARM Ltd designs had caught up to + surpassed XScale innovations (superscalar, Out-of-order pipelining, MIPS/w, etc). So instead of further innovating they decided to launch a competitor of their own ISA.
I don’t recall if there was ever a difference between “abort” and “fail.” I could choose to abort the operation, or tell it … to fail? That this is a failure?
¯\_(ツ)_/¯
About a year ago I looked into what practical benefits I'd gain if I upgraded the CPU and mobo to a more recent (but still used) spec from eBay. Using it mainly for retro game emulation and virtual pinball, I assessed single core performance and no CPU/mobo upgrade looked potentially compelling in real-world performance until at least 2020-ish - which is pretty crazy. Even then, one of the primary benefits would be access to NVME drives. It reminded me how much Intel under-performed and, more broadly, how the end of Moore's Law and Dennard Scaling combined around roughly 2010-ish to end the 30+ year 'Golden Era' of scaling that gave us computers which often roughly doubled performance across a broad range of applications which you could feel in everyday use - AND at >30% lower price - every three years or so.
Nowadays 8% to 15% performance uplift across mainstream applications at the same price is considered good and people are delighted if the performance is >15% OR if the price for the same performance drops >15%. If a generation delivers both >15% performance AND >15% lower price it would be stop-the-presses newsworthy. Kind of sad how our far our expectations have fallen compared to 1995-2005 when >30% perf at <30% price was considered baseline and >50% at <50% price was good and ~double perf at around half price was "great deal, time to upgrade again boys!".
I don't mean to look down on this kind of group, I am probably one of them. There is nothing wrong with people enjoying a good work life balance at a decent paying job. However, I think there is a reality that if one wants a world-best company creating world-best products this is simply not good enough. Just like a team of weekend warriors would not be able to win the Superbowl (or even ever make it anywhere close to a NFL team) - which is perfectly fine! - the same way it's not fair to expect an average organization to perform world champion feats.
Organisations fail when the ‘business’ people take over. People who let short term money-thinking make the decisions, instead good taste, vision or judgement.
Think Intel when they turned down making the iPhone chips because they didn’t think it’d be profitable enough, or Google’s head of advertising (same guy who killed yahoo search) degrading search results to improve ad revenue.
Apple have been remarkably immune to it post-Jobs, but it’s clear that’s on the way out with the recent revelations about in-app purchases.