"shareholder friendly" and "good for the company" are not at all the same things.
Instead of saying to AMD they will be in the rearview mirror, they should have been paranoid. Not do stupid GPUs. and destroy others where it mattered
EDIT: I personally always liked intel iGPUs because they were always zero bullshit on Linux minus some screen tearing issues and mumbo-jumbo fixes required in X11.
The "stupid" thing with them (maybe) is that they cannot do anything exceptionally good, and that while having compatibility problems. They are cheap yes, but there are many other chips for the same price, and while they are less capable, they are more compatible.
Make A380 cost 30% less or invest way more to the drivers and IMHO it'd been completely different.
Driver stability, less heat and fan noise, battery life is almost assured in the Intel iGPU.
The only area where AMD's discreet GPUs are lagging behind is AI stuff. You get a lot more performance with Nvidia GPUs for the same price. For gaming, though AMD is the clear winner in the price/performance space.
Of course, Nvidia is still a bit of a pain but it's of their own making. You still need to install their proprietary driver (which IMHO isn't that big a deal) but the real issue is that if you upgrade your driver from say, 550 to 555 you have to rebuild all your CUDA stuff. In theory you shouldn't have to do that but in reality I had to blow away my venv and reinstall everything in order to get torch working again.
I am trying to learn this but having difficulty finding good explanations. I know the Wikipedia-level overview, but need more details.
Apparently AMD has at least the Sony PS6 contract now.
Hard disagree here x100. Investing in GPUs in the time when Nvidia and AMD started to gouge the market is actually the best decision Intel did in recent times. It's the piece of semiconductor with some of the highest margins in the business and they already own a lot of the patents and IP building blocks to make it happen.
The only stupid thing they did was not getting into GPUs earlier so they would already be on the market during the pandemic GPU shortage and AI boom.
There are a lot of strategic shifts Intel could do but their timeline to paying off at the scale Intel needs is very long.
What I see is a bunch of endeavours that get killed off too quickly because they were not paying off fast enough and this creates mistrust in the community around Intel’s new initiatives that are not core that make them harder to succeed going forward. It is a bit of a death spiral.
Basically big companies find it hard to learn new tricks when their core offering starts to fail. The time to learn new tricks was a while ago, now it is too late.
With this logic, Apple should have also stayed with making Macs when it had financial troubles in in 1999, since that's its focus, not venture into making stupid stuff like Mp3 players and phones, everyone knows that's the core focus of Sony and Nokia who will rules those markets forever.
Trying to use that the fact Einstein or Muhammad Ali or any other genius in his area could do something or did something is not a counterpoint.
The CEO of company X may think that he's as talented as Steve Jobs, but is he really?
You’re off by a few years. Jobs axed any project not related to the Mac when he came back in 1997, so they actually did what you say they did not. The iPod project started around 1999-2000 when Apple was in a massive growth phase after the iMac and the G3 and then G4 desktops.
Also, in the alternate reality where the iPod did not massively succeed, it would very likely have been killed or remain as a hobby like the Apple TV. Apple might not be as huge as they are now, but the Mac was doing great at the time.
If they had focused more on mobile CPUs, GPUs, or especially GPGPUs a decade ago, they could have more product diversity now to hold them over.
Instead, they dipped their toes into a new market every few years and then ran away screaming when they realized how difficult and it would be to gain market share.
If they had any actual vision, they could have a line of ARM CPUs now to compete with the onslaught in all major CPU markets.
They should have listened to their customers and market forces instead of trying to force x86 down everyone’s throats.
Disagree. Selling ARM chips with high profit margins is tough. There's too much competition from the likes of Samsung, MediaTek and until the US ban, Hi Silicon(Huawei). ARM chips are a race to the bottom in terms of price with a market dominated by companies from Asia. There's no guarantee Intel could have had a competitive ARM design that could beat Apple's or Qualcom's.
Yes, without proprietary technology, margins are slim.
But Qualcomm has somewhat succeeded in this area anyhow. I guess they took the ARM base but innovated on top of it in order to command higher margins.
It's wasn't just anyhow. Qualcomm succeed in the mobile SoC space because they also had the best modems in the industry (name comes form Quality Communications after all). And also the best mobile GPU IP they bought from ATI.
Intel and AMD can dominate the x86 market all they want. But x86 has been steadily losing ground every year to cheaper and more power efficient ARM processors. It’s still a big market now, but I don’t think it’s going to be a great business in a decade or two.
ARM was just an example. If Intel spent a decade strengthening their Larrabee GPGPU platform and building AI and crypto ecosystems on it, they may have been well positioned to benefit immensely like Nvidia has over the last 5 years.
Although their latest plan is to turn it into ARMv8 anyway: https://www.intel.com/content/www/us/en/developer/articles/t...
The main problem with x86 is that it has poor security, not poor performance.
GPUs exist because CPUs just aren't fast enough. Whether or not people are going to start making GPU-only computers is debatable (although there has clearly been a lot of CPU+GPU single chips).
A GPU-only computer would be absolutely horrendous to use. It'd be incredibly slow and unresponsive as GPUs just absolutely suck at running single-threaded code or branchy code.
The overall point is that work is being increasingly done _not_ on the CPU. If your business is CPUs-only then you're going to have rough times as the world moves away from a CPU-centric world. You don't need to predict AI; you just need to see that alternatives are being looked at by competitors and you'll lose an edge if you don't also look at them.
It's not going to matter much if you have a crappy CPU if most of the work is done on the GPU. Its like how iPhones don't advertise themselves as surround sound; phones aren't about calling people anymore so no reason to advertise legacy features.
Eh? GPGPU has been a thing for decades and yet barely made a dent in the demand for CPUs. Heck, CUDA is 17 years old!
The world has not budged from being CPU-centric and it isn't showing any signs of doing so. GPUs remain an accelerator for specialized workloads and are going to continue to be just that.
Of course they also add GPU-like things to the CPU too for the same reason: https://developer.arm.com/documentation/109246/0100/SME-Over...
Undercut the big boys with affordable on-prem AI.
Intel should have appointed people form the HN comment section as their CEO, as they clearly know more about running a giant chip design and fabrication company than the guy who worked there for 10+ years.
I have my thoughts on the matter and cautiously welcomed their move to GPUs ( though admittedly on the basis that we -- consumers -- need more than amd/nvidia duopoly in that space; so I am not exactly unbiased ).
That's just speculation. There's no guarantee that would have happened. Nobody has a crystal ball to guarantee that as the outcome.
It's like saying if someone would have killed Hitler as a baby, that would have prevented WW2.
What do you think has happened so far?
Your mental model of the world may help me understanding the point you are trying to make.
That's just speculations from people online. I don't see any wisdom in that like you do, all I see is just a guessing game from people who think they know an industry when they don't (armchair experts to put it politely).
What made Nvidia dominant was not weak GPUs with a lot of RAM. The puzzle of their success had way more pieces that made the whole package appealing over many years, and a great timing of the market also helped. Intel making underperforming GPUs with a lot of RAM would not guarantee the same outcome at a later time in the market with an already entrenched Nvidia and a completely different landscape.
But since they obviously apply just as well to Intel itself, it is a poor reason to dismiss other’s ideas.
—
> What made Nvidia dominant was not weak GPUs with a lot of RAM.
Intel doesn’t have the luxury of repeating NVidia’s path in GPUs. NVidia didn’t have to compete with an already existing NVidia-like incumbent.
That requires no speculation.
—
Competing with an incumbent via an underserved low end, then moving up market, is called disruption.
It is a very effective strategy since (1) underserved markets may be small but are are immediately profitable, and (2) subsequent upward growth is very hard for the incumbent to defend against. The incumbent would have to lower their margins, and hammer their own market value.
And it would fit with Intel’s need to grow their foundry business from the low end up too.
They should take every low-end underserved market they can find. Those are good cards to play for ambitious startups and comebacks.
And the insane demand for both GPUs and chip making is increasing the number of such markets.
True, it is just speculation. 'Any' seems to be a strong qualifier. One of the reasons I troll landscape of HN is that some of the thoughts and recommendations expressed here ended up being useful in my life. One still has to apply reason and common sense, but I would not dream of saying it has no ( any ) wisdom.
<< What made Nvidia dominant was not weak GPUs with a lot of RAM.
I assume you mean: 'not in isolation'. If so, that statement is true. AMD cards at the very least had parity with nvidia, so it clearly wasn't just a question of ram.
<< The puzzle of their success had way more pieces that made the whole package appealing over many years, and a great timing of the market also helped.
I will be honest. I am biased against nvidia so take the next paragraph for the hate speech that it is.
Nvidia got lucky. CUDA was a big bet that paid off first on crypto and now on ai. Now, we can argue how much of that bet was luck meets preparation, because the bet itself was admittedly a well educated guess.
To your point, without those two waves, nvidia would still likely be battling amd in incremental improvements so the great market timing accounts for majority of its success. I will go as far as to say that we would likely not see a rush to buy 'a100s' and 'AI accellerators' with exception of very niche applications.
<< Intel making underperforming GPUs with a lot of RAM would not guarantee the same outcome at a later time in the market with an already entrenched Nvidia and a completely different landscape.
Underperforming may be the key word here and it is a very broad brush. In what sense are they underperforming and which segment are they intended for? As for ram, it would be kinda silly in current environment to put a new card out with 8gb; I think we can agree on that at least.
<< I'm saying nobody can guarantee the claim of the GP I've replied to,
True, but it is true for just about every aspect of life so as statements go, so it is virtually meaningless as an argument. Best one can do is argue possibilities based on what we do know about the world and the models it tends to follow.
The market for that is just not that large, it wouldn't move the needle on Intels revenue, but then again it could get the enthusiasts onboard and get Intels CUDA alternative moving.
But then you move up, and the incumbents have to choose to keep ceding more of their lower end or lower their margins. And it is very hard and costly for a company succeeding at the high end to do the later.
That would have been a fantastic sign Intel was getting more nimble and strategic.
And been a good fit with a come back in the low end of fabs too.
It is something that is easy to miss if you are just looking at typical business strategy and finances. A high memory consumer GPU would undercut their server GPUs, which are presumably higher margin intended golden geese. It's easy to see them chasing server markets and "gamers" being an afterthought.
However there is huge demand right now for a modern, even a crappy modern, GPU with gobs of memory. Make the card and the open source AI tooling for it will start sprouting in days after it's release.
It's an extremely powerful position to have every at-home AI geek's setup to be bound to using intel cards and intel focused tooling. Nvidia and AMD won't do it because they want to protect their server cards.
So, incredibly small market share while your competitors already have the first-mover advantage and nailed down the ecosystem? With no data backing it up, I think, graphics cards for local LLM needs is not really on demand. Even for gaming it’s probably more attractive, but then again, that’s not even where the real money is.
Exactly. This x100. It was easy for Nvidia to succed in the LLM market by winging it, in the days when there was no LLM market, so they had the greenfield and first mover advantages.
But today, when Nvidia dominates the mature LLM market, Intel winging it the same way Nvidia did, won't provide nearly the same success as Nvidia had.
Ferruccio Lamborghini also built a successful sports car company by building tractors and cars in his garage. Today you won't be able to create a Lamborghini competitor with something you can build in your garage. The market has changed unrecognizably in the mean time.
The people learning how to do local LLMs will be the people directing build out of on-prem transformers for small-midsize companies. The size of the market is irrelevant here, it's who is in that market and the power they will have that is extremely relevant.
AMD has tried this for many of its technologies and I don't think it is working. Granted, they suck at open sourcing, but a shitload of it was open sourced. See TinyGrad voyage into the Red Box driver (streams on youtube).
It's either old slow Tesla cards with 48GB or $2000 nvidia cards with 24GB.
I think you're overestimating what people can and will do.
Nvidia didn't succeed because it just launchend cards and let people write CUDA for them. Nvidia is where it is because it has an army of researchers and SW engineers developing the full stack from research papers, to frameworks, to proofs of concepts, showing customers the value of paying for their pricey HW + SW, most of it proprietary, not community developed.
"People" alone won't be able to get even 10% there. And that's ignoring the fact that Nvidia HW is not FOSS so they'd be working blind.
The current local model/open source model community is literally an army of SWE's and researchers. They already make tons of tooling too.
We can't see the future, but neither can CEOs, no matter how well paid and respected they are.
After all the current CEO is being ousted, so obviously he didn't do the right things despite being a CEO.
Some are probably multi millionaires smurfing (and I dont mean cryptobros).
Do you even have a Putnam award?
Intel has been "getting into" GPUs for two decades now going back to Larrabee. They are just not good at it.
Engineering chops?
AMD and nVidia already patented too much of the good stuff?
Too much existing code optimized for AMD and nVidia quirks?
A few weeks ago Gelsinger even commented he saw "less need for discrete graphics in the market going forward" - just seemed like a very Intel thing to say
Their purported VRAM offerings for Battlemage are also lower than hoped for, which is going to be a turnoff to many buying for local inference.
I think we have too many decision-makers gunshy from crypto mining that haven't yet realized that compute isn't just a phase.
To me, AMD's approach demonstrates planning and buy-in across multiple teams (CPU die, Ryzen IO die, Epyc IO die, etc), and that suggests a high degree of management involvement in these engineering decisions.
Intel's activities seem comparatively chaotic. Which I agree smells of disincentives and disinterested middle management hyperfixated on KPIs.
No they aren't - much like Boeing, at this point they are considered a national security asset.
"Gelsinger said the market will have less demand for dedicated graphics cards in the future."
(from: https://www.techspot.com/news/105407-intel-not-repeat-lunar-...)
Gelsinger apparently agreed with you. However, the market very clearly has enormous demand for discreet GPUs. Specifically for AI workloads (not PC gaming).
If I was on Intel's board I would've fired him for this statement. The demand for GPUs (parallel matrix processors with super fast local memory) is NOT going to go down. Certainly not in the next five to ten years!
I know Intel makes a lot of things, has a lot of IP, and is involved in many different markets but the fact that they're NOT a major player in the GPU/AI space is their biggest failure (in recent times). It's one of those things that should've been painfully obvious at some point in 2022 and here we have Gelsinger saying just a few months ago that somehow demand for AI stuff is just going to disappear (somehow).
It's magic hand waving like this that got him fired.
…Although it looks like Intel tried to go that route without understanding why and got cold feet.
Their dGPU is also pretty promising, I have it on my list to get - even if not for gaming, its possibly the best media encoding / decoding card for the money to get today. The only thing holding it back for entry level or mid level gaming is the drivers - for some games, it wont matter it seems, but for others it has some growing pains but they seem to be diligently working on them with every driver release.
Intel has made vast improvements even within their first generation of dedicated desktop cards. They will likely never compete with cards like a 4080/4090, but they may be great options for people on a budget. Helps put pressure on AMD to be better as well.
Is it a "he retired" or a "we retired him"?
Brian’s affair with an underling was also surprisingly conveniently timed back then.
Does it matter?
Pat didn't do that, I guess.
They are in a deep hole, and it is difficult to see a future where they can restore their former glory in the foreseeable future.
ARM isn't doing any such thing. Apple & Qualcomm are, though. ARM itself if anything looks weak. Their mobile cores have stagnated, their laptop attempts complete failures, and while there's hints of life in data center it also seems mediocre overall.
ARM aren’t trying to force Qualcomm to use ARMs cores. They’re trying to force them to update the licensing royalties to make up for the (as ARM sees it) licensing term violations of Nuvia who had a design license for specific use cases.
The part you reference (using ARM designs) is the fallback if Qualcomm lose their design license.
The destruction of the chips is standard practice to request in cases of license and IP infringement .
I'm not enough of a lawyer to figure out who is right.
That is the entire crux of the issue. ARM gave Nuvia a specific license, and then Nuvia was acquired which transferred the IP to a license that ARM did not extend it to.
And the fact that Qualcomm got just about everyone to endorse the acquisition ahead of announcing it but didn’t even tell Arm is a bit of a tell.
But even if they meant ARM-the-ISA, that'd still put it in a fragile position when the 2 clear market leaders in the ARM-the-ISA space have no particular allegiance to ARM-the-ISA (Apple having changed ISAs several times already, and QCOM both being active in RISC-V and also being sued by ARM-the-company)
It’s a really bad sign when a customer decides it can out innovate you by internally copying your entire business and production line.
Not necessarily, at scale, especially Apple's scale, vertical integration can make a ton of sense - for control, innovation, price, risk hedging.
It is the fact they can build a much better chip in almost any metric so far ahead of Intel is the red flag.
And historically, wasn't this juts an extension of their fight with Samsung in the mobile space more than a rejection of Intel?
I think you’re correct that this was initially because Apple didn’t want to be at the behest of Samsung. But as Apple iterated on their mobile and tablet chips, they had the choice to bring them to their desktop and laptop lines or not to. They took that gamble and here we are today with non Intel Macs.
It's pretty much two decades at this point.
For example, could "Intel A" continue to own the foundries, split off "Intel B" as the owner of the product lines, and then do some rebranding shenanigans so that the CPUs are still branded as "Intel"?
I don’t know if it’s legally possible, but HP shows the branding bit can kinda work.
Splitting Intel is necessary but probably infeasible at this point in the game. Simple fact is that Intel Foundry Services has nothing to offer against the likes of TSMC and Samsung - perhaps only cheaper prices and even then it's unproven to fab any non-Intel chips. So the only way to keep it afloat is by continuing to fab Intel's own designs, until 18A node becomes viable/ready.
That means either he knew and allowed it to happen, which is bad, or he didn't know and allowed GPU division to squander the resources, which is even worse. Either way, it was an adventure Intel couldn't afford.
There is a similar story in other areas.
Intel deserves a lot of blame but they also got hit by some really shit circumstances outside of their control.
This only reinforces my previous point. He had good ideas, but couldn't execute.
On a side note getting people in russia write your drivers sounds a bit insane. Yea lower cost and probably ok quality, but the risks...
a chief executive officer, the highest-ranking person in a company or other institution, ultimately responsible for making managerial decisions.
Maybe you mean COO?
They chose to outsource the development of their core products to a country like Russia to save costs. How was that outside of their control? It's not like it was the most stable or reliable country to do business in even before 2022...
e.g. There are plenty of talented engineers in China as well but it would be severely idiotic for any western company to move their core R&D there. Same applied to Russia.
War in Afghanistan (2001–2021)
US intervention in Yemen (2002–present)
Iraq War (2003–2011)
US intervention in the War in North-West Pakistan (2004–2018)
Second US Intervention in the Somali Civil War (2007–present)
Operation Ocean Shield (2009–2016)
Intervention in Libya (2011)
Operation Observant Compass (2011–2017)
US military intervention in Niger (2013–2024)
US-led intervention in Iraq (2014–2021)
US intervention in the Syrian civil war (2014–present)
US intervention in Libya (2015–2019)
Operation Prosperity Guardian (2023–present)
Wars involving Russia in the 21st century: Second Chechen War (1999–2009)
Russo-Georgian War (2008)
Russo-Ukrainian War (2014–present)
Russian military intervention in the Syrian Civil War (2015–present)
Central African Republic Civil War (2018–present)
Mali War (2021–present)
Jihadist insurgency in Burkina Faso (2024–present)
Also, I am not American and not an I conditional supporter of their foreign policy. And considering the trajectory of American politics it is obvious that any foreign multinational developing in the US should have contingency plans.
Putin's concentration of power has been alarming, but only since around 2012, to be honest. It was relatively stable between 2000 and 2012 in general (minus isolated cases of mysterious deaths and imprisonments). Russia was business-friendly back then, open to foreign investors, and most of Putin's authoritarian laws were yet to be issued. Most of the conflicts Russia was involved in were viewed as local conflicts in border areas (Chechen separatism, disputed Georgian territories, frozen East Ukrainian conflict, etc.). Only in 2022 did the Ukraine war escalate to its current scale, and few people really saw it coming (see: thousands of European/American businesses operating in Russia by 2022 without any issue)
So I kind of see why Intel didn't do much about it until 2022. In fact, they even built a second R&D center in 2020... (20 years after the first one).
i.e. if you are an American/European company and you are doing business in Russia you must account for the potential risks of suddenly. The sanctions after 2014 were a clear signal and Intel had years to take that into account.
> So I kind of see why Intel didn't do much about it until 2022.
I'm pretty sure that the consensus (based on pretty objective evidence) is that Intel was run by incompetent hacks prior to 2021 (and probably still is).
> thousands of European/American businesses operating in Russia by 2022
Selling your products there or having local manufacturing is not quite the same as outsourcing your R&D there due to obvious reasons...
Then again Yandex kind of pulled it off.
I will never understand this line of reasoning. Why would anyone expect an initial offering to match or best similar offerings from the industry leader? Isn't it understood that leadership requires several revisions to get right?
Intel had money and decades of integrated GPU experience. Any new entrant to the market must justify the value to the buyer. Intel didn't. He could sell them cheap to try to make a position in the market, though I think that would be a poor strategy (didn't have financials to make it work).
Honestly, even with their iGPU experience, Arc was a pretty impressive first dGPU since the i740. The pace of their driver improvement and their linux support have both been impressive. They've offered some niche features like https://en.wikipedia.org/wiki/Intel_Graphics_Technology#Grap... which Nvidia limits to their professional series.
I don't care if they have to do the development at a loss for half a dozen cycles, having a quality GPU is a requirement for any top-tier chip supplier these days. They should bite the bullet, attempt to recoup what they can in sales, but keep iterating toward larger wins.
I'm still upset with them for cancelling the larrabee uarch, as I think it would be ideal for many ML workloads. Who needs CUDA when it's just a few thousand x86 threads? I'm sure it looked unfavorable on some balance sheet, but it enabled unique workloads.
And here is the problem. You are discussing a dream scenario with unlimited money. This thread is about how CEO of Intel has retired/was kicked out (far more likely) for business failures.
In real world, Intel was in a bad shape (see margins, stock price ect) and couldn't afford to squander resources. Intel couldn't commit and thus it should adjust strategy. It didn't. Money was wasted that Intel couldn't afford to waste.
I brought up Intel's insane chiplet [non-]strategy elsewhere in the thread as an example where it's clear to me that Intel screwed up. AMD made one chiplet and binned it across their entire product spectrum. Intel made dozens of chiplets, sometimes mirror images of otherwise identical chiplets, which provides none of the yield and binning benefits of AMD's strategy. Having a GPU in house is a no-brainer, whatever the cost. Many other decisions going on at Intel were not. I don't know of another chip manufacturer that makes as many unique dies as Intel, or has as many SKUs. A dGPU is only two or three of those and opens up worlds of possibility across the product line.
Weren't they pretty good (price/performance) after Intel fixed the drivers during the first year or so after release? The real failure was taking so long to ship the next gen..
Sure Intel GPUs are inferior to both Nvidia and AMD flagship offerings, but they're competitive at a price-to-performance ratio. I'd argue for a 1st gen product, it was quite successful at opening up the market and enabling for cross-selling opportunities with its CPUs.
That all said, I suspect the original intent was to fabricate the GPUs on IFS instead of TSMC in order to soak up idle capacity. But plans changed along the way (for likely performance reasons) and added to the IFS's poor perception.
So with that, they are outsourcing production of these chips to TSMC and using nearly cutting edge processes (battlemage is being announced tomorrow and will use either TSMC 5 or 4), and the dies are pretty large. That means they are paying for dies the size of 3080s and retaling them at prices of 3060s.
RTX 3070 Ti: 17,400 million transistors
A770: 21,700 million transistors
https://www.techpowerup.com/gpu-specs/geforce-rtx-3070-ti.c3...
https://www.techpowerup.com/gpu-specs/arc-a770.c3914
It has taken Nvidia decades to figure out how to use transistors as efficiently as it does. It was unlikely for Intel to come close with their first discrete GPU in decades.
That said, it is possible that better drivers would increase A770 performance, although I suspect that reaching parity with the RTX 3070 Ti would be a fantasy. The RTX 3070 Ti has both more compute and more memory bandwidth. The only advantage the A770 has on its side is triple the L2 cache.
To make matters worse for Intel, I am told that games tend to use vendor specific extensions to improve shader performance and those extensions are of course not going to be available to Intel GPUs running the same game. I am under the impression that this is one of the reasons why DXVK cannot outperform the Direct3D native stack on Nvidia GPUs. The situation is basically what Intel did to AMD with its compiler and the MKL in reverse.
In specific, information in these extensions is here:
https://gpuopen.com/amd-gpu-services-ags-library/ https://developer.nvidia.com/rtx/path-tracing/nvapi/get-star...
Also, I vaguely recall that Doom Eternal used some AMD extension that was later incorporated into vulkan 1.1, but unless ID Software updated it, only AMD GPUs will be using that. I remember seeing AMD advertising the extension years ago, but I cannot find a reference when I search for it now. I believe the DXVK developers would know what it is if asked, as they are the ones that told me about it (as per my recollection).
Anyway, Intel entered the market with the cards stacked against it because of these extensions. On the bright side, it is possible for Intel to level the playing field by implementing the Vulkan extensions that its competitors use to get an edge, but that will not help it in Direct3D performance. I am not sure if it is possible for Intel to implement those as they are tied much more closely with their competitors’ drivers. That said, this is far from my area of expertise.
Wouldn't that just pretty much guarantee that the foundry business would fail since Intel wouldn't have any incentives to shift most of their manufacturing to TSMC? The same thing happened with AMD/Global Foundries..
Anyway, whatever woes GloFo is facing you can’t blame them on AMD. They had an exclusivity deal for a decade which only got broken when it was no longer tenable and AMD still buys a ton of their wafers. I suppose AMD may have bought more wafers if their products didn’t suck for most of that time but it had nothing to do with shifting production to TSMC which only happened after GloFo gave up.
They will have to focus, that means getting out lines of business which may likely die.
That would be better than going bankrupt and your competitors picking the pieces
I, personally, found my life to improve when we decided that the cleaning lady could be someone from outside the marriage.
BK will go down in history as the person who destroyed a once great engineering firm.
An alien from Vega looking at our constellation of tech companies and their leadership might point at an obvious answer…
I'm seeing the same thing now with Intel QAT / IAA / DSA. Only niche software support. Only AWS seems to have it and those "bare metal" machines don't even have local NVMe.
About 10 years ago Intel Research was publishing a lot of great research but no software for the users.
Contrast it with Nvidia and their amazing software stack and support for their hardware.
Linus seems to disagree https://m.youtube.com/watch?v=tQIdxbWhHSM
The video is 12 years old. A lot changed in the meantime.
AMD has open source drivers and crashes often. NVidia has (or more precisely had) closed source drivers that nearly always work.
Kernel Module Driver which is most likely less than 5% of drivers.
(I can't watch videos)
Every aspect of that document was just dripping in corporate dinosaur / MBA practices.
For example, they include 4 cores of these accelerators in most of their Xeons, but soft fuse them off unless you buy a license.
Nobody is going to buy that license. Okay, maybe one or two hyperscalers, but nobody else for certain.
It's ultra-important with a feature like this to make it available to everybody, so that software is written to utilise it. This includes the starving university student contributing to Postgres, not just some big-enterprise customer that merely buys their software!
They're doing the same stupid "gating" with AVX-512 as well, where it's physically included in desktop chips, but it is fused off so that server parts can be "differentiated".
Meanwhile AMD just makes one compute tile that has a uniform programming API across both desktop and server chips. This means that geeks tuning their software to run on their own PCs are inadvertently optimising them for AMD's server chips as well!
PS: Microsoft figured this out a while ago and they fixed some of their products like SQL Server. It now enables practically all features in all SKUs. Previously when only Enterprise Edition has certain programmability features nobody would use them because software vendors couldn't write software that customers couldn't install because they only had Standard Edition!
Intel started doing this kind of right recently with their Xe cores (ideally same programming model between their integrated GPUs and their datacenter HPC GPUs), but we’ll see how different the Xe-LPG and Xe-HPC end up being when Falcon Shores really comes out (I’m very worried about this description of it as a XPU, which seems like real confusion about what ML people want).
And this destroys 99% (maybe 99.99%) of the actual economic value for Intel! What Intel needs is for people to integrate these accelerators into software the way that AVX is integrated into software. Then software and libraries can advertise things like “decompression is 7x faster on Intel CPUs that support such-and-such”, and then customers will ask for Intel CPUs. And customers will ask their providers to please make sure they support such-and-such feature, etc.
But instead we have utterly inscrutable feature matrices, weird licensing conditions, and a near-complete lack of these features on development machines, and it’s no wonder that no one uses the features.
Which means that any fool that did utilise this feature has tied their cart to a horse that's going to be put down.
Smarter people can see this coming a mile off and not bother investing any effort into support.
It's so predictable that it's just depressing.
He should have cut 25% of the workforce to get started (and killed the dividend).
Also - the European expansion and Ohio projects, while good short-term strategies, were too early.
Cut the ship down to size and force existing sites to refocus or be closed. Get alarmist. Make sure you cut out all the bad apples. Prune the tree. Be ruthless and determined.
They should hire Reed Hastings now. He's the OG turnaround king.
Divesting from their discrete GPUs just as they were starting to become a viable value option was one of their most recent asinine decisions. No idea why they didn't try test the market with a high RAM 64GB+ card before bowing out to see how well they'd do. Nvidia's too busy printing money to add more RAM to their consumer GPUs, they'd have the cheap GPU VRAM market all to themselves.
> "Gelsinger said the market will have less demand for dedicated graphics cards in the future."
From: https://www.techspot.com/news/105407-intel-not-repeat-lunar-...
He may have been talking about something like, "all GPU stuff will just be integrated into the CPU" but I highly doubt that's realistic. Nvidia's latest chip die size is enormous! Their new 5090 die is reportedly 744mm squared: https://www.tomshardware.com/pc-components/gpus/the-rtx-5090...
There's no way you're going to be getting an equivalent level of performance from a CPU with integrated GPU when the GPU part of the die needs to be that big.
On the other hand, maybe not.
My point is that although you might think they are going to divest the GPU business in future, we don't know that for sure and it's kind of weird to present it as having already happened.
Intel wasn't doing great to start, but Pat put it on the path, potentially, towards greatness. Now even that is in major jeopardy.
MJ has a pretty good reputation inside the company.
Pat so suddenly getting "retired" like this isn't based on the success or failure of the new foundry nodes. You're correct that they weren't supposed to ship yet anyway. With this news most are expecting they'll be (further) delayed soon, but the real cause of today's action is strategic.
Things at Intel are so far gone that there's now no choice but to look at splitting up the company and/or selling/merging key parts off. Pat didn't want to preside over breaking up Intel, he wanted to save the company by shipping the new nodes. This was always a long shot plan which would require the existing businesses to do well and new things like GPUs to contribute while massively cutting costs in other areas.
Those things didn't work out. The GPUs were late and under performed forcing them to be sold near break even. The market for desktop and laptop CPUs was much softer than expected for macro economic reasons and, worse, there were massive, delayed death field failures of the last two gens of desktop CPUs. Competitors like AMD generally took more share from Intel faster than expected in other markets like data center. The big layoffs announced last Summer should have been done in 2021. Those things combined caused time and money to run out sooner than the new nodes could show up to save the day. This is reality finally being acknowledged. Frankly, this should have happened last Summer or even earlier. Now the value has eroded further making it even harder to save what's left.
When he took over, I remember the enthusiasm and optimism, not only in business, but in hacker circles also. It's a pity it didn't work out. I wonder if it is even possible to save Intel (not trying to imply that "if Gelsinger can't do it, than no one can", just wondering if Intel is just doomed, regardless of what their management does).
A lot of hate for Pat Gelsinger on Reddit and YouTube from armchair experts who don't really grasp the situation Intel were in or what was needed to turn the ship around, so if he was pushed it seems like it might be to pacify the (not particularly informed) YouTube/gamer community and bump the share price. That's all speculation, though.
I'd be interested to see who Intel intends to get to run the company in his place, as that would signal which audience they're trying to keep happy here (if any).
I'd expect Intel marketing and Public Relations to be paying YouTube Influencers to have a particular opinion, the one most favorable to the board.
However, IMO: they need somebody like Lisa Su, somebody with more of an R&D-engineering background. Gelsinger was a step in the right direction, but he got a masters degree in the 80’s and did technically hardcore stuff in the late 80’s, early 90’s. That was when stuff just started to explode.
Su finished her PhD in the 90’s, and did technically interesting stuff through the computer revolution. It appears to have made a world of difference.
He had two routes with the capital available following a cash injection from COVID-19 and the rapid digitization of the workplace - compete with AMD/NVIDIA, or compete with TSMC/Samsung. The only sensible option that would capture the kind of capital needed to turn the ship around would be to become a company critical to the national security of the US, during a time of geopolitical stability, onshoring chip manufacture and receiving support from the government in doing so. He could either compete with competitors at home or those abroad, but not both simultaneously. The thesis makes sense; you've lost the advantage to NVIDIA/AMD, so pivot to become a partner rather than a rival.
I don't think it's a coincidence that just a week after Intel finally received the grant from the government, he announced his departure. The CHIPS act was a seminal moment in his career. It makes sense he'd want to see that through till completion. He's 63; now is as good a time as ever to hand over the reins, in this case to a very capable duo of MJ and Zisner (who were always two of the most impressive EVPs of the bunch in my book).
In short, a bad or subpar chip design/architecture can be masked by having the chip fabricated on a leading edge node but not the inverse. Hence everyone is vying for capacity on TSMC's newest nodes - especially Apple in trying to secure all capacity for themselves.
https://www.bloomberg.com/news/articles/2024-12-02/intel-ceo...
I hate the idea that the board might do this just as Intel's prospects start looking up after years of cost-cutting measures to bring the company back to profitability and take credit for a "miraculous turnaround" that was actually instigated 4 years prior by the same person they sacked. It's like an incoming political party coming in and within a couple of months of office taking credit for a good economy that they had no hand in creating.
I hold no opinion on Pat Gelsinger, but changing the CEO in the middle of ensuring that Intel remains relevant in the long term, seems like a bad move. Probably his plan for "fixing" Intel is to slow for the market and the board. Let's see who takes over, if it's not an engineer, then things just became more dangerous for Intel. The interim people are an administrator and a sales person, that does not bode well.
Wonder if they will approach Lisa Su to take the job now :D
How is it starting to turn the wheel?
The promise of state backing for a US champion SC fab was taken seriously by Gelsinger, who went all-in in trying to remake Intel as TSMC 2.0. But the money turned out to be way too late and far more conditional than Gelsinger thought. This is bad news for Intel, bad news for US industrial policy
> Gil was a nice guy, but he had a saying. He said, 'Apple is like a ship with a hole in the bottom, leaking water, and my job is to get the ship pointed in the right direction.'
Intel's stock is jumping at this announcement, but I look at it as a bad signal for Intel 18a. If 18a was looking to be a smash hit then I don't think Pat gets retired. If 18a is a success then it is an even more short-sighted decision by the board.
What this likely means is two-fold:
1. Intel 18a is being delayed further and/or there are significant issues that will hamstring performance.
2. Pat is/was unwilling to split the foundry and design business / be part of a M&A but the board wants to do one or the other.
If 18a is not ready I think the best case scenario for Intel is a merger with AMD. The US Govt would probably co-sign on it for national security concerns overriding the fact that it creates an absolute monopoly on x86 processors. The moat of the two companies together would give the new combined company plenty of time to ramp up their fabs.
Sure they won't allow Intel to be bought by a foreign company, but surely everyone would much rather see Intel being bought by literally any other company than AMD and Nvidia.
I heard they’re building a iOS / android replacement. Think of the vertical integration!
I’m picturing a boot-looping cargo plane full of hummers dropping like a stone — doesn’t get much more vertically integrated than that. Think of all the layers they can eliminate.
They'll barely allow Nvidia to acquire anybody at this point, no matter how small. See recent EU response to Run:ai. Intel would be considered 100x worse.
If not, look up e.g. Microsoft 's purchase of Activision, both US companies.
Huawei is trying to make establish domestic Ascend/MindSpore ecosystem, but they are limited by the SMIC process (~7nm). Amount of defects is allegedly rather high, but they are the only "official" game in town (other than smuggled NVIDIA cards or outsourced datacenters in Middle East).
Well they obviously would..
Also EU has promised various significant subsidies to Intel. They obviously has fabs in Ireland and building one in Germany and perhaps even Poland..
They already do networking, photonics, GPUs, high speed interconnects, and CPUs. They are planning on selling their FPGAs (the Altera acquisition) to Lattice.
The only things left are their fab ops, thunderbolt/usbc, wifi, and ble.
Their fab ops would take over a decade of heavy investment to catch up to TSMC or Samsung and idk if even Nvidia is ambitious enough to take that on.
Wifi and BLE could be good additions if they wanted to branch out their mellanox portfolio to wireless. Thunderbolt/USB C also might be worthwhile.
But that IP is probably going to be cheaper to buy piecemeal so idk if it's worth it to buy the whole company outright.
So if Intel sells, everyone is fucked until whoever buys can renegotiate the terms of the agreement. And if that's Nvidia, they could just sit dead on the IP and starve AMD of the bulk of their CPU revenue (which is what is keeping the company solvent). And requiring they keep the agreement they currently have would mean requiring AMD to give Nvidia a pretty detailed look at the AMD secret sauce which would increasingly push AMD into the red until they become insolvent, again leading to a Nvidia monopoly.
The US government as well as the EU will not allow that to happen so however things slice, ownership of the x86 ISA patents would not be going to Nvidia.
[0]https://www.kitguru.net/components/cpu/anton-shilov/amd-clar...
I can see them wanting certain parts of the business (GPU mainly) but on a whole it doesn't make a lot of sense.
I don't see Intel as a single entity being valuable to any US business really. You're essentially buying last years fall line, theres very little use for Intel's fabs without a huge amount being spent on them to get them up to modern standards.
It'll all come down to IP and people that'll be the true value.
The only way it happens is if it is kept separate kind of like the Beats acquisition. Apple lends some chip designs to Intel and Apple starts fabricating their chips on Intel fabs, but otherwise the companies operate independently.
Even then there is zero indication that Apple would ever want to do their own manufacturing.
One of the reasons where Intel "let" AMD compete in the x86 space is US Gov requirements for being able to source chips from two vendors at least
Aside from the x86 monopoly that would create, I don't think Intel has much of value to AMD at this point other than the fabs (which aren't delivering). IMHO if Intel is failing, let them fail and others will buy the pieces in bankruptcy. This would probably benefit several other companies that could use 22nm and up fab capacity and someone could even pick up the x86 and graphics businesses.
BTW I think at this point the graphics business is more valuable. Even though Intel is in 3rd place there are many players in the SoC world that can use a good GPU. You can build a SoC with Intel, ARM, or RISC-V but they all need a GPU.
Intel finally seem to have got their act together a bit with OneAPI but they've languished for years in this area.
At least for Intel, that is just not true. Intel's DPC++ is as open as it gets. It implements a Khronos standard (SYCL), most of the development is happening in public on GitHub, it's permissively licensed, it has a viable backend infrastructure (with implementations for both CUDA and HIP). There's also now a UXL foundation with the goal of creating an "open standard accelerator software ecosystem".
I decided that I have to start looking at Apple's AI docs
Apple will sell you a machine with 48GB of memory for thousands of dollars but plenty of people can't afford that, and even then the GPU is soldered so you can't just put four of them in one machine to get more performance and memory. The top end 40-core M4 GPUs only have performance comparable to a single A770, which is itself not even that fast of a discrete GPU.
For whatever reason, people just delete these tools from their minds, then claim Nvidia still has a monopoly on CUDA.
And yet still the popcorn gallery says "there no [realistic] alternative to CUDA." Methinks the real issue is that CUDA is the best software solution for Nvidia GPUs, and the alternative hardware vendors aren't seen as viable competitor for hardware reasons, and people attribute the failure to software failures.
It certainly seems like there's a "nobody ever got fired for buying nvidia" dynamic going on. We've seen this mentality repeatedly in other areas of the industry: that's why the phrase is a snowclone.
Eventually, someone is going to use non-nvidia GPU accelerators and get a big enough cost or performance win that industry attitudes will change.
Is there?
10 years ago, I burned about 6 months of project time slogging through AMD / OpenCL bugs before realizing that I was being an absolute idiot and that the green tax was far cheaper than the time I was wasting. If you asked AMD, they would tell you that OpenCL was ready for new applications and support was right around the corner for old applications. This was incorrect on both counts. Disastrously so, if you trusted them. I learned not to trust them. Over the years, they kept making the same false promises and failing to deliver, year after year, generation after generation of grad students and HPC experts, filling the industry with once-burned-twice-shy received wisdom.
When NVDA pumped and AMD didn't, presumably AMD could no longer deny the inadequacy of their offerings and launched an effort to fix their shit. Eventually I am sure it will bear fruit. But is their shit actually fixed? Keeping in mind that they have proven time and time and time and time again that they cannot be trusted to answer this question themselves?
80% margins won't last forever, but the trust deficit that needs to be crossed first shouldn't be understated.
On paper, yes. But how many of them actually work? Every couple of years AMD puts out a press release saying they're getting serious this time and will fully support their thing, and then a couple of people try it and it doesn't work (or maybe the basic hello world test works, but anything else is too buggy), and they give up.
Do you have experience with SYCL? My experience with OpenCL was that it's really a PITA to work with. The thing that CUDA makes nice is the direct and minimal exercise to start running GPGPU kernels. write the code, compile with nvcc, cudaed.
OpenCL had just a weird dance to perform to get a kernel running. Find the OpenCL device using a magic filesystem token. Ask the device politely if it wants to OpenCL. Send over the kernel string blob to compile. Run the kernel. A ton of ceremony and then you couldn't be guarenteed it'd work because the likes of AMD, Intel, or nVidia were all spotty on how well they'd support it.
SYCL seems promising but the ecosystem is a little intimidating. It does not seem (and I could be wrong here) that there is a defacto SYCL compiler. The goals of SYCL compilers are also fairly diverse.
No, I bought a Nvidia card and just use CUDA.
> OpenCL had just a weird dance to perform to get a kernel running...
Yeah but that entire list, if you step back and think big picture, probably isn't the problem. Programmers have a predictable response to that sort of silliness. Build a library over it & abstract it away. The sheer number of frameworks out there is awe-inspiring.
I gave up on OpenCL on AMD cards. It wasn't the long complex process that got me, it was the unavoidable crashes along the way. I suspect that is a more significant issue than I realised at the time (when I assumed it was just me) because it goes a long way to explain AMD's pariah-like status in the machine learning world. The situation is more one-sided than can be explained by just a well-optimised library. I've personally seen more success implementing machine learning frameworks on AMD CPUs than on AMD's GPUs, and that is a remarkable thing. Although I assume in 2024 the state of the game has changed a lot from when I was investigating the situation actively.
I don't think CUDA is the problem here, math libraries are commodity software that give a relatively marginal edge. The lack of CUDA is probably a symptom of deeper hardware problems once people stray off an explicitly graphical workflow. If the hardware worked to spec I expect someone would just build a non-optimised CUDA clone and we'd all move on. But AMD did build a CUDA clone and it didn't work for me at least - and the buzz suggests something is still going wrong for AMD's GPGPU efforts.
Impossible. GPGPU runtimes are too close to hardware, and the hardware is proprietary with many trade secrets. You need support from GPU vendors.
BTW, if you want reliable cross-vendor GPU, just use Direct3D 11 compute shaders. Modern videogames use a lot of compute, to the point that UE5 even renders triangle meshes with compute shaders. AMD hardware is totally fine, it’s the software ecosystem.
An x86 monopoly in the late 80s was a thing, but not now.
Today, there are sufficient competitive chip architectures with cross-compatible operating systems and virtualization that x86 does not represent control of the computing market in a manner that should prevent such a merger: ARM licensees, including the special case of Apple Silicon, Snapdragon, NVIDIA SOCs, RISC-V...
Windows, MacOS and Linux all run competitively on multiple non-x86 architectures.
I think you're off by 20 years on this. In the 80s and early 90s we had reasonable competition from 68k, powerpc, and arm on desktops; and tons of competition in the server space (mips, sparc, power, alpha, pa-risc, edit: and vax!). It wasn't till the early 2000s that both the desktop/server space coalesced around x86.
It seems almost like the forces that are pushing against these long-term trends are focused more on trying to figure out how to saturate existing compute on the high-end, and using that to justify drives away from diversity and vertical integrated cost/price reduction. But there are, long-term, not as many users who need to host this technology as there are users of things like phones and computers who need the benefits the long-term trends provide.
Intel has acted somewhat as a rock in a river, and the rest of the world is finding ways around them after having been dammed up for a bit.
A lot of companies killed off their in-house architectures and hopped on the Itanium bandwagon. The main two exceptions were Sun and IBM.
Intel had just wiped the floor with x86 servers, all the old guard Unix vendors with their own chips were hurting. Then Intel makes the rounds with a glorious plan of how they were going to own the server landscape for a decade or more. So in various states of defeat and grief much of the industry followed them. Planned or not, the resulting rug pull really screwed them over. The organs that operated those lines of businesses were fully removed. It worked too well, I am going to say it was on accident.
Intel should have broken up its internal x86 hegemony a long time ago, which they have been trying since the day it was invented. Like the 6502, it was just too successful for its own good. Only x86 also built up the Vatican around itself.
https://herecomesthemoon.net/2024/11/two-factions-of-cpp/
There are a lot of pre-compiled binaries floating about that are depended on by lots of enterprise software whose source code is long gone, and these are effectively locked to x86_64 chips until the cost of interoperability becomes greater than reverse engineering their non-trivial functionality.
"two factions" is only discussing source compatibility.
Incorrect, we have an even greater lack of x86 vendors now than we did in the 80s. In the 80s you had Intel, and they licensed to AMD, Harris, NEC, TI, Chips & Technologies, and in the 90s we had IBM, Cyrix, VIA, National Semi, NexGen, and for a hot minute Transmeta. Even more smaller vendors.
Today making mass market x86 chips we have: Intel, AMD, and a handful of small embedded vendors selling designs from the Pentium days.
I believe what you meant was that x86 is not a monopoly thanks to other ISAs, but x86 itself is even more of a monopoly than ever.
Could also be Intel and Micron. Then you end up with full stack devices with Intel CPUs and Micron RAM and storage, and the companies have partnered in the past.
Maybe they should follow AMD's lead and spin off the foundry business.
Samsung still has a fairly competitive process and could make x86 CPUs to put in their own tablets and laptops without having the OEM and Intel get into a fight about margins if they're the same company. And with the largest maker of Android devices putting x86 CPUs into them, you get an ecosystem built around it that you wouldn't when nobody is using them to begin with because Intel refuses to price competitively with ARM.
> intended so they're not making an effective argument
To be fair I'm really struggling to somehow connect the "x86 monopoly in the late 80s" with the remainder of their comment (which certainly makes sense).
By that standard if we exclude mobile x86 has a much stronger monopoly these days than in 1985. Unless we exclude low end PCs like Apple II and Commodore 64.
In 1990 x86 had ~80%, Apple ~7%, Amiga ~4% (with the remainder going to lowend or niche PCs) so again not that different than today.
Sadly with the rise of laptops with soldered-in-everything, and the popularity of android/iphone/tablet devices, I share some of layer8's worries about the future of the relatively open PC architecture and hardware ecosystem.
We are really lucky that such a diverse and interoperable hardware platform like the PC exists. We should not discount it, and instead appreciate how important it is, and how unlikely for such a varied and high-performance platform to emerge again, should the PC platform die.
All the use cases, except gaming PC, have "less serious" solutions in Linux/ARM and Linux/RISCV today, where I would argue there is more interoperability and diversity. Those solutions get better and closer to "serious" x86 solutions every day.
Will they be roughly equivalent in price/performance in 5 years... only time will tell but I suspect x86 PC is the old way and it's on it's way out.
And then in the 2000s after AMD64 pretty much destroyed all competing architectures and then in the 2010s Intel itself effectively was almost a monopoly (outside of mobile) with AMD being on the verge of bankruptcy.
Wintel was a duopoly which had some power: Intel x86 has less dominance now partly because Windows has less dominance.
There are some wonderful papers on how game theory and monopoly plays out between Windows and Intel; and there's a great paper with analysis of why AMD struggled against the economic forces and why Microsoft preferred to team up with a dominant CPU manufacturer.
Restoring Intel's foundry lead starting with 18A was central to Pat's vision and he essentially staked his job on it. 18A is supposed to enter production next year but recent rumors is that it's broken.
I have next to zero knowledge of semiconductor fabrication, but “Continued Momentum” does sound like the kind of corporate PR-speak that means “people haven't heard from us in a while and there's not much to show”.
I also would never have realized the 20A process was canceled were it not for your comment since this press release has one of the most generous euphemisms I've ever heard for canceling a project:
“One of the benefits of our early success on Intel 18A is that it enables us to shift engineering resources from Intel 20A earlier than expected as we near completion of our five-nodes-in-four-years plan.”
And despite this total failure they spent many tens of on stock buybacks https://ycharts.com/companies/INTC/stock_buyback no less than ten billions in 2014 and in 2018-2021 over forty billions. That's an awful, awful lot of money to waste.
I think at this point no one believes Intel can deliver. So news or not..
Part 1, combine branch predictor with the instruction trace cache to be able to detect workloads, have specific licenses for say Renderman, Oracle or CFD software.
Part 2, add a mesh network directly to the CPU, require time based signing keys to operate. Maybe every chip just has starlink included.
Part 3, In an BWM rent your seats move, the base CPU is just barely able to boot the OS, specific features can be unlocked with signed payloads. Using Shamir secrets so that Broadcom AND the cloud provider are both required for signing the feature request. One can rent AVX512, more last level cache, ECC, overclocking, underclocking.
The nice part about including radios in the CPUs directly means that updates can be applied without network connectivity and you can geofence your feature keys.
This last part we can petition the government to require as the grounds of being able to produce EAR regulated CPUs globally.
I think I'll just sell these patents to Myhrvold.
The best they could do with the GFX business is a public execution. We've been hearing about terrible Intel GFX for 15 years and how they are just on the cusp of making one that is bad (not terrible). Most people who've been following hardware think Intel and GFX is just an oxymoron. Wall Street might see some value in it, but the rest of us, no.
What does an OS need a GPU for?
My current laptop only has integrated Intel GPU. I'm not missing Nvidia, with its proprietary drivers, high power consumption, and corresponding extra heat and shorter battery life...
It actually separates the OS from the GPU. Before WDDM your GFX device driver was the only software that could use GFX acceleration. After WDDM the GPU is another "processor" in your computer that can read and write to RAM and the application can use the GPU in user space any way it wants, and then the compositor can to the same (in user space) and in the end all the OS is managing communication with the GPU.
For that approach to work you need to have enough fill rate that you can redraw the screen several times per frame. Microsoft wanted to have enough they could afford some visual bling, but Intel didn't give to them.
“Ubuntu 7.10 is the first version of Ubuntu that ships with Compiz Fusion enabled by default on supported hardware. Compiz Fusion, which combines Compiz with certain components developed by the Beryl community, is a compositing window manager that adds a wide range of visual effects to Ubuntu's desktop environment. The default settings for Compiz enable basic effects—like window shadows and fading menus—that are subtle and unobtrusive. For more elaborate Compiz features, like wobbling windows, users can select the Extra option on the Visual Effects tab of the Appearance Preferences dialog.”
MacOS in 2000 was still old MacOS, with no compositing at all. The NeXT derived version of MacOS was still in beta, and I tried it back then, it was very rough. Even once OSX shipped in 2001, it was still software composited. Quartz Extreme implemented GPU compositing in 10.2, which shipped in 2002.
Windows finally got a composited desktop in Vista, released in 2007. It was GPU accelerated from day one.
An 8K 32bpp framebuffer is ... omg 126MB for a single copy. I was going to argue that a software rasterizer running on vcache would be doable, but not for 8k.
For 4k, with 32MB per display buffer, it could be possible but heavy compositing will require going out to main memory. 1440p would be even better at only 15MB per display buffer.
For 1440p at 144Hz and 2TB/s (vcache max), best case is an overdraw of 984 frames/frame
I was doing a11y work for an application a few months back and got interested into the question of desktop screen sizes. I see all these ads for 4k and bigger monitors but they don't show up here
https://gs.statcounter.com/screen-resolution-stats/desktop/w...
And on the steam hardware survey I am seeing a little more than 5% with a big screen.
Myself I am swimming in old monitors and TV to the point where I am going to start putting Pepper's ghost machines in my windows. I think I want to buy a new TV, but I get a free old TV. I pick up monitors that are in the hallway and people are tripping on them and I take them home. Hypothetically I want a state-of-the-art monitor with HDR and wide gamut and all that but the way things are going I might never buy a TV or monitor again.
A complex desktop web form with several pages, lots of combo boxes, repeating fields, etc. I cleaned up the WCAG AA issues and even the S, but the AAA requirement for click targets was out of scope but had me thinking that I wanted to make labels (say on a dropdown menu bar) as big as I reasonably could and that in turn got me thinking about how much space I had to work with on different resolution screens so I looked up those stats and tried to see what would fit in which constraints.
If the above-linked website uses data reported by the browser, I wonder how this scenario might be taken into consideration (or even if such a thing is possible)
The PC I'm typing this on has two 27in 4k screens. I'm sitting so that I look at them from about 75cm away (that's 2.5 feet in weird units).
I archive most of my video files in 720p, because I genuinely don't see that big of a difference between even 720p and 1080p. It is definitely visible, but usually, it does not add much to the experience, considering that most videos today are produced to be watchable on smartphones and tablets just as much as cinema screens or huge TVs. I only make an exception here for "cinematic" content that was intended for the very big screen. That does not necessarily mean movies, but also certain YouTube videos, like "Timelapse of the Future": https://youtube.com/watch?v=uD4izuDMUQA - This one hits differently for me in 4K vs. just 1080p. Having to watch this in just 720p would be tragic, because its cinematography relies on 4K's ability to resolve very fine lines.
So why would I make a point to have both my screens be 4K? Because where else do you look at fine lines a lot? You're looking at it right now: Text. For any occupation that requires reading a lot of text (like programming!), 4K absolutely makes a difference. Even if I don't decrease the font size to get more text on screen at once, just having the outlines of the glyphs be sharper reduces eye strain in my experience.
Which is far more powerful than the ones that caused problems almost two decades ago.
As people noted, most of your GUI is being rendered by it. Every video you watch is accelerated by it, and if it has some compute support, some applications are using it for faster math at the background (mostly image editors, but who knows).
[1] https://www.engadget.com/2008-03-27-nvidia-drivers-responsib...
It wasn't fully WDDM compatible for a quite minor (overall) part, but the performance were awful anyway and lack of running in the full WDDM mode (ie Aero) also didn't help too, partly because running in Aero was faster.
not sure about it. i had friends with discrete GPUs at the time and they told me that vista was essentially a gpu-stress program rather than an OS.
at the same time, compiz/beryl on linux worked beautifully on intel integrated gpus, and were doing way cooler things than vista was doing at the time (cube desktops? windows bursting into flames when closed?).
I'm a bit sad that compiz/beryl is not as popular anymore (with all the crazy things it could do).
The problem with the "use the GPU in a SoC" proposition is everyone that makes the rest of a SoC also already has a GPU for it. Often better than what Intel can offer in terms of perf/die space or perf/watt. These SoC solutions tend to coalesce around tile based designs which keep memory bandwidth and power needs down compared to the traditional desktop IMR designs Intel has.
Why would AMD want a merger? They aren't a charity, and certainly don't need the distraction.
Having a sole supplier for CPUs is a bad strategy.
Even if Intel gets bought out, it'll be in pieces. Nobody wants to enter the x86 market, but there may be smaller segmenrs of the business that can help an ARM based business, or someone looking to get into GPU's.
Why would Intel "fold"? Their revenue is still 2x higher than AMDs... I mean obviously they are not doing great but its silly to say something like that at this point.
If the ISA patent licenses opened up, that might not be the case. When the topic comes up, it's more about Intel shutting down license transfers, so naturally companies have avoided x86.
- A token legal remnant of Intel, with 0 employees or properties, might suffice to keep that license ticking.
- If the stakes appeared to be "or America will lose it's ability to make computers", then the government might find a judge willing to sign off on just about any sort of counterfactual, "because national security".
Doubtful that is the issue with Intel's track record. Curious when we will know if 18A is competitive or not.
> If 18a is not ready I think the best case scenario for Intel is a merger with AMD. The US Govt would probably co-sign on it for national security concerns overriding the fact that it creates an absolute monopoly on x86 processors. The moat of the two companies together would give the new combined company plenty of time to ramp up their fabs.
No way other countries would allow that. If A-tel (Amd-inTEL) can not sell to the EU the merger will not happen.
What the EU gonna do then? Stop buying computers? Perform rapid continental ARM transition for mythical amount of money?
> Perform rapid continental ARM transition for mythical amount of money?
And what is Intel + AMD going to do? Not sell CPUs in Europe?
Apple has made it fairly obvious, even if it was not already with smartphones and chromebooks, that Arm is a viable, realistic, and battle-tested alternative for general purpose computing. Windows 11 even runs on Arm already.
It would not happen "tomorrow" - this would be years in court if nothing else. This would give Dell/HP/Lenovo/whoever plenty of time to start building Arm laptops & servers etc for the European market.
And who knows what RISC-V will look like in a few more years?
The EU has done a bunch of stupid anti-consumer shit in tech already (hello cookie warnings that everyone now ignores), so I would not be surprised if this happened.
Seize or terminate their patents and copyrights. Issue arrest warrants for criminal evasion. Compulsory licensing of x86 to a European design firm immunized by EU law.
> Perform rapid continental ARM transition
Yes.
Windows is on ARM. Apple is on ARM. AWS and Ampere make decent ARM servers. You have decent x86 user-space compatibility on ARM laptops. That is all users want.
I doubt it will cost 'mythical amounts of money'. Most users use a web browser and an office suite. I doubt they will know a difference for a while.
My eyes rolled so far back I hurt myself.
Please provide some examples of where the EU has been able to do a fraction of what you listed to large, US based firms in the past.
Looking at the future, if you want a trade war and an excuse for the new US administration to completely neglect NATO obligations this is a great start.
Provide an example of where a large firm decided to ignore EU law and go ahead with a merger that the EU objected to.
No-one wants the nuclear option, on either side. But if anyone ever tries to call the EU's bluff, they may find out the EU wasn't bluffing at all.
I'm not convinced of this. Fabs are incredibly expensive businesses. Intel has failed to keep up and AMD spun off their fabs to use TSMC.
There is also ARM knocking at the door for general computing. It's already gaining traction in previously x86 dominated markets.
The model for US based fabbing has to include selling large portions of capacity to third party ASIC manufacturers, otherwise I see it as doomed to failure.
IT departments are not going to stop buying x86 processors until they absolutely are forced to. Gamers are not going to switch unless performance is actually better. There just isn't the incentive to switch.
> IT departments are not going to stop buying x86 processors until they absolutely are forced to.
IT departments are buying arm laptops, Apple's.
And there is an incentive to switch, cost. If you are in AWS, you can save a pretty penny by adopting graviton processors.
Further, the only thing stopping handhelds from being arm machines is poor x86 emulation. A solvable problem with a small bit of hardware. (Only non-existent because current ARM vendors can't be bothered to add it and ARM hasn't standardized it).
Really the only reason arm is lagging is because the likes of Qualcomm have tunnel vision on what markets they want to address.
About corporate laptops, do you have evidence to show that companies are switching to Macbooks from HP/Dell/ThinkPads?
They are similar. Particularly because developing on a corporate hardware with an ARM processor is a surefire way to figure out if the software you write will have issues with ARM in AWS.
That's pretty much the entire reason x86 took off in the server market in the first place.
> About corporate laptops, do you have evidence to show that companies are switching to Macbooks from HP/Dell/ThinkPads?
Nope. Mostly just anecdotal. My company offers devs the option of either an x86 machine or a mac.
Also Qualcomm's GPUs are pretty shit (compared to Intel's and AMDs or Apple's)
Plenty of them are buying Macbooks. It's definitely a small percentage of the worldwide market, but not entirely insignificant either.
That's the question that remains to be seen.
Of course these days more and more of that is moving the the cloud and all IT needs to a web browser that works. Thus making their job easier.
This was the point I was going to make. While not completely dead, the days of desktop applications are quickly coming to a close. Almost everything is SAAS now or just electron apps which are highly portable.
Even if it's not saas or electron, the only two languages I'd do a desktop app in now-a-days is C# or Java. Both of which are fairly portable.
The world outside of the SV tech bubble runs on Excel.
1. Excel now-a-days is mostly just an electron app. That's what the office 365 conversion was effectively.
2. MS has supported ARM platforms for quite some time now. [1]
[1] https://www.windowscentral.com/office-windows-11-arm-starts-...
And google sheets in my opinion is not good for complicated stuff - the constant lag..
We mostly email links to spreadsheets running in cloud. So it really doesn't matter what your OS is any more from an excel perspective, as long as your computer can run a modern browser you are good.
It is like a toy version of the standalone app.
Also it sucks with lists, pivot tables...
Maybe in the coming Great Depression of 2025, people will think differently and start looking at cheaper alternatives.
> If you need to use Excel at work, you need x86 since Excel for Mac is a gutted toy
The nominal cost of Excel was not the topic being discussed. It was the cost of using Excel for MacOS rather than Excel for Windows.
Almost no one needs the Windows specific features of Excel, so almost no one needs to give up using macOS just because of Excel.
What's this based on? Surely the proportion of desktops that need to be more powerful than anything Apple is doing on ARM is very small. And surely Apple isn't 5 years ahead?
Apple can force the transition. It is not so straightforward on Windows/Linux.
Massively lower power consumption and way less waste heat to dispose of.
Literally the two biggest concerns of every data centre on earth.
Market forces have traditionally pushed Intel and AMD to design their chips for a less efficient part of the frequency/power curve than ARM vendors. That changed a few years ago, and you can already see the results in x86 chips becoming more efficient.
I say, diversity rules!
[1]: https://research.ibm.com/blog/northpole-llm-inference-result...
Talos is the exception that proves the rule, sadly.
They did realize the tactical error, so I'm hoping Power11 will reverse the damage.
I know anecdotes aren't data, but I was talking with a colleague about chips recently and he noticed that converting all of his cloud JVM deployments to ARM machines both improved performance and lowered costs. The costs might not even be the chips themselves, but less power and thermal requirements that lowers the OpEx spend.
If this happens, couldn’t they force them giving out licenses as a condition? The licensing thing has been such an impediment to competition that it seems like it’s about time anyway.
Oh man, the risk in that is extreme. We are moving away from x86 in general, but wow, that's... a big jump in risk.
Does this mean that Intel's fabs should split for Global Foundries, and the Intel design team should go to AMD?
What could go wrong?
If you take that away, it becomes irrelevant (like many other ARM-based processors that struggle to be a good product because of compatibility).
Apple has a promising path for x86 liberation, but it is not there yet.
https://www.sciencedirect.com/science/article/pii/S138376212...
It's not the only ARM CPU with TSO support though, some server platforms also do it.
Also, can you name these other platforms?
Also, can you back up the information about most users not leveraging Rosetta?
https://share.transistor.fm/s/124f8bc4
> Also, can you name these other platforms?
Fujitsu A64FX, and https://en.wikipedia.org/wiki/Project_Denver
> Also, can you back up the information about most users not leveraging Rosetta?
Well I couldn't tell you that, but most people just use web browsers.
If most people just use web browsers, why so much effort has been put on Rosetta 2? Sounds very wasteful.
Also, A64FX seems to agree with my statement that an ARM with some x86 spice makes a better product than just an ARM.
And, if we're talking about making ARM mainstream, Raspberry Pi did more for the platform than Apple and deserves to get the good reputation.
It's mostly something that was needed in the first few years so you could run Chrome and Photoshop, but those have been ported now. It's mostly useful for running WINE but that's not super common outside games.
That said, a binary recompiler has a lot of uses once you have one: https://valgrind.org
What you're saying is exactly what I'm saying. Having that x86 translation layer helped them sell macs. It doesn't matter if it is not needed now (it is needed now, otherwise it would have been removed).
So, yes. Apple has a popular new band in town. But they still need to play some x86 covers to sell tickets.
As I said, they have a good plan, but they're _not there yet_.
When they'll be? When x86 translation is not shipped anymore. It's rather simple.
Giving life support to dinosaurs isn't how you create a competitive economy.
IIRC their data center CPU revenue was about even this quarter so this is a bit deceptive (i.e. you can buy 1 large CPU instead of several cheaper ones).
"When it comes to servers, AMD's share totaled 24.2%"
and
"Intel, of course, retained its volume lead with a 75.8% unit market share."
Ignoring ChromeOS, and assuming 100% of windows and linux is x86 (decreasingly true - the only win11 I’ve ever seen is an arm VM on my mac) and 100% of Mac is arm (it will be moving forward), that puts arm at 20% of the PC market.
Interpolation from your numbers puts intel at 64% (with a ceiling of 80% of PC; 25% of consumer computing devices unless windows makes a comeback).
Windows:
Intel: 64.23%
AMD: 35.71%
Linux:
Intel: 30.15%
AMD: 69.85%
In the mean time, AMD/ARM already won phones, table and game consoles.
Server purchasing decisions aren’t made by everyday people. Intel’s roadmap in that space slipped year for year for at least 10 of the last 15 years.
That leaves Intel with the fraction of the non-mac laptop market that’s made up of people that haven’t been paying attention for the last ten years, and don’t ask anyone who has.
Don't forget laptops. Intel has been terrible on laptops due to their lack of efficiency. AMD has been wiping the floor with them for years now.
2024 is the first year that Intel has released a laptop chip that can compete in efficiency. I hope Intel continues to invest in this category and remain neck and neck with AMD if we have any hope of having Windows laptops with decent battery lide.
Anyway, in my questions for her about what she really cares about in a new laptop, power efficiency was not a concern of hers. She does not care about efficiency at all. All she cared about was a good enough screen (2560x1440 or better), and a fast CPU to run the new Photoshop features, and the ability to move it from one location to another (hence the need for a laptop instead of a desktop). I'd wager that for most people, the fact that it's a portable computer has nothing to do with how long the battery lasts away from an outlet. She can transport the computer to another location and plug it in. There are very few situations that require extended use away from an outlet, and even in an airplane, we often see 120V outlets at the seats. There's really no use case for her that puts her away from an outlet for longer than an hour or two, so efficiency is the least of her concerns in buying a new laptop.
So we went with a new Dell laptop with the Intel i9-13900HX, which beats the Apple M4 Max 16 Core in terms of overall performance in CPU benchmarks. I would have looked at an AMD based laptop, but the price on this Dell and the performance of the i9 were great, it was $999 on sale. It's got a decent enough screen, and we can easily upgrade the RAM and storage on this laptop.
I doubt she'd even care if the new laptop didn't have a battery at all, so long as she can easily stuff it in a bag and carry it to another location and plug it in. I feel the exact same way, and I recently bought a new (AMD based) laptop, and power efficiency was not a thing in my decision making process at all. The battery lasts a few hours, and that's plenty. I don't get a hard-on for battery life, and I'm not really sure who does. Are these people dancing around with their laptops and simply can't sit still and plug it in?
Handed the wife M2 Macbook Air and she's thrilled how little she has to plug it in. She goes weeks between charges sometimes.
Not trying to invalidate or lessen your complaint (which I completely agree with) but want to make sure you are aware of OpenCore Legacy Patcher. It's a little hacky by nature but can give some extra life to that machine: https://dortania.github.io/OpenCore-Legacy-Patcher/MODELS.ht...
Evidently, that leaves Intel the majority of the market.
A friend who worked in film post production was telling me about similar rare but annoying problems with Mx Apple computers. I feel like their are verticals where people will favor x86 chips for a while yet.
I am not as close to this as I was when I actually programmed games (oh so long ago!) so I wonder if this is just the point of view of a person who has lost touch with trends in tech.
I think this is partly because big OEMs doubt (used to doubt?) AMD’s ability to consistently deliver product in the kind of volume they need. Partly it’s because of Intel’s historically anticompetitive business practices.
> Multiple reports, citing sources at laptop OEMs, have covered what is said to be poor support, chip supply, and communication from AMD with its laptop partners, leading to generally poor execution. Chip consultancy firm AC Analysis says AMD's shift of focus to AI and data centers has led to a "'Cold War ice age' in relationships with OEMs," leading to a loss of trust from its partners.
https://www.tomshardware.com/tech-industry/amds-laptop-oems-...
That's because Intel bribed OEMs to use only Intel chips
Let them die. Maybe we'd actually see some new competition?
Nonetheless his comment on nvidia being lucky was everything than a smart comment.
not sure what a "grown" semiconductor fab is but follow this link and sort by location https://en.wikipedia.org/wiki/List_of_semiconductor_fabricat... The number of homegrown companies with fabs is greater than 1
it would create so much loses not just by security but also effect on economy would be so high
The struggling companies with totally rotten management like to bring such "stars" (pretty shrewd people who built themselves a cute public image of supposedly talented engineers who got promoted into higher management on their merits) - Yahoo/Meyers come to mind as another example - who de-facto complete the destruction of the company while the management rides the gravy train.
He's also 63. Has plenty of money to survive the rest of his life. Has eight grandchildren. There's so much more to life than business. What's to say he doesn't want to simply enjoy life with more connection and community to loved ones around him?
I don’t know much about this guy but it’s reasonable to assume that any C-level exec will hold on to the position for dear life until they are forced out.
I don't know. Frank Slootman's retirement from Snowflake earlier this year was certainly not celebrated by any significant stakeholders. I'd imagine at some point someone like Frank realizes that they are worth more than Tim Cook, they consider that they're in their mid-60s, and they decide the remaining time they have on earth might be better spent in other ways.
Every person in the workforce, no matter how ambitious or how senior, is forced into the calculus of money and power vs. good years remaining. I expect the rational ones will select the balance point for themselves.
The Juggling Act: Bringing Balance to Your Faith, Family, and Work
https://www.amazon.com/Juggling-Act-Bringing-Balance-Family/...
News: https://www.cnbc.com/2024/12/02/intel-ceo-pat-gelsinger-is-o...
Intel CEO Pat Gelsinger ousted by board
Worse (for Intel) what can happen is Intel HP-isation - splits and sells.
But there is a lot of good news for them in cpu world: another B's granted, military buys Intel, new no-HT arch. And 80 bit memories like in Multics, can be true virtualisation on x86.
Even if x86 is dead Intel still have fabs - AMD can soon print in them :)
But that multigeneration refreshes are still a mistery - is it Intel's problem or maybe something else eg. simply someone have a some patent ? :>
https://finance.yahoo.com/news/intels-7-86-billion-subsidy-0...
This implies that he was pushed out, rather than chose to retire. I can't see anything in the article to suggest this, do you have another source?
Look at what Pat did to VMware. He's doing the exact same thing at Intel. He came in, muddied the waters by hiring way too many people to do way too many things and none of them got done appropriately. Pat is a huge part of the problem.
I had the unfortunate pleasure of watching him not understand, at all, VMware's core competency. It was a nightmare of misunderstanding and waste in that company under his leadership.
Intel turned into even more of a laughing stock under Gelsinger. I say: good riddance. He burned time, capital and people at both VMware and Intel. He's a cancer as a CEO.
Doubt.
Neither of the companies is particularly competitive on either processor or GPU architecture nor fabrication.
A merger of those entities looks like nothing but a recipe for further x86 stagnation and an even quicker death for the entities involved imho.
In particular I cannot see what's good in it for AMD. The fabs have no use/clear path forward. Their processors/gpu either match our outmatch the Intel offering.
A Broadcom/Apple takeover of Intel sounds much more reasonable.
Out of curiosity, what would make Intel interesting to Apple? Apple already acquired Intel's modem business and they have their own CPU and GPU.
Yes Congressman, we agree to take over Intel fabs if you agree to drop this antitrust nonsense. And we would like our $20B from Google back too.
But I'm just speculating.
Apple currently really enjoys being on the very latest process node. It's not a given that they could match or improve on that with their own fab (Sure, there is a lot of VLSI design and materials experience, but that does not automatically translate into a state of the art process, and is unlikely to contain the magic ingredient to get Intel back on top).
And in the unlikely case it SHOULD work, that will only invite further regulatory headaches.
Just speculating.
I can, but it's not technical. Intel has a huge advantage in several markets, and has strong relationships with many OEMs like Dell. Intel, even though their market cap is now a fraction of AMD's, still has a huge lead in marketshare in OEM systems and servers. (Several other posts in this thread have real numbers.)
If AMD bought out Intel, it would now get all of that, and be able to push all these OEM and server customers into AMD's solutions instead.
Who is then? Apple is of course still ahead in lower power chips. But Apple is not in the the desktop/workstation/server market and there are hardly any alternatives to AMD or Intel there.
e.g. M2 Ultra Apple's fastest "desktop" CPU is slower than the 14700K you can get for $350. Seems pretty competitive...
I think a merger with Nvidia would be more likely given the antitrust issues that a merger with AMD would bring up.
At least in theory, a fully independent, split/spun out standalone fab removes this concern.
That said - what does Intel have to offer the top players here? Their fabs are being state of the art. And what's the standalone value of post-spin fabless Intel if their chip designs are as behind as their fabs?
This certainly presents a conundrum for US policy since we need fabs domestically for national security reasons, but the domestically owned ones are behind.
I can’t imagine fabs have that level of automation. It’s not like sending a file to your printer. It’s a multi month or year project in some cases to get your design produced. There’s many humans involved surely.
AMD/NVIDIA are in same business as Intel and have same pool of customers.
Oh no no no no. Oh hell no. For now, we need competition in the x86 market, and that would kill it dead. Imagine Intel re-releasing the Core 2 Quad, forever.
>Gelsinger, who resigned on Dec. 1, left after a board meeting last week during which directors felt Gelsinger's costly and ambitious plan to turn Intel around was not working and the progress of change was not fast enough, according to a person familiar with the matter. The board told Gelsinger he could retire or be removed, and he chose to step down, according to the source.
I "predicted" this three months ago (really, it was inevitable), but gave it 1-6 months.
Now for the what-happens-next popcorn. In a normal world, they would go bankrupt, but this is very much not a normal world.
However, I think the fab and design should be separate companies, with separate accountability and goals/objectives. There is just too much baggage by keeping them coupled. It doesn't let either part of the company spread their wings and reach their full potential when they are attached at the hip. From the outside perspective, that is the thing that Pat has seemingly been focused on, keeping it together, and its why people have lost faith in his leadership.
I also don't think that from a investment / stock standpoint that accelerating the depreciation / losses related to restructuring on the most recent quarter was a wise decision, since what Intel really needed was a huge win right now.
That's the best exit case for the shareholders. It's the worst case for Intel's employees, customers and partners.
> would probably co-sign on it for national security concerns
This is equally laughable and detestable at this point in history. My personal security is not impacted by this at all. Weapons manufacturers honestly should not have a seat at this discussion.
> overriding the fact that it creates an absolute monopoly on x86 processors.
Yet this isn't a problem for "national security?" This is why I find these sentiments completely ridiculous fabianesque nonsense.
Dropping Pat will alleviate their feeling of having to do "something."
As for M&A, it wouldn't just have to be approved at the DoJ. And the Chinese will never ever approve of it (but would have to). If they do a transaction without approval from the CMA, it would be like a nuclear financial war.
I think it's high time to gut intel into parts, a la GE. Sell the Fabless to QCOM or BCOM. Sell the Fabs of one by one to GF, Tower, UMC or even tsmc. Find a PE firm for the leading edge and reconstrue it with significant rnd credits as a kind of Bell labs 2.0.
Or something like that.
He came, he did all these things, he left (likely with golden parachutes): https://kevinkruse.com/the-ceo-and-the-three-envelopes/
My back of envelope calculation says Intel should be about $35 a share (150/4). If they stumble when they report Q4, I think the US Gov will twist some arms to make sure that fresh ideas make it onto the board of directors, and perhaps even make Qualcomm buy them.
I do think that Intel need to ditch some of their experiments like Mobileye. Its great (essential) to try and "rebalance their portfolio" away from server/pc chips by trying new businesses, but Mobileye hasnt grown very much.
Perhaps this is a fall out from the election.
China is not ahead. Still they are capable of mass-producing somewhat capable chips, with good enough yields for strategic projects.
Also they can mass produce now outdated designs, which are still good enough for "office use" to provide continuity of the government bureaucracy's operation.
China has less advanced nodes where it can produce for industrial applications.
They have the potential to eventually get ahead, but now a total embargo you only slow them down, but not cripple them. This situation is acceptable for any state as a temporary measure until reorganizing efforts.
Also Intel is still producing better stuff then the chineese can. It is still ahead. And as I detailed above, I think it would need to fall behind way more to loose its strategic nature.
Also capacities in Taiwan and in South-Korea are very fragile from a strategic perspective, even if they are/were more advanced than what Intel has.
Intel invested tens of billions into A20 and A18 nodes, but it has not paid off yet. News about yield seemed promising. Massive investments have been made. If somebody buys Intel foundries now, they pay one dollar and take debt + subsidies. Intel can write off the debt but don't get potential rewards.
Foundry is the most interesting part of Intel. It's where everything is happening despite all the risk.
You are describing the start of the Vulture capitalist playbook for short term profits and dividents right there, take subsidies and loans and sell everything to pay dividents (or rent back to yourself via a shell company) then let the remaining stuff die and sell of for an aditional small profit. Don't know how it works here but it sure sounds like it.
> Don't know how it works here but it sure sounds like it.
Thank you for being honest and saying that you are describing how things you don't understand sound for you.
[1]: https://wccftech.com/intel-adds-14a-process-node-to-its-road...
>"Notable absent from that list is he fired Pat Gelsinger. Please just bring him back as CEO." - 2012 on HN, when Paul Otellini Retired.
>"The only one who may have a slim chance to completely transform Intel is Pat Gelsinger, if Andy Grove saved Intel last time, it will be his apprentice to save Intel again. Unfortunately given what Intel has done to Pat during his last tenure, I am not sure if he is willing to pick up the job, especially the board's Chairman is Bryant, not sure how well they go together. But we know Pat still loves Intel, and I know a lot of us miss Pat." [2] - June, 2018
I am sad to read this. As I wrote [2] only a few hours ago about how the $8B from Chip ACT is barely anything if US / Intel wants to compete with TSMC.
Unfortunately there were lot of things that didn't go to plan. Or from my reading of it was out of his control. Cutting Dividends was a No No from Board until late. Big Cut of headcount wasn't allowed until too late. Basically he was tasked to turn the ship around rapidly, not allow to rock the ship too much all while it has leaky water at the bottom. And I will again, like I have already wrote in [1], point the finger at the board.
My reading of this is that it is a little too late to save Intel, both as foundry and chip making. Having Pat "retired" would likely mean the board is now planning to sell Intel off since Pat would likely be the biggest opponents to this idea.
At less than $100B I am sure there are plenty of interested buyers for various part of Intel. Broadcomm could be one. Qualcomm or may be even AMD. I am just not sure who will take the Foundry or if the Foundry will be a separate entity.
I dont want Pat and Intel to end like this. But the world is unforgiving and cruel. I have been watching TSMC grow and cheer leading them in 2010 before 99.99% of the internet even heard of its name and I know their game far too well. So I know competing against TSMC is a task that is borderline impossible in many aspect. But I would still wanted to see Pat bring Intel back to leading edge again. The once almightily Intel.....
Farewell and Goodbye Pat. Thank You for everything you have done.
Similar to Yahoo a number of years ago, there's some real business still there, they just need to remove the expectation of constant quarterly calls and expectations and make long term, consistent investments again.
At this stage in Intel's life I think having a financier overseeing the company might actually be a good move. Engineering leadership is of course critical across the business, but they company has historically paid large dividends and is now moving into a stage where it's removing those pay outs, taking on debt and managing large long-term investments across multiple parts of the business.
Someone needs to be able to manage those investments and communicate that to Wall Street/Investors in a way that makes sense and doesn't cause any surprises.
Pat's error isn't that Intel revenues are slowing or that things are struggling, it's the fact he continued to pay dividends and pretend it wasn't a huge issue... until it was.
Pat had the mandate from both the board and the market to do whatever he deemed necessary to bring Intel back to the forefront of semiconductor manufacturing and a leading edge node. Frankly, I don't think he used that mandate the way he should have. Hindsight is 20/20 and all, but he probably could have used that mandate to eliminate a lot of the middle management layer in the company and refocus on pure engineering. From the outside it seems like there's something rotten in that layer as the ship hasn't been particularly responsive to his steering, even with the knowledge of the roughly 4-5 year lead time that a company like Intel has to deal with. Having been there for so long though, a lot of his friends were probably in that layer and I can empathize with him being a bit over confident and believing he could turn it around while letting everyone keep their jobs.
The market reaction really does highlight how what's good for Intel long term as a business is not necessarily what the market views as good.
Folks in this thread are talking about a merger with AMD or splitting the foundry/design business. I doubt AMD wants this. They're relatively lean and efficient at this point and turning Intel around is a huge project that doesn't seem worth the effort when they're firing on all cylinders. Splitting the business is probably great for M&A bankers, but it's hard to see how that would actually help the US keep a leading semi-conductor manufacturer on shore. That business would likely end up just going the same way as GlobalFoundries and we all know that didn't really work out.
The most bizarre thing to me is that they've appointed co-CEO's that are both basically CFO's. That doesn't smell of success to me.
I think one of the more interesting directions Intel could go is if Nvidia leveraged their currently enormous stock value and moved in for an acquisition of the manufacturing division. (Quick glance shows a market cap of 3.4 trillion. I knew it was high, but still, wow.) Nvidia has the stock price and cash to support the purchase and rather uniquely, has the motive with the GPU shortage to have their own manufacturing arm. Plus, they've already been rumored to be making plays in the general compute space, but in ARM and RISC-V, not x86. Personally, Jensen is one of the few CEO's that I can imagine having the right tempo and tenor for taming that beast.
I'm curious what others think of the Nvidia acquisition idea.
It makes sense once you understand the board has finally accepted the obvious reality that the only option remaining is to sell/spin-off/merge large parts of the business. Of course, the foundry business must remain in one piece and go to an American company or the US govt won't approve it.
Gelsinger 'got resigned' so suddenly because he wouldn't agree to preside over the process of splitting up the company. These co-CEOs are just caretakers to manage the M&A process, so they don't need to be dynamic turnaround leaders or even Wall Street investable.
Intel stock went up on the news not because the street expects a turnaround but because they think the pieces may be worth more separately.
Maybe it isn’t wise that the USA dump billions into private companies that can’t even get their own house in order.
I wouldn’t be surprised if the next CEO installed by the board will be hellbent on prepping INTC for sale. A few of the board of directors of INTC are from private investment firms.
AMD is gaining ground on x86 without the burden of fab, and Apple has demonstrated that desktop ARM is viable. Nvidia is showing the world that GPUs can handle parallel computing better.
1. Looking at the CyberMonday and BlackFriday deals, I see that the 12900K and 14900* series CPUs are what is being offered on the Intel side. Meanwhile AMD has released newer chips. So Intel has issues with either yield, pricing or other barriers to adoption of their latest.
2. The ARC GPUs are way behind; it seems obvious to me that a way to increase usage and sales is to simply add more VRAM to the GPUs - Nvidia limits the 4090 to 24GB; so if Intel shipped a GPU with 48 or 64GB VRAM, more people would buy those just to have the extra VRAM. It would spur more development, more usage, more testing and ultimately result in ARC being a better choice for LLMs, image processing, etc.
Nvidia is so far ahead that a single manufacturer won’t be able to compete for developers. Instead, the only chance the rest of the AI GPU market has is to build some portable open source thing and use it to gang up on Nvidia.
That means bigger GPU players (AMD, hyperscalers) will need to be involved.
Doesn't Intel have pretty decent support in PyTorch? It's not like most people working on/with AI use CUDA directly.
And especially for stuff like LLMs software is hardly the main or even a significant barrier.
Having half the number of GPUs in a workstation/local server setup to have same amount of VRAM might make up for whatever slowdown there would be if you had to use less-optimized code. For instance running or training a model that required 192GB of VRAM would take 4x48GB VRAM but 8x24GB VRAM GPUs.
They are just not very good? There is basically no point in buying the current gen equivalent 14900K with current pricing (the only real advantage is lower power usage).
Your comment confuses me. BOTH have released newer chips. Intel with the Core Ultra 200 series and AMD with Ryzen 9000. Neither are going to be on Black Friday sales.
> So Intel has issues with either yield, pricing or other barriers to adoption of their latest.
How does not putting their latest chips on sale indicate trouble selling them?
But Tavares has been doing a terrible job for the past year: New Jeep inventories stacking up on dealer lots for over a year, mass layoffs, plant closings, unreliable cars, and no meaningful price reductions or other measures to correct the situation. You couldn't pay me to take a new Jeep or Ram truck right now.
Does Apple use Intel foundry? No.
The two largest fabless companies in the world never used, and have no plans to use Intel foundries.
"Intel foundry" as a service is a fiction, and will remain so. Intel can't get others to use their weird custom software tooling.
Unlike true foundries which manufacture chips designed by customers.
Such plans would be something like 4-5 years ahead, so you wouldn't have heard of it yet unless they decided to talk about it. Chip development takes a long time.
Of course, that means you have to expect the foundry to still be there in 5 years.
It's unfortunate the situation is beyond almost anyone's grasp but I wonder if Pat should have talked less, and done more.
Gelsinger was -- emphatically -- the wrong person for the job: someone who had been at Intel during its glory days and who obviously believed in his heart that he -- and he alone! -- could return the company to its past. That people fed into this messianic complex by viewing him as "the return of the engineer" was further problematic. To be clear: when Gelsinger arrived in 2021, the company was in deep crisis. It needed a leader who could restore it to technical leadership, but could do so by making some tough changes (namely, the immediate elimination of the dividend and a very significant reduction in head count). In contrast, what Gelsinger did was the worst of all paths: allowed for a dividend to be paid out for far (FAR!) too long and never got into into really cutting the middle management undergrowth. Worst of all, things that WERE innovative at Intel (e.g., Tofino) were sloppily killed, destroying the trust that Intel desperately needs if it is to survive.
No one should count Intel out (AMD's resurrection shows what's possible here!), but Intel under Gelsinger was an unmitigated disaster -- and a predictable one.
"On May 2, 2013, Executive Vice President and COO Brian Krzanich was elected as Intel's sixth CEO [...]"
The next paragraph is ominous ...
'As of May 2013, Intel's board of directors consists of Andy Bryant, John Donahoe, Frank Yeary, Ambassador Charlene Barshefsky, Susan Decker, Reed Hundt, Paul Otellini, James Plummer, David Pottruck, and David Yoffie and Creative director will.i.am. The board was described by former Financial Times journalist Tom Foremski as "an exemplary example of corporate governance of the highest order" and received a rating of ten from GovernanceMetrics International, a form of recognition that has only been awarded to twenty-one other corporate boards worldwide.'
I'm not sure where Intel needs to go from here - ultimately lots of problems are solved if they can just design a solid CPU (or GPU) and make enough of them to meet demand. Their problems recently are just down to them being incapable of doing so. If their fab node pans out that's another critical factor.
Intel still has tons of potential. They're by no means uncompetitive with AMD, really. The fabs are a millstone right now, the entire reason they as cheap as they are until they can start printing money with them. It does feel like they don't have any magic, though, no big moonshots or cool projects left since they basically killed all of them :(.
[Edit]: Though it might have to be that Intel literally has to come to the brink of bankruptcy in order for that comeback to happen.
https://www.pcgamer.com/intel-ceo-says-amd-is-in-the-rearvie...
I own one, Arc isn't the best, but it's still able to compete with lower end Nvidia and AMD GPUs. They ended up having to mark them down pretty drastically though.
I actually owned an Intel Zenphone about 8 years ago. Aside from being absolutely massive, it was decent.
I think Intel got arrogant. Even today, with all the benchmarks showing Intel isn't any faster than AMD, Intel is still more expensive for PC builds.
Check Microcenter, at least in the US, the cheapest AMD bundle is 299 vs 399 for Intel.
They're lagging behind in every metric.
Intel indeed rested too long on their laurels from early 2000s. It's one of the most dangerous things for a company to do.
I find this view limited. If we look at the most successful tech CEOs, they all personally drove the direction of products. Steve Jobs is an obvious example. Elon pushes the products of his companies so much that he even became a fellow of the National Academy of Engineering. Jeff Bezos was widely credited as the uber product manager in Amazon. Andrew Grove pushed Intel to sell their memory business and go all in on CPUs. Walton Jr. was famous for pushing IBM to go all in on electronic computers and later the mainframes. The list can go on and on. In contrary, we can see how mere cheerleading can lead to the demise of companies. Jeff Immelt as described in the book Lights Out can be such an example.
Revenue expectations, margins expectations, and investors are entirely different between the two.
This is why most founder CEOs tend to either get booted out or choose to keep their companies private as long as possible.
> Revenue expectations, margins expectations, and investors are entirely different between the two.
Yeah, it's hard. How the great CEOs achieve their successes are beyond me. I was just thinking that a great CEO needs to drive product changes.
Not significantly. Massive product and strategy pivots are rare because they can fail horribly (eg. For every turnaround like Apple there's a failure in execution like Yahoo)
> How the great CEOs achieve their successes are beyond me
Luck in most cases.
I've worked with plenty of CEOs and in most cases, most people at the C-Suite and SVP level of large organizations are comparable to each other skills and intellect wise.
Administration/mid-level management buy-in is also critical. C-Suite doesn't have much visibility into the lower levels of an organization (it's unrealistic), so management is their eyes and ears.
It can be. You've just noticed the fact that for publicly traded companies where the value of the stock exceeds the total value of the assets you actually get to put that difference on your books into categories like "Good will" or "Consumer confidence."
For companies struggling to create genuine year over year value this is a currently validated mechanism for faking it. The majority of companies do not need to do this. That companies operating in monopolized sectors often have to do it is what really should be scrutinized.
The guy thumped religious stuff on his twitter, didn't acknowledge competition, didn't reform company culture and was left in the dust.
Intel is better off without him.
When the news came out that 20A was canceled the spin was 18A was so advanced that they no longer needed an in-between node.
NOPE, what happened was that 18A was a failure, and they renamed 20A to 18A.
Hi! CEOing has always been a hobby of mine, but with the recent news, I thought this would be a great opportunity for us to synergize.
Sure, I don’t have much experience in the chip making world but I did buy a Pentium when I built my own computer. I also have never run a multinational company but I visited several other countries on a Disney cruise.
Let me lay it out- you would get a dynamic new CEO at a fraction of the going market rate, and I’d get to take my Chief Executiving skills to the next level.
Let’s do this together!
You can reply here and let me know if you’re interested.
A relative of mine with a PhD sounds exactly like this. Worked for Intel on chip-production tech then was hired by Apple about 10 years ago, working on stuff that gets mentioned in Apple keynotes.
Reminds me of the Boeing managers saying that they didn't need senior engineers because its products were mature..
A few blown doors and deadly crashes later, that didn't age well..
And this is a problem! Most of Intel’s recent major architectural changes over the last decade or so have been flops. [1] Quite a few have been reverted, often even after they ship. I presume that Intel does not actually have many people who are really qualified to work on the architecture.
[0] I’m talking about how it interacts with an OS and how you put it together into a working system. Understand stuff like SIMD is a different thing.
[1] AVX and its successors are notable exceptions, but they still have issues, and Intel did not really rock the early implementations.
Genuine interest is not the only way to get great results. Excellent pay can do so as well.
And lack of talent, again, excellent pay.
I can actually believe this. Of most of the rest of the arguments, that tend to be rather vague, and wave at implementation, or some stock related motivation (like we need TSMCs business), a lack of genuine interest in the employees that was not sold to them or the market especially effectively seems fairly believable.
Most people are there for the chips, for making great designs in silicon, and being market leaders in CPU architecture. Not running the equivalent of an offshoring division making other people's stuff.
The entire implementation has seemed rather haphazard and not sold with much real motivation. Honestly, the entire situation feels a bit like Afghanistan (if that's a bit incendiary)
Nobody really knows why they're going. Nobody really knows what they're trying to accomplish. The objectives on the ground seem vague, ephemeral, and constantly changing. There's not much passion among the ground troops about the idea. The leaders always seem to be far away, and making strange proclamations, without actually putting boots in the dirt. The information released often feels multiple personalityish, like there's far too many cooks in the kitchen, or far too many puppeteers pulling strings in every direction. And afterward you find out it was mostly some dumpster fire driven by completely different motivations than what were publicly acknowledged.
Given Intel’s stock returns over the past 15 years, Intel would have to offer insane cash compensation to rival big tech.
Levels.fyi indicates Intel heavily underpays, which is what I would expect.
https://www.levels.fyi/?compare=Intel,Apple,Google&track=Sof...
- WFH means you do not even have to move.
- Competitors have set up satellite offices in Hillsborough/Portland to poach talent.
- Intel does not feel like the stable employer anymore, with the almost yearly layoff waves since the BK era.
Has it's downsides though. Having visited NASA Ames over in Mountain View because of space agency work, it has a lot of the same issues as Aspen. Lot of sameness, lot of architecture that's McWealthy pseduo-Pueblo, lot of expensive blast plain suburbia, lot of people that start with "you're leaving money on the table" mentality, San Jose and San Fran traffic nearby can be nightmarish, and the crime of Oakland doesn't have that far to walk.
With many family in the Portland, Hillsboro, Beaverton area, that area also has it's positives: relatively low crime in Hillsboro / Beaverton (Portland's not great on the East side)[1], wealthy neighbors, huge amounts of hiking / outdoors / mountain climbing / biking / botanical gardens / parks, much less of blast plain suburbia, somewhat private feeling / hidden housing developments, ocean nearby, Columbia River nearby, significant art culture, lots of breweries / restaurants / food trucks / foodies, decent public transit, if you want dry desert wealth like Cali then Bend is not that far away.
Comparing the shows Portlandia vs Silicon Valley and Weeds is not a bad approximation.
[1] https://www.thetrace.org/2023/02/gun-violence-map-america-sh...
The equity compensation figures on levels.fyi are very rough estimates on what one might sees, but it is possible that Meta has to offer more equity due to more perceived volatility in its business rather than, say, Apple or Microsoft. Or maybe more expected work hours/days.
But also, Meta has long been known to pay more, famously being the business that eschewed the collusion that resulted in the famous high-tech employee antitrust litigation:
https://en.wikipedia.org/wiki/High-Tech_Employee_Antitrust_L...
The meetings with them felt that way too. Watch the same slides for months and wonder how nobody was moving anywhere with anything actually involving choices. "So, we've got these three alternatives for SLS, and like 10 other nice-to-have engineer pipe dreams." "Right, you'll choose the obvious low cost choice A. Why are we having this meeting again?" Many months later, after endless dithering, surprise, obvious choice using almost no hardware changes is chosen. All engineer nice-to-have pipe dreams are thrown away. There was much rejoicing at the money successfully squandered.
That's a damn shame. Big tech monopolies are screwing up the talent market. Nobody can match their comp and it's bullshit
But alas, Intel is a mega bureaucratic company with a few tiny siloed teams responsible for innovating and everyone else being test and process jockeys. I think Gelsinger wasn't given enough time, personally, and agree with the OP that even someone like Elon would struggle to keep this sinking ship afloat at this point.
BK wanted wearables and Bob Swan wanted to cut costs; neither of them were visionaries nor did they really understand that Intel was a hard tech company. Intel had achieved such dominance in such a in-demand, growing market, that all they had to do was make the technology better (smaller, faster, lower power) and the money would continue to flow. The mandate was straightforward and they failed.
No one stopped Intel from paying for talent. Intel’s shareholders decided to pay themselves instead of investing in the business by hiring great employees. That’s how you go from being a leader to a laggard.
> Didn't TSMC also struggle to hire enough skilled workers in Arizona?
That was mostly marketing so they could ask for more tax credits. In practice it seems to be going okay.
"At least he'll have company as he joins 15K colleagues headed for the door"
Heyoo!
I have to agree with a few others, don't really see why people looked up to him. Seems like a pretty mediocre CEO overall.
I also, personally, do not actually see the problem with a financial CEO. Ultimately I don't believe technical acumen is that important to the decisions a CEO makes - they should be relying on subordinates to correctly inform them of the direction and prospects of the company and act accordingly.
I am concerned about Intel's long term prospects - obviously he wouldn't be leaving if everything was looking up, IMO.
Something I was thinking about the other day: All the mega successful chip companies at the moment: TSMC, Nvidia, Samsung - are all lead by founders or family of founders. It got me wondering that if you really want to innovate in the most innovative industry in the world, you need more skin in the game than stock options. Gelsinger was the closest thing Intel had to a founder-leader, someone who deeply cared about the company and its legacy beyond what they paid him, and was willing to make tough decisions rather than maintain the status quo until his options vest.
I wonder if this is the nail in the coffin.
Then 2013 rolled around and I built a desktop with an AMD processor, because it was dirt cheap. I remember people saying that they were excited and looking forward to the upcoming AMD architectures and roadmap. I couldn't understand how AMD could possibly have come from behind to beat Intel.
Realistically, you just need to come up with a good roadmap, get good people, get the resources, execute.
Roadmap is good, people - not so sure about this one, resources - seem a little low, execution - seems hampered by too many cooks. If Intel broke up into smaller companies and their new leading edge chip design group had 500 good people, I honestly think they would have a better shot. So I think we are seeing what makes the most sense, Intel must die and let the phoenixes rise from the ashes.
Fast forward to 2024/2025, and remember, anything is possible if AMD beat 2011 Intel.
He was sacked. The board fired him. Tell it like it is.
What does this mean for Intel? I think they’re too important to get bought or broken up because they have fabs in the US. They’re like Boeing in that sense. Or a big bank. They’re too big (strategically important) for the US to let them fail. But maybe they’ll get bought by another US company or investor. I dunno.