But will there be enough demand in 2023 or beyond for all this new capacity in such a cyclical industry?
By Wolf Richter for WOLF STREET.
Despite the shortages of certain types of semiconductors, overall sales by chipmakers around the world hit a new record of $49.7 billion in November (three month moving average), the seventh record in a row, up 24% from a year ago, and up 35% from two years ago, according to data from the World Semiconductor Trade Statistics. With one month left, the industry has already set a new annual record.
The 23% plunge in chip sales from October 2018 through April 2019, marked in the chart below, was in part due to the collapse in demand for the specialized chips for crypto mining rigs, whose sales had collapsed after crypto prices had collapsed, with Bitcoin down by 85% from $20,000 in December 2017 to $3,200 by December 2018. But now crypto mining rigs are in high demand, along with all types of other semiconductors, and some semiconductors are in short supply, despite booming production.
The data and chart, being expressed in dollars, also shows the impact of price changes of semiconductors that have been rising since October 2020. One of the component in the collapse of sales from October 2018 through April 2019 was a downturn in semiconductor prices.
The US semiconductor industry – companies such as Intel, TI, NVIDIA, etc. – still had 47% of the global market share in 2020, according to the Semiconductor Industry Association’s 2021 industry report, but they’re manufacturing part of their products outside the US, and some of their chips are manufactured by contract manufacturers in other countries.
In terms of semiconductors manufactured in the Americas by all companies, including companies like Samsung with plants in the US: Sales jumped by 28.7% year-over-year in November (three-month moving average) to $11.5 billion. This was the highest year-over-year growth rate of any geographic region.
But all regions showed large growth rates in semiconductor sales:
- Americas: +28.7% ($11.5 billion)
- Europe: +26.3% ($4.3 billion)
- Japan: +19.5% ($3.9 billion)
- China: +21.4% ($16.9 billion)
- Asia Pacific/All Other: +22.2% ($13.1 billion)
Automakers and heavy equipment makers have been hit hard by shortages of some specific chips, and with just one component missing, the vehicle cannot be sold. Automakers have been discussing the prospects of shortages in 2022. Shortages are continuing, and there is no consensus, but it seems that at least some problems will persist possibly into 2023. Many other issues already improved in late 2021 and are expected to further improve in 2022.
The industry is heavily investing in production facilities, including in the US. But it takes years to build and equip these multi-billion-dollar manufacturing plants, and those plants won’t come online soon enough. There is a global drive underway to increase production. For China, semiconductors have become a national priority.
By the end of this year, construction on 30 new plants will have started, according to industry organization SEMI in June. And the amounts are huge. In September it estimated that $100 billion will be invested globally this year just for equipment for front-end fabs, where the silicon wafers are processed, up from a record $90 billion last year.
Governments around the world are pushing for massive investment in semiconductor manufacturing capacity. After the shortages, semiconductors are now considered a globally critical industry.
The four semiconductor giants – TSMC (Taiwan), Intel (US), Samsung (South Korea), and SMIC (China) – have released plans to invest nearly $400 billion in semiconductor plants around the globe over the next few years, according to the Nikkei Asia.
In addition, China, burned by US sanctions on semiconductors, is making a huge push to become 70% self-sufficient in semiconductors.
Given this flood of planned investment in new capacity, research group IDC is now penciling in the potential for overcapacity in 2023, according to the Nikkei. The semiconductor industry is cyclical, as the chart above shows, with demand suddenly plunging, and prices plunging, and huge investment projects being put on the back-burner. And suddenly when demand picks up, and then suddenly skyrockets, supply is constrained, prices spike, and it’s off to the races again.
But the shortages that began appearing in late 2020 are a different ball of wax than the regular supply constraints; and now the investment plans in reaction to those shortages are a different ball of wax as well, far bigger than ever before.
And someday, that boom in demand for crypto mining rigs is going to hit the skids again, or other demand falls off, such as from data centers. And then it’s back to the old cyclicality, with overcapacity and a glut of semiconductors waiting at the other end, once again. But for now, that’s just wishful thinking for automakers that cannot get all the components needed to put their cars together.
Enjoy reading WOLF STREET and want to support it? You can donate. I appreciate it immensely. Click on the beer and iced-tea mug to find out how:
Would you like to be notified via email when WOLF STREET publishes a new article? Sign up here.
– This certainly will help INTEL to recover from its disastrous decision to burn $ 84 billion in share buybacks.
Intel hasn’t been competitive in the HEDT market for the last several years and now the server/cloud business is migrating to AMD after a few years of dragging their feet. Naturally Intel’s response was to adopt an architecture (“big.little”) that allows them to use marketing gimmicks and advertise high core counts and high clock speeds. Conveniently, they don’t highlight for the average Joe that the architecture has two types of cores – some fast, some slow, i.e. you don’t have the advertised N cores at the advertised X clock rate. F*** Intel.
They’ve “caught up” with the competition by recently announcing a new generation of chips that are faster than the competition and that generate enough heat to warm your entire apartment.
– Once everyone is “Working from home” and has bought a laptop then demand for chips is bound to make a nose dive.
There are still data centers and servers (they’re veritable chip hogs), computers of all kinds, cars (thousands of chips each), appliances, cellphones, crypto mining rigs, industrial products, just about any durable good that gets manufactured has chips in it…. Even the lowly toaster has semiconductors in it.
We had enough chips before COVID. So why the shortage now? My understanding was that in the case of auto chips, the auto companies canceled their chip orders, the chip manufacturers then re-tooled their chip production to other products, these orders then tied them up so they could not convert back to auto chips. This along with COVID issues hampering production. But the capacity should be there overall to meet demand, at least at pre-COVID levels. Overall demand is high for all products but that will wane. As Wolf predicts – a massive chip oversupply is in the making.
“My understanding was that in the case of auto chips, the auto companies canceled their chip orders,”
If it was ever true — auto supply chains are long-term planning events, and you cannot just go cancel a bunch of stuff — it might have been done on a small scale for a few weeks by a couple of companies in June 2020, if ever.
“… the chip manufacturers then re-tooled their chip production to other products,…”
That was never true. That crazy theory refuses to die. In 10 years, people will use it to explain whatever.
There were real problems. The causes are pretty clear by now:
Supply problems as chip plants shut down due to Covid outbreaks in Asia, and this is still happening; due to a fire at a plant in Tokyo that specialized in chips for the auto industry; due to the Big Freeze in Austin TX where several plants had to shut down, etc.
These supply problems combined with:
Huge demand as chips go into nearly all durable goods, and demand for durable goods spiked out the wazoo globally since June 2020 due to all the stimulus and lockdowns, when people couldn’t buy services (travel, etc.). This is durable goods spending in the US:
I’m not sure why Wolf is claiming that the auto industry couldn’t cancel their chip orders. Bloomberg seems to think they can and did:
The auto industry has been notorious for abusing their suppliers by refusing to honor their purchasing commitments and pricing. The problem was that when they tried this game on the semiconductor manufacturers those guys just reallocated their capacity. Add in a couple of damaged/destroyed fabs and you have a perfect storm for the auto industry.
OK, read the article you linked.
The article discusses how automakers were deprioritized by chipmakers because of their relatively small volume of orders (lot more consumer products sold than cars). And consumer product makers were prioritized because demand had exploded (Samsung, Apple, etc. prepping for 5g). The article talks about a whole lot of issues.
The article mentioned that “auto chip companies slashes orders” – not automakers. They slashed new orders, which is different from cancelling existing orders. When they wanted to increase supply later in 2020, they were being deprioritized, and consumer products makers got the chips.
The article mentioned that COMPONENT SUPPLIERS — not automakers — “canceled orders originally planned for foundries in the first half of 2020,” which was over 18 months ago… this is long gone.
What the article kept saying over and over again is that automakers were being deprioritized in favor of consumer products makers, to the point where the EU and trade organizations had discussions with Taiwan, etc.
After the boom, always comes the bust.
Right now, the shortages are predicted to last through next year. However, a global recession is always possible (this reduces demand), especially after coming off the sugar high of all the stimulus money.
With semiconductors, most are simply cheap parts that anyone can make. China has the most success here. This could become a glut, without a recession. This includes simple parts such as capacitors, simple sensors, lights, and the majority of components of a motherboard.
Actual finished high end processors such as the one listed on the specifications of a device (and graphics cards, which are a form of processors) are among the most difficult to things to make on earth. The technologies to make processors are becoming tightly guarded; American, European, and Japanese companies have to work together to supply the necessary technologies to make them. Tawain is the current world leader at bringing them together to make the finished processor; South Korea is one of the runner ups. However, neither of these countries make the essential machinery needed to make processors. Right now, Tawain is helping build additional processor plants in America. China has no hope in the near future of making these processors using homegrown technology. Before relations soured, other countries (mainly Tawain) put some processor plants in China, however, these processors become obsolete. China cannot make, up to date processors and their current factories, will have diminishing capacity as their machinery breaks down. A glut in this category is less likely as processors become obsolete.
There is a middle category which includes, various simple types of simpler processors and other components, such as video display drivers, specialty sensors, and much more. These parts don’t become out of date as fast, but, make little money and there is less effort to build more of these plants to meet temporary demand. There will likely continue to be a shortage here for years to come, for some parts. Cars might be continue to be hit by this. On computers, many of these intermediate components are being replaced or integrated directly into the main processor.
Ram, SSD’s, and mechanical hard drives are their own situation as well, I haven’t payed as much attention to these.
I’d expect, only a recession in Asia or a global recession is likely to cause a glut, except in the low end category.
Probably in under 10 years and in as little as 5 years, the smallest possible transistor size, will be in production cpu’s. Soon after the main cpu, will become stagnant. Soon after this happens, the real glut in semiconductors will begin.
“Right now, the shortages are predicted to last through next year.”
According to who? Everybody is just pulling predictions out of their ass. It was supposed to be over by the 3rd quarter of 2021 according to “experts,” until it wasn’t.
“just pulling predictions out of their ass”
In IT, we used to call that a Rectal Data Extraction.
Surely you mean a RDE?
We all know IT terms are three letter acronyms!
The question is whether there will soon be a glut, I said no, your response is that some said the shortages would have been over already and it’s still going.
I didn’t say shortages would end next year, just that shortages will continue thru next year.
Looking at the massive backorders for semiconductors, this shortage is lasting until next year at the shortest, but probably longer. Just like demand for most things, though, a recession can diminish demand; that’s the only situation, i could see, within a year or 2 that could cause a glut.
“the smallest possible transistor size, will be in production cpu’s”
It took me a second to realize you were talking about theoretical limits, I was confused by your comment. Anyway, yeah, they have to find a new paradigm, substrate, or both.
Quantum Photonics, “Where single photons mediate interactions between embedded on-chip memories coupled to complex photonic circuits.”
Dr. Edo Waks is one of the electrical engineers working on this new technology. He’s at the University of Maryland and part of the Joint Quantum Institute doing research on this.
‘Quantum confined emiters’ is one of the new frontiers. Work on this has been ongoing for a decade or so.
“However, incorporating rare-earth ions into a thin film form-factor while preserving their optical properties has proven challenging. We demonstrate an integrated photonic platform for rare-earth ions doped in a single crystalline thin film lithium niobate on insulator.” -19 December 2019, published by American Chemical Society.
New paradigm & new substrate are coming — someday.
This is a comment, I made in the past.
The main way electronics are made faster is to shrink the transistors, right now in as little as 5 years-ish the smallest possible size of transistors will be reached, meaning no more yearly speed increases. That also means there will be no more yearly increases in power consumption, shrinking the size of the device itself and much more.
Everyone is already switching to ARM processers, which will be 1 time speed boost. The Intel architecture has many legacy design shortcomings dating back to the 70s and while ARM is nothing special, it’s modern.
The only big thing for consumer processors after that, is the stacked processor, basically it’s a 3ish layer chip that combines most processing components into a single piece, this increases speed and likely will reduce rare earth metals needed, though, I still wonder if heat will be a problem. This is likely a 1 time thing though. This might not be far off and may be within 10 years.
A couple hypothetical major increases are left, that would mostly be 1 time things like using superconductors. But, it’s unclear if it’s possible.
There are alternative processor types involving quantum mechanics and light, however, these won’t replace conventional processors and will be mainly in specialized computers and server farms. The light processor will be available in some consumer computers; It’s mainly thought to be good for some things like AI.
The tech community is pretty sure, that transistors made smaller than about 1.4nm, would interfere with each other on the atom level. This means about 1.5nm is smallest usable size. They usually just round down and call these limits 1nm. Right now, 5nm chips are commercially available and in the newest apple devices. 4nm is expected to launch this year, last I heard. 3nm might be next year. After that 2nm and finally about 1.5 nm is all that is left. The last few shrinks might hit difficulties and take several years each.
Thank you Thomas.
I would assume that the 1.4 nanometer size limitation is connected to the Heisenberg Uncertainty Principle???
re “I would assume that the 1.4 nanometer size limitation is connected to the Heisenberg Uncertainty Principle???”
not really. I believe it has to do with the (probability) wavelength of electrons and the probability they will appear at a certain location. When the insulating gates get near that 1-2nm, I expect their probability tunnelling right past the insulating gate gets very high making the transistor useless as an amplified gate control device. And even sub 5nm likely experiences much higher tunnelling gate leakage currents draining/costing power/heat.
Thank you for the answer to my question. Learn something new everyday at WOLF STREET.
I’m not sure if the uncertainty principle itself, applies, but, the rest of what you said is I believe the problem. It has awhile since I’ve heard and understood the exact problem. Previously, 3nm was considered a potential final limit, even before the “1nm” limit, but TSMC has done some experimenting and “1nm” should be possible. Either way, we are pretty close to the end.
In theory, if a room temperature/above room temperature superconductor is found and is viable; we might not be able to produce it immediately at the “1nm” transistor level and we could end up going through years of shrinking the transistors again. It’s very iffy if a stable mass producible superconductor exists.
Quite a lot can be done with software though to speed things up and the graphics cards could still see major design overalls. There will still be some internal improvements for awhile to processors involving specialty cores, but consumer electronics will be largely stagnant and probably drop in price, quite a lot, after the transistors stop shrinking and the stacked chip comes out.
Specialty cores are mini processors built into other processors, like the main cpu and graphics cards, they are hugely faster than general purpose cpu’s “like the main cpu” at the specific tasks they are designed for. Examples of this are for encryption and video decoding/encoding. They have a special layout dedicated to the math involved with very specific tasks. They only can be used at all for the exact math they were designed for. Most encryption cores for instance, can often only be used for AES encryption and no other encryptions. Because video takes huge amounts of processing power to handle, specialty cores are prevalent to take the main load off of the main cpu and graphics card; when new video formats are designed, new specialty cores have to be designed to handle them and existing ones cannot. For video decoding/encoding, specialty cores can handle multiple formats, but increasing the number of formats, increases this cores size, so only a small number are supported by any one core.
Graphics cards are largely a bunch of a bunch of specialty cores designed to work together. Graphics cards might eventually be entirely redesigned, there have been design overalls, but not a complete redo of them.
Alot of server tasks might eventually be handled by new specialty cores.
The main cpu, which is a general purpose processor designed for any computer task is the main imminently stagnant piece of the computer system.
I DO NOT trust The Chinese for one bit on all those counts. companies now use AI to optimize their chips and circuits. And yes, China now rules the AI scene (more patents). I simply cannot see them back down on this one. Xi is pumping manufacturing in China in all fields, as if its economy needed more pumping…
I’m not convinced the West is coming out of this one unscathed …
China has the manpower and the brains. And their honor to save face.
My goodness, did you see the Jets that they built? I know it’s all copied pasted on the Americans’, but still, do not underestimate what a people can accomplish when they’re uniting nationally… How about the Apollo missions in the 60s? Wasn’t that in a lot of ways built on the us pride? What could China build on the same principle?
Now I didn’t say that China has the same technology as the West at this moment, but they might (what am I saying) they will steal plans to the new smaller chips infrastructure etc. I’m not confident we will win this tbh.
Where are that American and Europe made chips going to go to be used? In which factories? Apart from cars and planes, where are your new made in Canada toaster going to be made? Do you really think some of our computers will have on it made in the Netherlands? Made in France? By undercutting all our industries for years China has slashed our building capacity. I would love to buy a good drill from Europe say, but ain’t paying those prices! They have us all by the balls and some more.
Patents don’t actually mean anything, until, they are upheld in court. Every year, many people create patents for ridiculous things that already exist, such as a key on a keyboard that allows you to scroll down in a one page increment; I.e. the page down key. The vast majority of patents are worthless.
I’ve heard quite a lot about how China is taking over AI, but I’ve never actually seen anything new or worthwhile they have made or actually done work with AI (except DJI). The DJI drones are the only actual new invention that comes out of China, during the rule of the CCP.
As far as planes go, which ones? All non military planes being built in China, are foreign designed and have to be built with cooperation of foreign companies, who make all the hard parts, the jet engines still have to be imported. The military planes they make are of questionable quality, and many are likely to be unable to take off in rough conditions or when it’s raining; on top of reliability concerns.
Manufacturing has shrunk under Xi. Before Xi, China could have actually become a superpower, but Xi has ended that possibility. Manufacturing is migrating from China to many countries across the world. It’s more expensive, bureaucratic and dangerous to make things in China today; as opposed to Vietnam, Mexico, India, and many others.
I thought a lot of the US rocket technology in the 60’s originally came from Germany, from the Nazis.
Not even close to being accurate.
The issue for the majority of the “auto chip shortage” are 20 or 30 year old semiconductor fabs – these have been inching up in capacity utilization for years and hit over 100% due to demand increases.
The demand increases aren’t primarily because of cars, but because of inserting chips into everything from toasters to light bulbs.
Anyone can build a “new” 2 or 3 decade old chip fab; the problem is the economics: How do you sell output from a (still) multibillion dollar new fab vs. the existing, fully depreciated stuff?
There are no functional differences – the only delta is availability and cost. So will surging IoT nonsense be mitigated by decadal rising costs of said chips?
These fabs were originally built using the “premium” cost passthroughs when they were new; a “new” fab for the same items doesn’t have this luxury.
Alot of auto chips are specialty chips, they are built to be more rugged and long lasting. They might have things like specialty sensors as well. Some yes, are just older chips.
These chips are in the intermediate category, I mentioned, and it doesn’t make financial sense to make these plants. Shortages will continue.
The chips used in cars these days are no different in any way than any other chips build years ago.
Motorola used to be a major producer of 4 and 8 bit microcontrollers; the 8 bit microcontrollers are still the primary chip used in the ECU/ECM systems in cars. These ECU-ECM systems do everything from emissions monitoring to engine control to collect sensor data.
The only real difference between an automotive microcontroller and say, one stuck into a toaster is that the automotive industry used to test their systems for up to 4 years before putting into production. There can be minor differences in the manufacture – i.e. build more “wasteful” but robust interconnects but I’d bet any amount of money you’d want to put up that this is no longer true, if it ever was.
Perhaps an understanding of the scale involved will help: annual sales of 8 bit microcontrollers is in the multiple billions range. It was around 2 billion/year back when I was still in the industry; it is probably not significantly higher today (3 billion? Probably not 4).
To compare: Worldwide computer sales in that era were in the 200-250 million range.
Intel could sell a CPU in 2002 for $1000; an 8 bit microcontroller cost well under 10 cents wholesale.
This is why the economics is going to be an issue: building a new fab for 8 bit microcontrollers means actual operating cost + investment depreciation will require wholesale prices of these 8 bit MCUs to increase by factors of 10 or more. $1/MCU isn’t a lot of money, all things considered, until you look at the volume and the risk said fab builder takes on: what happens if the demand wave subsides? Who will buy the $1 MCU if there is still $0.10 MCU capacity?
And we’re not talking about Intel leading edge CPU economics; Intel could build a new fab for each new generation of chips and depreciate it in 4-6 years; even at $1/MCU, it will require 10+ years of consistent sales to depreciate the capital cost.
The reducing of transistor size has not increased the performance for 10 years.
The maximum frequency for CPU is capped to ~4GHz.
Transistor size reduced the power consumption, and that’s a huge thing for reliability of the whole system.
The performance gains have been obtained by the increasing of :
– cpu cache
– number of cores, and generally speaking – complexity = number of transistors.
– mainboard and ram frequency
Apparently the current economic equation pushes for the development of farms built with low- and mid-range CPUs instead of creating the super CPUs.
This isn’t quite right. Cpu frequency is only 1 factor in the performance of a computer.
For complicated reasons, processors can only be made to a certain size. Shrinking the transistors size, allows you to put more transistors into a given size, increasing their density; and Increasing the total computing capability. If you look at the total number of transistors in a processor, you will notice that it’s risen dramatically over the last 10 years.
In gaming consoles, having a standardized set of specifications makes programming for them easier and more efficient. As a currently produced game console is on the market over time and transistors shrink in size, their manufacturers will switch to the newer smaller transistors without increasing the number of transistors; allowing them to make that game console smaller, cheaper, and more power efficient. Some other devices are improved in this way as well.
If you look at the benchmarks for a device such as an iPhone, you will see that their performance does in fact increase every year. Because, they are faster and have a better processor design every year, they can do the exact same tasks as the previous iPhone using less electricity.
In the pre 2000 era, decreasing minimum geometry sizes (I.e. transistors and interconnect) yielded both improved speed and cost (size).
Starting around 2000 – the equation changed: you can reduce the transistor sizes but the power consumption (power in, heat generated, substrate leakage etc) became the dominant issue.
And that seems to hold true today. Pushing 10 or 20 nm transistors *could* be done to greater frequencies but the resulting power issues would destroy the chip.
And anyway, a major part of CPU performance has always been parallel predictive architecture: the use of multiple execution paths with “guessing” for the next command to speed up performance. Not coincidentally, these paths are precisely the cause for Spectre, meltdown and other form of architecture attacks on security.
So if you can’t predict and you can’t speed up – the next method is to add more cores. But again: parallel core processing benefits only a tiny minority of applications.
The reality is that there are minimal differences in hardware in the past decade; only the ever more wasteful OS setups and sloppy programming are why anyone needs a computer newer than 2016 for everyday use.
– “anyone needs a computer newer than 2016 for everyday use.”
I have a corporate grade laptop built 10 years a ago with CPU i3-3110m. This device just refuses to die: it migrated successfully to Win10. I had to put SSD inside because Windows doesn’t know to run properly on HDD anymore.
My genuine surprise is that I see no much difference in performance with my new laptop of 2021 with the latest intel I5 cpu.
My point is not that you can save 300$ keeping an old device; I have a strong feeling that the quest for computational power is suspended for PC CPUs; it certainly contiues in graphic processors. I think that the cloud computing killed the incentive to have powerful CPUs.
If you compare the performance on a single threaded cpu task on a 2020 cpu vs a similar tiered (for its year) 2010 cpu, the 2020 cpu will win by alot.
Yes, pushing frequencies higher was a problem.
Speculative execution capabilities are only 1 aspect of improving performance. Many security issues have been found in its implementation, but hardware and software fixes are being developed. RISC based architectures like ARM depend less on Speculative execution capabilities, and their overall architecture is simpler and superior today, so it will be solveable on RISC architectures, which all new processors will be as Intel x86 dies out.
Memory bottlenecks was always an ever growing problem, to minimize that, most transistors now in a cpu are simply cashes (memory). Smaller transistors have allowed the size of the cashes to grow.
Smaller transistors leads to more processing units and more memory, thus greater performance.
You can get a newer cpu with the same number of cores with a lower frequency and still blow away a 10 year old cpu with the same overarching architecture (x86).
For basic pc tasks, especially on windows, most animations are timed to only go so fast, so that the user knows something specific happened and doesn’t get possibly motion sick. So for basic tasks on pc’s it might appear that it’s not faster.
The reality is most people use their computers for email, internet and gaming.
My mother has been using a Pentium 4 computer for email and internet – this is a 2004 era computer. It is painfully slow and finding drivers for new peripherals (it is Win 7) is quite challenging, but it is still functional and acceptable for her.
For a 2016 era computer or level 2012 (as you note), email and internet will work fine.
But even for gaming: only a specific hard core group care about the latest graphics processors – and I’d bet those are the e-sports stars and wannabe stars.
And yes, there was crypto demand; there is still some crypto demand but the professional miners (which do 85%+ of the mining) use ASICs. I would posit that there is more demand on the machine learning/AI side though.
Net net: it isn’t that CPU use is suspended for PCs – it is that there are no non-gaming applications that need more powerful CPUs. Even Microsoft OS and application bloatware hasn’t been enough.
Even cloud: the primary reason cloud uses more powerful and multi-core CPUs is that they run virtual machines which emulate the actual everyday CPUs. I don’t know the precise breakdown, but the costs show this clearly: the majority of cloud CPU (i.e. fixed or variable availability, fixed CPU power) computer engines are clearly in the lower end because the high end is egregiously expensive. We’re talking hundreds to thousands of dollars per month per CPU – at which point there is zero economic point to using them except in very specific cases like huge data lake processing.
The comparisons you are referencing are rarely real world applications; furthermore all of the various benchmark references have been demonstrated to be more marketing than reality (i.e. they deliberately exaggerate differences in CPU performance both by design and because the CPU makers have designed to maximize benchmark performance)
As I note above responding to Engin-ear – outside of gaming, there is not a lot of difference between a high end 2010 CPU vs. an everyday 2020 CPU for everyday use. I’ve booted up 2006 era Mac Servers for example – this probably cost $12K new; it has a 550GB HD, 16GB Ram and was a graphic designer’s main processing computer back when. I slapped in some software just out of curiosity – it works quite well.
The main differences between the 2010 era machines and newer ones is RAM and HD: 2010 are spinners vs. SSD today (a huge difference) and 2010 era machines would be 1 or 2 GB RAM vs. the 8GB to 16GB+ today.
Note that RAM and HD are changeable…
So again: I’ve never said there are no differences whatsoever. What I’ve said is that outside of gaming and a handful of very specific and small-population use cases, a PC from 8 or 10 years ago works fine. I’m writing this from a 2018 era top end laptop which is Windows 10 – and it performs visibly worse than my 2014 era top end Win 7 laptop. Both are from the same maker, the 2018 era has a better SSD and twice the RAM but Win10 and associated bloatware has offset much of the hardware differential.
Nor is this my imagination: I get occasional requests to do CPU intensive stuff like password cracking. I have had to regress our cracking servers to Win 7 because of the visible (negative) impact on performance. I’d use Linux but the professional software I use only runs on windows…
It’s true that for the average user who doesn’t game (on their pc) or do certain work tasks such as CAD and video editing, that there is little real world difference (in performance) with the newest cpu’s (on a pc)
It’s true that windows sucks, quite alot and we are often stuck with it, until the Linux community gets its act together and isn’t so incredibly fragmented.
Most people use gaming consoles like the PS5 for non phone gaming, both the ps4 and ps5 use 8 core AMD cpu’s, there is a big cpu performance increase; you can upgrade a ps4 to a ssd. You can run ps4 games off an external ssd on the ps5 and compare the results with a ps4 running with that same ssd inside of the ps4; even when running in backwards compatibility mode (which for compatibility reasons, shuts off a large number of new features) on a non updated game, the ps5 shows a large gain in performance; 2013 vs 2020.
The original question/statement was about the end of transistor shrinks, bringing advances in consumer electronics to a far smaller pace. And how that effects semiconductor demand and the general demand for electronics. The end of shrinking transistor sizes; effects performance, shrinking devices, power consumption/heat, and much more. Also the differences with shrinking transistors, is much more apparent in devices like phones and tablets, which people have mostly switched to.
The transistor shrinks benefit mobile more, certainly. Mobile CPUs are largely ARM though – not custom designed circuits like an Intel/AMD CPU (was, can’t speak for today).
However, the main differences are again power related. ARM dominates the mobile CPU space because it is much easier to “tune” – a paper design for mobile bears far less resemblance to what comes off the silicon than the custom chips (where development of process and design are simultaneous).
The basic difference is that the mobile chip: the process is developed first then the CPU is tuned to the process while for desktop/laptop CPUs, the process and CPU are designed simultaneously.
Again, this is how it was in the runup to 45 nm; it probably has changed given Intel and AMD’s management switches.
Apple is a hybrid – they used to use ARM, but they are large enough and charge enough that they switched over to their own custom CPUs (non ARM for mobile, non-Intel for desktop/laptop). However, they still don’t own the processes although I suspect they have more input into a process than the typical customer.
Be that as it may, mobile power consumption has always been dominated by the RF portion: the radio signaling, the conversion from analog to digital and back.
The reality is that people use their mobiles for far less than desktop/laptop – I doubt anyone uses a mobile phone for say, graphics design or extreme modeling of stocks, climate whatever.
There are mobile games, but the small screen sizes mean far less graphics capability = far less GPU/CPU burden.
The main changes to mobile battery performance are far more likely due to ever larger batteries combined with RF signaling changes (5G vs. LTE vs. 4G etc); the actual chip performance has probably improved but the digital part was never large to begin with.
Even there, I’d bet a lot more of the improved battery performance isn’t the hardware performance so much as more storage. More storage means more caching of repetitive data = less transmission = less power use. I’m just making an educated guess here but there are only so many ways to skin a CPU cat.
Maybe because the US with no more “forever wars man” at 2500 troops Biden, is unlikely to mount an extremely long campaign on behalf of Taiwan.
As has already been mooted as a defence, the value to China of Taiwan is that its the worlds chip manufacturer, so a credible threat of destroying them might deter China.
Personally, I think this explains the investment. Even little UK has a tiny fabrication plant in Wales (a country in the union) that can expand. Global chip production can’t be captive to China so plants are needed elsewhere. Then the Taiwan issue is mainly relevant to Japan although I think China might also think about pushing to Singapore if they grabbed Taiwan because controlling a trade route.
Singapore is predominantly Chinese in the same sense that Taiwan is, populated by fleeing Chinese nationalists from WW2.
The US is in a formal, obligatory, legal, binding defense treaty with Japan. If Japan is involved so is the US. Equally and more to the point, if the US gets involved, so is Japan. And Oz and the UK. The US doesn’t have a formal guarantee to defend Taiwan, it has just made it clear it would assist Taiwan.
China can’t wander around knocking off this place, then that one. I predict that IF China tries to invade Taiwan there will be a China/ US and allies naval clash that China will lose without landing a man by sea on Taiwan. But say I’m wrong and China successfully invades T and for fun in this game also Singapore, because after all they’re all Chinese. Of course there would be a total, instant, US embargo on trade with China, no doubt joined by EU, UK and Commonwealth.
Re: they are all Chinese. At the time of the US revolution most were of English ancestry. The people of Chinese descent in Hong Kong, Taiwan and Singapore etc. do not want to be ruled by the CCP.
Destroying Taiwan’s chip industry would hurt the US and Western Europe far more than China, because those nations/regions are far, far more reliant on automation and consumer gewgaws than China is.
They get paid first and foretell every other parts of the businesses that use them. For now the safer trade til it’s not. Cramer touted them. I remember AMD at 10! Didn’t hold! Thanks Wolf. Tape looks weird, almost lower lows and highs or long and short til it’s not.
Buffet has a good saying that you don’t get extra points as an investor because a product is complicated to make. Sometimes basic human needs and wants are more rewarding investments. Cyclical industries that require tons of capital can be tough for average retail investor.
Without being impolite, I would have to say Wolf that the Fed do not seem, so far, as bothered about Inflation as you believe.
Fully in the back seat of the car relaxing at the moment …
Yeah I got this. Comfortably behind the curve to make sure I take a long time to control inflation, so I minimize the debt and reset yields out of the the Japanese deflationary death spiral level. Sorry Bear blog!
They’ve said they are bumping 4 times in 22. The market is the one relaxing in the back seat, acting as though the Fed hasn’t said anything. There are people commenting on this site that say if the market hiccups the Fed will retract. What is the Fed supposed to do, rent billboards or TV ads saying: ‘Avoid risk assets’ or ‘Markets can go down’
I will agree with you in one sense: it would have been better for the Fed to announce a rate hike of .5 % for Jan 1. With inflation apparently gathering speed, and the WH upset. they might make one of those bumps .5 instead of .25 %.
The Fed’s seemingly bending over to not disturb the sentiments of stock investors is worrisome. “All will be smooth,” Congress is told (I paraphrase), “we’ve got this one.” Unlike for the last several months. Yes, the Fed must calibrate the message and not create a panic. But does this signal more timidity, and appeasement of the wrong interests? It makes me think of the risk-aversion of the old aging Soviet bureaucrats.
phleep-your final sentence re-convinces me that our nation’s Cold War ‘victory’ was Pyrrhic…
may we all find a better day.
I worked in the semiconductor industry during the mid to late 90’s. It’s a very cyclical industry. My father also worked in the industry.
In my experience the industry is always in one of two possible states. Either they can’t hire people fast enough, or they can’t get rid of them soon enough. Not much middle ground.
I remember the international tensions about memory chips being ‘dumped’, sold below cost of production. The 7400 type? Can’t remember if it lead to tariffs or quotas
The oil and gas industry was like that once upon a time….hey, it still is! I know, as I was in it 35 years.
A different dynamics this time in our consumption based economy.
Without ENERGY, there is no economy!
B/c for demand for green alternate energies ( which won’t replace the need for fossil fuels at least a decade) prospect for investors investing in OIL & GAS industry, is brighter than ever. look at the barrel oil price for the 30 days ( XLE as a proxy – from 55.40 to 63. 37 – ytd 9.02% Div/yield 3.92%) unless Omicron significantly reduces the demand. Besides and ironically all the green alternates need fossil fuel to produce/mine them!
“And someday, that boom in demand for crypto mining rigs is going to hit the skids again”
As of today there are 16,698 cryptocurrencies (which I’d guess is even more than the number of varieties of tulip bulbs and also reveals the incentive to quick riches by starting a new tulip bulb variety, mining a bunch while that’s still incredibly easy, and then hope that suckers pay you for them), 457 crypto exchanges, and a total tulip bulb market cap of $2,073,560,520,955.
When one type of illusory wealth coin mining is no longer profitable using GPUs, the miner may shift to one that still is or sell his abused GPUs to some other sucker and move to ASICs.
This will continue indefinitely as there is a limitless supply of suckers in the world until cryptos, instead of being a way for large investment firms to “earn” commissions and own large amounts of cryptos to participate in the pump and dump game along with their fellow whales, become a threat to the established financial order, fiat currencies, or any government sponsored digital currencies.
At that point governments which everywhere have life and death control over businesses through various means will make cryptos illegal tender for all purchases and conversions to approved currencies and cryptos will become universally useful only for what they are mostly useful and used for now – illegal activities.
And, unfortunately, at that point and only that point will the demand for mining rigs hit the skids. I say that as someone who laments the insane overpricing of PC GPUs all due to these moronic tulip bulb chasers.
Yep correct. Since government cannot control it, politicians won’t allow it. I’m sure they are working on a fake narrative as an excuse to make it illegal right now as we speak.
Escape from political risk (government control) opens into a huge endless frying pan of other risks and players. Even more iterations of rats are moving into the crypto space with all their faux cleverness and card tricks. Don’t stop believin’, bro, but I won’t die on that hill. I won’t join you in that foxhole (aside from a small side bet just for laughs). I’ll have me some real assets, thank you. I like a thing I can eat, or shelter myself with, that does not depend on raw, shifty belief of the next sucker.
The knee jerk all-gov-is-bad libertarianism is such a fake horizon, such a pie-in-the-sky, the province of the young or naive. That belief set is a contrary indicator I guess for GPUs and ASICs.
Surely, there are lots of suckers who are willing trade government control for private tech mogul control.
(Here’s a shocker: decentralized crypto will stay a unicorn. Search for “My first impressions of web3” to understand why.)
I bought a used GPU from a miner last crash and it’ll have to do till the next one. I hope it holds out- it’s doing perfectly fine so far. Built an all new computer except for the GPU. Even without the newest GPU it’s so much faster than the old computer. Well worth learning to build myself. I use AI photo editing software which is heavily GPU based.
1) A stunning 7% y/y inflation. NDX correction might not be over yet.
NDX might breach Jan 2020 low.
2) NDX corrections will slow down the “transitory” inflation.
3) USDCAD BB : Jan 26 2015 hi/ Feb 2 2015 lo @ 1.235. // BC Mar 16 2015 hi/ AR June 11 2015 @1.192.
4) USDCAD breached the cloud, trading inside BB, on the way down, above Sept 2017 low. @1.205
5) USDCAD retraced only 38% of the move from Mar 16 2020 hi to May 2021 low.
6) USDCAD next reaction will reach : 50% – 62%.
7) A higher USDCAD ==> inflation killer.
8) Thereafter, USDCAD might reach parity 1.000 or below in 2023/ 24.
9 ) Parity means 2008 style inflation.
10) Global overcapacity will kill it.
Markets have been very tepid as of late. Volume lower and movement significantly down.
People are very much in a wait and see situation right now. Everyone trying to call the top of the market.
Hopium remains very high. as evidenced in significant spreadin call options on DIA! BTFD crowd (& Hedge funds) keep jumping intoevery dip. There is a deep and inherent belief that Fed won’t let slide more than 10%-20% ( true or not) but that’s what perception in the air. Mkts have to slide beyond 20% before the panic hits!
Can any one imagine sudden or even slow-intermittent slide in an mid election year? I am still waiting to witness it!
8) Thereafter, USDCAD might reach parity 1.000 or below in 2023/ 24.
Can will do anything to prevent this. Gov, finance and BoC want US biz and tourists. Their attitude to it being more expensive for us to visit, holiday, shop in US: ‘that’s too bad, eh?’
What could Can do about it? Don’t assume the Lib/NDP coalition doesn’t now how to spend money.
By the time Trudeau I and NDP were through, the C$ was 62 US cents.
In the race to overspend the US is formidable but Can is a contender. If the worst comes to worst and free day care etc. is not enough to lower the C$ back to 80 cents, we could exhume the Avro Arrow project and go really bust.
Crypto mining – what a colossal waste of resources.
Like clockwork, every time the government intervenes the market economy breaks. Subsidizing silicon manufacturing will produce a glut just as lowering interest rates produced inflation and unaffordable housing. Interventionism vs market economy. Not much different from the old Soviet 5 year plans. Clueless.
I worked for IDC during the summer back in the mid 80’s when I was in grad school. We were in the middle of a semiconductor glut and IDC ( Industrial Design Corporation, a spin-off of the big engineering consulting firm CH2mHill) was scrambling for work. I spent the first half of the summer doing hvac cost calculations for various potential fab sites around the world for T.I. Then the second half of the summer I was given the task of figuring out a new market for IDC in case semiconductors stayed on the skids for a long time. I came up with a paper ( on my own) about the future of high level containment labs for biological research. I layed out the technologies needed, especially for level 4 containment labs. I predicted that lots of these level 4 containment systems would be needed because working with novel viruses and other modified bio agents would be too dangerous without the most meticulous and advanced containment. Oops!
Are they selling more chips, or are they raising prices to increase sales volume?
Samsung is expecting to triple their chip production in the coming years. It may take two years or more to expand an existing chip fab.
John Deere is doing R&D for autonomous tractors. EV sales resulted in increased chip demand. 5G requires more chips.
I saw dealer web pages filled with large pick up trucks and large SUV’s for sale ads while shopping online. I bought a new 2021 Ford Escape SE yesterday. It has more electronics and lights in the cabin than I ever had before.
A new fully loaded Corvette was advertised online for $110k, a new Mustang for $85k.
Northern California auto dealer friend on Friday, he says “It may be a couple years before you can make a good deal on a new auto. Once this chip supply problem is over, there will be a glut of buyers who have been waiting in the wings who have refused to purchase greater than MSRP.
There have also been many owners that have been selling their cars for bank, thinking that if they wait for this supply problem to blow over, they could win on both ends.”
A glut of used cars could be coming on line soon.
The latest Crypto news – Kim Kardashian, Floyd Mayweather and Paul Pierce are being sued for pump and dump scams for promoting a crypto that crashed 98%. Once the class action lawyers get hold of cryptos, they’ll be history. I wonder if their defense lawyers will be accepting worthless tokens for their fees?!?!?!
Personally I would short anything those three promoted.
Latest crypto news:
Robber tech baron, Jack Dorsey, wants to establish a fund to provide legal defense to those crypto “freedom fighter” techies, if ever they screw up.
(Also, lately, huge amount of money has been poured into lobby firms and used to grease the palms of people close to lawmakers with the goal to blunt any effort to regulate and police this space.)
Kim Kardashian’s Instagram followers and their stimmies are soon parted. And back to crying for more “justice.”
It will be interesting to see if a fraud case can be made here. I saw the Kim K. Instagram pitch, which said in essence (I paraphrase), “the price of this thing has gone way up, and I like it.” I.e., “Do you have a dog’s level brain so you will follow this sort of bait? Just go with the feeling of being my pseudo pal! Hop in!” It is the basic business model of everything touted by tabloid celebs.
Intel still has a theoretical chance to succeed, the new CEO is setting some challanging tasks ahead and has moved in the right direction already (no stock buyback and sale of NAND technology). Now the industry is waiting for a restructuring of managment resources, so the plans can be materialized. I hope he finds some outstanding and commited people because the damage made by the Sohail gang almost obliterated Intel manufacturing and specificaly development capability.
chips are in everything now and the demand will just increase. yes, crypto price crashes could send demand down sharply, but overall this is a growth industry.
Genterally, great articles and insights by Wolf. I love it. However,re any semi glut in 2023 or beyond, if any, IMHO, might it be a very short term problem, and buying opportunity.
I mean, I’m thinking that the rapid shift to EVs and new energy and automation (also due to high incentives from govs) will surprise, and surge the annual demand growth of the mid and low end chips you are most worried about (e.g., MCUs, logics, et.) Also, I expect that it will take at least 2 years for COVID to allow people to get back to their normal spending on services (R&R, food, fun, vacations, etc.) and until then they’ll have at least an extra $2-4K extra per year, which will get spent on consumer goods. Also, work from home is here to stay, incl. as a job retention perk, resulting in a double spend on tech and a shift from office space to home space, which demands more consumer good purchasing too. So, while your crystal ball may be more clear on the timing of fast rising supply, I don’t expect it is so clear about the various unexpected demand boosters, over the next 1-5 years.
My 1937 Cord 810 has no chips. My 1979 low power 350 ci Corvette has no chips. My 1997 Expedition has 2 chips. My 1999 GMC has a couple of chips also. My 2002 BMW 525 Touring Wagon has maybe 3 chips.
All problems relating to the three newest vehicles are related to the chips.
I am in the process of buying a 1965 Mustang with no chips.
I used to own a 68 Mustang back in the 1970s. POS already back then, with stuff literally falling off (arm rest, clutch linkage. etc.) and leaking (fuel, oil) and not sealing (wind and water coming into the car through door jams and by the windshield. Tach broken. Three-speed manual = POS personified. Brakes… well, don’t call them that. If you collect cars that you never drive, fine. Have fun looking at it.
Wolf; you aren’t wrong, but I know the car and its history. Thank you for the analysis. I will drive it for the next twenty years or so, then die. Get an Ecklers catalog, fix each item that fails, then move on.
I have owned a few cars in the last fifty years.
The most problematic was the 1989 Lotus Esprit. The second worst was the Rolls, then the Pantera. I used to drive every car until I fixed it, then got rid of it. Oh…, wait, the worst was the 1980 Delorean.
I miss crawling under the ole “66 GTO, taking off the drive shaft, pulling off the transmission and replacing the clutch, etc.
These days, changing the oil takes an hour. Then I have to get back on my feet.
I am just an old pilot with a lot of time. I look forward to the problems.
But really, I know exactly what you are stating.