The sad thing is that we enthusiasts are a small market compared to the overwhelming majority of computer users who don't mind locked-down devices, or at least until they've been bitten by the restrictions, but if there are no alternatives other than retrocomputing, then it's too late. For decades we enthusiasts have been able to benefit from other markets with overlapping needs such as gaming, workstations, and corporate servers. However, many on-premise servers have been replaced by cloud services, the workstation market has been subsumed by the broader PC market, and PC gaming has faced challenges, from a push toward locked-down consoles to challenges in the GPU market due to competition with cryptocurrency mining and now AI.
One of the things I'm increasingly disappointed with is the dominance of large corporations in computing. It seems harder for small players to survive in this ecosystem. Software has to deal with network effects and large companies owning major platforms, and building your own hardware requires tons of capital.
I wonder if it's possible even for a company to make even 1980s-era electronics without massive capital expenditures? How feasible is it for a small company to manufacture the equivalent of a Motorola 68000 or Intel 386?
I'd like to see a market for hobbyist computing by hobbyist computer shops, but I'm not sure it's economically feasible.
Yes, you'll probably have difficulty walking into a STORE to buy PC components, but only because online shopping has been killing local shops for decades now. You'll find it easy to get that stuff online, for better prices.
PCs, since the very start, have been going through a process of being ever more integrated each generation. Not too many people install sound cards, IDE controllers, etc., anymore. CPUs, GPUs, and RAM are about the only holdouts not integrated on the motherboard these days. It's possible that could change, if CPUs and GPUs becomes fast enough for 99% of people, and RAM gets cheap enough that manufacturers can put more on-board than 99% of people will need. And while you might not be happy about that kind of integration, it comes with big price reductions that help everyone. But we're not there yet, and I can't say how long down the road that might be.
Mouse replacement on a weekend coz old one broke same story (button smashed in and not usable at all any longer). Online not cheaper, no same day available at any price, Amazon delivery without Prime no next day either. Local chain store had it for immediate pickup and I was gaming again in 30 minutes.
Of course, you have Newegg and other online stores.
Actually, his experience is the standard PC enthusiast experience for the vast majority of DIY'ers in many nations. And is now subject to threat if businesses catering to consumers shut-down.
I have to genuinely question this. I haven't heard of anyone I know buying PC components at a physical store in like 20 years, and I know people from various nations.
You have to take into account that same day delivery from amazon and the likes is only a real thing in the USA. Most other markets do not have the same service, even with accounts such as Amazon Prime. There is only one online store I know that is providing same day delivery in my area in Spain and it is a physical (and rather expensive) chain, El Corte Inglés.
And it is available in other countries.
Even normal folks upgrade RAM. My aunt did so last year for her old desktop PC. PC components are available in the local computer hardware market of any nation. (Though admittedly, most people buy parts online nowadays and local hardware markets are shutting down)
No. I'm a PC enthusiast myself, as are most of those people I know. I run an online (PC) gaming community.
> (Though admittedly most people nowadays just buy online and local hardware markets are shutting down)
Literally what I was saying.
I'd love to see more market-style parts locations a la Huaqiang or Akiahabara
Which is why they shut down - the addressable market of people having an emergency need for an item from a limited selection of electronics isn't that big, and that's becoming the only market.
It's not your fault that you don't want to pay over the odds for everything when you're not in a rush, and it's not their fault they need to pay commercial rent, utilities, payroll, insurance and all the other overheads.
But the outcome is simply that staffed local physical shops have a lower efficiency ceiling in terms of getting items to customers.
That MicroCenter continues to exist tells me that there's at least enough people shopping for parts in meatspace that there's net revenue to be had.
Yes, Microcenter "exists", but primarily through selective cultivation of their locations. From a pure market footprint perspective, they are outclassed by a candle company, and many other niche businesses.
At no point was I entirely denying that some people go to physical stores to buy components. I was just countering the idea that a majority of people do so, as opposed to ordering online.
30 years ago you would buy what was available locally, possibly you could obtain from the shop owner that he orders a part from his distributor's catalog and that was it. And we weren't giving it much second thought.
Now when we know we can obtain any brand or any model online we are much more picky about our component choices. I know for me it is the same in other areas I am knowledgeable like bicycle parts. Regardless of the price more often than not the local bike shop doesn't have the exact tire model I want so if I am not in a hurry I order online. I wasn't unhappy buying whatever was available back in the days as it was just not a possibility and I had less knowledge about what was available, even when receiving magazines every month. Ignorance is bliss sometimes.
When I was an undergrad at Cal Poly San Luis Obispo 20 years ago, I relied heavily on Newegg, since there were no large electronics stores in San Luis Obispo back then for computer enthusiasts. Best Buy today has come a long way and is now a great place for PC enthusiasts, but this wasn't the case 20 years ago; it had much more of a consumer electronics focus back then. Four years ago I was visiting Cal Poly friends in Santa Maria; we were building my PC together. I bought the wrong power supply online, and so we ended up going to Best Buy in Santa Maria, where I was able to find the correct power supply for a good price!
Even with Best Buy's improved selection, nothing beats Micro Center in either Silicon Valley or Irvine, but if you're in neither location and Best Buy doesn't stock what you need, then you have to order online.
As much as I love Micro Center, though, nothing beats Yodobashi Camera in Akihabara, Tokyo. That store is electronics heaven, at least for new components. For used components, I peruse Akihabara's alleys, which are filled with small shops specializing in used and retro gear.
The one that my first PC came from (in 1988) was open for something like 20 years. Another that still remains has been there for 33 years.
Plus, I mean: Best Buy stocks some PC parts. So does Wal-Mart. They're not "local," but they're nearby and they have stuff.
I have complete confidence that I could leave the house in the morning with nothing but some cash, and come home with enough parts to build a performant and modern PC from ~scratch in about an hour or two -- including travel.
And that's Ohio -- it's flyover country, full of corn fields and cowpies.
But after I drive to Microcenter and shop there and drive back, I'm fuckin' tired. I won't want to build a PC when I get home. I'll want to think about either getting a pizza or going to bed, and the bed will probably win.
So usually, I don't shop at Microcenter at all. I adore that place (and yeah, I'm impressionable: Keeping Raspberry Pi Zero W's in stock at every checkout register and selling them for $5 made an impression on me), but it's just too far away from where I live.
What usually happens instead, despite still having much more local alternatives, is this: I order the stuff. It shows up on my porch a day or two later. I build it at my leisure.
- Went online, ordered everything for pickup (didn't pay yet)
- Drove there, they had it all bagged and ready
- I showed them online prices for some of the parts
- For the ones they could verify (I think it was all of them) by going to the website and checking, they matched the prices
- Then I paid and took my stuff home
I also got my M1 MBP there (it was 25% off when the M2 models came out).
Please, if you have a Microcenter near you, give them your business. I don't want them to go away. Once all this memory madness dies down, I'm going to go there to build a new gaming rig.
I walked into a Central Computers the other day and was flabbergasted. I had never seen a Threadripper PRO or a 10G switch in a store before!
https://m.youtube.com/watch?v=Xjj6uIIUT_0 gives a good idea of how it is now
But, I guess, if you need a mouse right now and don't insist on the absolute best price, they're still there, yeah.
In practice, I live two streets away from there and yet I do all my shopping online (not that I buy that many parts anymore).
Brick and mortar stores are as useless as they've always been. Even now they're selling old hardware (couple of generations old or older) for more than it was ever worth. For example, one such store not far from me has been trying to offload a 12-year old LCD monitor for several years now, for two times of its original price. I wonder why.
People have been tinkering with electronic/electric modules for decades:
Rather: very commonly the local shops don't stock the parts that I would like to buy, and it is often hard to find out beforehand which kind of very specialized parts the local shop does or doesn't stock.
True story concerning electronic components: I went to some electronic store and wanted to buy a very specialized IC, which they didn't stock. But since the sales clerk could see my passion for tinkering with electronica, he covertly wrote down an address of a different, very small electronics store including instructions which tram line to take to get there (I was rather new to the city), which stocks a lock more stuff that tinkerers love. I guess the sales clerk was as disappointed with the range of goods that his employer has decided to concentrate on as I was. :-)
On the other hand, lots of former stores for PC component now have whole lots of shelf rows with mobile phone cases instead. I get that these have high sales margins, but no thanks ...
Thus, in my opinion it is not online shopping that killed local shops, but the fact that local shops simply don't offer and stock the products that I want to buy.
Increasingly, what we have are mobile terminals - possibly with a dock for monitor connections - for remote big iron. And the continuous push from governments for more control - seemingly synchronous demands for age gating (i.e. requiring IDs) and chat snooping - males me think this remote hardware won't really be yours before long.
Windows, caught up in the LLM mania, is to be left by the wayside too.
Number one, you become a recurring subscription instead of a one and done deal, making it incredibly profitable for industry
And number two, the government can more easily snoop on your data when it's all in the cloud versus a HDD box in your closet.
Granted, I think we're far away from that future, but I do feel that's the future the powers that be desire and they can use various mechanism to force that behavior, like convenience, and pricing, like for example making PC parts too expensive for consumers and subsidizing cloud and mobile devices to accelerate the move, and once enough consumers only know how to use Apple or Google devices they'll be less inclined to spend more money to build a PC and learn what a Linux is.
But the first one? I'm less convinced. I think the underlying assumption is that companies look to make the most money off consumers. I can get behind that.
But just the other day I was looking at a new GPU and considered running a local LLM. I was looking at spending no more than 1000 €. That would get a me 5070 TI 16 GB. Not enough to reasonably run anything interesting. I'm not looking to "tinker with things", I want to actually use them, mostly for coding. A JetBrains subscription would run less than 10 € per month [0], and keep me up to date with the evolution of things. My 5070 would be stuck at its mostly useless level forever, since I don't see requirements going down any time soon. If prices didn't change, I'd need more than 100 months, or 8 years, to break even. And during these 8 years, I'd never have a decent LLM experience by buying it outright.
Sure, this would be a capable GPU for other uses. But in my case, it would just sit around under my desk heating up my room.
---
[0] You'd get a 20 € discount if paying for a whole year upfront (10/month, 100/year). I'm also excluding VAT, since I'd buy this for work and have a VAT registered company.
When computers go obsolete in 2 years nobody wants them on their balance sheets, when they last 6 sales go down and you get more years of profit from owning them. This hasn’t been an instantaneous transition, but a low shift in how the industry operates.
The CPU is more expensive than the workstation blackwell card. 8x 96GB DIMMs - 96GB was at a corner in the price per GB, 128GB was more expensive per GB - is also more expensive now than the GPU. In fact the prices for that kind of package on ebay seem to be approaching the price of the entire box.
Give citizens computers and they have encryption. This alone gives them a fighting chance against police, judges, three letter agencies, militaries.
Give citizens computers and they can wipe out entire sectors of the economy via the sheer power of unrestricted copying.
The future is bleak. Computer freedom is dying. Everything then word "hacker" ever stood for is dying. Soon we will no longer have our own systems, we will have no control, we will be mere users of corporation and government systems. Hacking will go extinct, like phreaking.
This fact brings me a profound sadness, like something beautiful is about to perish from this earth. We used to be free...
But low end iGPUs don't need a lot of memory bandwidth (again witness Apple's entry level CPUs) and integrating high end GPUs makes you thermally limited. There is a reason that Apple's fastest (integrated) GPUs are slower than Nvidia and AMD's fastest consumer discrete GPUs.
And even if you are going to integrate all the memory, as might be more justifiable if you're using HBM or GDDR, that only makes it easier to not integrate the CPU itself. Because now your socket needs fewer pins since you're not running memory channels through it.
Alternatively, there is some value in doing both. Suppose you have a consumer CPU socket with the usual pair of memory channels through it. Now the entry level CPU uses that for its memory. The midrange CPU has 8GB of HBM on the package and the high end one has 32GB, which it can use as the system's only RAM or as an L4 cache while the memory slots let you add more (less expensive, ordinary) RAM on top of that, all while using the same socket as the entry level CPU.
And let's apply some business logic to this: Who wants soldered RAM? Only the device OEMs, who want to save eleven cents worth of slots and, more importantly, overcharge for RAM and force you to buy a new device when all you want is a RAM upgrade. The consumer and, more than that, the memory manufacturers prefer slots, because they want you to be able to upgrade (i.e. to give them your money). So the only time you get soldered RAM is when either the device manufacturer has you by the short hairs (i.e. Apple if you want a Mac) or the consumers who aren't paying attention and accidentally buy a laptop with soldered RAM when their competitors are offering similar ones for similar prices but with upgradable slots.
So as usual, the thing preventing you from getting screwed is competition and that's what you need to preserve if you don't want to get screwed.
Even if you have a surface area equivalent to a high end cpu and high end gpu, combined in a single die?
Separate packages get you more space, separate fans, separate power connectors, etc.
In theory you could do the split in a different way, i.e. do SMP with APUs like the MI300X, and then you have multiple sockets with multiple heatsinks but they're all APUs. But you can see the size of the heatsink on that thing, and it's really a GPU they integrated some CPU cores into rather than the other way around. The power budget is heavily disproportionately the GPU. And it's Enterprise Priced so they get to take the "nobody here cares about copper or decibels" trade offs that aren't available to mortals.
And before you bring up the “efficiency” of the Mac: I’ve done the math, and between the Mac being much slower (thus needing more time to run) and the fact that you can throttle the discrete GPUs to use 200-250W each and only lose a few percent in LLM performance, it’s the same price or cheaper to operate the discrete GPUs for the same workload.
My point was not that “it isn’t really that slow,” my point is that Macs are slower than dedicated GPUs, while being just as expensive (or more expensive, given the specific scenario) to purchase and operate.
And I did my analysis using the Mac Studio, which is faster than the equivalent MBP at load (and is also not portable). So if you’re using a MacBook, my guess is that your performance/watt numbers are worse than what I was looking at.
Ultra is about 2X of the power of a Max, but the Max itself is pretty beefy, and it has more than enough GPU power for the models that you can fit into ~48GB of RAM (what you have available if you are running with 64GB of memory).
In pretty much any other situation, using dedicated GPUs is 1) definitely faster, like 2x the speed or more depending on your use case, and 2) the same cost or possibly cheaper. That’s all I’m saying.
Very feasible but it would have to be redesigned around the cell libraries used in newer nodes since the i386 was manufactured on >1um size nodes.
Prototypes would cost around $1-2k per sq mm at 130nm and $10k per sq mm at 28nm (min order usually around 9 sq mm). Legacy nodes are surprisingly cheap, so non-recurring engineering will generally be the bulk of the cost. The i386 was originally >104 sq mm but at 1um, so you could probably fit the entirety of a i386 clone in 1-2 sq mm of a 130nm chip. Packaging it in the original footprint and the lead times on prototypes would probably be more annoying than anything.
I'm really wondering about hardware as well but today's tech is surprising in it's scale and requirements - I wouldn't be surprised if we could do mid 70's tech as hobbyist today - but further than that...
No one is saying that it's the sole culprit. But when average PCs start costing $3000+ from now on, it seems like the end of an era.
- Hardware
We won't have any hardware without secure boot and we won't have the signing keys. Signed firmware is required for everything x86, everything Apple, everything mobile, probably everything ARM too. Rockchip ARM could still boot without signed firmware last time I checked a few years ago, but I'm not sure about the newer CPUs.
[ Short story: I have an Asus Tinkerboard S. It came with an unprovisioned emmc onboard. Just empty. A few years ago I made the mistake of trying the official OS from Asus. It automatically set up the CPU to boot from emmc first and provisioned the emmc with boot0, boot1 and rpmb. These partitions can't be written and can't be removed. Their creation is a one-way operation according to emmc standards. Now I have to keep the emmc masked because it won't boot otherwise. So beware of any devices with emmc. ]
You can, of course, use MCUs for general computing today. ESP32 is pretty powerful - it's probably 4 times faster than a 486, certainly more powerful than an i386 or a 68000 you suggested. The big problem here is memory, graphics and software. No MMU requires a completely new OS. Linux (uclinux) could boot at some point without a MMU, but it won't fit in 540KB memory. MCUs can access external memory (PSRAM), but via slow buses and it's paged. Also there are no hi-speed buses for graphics.
There is some hope coming from the chinese CPUs. AFAIK, they don't support secure boot at all. I'm planning on getting one as soon as their proprietary firmware/UEFI/ACPI and be replaced by uboot.
- Software
It's useless to make a i386 or 68000 today. There's nothing but old software for them. Not even linux has i386 support anymore. This is an even bigger problem than hardware. Much, much bigger. To have any hope of a minimally useful computing platform, we need a working browser on that platform. This is an absolute requirement. There's no way around this. I had to abandon Win98, then WinXP, and soon Win7 because of no working browser.
Linux is generally usable today, but as soon as Linus retires, it's going to fall into the same user-lockdown like all others. The basic infrastructure is all in place: secure boot, root login basically deprecated, access rights and security settings administered by the distro and package manager, not the user, no per-program firewall rules, automatic updates as standard, surveillance infrastructure via udev (hardware events), dbus (software events) and gtk accessibility (input events), etc. Linus fought hard to keep them outside the kernel, but he won't live forever.
To have any hope of privacy and/or freedom our personal computers we need to turn the security paradigm completely on it's head: users are not the threat - programs are. The user should login as root (or system account in Windows) and every program, including services, should run under their own limited accounts with no access to the rest of the system, especially user's files.
Of all OSs today, Gentoo portage is probably the easiest package manager to tweak into creating accounts and groups for programs instead of users and Gobo Linux has the best separation of programs in it's filesystem. I'd love to see a merger of these two.
Hobbyist computing? Ha! First get a browser working.
I'm actually a big fan of Apple hardware (when you crunch the numbers for base spec machine and when you're able to get discounts, the price/performance for the half-life you get is incredible), but I'm also planning to get back into home-brew builds a bit more over the next year: I need to build a NAS, a home lab, I might look at a gaming rig... and I'm far from alone.
So yes, it's a niche market, but a profitable one for a lot of players, and one that Micron will be glad is still available to them when the data centre bubble bursts.
I don't really believe in AGI if that's what they're going for, but hey they'll get something close to that.
Micron is exiting this business because it’s a commodity. It’s everywhere. There are numerous companies producing parts in this space. Their investments are better spent on other things.
> I wonder if it's possible even for a company to make even 1980s-era electronics without massive capital expenditures? How feasible is it for a small company to manufacture the equivalent of a Motorola 68000 or Intel 386?
I don’t know what your threshold is for massive capital expenditure, but you could get a tapeout for a 68000 clone easier than at any point in history. There are now even takeout services that will let you get a little piece of a shared wafer for experimenting with making your own chips for very low prices. There are even accessible tools available now.
The hobby and small scale situation is better now than at any point in history. I don’t know how anyone can be sad about it unless they’re ignoring reality and only speculating based on the most cynical online takes about how the future is going to do a complete 180.
Numerous as in plenty? Or basically three? Samsung, SK Hynix and Micron make up over 90% of the market share of DRAM. Micron saying goodbye to the consumer market basically leaves us with yet another duopoly.
You don't buy SK Hynix or Samsung RAM (unless you are an OEM typically.)
Consumers can still buy RAM from companies like G.Skill, TeamGroup, Corsair, Kingston, etc. Those companies can use chips from any of the big three.
Crucial winding down doesn't mean that there's less flash fabs in the world, just like there are plenty of DRAM companies that sell to the consumer market that don't have their own flash fabs (Corsair, Gskill, Geil, etc). Those consumer oriented brands bought flash from Micron (not crucial), SK (also doesn't have a consumer facing DRAM brand), and Samsung. That's just as true today as it will be post Feb 2026 when Crucial closes it's doors because Micron selling components to say, Corsair, is an Enterprise transaction from Micron's perspective (B2B), even if the ultimate end product is a consumer oriented one.
What you linked is OEM RAM. The enterprise market still uses lots of UDIMM's, even though it certainly favors RDIMMs. Yes, it's DDR UDIMM's that are compatible with consumer PC's. But it's not sold like Samsung or Crucial consumer-oriented memory is, at least in the US market. Check the availability of it compared to say Crucial or Samsung. It's virtually non-existant. Check the comments on that very site as well, note how many people are talking about getting them from pre-builts (aka OEM/SI). Note, SK selling UDIMMs to SI/OEM's for consumer products is not a "consumer transaction" to SK, that's a B2B enterprise sale to another company from SK's perspective.
So technically yes, but practically no, not in the US they don't. Elsewhere I'm unsure, but SK has been deprioritizing consumer DRAM sales for a while now, just like Micron.
This massive tide floats all boats and there will be smaller players filling the gap. With slightly worse chips at first yes, but historically DRAM had been a cyclical businesss where new manufacturing capabilities were brought online en masse after a boom.
Here is a video of Sam Aloof who now runs Atomic Semi with Jim Keller. It likely took thousands of dollars to make his own custom Z2 chip that only has 1200 transistors and its nowhere near the likes of the 68k or Intel 386. They might have more advanced stuff now at Atomic Semi but they haven't announced anything
You should look into what’s happening with DIY robotics because it looks eerily similar to what I experienced in the early to mid 90s with PC hardware and software
And you can do way more than just host a bbs with robots
Jump in the DeLorean and head to the 1980s / 1990s
This is a big loss. Crucial offered a supply chain direct from Micron. Most other consumer DRAM sources pass through middlemen, where fake parts and re-labeled rejects can be inserted.
From what I understand, OpenAI just bought out a significant proportion of the capacity of Samsung and Hynix, which is the big reason prices just spiked. They're two of the three DRAM manufacturers, Micron being the third.
That gives us a good idea as to what Micron is doing here: They have contracts with their other customers to supply DRAM at whatever price they previously negotiated, but now prices are higher, and they don't have to honor supply contracts with Crucial because they own it. So they're taking all the supply that would have gone to Crucial and selling it to the high bidder instead.
Spinning off the brand in that context doesn't work, because then "Crucial" would need to come with supply contracts or they'd have no access to supply and have nothing to sell. Moreover, the supply constraint isn't likely to be permanent, and whenever prices come back down then Micron would still own the brand and could start selling under it again, which they couldn't if they sold it.
Why not just announce limited supply, then, instead of exiting?
This seems like a "automaker invests more in financing arm, because it's the most profitable" concentration mistake, towards an industry with wide concerns over intermediate term financial sustainability.
Because it would be a lie. The amount of supply they're planning to allocate to Crucial in the near future is zero. Keeping the website up as a place you can go and order nothing would only mislead people into thinking they should come back in a few days when they're restocked, when that isn't going to happen anytime soon.
> This seems like a "automaker invests more in financing arm, because it's the most profitable" concentration mistake, towards an industry with wide concerns over intermediate term financial sustainability.
Their business is manufacturing DRAM. Also selling it at retail makes sense most of the time because they get the retailer's margin too, but it doesn't right now, and that's the side business.
There are reputable eBay sellers shipping decomm'd previous-gen servers for bargain basement prices.
If you're fine running last gen (and you should be, for home lab use) then it's worth it to monitor prices over time.
They typically hit a floor, as supply from decomm waves coinciding swamps demand, but the resellers still want to move things as quickly as possible.
I don't know if I'd buy any of the servers or older computers but the internal components are pretty dang good. Stuff like hard drives, network cards, hbas, that's the real money saver right there.
I upgraded my home rack to use cx3 cards, and I sport a ds4246 that has a single 18tb sas hard drive in it lol.
Kind of cool I have 56gbps network between them, tho I can only get 30gbps at max in iperf3 lol (due to some pcie bw limit)
Unfortunately Weird Stuff is a thing of the past, though this had less to do with a reduced supply of surplus and more to do with Silicon Valley’s high real estate prices. Thankfully there are still good stuff to be found at e-waste recyclers, but if more companies are relying on the cloud, and if more devices are integrated boards with everything soldered on (such as most modern Apple hardware), the hand-me-downs of the future are going to be harder to work on than today’s hand-me-downs. I’m just an average software guy whose hardware experience is limited to a graduate-level computer architecture course and building PCs. I can talk about caching and branch prediction, but I’ve never picked up a soldering iron in my life; I’m no Louis Rossmann.
They are rerouting RAMs for consumers to enterprise for server build up
Enterprise? No. Micron is explicitly focusing on AI.So yes, enterprise customers doing AI data centre buildouts. They are going all out for them at the expense of their consumer business due to supply constraints.
I don't see this situation changing for many years to come. Would indirectly affect the cost of any electronics that has storage or memory on it. Would be interesting to see how Samsung plays this one out with their limited inventory - they make RAM + SSDs, use it themselves on their phones, laptops, etc. Also supply to consumer and business customers. Interesting times.
This is like developers shifting from building homes targeted at homeowners to building build-to-rent neighborhoods for Blackrock and company xD
Have you ever confused BlackRock with Blackstone? Despite their similar
sounding names, these two financial powerhouses represent distinct approaches
to investment management.
https://www.investing.com/academy/trading/blackrock-vs-black... Major news organizations and sector researchers describe the claim as
unfounded and often rooted in confusion between BlackRock Inc. and the
private-equity firm Blackstone Inc.
https://en.wikipedia.org/wiki/BlackRock_house-buying_conspir...Anyway it doesn’t matter the name for the purpose of my argument. And lol at “unfounded conspiracy theory” it’s only a “conspiracy” that Blackrock is buying homes (crucially, no one is actually suggesting that, except the people who are confused between Blackrock and Blackstone), because Blackstone definitely is buying homes.
It shouldn't be possible for one holding company (OpenAI) to silently buy all available memory wafer capacity from Samsung and SK Hynix, before the rest of civilization even has the opportunity to make a counteroffer.
You mean a central planning, command and control economy? There is a lot of history of countries trying these things and they don’t have the outcome you want.
DRAM manufacturing is a global business. If one country starts imposing purchase limits for whatever reason, the DRAM consumers are going to laugh as they move their data centers and operations to another country that doesn’t try to impose their laws on a global market.
> It shouldn't be possible for one holding company (OpenAI) to silently buy all available memory wafer capacity from Samsung and SK Hynix, before the rest of civilization even has the opportunity to make a counteroffer.
Good news: That’s not how markets work. DRAM manufacturers don’t list a price and then let OpenAI buy it all up. Contracts are negotiated. Market prices fluctuate.
No supplier of anything is going to let all of their inventory disappear to one buyer without letting the market have a chance to bid the price higher.
https://news.ycombinator.com/item?id=46143535
Had Samsung known SK Hynix was about to commit a similar chunk of supply — or vice-versa — the pricing and terms would have likely been different. It’s entirely conceivable they wouldn’t have both agreed to supply such a substantial part of global supply if they had known more...but at the end of the day - OpenAI did succeed in keeping the circles tight, locking down the NDAs, and leveraging the fact that these companies assumed the other wasn’t giving up this much wafer volume simultaneously…in order to make a surgical strike on the global RAM supply chain..
https://thememoryguy.com/some-clarity-on-2025s-ddr4-price-su... The Chinese government directed CXMT to convert production from DDR4 to DDR5 as soon as the company was able. The order was said to have been given in the 4th quarter of 2024, and the price transition changed from a decrease to an increase in the middle of March 2025.. A wholesale conversion from DDR4 to DDR5 would probably be very expensive to perform, and would thus be unusual for a company that was focused on profitability. As a government-owned company, CXMT does not need to consistently turn a profit, and this was a factor in the government’s decision to suddenly switch from DDR4 to DDR5.That means that it is a market where there is an asymmetry of power between vendors and buyers, caused by the fact that the vendors know more than the buyers, i.e. only the vendors know the right price for their products.
Therefore in such a market there are winners and losers among the buyers. Those buyers who buy quantities great enough to have negotiating power and who have knowledge about the right prices can buy at those prices, while the other buyers are fooled by the vendors into paying excessive prices.
The fact that the big-volume buyers deserve discounts has nothing to do with price negotiation. In a good market, where there is enough competition, the volume discounts can be public and available for anyone.
Also, a public auction for a product where the demand exceeds the offer has nothing to do with a secret price negotiation.
Any vendor who promotes price negotiation is a vendor who desires to steal money from its customers, instead of performing a mutually advantageous exchange.
The simpler solution is a tax on scale -- a graduated corporate revenue tax, aggregated across any group of entities which meet the common control [1] criteria. Then it's just a tax, and you simply have to collect it. Very little wiggle room.
If splitting your company in half wouldn't impair any of its lines of business, the CEO has a powerful financial incentive (lower tax rates on the two halves) to do so.
You can't exactly break up a chip manufacturer when they have just 1 or 2 plants tooled for the latest memory.
IMO, recognizing chip fabrication as a national security asset and turning it into a public corporation would be the better way to go. Let the likes of intel/amd/or micron continue developing chips. But also, take control of the most expensive and risky part of chip manufacturing to make sure we don't fall behind due to corporate budget cuts. You also keep and continue to build expertise in a vital part of modern society.
In fact, the largest, most advanced, and best known semiconductor manufacturer is primarily government owned: TSMC.
The only thing that gets in the way of their ability to sustain innovation is administrations hostile to publicly funded research.
Outside of innovative industries, there are plenty of examples of important government ran organizations aren't "disasters". Some of which can only be effectively ran via government. For example, healthcare.
What's been a disaster is relying on privatization and capitalism to solve all problems. That's the system of government we had in the dark ages.
Not only that, there was government-led chip research in Taiwan before TSMC (ITRI). And it was going nowhere. If Morris had stayed in ITRI, Taiwan would probably look like a developing country whose primary value is to host the US military bases today.
It would not exist without the government's direct intervention.
There's a reason why basically only Intel does inhouse fabrication and even they have had to rely out outsourcing it.
Antitrust laws can/should be applied to, eg., Google for search and web monopolization.
If someone is willing to pay more than you for a limited supply of some resource, that isn't a market monopoly.
Google is about to lay waste to everyone.
Google is using their nation state wealth to once again dominate a new sector. Wealth gained from unfair monopolization of search and web and mobile.
Google changed the notion of the URL bar to search. They control every ingress. Now, if you want to access a name brand registered trademark, it flows through Google search. Brands have to pay extortion money to Google to keep others from sniping their rightful name brand.
Google gets even more money because there's a bidding war.
Google pays to put themselves as middle men in front of nearly every web access.
Google doesn't just have a monopoly on this, it's downright unethical and should be tried in court or have laws written to make this illegal.
I have no love for OpenAI, but how do they even compete with the hundreds of billions of dollars this nets?
> Monopolies never last in the tech industry
Google is an invasive species in the ecosystem. They're killing viable competition by engorging themselves and taxing non-productively.
Capitalism should be hard. It should be live or die regardless of size or scale. Google is barely breaking a sweat.
This isn’t antitrust because the companies aren’t reselling it to you at a much higher price after cornering the market (cough cough Ticketmaster & scalpers).
Perhaps the limited competition caused by 25+ memory manufacturers consolidating down to a 3-member cartel is a sign of market failure. I use "cartel" deservedly, as the RAM manufacturers were found guilty of price-fixing in multiple times in the past.
-Thoughtless, deluded, Homo economicus religion, idiot
It’s these pesky pc things that people do bad things like piracy with/s
And they're effectively saying they've had enough of running call centers, tracing lost parcels, weirdo customers who show up at the factory, running marketing campaigns etc.
A consumer facing business is a lot of overhead, and since more and more hardware now has soldered ram, it is a shrinking business too.
Shrinking businesses are super hard to run - it's far easier to grow a business than shrink it whilst maintaining the same margins.
When this is a company's core complaint, then the usual strategy for getting out of the D2C business (without losing D2C revenue) is finding a channel partner willing to absorb the dealflow. I.e. turning your B2C channel into a single B2B(2C) enterprise customer.
Large DIMM vendors are definitely not buying through middlemen.
Any vendor consuming a lot of RAM chips over a threshold will be negotiating contracts with RAM chip manufacturers. It’s not that hard even at medium scale.
https://www.klevv.com/ken/main
And don't forget about https://www.nanya.com/en/
While I never had a problem with https://semiconductor.samsung.com/dram/module/ , I think they will be rare/more expensive now, or 'soonish'.
For chinese CXMT and YMTC there is https://www.biwintech.com/
We live in interesting times!
(Cackling madly...)
Just looked at standard desktop: still no 64GB 5600MT/s modules. CUDIMMs are missing 32GB.
> And don't forget about Nanya
BTW, what is the status of Elpida now?
I really wouldn't want to buy any NAND vendor until a bunch of years after they build a reputation. It's too scary to get a decent bargain SSD drive that actually oh secretly dies really early, doesn't actually have anywhere near the endurance it claims.
This is a mistake. The consumer business is a bet , which they excel at. Yes, its not printing money right now, but it is an option. Exiting the consumer business will mean they may miss insights into the next hot consumer trend.
The game for large companies like this should be to keep small bets going and literally, just survive. That's what Micron was doing, that's what NVIDIA did for a better part of a decade. Now that both are printing money.
Yet, Micron has decided its retiring from placing more bets. Mistake.
Crucial is primarily a Marketing and Support company, they didn't really make anything although there was a small engineering team that did DIMM/Module binning, but mostly contracted out heatsinks to glue to Micron DIMMs. On the SSD side of things, they used Phison controllers with Micron flash, just like pretty much any other consumer SSD that isn't Samsung or SK/Solidigm.
Corsair, Gskill, Geil, etc don't buy components from Crucial, they get them Micron. Crucial closing their doors has no bearing on that as far as we can tell.
They probably considered dumping it on some Private Equity firm or something, but likely decided to keep the IP in case they decide to resurrect it in the future should the market landscape change.
It sucks that they're winding down crucial, but it makes sense. Frankly I'm surprised they didn't pull the trigger sooner, and by sooner I mean years sooner.
Now it feels like if you're not Facebook, Google, OpenAI, etc. etc. computation isn't for you.
I hope this is just a blip, but I think there is a trend over the past few years.
The democratization of technology was something that had the power to break down class barriers. Anyone could go get cheap, off the shelf hardware, a book, and write useful software & sell it. It became a way to take back the means of production.
Computing being accessible and affordable for everyone = working class power.
That is why its backsliding. Those in power want the opposite, they want to keep control. So we don't get to have open devices, we get pushed to thin clients & locked boot loaders, and we lose access to hardware as it increasingly only gets sold B2B (or if they do still sell to consumers, they just raise prices until most are priced out).
When the wealthy want something, that something becomes unavailable to everyone else.
Granted, to be fair, many of today's startups and small businesses are made possible by AWS, Google Cloud, Microsoft Azure, and other cloud services. It is sometimes cheaper to rent a server than to own one, and there are fewer system administration chores. However, there's something to be said about owning your own infrastructure rather than renting it out, and I think a major risk of compute power being concentrated by just a few major players is the terms of computation being increasingly dictated by those players.
While it's undeniable that MAFIAA et al have been heavily lobbying for that crap... the problem is, there are lots of bad actors out there as well.
I 'member the 00s/10s, I made good money cleaning up people's computers after they fell for the wrong porn or warez site. Driver signatures and Secure Boot killed entire classes of malware persistence.
Do we want to accept that as a potential consequence, or have someone else choose for us what consequences we are allowed to accept?
Unfortunately, I think the old guard here is dying out and the majority want someone else choosing for them, which is why all the age verification & chat control-like bills have broad bipartisan support.
I'm in the "with freedom comes responsibility" camp. Obviously we should build secure systems, but our devices shouldn't be impenetrable by their own user. The "security" we are getting now is just security against the user having the freedom to do as they wish with their devices and software.
The cultural zeitgeist surrounding internet and computing freedom has changed to be in favor of more control and censorship. Not sure how we can stop it.
In a naive way, when rich entities are interested in a limited resource it's basically over.
Somehow I can see a parallel with the housing crisis where the price go higher and higher.
I can't see both of them ending anytime soon unless there is a major paradigm shift in our life.
What high end technology do you want that you can't get?
In the 90s, I paid nearly $10k for a high-end PC. Today, I can get something like an Nvidia RTX Pro 6000 Blackwell for ~$8k, with 24,064 CUDA cores and 96 GB RAM, that's capable of doing LLM inference at thousands of tokens per second.
I realize the prices from this example are a bit steep for many people, but it's not out of line historically with high-end hardware - in fact the $10k from the 90s would be something like $25k today.
My point is I don't see how "if you're not Facebook, Google, OpenAI, etc. etc. computation isn't for you." I'd love an example if I'm missing something.
A little foil hat conspiracy i supposed, but the big companies saw nobodies become incredibly wealthy over the last decade, and this is the new companies protecting their position by limiting technology.
That being said, the only SSD I’ve ever had fail on me was from Crucial.
In recent builds I have been using less expensive memory from other companies with varying degrees of brand recognizability, and never had a problem. And the days of being able to easily swap memory modules seem numbered, anyway.
IBM really locked me in on the Ultrastar back in the mid '90s. Sure, it has proven itself to be a great product. But some of the first ones I bought, one of the drives arrived failed. I called the vendor I bought it from and they said they wouldn't replace it, I'd have to get a refurb from IBM. So I called IBM, when I told them I had just bought it they said I needed to call the place I bought it from because otherwise I'd get a refurb. I explained I had already called them. "Oh, who did you buy it from?" I told them. "Can you hold on a minute?" ... "Hi, I've got [NAME] on the line from [VENDOR] and they'll be happy to send you a replacement."
The only real difference between Crucial RAM and Micron's unbuffered RAM was which brand's sticker they put on it, with some binning and QA on the higher-end enthusiast SKU's and a heatsink. This sub-business-unit was almost entirely redundant to Micron.
> And the days of being able to easily swap memory modules seem numbered, anyway.
I keep seeing people say this in threads across platforms discussing this news, and it baffles me. Why?
Absolutely, positively, wildly untrue. Just because there is a boom in memory-on-package designs doesn't mean the market is moving away from expandable/socketable memory. The opposite is true. It's supplementing it because we're trying to cram as much ram as possible into things, not because we're trying to reduce it.
There has never been more demand for RAM. Many of the memory-on-compute/memory-on-package designs are going into systems with socketable ram. Those systems btw have never had more memory channels/slots available. Intel just cancelled their 8 Channel SKU's for their upcoming Xeon parts because their partners pretty much all universally told them they'd be buying the 16 channel variants instead, because demand is so high, and that's not unique to Intel. AMD and Ampere are seeing and responding to similar demands, by continuing to increase their supported memory channels/memory capacities.
> and manufacturing cost reasons.
This generally increases price, even when using things like LPDDR, especially as the capacity of the packaged RAM goes up (the fact that this can't be replaced makes yield issues a big concern whereas in socketable RAM it's effectively a non-issue). There are ways that it can be used for cost effectiveness, but those applications are generally not "high margin" nor are cost-sensitive applications of this deploying a lot of SKU's to cater the wide variety of demand in type/speed/capacity (eg (LP)DDR vs GDDR vs HBM and all the variations therein, not to mention buffered vs unbuffered, Load reduced, computational, storage class etc), because even with the chiplet/modular production of CPU's, that is not a linear scale up of cost-to-manufacture (or engineer) as complexity goes up. This isn't like Cores on a CPU where you can just disable some if there's a manufacturing defect, you need to swap memory controllers and scale qty of those controllers and use different kinds of DMA interlinks depending on the ram type (can't just swap HBM for DDR and expect everything to work)
For most performance oriented products, the memory-on-package thing is a new layer of RAM that sits between the cache of the compute unit (CPU/DPU/Whatever) and traditional socketable DRAM, not as a replacement for it. There are very real thermal and packaging limits though. For example, how are you going to install 2TB of DDR directly onto a CPU package? How are you going to cool it when teh available surface area is a fraction of what it is with socketable RAM and you're theoretically hitting it even harder by leveraging the lower latency while placing it so close to the compute that's using it, that even if the RAM is idle it's still subject to far more heatsoak than equivalent socketable RAM is?
This is further substantiated by the demand for things like CXL which allows you to further expand RAM by installing it to the PCIe bus (and thus, through things like RDMA/RoCE, through the network fabric) like you would any other PCIe add in card, which is leading to an explosion in something called Storage Class Memory (SCM), so that we can deploy more socketable/expandable/replacable RAM to systems/clusters/fabrics than ever before.
I could go on and on actually, but I'm not trying to attack you, and this post is long enough. If interested, I could continue to expand on this. But the point is, memory-on-package designs aren't replacing socketable memory in high margin markets they're supplementing it, as a direct result of demand for RAM being so astronomical and there being multiple hard limits on how much RAM you can cram into a single package effectively. The last thing people want is less RAM or less choice in RAM. The RAM itself may evolve (eg SOCAMM, Computational Memory, MCR, SCM etc), but socketable/replaceable/expandable memory is not going away.
EDIT:
> It’s hard to see what the motivation for spending some of their limited foundry time on products that are only of interest to lower margin direct consumers if this keeps up
This is a fair concern, but entirely independent of the first part of your comment. Worth noting that just because Samsung is the only game in town left selling consumer DIMMs (at least in the US), doesn't mean that the consumer market isn't getting supplied. Micron, Samsung and SK are all still selling DRAM components to consumer facing brands (like Corsair, Gskill, Geil). It's entirely possible they may reconsider that, but consumers aren't the only ones with volume demand for DDR4/DDR5 DRAM UDIMM's. OEM's like DEll, HP, etc and various SI's all have considerable volume demand for them as well, and combined with consumer demand, does place considerable pressure on those companies not to fully exit supplying the market - even if they chose to only do it indirectly going forward.
Was a huge relief that the machine come up successfully. But then it would lock up when it got warm, until I found the dodgy joint.
Was a very stressful afternoon, but a confidence builder!
I bet there are many people whose sole experience inside a computer is popping in some DIMMs. I’ll be kinda sad if those days are also gone. On the other hand, super integrated packages like Apple’s M-series make for really well-performing computers.
And before that I duct-taped the insanely large 16KB RAM extension (from 1KB), so it doesn't reset with the slightest movement, on my Sinclair ZX81, which I've also assembled and soldered from a kit :)
This is bad for consumers though since DRAM prices are skyrocketing and now we have one less company making consumer DRAM.
Other times professionals will sneer at a consumer product, or a consumer product can diminish your brand. Nobody's wiring a data centre with Monster Cables, and nobody's buying Cisco because they were impressed by Linksys.
Similar story on the SSD side of things regarding reputation/innovation, especially when you consider that Crucial SSD's are no more "micron" in a hardware sense than a Corsair one built using Micron flash (support is a different matter), as the controllers were contracted out to 3rd parties (Phison) and the flash used was entry level/previous gen surplus compared to what's put in enterprise. The demands and usecase for consumers and even prosumers/enthusiasts are very different and in general substantially less than on the enterprise side of things with SSDs, and that gulf is only growing wider. So again, what is meant to carry over? How can Micron leverage Crucial to stand out when the consumer market just doesn't have the demands to support them making strong investment to stand out?
Frankly, taking what you say farther, I think if this is what they want to do (having consumer brand recognition that can carry over in some meaningful way to B2B), then sundowning crucial now (given the current supply issues) and then eventually re-entering the market when things return to some sense of "normal" as Micron so that both consumer and enterprise brands are the same brand, "Micron", makes much more sense.
Especially considering that there's little innovation in the consumer DRAM and SSD spaces vs their enterprise counterparts that Micron can flex their talent in.
Almost certainly this is because of a windfall for Micron, at least in the short term. Datacenter memory demand is going through the roof, and that was where margins were highest already. It makes no sense to continue to try to milk a consumer brand that can be sold at, what, a 20% markup over generics?
Most likely Micron was planning this forever, and the current market conditions are such that it's time to pull the trigger and retool everything for GPU memory.
You can’t think about companies like it’s 2024. We’re in a gilded age with unlimited corruption… Anything can happen. They can sign a trillion dollar deal with OpenAI, get acquired by NVidia, merge with Intel, get nationalized by Trump, etc.
Sounds to me like they are using the tried and true method of selling equipment to the people rushing for gold
Their 'smaller' market, SSDs - has an estimated 13% of global NAND revenue.
https://counterpointresearch.com/en/insights/global-dram-and... https://counterpointresearch.com/en/insights/global-nand-mem...
I don't know their breakdown for consumer vs enterprise, but the Crucial brand is consumer focussed. Obviously enterprise at this point is incredibly lucrative.
We're gonna need a bigger pin.
Consumers are so annoying. And by consumers, I mean "anyone can get an API key for the latest model."
We cut down our trees to build more AI datacenter sculptures to please the AI gods.
Or the nuclear craze of 50s where radioactuve material where stuffed in everything like toothpaste, cream etc.
It really is just dotcom all over again.
Already most "AI researchers" outside of the big corps have basically turned in the last 3 years from "people training their models and doing research" to "webdev plugging into other people's APIs to use LLMs they don't know crap about". When, not if, the big AI bubble bursts, the damage done to the sector will be immense
Diversification is resilience.
Putting consumer on hold makes some sense. An exit? This will be written about in business books.
(ProTip: When you see 'Crucial'-labeled DIMMs with chips that don't have the Micron 'M' logo, I wouldn't buy that, or I would send it back.)
1. Gaming cards are their R&D pipeline for data center cards. Lots of innovation came from gaming cards.
2. Its a market defense to keep other players down and keep them from growing their way into data centers.
3. Its profitable (probably the main reason but boring)
4. Hedge against data center volatility (10 key customers vs millions)
5. Antitrust defense (which they used when they tried to buy ARM)
If they're unemployed, they'll just rent from the cloud
How many of you still manage your own home server?
No way that is true any more. Five years ago, maybe.
https://www.reddit.com/r/pcmasterrace/comments/1izlt9w/nvidi...
basically, the gaming segment is the beta-test ground for the datacenter segment. and you have beta testers eager to pay high prices!
we see the same in CPUs by the way, where the datacenter lineup of both intel and amd lags behind the consumer lineup. gives time to iron out bios, microcode and optimizations.
And they're not selling a handful of GPUs to nobodies like us; they're selling millions of GPUs to millions of nobodies.
Tell that to Micron.
Instead we will be streaming games from our locked down tablets and paying a monthly subscription for the pleasure.
The push for 4K with raytracing hasn't been a good thing, as it's pushed hardware costs way up and led to the attempts to fake it with AI upscaling and 'fake frames'. And even before that, the increased reliance on temporal antialiasing was becoming problematic.
The last decade or so of hardware/tech advances haven't really improved the games.
Biggest flop is UE5 and it's lumen/nanite. Reallly everything would be fine if not that crap.
And yeah, our hardware is not capable of proper raytracing at the moment.
Somebody should tell that to the AAA game developers that think hitting 60fps with framegen should be their main framerate target.
Poor RT performance is more a developer skill issue than a problem with the tech. We've had games like Doom The Dark Ages that flat out require RT, but the RT lighting pass only accounts for ~13% of frame times while pushing much better results than any raster GI solution would do with the same budget.
Do I, as a player, appreciate the extra visual detail in new games? Sure, most of the time.
But, if you asked me what I enjoy playing more 80% of the time? I'd pull out a list of 10+ year old titles that I keep coming back to, and more that I would rather play than what's on the market today if they only had an active playerbase (for multiplayer titles).
Honestly, I know I'm not alone in saying this: I'd rather we had more games focused on good mechanics and story, instead of visually impressive works that pile on MTX to recoup insane production costs. Maybe this is just the catalyst we need to get studios to redirect budgets to making games fun instead of spending a bunch of budget on visual quality.
Ray tracing has real implications not just for the production pipeline, but the kind of environments designers can make for their games. You really only notice the benefits in games that are built from the ground up for it though. So far, most games with ray tracing have just tacked it on top of a game built for raster lighting, which means they are still built around those limitations.
These massive production budgets for huge, visually detailed games, are causing publishers to take fewer creative risks, and when products inevitably fail in the market the studios get shuttered. I'd much rather go back to smaller teams, and more reasonable production values from 10+ years ago than keep getting the drivel we have, and that's without even factoring in how expensive current hardware is.
Ray tracing is a hardware feature that can help cut down on a chunk of that bloat, but only when developers can rely on it as a baseline.
Now, AI is taking all the RAM.
Soon, AI will be taking all the electricity.
https://www.spectrumsourcing.com/spectrum-news-feed/industry...
> Going forward, customers must order the full server system to obtain the motherboard.
Sadly, the MX500 is now difficult to find in western europe. Only lower grade BX500, still quite reliable but not as fast as the MX500 with cache + DRAM.
Had quite a lot of controller issues (become sluggish for periods of time) with the sandisk/WD ones like green/blue and SSD plus.
Just that I would not really compare the two. The BX500 is the only Crucial SSD I've ever had troubles with and kinda eroded my trust in the brand. My >10 years old M4 is still working like a champ, so does my MX200.
DRAM-less SSDs are a plague that is very hard to avoid, as it's never mentioned in the spec sheets.
They announced a month ago that their upstate NY fab was delayed by 2-3 years so the painting was on the wall
https://www.syracuse.com/micron/2025/11/micron-chip-factorie...
Anything the fab outputs will feed into Micron selling to datacenters
[1] https://www.morningstar.com/news/marketwatch/20251017180/why...
Such a defeatist take. Human nature is to come up with culture to check human nature. Human nature is to feel community expulsion as badly as physical pain, because human nature is to punish anti-social behavior. We are equally capable of love, empathy, anger and violence. What isn't human nature, is watching everything taken from you, while you tell yourself that's just the way things are. That's pathology.
They're one of the 0.1% left that don't use Cloudflare :)
I don't want to have to start buying obscure keysmash chinese brands for normal looking affordable hardware.
God dammit Micron.
I really hope this bubble pops, all these investors lose their shirts, and prices come down to something reasonable.
What's next, a new tech allowing you to turn food into money at a rate nobody can afford to eat anymore?
They've got you coming and going.
At least in my house, most of the computers can't be upgraded with Crucial parts, because they use Apple Silicon.
For starters, in the past, Crucial RAM wasn't really compatible with the Apple devices that did have socketable/upgradeable RAM. Technically some of their laptops/imacs did, but Apple really didn't want you servicing/upgrading those on your own, Crucial DIMMs weren't QVL approved by Apple (even though it'd generally work), and those were all SO-DIMMs not standard desktop DIMMs. The Mac Pros and what not that had upgradeable ram were all (L)RDIMM, not UDIMM. Crucial didn't make (L)RDIMM, that was Micron's market. Point being, this change in dynamic from Apple had virtually no impact on Crucial, as it was a market they never really served, or at least certainly not in the past decade.
The consumer market isn't shifting to non-upgradeable devices. There are more and more of them, but they're largely either not a market Crucial served to begin with (Apple), or supplementing the consumer devices that do have upgradeable RAM (Steam Deck, very compact HTPC's/Thin Clients, things that aren't really desktop replacements).
Two or three years ago, I bought an XPS 15 with minimum memory specs and immediately maxed the RAM to 64gb and threw in a 2tb nvme drive. Last year I purchased a M3 Macbook Pro and made no upgrades. I don’t expect to go back to PC, because the Mac performance is so great and the power managment actually works.
When I threw my XPS 15 in my bag, it often comes out of my bag with no battery. The Mac not only gets awesome battery life, but it doesn’t randomly discharge when closed and sleeps properly.
Crucial didn't leave because no one was buying RAM, or even that less people were buying RAM. Or at least that was the situation prior to shit going sideways with pricing/availability in the past 2 months, but that's independent of Apple/Crucial/Micron. They left because there's been little margin in unbuffered DIMMs (aka consumer RAM) for years, and there's virtually no demand or opportunity for innovation in the space for Micron to leverage the brand to make their B2B/Enterprise sales look better, and what Does make them stand out (largely cosmetics these days) is not a core competency of crucial or Micron. There's even less demand for anything interesting/innovative/worth having a whole separate company/BU for in the markets that are growing (Apple, Mobile, Laptops).
Also, now is a prime opportunity to exit the market, but it's also clear this has been in the works for a while though. Look at their limited and frankly half-assed product offerings for DDR5, then compare to what they offered on DDR4. Substantially less SKU's, with very little offerings for the enthusiast market. This wind down started circa 2020.
As far as Crucial is concerned, Apple has never been relevant or a danger to their bottom line. And if Micron wanted in on that money, they're better served doing it as Micron, as opposed to Crucial, which was tantamount to operational overhead in that context.
The MX500 1st gen (fw M3CR023) was the second best SATA SSD range with the kings the Samsung 860 Evo and Pro. P3 and P3+ were very good drives with great princing for some time, not comparable to the Samsung 970 Evo and Evo+ though.
Never had a failure on about 500 units of crucial MX300/500/P1/P3/P3+/P5. Always updated their firmwares, though.
Comparatively, had lot of sluggish controllers on Sandisk/WD green/blue SATA SSD, and some BX500. But a lot better than any entry level generic Phison S3111 based SSD.
Also very few failures with DDR3/4 DIMMs and SODIMMs. Less than with Kinston and Corsair modules. About the same as Samsung OEM modules from HP/Dell.
Now let's just hope Samsung will not follow in their tracks. I don't see WD-Sandisk going corporate only since they do not make DRAMs modules.
"AI"-driven collapse will go down as the stupidest crisis in human history. The idiotic waste of gigantic amounts of civilizatory resources, for something that hasn't remotely proven useful yet, while simultaneously neglecting existentially mandated reforms and investments, in an outrageously obvious critical moment in time ... well that's gonna dwarf even historic missteps of organized religion and island cultures.
I am calling it now:
* Cancelled: Cyberpunk.
* New lore timeline: Hypepunk > Crash-Core > Silicon Gothic
* Historian epoch title: The Dark Ages.
The term is up for grabs again. Also, future historians may not categorize medieval times as we do. By then, libraries may have been converted to data centers and historic artifacts burned to fuel GPUs, or destroyed through hallucinated restoration advice. Remaining digital traces may have been exhaustively overwritten by verbose AI prose, or simply lost when AWS introduced generative DNS. In a 1000 years, the only preserved evidence of medieval times may be an LLM's summary of "A Knight's Tale".
I do think it's proven useful, much like the internet had in the nineties.
2010s ATT leadership was something else.
The AI memory bubble is going to bust eventually. They have such enormous brand recognition and are known for selling a quality-first product.
To just throw that all away is insane.
For the last ten plus years I have only bought Crucial RAM. I am well beyond disapointed to hear this.
Or even worse : Maxio
What the fuck does "secular" even mean in this context? Is there religious DRAM?
What a short-sighted, boneheaded move. I'm so tired of the MBA-ificiation of every single part of my life.
Steam is a powerful force, and I applaud them for staying private and not IPO'ing - but their founders won't be around forever (hell Gabe, please transfer Steam into some sort of public-good trust or whatever, or have provisions in your will making it impossible for whomever inherits the company to enshittify), and sadly even they can't beat down on the studios too hard lest they end up like Netflix.
And same. Though blowing significant fractions of a trillion dollars into (imo) investments that are never gonna return anything near to making that a good plan (ex a government bailout) will inevitably redirect huge portions of stuff we care about. The world's gdp is about 90T iirc; that is basically taking 1/90th of the stuff the world does in a year and putting it into ai.
RAM optimized for TempleOS and Holy-C
well hopefully we see more proliferation of system of chips / unified architectures such as the one offered by apple being offered by amd, intel etc for consumer stuff.
All the samsung ones I've tried have died within a year. Fast until they died I suppose.
Any recommendations for nvme in a post-micron world?
Uh, isn't that normally a contradiction? bulk is cheaper(???)
Please explain how this is "short term" thinking.
AI is destroying so much. God knows how bad things will get once the bubble bursts.