tr:dr; he says “x86 took over the server market” because it was the same architecture developers in companies had on their machines thus it made it very easy to develop applications on their machines to then ship to the servers.
Now this, among others he made, are very good points on how and why it is hard for ARM to get mainstream on the datacenter, however I also feel like he kind lost touch with reality on this one…
He’s comparing two very different situations, more specifically eras. Developers aren’t so tied anymore like they used to be to the underlaying hardware. The software development market evolved from C to very high language languages such as Javascript/Typescript and the majority of stuff developed is done or will be done in those languages thus the CPU architecture becomes irrelevant.
Obviously very big companies such as Google, Microsoft and Amazon are more than happy to pay the little “tax” to ensure Javascript runs fine on ARM than to pay the big bucks they pay for x86…
What are your thoughts?
As someone dealing with enterprise software for living, what he’s saying absolutely makes sense, and I deal mostly in web applications (where I never really have to worry about the low level stuff).
Just because the top layer seems to be the same, doesn’t mean the underlying ones are. There’s a reason why perfect bug compatibility is a thing (or maybe, was, in RHEL ecosystem?).
Things that looks like slam dunks in theories are never such in practice. Weird bugs pop up from time to time; and believe me, they will!
It might be rare, you may only see it once or twice in a project; but when it happens, you’re gonna want to be ready, or people will question your ability to do your job.
The cross-compiling point makes sense but, since this is a 4.5 year old message, the state of ARM in the cloud has changed. Now developers do actually have ARM-based machines because of Apple. AWS has Graviton2 instances now and they are a lot cheaper than similarly specced x86_64 instances. ARM is a viable consideration that can be made.
While its true that having ARM ecosystem is more feasible now, there’s not many companies that’s willing to equip their whole team with very specific model of laptop, with almost no servicable parts for no perceivable benefit. No, Pinebooks as well as Raspberry Pi laptops and cyberdecks are not feasible for industry.
Most companies are not looking for gimmicks for work, even when they make some for living; so no, looking cool is not a benefit that defeats all that cost.
Meanwhile, most people in the industry, such as myself, and my current bosses & colleagues, and my previous bosses & colleagues, and probably all my future bosses & colleagues are fine running x86 for production servers. It got everything we’d need, including upgradable RAM and decades worth of collective experience, which I cannot say ARM has.
At the same time, I have some hope for RISC-V. It won’t take over the industry anytime soon, but it’s been showing some promise for long term.