• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

The Mac vs PC War is back on (sprawling newsletter article by Tom Warren about Windows & Arm)

LordOfChaos

Member
Hardware wise Apple has been killing it, Apple Silicon is an impressive leap delivering the worlds highest single core performance in a core that takes 1/3rd the power to get there than the next part.

But I've felt like software wise, Apple has ceased to be anything special, and is as ridden with bugs and slop and jank as any other major platform these days. There's a half dozen third party apps I have to get just to get to where Windows 11 is out of the box on things like window and multi display management.

So if Windows on ARM this time can truly steal its thunder with performance per watt, even if not quite at the single core performance of M4 yet, this could have my attention. These are all big IFs, and Qualcomm has a penchant for overstatement and underdelivery. It's notable that Apple launched TWO generations of M chips just since Qualcomm announced X Elite and still hasn't launched it.

Anyways, will be watching Microsoft's Monday event with interest, as well as Apple's WWDC in June catching up on AI features.
 
Last edited:

Guilty_AI

Member
According to Wikipedia: X86 was introduced in 78 and ARM in 85. At the speed progresses were made back then and still are, I'm sure it's a sufficient gap to call this prehistory vs beginning of modern architectures.
46 vs 39 years of age doesn't really make an argument for using a "newer" technology. Besides, the concept these two are based on are even older than that, being able to date them all the way back to the 60s.

It's also worth mentioning RISC was introduced to solve a problems that aren't really problems anymore. Most modern arguments you see online don't hold much water and it's recent renaissance mostly comes down to the fact they're more accessible and easy to understand, and thus do R&D for. Also means you don't have to fork out money to AMD or Intel, since it's highly unlikely for any company to be able to compete with them with their own x86 equivalents.
 

Chiggs

Gold Member
Feel free, they do have a few criticisms after all. This isn't some bajulating video like most pro-arm articles are.

Okay, I finished the video and nothing has changed my mind here. In fact, for all the technical gee-whiz Casey likes to employ, here's what this all boils down to:
  • Basically, Casey and his friend critique an article written by someone else, which is about X86, why it sucks and needs to die; Casey and friend disagree with the author on some key points, but then come to the conclusion that X86 does indeed have some "nasty" legacy elements.
What Casey thinks is especially bad about X86:
  • Because of poor planning and inherited legacy elements, X86/X64 can be a real mess for decoding logic (instruction sets), and because computing and coding is becoming increasingly parallel, you'll hit bottlenecks where all of the elements are dependent on each other, and even widening certain pipelines won't make a difference if one area is lagging behind another.
  • Casey goes on to state, and I am quoting Casey directly in giant font, because it's the most damning part of the entire video:
    • "There is another article you could write that is right about this, and that article would basically be, look: we could be decoding a lot more instructions per cycle on X86 with a lot less silicon if we made these changes."
So basically, X86 is an aging architecture with some truly idiotic legacy elements that halts progress and speed, wastes silicon and therefore increases manufacturing costs, and there's a reason why everyone is looking to dump it...even to the point that Intel has proposed making sweeping changes to keep people from bailing (which isn't working).

How the fuck is that different from any of the complaints I--or others--have made about X86 in this thread, or on this site?

Seriously? Help me understand.

Edit: I cannot believe I wasted time on this bullshit. Good night, and X86 still sucks.
 
Last edited:

poppabk

Cheeks Spread for Digital Only Future
46 vs 39 years of age doesn't really make an argument for using a "newer" technology. Besides, the concept these two are based on are even older than that, being able to date them all the way back to the 60s.

It's also worth mentioning RISC was introduced to solve a problems that aren't really problems anymore. Most modern arguments you see online don't hold much water and it's recent renaissance mostly comes down to the fact they're more accessible and easy to understand, and thus do R&D for. Also means you don't have to fork out money to AMD or Intel, since it's highly unlikely for any company to be able to compete with them with their own x86 equivalents.
It's amusing to me that the BBC micro computers we had at school that we wheeled from classroom to classroom were the first step that lead to the current Apple chips.
 

Guilty_AI

Member
Okay, I finished the video and nothing has changed my mind here. In fact, for all the technical gee-whiz Casey likes to employ, here's what this all boils down to:
  • Basically, Casey and his friend critique an article written by someone else, which is about X86, why it sucks and needs to die; Casey and friend disagree with the author on some key points, but then come to the conclusion that X86 does indeed have some "nasty" legacy elements.
What Casey thinks is especially bad about X86:
  • Because of poor planning and inherited legacy elements, X86/X64 can be a real mess for decoding logic (instruction sets), and because computing and coding is becoming increasingly parallel, you'll hit bottlenecks where all of the elements are dependent on each other, and even widening certain pipelines won't make a difference if one area is lagging behind another.
  • Casey goes on to state, and I am quoting Casey directly in giant font because it's the most damning part of the entire video:
    • "There is another article you could write that is right about this, and that article would basically be, look: we could be decoding a lot more instructions per cycle on X86 with a lot less silicon if we made these changes."
So basically, X86 is an aging architecture with some truly idiotic legacy elements that halts progress and speed, wastes silicon and therefore increases manufacturing costs, and there's a reason why everyone is looking to dump it...even to the point that Intel has proposed making sweeping changes to keep people from bailing (which isn't working).
Congratulations, the single criticism in the 1 hour long video they made explaining all the advantages x86 and how most common criticisms for it in favor of arm don't really hold water. A criticism he addressed with "yeah we can improve that" and not with "lets scrap everything and start using arm instead" as you're trying to imply here for some reason.

Edit: I cannot believe I wasted time on this bullshit. Good night, and X86 still sucks.
I truly don't get all this fanboyism over ISAs. I blame Apple.
 
Last edited:

Radical_3d

Member
Microsoft has full confidence that Qualcomm's offerings can beat the M3.
Morgan Freeman Good Luck GIF

Apple is already moving to M4.
 

Chiggs

Gold Member
Congratulations, the single criticism in the 1 hour long video they made explaining all the advantages x86 and how most common criticisms for it in favor of arm don't really hold water. A criticism he addressed with "yeah we can improve that" and not with "lets scrap everything and start using arm instead" as you're trying to imply here for some reason.

No offense, but I've seen too many drive-by posts where someone dumps a Chips and Cheese article, or a video like this, which is supposed to be a crushing blow at Arm, and it just falls flat upon closer inspection.

Think about it for a moment: even if you believe that I'm some overzealous Arm-lover, why is it that EVERY SINGLE major chipmaker is now gung-ho about Arm? AMD, Nvidia, Qualcomm, Apple,...even Intel (allegedly). Is that because X86 is so wonderful and vibrant? Is that because Arm is just a big boondoggle that only you can spot for the fraud it is, meaning that you're smarter and more strategic than people like Dr. Lisa Su and Jensen Huang?

Or maybe X86 is a corpse that has been picked clean, and everyone knows it...except for the "MUH PC!" crowd who actively defend massive power draw. Ah, but power draw is all about the process node...or so the "experts" all tell me. It's never about better or more efficient designs. Nope. All about that process node. I imagine that's what the engineers at Intel tell themselves, too, as their boss runs around Washington D.C. trying to get all the government subsidies he can to transform the company into a fab.
 
Last edited:

Guilty_AI

Member
7 years in technology is a lifetime.
I guarantee neither one just stopped in time from back then. Or that other even newer architechture didn't come up since.

One curiosity is that CISC became RISCer over time and RISC became CISCer over time, so it's not like we have everything cleanly cutout either.
 
Last edited:

bender

What time is it?
No offense, but I've seen too many drive-by posts where someone dumps a Chips and Cheese article, or a video like this, which is supposed to be a crushing blow at Arm, and it just falls flat upon closer inspection.

I watched the video and it wasn't really about being a crushing blow at Arm nor was it about defending x86, it was just about deconstructing the poorly made arguments (in Casey's estimation) in the article at hand.
 
Last edited:

Guilty_AI

Member
Think about it for a moment: even if you believe that I'm some overzealous Arm-lover, why is it that EVERY SINGLE major chipmaker is now gung-ho about Arm? AMD, Nvidia, Qualcomm, Apple,...
He explained it in the video you apparently watched. I also laid it out in two comments already.

No offense, but I've seen too many drive-by posts where someone dumps a Chips and Cheese article, or a video like this, which is supposed to be a crushing blow at Arm, and it just falls flat upon closer inspection.
The fact you seem to belive i'm somehow trying to "slight the glorious arm architechture" just cements my impression you're some type of fanboy. Your "'MUH PC!' crowd" comment doesn't help your case either.
 

Three

Member
I guarantee neither one just stopped in time from back then. Or that other even newer architechture didn't come up since.

One curiosity is that CISC became RISCer over time and RISC became CISCer over time, so it's not like we have everything cleanly cutout either.
Yes but I think you have ignored the reason RISC and ARM came to be. RISC and ARM was a later technological improvement designed for efficiency with the tradeoff of making the programming part more complex, not simpler. It's why it won the mobile/tablet space and now is making a big move into laptops. Even though CISC became RISCer to compete on efficiency it relied too heavily on one or two companies (due to more stringent IP rights?) and supporting CISC meant it still just wasn't as efficient and was always playing catchup. I'd say even today with things like P/E-core architecture it still is.
 

Guilty_AI

Member
Yes but I think you have ignored the reason RISC and ARM came to be. RISC and ARM was a later technological improvement designed for efficiency with the tradeoff of making the programming part more complex, not simpler. It's why it won the mobile/tablet space and now is making a big move into laptops. Even though CISC became RISCer to compete on efficiency it relied too heavily on one or two companies (due to more stringent IP rights?) and supporting CISC meant it still just wasn't as efficient and was always playing catchup. I'd say even today with things like P/E-core architecture it still is.
I think you're mixing stuff up here. RISC was designed with simpler code in mind with the tradeoff of having to do more cycles per instruction. And the reason it won the mobile space is because ARM was actively investing on embedded systems, specifically focusing on business models that prioritized low cost and low energy use of their chips over performance. They just breached that market very early.
 
Last edited:

StereoVsn

Gold Member
Don't get me wrong bro. I'm not an Apple hater by any means. Macs are extremely well made and I love the device integration. If their base Macs were 16GB memory and 512GB storage and the upgrades were halved then I'd probably still be using Mac, but as it is even with their Mac minis taking a base 8GB/256GB to 16GB/512GB costs $400. I'm just not paying those prices any more.

I've switched to Samsung for phone, tablet and watch. They do a good job with integration and work well with Windows. Not as good as Mac, but good enough.
That’s absolutely understandable. I had S22 Ultra and a Galaxy Watch less than a year ago. Was debating between getting a Tablet and Galaxy Book or going Mac.
 
Last edited:

Three

Member
I think you're mixing stuff up here. RISC was designed with simpler code in mind with the tradeoff of having to do more cycles per instruction. And the reason it won the mobile space is because ARM was actively investing on embedded systems, specifically focusing on business models that prioritized low cost and low energy use of their chips over performance. They just breached that market very early.
RISC wasn't designed for simpler code. It made the assembly language programmer’s job more complex but the benefit was lower power and smaller chips. The whole point of RISC projects in the 80s were to improve efficiency of the late 70 designs by removing those low use instruction sets. Not to make programming easier. When you say make CISC more RISCer you have to realise that this doesn't actually make any sense. You cannot make a Complex Instruction Set Computer a Reduced Instruction Set Computer. You're either RISC/CISC or you're not. What you actually mean by that is that x86 over the years tried to become as efficient as other RISC designs but still had to support the whole complex instruction set.
 

Ozriel

M$FT
I'll need to see some gaming benchmarks.....otherwise....

morgan freeman idgaf GIF

It's put up or shut up time next week. I don't think MS can afford two gaming-related belly flops in the span of a month.

None of this is gaming related, though. This ARM push hasn’t ever been touted as a gaming powerhouse.

This should easily outperform the Intel iGPUs we usually see in these types of form factor. And that’s the point.


They tried this before. I have my doubts. It will end up being chrome book level of cheapness trying to get the price down below 1k with plastic shells, slow ssds, and bad screens.

And by the time they get anything out apple will be on the m5 lol.

They haven’t tried this combination of powerful CPU + improved app emulation before. And certainly the leaked laptops we’ve seen from the likes of Asus and Dell are pretty well specced.

But the thing here in the article is completely wrong anyway. It's not the MacBooks that they need to beat. It's the ultra powerful coming mass consumer level iPads and iPhones and those are only a few years away.

None of those is any threat to traditional computers.
The latest iPad Pro has a powerful M4 chip in it and is still crippled with iPadOS.
 

bitbydeath

Member
If MS are going to do a reset with ARM then they should really create a new OS from scratch.

Their ancient system is long overdue for a complete overhaul.
 

nemiroff

Gold Member
I knew this thread would end up full of Apple fanboy sales pitches..

Don't get me wrong, with over a decade long run as an Apple fanboy (computers, pad, phones..) myself I know how hard it is to come back to real life.

With that said, one thing that kept me on Macs even after coming to my senses again is the fact that they have superior track pads. No joke, they're so good.

/Spinning beach ball

Anyway, there's nothing much to gain for consumers from the main topic of this thread, this is an industry's fight, let's see what happens. I doubt there will be much tangible change to feel from the outside.
 

KungFucius

King Snowflake
This is the retarded take I knew I'd see in here.

What kind of cheap fucking PCs do people buy?
Apple sells workstations that are 60k and their laptops are much more expensive than windows machines. It is stupid to buy them if all you are doing is basic work stuff. People are not paying because they are objectively better, they are paying to go to a coffee shop and shine the fucking apple logo at people. The cheapest new Macbook air is 1100 bucks. How is that not expensive compared to a comparable windows laptop? Comparable means it gets the job done, is not a low end piece of crap. A lenovo flex with an i7 can be had for 750. Similar size, specs are decent. No showoff logo though so yeah people should be paying over 40% more to get an apple.
 

twilo99

Member
If MS are going to do a reset with ARM then they should really create a new OS from scratch.

Their ancient system is long overdue for a complete overhaul.

Goes against the core principles of supporting their enterprise clients with backward compatibility for their 20+ year old software…
 

YuLY

Gold Member
I have a games library which I keep replaying, backwards compatibility is important for me, so no thanks.
 

Guilty_AI

Member
RISC wasn't designed for simpler code. It made the assembly language programmer’s job more complex but the benefit was lower power and smaller chips. The whole point of RISC projects in the 80s were to improve efficiency of the late 70 designs by removing those low use instruction sets. Not to make programming easier.
RISC does use simpler code, it's literally in the name, though what i believe you're trying to say is that the code needs to be longer. But yeah, the main purpose was still trying to be more efficient by using less transistors, however they still needed to compensate the reduced instruction set so it's not like it is a clear cut thing.

When you say make CISC more RISCer you have to realise that this doesn't actually make any sense. You cannot make a Complex Instruction Set Computer a Reduced Instruction Set Computer. You're either RISC/CISC or you're not. What you actually mean by that is that x86 over the years tried to become as efficient as other RISC designs but still had to support the whole complex instruction set.
I think it's you who doesn't realize how much of frankensteins modern CPUs are. To give you an idea, since mid-90s intel has actually been using RISC internally for their architechture with a CISC translation layer on top, while many modern RISC CPUs gladly abandoned their name sake in favor of adding more specific instruction sets.
 

YOU PC BRO?!

Gold Member
I wonder if this means the Arm version of Windows is actually ready to be considered a full release? Also, this may hint at Arm being the CPU of choice for the rumoured Xbox handheld. Interesting times...
 

bitbydeath

Member
Goes against the core principles of supporting their enterprise clients with backward compatibility for their 20+ year old software…
So does ARM. It’ll need to run through emulation and there are no guarantees all X86 software will still work.
 

Chiggs

Gold Member
None of this is gaming related, though. This ARM push hasn’t ever been touted as a gaming powerhouse.

This should easily outperform the Intel iGPUs we usually see in these types of form factor. And that’s the point.

They did make some comments about how games "should just work, though." They being Qualcomm, to be fair.
 

Chiggs

Gold Member
He explained it in the video you apparently watched. I also laid it out in two comments already.

Okay. :rolleyes:

Thanks again for the video, though. Watching an admittedly smart man spaz out over platform critiques he himself wouldn't make, but still arriving at the fact that X86 has legacy problems, sure was a step-change moment for both me and the world. A new age of understanding dawns.

I'll remember that, in the future, if someone posts a "diss" video to an "X86 sucks" article that I didn't reference, but still arrives in the same ballpark when it comes why X86 sucks, then I guess tie goes to the runner?

It's going to be fun watching the next few years unfurl in the pc space. Mental gymnastics ahoy.
 
Last edited:

magnumpy

Member
I'm cheap and all, give me a $400-$500 console for videogames and most of my web browsing is done on a cell phone. GAF is basically the only use I have for a PC. although... can you imagine how quickly I could post new forum messages with these next gen PCs 😎
 

Ozriel

M$FT
They did make some comments about how games "should just work, though." They being Qualcomm, to be fair.

Apple demoed the MacBook Air M1 running Rise of the Tomb Raider via Rosetta. Doesn’t make the M1 Air a gaming focused notebook.

Gaming is a big part of Windows. A decent portion of PC gamers are on iGPUs. It’s valid to discuss emulation/compatibility for some of the most widely used Windows applications.
 

64bitmodels

Reverse groomer.
An architecture (and OS for that matter) is only as good as its software BTW. Remember that anytime someone proclaims the death of any computer hardware platform
 
Last edited:

Three

Member
RISC does use simpler code, it's literally in the name, though what i believe you're trying to say is that the code needs to be longer. But yeah, the main purpose was still trying to be more efficient by using less transistors, however they still needed to compensate the reduced instruction set so it's not like it is a clear cut thing.
You're getting into semantics here but what I said "RISC wasn't designed for simpler code" is true unless you think reduced instructions or more complex low level programming means "simpler code" when in fact it meant more complex code. You can't really have code that "needs to be longer" and suggest it's "simpler code". That's obviously not what it was designed for.

The name RISC was coined by the founder of Transmeta a company that specialised in very low power processors that were actually x86 compatible. The aim of the early RISC processors weren't to make "simple code", they were designed to be low power and smaller. This was the only point being made. Not to get into semantics of how the meaning of RISC has blurred over the years or about Intel using micro-ops. It was in relation to your idea that:

It's also worth mentioning RISC was introduced to solve a problems that aren't really problems anymore. Most modern arguments you see online don't hold much water and it's recent renaissance mostly comes down to the fact they're more accessible and easy to understand, and thus do R&D for.

RISC was introduced for low power, smaller chips and in the early 90s even discovered to be beneficial to performance with increasing clock (hence why intel started using some "RISC-like" micro-ops) but had to maintain legacy compatibility still.

ARM won over that low power market due to this goal. Intel tried to compete with Atom but failed to compete effectively because it was playing catchup in power efficiency and cancelled its projects. RISC/ARM's renaissance has nothing to do with being 'easier to understand' in any way.
 
Last edited:

RickMasters

Member
Outside of gaming purposes, I think it's very hard to argue Apple makes the best computers. MacOS stability and reliabilty is unmatched. Windows is a bloated mess. It's also extremely difficult to get infected on Mac, unlike Windows which has more viruses than Wuhan. Privacy is also hell of a lot better on the Apple front: E2E Encryption for cloud storage with NO remote scanning (unlike Onedrive and Google Drive), iMessage features quantum computing cryptography, Safari has Private Relay (akin to onion routing), biometrics never leave the device.

And the design is just, unmatched. The aluminium finish, the impecable packaging, the UI design. The machines are pieces of art, not just computers.


I use both for music production and sound engineering work. My Mac Pro is far less headache to work with. Mostly comes down to apples core audio being better than windows WMD audio. I just ASIO on PC but all of my audio interfaces are far more stable performance wise on max. I use universal Apollo Xs and I have a An avid pro tools carbon/MTRX set up. Both work way better with MAC. And the M chips are better for productivity than anything Intel or AMD are offerings….


We are starting to see MS develop an M3 rival of their own though with ARM. And others will follow so maybe in the PC space it’s ARM that leads and not Intel and AMD.
 

RickMasters

Member
I've been on Apple Silicon with my M1 MacBook Pro for almost 4 years now. A laptop that runs on passive cooling and never uses its fan, gets 2-3 days of battery life.

It's nice to see WIndows finally joining the ARM party but it still remains to be seen if Windows on ARM is less shit than Windows on x86. I'm pretty spoiled by how nice Mac OS is these days for laptop use.



Will be interesting if true that they will be going ARM with the next Xbox too. I have a Mac Pro with an M2 and it’s a beast of a machine to work on. Curious to see what they have been working on with ARM especially with new ARM equipped laptops about to come out. Im curious to see when we get a full blown desktop CPUs and motherboards for those CPUs.
 
Top Bottom