The era of 4-digit Mac names was such a mess, trying to figure out which one was the best option available at your price point, One of the best things Steve Jobs did on his return was to trim the number of Mac models to a minimum. When my ex-wife was looking to upgrade her Windows laptop a few years back, she ended up in analysis paralysis because the options just from HP were so complicated that she couldn’t figure out what she should buy. Say what you will about Apple’s extreme overcharging for internal memory and storage, it’s at least easy to pick the right Mac for yourself.
I didn't mind it that much because the higher the number was the more powerful the computer tended to be. The 9500 was super good. The 8500 not bad. The 7500 was the best of the mediocre, and the 7200 was similar but not as great.
During that time we bought our Macs from a local computer store. Our guy, Fred, always helped advise us, but he was pretty frustrated overall with the whole situation. I mean he also didn't get why anybody would want a Mac when Windows devices were cheaper, more standard, and less buggy.
> One of the best things Steve Jobs did on his return was to trim the number of Mac models to a minimum.
Fred always said that if they ever introduced colors to computer cases he was going to quit. Jobs came along with the iMac and less than a year later he retired. It cracks me up how he stuck to his word.
Keep in mind the 9500 (and 9600) and 8500 were at or near the top of the line and relatively easy to figure out. If that and the 7000 had been the only things Apple shipped, fine. The problem was the 4000/5000/6000 range, and Performa vs Quadra/Centris/Whatever. It was a complete and total mess.
Your Fred also clearly had a weird sense of "less buggy." At that time, Windows was essentially a GUI atop an extended version of MS-DOS. Look up any contemporary serious review and you'll find complaints about stability. Compare to OS/2.
Also thanks for bringing back the memory of all those "other" macs. I'd forgotten how weird it was trying to distinguish between all those meaningless names, and the marketing behind them didn't really help much.
The most awful machine I've ever used was a circa-2000 98SE Celeron Compaq Staples special that my family bought when the old Performa gave out one evening and we needed a replacement right away. Aside from being a little slower, that Performa was better in every single way despite being four years older.
I used an 8500 as my personal machine for way too long. With a Sonnet G3 upgrade and maxed out RAM, it stayed viable for way longer than it should have.
> I didn't mind it that much because the higher the number was the more powerful the computer tended to be. The 9500 was super good. The 8500 not bad. The 7500 was the best of the mediocre, and the 7200 was similar but not as great.
The general rule was that the first digit represented the form factor, the second digit represented the base model (logic board), the last two digits represented the value-add configuration (amount of RAM, size of HDD, and included software package), then CPU speed was given after a forward-slash, and there might be a CD somewhere in there for configurations which included an internal AppleCD drive.
The PowerPC-era form factor numbering scheme was actually established in the 68k era with the all-in-one LC 500-series, the pizza-box Centris 600-series (descendant of the original LC form factor), the desktop Quadra 700 (descendant of the Macintosh Ⅱᴄx/ᴄɪ [compact] form factor), the mid-tower Quadra 800/840AV, and the full tower Quadra 900/950. Computers were called Macintosh when sold by Apple (like to schools) or sold through Apple's dealer network, called Workgroup Server (WGS) when sold in server configurations (like with AppleShare/IP) and called Performa when sold direct to consumers (like through CompUSA, etc).
It started well with the initial models of NuBus Power Mac: the pizza-box 6100/60, desktop 7100/66 (I had this one!!!), mid-tower 8100/80, and full-tower WGS 9150 — different form factors but obviously denoted as the first PowerPC model (x1xx) of each series.
The 6100 makes a good example of this era because it got an especially large number of consumer-focused SKUs where it was known as the Performa 611{0..8}CD, a server version known as the WGS 6150/60, and an eventual speed-bump when it became the Power Macintosh 6100/66 and WGS 6150/66: https://en.wikipedia.org/wiki/Power_Macintosh_6100#Models
Then the exceptions to the numbering scheme started with the second generation of PowerPC machines, the first to switch from NuBus to PCI. They reintroduced all-in-ones as the Power Mac 5200 series (famously horrible machines) and stuck the same board in a Quadra 630 style case as the 6200, both with a PowerPC 603. They introduced a desktop 7200 and mid-tower 8200 with PCI but still using a PowerPC 601, so the x2xx still seemed to represent release order and not CPU. But then they simultaneously released the 7500 and 8500 with a PowerPC 604. Were these supposed to be fifth-gen models? What happened to 3 and 4? They introduced a six-slot PPC604 machine at the same time, the Power Mac 9500, but there was no PPC601 9200.
Next year, the 6400 appeared in a curvy and quite nice-looking consumer mid-tower case, but 6xxx has now represented three different form factors. The 7600, 8600, and 9600 replace their respective x500 counterparts, so now we're back to release order? It doesn't mean CPU generation, because there are higher-end 9500s with PPC604e instead of just 604, and lower-end 7600s with just 604 and not 604e.
The year after that, the Power Macintosh 7300 (I had this one!!!) replaces the 7200 and the 7600, so now we're going backwards even though it's a better computer? It doesn't mean release year, because the 5300 and 6300 are a year older and are just speed bumps of the 5200 and 6200. Except the 5260 which is newer than the 5300 and a much better machine, which was replaced by the 5400 which is a 6400 board in a 5xxx-style case. Except the 6360 which is a 6400 board in a 6200/6300-style case because they had already used 6400 for the tower form-factor the board came from. The 6500 and 5500 were speed bumps of the 6400 and 5400, but at the time of their release were two years newer than the 7500/8500/9500.
The 4400 falls outside of all of this, so at the time it felt like Apple trying to build a cheap Wintel-style business machine. There had been 68030 machines numbered 4xx, but they were consumer-only Performa variants of the LC Ⅲ. Except the LC 475 which was a Quadra 605 in a LC Ⅲ style case.
I think the iPad is the current outlier of that strategy. There's the iPad, iPad Mini, iPad Air, and iPad Pro, with overlapping sizes. It's too much differentiation I think.
Especially as a developer - Macs are like a godsend for us. And it's a device that you use 8-14 hours a day. Sure, pay a bit extra for RAM (which btw has much better bandwidth than the competition), in the end that extra cost is negligible.
Yeah both RAM and storage aren’t comparing apples to apples (heh) when compared as people often do. If you double the storage on your MacBook, Apple doubles the number of storage chips, with dedicated pcie lanes to each. Since it internally operates with something like RAID0, you also get double the speed.
Do we know that for sure? There are I/O lanes being allocated whatever the exact technology being used is. I just don’t see why they’d reinvent the wheel here.
Yes, the Asahi Linux reverse engineers said in their commit message for the Linux apple-nvme driver:
“Add a driver for the NVMe storage controller integrated on
Apple SoCs. This NVMe controller isn't PCI based and deviates
from the NVMe standard in its implementation of the command
submission queue and the integration of an NVMMU that needs
to be managed. This commit tweaks the core NVMe code to
support the linear command submission queue implemented by
this controller. But setting up the submission queue and
managing the NVMMU controller is handled by implementing
the driver ops that were added in an earlier commit.“
I agree for RAM, but not for flash storage. The competition usually has as-good or even better flash throughput and IOs/second.
The reason people are aggrieved by Apple’s storage upgrade prices is that you can usually buy a high-end, entire NVMe device of a given capacity for less than Apple charges just for the upgrade to that capacity, and the NVMe will be as fast or faster than Apple’s offering.
I’m processing 100’s of GB of information (the whole historical bitcoin blockchain). Enough to not fit in RAM, but the computation I’m running is fast enough to not be CPU bound.
At home I have a desktop rig with multiple TB RAM and a fast server CPU. I would normally ssh into that to run tests with the chain mounted on a /dev/shm partition, which was a pain and only was accessible when I was at home. With my new MacBook Air, the upgraded internal drive is large enough to hold the full historical chain, and streams from disk fast enough to finish a run in comparable time. So now I’m mobile and can work from anywhere, with an entry point laptop replacing a dedicated server! That’s a big change for me.
I recognize not everyone’s tasks are bottlenecks the same way though.
Absolutely. That's why in newer computers with M.2 storage I'm delighted to find the existence of SATA SSD storage in M.2 form factor. Now I don't have to pay NVMe prices any more.
Whereas with Apple, I believe the only choice is NVMe storage. What if I want more but slower SSD storage?
>in newer computers with M.2 storage I'm delighted to find the existence of SATA SSD storage in M.2 form factor Now I don't have to pay NVMe prices any more.
In which year do you live in? New computers haven't been shipping with M2 SATA storage since like ~2018, and SATA SSDs haven't been cheaper than NVME since at least ~2020. NVME has been cheaper than SATA for many years already.
You mean M.2 SATA storage is available cheaper than M.2 NVMe where you live? Over here there are very few options in M.2 SATA form factor and the prices are almost double that of M.2 NVMe, which is really not that suprising given the obviously very much higher volumes of NVMe parts.
I think this is backwards? NVMe should be faster than SATA. I don’t think Apple uses either though. They directly connect the CPU to storage on the SoC. Another commenter above is saying they don’t even use PCIe.
NVMe is faster than SATA. But for my purposes SATA is fast enough; I just hope manufacturers would make SATA SSDs cheaper than NVMe SSDs. But alas that's not the case.
Aren't Mac the opposite for the developers with subpar support for say, containers and dev tools, and crappy out of the box window management compared to a laptop running on linux?
Seems to me you have to do a lot of manual tweaking and install before having something half decent as a dev.
> with subpar support for say, containers and dev tools,
It's a Unix machine. All that is right there and readily available.
> and crappy out of the box window management
Not really, no. Add one app and it's a tiling environment. Actually that is built-in in macOS 15 but I've got it turned off as I have a tiling app I've been using for 15+ years and I'm happy with it.
> compared to a laptop running on linux?
No. It's a better UI in every way, less hassle, more apps and better support.
I've been using Linux for 28 years now and for a while it improved beyond all recognition, but it's getting very clunky again with all the bloat now.
I switched to macOS on my desktop machines once I could afford it, and Linux for laptops. This is a happy compromise.
But I've also been writing about it for well over 25 years and that means reading other people's writing about it and where possible talking to them.
All the professionals evangelising Linux in the 20th century have moved to Macs now. It's the same core experience, but done better.
> Seems to me you have to do a lot of manual tweaking and install before having something half decent as a dev.
I can't speak for being a developer because I'm not one, but I can speak about Linux and macOS as a pro.
This is the wrong way to use a Mac.
The right way to use a Mac is not to fight it. Accept things as they are, learn to work with it, and add the extras you need.
You can't customise macOS very much and it's hard. So, don't. Be like bamboo, not a tree: bend with the wind, adapt to where you are, and then grow where you want to go.
The result is a proper full on UNIX™ environment which needs little to no maintenance and has integration to an extent no other Unix-like OS will ever achieve.
Your questions seem to me to be motivated by bias and conviction of faith, and it is misplaced, as any such fervent belief is.
> I've been using Linux for 28 years now and for a while it improved beyond all recognition, but it's getting very clunky again with all the bloat now.
Oh my this rings so true.
While some don't like systemd (which is fine, everyone's entitled to their own choices) I do like the more cohesive and consistent approach a lot.
But then my two uphill battles are:
- Xorg is still my go-to in spite of limitations Wayland aims to solve (colour management, heterogeneous multihead) but I can't for the life of me seem to be able to make it stable/reliable/usable.
- Pulseaudio was a debacle, so I used ALSA since like forever and could do great things with it. Trouble is some modern things expected become hard to impossible with just ALSA. Enter Pipewire, which conceptually sounds like a great thing but it is so obscure and underdocumented that I just can't wrap my head around it.
Although colleagues have earnestly described why they like GNOME, and demonstrated it, all I see is people who don't know how to use the existing, 35+ year old keyboard UI of Windows, or the simpler and only a few years younger one of NeXTstep/macOS.
I can't stand GNOME myself. It doesn't get out of my way. It wastes a tonne of precious vertical space on its wasted panel. Its app-switcher is poor. Its window management is atrocious, but then, I've met with and interviewed the dev team, and they don't manage windows. They switch between full-screen sessions instead. I'm looking at twin 27" screens right now, and I want to see 5 or 6 apps at once. GNOME obstructs that massively.
But it's trivial to configure macOS to be as minimal as GNOME. Dock to autohide, cmd+space for the app launcher, trackpad gestures to hop between full-screen apps. It's not how I work or want to, but it's easily achieved.
Yesterday I upgraded Fedora Asahi 40 to 41 on my M1 MBA, and KDE is so bad I was reduced to laughter at its pathetic clunkiness. But then I am a documented KDE-hater ever since the days of KDE 2.0.
And GNOME, too, but at least it has the mercy of being pretty. Horribly confining and with an appalling keyboard UI, but it's pretty.
I liked KDE 1.x a lot. I didn't love it, but it was a perfectly serviceable desktop for Linux and it was FOSS. All the other usable Linux desktops I'd seen before then were paid for, such as IXI X.Desktop.
When I say I've been a KDE hater since KDE 2 I meant that I liked KDE 1, but KDE 2 was a bit of a bloated mess. KDE 3 was much much worse and it's continued to turn into a parody of a bad implementation of the Windows 98 desktop -- the bad version, with IE4 embedded in the shell -- ever since.
Yeah, desktop environments are pretty bad on Linux. Window managers are where it’s at of course. I guess it is sort of unfortunate for people if they get the impression that Linux has bad UI because somebody decided a desktop environments should be the out-of-the-box experience.
There have been good desktops. GNOME 2 was basic and clunky but usable. I actually liked and still use Unity, which is as good as it's got so far IMHO, but it's undergoing bitrot now.
Xfce is perfectly fine and I'm happy with it but it could do with some streamlining and simplification in places. The workspace switcher is a bit silly and so a good example: rows are set in one place, columns separately in a different screen. Junk the separate start menu and app finder, because the whisker menu does that. Dashboard on by default. Docklike-taskbar present by default and either set it up as a better Win10 or Win11 clone, which it can do better than the original now, or lean in to the areas where it can do things others can't and set it up as a Mac/Unity-like setup or something different that MATE, Cinnamon, etc. can't do. And slap some pretty themes on it, with visible, grabbable window margins.
But the big names are all basically in death spirals now. Aside from Elementary OS, which is very very pretty but about as flexible as an iPad (i.e. not very) the only people making real efforts at looking good and working well are in China. Deepin is gorgeous in its way, UKUI and Kylin is equally so.
I've found that to be true on every OS I use. Customize as little as possible and things tend to work better and you will have better luck finding answers when something does break.
>All the professionals evangelising Linux in the 20th century have moved to Macs now.
>Your questions seem to me to be motivated by bias and conviction of faith, and it is misplaced, as any such fervent belief is.
Weird to accuse someone of conviction of faith while confidently claiming that all linux users switched to Mac and how Mac is the be-all end-all of computers. You're in a bubble if you think so, I can definitely tell you that.
> confidently claiming that all linux users switched to Mac
I did not say that. I did not say anything resembling that. It's an absurd claim.
What I said was:
«All the professionals evangelising Linux in the 20th century have moved to Macs now»
Which followed, and was in the context of, the sentence:
«reading other people's writing about it and where possible talking to them.»
In other words: the professional Linux advocates -- that means, the people who were advocating and recommending Linux to non-Linux users –- that I read and knew and sometimes have talked to -- switched.
Not people in the Linux biz talking to other people in the biz.
People like author Charlie Stross, who is occasionally cstross on here, who for years wrote the Linux column in the UK edition of Computer Shopper and was as such perhaps the most visible UK tech journalist writing about and recommending Linux.
Or Neal Stephenson, author of the seminal "In the Beginning was the Command Line", which if you have not read recently you need to.
Context is important and must be considered. You apparently did not.
> and how Mac is the be-all end-all of computers.
I didn't say that either.
It's got a damned good case to be the most sophisticated general-purpose desktop/laptop there's ever been, though, and it's held that place pretty much the entire century so far.
Tastes differ. Not everyone likes it. That's fine. I am not saying everyone should.
But I'm saying that if you read the widest possible range of OS and UI discussion and debate, there is a fairly clear consensus that what was Mac OS X and is now macOS is, while flawed, about the best there is.
>«All the professionals evangelising Linux in the 20th century have moved to Macs now»
In other words: the professional Linux advocates -- that means, the people who were advocating and recommending Linux to non-Linux users –- that I read and knew and sometimes have talked to -- switched.
I'm sorry, but as a professional journalist surely you must know the contradiction you've introduced here with the difference between "all Linux evangelists moved to Macs now" and "all those I know and read of switched" because those two statements are not the same thing.
One statement deals in absolutes("all Linux evangelists switched to Mac") and can be supported by sources if so, the other is a opinion based on your bubble ("all that I know switched to Mac") which is just your opinion that's different than the situation in my bubble and holds just as much weight.
You're arguing with one of the better writers for The Register, there.
I'm not much of a Linux person, but I have been using Macs since 1986 (as a developer), so I can attest to most of Mr. Proven's statements, irt to the MacOS.
>You're arguing with one of the better writers for The Register, there.
You're saying that like it should mean something. It's still the subjective opinion of a person. It holds no more or less value than the subjective opinion of another person. Being a journalist doesn't automatically make you the supreme authority on something, you're still just a professional opinionator (no offence), but that opinion can be different than other users.
>I have been using Macs since 1986 (as a developer)
That's an issue IMHO. Long term MacOS nerds are the ones who got used to all the quirks and can't see anything at fault as they molded themselves into he platform with age, developing muscle memory workarounds without realizing, so to them that status is perfection.
Meanwhile, new users to the platform will see things differently.
> Long term MacOS nerds are the ones who got used to all the quirks and can't see anything at fault as they molded themselves into he platform with age, so to them everything is perfect. Meanwhile, new users to the platform will see things differently.
An easier way to phrase that is "people have confirmation bias." You clearly exhibit this in your post. New users depends on if they've used other desktop environments or not. I'm confident that someone who has never used a desktop computer before would be more productive on a Mac. Had they used Windows, they may be confused.
It depends on who "the person" is. In this case, it's a seasoned professional, who uses both operating systems regularly, at a fairly advanced level, and also explains this stuff to others, while being held to journalistic standards.
Also, The Register tends to hire pretty sharp folks.
> Meanwhile, new users to the platform will see things differently
That's always the case. Unless you are an invested user of a platform, it's likely to be uncomfortable. When folks ask me if they should get an Apple device, as opposed to an Android/Windows PC, they are often surprised, when I say they should probably get what they are already used to.
Truth be told, there's plenty of good in all UI (including CLI), and people get very efficient, using their UI of choice. I find that it's usually best, if they stay on it.
Having been an Apple developer for decades, I have been absolutely slathered in bile from Apple-haters. It seems to be pathological. I assume that's because of the "snottiness" of Apple's approach. It's actually deliberate, and part of their branding. It can get annoying, but I know why they do it. Personally, I don't feel that way, despite being invested in the Apple ecosystem, and I don't hate other approaches, either. I managed a multi-platform development team for a couple of decades. It was not conducive to effectiveness, for me (or any of my employees) to be jingoistic about platform choices.
Professional in what? I'm also a professional. Is my opinion not just as valid? Is MKBHD also a professional in this sense?
>who uses both operating systems regularly
I think many people on the planet, including children, can use two or more operating systems regularly and provide opinions on them, it's not a rare skill or something that requires academic degrees. Is their opinion not just as valid?
>while being held to journalistic standards
A lot of events proved that "journalistic standards" mean very little, especially in the modern era of online publications being dependent on click ad-revenue. For example look at the disconnect between critics ratings of movies and audience ratings, or between car reviewers and car owners. Similarly, Microsoft and Apple make OSs for users, not for professional critics or journalists.
It's still just someone's subjective opinion on an OS, not something numerically and logically quantifiable as being the right opinion. It's not like it's a debate with Linus Torvalds on the correct implementation of mutexes.
> I have been absolutely slathered in bile from Apple-haters.
What does this have to do with me? What's with this victimization attitude on people lately? Should I feel guilty or sorry about something some other random people said something mean to you in connection to this topic? It's a conversation between you and me, I don't care about what others did.
I am replying to you from my third mac. I got it less than a year ago and it is the first Mac I have used since 2010 or so. Sure I am getting used to it but it does surprise me how different some things are from my typical XFCE/Win10 environments. I know unintuitive is the wrong word but at least for my own intuition, it is unintuitive.
It is very much a thing of modern times to be lectured on, for instance, desktop design, when I am fairly confident I've used more different desktop environments than the person accusing me is even aware exists.
(I would estimate I've used 35-40 different desktops across over a dozen or more GUI OSes. The first I owned myself was an Acorn Archimedes with RISC OS 2, an environment far weirder than any hardcore Linux advocate could even imagine… a default editor with two separate independently-navigable cursors (source and destination), three mouse buttons all heavily used, and no permanently on-screen menus of any kind (only context menus).
>It is very much a thing of modern times to be lectured on for instance, desktop design
Where did I lecture you on that?
> when I am fairly confident I've used more different desktop environments
Does using more desktop environments makes one's opinion on a specific desktop design more valuable than everyone else's? It's not like you're designing them, you're just using them, just like me and millions of other people.
>than the person accusing me is even aware exists.
Care to point out what exactly did I accuse you of?
I didn't. Turn your paranoia down. I never mentioned you once and none of this is specific or particular to you.
But, to answer one point: yes, I do think that broad experience of lots of different desktop GUIs does qualify someone for comparing them, and for identifying particular strengths or weaknesses of particular ones.
docker assumes there is a linux kernel underneath, not a mac 'unix' kernel... so you end up having to have, just like on Windows, a vm running a linux kernel to run a docker container
But I am told -- I do not work with this stuff myself -- that if you simply install Docker Desktop, or something equivalent, it just happens, invisibly and out of sight, zero intervention and zero maintenance.
Linux to me still feels like it used to be in the 90s. It's certainly improved, and package management is better, but the UI's are inconsistent and of relatively low quality. The advantage would be that you have more of a one-to-one match with what happens on the server side (which is usually Linux for most people).
Some parts might require some tweaks, but usually it's a once off and then you're good to go. Containers - haven't had much issues, but you might run into some non-ARM based images for Docker, but fairly easily solved.
As for window management - what do you mean? The window management to me is good, but then I never understood tiling window managers and such, if that is your requirement.
I've recently spent a considerable amount of time on Windows 11(after using linux/x11/wayland/kde for a long time),-- the UI inconsistencies are widespread there too. Microsoft is only finally finishing the push to make all control panels look consistent, and they are doing so by removing some of the more detailed options.
I was there and the wiki article captures the look, but not the feel. Giant funky mouse pointers that flip which direction they point as you hover over elements, but without apparent logic...Window maximizing behavior that left you unable to get back to the desktop...Minimized windows disappearing, never to be seen again...
Yeah. Linux got left in the dust in about 2005 and no one has worked it out yet.
The principal difference is the sheer quality of the client desktop experience hasn't improved since then. The Linux desktop apps are pretty terrible, unreliable and clunky and most of the progress so far has been rewriting them again and again in slightly different desktops to no avail (gnome over the years for example). Yet still things like fractional scaling barely even work.
While everyone was pissing around with that and fanfaring open source, Apple refined a whole suite of apps that ship with their macs and phones and ipads that just work and sync properly.
And that's what is important to a lot of people, not whether the icons are in the title bar on gnome, any purity etc. Usability is number 1. And Linux is not.
Err we have virtual desktops, tiling, snapping and things you know, out of the box these days. I mean the virtual desktops thing is mostly what I use and it's a triple swipe on my magic trackpad to switch.
And MacOS had Spaces 6 years before Windows had anything similar and Exposé for 3 years before they came out with a crappy not-as-good equivalent, and the current task view still sucks by comparison.
But touch input drivers on both platforms still suck, so I don't really care what their Window management is like when I can't interact with them without a hand cramp.
I'm not sure why the need to move the goalposts but I'll bite.
>And MacOS had Spaces 6 years before Windows had anything similar and Exposé for 3 years before they came out with a crappy not-as-good equivalent, and the current task view still sucks by comparison
So what? On Windows and Linux I never needed that feature because they always had proper window management, nor do I use that feature now. You're comparing Apples to Oranges. A Dodge RAM has a tow hitch, a Ferrari doesn't have a tow hitch. Is one better than the other, or are they better at different scenarios?
>But touch input drivers on both platforms still suck, so I don't really care what their Window management is like when I can't interact with them without a hand cramp.
All touchpads give me cramps and carpal tunnel, that's why I use an angled mouse. Again, moot and off topic point. What's the point of a better touchpad if it's never gonna beat an ergonomic mouse?
Half-screen tiling (Window > Tile Window to Left/Right of Screen, or click and hold the green button), snapping (same but hold Opt), and virtual desktops ("Spaces", later "Mission Control") have been been available for a long time. The former ones maybe not used that much because people don't explore the menus.
The window management thing is really overblown in my opinion. On macOS I just keep two monitors with separate virtual desktops enabled on both with apps assigned to specific desktops which reduces management to almost nothing, which is even easier and lower effort than a Win9x-paradigm desktop or tiling setup (which I’ve found requires a surprising amount of micromanagement to keep usable).
I've worked with Unix and Windows in the past decades and each and every time the only scenario Windows wins is when I'm developing applications for Windows.
Since I develop mostly for server-side, a Unix-like OS is a no-brainer. I have all three OSs on my desk and the least satisfying to use is Windows - it's relatively slow and difficult to troubleshoot device driver issues. On Linux you can always look under the hood and on Macs there is no such thing as device issues.
Yeah. I like to run docker without needing a Linux vm, I like the choice of desktop environments and they are superb for me compared to the macos desktop. KDE Plasma 6 is one of the best desktops I've ever used.
Now with the atomic distros, such as Aurora, you have a rock solid base you never touch, updates are atomic so you can always reboot to the previous version if needed and you create lightweight containers for development.
My current setup is Aurora as the base distro, all GUI applications from Flathub and the terminal automatically opens up in distrobox which runs Arch Linux with Nix. Super solid, super fast and everything just works.
> you’re much more likely to just grab “the cheapest”.
And then realize there were better deals at the time and tarnish the brand.
Absolutely simplifying the lineup to four Macs was the best decision. Right now they have one more than they should - the MacPro and Mac Studio seem to clash a lot, especially since you can't use the PCIe slots of the Pro for GPUs. What do people put in those slots? I'd assume storage and fast networking.
I’d also imagine there’s firms using SDI video capture cards for on set/production purposes. Outside of local storage I also used to see Fibre channel HBAs and the like somewhat commonly on the older cheese grater (unsure how common that use case is now).
The current Mac Pro to my recollection is also readily available in a rack mount format, in and of itself that’s a solid reason for keeping it alive
The Mac Pro is expensive enough that it takes it solidly out of the "consumer" arena and puts it into commercial/business customers. Those customers will take the time to investigate and determine what they need for the job.
There have been times where the Mac Pro dipped into the high end consumer market explicitly, but we're not in one of those times now.
(Do note that some consumers WILL buy "commercial" products and Apple's obviously aware of that, but I suspect it's hard to get them to recommend the Mac Pro to home users.)
100% agree. I call it the "toothpaste paralysis." It’s like when you’re shopping for toothpaste, and brands like Colgate have so many overlapping options that it becomes impossible to figure out which one is actually the best. Unfortunately, I think the same thing is starting to happen with MacBooks again. It’s not as bad as before, but it’s definitely not as straightforward as it used to be.
- Air if you want something lightweight and don't need the "Pro" level CPU/GPU power
If you're only doing email/web, you should probably go Air. (There are no bad choices with the Apple silicon Macs for general use, it's mainly a question of how slow you want your video rendering chores and Xcode builds to be.)
Then multiply by screen sizes, which determines the overall size of the machine.
Edit: formatting. HackerNews support markdown challenge.
Air as an idea makes sense—with pro you pay extra for more power, with Air I thought the idea was you paid more for the same functionality, but in a thinner/lighter form factor.
Is it really just between the Pro and nothing at this point? Because that’s dumb if so.
>When my ex-wife was looking to upgrade her Windows laptop a few years back, she ended up in analysis paralysis because the options just from HP were so complicated that she couldn’t figure out what she should buy.
It's like cars where they bundle one feature you want with a bunch of stuff you don't care about to force you into overspending.
How dare you attempt to sully the honor of my beloved Power Mac 9500, or take an implicit shot at my Performa 638CD — which was not even 4 digits, but 3 digits and 2 letters. You need to check your wiring, friend, or dial up your SoftRAM.
> I am not sure what "should" means in that context.
Really?
The phrase means "what was the best choice", which means "she could not figure out which model offered the best balance of price, performance and features."
I can't offhand think of a more efficient way to phrase it, TBH.
I used to have a 4400/200 with the PC card and honestly loved the thing. It was my first Powermac and I could press cmd+enter to switch into windows 95, it felt so cool at the time.
I managed a lab full of these. So painful to work on because of all those sharp edges. We upgrade the RAM by hand though, which did help. And went from OS 7.6 to 8.6 eventually... which made things a bit more stable. Such weird machines.
I had that model. Modernized it later by adding more memory, more video memory and eventually a Sonnect G3 extension card that made it very fast. With that card it did run Mac OS X, 10.3, as far as I remember, and was fairly usable.
What it did not have though was true color; the video card simply did not produce it, even with maxed out video memory. As far as I understand the cause was that the memory was too slow for that.
They quite possibly use the same LPX-40 logic board design.
Going by the developers note Apple created for it, the LPX-40 is somewhat interesting, but the PowerMac 4400 is basically the most "boring" and normal Mac like configuration. PS/2 and VGA connectors, PC style MFM only manual eject floppy drives, support for "hard power" configurations and an AT like PSU connection - they were really going for "shove this into a PC case, and you've got a Mac". Also, Apple could've fitted a PPC604...
Fast forward to 2020 and Apple introduces the M1 MacBook Air. (Though people still complained about the 8GB memory configuration.)
Apple seems to have learned their lesson with entry-level machines; the basic iPad and Mac mini are quality designs (though storage/memory upselling is still a thing - the cheapest iPad is probably aimed at classrooms/kiosks/streaming.)
I don't think it was that bad. I think it was different, and it was confusing to compare the Tanzania systems to anything else Apple, but it wasn't a bad system by itself.
I found a Motorola Starmax desktop (not tower) in the trash in Manhattan in the early 2000s. It chimed but didn't show video, so I installed a disk, installed NetBSD and used it as server for many, many years. It was very decently performant and incredibly stable.
These days I think it'd need a recap and the 160 megabyte limit would make it less useful than it was, but I still have only good things to say about it.
TFA says it s been followed by the Macintosh G3 desktop... But the G3 was just a beige PC too. Slightly heavier than a regular tower PC but still very beige.
Not Apple s greatest era. They weren't the old Mac cool anymore and they weren't yet iPod/iPhone/iPad cool.
Some G4 were actually good looking and had a great monitor too. But to me the G3 that followed that 4400 was just as bad Apple.
I have fond memories of the OS and still own it though.
It was a noticeable step down from the previous keyboards. Certainly not Apple's best.
Incredible as this might sound, I think the best keyboard Apple made was the Butterfly. It was fragile and unreliable, but it felt great and sounded crisp and precise.
I had no idea Apple ever did this. And the idea of a floppy drive that doesn’t have auto-inject is just sacrilege.
Even after leaving the Mac in the late 90s and building my own PCs getting to mess with a Mac was always a nice experience because they were so nicely built physically.
The first 3.5” drive I ever had was in an LC II. Before that I had only used a 5.25 in a PC XT or something like that. Being able to have it suck a disc in or ejecting a disc and having it pop out with that great mechanical noise was fantastic.
Because my age I thought all drives were like that. The first time I used a Windows PC (3.0?) I was surprised that you had to push the disc in by hand and that it didn’t just show up on the desktop in Windows. I had to be introduced to the concept of drive letters. Seemed relatively barbaric to young me.
Of course within about two years I was asking for my own PC for all the great games. So that didn’t last all that long.
Hah, yes my childhood experience with these was similar. There was your typical 8 bitter 5.25" floppy with its floppiness and rattly drives and make-it-double-sided-with-a-hole-punch diy-ness. And then there was the 3.5" hard plastic square, straight out of Star Wars. A robot would eat it and regurgitate it for you on command.
Funnily enough, Microsoft actually planned to add floppy drive auto-mount on Windows 95. But half the drives implemented the signal for "media present" backwards from the other half, and Microsoft couldn't figure out a user-friendly way to make it auto-detect, so they canned it
That is really interesting, in that it is the opposite of my childhood understanding. I started with CP/M and DOS, and the first time I came to a Linux machine, I just couldn't understand how someone could work with drives without the letters (dedicated namespaces, right). My thought was that it was a less polished design.
I think a Colour Classic still has an auto inject drive, my LC II had one. You can tell the manual inject ones because the case has a curved indent around the drive. Although this is the changeover era, as some late LC IIs apparently have the different drive (and lose the Snow White stipe along the front at the same time).
There is a small hole near the floppy drive and there also was a pin to eject a disk when the computer was off, similar to how SIM cards are handled in modern phones. Good design, actually; harder to damage data.
I suspect it’s more just that it doesn’t “fit“ the way all the other machines were, it stands out and not in an impressive way. It just sort of increases the otherness.
I agree the stuff about being harder for right hand is probably just made up after the fact as color commentary.
I think there’s just a bit of snobbery. It’s an off the shelf LPX chassis with a “Logic Board LPX-40” that Apple also supplied to clone makers. The floppy drive being on the “wrong” side is just proof that it somehow lacks that special something.
It's just because it looked like a WIntel PC and thus was a threat to the collective illusion that 1996-Apple offered anything substantially different or better than Windows '95 (source: was a 1996 Macintosh user who used the term “WIntel”)
I love PowerPC but I still don't believe CPU architecture alone is a motivating factor for why any person would choose to use a particular computer over any other. If that were true then Stebe Jovs never would have told me about The Megahertz Myth, now welcome Phil S[c]hiller to the stage to run the PentiumⅡ machine for this specially-scripted Photoshop benchmark, et cetera.
Reads like it was just a bad Mac all-around but the left-hand floppy drive was a visible symbol of that on the face of the machine because it was different from what was normal for a Macintosh in a machine that was full of things that were different from a normal Macintosh.
Also knowing Stephen Hackett, I don't think he's capable of hate for older Macs. He seems to love even the oddest of ducks and has a lab full of them.
The era of 4-digit Mac names was such a mess, trying to figure out which one was the best option available at your price point, One of the best things Steve Jobs did on his return was to trim the number of Mac models to a minimum. When my ex-wife was looking to upgrade her Windows laptop a few years back, she ended up in analysis paralysis because the options just from HP were so complicated that she couldn’t figure out what she should buy. Say what you will about Apple’s extreme overcharging for internal memory and storage, it’s at least easy to pick the right Mac for yourself.
I didn't mind it that much because the higher the number was the more powerful the computer tended to be. The 9500 was super good. The 8500 not bad. The 7500 was the best of the mediocre, and the 7200 was similar but not as great.
During that time we bought our Macs from a local computer store. Our guy, Fred, always helped advise us, but he was pretty frustrated overall with the whole situation. I mean he also didn't get why anybody would want a Mac when Windows devices were cheaper, more standard, and less buggy.
> One of the best things Steve Jobs did on his return was to trim the number of Mac models to a minimum.
Fred always said that if they ever introduced colors to computer cases he was going to quit. Jobs came along with the iMac and less than a year later he retired. It cracks me up how he stuck to his word.
Keep in mind the 9500 (and 9600) and 8500 were at or near the top of the line and relatively easy to figure out. If that and the 7000 had been the only things Apple shipped, fine. The problem was the 4000/5000/6000 range, and Performa vs Quadra/Centris/Whatever. It was a complete and total mess.
Your Fred also clearly had a weird sense of "less buggy." At that time, Windows was essentially a GUI atop an extended version of MS-DOS. Look up any contemporary serious review and you'll find complaints about stability. Compare to OS/2.
Heh I disagreed with Fred that's for sure.
Also thanks for bringing back the memory of all those "other" macs. I'd forgotten how weird it was trying to distinguish between all those meaningless names, and the marketing behind them didn't really help much.
The most awful machine I've ever used was a circa-2000 98SE Celeron Compaq Staples special that my family bought when the old Performa gave out one evening and we needed a replacement right away. Aside from being a little slower, that Performa was better in every single way despite being four years older.
I used an 8500 as my personal machine for way too long. With a Sonnet G3 upgrade and maxed out RAM, it stayed viable for way longer than it should have.
MacPro5,1 is the new 8500 then
> I didn't mind it that much because the higher the number was the more powerful the computer tended to be. The 9500 was super good. The 8500 not bad. The 7500 was the best of the mediocre, and the 7200 was similar but not as great.
The general rule was that the first digit represented the form factor, the second digit represented the base model (logic board), the last two digits represented the value-add configuration (amount of RAM, size of HDD, and included software package), then CPU speed was given after a forward-slash, and there might be a CD somewhere in there for configurations which included an internal AppleCD drive.
The PowerPC-era form factor numbering scheme was actually established in the 68k era with the all-in-one LC 500-series, the pizza-box Centris 600-series (descendant of the original LC form factor), the desktop Quadra 700 (descendant of the Macintosh Ⅱᴄx/ᴄɪ [compact] form factor), the mid-tower Quadra 800/840AV, and the full tower Quadra 900/950. Computers were called Macintosh when sold by Apple (like to schools) or sold through Apple's dealer network, called Workgroup Server (WGS) when sold in server configurations (like with AppleShare/IP) and called Performa when sold direct to consumers (like through CompUSA, etc).
It started well with the initial models of NuBus Power Mac: the pizza-box 6100/60, desktop 7100/66 (I had this one!!!), mid-tower 8100/80, and full-tower WGS 9150 — different form factors but obviously denoted as the first PowerPC model (x1xx) of each series.
The 6100 makes a good example of this era because it got an especially large number of consumer-focused SKUs where it was known as the Performa 611{0..8}CD, a server version known as the WGS 6150/60, and an eventual speed-bump when it became the Power Macintosh 6100/66 and WGS 6150/66: https://en.wikipedia.org/wiki/Power_Macintosh_6100#Models
Then the exceptions to the numbering scheme started with the second generation of PowerPC machines, the first to switch from NuBus to PCI. They reintroduced all-in-ones as the Power Mac 5200 series (famously horrible machines) and stuck the same board in a Quadra 630 style case as the 6200, both with a PowerPC 603. They introduced a desktop 7200 and mid-tower 8200 with PCI but still using a PowerPC 601, so the x2xx still seemed to represent release order and not CPU. But then they simultaneously released the 7500 and 8500 with a PowerPC 604. Were these supposed to be fifth-gen models? What happened to 3 and 4? They introduced a six-slot PPC604 machine at the same time, the Power Mac 9500, but there was no PPC601 9200.
Next year, the 6400 appeared in a curvy and quite nice-looking consumer mid-tower case, but 6xxx has now represented three different form factors. The 7600, 8600, and 9600 replace their respective x500 counterparts, so now we're back to release order? It doesn't mean CPU generation, because there are higher-end 9500s with PPC604e instead of just 604, and lower-end 7600s with just 604 and not 604e.
The year after that, the Power Macintosh 7300 (I had this one!!!) replaces the 7200 and the 7600, so now we're going backwards even though it's a better computer? It doesn't mean release year, because the 5300 and 6300 are a year older and are just speed bumps of the 5200 and 6200. Except the 5260 which is newer than the 5300 and a much better machine, which was replaced by the 5400 which is a 6400 board in a 5xxx-style case. Except the 6360 which is a 6400 board in a 6200/6300-style case because they had already used 6400 for the tower form-factor the board came from. The 6500 and 5500 were speed bumps of the 6400 and 5400, but at the time of their release were two years newer than the 7500/8500/9500.
The 4400 falls outside of all of this, so at the time it felt like Apple trying to build a cheap Wintel-style business machine. There had been 68030 machines numbered 4xx, but they were consumer-only Performa variants of the LC Ⅲ. Except the LC 475 which was a Quadra 605 in a LC Ⅲ style case.
Except, except, except. What a fucking mess lol
I think the iPad is the current outlier of that strategy. There's the iPad, iPad Mini, iPad Air, and iPad Pro, with overlapping sizes. It's too much differentiation I think.
I still don't understand why the iPad mini needs to be its own separate product category instead of it just being the 8" iPad Air.
Don't forget iPad keyboards! That's a whole other level of overload.
Especially as a developer - Macs are like a godsend for us. And it's a device that you use 8-14 hours a day. Sure, pay a bit extra for RAM (which btw has much better bandwidth than the competition), in the end that extra cost is negligible.
All of the Macs finally come with 16 GB RAM which is decent.
Yeah both RAM and storage aren’t comparing apples to apples (heh) when compared as people often do. If you double the storage on your MacBook, Apple doubles the number of storage chips, with dedicated pcie lanes to each. Since it internally operates with something like RAID0, you also get double the speed.
The NVME controller on Apple Silicon is not PCI-based, so there are no pcie lanes going to the storage chips at all.
Do we know that for sure? There are I/O lanes being allocated whatever the exact technology being used is. I just don’t see why they’d reinvent the wheel here.
Yes, the Asahi Linux reverse engineers said in their commit message for the Linux apple-nvme driver:
“Add a driver for the NVMe storage controller integrated on Apple SoCs. This NVMe controller isn't PCI based and deviates from the NVMe standard in its implementation of the command submission queue and the integration of an NVMMU that needs to be managed. This commit tweaks the core NVMe code to support the linear command submission queue implemented by this controller. But setting up the submission queue and managing the NVMMU controller is handled by implementing the driver ops that were added in an earlier commit.“
I agree for RAM, but not for flash storage. The competition usually has as-good or even better flash throughput and IOs/second.
The reason people are aggrieved by Apple’s storage upgrade prices is that you can usually buy a high-end, entire NVMe device of a given capacity for less than Apple charges just for the upgrade to that capacity, and the NVMe will be as fast or faster than Apple’s offering.
Still much worse than e.g. 2 M.2 slots..
Not a very relevant point - the difference between 500 MB/s and 5 GB/s mass storage is rarely noticeable.
Depends on what you're doing. It's very noticeable for my work.
In my case, software development with C++. It's basically small files and a lot of disk cache.
I’m processing 100’s of GB of information (the whole historical bitcoin blockchain). Enough to not fit in RAM, but the computation I’m running is fast enough to not be CPU bound.
At home I have a desktop rig with multiple TB RAM and a fast server CPU. I would normally ssh into that to run tests with the chain mounted on a /dev/shm partition, which was a pain and only was accessible when I was at home. With my new MacBook Air, the upgraded internal drive is large enough to hold the full historical chain, and streams from disk fast enough to finish a run in comparable time. So now I’m mobile and can work from anywhere, with an entry point laptop replacing a dedicated server! That’s a big change for me.
I recognize not everyone’s tasks are bottlenecks the same way though.
Same here, but processing the entire history of Ethereum instead.
Absolutely. That's why in newer computers with M.2 storage I'm delighted to find the existence of SATA SSD storage in M.2 form factor. Now I don't have to pay NVMe prices any more.
Whereas with Apple, I believe the only choice is NVMe storage. What if I want more but slower SSD storage?
>in newer computers with M.2 storage I'm delighted to find the existence of SATA SSD storage in M.2 form factor Now I don't have to pay NVMe prices any more.
In which year do you live in? New computers haven't been shipping with M2 SATA storage since like ~2018, and SATA SSDs haven't been cheaper than NVME since at least ~2020. NVME has been cheaper than SATA for many years already.
You mean M.2 SATA storage is available cheaper than M.2 NVMe where you live? Over here there are very few options in M.2 SATA form factor and the prices are almost double that of M.2 NVMe, which is really not that suprising given the obviously very much higher volumes of NVMe parts.
Huh I just checked prices again and you are right. I must have remembered wrong. I stand corrected.
I think this is backwards? NVMe should be faster than SATA. I don’t think Apple uses either though. They directly connect the CPU to storage on the SoC. Another commenter above is saying they don’t even use PCIe.
NVMe is faster than SATA. But for my purposes SATA is fast enough; I just hope manufacturers would make SATA SSDs cheaper than NVMe SSDs. But alas that's not the case.
Aren't Mac the opposite for the developers with subpar support for say, containers and dev tools, and crappy out of the box window management compared to a laptop running on linux?
Seems to me you have to do a lot of manual tweaking and install before having something half decent as a dev.
Another very odd comment, to me.
> Aren't Mac the opposite for the developers
No?
> with subpar support for say, containers and dev tools,
It's a Unix machine. All that is right there and readily available.
> and crappy out of the box window management
Not really, no. Add one app and it's a tiling environment. Actually that is built-in in macOS 15 but I've got it turned off as I have a tiling app I've been using for 15+ years and I'm happy with it.
> compared to a laptop running on linux?
No. It's a better UI in every way, less hassle, more apps and better support.
I've been using Linux for 28 years now and for a while it improved beyond all recognition, but it's getting very clunky again with all the bloat now.
I switched to macOS on my desktop machines once I could afford it, and Linux for laptops. This is a happy compromise.
But I've also been writing about it for well over 25 years and that means reading other people's writing about it and where possible talking to them.
All the professionals evangelising Linux in the 20th century have moved to Macs now. It's the same core experience, but done better.
> Seems to me you have to do a lot of manual tweaking and install before having something half decent as a dev.
I can't speak for being a developer because I'm not one, but I can speak about Linux and macOS as a pro.
This is the wrong way to use a Mac.
The right way to use a Mac is not to fight it. Accept things as they are, learn to work with it, and add the extras you need.
You can't customise macOS very much and it's hard. So, don't. Be like bamboo, not a tree: bend with the wind, adapt to where you are, and then grow where you want to go.
The result is a proper full on UNIX™ environment which needs little to no maintenance and has integration to an extent no other Unix-like OS will ever achieve.
Your questions seem to me to be motivated by bias and conviction of faith, and it is misplaced, as any such fervent belief is.
> I've been using Linux for 28 years now and for a while it improved beyond all recognition, but it's getting very clunky again with all the bloat now.
Oh my this rings so true.
While some don't like systemd (which is fine, everyone's entitled to their own choices) I do like the more cohesive and consistent approach a lot.
But then my two uphill battles are:
- Xorg is still my go-to in spite of limitations Wayland aims to solve (colour management, heterogeneous multihead) but I can't for the life of me seem to be able to make it stable/reliable/usable.
- Pulseaudio was a debacle, so I used ALSA since like forever and could do great things with it. Trouble is some modern things expected become hard to impossible with just ALSA. Enter Pipewire, which conceptually sounds like a great thing but it is so obscure and underdocumented that I just can't wrap my head around it.
> No. It's a better UI in every way
That's very subjective. I prefer KDE Plasma.
Subjective indeed. I like Gnome Desktop's simplicity and straightforwardness.
See my comment above.
Although colleagues have earnestly described why they like GNOME, and demonstrated it, all I see is people who don't know how to use the existing, 35+ year old keyboard UI of Windows, or the simpler and only a few years younger one of NeXTstep/macOS.
I can't stand GNOME myself. It doesn't get out of my way. It wastes a tonne of precious vertical space on its wasted panel. Its app-switcher is poor. Its window management is atrocious, but then, I've met with and interviewed the dev team, and they don't manage windows. They switch between full-screen sessions instead. I'm looking at twin 27" screens right now, and I want to see 5 or 6 apps at once. GNOME obstructs that massively.
But it's trivial to configure macOS to be as minimal as GNOME. Dock to autohide, cmd+space for the app launcher, trackpad gestures to hop between full-screen apps. It's not how I work or want to, but it's easily achieved.
I am honestly boggling here.
Yesterday I upgraded Fedora Asahi 40 to 41 on my M1 MBA, and KDE is so bad I was reduced to laughter at its pathetic clunkiness. But then I am a documented KDE-hater ever since the days of KDE 2.0.
And GNOME, too, but at least it has the mercy of being pretty. Horribly confining and with an appalling keyboard UI, but it's pretty.
Interesting, I loved KDE 1 when it came out... that was a couple of years ago, 1998 I think. I ran it on Slackware.
I see now that my comment was ambiguous.
I liked KDE 1.x a lot. I didn't love it, but it was a perfectly serviceable desktop for Linux and it was FOSS. All the other usable Linux desktops I'd seen before then were paid for, such as IXI X.Desktop.
When I say I've been a KDE hater since KDE 2 I meant that I liked KDE 1, but KDE 2 was a bit of a bloated mess. KDE 3 was much much worse and it's continued to turn into a parody of a bad implementation of the Windows 98 desktop -- the bad version, with IE4 embedded in the shell -- ever since.
Yeah, desktop environments are pretty bad on Linux. Window managers are where it’s at of course. I guess it is sort of unfortunate for people if they get the impression that Linux has bad UI because somebody decided a desktop environments should be the out-of-the-box experience.
There have been good desktops. GNOME 2 was basic and clunky but usable. I actually liked and still use Unity, which is as good as it's got so far IMHO, but it's undergoing bitrot now.
Xfce is perfectly fine and I'm happy with it but it could do with some streamlining and simplification in places. The workspace switcher is a bit silly and so a good example: rows are set in one place, columns separately in a different screen. Junk the separate start menu and app finder, because the whisker menu does that. Dashboard on by default. Docklike-taskbar present by default and either set it up as a better Win10 or Win11 clone, which it can do better than the original now, or lean in to the areas where it can do things others can't and set it up as a Mac/Unity-like setup or something different that MATE, Cinnamon, etc. can't do. And slap some pretty themes on it, with visible, grabbable window margins.
But the big names are all basically in death spirals now. Aside from Elementary OS, which is very very pretty but about as flexible as an iPad (i.e. not very) the only people making real efforts at looking good and working well are in China. Deepin is gorgeous in its way, UKUI and Kylin is equally so.
> KDE is so bad I was reduced to laughter at its pathetic clunkiness
We can all have our particular taste. I don't think KDE Plasma is "bad". I personally prefer KDE Plasma.
Is there a typo on there or are you being very meta in some way I can't follow?
> The right way to use a Mac is not to fight it.
I've found that to be true on every OS I use. Customize as little as possible and things tend to work better and you will have better luck finding answers when something does break.
>All the professionals evangelising Linux in the 20th century have moved to Macs now.
>Your questions seem to me to be motivated by bias and conviction of faith, and it is misplaced, as any such fervent belief is.
Weird to accuse someone of conviction of faith while confidently claiming that all linux users switched to Mac and how Mac is the be-all end-all of computers. You're in a bubble if you think so, I can definitely tell you that.
> confidently claiming that all linux users switched to Mac
I did not say that. I did not say anything resembling that. It's an absurd claim.
What I said was:
«All the professionals evangelising Linux in the 20th century have moved to Macs now»
Which followed, and was in the context of, the sentence:
«reading other people's writing about it and where possible talking to them.»
In other words: the professional Linux advocates -- that means, the people who were advocating and recommending Linux to non-Linux users –- that I read and knew and sometimes have talked to -- switched.
Not people in the Linux biz talking to other people in the biz.
People like author Charlie Stross, who is occasionally cstross on here, who for years wrote the Linux column in the UK edition of Computer Shopper and was as such perhaps the most visible UK tech journalist writing about and recommending Linux.
Or Neal Stephenson, author of the seminal "In the Beginning was the Command Line", which if you have not read recently you need to.
Here's a free copy.
https://web.stanford.edu/class/cs81n/command.txt
Mac users now.
Context is important and must be considered. You apparently did not.
> and how Mac is the be-all end-all of computers.
I didn't say that either.
It's got a damned good case to be the most sophisticated general-purpose desktop/laptop there's ever been, though, and it's held that place pretty much the entire century so far.
Tastes differ. Not everyone likes it. That's fine. I am not saying everyone should.
But I'm saying that if you read the widest possible range of OS and UI discussion and debate, there is a fairly clear consensus that what was Mac OS X and is now macOS is, while flawed, about the best there is.
>«All the professionals evangelising Linux in the 20th century have moved to Macs now»
In other words: the professional Linux advocates -- that means, the people who were advocating and recommending Linux to non-Linux users –- that I read and knew and sometimes have talked to -- switched.
I'm sorry, but as a professional journalist surely you must know the contradiction you've introduced here with the difference between "all Linux evangelists moved to Macs now" and "all those I know and read of switched" because those two statements are not the same thing.
One statement deals in absolutes("all Linux evangelists switched to Mac") and can be supported by sources if so, the other is a opinion based on your bubble ("all that I know switched to Mac") which is just your opinion that's different than the situation in my bubble and holds just as much weight.
You're arguing with one of the better writers for The Register, there.
I'm not much of a Linux person, but I have been using Macs since 1986 (as a developer), so I can attest to most of Mr. Proven's statements, irt to the MacOS.
>You're arguing with one of the better writers for The Register, there.
You're saying that like it should mean something. It's still the subjective opinion of a person. It holds no more or less value than the subjective opinion of another person. Being a journalist doesn't automatically make you the supreme authority on something, you're still just a professional opinionator (no offence), but that opinion can be different than other users.
>I have been using Macs since 1986 (as a developer)
That's an issue IMHO. Long term MacOS nerds are the ones who got used to all the quirks and can't see anything at fault as they molded themselves into he platform with age, developing muscle memory workarounds without realizing, so to them that status is perfection.
Meanwhile, new users to the platform will see things differently.
> Long term MacOS nerds are the ones who got used to all the quirks and can't see anything at fault as they molded themselves into he platform with age, so to them everything is perfect. Meanwhile, new users to the platform will see things differently.
An easier way to phrase that is "people have confirmation bias." You clearly exhibit this in your post. New users depends on if they've used other desktop environments or not. I'm confident that someone who has never used a desktop computer before would be more productive on a Mac. Had they used Windows, they may be confused.
> It's still the subjective opinion of a person.
It depends on who "the person" is. In this case, it's a seasoned professional, who uses both operating systems regularly, at a fairly advanced level, and also explains this stuff to others, while being held to journalistic standards.
Also, The Register tends to hire pretty sharp folks.
> Meanwhile, new users to the platform will see things differently
That's always the case. Unless you are an invested user of a platform, it's likely to be uncomfortable. When folks ask me if they should get an Apple device, as opposed to an Android/Windows PC, they are often surprised, when I say they should probably get what they are already used to.
Truth be told, there's plenty of good in all UI (including CLI), and people get very efficient, using their UI of choice. I find that it's usually best, if they stay on it.
Having been an Apple developer for decades, I have been absolutely slathered in bile from Apple-haters. It seems to be pathological. I assume that's because of the "snottiness" of Apple's approach. It's actually deliberate, and part of their branding. It can get annoying, but I know why they do it. Personally, I don't feel that way, despite being invested in the Apple ecosystem, and I don't hate other approaches, either. I managed a multi-platform development team for a couple of decades. It was not conducive to effectiveness, for me (or any of my employees) to be jingoistic about platform choices.
>it's a seasoned professional
Professional in what? I'm also a professional. Is my opinion not just as valid? Is MKBHD also a professional in this sense?
>who uses both operating systems regularly
I think many people on the planet, including children, can use two or more operating systems regularly and provide opinions on them, it's not a rare skill or something that requires academic degrees. Is their opinion not just as valid?
>while being held to journalistic standards
A lot of events proved that "journalistic standards" mean very little, especially in the modern era of online publications being dependent on click ad-revenue. For example look at the disconnect between critics ratings of movies and audience ratings, or between car reviewers and car owners. Similarly, Microsoft and Apple make OSs for users, not for professional critics or journalists.
It's still just someone's subjective opinion on an OS, not something numerically and logically quantifiable as being the right opinion. It's not like it's a debate with Linus Torvalds on the correct implementation of mutexes.
> I have been absolutely slathered in bile from Apple-haters.
What does this have to do with me? What's with this victimization attitude on people lately? Should I feel guilty or sorry about something some other random people said something mean to you in connection to this topic? It's a conversation between you and me, I don't care about what others did.
I could not agree with you more.
I am replying to you from my third mac. I got it less than a year ago and it is the first Mac I have used since 2010 or so. Sure I am getting used to it but it does surprise me how different some things are from my typical XFCE/Win10 environments. I know unintuitive is the wrong word but at least for my own intuition, it is unintuitive.
Thank you very much! :-)
It is very much a thing of modern times to be lectured on, for instance, desktop design, when I am fairly confident I've used more different desktop environments than the person accusing me is even aware exists.
(I would estimate I've used 35-40 different desktops across over a dozen or more GUI OSes. The first I owned myself was an Acorn Archimedes with RISC OS 2, an environment far weirder than any hardcore Linux advocate could even imagine… a default editor with two separate independently-navigable cursors (source and destination), three mouse buttons all heavily used, and no permanently on-screen menus of any kind (only context menus).
Ah well. So it goes.
>It is very much a thing of modern times to be lectured on for instance, desktop design
Where did I lecture you on that?
> when I am fairly confident I've used more different desktop environments
Does using more desktop environments makes one's opinion on a specific desktop design more valuable than everyone else's? It's not like you're designing them, you're just using them, just like me and millions of other people.
>than the person accusing me is even aware exists.
Care to point out what exactly did I accuse you of?
I didn't. Turn your paranoia down. I never mentioned you once and none of this is specific or particular to you.
But, to answer one point: yes, I do think that broad experience of lots of different desktop GUIs does qualify someone for comparing them, and for identifying particular strengths or weaknesses of particular ones.
>I never mentioned you once and none of this is specific or particular to you.
Who were you referring to in this statement?
>than the person accusing me is even aware exists
>container and macos
docker assumes there is a linux kernel underneath, not a mac 'unix' kernel... so you end up having to have, just like on Windows, a vm running a linux kernel to run a docker container
Yes, I am fully aware of that.
But I am told -- I do not work with this stuff myself -- that if you simply install Docker Desktop, or something equivalent, it just happens, invisibly and out of sight, zero intervention and zero maintenance.
Which is the general Mac story, even now.
Yes and for reasons I was running an x86 SQL Server Docker image on my ARM Mac and that just works
Although if you can find ARM images, make the effort. I stay away from anything x86 via Rosetta as I don't want the slowdown.
It didn’t need to be performant. I was in between jobs for 3 weeks and I reviewing C#/EF Core. I hadn’t programmed in C# in over four years
Luckily quite a lot of .NET teams are using Postgres or MySQL nowadays but yeah.
It’s a different environment compared to what it was 5 years ago.
Wow! :-)
Linux to me still feels like it used to be in the 90s. It's certainly improved, and package management is better, but the UI's are inconsistent and of relatively low quality. The advantage would be that you have more of a one-to-one match with what happens on the server side (which is usually Linux for most people).
Some parts might require some tweaks, but usually it's a once off and then you're good to go. Containers - haven't had much issues, but you might run into some non-ARM based images for Docker, but fairly easily solved.
As for window management - what do you mean? The window management to me is good, but then I never understood tiling window managers and such, if that is your requirement.
I've recently spent a considerable amount of time on Windows 11(after using linux/x11/wayland/kde for a long time),-- the UI inconsistencies are widespread there too. Microsoft is only finally finishing the push to make all control panels look consistent, and they are doing so by removing some of the more detailed options.
Ironically, Windows is not user-friendly at all these days. It was supposed to be. How could this happen?
You should use CDE for a week to truly appreciate 90s UNIX.
https://en.m.wikipedia.org/wiki/Common_Desktop_Environment
I was there and the wiki article captures the look, but not the feel. Giant funky mouse pointers that flip which direction they point as you hover over elements, but without apparent logic...Window maximizing behavior that left you unable to get back to the desktop...Minimized windows disappearing, never to be seen again...
Yeah. Linux got left in the dust in about 2005 and no one has worked it out yet.
The principal difference is the sheer quality of the client desktop experience hasn't improved since then. The Linux desktop apps are pretty terrible, unreliable and clunky and most of the progress so far has been rewriting them again and again in slightly different desktops to no avail (gnome over the years for example). Yet still things like fractional scaling barely even work.
While everyone was pissing around with that and fanfaring open source, Apple refined a whole suite of apps that ship with their macs and phones and ipads that just work and sync properly.
And that's what is important to a lot of people, not whether the icons are in the title bar on gnome, any purity etc. Usability is number 1. And Linux is not.
>As for window management - what do you mean? The window management to me is good
What's good about it? The fact it doesn't exist?
Err we have virtual desktops, tiling, snapping and things you know, out of the box these days. I mean the virtual desktops thing is mostly what I use and it's a triple swipe on my magic trackpad to switch.
Since MacOS sequoia apparently. So 3 months since MacOS users have window management out of the box.
Better late than never I guess, but they sure took their sweet time to implement features standard on Windows for 15+ years and 20+ years on Linux.
And MacOS had Spaces 6 years before Windows had anything similar and Exposé for 3 years before they came out with a crappy not-as-good equivalent, and the current task view still sucks by comparison.
But touch input drivers on both platforms still suck, so I don't really care what their Window management is like when I can't interact with them without a hand cramp.
I'm not sure why the need to move the goalposts but I'll bite.
>And MacOS had Spaces 6 years before Windows had anything similar and Exposé for 3 years before they came out with a crappy not-as-good equivalent, and the current task view still sucks by comparison
So what? On Windows and Linux I never needed that feature because they always had proper window management, nor do I use that feature now. You're comparing Apples to Oranges. A Dodge RAM has a tow hitch, a Ferrari doesn't have a tow hitch. Is one better than the other, or are they better at different scenarios?
>But touch input drivers on both platforms still suck, so I don't really care what their Window management is like when I can't interact with them without a hand cramp.
All touchpads give me cramps and carpal tunnel, that's why I use an angled mouse. Again, moot and off topic point. What's the point of a better touchpad if it's never gonna beat an ergonomic mouse?
My point here is that your above idea of "proper window management" (or "window management [full stop]") is your own personal opinion.
There are differing schools of thought in how computers should be interacted with, and your opinion is one of the many opinions that exist.
Half-screen tiling (Window > Tile Window to Left/Right of Screen, or click and hold the green button), snapping (same but hold Opt), and virtual desktops ("Spaces", later "Mission Control") have been been available for a long time. The former ones maybe not used that much because people don't explore the menus.
Before, we just used Rectangle. It's no biggy.
Some of us don't build containerised web applications you know.
It's basically a Unix machine. A very fast and very cheap one.
The window management thing is really overblown in my opinion. On macOS I just keep two monitors with separate virtual desktops enabled on both with apps assigned to specific desktops which reduces management to almost nothing, which is even easier and lower effort than a Win9x-paradigm desktop or tiling setup (which I’ve found requires a surprising amount of micromanagement to keep usable).
Being a developer isn't a synonym for UNIX.
I've worked with Unix and Windows in the past decades and each and every time the only scenario Windows wins is when I'm developing applications for Windows.
Since I develop mostly for server-side, a Unix-like OS is a no-brainer. I have all three OSs on my desk and the least satisfying to use is Windows - it's relatively slow and difficult to troubleshoot device driver issues. On Linux you can always look under the hood and on Macs there is no such thing as device issues.
Yet, it is quite possible, although surprisingly in modern times, to be a developer, without dealing with UNIX, nor Windows.
Developer job !== UNIX.
There are plenty places in embedded where the toolchains exist only for Windows.
Yeah, then again Developer job !== Windows, in case you haven't yet got the point.
Being a developer has nothing to do with a specific OS in particular.
Yeah. I like to run docker without needing a Linux vm, I like the choice of desktop environments and they are superb for me compared to the macos desktop. KDE Plasma 6 is one of the best desktops I've ever used.
Now with the atomic distros, such as Aurora, you have a rock solid base you never touch, updates are atomic so you can always reboot to the previous version if needed and you create lightweight containers for development.
My current setup is Aurora as the base distro, all GUI applications from Flathub and the terminal automatically opens up in distrobox which runs Arch Linux with Nix. Super solid, super fast and everything just works.
Apple makes consumer electronics.
From a professional perspective they are toys.
That's an elitist attitude that has very little basis in reality. Would you care to justify it?
Plenty of professionals, including developers, use Apple machines for their work, as tools not toys.
People have been saying that for 40 years now. Give it a rest already.
You do realize that many, many movies have been edited in various versions of Final Cut Pro on Macs right? Including several Academy Award winners.
Movies like _Parasite_, _The Social Network_, and _300_, to name just a few.
If that's a toy, I'd love to hear what is industrial strength.
So an actual certified commercial Unix workstation is a toy now?
Papier ist geduldig. (Literally paper is patient) / roughly: Paper doesn't blush.
It’s also a better strategy for the company because you’ll easily pick the right “level” and then it’s much easier to upsell you on a part or two.
If instead they give you ten thousand combinations you’re much more likely to just grab “the cheapest”.
> you’re much more likely to just grab “the cheapest”.
And then realize there were better deals at the time and tarnish the brand.
Absolutely simplifying the lineup to four Macs was the best decision. Right now they have one more than they should - the MacPro and Mac Studio seem to clash a lot, especially since you can't use the PCIe slots of the Pro for GPUs. What do people put in those slots? I'd assume storage and fast networking.
I’d also imagine there’s firms using SDI video capture cards for on set/production purposes. Outside of local storage I also used to see Fibre channel HBAs and the like somewhat commonly on the older cheese grater (unsure how common that use case is now).
The current Mac Pro to my recollection is also readily available in a rack mount format, in and of itself that’s a solid reason for keeping it alive
The Mac Pro is expensive enough that it takes it solidly out of the "consumer" arena and puts it into commercial/business customers. Those customers will take the time to investigate and determine what they need for the job.
You can see this by comparing the marketing around the F150 (a consumer pickup that is used by commercial/business customers) and the F650 - https://www.ford.com/commercial-trucks/f650-f750/
There have been times where the Mac Pro dipped into the high end consumer market explicitly, but we're not in one of those times now.
(Do note that some consumers WILL buy "commercial" products and Apple's obviously aware of that, but I suspect it's hard to get them to recommend the Mac Pro to home users.)
100% agree. I call it the "toothpaste paralysis." It’s like when you’re shopping for toothpaste, and brands like Colgate have so many overlapping options that it becomes impossible to figure out which one is actually the best. Unfortunately, I think the same thing is starting to happen with MacBooks again. It’s not as bad as before, but it’s definitely not as straightforward as it used to be.
The MacBook choices today seem relatively clear?
- Pro if you need maximum CPU/GPU power.
- Air if you want something lightweight and don't need the "Pro" level CPU/GPU power
If you're only doing email/web, you should probably go Air. (There are no bad choices with the Apple silicon Macs for general use, it's mainly a question of how slow you want your video rendering chores and Xcode builds to be.)
Then multiply by screen sizes, which determines the overall size of the machine.
Edit: formatting. HackerNews support markdown challenge.
Edit 2: fuck, forgot non-Pro. Maybe you're right.
The MacBook Pro also has a choice of regular, Pro, Max, and Ultra chips.
The MacBook Pro has the choice of Pro and Max.
The Mini has the choice of base or Pro.
The Mac Pro/Studio has the choice of Max and Ultra.
IIRC.
iPad has reached that state. The regular iPad and the Pro make some sense but the Air is in a very awkward middle.
The Air exists so the Pro can have expensive Pro features and the (null) iPad can hit an impulse purchase price point.
While those two are pulling in opposite directions, having nothing in the middle would leave a big market gap.
Air as an idea makes sense—with pro you pay extra for more power, with Air I thought the idea was you paid more for the same functionality, but in a thinner/lighter form factor.
Is it really just between the Pro and nothing at this point? Because that’s dumb if so.
>When my ex-wife was looking to upgrade her Windows laptop a few years back, she ended up in analysis paralysis because the options just from HP were so complicated that she couldn’t figure out what she should buy.
It's like cars where they bundle one feature you want with a bunch of stuff you don't care about to force you into overspending.
> The era of 4-digit Mac names was such a mess, trying to figure out which one was the best option available at your price point
Yeah. I really liked my 9600 but the Performa lines were way too confusing.
> The era of 4-digit Mac names was such a mess...
Same goes for a lot of other products. For example CPUs, GPUs, TVs and fridges.
Sometimes appliance names are nearly impenetrable.
How dare you attempt to sully the honor of my beloved Power Mac 9500, or take an implicit shot at my Performa 638CD — which was not even 4 digits, but 3 digits and 2 letters. You need to check your wiring, friend, or dial up your SoftRAM.
> she couldn’t figure out what she should buy.
I am not sure what "should" means in that context.
Surely many models would have been suitable. It is more a self induced SKU nightmare/issue for the manufacturer.
> I am not sure what "should" means in that context.
Really?
The phrase means "what was the best choice", which means "she could not figure out which model offered the best balance of price, performance and features."
I can't offhand think of a more efficient way to phrase it, TBH.
> One of the best things Steve Jobs did on his return was to trim the number of Mac models to a minimum.
So much for choice.
I used to have a 4400/200 with the PC card and honestly loved the thing. It was my first Powermac and I could press cmd+enter to switch into windows 95, it felt so cool at the time.
TIL that's also the fastest PC card Apple shipped and is Gestalt-locked to the 4400/7220: http://www.oliver-schubert.com/DOScard/DOScard.html#wishiwer...
I had a 7600 with the PC Card. Favorite computer I ever owned.
I managed a lab full of these. So painful to work on because of all those sharp edges. We upgrade the RAM by hand though, which did help. And went from OS 7.6 to 8.6 eventually... which made things a bit more stable. Such weird machines.
Only weirder machine I remember was the Mac TV, I knew someone with a school equipped with those.
It's almost cheating given that it was a limited-edition model, but the TAM [1] was even stranger.
[1]: https://en.wikipedia.org/wiki/Twentieth_Anniversary_Macintos...
Oh yes, I forgot all about those. I've never seen one in person but it's a very odd one.
Of course, 65scribe has a great (if you appreciate his passion and shtick) video on the 4400:
https://youtu.be/40VtkZdOAGo
I had that model. Modernized it later by adding more memory, more video memory and eventually a Sonnect G3 extension card that made it very fast. With that card it did run Mac OS X, 10.3, as far as I remember, and was fairly usable.
What it did not have though was true color; the video card simply did not produce it, even with maxed out video memory. As far as I understand the cause was that the memory was too slow for that.
6400 on the other hand was up there with the Color Classic, Twentieth Anniversary Macintosh, and PowerBook 500 as objects of 90s pre-Jobs desire.
The 4400 also uses 3.3V EDO DIMMs like some of the clones. Most of the other Apple-branded Power Macs of its era used 5V FPM DIMMs.
They quite possibly use the same LPX-40 logic board design.
Going by the developers note Apple created for it, the LPX-40 is somewhat interesting, but the PowerMac 4400 is basically the most "boring" and normal Mac like configuration. PS/2 and VGA connectors, PC style MFM only manual eject floppy drives, support for "hard power" configurations and an AT like PSU connection - they were really going for "shove this into a PC case, and you've got a Mac". Also, Apple could've fitted a PPC604...
Cheap in more ways than one.
Fast forward to 2020 and Apple introduces the M1 MacBook Air. (Though people still complained about the 8GB memory configuration.)
Apple seems to have learned their lesson with entry-level machines; the basic iPad and Mac mini are quality designs (though storage/memory upselling is still a thing - the cheapest iPad is probably aimed at classrooms/kiosks/streaming.)
I don't think it was that bad. I think it was different, and it was confusing to compare the Tanzania systems to anything else Apple, but it wasn't a bad system by itself.
I found a Motorola Starmax desktop (not tower) in the trash in Manhattan in the early 2000s. It chimed but didn't show video, so I installed a disk, installed NetBSD and used it as server for many, many years. It was very decently performant and incredibly stable.
These days I think it'd need a recap and the 160 megabyte limit would make it less useful than it was, but I still have only good things to say about it.
TFA says it s been followed by the Macintosh G3 desktop... But the G3 was just a beige PC too. Slightly heavier than a regular tower PC but still very beige.
Not Apple s greatest era. They weren't the old Mac cool anymore and they weren't yet iPod/iPhone/iPad cool.
Some G4 were actually good looking and had a great monitor too. But to me the G3 that followed that 4400 was just as bad Apple.
I have fond memories of the OS and still own it though.
The article doesn't mention them, but the keyboard and mouse felt super cheap, too. Light and flimsy and unpleasant.
But they were the same Apple Design Keyboard and ADB Mouse II that shipped with all of the other mid 90s macs though right?
It was a noticeable step down from the previous keyboards. Certainly not Apple's best.
Incredible as this might sound, I think the best keyboard Apple made was the Butterfly. It was fragile and unreliable, but it felt great and sounded crisp and precise.
It could have been worse. Apple use to love selling Macs that were crippled by horrible buses.
My first Mac was an LCII. It had a 32 bit 68030–16Mhz processor with a 16 bit bus.
I won’t even get started with the 12 inch 512x384 monitor that few games were compatible with
I had no idea Apple ever did this. And the idea of a floppy drive that doesn’t have auto-inject is just sacrilege.
Even after leaving the Mac in the late 90s and building my own PCs getting to mess with a Mac was always a nice experience because they were so nicely built physically.
And the idea of a floppy drive that doesn’t have auto-inject is just sacrilege.
Auto inject was gone from Macs well before this model so it wasn't directly connected to the cheapness of this thing.
Oh. That’s too bad.
The first 3.5” drive I ever had was in an LC II. Before that I had only used a 5.25 in a PC XT or something like that. Being able to have it suck a disc in or ejecting a disc and having it pop out with that great mechanical noise was fantastic.
Because my age I thought all drives were like that. The first time I used a Windows PC (3.0?) I was surprised that you had to push the disc in by hand and that it didn’t just show up on the desktop in Windows. I had to be introduced to the concept of drive letters. Seemed relatively barbaric to young me.
Of course within about two years I was asking for my own PC for all the great games. So that didn’t last all that long.
Hah, yes my childhood experience with these was similar. There was your typical 8 bitter 5.25" floppy with its floppiness and rattly drives and make-it-double-sided-with-a-hole-punch diy-ness. And then there was the 3.5" hard plastic square, straight out of Star Wars. A robot would eat it and regurgitate it for you on command.
Funnily enough, Microsoft actually planned to add floppy drive auto-mount on Windows 95. But half the drives implemented the signal for "media present" backwards from the other half, and Microsoft couldn't figure out a user-friendly way to make it auto-detect, so they canned it
That is really interesting, in that it is the opposite of my childhood understanding. I started with CP/M and DOS, and the first time I came to a Linux machine, I just couldn't understand how someone could work with drives without the letters (dedicated namespaces, right). My thought was that it was a less polished design.
It vanished with the 800K drives, in the Motorola era. My Color Classic doesn't inject the disk.
I think a Colour Classic still has an auto inject drive, my LC II had one. You can tell the manual inject ones because the case has a curved indent around the drive. Although this is the changeover era, as some late LC IIs apparently have the different drive (and lose the Snow White stipe along the front at the same time).
Maybe it was an option. I'm not really sure.
There is a small hole near the floppy drive and there also was a pin to eject a disk when the computer was off, similar to how SIM cards are handled in modern phones. Good design, actually; harder to damage data.
There seems to be a lot of hate for the left side disk drive. Are right handed people so incapable that they can't handle a bit of ambidexterity? /s
I just went and tried inserting a floppy disk with either hand and it was exceptionally easy.
Wouldn't a left side disk drive and the standard right side mouse placement be a superior workflow?
Was the dislike just because of the change?
IIRC no in the real world cared about the left sided mounting of the drive.
I've never heard that complaint mentioned before, so that article is the first I've heard of it.
My anecdata is working at Apple and Apple Dealers in the mid 90s to 2001.
But then not many of them got sold in my sphere IIRC. We were selling 8600s and then G3s into Ad Agencies etc.. at that point.
I suspect it’s more just that it doesn’t “fit“ the way all the other machines were, it stands out and not in an impressive way. It just sort of increases the otherness.
I agree the stuff about being harder for right hand is probably just made up after the fact as color commentary.
I think there’s just a bit of snobbery. It’s an off the shelf LPX chassis with a “Logic Board LPX-40” that Apple also supplied to clone makers. The floppy drive being on the “wrong” side is just proof that it somehow lacks that special something.
> Was the dislike just because of the change?
It's just because it looked like a WIntel PC and thus was a threat to the collective illusion that 1996-Apple offered anything substantially different or better than Windows '95 (source: was a 1996 Macintosh user who used the term “WIntel”)
Compare:
- Compaq DeskPro https://serialport.org/pcs/compaq/compaq-deskpro-en-c300a/#p...
- Packard Bell Legend https://old.reddit.com/r/retrobattlestations/comments/hjomez...
- HP Pavilion https://www.hp.com/hpinfo/abouthp/histnfacts/museum/personal...
- Gateway 2000 https://upload.wikimedia.org/wikipedia/commons/7/7a/Gateway_...
> the collective illusion that 1996-Apple offered anything > substantially different or better than Windows '95
PowerPC
I love PowerPC but I still don't believe CPU architecture alone is a motivating factor for why any person would choose to use a particular computer over any other. If that were true then Stebe Jovs never would have told me about The Megahertz Myth, now welcome Phil S[c]hiller to the stage to run the PentiumⅡ machine for this specially-scripted Photoshop benchmark, et cetera.
Downvote all you want but I was there and literally heard people complain about it for this reason lol
Reads like it was just a bad Mac all-around but the left-hand floppy drive was a visible symbol of that on the face of the machine because it was different from what was normal for a Macintosh in a machine that was full of things that were different from a normal Macintosh.
Also knowing Stephen Hackett, I don't think he's capable of hate for older Macs. He seems to love even the oddest of ducks and has a lab full of them.
Whole thing seems like a bunch of whining about things that don't matter and are good ways to reduce cost with minimal impact.
The auto voltage switching - how often are you taking your PC to another country with a different voltage?
The lower quality case finish - how many mac users ever dared open the case?