Just in case it is relevant for anyone here this is what our security team have established thus far:

- Can be mitigated by enabling the root user with a strong password

- Can be detected with `osquery` using `SELECT * FROM plist WHERE path = "/private/var/db/dslocal/nodes/Default/users/root.plist" AND key = "passwd" AND length(value) > 1;";`

- You can see what time the root account was enabled using `SELECT * FROM plist WHERE path = "/private/var/db/dslocal/nodes/Default/users/root.plist" WHERE key = "accountPolicyData";` then base 64 decoding that into a file and then running `plutil -convert xml1` and looking at the `passwordLastSetTime` field.

Note: osquery needs to be running with `sudo` but if you have it deployed across a fleet of macs as a daemon then it will be running with `sudo` anyway.

At the risk of sounding a bit pedantic you can't really assume that, it's possible that somebody used this vulnerability, installed some sort of backdoor and then disabled the account to hide their tracks.

When you do this you'll get the creationTime and passwordLastSetTime as seconds since the 'epoch' – January 1, 1970, 00:00:00 (UTC). These are numbers like 1474441704.265237 which aren't very easy for a human to read :-)

To convert this into a human-readable date and time, open a terminal and do this:

One of my Macs is showing a root password change date of Nov 10th 2017. I can't explain that, so I'm reinstalling now. It did have sshd enabled and remotely accessible, though I thought root login was prohibited.

If I understood correctly, this particular bug was only exploitable from the GUI and this machine hasn't been away from home, so it's likely this isn't related, but posting here, in case it's part of a bigger picture.

Oh wow. Is there any other explanation for this other than this having been exploited in the wild for almost three weeks? Or maybe someone just tried to log in over SSH to exploit some other weakness (something like predictable SSH passwords on jailbroken iOS devices), and happened to create the root user on your machine?

Did you also have sshd running, and do you know what kind of network you were using at the time?

System Integrity Protection (SIP)[1] does prevent even the root user from modifying some system files[2]. It seems possible, at least in principle, to protect system logs from modification by user root. In practice, I think most system logs are stored in /var, and that part of the directory tree does not appear to be protected by SIP (but I hope I'm wrong!)

[2] Unless/until you reboot to a diagnostic monitor on a special partition (which requires pressing command-R from a local keyboard during the POST), then run a command to disable SIP, and then reboot again. Continuity Activation Tool requires users to perform this step as part of the install process to allow installation of Bluetooth drivers not originally signed by Apple.

You can't load unsigned kexts anymore, due to that same SIP. It's a pain in the gonads when hacking your own kexts. I had forgotten about this, but it does indeed allow for a system that leaves an audit trail which cannot be hidden, even by root.

However, user labcomputer is right, I doubt that applies to the solutions proposed by OP here. Well, I'm certain: root can switch out the shell or terminal emulator binary itself and have it lie about executing those commands and return something trustworthy. One way or another, to truly check this, you'd need an immutable audit log (probably not currently available), AND a reboot into safe mode or a mount as a HDD onto a safe system.

I see a lot of comments here wondering why Apple seems to not care about software quality anymore. I don’t know if that’s true, but there’s a perfectly obvious answer: They don’t have to.

Software quality in macOS was important back when they were trying to get people to switch from Windows-based PCs to Macs. Nowadays, most people who were going to switch have already switched, so Apple has no incentive to keep up the same level of software quality anymore. They just have to keep people locked into their ecosystem (with iPhone etc.) enough that the barrier to switch out again is high enough.

There is no reason for Apple to improve macOS, since doing so won’t make anyone switch to Macs who hasn’t already switched, and not improving macOS won’t make anyone upset enough to switch back. Ergo, Apple leaves macOS to stagnate, and they will keep macOS at this bad-but-not-horrible-enough-to-switch level for the foreseeable future.

The core applications that I use (Firefox, Docker, VSCode, vim, ...) all work just as well on Linux, MacOS and Windows.

I have a Mac, because it's (at least previously) been pretty secure by default, doesn't require me to invest a lot of time sysadmining my own box, and lets me dip into a healthy ecosystem of commercial software useful to my hobbies (like photography.)

The software has definitely declined in quality, but not enough to massively annoy me.

If there is lock-in, it's on the hardware side. I've got an early 2013 MBP, still going strong, a bit dented but it's been around the world with me a few times, so that's understandable.

My workplace uses Dell XPS hardware, and that's good, but it still doesn't feel as solid to me. It's good, but it's not as good.

I think the hardware is the laurel Apple has really been resting on.

I could meet my main use cases on Linux quite happily, and dual-boot Windows for the rest. Right now the premium on Mac hardware, which only happily runs an increasingly decrepit operating system, isn't looking worth it. Previously, it was.

Most people don't realize but the vast majority of Video Editing was Windows based till about 2010 when Final Cut was considered best in class (I can't stand Final Cut myself but to each their own...) The vast majority of video editing is now Premier due to Apple's handling of Final Cut Pro and the lack of support for the Mac Pro (They usually sit in back rooms as expensive file servers) Also most people mentally think that somehow Apple is better for design but the software runs just as well on Windows.

The iPhone and the money spent on software is what is keeping people these days. But whenever I talk with my friends they are certainly not thrilled and zealots of Macs anymore. The vast majority of my video editing friends are getting really frustrated with what they call the ceiling. Do you really want to be editing full time on a lap top? The Mac Pro isn't a real solution for full time editors.

It's also display quality. If you're doing design work you can use a MacBook pro and be pretty sure that the color is accurate with no calibration. If you switch platforms you have to sort out the enterprise and gaming displays, which have totally different selling points (price and responsiveness, respectively). Getting a good display and accurate color on a Windows machine requires a lot more knowledge and effort. This is definitely less true since Apple abandoned their display line (one more bit of evidence that Apple doesn't care about the professionals that established their brand anymore).

Those Apple mirrors (err, thunderbolt displays) were definitely not close to color accurate. The macbooks are okay I guess?

When I worked at a major printing company, they were not using Macs because people would THINK they were color accurate when they were very much not, and we had a bunch of calibrated Dell monitors around specifically for that purpose.

So definitely more of an urban legend than anything. Apple displays are reasonable, they're decent IPS panels, but they're middle of the road if anything.

I own my own calibrator. I calibrate every monitor I use for Video. I have never seen a accurate monitor in the wild yet. The funny thing is I can get a horrible cheap monitor to be calibrated in a dark room and it is better than anything not calibrated.

People need to buy calibrators. I use the open source ColorHug it runs on Linux so I actually use a live cd and do the calibration. http://www.hughski.com/

My partner does photography and has a Datacolor Spyder 4, which I of course borrow to calibrate all my monitors. At work I have a 30" IPS and next to it vertically an old 24" tn-film. After calibration, they are very close color-wise and they both are very enjoyable for reading code. The tn panel has worse viewing angles and about ~80% of sRGB, but after calibration it is absolutely much nicer even for development.

I calibrate my monitors with DisplayCAL[0] on Linux.

There should be one calibrator in every office, the difference it makes is enormous.

I'm not sure I agree with this. Or at least it goes too far to say that it's a mental lock in.

Yeah Apple is making some very bad mistakes in their software quality, but there are two things that are very essential to the Mac experience that still make it the most straightforward choice.

One key advantage Macs have over Windows is that they run Unix. You can open a terminal and be involved with most of the Linux/Unix monoculture that exists and have access to much the same tools. No VMs and all the hassle they bring to take into account, mostly at least.

One key advantage Macs have over Linuxes is the availability of good quality graphical software. If you like a GUI for Git, the best are available on Mac. It has OmniGraffle, which many regard as amongst the best diagramming software out there. It runs a very decent version of Microsoft Office. Many would argue that - especially for developers - the software ecosystem for Macs is even superior to Windows. And add on top of that is that this also runs on a still mostly flawless out-of-the-box experience.

Sure, I bet most people could switch to Linux or Windows if they wanted to go through some effort. But it's more than a mental lock-in, you give too little credit to the Mac ecosystem. It might not be the obvious best place to be anymore, but it's still great value. As was pointed out before, this seems to be something that Apple is okay with.

I really hope Apple feels this security incident steps up their game - they deserve all the hate they get for this. But the Mac value proposition will barely change for most people, as sad as that may be.

Please note as disclaimer that although I do use Macs sometimes, I spend most of my time on Windows and Linux systems.

> One key advantage Macs have over Windows is that they run Unix. You can open a terminal and be involved with most of the Linux/Unix monoculture that exists and have access to much the same tools. No VMs and all the hassle they bring to take into account, mostly at least.

Thankfully we are getting there with "Windows Subsystem for Linux." I am using the OpenSUSE subsystem which you can install in the Windows 10 store. It isn't perfect but it sure is getting closer.

> Also most people mentally think that somehow Apple is better for design but the software runs just as well on Windows.

Back in the PowerPC days, a large part of every keynote was getting Phil on to press the spacebar so we could all see how much slower Photoshop was at making the poster for Inspector Gadget. Can't help but feel like this was where a lot of people cut their teeth on this opinion. While Mac OS 9 and its users (niners) are a tiny minority now, I suspect a lot of those shops moved to Mac OS X.

But that was all a lie about the speed of Macs. It was absolutely smoke and mirrors. Intel CPU blew the doors off the Power PC. Case in point, Apple switched from Power PC to Intel and saw a huge speed increase. The "Cult of Mac" was 100% anti-Intel and people would tell me that the G5 Power PC was the fastest personal computer you could buy. All lies and dishonesty. Apple for years caused huge animosity of "Apple Fanboys" vs Intel.

While you don't believe you are locked in, I don't believe that you as a programmer "power user" are the majority that Apple cares about.

I believe not only that for the majority of users there is a level of software lock-in, but further there is a high level of psychological lock-in, where users get used to and comfortable with Apple's design strength, which is Apple's main offering.

As people get more comfortable and more older it is easy to say that people get more resistant to change.

Photos, apps purchased, and iMessage are overwhelmingly the reasons I don't see people switch. All their kids photos, etc, are stored away and they'd have to figure out how to nicely export them. iMessage is seemless for them across devices while an alternative like Hangouts doesn't have the market penetration—it isn't ubiquitously used even among just Android users. Apps purchased I added to the list because often people don't think about it, but if you mention "re-buying all your apps" you see the frown appear on their face.

Without directly disagreeing with your post, I think there is a slight OS lock-in, in the fact that the MS alternative is a horrible piece of burning wreckage. Anybody that had to put up with the autoupdate experience in Windows 10 (oh, you were doing something important? Never mind, I'll just hog your network in random intervals for like an hour without you having a way to stop me and then I'll take 2 hours to apply the patches before the reboot), can understand that Apple was playing without serious competition for some years now.

There are many lock-ins, first there is iMessage, second there are some apps that still work only on Mac OS X I don't remember the name of the software but I once was sent a design file and was only able to open it on a Mac OS X software (there was a windows alternative but it didn't allow me to edit the file as needed). Another example is XCode, you need a Mac to properly create iPhone apps. For programming, there are also issues with symbolic links on Windows.

I personally prefer Windows, but as a software developer I had to buy a Mac, I grew tired of having to always power-on a Mac OS X virtual machine. My job is so much easier now then it was on windows.

I used WhatsApp, Telegram, Messenger and Skype on my phone in the past two hours. Obviously no iMessage because I'm on Android. Maybe my friends with an iPhone use only iMessage between them (I doubt it) but the network effect is all for WhatsApp and Messenger where I live (Italy). Probably nobody switches to Apple because of iMessage here, but it could be a lock in if you really use it.

I'm not sure the correlation is technical proficiency by itself, I think it's based upon a critical mass of your social circles using iMessage or not using iMessage. If you choose to, you can probably make a correlation between "technical savviness" and a user's choice between Android and Apple, but I don't think that is a deciding factor in who uses iMessage.

The reason people throw fits is because the experience between a group messaging together on iMessage is exceptional - this experience breaks down when even one of your friends in the chat doesn't have an Apple product. They aren't able to send or receive the majority of the "chat add ons" iMessage provides. I'm sure making the bubbles green vs. blue only helps to stoke the "us vs. them" fire.

I consider myself to be a reasonably technical user and still prefer to message with iMessage since I know the experience will be the same for everyone I'm chatting with. Yes, we _could_ all start using WhatsApp et al, but if 8/9 of our group message is on iMessage, why would we?

Apple's sales per square foot in their stores is really high. Having some place to take your computer to when you need help is extremely valuable for a lot of people. Why don't Samsung, Dell, Lenovo, and HP all have their own stores in every neighborhood that has an Apple store? Is the Apple store only successful because of the iPhone?

The biggest thing I lost in a similar switch was ANY old conversation with somebody who was an iMessage user. Where they reply to a conversation and think they are just texting you, but actually they are sending iMessages that you just aren’t receiving. Especially on, for example, family group texts, people just find the old conversation and continue it from the previous year or whatever. Lots of times when I was on windows phone or had iMessage switched off, I’d only get the parts of the conversation coming from non-iMessage users. A good example is random family members who use android texting me reactions to an original text or picture I didn’t get. It’s really dumb. Maybe there’s something I could have done about it in my own settings, but instead I ended up switching back to iPhone recently. Not just for that reason but it sure was nice to get messages properly again. But also more sensitive stuff like somebody sharing pictures of items related to planning a funeral.

Thanks, that is really interesting. I hadn’t read that page, but I had disabled iMessage from the phone settings, which is how I first noticed the issue. Maybe I was not thorough enough, (it says to disable FaceTime too, which I didn’t/don’t want to disable) because it never corrected itself.

I second this. The only feature that I can immediately think of that locks me in the mac eco-system is the convenience of airdropping files with my wife. Apart from this, with most of my stuff on the cloud, there won't even be a migration process if I decide to switch to Windows.

While your theory is interesting, if deeply cynical, the thing I find most interesting is that it's the top comment on an 800+ comment discussion when it was less than a minute old. Do new comments start at the top? I've never noticed that before.

Edit: By the way, regarding the vulnerability, ANY password you use when you first attempt to login as root BECOMES root's new password. (Blank is a red herring.)

So if you're going to test this, maybe use something non-obvious. In a terminal, setting a strong password for root with "sudo passwd" is the quickest mitigation.

Ill-advised, but in a pinch, you can apparently 'secure' a machine you don't otherwise have access to by attempting to log in as root with a long random password you fail to remember. An admin on that machine can later change root's password with a "sudo passwd".

The higher the poster's karma, the higher her comment will be upon posting. This user has almost 6k karma, so it can rise high. Once the comment is at the top for a minute or so, it can stay there if enough people keep upvoting it.

Try it and post a top level comment now. I'm pretty sure it won't be at the top initially because you don't have enough karma for that.

That said, between this, the disk encryption bug, not being able to type "I" on an iphone you have to wonder what is going on. I recently upgrade my MacBook Pro to High Sierra and it's been plagued with problems (Weird red flash when displaying menus, hangs/crashes with external monitors etc.)

Then I look at switching away, and I lose all the OSX software I own, all the easy iOS integration, all those Pages documents etc.

Maybe I just need to build a cheap but upgradable Linux box and start trying to switch.

Maybe. But this particular bug happened precisely because Apple has changed _something_ in macOS. Also this something was probably quite profound since it has impacted a part of software that, at least from the outside, haven't changed much since a long time.

A lot of macOS users would actually prefer Apple to do less with it than what they are currently doing.

> this particular bug happened precisely because Apple has changed _something_ in macOS

I don't know much about this bug but I have seen several reports that the bug has actually existed quite some time and is not new, only the publicity surrounding it is now shining a bright light on it.

I kind of love how you frame it as "everyone who has switched has switched" as if the job is done. As if there would be no market to capture. Which isn't true. And doesn't even consider the reality that there are young computer buyers who they need to capture because existing users don't buy new machines or won't last in the long run (people die).

There is always more market to capture, but the cost of capturing those few additional users might not be worth it (to Apple, currently). And new users don’t look to the quality of the OS to pick their platforms, they look to existing user bases. And Apple now has a sizable existing user base, especially if you also count iPhones.

I was going to comment the same thing. Most people are always open to switching if the evidence is there to support a better workflow or experience. The realm of endless marketing to all the demographics will never stop.

I’m not so sure about this — although it may be due more to the hardware side of their business: after the recent, disappointing iteration of their MacBook Pros I’ve heard a lot of people considering to switch (and actually switching).

Taken together with software quality issues, I wouldn’t be surprised if at least a subgroup of users are leaving Apple gradually. That subgroup being professional users, of course: Apple is still unassailed as a status symbol, and casual (+ mobile) users seem more than happy.

There's a big difference between a developer grudgingly keeping a cheap headless mini-computer under a stack of papers somewhere that gets used only as needed, and a developer using your system as their "home base" and buying into your entire ecosystem.

I think it's more that desktop machines make them a fraction of the money that their iOS devices do. Development goes towards the profit centre.

Remember that back when Apple made only computers, right before the iPod, they were on the verge of bankruptcy and barely profitable.

Since then their laptops have taken off, of course, and I have no idea how much money they make off them. But compared to the huge torrent of cash Apple makes off iPhones I can't imagine the beancounters see a huge amount of value in investing heavily in the parts of OS X that aren't shared with iOS.

Whether or not most potential apple users have already switched, security is surely vital to keeping their customers with them.

Much like the importance of feeling safe in our own house, if the computer that houses our information suddenly makes us feel unsafe or exposed, we'll naturally seek other options unless the issue is, shall I say, swiftly fixed or easily fixable.

Not to excuse the bug, but I think it has more to do with the annual upgrade cycle for the iPhone. Everything else Apple does has to tie into this, which is a pretty tight cycle for an OS with new "regular" OS features, plus the integrations with iOS.

They can't afford to wait 2 years (or whatever) to update the phones, and Mac OS gets pulled along for the ride.

Their QA department has been in a downward spiral since 2014. I would love to name some people who were doing a fantastic job running the place until then, but I'll spare the embarrassment. This really isn't about some mega company not caring as much as one of their cornerstone departments being unable to function effectively.

I really don't think that to be the case. Quality on OS X was a priority in its own right and fundamental to everything at Apple, not just a by-product of a strategy to get people to move from Windows.

Of course all that changed when its only priority became to shift more iPhones, and everything became secondary to that.

Some years ago, I was hearing about people switching from PCs to Macs all the time. Later, not so much, but macOS was still getting praise. Maybe Apple looked at the conversion numbers at that time and decided that the cost of keeping up the quality of macOS wasn’t worth the few PC converts they were still getting, and they figured that not enough people would switch back to PCs since the iOS system lock-in effects, etc. would present enough of a barrier.

So it’s not that there aren’t still people who could conceivably switch to Macs, it’s that Apple decided they didn’t need more converts quite as badly anymore.

This goes against basically every corporate strategy ever, which is to always increase growth.

At this state in the company's life there is a disconnect between those who make the software and those who make the business decisions.

I don't think it's likely that Apple's board just decided to give up attracting new customers, and any apparent decline in quality is likely attributed to bad management; ineptitude, rather than purpose.

Old school corps - DEC, HP, even IBM to an extent - weren't about increasing growth irrespective of consequences.

The DEC Employee Handbook made a big deal out of Doing the Right Thing. Obviously that was subjective, frequently debatable, and sometimes just a pain in the ass - but it was a guiding principle for engineers of that generation, and for engineers who became managers.

And it produced some outstanding engineering and innovation.

Because it actually means "Do the best work you can, for your own self-respect, and also because you respect your users."

That's light years away from "Screw as much money out of your customers as you can, as many overtime hours out of your developers as you can, and if the product is broken - who cares if the money keeps coming in?"

Increase growth, yes, but not at any cost. My point is that Apple may have decided that at this time they don’t need the growth as much as they need internal developers to work on other things than macOS.

While I think there's a chance you might be right, I don't think it's logical in the long run. I think changes in perception like this are accumulated over time and will in the end hurt the product.

For some examples, look at the impression of Microsoft and Windows when it comes to quality. It is only now starting to improve, with gigantic efforts from Microsofts side. Another example is Linux and usability, which have constantly gotten better (maybe still not good enough, but that's better left for another thread) but still many see Linux as "advanced" and only for power users. These are not perfect examples, of course.

What I mean is that I think it's bad strategy on Apple's part (if they're doing this deliberately), especially considering the resources they have at their hands. I wouldn't be surprised if Apple could increase it's desktop market share further by positioning themselves as high quality. However, it's a reputation they are losing fast.

>It is only now starting to improve, with gigantic efforts from Microsofts side.

Is it? They axed their internal QA and definitely aren't catching all the bugs with the "Insiders Program."

After the Fall Creator's Update I've had to log in twice (after the first one I just get sent back to the login screen).

The workaround is disabling a setting: "Use my sign-in info to automatically finish setting up my device after an update or restart."

I'm also getting repeated alerts that a restart is required to complete installing an audio driver, but restarting doesn't finish it. I probably need to track down the responsible driver, uninstall it, and reinstall manually or hope Windows does it.

Obviously that's not as serious an issue as unauthenticated root access, but in day-to-day use of my Windows computer I don't have a very positive impression of their software quality.

Maybe not quality wise in all areas (I agree with you there) but at least in giving a professional and modern impression compared to let's say Windows XP/7. Maybe operating systems are declining in quality in general, even though organizations sometimes try to improve them. I guess legacy plays a big role here.

I've heard of a lot of people switching away from Macs to Linux and Windows, especially with Windows building up their own official Linux subsystem now.

PC hardware is cheaper than Apple's, and hardware (even the "good stuff") becomes obsolete after 5 years anyway. Besides, most software is cross platform these days.

The only real good retention plan Apple has is that we can't release iOS apps without owning Apple hardware; there's a few Mac-specific software titles that certain professionals rely on; and a little bit of "it's overall higher quality than PCs" mindshare that some people still have either from the 80s and early 2000s, but that can't last long if Apple keeps this up.

Nah. At this rate people will simply abandon the ship sooner or later. There's definitely some deterioration going on instead of only a cynical strategy shift.

The new MBP isn't attractive anymore. The software stagnates. The only reason I keep using Mac for usual use cases is just its wonderful collection of dictionaries (I like to constantly learn new languages). I wonder why no publisher ever bothered coming up with a decent dictionary software on Windows/Linux yet instead of making do with crappy online versions. If they did I'd happily just use a Windows + Linux dual boot machine.

There are probably a lot of people like me on HN who _need_ a unix box to do their work, and the various Macs are still far and away the best general purpose unix boxes available (best chassis, best peripheral compatibility, best (o|O)ffice software compatibility).

Now that Google Docs and Office 365 are "good enough" for most things, I would probably be happy to go back to Linux if there was a Linux machine that had comparable build quality yet was a bit cheaper than a Mac.

Dell XPS 15 on Linux is pretty glorious you guys. My 2011 version is still kicking amazingly well with a 1TB SSD, and the newer models are way sleek. I also have a 2013 Sony vaio with dual boot linix/windows. Haven't booted to windows for anything but updating it for years.

I should have known that updating to a new MacOS versions before 6 to 9 months have passed is a mistake. High Sierra is in my experience the buggiest MacOS release so far, not only security-wise. The system is not very stable and APFS reduced drive performance … :(

I basically only update when (a beta of) Xcode tells me it won't run on my current version. Usually that's the point when either all bugs have been fixed or they will not be fixed before the new version.

Yea, Xcode is so annoying with this. What magical features does it use under the hood to not allow basic functionality and iOS support that seems completely unrelated to the macOS version? I have zero incentive to update macOS except Xcode telling me it needs a new version that only runs on the new one.

But why? You don't need to be running the latest version of the Linux kernel to compile binaries for it. You don't need to run Windows 10 to compile programs for it. Why does Apple's compiler need to run on the same system its build target is for?

Probably also to run the actual application. But a Linux kernel is a different beast than macOS versions. macOS versions are pretty stale in terms of features to change abruptly with a new release.

But I think if you keep compiling for older versions you should be able to stay on an older version for a while without newer versions of the OS refusing to run it.

It's just that sometimes new features are introduced that require you to change something in your application because there's a new or deprecated framework. Apple likes to break things to not drag a lot of legacy around.

Yeah, I miss the days back at the start of the decade when I would brim with delight over an email notification that a senior engineer / moderator had chimed-in on my thread on the Apple dev forums.

Checking the dev forums was my favourite thing to do in IT class at school :)

These days, I get that (especially now that they're open) the forums are too saturated with content to have engineers on the ball all the time... But the Captain Hindsight in me thinks they could have done with some keyword notifications to nip instances like this in the bud...

I've been a developer for a long time. I understand bugs happen, even bugs with terrible consequences. A lot of bugs seem understandable, like I can see the chain of ifs/thens required to end up at some hilarious broken state.

But I'm breaking my brain trying to figure out how in the hell a login attempt for "root" will enable it if it's disabled. Why is this is a possibility, to just enable root, no questions asked?

Seems to be something related to a backwards-compatibility code path for upgraded systems. According to multiple posts on this thread it only affects systems upgraded to High Sierra, not fresh installs. See https://news.ycombinator.com/item?id=15802622 for example.
Adding extra layers for compatibility complicates testing and debugging. With this many eyes on it hopefully someone will be able to deduce exactly what's going on.

Apparently it is not just enabling root, but setting the password the first time you do it (in other words, the blank value has nothing to do with it). Then the subsequent times it'll use the pw you set the first time.

It would have to be that looking up the root account enabled it, maybe users go dormant or something, and this was a way to readd them? then once it was enabled it defaulted to a blank password, but you would think that it needs sudo to enable root in the first place.

So by my logic - if you tried this exploit and it failed the first time, then worked the second time: No one else has tried it before you. Otherwise it would either have worked the first time (if you guessed the same pass) or not worked at all (if the first time it was tried a different pass was used).

Well, I suppose if someone had exploited your system with this, they could probably install some remote access tool, and then disable the root account and unset the password, and remove all evidence they were there.

But, if you don't have Screen Sharing or Remote Management enabled and exposed to the WAN, you're probably safe unless someone untrusted had physical access.

It's hard to know how long this vulnerability was "known." The initial report on Nov 13th looks second hand, so it may have been circulating earlier.

Not only enabled it but actually set an empty-string password. Usually the stored hash for a disabled account is not in the hash space, so it was either overwritten, or root account password was actually empty string out of the factory. That and the enabling of the account both point to debug code accidentally left in (or intentional backdoor by the disgruntled).

To some up, when you try to log in with a disabled account, MacOS "promotes" the account but uses the _password provided by the user trying to log in_ instead of the password on file (in this case, an asterisk indicating the account is disabled). Once that is done, you can log in with that account.

IMHO these are two separate bugs: promoting disabled accounts and using the password the user typed in instead of the value in the password list.

I see your point, but it still seems kind of wacky to me. They should validate that the password is correct, then promote the account. Taking the password provided in the authentication dialog just seems like a bad idea.

Perhaps the root issue here is forgetting that the asterisk indicates that the account is disabled and shouldn't be a candidate for promotion.

OSX user management is weird. At least on prev versions, they don't show a root account in the Users & Groups ui.

A guess: there's a code path in the UI that is only tested on "mac" accounts, not the root account that the system requires to exist. Something about the non-macness of the root account interacts badly with the UI that expects to be run on a mac users account.

It's worse than that. You're enabling the root user EVERY time you use this vulnerability. Even if you disable the root user in Directory Utility, logging in with root and no password will re-enable the root user.

Yeah I've noticed this myself - I'm on the fence as to whether this is actually disabling the account or simply creating that impression (it does show as disabled in Directory Utility after you perform this command).

My hope in recommending people disable this way is that with the additional scrutiny on this subsystem, accounts disabled this way will remain genuinely disabled in a future update. Either way this doesn't seem to reintroduce the bug.

As far as I can tell, "dsenableroot -d" seems to have no useful effect. After having "* Successfully disabled root user." with it, I can still log in to the root account with the password I set, both at the command line with "login" and from a remote machine via screen sharing.

To be flippant, I might say HN discussions seem to QA using Apple methods.

I haven't upgraded to High Sierra yet and this doesn't happen on my install atm. Does adding a password to the root user stop this vulnerability? If it does then that seems way better than disabling the account until this is fixed.

Having tested this by both approaches (disabling through GUI & shell), the above (through shell) seems to prevent this from re-occurring when you attempt to perform this bogus login again. Disabling the account via the GUI causes the failure to re-occur.

Can this be used remotely? Edit: Yes, after turning on Remote Management on my second mac I was able to log into it using Remote Desktop, account root and no pw.
It only works after getting physical access once.

Yes, I just had a coworker test it after I enabled remote management and they used screensharing.app. I didn't even get notified a user remoted in.. never used screen share, that seems awful. Had to look over and ask if he was in.

edit: I should say, I did test this locally first so I don't know if a fresh machine that hasn't done it will do the same thing and let a remote account enable root.. Would like to hear if anyone tested it remotely WITHOUT doing it locally first.

It only works after getting physical access once to enable the root user by gibing any password UI the root user with no password (which will enable the local root account, which is also why it fails the first time around)

I wonder what is going on with software quality and testing at Apple. It feels like recently there have been quite a few issues like this (the FileVault password bug, numerous issues with iOS 11, the issue that totally broke iOS Safari a couple of years ago) which should have been fairly easily caught, especially given the limited range of devices their software runs on.

I know testing is hard, but a company with Apple’s resources shouldn’t be making slip ups like this. It suggests some real issues such as lack of unit/automated tests and/or sufficient release testing, which pretty urgently need addressing.

macOS and iOS updates at Apple are now inextricably tied to new iPhone releases. There is a strict yearly deadline that the teams sprint toward, a timeline imposed by marketing rather than readiness. This affects prioritization of which features are pursued, where they lie in the stack, and how polished they get.

Insufficient testing at today's Apple is not limited to software. They bragged about their extensive input testing lab [0] when the new line of Magic accessories was released, but the Magic Keyboard with Numeric Keypad launched last summer had all of its inventory pulled from the channel last month because users discovered that the model was so thin that its midsection bowed over time.

it is also that they pursue features just for the sake of it. things get moved arund in the iPad from release to release for no good reason, often going backwards in usability. every release i have to relearn simple things like how to manage the screen brightness. i really wonder what they are thinking internally other than “we need to shake things up to make it appear we’re doing something with stale products”.

It seems phones and tablets have reached the stage where laptops were maybe 15 years ago. All the major features are done and innovation is pretty much over. So they have to make a lot of cosmetic changes that look like activity.

Haven't deadlines at Apple always been driven by marketing? I'm looking for a source but I remember a story where the product director for iPod was told by steve jobs "make it simple, fast, beautiful, and have it done by Christmas."

Take this for the anecdata that it is. I interviewed at Apple, referred by old Microsoft friends that worked there. As I was trying to get a feel for things before the interview, I asked about the software testing. I was told, "don't expect what you're used to at Microsoft". The reference there is from when Microsoft often had more testers on a team than devs (ah, the good ol' days). The summary of what I was told by friends, and the questions I asked during the interview, is that testers at Apple aren't the testers that Microsoft used to have. Microsoft had testers working in MS Research, researching ways to better test software. Apple, from the impressions I got, is doing good to have testers than can write "hello, world". This was from the app side of things, not OS; I don't know if it's any different on the OS side.

But since I don't work there, I have no good inside info. But just from gut feel, I don't think my anecdata is too far off the mark. Based just on the bugs made public, I just don't get the impression that there are testers at Apple whose sole reason for being there is to tear into a piece of software and break it. There was a bug a few weeks ago posted to HN that I commented on. I don't have a link without digging through my comments, but it was something along the lines of "how could a tester not find this in five minutes of exploratory testing?" This bug is similar. It would take more than five minutes, but were this my area to test I'd pick at it once in a while when I had a few minutes. As I pick at it, I wouldn't expect to find anything, but I've got a minute between builds, so instead of randomly clicking Facebook I'll randomly click this dialog. What did the dev forget? What weird state was not accounted for? Some kind of state overflow if I click the button enough times? Shove some Unicode in there, that didn't find anything; meh, maybe I ought to move o...hey, wait a minute. Did that thing just log me in as root?

As a Tester myself, I cannot understand why this is not covered by either unit tests or behavioral tests.
Clicking dialog buttons in rapid succession is what we (should) do once in a while. Especially in core functionalities such as the login screen.
It's one of the first screens you see as a tester. And you have default usernames, be it enabled or not.

For example, I do not own an iPhone, but at work, I made a bet with my colleague (jokingly) that I could break _something_ on his phone in a few minutes.

I did not have his finger print or pin-code, so I was very limited, I even joked "I don't need that, give it here!"

Finding out I only had a hand full of options, I focused on the emergency dialer.
As any good tester would be curious about, I wanted to check the max field length, so I entered digits, copy/paste it a few times, copy/paste that string, ("wait, no limit? Not even at 1000? why?") and so on, until I noticed the interface became laggy, so of course, I kept going.

Boom, suddenly back at the login screen, tried to open the emergency dialer, but got a full blank white screen, in the meantime the phone started heating up substantially.
Since it was a new Phone (iPhone 7 with iOS 10.x I believe) and the dev getting nervous, we decided to reboot it. That fixed the issue.
(Curious if this is still an issue in iOS 11.x)

TL;DR: As a tester this simple curiosity should be in your blood, and especially covered in behavioral tests when your software has been around for 5+ years.

I got my friends Apple Watch stuck last week just by looking at it's features. IIRC it got stuck while using the "flashlight". It suddenly froze, and it took me a while to reboot it (it got stuck once more while rebooting).

All in all, it took me about a minute to break it, and around 5 minutes to get it working again. I was getting a bit nervous.

This bug isn't caused by rapid succession or whatever, it's more of a generic end to end test. In this case someone would have had to write an exact scenario that opens a settings page, unlocks it, types in 'root', no password, presses login and it should not work.

Actually, I've been wondering why I hear less about people working at Apple than at other big tech companies. It seems everyone and their mother work at Google or Facebook, but no so much at Apple. Do they have less software engineers, or their employees are required to be more discrete?

Do they have less software engineers, or their employees are required to be more discrete?

I know but a few that work at Apple, and of those few they strike me as less forthcoming than the multitudes I've worked with and know at Microsoft. I've wondered if part of that is because Microsoft previews/pre-announces just about everything, whereas Apple (mostly, and not so much anymore) announces it when the shipping trucks show up at the local Apple store.

So the outcome from the Microsoftie is, "it'll do this that and the other, but that's all I can say right now." From a recent conversation with an Apple employee: "they make me go in a special room to use the hardware, and I can't work from home. That's all I can say."

Probably more so, last I looked, Apple has considerably fewer software employees than the other big companies.

I believe there was an article about how Apple was finding it extremely difficult to hire ML experts because of the secrecy they require. In order to mitigate that they mad an exception for their ML/AI engineers allowing them to publish papers to external journals and present at conferences.

Presumably this is why their software is increasingly shoddy. But I wonder whether it's a direct effect of poor internal communications, or an indirect effect where the ludicrous secrecy has driven away any half-decent programmers.

Unrelated to Mac OS but I used to wonder all the time why iTunes connect was so shoddy. I got my answer when I learned Apple had outsourced a ton of backend work including iTunes Connect, App Store backend to Infosys in India.

It's the last release they made any significant updates to the BSD userland. I'd mark this as the release they ceased serious investment into the operating system itself. After this point it's almost nothing of note. If you look at the dates of the various tools etc., this release is when they were last updated. Many are now getting on for being a decade out of date. The only major change has been the switch to llvm, and they made that horribly painful.

It also marks the decline of the desktop UI to introduce increasing amounts of iOS-like behaviour and appearance to the detriment of a usable desktop. Like proper scrollbars etc.

While some nostalgia might account for holdouts, it was the peak of MacOS in the minds of many, including myself. As a developer, I've been quite disappointed by its direction and declining quality. For the amount we pay for this hardware, it's not much to ask for some basic maintenance work and testing to be done.

There's been releases ever since that improve battery life, or, keep it the same while doing more; background apps get throttled, graphics rendering has been optimized using Metal, and in future Mac devices, I'm sure they'll put their mobile processors in it to handle background tasks - it might even be strong enough to power the next generation of Macbook Air.

I have been in the apple ecosystem for about 10 years. For a company that has been priding itself on end user security, the bugs that have been creeping their way into the OS are just... disappointing. What is the point of paying a premium for a well polished hardware/software bundle if the OS is malfunctioning in a non trivial manner. Design? Right now when I use my calculator app on my iPhone and do 2+2+2 I get 24. That's a pretty awful design. Actually, it's a lie.

Dominic Giampaolo of BeOS/BeFS fame. He now works on APFS at Apple. Their work is really - impressive APFS was announced in June 2016 and rolled out on iOS devices in March 2017. Given that the APFS roll-out was relatively uneventful and how they tested it [1], it seems that they can still do low-level engineering and proper testing.

Of course, until recently they had Chris Lattner as well.

[1] For some iOS releases, they converted HFS to APFS in-place, report the results back to Apple, but did not write the APFS 'superblock' to keep the filesystem HFS+. It's quite a smart idea, because they got reports from millions of devices without actually switching them to APFS.

Subjectively it feels like Apple bugs have become larger and more prevalent, over the last few years. That and IMO clean OSX/iOS installs don't quite feel as polished as they used to. (I stopped using Apple products, except for a MBP, for a few years and recently started using them again, and the MBP still runs 10.10 for precisely this reason) The last solid OSX release was Snow Leopard

They've added features over the years without removing or polishing them; there's Launchpad which was added in a period where OSX seemed to lean towards becoming touch-friendly, but it didn't replace any existing feature (iirc) and just feels off. Might just be me though. Notification center? Don't use it.