Archive for the ‘Apple’ tag

Designed by Apple in California is a tagline the company uses to add a little prestige to their Chinese manufactured electronics. In addition to designing electronics the company also designs its own stores. However, when people in California design stores they often overlook environmental issues that are rare there but common elsewhere, such as ice and snow:

Apple’s new flagship retail store in Chicago, the one with a MacBook-shaped rooftop, is nothing short of an architectural marvel. At least, that’s how some news reports put it when the store opened back in October. Beyond standing out among the less inspired buildings of the downtown Chicago area, the new Apple Store also happens to be very poorly thought through considering its thin roof now has dangerous icicles hanging perilously over public walkways.

Designed by Apple in a state that doesn’t have to deal with arctic bullshit. As a Minnesotan I can’t help but laugh at this.

Apple isn’t the first company to run into this problem and it won’t be the last. It’s too easy to take architecture for granted. An architect in California can easily overlook the effects harsh winters will have on their building. An architect in Minnesota can easily overlook the effects earthquakes will have on their building. If you’re tasked with designing a building that will be built in another region, it might be a good idea to contact some architects in that area and ask them about environmental issues they have to design around.

I swear Apple fanboys are some of the dumbest people on the planet. Quite a few of them have been saying, “If an attacker as physical access, it’s game over anyways,” as if that statement makes the root user exploit recently discovered in High Sierra a nonissue.

At one time that statement was true. However, today physical access is not necessarily game over. Look at all of the trouble the Federal Bureau of Investigations (FBI) has been having with accessing iOS devices. The security model of iOS actually takes physical access into account as part of its threat modeling and has mechanisms to preserve the integrity of the data contained on the device. iOS requires all code to be signed before it will install or run it, which makes it difficult, although far from impossible, to insert malicious software onto iOS devices. But more importantly iOS encrypts all of the data stored in flash memory by default. Fully encrypted disks protect against physical access by both preventing an attacker from getting any usable data from a disk and also by preventing them from altering the data on the disk (such as writing malware directly to the disk).

macOS has a boot mode called single user mode, which boots the computer to a root command prompt. However, if a firmware password is set, single user mode cannot be started without entering the firmware password. The firmware password can be reset on machines with removable RAM (resetting the password requires changing the amount of RAM connected to the mainboard) but most of Apple’s modern computers, some iMacs being the exception, have RAM modules that are soldered to the mainboard.

Physical access is especially dangerous because it allows an attacker to insert malicious hardware, such as a key logger, that would allow them to record everything you type, including your passwords. However, that kind of attack requires some amount of sophistication and time (at least if you want the malicious hardware to be difficult to detect), which is where the real problem with High Sierra’s root exploit comes in. The root exploit required no sophistication whatsoever. Gaining root access only required physical access (or remote access if certain services were enabled) to an unlocked Mac for a few seconds. So long as an attacker had enough time to open System Preferences, click one of the lock icons, and type in “root” for the user name a few times they had complete access to the machine (from there they could turn on remote access capabilities to maintain their access).

Attempting to write off this exploit as a nonissue because it requires physical access requires willful ignorance of both modern security features that defend against attackers with physical access and the concept of severity (an attack that requires no sophistication can be far more severe than a time consuming sophisticated attack under certain threat models).

macOS High Sierra may go down in the history books as Apple’s worst release of macOS since the initial one. Swapping the graphical user interface to use the Metal API wasn’t a smooth transition to say the least but the real mess is in regards to security. There was a bug where a user’s password could be displayed in the password hint field so logging in as a malicious user only requires entering a user’s password incorrectly to trigger the hint field. But yesterday it was revealed that the root account, which is normally disabled entirely, could be activated in High Sierra by simply typing root into the user name field in System Preferences:

The bug, discovered by developer Lemi Ergin, lets anyone log into an admin account using the username “root” with no password. This works when attempting to access an administrator’s account on an unlocked Mac, and it also provides access at the login screen of a locked Mac.

The security mistakes in High Sierra are incredibly amateur. Automated regression testing should have caught both the password hint mistake and this root account mistake. I can only assume that Apple’s quality assurance department took the year off because both High Sierra and iOS 11 are buggy messes that should never have been released in the states they were released in.

The announcement of the iPhone X was one of the biggest product announcements of the year. Not only is it the latest iPhone, which always captures headlines, but it includes a new facial recognition feature dubbed Face ID. With the popularity of the iPhone it’s inevitable that politicians will try to latch onto it to capture some headlines of their own. Al Franken, one of Minnesota’s congress critters, decided to try to latch onto the iPhone X by expressing concern about the privacy implications of the Face ID feature. This may appear to have been a smart political maneuver but the senator only managed to make himself appear illiterate since Apple had already published all of the technical information about Face ID:

Apple has responded to Senator Al Franken’s concerns over the privacy implications of its Face ID feature, which is set to debut on the iPhone X next month. In his letter to Tim Cook, Franken asked about customer security, third-party access to data (including requests by law enforcement), and whether the tech could recognize a diverse set of faces.

In its response, Apple indicates that it’s already detailed the tech in a white paper and Knowledge Base article — which provides answers to “all of the questions you raise”. But, it also offers a recap of the feature regardless (a TL:DR, if you will). Apple reiterates that the chance of a random person unlocking your phone is one in a million (in comparison to one in 500,000 for Touch ID). And, it claims that after five unsuccessful scans, a passcode is required to access your iPhone.

Franken should feel fortunate that Apple even bothered entertaining his concerns. Were I Tim Cook I would have directed a member of my staff to send Franken links to the technical publications with a request to have a member of his staff read them to him and not bothered giving him a TL;DR. After all, Apple’s time is worth far more money than Franken’s since it’s actually producing products and services that people want instead of being a parasite feeding off of stolen money.

Still I admit that it was pretty funny seeing Franken make an ass of himself yet again.

Apple released macOS High Sierra yesterday. Amongst other changes, High Sierra includes the new Apple File System (APFS), which replaces the decades old Hierarchical File System (HFS). When you install High Sierra, at least if your boot drive is a Solid State Drive (SSD), the file system is supposed to be automatically converted to APFS. Although Apple’s website says that FileVault encrypted drives will be automatically converted, it didn’t give any details.

I installed High Sierra on two of my systems last night. One was a 2012 MacBook Pro and the other was a 2010 Mac Mini. Both contain Crucial SSDs. Since they’re third-party SSDs I wasn’t sure if High Sierra would automatically convert them. I’m happy to report that both were converted automatically. I’m also happy to report that FileVault didn’t throw a wrench into the conversion. I was worried that converting a FileVault encrypted drive would require copying files from one encrypted container to a new encrypted container but that wasn’t necessary.

If you’re installing High Sierra on a FileVault encrypted drive, the conversion from HFS to APFS won’t take a noticeably greater amount of time.

One reason I prefer iOS over Android is because Apple has invested more heavily in security than Google has. Part of this comes from the fact Apple controls both the hardware and software so it can implement hardware security features such as its Secure Enclave chip whereas the hardware security features available on an Android device are largely dependent on the manufacturer. However, even the best security models have holes in them.

Some of those holes are due to improperly implemented features while others are due to legalities. For example, here in the United States law enforcers have a lot of leeway in what they can do. One thing that has become more popular, especially at the border, are devices that copy data from smartphones. This has been relatively easy to do on Apple devices if the user unlocks the screen because trusting a knew connection has only required the tapping of a button. That will change in iOS 11:

For the mobile forensic specialist, one of the most compelling changes in iOS 11 is the new way to establish trust relationship between the iOS device and the computer. In previous versions of the system (which includes iOS 8.x through iOS 10.x), establishing trusted relationship only required confirming the “Trust this computer?” prompt on the device screen. Notably, one still had to unlock the device in order to access the prompt; however, fingerprint unlock would work perfectly for this purpose. iOS 11 modifies this behaviour by requiring an additional second step after the initial “Trust this computer?” prompt has been confirmed. During the second step, the device will ask to enter the passcode in order to complete pairing. This in turn requires forensic experts to know the passcode; Touch ID alone can no longer be used to unlock the device and perform logical acquisition.

Moreover, Apple has also included a way for users to quickly disable the fingerprint sensor:

In iOS 11, Apple has added an new emergency feature designed to give users an intuitive way to call emergency by simply pressing the Power button five times in rapid succession. As it turns out, this SOS mode not only allows quickly calling an emergency number, but also disables Touch ID.

These two features appear to be aimed at keeping law enforcers accountable. Under the legal framework of the United States, a police officer can compel you to provide your fingerprint to unlock your device but compelling you to provide a password is still murky territory. Some courts have ruled that law enforcers can compel you to provide your password while others have not. This murky legal territory offers far better protection than the universal ruling that you can be compelled to provide your fingerprint.

Even if you are unable to disable the fingerprint sensor on your phone, law enforcers will still be unable to copy the data on your phone without your password.

I’ve annoyed a great many electrons writing about the dangers of using other people’s computer (i.e. “the cloud”) to store personal information. Most of the time I’ve focused on the threat of government surveillance. If your data is stored on somebody else’s computer, a subpoena is all that is needed for law enforcers to obtain your data. However, law enforcers aren’t the only threat when it comes to “the cloud.” Whoever is storing your data, unless you’ve encrypted it in a way that make it inaccessible to others before you uploaded it, has access to it, which means that their employees could steal it:

Twenty-two people have been detained on suspicion of infringing individuals’ privacy and illegally obtaining their digital personal information, according to a statement Wednesday from police in southern Zhejiang province.

Of the 22 suspects, 20 were employees of an Apple “domestic direct sales company and outsourcing company”.

This story is a valuable lesson and warning. Apple has spent a great deal of time developing a reputation for guarding the privacy of its users. But data uploaded to its iCloud service are normally stored unencrypted so while a third-party may not be able to intercept en route, at least some of Apple’s employees have access to it.

The only way you can guard your data from becoming public is to either keep it exclusively on your machines or encrypt it in such a way that third parties cannot access it before uploading it to “the cloud.”

My first Apple product was a PowerBook G4 that I purchased back in college. At the time I was looking for a laptop that could run a Unix operating system. Back then (as is still the case today albeit to a lesser extent) running Linux on a laptop meant you had to usually give up sleep mode, Wi-Fi, the additional function buttons most manufacturers added on their keyboards, and a slew of power management features that made the already pathetic battery life even worse. Since OS X was (and still is) Unix based and didn’t involved the headaches of trying to get Linux to run on a laptop the PowerBook fit my needs perfectly.

Fast forward to today. Between then and now I’ve lost confidence in a lot of companies whose products I used to love. Apple on the other hand has continued to impress me. In recent times my preference for Apple products has been influenced in part by the fact that it doesn’t rely on selling my personal information to make money and displays a healthy level of paranoia:

Apple has begun designing its own servers partly because of suspicions that hardware is being intercepted before it gets delivered to Apple, according to a report yesterday from The Information.

“Apple has long suspected that servers it ordered from the traditional supply chain were intercepted during shipping, with additional chips and firmware added to them by unknown third parties in order to make them vulnerable to infiltration, according to a person familiar with the matter,” the report said. “At one point, Apple even assigned people to take photographs of motherboards and annotate the function of each chip, explaining why it was supposed to be there. Building its own servers with motherboards it designed would be the most surefire way for Apple to prevent unauthorized snooping via extra chips.”

Apple has a lot of hardware manufacturing capacity and it appears that the company will be using it to further protect itself against surveillance by manufacturing its own servers.

This is a level of paranoia I can appreciate. Years ago I brought a lot of my infrastructure in house. My e-mail, calendar and contact syncing, and even this website are all being hosted on servers running in my dwelling. Although part of the reason I did this was for the experience another reason was to guard against certain forms of surveillance. National Security Letters (NSL), for example, require service providers to surrender customer information to the State and legally prohibit them from informing the targeted customer. Since my servers are sitting in my dwelling any NSL would necessarily require me to inform myself of receiving it.

It appears that the Federal Bureau of Investigations (FBI) is finally following the advice of every major security expert and pursuing alternate means of acquire the data on Farook’s iPhone, which means the agency’s crusade against Apple is temporarily postponed:

A magistrate in Riverside, CA has canceled a hearing that was scheduled for Tuesday afternoon in the Apple v FBI case, at the FBI’s request late Monday. The hearing was part of Apple’s challenge to the FBI’s demand that the company create a new version of its iOS, which would include a backdoor to allow easier access to a locked iPhone involved in the FBI’s investigation into the 2015 San Bernardino shootings.

The FBI told the court that an “outside party” demonstrated a potential method for accessing the data on the phone, and asked for time to test this method and report back. This is good news. For now, the government is backing off its demand that Apple build a tool that will compromise the security of millions, contradicts Apple’s own beliefs, and is unsafe and unconstitutional.

This by no means marks the end of Crypto War II. The FBI very well could continue its legacy of incompetence and fail to acquire the data from the iPhone through whatever means its pursuing now. But this will buy us some time before a court rules that software developers are slave laborers whenever some judge issues a court order.

I’m going to do a bit of speculation here. My guess is that the FBI didn’t suddenly find somebody with a promising method of extracting data from the iPhone. After reading the briefs submitted by both Apple and the FBI it was obvious that the FBI either had incompetent lawyers or didn’t have a case. That being the case, I’m guessing the FBI decided to abandon its current strategy because it foresaw the court creating a precedence against it. It would be far better to abandon its current efforts and try again later, maybe against a company that is less competent than Apple, than to pursue what would almost certainly be a major defeat.

Regardless of the FBI’s reasoning, we can take a short breath and wait for the State’s next major attack against our rights.