Bragging about Macs and less vulnerable is like me bragging about Linux and being less vulnerable...yeah, we're not really less vulnerable, just single digit market percentage...ain't nobody got time to write a virus for single digit market percentage...especially since the exploit being targeted will be patched within 24 hours of being disclosed to the public on most major Linux distributions.

Or, you know, run Windows where exploits cross generations and OS versions. I still come across the occasional article about Windows 7/8/10 running into Windows 3.1/95 era bugs...because, you know, an OS still dealing with bugs from the early 90s is what I want to run to do secure, financial transactions and keep me private.

Hacking-wise, Linux and Macs are like high end cars and Windows is like a 1990 Toyota Camry. You can steal a Ferrari or a BMW, but it's 1000x easier to steal a 1990 Toyota Camry since they can be started and driven off with nothing but long fingernails...keys aren't required in that car...seriously, I used to own one.

Rather amusing. When I was working with Apple computers in the '80s, they were the worse for infections. People loaded any crap they came across. Since this was a government job, only the support team was authorized to add software, but cleanups and rebuilds were a daily thing.

/NASA "manager" was the worst offender, her system rarely ran for more than a week before she added something that crashed it.

One of the attacks that specifically makes use of out-of-date firmware is the Thunderstrike attack, which allows a malicious actor to take control of the machine by inserting an ethernet adapter into a Mac's Thunderbolt port to deliver a malicious payload

How did the whole "Mac" and "PC" nomenclature come about, anyway? If it's hardware, wouldn't the better comparison have been "Apple" and "IBM" or "Intel"? And why does it persist when it seems to now refer to "OSX" vs "Windows"?

On what planet is Linux a single-digit market percentage? Sure, for desktop computing, they are. But for backend stuff- y'know, for the systems that hold valuable data- they are the world. Also, for IoT devices- if you're building a botnet, it's waaaaay easier to build that botnet out of shiattly secured coffee makers that connect to the Internet for some idiotic reason.

But there's only so much an OS can do to control its threat surface- code/data separation, ASLR, robust privilege models, etc. At the end of the day though, it's on the actually used software to provide sufficient security. For example, the Equifax hack used a Struts vuln that happened entirely in application-space. No OS could defend you against that.

Now, that said, macOS does something that only Apple could get away with, that honestly does make your device more secure: in its default configuration, it won't run unsigned code. No OS should run a binary without a valid cryptographic signature from a known source. Apple, sadly, doesn't go far enough- you can still run Unix-style executables without a signature, but no application bundles (which means it's good enough to keep users from hurting themselves, but doesn't stop a determined attacker).

natazha:Rather amusing. When I was working with Apple computers in the '80s, they were the worse for infections. People loaded any crap they came across. Since this was a government job, only the support team was authorized to add software, but cleanups and rebuilds were a daily thing.

/NASA "manager" was the worst offender, her system rarely ran for more than a week before she added something that crashed it.

Oddly enough the majority of Apple Computers in the 1980s, and very few people had hard drives for them and they were frightfully expensive. There only a handful(most were variants of each other) I think maybe 20, and most just introduced a counter into the boot sector that after a certain number of boots would display a message"hey, your computer was hacked by whatever elite cool guy user group" and did not real home. I think there were like 2 that destroyed the disk.

There were no "rebuilds" because the the OS was resident on each floppy you booted from, so literally turning the computer off and putting another disk in would solve your problem. Due to technological limitations the virus was quarantined to the disk. Because when you turned off the computer, well the memory was wiped.

There were around 3 or so viruses for the mac, and a few variants of each.

skozlaw:How did the whole "Mac" and "PC" nomenclature come about, anyway? If it's hardware, wouldn't the better comparison have been "Apple" and "IBM" or "Intel"? And why does it persist when it seems to now refer to "OSX" vs "Windows"?

It originated in the 80s when basically anyone could build PCs and their peripherals but Apple was super proprietary about their format. There was a time when if you owned an Apple computer you had to use an Apple HD, printer, etc. Other formats existed (Atari, Commodore, etc.) but PCs composed the majority of the market with Apple a very distant second. Apple's defense of this policy was that it was necessary to maintain quality and ensure the computers were easy to use (DOS based PCs were not exactly user friendly), however it can't be ignored that Apple computers and accessories were far more expensive than comparable PCs (for example, when I upgraded to a PC from my Atari in 1990 the comparable Apple was over 3 times the cost).

Prior to the iPod and iPhone this was a common description of Apple's proprietary strategy:

It's not just that Macs or less vulnerable, its more than that just on a worthwhile Target. Sure you can get some buddies little dinky music program to shut down and steal their "Beats", or lock some artist out of their digital files. But there's no real valuable information on most Macs.

It's not just the smaller market share, it's also the "why bother" infecting some douchebag hipster's Mac. If they can afford a Mac in the first place, they will probably just throw it out and buy a new one, cuz that's what Apple users do. It's not like you can repair or upgrade them.

Tyrosine:their peripherals but Apple was super proprietary about their format

Well, I think more than that, Apple was super unwilling to adopt other people's proprietary formats. That's why they went with Firewire over USB at points. USB was alsoproprietary, but Intel was more open about licensing it (because they wanted adoption). It's why they adopted SCSI. And while the ADB was definitely proprietary, at the time, it's not like there was much of a mass-market, widely adopted protocol out there for peripherals.

On what planet is Linux a single-digit market percentage? Sure, for desktop computing, they are. But for backend stuff- y'know, for the systems that hold valuable data- they are the world. Also, for IoT devices- if you're building a botnet, it's waaaaay easier to build that botnet out of shiattly secured coffee makers that connect to the Internet for some idiotic reason.

But there's only so much an OS can do to control its threat surface- code/data separation, ASLR, robust privilege models, etc. At the end of the day though, it's on the actually used software to provide sufficient security. For example, the Equifax hack used a Struts vuln that happened entirely in application-space. No OS could defend you against that.

Now, that said, macOS does something that only Apple could get away with, that honestly does make your device more secure: in its default configuration, it won't run unsigned code. No OS should run a binary without a valid cryptographic signature from a known source. Apple, sadly, doesn't go far enough- you can still run Unix-style executables without a signature, but no application bundles (which means it's good enough to keep users from hurting themselves, but doesn't stop a determined attacker).

The problem with Apple computers are not really the computers, but the user base.

About .001 know what they are doing. That includes "creatives" and people who cry out "I have been using the Mac since 1984, and I all every Apple product, hey how do I search for a file?".

They also have a false sense of security and will install anything from any site. "Your Flash player is not up to date?" And since they do not know how to check it under preferences in Safari they install it, and boom they get creepy crawlers. The worst being Mac Keep which pretends to optimize their machines, then strange stuff starts happening and an official looking Apple logo and page pops up and says they have been infected to call Apple, then they get some Indian guy who has them download a Citrix client who opens up Terminal and shows them random directories and points out "problems" meanwhile the guy is grabbing their keychain with all of their personal data, while extracting 199 bucks off their credit card for "cleaning the computer". Mac users are the worse for social engineering and phishing.

The other thing is Mac users rarely update their OS's, most have figured out how to turn off the nag screen, so they freak out when their back has deprecated their version of their browser and cannot login, and they say"no one ever told me I had to update".

Windows converts to Os X are easier to deal with, they are used to poking around, updating, taking instruction.

t3knomanser:Tyrosine: their peripherals but Apple was super proprietary about their format

Well, I think more than that, Apple was super unwilling to adopt other people's proprietary formats. That's why they went with Firewire over USB at points. USB was alsoproprietary, but Intel was more open about licensing it (because they wanted adoption). It's why they adopted SCSI. And while the ADB was definitely proprietary, at the time, it's not like there was much of a mass-market, widely adopted protocol out there for peripherals.

Firewire was great, but Apple wanted to charge a buck for every firewire port manufactured from the manufacturer, they killed it with cost.

theflatline:natazha: Rather amusing. When I was working with Apple computers in the '80s, they were the worse for infections. People loaded any crap they came across. Since this was a government job, only the support team was authorized to add software, but cleanups and rebuilds were a daily thing.

/NASA "manager" was the worst offender, her system rarely ran for more than a week before she added something that crashed it.

Oddly enough the majority of Apple Computers in the 1980s, and very few people had hard drives for them and they were frightfully expensive. There only a handful(most were variants of each other) I think maybe 20, and most just introduced a counter into the boot sector that after a certain number of boots would display a message"hey, your computer was hacked by whatever elite cool guy user group" and did not real home. I think there were like 2 that destroyed the disk.

There were no "rebuilds" because the the OS was resident on each floppy you booted from, so literally turning the computer off and putting another disk in would solve your problem. Due to technological limitations the virus was quarantined to the disk. Because when you turned off the computer, well the memory was wiped.

There were around 3 or so viruses for the mac, and a few variants of each.

I think you just might be full of shiat.

Depends on how you define "Apple computers" and "The 80s."

Are you talking about Apple II computers in the early 80s? Or the Mac computers that followed the debut of the Mac Plus?

Because after the Mac Plus came out, people often had external or internal HyperDrives that held between 5 mb and 20mb. And that worked our rather well compared to how it used to be.

Tyrosine:skozlaw: How did the whole "Mac" and "PC" nomenclature come about, anyway? If it's hardware, wouldn't the better comparison have been "Apple" and "IBM" or "Intel"? And why does it persist when it seems to now refer to "OSX" vs "Windows"?

It originated in the 80s when basically anyone could build PCs and their peripherals but Apple was super proprietary about their format. There was a time when if you owned an Apple computer you had to use an Apple HD, printer, etc. Other formats existed (Atari, Commodore, etc.) but PCs composed the majority of the market with Apple a very distant second. Apple's defense of this policy was that it was necessary to maintain quality and ensure the computers were easy to use (DOS based PCs were not exactly user friendly), however it can't be ignored that Apple computers and accessories were far more expensive than comparable PCs (for example, when I upgraded to a PC from my Atari in 1990 the comparable Apple was over 3 times the cost).

Prior to the iPod and iPhone this was a common description of Apple's proprietary strategy:

ArroganceProducesProfitLosingEntity

Yeah. The aforementioned HyperDrive for MacPlus computers.

Word has it that, when Apple's tech geniuses saw how engineers for HyperDrive had created a bootable internal hard drive for the Mac Plus, their response was: "You're not supposed to be able to do that..."

t3knomanser:Tyrosine: their peripherals but Apple was super proprietary about their format

Well, I think more than that, Apple was super unwilling to adopt other people's proprietary formats. That's why they went with Firewire over USB at points. USB was alsoproprietary, but Intel was more open about licensing it (because they wanted adoption). It's why they adopted SCSI. And while the ADB was definitely proprietary, at the time, it's not like there was much of a mass-market, widely adopted protocol out there for peripherals.

I see that as two sides of the same problem. They wanted to create a product that they would have 100% control over which precludes using other formats and licensing others to use your format. This problem still exists within the company as seen by their boneheaded resistance to a common standard for charging cables in Europe.

theflatline:Firewire was great, but Apple wanted to charge a buck for every firewire port manufactured from the manufacturer, they killed it with cost.

You mean MPEG LA, not Apple. MPEG LA holds all the patents on Firewire and related technologies. Apple designed the product, yes, but they partnered with many other companies to actually make it something to be sold. Firewire was just plain more complicated than USB, and that was the larger bottleneck- the hardware to handle Firewire was more complex.

Originally, nobody thought of them as competing protocols- Firewire was for high speed data transfer, USB was for peripherals. The low cost of implementing USB though meant that everyone used it for data transfer, despite being much slower than Firewire until the mid-late 2000s.

Tyrosine:skozlaw: How did the whole "Mac" and "PC" nomenclature come about, anyway? If it's hardware, wouldn't the better comparison have been "Apple" and "IBM" or "Intel"? And why does it persist when it seems to now refer to "OSX" vs "Windows"?

It originated in the 80s when basically anyone could build PCs and their peripherals but Apple was super proprietary about their format. There was a time when if you owned an Apple computer you had to use an Apple HD, printer, etc. Other formats existed (Atari, Commodore, etc.) but PCs composed the majority of the market with Apple a very distant second. Apple's defense of this policy was that it was necessary to maintain quality and ensure the computers were easy to use (DOS based PCs were not exactly user friendly), however it can't be ignored that Apple computers and accessories were far more expensive than comparable PCs (for example, when I upgraded to a PC from my Atari in 1990 the comparable Apple was over 3 times the cost).

Prior to the iPod and iPhone this was a common description of Apple's proprietary strategy:

ArroganceProducesProfitLosingEntity

Hardly anyone was building their own PC's in the 80s, most personal home computers were Apples and Commodores.

It originated as part of a the IBM 5150. It was called the IBM Personal Computer (PC). IBM based an entire product line around the name as did peripheral makers.

And for the majority of the 1980s Apples and IBM's were in the same price range. Only in the late 1980s when you started seeing "clones" come out that prices dropped. But other than adding a Sound Blaster, a graphic card, or ram people were upgrading on their own, but hardly anyone was building a box(outside of universities and even then it was rare. That really started in the 1990s. The one company that really sold kits was HeathKit and even Zenith, but I never saw one in the wild.

PC's became cheaper because Phoenix clean roomed the PC Bios. Because IBM was suing everyone for copyright violation who made a PC without licensing their BIOS(which they did really want to do). And you really only starting seeing clones around 87-88.

The Apple Lisa was over priced and junk, and Apple was too expensive in the early 1990s.

Actually, the Apple 2 line, which was used in schools and homes across the country, was built to be completely open and you could get any part from literally thousands of third party companies.

Candygram4Mongo:theflatline: natazha: Rather amusing. When I was working with Apple computers in the '80s, they were the worse for infections. People loaded any crap they came across. Since this was a government job, only the support team was authorized to add software, but cleanups and rebuilds were a daily thing.

/NASA "manager" was the worst offender, her system rarely ran for more than a week before she added something that crashed it.

Oddly enough the majority of Apple Computers in the 1980s, and very few people had hard drives for them and they were frightfully expensive. There only a handful(most were variants of each other) I think maybe 20, and most just introduced a counter into the boot sector that after a certain number of boots would display a message"hey, your computer was hacked by whatever elite cool guy user group" and did not real home. I think there were like 2 that destroyed the disk.

There were no "rebuilds" because the the OS was resident on each floppy you booted from, so literally turning the computer off and putting another disk in would solve your problem. Due to technological limitations the virus was quarantined to the disk. Because when you turned off the computer, well the memory was wiped.

There were around 3 or so viruses for the mac, and a few variants of each.

I think you just might be full of shiat.

Depends on how you define "Apple computers" and "The 80s."

Are you talking about Apple II computers in the early 80s? Or the Mac computers that followed the debut of the Mac Plus?

Because after the Mac Plus came out, people often had external or internal HyperDrives that held between 5 mb and 20mb. And that worked our rather well compared to how it used to be.

Apple sold the Apple 2 variants up until the 1990s, they were the preferred model of choice. However, in my post I went over that the mac did have viruses as well, there were not "tons of them" there like three, and most were a variant of the Hypercard virus, and one went over early networks.

I used to run the network for a very large science museum and research institution. We had a large collection of Macs for the graphics department and PCs running Windows just about everywhere else.

Windows has a well deserved reputation for being and infection laden pile of shiat. Back in the late 90s/early 2000s they insisted on embedding a lot of application functions into the OS itself. They were trying to take over the Internet, and by making Outlook and Explorer parts of the OS they were hoping to skirt Anti-trust laws.

Because those two programs were "inside the sanctum", breaking into through, say, the address book could lead to total system pwn. There were also some questionable design decisions that went into Visual Basic, the system registry, and the way drivers interact with the system core (or actually had to replace parts) that meant there was no way to secure a windows machine that was powered on.

I remember one virus that was so bad you couldn't do a clean install and get through system update before the machine was re-infected.

Windows has gotten a lot better. But it still leaves a nasty taste in my mouth.

Evil Twin Skippy:I used to run the network for a very large science museum and research institution. We had a large collection of Macs for the graphics department and PCs running Windows just about everywhere else.

Windows has a well deserved reputation for being and infection laden pile of shiat. Back in the late 90s/early 2000s they insisted on embedding a lot of application functions into the OS itself. They were trying to take over the Internet, and by making Outlook and Explorer parts of the OS they were hoping to skirt Anti-trust laws.

Because those two programs were "inside the sanctum", breaking into through, say, the address book could lead to total system pwn. There were also some questionable design decisions that went into Visual Basic, the system registry, and the way drivers interact with the system core (or actually had to replace parts) that meant there was no way to secure a windows machine that was powered on.

I remember one virus that was so bad you couldn't do a clean install and get through system update before the machine was re-infected.

Windows has gotten a lot better. But it still leaves a nasty taste in my mouth.

/Uses a Mac at home

Format the drives and you are golden on windows, there as a Virus that made it self resident in the bios, but MS did not create the BIOS, so you cannot blame it on them.

MBR viruses have been killed since windows 2000, but you could get them if you multibooted and older OS with 2000 on up.

On what planet is Linux a single-digit market percentage? Sure, for desktop computing, they are. But for backend stuff- y'know, for the systems that hold valuable data- they are the world. Also, for IoT devices- if you're building a botnet, it's waaaaay easier to build that botnet out of shiattly secured coffee makers that connect to the Internet for some idiotic reason.

But there's only so much an OS can do to control its threat surface- code/data separation, ASLR, robust privilege models, etc. At the end of the day though, it's on the actually used software to provide sufficient security. For example, the Equifax hack used a Struts vuln that happened entirely in application-space. No OS could defend you against that.

Now, that said, macOS does something that only Apple could get away with, that honestly does make your device more secure: in its default configuration, it won't run unsigned code. No OS should run a binary without a valid cryptographic signature from a known source. Apple, sadly, doesn't go far enough- you can still run Unix-style executables without a signature, but no application bundles (which means it's good enough to keep users from hurting themselves, but doesn't stop a determined attacker).

I was referring to desktop usage. If a server or backend Linux maintainer isn't using an up-to-date LTS kernel at the minimum they're doing it wrong and deserve what happens to them. Cent & Debian are easy to learn, free to use, & run old ass LTS software with assloads of security patches...Need or want paid technical support? Fedora...There, I've solved damn near every companies' Linux server security issues. Oh, your software is old, out-of-date, and needs old ass software & OS's to run? Sandbox it all in a VM or, you know, pay programmers to write new software...

I don't think the IoT or phones count....products with old kernels and shiatty to non-existent update services are going to be vulnerable regardless of the underlying software or OS.

Most Linux distros, by default, only allow a root or root-level user to install programs from places with valid cryptographic signatures and they have the same Unix-style executable limitation that macOS has. In that same style, they are also set up so that only certain users with certain permissions can do certain things....or you need to be a root user or given permission to actually do something that will break your system. Mainstream Linux isn't any different than macOS in that regard (or *BSD or damn near any other open source or Unix-based OS for that matter).

That said, I'm unsure if there is a Linux distro that checks every binary\library\thing ran against a cryptographic hashtable before being loaded/ran to ensure it is what was compiled upstream and is unaltered. Probably one out there, but it's nothing I've ever bothered looking into.

theflatline:Evil Twin Skippy: I used to run the network for a very large science museum and research institution. We had a large collection of Macs for the graphics department and PCs running Windows just about everywhere else.

Windows has a well deserved reputation for being and infection laden pile of shiat. Back in the late 90s/early 2000s they insisted on embedding a lot of application functions into the OS itself. They were trying to take over the Internet, and by making Outlook and Explorer parts of the OS they were hoping to skirt Anti-trust laws.

Because those two programs were "inside the sanctum", breaking into through, say, the address book could lead to total system pwn. There were also some questionable design decisions that went into Visual Basic, the system registry, and the way drivers interact with the system core (or actually had to replace parts) that meant there was no way to secure a windows machine that was powered on.

I remember one virus that was so bad you couldn't do a clean install and get through system update before the machine was re-infected.

Windows has gotten a lot better. But it still leaves a nasty taste in my mouth.

/Uses a Mac at home

Format the drives and you are golden on windows, there as a Virus that made it self resident in the bios, but MS did not create the BIOS, so you cannot blame it on them.

MBR viruses have been killed since windows 2000, but you could get them if you multibooted and older OS with 2000 on up.

No, this was a network virus that requires a service pack to cure because it broke in through a vulnerability in the network stack. At the time, offline updates were unheard of. And having hundreds of machines in circulation on the local network, you could never be sure you got all of them. We would bring in a machine that was giving up trouble and reformat it. And that is what landed us where we were at.

Trust me child, I knew what the fark I am talking about.

(And after that day we would ghost any machine we were boot strapping with a copy of the OS that had been patched. Fortunately we were using all Dells and we had a site license for Windows.)

chawco:It's not just that Macs or less vulnerable, its more than that just on a worthwhile Target. Sure you can get some buddies little dinky music program to shut down and steal their "Beats", or lock some artist out of their digital files. But there's no real valuable information on most Macs.

It's not just the smaller market share, it's also the "why bother" infecting some douchebag hipster's Mac. If they can afford a Mac in the first place, they will probably just throw it out and buy a new one, cuz that's what Apple users do. It's not like you can repair or upgrade them.

I have newer 13'' Lenovo Yoga, which is a wonderful little computer. It's very quick, and the price was nice. But I have no illusions that I can have this repaired if it breaks, or of having the ability to do any upgrades to this notebook.That is just the way of the portable market now. About the only machines that are up-gradable, or repairable, are gaming notebooks.

And no, I don't have any Apple notebooks. I just think it's stupid to fight over your choice of computer equipment.

theflatline:While not as vulnerable as PCs most mac users run their systems wide open and install anything.

This simply isn't true. The default, out-of-the-box configuration for the Mac prevents users from running unsigned app bundles, signed with signatures authorized by Apple. You can't install anything that Apple doesn't approve of, unless you know how to change those settings- and I'm very skeptical of the idea that most users actually do that.

FlashHarry:However, due to its inherently more secure OS, the Mac is currently much less susceptible to viruses - i.e. malicious code that can enter a computer without user intervention - if at all.

This also isn't true. An OS doesn't provide "inherent security", since the largest threat surface is always going to be applications. I mean, honestly, there was just a published exploit where a non-privileged application can exfil all of the data in a user's Keychain (macOS's password management tool).

Evil Twin Skippy:theflatline: Evil Twin Skippy: I used to run the network for a very large science museum and research institution. We had a large collection of Macs for the graphics department and PCs running Windows just about everywhere else.

Windows has a well deserved reputation for being and infection laden pile of shiat. Back in the late 90s/early 2000s they insisted on embedding a lot of application functions into the OS itself. They were trying to take over the Internet, and by making Outlook and Explorer parts of the OS they were hoping to skirt Anti-trust laws.

Because those two programs were "inside the sanctum", breaking into through, say, the address book could lead to total system pwn. There were also some questionable design decisions that went into Visual Basic, the system registry, and the way drivers interact with the system core (or actually had to replace parts) that meant there was no way to secure a windows machine that was powered on.

I remember one virus that was so bad you couldn't do a clean install and get through system update before the machine was re-infected.

Windows has gotten a lot better. But it still leaves a nasty taste in my mouth.

/Uses a Mac at home

Format the drives and you are golden on windows, there as a Virus that made it self resident in the bios, but MS did not create the BIOS, so you cannot blame it on them.

MBR viruses have been killed since windows 2000, but you could get them if you multibooted and older OS with 2000 on up.

No, this was a network virus that requires a service pack to cure because it broke in through a vulnerability in the network stack. At the time, offline updates were unheard of. And having hundreds of machines in circulation on the local network, you could never be sure you got all of them. We would bring in a machine that was giving up trouble and reformat it. And that is what landed us where we were at.

Trust me child, I knew what the fark I am talking about.

(And after that day we would ghost any mac ...

You did not state that in your post. You said that it was one that would infect you before a windows update would run, so using the parameters given my assumption of the "bios" virus was logical. Because that was how it worked.

No your assumption of bios was not logical. I was talking about the operating system. I mentioned 2 network applications in particular. I mentioned the integration of application layer features into the operating system side of the house in particular.

You didn't bother to read. And then you called me on some esoteric bit of trivia.

veale728:Macs are just as vulnerable, yes. Windows machines have a higher likelihood of attacks though, because of the higher marker share

This is why there are so many iOS attacks out there, right?

No, the difference between Windows and Mac OS is not just security through obscurity, but a different approach in default settings: Windows - and this unfortunately includes home installs - is pre-configured for enterprise environments. Want to save time sending your techs to your satellite office? We'll include remote administration and installation and have them on by default! Your corporate end user can just open the box, put the machine on your network, and you can install everything!Great idea for IT departments, terrible for users.

By contrast, Mac OS is locked down by default. Want remote administration? Start it up, go into a two-layer deep settings menu, and then input an administrator password. IT hates it, because they can't trust Bob in Accounting to handle that. But it also means that Bob's Mac at home is immune to most hacks unless someone breaks into his house.

Plus I'm one of those middle aged "veteran of the browser wars" who gets his panties in a bunch and insists that kids today don't understand what it was like then. And then swills some whiskey, and starts on yet another tale of the shiatty old days...

On what planet is Linux a single-digit market percentage? Sure, for desktop computing, they are. But for backend stuff- y'know, for the systems that hold valuable data- they are the world. Also, for IoT devices- if you're building a botnet, it's waaaaay easier to build that botnet out of shiattly secured coffee makers that connect to the Internet for some idiotic reason.

But there's only so much an OS can do to control its threat surface- code/data separation, ASLR, robust privilege models, etc. At the end of the day though, it's on the actually used software to provide sufficient security. For example, the Equifax hack used a Struts vuln that happened entirely in application-space. No OS could defend you against that.

Now, that said, macOS does something that only Apple could get away with, that honestly does make your device more secure: in its default configuration, it won't run unsigned code. No OS should run a binary without a valid cryptographic signature from a known source. Apple, sadly, doesn't go far enough- you can still run Unix-style executables without a signature, but no application bundles (which means it's good enough to keep users from hurting themselves, but doesn't stop a determined attacker).

The problem with Apple computers are not really the computers, but the user base.

About .001 know what they are doing. That includes "creatives" and people who cry out "I have been using the Mac since 1984, and I all every Apple product, hey how do I search for a file?".

They also have a false sense of security and will install anything from any site. "Your Flash player is not up to date?" And since they do not know how to check it under preferences in Safari they install it, and boom they get creepy crawlers. The worst being Mac Keep which pretends to optimize their machines, then strange stuff starts happening and an official looking Apple logo and page pops up and says they have ...

The problem with believing that computers should "just work" is that it exonerates you of any responsibility. You don't need to know how to avoid viruses or update your OS, because the computer is just going to take care of everything and it'll be fine. The kind of thinking that equates computers with magic is abhorrent and extremely dangerous.

t3knomanser:theflatline: While not as vulnerable as PCs most mac users run their systems wide open and install anything.

This simply isn't true. The default, out-of-the-box configuration for the Mac prevents users from running unsigned app bundles, signed with signatures authorized by Apple. You can't install anything that Apple doesn't approve of, unless you know how to change those settings- and I'm very skeptical of the idea that most users actually do that.

FlashHarry: However, due to its inherently more secure OS, the Mac is currently much less susceptible to viruses - i.e. malicious code that can enter a computer without user intervention - if at all.

This also isn't true. An OS doesn't provide "inherent security", since the largest threat surface is always going to be applications. I mean, honestly, there was just a published exploit where a non-privileged application can exfil all of the data in a user's Keychain (macOS's password management tool).

Only in Sierra did that change.

Previously. Most clicked anywhere the first time messages them to go there and select anywhere. It was a one time change.

Now, in Sierra the option was removed, so when you try to install an unknown source the system tells you to go to system preferences to install it, and behold, you just hit the open button. Most users you know cannot hit the open button?

And this third pic. Is my desk, albeit messy since I work from home. But if you enlarge the badge hanging from the lamp. You might see my employer.

Guntram Shatterhand:Who said Macs weren't vulnerable? Only people I know who believed that were the ones who came to me to ask me to fix them while spouting that sales garbage.

Also, aren't Macs now really just underpowered as fark?

Those are just the ones that are 10 years old and still running. My 2015 MacBook is still outrunning the top of the line simulation servers we have at the office.

And that is running our software through an emulator to boot. Our product is a Windows app. I develop on a VM so I can keep a snapshot of the OS around for debugging and or recovery from "windows update bricked my machine" syndrome.

chawco:It's not just that Macs or less vulnerable, its more than that just on a worthwhile Target. Sure you can get some buddies little dinky music program to shut down and steal their "Beats", or lock some artist out of their digital files. But there's no real valuable information on most Macs.

It's not just the smaller market share, it's also the "why bother" infecting some douchebag hipster's Mac. If they can afford a Mac in the first place, they will probably just throw it out and buy a new one, cuz that's what Apple users do. It's not like you can repair or upgrade them.

I'm using a Mac laptop with DIY upgraded memory and an aftermarket battery. It'll be my last, though. The latest ones seem to have the battery glued in, and the memory cards are soldered to the motherboard.

Abe Vigoda's Ghost:chawco: It's not just that Macs or less vulnerable, its more than that just on a worthwhile Target. Sure you can get some buddies little dinky music program to shut down and steal their "Beats", or lock some artist out of their digital files. But there's no real valuable information on most Macs.

It's not just the smaller market share, it's also the "why bother" infecting some douchebag hipster's Mac. If they can afford a Mac in the first place, they will probably just throw it out and buy a new one, cuz that's what Apple users do. It's not like you can repair or upgrade them.

I have newer 13'' Lenovo Yoga, which is a wonderful little computer. It's very quick, and the price was nice. But I have no illusions that I can have this repaired if it breaks, or of having the ability to do any upgrades to this notebook.That is just the way of the portable market now. About the only machines that are up-gradable, or repairable, are gaming notebooks.

And no, I don't have any Apple notebooks. I just think it's stupid to fight over your choice of computer equipment.

Damn damn it, now I am dogmatically required to hate those NASA guys

Thanks a lot! And hey, for all you know they are making pretty pictures for press releases! Everyone knows JPL uses Linux!