Post navigation

51 percent of respondents, including a majority of Millennials, believe stormy weather can interfere with cloud computing.

Technically weather can cause your internet connection to go down, so yes it does interfere with your access to cloud computing. If you can’t access it, for all intents and purposes it doesn’t exist. I’d further argue any remotely decent data center is not impacted by “stormy weather”, it would need to be along the lines of “act of god”. A notable difference.

They also focused on:

You’re not alone: While many admit they don’t understand the cloud, 56 percent of respondents say they think other people refer to cloud computing in conversation when they really don’t know what they are talking about.

Again, I’d argue no big deal. You shouldn’t need to know what a utility is anymore than you need to know the molecular makeup of natural gas. You just need to know how to safely operate a stove. Cloud computing is turning computing into a utility. It removes the complexities (how to gather wood to keep with the fire/stove example).

The part that gets me is what was ignored by seemingly everyone else (emphasis mine):

Softer advantages, like working from home in the buff: People offered additional, unexpected benefits of the cloud, including the ability to access work information from home in their “birthday suit” (40 percent); tanning on the beach and accessing computer files at the same time (33 percent); keeping embarrassing videos off of their personal hard drive (25 percent); and sharing information with people they’d rather not interact with in person (35 percent).

We’ve failed miserably as technology professionals if 25% of the population think putting their embarrassing photos in the cloud is a good way to keep them private. This is akin to if 25% of the population said they trusted random Nigerian email’s for their banking needs.

If I were Apple, or Microsoft, or anyone else in the market, I’d be asking myself how to fix this misconception and make security on the desktop visibly superior as well as technologically. Perhaps make disk encryption standard for at least the user data which could be partitioned (especially since adjusting partitions isn’t impossible these days). A lot of privacy is lost in the cloud. It’s also potentially not subject to many of the protective laws the US provides to physical property in terms of search as I’ve mentioned before.

There’s another attack on Java via a new zero day flaw. This is why I don’t keep Java enabled in web browsers anymore. If you still do, I’d suggest turning it off. There’s a good chance you won’t miss it.

I’ve yet to get there with Flash, but the day is coming. After the previous post a few months ago, I think I like the idea of a blacklist/whitelist for plugins in general that allow a user to enable them only for specific hostnames. That would make it a bit more intuitive to use plugins when still needed, but gain the security of not having them available for any hostname you happen to stumble upon. The options would be something like:

According to the US Department of Justice (DOJ) the iPhone is largely uncrackable at this point:

“I can tell you from the Department of Justice perspective, if that drive is encrypted, you’re done,” Ovie Carroll, director of the cyber-crime lab for the CCIPS division of the Department of Justice, said earlier this month during his presentation at DFRWS. “When conducting criminal investigations, if you pull the power on a drive that is whole-disk encrypted you have lost any chance of recovering that data.”

Of course there are a fair number of tools out there for iOS 4 and below including UFED Ultimate and XRY. There is a lack of iOS 5 tools, at least that are being publicly advertised.

However, there’s arguably little need for such a tool anymore. As users put data on in “the cloud”, law enforcement doesn’t even need the physical phone, they can just send a request to Apple (or Google) for the data they want. I suspect this is at least part of what Steve Wozniak was talking about when he mentioned “horrible problems” in the next five years. It’s worth noting Apple has almost zero transparency regarding law enforcement requests and how they are vetted. It’s not even clear a warrant is necessary to request data. The law certainly isn’t clear in that regard.

If anything, I think it’s becoming easier for law enforcement, not harder.

Rob “CmdrTaco” Malda, (of Slashdot fame), did an IAmA on Reddit. Overall interesting but this particular answer caught my attention. When asked “How do you see a company like Google using the data it collects, and specifically your interest in Google+?”:

Data is just data. When I ran Slashdot, I logged everything I practically could because without it, I couldn’t make informed decisions.

What corporations DO with that data will pretty much define the future of the internet. I don’t think people truly understand the implications.

The biggest databases- things like Facebook and Google could be used in so many awful ways. But I collected all the data I possibly could too, and I really tried to impress upon everyone I worked with the RESPONSIBILITY that this data entails. Like, just because you gave me your email address is not permission for me to sell it or spam you.

I hope that the giants play nicely. I want the cool shiny things they make… and I’ll give them the benefit of the doubt in most cases.

The only thing I disagree with is he didn’t mention anything about disclosing to users how that data is used and will be handled. Of course current privacy policies suck, they need to be easier to understand without having a law degree.

The UK Government wants ISP’s to record secure transmission of messages with services like Facebook and Gmail, which are currently using SSL. I’d be curious to know how the UK government actually plans to pull this off. To pull that off they’d need to get browsers to include their root certificate so they can MITM Gmail and Facebook. I can’t see that happening.

Of course anyone really wanting to do something criminal will just employ a VPN to tunnel past these ISP’s, or encrypt messages using GPG. Therefore, I don’t see what the point is.

We do many things throughout the day. Most of the time we don’t give these things much thought. Often they are repetitive tasks we do every day. Our “routine” we call it. It may be that bathroom break mid-day, or that coffee break. Or might be those n Google searches throughout the day. You might be able to name some of them and put a count to it, but stop and think for a second. How many things do you actually know how many times you performed them? How much time was spent? How much energy/expense?

Companies collect this information, but strangely individuals don’t. The companies who we deal with often know more about us than we do. Google knows how many times you searched in a given day. It may (depending on your privacy settings) be able to recall each search you ever made. A feat I bet you can’t perform. Your credit card company knows how many times you purchase coffee at a given store in a given year. You quite possibly have no idea.

Stephen Wolfram has been analyzing his life for years. Just tiny aspects of it. The data is stunning. It makes you wonder why we don’t have more products out there that give us access to and control of our own data. Everyone else has more access to it than we have.

Collusion is a Firefox extension that gives another little bit of insight. Who knows where you’ve been online. Try installing it and running it for a week. It’s fascinating to see. But still so much in the browser isn’t exposed to the user. Your search history knows what you searched for. Your browser history knows when you browse the web, where you’re going. There’s a mountain of data there. The authorities use it when a crime is committed for a reason. about:me is a great extension for getting a little bit more of this information out of Firefox. It’s a fascinating area where I hope we’ll see more people spend time on. The great thing about these is they are client side and private. You don’t need to give your data away to someone else if you want to learn about yourself.

However we’re still at the infancy in personal analytics. There’s very few products out there to let us know what we do all day. FitBit can tell you when you sleep, when you’re active and how active you are. But not terribly much else about you. Your computer has a wealth of info, but really doesn’t tell you much. To even get a little out of it you need to be fairly technically adept.

I propose it’s time to encourage people to start learning more about themselves. Data is amazing and can change our behavior for the better. Data is all around us yet somehow it eludes us. Big companies know things about us that even we don’t know. Perhaps it’s time to change that?

“So far no credible high profile attack has been recorded but we are seeing evidence of basic spoofing, likely carried out by rogue individuals or small groups,” Humphreys explains. “Whilst the leap to more advanced, untraceable spoofing is large, so are the rewards. It’s therefore guaranteed that criminals are looking at this. All it takes is one person to put one together and publish it online and we have a major problem.”

Iran claims to have already done this to bring down a drone intact. There’s no public confirmation or evidence to prove if this is actually what happened or not.

The reality is messing up people’s phone or car navigation is relatively benign mayhem at best. Disrupting military systems, aircraft, financial systems is a much larger concern.

Gatekeeper is without question a bold move to prevent malware from impacting Mac OS X, but it will likely turn into a legal and ethical mess. Before I explain why, I’ll give a very high level overview. There are three options:

Mac App Store – Only run applications from the Mac App Store.

Mac App Store and identified developers – Only run applications from the Mac App Store and developers who sign up with Apple to get a key.

Anywhere – This is how every Mac and PC today operates out of the box.

The default in Mountain Lion is App Store and identified developers. As MacWorld’s Jason Snell explains:

Apple says, if a particular developer is discovered to be distributing malware, Apple has the ability to revoke that developer’s license and add it to a blacklist. Mountain Lion checks once a day to see if there’s been an update to the blacklist. If a developer is on the blacklist, Mountain Lion won’t allow apps signed by that developer to run.

It’s worth noting that at least today the authentication is only done on first run from what I’ve read. However it’s not impossible for Apple to later check an application on each run to make sure it’s not on the blacklist. That could even happen before the feature ships this summer.

What’s concerning is that Apple will now essentially be the gatekeeper (get it?) and thus pressured to control what users can or can’t install on their computer. Lets be honest, most developers will never get their users to open system preferences and change this, so getting “identified” is essentially required to develop on Mac OS X if you want more than geeks to use your software.

Apple in the past has been pressured to remove Apps from the iOS App Store. It’s likely (read: guaranteed) to be pressured to blacklist developers who write apps which are controversial. Anything that could be used for piracy from a BitTorrent client to VLC which uses libdvdcss (the library hasn’t been legally challenged ever AFAIK but pressuring Apple is a way around the court system) could be targeted. Apple has a bit of a history banning apps for all sorts of reasons including being negative towards Apple.

How would Apple deal with pressure from patent claims? What about a desktop client for WikiLeaks, like the one that was pulled from the App Store? What about a game distributed by Planned Parenthood or some other organization that tends to draw controversy? There’s also the international issues here (Nazi images and Germany, privacy violations and EU). What about more indirect things like Firefox which can run 3rd party code via plugins and addons. Mozilla refused to kill MaffiaaFire. Could the Feds have went to Apple?

These are all hypothetical situations technically since the feature hasn’t even launched and Apple hasn’t given any clear policies. That in my opinion is the big problem. Apple as far as I know hasn’t given any guidelines to what would put a developer on the blacklist? Is there even an appeals process?

I’m pretty sure we’ll learn more over the coming weeks. The cool guys over at Panic are pretty optimistic about the feature, so I guess we’ll see.

I’ve been doing some reading up on best practices for SSL. From what I can gather, and seeing what other big sites are doing this seems to be the best practice as of today. This is assuming you’re in an OpenSSL 0.9.x (via mod_ssl) and Apache2 world, which is the majority of Linux/Unix based environments. Use a 2048 bits key SHA1 signed cert. Which is now pretty much standard.

That will disable potentially insecure cyphers and help mitigate a BEAST attack. Note that this disables SSL 2.0 which shouldn’t be necessary for the vast majority of visitors. I don’t think many websites still support it.

If you’ve got a site where login is required, please make sure to use SSL. It’s not that costly anymore. Even this blog uses SSL where necessary.

It’s not needed for general public consumption things (like this webpage), but anywhere a session can be hijacked or confidential data could be transfered, SSL is a good idea. When not possible to default, at least make it an option (I do this for safepasswd.com).