Two things that immediately caught my attention in this article: it took apparently at least 4 months after the breach itself to notify and maybe even detect it, the fact that E-Mail is hosted by Google and 2-factor authentication is mandatory doesn’t really mean this is also true on the network as such. Interesting also is the fact that they closed the hacker’ access but they don’t know how they got breached in the first place.
Besides that, if 2FA is present why do employees then need to change there passwords?
The 2FA story on the network becomes even more shady if you combine it with this article by AP (external link) which explicitly states the hack was performed using compromised credentials.
source: ZDNet (external link)

Wells Fargo Accidentally Releases Trove of Data on Wealthy Clients

Okay so let me break this one down: you hire an external law firm to handle a specific (or maybe multiple) cases, the legal person who needs to send over some data relevant for the case has access to far more then she probably should have, copies everything “by mistake” and doesn’t even realise that the total size of the archive is a bit larger then what you would have expected the data requested would have been.
And then to make matters worse, a Wells Fargo spokesperson says that they take the security and privacy of their clients “very seriously”. Yep, we can see that alright.
Root cause analyses questions: why did this external law firm have access to so much data? If that is legitimate then why was there never a 4-eyes principle implemented to make sure only data fit for release was actually send? If the access was not required, why did the external firm have it at all?
As for the breach resolution and notification, does Wells Fargo actually know what data was send over by mistake?
source: The New York Times (external link)

Details of 400,000 loan applicants spilled in UniCredit bank breach

Ten months is indeed on the longer end of data breach discovery for sure. That an investigation is only started after the second breach, probably to do with the fact the first one was discovered extremely late, is no more then damage control.
The more interesting bit in this article however is the fact that the GDPR is named and apparently before it goes into force companies don’t have to know where there data is, who accesses it and there is no liability when it goes wrong. This is incorrect.
What does change however in May next year that the fact that you don’t know these things, accountability principle, will be a breach by itself. If that’s your only reason to do a data audit to find out what personal data you have where and with whom then at least that part of the regulation is interpreted correctly. Current data protection regime however requires you to know this already, although as said not knowing when nothing seems to go wrong won’t lent you into trouble, under the GDPR it most certainly will.
source: The Register (external link)

CJEU limits transfer of sensitive personal data outside EU; what does that mean for Privacy Shield and Brexit?

Whilst not as directly massively impacting as the ruling on Safe Harbor was in October 2015, the European Court of Justice affirmed ones more it’s opinion on data protection as a fundamental right. Not only his the passenger name records exchange )PNR) data with Canada not strictly defined enough, the ECJ is of the opinion it is in direct violation with the European Charter of Human Rights (ECHR) as well.
Also interesting to see that the ECJ seems to create additional categories of sensitive data in the telecoms sector. Something which will be of importance for the upcoming E-Privacy Regulation changes.
All in all this opinion will strengthen the rights of data subjects in the EU and also shows which direction ECJ rulings may possibly go on the new cases the ECJ will hear next year on standard contractual clauses and maybe on a side-note on the EU/US Privacyshield as well. As for Brexit and the UK’s adequacy, yes the ECJ will remain a force to be reckoned with for the British post-brexit too.
source: IAPP (external link)

Legal boffins poke holes in EU lawmaker’s ePrivacy proposals

As with any law making process in the European back rooms, we will have to see what comes of this E-privacy regulation. Where it clearly is in breach itself of the then stricter GDPR, legal uncertainty will follow and the data subject’s rights will suffer as a consequence automatically.
Some positive outcomes have been scored in the recent months amending the draft proposals in favour of stricter privacy and data subject’s rights, but it remains to be seen if all of those make it to the final text or not.
With the ever more pervasive and invasive online tracking and profiling techniques, it is about time we take back control on our personal data and as a continent on the fundamental human right of privacy we have steadily been losing to data brokers and online advertising giants.
source: The Register (external link)

The data breach notifications to customers (note this as opposed to data subjects in Europe) looks somewhat similar to the notification requirements in the GDPR, although the notification to the data protection authorities in Europe are more stringent.
What is of more interest though is the fact that data processing in Singapore seems to rely on consent for the most part, something a lot of GDPR nitwits claiming to have the right knowledge are claiming for the European law as well which is simply not the case.
The final bit that caught my attention is the possibility to allow data sharing when consent is difficult to obtain. Whilst this seems reasonable., under strict conditions obviously, the example given in the article however makes me wonder if they really understand it at all:
The example is: “For example, data collected by a pharmacy via its customer satisfaction survey – which may include name, age, gender and health products bought – can be shared with a research company for a study on health supplements as long as there is no adverse impact on the customer.”
So you are doing a survey and with that you are not easily able to request consent? Really?
source: Channel NewsAsia (external link)

Using a blockchain doesn’t exempt you from securities regulations

Unregulated, worldwide and no roll-back so utterly trustworthy. Well not really as this story blatantly shows in all aspects.
One other misconception on blockchain technology is that it will replace the current financial system, with a limit of 7 transactions per second as the absolute maximum that’s almost impossible.
The other misconception and lie is that blockchain will enhance privacy. Since all transactions are computed in parallel on distributed systems and fully in the clear, privacy doesn’t enter in to it at all. There are methods to do it by e.g. using techniques like zero-knowledge proves, but those are not very trivial and only will add to the computational complexity of the system.
source: Ars Technica (external link)

ALIS in Blunderland: Lockheed says F-35 Block 3F software to be done by year’s end

Yep, the saga continues indeed. After I wrote about the F35 early this year, there is a new episode to the tail.
So the software makers say it should completed. Several audits say otherwise and the list of vulnerabilities that can cause injury or death is quite substantial. Who would you believe?
source: The Register (external link)

Of course there are privacy concerns with these kind of systems. In fact, any monitoring system that uses identifiers from either connected cars or bluetooth devices will raise those concerns which is not only obvious but also correct.
Bluetooth devices, as being used in the examples in this article, will transmit a unique device ID. Marketing tracking firms for shops already know this and are actively misusing it too.
Can this be done in a privacy friendly way? Sure, if only the number of cars is of any interest then don’t store the ID’s you receive and only count them and show them on a map. The problem arises if you want to do traffic analyses where you want to actually know how traffic moves through the area, in which case you will need to actively track the bluetooth ID’s themselves and register them with every sensor that detects them.
The problem in this respect is that those bluetooth ID’s are personal data, certainly under the upcoming GDPR, and the tracking information plus the sensor locations will be location data which will even become a special category.
And then we are not even considering the automatic licenceplate readers in this picture. It will be interesting to see what would happen if some citizens from this Danish town protest on the basis of data protection and privacy, specifically if that were to happen after May 25th 2018.
source: Ars Technica (external link)