Technology law blog by a Canadian information technology and intellectual property law lawyer and trade-mark agent dealing with issues including software, copyright, privacy, the Internet, electronic commerce, computers

Archives

Feeds

Follow me on Twitter

Tag: Slaw

Friday July 31 is the 16th annual SysAdmin Day. A day to show our appreciation to the IT professionals who keep our computers, networks and apps working. For those of us who push the tech envelope a bit beyond a typical office setup, our thanks for not rolling their eyes every time we ask them for something new and different. And our thanks for using us as the test platform for new stuff.

In the interest of using your SysAdmin’s time most effectively, take a look at this amusing list.

From time to time various law enforcement and government types whine that encryption is a bad thing because it allows criminals to hide from authorities. That is usually followed by a call for security backdoors that allow government authorities to get around the security measures.

That’s a really bad idea – or as Cory Doctorow puts it in a post entitled Once Again: Crypto backdoors are an insane, dangerous idea: “Among cryptographers, the idea that you can make cryptosystems with deliberate weaknesses intended to allow third parties to bypass them is universally considered Just Plain Stupid.”

They build in a vulnerability to exploit – there are enough problems keeping things secure already. And the thought that government authorities can be trusted to use that backdoor only for the “right” purposes, and to keep the backdoor out of the hands of others is wishful thinking.

The Intercept has an article entitled Chatting in Secret While We’re All Being Watched that’s a good read for anyone interested in how to keep communications private. It was written by Micah Lee, who works with Glenn Greenwald to ensure their communications with Edward Snowden are private.

Even if you don’t want to read the detailed technical instructions on how to go about it, at least read the first part of the article that explains at a high level how communications can be intercepted, and the steps needed to stop that risk.

Communicating in secret is not easy. It takes effort to set it up, and it’s easy to slip up along the way. As is usually the case in any kind of security – physical or electronic – its about raising the difficulty level for someone to breach the security. The more efforts someone might take to try to intercept your communications, the more work it takes to keep it secret. For example, you raise the sophistication level of the thief who might burglarize your house as you increase security – from locking your doors, to deadbolts, to break resistant glass, to alarms, etc. It doesn’t take much extra security to make the thief go to another house, but it may take a lot more if a thief wants something specific in your house .

Edward Snowden’s communications, for example, require very diligent efforts, given the resources that various authorities might use to intercept those communications.

It is common to register a corporate name as a trademark. That’s fine if it is actually used as a trademark – but mere use as a corporate name is not enough to amount to trademark use.

Similarly, mere use of the trademark within an email or other text is not enough if it looks like the rest of the text. It must somehow look different than the rest of the text.

For example, if your company name is Abcd Widgets Inc, and your trademark is ABCD, the use of Abcd Widgets Inc. is not use of the trademark. ABCD must be used independently. And in text, using abcd is not use, but using ABCD may be, as it looks different than the surrounding text (unless, of course, the rest is in all caps as well.)

A common rebuke to self-driving cars are thoughts about cars behaving like computers – like freezing or rebooting while driving. Those make amusing sound bytes or twitter comments, but there is a grain of truth to it. Self driving technology has come a long way, but while computers and software can follow programmed instructions, and can learn over time, humans are still better at many things.

An article in the New York Times entitled Why Robots Will Always Need Us does a good job of putting this in context, in part by the experience of aircraft.

Author Nicholas Carr points out that:

Pilots, physicians and other professionals routinely navigate unexpected dangers with great aplomb but little credit. Even in our daily routines, we perform feats of perception and skill that lie beyond the capacity of the sharpest computers. … Computers are wonderful at following instructions, but they’re terrible at improvisation. Their talents end at the limits of their programming.

and

In 2013, the Federal Aviation Administration noted that overreliance on automation has become a major factor in air disasters and urged airlines to give pilots more opportunities to fly manually.

That’s not to say that we should smugly dismiss automation or technology. Lawyers, for example, who dismiss the ability of software to replace certain things we do are in for a rude awakening.

In general, computer code is never bug free, is never perfect, and is not able to do certain things. (You can say the same for us humans, though.) For example, the aircraft industry spends huge amounts of time and money testing the software that operates aircraft. On the other hand, the types of things computers can do well are increasing, and will increase over time. At some point there may be breakthroughs that make computers more reliable and better at the things us humans are more adept at. But we are not there yet.

Today 88% of face-to-face transactions in Canada are Chip & PIN or contactless, and thanks to the layers of security built into the MasterCard network, Chip & PIN and contactless are safe and fraud rates for Canadian face-to-face transactions have sharply declined.

While consumers should feel safe using their card all the time, they can further protect themselves by remaining diligent and taking precautions. Here are a few simple tips:

1> Don’t underestimate the strength of strong passwords. Make them complex with upper case, numbers and symbols and change them from time to time. Use different passwords for different purposes and ensure you have a means to recover passwords, where applicable, such as a separate registered email address.

2> Shop with confidence online and visit reliable websites. eCommerce makes shopping more convenient than ever, but consumers should do their homework. Look for the SecureCode symbol from MasterCard at checkout, which adds a layer of security and ensures you are who you say you are online.

3> Be skeptical of unsolicited phone calls, email, text messages, or social media messages if they request credit card data or personal information such as passwords, date of birth, social insurance number etc.

4> Do not click hastily on links contained within emails or on any email attachments sent by an unknown or un-validated source no matter how harmless or familiar the title appears. Instead delete the message unless you can confirm the sender.

5> If you followed an email link to a website (or a text message to a voice recording system) and provided card data that later seemed suspicious, contact your card issuer immediately so your account can be protected

6> Always use Chip & PIN, and tap to pay where applicable. You should be the only one with knowledge of your PIN number, and shield it from sight at checkout.

7> Keep an eye on your card statement. Sign up for online/e-statements and check regularly to make sure an unauthorized purchase was not processed. If you notice something, call your bank immediately. The number is always on the back of your credit card.

8> Be informed; know the facts about the layers of security built into your card’s payment network.

I’ve written about smartwatches before. So far they have not been selling as fast as some expected. The marketplace still hasn’t sorted out the right combinations of features and price.

Apple’s iWatch is arriving in April. It will no doubt sell well – if for no other reason than it’s an Apple product.

The first real smartwatch was the Pebble, which broke Kickstarter records in 2012. They announced a new version of it yesterday, called the “Pebble Time”. They launched a new Kickstarter project yesterday morning – but this time just to take pre-orders at a discount for May delivery, rather than for funding development.

If nothing else, it proved that there is tremendous interest in smartwatches. They achieved their $500,000 sales goal in about an hour, and the last I checked they were over $9,100,000, which translates to around 50,000 watches.

There has been a big kerfuffle in the last few days over the thought that Samsung smart TV’s are listening to and recording TV watcher’s conversations via their voice command feature. That arose from a clause in their privacy policy that said in part “…if your spoken words include personal or other sensitive information, that information will be among the data captured and transmitted to a third party through your use of Voice Recognition.”

Samsung has since clarified this language to explain that some voice commands may be transmitted to third parties to convert the command to text and make the command work. Also to point out that you can choose to just turn that feature off. That is similar to how Siri, Google Now, Cortana, and other voice command platforms work. Some voice commands are processed locally, and some may require processing in the cloud. How much is done locally, and how much in the cloud varies depending on the platform and the nature of the command.

While one should never reach conclusions based on press reports, the probability is that this issue was way overblown. But it does show how challenging privacy issues can get when it comes to technology and the internet of things (IOT).

Issues to ponder include:

The importance of designing privacy into tech – often called “Privacy by Design” – rather than trying to bolt it on later.

How complex privacy is in the context of modern and future technology where massive amounts of data are being collected on us from almost everything that includes things like fitness trackers, web browsers, smartphones, cars, thermostats, and appliances. Not to mention government surveillance such as the NSA and the Canadian CSE.

The mothership issue – meaning where does all that information about us go, how much is anonymised, what happens to it when it gets there, and who gets to see or use it?

How difficult it is to draft privacy language so it gives the business protection from doing something allegedly outside its policy – while at the same time not suggesting that it does unwanted things with information – while at the same time being clear and concise.

How difficult it is for the average person to understand what is really happening with their information, and how much comfort comes – or doesn’t come – from a trust factor rather than a technical explanation.

How easy it is for a business that may not be doing anything technically wrong or may be doing the same as everyone else is to become vilified for perceived privacy issues.

Have we lost the privacy war? Are we headed to a big brother world where governments and business amass huge amounts of information about us with creeping (and creepy) uses for it?

Are we in a world of tech haves and have nots where those making the most use of tech will be the ones willing to cross the “freaky line” where the good from the use outweighs the bad from a privacy perspective?

Are we headed to more situations where we don’t have control over our personal freaky line?

The US FTC just released a report entitled internet of things – Privacy & Security in a Connected World. Its a worthwhile read for anyone interested in the topic. It should be a mandatory read for anyone developing IoT devices or software. A summary of it is on JDSupra.

The conclusion of the FTC reports reads in part:

The IoT presents numerous benefits to consumers, and has the potential to change the ways that consumers interact with technology in fundamental ways. In the future, the Internet of Things is likely to meld the virtual and physical worlds together in ways that are currently difficult to comprehend. From a security and privacy perspective, the predicted pervasive introduction of sensors and devices into currently intimate spaces – such as the home, the car, and with wearables and ingestibles, even the body – poses particular challenges.

In essence, the FTC states that security and privacy must be designed into the devices, data collected must be minimized (at least in respect to consumer data), and people need to be given notice and choice about the collection of data.

From the Privacy Commissioner of Canada: “On January 28, Canada, along with many countries around the world, will celebrate Data Privacy Day. Recognized by privacy professionals, corporations, government officials, academics and students around the world, Data Privacy Day highlights the impact that technology is having on our privacy rights and underlines the importance of valuing and protecting personal information.”

Privacy becomes increasingly challenging with new tech such as big data, the internet of things, wearable computers, drones, and government agencies recording massive amounts of data in the name of security. Sober thought needs to go into balancing the advantages of such things with privacy rights, creating them in a privacy sensitive way, and giving people informed choices.