Post navigation

My Friend Cayla’s in trouble again: the smart interactive doll is too blabby and eavesdroppy to put under the Christmas tree, the French data privacy watchdog said on Monday.

The Commission Nationale Informatique et Libertés (CNIL) announced that it’s served formal notice to Genesis Industries about the Bluetooth-enabled talking/listening doll Cayla, along with her Bluetooth buddy i-Que robot. CNIL demands that the company cease its “serious breach of privacy” caused by the toys’ lack of security.

Both toys listen to children as they ask questions on subjects such as mathematical calculations or weather forecasts. Cayla and i-Que are equipped with microphones and speakers and use Bluetooth to communicate with a mobile app on smartphones or tablets. Off the app goes to the internet when it hears a question; back it comes to hand over the information to a child.

Information or, as the case may be, whatever a hacker programs it to say. We learned in February 2015 that Cayla was suffering from noxious cloud syndrome: the doll had a software vulnerability that allowed it to be programmed to say anything at all – from Hannibal Lecter quotes to lines from 50 Shades Of Grey.

In addition, according to security researcher Ken Munro, any nearby device could connect with the doll via Bluetooth and therefore communicate with a child.

A consumer association gave CNIL a heads-up about lack of security in both toys a year ago. CNIL decided to do its own online investigations into what was happening to data the toys sent into the cloud. It also sent a questionnaire to Genesis, a Hong Kong company, in March 2017.

CNIL found that the toys collect plenty of personal data about children, their families and their friends, including their voices, the content of their conversations with the toys (which CNIL found can reveal identifying data such as addresses and names), as well as information filled into the form in the application “My Friend Cayla App”.

It turns out that anybody located within nine meters of the toys, outside a building, can wirelessly pair a mobile phone to the toys through Bluetooth, without having to log in. It can be done without inputting a PIN code, and you don’t have to press any kind of button on the toy. After that, whoever pairs with the toys can listen and record the conversations between children and their toys, along with any conversation taking place nearby.

Apparently, if you then make a call to the phone that’s sneakily paired with the toy, what you say into the calling phone will be relayed to the toy by the called phone, which effectively gives two-way conversation.

Again, we’ve known this about Cayla for a while. For its part, the CNIL has concluded that the toys’ lack of security breaches Article 1 of the French Data Protection Act. Back in February, Germany’s Bundesnetzagentur, the telecoms watchdog, called Cayla an “illegal espionage apparatus” that parents should destroy. It banned the doll on the grounds that the devices violate privacy laws by their ability to illegally transmit data collected without detection.

The CNIL was also concerned about the lack of clarity parents get about how Genesis processes the personal data the toys drink in. Nor are parents informed that the company transfers conversations to a service provider in a non-EU country.

Genesis Industries Ltd. has two months to comply with the Data Protection Act, which stipulates that technology “shall not violate human identity, human rights, privacy, or individual or public liberties”.

Meanwhile, Cayla doesn’t have any privacy advocate friends in the US, either. Several consumer complaints have been lodged with the Federal Trade Commission (FTC), including this one from the Electronic Privacy Information Centre (EPIC).

From EPIC’s complaint:

The failure to employ basic security measures to protect children’s private conversations from covert eavesdropping by unauthorized parties and strangers creates a substantial risk of harm because children may be subject to predatory stalking or physical danger.

Cayla also made it into this year’s annual Trouble in Toyland report from the US Public Interest Research Group (PIRG), a federation of consumer nonprofits.

This is at least her second appearance as a PIRG Troubled Toy. She joins the ranks of lead-filled fidget spinners, balloons that kids can easily inhale and choke on, and hoverboards that have been blamed for house fires that have killed two girls and a firefighter.

Make sure Santa’s reading up on those toys he delivers: clearly, some of them are dangerous.

It reads as a colloquial English sentence to me. “Off he goes to the liquor store; back he comes with a bottle of gin.” That’s a mildly unusual word order, but if you want to emphasise the “out-and-back” part of the activity (which is exactly the point here – the round trip of the data into the cloud) then this sort of sentence construction is both grammatical and unexceptionable.

From reading the original text, I don’t think your comments about “tabloidism” are correct at all. Based on the repetition in the paragraph you didn’t like, it looked to me like an editing mistake where a sentence had been edited but the old version hadn’t been deleted. As it happens, I didn’t use the word “creep” even once in the edited version, but that’s because it didn’t fit cleanly into my rewrite, not because I think it’s an inappropriate word in this context.

Just when is the UK going to ban this doll?!! Both Germany and France have with Germany insisting it should be destroyed and I, for one, agree with them. That’s probably because I have experienced being constantly tracked and spied upon. You pick up it is going on but you don’t know who, when or why. I moved county, dropped “friends” and had big rows with family. Why does anybody think that invasion of anyone’s privacy is at all acceptable? It turned out that Instagram was the conduit in my case and included around 7 to 8 neighbours whom I had no wish to even engage with. It still makes me feel sick to my stomach and I didn’t have to grow up with it.