The TCPA remains a hotbed of class action litigation, with new cases being filed across the country. The viability of cases frequently turns on whether defendant used an automatic telephone dialing system (“ATDS” or “autodialer”). Given the broad definition provided by the FCC, many courts have held that any device that has the mere capacity to autodial falls within the definition of an ATDS. As one well-known company expressed in an amicus brief on the issue, such a broad interpretation of ATDS would encompass any device, including a toaster oven.

Last week, a Northern District of California court acknowledged this absurdity when it granted summary judgment for GroupMe, Inc. (“GroupMe”) on a TCPA claim, finding there was no triable issue of fact as to whether GroupMe used an autodialer. Glauser v. GroupMe, Inc., 2015 WL 475111 (N.D. Cal. Feb. 4, 2015). The case involved GroupMe’s group messaging application, where users can create groups whose members can all send text messages to each other. One user created a “Poker” group and added plaintiff to that group. Plaintiff received two text messages welcoming him to the group and instructing him how he could opt out of the group. Plaintiff did not respond, but received more text messages where group members discussed their availability for a poker game. Because plaintiff had not responded, GroupMe sent plaintiff a text message notifying plaintiff he’d be removed from the group, unless plaintiff replied to the text message. Plaintiff indeed responded “In,” which reinstated him as part of the group.

Despite joining into the Poker group, plaintiff cried foul and sued under the TCPA. GroupMe moved for summary judgment on the issue of whether it used an autodialer. The court made three findings in ruling in favor of GroupMe. First, the district court agreed that whether equipment has the “capacity” to autodial depends on the device’s present capacity, not potential capacity, noting that to accept a “potential capacity” argument would impermissibly allow the TCPA to capture common devices, such as smartphones. Second, the court held that autodialers include not just dialers that can generate numbers randomly or sequentially, but also predictive dialers. Finally, the court ruled that an autodialer must have the capacity to dial numbers without human intervention. Because all of GroupMe’s text messages were triggered by the original GroupMe user’s creation of the “Poker” group, human intervention was necessary, and GroupMe did not use an autodialer. Absent an autodialer, plaintiff’s case fell on summary judgment.

The GroupMe case demonstrates that TCPA actions may be brought for nearly any conduct involving text messages. Despite affirmatively joining in the “Poker” group and reaping the benefits of the GroupMe service, plaintiff still sued. The ruling also demonstrates that more courts are inclined to scale back what constitutes an ATDS under the TCPA. The FCC may weigh in on the issue, as petitions remain pending. Until then, expect more uncertainty on the scope of an ATDS under the TCPA.

Yesterday afternoon, President Barak Obama gave a quip-filled speech at the Federal Trade Commission where he praised the FTC’s efforts in protecting American consumers over the past 100 years and unveiled his plans to implement legislation to protect American consumers from identity theft and to protect school children’s personal information from being used by marketers. These plans build upon past legislative efforts and the Administration’s focus on cybersecurity, Big Data, and Consumer Protection. Specifically, On February 23, 2012, the White House released “Consumer Data Privacy in a Networked World: A Framework for Protecting Privacy and Promoting Innovation in the Global Digital Economy” (the “Privacy Blueprint”) and in January 2014, President Obama asked his Counselor, John Podesta, to lead a working group to examine Big Data’s impact on government, citizens, businesses, and consumers. The working group produced Big Data: Seizing Opportunities, Preserving Values on May 1, 2014.

In his speech, the President highlighted the need for increased privacy and security protections as more people go online to conduct their personal business—shop, manage bank accounts, pay bills, handle medical records, manage their “smart” homes, etc.—stating that “we shouldn’t have to forfeit our basic privacy when we go online to do our business”. The President referenced his “Buy Secure” initiative that would combat credit card fraud through a “chip-and-pin” system for credit cards and credit-card readers issued by the United States government. In that system, a microchip would be imbedded in a credit card and would replace a magnetic strip since microchips are harder than magnetic strips for thieves to clone. A pin number would also need to be entered by the consumer into the credit card reader just as with an ATM or debit card. The President praised those credit card issuers, banks, and lenders that allowed consumers to view their credit scores for free. He also lauded the FTC’s efforts in the efforts to help identity theft victims by working with credit bureaus and by providing guidance to consumers on its website, identitytheft.gov.

The first piece of legislation the President discussed briefly was a comprehensive breach notification law that would require companies to notify consumers of a breach within 30 days and that would allow identity thieves to be prosecuted even when the criminal activity was done overseas. Currently, there is no federal breach notification law and many states have laws requiring companies to notify affected consumers and/or regulators depending on the type of information compromised and the jurisdiction in which the organization operates. The state laws also require that breach notification letters to consumers should include certain information, such as information on the risks posed to the individual as a result of the breach along with steps to mitigate the harm. This “patchwork of laws,” President Obama noted, is confusing to customers and costly for companies to comply with. The plan to introduce a comprehensive breach notification law adopts the policy recommendation from the Big Data Report that Congress pass legislation that provides for a single national data breach standard along the lines of the Administration’s May 2011 Cybersecurity legislative proposal. Such legislation should impose reasonable time periods for notification, minimize interference with law enforcement investigations, and potentially prioritize notification about large, damaging incidents over less significant incidents.

The President next discussed the second piece of legislation he would propose, the Consumer Privacy Bill of Rights. This initiative is not new. Electronic Privacy Bills of Rights of 1998 and 1999 have been introduced. In 2011, Senators John Kerry, John McCain, and Amy Klobucher introduced S.799 – Commercial Privacy Bill of Rights Act of 2011. The Administration’s Privacy Blueprint of February 23, 2012 set forth the Consumer Privacy Bill of Rights and, along with the Big Data Report, directed The Department of Commerce’s The National Telecommunications and Information Administration (NTIA) to seek comments from stakeholders in order to develop legally-enforceable codes of conduct that would apply the Consumer Privacy Bill of Rights to specific business contexts.

The Big Data Report of May 1, 2014 recommended that The Department of Commerce seek stakeholder and public comment on big data developments and how they impact the Consumer Privacy Bill of Rights draft and consider legislative text for the President to submit to Congress. On May 21, 2014, Senator Robert Menendez introduced S.2378 – Commercial Privacy Bill of Rights Act of 2014. The Consumer Privacy Bill of Rights set forth seven basic principles:

1) Individual control – Consumers have the right to exercise control over what information data companies collect about them and how it is used.

2) Transparency – Consumers have the right to easily understandable and accessible privacy and security practices.

3) Respect for context – Consumers expect that data companies will collect, use, and disclose the information they provided in ways consistent with the context it was provided.

4) Security – consumers have the right to secure and responsible handling of personal data.

5) Access and accuracy – Consumers have the right to access and correct their personal data in usable formats in a manner that is appropriate to the data’s sensitivity and the risk of adverse consequences if the data is not accurate.

6) Focused Collection – Consumers have the right to reasonable limits on the personal data that companies collect and retain.

7) Accountability – Consumers have the right to have companies that collect and use their data to have the appropriate methods in place to assure that they comply with the consumer bill of rights.

The President next discussed the third piece of legislation he would propose, the Student Digital Privacy Act. The President noted how new educational technologies including tailored websites, apps, tablets, digital tutors and textbooks transform how children learn and help parents and teachers track students’ progress. With these technologies, however, companies can mine student data for non-educational, commercial purposes such as targeted marketing. The Student Privacy Act adopts the Big Data Report’s policy recommendation of ensuring that students’ data, collected and gathered in an educational context, is used for educational purposes and that students are protected against having their data shared or used inappropriately. The President noted that the Student Digital Privacy Act would not “reinvent the wheel” but mirror on a federal level state legislation, specifically the California law to take effect next year that bars education technology companies from selling student data or using that data to target students with ads. The current federal law that protects student’s privacy is the Family Educational Rights and Privacy Act of 1974, which does not protect against companies’ data mining that reveals student’s habits and profiles for targeted advertising but rather protects against official educational records from being released by schools. The President highlighted current self-regulation, the Student Privacy Pledge, signed by 75 education technology companies committing voluntary not to sell student information or use education technologies to send students targeted ads. It has been discussed whether self-regulation would work and whether the proposed Act would go far enough. The President remarked that parents want to make sure that children are being smart and safe online, it is their responsibility as parents to do so but that structure is needed for parents to ensure that information is not being gathered about students without their parents or the kids knowing about it. This hinted at a notification requirement and opt-out for student data mining that is missing from state legislation but is a requirement of the Children’s Online Privacy Protection Act of 1998. Specifically, COPPA requires companies and commercial website operators that direct online services to children under 13, collect personal information from children under 13, or that know they are collecting personal information from children under to children under 13 to provide parents with notice about the site’s information-collection practices, obtain verifiable consent from parents before collecting personal information, give parents a choice as to whether the personal information is going to be disclosed to third parties, and give parents access and the opportunity to delete the children’s personal information, among other things.

President Obama noted that his speech marked the first time in 80 years—since FDR—that a President has come to the FTC. His speech at the FTC on Monday was the first of a three-part tour leading up to his State of the Union address. Next, the President also planned to speak at the Department of Homeland Security on how the government can collaborate with the private sector to ward off cyber security attacks. His final speak will take place in Iowa, where he will discuss how to bring faster, cheaper broadband access to more Americans.

Happy New Year! For many, the holidays included exciting new gadgets. Whether it’s a new fitness tracker, a smart thermostat, or a smart glucose meter, these new connected devices have arrived, and new products are on the horizon. These products, termed the “Internet of Things” by privacy professionals, are broadly defined as products that can connect to a network.

On January 6, 2015, FTC Chairwoman Edith Ramirez delivered the opening remarks at the International Consumer Electronics Show, during which she spoke on security issues surrounding the Internet of Things (“IoT”). Chairwoman Ramirez discussed what she viewed as three key risks to consumer privacy, along with suggested industry solutions to mitigate those risks.

IoT Risks to Consumer Privacy
The first privacy and security risk of connected devices Chairwoman Ramirez identified was that connected devices engage in “ubiquitous data collection.” Because these devices can potentially collect personal information, including our habits, location, and physical condition, the data can lead to rich profiles of consumer preferences and behavior.

The second risk Chairwoman Ramirez identified was the possible unexpected use of consumer data acquired through connected devices. As an example, she asked whether data from a smart TV’s tracking of consumer television habits could be combined with other data to enable businesses to engage in targeted advertising or even exacerbate socio-economic disparities.

The third risk she identified was that connected devices can be hijacked, leading to misuse of personal information.

Suggested Industry Solutions
To combat the risks identified above, Chairwoman Ramirez suggested three solutions for the IoT industry. First, IoT companies should engage in “Security by Design,” namely that IoT products should be built initially with a priority on security, and that IoT companies should implement technical and administrative measures to ensure reasonable security. Chairwoman Ramirez identified five aspects of Security by Design:

conduct a privacy or security risk assessment as part of the design process;

test security measures before products launch;

use smart defaults—such as requiring consumers to change default passwords in the set-up process;

consider encryption, particularly for the storage and transmission of sensitive information, such as health data; and

monitor products throughout their life cycle and, to the extent possible, patch known vulnerabilities.

Second, Chairwoman Ramirez suggested that companies that collect personal information should engage in data minimization, viz. that they should collect only the data needed for a specific purpose and then safely destroy that data afterwards. Chairwoman Ramirez also urged companies to de-identify consumer data where possible.

Finally, Chairwoman Ramirez suggested that IoT companies provide notice and choice to consumers for unexpected collection or uses of their data. As an example, Chairwoman Ramirez stated that if IoT companies are sharing data from a smart thermostat or fitness band with data brokers or marketing firms, those companies should provide consumers with a “simple notice of the proposed uses of their data and a way to consent.”

Although not official FTC statements, these remarks by Chairwoman Ramirez provide valuable insight into how the Federal Trade Commission may regulate connected devices in the future. Companies in the IoT space should monitor further developments closely and review their data collection, security, and sharing practices accordingly.

Canada’s Anti-Spam Law (CASL) targets more than just email and text messages

In our previous post, we explained that on July 1, 2014, Canada’s Anti-Spam Law (CASL) had entered into force with respect to email, text and other “commercial electronic messages”.

CASL also targets “malware”. It prohibits installing a “computer program” – including an app, widget, software, or other executable data – on a computer system (e.g. computer, device) unless the program is installed with consent and complies with disclosure requirements. The provisions in CASL related to the installation of computer programs will come into force on January 15, 2015.

Application outside Canada

Like CASL’s email and text message provisions, the Act’s ”computer program” installation provisions apply to persons outside Canada. A person contravenes the computer program provisions if the computer system (computer, device) is located in Canada at the relevant time (or if the person is in Canada or is acting under the direction of a person in Canada). We wrote about CASL’s application outside of Canada here.

Penalties

The maximum penalty under CASL is $10 million for a violation of the Act by a corporation. In certain circumstances, a person may enter into an “undertaking” to avoid a Notice of Violation. Moreover, a private right of action is available to individuals as of July 1, 2017.

CASL’s broad scope leads to fundamental questions – how does it apply?

The broad legal terms “computer program”, “computer system” “install or cause to be installed” have raised many fundamental questions with industry stakeholders. The CRTC – the Canadian authority charged with administering this new regime – seems to have gotten the message. The first part of the CRTC’s response to FAQ #1 in its interpretation document CASL Requirements for Installing Computer Programs is “First off, don’t panic”.

New CRTC Guidance

The CRTC has clarified some, but not all of the questions that industry stakeholders have raised. CRTC Guidance does clarify the following.

Self-installed software is not covered under CASL. CASL does not apply to owners or authorized users who are installing software on their own computer systems – for example, personal devices such as computers, mobile devices or tablets.

CASL does not apply to “offline installations“, for example, where a person installs a CD or DVD that is purchased at a store.

Where consent is required, it may be obtained from an employee (in an employment context); from the lessee of a computer (in a lease context); or from an individual (e.g. in a family context) where that individual has the “sole use” of the computer.

An “update or upgrade” – which benefits from blanket consent in certain cases under CASL – is “generally a replacement of software with a newer or better version”, or a version change.

Grandfathering – if a program (software, app, etc.) was installed on a person’s computer system before January 15, 2015, then you have implied consent until January 15, 2018 – unless the person opts out of future updates or upgrades.

Who is liable?

CRTC staff have clarified that as between the software developer and the software vendor (the “platform”), both may be liable under CASL. To determine liability, the CRTC proposes to examine the following factors, on a case-by-case basis:

was their action a necessary cause leading to the installation?

was their action reasonably proximate to the installation?

was their action sufficiently important toward the end result of causing the installation of the computer program?

CRTC and Industry Canada staff have indicated that they will be publishing additional FAQs, in response to ongoing industry stakeholder questions.

Those who long have been concerned about a lack of consistent principles to guide the implementation of cloud services now have access to a new tool — one that promises to provide a useful guide to categories and controls in this important and expanding area of our practice.

This past summer, the International Organization for Standardization (“ISO”) together with the International Electrotechnical Commission (“IEC”) published ISO/IEC 27018, a new voluntary code of practice for the protection of personally identifiable information (“PII”) that is processed by a cloud-service provider. Used in conjunction with and as an expansion of ISO/IEC 27002, a best-practice guide for implementing information-security management, ISO/IEC 27018 creates a common set of security categories and controls intended specifically for cloud services. As the first-ever security standard for the cloud, ISO 27018 has the following key objectives:

• Help cloud-service providers that process PII to address applicable legal obligations as well as customer expectations.
• Enable transparency so customers can choose well-governed cloud services.
• Facilitate the creation of contracts for cloud services.
• Provide cloud customers with a mechanism to ensure cloud providers’ compliance with legal and other obligations.

While ISO/IEC 27018 does not replace existing laws and regulations, it provides a global common standard, which is particularly helpful for those cloud providers that offer services to customers in different countries. Because the requirements of such laws and regulations governing the protection of PII vary significantly from country to country, and obligations as between cloud-service providers and their customers can differ according to individual contract terms, ISO/IEC 27018 addresses the special challenges faced by cloud services operating internationally.

ISO states that the new standard is “applicable to all types and sizes of organizations, including public and private companies, government entities, and not-for-profit organizations, which provide information processing services as PII processors via cloud computing under contract to other organizations.” Still, ISO notes that the new standard should be adopted only as a “starting point.” In other words, not all aspects of the standard will be appropriate for all cloud services, and additional controls not included in ISO/IEC 27018 might be necessary for particular services to develop. Likely sometime next year, ISO will release ISO/IEC 27017, which more broadly will address information-security best practices for cloud computing.

In order to achieve ISO/IEC 27018 certification, a cloud service must undergo an audit by an accredited certification body that ensures that the cloud provider:

• Helps customers comply with their obligations to allow end-users to access, correct and/or erase their personal information.
• Processes PII only in accordance with a customer’s instructions.
• Processes PII for marketing or advertising purposes only with the customer’s express consent.
• Discloses information to law-enforcement authorities only when legally bound to do so.
• Discloses the names of any sub-processors and the possible locations where personal information may be processed prior to entering into a cloud-services contract.
• Helps customers comply with their data breach notification obligations.
• Implements a policy for the return, transfer, or disposal of personal data that specifies the retention period following the termination of a contract.
• Agrees to independent information-security reviews at planned intervals or when significant changes occur.
• Enters into confidentiality agreements with staff who have access to personal data and provide them training.