Tag Archives: FTC

Endorsements from a celebrity and/or social media influencer are an important tool used by advertisers to build a brand’s image and persuade consumers. Moreover, it’s understandable that any brand ambassador would like to preserve an “organic” feel when conveying their messages. But the law, governed and enforced by the Federal Trade Commission (“FTC”) under the FTC ACT (15 U.S.C. §§ 41-58, as amended), states that “endorsements must be truthful and not misleading.” FTC relies on its recently updated Guides Concerning Use of Endorsements and Testimonials in Advertising (the “Guides”, as set forth in 16 CFR Part 255) to ensure that consumer products and services are described truthfully online, and that consumers understand what they are paying for. The Guides represent administrative interpretations of laws enforced by the FTC, and therefore failure to adhere to the voluntary compliance requirements set forth therein, may result in law enforcement actions for violations of the FTC Act.

The Guides themselves are not regulations, and so there are no civil penalties associated with them. But if advertisers don’t follow the Guides, the FTC may decide to investigate whether the practices are unfair or deceptive under the FTC Act, and may take corrective action under Section 5 of the FTC Act (15 U.S.C. 45). The following then, shall serve to advise regarding the basic legal requirements you must follow when making sponsored endorsements of any product, service or brand on any of your social media platform.

In short, in order to comply with the FTC, you must clearly and conspicuously disclose your material relationship with any brand which you are endorsing via your social media. The following will provide some guidance as to how and when you should disclose, but as a general rule of thumb, “when in doubt, disclose.”

What Is An Endorsement?

According to the FTC, an endorsement means “any advertising message that consumers are likely to believe reflects the opinions, beliefs, findings, or experiences of a party other than the sponsoring advertiser, even if the views expressed by that party are identical to those of the sponsoring advertiser.” Thus, in order to be FTC compliant as the endorser of any brand’s product or service, you must ensure the following with respect to any of your endorsements:

The endorsement must reflect your honest opinion, belief, finding, or experience with respect to the product or service you are endorsing; and

You must have been a bona-fide user of the endorsed product or service at the time you made your endorsement.

[For example, you would not be permitted to tweet about how delicious you believe a particular beverage tastes, if (a) you’ve never actually tried that beverage, or (b) you don’t honestly believe the beverage is delicious].

Material Connections Must Be Disclosed.

In addition to the foregoing requirements, when there exists a connection between the endorser and the seller of the advertised product or service that might materially affect the weight or credibility of the endorsement, that connection must be fully disclosed. In other words, your endorsement of any product or service is subject to enforcement if the brand/advertiser, or someone working for the brand/advertiser, pays you or gives you something of value or provides some other incentive to mention their product on any of your social media.

Required Disclosures Must Be Clear and Conspicuous.

The hard and fast rule is, material relationships between brand and endorser on social media must be clearly and conspicuously disclosed. The FTC has provided specific guidance as to how to address these disclosures on various social media platforms, as follows:

On Twitter, Facebook and Instagram: While there are no specific rules as to how the disclosure needs to be stated, the FTC has taken a firm stance that the limited 140-character space available on Twitter does not change the need to disclose in an endorsement tweet. According to the FTC, “the words ‘Sponsored’ and ‘Promotion’ use only 9 characters; ‘Paid ad’ only uses 7 characters; and starting a tweet with ‘Ad:’ or ‘#ad’ takes only 3 characters” – each of these would likely be effective disclosures, per the Guides. Also note that the disclosure must be made on each and every tweet you make, even if you are tweeting the same message consecutively (i.e. minutes or even seconds apart) or if the tweet is broken up into several parts. The same rules apply to Facebook and Instagram posts, with the caveat that, because you are not limited in space on these alternative platforms, you should err on the side of making your disclosure longer and more obvious, rather than falling back on the same abbreviated text that would be appropriate for Twitter.

Contests & Sweepstakes Rules Need Disclosure: If you are promoting/sponsoring a contest on behalf of a brand, a disclosure is also required. Moreover, the responsibility falls on you (e.g. the contest sponsor) to make sure people entering the contest make the disclosure themselves if the contest requires them to review or promote a product/service. Again, the key is whether the gift would affect the “weight or credibility” of an endorsement, but determining where to set the bar is difficult, so it’s always safer to disclose. For example, if you, as a brand ambassador, are calling upon your social media followers to tweet about, or make an online review of the brand, in exchange for some type of gift or reward, then your call to action must also require that your followers disclose the contest/sweepstakes. Displaying a hashtag like “#contest” or “#sweepstakes” should be sufficient as a disclosure; however, using something like “#BrandXYZ_Rocks” or merely the abbreviated “#sweeps” is not sufficient because the relationship is not deemed obvious enough and people might not understand what the disclosure means.

Video Disclosures Must Be Made Early And Often: For any YouTube or other sponsored online video (e.g. Snapchat, Vine, etc.), it’s not enough to have a disclaimer on the details page. The FTC has stressed that proximity and placement are two determinative factors as to the conspicuousness of the disclosure. Therefore, your disclosure must be made at the beginning of the video and preferably repeated multiple times for longer-form pieces. Similarly, streaming video, such as Periscope or when making a video/mobile game review as a sponsor/ambassador of a gaming company, also needs disclosure throughout the video. As an example, stating throughout your videos or live streams language such as, “Sponsored by [name of the company],” would be sufficient as a disclosure.

Facebook “Likes” Might Require Disclosure: The FTC has not clearly addressed this specific issue yet, however, you should still stick to the same general rule of clearly disclosing if you are acting as a brand ambassador/sponsor to incentivize your followers to “Like” a brand on Facebook. It should be noted however that the FTC is unequivocally against the practice of “fake likes.”

Conclusion.

In summary, whenever you are acting as a Brand Ambassador or Sponsor of any product or service, you must ensure that people get the information they need to evaluate your sponsored statements. If you were given something for free or paid to promote a product or service, clearly state so. You should use clear and unambiguous language and make the disclosure stand out. Consumers (i.e. your social media followers) should be able to notice the disclosure easily and should not have to look for it. And finally, if your disclosures are hard to find, tough to understand, fleeting, or buried in unrelated details, or if other elements in your ad or message obscure or distract from the disclosures, they don’t meet the “clear and conspicuous” standard and you could find yourself the subject of an action from the FTC.

Jason W. Brooks, Esq. is an entertainment attorney and a founding partner of altView Law Group, LLP. Jason specializes in transactional business and legal affairs matters, particularly in the areas of New Media and TV production. Feel free to contact Jason via email: Jason@altviewlawgroup.com or follow him on Twitter: @Jasonbrookslaw.

Disclaimer: The information in this post is intended for general information purposes only and should not be construed as legal advice.

Privacy continues to evolve into one of the most important legal issues of this decade. While we as Americans are wary of the government collecting our private information, we are comparatively complacent regarding private information collected by private businesses. It’s a dangerous conundrum. After all, the government is created for our benefit and is ultimately accountable to us, but private business, on the other hand, has no such inherent accountability and is dedicated to its own self-interest.

The Federal Trade Commission (“FTC”) plays an important role in protecting the privacy of persons using the internet. The FTC has just recently adopted changes to its Children’s Online Privacy Protection Act (“COPPA”) to strengthen privacy protections for children and give parents greater control over the personal information that websites and online services may collect from children under thirteen. The information in this article, largely taken from the FTC itself, explains these changes.

Congress passed COPPA in 1998. It requires that operators of websites or online services that are either directed to children under thirteen or have actual knowledge that they are collecting personal information from children under thirteen give notice to parents and get their verifiable consent before collecting, using, or disclosing such personal information, and keep secure the information they collect from children. It also prohibits these operators from conditioning children’s participation in activities on the collection of more personal information than is reasonably necessary for them to participate. COPPA contains a “safe harbor” provision that allows industry groups or others to seek FTC approval of self-regulatory guidelines.

In 2010, the FTC initiated a review to ensure that COPPA keeps up with evolving technology and changes in the way children use and access the internet, including the increased use of mobile devices and social networking.

The final amendments:

modify the list of “personal information” that cannot be collected without parental notice and consent, clarifying that this category includes geolocation information, photographs, and videos;

offer companies a streamlined, voluntary, and transparent approval process for new ways of getting parental consent;

close a loophole that allowed child-directed apps and websites to permit third parties to collect personal information from children through plug-ins without parental notice and consent;

extend coverage in some of those cases so that the third parties doing the additional collection also have to comply with COPPA;

extend COPPA to cover persistent identifiers that can recognize users over time and across different websites or online services, such as IP addresses and mobile device IDs;

strengthen data security protections by requiring that covered website operators and online service providers take reasonable steps to release children’s personal information only to companies that are capable of keeping it secure and confidential;

The definition of an “operator” has been updated to make clear that COPPA covers a child-directed site or service that integrates outside services, such as plug-ins or advertising networks, that collect personal information from its visitors. This definition does not extend liability to platforms, such as Google Play or the App Store, when such platforms merely offer the public access to child-directed apps.

The definition of a “website or online service directed to children” is expanded to include plug-ins or ad networks that have actual knowledge that they are collecting personal information through a child-directed website or online service. In addition, in contrast to sites and services whose primary target audience is children, and who must presume all users are children, sites and services that target children only as a secondary audience or to a lesser degree may differentiate among users, and will be required to provide notice and obtain parental consent only for those users who identify themselves as being younger than thirteen.

The definition of “personal information” now also includes geolocation information, as well as photos, videos, and audio files that contain a child’s image or voice.

The definition of “personal information requiring parental notice and consent before collection” now includes “persistent identifiers” that can be used to recognize users over time and across different websites or online services. However, no parental notice and consent is required when an operator collects a persistent identifier for the sole purpose of supporting the website or online service’s internal operations, such as contextual advertising, frequency capping, legal compliance, site analysis, and network communications. Without parental consent, such information may never be used or disclosed to contact a specific individual, including through behavioral advertising, to amass a profile on a specific individual, or for any other purpose.

The definition of “collection of personal information” has been changed so that operators may allow children to participate in interactive communities without parental consent, so long as the operators take reasonable measures to delete all or virtually all of the children’s personal information before it is made public.

Parental Notice

The amended Final Rule revises the parental notice provisions to help ensure that operators’ privacy policies, and the direct notices they must give parents before collecting children’s personal information, are concise and timely.

Parental Consent Mechanisms

The Final Rule changes add several new methods that operators can use to obtain verifiable parental consent: electronic scans of signed parental consent forms; video-conferencing; use of government-issued identification; and alternative payment systems, such as debit cards and electronic payment systems, provided they meet certain criteria.

The amendments retain email plus as an acceptable consent method for operators that collect personal information only for internal use. Under this method, operators that collect children’s personal information for internal use only may obtain verifiable parental consent with an email from the parent, as long as the operator confirms consent by sending a delayed email confirmation to the parent, or by calling or sending a letter to the parent.

To encourage the development of new consent methods, the FTC establishes a voluntary 120-day notice and comment process so parties can seek approval of a particular consent method. Operators participating in an FTC-approved safe-harbor program may use any consent method approved by the program.

Confidentiality and Security Requirements

COPPA requires operators to take reasonable steps to make sure that children’s personal information is released only to service providers and third parties that are capable of maintaining the confidentiality, security, and integrity of such information, and who assure that they will do so. COPPA also requires operators to retain children’s personal information for only as long as is reasonably necessary, and to protect against unauthorized access or use while the information is being disposed of.

Safe Harbors

The FTC seeks to strengthen its oversight of the approved self-regulatory “safe harbor programs” by requiring them to audit their members and report annually to the FTC the aggregated results of those audits.

These changes will go into effect on July 1, 2013.

Kevin Mills is an owner of the law firm of Kaye & Mills where his practice focuses on advising clients with transactions across a full range of issues in entertainment, media, technology, Internet and general business. His practice encompasses copyright; trademark; trade dress; trade secret; brand protection; content creation, protection and distribution; and general corporate, organizational and business matters.

The Federal Trade Commission (“FTC”) has recently proposed to modify the rules for the Children’s Online Privacy Protection Act (“COPPA”). If these modifications are implemented, it would be the first time COPPA rules were revised since 1999, a time when there was no Facebook or “an app for that” – even MySpace wasn’t founded until 2003.

Today, there are countless ad networks, third party tracking cookies, and information brokers that harvest personal data across the web and on smartphones – none of these existed when the COPPA rules were last issued. Although COPPA was designed to protect children’s online experiences, currently, certain loopholes in COPPA allow companies to gather children’s personal information. A 2010 Wall Street Journal report found that some popular children’s websites installed more data-gathering technology on computers than websites aimed at adults.

The FTC wants to revise COPPA rules so that they apply to third party ad networks and app and plug-in developers, and to expand the definition of “personal information.” Specifically, the revisions aim to cover plug-ins and ad networks that know or have reason to know that they are collecting personal information through child-directed websites or online services. The revisions could affect popular website features such as Facebook’s “Like” button, as well as new social networks for playing games on smartphones.

First, the proposed revised rules would require sites with content designed to appeal to both young children and others (including parents) to be able to “age-screen all visitors in order to provide COPPA’s protections only to users under age 13.” These sites would not be allowed to collect any personal information without first obtaining parental consent. Currently, many websites secure consent by sending an email to an address provided by the child.

Second, the proposed revised rules would create co-responsibility between companies that furnish apps or plug-ins and those that operate the platforms where the apps or plug-ins run. The FTC states that “an operator of a child-directed site or service that chooses to integrate the services of others that collect personal information from its visitors should itself be considered a covered ‘operator’ under the Rule.” The revised rules would not only hold third parties responsible for any unlawful data collection, but would also make the host website responsible for those infractions.

Third, the proposed revised rules would expand the definition of “personal information” to include “‘persistent identifiers’ that recognize a user over a period of time which are used for purposes other than ‘support for internal operations.’” This revision is aimed at “tracking cookies” that are capable of delivering advertising within a single site and also of tracking people across sites to deliver targeted information. In other words, the revised rules would restrict or prohibit advertising to children based on their previous online behavior.

Fourth, the proposed revised rules would prohibit smartphone apps from collecting geolocation data (defined as “a home or other physical address including street name and name of a city or town”), which they often collect along with phone numbers.

Another important change, especially for many mobile apps, is that personal information now includes “a home or other physical address including street name and name of a city or town.” Such geolocation data is often collected by smart phone apps along with phone numbers, which will now be prohibited by the proposed rules.

It is also important to take a look at what is not covered in the new rules; these rules would apply to information that is being collected for the purposes of advertising or marketing — not information necessary to maintain a network or offer a service.

The revised rules are not aimed at sites that don’t allow children. This is true even though children do in fact use such sites. Facebook, for example, requires users to state their date of birth and does not allow users under thirteen to use the site. Of course, it is possible to lie about one’s age (Consumer Reports estimates that 5.6 million of Facebook’s users are under thirteen). And it’s worth noting that any site that requires a user to sign in via Facebook is certifying that that person claims to be thirteen or older based on Facebook’s terms of service.

Of course, when considering new rules, one must consider their effectiveness. Privacy advocates are concerned that the FTC lacks the resources to vigorously enforce the law. And given the FTC’s history of lax enforcement of COPPA, that is a valid concern.

Kevin Mills is an owner of the law firm of Kaye & Mills where his practice focuses on advising clients with transactions across a full range of issues in entertainment, media, technology, Internet and general business. His practice encompasses copyright; trademark; trade dress; trade secret; brand protection; content creation, protection and distribution; and general corporate, organizational and business matters.

In my last post, I discussed the recent Federal Trade Commission (“FTC”) report on internet privacy that was released in March 2012 (the “Report”). The Report had two basic themes. First, industry, working with government and consumer groups, should implement best practices for safeguarding consumer’s privacy and, second, Congress should consider enacting targeted legislation to provide greater transparency for and control over the practices of information brokers. In this post, I’m going to examine the legislative aspects of the Report.

The Report calls on Congress to consider enacting baseline privacy legislation. It echoes the Obama Administration’s call, issued earlier this year, for a new law that would serve as a Consumer Privacy Bill of Rights by establishing a basic set of online privacy principles. The Report also calls for a new law that would allow consumers to access and dispute personal and financial data that is collected and sold by data brokers without consumers’ permission.

Consumer groups and privacy advocates were generally pleased with the Report’s recommendations. Internet companies were less pleased. Senator John F. Kerry (Democrat) and Congressman Edward J. Markey (Democrat), advocates of online privacy protection, praised the Report, noting that it endorsed many of the legislative safeguards they have proposed in the past.

In April 2011, Kerry and Republican Senator John McCain co-sponsored the Consumer Privacy Bill of Rights, legislation that would afford consumers the right to opt out of information collection and would require companies to obtain consumers’ consent before gathering sensitive, personally identifiable data. Kerry, who chairs the Senate Subcommittee on Communications, Technology, and the Internet, has expressed frustration that the bipartisan bill has not yet passed, and hopes that the FTC Report will spur his colleagues in Washington to pass the bill.

Markey also issued a similar statement, linking the Report to a House bill that he and Republican Joe Barton of Texas filed last May. The Markey-Barton bill focuses on privacy protections for children under sixteen, aiming to ban targeted advertising directed at children and teens and to create an “eraser button” to enable the deletion of minors’ personal information. The Markey-Barton proposal is meant to strengthen the Children’s Online Privacy Protection Act of 1998 (“COPPA”), which covers only children twelve and younger, and predates many forms of contemporary digital media.

Markey noted that, “As in our legislation, the [Report] appropriately highlights the importance of providing teens with clear information about how their personal data is used, so they can be empowered to exercise control over these uses.” The Report notes that teens are especially vulnerable to targeted advertising due to their use of social media and mobile devices, making it all the more important that legally enforceable privacy protections for this age group are updated.

Personal information is valuable to businesses because it allows them to present relevant advertisements to consumers based on the interests that internet users display online. Companies benefit from an ability to concentrate their marketing resources on promising buyers and, they argue, consumers benefit from a selection of ads that appeal to their tastes. But a study released in March 2012 by the Pew Internet and American Life Project suggested that people find targeted advertising to be more creepy than convenient. More than two-thirds of internet users said they have unfavorable opinions of targeted ads “because they do not like having their online behavior tracked and analyzed.” Only 28% approved of the practice of targeted advertising.

Regulating targeted advertising is at the core of the Report and the legislative efforts of Kerry and Markey. The Kerry-McCain and Markey-Barton bills would give the government a degree of authority that it currently lacks – authority that industry self-regulation cannot achieve alone. The bills would provide the commission rulemaking authority concerning notice, consent, and the transfer of information to third parties.

One interesting aspect to any new legislation will be what the FTC calls its “Global Interoperability.” Reflecting differing legal, policy, and constitutional regimes, privacy frameworks around the world vary considerably. There is a need to promote more consistent and interoperable approaches to protecting consumer privacy internationally. Meaningful protection of data requires an ability of legal regimes to work together and requires enhanced cross-border enforcement. Global Interoperability may be tricky to achieve, however; 107 other countries have their own sets of privacy laws regulating the internet. The EU has comprehensive privacy laws, which have recently been amended and have been made tougher. In Europe, a consumer’s consent has to be real and informed, and the EU Justice Commission has announced new privacy legislation that would impose hefty fines on rule breakers. Clearly, European privacy regulators have a message for America’s tech giants: respect European privacy rules — or else.

Kevin Mills is an owner of the law firm of Kaye & Mills where his practice focuses on advising clients with transactions across a full range of issues in entertainment, media, technology, Internet and general business. His practice encompasses copyright; trademark; trade dress; trade secret; brand protection; content creation, protection and distribution; and general corporate, organizational and business matters.

The Bill of Rights has defined our basic rights as Americans for over 220 years. However, the Bill of Rights doesn’t afford us the right of privacy. In 1965, the Supreme Court held that the right of privacy was to be found in the “penumbras” and “emanations” of other constitutional protections. At the time, not everyone agreed such a right existed. That fight continues to the present.

In today’s supercharged, computer-powered, information age, privacy, and the right to it, takes on an even more important and pervasive meaning. In December 2010, the Federal Trade Commission (“FTC”) issued a preliminary report on privacy issues. On March 26, 2012, the FTC issued a final, updated version of this report, entitled “Protecting Consumer Privacy in an Era of Rapid Change.”

The FTC’s report makes two basic recommendations: (1) The establishment of a privacy framework that sets forth best practices for companies that collect or use consumer data (discussed herein) and (2) that Congress enact baseline privacy legislation that is flexible and technologically neutral. This second recommendation reflects the FTC’s view that self-regulation has not proven effective enough to protect consumer privacy. The report also sets out the FTC’s privacy priorities for the coming year (with regard to ongoing efforts, see the Obama Administration’s report issued on February 13, 2012 entitled “Consumer Data Privacy in a Networked World: A Framework for Protecting Privacy and Promoting Innovation in the Global Digital Economy,” which calls for a “Consumer Privacy Bill of Rights” among other things).

The FTC’s report recommends that companies incorporate the concept of “privacy by design” into their practices. A company adopting the “privacy by design” approach will promote consumer privacy throughout its organization and at every stage of the development and life of its products and services. To accomplish this, companies should incorporate the following protections into their standard, every day practices:

(a) LIMITATIONS ON COLLECTION. Reasonable limits are those that are consistent with the context of the particular transaction or the consumer’s relationship with the company, or as required by law or regulation.

(b) DISPOSAL AND RETENTION. Companies should implement reasonable restrictions on the retention of consumer data and should dispose of it once the data has outlived the legitimate purpose for which it was collected. The reasonableness of the practice depends on the type of relationship, nature and use of the data. The FTC invites trade associations and self-regulatory groups to contribute guidance to companies regarding data retention and destruction.

(c) ACCURACY. Companies should maintain the accuracy of the data they collect and hold. The FTC posits that a flexible approach that is scaled to the intended use and to the sensitivity of the data is the best method to achieving accuracy.

(d) SECURITY. Security is a critical factor and companies need to take their obligations seriously. FTC enforcement is only one consequence of failing to reasonably protect consumer data. In addition to the requirement to protect consumer data, there is the requirement of notification to the consumer in the event of a breach. The cost of such a breach can be significant. For example, the 2011 Sony PlayStation breach could well have a cost of $150 million and might have put the kibosh on Sony’s plan to network across entertainment devises and content.

(e) SIMPLIFICATION. The Report calls on companies to simplify consumer choice regarding privacy issues and to implement measures so that making the choice is meaningful. Where choice is required or desirable, it should be requested at a time and in a context in which the consumer is making a decision about the data. And in particular, special attention needs to be paid where data use and disclosure are inconsistent with the context of the transaction or the company’s relationship with the consumer. Furthermore, where sensitive information is being collected (e.g., information regarding children, health, or finances), clear and conspicuous notice and an opportunity to opt-out should be given.

(f) TRANSPARENCY. Companies need to increase the transparency of their data practices. Privacy notices should be clear, short and more standardized. Companies should provide reasonable access to data they retain and an opportunity in appropriate cases to permit the suppression of categories the consumer would like to restrict the use of in targeting. Furthermore, companies should increase the transparency of their data enhancement practices. For example, this could include an explanation to consumers of how data enhancement works and how the consumer can contact data enhancement sources directly.

Ultimately, the FTC’s report is an invitation for industry and government to work together to address the challenge of data collection, use, and management in the modern, technologically-changing world. But it is far from the final word.

Kevin Mills is an owner of the law firm of Kaye & Mills where his practice focuses on advising clients with transactions across a full range of issues in entertainment, media, technology, Internet and general business. His practice encompasses copyright; trademark; trade dress; trade secret; brand protection; content creation, protection and distribution; and general corporate, organizational and business matters.