Why We Need a Federal Criminal Law Response to Revenge Porn

As promised in the comments section of my last post, I offer in this post the outline of my proposal to effectively combat revenge porn. A few preliminary notes: one, this is very much a work in progress as well as being my first foray into drafting legislative language of any kind. Two, a note about terminology: while “revenge porn” is an attention-grabbing term, it is imprecise and potentially misleading. The best I have come up with as a replacement is “non-consensual pornography,” so that is the term I will use throughout this post. I would be interested to hear suggestions for a better term as well as any other constructive thoughts and feedback.

I want to emphasize at the outset that the problem of non-consensual pornography is not limited to the scenarios that receive the most media attention, that is, when A gives B (often an intimate partner) an intimate photo that B distributes without A’s consent. Non-consensual pornography includes the recording and broadcasting of a sexual assault for prurient purposes and distributing sexually graphic images obtained through hacking or other illicit means. Whatever one’s views on pornography more broadly, it should be a non-controversial proposition that pornography must at a minimum be restricted to individuals who are (1. adults and (2. consenting. Federal and state laws take the first very seriously; it is time they took consent requirements seriously as well.

Before I offer my proposal for what a federal criminal prohibition of non-consensual pornography could look like, I want to explain why looking to federal criminal law is the most appropriate and effective response to the problem. In doing so, I do not mean to suggest that other avenues are illegitimate or ill-advised. I support the use of existing laws or other reform proposals to the extent that they are able to deter non-consensual pornography or provide assistance to victims. That being said, here is my case for why federal criminal law is the best way to address non-consensual pornography, in Q&A form.

Why criminal law? Can’t the problem be adequately addressed by tort and/or copyright law? There are two answers to this, one concerning what could be called legal integrity and the other concerning practical obstacles. On the first, we should regard non-consensual pornography as a crime because that is the most accurate and principled characterization of its harm. Non-consensual pornography may indeed also be a violation of privacy or an infringement of copyright, but it is at its base an act of sexual use without consent. When such sexual use is inflicted on an individual’s physical body, we call it rape or sexual assault. We also accept, both as a matter of intuition and as a matter of law, that forcing an individual to strip naked and perform sexual acts can plausibly be considered a crime, even if the perpetrator never touches the victim. The fact that perpetrators and victims are not in physical proximity in non-consensual pornography should not change this analysis. Nor should the fact that such an assault is not physical remove it from the category of criminal sexual use without consent. Our society readily accepts that child pornography, for example, is a crime separate from the physical act of child abuse – that is, we recognize that the production and distribution of the image is a harm in itself. Of course this does not mean, as some have tried to suggest, that this would lead to the ridiculous result that we should or could criminalize any sexual thought a person has of another person without their consent. The fact that viewing and distributing child pornography is a crime clearly does not lead to the result that merely thinking sexually about a child must also be a crime; there is no reason to think that criminalizing non-consensual pornography would require such a result either. The Supreme Court recognized in New York v. Ferber that the “distribution of photographs and films depicting sexual activity by juveniles is intrinsically related to the sexual abuse of children in at least two ways. First, the materials produced are a permanent record of the children’s participation and the harm to the child is exacerbated by their circulation. Second, the distribution network for child pornography must be closed if the production of material which requires the sexual exploitation of children is to be effectively controlled … The most expeditious, if not the only practical, method of law enforcement may be to dry up the market for this material by imposing severe criminal penalties on persons selling, advertising, or otherwise promoting the product.” Victims of non-consensual pornography of any age are harmed each time a person views or shares their intimate images, and to allow the traffic in such images to flourish increases the demand and the pervasiveness of such images.

With regard to practical obstacles, tort actions place a tremendous burden on the victim and in many cases will be an implausible or impossible approach. Civil litigation of any kind requires money, time, and access to legal resources. Perhaps most distressingly, it often requires further dissemination of the very material that harms the victim. The irony of privacy actions is that they generally require further breaches of privacy to be effective. Moreover, the priority of most victims is to have the material removed, not to recover damages. Additionally, in many cases the party responsible will not have enough financial resources to make a damages claim worthwhile (i.e., the defendant will be judgment-proof). This leads us to the other difficulty in bringing tort claims for non-consensual pornography: it’s very difficult to find a party to sue. Given the ease with which individual purveyors of non-consensual pornography can access or distribute images anonymously, it is difficult to identify and prove (especially for the purposes of a lawsuit) who they are. So why not go after the websites distributing the images? The answer, as Prof. Citron and others have detailed, Section 230 of the Communications Decency Act will probably stand in the way. CDA §230 has been interpreted to grant website owners and operators far-ranging immunity for tortious material submitted by third-party users (“No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”). Though one can hope that more courts rule along the lines of the Ninth Circuit in Fair Housing Council v. Roommates.com, such rulings are very rare so far.

Copyright law is more promising for some victims of non-consensual pornography because CDA §230 does not immunize websites from copyright claims. If a victim took the image or video herself, she is the copyright owner and can in theory take action against unauthorized use. This strategy has proven successful in some cases. However, this option will not be of use to the many victims who do not take the images or videos themselves. Some lawyers and scholars have suggested that an expansive conception of “joint authorship” might cover these victims, but it is not clear how much traction this theory will have in actual cases. Moreover, similar problems of publicity, time, and resources that accompany tort claims hinder copyright claims.

Why aren’t existing criminal laws (many of which are federal) sufficient to address this issue? Some forms of non-consensual pornography fit, or could be argued to fit, definitions of existing crimes. None of them, however, could be used in more than a minority of cases, and the protections they offer are incomplete. For example, if 18 U.S.C. 2257 sets out recordkeeping requirements for producers of pornography, but it is seriously limited in two ways. First, the statute’s definition of “producer” does not seem to include websites that solicit images from third-party users, which are the websites most likely to include non-consensual pornography. Second, the law focuses almost exclusively on age-verifying identification. It sets out no requirements to verify that the individuals portrayed have consented to the use of their images. Thus, adult victims are not protected. The Interstate Anti-Stalking Punishment and Prevention Act: 18 U.S.C. 2261A makes it a crime “to use the mail, any interactive computer service, or any interstate or foreign commerce facility to engage in a course of conduct that causes substantial emotional distress to a person or causes the person or a relative to fear for his or her life or physical safety.” This statute could apply to some instances of non-consensual pornography, but has not often been interpreted to do so. In addition, many perpetrators of non-consensual pornography may not fulfill the intent requirement of the statute, namely, the intent to “to kill, injure, harass, or intimidate a spouse, intimate partner, or dating partner.” Many admitted purveyors of non-consensual pornography maintain, with some plausibility, that their sole intention is to obtain notoriety, fulfill some sexual desire, or increase traffic for their websites. Additionally, many individuals involved in the production or distribution of non-consensual pornography have no intimate relationship to the victim as required by the statute. The Video Voyeurism Prevention Act of 2004, 18 U.S.C. 1801, makes it a crime to intentionally “capture an image of a private area of an individual without their consent, and knowingly do[] so under circumstances in which the individual has a reasonable expectation of privacy.” Substantively, this Act could cover some instances of non-consensual pornography, but it is not clear that this statute would reach situations in which the initial image is consensually produced or given, but subsequent dissemination and access is not. The statute is written without acknowledgment of the contextual nature of consent. The statute’s reach is moreover limited to “the special maritime and territorial jurisdiction of the United States.” Finally, the Computer Fraud and Abuse Act, 18 U.S.C. 1030, addresses various forms of computer fraud and hacking. Because non-consensual pornography sometimes involves computer fraud and hacking, some perpetrators would theoretically run afoul of this statute. However, such activity is not the real target of this statute, and there are ways to participate in the creation or distribution of non-consensual pornography that do not involve hacking or fraud as defined by this statute.

Why federal, as opposed to state, criminal law? State laws, while important, have limited jurisdiction. The fact that one or even many states might criminalize non-consensual pornography will not help a person who is victimized in a state that does not. The Internet has greatly facilitated the capacity to commit interstate crimes, and the only way to reach such crimes is through federal law. However, it is important for states to pass their own non-consensual pornography laws, in part because the distribution of such content does not always take place on the Internet. A person standing on a street corner handing out photographs or DVDs containing non-consensual pornography is likely not engaging in interstate commerce. In such an instance, state, not federal, law is the way to prohibit or punish his conduct. A federal criminal law on non-consensual pornography would also have the salutary effect of providing a model for state laws. In the absence of such a federal law, New Jersey provides an example for other states and the federal government in its criminalization of certain invasions of privacy, in particular those involving intimate photographs or videos. New Jersey law prohibits a range of acts of non-consensual observation or disclosure of sexual activity.

Doesn’t criminalizing non-consensual pornography raise First Amendment concerns? The Supreme Court held in Miller v. California that obscenity does not receive First Amendment protection. The Court set out guidelines for determining whether material is obscene: “(a) whether ‘the average person, applying contemporary community standards’ would find that the work, taken as a whole, appeals to the prurient interest, …; (b) whether the work depicts or describes, in a patently offensive way, sexual conduct specifically defined by the applicable state law; and (c) whether the work, taken as a whole, lacks serious literary, artistic, political, or scientific value.” Further, the Supreme Court held in New York v. Ferber that child pornography is not protected by the First Amendment: “The test for child pornography is separate from the obscenity standard enunciated in Miller, but may be compared to it for the purpose of clarity. The Miller formulation is adjusted in the following respects: a trier of fact need not find that the material appeals to the prurient interest of the average person; it is not required that sexual conduct portrayed be done so in a patently offensive manner; and the material at issue need not be considered as a whole.” Non-consensual pornography plausibly fits either into the category of “obscenity” or as a variation of child pornography, thus depriving it of First Amendment protection. Of course, statutory language must be carefully drafted to avoid vagueness and overbreadth, and I have tried to be sensitive to such concerns in my draft statute.

What would a federal criminal statute on non-consensual pornography look like? In drafting a proposed statute, I have examined the language and provisions of existing state and federal criminal laws on other forms of sexual abuse. The following is the product of my efforts so far.

Proposed Federal Law: Non-Consensual Pornography

I. Whoever uses the mail, any interactive computer service, or any facility of interstate or foreign commerce to engage in a course of conduct or travels in interstate or foreign commerce or within the special maritime and territorial jurisdiction of the United States to produce or disclose a sexually graphic visual depiction of an individual without that individual’s consent shall be fined under this title or imprisoned not more than one year, or both.

a. “Intimate areas” is defined as in 18 USC § 1801: “the naked or undergarment-clad genitals, pubic area, buttocks, or any portion of the female breast below the top of the areola”;

b. “Sexually explicit conduct” is defined as in 18 USC § 2256: “(i) graphic sexual intercourse, including genital-genital, oral-genital, anal-genital, or oral-anal, whether between persons of the same or opposite sex, or lascivious simulated sexual intercourse where the genitals, breast, or pubic area of any person is exhibited; (ii) graphic or lascivious simulated; (I) bestiality; (II) masturbation; or (III) sadistic or masochistic abuse; or (iii) graphic or simulated lascivious exhibition of the genitals or pubic area of any person”;

(3) “Visual depiction” is defined as in 18 USC § 2256: “includes undeveloped film and videotape, data stored on computer disk or by electronic means which is capable of conversion into a visual image, and data which is capable of conversion into a visual image that has been transmitted by any means, whether or not stored in a permanent format;

II. It is an affirmative defense to a crime under this statute that:

A. The actor can produce verifiable written consent by the individual(s) depicted. The falsification of such forms shall be punishable by law.

B. The actor acted with a lawful purpose, including law enforcement in connection with a criminal prosecution; compliance with subpoenas or court orders for use in legal proceedings; routine security observation by retail establishments when such observation is clearly posted; employers acting within the scope of their employment.

44 Responses

I’m not sure why the choice between civil and criminal responses needs to be exclusive. Would you endorse adding a private right of action that tracks the criminal prohibition? Given the long-standing difficulty of convincing prosecutors and police to intervene in what they perceive as “domestic” disputes, it seems like sometimes tort suits might be more effective for your purposes.

Setting aside other major issues: are you truly intending to prohibit consensual-but-not-memorialized sex tapes? I can think of a situation where a man and a woman purchase a videocamera online, make a tape, have a bad break up, and one of the parties calls the police, who search the other party’s apartment, find the tape, and charge under this title.

More broadly, if you are concerned about the ACT of revenge porn, prohibiting the production seems overbroad.

In addition, runs up against big First Amendment issues. To give one uncontroversial example: person sunbathing nude (and illegally) in public. A paparazzo photographs, and is charged.

I’m going to stick in this particular comment just to the drafting issue and I’ll hopefully post a later comment what you wrote leading up to the draft.

2 issues with the way you drafted consent in the proposed statute:

1) Maybe I’m messing up the statutory construction, but “without that individual’s consent” in article I seems to be an element of the offense which would negate the need for article II which offers consent as an affirmative defense. This is a little confusing.

2) I’m not sure why in article II, you require that consent be expressed in writing. Why couldn’t the defendant be allowed to prove consent using whatever evidence of consent is available? (For instance, if the depiction is a video and the alleged victim starts off the video by expressing their desire to see the video be online or if the alleged victim was the one who uploaded the item in question on the defendant’s website)

I think making lack of consent an element of the offense is much preferable because that is where the real problem is. The problem is not that sexually-explicit depictions are depicted, the problem is the lack of consent. I think to reverse the burden of proof on what is the core of the offense goes against the spirit of the burden of proof laying on the prosecution.

I think you need work some mens rea in here. If I am a customer of an adult website, or an advertising service who works with them, and they sell non-consensual pornography under the guise of consensual pornography, it seems that I would be liable under this statute. That kind of third-party or nth-party liability seems problematic.

IA2b Does it really have to say “sadistic or masochistic abuse”? I have friends who would take offense at their sexual activities being referred to as “abuse”. But well, that’s a bit beyond the scope of this exercise and you’re just importing this definition in there.

I think you’re dismissing the First Amendment concerns too quickly; what you’re proposing does not fit neatly into either existing category. Obscenity is about the content of the speech, but there is nothing about the images that is necessarily (or likely) obscene under Miller. Child pornography is about recording a criminal activity (non-consensual sexual conduct involving children) and the exacerbated harm from the mass-production, sale, and possession of that recording of criminal activity. But in many of these cases, the production was consensual, although the subsequent distribution was not.

So you are talking about a new category of unprotected speech, at least if the focus is on non-consensual distribution of otherwise-consensually created material.

James, I emphatically do not think that civil and criminal responses should be exclusive. I thought I had made this clear in the third paragraph, but perhaps I didn’t state it strongly enough. Of course I understand that pursuing criminal prosecution, as opposed to some form of civil litigation, will not necessarily be what a victim wants. I have no position as to what legal action any individual victim ought to take; I simply believe that non-consensual pornography should be criminalized. Accordingly, yes, I would fully support a private right of action that tracks the criminal prohibition.

AndyK, every criminal law raises the possibility of false accusations and de minimis harms. It is possible to make false claims of assault, and some assault statutes might be used to prosecute arguably trivial injuries. That doesn’t mean assault shouldn’t be addressed through criminal law. Incidentally, I don’t think your example of a paparazzo taking a photograph of a person sunbathing nude in public is necessarily “uncontroversial.” It’s not obvious to me at all that paparazzi who literally lie in wait, often on the ground, to take upskirt shots of female celebrities (not your example, but related) shouldn’t be criminally prosecuted.

Shg, your comment is baffling. Nowhere in my post did I “analogize” women and children. The statute is first of all gender-neutral – any person can be the victim of non-consensual pornography. Secondly, I have made no suggestion that women – or men – are incapable of consent. The statute concerns non-consensual acts. Unless you also think that criminalizing rape implies that women are incapable of consent, I don’t understand where you are coming from. Perhaps you are confused by my reference to child pornography, so let me try to make it clearer: child pornography is a useful point of reference not because the victims of one are just like the victims of another. It is useful because we recognize in child pornography the harm that “mere” images can cause, and we can recognize that a failure to prohibit traffic in such images increases the demand for such images.

PrometheeFeu, why NOT require written consent? Is obtaining a simple form before one consumes or distributes graphic sexual imagery of actual people really such a burden? The problem with other forms of consent is that they are too easy to manipulate. Consent forms will be susceptible to manipulation too, of course, but it’s better than leaving it to “he said, she said.” That leads in to your other question about mens rea: I grant that if you are talking about a website that engages in outright fraud about its product, it would be unfair to hold unsuspecting consumers responsible. Otherwise, I don’t find third-party liability particularly problematic. If you are a porn consumer, why shouldn’t you make some effort to determine whether the material you are consuming is consensual? Shouldn’t it bother porn consumers that what they’re masturbating to might be an actual rape? It’s always struck me as odd that even as consumer concern for transparency and fairness has risen in so many areas (fair trade for coffee, no sweatshop labor for T-shirts, GMO food labeling, etc.), no one seems to particularly care whether the pornography they consume is the product of crime or abuse. But perhaps that tells us something about porn consumers. There’s a larger point to be made here about why there is so much hostility towards the suggestion that potential perpetrators, as opposed to potential victims, should assume some of the burden of the risk involved in sexual acts. It is notable that the list of things women are commonly advised to do to avoid being raped, sexually harassed, or becoming the victim of revenge porn is interminable, while the mere suggestion that men should make some slight effort to make sure they are not committing rape, sexual harassment, or revenge porn is treated as an offensive proposition.

Howard, I don’t take First Amendment concerns lightly, and I don’t pretend that in a blog post I can comprehensively address every free speech objection to my proposal. But I don’t think I am talking about a new category of unprotected speech. I think that non-consensual pornography fulfills the Miller standard about as well as any content could. And again, the category of acts I am concerned with includes images that were obtained without any form of consent at any point in time; I’d be interested to know how the law could reach those acts if not with a proposal similar to the one I am making.

As I said, what makes something obscene is the content of the expression itself, not anything surrounding its creation or distribution. Consent (particularly to the distribution of an image) has nothing to do with whether the *image* appeals to the prurient interest, depicts sexual conduct in a patently offensive way (which no longer includes simple nudity), or lacks SLAPS value.

I don’t know whether your proposed statute would survive First Amendment scrutiny. Maybe the balancing of privacy interests, given lack of consent, wins out. My point is that obscenity doesn’t get you there.

“Non-consensual pornography plausibly fits either into the category of ‘obscenity’ or as a variation of child pornography, thus depriving it of First Amendment protection.”

A photo of the bottom of a woman’s breast or buttocks qualifies as “obscenity” or “child pornography”? Really? Even CJ Burger accepted that Miller was designed to address hard-core sexual conduct.

If you want to argue for a fundamental restructuring of First Amendment jurisprudence, one that permits criminalizing the distribution of non-obscene images because of lack of consent, fine. That’s an important discussion to have. But don’t pretend that your proposal fits comfortably within existing doctrine. It doesn’t — your statute is patently unconstitutional. It’s not even close.

(QUOTE) a. “Intimate areas” is defined as in 18 USC § 1801: “the naked or undergarment-clad genitals, pubic area, buttocks, or any portion of the female breast below the top of the areola”;

Besides making it a Federal criminal offense to redistribute the Sears Catalog (undergarment-clad genitals abounded in it), old National Geographic magazines (with pictures of scantily-clad “natives” in them), and so-forth, since it would be impossible to obtain “written consent” from the subjects of the photos decades later, you would end up criminalizing the web display of vacation snapshots from the Cote d’Azure– the thought of American tourists asking every scantily-clad European on the beach to sign a “written consent” is risible. Oh, and “Our Bodies, Ourselves” (plus countless medical and science textbooks, not to mention many coffee-table art books) will be contraband under your proposed statute. You really should re-consider restricting mere visual depictions of undergarment-clad genitals or partly-exposed breasts simpliciter.

Your argument that consumers(!) of imagery of scantily-clothed people should demand that the originating photographers and publishers obtain special written consent (apart from ordinary models’ releases) seems rather hysterical. Should every photo-spread in Maxim or Vogue come with facing pages reproducing the “written consents” of the models? Who will pay for the extra pages? Why should visual material produced before the enactment of your statute become contraband?

Also, do you intend your statute to supersede the E-Sign Act (15 USC Chapter 96)? You should clarify whether you want “really, truly, written on paper” written consent, or whether electronically-recorded “written consent” is good enough.

Finally, I find your arguments about the First Amendment unpersuasive. To be sure, the Supreme Court has (mostly) excluded pornography from First Amendment protection– though not totally, since not all “pornography” is “obscenity” under the law. However, the Supreme Court has ruled (United States v. Stevens, 533 F. 3d 218, 2010) that Congress may not simply proclaim arbitrary types or styles of images to be obscene in order to exclude them from First Amendment protection. Your attempt to define “unconsented-to” images of underwear-clad buttocks as obscenity seems very probably unconstitutional.

Finally, I find your arguments about the First Amendment unpersuasive. To be sure, the Supreme Court has (mostly) excluded “hard core” pornography from First Amendment protection– though not totally, since not all “pornography” is “obscenity” under the law. However, the Supreme Court has ruled (United States v. Stevens, 533 F. 3d 218, 2010) that Congress may not simply proclaim arbitrary types or styles of images to be obscene in order to exclude them from First Amendment protection. Your attempt to define “unconsented-to” images of underwear-clad buttocks as obscenity seems very probably unconstitutional.

Requiring written consent is very burdensome. Sure, in a professional pornography setting, it seems perfectly sensible. But what about an adult dating website or social networking website where users are allowed to post sexually-explicit pictures of themselves? Would it be reasonable to require the site to obtain written consent from every user who wishes to upload photos of themselves or expose themselves to criminal liability?

Regarding third-party liability, I do think consumers of pornography should care that the models whose depictions they consume consented to the distribution of those depictions, but I also think criminal liability is excessive. If I buy shoes made with child labor, I am not subjected to criminal penalties. At the very least, we should be able to agree that there are gradations in culpability and that the consumer who does not research their sources of pornography and simply assumes that consent was obtained from the models is nowhere near as culpable as the person who was entrusted with a private film or picture and maliciously chose to disseminate said media. Yet, your proposed statute treats both as the same.

If you have data to back up your assertion that pornography consumers do not care whether the pornography was produced and distributed through abuse and crime, please provide that data. Otherwise, why make such an assertion?

To the First Amendment objections: please bear in mind that I only suggested that non-consensual pornography “plausibly” fits under existing exceptions to First Amendment protection. I have not attempted, nor do I think it is necessary, to prove that the Supreme Court would ultimately uphold my proposed statute. I have no love for the obscenity exception for any number of reasons, but I made reference to it because I think it provides precedential grounds for finding some material undeserving (or at least less deserving) of First Amendment protections. Howard, I obviously don’t agree with you that the determination of whether something is obscene must be completely divorced from the context of its creation, but I’m not that invested in trying to justify the statute on the basis of obscenity. There are plenty of other, better ways to justify why non-consensual pornography should not be protected by the First Amendment. As I mentioned in my original post, the Court found in Ferber that the distribution of child pornography could be criminalized even if some images did not meet all the factors of the Miller Test, and I think the same should be true here. The Supreme Court also seems to think that expression creating a “hostile environment” under Title VII doesn’t receive full First Amendment protection: it appears to be permissible to regulate certain forms of speech and expression when they violate certain underlying social commitments to equality. As I’ve demonstrated before, one of the consequences of non-consensual pornography is that it produces serious and disproportionate discriminatory effects on women and girls.

Horspool, I don’t know why you would assume that the written releases models or other subjects sign as a routine matter wouldn’t fulfill the requirement of written consent. As to sexually explicit photographs of “natives,” whether by National Geographic journalists or American tourists, I’m not sure why they shouldn’t require written consent, especially if one plans to distribute them to the wider public. I also don’t know why you would assume that the statute would apply retroactively, given that I gave no indication that it would.

I welcome and appreciate ways of modifying the language regarding prohibited forms of visual depiction, as well as ways to make the written consent aspect (which is not, by the way, a “requirement” – it simply means that if a person wants to minimize the risk of violating a federal criminal statute, this is one precaution he could take) more efficient. Along those lines, I am curious if those of you objecting to the language taken directly or mostly from other federal statutes find those statutes unconstitutional as well.

prometheefeu, I doubt that you’ll be surprised that my answer to your first two questions is yes. As to third-party liability, I have already repeatedly given arguments as to why acts of non-consensual sexual use are serious enough to be criminalized. If you don’t agree, you don’t agree – there’s not much point in going around that circle again. As to gradations of culpability, sure, they might be justified. I’m open to that suggestion, but I would invite you to compare non-consensual pornography with child pornography on this point to see if the former should be treated similarly. As to your agitation regarding my speculation that porn consumers do not care about whether the product they consume was produced under consensual circumstances, I suggest you have once again misunderstood appropriate burdens of proof. If you want to claim that porn consumers do in fact care about this issue, you should provide some evidence that they do so.

I want to make a final observation about the majority of the comments I have received so far. Many of them take the form of “this part of your proposed statute is problematic/terrible/unconstitutional etc… the end.” I am not troubled by the critique part, but I am troubled by the fact that the critiques seem so often to be an end in themselves. It strikes me as strange that many of you are offended and agitated about aspects of my proposal that theoretically create harm or inconvenience to potential perpetrators, but do not express any concern about the actual and documented harm the statute is trying to address. Now, that may be because you don’t think that the harms of non-consensual pornography are real or serious. But if that’s the case, I think it would be more intellectually honest to state that up front so that it is clear that there is likely no law, no matter how carefully written, that you are likely to support because you think the calculus of harm only has one side. If you do recognize the harm of non-consensual pornography, I am genuinely interested in how you would have the law address it.

Actually, I am surprised. I thought you would place a higher value on the consent of the subjects of sexually-explicit depictions. In practice, any site which wished to allow users to post sexually-explicit depictions of themselves would need to maintain written consent forms from their users. The cost of maintaining such records would be prohibitive. Even if it were not so, it would most likely require users to identify themselves to the site operator which would be very burdensome upon the user’s privacy. (Potentially subjecting the user to harassment if their real life identity leaked) To put it simply, a social networking site would have little choice but to prohibit all sexually-explicit depictions on their site, regardless of whether the subject of the depiction has given consent or not.

You claim to value consent, but push a regime which would significantly burden consensual adults who wish to share sexually-explicit depictions of themselves. Perhaps you see no value in such exchanges, but this is just another control on the legitimate sexual choices of others. Something which you claim to detest. (At least when it comes to women, but I can only assume you feel the same way about attempts to control men’s sexuality)

Regarding to child pornography, yes. I would favor imposing stiffer penalties on producers/disseminators than on pure consumers.

Regarding your final point, I can only speak for myself, but my concern has been much less for “potential perpetrators” and much more for innocent bystanders whose activities do not involve knowingly producing or distributing non-consensual pornography. More generally, I have found your approach to be so absolute in considering only the victims of revenge porn that I didn’t see where I could potentially disagree with you in their favor. I of course have expressed agreement on multiple occasion that the harm done to revenge porn victims is very real and not to be dismissed.

QUOTE: ” I also don’t know why you would assume that the statute would apply retroactively, given that I gave no indication that it would.”

The statute you drafted would apply prospectively to the conduct of “…distribution, publication, dissemination, transfer, sale, purchase, delivery, trade, offering, or advertising” of specified visual depictions NO MATTER WHEN THOSE DEPICTIONS WERE CREATED, even decades ago. Surely you must have noticed that library shelves groan under the weight of previously-created and published depictions of people in their underwear or less (see http://www.jir.com/geographic.html )?

And you are “not sure why [sexually explicit photos of European beachgoers] shouldn’t require written consent,” but you are refusing to address the real problem with your proposed statute, which is that it defines by legislative fiat a vast universe of NON-sexually-explicit, non-obscene images as contraband. As written your statute would be just a fishing license for aggressive prosecutors, of which we have too many already. (Plus as I pointed out, US v. Stevens stands for the proposition that the First Amendment doesn’t allow you to just outlaw images you don’t like, even those made under unsavory circumstances.)

You posted your draft statute. Instead of accusing all critics of lack of concern for the victims of so-called revenge porn, you might want to consider addressing some of the flaws in your draft statute, which include astonishing overbreadth, an unworkable regime for distinguishing whether models consented to appear in images, and fairly obvious unconstitutionality.

Your definition of “consent” has problems beyond the strange evidentiary rule you propose. At what stage in the production and dissemination of images do you think consent should be required? If someone consents to a companion taking a photo in January, falls out with that person in March, and objects to publication of the old photo in May, does the original consent authorize the later publication? If not, you would give every professional fashion model a right to extort extra payments from magazines (for example) despite having no such right under the (Constitutionally nearly exclusive) copyright law. If initial consent is consent for all time, then your statute won’t help a large fraction of “revenge porn” complainants.

The older statute you rely upon most heavily (for your absurd definition of sexually explicit images), 18 USC § 2256, is constitutional only because it regulates the non-expressive conduct of making secret or ambush videos in a very restricted set of physical places. It doesn’t reach publication of images not so made at all, and the definition of forbidden subject matter it contains (which you have mistaken for a generally-applicable definition of sexually-explicit imagery) merely further narrows, rather than expanding, the elements of the crime that statute defines (which starts, as I just mentioned, with non-expressive conduct in places of intimate personal privacy).

Since you brought it up again, we probably should reconsider your assertion that “revenge porn” causes so much harm it should be criminalized (assuming we can define it specifically enough to create an enforceable statute). You have not explained how “revenge porn” without extortion causes any legally-cognizable harm apart from possible copyright or contract violations (you could perhaps make the unconsented-to publication of bedroom snapshots a contract violation by imputing a non-disclosure contract to all intimate relationships, though precedent would be against you). Publishing images of scantily-clad ex-lovers obviously involves no violence to their persons, no injury to their chattels, no larceny, no injury at all, really, except to their own feelings of dignity. “Revenge porn” at most “holds the [imagees] up to public ridicule or contempt,” which could be libel if the images were FALSE, but since you assume the images are true, they cannot be libelous. in America no one has the right to stop others from speaking truthfully about her. Certainly many people dislike being exposed to ridicule or criticism, but we Americans believe, for good reasons, that critical speech is too valuable to forbid. Among other things, maintaining an informed electorate in a democracy requires free speech, including critical and even vituperative speech.

You wrote “Victims of non-consensual pornography of any age are harmed each time a person views or shares their intimate images, and to allow the traffic in such images to flourish increases the demand and the pervasiveness of such images.” The first part of that is obviously false, since the supposed victim cannot even know when someone far away “views” an image. The second part may be true– certainly the demand for non-contraband images is higher than for forbidden ones– but your attempt to diminish the production of “non-consensual pornography” by criminalizing photos of underwear models risks burning down the house to kill a flea. The child pornography law is constitutional only because the harm in the actual production of the images is so extreme that extreme measures are justified to avert it– but the Supreme Court has already ruled that lesser harms cannot justify such measures. The “harm” caused by publication, later, of a photo, taken earlier without objection, of an ex-girlfriend in her underwear, is not just “lesser,” it is trivial compared to the child-abuse necessary to the production of child porn.

Obviously, if there is any crime in “non-consensual pornography” it occurs only at the time when the images are made without consent. If the model consents at the time to be photographed (or whatever) then later “viewing” of the resulting images is no crime (it could violate some contract, and copying/publication could possibly violate a copyright, and as you wrote, the images might conceivably be used in some kind of extortion).

Any time someone appears in public voluntarily, they are, and to satisfy the First Amendment MUST BE presumed to consent to others seeing them, and even photographing them. Your proposed statute would criminalize publication of photos even of scantily-clad people on a public beach. That is silly.

Yes, Horspool, I did post my draft statute, and I did so by posting under my own name so that I can take responsibility for my own ideas. And I have taken good-faith, valid points under consideration and responded even to points that didn’t meet that standard. I think I’m doing my part to engage in a reasonable exchange of ideas. I have no problem acknowledging that my proposal may be flawed, or that other proposals might be better. I have no problem acknowledging that some of your points might have some validity, though the hysterical, self-righteous tone you have adopted does make them hard to see. The fact that I do not simply bow down before your self-proclaimed wisdom does not mean I am not interested in improving or modifying my proposals. It might mean that your points are just not that good, or poorly expressed, or reveal a fundamental commitment to inequality that I find both uninteresting and untenable.

prometheefeu, we clearly calculate the burdens imposed by non-consensual sexual use differently. If we could eliminate the most serious forms of sexual harm while not inconveniencing any good-faith actor, that would be wonderful. But this has not proven possible in this or other contexts regarding sexual use. Rape laws burden spontaneous and ambiguous sexual contact, as do regulations of sexual harassment. To the extent that we should mourn this burden, the blame for it must be squarely placed on the forces that provoke the need for such restrictions, not the restrictions themselves.

It’s worth pointing out that the suggested definition of “sexually graphic” would not be constitutional even if it was limited to minors. Not only is the suggested definition not limited to nudity, it does not require the “sexually graphic” image to be sexually suggestive.

As for the comment “It strikes me as strange that many of you are offended and agitated about aspects of my proposal that theoretically create harm or inconvenience to potential perpetrators, but do not express any concern about the actual and documented harm the statute is trying to address” — you are not only attacking a strawman, you are doing so in a patently offensive and condescending way. Your critics are not agitated about the “theoretical harm or inconvenience to potential perpetrators” your proposal creates; they are agitated by your attempt, however well-intentioned, to gut the First Amendment through a vastly overbroad criminal law. If you don’t believe in the First Amendment, fine; that’s another conversation we can have. But don’t insult your critics by assuring them that you take freedom of expression seriously and then respond to First Amendment criticisms of your statute by claiming that they are really motivated by a lack of concern for women.

Professor Franks has explained very clearly that her statute is a work in progress. Particularly in light of the stage of the project, the level of vitriol emanating from some of the comments (particularly the pseudonymous ones) is unnecessary.

It seems to me that the demand that the proposed statute fit precisely into an existing category of unprotected speech misses an important point. None of the categories of unprotected speech were unprotected until a court said they were unprotected. Put another way, the fact that the statute might require a new category of unprotected speech, or an extension of an existing category, seems not inherently problematic for the statute.

Somewhat relatedly, a question for Professor Franks: to the extent you would like to tether the statute to an existing category of unprotected or less protected speech, I wondered whether you had thought about analogizing the statute to the one in Virginia v. Black? For anyone unfamiliar with that case, the Court there upheld a statute prohibiting cross-burning with intent to intimidate. Perhaps you wish to avoid an intent requirement because you are (understandably) worried about creating too high a burden of proof, but perhaps failure to obtain written consent could create a presumption of bad intent. More to the point, drawing upon that particular line of cases would call attention to the equality harm that it seems to me has informed your thinking on this topic. Personally I wish the court’s hate speech jurisprudence went much further than it did, but Black shows some willingness to acknowledge the harm of inequality-reinforcing speech. My apologies if this takes the conversation further into the First Amendment realm than you wish to go at this point.

Dear Ann Marie Frank,
I’ve seen you on HuffPost Live and I just wanted to say that you were terrific. I agree with everything you say, and yes, you are right: people usually think that the harms of non-consensual pornography are not real or serious. I am also noticing that people are inclined to have the sort of attitude that yeah, harming women or girls is bad, but giving them protection is not a priority. Why pay the costs of prosecuting the offenders and try to fix the system, when we can just ask them to refrain from certain behaviors?

Prof. Franks, it appears I was mistaken. I thought you wanted to discuss the merits of your legislative proposal. It seems that you would prefer a meta-discussion about who is the more genteel debater. I suppose we’re displaying different styles of discourse as discussed here: http://alastairadversaria.wordpress.com/2012/08/07/of-triggering-and-the-triggered-part-4/ (an article I think you might find interesting despite it being recommended by me

As for addressing objections to your draft statute, I now get the idea that you want commenters to propose ways to make it tougher, not just point out areas in which is it ill-drafted or likely to produce unwanted effects. Well, I would think as the drafter and chief proponent you would want to be made aware of weaknesses so you could address them, but I will offer here some sincere suggestions for improving your draft to get it to do at least some of what you want. I will also make some criticisms without suggestions where I really don’t have anything good to offer…

So far you haven’t seriously addressed the following problems with your draft: (1) the time at or during which the exculpating “consent” to publish becomes or remains operative is unclear (is consent to be required at time of image creation? Or of any publication? the draft language is genuinely ambiguous) and whether consent can be withdrawn (and if so, whether doing so could create retrospective culpability). I suggest you require consent only at image creation time (vide infra for an extended discussion); (2) as drafted the statute would criminalize the distribution, etc, of visual material created before enactment and previously regarded as entirely lawful, unless perhaps retrospective permission (see (1)) were obtained– even though that might be impossible, making your statute a censorship law of unprecedented breadth– I suggest that you add language exempting pre-enactment images from the requirement of consent lest the whole statute be ruled unconstitutional; (3) your draft seems to make lack of consent an element of the crime (to be proved beyond a reasonable doubt by the prosecutor) in the first clause, but in the second clause tries to presume lack of consent and burden the defendant with proving consent– which do you want to do? If the latter, your approach raises Due Process issues hard to fully address in a blog comment–I suggest you drop the “affirmative defense” stuff; (4) your definition, pulled from another statute though it may be, of “sexually-explicit” imagery is so broad it covers stuff like Victoria’s Secret underwear catalog images or tourist photos of European sunbathers– you should narrow your draft definition considerably. I cannot tell you how exactly, because though I, like Justice Stevens, may know it when I see it, I am unable to supply any definition broad enough to do what you want and narrow enough not to offend the Constitution; (5) your draft would criminalize both the creation, and the subsequent dissemination of specified images– but those actions inflict different harms and restricting each implicates different Constitutional concerns, so you should probably address them separately; (6) your statute would likely violate the First Amendment as it is presently construed– I realize you dismiss this concern, but I think you should give it more attention, especially because that concern is standing proxy for a whole raft of concerns about criminalizing speech and handing Federal prosecutors new dragnets; (7) you have not said whether you want to override the E-Sign Act to require your proposed “written consent” to be paper-and-ink rather than electronic, even though Prometheefeu raised the issue, and overriding E-Sign would chill Internet publishers greatly– your statute might be better off demanding “some record of consent”; (8) your list of permissable reasons for creating or disseminating specified images seems too restrictive– you should at least add newsgathering and publication plus comment on matters of public concern or interest.

Let me address (5) in more detail since several commenters have alluded to it but not clearly. Creating an explicit image without consent you have analogized to rape, and it is clear that there is a specific time and place where a model’s consent to some physical conduct (by the photographer) could be required. The “pornographic” images* would serve as evidence that they were “created” so the question in court would be whether the model consented at the time (or if you allowed it, consented retroactively) to their creation. The harm would be the deprivation of the victim’s liberty (to refuse the “job” of pornographic model), analogous to slavery. (You could grant a presumption that images of people voluntarily appearing in public in a state of deshabille were consented-to, averting a bunch of constitutional issues.) The perpetrator would be the photographer (and perhaps her confederates) and the question of guilt would be reasonably justiciable.

Making mere dissemination of some non-obscene images a crime is a different sort of problem. Mere dissemination (unless the images are obscene under existing statutes, in which case no new statute is needed) causes no harm other than idiosyncratic “hurt feelings” which American law has long regarded as insufficient grounds for criminal prosecution and very nearly non-compensable in tort (libel requires both falsity and proof of effects on other people, not merely mental distress to the subject). Worse, since these images are (by definition) only unlawful to disseminate if created without consent and the presence or absence of consent cannot be discerned from the images themselves, no prospective defendant could possibly recognize and avoid forbidden images. Due process concerns militate against criminalizing conduct defendants cannot reasonably avoid. You have suggested that anyone who comes into possession of any “pornographic” but not obscene images ought to independently obtain consent from the model, but that seems insouciant to your critics– since obtaining “written consent” before handling any potentially-contraband image is logistically impossible, your proposal seems likely to impose a gigantic and unconstitutional “chilling effect” on lawful conduct. Prosecuting a disseminator not in privity with the photographer poses the problem of identifying any nexus between defendant and “victim.” If a disseminator can rely on a presumption that the image creator had consent, she will be virtually immune, and if not, she will be impermissably chilled from exercising her free-speech and -press rights, because no reasonable actor could be expected to get independent permission from some model whose image has entered the stream of commerce. The question of consent-time also makes criminalizing dissemination problematic: if the statute says a model can withdraw consent to dissemination and thereby make the same action which is lawful on Monday a crime on Tuesday by private rather than legislative action, prosecution would violate due process.

You have said that the impetus for this law is to protect people who might be forced into “non-consensual pornography,” yet you have also said that it is to protect people against being exposed to ridicule by former intimates. Well, if consent to dissemination can be withdrawn some time after image creation, or if fresh consent is required for each act of dissemination (on every web click?) then a new law might protect people who permitted image creation but later objected to image dissemination. However, as mentioned, a law that operated in that way would chill too much lawful expression and violate due process. On the other hand, a law which regulated image creation (as discussed above) might constitutionally deter and punish the creation of “non-consensual pornography” but would not enable people vexed by publication of images to get their publishers censored or jailed.

I suggest that if you are really concerned about the creation of “non-consensual pornography,” you settle for trying to criminalize its creation. Your proposal to go beyond that and criminalize the mere dissemination of non-obscene images is fraught with problems. In the interest of this debate I have tried, but I cannot think of any way to draft a law, consistent with American freedoms of speech and press, which would give people offended by publication of non-obscene images of themselves the power to criminalize the dissemination of those images at a later time (except possibly additional dissemination by a creator duly convicted of creating the images without consent– but such a stricture would not bind any stranger).

Oh, and a point of personal privilege: just why would you suggest that I espouse a “fundamental commitment to inequality?” I want everyone to enjoy equality under the law. I want the law to protect everyone equally. I want crimes to be defined by law; not by the whims of individuals (whether complainants or aggressive prosecutors). I want laws which everyone can reasonably obey. And I want laws that don’t violate fundamental rights such as the right to free speech or the right not to be punished without due process.

*I assume for argument’s sake that you can write a suitable definition of the sort of “pornography” that shouldn’t be created without the model’s consent.

I have blogged for eight years and commented on blogs for longer than that. I have never commented anonymously, and I never will. But if you do not want to deal with commenters who — for a variety of reasons, some illegitimate, others completely understandable — wish to remain anonymous, I suggest that you don’t guest-blog again. I would also suggest that you need to fundamentally rethink your expectations for the kind of discourse that occurs on a blog; to describe Horspool’s comments, which are remarkably well-though out and argued for a blog, as exhibiting a “hysterical, self-righteous tone” says far more about your oversensitivity to criticism than it does about his or her writing. I wish all the comments on my blog were as well-reasoned.

Horspool, I think you may be overlooking potential gradations of consent here that could point toward a solution to the drafting problem – certainly I can envision a situation where the model, at the time of creation, would be consenting to the creation but not the dissemination of the image.

Indeed, the dissemination half is the only part that really needs attention – nonconsensual creation of pornography is already criminalized under various voyeurism-type statutes. I certainly agree that it would be plainly unconstitutional under current First Amendment precedents to criminalize mere possession of such an image, but I think a statute could be drawn in such a way as to narrowly punish dissemination of “revenge porn.” Will have to think further before I have any specific language though.

(1) a visual depiction of the victim’s “intimate areas” [I think the 18 USC 1801 definition is fine for these purposes]

(2) the victim has a reasonable expectation of privacy in the image [I would let the courts develop this element on a case-by-case basis, but it would screen out the professional model cases and the public sunbathing cases]

(3) the defendant disseminates the image in a manner to which the victim does not consent [or “publicly disseminates without consent” – that presents fewer problems but also fails to catch some conduct that I think should be covered]

(4) the defendant knows or has reason to know that the victim does not consent to the dissemination [this element does the heavy lifting on the First Amendment/Due Process problems, while also presenting the biggest enforcement difficulty; but I don’t think it would render the statute toothless]

anon, yes, I have been thinking about how Virginia v. Black might be useful in this context, but I haven’t thought it all the way through (I find it reassuring that I’m not the only one who thinks it might be applicable, so I’m glad you raised it). Apart from the harm that non-consensual pornography inflicts on individual victims (which is serious, no matter how casually dismissed by some commenters), it inflicts discriminatory harms on society as a whole. Like rape, domestic violence, and sexual harassment (that is, abuses directed primarily at women and perpetrated primarily by men) revenge porn reinforces the message that women’s bodies belong to men, and that the terms of women’s participation in any sphere of life are to be determined by men’s indulgence. Revenge porn results in women losing jobs, leaving school, changing their names, and fearing for their physical safety – all of which drive women out of public spaces and out of public discourse. In addition to cases like Virginia v. Black, the fact that Title VII’s restrictions of some forms of discriminatory speech and action has not been held to offend First Amendment principles suggests that there may be a kind of discrimination exception to full First Amendment protection. In any event, your comment is a nice reminder that First Amendment doctrine is constantly evolving, despite some commenters’ apparent belief that it is both unambiguous and fixed in stone (and that they are Moses…). It is also a nice reminder that the First Amendment is not the only Amendment, or indeed the only constitutional value, that is relevant to this discussion.

Several other very interesting points have been raised in this thread so far, particularly the suggestion that creation and dissemination might be usefully (and even necessarily) separated for the purposes of a criminal statute. I think that might be right – I’ve been struggling with a way to reach situations in which no consent of any kind was given as well as situations in which the scope of consent was exceeded, but perhaps the Video Voyeurism Act could reach the first if it were amended to include the language “anyone who uses the mail, any interactive computer service, or any facility of interstate or foreign commerce …” instead of only reaching those “in the special maritime and territorial jurisdiction of the United States.” The VVA is quite tightly crafted, requiring that a person “capture an image of a private area of an individual without their consent and knowingly do[] so under circumstances in which the individual has a reasonable expectation of privacy.” It’s even possible that the VVA could be amended to reach the second type of situation (thus negating the need for any new federal statute) – non-consensual dissemination of an image that was originally consensually captured – but that will be a more difficult fix. The bottom line, I think, is for a statute to explicitly recognize the contextual nature of consent and account for it in some meaningful way (it has occurred to me that Fourth Amendment jurisprudence requiring searches to be reasonable both in inception and scope may be helpful here, but I haven’t worked that all the way through yet either). As I’m writing this, I see that Griff has just posted a proposal on this issue, and though I haven’t processed it very deeply yet, my initial reaction is that I find it very promising – thanks, Griff.

With regard to the written consent aspect, I see no problem with e-sign or other ways of increasing the efficiency of documentation. The affirmative defense section is my attempt to provide some clear and verifiable way for people who are engaging in consensual activity to prove that they are doing so, and I really welcome more suggestions about exactly how to make this work. Some lawyers tell me that USC 2257’s record-keeping requirements work pretty well with regards to age verification, so perhaps piggybacking on those requirements gets us part of the way there. The real problem with 2257, though, is that it’s very much “CDA 230-compliant,” which means that only “producers” of pornography – and fairly narrowly defined ones at that – are on the hook. Sites that claim they are only providing forums for content submitted by third parties (that would be the majority of revenge porn sites) wouldn’t be affected.

There are obviously many other points that have been raised in this discussion so far that I haven’t addressed. This is not necessarily because I don’t find those points interesting (although this is true for some of them); it may just be because I’m not yet sure what I think about them or haven’t yet had the time to respond to them. Many of you have clearly invested a lot of time and effort in your comments, and I appreciate that. Several of the comments have been genuinely helpful, and many of the others have been, shall we say, at least helpfully revealing.

I was traveling and was not able to respond earlier, but I am glad to see that Horsepool has in her latest comment taken up most of my concerns.

A word on e-signatures. I do think it does address at least some of the problems with regard to internet publishers. However, the requirement of the keeping of a “record of consent” still creates privacy issues. If the records of consent were leaked, this could result in previously-pseudonymous or anonymous photographs being associated with legal identities placing the victims of such leaks in a situation quasi-identical to that of revenge porn victims.

It also seems to me, that even if third-party liability is desirable in some cases, a person who believes in good faith that a depiction was produced and disseminated in compliance with this statute should not themselves be held to be criminally liable if their good faith belief happens to be mistaken. To put it concretely, a website should be able to certify that they either have obtained consent or obtained a certification that consent was obtained and in so-doing immunize their customers under this statute.

This actually brings me back to the issue of mens rea. Under the statute as written, if I send you an email containing non-consensual pornography, you, as soon as you have opened the email and had the picture or film loaded, are liable under the statute. That does not seem particularly sensible since you may not have known what the email contained.

On a more meta note and addressing your second-to-last comment, I don’t think you can blame the perpetrators of revenge porn for the collateral damage resulting from a particular approach to the problem they have created. There are many ways to balance the many interests at stake here and we all are responsible for the collateral damage of our particular attempts striking a balance as well as the way such a particular attempt may fail to address the whole of the problem.

Little wonder indeed. It’s all about the driving worldview. Given how easy and recklessly she throws around the charge of mysogony and indulges her instinct to make female suffering central of all concerns (an instinct we all share, which is why many liberal men are no less chivalrous than cowboys) one should have seen her responses coming from a mile away.

I think the idea of criminalizing revenge porn is sound, but I disagree with many of the ideas in this post.

1. First and foremost, the reason such a law would be consistent with the First Amendment is not that it fits under Miller or within a putative new category of unprotected speech, but that revenge porn is an outrageous and exploitative invasion of privacy that is already subject to civil liability under the tort doctrines of intrusion, public disclosure of private facts, right of publicity (which shares characteristics of intellectual property), and/or intentional infliction of emotional distress. Taken together, the *substantive* reach of these torts seems to extend to virtually everything that is objectionable about revenge porn.

2. I agree that there is a good case for augmenting existing tort liability with criminal penalties. As Professor Franks’ own discussion suggests, however, the issue is not that existing tort law is substantively too narrow (though it is in certain states), but that it is procedurally and economically difficult to enforce. In my view, the sustantive reach of the criminal law should be limited to acts that are analogous of the existing torts.

3. The present draft, however, is much broader than any of these torts. This breadth not only deprives the proposal of its best constitutional defense, it also criminalizes activities (for example, photographing public displays of affection by public figures) which strike many people (including me) as being protected by the First Amendment.

4. Though most objectionable revenge porn falls within at least one of these torts, it rarely if ever falls within *all* of them. The solution, it seems to be, is to proscribe the creation or dissemination of non-consensual pornography in circumstances that have been recognized as constituting sanctionable invasions of privacy. Something roughly like this:

Whoever:

a. Produces a sexually graphic visual depiction of individual, without the individual’s consent, where the individual is depicted in a private place;

b. Publicly discloses a sexually graphic visual depiction of an individual, without the individual’s consent, where the individual is depicted in a private place;

c.shall be fined under this title or imprisoned not more than one year, or both.

a. Produces a sexually graphic visual depiction of individual, without the individual’s consent, where the individual is depicted in a private place [intrusion];

b. Publicly discloses a sexually graphic visual depiction of an individual, without the individual’s consent, where the individual is depicted in a private place [public disclosure of private facts];

c. Discloses for commercial gain a sexually graphic visual depiction of an individual, without the individual’s consent, where such disclosure was not of public concern [right of publicity]; or

d. Produces or discloses a sexually graphic visual depiction of an individual, without the individual’s consent, for the purpose of causing emotional distress to that individual, where such disclosure is not of public concern [intentional infliction of emotional distress];

shall be fined under this title or imprisoned not more than one year, or both.

I want to thank you for writing such a thoughtful article on an important topic. While efforts to shame and humiliate women are not new, the idea of non-consensual pornography seems to be picking up traction in popular culture and academics as of late. Your article is a much needed piece of advocacy on the topic and I look forward to reading more from you in the future.

I think the underlying idea is fantastic. I do think the draft statute is somewhat overbroad.

A lot of the objections have focused on ways which the draft language could apply to normal and non-problematic acts (like getting an e-mail from Victoria’s Secret about a new sale).

Would it make sense to structure it more like internet harassment statutes (since that’s really what this is — internet harassment).

That is:

-Any person who produces or distributes with the intent to harass, alarm, stalk, disturb, invade the lawful privacy of, or cause emotional distress to such individual; or with reckless disregard for the possibility that this production or disclosure will have the effect of harassing, alarming, . . .

-There shall exist a rebuttable presumption that the public disclosure of a sexually graphic visual depiction not taken in a commercial setting, of a person who is not a public figure, is likely to cause that person alarm and distress.

And a few scholars, like Ann Bartow, have written about how some porn niches focus on this, depicting either real or fictional instances of unconsented ejaculation inside a woman, which is then highlighted for marketing purposes.

AF and Kaimipono, thank you very much for your comments. Along with Griff’s points, they help me think more clearly about how to refine and improve the statute. AF, as is probably evident from what I’ve written on this site and elsewhere about online harassment, I don’t agree that the harm here is only about privacy, though I do agree that it is also that. In my view, the focus and impact of so many forms of online harassment is clearly a matter of gender discrimination. That being said, there’s overlapping consensus between us that this is a privacy harm, and I think that crafting the statute to more closely track privacy torts might be advisable for many reasons. Your proposed statute is very appealing, though I have two concerns about it. One is how to define “depicted in a private place.” This, like the Video Voyeurism Act’s “reasonable expectation of privacy,” might be interpreted too narrowly to do victims much good (or on the other hand, too broadly to comport with fairness and notice, though I think this less likely). Two, the intent required by the language “for the purpose of causing emotional distress to that individual”: as I mentioned in my original post, it’s not always clear what a perpetrator “intends” when he either captures or discloses a non-consensual image. He might be seeking “only” individual sexual gratification, or the admiration of his peers, or something else entirely. Here’s a scenario: a man positions a camera up the skirt of a woman not wearing underwear as she stands on a subway platform. He does not disclose the picture to anyone else. It does not seem that she is being depicted in a private place, and our photographer may say that he is only taking the picture for purposes of sexual gratification. He may provide further evidence of his lack of intent to cause emotional distress by explaining that he did his best to take the picture surreptitiously so that the woman was not even aware that he was taking it. It doesn’t seem that your statute would reach his behavior. I’m not arguing that it has to (perhaps existing voyeurism laws, or new voyeurism laws, are better suited to address it), but I wonder if you would consider this a problem.

That leads rather nicely into Kaimipono’s “rebuttable presumption” suggestion, which I like a lot. The addition of such a presumption would reach more cases than AF’s statute standing alone, although I think the presumption would have to add production as well as disclosure – though perhaps, Kaimipono, you deliberately omitted production from the second clause of your proposal though it is included in the first?

There’s still the thorny issue of what counts as “sexually graphic” for the purposes of a new law. I can see why the language I borrowed from 18 USC 1801 raises problems here: it’s plausible that the reason 1801’s inclusion of “underwear-clad” genitals and part of women’s breasts is acceptable is because it is limited by the “reasonable expectation of privacy” language, and that something similar might be required here to avoid overbreadth. Alternatively, the statute could restrict the definition of “sexually graphic” to genitalia and/or sexually explicit conduct.

While I was aware of the problem of reproductive coercion, I did not know that a niche form of porn had developed around it. My first reaction is that this, like porn made of actual sexual assaults or without the individual’s knowledge that she is being filmed, raises additional issues of encouraging actual crimes. One of the points the Supreme Court made in Ferber regarding the dissemination of child pornography was that it encouraged the actual crime of child sexual abuse by increasing the demand for such images. Similar arguments, it seems, could be raised here.

That was an inadvertent omission, yes. I think production is equally problematic, for reasons you’ve discussed here.

On the coercion porn — I am not an expert, and I really don’t want to become one. But, I have seen a few news accounts. There is apparently a wildly popular genre of interview-coercion porn. A woman comes to a “job interview” with a male employer. It starts as an interview. He slowly starts asking for sexual favors. She makes clear that she would rather not. He says she has to if she wants the job. She then has sex with him. (There was a write-up in the Phoenix New Times about it, a few years ago.)

Oddly enough, the NT article noted, at least some of these women are actually _paid porn actresses_ (at least for the one company they discussed). So in at least some cases there is not actual coercion, as much as a script of coercion that is marketed as such.

I would strongly suspect that there is at least some actual coercion taking place. But it could be tricky to try to sort out actual coercion from scripted pseudo-coercion.

Professor Franks: Thank you for your comments. I didn’t intend to suggest that the only harm from non-consensual porn concerns privacy. My point was that the existing privacy torts seem to cover most if not all non-consensual porn, and that since the constitutionality of these torts is largely settled, they offer a practical way of avoiding the constitutional difficulties identified by other commenters.

As for the subway-platform hypothetical that you describe, I agree that it is objectionable. If the camera is hidden on the platform floor, the picture may well be covered by a version of the first prong of my draft statute, insofar as there is a reasonable expectation that body parts covered by clothes will not be seen by others; in any event, such a picture seems worthy of punishment, whether under this statute or voyeurism laws. On the other hand, if the picture is taken in a public place without the use of subterfuge, and is not distributed for commercial gain or for the purpose of causing emotional distress, I think drafting a statute that criminalizes it presents serious difficulties from a First Amendment perspective. There would certainly be some behavior within this category that is objectionable (eg, capturing a stranger’s “wardrobe malfunction”), but also a lot that is not. For example, I don’t think it should be a federal crime to take a photograph of a group of men playing volleyball at a nude beach, or of a large group of college kids streaking the quad at midnight before the first day of finals. It also probably shouldn’t be illegal to take a snapshot of two people making love in a public park.

Incidentally, if the subway picture is not taken with a hidden camera, but instead by surreptitiously angling a camera’s line of vision up a woman’s skirt, I would think that too would be subject to punishment. The key idea is that a woman wearing a skirt has a reasonable expectation that her genitals are not going to be seen by others.

“I would strongly suspect that there is at least some actual coercion taking place. But it could be tricky to try to sort out actual coercion from scripted pseudo-coercion.”

I would imagine that it would be a rather easy question of facts in most cases. After all, you just need to show that the person who claims coercion was on their way to a job interview rather than a scripted pornography shoot. Of course, there are some cases where the facts will not be clear, but this in general does not seem very difficult to me.

On the broadcasting of such acts of sexual coercion, I think the best way to handle it is to make it an aggravating circumstance. The acts you describe here are already serious crimes and we could I suppose criminalize the recording and publication of the commission of sex crimes in which you participate.

On the other hand, rapists have already shown themselves to be undeterred by rape laws and any reasonable sentence for the the recording and publication would fall well within a standard deviation of the sentence the rapist should expect for the original rape. So I’m not sure there is much point in such a law. Furthermore, in a sense, the criminal is broadcasting evidence of their crime. This could be helpful in convicting them. So on the balance, I am not sure there is much point in such criminalization.

Has anyone ever been convicted under the NJ statute (or any other state statute)? I read that a man (Brandon Carangelo) had been charged with it for posting naked pictures his ex-girlfriend had emailed to him. I haven’t heard any more about it, though. It would be interesting to know if there have been successful prosecutions. If there were without any major First Amendment commotion then the statute could serve as a guide.

My gut says that, if it is well constructed, a statute criminalizing non-consensual pornography would not violate the First Amendment, but I do agree with some of the other commenters that this post does not contain a thorough First Amendment analysis (of course, it is a blog post and not a journal article so I won’t fault Professor Franks for lack of thoroughness).

I realize the desire to dodge the hypothetical, but my sunbathing example, and AF’s genital photograph, are both core examples of First Amendment-protected speech. There are no two ways around that.

There are a number of faddish areas in the law where scholars tend to get myopia and forget the basics. Cyberbullying, revenge porn, and hate speech are all three areas where folks tend to ignore First Amendment prohibitions on sweeping legislative proposals designed to catch all allegedly morally bad conduct. But the way the First Amendment works is to treat overbreadth to have actual legal effect, and underbreadth to be regrettable, but without legal relevance.

Like it or not, there is a thumb on the scale AGAINST additional legislation in these areas.