Woodrow Hartzog

Woodrow Hartzog is an Assistant Professor at the Cumberland School of Law at Samford University. His research focuses on privacy, human-computer interaction, online communication, and electronic agreements. He holds a Ph.D. in mass communication from the University of North Carolina at Chapel Hill, an LL.M. in intellectual property from the George Washington University Law School, and a J.D. from Samford University. He previously worked as an attorney in private practice and as a trademark attorney for the United States Patent and Trademark Office. He also served as a clerk for the Electronic Privacy Information Center.

According to NPR, 300 plus teenagers broke into former NFL player Brian Holloway’s vacation home, causing massive damage and showcasing their exploits on social media. In response, Holloway created a website,helpmesave300.com, that collects the alleged culprits’ social media posts. He claims this repository has enabled teens to be identified, and that the growing list of names is “being turned over to the sheriffs (sic) department to assist them to verify and identify the facts.”

Online stalking, harassment, and invasions of privacy can be incredibly destructive. Yet very little empirical data exisits regarding these incidents. This paucity of data hinders educational, support, research and policy efforts. Without My Consent, a non-profit organization seeking to combat online invasions of privacy, is conducting research to better understand the experiences of online harassment. If you are 18 or older and have experienced harassment on the Internet, please consider taking their survey.

Few things repre­sent the age of social media better than posting a selfie. We share these ubiq­ui­tous self-portraits with such an urgency you’d think we’d cease to exist if we stopped producing them at a rapid and ongoing rate. Think about taking a trip to a gorgeous loca­tion. If you exer­cise “selfie-control” and don’t post a picture of your­self at a place like the beach, did the exquisite voyage really happen?

For some crimes the entire law enforcement process can now be automated. No humans are needed to detect the crime, identify the perpetrator, or impose punishment. While automated systems are cheap and efficient, governments and citizens must look beyond these obvious savings as manual labor is replaced by robots and computers.

Pages

“The way privacy law largely works for consumers in the United States is through what regulators call ‘notice and choice,'” said Samford University law professor Woodrow Hartzog by email. “That means that so long as users were put on notice of an app’s data practices and made the choice to continue using the app in light of that notice, then the app’s data practices are presumptively permissible.”

"“Using location data this way is dangerous,” said Woodrow Hartzog, a law professor at Samford University, via email. “People need to keep their visits to places like doctor’s offices, rehab, and support centers discreet. Once Facebook users realize that the ‘People You May Know’ are the ‘People That Go To the Same Places You Do,’ this feature will inevitably start outing people’s intimate information without their knowledge.”"

"When you have a conversation with a chatbot, it’s clear that you’re talking to software, not a human. The conversation feels stiff. But some bots are adept at shooting the breeze, a skill that can make it hard to know you’re conversing with code. “Disclosure is going to be really important here,” says Woodrow Hartzog, a law professor at Samford University. “Problems can come up when people think they’re dealing with humans, but really they’re dealing with bots.”

"Just because someone might be able to use their ear at checkout doesn’t mean it’s necessarily going to happen anytime soon, though. “Biometrics are tricky,” Woodrow Hartzog, an Associate Professor of Law at Samford University told WIRED. “They can be great because they are really secure. It’s hard to fake someone’s ear, eye, gait, or other things that make an individual uniquely identifiable. But if a biometric is compromised, you’re done. You can’t get another ear.”

Pages

Robots are starting to look suspiciously familiar. Increasingly sophisticated robots designed to resemble us are striking up more and more symbiotic relationships with humans, at home as our companions and at our workplaces as colleagues.

Human-robot interactions will continue to evolve as robotic technology transforms the way we see our creations and the way they react to us. But as machines cease acting like machines and become more integrated into our lives, how will we feel about them? And, dare we ask, how will they feel about us?

CIS Affiliate Scholars Peter Asaro, Ryan Calo and Woodrow Hartzog are listed as participants for We Robot 2014. Robotics is becoming a transformative technology. We Robot 2014 builds on existing scholarship exploring the role of robotics to examine how the increasing sophistication of robots and their widespread deployment everywhere from the home, to hospitals, to public spaces, and even to the battlefield disrupts existing legal regimes or requires rethinking of various policy issues. If you are on the front lines of robot theory, design, or development, we hope to see you.

What harms are privacy laws designed to prevent? How are people injured when corporations, governments, or other individuals collect, disclose, or use information about them in ways that defy expectations, prior agreements, formal rules, or settled norms? How has technology changed the nature of privacy harm?

"On the other hand: even algorithms can make mistakes. You will eventually written by humans. And just legal texts can be difficult in a formalized language to translate. They are, says Woodraw Hartzog, just not made for it to be automated. And they are not made to be enforced to one hundred percent."