Cornell communication professors Jeff Hancock and Jeremy Birnholtz call them "butler lies," in honor of the personal assistants of yore who would provide a buffer when unwelcome guests turned up at the door.

Nowadays, we tend to rely on technology to serve this purpose, such as making excuses about needing to get offline when we don't actually have to.

In fact, Hancock has found that up to 10 percent of text messages contain lies, and one-fifth of those are butler lies.

But such lies are hard to hide when GIS cell phone apps publish every movement, and friends post about activities on Facebook for an entire extended network to see. So is technology doing enough to help maintain privacy?

With the help of a $460,000 grant from the National Science Foundation and a small army of student researchers, the Cornell communications duo is studying how people manage their availability using modern technology and whether they can design new ways to help them do so. In their opinion, lying isn't necessarily a bad thing, and some relationships can benefit from a bit of privacy.

And although most of us are aware and accept that we are occasionally being lied to, we are not very good at identifying which messages are lies, he said.

"A lot of the lying that we end up doing is about managing how people interact with us," Hancock said. "We are telling narratives of our lives, and we are telling different narratives to different people. Right now, that narrative is leaky and can get messed up very easily."

So how do we manage it?

Many take advantage of the ambiguity that technology provides to cloak their activities, such as setting their instant messenger status to "away" or "busy" to avoid conversations -- even though that is a lie, and in reality they are sitting inches away from the screen, busy playing solitaire.

Birnholtz believes the widespread use of lies suggests that people are resorting to social solutions because there are insufficient technical solutions. That, in turn, indicates a need for more controls or features to help people manage their personal relationships.

By doing a linguistic analysis of the deceptive messages, Birnholtz said he might be able to design a program that could predict when someone wants information to be shared or protected.

Some of his students are doing related research into how the butler lies vary from country to country, based on the plausibility of particular excuses in different cultures.

And the pair also has funding from Google to study technology-enabled deceptions in the workplace, such as manipulating the "track changes" function in word-processing software to selectively hide or highlight edits in group documents.

"Our social conventions have evolved over 60,000 years. Facebook has been around for six," Hancock said. "I think this is where a lot of the confusion comes from."