Skip to main content

Getting at the many tangled webs of digital deception we seem hardwired to weave

Jeff Hancock gives presentation
Jason Koski/University Photography
Jeff Hancock discusses the notion of identity-based deception and the "lonelygirl15" Internet YouTube video series in class. The subject turned out to be an actress.

It's a simple truth yet a complex social behavior. Everybody lies. From harmless fibs to whoppers that can destroy a corporation (think: Enron), humans are adept at the art of deception. Getting at the truth about the language of lies and how and under what circumstances we weave our tangled webs is much of the stuff of Jeff Hancock's research.

Hancock, Cornell assistant professor of communication and member of the faculty of computing and information science, recently received a $680,000 National Science Foundation (NSF) grant to pursue "The Dynamics of Digital Deception in Computer Mediated Environments."

Hancock's work in online fibbery made news a couple of years ago, and his initial studies in that area, funded in part through the Institute for the Social Sciences (ISS) small-grant program, led to the NSF grant and continue to attract attention. Those initial studies look at how computer-mediated environments affect the production and practices of deception, and also explore how people's ability to detect deception is affected by computer-mediated environments.

"Jeff's work is exactly the type of innovative, rigorous social science that the ISS was created to support," said Beta Mannix, ISS director. "It is especially exciting for us to see a young scholar like Jeff be able to build on his initial work that was funded by the ISS to develop a broader, interdisciplinary research program."

Indeed, the NSF project will be a powerful collaboration that will include two leading experts in the fields of language processing and linguistics, respectively: Claire Cardie who has worked with the department of homeland security on detecting fact from fiction, associate professor of computer science and the Charles and Barbara Weiss Director of the information science program; and Mats Rooth, professor of linguistics, who also is a member of Cornell's information science program.

The group brings a triple threat to the challenge of understanding the language of deception.

"Most of the work on deception has focused on nonverbal forms of deception," Hancock said. "The thinking has been that you can control your speech but you can't control your nonverbal behavior, and this kind of thinking led to a focus on examining nonverbal cues associated with lying, which is what the polygraph tests." What Hancock and his colleagues would like to do is create a language-based approach to detecting lies.

"By using their [Cardie and Rooth's] expertise in natural language processing and computational linguistics, we will see if we can determine if the very language of deceptive messages is different from that in messages which are not deceptive," said Hancock. "We should have ample opportunity to look at lies because usually people tell one to two lies a day, and these lies range from the trivial to the very serious, including deception between friends and family, in the workplace, and in security and intelligence contexts."

Hancock says that by "examining deception in mediated environments and building computer-based tools for the detection of deceptive messages," this research will develop new approaches that will improve our ability to detect digital forms of deception.

"Where this differs is we're looking at the actual tool of the lie, the words, the actual medium of the lie," he said. "We're looking at the lie rather than looking at some correlate like physiological activity."

The research could have implications and applications at many levels, from business hiring practices to online dating to national security tracking. In the case of Enron, some 500,000 e-mail messages were made public. A stadium of lawyers would never be able to analyze all those messages for signs of deception. But it is possible that a computerized language program could be developed with the ability to detect deceptive language patterns then red flag them for closer scrutiny.

Hancock also is engaged in a separate study with Ph.D. student Catalina Toma and Michigan State University professor Nicole Ellison that examines the prevalence of lies in online dating services, which are notorious for false reports. Yet they are among the most lucrative for-fee online services with more than 17 million American customers, Hancock says. Unlike his prior studies of subjects who had to self-report their fibs, the claims made by online daters were more difficult to check.

"What we could do then was establish ground truth: We actually put them [the daters] on a scale and weighed them, checked their height and we looked at their driver's license to get their age to see how it compared with their online profile," Hancock said. That study is still being prepared for peer review. But one thing that struck Hancock about his subjects was that while almost all of them lied, most of the lies were minor.

"There were very few real doozies," he said. "But it's the doozies that get all the attention and give the business a bad rub."

The typical lie for men? Their height. The typical lie for women? Their weight.