Email not displaying correctly?
View it in your browser.
Main Image

Researchers in computer scientist Ehsan Hoque’s lab have created a game that has allowed them to analyze more than 1 million frames of facial expressions, the largest publicy available video dataset so far for understanding how to tell if someone is lying. (University of Rochester photos / J. Adam Fenster)

Using data science to tell which of these people is lying

Someone is fidgeting in a long line at an airport security gate. Is that person simply nervous about the wait?

Or is this a passenger who has something sinister to hide?

Even highly trained Transportation Security Administration (TSA) airport security officers still have a hard time telling whether someone is lying or telling the truth – despite the billions of dollars and years of study that have been devoted to the subject.

Now, University researchers are using data science and an online crowdsourcing framework called ADDR (Automated Dyadic Data Recorder) to further our understanding of deception based on facial and verbal cues.

They also hope to minimize instances of racial and ethnic profiling that TSA critics contend occurs when passengers are pulled aside under the agency’s Screening of Passengers by Observation Techniques (SPOT) program.

“Basically, our system is like Skype on steroids,” says Tay Sen, a PhD student in the lab of Ehsan Hoque, an assistant professor of computer science. Sen collaborated closely with Kamrul Hasan, another PhD student in the group, on two papers in IEEE Automated Face and Gesture Recognition and the Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies. The papers describe the framework the lab has used to create the largest publicly available deception dataset so far – and why some smiles are more deceitful than others.

Here’s how ADDR works: Two people sign up on Amazon Mechanical Turk, the crowdsourcing internet marketplace that matches people to tasks that computers are currently unable to do. A video assigns one person to be the describer and the other to be the interrogator.

The describer is then shown an image and is instructed to memorize as many of the details as possible. The computer instructs the describer to either lie or tell the truth about what they’ve just seen. The interrogator, who has not been privy to the instructions to the describer, then asks the describer a set of baseline questions not relevant to the image. This is done to capture individual behavioral differences which could be used to develop a  “personalized model.” The routine questions include “what did you wear yesterday?” — to provoke a mental state relevant to retrieving a memory —  and “what is 14 times 4?” — to provoke a mental state relevant to analytical memory.

“A lot of times people tend to look a certain way or show some kind of facial expression when they’re remembering things,” Sen said. “And when they are given a computational question, they have another kind of facial expression.”

They are also questions that the describer would have no incentive to lie about and that provide a baseline of that individual’s “normal” responses when answering honestly.

And, of course, there are questions about the image itself, to which the describer gives either a truthful or dishonest response.

The entire exchange is recorded on a separate video for later analysis.

Using the ADDR framework, the researchers gathered 1.3 million frames of facial expressions from 151 pairs of individuals playing the game, in a few weeks of effort. More data collection is underway in the lab.

The researchers used automated facial feature analysis software and an unsupervised clustering technique to discover there were basically five kinds of smile-related ‘faces’ that people made when responding to questions, Sen said.

The one most frequently associated with lying was a high intensity version of the so-called Duchenne smile involving cheek, eye, and mouth muscles. This is consistent with the “Duping Delight” theory that “when you’re fooling someone, you tend to take delight in it,” Sen explained.

The researchers say they’ve only scratched the surface of potential findings from the data they’ve collected.

Hoque, for example, is intrigued by the fact the interrogators unknowingly leak unique information when they are being lied to. For example, interrogators demonstrate more polite smiles when they are being lied to. In addition, an interrogator is more likely to return a smile by a lying describer than a truth-teller. While more research needs to be done, it is clear that looking at the interrogators’ data reveals useful information and could have implications for how TSA officers are trained.

Read more and see a video here.


Deaf health survey now available across New York State

Surveys are commonly used to understand the health needs of communities, but are often conducted by telephone, rendering them inaccessible or inappropriate for deaf people. The Rochester Prevention Research Center (RPRC) and the National Center for Deaf Health Research (NCDHR) have just launched the third round of their Deaf Health Survey, a culturally and linguistically accessible health survey that was designed by deaf people for deaf people.

RPRC/NCDHR, which are part of the University’s Clinical and Translational Science Institute, have developed and refined new techniques to translate English into American Sign Language (ASL) in order to conduct video-based surveys with deaf communities. Participants can choose between video in ASL or English-based sign language, can turn captioning on or off, and can choose from a diverse array of sign language models.

The survey, which was previously released in 2008 and 2013 to the Rochester deaf community, will be available across New York State for the first time this year. The goal is to collect public health data from deaf communities throughout the state and develop relationships between these communities and public health officials. The data will help identify the communities’ health priorities, show the need for services, and obtain funding for associated services and research projects.

“The NCDHR was able to successfully obtain funding to further research on three community-chosen health priorities: obesity, suicide risk, and intimate partner violence,” said Kelly Matthews, research coordinator for NCDHR. “The New York State Deaf Health Survey is the first opportunity for deaf communities outside of Rochester to participate in culturally (Deaf) and linguistically (ASL) accessible public health surveillance. We are excited to strengthen our deaf community connections throughout the state.”

The data will also be presented to the NYS Department of Health for the first time, showing the health and well-being of deaf communities throughout the state and what they need. Results and next steps will be shared with the communities who participated in the survey.

Other members of the team heading up the survey include Earl Allen, project assistant; Jenna Stewardson, research coordinator; and Christina Whetsel, project assistant.

Using Google Chrome, take the survey. For questions or comments, call 585-286-2776 (video phone) or email NCDHR.


CIRC summer school offers programming languages, data analysis skills

Summer is the perfect time to learn a new programming language and to catch up on your data analysis skills.

The Center for Integrated Research Computing (CIRC) will once again host workshop training sessions for a 6-week period starting on July 17. Known as the CIRC Summer School, these workshops will feature 8 different languages and tools. Faculty, students, and staff may register for any one or multiple topics prior to the start of the training sessions.

Topics will include basic training in Linux, programming languages, data analytics tools, and visualization. The courses are designed for beginners and extra emphasis will be placed on using these languages, libraries, applications, etc. specifically on BlueHive.

The classes will take place on Tuesday, Wednesday, and Thursday mornings and afternoons in the University’s large-scale, interactive visualization facility, the VISTA Collaboratory, located on the first floor of the Carlson Library. See the table on the registration page for the topics, dates, and times and to register for the sessions.

Register early since these classes tend to reach capacity rather quickly.


i2b2 upgrade completed

The clinical research data tool i2b2 has now been updated with the new clinical EMR ePARC system. As of June 1, 2018, clinical data in i2b2 are being updated on a daily basis. The University’s Clinical and Translational Science Institute’s Informatics and ISD team worked together to successfully complete the upgrade and testing.

As before, you can get help with clinical data pulls from i2b2 by using the CTSI Research Dashboard to make a request. Select “i2b2” under “CTSI Biomedical Informatics” when specifying expertise.


PhD dissertation defenses

Jennifer Suor, Clinical Psychology, “The Interplay between Parenting and Child Temperament in Associations with Children’s Executive Function in Early Childhood.” 1 p.m. June 15, 2018. 366 Meliora. Advisors: Melissa Sturge-Apple and Sheree Toth.

Daniel Barnak, Physics, “Applications of Magnetic Fields in High Energy Density Plasmas.” 2 p.m., June 15, 2018. LLE (Laboratory for Laser Energetics) Seminar Room 2101. Advisor: Riccardo Betti.


Mark your calendar

June 11-12: “The Nitrate Touch,” graduate student workshop in advance of conference on provenance in early cinema (see June 13-16 below). Click here for more details.

June 13-16: “Provenance and Early Cinema: Preservation, Circulation, and Repurposing” conference. 55 presentations in 18 panels over four days plus two evenings of screenings of rare early films from the collection of the Moving Image Department of George Eastman Museum. Click here for more details.

June 13: “When Computer Vision Meets Audition: From Cross-Modal Generation to Audio-Visual Scene Understanding.” Chenliang Xu, assistant professor of computer science. Data Science Summer Colloquium Series, Goergen Institute for Data Science. Noon to 1 p.m., Wegmans Hall 1400. Open to all faculty, staff, students, and community members. Lunch included.

June 20:  “When Can a Computer Improve Your Social Skills?” M. Ehsan Hoque, interim director of the Goergen Institute, assistant professor of computer science, and Asaro Biggar (’92) Family Fellow in Data Science. Data Science Summer Colloquium Series, Goergen Institute for Data Science. Noon to 1 p.m., Wegmans Hall 1400. Open to all faculty, staff, students, and community members. Lunch included.

June 22: Deadline to apply for pilot and feasibility awards of up to $50,000 for innovative applications of technology (e.g. novel use of electronic health record data, wearable sensors, digital tools, human-machine interfaces, etc.) in research with human participants to yield new insights into clinical neuroscience. The Center for Health + Technology (CHeT), in conjunction with the Ernest J. Del Monte Institute for Neuroscience. For more information and to download the RFA, click here.

June 27: “Identifying Differences in GPUs Using Performance Data.” Sreepathi Pai, assistant professor of computer science. Data Science Summer Colloquium Series, Goergen Institute for Data Science. Noon to 1 p.m., Wegmans Hall 1400. Open to all faculty, staff, students, and community members. Lunch included.

July 11:  “Information Flow in Music.” David Temperley, professor of music theory at the Eastman School. Data Science Summer Colloquium Series, Goergen Institute for Data Science. Noon to 1 p.m., Wegmans Hall 1400. Open to all faculty, staff, students, and community members. Lunch included.

July 17: CIRC Summer School begins. Classes in programming languages and data analysis skills. VISTA Collaboratory. Click here to learn more and to register.

July 18:  “Physics of Complex Systems.” Gourab Ghoshal, assistant professor of physics. Data Science Summer Colloquium Series, Goergen Institute for Data Science. Noon to 1 p.m., Wegmans Hall 1400. Open to all faculty, staff, students, and community members. Lunch included.



Please send suggestions and comments here. You can also explore back issues of Research Connections.



Copyright ©, All rights reserved.
Rochester Connections is a weekly e-newsletter all faculty, scientists, post docs and graduate students engaged in research at the University of Rochester. You are receiving this e-newsletter because you are a member of the Rochester community with an interest in research topics.