Home Life Style Georgia Tech Researchers Develop Innovative Technologies to Study Autism

Georgia Tech Researchers Develop Innovative Technologies to Study Autism


Innovative Technologies to Study Autism

A recent project from the Center for Behavior Imaging at Georgia Institute of Technology promises an important impact on understanding autism and other behavioral disorders through new technologies that measure the behavior of children.

The first technology consists of an instrument that uses a special facial-analysis program and a pair gaze-tracking eye-wear. This allows researchers to study the eye-contact the child makes with the person who is wearing the eye-wear. The second instrument is made of multiple accelerometers that monitor and categorize any behavioral problems that these children might have. These two technologies are currently being used in the ongoing work at  the CBI (Center for Behavior Imaging). Researchers are trying to combine medical understanding of behavioral disorders with computer guided screening and measurements.

According to precedent studies, children that are predisposed for autism often show specific behavioral markers. One of these specific markers is the unwillingness to make continuous or repeated eye contact with people around them. A computational software capable of detecting these two markers can significantly improve the currently existing screening methods. The project is funded by the NSF (national Science Foundation) through the help of professor Jim Rehg.

Georgia Tech Researchers

Georgia Tech Researchers

The studies for the new technologies were conducted in the CSL (Child Study Lab) from the Georgia Institute of Technology. Researchers captured a video of a child that was interacting with an adult wearing a front-facing camera on the gaze-tracking device. The recorded video was then processed through the facial-analysis software. The software is able to detect the direction of the child’s gaze, thus analyzing the length of direct eye contact between the two participants. Its current accuracy is noted at 80%.

According to professor Rehg, precedent methods used to study the eye gaze were very laborious and involved many hours of video analysis. However, this new method can be used to replicate the results automatically, thus reducing the analysis hours and intensive labor. Professor Rehg notes that these are only preliminary results due to the fact that the method was only performed on a single child. Nevertheless, Rehg and his research team are confident that these results will be replicated because of the similarities between humans’ eyes.

The second technology was developed through a collaboration between researchers at the Georgia Institute of Technology (US), Newcastle University (UK) and the Marcus Autism Center (US). The device consists of a pack of sensors that are worn on the wrists and ankles. These sensors use accelerometers in order to detect every movement that the subject makes. The researchers developed a series of algorithms that can automatically detect changes in behavior. These changes are then analyzed and classified as disruptive, self-injurious or aggressive.

The analysis algorithm was developed on four staff members from the Marcus Autism Center. More than 1000 moves and behavior instances were categorized and analyzed. Researchers say that the technology can detect problematic behaviors with an accuracy of over 95% and can classify different behaviors with an accuracy of over 80%. When the algorithm was used on a child that was previously diagnosed with autism, it detected the behavioral problems with an accuracy of 81% whilst classifying them correctly with almost 70% accuracy.

“These results are very promising in leading the way toward more accurate and reliable measurement of problem behavior, which is important in determining whether treatments targeting these behaviors are working”, noted Agata Rozga, one of the researchers. According to professor Gregory Abowd, these new technologies will have a major impact on the lives of many children and their families due to the more effective screening capabilities.

Both these technologies were first presented at the 14th International Conference of Ubiquitous Computing, in early September.