Harnessing AI, NTU creates apps to help people with special needs learn social, emotional skills
SINGAPORE — In an effort led by Nanyang Technological University (NTU), three mobile apps have been developed to help people with special needs learn about social and emotional intelligence. For example, users of EmojiCapcha match their facial expressions with the emoji shown on screen to score points

Project Officer Mr Ivan Yew (left) and Professor Ong Yew Soon (right) of NTU’s Data Science and Artificial Intelligence Research Centre, where mobile apps were developed for students with special needs.
SINGAPORE — In an effort led by Nanyang Technological University (NTU), three mobile applications have been developed to help people with special needs learn about social and emotional intelligence. For example, users of the EmojiCapcha app match their facial expressions with the emoji shown on screen to score points.
Students at the Association for Persons with Special Needs (APSN) have been using the app to learn about emotional intelligence and recognise facial expressions in a pilot programme that started in mid-September last year.

EmojiCapcha is a child-friendly quiz app that rewards users for making facial expressions that match the emojis on screen. Photo: Nanyang Technological University
The other two apps are Happy Bird, a twist on the classic game Flappy Bird, and Betterfly, in which users change their facial expressions to create different visual effects for in-game butterflies.
The three apps were developed by NTU’s Data Science and Artificial Intelligence Research Centre in collaboration with global game development company Yoozoo Games and APSN.
The games were developed with NTU’s IntelliK, a platform that allows users to create apps powered by artificial intelligence (AI).
The collaboration started when the NTU centre’s director, Professor Ong Yew Soon, got the idea to do something for students with special needs, especially in light of the importance of virtual learning during the Covid-19 pandemic.
“I hope we can democratise AI for social good,” Prof Ong said, referring to the way that the apps use AI for facial recognition and emotion detection.
The collaboration has helped NTU refine the apps. For instance, the apps started out recognising just four emotions, but students who provided feedback on the apps came back with 16, prompting the developers to find solutions that “challenge AI”, Prof Ong said.
Previously, APSN students would use a pen-and-paper approach to learn about social and emotional intelligence.
Ms Michele Yap, ASPN’s senior manager of infocomm, said that by using the apps to complement traditional classroom learning, the students are very engaged and have the opportunity to practise the skills that they learn in class.
“Over at the association, we are constantly looking for ways to enhance teaching and learning experiences,” she said.
The first phase of the pilot programme started in September last year with 140 students.
The second phase involves 220 students from APSN Katong School.
Unlike the first phase, users may customise the apps — for example, by changing emoji prompts to photos — to suit different needs.
Students from APSN’s Chaoyang and Tanglin schools will join the pilot in January, bringing the total number of participants to more than 300.
Nurrifahirah Ahmad Sulimi, a Secondary 3 student from APSN Tanglin School, said: “Through the apps, I learned more about different emotions and most importantly, how to better express myself,”
The apps are nw available only to APSN students due to data and privacy concerns, since the games involve taking photos of the users’ faces.
Prof Ong said that NTU is looking at how the apps can be extended to other special needs schools in the future.
“Possibly, we are also in discussion with Yoozoo to extend it for educational purposes and so on,” he said, adding that they might reach out to other target groups such as elders, professionals, and educational cohorts.