New and Emerging Tech Part 5: Affective Computing

So here’s the fifth of my blogs on how new and emerging technologies’ are having and will have an impact on higher education. You can see the first, secondthird, and fourth by clicking on those links.

Today, things are going to get a little more….sci-fi…I’ll be discussing Affective Computing. The NMC Horizon Report 2016 believes that the time for Affective Computing to be adopted into the mainstream of higher education will be 4-5 years.

Robot Reading Emotions

Affective Computing

Affective computing refers to how devices and systems are being developed and researched with a view to making them able to to recognise, interpret, process, and affect human emotions. Affect means to ‘simulate’. Therefore, and this is why I said this blog was getting a lot more sci-fi than the previous blogs, we’re talking about machines that can recognise when humans that interact with them are angry, sad, or bored, and possibly react to that in some way. Therefore the technology of affective computing can be split into two areas. 

The first is a machine detecting and understanding an emotion displayed by someone. This might be derived from the person’s facial expressions or their body language. It might be able to hear a person’s speech and decipher whether there is any emotion relayed in how that person has said something. The machine might be able to sense changes in skin temperature or whether someone is perspiring, or it could be any combination of these things. Machines process this data and perform machine learning. The result of the algorithms set in motion by that data is that the machine might be able to estimate a person’s emotional state.

The second area is the replication of convincing human emotions. This might aid in facilitating interactions between humans and machines. When we hear emotion in a voice, it conveys so much ore than the words alone. I suppose I just can’t wait for the Sarcastabot’s withering wit.

Driving Factors and Benefits of Affective Computing

So how does this possibly relate to education? Well, affective computing can be used in online learning to increase interest or improve confidence in a learner. Consider how adaptive learning adapts learning scenarios according to a learner’s knowledge, context, and performance; if a machine can change the course of a learning scenario based on whether the learner sounds bored or looks confused, it enriches that experience even further. The mental states of learners can be gauged and interventions and modifications to learning initiated. If the learner is bored, maybe a gamified aspect of the materials can be introduced. 

Intuitive reactions to learners’ recognised emotions could replicate traditional face-to-face interactions. Imagine you are learning with your computer, you appear stressed-out, that computer might identify this and advise you to take a break, finish up for the day. Maybe even make a joke to alleviate your stress somewhat. All of this could be done via speech from the machine in what it determines to be the most appropriate in that particular situation. See, told you it was going to get sci-fi.

It’s machines catering to humans’ needs reactively and possibly proactively. Look at this scene from Buck Rogers, yes full sci-fi mode now. Buck Rogers has just found out that he has returned to Earth over 400 years after he left. Buck is naturally shocked and needs a drink. Twiki, the robot, obliges, gives him a green sci-fi drink, and says in a rather empathetic tone, “L’chaim” which is a toast used in drinking to a person’s health or well-being. That’s great service. That’s affective computing. Well, it’s probably full AI considering in a few scenes time Twiki falls in love with a golden female robot. That’s probably for another blog….

Affective Computing in Practice

Apple, Intel, IBM, and Microsoft all have affective computing technologies in development. From MIT, Affectiva was born. Affectiva claim to have ‘the world’s largest emotion database’. 

Bayer recently employed the Affective Computing Company333 for its Human2Human platform and Thrive software to aid manager in making more informed decisions based on employees’ engagement and well-being. These are monitored with the increasingly self-aware system which prompts staff on their portable smart devices.

Specifically in regards to education, Stanford University researchers have developed YouED. YouED is a tool that detects confusion in the forum posts of Massive Open Online Course students and, in response, sends them videos in the hope of alleviating that confusion.


As you can see, there are numerous applications for affective computing from improving productivity in the workplace to advertising to education. Affective computing can further the autonomy of independent learners interacting with machines and computers, enriching their learning activities by identifying their emotions and reacting appropriately. It’s certainly another potential string to technology enhanced learning and ultimately in the creation of machines that serve humanity’s needs. In my next blog on new and emerging technologies and their impact on higher education, I’ll be looking at the equally exciting and, for some, frightening subject of Robotics.