There's no doubt technology is getting smarter. While my smartphone still can't seem to grasp the correct moment to auto rotate or that "ducking good" is a phrase used by no one ever, much of our tech is increasingly getting sophisticated.
And now that fingerprint and voice recognition have become commonplace, our tech overlords at Microsoft are developing technology to recognise and understand facial expressions.
Yes, cameras have been able to identify smiles for a while but the new software from Microsoft's Project Oxford's new artificial intelligence goes further.
The AI, unveiled at Microsoft's Future Decoded conference, relies on advanced machine learning as well as facial recognition. Through these, it's capable of examining facial expressions, and inferring the subject's emotional state based on it.
The software, which runs on the company's cloud-computing platform Azure, has even been released as a beta version for free. All one has to do is get on to Microsoft's Project Oxford site, upload an image and the AI spits out a list of numbers in relation to 8 different states of emotion. Yes, just 8, but between anger, disgust, contempt, fear, happiness, neutral, sadness and surprise, it's fairly comprehensive.
So I decided to test out just how good it was:
Believe it or not, that's my happy face, just not according to Microsoft. Which either means that: a) Computers know me better than I know myself b) I am naturally zen.
Still, apart from testing one's gag reflex, my face isn't one that lends itself to testing anything. So I decided to test Microsoft's software with the most emotive face in the country - Narendra Modi's:
Modi's never been the most neutral man, but Microsoft's algorithm pegs his neutral face accurately enough. But will Microsoft recognise an angry Modi?
That is anything but a neutral image. Legend has it that the man who took that photo spontaneously combusted moments after clicking it. But clearly Microsoft has higher standards for anger, contempt, disgust and fear which measure in minimally compared to neutral. Heck, the algorithm even managed to find some 'happy' in this picture which is only possible if Happy was the name of the owner of that blurry head in the background. Okay, so anger isn't the algorithms strong suit.
What if we went in the opposite direction?
Smile recognition is probably one of the easier things to factor in to the programme, so the high happiness factor isn't surprising. But you know what is surprising?
While being surprised may not be an emotion, Microsoft got this one so bang on I'm surprised too.
So, what it looks like from the Modi experiment is that Microsoft has decided to err on the side of caution when it comes to passing value judgement.
It's not much of a surprise either after Google, in their attempts to push the envelope, introduced an auto-tag option for photos that tagged two African-Americans as gorillas. Google isn't alone. Microsoft also had egg on their face earlier this year when their age-recognition attempts ended up getting things horribly wrong. It seems that for the moment, the algorithm tends to focus on neutrality, surprise and happiness.
Anger, which is a strong word in itself, only comes across for exaggeratedly enraged visuals. More subtle emotions like contempt and disgust only ever seem to pop up with minimal numbers. Clearly Microsoft aren't keen on calling you on your resting-bitch-face.
That being the case, the programme will only get more intelligent as its database of images gets larger, so we can only expect it to get smarter. Something that I can't decide is awesome or scary. I imagine if I took a selfie right now and ran it through the programme I might have a better idea.
Yes, it's a fantastic way to mess around and pass the time but even in its nascency, the software's potential is clear.
For starters, Microsoft's beta application programme interface (API) for the facial recognition software is available for other developers to use.
Its scope is incredible.
The programme can be used to determine consumer reactions to window displays, movies and food. In addition, developers could design apps that offer customised responses to images based on the predominant emotion the software detects in them.
Heck, Microsoft even showcased the adaptability of the software when they released a MyMoustache version for Movember that measures and rates facial hair growth.
In addition to the emotion recognition, some of the other tools showcased at Future Decoded include tools to identify individual faces and voices in videos. Combined with the emotion detection, it amounts to a heady cocktail of artificial intelligence. Should come in handy for when the robots rise up against us.