As someone on Twitter said, if “bots” was on your Build 2016 drinking game card, you’d be long dead. But while Microsoft is all about getting developers to create intelligent app companions to make our lives easier, is there any impressive machine learning the Redmond firm is ready to show off right now?
The answer, surprise, is yes. Besides ordering Domino’s pizzas, Microsoft has been tinkering with its Azure-based tools to recognize age, gender, emotion and individuals by name.
Remember the great/awful How Old Do I Look? website introduced at last year’s Build? The Windows 10 maker has been building off this ego-buster since, and what it’s cooked up is far more intelligent – and more flattering – than its earlier iteration.
During a demo of the API’s capabilities, Microsoft Data Scientist Carlos Pessoa showed myself and a group of reporters a Real-Time Perceptual Intelligence app that recognizes your age, gender and emotion, even if other people are in the shot with you (though it will pinpoint that data for each of them, too).
Whereas last year’s HODIL guessed my age to be in the mid- to late 30s, and even pushed it into the 40s on some tries, today it initially got my age correct at 29, then fluctuated to as low as 26 and as high as 31, depending on my expression. A natural smile yielded the most accurate result, while a scowl or surprised tended to skew older.
It pegged my emotion at being generally happy or neutral if I was smiling broadly or just a little. A surprised expression was recognized as such or as happy, and apparently my resting face registers as sad.
Pessoa confirmed that Microsoft’s age recognition tech has gotten smarter: even if it doesn’t guess your exact age correctly, it’s usually in the ballpark now. He also showed how the machine could recognize someone by name if he took a picture of them with his phone and “taught” the machine who they were. If the same person walks by the same machine at any point during Build this week, it will recognize them.
There were other demos, such as an emotion matcher that pairs a just-snapped photo of you with a baby it deems shows the same top three emotions as you are, as well as the Fetch! app to match mug with you with a dog.
But Real-Time Perceptual Intelligence was the most intriguing of them all. It was just a small demo, but the powerful program app was running off of Passoa’s phone and it felt like in the past year How Old Do I Look? went to college and got a master’s degree in terms of capability. The real-time, accurate assessment of my age and expression got me excited – and maybe slightly scared – to think of its possible uses.
Pessoa sees the API being used for good, of course, in places like the customer service industry. He gave the example of an ATM camera that could tell if an elderly person walks up and it needs to make the font on the screen bigger. Or, if customers were having a consistently negative interaction with the ATM, a bank could investigate the issue and make improvements.
And with today’s talk of bots that use “the power of natural human language with advanced machine intelligence”, is there a place for bots that tap into the power of visuals, too?
“There’s no intersection yet, but yes, I think it’s only a matter of time,” Pessoa told me in terms of bringing this machine learning to app bots. “You could have a How Old Do I Look Bot? that you send a picture to after you’ve gotten a hair cut, and it could tell you what it thinks your age is. If it sees your expression, it might flatter you by saying, ‘Oh, you have a beautiful smile.'”
It will likely be a while before we see applications like this come to market. Microsoft only announced its Bot Framework today, giving developers tools so they can start making apps that communicate intelligently with platforms like Skype, SMS and the web.
But if bots catch on, it may only be a matter of time before apps with integrated bots are telling us that our new hairstyle makes us look 10 years older, or stores are equipped with cameras to gauge our customer satisfaction. Tapping into Microsoft’s APIs, they’ll probably end up eerily accurate.
Article continues below