When I was a young 24-year-old Associate Product Manager at Microsoft’s Consumer Strategy Team, one of my main jobs was to demo a variety of devices at Microsoft’s Consumer Experience Center and Home of the Future. The year was 2003 and the closest thing I had to a smart phone was a SPV Windows Mobile device that let me read my email on a tiny screen. However, searching for content or trying to respond to emails was a tough exercise in patience as the press key pad and slow connectivity made it nearly impossible to input and request information.
Because I was the youngest member of the team, I ended up being the go-to demoer for tours of these centers. This gave me the opportunity to play around with new products like the SPOT (Smart Personal Objects Technology) Watches, Refrigerators that recognized RFID tags, and with the Windows Media Center PC, which was intended to be the “hub of the home.” The world seemed to be changing incredibly fast and I couldn’t wait to see products like this in the market. One of the core lessons I learned was that without a seamless and easy user interface, the consumer would feel the same way I did with my first smart phone. That this is cool but that it needed to be a more intuitive experience.
My next major career experience had me move all the way to Seoul, South Korea and work for one of the top mobile phone manufacturers, Samsung Mobile. My experience at Microsoft taught me that business is a global enterprise and I thought that having experience working in Asia would probably be a great developmental as well as personal life experience. It was now 2008 and I soon realized that I had jumped into an incredibly dynamic and fast moving industry. It wasn’t clear whether Symbian, Android, Windows Mobile or even Samsung’s own BADA mobile OS would lead the market, and Samsung had just started to produce their first touch capacitive devices like the Omnia and Galaxy mobile phones.
Once again the world felt like it was moving incredibly fast and I was part of addressing the usability issues I had experienced with the first versions of the smart phone. Input and user experience was vastly improved with large touch screens and a cool new keyboard called Swype. But, I still wondered why the interesting and fun home and wearable concepts I had played with at the Home of the Future were nowhere to be seen in the market. What was taking so long – and what were the barriers?
Fast-forward to 2016 — everything that I dreamed of and saw when I was a young 24-year-old is now becoming a reality. Wearable technology monitors your personal fitness metrics, sprinklers can tell when the soil is getting too dry, and even your car can now be connected to the content and service providers. However, interactivity with these devices remains a challenge.
Similar to my first smart phone experience, it is incredibly difficult to communicate with devices that have small or even no screens. We have to re-think how we relay information to the Internet of Things (IoT) and perhaps the most basic and simple answer is the best. Voice is at the core of how human beings communicate, and as these devices become smarter and connected, it will be natural for human beings to want to speak to them especially inside the home.
But each of these IoT devices requires unique and different interaction experiences. How you talk to a thermostat is very different from what you would say to your coffee maker. That’s why it is crucial that device makers and application developers for intelligent devices focus on consumer usability and device specific use cases. This is at the core of Nuance’s thinking around how to best integrate voice recognition capabilities into the Internet of Things. Our development philosophy believes that one size doesn’t fit all for voice recognition integrations, just like all families and homes are not the same.
One great real life example of this is the soon to be released Samsung Family Hub Refrigerator. It features a Wi-Fi connected touch screen that can help you manage your food and recommend menu options based on what you have handy. And, the Samsung Family Hub will include speech capabilities utilizing Nuance’s voice recognition technologies. A user can now simply tell the fridge to look for recipes with spinach and tomatoes, which is a godsend when you have messy hands while working in the kitchen. My cookbooks physically tell the war stories of Thanksgiving dinners gone awry and frantic attempts so see what is next in the recipe with gloopy hands.
My journey in consumer technology is far from over, but, like a fine wine, it seems to get better with time. I am excited to be part of the Internet of Things revolution and to be working on our new development platform, Nuance Mix, which is specifically designed to help integrate voice into the latest and greatest devices and services. Fifteen years in and I am now finally seeing the reality of what I saw in the early 2000s. I’m eager to see what the next 15 years will bring and how I can be part of bringing to life the next wave of consumer technology.