October 26, 2010 | by Andrew Kameka
Enjoying that new Vlingo In Car app? Me too, and we have Sensory to thank for the pleasure. Sensory is the company that powers the new “Hey, Vlingo” feature, and it just released a new SDK for Android and iPhone that developers can utilize to build hands-free applications.
The Truly Handsfree Trigger SDK was announced today, offering a new set of tools for developers looking to tap into the power of voice search or control. And while Android has several apps that can perform functions through voice prompts, they almost always require some level of physical interaction. Truly Handsfree creates a ready-to-respond environment that doesn’t require any undue distractions.
“When your hands are busy and your eyes are busy, you don’t want to have to distract yourself touching buttons,” Sensory CEO Todd Mozer told Androinica in a recent conference call. “But people still want to be productive in their cars and Truly Hands Free Trigger will help enable that.”
Other possible uses of the SDK include controlling a music app’s playback functions, games that take advantage of voice commands, or apps that enable controls remotely. Imagine that your hands are full during an incoming call. A developer could create a program that reads the caller ID and responds to “Command: Answer in speakerphone mode” or “Command: Send to voicemail” if it’s not worth diverting attention.
Costs and Concerns
Truly Hands-free Trigger SDK is available now at a cost of $2,500 for 5 hours of development support, and a per unit licensing fee negotiated with Sensory. According to Mozer, Sensory will negotiate deals that make sense for developers, including independent or smaller creators.
“There’s this whole new world where people like to give apps away and make money through other means. We don’t have a preconfigured business model for that,” Mozer explained. “We’ve done some revenue share agreements and we’re open to a variety of ways of [monetizing apps]. What we usually try to do is partner with developers and better understand in what ways they plan to make money and figure out ways that we can share in their success.”
There were two other major themes we touched on in our conversation. Here are Mozer’s responses to those questions.
Androinica: Can you give some potential uses of your SDK?
Todd Mozer: We think it will be very powerful in the home. We’ll see that more in the tablets that are coming out and we know that customers are talking about Android-based TV’s or control systems. The idea of having the trigger in the voice trigger in the home is really not that different from what remotes did. Everyone use to walk up to the TV and crank the television dial to change the channel.
Now, everybody has a remote control for doing that, but everything else – clocks, systems, microwave ovens – you get up and touch your hand. It doesn’t have to be that way. We’ve introduced clocks where you can call over and say, “Hey, Clock, what time is it?…Hey, Clock, set my alarm.”…In Android, the ability to do this type of trigger is a really powerful concept because more homes have Wi-Fi and have the ability to go into the cloud to do the heavy lifting.
What’s better to use: Google’s or Sensory’s tools?
I think the best of both worlds is to use both actually. The Google API won’t allow you this hands-free capability and requires you to be connected when you use it. What we’ve found is that in some situations, [developers] want to not have to be connected or use bandwidth. Now, developers can put speech control on the client when it’s not connected and in the cloud through Google, so you can have a hands-free trigger and actually call out to the client or to Google’s voice services.