Mashable Tech - Until recently, the idea of holding a conversation with a computer seemed pure science fiction. If you asked a computer to “open the pod bay doors”—well, that was only in movies. [...]
“We’re at a transition point where voice and natural-language understanding are suddenly at the forefront,” says Vlad Sejnoha, chief technology officer of Nuance Communications, a company based in Burlington, Massachusetts, that dominates the market for speech recognition with its Dragon software and other products. “I think speech recognition is really going to upend the current [computer] interface.”
Progress has come about thanks in part to steady progress in the technologies needed to help machines understand human speech, including machine learning and statistical data-mining techniques. Sophisticated voice technology is already commonplace in call centers, where it lets users navigate through menus and helps identify irate customers who should be handed off to a real customer service rep. [...]
Jim Glass, a senior research scientist at MIT who has been working on speech interfaces since the 1980s, says today’s smart phones pack as much processing power as the laboratory machines he worked with in the ’90s. Smart phones also have high-bandwidth data connections to the cloud, where servers can do the heavy lifting involved with both voice recognition and understanding spoken queries. “The combination of more data and more computing power means you can do things today that you just couldn’t do before,” says Glass. “You can use more sophisticated statistical models.” [...]
Perhaps people will even speak to computers they wear, like the photo-snapping eyeglasses in development at Google. Sources at Nuance say they are actively planning how speech technology would have to be architected to run on wearable computers.
CNET - A group of researchers says shoes may be the next thing in the busy field of wearable computers and gesture interfaces.
Computer scientists from the Telekom Innovation Laboratories, the University of Munich, and the University of Toronto this week published a paper on ShoeSense, a wearable computing system for a smartphone.
[...] Developing alternative inputs for smartphones makes sense when a person is moving or engaged in other tasks, such as driving, or when it’s inappropriate to pull out a smartphone, such as during a family dinner, the ShoeSense developers said in a paper.
Its developers envision a sensor being placed in a shoe that is able to understand customizable hand and arm gestures. In a video, a user moves his finger along his forearm to turn up the volume on a music player in his pocket, pinches to select the next track, and then pinches with three fingers to send an “I will be late” e-mail to his wife.
Having a sensor device in a shoe has advantages over glasses in that it allows for eyes-free interaction, and it doesn’t constrain body motions. ShoeSense’s designers also think that it can be more socially acceptable to operate a smartphone through arm and hand gestures than via glasses. Potentially, the sensor could be powered by a walking motion.
CNET - Sebastian Thrun, who works on Google’s Project Glass, shows how to operate the wearable computer glasses by taking a photo of Charlie Rose and sharing it on Google Plus. [Sebastian] says that the company’s Project Glass glasses are best doing what a smartphone does but in a hands-free way. The company executive, who works on Project Glass at Google X Labs, wore a prototype of the now-famous glasses during an interview with Charlie Rose which went online today.
[...] Thrun said that the idea of augmented reality, or superimposing digital images over the physical world, isn’t the best use for Google’s glasses. Instead, it’s a good hands-free way of interacting with technology services, he said.
“The thing we like is picture taking,” he said and then took a photo of Rose. “I nod and the picture is now visible to (my friends).” Google has also done experiments with making phone calls with the device, notifying the user of events on a calendar, or having e-mails spoken to the user.
“I can have e-mails read to me, so overall it’s very liberating to me,” said Thrun, an artificial-intelligence expert who also worked on Google’s driverless car. “The hope is to get things out of the way. This is a display that’s with you all of the time.”
Forrester Blogs – Senior Analyst Sarah Rotman Epps writes: Wearable devices, or “wearables” for short, have enormous potential for uses in health and fitness, navigation, social networking, commerce, and media. Imagine video games that happen in real space. Or glasses that remind you of your colleague’s name that you really should know. Or paying for a coffee at Starbucks with your watch instead of your phone. Wearables will transform our lives in numerous ways, trivial and substantial, that we are just starting to imagine.
In a new Forrester report out [on 04/17/2012], we argue that wearables will move mainstream once they get serious investment from the “big five” platforms — Apple, Google, Microsoft, Amazon, and Facebook — and their developer communities, and we give advice to product strategists who want to stay ahead of the wearables curve. Key takeaways:
Wearables are here, and more innovation is coming. We’ve all seen the movies: Gadget-laden heroes from James Bond to the Terminator to Iron Man have long relied on voice-controlled watches and heads-up display glasses to extend their powers. Now, those gadgets are a reality, albeit a niche one. [...]
Wearables need backing from the big five platforms to succeed. Wearables without software are just geeky hardware. The big five software platforms — Apple, Google, Microsoft, Amazon, and Facebook – each have strengths to bring to wearables. [...]
Wearables will heighten the platform wars — and Google may actually win. [...] Google’s open Android platform will inspire broader experimentation for entire wearable solutions. [...]
Product strategists who want to stay ahead of the curve should take a cue from companies like Intuit and experiment with wearables now, especially if you’re in an industry that will be disrupted by wearables, including apparel, software, media, gaming, and commerce.
CNET- Google finally acknowledged that it’s testing a prototype set of eyeglasses that can stream data to the wearer’s eyes in real time.
A video of this augmented-reality experiment was posted by Google on YouTube showing someone wearing the glasses as he made his way around variety of Manhattan venues, receiving up-to-the-minute updates as information streamed into his glasses.
Now Google’s touting it as Project Glass. Parviz and his collaborators, Steve Lee and Sebastian Thrun, wrote up a brief post to accompany the video and solicited feedback, asking people what they’d like to see in the glasses.
“A group of us from Google[x] started Project Glass to build this kind of technology, one that helps you explore and share your world, putting you back in the moment. We’re sharing this information now because we want to start a conversation and learn from your valuable input. So we took a few design photos to show what this technology could look like and created a video to demonstrate what it might enable you to do.”
Let’s not be too cynical about an idea that, at first blush, seems delightful but not very relevant. Also, given that the authorities take a dim view of driving while texting, you can image how they’ll react to someone behind the wheel of a car with yet another distraction.
DVICE - Fans of the iPod Nano Watch may want to think about making an upgrade in the near future. Of course, the only trade off is that you’ll have to switch to an Android device.
The Sony SmartWatch is essentially a tiny touchscreen that allows you to control and interact with your smartphone via Bluetooth. In addition to functioning rather well as a wristwatch, the device allows you to read sms and email messages, control your phone’s camera, read calendar appointments, and use services like Twitter and Facebook. In order to use the SmartWatch you’ll also need Sony’s LiveWare app along with the SmartWatch plug-in. Priced at $149, it’s due to arrive in stores later this month.
CNET- It appears LG is making good on its promise to bring flexible displays to e-book readers, as the Korean consumer electronics company revealed that it has started mass production of the “world’s first” plastic electronic paper display (EPD).
The screen measures 6 inches diagonally and has a resolution of 1,024×768 pixels. It can bend at a range of 40 degrees from the center of the screen, and it’s also about one-third thinner and half as light as current glass EPDs, LG says, meaning it’ll be even easier to carry than the e-book readers today.
LG also cites durability as a benefit of its plastic EPD. The company ran numerous stress tests on the display, including dropping it from a height of about 5 feet and hitting the screen with a small urethane hammer [...] and saw no damage. Presumably, this means you could toss your reader into your bag sans case without fear of scratching up the display.
The company says we could see products with its plastic EPDs as early as next month in the European market, and though LG is focusing on e-book readers for now, it’s looking to incorporate the technology into other products in the future.
medGadget - Avery Dennison Medical Solutions (Chicago, IL) has created a disposable wearable sensor to improve medical monitoring. To establish a strong identity in the quickly growing body monitoring field, Avery Dennison worked with Karten Design (Los Angeles, CA) to optimize the Metria sensor for end users. With a design that draws more from athletic apparel than medical products, the firm sought to design a fitness-inspired product that could be worn comfortably around the clock for approximately seven days.
“Many sensors available today look like bandages,” explains Jonathan Abarbanel, the lead designer on this project, in an interview with medGadget. “Through design, we wanted to visually message the wearable sensor’s capabilities: it’s not just a bandage; it’s a body-worn sensor with complex electronics that can provide real-time, continuous information about your vital signs.”
The sensor is expected to be available later this year and will be distributed under the Body Media (Pittsburgh, PA) brand.
new electronics - We are surrounded by electronic machines, many of which have advanced at an astonishing rate. But, arguably, the way we interact with these machines has lagged far behind. For example, decades after speech recognition was invented, how many people do you hear talking to their pcs? The humble keyboard and mouse remain the dominant interface.
Smartphones and tablet computers already use the touchscreen interface to great effect and if some of the many research projects underway succeed, touch technology – or haptics – will transform the way we use electronic devices.
One promising example of haptics is OmniTouch, a wearable projection system developed by Microsoft Research and Carnegie Mellon University (CMU) in the US. It enables users to turn pads of paper, walls or even their own hands, arms and legs into graphical, interactive surfaces.
US company Novint Technologies is a leader in haptic interfaces for gaming, in the form of its Falcon and XIO products. Users hold onto the Falcon’s grip and as it moves, the computer tracks a 3d cursor. When the cursor touches a virtual object, the computer registers contact with that object and updates currents to motors in the device to create an appropriate force to the device’s handle, which the user feels.
Chris Harrison, of CMU’s Human-Computer Interaction Institute says, “The real world is full of rich haptic feedback: we push a door, grab a toothbrush, grasp a bottle. So far computing has lacked much touch input, so we’re mostly clicking buttons and poking touchscreens. But there is a huge opportunity for providing haptic feedback to the user, just as we get from real world actions.”
CNET- Scientists and engineers have built a monitor that tracks heart rate, respiration, and movement–without requiring direct contact with skin.
The “life and activity” monitor, developed at Oregon State University, is wearable and non-invasive. The sensor does this via a 5-axis inertial measurement unit and a non-contact heart rate sensor that allow for ongoing and simultaneous monitoring of movement, heart rate, and respiration. Imagine adhering such a device to your pants instead of wearing yet another arm or wrist band that’s trying to resemble a watch.
The researchers, who reported on their emerging tech this week, say the next step is to continue to miniaturize a device that is already just two inches wide–ultimately taking the form of, say, a disposable bandage prescribed by a doctor for a few weeks of continuous monitoring.