Adam Gartenberg's Blog

Business Analytics and Optimization, IBM and Social Marketing

IOD EMEA 2010 - Smarter Futures

Along with others from the bloggers program, I had the chance to get a tour of the "Smarter Futures" Emerging Technology lab set up on the expo floor at IOD EMEA yesterday.  This team, based in IBM's Hursley lab, is doing some really interesting things when it comes to using leading-edge technology and applying it to the way we live and work.

The setup that seems to be getting the most attention (by us and others) was a demo that uses a thought control headset (yes, thought control) that has been coupled with IBM's MQTT (MQ Telemetry Transport) middleware technology.  The particular output they demonstrated was to drive a miniature remote-control car by raising your eyebrows, but the applications for this might be to allow you to control household appliances just by thinking about them, or for use by long-haul truckers to sound an alarm if it senses they are starting to get drowsy.

This same MQTT technology was in use elsewhere in the demo.  They had set up a demonstration of Near Field Communications, again using MQTT as the message broker, to connect together various household devices.  For example, capturing a QR code via augmented reality with a mobile phone recognized that the code belonged to a light and presented the mobile phone user to press a button to turn on the light.  When the light went on, it sent smart meter on the home over a preset limit, which alerted the homeowner that they were using more electricity than they intended.  The setup and software is easily extensible, and could just as easily include RFID tags or other sensors.

Another really interesting project was "SiSi" (Say it, Sign it), a project that began when some of the Hursley lab staff wanted to brainstorm alternate ways to communicate with a deaf coworker.  They designed a system which converts verbal speech to text, which it then in turn translates to sign language acted out by an avatar.  This project was done in partnership with the UK Royal National Institute for Deaf People and the University of East Anglia, who developed the signing avatar.

We were also shown a demo of a retail store scenario designed to build closer interaction between the customer and the store.  If you think about an online shopping experience, the online retailer knows a lot about us, either by our browsing habits during that particular shopping session, but especially if we sign in to the site when we first visit it.  This is in large contrast to the physical retail experience, where for the most part the store has no idea who we are or what our preferences or prior history with them are.  The project the lab team demonstrated was one where your mobile phone could be used to alert the store that you were there (voluntarily, of course), upon which they could better personalize their interaction with you, or for example, send your phone a customized discount code to use on checkout.

Like I said at the top, they were showcasing some very cool technology.  My thanks to Peter Waggett, Kevin Brown and Ed Jellard for taking us through the demonstrations.