Daily Archives: January 22, 2013

The Contextual, Sensual CES2013

It’s been a week now since the Consumer Electronics Show closed. I wanted to take that time to read all the reports and get rid of any overhype I picked up because of all those big screens that I saw.

Really the story was sensors. Whether video sensors on glasses, heart rate sensors on watches, or 3D sensors that you can interact with, this CES was more about sensors than anything else.

For a taste of just how big a deal this was this year, check out this video of Primesense’s private suite.

Don’t know Primesense? It licensed its technology to Microsoft for the Kinect sensor. You know, the one that can see you dancing, or gesturing, or moving around. It even does pretty good face detection. It knows I’m playing instead of my sons.

But this year the technology took a Moore’s law-style turn. It got a LOT smaller. It’s now a stick of chewing gum instead of something longer than most of my books. It’s lower cost. Will run less than $100. It’s much higher resolution. It now is so accurate it can see how hard you are pressing against a desk.

Listen to Primesense founder Aviad Maizels talk about his vision for 3D sensing.

Speaking of 3D sensors, I did see the Leap Motion. I like what they are doing too and we’ll do a video in the future with them. But their sensor is optimized for over keyboard use, not room use, so I find the Primesense has me dreaming about a contextual future a lot more.

At CES I had dinner with execs from GM and Ford and they are thinking about how to use these sensors in cars. Both to personalize the car (with a sensor like this they could tell you are sitting in drivers seat) but also to do things like wakeup alarms if you are falling asleep while driving. Also, hand gestures will be more efficient in many ways than voice systems, particularly for moving around user interfaces. Listen in to John Ellis, head of the Ford Developer Program, talk about the contextual future of cars:

The other thing I saw were wearable computers. Listen in to these two visionaries who are building really interesting wearables. Recon Instruments builds the heads-up displays that Oakley is including in its AirWave ski goggles and Pairasight has built a glasses with two 1080p cameras. The Texas Instruments chipset Pairasight was using lets you stream about 1.5 hours of 1080P video on a single battery charge (and the battery is tiny, so this is a breakthrough). Pairasight’s glasses are in prototype stage. Recon’s are shipping now.

That all led me to talk with Don Norman, who I ran into at CES. Don’t know who he is? He used to be a fellow as a User Experience Architect, which was the first time User Experience was used in a title at Apple and later became Vice President of Apple’s Advanced Technology Group, but that hardly explains Don, go read his wikipedia entry.

So, what does it mean?

Well, consumer electronics are about to become anticipatory and personal.

Think about Google Now, which shows you all sorts of ways to live your life better (like the fact that I better leave for my meeting now because traffic is bad on way into San Francisco). Our world will know you at a deep level. Don’t believe me? Look again at the Primesense video. In there is a demo by Shopperception which lets retail stores see what you are buying in real time. Freaky, huh? But you know we’ll let stores do this. Why? We’ll get paid to. I see everyone in Safeway using their Safeway card, which already allows pretty deep tracking of buying behavior. Imagine a display near the cereals saying “hi Robert Scoble nice choice of Cheerios, if you want a second box it’s half off.”

The sensual, contextual age of consumer electronics is here ready or not.