Mobile 3.0 arrives: How Qualcomm just showed us the future of the cell phone (and why iPhone sucks for this new contextual age)

Google Now screen shot

The world just changed yesterday. You probably didn’t notice. But I guarantee strategists at Apple, Facebook, Amazon, Microsoft, and Google did.

What happened? Qualcomm shipped a new contextual awareness platform for cell phones.

Yesterday the Mobile 3.0 world arrived. First mobile was the standard old cell phone. You talked into it. The second mobile era was brought to us by the iPhone. You poked at a screen. The third era will bring us a mobile that saves us from clicking on the screen.

We’ve seen lots of precursors. Heck, Google itself, a couple of weeks ago, shipped something called “Google Now” that tells you stuff based on your context. “Hey, Scoble, you better leave for your next appointment because it takes 53 minutes to get there” my new Nexus 7 tablet tells me. You see the actual screen shot above.

But in the future your mobile device, whether it be something you hold in your hand like a smart phone, or wear on your face, like Google Glasses, will know a hell of a lot about you.

How?

Well, Qualcomm just shipped the developer SDK, called Gimbal.

This SDK talks to every sensor in your phone. The compass. The GPS. The accelerometer. The temperature sensor. The altimeter sensor. Heck, we’ve known about sensors in cell phones for a while now. Here’s a New York Times report from May of last year.

But now, thanks to this SDK your smart phone will start to make sense of the data. Developers will have a single data pool on your cell phone to talk with (Qualcomm was very smart about privacy — none of this data leaves your own cell phone unless you give it permission to).

Today I was talking with Roland Ligtenberg, product developer at Qualcomm Labs. While talking with me I realized just what Qualcomm was up to.

See, if you do all this collection and analysis in software there is a battery cost. Remember Highlight? My favorite app of SXSW (and really the year). Did you ignore it? Well, investors aren’t. Ron Conway told me that aside from Pinterest Highlight is his favorite new company. Mine too because it showed me something no one else showed me before (a new kind of context of people who are near me). It actually is a lame app compared to what is coming, thanks to this Qualcomm SDK.

Qualcomm wouldn’t comment, but Roland told me that if you did all this in hardware there would be a lot less battery cost. So, look for this SDK to come to your mobile phone (or other wearable computing devices, like Google Glasses) soon.

Want to see what other use cases are coming? Check out this answer on Quora (actually 28 separate answers from techies) about what the Google Glasses world will bring (really they are talking about contextual and wearable computing, mashing together).

To add onto those answers, these new systems are going to know whether you are walking, running, skiing. Whether you are shopping, working, entertaining yourself (it knows whether you are in church, or in a strip club, or at school, or at work, or driving). Thanks to the wifi and bluetooth radios it can even know you are riding in your wife’s car, not driving. (Only available on Android, because Apple doesn’t let developers talk to the radios).

Which brings me to why Apple sucks.

Apple does NOT give developers access to the Bluetooth and Wifi radios. This is going to really hinder developers in this new contextual world.

Think about why your phone or Google Glasses might want to know you are in the kitchen, vs. sitting on your couch in the living room. The information that should automatically show up on your phone will be radically different. In the kitchen I’m in a food context. I want recipes, or healthy living guides, or I want my device to track just how many Oreo cookies I’m eating “hey, Scoble, you fat dude, this isn’t helping!” Already we’re doing this kind of quantified self stuff with Fitbit, Nike Fuel Band, and other devices. My wife is already tracking everything she eats and does on her cell phone.

Now, in the future our cell phones will know us at a very deep level. Already I’ve told Facebook more than 5,000 things I like. Check out my list. It’s public. On it you’ll see which startups I like. But also that I like Round Table Pizza. Think about that one for a moment.

In the future my cell phone will know I ordered a pizza. Will know when I get in my car. Will know who is in the car with me. And will give me contextual data that will make my life better. For instance, on my todo list I might have put “pick up a hammer at the hardware store.” It will know that Round Table Pizza is near the hardware store. It will know I have an extra 15 minutes. It can use Waze to route me to the hardware store first, tell me to pick up my hammer, and then head to Round Table to pick up that pizza. All while measuring how many steps I took (Nike Fuel points!) and telling me who has crossed my path. Oh, Joseph Smarr, who works at Google, is also at the Round Table? Cool! (He lives in Half Moon Bay too so this could happen at any time).

But when I get back, can my phone understand that I’m now in the dining room, eating? Or the living room, ready to watch a sports show (it knows already what sports I like — think about the next Olympics where it tells me that it has queued up the track and field finals for me to watch automatically)? Only if you don’t have an iPhone because Apple hasn’t given developers access to the wifi and bluetooth radios, so it can’t let developers let you map out your house accurately.

Which gets me to what Facebook and Amazon could do to totally disrupt the smart phone market (both are rumored to be working on hardware). See, you shouldn’t work on hardware if you only can match what Apple has already done. You should work on it if you can totally blow away what Apple has done.

I bet that Amazon and Facebook are building a new kind of contextual device. One that already knows you. Facebook already knows what I read, watch, listen to, and much more thanks to its Open Graph API system. Amazon already knows what I read, watch, and buy, thanks to its commerce system.

Add these two companies to Qualcomm’s new contextual platform and you have a new world.

By the way, Qualcomm is a $95 billion market cap company and is spending $3 billion a year in R&D and its chipsets are probably inside the phone you are currently holding. So, I take what they are doing very seriously.

So seriously that next week Forbes author Shel Israel and I will announce a new project all around contextual computing next week. See ya on Tuesday.

A new age just arrived. Mark yesterday in your calendar and see you on Tuesday.

By the way, for those at Rackspace, this will eventually change everything about our business too. We’re well positioned, thanks to our move to supporting a totally open cloud, which will pay big benefits next year as developers need to build new infrastructure to deal with this contextual age. The cloud is about to turn contextual in a very big way and that’s why we need to keep up with what Amazon, Google, and the other players are doing here and why we should start building support systems for this Qualcomm SDK now. It is that big a deal.

Watch this video to see a taste of what’s coming in the new contextual age.