“Any sufficiently advanced technology is indistinguishable from magic,” Arthur C. Clark once said. With recent developments in mobile computing, his analogy is increasingly relevant. Sensor-enabled apps like Color have implications on a broad scale.
According to McKinsey, “Closing this loop from data to automated applications can raise productivity, as systems that adjust automatically to complex situations make many human interventions unnecessary.” Mobile apps offer us “micro-efficiencies,” shortcuts that can make our lives easier.
Behind the scenes, Color makes use of every phone sensor it can access. It hijacks the iPhone’s camera to check lighting conditions and the microphone to “listen” to a user’s surroundings. Through this, Color triangulates location in relation to others and generates “elastic networks” for photo sharing. It’s the secret sauce that has sparked fascination among the tech elite. No one really understands it, but Color demonstrates how mobile could eventually change the way we interact in the world.
As impressive as Color is, ever more impressive mobile applications are on their way. Computer vision, for example, has inspired an entire field of study that looks at the way computers see and extract information from images, information that can in turn be used to perform a certain task. Microsoft Kinect demonstrates this well; hackers have manipulated it to do everything from simulating invisibility to recognizing sign language.
Mobile services that take in the world and enable decision making are already available in rudimentary form. In retail, ShopWell is able to scan product barcodes and personalize recommendations based on a user’s diet and nutritional needs. Walgreen’s new app overrides the need to manually input prescription codes with its ability to recognize images. It can just take a picture of the pill bottle instead. Google Shopper searches for online prices and Meal Snap identifies calorie counts at the tap of a button.
These applications do get a little more serious. Miami University’s Augmented Reality Research Group developed an Android app that “reads” a bookshelf in order to flag books that are misplaced. Marmota, a prototype developed by FBK researchers, analyses and overlays mountains with names, distance, and altitudes. Similarly, Augmented Driving watches the road while detecting lanes for change warnings.
All of this leads to what Google’s Eric Schmidt calls an “Age of Augmented Humanity.” Mobile technologies are dissolving into behavior. For organizations, this could mean a couple of things. Brands will be able to provide their consumers with utility that not only makes things easier, but changes behavior. For example, Chase’s mobile app now lets users deposit checks with two camera clicks, vastly reducing a customer’s need to stop at the bank. Brands can harness these technologies to sell with the magic that Clarke suggested. A service like Color invokes delight and emotion, which in today’s experience economy could be just as impactful on a brand’s positioning as utilitarian applications.