It may seem like it’s been around forever, but the first iPhone came out nearly eight years ago in 2007. Don’t believe me? You can easily dig up the initial reviews through a simple search – no need to call up the microfiche for turn of the century print articles.
At that moment in time, the idea of a computer in your pocket was already established through Palm, Blackberry and the first slate of Android phones, but it took Apple’s sleek product to really acclimate people to the idea that the mobile revolution had finally come.
That’s why the conversation around the Apple Watch has sound very familiar: Apple launching a new piece of hardware two to three years behind its many competitors. And while Apple had an instantaneous hit on its hands with the iPhone (and the subsequent iPad), something feels different about the discussion this time around.
Whereas the iPad was universally mocked as being just “a big iPhone” that you couldn’t make calls from during pre-sale hype period, reviewers immediately changed their minds as soon as they watched their first movie on the device.
We humans assumed we needed a computer on our wrist and, early adopters aside, waited for Apple to get around to supplying us them. The clamor in the run up was pretty substantial. Yet, reviewers were either underwhelmed by the device or felt it played to our worst devices of ignoring the real world for a digital domain. It gave birth to many “why I’m returning my Apple Watch” pieces that I don’t recall seeing for the iPad or iPhone.
— Jay Yarow (@jyarow) June 4, 2015
Suddenly the Apple Watch ecosystem for which ad and marketing tech companies had waited patiently seems in jeopardy. But I’d argue the watch is not the thing. It’s the idea that we may be taking stock of exactly what ubiquitous connectedness is doing to ourselves.
We, as a people, are terrible at multitasking.
We try to convince ourselves that we’re not, but science cares not for our argument. Every time someone chases a lull in the conversation with a dip into Twitter, there is a “cognitive cost” in doing so, Earl Miller, a neuroscientist at MIT, told the Guardian.
However, we have the expectation that the next wave of technology will link us even more powerfully, more consistently with information instantaneously. Our wearable devices will tell us when we’re dehydrated. The Internet of Things can alert us when our toaster needs fixing and our smart cars will remind us we’re driving by the place we order the most takeout from during this particular time. And on and on. And these are
This cavalcade of notifications and distractions has mostly been framed in terms of progress, but the negative initial reaction to the Apple Watch calls that into question. Perhaps always on is not preordained. We are starting to recognize the limits to our attention spans just through the use of tablets and phones. Vendors that enable companies to smartly advertise services and products may find that consumers are even less receptive to that idea that previously thought.
That’s not to say that we won’t want to know when our house is on fire. But it’s clear that consumers are not going to sign up for the next generation of interconnected devices without taking a hard look at what is given up for the gain of knowledge.