Over sixteen years ago, my friend Dr. Thomas P.M. Barnett (@thomaspmbarnett) wrote, “The defining achievement of the New Economy in the globalization era will be the Evernet, a downstream expression of today’s Internet, which most of us still access almost exclusively through bulky desktop personal computers anywhere from a few minutes to several hours each day. Over the next ten or so years, this notion of being ‘online’ versus ‘offline’ will completely disappear.”[1] Although the term “Evernet” didn’t catch on, Barnett was on point with his prediction. Technologists at Cisco have also looked into their crystal balls and they see a world in which the lines between the World Wide Web (i.e., the human-to-human Internet) and the Internet of Things (i.e., the machine-to-machine Internet) blur so completely that the Internet of Everything (IoE) emerges. “The Internet of Everything,” they explain, “is the networked connection of people, process, data, and things.”[2]
Many (if not most) analysts believe this new world of connectivity is spurring the Fourth Industrial Revolution. In fact, the Fourth Industrial Revolution was the theme for this year’s World Economic Forum (WEF) meeting in Davos, Switzerland. The WEF defines the Fourth Industrial Revolution as a “fusion of technologies that is blurring the lines between the physical, digital, and biological spheres.”[3] This fusion of spheres is likely to create a very different world than one we know live in. Barnett saw the new world this way:
“We go from today’s limited-access Internet to an Evernet with which we will remain in a state of constant connectivity. We will progress from a day-to-day reality in which we must choose to go online to one in which we must choose to go offline. This is not some distant fantasy world. Almost all the technology we need for the Evernet exists today. It mostly is just a matter of achieving connectivity. The rise of the Evernet will be humanity’s greatest achievement to date and will be universally recognized as our most valued planetary asset or collective good. Downtime, or loss of connectivity, becomes the standard, time-sensitive definition of a national security crisis, and protection of the Evernet becomes the preeminent security task of governments around the world. Ruling elites will rise and fall based on their security policies toward, and the political record on, the care and feeding of the Evernet, whose health will be treated by mass media as having the same broad human interest and import as the weather (inevitably eclipsing even that).”
Lev Grossman and Matt Vella (@mattvella) point to the introduction of the Apple Watch as the tipping point that slid us into this brave new world. “Technological progress tends to feel incremental,” they write, “but this is a watershed, a frog-boiling moment.”[4] They continue:
“There was a time when the Internet was something you dialed up; then it was replaced in the late 1990s by broadband, the always-on Internet, a formula that already sounds quaint. Apple Watch signals the advent of an always-there Internet, an Internet that can’t be put away. We’re used to dabbling just our fingertips in the Internet, but the Apple Watch doesn’t stop there. It tracks your movements. It listens to your heartbeat. It puts your whole body online. Exactly how personal do we want to get?”
The Apple Watch, however, is only one of the many “wearables” introduced a couple of years ago that promoted the “quantified self” movement. The 2014 International Consumer Electronics Show (CES) was noted for the number of wearable technologies on display. Having roamed the Show’s display areas, Troy Wolverton wrote, “If Big Brother — or at least Little Cousin, aka your smartphone — isn’t watching yet, he soon will be. That seemed to be the underlying message … of the opening events of the Consumer Electronics Show. … On display at Unveiled were numerous products that will monitor things ranging from users’ heart rates to the sump pump in their house. Taken together, the vision of the technology industry was clear: everything you do and everything that happens in and around your home is going to be tracked.”[5] Each new wearable device adds data to the so-called “quantified self.”
Although the always on world raises some privacy issues, the combined benefits of connectivity, data collection, advanced analysis, and artificial intelligence (AI) are undeniable. I agree with Eric Schmidt, Chairman of Alphabet (Google’s holding company), that AI can help address some of the world’s most difficult challenges as well as profoundly assist the less fortunate throughout the world. Jack Clark (@mappingbabel) reports, “Google’s chairman thinks artificial intelligence will let scientists solve some of the world’s ‘hard problems,’ like population growth, climate change, human development, and education. Rapid development in the field of AI means the technology can help scientists understand the links between cause and effect by sifting through vast quantities of information.”[6]
One of the reasons that Schmidt and I are so hopeful is because new technologies, like cognitive computing — which can leverage artificial intelligence, semantic reasoning, and advanced analytics — are capable of dealing a number of variables too large for traditional analytic methods to cope. That allows researchers to go beyond statistical correlation to actual cause. Schmidt told Clark, “[There is] a small set of people that understand collectively that when we put all this stuff together we can build platforms that can change the world.” Klaus Schwab, executive chairman of the World Economic Forum, worries about “unexpected consequences” of decisions in a complex world. “In the modern era, it’s harder for policy makers to know the impact of their actions, which has led to ‘erosion of trust in decision makers’.”[7] Once the types of systems envisioned by Schmidt are mature, decision makers will be able to use them to explore possible consequences of their decisions.
The Internet of Everything will likely affect every area of human activity from healthcare to shopping. Humans will eventually come to depend on their always-on virtual co-worker that anticipates work that needs to be accomplished, gathers and analyzes appropriate data, and then provides actionable insights that can be implemented. Thomas H. Davenport (@tdav), a Distinguished Professor at Babson College, and Julia Kirby (@JuliaKirby) argue that human/machine collaboration should allow humans “to take on tasks that are superior — more sophisticated, more fulfilling, better suited to our strengths — to anything we have given up.”[8] That is an “always on” world to embrace rather than fear.
Footnotes
[1] Thomas P.M. Barnett, “Life After DoDth or: How the Evernet Changes Everything,” USNI Proceedings, May 2000.
[2] “Value of the Internet of Everything for Cities, States & Countries,” Cisco.
[3] Joe Weisenthal, “Davos Boss Warns Refugee Crisis Could Be Precursor to Something Much Bigger,”
[4] Lev Grossman and Matt Vella, “Never Offline,” Time, 10 September 2014.
[5] Troy Wolverton, “CES’s Unsettling Message: Everything Will Be Tracked,” siliconbeat, 6 January 2014.
[6] Jack Clark, “Google Chairman Thinks AI Can Help Solve World’s ‘Hard Problems’,” Bloomberg, 11 January 2016.
[7]Weisenthal, op. cit.
[8] Thomas H. Davenport and Julia Kirby, “Beyond Automation,” Harvard Business Review, June 2015.