A few weeks ago, I was asked to participate in a conference sponsored by Bristol-Myers Squibb called “Data Driven Decisions Day” or 4D for short. The theme of the day was “Human-Machine Symbiosis” and I was asked to speak on the topic “The Disruptive Potential of Cognitive Computing.” I also participated in a panel discussion on the topic “Cognitive Intelligence — Hype or Reality?” Symbiosis is traditionally defined as interaction between two different organisms living in close physical association, typically to the advantage of both or a mutually beneficial relationship between different people or groups. For the purposes of the conference, of course, symbiosis had to be defined slightly differently with references to living organisms being modified to include machines. In preparation for the conference, I jotted down a few thoughts to help me put the topic of human-machine symbiosis in perspective.
History
Human advancement has always been the product of technological development. A recent study concluded that language was probably developed by early humans in order to teach each other how to make tools (see my article entitled “In Education, Language and Learning Go Together”). Tools eventually evolved into ever more complicated machines; but, the purpose of these increasingly complicated gadgets remained constant — to make difficult or unpleasant tasks easier to perform. As these tasks were taken over by machine, there was a fear that machines would put large numbers of people out of work. Those fears were never realized. In fact, just the opposite occurred — new technologies created more jobs than they destroyed. There are currently a number of pundits who insist that the new industrial revolution — one involving artificial intelligence and robots — is going to be different. This time around, they insist, more jobs are going to be lost than created.
There are two different discussions about human-machine symbiosis which could be held. The first discussion continues the conceptual thread noted above — namely, machines have always helped humans complete difficult or unpleasant tasks. The second conceptual thread is one that so-called trans-humanists like to follow. Their argument is that humans have a long history of augmenting their bodies using technology (e.g., wearing glasses and/or hearing aids). The natural progression of this trend, they insist, is augmenting human abilities in every possible way (e.g., exo-skeletons, artificial eyes, and, eventually, implants that connect people to the Internet of Everything). Items like the Apple Watch and Google Glasses demonstrate that proponents of the trans-humanist movement may be correct in their prediction. Where both threads converge is within the Internet of Everything.
Fifteen years ago there were no discussions about cloud computing. Today, most businesses are moving much of their digital activity into the cloud. Fifteen years ago there were no smartphones (they were still more than half a decade away). Today, smartphones have become the fastest-selling gadgets in history, outstripping the growth of the basic mobile phones that preceded them. Fifteen years ago there were no wearable devices. Today wearable devices are the stars of consumer electronics shows. When you consider all of these changes, there is little wonder that most experts believe we are entering a Second Industrial Revolution and human-machine cooperation is at the heart of that revolution.
Human-Machine Symbiosis and the Second Industrial Revolution
It would be foolish to argue that some jobs now being performed by humans aren’t going to be lost to technology (either hardware [e.g., robots] or software [e.g., cognitive analysis]). One of the motivations behind the advance of technology is to get machines to perform work that is drudgery for humans. That motivation will continue to drive technological progress. As noted above, new technologies have historically also been job creators; sometimes creating entirely new industries. But some analysts are asserting that the next wave of technological advancement will be different — more jobs will be lost than will be created and this will create a societal crisis. Other analysts dismiss such claims and assert that the historical pattern will continue. Frankly, no one knows which camp is correct. We do know that the next wave of technologies will be different than previous waves in one significant way — the jobs new technologies put at risk won’t just involve unskilled or low-skilled labor. For the first time in history the jobs of skilled professionals are at risk. For the first time in history we are looking at advanced technologies that will do both the work and the thinking.
Machines that Work
When Pew Research surveyed two thousand experts in the fields of AI, robotics, and economics to find out how they thought automation would impact the future, 52% thought automation would result in a brighter future and 48% thought automation would create a darker future.[1] The survey found that experts who believe that there won’t be job crisis generally argue five specific points.[2] They are:
- Argument #1: Throughout history, technology has been a job creator — not a job destroyer.
- Argument #2: Advances in technology create new jobs and industries even as they displace some of the older ones.
- Argument #3: There are certain jobs that only humans have the capacity to do.
- Argument #4: The technology will not advance enough in the next decade to substantially impact the job market.
- Argument #5: Our social, legal, and regulatory structures will minimize the impact on employment.
While experts remain divided about whether emerging technologies will destroy more jobs than they create, there is unanimous agreement that some jobs, especially those involving repetitive actions, are going to be automated. A recent study concluded that by 2030, 50% of today’s jobs will no longer exist thanks to artificial intelligence.[3] Many of those jobs will be lost to robots. The factory floor of the future will be filled with more machines than people. Thanks, however, to artificial intelligence and machine learning, some robots are being built to function alongside humans in the workplace. I believe this kind of human/machine collaboration will be one of the characteristics of the Second Industrial Revolution. There are still a lot of jobs — some of them very simple — that are easy for humans but difficult for robots. There are also a lot of jobs that still require the human touch. David Hummels, a professor of economics at Purdue University, believes humans still have a unique advantage that machines may never be able to emulate: our ability to respond to other humans. “We have evolved over 100,000 years to be exquisitely perceptive to visual and aural cues from other people around us,” Hummels states, “which is an important skill that machines may never be able to match.”[4] That’s why human/machine symbiosis is a much more likely future than one completely dominated by machines.
Machines that Think
Kevin Kelly (@kevin2kelly), founding Executive Editor of Wired magazine, recently tweeted, “In the very near future you will cognify everything in your life that is already electrified.” Kelly was looking for a way to describe the process now underway to make everything from watches to washing machines smarter. The Enterra Enterprise Cognitive System™ is a perfect example of the kind of technology involved. When we talk to clients, we note that when an organization has an analytic problem, it typically has to assemble a team of three experts:
- A business domain expert – the customer of the analysis who can help explain the drivers behind data anomalies and outliers.
- A statistical expert – to help formulate the correct statistical studies, the business expert knows what they want to study, and what terms to use to help formulate the data in a way that will detect the desired phenomena.
- A data expert – The data expert understands where and how to pull the data from across multiple databases or data feeds.
Having three experts involved dramatically lengthens the time required to analyze, tune, re-analyze, and interpret the results. Enterra’s approach empowers the business expert by automating the statistical expert’s and data expert’s knowledge and functions, so the ideation cycle can be dramatically shortened and more insights can be auto-generated. Even some of the business expert’s logic is automated to help tune and re-analyze the data.
We are on the cusp of an era when machines are going to be making many of the business routine decisions now being made by humans and will assist us in making many of the tough decisions as well. There is virtually no part of a business that will be immune. Ginni Rometty (@GinniRometty), the Chairman and CEO of IBM, told participants at an IBM-sponsored conference earlier this year, “In the future, every decision that mankind makes is going to be informed by a cognitive system like Watson, and our lives will be better for it.”[5] Bain analysts, Michael C. Mankins and Lori Sherer (@lorisherer), write, “The best way to understand any company’s operations is to view them as a series of decisions.”[6] They explain:
“People in organizations make thousands of decisions every day. The decisions range from big, one-off strategic choices (such as where to locate the next multibillion-dollar plant) to everyday frontline decisions that add up to a lot of value over time (such as whether to suggest another purchase to a customer). In between those extremes are all the decisions that marketers, finance people, operations specialists and so on must make as they carry out their jobs week in and week out. We know from extensive research that decisions matter — a lot. Companies that make better decisions, make them faster and execute them more effectively than rivals nearly always turn in better financial performance. Not surprisingly, companies that employ advanced analytics to improve decision making and execution have the results to show for it.”
Many analysts believe that the only companies that will survive in the Industry 4.0 world will be digital enterprises. When considering transforming into a digital enterprise, companies should be asking themselves which decisions can be automated and which decisions still require human intervention when anomalies arise? At Enterra Solutions® we are so convinced that digital enterprises will dominate the business landscape we are developing the Enterra Enterprise Cognitive System to help meet the challenge. The system uses cognitive computing technologies to help companies make sense of oceans of data in which they swim. We are predicting that most of the routine business decisions (including supply chain decisions) will be made by cognitive computing systems. This will free time for humans to apply their expertise towards solving a company’s most vexing challenges.
Potential Ramifications
Companies that are considering automating jobs that are currently being performed by humans must surely realize three things: First, the ultimate success of their business relies on global economic growth. Second, global economic growth relies on the continued emergence of a middle class that has expendable income and is willing to spend it. Third, the global middle class won’t emerge if good paying jobs are not available because they have all been filled by robots. In other words, many companies could be cutting off their noses to spite their faces by replacing humans with robots. The question remains: Can anything be done to break this vicious circle or to mitigate its effects? There are good reasons to be concerned about the future and how automation will affect the job market; but, it is often easier to see the down side of situation than it is to see the up side. Since we don’t know what jobs will be created, we can’t intelligently speak about them since most of them have no equivalent in today’s workforce. If the global economy is going to continue to grow, companies and governments are going to have to figure out how to ensure that the future of human-machine symbiosis is, in fact, a mutually beneficial arrangement.
Footnotes
[1] Alex Hern, “Will robots take our jobs? Experts can’t decide,” The Guardian, 6 August 2014.
[2] Aaron Smith and Janna Anderson, “AI, Robotics, and the Future of Jobs,” Pew Research Center, 6 August 2014.
[3] Chen Chia-Lun, “50% of jobs to vanish in 15 years: research,” Want China Times, 15 November 2014.
[4] Steve Talley, “You Won’t be Replaced by a Robot,” Scientific Computing, 8 September 2014.
[5] Lauren F. Friedman, “The CEO of IBM just made a jaw-dropping prediction about the future of artificial intelligence,” Business Insider, 15 May 2015.
[6] Michael C. Mankins and Lori Sherer, “Creating value through advanced analytics,” Bain Brief, 11 February 2015.