AI’s economic impactThe effect AI will have on the economy is massive. Such was the conclusion of a 2017 PwC report, Sizing the prize: What’s the real value of AI for your business and how can you capitalise?[i] This report predicts that AI’s contribution to the global economy will be $15.7 trillion in 2030. Consider this: That amount exceeds the current output of China and India combined. Such gargantuan growth will not come merely from increased productivity, either. PwC sees less than half of it – $6.6 trillion – coming in the form of productivity gains.[ii] This increased productivity, they say, will come in the short term from automating routine tasks and providing augmented intelligence systems to help employees work more efficiently. This should free employees from routine tasks and switch their focus to higher value-adding work that will fuel the more significant long-term growth. We’ll examine this skill set shift and what it will mean for both workers and managers in coming chapters. Before we do, though, it’s important to understand the context for that shift. That other $9.1 trillion growth, PwC’s report says, will come from the higher-value work that will be made possible by the initial increase in process efficiencies. Once employees’ responsibilities shift from routine tasks to more profit-oriented ones, the shift should spur increased consumption.[iii] This prediction takes into consideration much more than most assessments of AI’s impact on the economy. Other assessments go only as far as measuring the economic effects that cost-cutting through automation is likely to have. The PwC assessment also calculates AI’s potential for enabling companies to redirect the resources that are freed up by automation and using them to develop new offerings and penetrate new markets. That is why the predicted growth from the workforce skill set shifts that AI will enable is almost 50% greater than the predicted growth from increasing efficiencies. This reallocation of resources should enable companies to produce higher quality offerings and increased personalization options to offer consumers. This should bring down costs and increase demand by making what are now more expensive goods available at lower prices that more people can afford.
How technology-drives commoditizationWe can get an idea of what this should be like by considering what happened in the technology disruption that shifted the production of cloth from craftsmen to machines in the First Industrial Revolution. As tasks in the weaving process were increasingly automated, workers did what the machines could not do, such as moving raw materials into place and tending machines. As machines improved to do more tasks, workers increasingly moved from tending one machine to tending a number of them. This greatly increased productivity. The output of finished cloth grew fiftyfold. Labor required to produce a yard of material dropped by 98%. As a result, the cost to consumers plummeted. As costs decreased, more people were able to afford not only more cloth but also more finished clothing. Demand for both skyrocketed. What had previously been scarce and expensive goods became commodities.[iv] Nor is this scenario limited only to the early 1800s, either. It has repeated itself more recently with the development of personal computers. As the technology behind them grew, capabilities of personal computing products have skyrocketed while prices dropped dramatically, to the point where devices with capabilities that would have been beyond the imaginations of most people 40 years ago are now affordable to most people today – and are so deeply incorporated into people’s lives that most people today would not dream of being without them. We can expect that same type of commoditization to come to many goods that are out of reach of many people today, as AI is incorporated into more processes. Greater efficiencies should lead to lower prices that allow more products and services to become commoditized, stoking demand and expanding industries. And that’s just on the supply/demand end of the equation. AI is likely also to free consumers’ time from routine activities, such as driving themselves to work. This will give them more time and motivation to engage in activities that produce more and data touchpoints that AI-driven systems can analyze to develop products that suit customers’ desires even more precisely. Even the ability to customize products to consumers’ exact specifications could become commoditized, as AI enables manufacturing processes to move increasingly away from a mass-production business model to an on-demand model. Here again, consumption would increase, increasing demand and spurring expansion of industries along with it.[v]
Developing competitive advantages through AIWith improved ability to tap into consumer preferences, AI front-runners will be better able to tailor their products to match consumer demands and capture more of the market.[vi] Early adopters will be able to build a competitive advantage that late adopters will be unable to overcome, much like what happened between Netflix and Blockbuster. Netflix, struggling to find profitability in the emerging streaming video market in 2000, offered to sell a 49% share to Blockbuster for $50 million. The deal would have rebranded Netflix under the Blockbuster name and would have positioned Blockbuster as the major force in the emerging technology. Blockbuster’s leadership, however, saw no future in streaming video and turned down Netflix’s offer. Four short years later, Blockbuster leaders realized that video streaming was the future of video distribution and sought to launch their own service against the now well-established Netflix. Blockbuster, though, had entered the space too late to gain market share against Netflix. Blockbuster’s video streaming service failed. Consumers had moved on to Netflix’s new business model and Blockbuster soon went bankrupt, due to their slowness in adapting to emerging business models. Netflix and Blockbuster are not the only examples of this scenario, either. It has occurred also in new business models for producing and distributing books, music and in many other sectors. Such a scenario will undoubtedly play itself out repeatedly between early and late adopters of AI in the coming decade. Early adopters, making use of AI’s ability to analyze customer preferences and predict trends, will be able to build unassailable competitive advantages over late adopters that ultimately fall by the wayside.
Commoditized AIWe can also expect AI itself to become commoditized as the technology matures, in much the same way that personal computing did. It may be hard to picture right now, but all indicators point in that direction. Machine learning is already at the core of the predictive technologies for Netflix and Amazon and many other consumer-facing services. As the technology continues to expand into more and more mainstream uses, it will have the same climb in capability coupled with descent in cost. Supply will increase as more businesses find ways to package AI into products or services that meet a growing demand for newer, better and cheaper ways to apply it to business and consumer needs. And when a high level of supply meets a high level of demand, you have a commodity. The PwC report suggests: “While investment in AI may seem expensive now, PwC subject matter specialists anticipate that the costs will decline over the next ten years as the software becomes more commoditised. Eventually, we’ll move towards a free (or ‘freemium’ model) for simple activities, and a premium model for business-differentiating services. While the enabling technology is likely to be increasingly commoditised, the supply of data and how it’s used are set to become the primary asset.”[vii] That means that focus will shift away from the technology itself towards finding new and innovative ways to use it. As those who sought to look forward into the future during the formative years of personal computing likely would never have imagined all the doors that technology would open to us today, so we stand in a similar position with AI today. The PwC report gives a glimpse[viii] into that future: “An automotive company developed a dynamic agent-based model to simulate thousands of strategic scenarios for entering the ridesharing market. The model allowed key decision makers to test a variety of policy configurations in a virtual, risk-free simulated environment to help them understand the ultimate impact on market share and revenue over time, before actually making any decisions. Flight simulators allow pilots to test the impact of their decisions in a virtual environment to better prepare them for making decisions in flight, so why shouldn’t business executives do the same?” Such simulators likely should become commonplace in businesses over the next decade. So should AI solutions that optimize supply chains, self-correct manufacturing processes or aid physicians in diagnosis. The PwC report and the more detailed sector-specific reports associated with it contain an AI Impact Index that rates the potential for AI solutions to impact selected sectors both in its ability to increase efficiencies and in its ability to drive increased consumption through enhanced quality and personalization.[ix] By using this analysis, they identified nearly potential 300 potential new use cases for AI in those sectors, as well as how close the technology is to being able to deliver such solutions and what barriers remain to be overcome. In most cases, the solutions described are already fairly close to feasibility, which only underscores the difficulty of looking too far into AI’s future. It also underscores perhaps the most crucial element driving AI’s future – the one thing that AI cannot advance without: human creativity.
What AI lacks – creativityFor all the benefits it offers in expanding certain aspects of humankind’s ability to store and process data, AI falls short when it comes to creativity. It needs humans who can develop innovative ways to use it more effectively. Granted, much has been made of AI “creativity.” Researchers have developed systems of multiple neural nets and programmed them to recognize what is aesthetically pleasing to humans. Then they programmed those nets to work together as artist and critic to iteratively create and judge the resultant artworks the one net created until they came up with original works of art – and even new styles of art[x] – that humans could not distinguish from ones done by other humans. Still, those neural nets did not spontaneously develop an artistic flair. They were programmed for the single purpose of assessing human art, learning the principles of aesthetics that lie behind it and mindlessly following those principles to create works that follow the principles, but don’t recreate any specific human work or style. The true creativity in these experiments was in the humans who developed the programming that gave those nets the appearance of creative ability. The American Psychological Association[xi] has long studied creativity. Yet the specifics of how it works remains a mystery to them. Studies going back nearly a century have debunked the idea that it is merely a function of applying intelligence to a situation: “In the 1920s, psychologist Louis Terman, PhD, began looking at the relationship between intelligence and creativity. In a longitudinal sample of intelligent children, not all ended up developing their creative abilities, he found. That’s when psychologists started to realize more than intelligence was required – also critical is having an ability to see things from a different perspective, Simonton says.” Being able to see things from a different perspective is what AI lacks. Even with machine learning, it sees things only from the perspective programmed into it. Here the old computing adage, “Garbage in, garbage out” adage works in reverse. Creativity in programming results in an appearance of creativity in the AI’s output. Creativity must be programmed in to achieve an output that mimics creativity. So how does this affect AI’s future? Very simply, AI is reliant on human creativity. It cannot make the creative leaps that humans do. It can reach only the outcomes that humans have programmed it or taught it to reach.
The connection between human creativity and new technologiesHuman creativity has driven all past technology disruptions and it is necessary for the current one. When people respond to gaps that new technologies create, growth occurs, both in developing new industries and jobs and also in helping disrupted organizations emerge from the pack and thrive under new conditions. Take, for example, the gaps that human creativity has identified and filled over the years in the industry of information dissemination. In medieval times, information was disseminated through books. They were produced by having dozens of individuals painstakingly copy a book, each person producing one copy. Human creativity, though, identified and created a more efficient way to make copies of books through the moveable-type printing press. But that was only the beginning. Originally, this new technology was limited to its inventor. But other people recognized its potential and copied the process. This not only created a demand for more affordable books, but also created needs for some people to build presses and others to operate them. This created new professions, multiplied book production, drove down the price for books and made it possible to sell even more. Other people saw the bottleneck that formed when printers were the only source of books for those who wanted to buy them. Not everyone lived close to a printer, and printers were too busy keeping up with the demand to effectively get printed materials to everyone who wanted them. So those who noticed the gap jumped into it and the occupation of booksellers was born. At each step in the process, it was human creativity that recognized the emerging gaps and found ways to fill them. One printing press led to new market conditions and multiple new occupations. This application of creativity has continued throughout the centuries, spawning new types of reading materials, such as newspapers and periodicals, and new gaps to fill. In subsequent centuries, others recognized the potential for promoting businesses in print materials that people commonly read, and the print advertising industry was born. Move forward further, to the introduction of computers and the internet to aid in the production of information, and you see yet more people using creativity to identify never-before-imagined gaps to fill. They spurred the development of many digital- and cyber-related professions devoted to information dissemination and an explosion of devices whose manufacture employs hundreds of thousands of people. The explosion of new needs that successive waves of new technologies have created has completely transformed the information-distribution task that once employed probably no more than a few hundred copyists worldwide who produced a miniscule number of books per year. Information dissemination today comprises myriad industries and professions that employ tens of millions of people in producing a body of information that grows exponentially each year. And all of it came as a result of human creativity recognizing gaps that could be filled and filling them. Could AI have anticipated gaps that led to new industries and occupations the way that humans did? Certainly – but only if human creativity had first recognized the kind of gaps that could develop and had programmed the AI system to look for them. What about the example mentioned earlier about the automotive company that developed an AI system that simulated all possible market conditions to enable executives to test various policies and decisions and find the best strategy with which to move forward? Isn’t that AI showing creativity and making decisions? Again, that system is extremely valuable at processing far greater amounts of data than the best human data analyst could process in the same amount of time. Yet it is only as good as the data programmed into it. Recall the previous chapter’s example of the AI system that erroneously concluded that asthma patients should not be prioritized for hospitalization when diagnosed with pneumonia. That conclusion was accurate according to the data it had been fed, but the humans who programmed it had erred in not including one piece of data that was so second-nature to them that no one thought to include it. That means that the ultimate benefit of AI will be realized only by the application of human qualities that AI cannot duplicate. Human creativity and insight will be needed for AI to achieve its full potential.[xii]
TakeawaysAI is likely to spread across more and more industries. The efficiencies it will bring will likely result in greater commoditization, making many goods, services and analyses far more accessible to far more businesses and consumers. As the technology matures, AI itself will likely become commoditized, becoming every bit as ingrained into the business environment as the internet now is. Yet it will remain dependent on the humans who create it. As much processing power as it has, it shows no sign of developing the ability to evolve beyond what is programmed into it. The PwC report[xiii] points out: “It’s important to prepare for a hybrid workforce in which AI and human beings work side-by-side. The challenge for your business isn’t just ensuring you have the right systems in place, but judging what role your people will play in this new model. People will need to be responsible for determining the strategic application of AI and providing challenge and oversight to decisions.” It is in these human decisions about judging what role people will play – and effectively preparing them to be ready to play that role – that the real challenge that will determine how smooth the transition into an AI-enabled future will lie. We’ll be looking at that particular challenge over the rest of this book.
[i] Sizing the prize: What’s the real value of AI for your business and how can you capitalise?, 2017, Available: https://www.pwc.com/gx/en/issues/data-and-analytics/publications/artificial-intelligence-study.html [ii] Sizing the prize, p. 3. [iii] Sizing the prize, p. 3. [iv] Michael Morganstern, Automation and Anxiety, The Economist, 2016, Available: http://www.economist.com/news/special-report/21700758-will-smarter-machines-cause-mass-unemployment-automation-and-anxiety [v] Sizing the prize, p. 6. [vi] Sizing the prize, p. 6. [vii] Sizing the prize, p. 21 [viii] Sizing the prize, p. 19 [ix] Sizing the prize, p.10 [x] Chris Barianuk, Artificially intelligent painters invent new styles of art, New Scientist, 2017, Available: https://www.newscientist.com/article/2139184-artificially-intelligent-painters-invent-new-styles-of-art/ [xi] Karen Kersting, What exactly is creativity? Psychologists continue their quest to better understand creativity, American Psychological Association, 2003, Available: http://www.apa.org/monitor/nov03/creativity.aspx [xii] Sizing the prize, p. 21 [xiii] Sizing the prize, p. 21