(This article originally ran in The Publishers Weekly/BookBrunch London Book Fair Show Daily.)
When we hear the term “artificial intelligence”, or “AI”, popular cultural images, such as HAL from “2001: A Space Odyssey,” the Cylons in “Battlestar Galactica,” and Ultron in the most recent “Avengers” film, often come to mind. The man against machine theme is very much in vogue, and popular movies and TV series like “Ex-Machina,” “Westworld” and “Humans” have further cemented this trend. Some of that fictional fear also bleeds into our reality with many media articles claiming that automation or machine learning platforms will put humans out of work.
Last year at the London Book Fair’s Quantum Conference, Nick Bostrom, philosopher and author of Superintelligence, even stated, “Artificial Intelligence could in the end replace most, if not all, of the high-level functions of an editor, but if you get to that point then the whole game changes not just for an industry but for humanity as a whole, because it will have become super-intelligent.” Though that makes for a good soundbite, in reality, AI can help humans be more productive, explore new markets and customers, and grow revenue, taking away many of the more boring, time-consuming aspects of work while freeing up space to explore new creative ideas.
In the last few years, publishers have been given access to an extraordinary amount of granular information about customers and products in the marketplace. This information tells publishers how, when, and why customers buy books, whether they complete them, and what other titles they might want to read on the same subject. This “big data” can help publishers make informed decisions on the types of products they publish, how they should sell them and what marketing they should invest in.
Unfortunately, there is just too much data for humans to process quickly and effectively. Turning big data into a big opportunity is one of the main challenges publishers are currently confronted with – a challenge where AI can help. Using machine learning, computers can ingest vast amounts of data, learn what the data means and recommend next steps, something humans find very difficult.
From chatbots on retail sites to programmatic advertising, machine learning is already a big part of daily life, and most people have never heard of it beyond, perhaps, IBM’s Watson. IBM Watson Health is “cognitive computing” created to help humans who are dealing with cognitive overload because of the complexity of data they are dealing with or decisions they are trying to make. IBM want to create systems to augment human capabilities, not eliminate humans entirely from the process.
This approach of augmenting human capabilities can be used in publishing. Service providers and some in-house departments at publishers are creating new systems to process big data in the areas of bookselling, editing, rights, advertising, and learning.
For bookselling, machine learning could recommend better types of books to readers. As publishers mine their backlist for increased revenue, machine learning systems can help them better understand and access their full rights catalog in order to take advantage of current trends in the marketplace. And, for academic publishers, machine learning could offer a more accurate approach to learning by measuring a student’s understanding of concepts and tailoring a specific framework for that student’s learning. But, these are just a few areas in which machine learning can and is being explored.
Publishers can also take a page from similar industries who are already implementing machine learning. In the media space for example, machine learning is already being employed on both the editorial and advertising side of operations. It was revealed recently that the Associated Press is already using Machine Learning technology and natural language generation to write news stories*. Meanwhile advertisers are exploring multiple ways of tailoring messaging based on reader behaviour and collated first party data.
While some see the rise of machines as cause for concern, another option is to see what new territories and ideas we can explore while machines are processing all of the information we can’t or don’t want to delve into. Human analysis will always still be required, so our jobs are safe. For now.
By David Montgomery, CEO, Ingenta
In September, 2015, David assumed the role of Ingenta’s CEO. He was previously Chief Technology Officer, where he was responsible for driving all aspects of the company’s IT strategy, including its vision, innovation and roadmap. In addition to defining the technical architecture and development of the company’s core products, David continues to manage their testing, rollout, and on-going support, working in close collaboration with the company’s customers to ensure that product strategy and development is aligned and with client requirements. Prior to Ingenta, David was Managing Director of Software Operations at Inspired Thinking Group (ITG), a Tech Track 100 company, where he was responsible for overseeing software hosting, application management, software development and customer services. Prior to that, he held various senior positions, including Chief Innovation Officer, at software company Atex, 10 years as Director of Technology at 5 Fifteen and spent 9 years as Director of Technology at Anite plc (previously Autofile).