Technological progress has long affected the creative process and many advancements have made the industry a much more diverse and interesting place – as well as giving birth to completely new markets and categories. But could technology replace certain creative functions altogether?
Each new era’s major technological development has brought fundamental change from the era before, and a raft of predictions too. There was a constant though. While the media might be changing, most key elements of the creation process remained the same.
And they always involved humans.
Writers, designers, musicians, broadcasters, models. actors, videographers, animators or developers, you name it, there was a human element to every part of the journey. Now with artificial intelligence (AI) could the creative process survive without human input?
As soon as anyone talks about AI and machine learning, it’s not long before the Terminator franchise gets mentioned, after all it’s been giving sentient machines a bad name for over 35 years. The conclusion is the moment computers or machines can do things on their own, they will take over and destroy all humanity.
The reality is the Roomba.
It’s similar to the Terminator in that it’s unstoppable and it can move around anything you place in its path – except it does the vacuuming for you. AI is easily accepted when it’s introduced seamlessly or has an instant benefit to the user. In the Roomba’s case, it completes a menial task for you and gives people more time to do more worthwhile things.
When this philosophy is applied to the creative process, it’s no wonder AI systems have already been used to augment or speed up creative processes for decades. For example, image searching even 10 years ago was a laborious task. 10 years before that, every creative studio had stacks of stock photography books that interns had to look through to find images for digital projects or ad campaigns. Now algorithms with image recognition technology allow you search via keyword or ‘similar to’, ‘containing colour’ ‘related images’ etc and offer a range of options in a fraction of a second. AI not only helps you find images, but it suggests alternatives too, which has significant time and cost savings.
AI’s biggest influence in the creative realm has probably been within predictive and data-driven marketing. By taking the guesswork – and human element – out of where, when and to whom online ads are served, it has increased open rates, click-throughs and sales. Of course, these customer behaviour patterns are tracked and predicted by complex data engines and are continuously refined as a result. It’s this trackable data that’s added significant value and insight to every stage of the predictive marketing process. After all, if you can say with confidence what an ROI will be, it’s much easier to get buy-in and sign-off from senior management to start a project or campaign.
AI can already suggest basic ad templates, IBM’s Watson has shown how sentiment can be measured from headlines or email subject lines and plotted against open rates. Google has experimented with AI by letting its supercomputers caption images with great accuracy. In theory, AI can already create the bedrock of any advertising or creative campaign.
Well, the answer is, pretty much anything we can do.
‘Sunspring’ is a sci-fi movie written entirely by an artificial intelligence bot using neural networks. It was trained to ‘write’ the script by processing dozens of other sci-fi screenplays from the 1980s and 90s, and although it was only 9 minutes long, it showed how with relatively little new input, a screenplay of sorts could emerge that hadn’t been created by human hand.
Drum machines have long replaced bulkier, costlier and moodier alternatives and there have been many one-off songs with either vocals or music created by AI. A record called I AM AI was the first full-length album that was entirely composed and produced by artificial intelligence. Released last year, it works in collaboration with a human artist, who provides inputs that AMPER – the artificially intelligent music composer, producer, and performer – uses as composing parameters to create whatever it sees fit.
The key element here is collaboration. Humans are guiding the machines. It’s less replacement and more helping to augment what’s being created. So what if it happens the other way around?
Phrasee is a UK start-up that uses artificial intelligence, natural language generation and deep learning technology to produce creative marketing copy for brands including Domino’s, Virgin and Superdry. At the moment, it’s mainly used to generate copy for email subject lines, Instagram ads and push notifications on smartphones, but it’s easy to see how this can be expanded and adapted in the future to include almost any piece of collateral that requires copywriting. As this writer quietly weeps into his keyboard, the automation of copywriting, especially for sales-focused and automated channels is here to stay, and the results seem very positive so far. As usual, there’s always the next stage.
And that’s when humans take a step back from the whole process.
Generative Adversarial Networks (GANs) are a class of AI algorithms used in unsupervised machine learning. One network generates options, while the other evaluates them. Imagine you need to generate a fake token to get into a theme-park. Your only aim is to get into the theme park. Their only aim is to keep people without tokens out. So you need to create a fake token convincing enough to get you in. The only issue is, you’ve never seen what this token looks like. Luckily, you have a friend that’s willing to try to get past security with your first attempt at creating this fake.
Attempt one, needless to say, your friend gets sent back.
But they have more information of what the token could look like. The person at the gate laughed and said, “That’s the worst fake token I’ve ever seen. It’s not even gold.” Ah ha! So now you create a token that’s gold. The next time, “It’s not even hexagon-shaped.” Bingo! So now you create a token that’s gold and hexagon shaped. This back and forth gets slowly refined over many journeys until you not only have the right colour and right shape, but enough of the correct information on the token to convince the gatekeeper that you have a real one. This is essentially how GANs work. Refining and refining again until one of the results doesn’t get rejected by the other network. This is then repeated many times over so you can end up with complete outputs of whatever you choose to create.
The Cannes Lions winners in 2018 are a solid representation of how creatives are using technology to create new ways to communicate with audiences. They included using AI and Speech Synthesis Markup Language (SSML) to enable JFK to deliver the speech he was set to give the day he was killed in Dallas in 1963. This involved analysing over 800 speeches previously given by Kennedy, creating a database and then by using a mixture of AI and SSML, the words he was due to deliver could be heard for the first time in JFK’s own voice.
There was even a winner that leveraged old technology to access new. With most people in the world still unable to access smartphones or online services due to financial or logistical reasons, the Colombian government set up a landline number 9000973 (Gooogle) that worked in conjunction with Google’s Voice Assistant to get access to online information. People could use their landline to call this local number, ask what they needed to know, and the Google Assistant spoke it back to them. A novel way to mix old and new.
Vue.ai is an end-to-end ‘intelligent retail automation’ and among its range of services, soon they will be offering AI generated models you can tailor for your market including the ability to change skin tones and body shape. The AI software will even allow you to map whole clothing inventories on to an AI model’s body. Naturally, this changes the way you would think about any potential casting or photoshoots. Could you concentrate on creating product and lifestyle videos now that the heavy-lifting of a full product range shoot can be virtually shot and market-ready in less time and for potentially less cost?
In theory, if the more mundane aspects of creative work can be ‘outsourced’ to AI, then it leaves creative teams to concentrate on more complex matters. And it’s these elements of automation for the most basic or simplistic outputs that can free people up to be more creative and deliver more compelling work.
When any new medium comes along, it’s a great opportunity to think about the creative process in new ways. Although the development process for a voice skill is different from creating a graphical UI, it has parallels to more traditional scriptwriting. Our Channel 4 Human Test is a great example of this. Leveraging smart speakers and voice assistant technology, we created a Turing-style test to promote season 3 of the critically acclaimed drama Humans by quizzing the show’s core fans to find out if they were human or ‘synth’. So you still start with a script but it gives people an exciting new experience as a result of the new technology.
Anticipatory predictions and output will form one of the next steps in the evolution of creative process. By merging historical and real-time data inputs, and then trusting technology to analyse, predict, and decide outputs, will we allow machines to make key creative decisions? At the moment, human and machine collaboration is still essential to creating coherent content and solutions, but in 10 years’ time who knows where developments in AI, voice or robotics will take us? Or indeed ‘them’.
If you’re thinking about getting our human team to work on your next project, get in touch, we’ve a range of creative processes to ensure we help you meet – and exceed – your goals.
So now what?