November 17, 2021

Will GPT-3 move language AI into the business mainstream?

Woman in a red blouse sitting at a table holding her phone in hand and the other hand on her laptop while checking AI integration

AI has spent years at the top of the list of technologies that executive leaders say will have a significant impact on how they do business. Yet AI — which is more a category than a technology, per se — has yet to live up to its promise in most areas of business.

Marketing may be the exception. About half of marketing respondents to the Persado 2021 AI in Creative Survey reported using AI to improve marketing performance, and 67% of them saw revenue uplift. About 40% of them are using AI to generate content and creative.

One reason why AI isn’t used more broadly for generating business content is that it takes a long time, huge volumes of data, and many talented data scientists and software engineers to build language data models. That’s one reason why the world of AI and marketing got so excited about the launch of GPT-3 — the latest iteration of OpenAI’s AI model trained to read and write text.

But what does GPT-3 mean for the use of AI in creative broadly, and for Persado in particular? To answer that question, Persado offers a broad overview of what GPT-3 is and how it fits into our world of AI content generation and decisioning.

What is GPT-3?

OpenAI’s “Generative Pre-trained Transformers” (or GPT) are AI models that have been trained using Internet text to produce language. Transformers specifically are a type of neural network — which are themselves a sub-set of machine learning. Neural networks are designed to mimic the thought patterns of the human brain. Transformers specifically have an architecture that allows them to evaluate words as sequences in a sentence. As a result, transformer-based language models are very good at taking a small amount of text and then predicting the logical words or sentences that follow.

GPT-3 specifically has 175 billion machine learning parameters — more than 15X more than Microsoft’s Turing NLG, which for a short time had the record as the largest machine learning model.  Couple the number of parameters with the size of its training data set (i.e. all of the Internet) and GPT-3  has powerful predictive capacity that enables it to generate content at scale.

Even out of the box, the results are impressive,” says Panagiotis Angelopoulos, Persado’s Chief Data Scientist. “Usually these models require some fine tuning, but you can get some usable results even without that. It’s definitely going to be a revolution in how companies generate content, but it’s not going to completely replace the human element.”

Why not? Because GPT-3 strength is to generate language that it predicts is the logical continuation of a given sample. Its strength is volume, not quality — meaning, the platform is not context or audience aware, and thus will not (and can not) on its own adapt the text to the reader or the medium.

How can companies use GPT-3?

OpenAI released the model in beta, which led to a wave of “AI wrote this” articles, poems, short stories, and other demonstrations of what GPT-3 could do in major publications and blogs. General musing on the benefits of GPT-3 for content marketing and SEO optimization quickly followed. Early adopters also began applying GPT-3 to write software code, another form of language — and a use case that is expected to grow for the purposes of using AI to craft the rote, mundane code developers write so the humans can focus on the high-value creative part.

As of October 1st, 2020, OpenAI began licensing access to the GPT-3 API to third-parties. Around that same time, Microsoft made a $1 billion investment in OpenAI, which granted the tech giant exclusive rights to the underlying GPT-3 code. Five months later, by March 2021, 300 apps were using GPT-3, according to OpenAI, generating a collective 4.5 billion words a day across a range of use cases. IThe tool has undoubtedly made it much easier for both technology providers and end-user companies to build basic language AI apps in-house.

The pros and cons of an open, language AI model

From applications that can sort through customer feedback to identify sentiment trends, to responsive experiences for gaming platforms, there are scores of creative applications coming online using the GPT-3 language AI at their core. Both startups and enterprises are going live today with much faster development times than a new company could achieve if it started by building a proprietary model. That’s one of the genuine benefits of a model like GPT-3 — it encourages experimentation.

But there are also negatives from using GPT-3. One major concern is that it doesn’t “learn”. OpenAI trained the model using Internet data. It is capable of mimicking some aspects of the sample text a user inputs as a base. But GPT-3 does not take into account what a reader wants to engage with, nor does it undergo continuous retraining. It merely predicts what comes next in a line of text based on what came before. 

In practice, that means that an app using GPT-3 doesn’t improve without manual intervention. Observers have also called out GPT-3 for how quickly its predictive model defaults to text rife with bias and extremist speech. Humans, after all, use that kind of language with each other on the Internet, which is where GPT-3 was trained.

GPT-3’s relationship with Microsoft also raises questions. OpenAI says that developers will continue to have access to GPT-3 through an API even as Microsoft enjoys exclusive access to the underlying code. But how long that will last, and how that distinction affects the value of the model for developers, is unclear.

The bottom line on GPT-3 and how it compares to Persado for generating business language

For language AI to become integrated into the core of an organization, they need models that make it easy to experiment and learn. That’s how technologies “cross the chasm” from niche innovations to widely-used assets that produce business results. To the extent that GPT-3 contributes to brands learning about the potential for language AI, it’s a great development.

What it is not, however, is an out-of-the-box solution to the challenge of generating personalized content for customers at scale. A neural network that cannot learn from continuous interactions with a brand’s customers creates risk in the form of insensitive content that is ignorant of context.

Persado’s content generation and decisioning AI does not have those disadvantages. Persado relies on a proprietary language model that delivers value to customers that is on-brand and in-context every time. Our language AI adjusts to reflect a brand’s specific tone and voice, is mindful of the context, and continuously learns as customers respond (or don’t) to Persado-generated messages.

More than that, the Persado platform remembers the results of every experiment. It knows what works to engage a particular company’s core audience and also why. Persado can share those insights with creative teams at the beginning of campaign creation to improve results.

GPT-3, in contrast, can help enterprises craft a large volume of content but humans ultimately have to decide which piece of content to use in campaigns.

“GPT-3 is a generic model,” confirms Angelopoulos, Persado’s Chief Data Scientist. “It does a lot of things well, but it doesn’t do anything great. Our platform is specifically designed for communication for the enterprise. The content that we generate is not only on-context and on-brand, but has another dimension which GPT-3 is completely unaware of — performance. GPT-3 has no idea how the content it generates performs.”

In sum, GPT-3 is a great development for the broader market for language AI. It will allow more companies to leverage language models for a range of needs in their business. Persado, meanwhile, will continue to improve and grow our proprietary AI language generation and decisioning platform to engage enterprise stakeholders with the highest-performing, personalized, continuously improving and context aware communications available.

Related Articles