Top
image credit: Adobe Stock

AI text generation in marketing must be managed carefully

October 21, 2022

It’s been two years since OpenAI announced the arrival of GPT-3, its seminal natural language processing (NLP) application. Users were blown away by the AI language tool’s uncanny ability to generate insightful, considered and colorful prose — including poems, essays, song lyrics and even detailed manifestos — using the briefest of prompts.

Known as a “foundation model,” OpenAI’s GPT-3 was trained by feeding it practically the entire internet, from Wikipedia to Reddit to the New York Times and everything in between. It uses this vast dataset to predict what words are the most plausible given any prompt. Given the scale of such a vast research undertaking, there are only a handful of these foundation models. Others include Meta’s RoBERTa and Google’s BERT, plus others developed by startups, such as AI21.

Read More on Venture Beat