GPT-3: Deciphering substance from hype

gpt-3 Sep 10, 2020

Understanding tools rather than succumbing to mythology

A recurring theme of this newsletter is that technology is a tool we can understand and use rather than a magical force beyond our comprehension. While this applies to all technology, it was derived with AI in mind.

Over the past few weeks, there’s been a steady build up of hype around a new AI tool called GPT-3, or generative pre-trained transformer 3, which is the third iteration of a language prediction model that can be used to produce text.

What’s fueling this hype is that the text generated is hard to distinguish from words composed by a human rather than a machine.

However that’s where the hype and mythology take root, by building a false and distracting comparison between human and machine.

Another theme of this newsletter, is that human and machine are complimentary and not contradictory. Instead of an either/or scenario, we must recognize it is about both, about hybrids in which humans employ machines or automation with increasing sophistication and depth. Once we accept our status as cyborgs, it’s easier to move forward in ways that preserve and celebrate our humanity.

Which is partly why GPT-3 is genuinely interesting. Developed by OpenAI, an AI lab in Silicon Valley that seeks to create “general AI” that benefits humanity, GPT-3 uses deep learning to enable natural language processing at an unprecedented scale. By scale I’m referring to the neural network the software uses to recognize and generate language. At 175 billion “machine learning parameters” the system is far larger than any previously created.

This translates into an advanced language capability that is driving the hype machine currently surrounding this software. Unfortunately due to concerns around misuse, it is not currently open to the public, and the waiting list has closed due to overwhelming demand.

However the Guardian obtained access, and used that to create an op-ed that has received considerable attention (and hype):

The Guardian op-ed was an exercise in AI theatre, in which they asked the machine whether humans should fear it. This combines both the mythology of AI proficiency and the accompanying existential angst. The desired effect was to make people think that AI is out to take our jobs, in this case op-ed writers.

Although there were observers who noted that perhaps automated writing was emerging as a result of a poor labour market (rather than enabling one).

However another and deeper criticism was that the Guardian was not transparent with how the op-ed was produced, and their framing distracted from what the tool actually does. This shows how the stories we tell about how our technology works directly shapes how we perceive and understand the tool (and how it can be used).

In particular, when we approach this software as a tool, we can better understand the role that humans can and should play in its (responsible) use.

This is also relevant when robots inevitably make mistakes, and we ask who is responsible for the error:

In an opinion piece written for The Guardian, an instance of the machine learninglanguage modelGPT-3 claims that the word “robots” is Greek for “slave”. (GPT-3 is a “deep learning” model created by the US artificial intelligence research laboratory OpenAI, which generates human-like text based on a short prompt.)

GPT-3’s claim is inaccurate. While the word robot is indeed derived from a term for forced labour, it does not come from Greek. The word is actually Czech in origin.

The word “robot” for an artificial being was originated by the Czech playwright Karel Čapek in his 1920 play “R.U.R.” (which is short for “Rossum’s Universal Robots”). It was based on the Czech word “robota”, which means the kind of forced labour carried out by serfs under the feudal system. The word was used to describe the human labourers created by synthetic biology in the play.

Another person noted that there were problems with not just the depiction of robots, but how the software described humans:

Therefore it’s not just the issue of who is responsible for errors, but what about meaning? This is why we cannot regard the machine as distinct from humans, but rather a tool employed by them.

The Guardian op-ed was commissioned, created, and edited by humans. Like any op-ed, we can assign authorship and editorial responsibility.

Like a mirror, we can find whatever we want in our machine reflection, but to ascribe such meaning and intent to the machine is just foolish and mythological.

This is also why it is important to note that GPT-3 is not software in the application sense, but in the API or application programming interface sense. This makes it more akin to infrastructure that powers other applications, rather than a stand alone. It’s not a machine that is all encompassing, but rather a library that other machines can be built using.

As a result, while posts like the Guardian op-ed gather attention, the real revolution is happening via a range of GPT-3 applications now proliferating.

Another example contributes to the ongoing automation of email:

And writing job descriptions:

You can find more examples on their beta page.

Although here’s a long list of examples that was generated in July, before the current round of hype:

In recognizing GPT-3 (and OpenAI in general) as just a tool rather than a force of magic, we can find opportunities to use this tool and benefit from it. It can be difficult to distinguish between hype and substance, and that’s where today’s post is really just a first attempt.

There is quite a bit of hype associated with GPT-3 above and beyond whether it can compose text in comparison with a human ability to do so. In a follow up post, we’ll look at the issue of “general AI” and whether GPT-3 draws us closer to that other myth, and whether this software is as revolutionary or profound as some claim it to be.

Tags

Great! You've successfully subscribed.
Great! Next, complete checkout for full access.
Welcome back! You've successfully signed in.
Success! Your account is fully activated, you now have access to all content.