Illustration: Gabriella Turrisi/Axios
The world's response to the oracular artificial intelligence program called ChatGPT started with chuckles but has quickly moved on to shivers.
What's happening: Trained on vast troves of online text, OpenAI's chatbot remixes those words into often-persuasive imitations of human expression and even style.
Yes, but: A growing chorus of experts believes it's too good at passing for human. Its capacity for generating endless quantities of authentic-seeming text, critics fear, will trigger a trust meltdown.
Why it matters: ChatGPT's ability to blur the line between human and machine authorship could wreak overnight havoc with norms across many disciplines, as people hand over the hard work of composing their thoughts to AI tools.
Education is where ChatGPT's disruption will land first, but any discipline or business built on foundations of text is in the blast radius.
What they're saying: "Shame on OpenAI for launching this pocket nuclear bomb without restrictions into an unprepared society.” Paul Kedrosky, a venture investor and longtime internet analyst, wrote on Twitter earlier this month. “A virus has been released into the wild with no concern for the consequences."
The intrigue: AI companies, including OpenAI, are working on schemes that could watermark machine-generated texts.
The big picture: The intense online debate over ChatGTP among technologists, investors and critics has surfaced a range of warnings over its failings.
Accuracy: ChatGTP's conversational fluency masks its inability to distinguish between fact and fiction.
Bias: OpenAI has tried to limit the potential for ChatGPT to say things that are blatantly offensive or discriminatory, but users have found many holes in its restraints. (That's likely what OpenAI wanted to happen in this public trial so it could improve the product.)
Control: Large-scale machine learning-based AI provides output without explanation: Programmers know what they fed the program, but not why it arrived at a particular answer.
The other side: Historically, previous waves of automation — like the Industrial Revolution — triggered eras of instability but left society intact.
Our thought bubble: Writing is hard! The more writing AI does for us, the fewer of us will practice the skill.
Leave a Reply
You must be logged in to post a comment.