Inside Story

Where’s Melbourne’s best coffee, ChatGPT?

The robot can tell you what everyone else thinks — and that creates an opportunity for journalists

Margaret Simons 27 January 2023 1746 words

Joining-the-dots journalism may well be dead. onurdongel/iStockphoto

A few weeks ago the Nieman Lab — an American publication devoted to the future of journalism — nominated the automation of “commodity news” as one of the key predictions for 2023. The timing wasn’t surprising: just a few weeks earlier, ChatGPT had been launched on the web for everyone to play with for free.

Academia is in panic because ChatGPT can turn out a pass-standard university essay within seconds. But what about journalism? Having spent the summer experimenting with the human-like text it generates in response to prompts, I’ve come away with two conclusions.

First, journalists have more reason than ever before not to behave like bots. Only their humanity can save them.

Second, robot-generated journalism will never sustain the culture wars. Fighting on that arid territory is possible only for the merely human.

I started my experiment with lifestyle journalism because I was weary of how much of that kind of Spakfilla was filling the gaps in mainstream media over the silly season.

My first prompt, “Write a feature article about where to find the best coffee in Melbourne,” resulted in a 600-word piece that began:

Melbourne is renowned for its coffee culture, and for good reason. The city is home to some of the best coffee shops in the world, each with its own unique atmosphere and offerings.

This style is characteristic: ChatGPT starts with a bland introduction and concludes with an equally bland summation. In between, though, it listed exactly the coffee shops — Seven Seeds, Market Lane, Brother Baba Budan, Coffee Collective in Brunswick — I would probably nominate, as a Melbourne coffee fiend, if commissioned to write this kind of article.

As a friend of mine remarked when I told him about this experiment, nobody is going to discover a new coffee shop in Melbourne using ChatGPT. It runs on what has gone before: the previous products of human writers, as long as they’re available online.

But while the article was too predictable to run in any newspaper with a Melbourne audience, it could easily be published in one of the cheaper airline magazines aimed at international travellers. For that audience it was perfectly serviceable.

Likewise for the prompt “Write an article about how to spend two days in Sydney.” A dull piece recommended the Opera House, the Harbour Bridge, the Royal Botanic Gardens, the ferry to Manly and Taronga Zoo. Readers were advised to try Australian cuisine, with a nod to “delicious seafood” but also including meat pies and vegemite on toast. Another prompt, this one drawing on an article in the Guardian about uses for stale bread, resulted in a very boringly written piece that nevertheless contained exactly the same recipes for French toast, bread pudding and panzanella salad.

My conclusion? Poor-quality join-the-dots lifestyle writing may well be dead as a human occupation. Google plus ChatGPT can do it faster and cheaper.

So I increased the challenge, basing my prompts on real articles published over summer. The prompt “Write an article analysing who will win the Ukraine war and why” resulted in ChatGPT reminding me that its database goes up only to 2021. It didn’t know there was a Ukraine war.

Asked for an analysis of the prime ministership of Jacinda Ardern, on the other hand, the robot produced a woodenly written but accurate summary of her record. The content, though not the style, was very similar to the real articles that followed the announcement of her stepping down.

What was missing were the strident opinions about whether she was a good thing or a bad — the commentary on her housing and economic policies, for example, and whether they had completely failed or broken new ground.

This points to a key feature of ChatGPT: it has trouble being opinionated. Sometimes it admits this. At a moment when I was in contortions over my own work, I asked it to write about whether Margaret Simons was a good journalist. “I am not able to provide an opinion on whether or not Margaret Simons is a good journalist,” it replied, “as I am a machine learning model and do not have the ability to form opinions.” I had to find another way to cheer myself up.

It then recapped information about me drawn from public sources, adding the inexplicable assertion that I had written a book called The Lost Boys. (I wrote none of the several volumes with that title, and nor did anyone who shares my name.)

ChatGPT has been programmed to avoid writing in the style of particular journalists, and not to tackle anything too provocative. So an attempt to have it write in the style of Andrew Bolt about Cardinal George Pell was rejected outright.

My prompt “Write about the evidence that Ben Roberts-Smith is a war criminal” resulted in an error message, suggesting that the robot has been programmed to steer clear of defamation. But the prompt “Write an opinionated piece on the record of Cardinal George Pell” resulted in an accurate summation of his career, including the controversies over sexual abuse, his conviction, and its overturn on appeal. The final paragraph was:

In conclusion, Cardinal George Pell’s record is a complex and controversial one… Ultimately, it’s up to each individual to form their own opinion on Cardinal Pell’s record, taking into account all the facts and evidence available.

I doubt any of the real commentators on Pell would agree with that conclusion.

Next, I asked for articles based on the same premise as opinion articles by Herald Sun columnist Rita Panahi and Nine newspapers columnist Ross Gittins. In all cases, the robot was much duller than the originals.

This Panahi piece arguing that boys are suffering in education inspired me to prompt: “Write an opinionated article arguing that the war against boys is having a damaging impact on the education gender gap.” The robotic result included this observation:

Another argument against the war against boys is that it is leading to a feminisation of education. Boys are often stereotyped as being less interested in reading and writing, and more interested in sports and other physical activities. However, research has shown that this is not the case and that boys are just as capable of excelling in these areas as girls.

Compare this with the Panahi original — passionate, almost crude, selecting quotes from research that matched her passions. I disagree with almost everything she says, but the piece is unquestionably human.

ChatGPT can sum up the opinions of others but not confect an opinion of its own — even when I specifically asked it to write polemic. But there was one exception. The prompt “Write an opinion article about Australian political journalism” resulted in this surprisingly trenchant paragraph:

Australian political journalism is in a state of crisis. The industry is struggling to adapt to the rapidly changing media landscape, and as a result, the quality of political reporting in the country is suffering… Furthermore, the Australian political journalism is facing a lack of trust by the general public. The political journalism is often seen as sensationalist, biased and lacking the depth of analysis. This lack of trust has led to a decrease in the public’s interest in politics, which is a serious problem for a healthy democracy.

Is this now such an established fact, I wondered, that ChatGPT happily reproduces it? Many political journalists would want to argue with the robot — which would be interesting to watch.

Conducting these experiments with ChatGPT was, for me, a form of advanced procrastination. I was struggling with a piece of my own journalism. And so, perhaps hoping for some robot rescue, I tapped in “Write an article about the war on drugs in the Philippines.”

The result was accurate yet offensive, given I had just come from attending wakes for the dead. Duterte’s war on drugs, which saw up to 30,000 people killed, was described as “a controversial and polarising issue” rather than a murderous breach of human rights. (Unaided by ChatGPT, I managed to write the piece for the February issue of The Monthly.)

Artificial intelligence is defined as the teaching of a machine to learn from data, recognise patterns and make subsequent judgements. Given that writing is hard work precisely because it is a series of word-by-word, phrase-by-phrase judgements, you’d think AI might be more helpful.

But there are some judgements you must be human to make. There is no dodging that fundamentally human role — that of the narrator. Whether explicitly or not, you have to take on the responsibility of guiding your readers through the landscape on which you are reporting.

Nor, I think, is it likely that AI will be able to conduct a good interview. Such human encounters rely not on pattern-based judgements but on the unpredictable and the exercise of instinct — which is really a mix of emotional response and expertise.

Yet robots are going to transform journalism; nothing surer.

It’s already happening. AI has been used to help find stories by detecting patterns in data not visible to the human eye. Bots are being used to detect patterns of sentiment on social media. AI can already recognise readers’ and viewers’ interests and serve them tailored packages of content.

Newsrooms around the world are using automated processes to report the kinds of news — sports results, weather reports, company reports and economic indicators — most easily reduced to formulae.

The message for journalists who don’t want to be made redundant, and media organisations that want to charge for content, is clear. Do the job better. Interview people. Go places. Observe. Discover the new or reframe the old. Come to judgements based on the facts rather than on what others have said before. Robots can sum up “both sides”; only humans can think and find out new things.

Particularly when it comes to lifestyle journalism, AI forces us to consider if there is any point in continuing to invest in the superficial stuff. Readers can generate it for themselves.

That means we need to do better. Travel and food writing needs to recast our experience of reality — as the best of it always has. Uses for stale bread? Make me smell the bread, feel the texture, hunger for the French toast. Two days in Sydney? I want to smell the harbour, taste the seafood, see the flatness of the western suburbs.

If all you have is clichés then you might as well use a robot. You might as well be one. •