Skip to main content

Hot off the press

Hallucinate – 2023 word of the year

You didn’t need a crystal ball to predict that Cambridge dictionary’s word of the year would be in someway related to artificial intelligence but the choice of “hallucinate” is an interesting one. It neatly flags up the need to approach AI with a degree of caution.

The original definition of “hallucinate” is “to see, hear, feel or smell something that does not exist”. The word now also relates to when generative artificial intelligence systems, such as ChatGPT mimics human writing and produces false information.

According to a post on the dictionary site the word was chosen as it “gets to the heart of why people are talking about AI”. It goes on to say that “Generative AI is a powerful tool and one we’re all still learning to interact with safely and effectively – this means being aware of both its potential strengths and its current weaknesses.”

AI – A force for good?

It has been around in one form or another for many years but AI has really made its way into our collective psyche in 2023. The explosion of generative AI has proved to be a real talking point. GenAI has some impressive capabilities but it is also only as reliable as the algorithms it learns from. The speed at which AI can generate an article, a report or even a blog post (not that we’d EVER do that…) is undeniably impressive. On the flip side, it’s also true that text can be sprinkled with inaccuracies. The problem is not the false information as such, it’s the way incorrect statements are presented as fact. You still need a decent subject knowledge to sort the fact from the fiction. As users we have a responsibility to apply critical thinking to the information we are presented with.

As an agency we’re tentatively excited about GenAI and all its possibilities. However, we are glad there is acknowledgement that these tools need to be treated with a degree of caution.

Get in touch if you’d like to find out more about how you can add the human touch to your advertising.