John O'Connor
John O’Connor

ChatGPT is technology’s latest gift to the masses. And it may well prove to be one of the most dangerous.

Now, if you are still trying to get your head around what ChatGPT is, don’t feel bad. It was pretty much unknown until a few months ago.

But in almost no time at all, it has quickly become an internet sensation. How fast is it growing? Faster than TikTok.

And like TikTok, there’s something about it that is both alluring and unsettling. Basically, ChatGPT allows users to have human-like conversations by using a computer program known as a chatbot.

Need a 1,000 word essay on say, why cricket fans are more rabid in Mumbai than London? Simply type in the question. And bam, out comes a detailed analysis. That’s just one example. The possibilities here are almost endless.

Advocates are praising ChatGPT as a way to improve the use of telemedicine, medical record keeping, clinical trials, mental health support, medication management and more. And it can do all those things.

But at the same time, it has a tremendous potential for misuse and harm. In extreme cases, it can be used to create fake news, to impersonate people and even to carry out cyber-attacks.

Moreover, ChatGPT is unable to determine whether the sources it scrapes for content are accurate or not. It’s not a human, remember?

So if you ask for 20 ways to deliver better senior living services, you’ll likely get some great ideas from leading authorities. However, it’s possible crazed directives from a serial killer might also be included. Along with incoherent ramblings from a teenager. As Forrest Gump might say, ChatGPT is like a box of chocolates.

Earlier this week, two researchers posted an article that basically sounded the alarm.

Writing in the journal Health Affairs, authors Wura Jacobs and Omolola E. Adepoju noted that many of the texts, books, articles and websites that chatbots use when replying to inquiries contain biases in areas such as age, race/ethnicity, sex, education levels and more.

The full story can be found here.

Would you fire up a chain saw with no idea of how it works? Of course not. Well, ChatGPT can be pretty dangerous as well. So proceed with caution here.

The old adage about not believing everything you read has never been more relevant. Especially if the content is provided via ChatGPT.

John O’Connor is editorial director for McKnight’s Senior Living and its sister media brands, McKnight’s Long-Term Care News, which focuses on skilled nursing, and McKnight’s Home Care. Read more of his columns here.