ChatGPT: the good, the bad and the ugly

Screenshot of ChatGPT's homescreen.

Sarah Coles shares the inside track on the new ai chat tool everybody's talking about

Over the past few weeks, you’ve probably heard about the new chatbot technology, ChatGPT. Powered by OpenAI’s Chat GPT-3 large language model (LLM), ChatGPT is a computer program that can understand and answer users in a way that feels extremely close to talking with a human.

And it’s surging in popularity. In the five days after OpenAI launched ChatGPT late last year, the artificial intelligence chatbot achieved what took Netflix over three years—acquiring one million users.

And this popularity has led to the likes of Google, Meta and more building their own LLMs. Some say it could replace the likes of Google search or even Wikipedia in the future.

This all sounds exciting, but it’s not without risks. So with that in mind, let’s break down the positives, the risks and how you can protect yourself and your business.

Let’s start with the positives

It can be an incredible productivity tool. ChatGPT is able to quickly answer common questions, write (and even debug) code, assist with research, translate communications with clients in different languages and more.

In the future it’s likely we’ll use AI to drive improvements in cyber security too. AI can process data much faster, making it more efficient at identifying threats.

What about the risks?

Like anything online, ChatGPT comes with risks. Here’s a few examples:

  • Staff uploading confidential data – on average, a company leaks sensitive data to ChatGPT hundreds of times each week. And while ChatGPT doesn’t have the ability to store confidential data, it’s still vulnerable to the same privacy and security risks associated with online communication like email1.
  • It can be used to craft legitimate and natural looking phishing emails – which are unlikely to contain the grammatical errors which have been our tell-tale sign of phishing for many years. It can also translate phishing emails into any language and its unlikely the end user will detect any errors.
  • It could potentially be used to generate malware – or help less experienced cyber criminals improve their technical knowledge and develop skills2. It’s worth noting ChatGPT is trained to be on ‘the good side’ so is aware of risky requests. But some have found ways to work around its safeguards to help with more nefarious requests. 
  • Beware of fake ChatGPT sites – There have been lots of examples of fake sites trying to direct people onto fake ChatGPT apps, or sites that ultimately steal personal information or download malware3, 4.

4 tips to protect yourself and your business?

With all that in mind, here’s 4 tips on how you can enjoy the benefits of ChatGPT and stay safe:

  1. Beware of fake ChatGPT apps and websites - check reviews and do your research before installing an app or visiting a site
  2. Continue to raise awareness of good security practices - such as applying multi factor authentication and looking out for phishing emails. There aren’t always grammatical errors, so point out other clues to look out for, like if there’s a sense of urgency to the request.
  3. Don’t share personal or confidential information – It’s always sensible to think about privacy and confidentiality when sharing information online, chatbot or otherwise.
  4. Continually adapt – keep on top of new threats and see how cyber criminals are evolving their tactics. Then adapt your defences to make yourself and your business more resilient to future threats.

References:

1. Security Affairs - The risk of pasting confidential company data into ChatGPT (March 13, 2023)

2. Axios - Hackers are already abusing ChatGPT to write malware ( January 10, 2023)

3. Bleeping Computer - Hackers use fake ChatGPT apps to push Windows, Android malware (February 22, 2023)

4. Medium - “FakeGPT”: New Variant of Fake-ChatGPT Chrome Extension Stealing Facebook Ad Accounts with Thousands of Daily Installs (March 8, 2023)

This article is for financial professionals only. Any information contained within is of a general nature and should not be construed as a form of personal recommendation or financial advice. Nor is the information to be considered an offer or solicitation to deal in any financial instrument or to engage in any investment service or activity.

Parmenion accepts no duty of care or liability for loss arising from any person acting, or refraining from acting, as a result of any information contained within this article. All investment carries risk. The value of investments, and the income from them, can go down as well as up and investors may get back less than they put in. Past performance is not a reliable indicator of future returns.  

Speak to us and find out how we can help your business thrive.