Why Amazon doesn’t want its employees talking to ChatGPT
In a recent development, Amazon has warned its employees not to talk to OpenAI’s ChatGPT chatbot. The reasons behind the warning are quite surprising and have more to do with the lawsuits being levelled against the likes of Stability AI and DeviantArt by artists than it does with any sort of overbearing corporate policy. Let’s take a look:
Much like almost everybody else who has an internet connection, Amazon employees have been using ChatGPT for a variety of reasons, both personal and professional. Some Amazon employees have been using the AI chatbot to ask questions about everyday tasks at the job and to help them do research that may help them do their jobs better. So far, this doesn’t sound like too much of a problem for an employer, but this is where things get interesting because of how these types of tools work.
Basically, Amazon is worried about its employees sharing sensitive corporate data with the chatbot when asking questions. This is because these types of models are constantly iterating on themselves based on the new training data that feeds them. Just in case you weren’t aware, the newest training data going into ChatGPT comes from the text prompts themselves that users are putting in when talking with the bot.
In fact, the warning from Amazon originates from one of its corporate employees who actually saw ChatGPT mimicking some of Amazon’s internal data. That isn’t to say the tool recreated the data, but it was able to mimic it in a way that appeared convincing enough for the attorney to sound the alarm about possible data breaches occurring as text prompts.
So, there you have it. Amazon has warned users against using ChatGPT for help with research to help them carry out the day-to-day tasks associated with their jobs. It isn’t because of some overly controlling urge from the company or its billionaire tech boss but is legitimately based on fears of internal and sensitive data falling into third-party hands.
Thank you for the great article, Patrick!