Introduction
ChatGPT has taken the world by storm for all the right reasons. OpenAI released ChatGPT in July 2020, giving us a peek into the colossal amount of data used to train this language powerhouse. It is the fastest-growing consumer application ever launched. But the release of ChatGPT has given rise to many privacy concerns among its users. ChatGPT can use the data you share about yourself online to generate the text you want. So the question arises, does ChatGPT have access to all its users’ private information?
In March 2022, users were blocked from using ChatGPT, and Garante, the Italian data protection authority, took this action. Following this order, ChatGPT was temporarily blocked from using Italian users’ personal information while the investigations were being carried out. This has happened because ChatGPT is trained using vast amounts of data – data that belongs to you.
How has Italy’s ban on ChatGPT raised privacy concerns, among others?
Italy’s ban on OpenAI’s GPT-3 tool has caused quite a stir in the European data regulation scene. Following Italy’s announcement, various data regulators in France, Germany, and Ireland have contacted the Garante to provide them with more information on the matter. This way, they can conduct thorough research into and resolve the issue. The head of Norway’s data protection authority, Tobias Judin, has expressed concerns over the legality of using an AI model built on unlawfully collected data.
According to Judin, if OpenAI has merely squeezed the internet for information without taking into account privacy laws, this could have significant consequences. This scenario raises several different questions of whether anyone can use such tools legally, which could have far-reaching implications for the future of AI development. The outcome of Italy’s investigation will undoubtedly set a precedent for how European nations deal with AI models built on personal data.
Main Privacy Concerns Put Forward by Italy’s Garante
The General Data Protection Regulation (GDPR) in Europe safeguards the personal data of more than 400 million individuals. This monitors how organizations collect, store, and use personal data. GDPR’s protections apply even if someone’s information is publicly available online.
Italy’s Garante has indicted ChatGPT of breaching GDPR, stating that OpenAI had 4 main problems. It does not have age controls which makes it easily accessible to all age groups, can provide inaccurate information, does not inform people that their data was collected, and has no legal basis for collecting personal information.
Under GDPR, a company must rely on one of six legal justifications to collect and use people’s data. However, ChatGPT’s case did not have people’s consent or prove it had “legitimate interests.” Lilian Edwards, a professor of law, says the defense is “very hard” to do. The Garante believes this defense is “inadequate.”
ChatGPT has undoubtedly revolutionized the data we can access, but it has been done at the expense of our private data. If that’s the case, this could pose severe threats to the data present online, and further investigations have to be carried out to rule out any such possibility.
What Is The Solution?
Now that ChatGPT has been accused of stealing personal information, the need of the hour is to switch to another platform that ensures personal data’s safety and security. One such platform is Zyng. Zyng is a social networking platform that prioritizes user privacy and security. Compared to ChatGPT, Zyng takes several steps to ensure that private user data is safe.
Firstly, Zyng uses end-to-end encryption to protect user conversations and no server to store any of them. This means that the messages you send are encrypted on both ends, and Zyng never gets it, ever. This indicates that even if a third party or some malicious individual were to intercept the message, they would not be able to interpret its contents.
Secondly, Zyng has a strict data protection policy compliant with various data protection regulations. The platform only collects data that is necessary for providing its services. Zyng does not share user data with third parties because it simply can’t, even if it wants to.