The UK regulatory authority is currently assessing the privacy implications associated with Snapchat’s AI chatbot and is seeking explanations that could potentially result in a future ban.
Snapchat’s My AI chatbot garnered significant attention upon its initial launch. Users began treating the AI chatbot as a friend, sharing various personal details, which has raised significant privacy concerns, particularly concerning children.
Snapchat’s My AI is powered by OpenAI ChatGPT, enabling the chatbot to provide responses to user queries.
Early on, there were warnings advising users not to disclose personal information and data to the chatbot, as these interactions were being analyzed to enhance product performance.
Now, the Information Commissioner’s Office (ICO) has issued a notice to the company, demanding proof that their findings are incorrect.
Failure to acknowledge these risks may result in the banishment of Snapchat’s My AI feature in the country. The social media platform will also need to reevaluate its policies regarding users aged 13 to 17.
The My AI chatbot was initially introduced in April and received praise for its ability to deliver accurate responses.
Nevertheless, users voiced concerns about the risks associated with sharing information with the AI chatbot.
The company also reconsidered its initial stance that the app was suitable for ages 13 and above, suggesting that the age limit should be raised.