Snapchat has announced new tools that will make its recently launched AI chatbot “My AI” experience safer, including an age-appropriate filter and insights for parents.
After discovering some of the potential for misuse of the AI chatbot, the company said it learned people were trying to “trick the chatbot into providing responses that do not conform to our guidelines”.
As a result of learning, the company has shared an update on some of the safety enhancements and said to launch a few tools to keep the AI responses in check.
In age-appropriate experience, the company said it has worked vigorously to improve its responses to inappropriate Snapchatter requests, regardless of a Snapchatter’s age, after launching My AI.
It uses proactive detection tools to scan My AI conversations for potentially nonconforming text and take action.
Moreover, the company also “implemented a new age signal for My AI utilising a Snapchatter’s birthdate, so that even if a Snapchatter never tells My AI their age in a conversation, the chatbot will consistently take their age into consideration when engaging in conversation”.
Through the in-app Family Center, Snapchat mentioned that it will provide parents with more insight into their teens’ interactions with My AI, in the coming weeks.
With this, parents will be able to use Family Center to see if their teens are communicating with My AI, and how often.