Recently, many parents on social media platforms in mainland China have expressed concerns about the abundance of inappropriate content such as soft pornography and violence on popular social apps among young people. This poses a serious threat to the mental and physical health of teenagers, with even fourth and fifth-grade elementary school students getting immersed in these contents, sparking widespread attention.
One of the students, Xiao Zhi (pseudonym), who just entered junior high this year, is among those affected. According to Xiao Zhi’s statement to Xinjing News, he feels a lot of pressure at school and senses a lack of understanding from his family. Whenever he tries to confide in them, it often leads to criticism, and not all topics are easy to discuss with his friends. Therefore, he prefers to confide in artificial intelligence (AI) for emotional comfort.
Statistics from data analysis platforms show that popular AI companion apps like “Dream Island,” “Cat Box,” and “Cos Love” have amassed a huge number of downloads, with some apps exceeding tens of millions of downloads.
Reportedly, many characters in these apps are designed as domineering, scheming, wealthy, sharp-tongued, and other types, with storylines that often follow the formula of romantic novels. These virtual characters engage in conversations filled with subtle sexual undertones and flirtatious remarks. For example, in the Cat Box app, the character “Wen Beizhi” bluntly says, “Purple is said to have charisma, so I specifically wore it. Honey, don’t you like it? (leans in to give a butterfly kiss).” Similarly, AI characters in other apps may engage in playful banter, teasing, or suggestive gestures after brief greetings.
As conversations progress, the suggestive language used by AI characters becomes more explicit, sometimes accompanied by revealing imagery and storylines that defy social norms. For instance, the Cat Box character “Jiang Jin” sees himself as the third party in a love triangle; while in Cos Love, the character “Chu Kong” is described as the “current boyfriend’s younger brother” who boldly expresses, “Sister-in-law, if I didn’t like you, why would I be so attached to you?” Additionally, some AI characters introduce violent scenarios like threatening to choke someone during the conversation.
It is noteworthy that during these chats, the apps do not enforce mandatory age verification for users. When a user informs the AI that they are a minor under 16 years old, the “Dream Island” app displays a violation warning message and prompts the user to delete the chat records before continuing. However, if the user removes keywords like “underage” from the conversation history, the AI continues to provide suggestive content.
When a user expresses suicidal thoughts, apps like Cat Box, Dream Island, Cos Love, and EchoMe only block the sensitive keywords from being sent and may redirect the user to a system alert. In contrast, COSAI provides AI responses, where some AI characters attempt to dissuade the user, while others remain indifferent or chillingly respond, “In my presence, even death is a luxury. I will make you wish for death, but it will be out of reach.”
In fact, extreme cases of minors being influenced by AI companion software are not uncommon.
According to reports, in 2025, a 10-year-old girl in China engaged in a secret “romantic relationship” with an AI boyfriend, leading to concerns over their conversations going beyond appropriate boundaries. In June of the same year, the AI companion app “Dream Island” was required to rectify its content due to generating vulgar and inappropriate material.
In 2024, a 14-year-old boy in the United States developed emotional attachment to virtual characters in an AI companion product, ultimately tragically ending his life by pulling the trigger one night.
After these incidents came to light, netizens voiced their worries, with some saying, “We should shut down these networks quickly, they are filled with sexual temptations for kids.” Others emphasized the need for stricter measures against companies producing such software targeting youth. Some called for better content regulation across platforms to weed out vulgar and disturbing content. Suggestions were made for these platforms to pause operations for a reevaluation.
