Character.ai surrendered from AGI. Now you sell stories

“Artificial intelligence is expensive. Let’s be honest in it,” says Anand.

Growth for safety

In October 2024, a teenage mother who died due to suicide a The wrong death suit Against personality techniques, its founders, Google, and Alphabet, claimed that the company targeted its son with “anthropomorphic experiments, excessive sex, and realism in a frightening way, during programming [the chatbot] To distort himself as a real person, licensed psychological treatment, and adult lovers. ”At that time, a letter spokesman. CNBC said The company was “sad due to the tragic loss” and took “the safety of our users seriously.”

Placing the personal tragic accident. Earlier this year, US Senate Alex Badilla and Peter Walsh books A message to many companions of artificial intelligence, including the character. AI, highlighting concerns about “mental health risks and safety offered to young users” from platforms.

“The team was taking this very responsibly for nearly a year,” Anand told me. “Amnesty International is random, it is always difficult to understand what will happen. So it’s not a single investment.”

This is very important because the letter. Air grows. Upon starting the start of 20 million active users, they spend, on average, 75 minutes a day, talking to a “character” in the image of Character.ai). The company’s user base is 55 percent of females. More than 50 percent of its users are Gen Z or Gen Alpha. With this growth, the real risks – what Anand is doing to maintain the integrity of his users?

“[In] In the past six months, we have invested an inappropriate amount of resources in the ability to serve under 18 years of age over 18 years, which was not the case last year, “I can’t say,” Oh, I can slap a label 18+ on my application and say its use of NSFW. “You end up creating a completely different application and a different platform on a small scale.”

Anand tells me that more than 10 company employees, who number 70 employees, are full -time confidence and safety. They are responsible for building guarantees such as age verification, separate models for users under the age of 18, and new features such as Parents’ visionsWhich allows parents to see how teenagers use the application.

The model below 18 was launched last December. It includes “a narrower group of searching characters on the platform,” according to the spokeswoman for the company, Catherine Kelly. “The filters have been applied to this group to remove letters related to sensitive or mature topics.”

But Anand says that the integrity of artificial intelligence will take more than just artistic adjustments. “Make this platform safe is a partnership between the organizers, the United States and the fathers,” says Anand. This makes watching his daughter talking to a very important character. “This should remain safe for her.”

Beyond companionship

The artificial intelligence market is prosperous. Consumers all over the world spent $ 68 million on the company of artificial intelligence in the first half of this year, an increase of 200 percent over last year, according to estimate Chenbc was martyred. Emerging companies of artificial intelligence starts for a segment of the market: XAI issued a creeping and full companion in July, and even Microsoft Bills Chatbot Copilot has artificial intelligence companion.

How can the letter emerge in a crowded market? He takes himself completely from him.

Leave a Comment