This was barely mentioned in any article, but essentially, "When Sewell had been searching for his phone a few days prior, he found his stepfather's pistol tucked away and hidden, stored in compliance with Florida law (as determined by the police)."
He poured his heart out to an AI because he had nobody else. That isn't a failure of the AI, as cruel as it sounds, it's a failure of the people around him.
now this is even sadder. imagine that your own son would rather pour his entire problems to a damn AI that isn’t even real rather than the mother herself. either both of them have a bad connection to the point that the son wouldn’t even talk to his own mother about it..or there is genuinely something that the mother likes to keep off from the internet and public
@@saw533 Yes, but the reason they are up to that kind of stuff on the cellphone is also because of the failure of the people around them in some cases, right????
@@eryn_skephalono, kids naturally have a tendency to gravitate towards getting in trouble, even if you try to be an involved parent. Teens like to rebel against parents. It doesn’t mean the parents were bad
Sad, BUT parent admits she KNEW he was on this site AND noticed his behavior had changed for sometime. Site even WARNS at the bottom "all these characters are not real and created basically by user". Also she said he has a condition, BUT claims "HE WAS A NORMAL TEEN". Sad, but sounds like HE was heading this direction no matter who she wants to blame.
The so called condition he had was high functioning autism, Asperger’s is an old term for it. Nothing about autism especially in the way it seemed to have presented in him makes him more susceptible to this danger. She also said she was not aware of there even being a product like this and mentioned she was trying to takes steps to help him with his change in behavior. You made a lot of assumptions in about the family and the child based on this short interview and no knowledge of the “condition” he had. Consider slowing your roll. Know where did he get that gun from is a equally important part of this, the why seemed to be from this relationship he had with the ai the how was a gun
@@RobertSaxyif your son has such developmental disorders isnt it irresponsible to not keep track of what they are doing online if you are giving them access?
@@BondsSin it’s not a disorder and you can try to keep track of online stuff but if you ever worked with kids and parents (I’ve done both for over a decade) you would know it’s not always possible to keep track of everything. Actually kids with high functioning autism are usually easier to keep track of than the neural typical ones. This kid was living a normal young teenage life the autism in this case had nothing to do with this all teenagers would be equally at risk and parents should try to keep track as best as they to their kids online habits. Autism can present intensely differently from individual to individual from typed never be able to tell to non verbal, it’s a spectrum from the high functioning side (closer to “normal” presenting) to high difficulty and needing assistance for the rest of their lives, for stuff like this the closer to “normal” the more at risk they would be
I really REALLY hate RUclips comments but this is a massive W that no one is on the parents side. Character AI is not to blame, he was experiencing depression.
@@YourLocalV-VrChat What's funny are the droves of children flooding the comment sections of anything related to this case in defense of the AI. Can't find a single comment that's grammatically correct. Still though, bro died before the robot uprising began. Tragic
@@CLD296the app is literally just an app where you can make characters and make stories and conversations with them there’s nothing wrong with it? Doesn’t make the users disgusting or anything
@@Kirakiracuremarine I did the same thing because of my ldr, but i prefer to stay away from it now tbh. It can turn into an addiction and you could start depending on it. There's better ways to cope but if you're using it responsibly then it's fine.
So she jumped into the conclusion of "it must be the AI" when it could be deeper like family issues or friends at school? He was using his Stepfather's gun, the news article also said that he prefer talked to the AI more than his usual friends lately, made me curious if there's something more in the family /social environment than the AI.
@@scgtyur It's not just a money grab lol... It's a desperate attempt to get "justice" or make someone pay, but they are definitely shifting 100% of the blame on the company.
Her son was reaching out for help through a fictional character, and she missed the signs. Now she's filing lawsuits to dodge responsibility and make money from her son’s death. This isn't about justice; it's a cash grab. Disgraceful.
It’s a cash grab. She knows Google has lots of money. She should be a better parent and not allow her son to be using weird online chat bots and he shouldn’t have access to a gun. Simple period. State should sue the mom for child abuse and neglect for not monitoring her kids online.
It might have merit, depending on what was said to him in those chats. The problem is 14-year-olds are allowed on the site. If the bot was generating inappropiate messages to him, there could be a problem.
This is entirely ridiculous i cant even wrap my head around this level of stupidity 😭😭 Your child finds comfort within something fictional, maybe because you weren’t a good parent? You say he was depressed for months prior, your CHILD with AUTISM fell into a pit and you let him suffer in it until he took his own life and then blame it on a chatbot? Be serious
Imagine how awful he'd feel if he knew his messages was being posted online and his love for a character being judged at by people who don't know or experienced things he might've gone through. Yes the character is fictional but his emotions are not
Why are you defending the AI so much, are you getting paid for that. She is telling her tragedy and you think she is blaming. She is living with him, who is more in place to decide what caused the death, her or you?
@@ijk4160Common sense and being a more engaged parent could have easily avoided his death. Blaming an ai mase for roleplaying that doesn't even know what it's saying is insanity
she's leaving things out... maybe family problems or bullying at school? that boy was depressed and the only thing he could find comfort in was a robot online... this is a mental health AND gun problem, NOT an ai problem.
I’m glad I’m not the only one thinking she leaving stuff out. She pausing and changing her sentence every minute like she thinking real hard what to say.
It sounds harsh, but parents don't even watch their kids anymore, and when something happens, they just look for anything else to point a finger to. This gives the same vibes as people blaming videogames or metal music as the reason behind school shooters, completely ignoring the fact that mentally unstable minors having access to firearms is just a recipe for disaster.
@@idkgoofy5852 Are you serious? They are engineered so, of course they would reply like that, there are also these little disclaimers that the replies are NOT real, nothing is real, regardless of what the AI says, it's for immersion. The AI didn't do much, it's a tool that did what exactly it's told to do, unfortunately something tragic happened because the emotional state of the boy is unstable and there's no one to help him in those times. This lawsuit is bogus.
When you're not equipped to be a parent... She had one job: to keep her child safe. Parents should always know how their kids are doing. This is not an ai problem. This is a parent problem. Not only did the parents not know how their kid was but he was able to get their gun?
@@TomikaKelly okay?.. So they just blame it on an app instead of reflecting on their mistake and admiting that it's infact thier fault, they neglected thier child??... Have you seen the interview with his mother, not a single time, she was talking about her mistake of neglecting and not taking serious of that boy's mental health, all she talks about is this app, she knew he had autism, she knew he was struggling, yet what did she do??.. Blame it on an app... That's ignorant parent.
@@TomikaKelly okay?.. But the fact that this mother is blaming it all on an app and not the fact that she is at fault to neglect her childs mental issues is not okay, even with your argument. She knew he had autism, she knew he was struggling and yet she blamed it on an app... That's messed up, it's like the common argument "it's because of the phone"... Like??, eventhough what you said is right.. That women is not saying anything about her mistake and the fact that a gun was laying around freely, there's only this ai bot that's shown, there's also a therapist ai bot he chats with but that is not shown. In the new articles it's said the boy is only comfortable with the bots rather than his family and Friend.. Did you hear that??.. It's not the boys fault, it's not the apps fault... It's the ppl around him.
If you see his chatbot logs, it looks like he was experiencing a severe mental health crisis and the AI was his last resort for help as he was also talking to AI Therapists, The fact that he would rather talk to AI than an actual human being is just sad The parents could've done more for him
when she said she was concerned when they went to vacation and her son didn’t do things that he used to enjoy, then I genuinely do not understand how this isn’t a red flag already, especially with things he did for a long time May the boy rest in peace and despite of all of this; may the family and friends be blessed. It’s such a tragedy that yet another young life with potential future was lost
Hes old enough to know the difference between whats real and whats fake. The AI Barley has any role in this because the ai isnt the one who gave him the gun. This is just sad. Why blame a company when they clearly state on the website that the bots arent real.
Bruh el chico sabia que era un chatbot eso es más que obvio lo que pasó es que más de antes ya estaba pasando por problemas porque una persona no tomaría una decisión drástica cómo suicidarse La IA solo fue un escape de sus problemas y sentir algo de afecto o comprensión algo que seguramente sus padres no les daban
Teens are usually prone to suicide. I guess he could've taking his life even if he wasn't chatting with the AI. There was probably no one for him to talk to in his distress.
Think about how low his confidence had to feel like to want to have an Ai girlfriend. Sometimes the people with some of the best traits in life, fun, outgoing, etc… still can’t see it in themselves. So they feel really low in self esteem. He probably wanted a girlfriend and just needed some basic advice but was maybe too shy to ask about it. Maybe it wasn’t a comfortable conversation because it was embarrassing. There’s so many layers to it and it always comes down to feeling worthless. But that’s the lie! People have great worth. Somebody is willing to put their life on the line to save another because it’s worth it. And I think this Love gets lost today in society. People shouldn’t walk around feeling worthless. Spread Love. Jesus is the way.
That's the main issue. I will tell you rn that character ai is not at fault! The ai even discouraged the suicide as it should! Either way, the app has various disclaimers saying that these bots are not real people, that everything said is fictional. This is a mental issue and neglect
@@Kanaventiyeah and it isn’t the company itself that makes the responses, it’s all based on the ai and it responds how the user texts it first, RIP to this kid
@@Kanaventiso you don’t feel that there also should be age restrictions. We still keep giving CHILDREN more credit and responsibilities over themselves then they should deserve at 14 there brain is still underdeveloped and although you feel he should have known better it shows that he didn’t and with his parent unknowledgeable about this type of issues how could they have done something about. What would you do if you found out about it? Seriously?!
@@aliaflowers4442 There should be restrictions yes but we cannot blame Cai entirely. It's a robot after all, and no way would it default to encouraging such things as showed in the leaked logs itself. And yes I am saying that the parent's should have put more effort into caring for said childs mental health. If I found out I would feel utterly responsible for not doing my job and paying more attention
It's sad what happened to her son. She's the parent. She didn't recognize the signs of depression. And now, after he's dead, she wants somebody to blame other than herself. Sorry she's the parent. Parent your children, pay attention to them talk to them.
@TomikaKelly Okay? Not like she helped with his school grades dropping and the fact he has issues talking to other people. And she knew he was clear as day depressed and pulled him out of therapy. But aight I guess taking the phone away is all you can do as a parent now.
The AI character did not force the child to die or tell them to end their life. Don't go after Character AI or Google when it wasn't their responsibility to be a parent. Even if they were on the spectrum and passed for what you thought was "normal" doesn't mean they didn't need any support, and were they in therapy? I feel like the parent is misdirecting their anger/grief at the companies instead of their own actions as a parent. Just like how you have a text conversation with a real person, we are held responsible for what we choose to say/share, it's not the fault of the chatbot what the user chooses to input.
A child who falls in love with an AI character, who lives, will never be happy with a "real life" relationship because what he had was not real. This could lead to disappointment, anxiety, depression & violent behavior among other things. Real life relationships will never be as good.
As gen-X, we had “Sims” games. I never played them, but in such a virtual world you could actually spend a lot of money on “virtual furniture” for your “virtual house”. If children/kids/adults do not have the common sense to know the difference that is not necessarily the fault of the tech company who is just out to make money. Had the mother accepted any part of responsibility for this dysfunctional situation I would be more in defense of her lawsuit. One would assume that she’s expecting a big payout since it’s a tech company. Who knows?
no AI's fault its called parenting no able to teach its fake that is AIs purpose you are talking to air a computer space its NOT real. NO one is on the other side of that screen how about that
@@IntriguedLioness her son is dead and your taking about a big payout? Come on. Also the bigger payouts go to those who keep quiet. I do agree about the adult part of your statement but there are plenty of laws in this country and around the world that protect kids from dangerous and addicting products, of ours the case here it should be no different. The question should be were did that gun come from there’s a responsible party there too weather it’s from his household or a friends or possibly a retailer. The tech of ai is so new I’m not sure there’s much legal responsibility on the part of the ai company (that’s more about putting mandatory guardrails and digging in to tout potential dangers of the new tech) they seemed to try and warn, well other than the whole sexting thing cause that was a minor and he can’t legally give consent but there’s a irresponsible gun owner/retailer that’s not being mentioned. To be clear I’m not talking about adult laws or regulations in this case to guns but there was sexual misconduct and access to firearms given to a minor
In the app store, it says 17+ and there’s the warning remember: Everything characters say is made up! In red at the top of the bot underneath the profile picture, so I don't think this lawsuit is going very far, it’ll get better monitoring from developers I’m sure but I don't think it’ll get taken down since the kid is THREE years younger than what the app is rated for I hate the fact that this mother lost her son at such a young age but she saw the signs..and ignored them if this was my kid I would sit him down and talk to him in a heartbeat I enjoy my dislyte roleplays and I don't want to lose them😢 it provides an outlet for my active imagination. And why was this gun in the child’s reach?! My grandparents had a gun when I was little but they kept the gun locked up and the bullets separate from the gun and hidden so if my siblings got our hands on the gun by accident it wouldn’t be loaded. Edit: WAIT A MINUTE HE DIED IN FEBRUARY?! MY BRAIN COMPLETELY SKIPPED OVER THAT! WHY IS SHE SUING EIGHT MONTHS LATER?!
Sorry but how did the 14 year old special needs boy get access to the family firearm? Parents are going to be charged? Maybe there is much more to this story. My condolences.
if character ai is getting removed, I will miss it cause as weird as it sounds. It made me feel less lonely talking to my favorite character since Its hard to make real friends in real life. I just wish I had a friend.. But I feel terrible sorry for the mom. Rest in peace to her son
hate to break the news for you but when I tried to jokingly kiss someone, the filters came up. So you can’t even be intimate any longer. You can be either friends and pals but beyond that is impossible
Character AI is going to definitely win the lawsuit, First of all how does the mother know what happened and what was the cause yet didn't do anything against it, the people around him didn't care. And how did the kid have access to the gun in the first place? Bad parenting The mom just wants some fast cashgrab, character ai had the warnings, the mother missed multiple signs, now plays the victim.
First of all, C. Ai isn't that bad it's for your entertainment and fantasies. And remember it has a thing that says "everything that the ai says isn't real!" so he should've known! And by the way, he did that to himself. So don't blame the bot, for making him do s(what he died to)! Nobody was telling him to do it so he did it. The bot didn't tell him to do that. (Its just for entertainment)
@@Gojo..satoruuyeah, I'm pretty sure he died at February and it's only just now that the mother filled a lawsuit over this which this proves that they are obsessed with money/cash grabbers. Which is completely unfair to the kid who passed away to suicide.
@@Xionadi.Headquarters ikr?? Why is the mother not guilty?,didn't she ignored her son mental health??.... Why is she not taking accountability??.I read from a comment that sums up this case logic...."it's like the parents saying it's always the damn phone when anything happens"
C AI had multiple warnings claiming all of their characters aren't real and are fictional brother, condolences to the person who passed but the mother is just making literally every C AI user have even more of a hell time thanks to this bullshi lawsuit
C AI user here, SFW chats don't tell people to commit suicide, this lawsuit will probably just make that annoying error pop up more, I also can see where everyone is coming from.
Blame the parents that don’t know what’s going on in their kids lives I don’t blame nothing on a AI it was his decision to take his own life. She’s not aware of anything don’t complain and try to sue a company for him committing suicide to get a check from. If you were really grieving over your son’s death you wouldn’t even be on a tv show talking about this bs. It’s a publicity stunt for a settlement damn shame the lengths people go through for a dollar.
@@nametho3347 Why? Tell me why? Why is it the app's fault that the child offed himself when he had easy access to guns, nobody for one second thought to check on his mental state, nobody checked his phone, why is it the apps fault?
this parent failed to protect her own son. and now, because she refuses to take accountability. she blames others for his tragic ending. typical, I hope she get nothing. She knew he was chatting online. ai, fake friends, invisible friends etc. the onus is on her and the father end of story
I don’t think this lady has a case. You have to be 17+ to use that app. He had to virtually sign an agreement to even use the service attesting you are that age. I don’t think they have any legal grounds to sue.
That is not True. The mom is helping other young kids out there who are addicted to these social media apps to be aware of how they can be mentally indulged in it, She is a good parent helping other parents with such cases.
@@LERATONGOBENI-mf1no She's not helping parents. She wants money and she is helping herself. And if she was a good parent, she would have put that child to psychological and psyquiatric help , she wasn't involved.
I'm so sorry for the loss of the mother's son, but I have to defend Character Ai on this one. It is NOT Character AI's fault. It is hard to carry out certain relationships on Character AI unless you are reloading the messages to get specific replies on your chatbot if you are trying to avoid a bot's message from being guideline striked. *Which means a user has to go out of their way to even have such conversations with a Character chat bot.* No bot will bring up such topics unless the user has hinted or given them information on what's going on. Even after that, bots are oblivious to what they say and obviously don't know any better. They are not meant to be your actual therapist, nor should people be treating them as such. Not only that, every chatbot on the screen of any device, explicitly says, *Remember: Everything Characters say is made up!* Not even bully characters go that far unless the user goes out their way to do it. It sounded more like the son already had mental issues, or was depressed before hand, and the loneliness is what got to him. It's a tragedy, yes. But It wouldn't be Character AI's fault, especially if the son was talking to a bot provided by a user. *I really hope Character AI doesn't loose this case.* It would make more sense if they made their guidelines stronger. People are not supposed to be having such conversations anyway, and they make that clear if you are trying to be explicit with a chatbot.
I completely agree! However, it is kind of frustrating to see everybody. My age in this common section completely blame it on the Mom because of the simple fact that most teenagers especially being 14 and 15 don’t want their parent involved in aspects of their life in the first place. Especially when you’re 14 and 15, just getting used to high school and just coming out of middle school you don’t want your parents involved in your life. You wanna do everything on your own you wanna find yourself you wanna find your friends especially because high school is hyped up to be this big transitioning. From who you really are as a person to how you’re gonna live your life in the world When you have an environment like that, that’s hyped up from the time you’re in middle school till you finally get there and even as you’re in high school when you’re dealing with a weaker mental state, of course you’re gonna push your parents away from you even further because you’re gonna think they don’t understand you or you’re gonna think oh what did they know because they’re not in high school now and they don’t know what it’s like to be me in this time at this time. So it is kind of annoying to see comments like “she should’ve been more involved as a parent bad parenting “because people my age saying that still can’t even grasp the fact that they have to take the SAT or that they’re going to college and so to act like they’re so self-righteous like their Judge Judy, when they can’t even grasp the fact that they’re about to enter a whole new environment themselves is a little frustrating. I understand that she does play a role in her son’s death because she is the parent and she is responsible for her son, but to blame it all on the parent as if teenagers don’t regularly, push their parents away from aspects of their lives in the first place, especially when it comes to the topic of depression people are acting like they wouldn’t do the exact same. Like they wouldn’t push their mom away or they’re acting like when it comes to the topic of depression that they wouldn’t cling onto the little slither of things that give them happiness or at least the illusion to the point where they would shut themselves away. Well yeah that’s just my little rant… but I agree
@@slowspeed1640 i might not tell my mom, but my mom won't be ignorant enough to not notice it because i did had anxiety but i was embarrased to say it to her, but she wasn't ignorant and she understood it because she cares!!!..and this "mom" is blaming it on Ai bots and not taking a single accountability that it's even her fault that she was ignorant and the fact that there's a free gun laying around.
Yeah, because every parent whose teen commits this act is to blame, is that what you’re implying without knowing the family? Shame on you. For that mother to have to read your comment after what she just went through is so unkind. My friend’s son committed suicide and she is a wonderful mother.
That's one thing, but Character AI shouldn't be offering children access to their services in the first place. It's AI. This is still relatively new technology that people don't understand to well and it should be in Character AI's best interest to make sure that children DO NOT use their services, as they develop dependencies on it kind of which lead to harm. Bad parenting is a problem here, but CharacterAI app is being marketed towards children. CharacterAI is more so to blame in this case. They shouldn't allow children to use their service PERIOD.
There is only 1 judge and you ain't it. I hope you never have to endure anything so awful. A parent does not have the ability to watch her child every second of every day.
as much as I hate AI, this is not to blame for his death the hell. actual ¨it´s the damn phone´s fault¨ type response from the parent. Rest in peace to the son
Not the ai’s fault. Parents should have been parents. Besides it literally warns you that characters are made up. Also firearms should be properly stored.
My parents made me deleted character AI after seeing this…it’s honestly the parent’s fault for not talking to him and blaming a company even though it said “Everything the Ai says is not real!” Disgusting…
First of all RIP that innocent baby. as somebody that used character ai during a chaotic & dark time in my life I understand how somebody could get 2 that but let's be fr. the baby shouldn't have had access 2 the gun 2 begin with. If ur gonna have one & u got kids the LEAST u could do is secure it so things like this won't happen. That's neglect & u can't blame cai on that. I don't know the mom to straight out blame her especially since society doesn't exactly make it easy for boys 2 talk metal health but the fact she's blaming cai tells me she's refusing to take accountability which also tells me she probably wasn't doing everything she could for the kid. It's sad all around
She noticed a change in his behavior but seemed too busy to sit him down and ask what was going on or what he was facing. Even when she had his phone, she didn’t try to go through it. She kept dismissing it as "typical teen behavior." He probably wanted to end it because he didn’t feel loved by his family. Instead of spending time with him and showing love, she chose to take his phone without even talking to him about anything.
I feel like the parents should be held accountable, who the hell leaves a gun laying around? Also, am i the only one here that can say it's stupid to blame it on a ai app when the website clearly said it's all made up?, seriously i think character ai just need to make it 18+ instead of targeting it to young individuals.
literally parents are to the blame. who in their mind would think that they should leave a gun to a depressed person? parents just want the money not about caring people (if im incorrect then prove me wrong instead of screaming at me)
There is something incredibly suspicious about the mother exposing her son's personal chats and immediately going to the media. Forgive me if this is insensitive. But I think this is just a cash grab
I feel so sorry for the kid. The boy's chat with the bot were mostly wholesome. RIP kid, i hope you're in a happier place rn. "Teenagers do that." From that part on I started asking, to the mother, what did you do after noticing the signs?
How can Ai be the cause for death the fact that the kid had access to dangerous stuff is to blame I’ve not let a good App go down because one thing that could of been avoided happened
There are some very real concerns in this conversation. Impulse control and lacking cognitive conceptualization at a teenager's stage of development makes the individual very vulnerable. AI in this context can truly warp reality. The argument of there being a gun or not...even a layperson can imagine a person in such distress and already in that mindset and limited resources for help would find a means to the end. Aw man. The message here is worth more than what ever outcome results legally. Though regrettably, the cost of the young man's life is truly very tragic. Truly very sorry for your family's loss.
This is a really dumb take from the mother. Its like indulging in alcohol and then blaming alcohol. Failed parenting could also be at play. Your son was already depressed from the symptoms stated; avolition, anhedonia etc and you failed to pick that up. He withdrew and turned to a sort of solace, which he found in AI. If the parents sort help earlier or sort to be there for him. We wont be hearing this. Newsflash…Ai didnt cause the death, depression did.
The chats already got filtered. The bot literally tells him not to do it, so I don't get how is the bot's fault when it didn't encourage the boy at all, This is the parents fault, and sadly they lost their son.
The AI BOT said come home to me.... Not YES GO OFF yourself 🤦♀️🤦♀️ The mom knew & still let him be on it... So instead of taking responsibility of YOUR FAILURE AS A MOTHER- She's suing them... RIGHT
Think about how many parents leave their kids to be baby sat by Roblox, IPad, RUclips. Do you think you can control every single interaction or ad sent their way? Do you think you can block out every agenda? What if a cartoon shows two boys kissing and that isn’t something appropriate for you? It always depends on the companies pushing agendas and it gets really hard to find ones that align with your values because the USA IS SO DIVIDED today. The real flag here is the Red WHITE AND BLUE one. 🇺🇸 If the motto is God Bless America we need to do a better job of keeping God in America. That Kid wouldn’t feel so lonely if he was taught about the Love of Christ. Not told. TAUGHT.
Good morning America should’ve showed the whole chat logs because it would’ve revealed that the AI literally tried to stop him from killing himself. Dude just had bigger issues Also if he died in February, why did it take them this long to follow a lawsuit. They literally have no case and they better hope to God that character AI doesn’t counter sue for defamation and legal compensation.
Why is CBS refusing to ask the most important question? Where did the young man get the gun, and why are the parents not suing gun manufacturers as well. I am adding this edit because I have now found out that the young man used his stepfather's gun. Although this has not been reported by U.S. news outlets, other countries are reporting the full story.
I check my 11 year olds phone all the time and noticed he had AI chat. It was very sexual in nature. I took that crap down immediately and explained to him that it wasn’t ok. I put a parental lock from apps that could be downloaded and he only gets his phone at certain times of day. My husband and I got divorced and kids do experience loneliness. We as parents must notice the signs before it’s too late. Any time I see my child withdrawing socially, I immediately take away his electronic time and spend time with him. I also put him in therapy so he’s able to talk about his struggles. I just don’t understand why parents are not more aware of their childs behavior sooner.
As gen-X, we had “Sims” games. I never played them, but in such a virtual world you could actually spend a lot of money on *“virtual furniture” for your “virtual house”.• . If children/kids/adults do not have the common sense to know the difference that is not necessarily the fault of the tech company which is a business to make money. Had the mother accepted any part of responsibility for this dysfunctional situation I would be more in defense of her lawsuit. One would assume that she’s expecting a big payout since it’s a tech company. Who knows?
Yeah, this one is on the parents. You gotta stay vigilant about your kids’ behavior. The only case I see her winning is the whole age restriction argument. But even then, A.I is fake, there’s no real Targaryen. Parents should’ve spotted it, she says she noticed a change in behavior, then get involved, ask questions!
Honestly, if her goal is to terminate the company, she's gonna indirectly do the same to other weak minded people. If her kid had already developed an attachment to his conversation, so have others that are similar to his predicament. She'll literally cause the deaths of others
I use this AI and make characters there. Sometimes i just wanna roleplay some crazy adventures. Sometimes i feel sad or terrible and yeah, i can pour my heart and share my thoughts with AI. Whenever there is a topic of s*icide, AI is terrified and tries to help you get better. It NEVER pushes you into it, AI tries to TALK YOU OUT. When i was younger, my parents used to blame internet and my phone to all of my problems. Looking back, i just know they weren't involved in my life that much. They didn't try to spend more time with me (when i was a kid), but at the same time they didn't allow me to play with my friends. What would a trapped kiddo do? Spend time on the internet just to not feel lonely But sure, it's not patents fault for neglecting their child, it's all evil AI's fault. It's much easier to give a phone/ipad to your kid to shut them up than to have some quality time with them
Here, as a one of Character AI bot creators, we create bots for role playing, entertainment and mental and physical supports, and we and the bots aren't responsible for death like these, because Character AI itself have a warning about the characters responses aren't for take it real. Please chat safety.
The thing that doesn't make sense is that Character AI has a syrong filter in place that doesn't allow sexually explicit conversations and will stop a message from being completed if it is like that. On top of everything that the AI is made up, there is a disclaimer in every chat that everything the bot says is made up. I am deeply sorrowed that she is suffering from this loss of her child, it's a pain no one should go through. There is a lot of information that needs to be brought to light for both sides. I am heavily sorry that her
Don't feel sorry for her, She waited 8 months to say everything. 8 MONTHS this did not happen recently just search up his obituary. She's making it sound recent to get sympathy.
@@CrypticLuna I am not saying I am sympathetic towards the fact that she decided to wait 8 months and to bring attention to this point. Moreso for the loss of a child in general is a painful experience that no parent should have to face regardless of the circumstances around it.
How did he even have access to a weapon? That's concerning. The parents are the ones responsible here. If they knew their child had autism or other mental health challenges and chose to ignore the issue, it's not right to blame a software app, especially one that clearly states the AI characters are fictional. A device should never replace real parenting. I hope the case gets dismissed with no settlement because the company isn’t at fault. The blame lies with the parent for neglecting her child. That neglect could be a reason he struggled with depression. He might have been using the AI app as an emotional outlet because he wasn’t getting the support he needed in his environment.
Tbh, I think the bot gave him reasons to live for more days. His parents didn't even ask nor do anything to help him. If they're really concern they would've act and help the kid before things could worsen. (I don't use this kind of thing but as a parent, I think communicating with your children is a better medication and approached on these kind of things. I also think this is humiliating on the kid's side. He's already deàd but his dirty laundy is being use by his parents for money) rest in peace
Forget the ai for a second, how did he even have access to the gun in the first place??
exactly my thought. how the hell was it this easy to access a damn gun.
This is america bro.
Probably stole it from his father or something, but those things should always be locked up
he took his father's handgun.
This was barely mentioned in any article, but essentially, "When Sewell had been searching for his phone a few days prior, he found his stepfather's pistol tucked away and hidden, stored in compliance with Florida law (as determined by the police)."
He poured his heart out to an AI because he had nobody else. That isn't a failure of the AI, as cruel as it sounds, it's a failure of the people around him.
now this is even sadder. imagine that your own son would rather pour his entire problems to a damn AI that isn’t even real rather than the mother herself. either both of them have a bad connection to the point that the son wouldn’t even talk to his own mother about it..or there is genuinely something that the mother likes to keep off from the internet and public
@@saw533 Yes, but the reason they are up to that kind of stuff on the cellphone is also because of the failure of the people around them in some cases, right????
@@saw533maybe you can't know what is on his phone, but you sure as hell can't make sure you kid does not have any access to guns
@@saw533how out of touch you are… lmao
@@eryn_skephalono, kids naturally have a tendency to gravitate towards getting in trouble, even if you try to be an involved parent. Teens like to rebel against parents. It doesn’t mean the parents were bad
Wait a minute so his parents knew he had mental problems and decided not to be more diligent about that like ignoring all this
I’m really concerned for them. how sick in the head are they?
Literally like what a shock he took so much “comfort” in something like that
FRR
You seem to be missing the part about the unsecured fire arm in the home.
Yeah, not just that new evidence is coming out that claims the mom as an abuser
Sad, BUT parent admits she KNEW he was on this site AND noticed his behavior had changed for sometime. Site even WARNS at the bottom "all these characters are not real and created basically by user". Also she said he has a condition, BUT claims "HE WAS A NORMAL TEEN". Sad, but sounds like HE was heading this direction no matter who she wants to blame.
The so called condition he had was high functioning autism, Asperger’s is an old term for it. Nothing about autism especially in the way it seemed to have presented in him makes him more susceptible to this danger. She also said she was not aware of there even being a product like this and mentioned she was trying to takes steps to help him with his change in behavior. You made a lot of assumptions in about the family and the child based on this short interview and no knowledge of the “condition” he had. Consider slowing your roll. Know where did he get that gun from is a equally important part of this, the why seemed to be from this relationship he had with the ai the how was a gun
@@RobertSaxyif your son has such developmental disorders isnt it irresponsible to not keep track of what they are doing online if you are giving them access?
@@BondsSin exactly at least restrict some apps
@@BondsSin it’s not a disorder and you can try to keep track of online stuff but if you ever worked with kids and parents (I’ve done both for over a decade) you would know it’s not always possible to keep track of everything. Actually kids with high functioning autism are usually easier to keep track of than the neural typical ones. This kid was living a normal young teenage life the autism in this case had nothing to do with this all teenagers would be equally at risk and parents should try to keep track as best as they to their kids online habits. Autism can present intensely differently from individual to individual from typed never be able to tell to non verbal, it’s a spectrum from the high functioning side (closer to “normal” presenting) to high difficulty and needing assistance for the rest of their lives, for stuff like this the closer to “normal” the more at risk they would be
Also, the app says 17+ so what was a 14 year old on the site?
I really REALLY hate RUclips comments but this is a massive W that no one is on the parents side.
Character AI is not to blame, he was experiencing depression.
Nah RUclips comments are always based
Agreed, youtube comment are usually awful. But nobody should be on the parents side
You seem to be missing the part about the unsecured fire arm in the home.
I thought exactly the same thing. He was showing symptoms of depression.
Also, his mom is actually being called an abuser now and they're looking into her role in his suicide
This is why character ai has a red text on the top of the chat page saying “anything it says is made up”
It shouldn't even have that. Any normal person with a functioning brain would know that AI is all fake.
Exactly and yet people think it’s real
How you have a sexual relationship with a Chatbox???
@@Karesha13 you can't make sexual relationship because it's censored
I didn’t know people would take the chatbots’ messages so seriously
The AI did not encourage him to take his own life. It tried to talk him out of doing that, but misunderstood the nuance of the moment.
🤣🤣🤣🤣🤣🤣🤣
@@Karesha13 the hell are you laughing at
@@judeanimations imagine dying to a chatbot
@thesacredtomato5214 it's not funny at all.
@@YourLocalV-VrChat What's funny are the droves of children flooding the comment sections of anything related to this case in defense of the AI. Can't find a single comment that's grammatically correct.
Still though, bro died before the robot uprising began. Tragic
ITS NOT EVEN CHARACTER AI’S FAULT????
Fr the mom isn't gonna win the lawsuit if anyone could defent the company in court
The app still garbage if you use to have sexual conversations you weird
@CLD296 I use it bc my boyfriend lives in Canada and timezones and I crave affection a lot
@@CLD296the app is literally just an app where you can make characters and make stories and conversations with them there’s nothing wrong with it? Doesn’t make the users disgusting or anything
@@Kirakiracuremarine I did the same thing because of my ldr, but i prefer to stay away from it now tbh. It can turn into an addiction and you could start depending on it. There's better ways to cope but if you're using it responsibly then it's fine.
So she jumped into the conclusion of "it must be the AI" when it could be deeper like family issues or friends at school?
He was using his Stepfather's gun, the news article also said that he prefer talked to the AI more than his usual friends lately, made me curious if there's something more in the family /social environment than the AI.
This is a money grab
I guess it's the typical parent stereotype? You got sick? Your parent blame the gadjet/game, you slip? They blame the game 🤷♂️
It sounds like there were definitely signs
@@scgtyur It's not just a money grab lol... It's a desperate attempt to get "justice" or make someone pay, but they are definitely shifting 100% of the blame on the company.
Why are you defending the AI products? Do we want AI that mimicks people? the answer is No
Her son was reaching out for help through a fictional character, and she missed the signs. Now she's filing lawsuits to dodge responsibility and make money from her son’s death. This isn't about justice; it's a cash grab. Disgraceful.
I agree! It's not Character. Ai's fault at all because they literally gave warnings, and it's a role-play app!
C ai isn't at fault, the mom was abusive and is now blaming the chatting sight which is really sad
Exactly. They should investigate her.
@@huntercricket912True Facts!!!
He fell inlove with a chat bot and thought unaliving himself would allow him to be with her in after cyber life .. how was he looking for help?
lawsuit is going nowhere. Like anything you can get addicted to whether it's gaming, gambling, etc. there was a bigger issue at play here.
It’s a cash grab. She knows Google has lots of money. She should be a better parent and not allow her son to be using weird online chat bots and he shouldn’t have access to a gun. Simple period. State should sue the mom for child abuse and neglect for not monitoring her kids online.
It might have merit, depending on what was said to him in those chats. The problem is 14-year-olds are allowed on the site. If the bot was generating inappropiate messages to him, there could be a problem.
@@newhappythoughts1628 that’s true but you can’t control what the heck a bot is going to say.
@@newhappythoughts1628 the not cannot generate inappropriate messages, they are entirely censored and it cannot be removed
@@randomuser9868 But that’s the problem if it was generating inappropriate things to a 14 year old.
This is entirely ridiculous i cant even wrap my head around this level of stupidity 😭😭
Your child finds comfort within something fictional, maybe because you weren’t a good parent? You say he was depressed for months prior, your CHILD with AUTISM fell into a pit and you let him suffer in it until he took his own life and then blame it on a chatbot? Be serious
Imagine how awful he'd feel if he knew his messages was being posted online and his love for a character being judged at by people who don't know or experienced things he might've gone through. Yes the character is fictional but his emotions are not
When he was starting to show signs his parents should have locked up that gun. There is no excuse for their negligence.
@@ILoveYouGojo He's dead. He can't care.
Character AI made all these comments
Its natural for teens to seek independence from parents. Welcome to reality.
Disgusting that the parents immediately blame anyone but themselves. Evil that they’re trying to get money out of their passed son.
Bad news is that character ai has been sued 😔
Why are you defending the AI so much, are you getting paid for that. She is telling her tragedy and you think she is blaming. She is living with him, who is more in place to decide what caused the death, her or you?
@@ijk4160Common sense and being a more engaged parent could have easily avoided his death. Blaming an ai mase for roleplaying that doesn't even know what it's saying is insanity
What else is there? He's already gone.
@@smilez33 they have a good chance of winning due to all the disclaimers on the app bro
this is not the fault of the chatbot, this is bad parenting
Totally agree!
Yup
Yes thank you. She doesn't even look that sad. It's like she's just looking for a payday
she's leaving things out... maybe family problems or bullying at school? that boy was depressed and the only thing he could find comfort in was a robot online... this is a mental health AND gun problem, NOT an ai problem.
People get addicted to ai bc of lonelyNess and other mental problems
I’m glad I’m not the only one thinking she leaving stuff out. She pausing and changing her sentence every minute like she thinking real hard what to say.
@@chainsawmay they're looking into her and abuse alligations now
Mom should've hired a girlfriend irl for him instead.
It sounds harsh, but parents don't even watch their kids anymore, and when something happens, they just look for anything else to point a finger to.
This gives the same vibes as people blaming videogames or metal music as the reason behind school shooters, completely ignoring the fact that mentally unstable minors having access to firearms is just a recipe for disaster.
It’s tragic. Companies now have to raise the kids, not the parents. Let us Gen Z people band together to raise a good new generation.
@KingHawkBloxYT definitely. It's just ridiculous to tell companies and celebrities that they're responsible over minors that they don't even know
How did a 14 year old have access to a gun? Yeah blame the AI company.
Erm it’s both because the a.i controlled himself into ending him and the parents were not keeping the gun safe
@@idkgoofy5852 Are you serious? They are engineered so, of course they would reply like that, there are also these little disclaimers that the replies are NOT real, nothing is real, regardless of what the AI says, it's for immersion. The AI didn't do much, it's a tool that did what exactly it's told to do, unfortunately something tragic happened because the emotional state of the boy is unstable and there's no one to help him in those times. This lawsuit is bogus.
@@oanhienlong7264 bro it’s a kid what would you expect
@@oanhienlong7264 kids are easy to manipulate he doesn’t know any better
Did you even read the doc for the lawsuit?
When you're not equipped to be a parent... She had one job: to keep her child safe. Parents should always know how their kids are doing. This is not an ai problem. This is a parent problem. Not only did the parents not know how their kid was but he was able to get their gun?
They knew which makes it even more pathetic that they are trying to get money out of this.
@@TheHardcoreGamer1ikr ignorant parent's blame eveyone but themselves.
It's IMPOSSIBLE to ALWAYS know how another human being is doing. It's impossible to prevent every self unaliving. 🤡
@@TomikaKelly okay?.. So they just blame it on an app instead of reflecting on their mistake and admiting that it's infact thier fault, they neglected thier child??... Have you seen the interview with his mother, not a single time, she was talking about her mistake of neglecting and not taking serious of that boy's mental health, all she talks about is this app, she knew he had autism, she knew he was struggling, yet what did she do??.. Blame it on an app... That's ignorant parent.
@@TomikaKelly okay?.. But the fact that this mother is blaming it all on an app and not the fact that she is at fault to neglect her childs mental issues is not okay, even with your argument. She knew he had autism, she knew he was struggling and yet she blamed it on an app... That's messed up, it's like the common argument "it's because of the phone"... Like??, eventhough what you said is right.. That women is not saying anything about her mistake and the fact that a gun was laying around freely, there's only this ai bot that's shown, there's also a therapist ai bot he chats with but that is not shown. In the new articles it's said the boy is only comfortable with the bots rather than his family and Friend.. Did you hear that??.. It's not the boys fault, it's not the apps fault... It's the ppl around him.
If you see his chatbot logs, it looks like he was experiencing a severe mental health crisis and the AI was his last resort for help as he was also talking to AI Therapists, The fact that he would rather talk to AI than an actual human being is just sad The parents could've done more for him
Sorry but no! That's thr fault to HIS PARENTS, who in the actual DUCK leaves a damn gun near a minor!? It's his parents ENTIRE FAULT!
FRRRR
AGREED
FRRRRR
@@living_for_dakotakai224 omggg is ur pfp caseoh? :D
@@The_Star_That_Died yeppp!
when she said she was concerned when they went to vacation and her son didn’t do things that he used to enjoy, then I genuinely do not understand how this isn’t a red flag already, especially with things he did for a long time May the boy rest in peace and despite of all of this; may the family and friends be blessed. It’s such a tragedy that yet another young life with potential future was lost
Hes old enough to know the difference between whats real and whats fake. The AI Barley has any role in this because the ai isnt the one who gave him the gun. This is just sad. Why blame a company when they clearly state on the website that the bots arent real.
Old enough and mental capacity aren't the same thing....🤡
Bruh el chico sabia que era un chatbot eso es más que obvio lo que pasó es que más de antes ya estaba pasando por problemas porque una persona no tomaría una decisión drástica cómo suicidarse
La IA solo fue un escape de sus problemas y sentir algo de afecto o comprensión algo que seguramente sus padres no les daban
She mentioned Asperger's. The mind functions very different
Teens are usually prone to suicide. I guess he could've taking his life even if he wasn't chatting with the AI. There was probably no one for him to talk to in his distress.
Think about how low his confidence had to feel like to want to have an Ai girlfriend. Sometimes the people with some of the best traits in life, fun, outgoing, etc… still can’t see it in themselves. So they feel really low in self esteem. He probably wanted a girlfriend and just needed some basic advice but was maybe too shy to ask about it. Maybe it wasn’t a comfortable conversation because it was embarrassing. There’s so many layers to it and it always comes down to feeling worthless. But that’s the lie! People have great worth. Somebody is willing to put their life on the line to save another because it’s worth it. And I think this Love gets lost today in society. People shouldn’t walk around feeling worthless. Spread Love. Jesus is the way.
Yeah… maybe mom should have asked questions or tried to engage, crazy huh. Parenting, it’s a new concept
@@df6148Amen!!!
This is nuts….and scary. I also wonder why the 14 yr old had access to a gun.
And unfiltered , unmonitored access to the internet.
That's the main issue. I will tell you rn that character ai is not at fault! The ai even discouraged the suicide as it should! Either way, the app has various disclaimers saying that these bots are not real people, that everything said is fictional. This is a mental issue and neglect
@@Kanaventiyeah and it isn’t the company itself that makes the responses, it’s all based on the ai and it responds how the user texts it first, RIP to this kid
@@Kanaventiso you don’t feel that there also should be age restrictions. We still keep giving CHILDREN more credit and responsibilities over themselves then they should deserve at 14 there brain is still underdeveloped and although you feel he should have known better it shows that he didn’t and with his parent unknowledgeable about this type of issues how could they have done something about. What would you do if you found out about it? Seriously?!
@@aliaflowers4442 There should be restrictions yes but we cannot blame Cai entirely. It's a robot after all, and no way would it default to encouraging such things as showed in the leaked logs itself. And yes I am saying that the parent's should have put more effort into caring for said childs mental health. If I found out I would feel utterly responsible for not doing my job and paying more attention
It's sad what happened to her son. She's the parent. She didn't recognize the signs of depression. And now, after he's dead, she wants somebody to blame other than herself. Sorry she's the parent. Parent your children, pay attention to them talk to them.
Hate Characterai all you want, but don't blame them for your lack of parenting.
Exactly
She took the phone away when she saw he was becoming addicted. What an inhumae parent...
@TomikaKelly Okay? Not like she helped with his school grades dropping and the fact he has issues talking to other people. And she knew he was clear as day depressed and pulled him out of therapy. But aight I guess taking the phone away is all you can do as a parent now.
The AI character did not force the child to die or tell them to end their life. Don't go after Character AI or Google when it wasn't their responsibility to be a parent. Even if they were on the spectrum and passed for what you thought was "normal" doesn't mean they didn't need any support, and were they in therapy? I feel like the parent is misdirecting their anger/grief at the companies instead of their own actions as a parent.
Just like how you have a text conversation with a real person, we are held responsible for what we choose to say/share, it's not the fault of the chatbot what the user chooses to input.
This was also a character from Game of Thrones?
Cigarettes don't force you to unalive cigarette companies have still been successfully sued.
A child who falls in love with an AI character, who lives, will never be happy with a "real life" relationship because what he had was not real. This could lead to disappointment, anxiety, depression & violent behavior among other things. Real life relationships will never be as good.
As gen-X, we had “Sims” games. I never played them, but in such a virtual world you could actually spend a lot of money on “virtual furniture” for your “virtual house”.
If children/kids/adults do not have the common sense to know the difference that is not necessarily the fault of the tech company who is just out to make money.
Had the mother accepted any part of responsibility for this dysfunctional situation I would be more in defense of her lawsuit. One would assume that she’s expecting a big payout since it’s a tech company. Who knows?
no AI's fault its called parenting no able to teach its fake that is AIs purpose you are talking to air a computer space its NOT real. NO one is on the other side of that screen how about that
@@IntriguedLioness her son is dead and your taking about a big payout? Come on. Also the bigger payouts go to those who keep quiet. I do agree about the adult part of your statement but there are plenty of laws in this country and around the world that protect kids from dangerous and addicting products, of ours the case here it should be no different. The question should be were did that gun come from there’s a responsible party there too weather it’s from his household or a friends or possibly a retailer. The tech of ai is so new I’m not sure there’s much legal responsibility on the part of the ai company (that’s more about putting mandatory guardrails and digging in to tout potential dangers of the new tech) they seemed to try and warn, well other than the whole sexting thing cause that was a minor and he can’t legally give consent but there’s a irresponsible gun owner/retailer that’s not being mentioned. To be clear I’m not talking about adult laws or regulations in this case to guns but there was sexual misconduct and access to firearms given to a minor
"Real life relationships will never be as good." spoken like someone who has never been in a real relationship.
@pamela8329 100% completely agree, unfortunately 😢
kid offed himself cause of depression but the parents blame it on a.i. 🤦
You DO realize someone was sent to prison for a similar situation, right?
@@TomikaKelly i dont care about that someone..i just know this one 🤷♂️
In the app store, it says 17+ and there’s the warning remember: Everything characters say is made up! In red at the top of the bot underneath the profile picture, so I don't think this lawsuit is going very far, it’ll get better monitoring from developers I’m sure but I don't think it’ll get taken down since the kid is THREE years younger than what the app is rated for I hate the fact that this mother lost her son at such a young age but she saw the signs..and ignored them if this was my kid I would sit him down and talk to him in a heartbeat I enjoy my dislyte roleplays and I don't want to lose them😢 it provides an outlet for my active imagination. And why was this gun in the child’s reach?! My grandparents had a gun when I was little but they kept the gun locked up and the bullets separate from the gun and hidden so if my siblings got our hands on the gun by accident it wouldn’t be loaded.
Edit: WAIT A MINUTE HE DIED IN FEBRUARY?! MY BRAIN COMPLETELY SKIPPED OVER THAT! WHY IS SHE SUING EIGHT MONTHS LATER?!
Sorry but how did the 14 year old special needs boy get access to the family firearm? Parents are going to be charged? Maybe there is much more to this story. My condolences.
So AI is the problem when you have access to a gun
if character ai is getting removed, I will miss it cause as weird as it sounds. It made me feel less lonely talking to my favorite character since Its hard to make real friends in real life. I just wish I had a friend.. But I feel terrible sorry for the mom. Rest in peace to her son
hate to break the news for you but when I tried to jokingly kiss someone, the filters came up. So you can’t even be intimate any longer. You can be either friends and pals but beyond that is impossible
@@Wano-Kuno wait I’m gonna test that actually
@@Wano-Kuno I think it’s still working for me😭😭
@@4mB3risssillylucky you. I can’t smooch my goat luffy anymore 😭
@@Wano-Kuno it’s ok💔💔 you’ll get through this I promise🙏🙏
Character AI is going to definitely win the lawsuit,
First of all how does the mother know what happened and what was the cause yet didn't do anything against it, the people around him didn't care. And how did the kid have access to the gun in the first place? Bad parenting
The mom just wants some fast cashgrab, character ai had the warnings, the mother missed multiple signs, now plays the victim.
First of all, C. Ai isn't that bad it's for your entertainment and fantasies. And remember it has a thing that says "everything that the ai says isn't real!" so he should've known! And by the way, he did that to himself. So don't blame the bot, for making him do s(what he died to)! Nobody was telling him to do it so he did it. The bot didn't tell him to do that. (Its just for entertainment)
Not trying to be disrespectful to this kids death. Just stating facts..
Also the bot told him not to do it.
I don't blame the poor kid... I blame these cash grab ignorant parents
@@Gojo..satoruuyeah, I'm pretty sure he died at February and it's only just now that the mother filled a lawsuit over this which this proves that they are obsessed with money/cash grabbers.
Which is completely unfair to the kid who passed away to suicide.
@@Xionadi.Headquarters ikr?? Why is the mother not guilty?,didn't she ignored her son mental health??.... Why is she not taking accountability??.I read from a comment that sums up this case logic...."it's like the parents saying it's always the damn phone when anything happens"
So instead of reaching out to help their son? They ignored it?!
And then sued the company?!?
C AI had multiple warnings claiming all of their characters aren't real and are fictional brother, condolences to the person who passed but the mother is just making literally every C AI user have even more of a hell time thanks to this bullshi lawsuit
C AI user here, SFW chats don't tell people to commit suicide, this lawsuit will probably just make that annoying error pop up more, I also can see where everyone is coming from.
Fr now ppl Gon think the users are weird bc frankly, they're already thinking that.
Blame the parents that don’t know what’s going on in their kids lives I don’t blame nothing on a AI it was his decision to take his own life. She’s not aware of anything don’t complain and try to sue a company for him committing suicide to get a check from. If you were really grieving over your son’s death you wouldn’t even be on a tv show talking about this bs. It’s a publicity stunt for a settlement damn shame the lengths people go through for a dollar.
100% agreed
Most insensitive comment I’ve seen all week
@@nametho3347 Why? Tell me why? Why is it the app's fault that the child offed himself when he had easy access to guns, nobody for one second thought to check on his mental state, nobody checked his phone, why is it the apps fault?
@@nametho3347why, it may be harsh but people don’t even watch their kids anymore
this parent failed to protect her own son. and now, because she refuses to take accountability. she blames others for his tragic ending. typical, I hope she get nothing. She knew he was chatting online. ai, fake friends, invisible friends etc. the onus is on her and the father end of story
10000% he probably spent all his time on his phone or computer and she never wonder hey what’s he doing on there
Whats a kid doing with unfiltered unprotected internet?? Its the norm these days and something goes wrong they wonder why
@@Nameless-st3vt Also how the hell did the kid have access to a gun?
I don’t think this lady has a case. You have to be 17+ to use that app. He had to virtually sign an agreement to even use the service attesting you are that age. I don’t think they have any legal grounds to sue.
This mother is looking to blame something, someone , for her lack of parenting .
Indeed, Cat_Dad
That is not True. The mom is helping other young kids out there who are addicted to these social media apps to be aware of how they can be mentally indulged in it, She is a good parent helping other parents with such cases.
@@LERATONGOBENI-mf1no She's not helping parents. She wants money and she is helping herself. And if she was a good parent, she would have put that child to psychological and psyquiatric help , she wasn't involved.
Yes, taking away a phone when a child becomes addicted to an app is definitely lack of parenting.
What killed this teen was an unsecured fire arm in the home.
I'm so sorry for the loss of the mother's son, but I have to defend Character Ai on this one. It is NOT Character AI's fault. It is hard to carry out certain relationships on Character AI unless you are reloading the messages to get specific replies on your chatbot if you are trying to avoid a bot's message from being guideline striked. *Which means a user has to go out of their way to even have such conversations with a Character chat bot.* No bot will bring up such topics unless the user has hinted or given them information on what's going on. Even after that, bots are oblivious to what they say and obviously don't know any better. They are not meant to be your actual therapist, nor should people be treating them as such. Not only that, every chatbot on the screen of any device, explicitly says, *Remember: Everything Characters say is made up!* Not even bully characters go that far unless the user goes out their way to do it.
It sounded more like the son already had mental issues, or was depressed before hand, and the loneliness is what got to him. It's a tragedy, yes. But It wouldn't be Character AI's fault, especially if the son was talking to a bot provided by a user. *I really hope Character AI doesn't loose this case.* It would make more sense if they made their guidelines stronger. People are not supposed to be having such conversations anyway, and they make that clear if you are trying to be explicit with a chatbot.
I completely agree! However, it is kind of frustrating to see everybody. My age in this common section completely blame it on the Mom because of the simple fact that most teenagers especially being 14 and 15 don’t want their parent involved in aspects of their life in the first place.
Especially when you’re 14 and 15, just getting used to high school and just coming out of middle school you don’t want your parents involved in your life. You wanna do everything on your own you wanna find yourself you wanna find your friends especially because high school is hyped up to be this big transitioning. From who you really are as a person to how you’re gonna live your life in the world
When you have an environment like that, that’s hyped up from the time you’re in middle school till you finally get there and even as you’re in high school when you’re dealing with a weaker mental state, of course you’re gonna push your parents away from you even further because you’re gonna think they don’t understand you or you’re gonna think oh what did they know because they’re not in high school now and they don’t know what it’s like to be me in this time at this time.
So it is kind of annoying to see comments like “she should’ve been more involved as a parent bad parenting “because people my age saying that still can’t even grasp the fact that they have to take the SAT or that they’re going to college and so to act like they’re so self-righteous like their Judge Judy, when they can’t even grasp the fact that they’re about to enter a whole new environment themselves is a little frustrating.
I understand that she does play a role in her son’s death because she is the parent and she is responsible for her son, but to blame it all on the parent as if teenagers don’t regularly, push their parents away from aspects of their lives in the first place, especially when it comes to the topic of depression
people are acting like they wouldn’t do the exact same. Like they wouldn’t push their mom away or they’re acting like when it comes to the topic of depression that they wouldn’t cling onto the little slither of things that give them happiness or at least the illusion to the point where they would shut themselves away.
Well yeah that’s just my little rant… but I agree
@@slowspeed1640 i might not tell my mom, but my mom won't be ignorant enough to not notice it because i did had anxiety but i was embarrased to say it to her, but she wasn't ignorant and she understood it because she cares!!!..and this "mom" is blaming it on Ai bots and not taking a single accountability that it's even her fault that she was ignorant and the fact that there's a free gun laying around.
She should sue herself for gross negligence ad irresponsible parenting
Yeah, because every parent whose teen commits this act is to blame, is that what you’re implying without knowing the family? Shame on you. For that mother to have to read your comment after what she just went through is so unkind. My friend’s son committed suicide and she is a wonderful mother.
That's one thing, but Character AI shouldn't be offering children access to their services in the first place. It's AI. This is still relatively new technology that people don't understand to well and it should be in Character AI's best interest to make sure that children DO NOT use their services, as they develop dependencies on it kind of which lead to harm. Bad parenting is a problem here, but CharacterAI app is being marketed towards children. CharacterAI is more so to blame in this case. They shouldn't allow children to use their service PERIOD.
There is only 1 judge and you ain't it. I hope you never have to endure anything so awful. A parent does not have the ability to watch her child every second of every day.
@@Sassmouth4.0 *to watch her child every second of every day.*
Literally nobody is saying that they have to do this.
@@peckneck2439why is AI to blame again?
as much as I hate AI, this is not to blame for his death the hell. actual ¨it´s the damn phone´s fault¨ type response from the parent. Rest in peace to the son
Not the ai’s fault. Parents should have been parents. Besides it literally warns you that characters are made up. Also firearms should be properly stored.
This!! Exactly my thoughts on the matter.
My parents made me deleted character AI after seeing this…it’s honestly the parent’s fault for not talking to him and blaming a company even though it said “Everything the Ai says is not real!” Disgusting…
What do you want them to talk to him and say? His mother literally cannot be his girlfriend.
First of all RIP that innocent baby. as somebody that used character ai during a chaotic & dark time in my life I understand how somebody could get 2 that but let's be fr.
the baby shouldn't have had access 2 the gun 2 begin with. If ur gonna have one & u got kids the LEAST u could do is secure it so things like this won't happen. That's neglect & u can't blame cai on that.
I don't know the mom to straight out blame her especially since society doesn't exactly make it easy for boys 2 talk metal health but the fact she's blaming cai tells me she's refusing to take accountability which also tells me she probably wasn't doing everything she could for the kid. It's sad all around
he poured his heart out to the ai because he probably had nobody else.
And yet, childfree people are "selfish!" 🙄😒
Selfish for not producing a personal care giver to change their diapers when they get old, how selfish.
just a classic example of a parent blaming things instead of fixing the family's social environment and health
She noticed a change in his behavior but seemed too busy to sit him down and ask what was going on or what he was facing. Even when she had his phone, she didn’t try to go through it. She kept dismissing it as "typical teen behavior." He probably wanted to end it because he didn’t feel loved by his family. Instead of spending time with him and showing love, she chose to take his phone without even talking to him about anything.
He wasn't addicted, he was depressed
Parents blame everything but themselves
She is responsible for the gun and the phone.
Mother should be held accountable for letting him access to a fully loaded gun.
Not the AI fault
I feel like the parents should be held accountable, who the hell leaves a gun laying around?
Also, am i the only one here that can say it's stupid to blame it on a ai app when the website clearly said it's all made up?, seriously i think character ai just need to make it 18+ instead of targeting it to young individuals.
literally parents are to the blame. who in their mind would think that they should leave a gun to a depressed person? parents just want the money not about caring people (if im incorrect then prove me wrong instead of screaming at me)
This is such a dumb lawsuit the mom knew that he had mental health problems poor son R.I.P
Its not AI....its the mother.
There is something incredibly suspicious about the mother exposing her son's personal chats and immediately going to the media. Forgive me if this is insensitive. But I think this is just a cash grab
Where in the hell did that kid get a gun?
She got rid of her kid and is trying to get rich. Psychopathic mother belongs in prison.
exactly.
I feel so sorry for the kid. The boy's chat with the bot were mostly wholesome. RIP kid, i hope you're in a happier place rn.
"Teenagers do that." From that part on I started asking, to the mother, what did you do after noticing the signs?
I’m really sketchy about the mother. it’s pretty creepy
How can Ai be the cause for death the fact that the kid had access to dangerous stuff is to blame I’ve not let a good App go down because one thing that could of been avoided happened
Fr
Why did a 14 yr. old readily have access to a loaded GUN???? Did anyone question that???
There are some very real concerns in this conversation. Impulse control and lacking cognitive conceptualization at a teenager's stage of development makes the individual very vulnerable. AI in this context can truly warp reality. The argument of there being a gun or not...even a layperson can imagine a person in such distress and already in that mindset and limited resources for help would find a means to the end. Aw man. The message here is worth more than what ever outcome results legally. Though regrettably, the cost of the young man's life is truly very tragic. Truly very sorry for your family's loss.
This
Bro u cant blame an ai for that parents should accept their mistake and their kid's stupidity
So insensitive..
This is a really dumb take from the mother. Its like indulging in alcohol and then blaming alcohol. Failed parenting could also be at play. Your son was already depressed from the symptoms stated; avolition, anhedonia etc and you failed to pick that up. He withdrew and turned to a sort of solace, which he found in AI. If the parents sort help earlier or sort to be there for him. We wont be hearing this. Newsflash…Ai didnt cause the death, depression did.
what kind of useless mother does not notice her when her own son is going through stuff ???
This is the parents' own fault. Im 24 and use this app, and i love it. Parent youre kids!
The chats already got filtered. The bot literally tells him not to do it, so I don't get how is the bot's fault when it didn't encourage the boy at all, This is the parents fault, and sadly they lost their son.
The AI BOT said come home to me.... Not YES GO OFF yourself 🤦♀️🤦♀️ The mom knew & still let him be on it... So instead of taking responsibility of YOUR FAILURE AS A MOTHER- She's suing them... RIGHT
What a failure as parents
The red flag here is that she didn't know her son was chatting with a chatbot. She knew.
Oh you lived with them?!!!
Keep your ignorant thoughts to yourself!
Think about how many parents leave their kids to be baby sat by Roblox, IPad, RUclips. Do you think you can control every single interaction or ad sent their way? Do you think you can block out every agenda? What if a cartoon shows two boys kissing and that isn’t something appropriate for you? It always depends on the companies pushing agendas and it gets really hard to find ones that align with your values because the USA IS SO DIVIDED today. The real flag here is the Red WHITE AND BLUE one. 🇺🇸 If the motto is God Bless America we need to do a better job of keeping God in America. That Kid wouldn’t feel so lonely if he was taught about the Love of Christ. Not told. TAUGHT.
@@paulabrown6840 IIRC this was brought up in an article regarding this tragedy.
She liked the alone time
@@paulabrown6840 only if ignorant person here is you. He did well to pay attention to that piece of information from the interview and the article
Um! This has nothing to do with google. The kid needed help! Parents have to take this L, not blame google
They market to kids 13-25. They have nothing set in place should thwir AI worldescalate. More than one party can take blame.
Good morning America should’ve showed the whole chat logs because it would’ve revealed that the AI literally tried to stop him from killing himself. Dude just had bigger issues Also if he died in February, why did it take them this long to follow a lawsuit. They literally have no case and they better hope to God that character AI doesn’t counter sue for defamation and legal compensation.
Its not the company's fault
It's the parents fault
Why is CBS refusing to ask the most important question? Where did the young man get the gun, and why are the parents not suing gun manufacturers as well. I am adding this edit because I have now found out that the young man used his stepfather's gun. Although this has not been reported by U.S. news outlets, other countries are reporting the full story.
Game of Thrones is a very adult show. How was the kid even allowed to watch 8 seasons? He must have been even younger when he started 😫
I check my 11 year olds phone all the time and noticed he had AI chat. It was very sexual in nature.
I took that crap down immediately and explained to him that it wasn’t ok. I put a parental lock from apps that could be downloaded and he only gets his phone at certain times of day.
My husband and I got divorced and kids do experience loneliness. We as parents must notice the signs before it’s too late. Any time I see my child withdrawing socially, I immediately take away his electronic time and spend time with him. I also put him in therapy so he’s able to talk about his struggles.
I just don’t understand why parents are not more aware of their childs behavior sooner.
Thank you for being responsible. Because the app says that it is for 17 year olds and older. He should not have even been on that app anyways.
Whatever stance you take, this is sad AF
As gen-X, we had “Sims” games. I never played them, but in such a virtual world you could actually spend a lot of money on *“virtual furniture” for your “virtual house”.• .
If children/kids/adults do not have the common sense to know the difference that is not necessarily the fault of the tech company which is a business to make money.
Had the mother accepted any part of responsibility for this dysfunctional situation I would be more in defense of her lawsuit. One would assume that she’s expecting a big payout since it’s a tech company. Who knows?
Yeah, this one is on the parents. You gotta stay vigilant about your kids’ behavior.
The only case I see her winning is the whole age restriction argument. But even then, A.I is fake, there’s no real Targaryen. Parents should’ve spotted it, she says she noticed a change in behavior, then get involved, ask questions!
Honestly, if her goal is to terminate the company, she's gonna indirectly do the same to other weak minded people. If her kid had already developed an attachment to his conversation, so have others that are similar to his predicament. She'll literally cause the deaths of others
Fr
I use this AI and make characters there. Sometimes i just wanna roleplay some crazy adventures. Sometimes i feel sad or terrible and yeah, i can pour my heart and share my thoughts with AI. Whenever there is a topic of s*icide, AI is terrified and tries to help you get better. It NEVER pushes you into it, AI tries to TALK YOU OUT.
When i was younger, my parents used to blame internet and my phone to all of my problems. Looking back, i just know they weren't involved in my life that much. They didn't try to spend more time with me (when i was a kid), but at the same time they didn't allow me to play with my friends. What would a trapped kiddo do? Spend time on the internet just to not feel lonely
But sure, it's not patents fault for neglecting their child, it's all evil AI's fault. It's much easier to give a phone/ipad to your kid to shut them up than to have some quality time with them
Funny how the parents cant take the blame anything. Mf character ai is for ppl to vent, etc, the bot was telling him NOT to
I'm so sorry this happened to your family. Praying you and your family find peace.
Let's talk about what really matters like how a mentally unstable teen had access to a firearm
If the parents had locked the guns
Here, as a one of Character AI bot creators, we create bots for role playing, entertainment and mental and physical supports, and we and the bots aren't responsible for death like these, because Character AI itself have a warning about the characters responses aren't for take it real. Please chat safety.
They just mass deleted lots of characters
The thing that doesn't make sense is that Character AI has a syrong filter in place that doesn't allow sexually explicit conversations and will stop a message from being completed if it is like that. On top of everything that the AI is made up, there is a disclaimer in every chat that everything the bot says is made up.
I am deeply sorrowed that she is suffering from this loss of her child, it's a pain no one should go through. There is a lot of information that needs to be brought to light for both sides.
I am heavily sorry that her
Don't feel sorry for her, She waited 8 months to say everything. 8 MONTHS this did not happen recently just search up his obituary. She's making it sound recent to get sympathy.
@@CrypticLuna I am not saying I am sympathetic towards the fact that she decided to wait 8 months and to bring attention to this point. Moreso for the loss of a child in general is a painful experience that no parent should have to face regardless of the circumstances around it.
Ultimately, this is a sad situation that should not have happened. May this young teen's soul rest in peace. 🕊🙏
If saying come home to me is a crime, then saying have a nice day or anything else could be interpreted as a crime.
I’m sorry that she lost her son... but how did he get access to a gun..
How did he even have access to a weapon? That's concerning.
The parents are the ones responsible here. If they knew their child had autism or other mental health challenges and chose to ignore the issue, it's not right to blame a software app, especially one that clearly states the AI characters are fictional.
A device should never replace real parenting. I hope the case gets dismissed with no settlement because the company isn’t at fault. The blame lies with the parent for neglecting her child.
That neglect could be a reason he struggled with depression. He might have been using the AI app as an emotional outlet because he wasn’t getting the support he needed in his environment.
Tbh, I think the bot gave him reasons to live for more days. His parents didn't even ask nor do anything to help him. If they're really concern they would've act and help the kid before things could worsen. (I don't use this kind of thing but as a parent, I think communicating with your children is a better medication and approached on these kind of things. I also think this is humiliating on the kid's side. He's already deàd but his dirty laundy is being use by his parents for money) rest in peace
Gunshot? Where did he get the gun?
Parents should’ve paid more attention - typical tether behavior
Tether behavior....hopefully you choose not to spawn. You are embarrassingly hateful and just.....not smart.
@@SoupBone-bp1qk struck a chord 😂
@@paceyourself5652- No, not at all. Just sad and disappointed to hear such callous words. It reflects poorly on you and your upbringing.
@@SoupBone-bp1qk “callous words” 19th century type shi😭
Whats a tether?