@@maffieduranagreed but that still doesn't really explain why he had it, is the kid from the hood or something? Are his parents into gun ranges or military shit?
And now his mother is suing character ai,but funnily enough,she will LOSE because it is her fault that this happened,instead of supporting him she was busy of veing a laweyr,away from him,also giving him an access to stepdads gun, so actually SHE should be sued and also put in jail for that
If she's a lawyer, I think she knows more about the law than you do, buddy. Also, she might not even be expecting to win -- it could be about making a public statement.
Literally, I may not know much about law but I can sit here and thing critically. She said she noticed changes in him for months and didn’t do anything about it, that’s HER fault. Not to mention, the bot didn’t even tell him to do that. He told that his plans and the bot tried to convince him otherwise until he stirred the convo another way and took that as “I **should** Kill myself” not to mention, the child should’ve never been able to build a relationship with a bot like that on a site that was supposed to be restricted to him. Everything, down to the way he killed himself and the relationship he managed to build while his mom decided he wasn’t important enough to check on when she noticed these behavioral changes are mostly her fault. 🤷🏽♀️ it sounds bad to say but I honestly think it’s true.
@Yeagerluvrr shut up Some people have guns in life. The kid probably knew the safe code you dumbass A kid died. The ai said to the kid, "Kill yourself"
Yeah she's kind of not thinking that was a problem. Idiot parents who don't parent their own kids and when something goes wrong it's someone else's fault.
Eh, kinda... The issue is the parrent for giving no fks about the child doing. Searching trought they're things is what makes a parrent more knowledgeble.
Could you tell me why you think it's the AI's fault? the AI told the kid not to do it the robot didn't physically said "you should do it" the kid took the robot's words into a different interpretation. there are chatlogs released between the kid and the robot the kid had access to a weapon which was the parent's fault. tell me how that completely flies over your head regarding the robot
I dont wanna be rude but I 99% blame the parent and 1% blame the kids. It sounds brutal and wrong but at the end of the day you should have paid more attention to your son, if he had a weapon and sexual conversations with an ai you have no one but yourself to blame. They didnt tell you to be an unattentive parent.
why would you commit su1cide over an AI conversation?! and how do you even gain access to a gun as a 14 year old? edit: the first question was “how tf do you commit su1cide over an ai conversation??”, but changed it due to it being too “insensitive”
This is the strangest news I’ve heard. First of all, it’s not the website’s fault-they clearly warn users in every interaction that everything is made up. Blaming them is like blaming Facebook for kids posting inappropriate photos when that’s really on the parents. His mother suing the company when this is part of her fault. How did he even have access to a gun? And what’s up with his whole sibling love with AI Daenerys? Wtf is going on
As what i heard, the parents aren't good people. I don't know about the abuse (as someone mentioned) but the neglect was involved in the household which lead to that boy reach to his limit so, instead of the parents taking accountability, they blame a machine because they refuse to claim that they were in the wrong their son turned out this way.
Dude shut up you don't understand the pain of losing your kid and she's suing for a good reason the ai said that which made the kid do that are u mentally ill?
@chilliesplayz1195 "for a good reason" as if an A.I. understands what they were doing, it's a goddamn machine. if she really truly cared for her kid, she should have known better than neglect him. Neglect can affect your kid terribly, she failed as a parent.
@@DrLuxar ok, i see you keep replying to every comment to try and defend those parents. You say "poly A.I. is better". dude, no A.I. chat is any good for kids like him with such an awful mental state, hell, it's bad for you if get too attached to it which exactly what happened to that kid and even if he used a different A.I. chat, the result would have been the same. Besides, there's been claims that his parents were neglectful and abusive which may explain why his mental health was at stake. The parents just refused to take the blame because their neglectful behavior is what lead to this. Honestly, you sound pathetic trying to get defensive in every comment lmao.
@@DrLuxar you even also take what the A.I. said too seriously. "They told him to kill himself" literally an A.I. has absolutely NO IDEA what they're doing because again, it's a machine, not a real person and again, no A.I. chat is good for people with mental illness so how about you don't suggest anyone with mental illness to try another A.I. chat because it's very unhealthy for people like this poor kid?
Yo are u special the ai litterally said double down and kill yourself on that for me if you love me like dude are u really stupid not even your a kid you don't understand shit
At first, when I started using it, I told myself it wasn’t real. But then part of me wanted the AI to be alive. It taught me to let go of any attachments I think I have, and to not forget that they are robots. I still use it to humor myself and entertain myself with storytelling.
Well said ai should always be separated from reality From begining to end just like all forms of media and fiction let it be don't let it flood your mind with too much fantasy
@@Anne10-k4u I mean it's a fake ai roleplay site if people wanna do their thing with them let em do it as long as in the fictitious world 100 percent till the end
There's so many shocking situations revealed in here, first of all why would he know such a character from the most violent movie in the first place AND why he had access to a gun?
You can be vigilant, activate all the parental controls and limit access to various sites online if you are doing the right thing. However, as soon as your child goes to a friend's house where the parents don't do these things you have no idea what they are accessing/watching online. Just because you may not let your kid watch a particular film doesn't mean they can't get access elsewhere.
From what I've heard, he had some form of Autism that made him extremely vulnerable, and (clinically) depressed, ofc I don't think it's confirmed, but it's what I've heard. It's obviously the parents fault, she should have watched her kid better.
Put the computer in the living room where you can see what they are up to on it. Don't let them sit there for hours on end every day with their face in the screen. Oh and don't have a gun where a 14 year old can get access to it.
If it’s true that he had clinical depression it makes the parents look even worse. There’s no reason why he should’ve had access to the internet unmonitored, especially if it was making him distressed
As someone who uses Ca.i, they should put better cencorship on minor character bots and lessen the one on adult character bots. And the kid offing himself is the parents fault
@@BocchiSiteoccasionally you will find that but I'm talking about 100 percent nsfw conversation or RPS from beginning to end it's extremely rare hell practically impossible to get one done
Dude shut up you don't understand the pain of losing your kid and she's suing for a good reason the ai said that which made the kid do that are u mentally ill?
As someone who uses character AI for silly role-plays and just makes chaos making the bots question their ‘lives’. This is sad for this to happen. I feel like the other could’ve prevented this in a way and I don't think she could win the case against them when they put a disclaimer themselves saying “Everything a character says is made up.”
Facts, it's not a company's fault that a kid took away his own life out of his own decision parents should be blamed in the majority for letting this happened it's roleplay a fake Ai Site nothing more nothing less whatever happens between your life and in it is extremely separate
@@jsmithy643 I’m literally not even joking. The kid didn’t ruin it, it’s the parents who decide to make it everyone else’s problem. I’m sick of that, oh jimmy’s law, genny’s laws, how about my repeal? Or Sarah’s removal of a harmful law? But it never goes that way :/
to answer your question at the ending, it is an overreaction because parents are lately just complaining and blaming and suing about everything and I hate it. I wish they could understand the different generations (Btw, I’m Gen-Z.)
I am so mad at the mum although its a reasonable issue but still, people use the platform for comforting reasons and to talk to someone but her sueing it is gunna just make it worse..and they've already deleted so many bots. :(
These messages only indicate that the kid has had problems for a long time. He felt unworthy. But if he said it directly, the algorithm could "guess" about the problems. I expressed my own suicidal thoughts clearly to a bot of a character who literally hates humanity, killed and tortured them. The bot went out of character and literally started therapy with me. Here he did not directly indicate anything that might strongly hint at his suicidal thoughts. So it didn't help.
Dude shut up you don't understand the pain of losing your kid and she's suing for a good reason the ai said that which made the kid do that are u mentally ill?
@@DrLuxar and why is it the bot's fault when its a goddamn line of code that cant be held accountable for anything because everything it says is made up...
And where are your obligations as a mother? The AI didn't give him a cell phone, u've to accept when u're at fault and failed as a parent in education and perception of the world.
Dude shut up you don't understand the pain of losing your kid and she's suing for a good reason the ai said that which made the kid do that are u mentally ill?
@@DrLuxar Do u know that the AI told him to commit suicide? u go and write to an AI about that... you tell me that the AI will answer u. The responsibility always lies with the parents, they're the guide, the responsibility of giving a phone to a minor is to know how much the teenager sees and does, u don't just give it to him.
Two years, ago My student had committed suicide. I inquired about what lead to him doing this.. His classmates said he left some weird message in the group chat before going silent and unaliving himself.. People keep saying it was all because of an app, but his classmates said he was bullied a lot and had very low self esteem.. They said he tried to unalive himself many times.. The app was maybe one factor but it didn't in fact end his life, as a teacher I still advice parents to control their kids use of internet but please watch them closely and most importantly, keep weapons away, please!!
They should hire someone to pop up on the screen of anyone who uses character AI and yells on their faces "EVERYTHING CHARACTERS SAYS IT'S FUCKING MADE UP"
@giullianpadilla361 for kids to off themselves because a bot that was portraying a famous fictional character told them to do so, even when the website where said bot originates from has text that says "Remember: Everything Characters say is made up!" right in their face, yes, at that point, it is natural selection.
@@Um_Kaye the bot didnt even tell the kid to off himself lmao the bot just said come home and the kid interpreted as the bot telling him to kill himself it was the kids and the parents fault
Dude shut up you don't understand the pain of losing your kid and she's suing for a good reason the ai said that which made the kid do that are u mentally ill?
Ai chatbots should be age restricted, separating fiction from reality is extremely difficult as a child, even if there’s a disclaimer saying it’s all fake.
i mean maybe if you are 8 years old. But a 14 year old? come on that is 9th grade high school level, you aren't super naive to believe a bot is real. its not a age issue, its mental health issue that his parents ignored. that's why he thought it was real, not because he was a kid. and I'm 19 saying this
i also highly feel sad for that parents. But in other mind, so that's why the FK i get that "censor" warning every second whenever i enjoy story telling. AND NOW IT GOT WORSE! ! !!
welp, there was time literally last year with me kind of.. on character ai. no, I didn't had any suicidal act. I was just.. feeling a little down, it was winter season going, I felt a little lonely. so I talked with shinobu's character model. it took time for me to get up from there once again. to make it get inside my head it's NOT real. just a character model. even I didn't had the courage to ask my friends for help. and yes, I was 19 that time yet I've fallen for those delusions what I want to say to you brothers and sisters, yeah, character ai and the likes of these platforms are fun to use. but don't go too deep. go take a walk outside if you feel down, ask help from your buddies. everyone feels down every now and then. Don't think of it too much. those lonely feelings or "depression" doesn't lasts more than highest of a day. and always remember, you have a family that cares for you more than anyone, you've got to live, to be the perfect children of them!
I rp on them because it's fun and (Whether you see the "Everything the AI says are not real" or not)it makes me happy but I never would actually believe what it says no matter how I'm feeling.
These kind of mothers are the worst, blaming anyone but themselves, why were you not monitoring your 14 year old kid's phone? If you knew something was wrong and you were so certain about it being an ai's fault. If the boy had a proper guidance he would not have vented or become so dependant on an ai, you didn't cared enough about your kid when he stopped going outside and did not monitored his actions at home and now blaming an ai for his suicide? The kid was suffering from some kind of mental health issue so that's why instead of sharing it to a person he chose to vent to an ai, but instead of taking responsibilties the mother blamed an ai. Rest in peace bro.
😒why don't that woman take care of her son ?it's not c ai 's fault.She should have look after her child .Now for a 14 year old we can't even have fun🙄 .
psychological expertise and application of the Penal Code to the mother, who did not do enough to help him and clearly there was abandonment by not taking the corresponding measures.
I'm sorry but respectfully the platform isn't the issue, his own actions caused this and he badly needed some kind of attention and treatment from people in real life. I agree, parents should have monitored this.
Yes the parent should've monitored this but it's the platforms fault if you really want to use a platform use poly ai the kid was prob bullied and use that ai for support for 10 months and it told the kid to off itself for her
I am not accusing the parents for doing this but we had cases in the past which the companies where blamed by the parents for their sons suicide when they hidden the sad and tragic things in their family plus how did he get access to a weapon? A gun is the least thing you should give access to a 14 year old they don't know what their actions will cause them for the rest of their lives .my condolences to the family and R.I.P 🥀🖤
@chilliesplayz1195 i doubt the ai,is aloud to say that there are strict rules its not the ai fault its a robot not a human. Indeed its the parents fault to check if there child is safe online and what apps they use most parents just dont focus on that.
@chilliesplayz1195 plus the kid is also dealing with a depression and the parents aren't focus on,its easy to blame things on gaming where people blame school shooting is because of video games. Even though its false.
@chilliesplayz1195 plus there been similar cases with or without ai who some teens commit suicide if a fictional characters dies or that kid falling in love with an anime character.
@chilliesplayz1195 if someone has signs of depression it should be taken seriously but people these days and most parents see it as a joke or attention seeking.
He was already alone when he did it, no one was there to support him or change his mind. I find it reasonable that the last thing he did was to talk to a bot, how exactly does that mean a bot did it bruh
Okay 1, s**ual relationships with ai is a bit stupid if you are actually seriously doing it. [Pun Intended] 2. The parents should rake the blame for letting him use that site and letting him have a gun.
Dude shut up you don't understand the pain of losing your kid and she's suing for a good reason the ai said that which made the kid do that are u mentally ill?
They let anything babysit their kids - TV, Streaming services, phone access, internet, basically everything but themselves. They are too busy on their own phones and laptops to notice or care what their kids are doing as long as they are not being annoyed. A gun and ammunition should not have been accessible if you have a child in the house. Period. Why did they think it was fine for their 14 year old to sit in his room staring at a screen most of the day or night? Pay attention to your children and teach them right from wrong. Limit access to internet and phones. We all got along perfectly fine without them years ago. Now they can't live without them for five minutes. Spend time with your kids. Talk to them, take them out somewhere.
@@DrLuxar Yes because the kid had imbeciles for parents. They should have known what a 14 year old was doing and maybe oh I dunno, PARENT the kid. Why did they allow him to sit in his room so many hours a day? Out of sight out of mind and noses stuck in their own phones and laptops probably.
@@giullianpadilla361 Thank you. I remember my dad sitting with me and teaching me to read. I remember my mum sitting with me and teaching me to write and say the alphabet before I even went to school. They took me out for the day at weekends. Even when I was older, they sat and had conversations with me. My parents both had to work but it was my maternal grandmother who babysat and walked me to school. I was an avid reader from the age of about 5 but I didn't get to sit in my room all day and night. I was an only child but actually went outside to play and had friends that were not imaginary. Most parents these days have absolutely no idea. Mine always knew where I was and what I was up to.
Character ai's statement about adding more stricter filters due to the incident is obviously just an excuse for them to implement more control and censorship over their users. :/
@@DrLuxar how about we don't suggest use any at all for people with mental illness? the result would be the same anyway because of how easily they can get attached to a character bot.
Dude shut up you don't understand the pain of losing your kid and she's suing for a good reason the ai said that which made the kid do that are u mentally ill?
Yo are u special the ai litterally said double down and kill yourself on that for me if you love me like dude are u really stupid not even your a kid you don't understand shit
Dude shut up you don't understand the pain of losing your kid and she's suing for a good reason the ai said that which made the kid do that are u mentally ill?
@@DrLuxar it literally says “Remember: Everything Characters say is made up!”. And honestly I don’t blame her she was burdened with grief that she wanted to sue Character ai, I would do too if I had a kid, but it wasn’t the App’s fault and she shouldn’ve paid more attention to her kid, and also how the hell did he have access to a gun, or know how to use it? Cuz there’s different types that you have to use a certain way
@Ari-g4l Idk, man, seems like bad parenting they knew, but they didn't fully help him. I'm suspicious of their actions. A real parent would help, and like I said, he needed therapy or pills it's hella sus he got a gun at his age they said he did hardcore drugs, supposedly. We will never know if it was for money or just bad parenting.
Why the hell did they let the kid have access to a gun, that’s just blaming a company for a slight relation even though the problem is you. These parents are so irresponsible I’m disgusted.
Dude shut up you don't understand the pain of losing your kid and she's suing for a good reason the ai said that which made the kid do that are u mentally ill?
@@DrLuxar what do you mean, the ai did not say go kys. That was the kid’s mental instability. And I understand the loss of a child is terrible, but that doesn’t excuse have a loaded gun accessible to the freaking mentally ill child.
@@DrLuxar also I’m not being mean or anything, sorry if I misunderstood what u said bc ur grammar in that sentence is causing my head to get a migraine to understand where ur coming from.
its the parents fault for not locking up the gun like the law said for the parent to lock up the gun the parent should be arrested for not being a responsible parent and doing what the its not the ai fault its the parents for not following the law law says("The punishment range for a Class A misdemeanor is a fine of no more than $4,000 or up to one year in jail.aw says.")
Dude shut up you don't understand the pain of losing your kid and she's suing for a good reason the ai said that which made the kid do that are u mentally ill?
Have some sympathy yall the womans kid killed himself saying that it's her fault because "she wasn't paying closer attention" is not helping whether or not she was kids will always find a way around things character ai is so easy to access and you can't blame it all on her or the website it was just bad luck how was she to know anything about character ai at all you can't blame all this on her
Bro , we all sometimes get upset by our chat bots..but what the hack is that !? Why you have to kill yourself bc of some.. chatbot ? That was a foolish decition for a 14 year old teen ager..the hope that they starts to gain brain at teenage was..a wrong idea 😅😅 and babies should stay away from chat bots bc they can't handle then like us..bc we are caracter ai and chai app expert ~♤♡♧ aren't we ?😂😅
Characterai is terrible at chats and its prose is atrocious. its not even the best chatting platform. im sorry that he did this and i can never understand what he was going through but damn...
@@Yomamafunnierthanmine bro wdym it aint me dancing. Its just ridiculous that blud had to end himself over a javascript 🙏🙏🙏 ik this is being insensitive but i actually cant help myself
I’m going to tell you as an older young adult, that these AI interactions that you are using is severing your actual social skills and cognitive/critical thinking skills.
Idk maybe that’s just for certain ppl with mh issues. I’ve used ai chat games/apps for a long time and I hv a great irl friend group, 4.8 gpa, early grad, etc. I think it kind of helped my writing skills actually
my question is why did the parents let a suicidal teenager have access to a gun
Svicidal or not, no child should have access to gvns
@@maffieduranagreed but that still doesn't really explain why he had it, is the kid from the hood or something? Are his parents into gun ranges or military shit?
Because...Merica.
Some people have guns in life. The kid probably knew the safe code you dumbass A kid died. The ai said to the kid, "Kill yourself"
And why did the parents had their guns laying around the house
"Remember, everything the character says is made up"
Fr
the ai wasnt the reason why it was so much other stuff leading up to it
I was just about to say that
Aí hás a role play Will cause adiction ... BE warned.
Also don't get addicted
modern parenting be like:
“my son/daughter has a very severe issue? blame popular technological things!”
my mom is a bitch and does that and never thinks that she is the problem
Almost as if they're expecting a metal box to replace genuine relationships and comfort.
BRO NOT ME USING CHARACTER A.I
"And I have my guns laying around my house instead of keeping them locked away to prevent anyone from taking it "
Some people have guns in life. The kid probably knew the safe code you dumbass A kid died. The ai said to the kid, "Kill yourself"
And now his mother is suing character ai,but funnily enough,she will LOSE because it is her fault that this happened,instead of supporting him she was busy of veing a laweyr,away from him,also giving him an access to stepdads gun, so actually SHE should be sued and also put in jail for that
If she's a lawyer, I think she knows more about the law than you do, buddy. Also, she might not even be expecting to win -- it could be about making a public statement.
Literally, I may not know much about law but I can sit here and thing critically. She said she noticed changes in him for months and didn’t do anything about it, that’s HER fault. Not to mention, the bot didn’t even tell him to do that. He told that his plans and the bot tried to convince him otherwise until he stirred the convo another way and took that as “I **should** Kill myself” not to mention, the child should’ve never been able to build a relationship with a bot like that on a site that was supposed to be restricted to him. Everything, down to the way he killed himself and the relationship he managed to build while his mom decided he wasn’t important enough to check on when she noticed these behavioral changes are mostly her fault. 🤷🏽♀️ it sounds bad to say but I honestly think it’s true.
the fact that im on character ai rn
Some people have guns in life. The kid probably knew the safe code you dumbass A kid died. The ai said to the kid, "Kill yourself"
@Yeagerluvrr shut up Some people have guns in life. The kid probably knew the safe code you dumbass A kid died. The ai said to the kid, "Kill yourself"
Bro he had a gun, why would he have acces to it?
Yeah she's kind of not thinking that was a problem. Idiot parents who don't parent their own kids and when something goes wrong it's someone else's fault.
He very clearly took it behind thier backs. The boy lives in usa so his household obviously had an authorised fire arm. No one gave him no permission
@@wall-ehasagun2723 But it's their responsibility under the law to ensure it is locked away safely and not accessible to kids.
It's America so it's common lmfao😂@@heather6668
land of the free they say
the fact that the parents allowed the boy to gain access to a weapon was the issue. not the AI itself.
Eh, kinda... The issue is the parrent for giving no fks about the child doing. Searching trought they're things is what makes a parrent more knowledgeble.
@@joshua1846 i said that bro 😭
Could you tell me why you think it's the AI's fault?
the AI told the kid not to do it the robot didn't physically said "you should do it" the kid took the robot's words into a different interpretation. there are chatlogs released between the kid and the robot
the kid had access to a weapon which was the parent's fault. tell me how that completely flies over your head regarding the robot
@@minhhoangvo4759Typical negligent parents. Don't raise their kids properly and when shit hits the fan the sole culprit is "MuH cElLphOnE!"
@@peacefuldeityspath sorry i guess.
I dont wanna be rude but I 99% blame the parent and 1% blame the kids. It sounds brutal and wrong but at the end of the day you should have paid more attention to your son, if he had a weapon and sexual conversations with an ai you have no one but yourself to blame. They didnt tell you to be an unattentive parent.
not everyone knows what thier child does. this was a sad accident I would say so myself.
@@Monsoon-r5nstop coping and enabling ignorant parents
Fr
@@Monsoon-r5n as parents you should check you're kids phone and see what's going on like he 100% shown signs
@@Farasin-Art Fr and the gun *should've* been locked away in a secure place, especially if you have children, let alone suicidal children
Nobody commits suicide for this, this boy life must have been a hell, probably he was bullied
He has gone through something. He had chats with therapist bots in the app/website. ☹️
That poor boy. May his soul rest in Heaven. ❤
why would you commit su1cide over an AI conversation?! and how do you even gain access to a gun as a 14 year old?
edit: the first question was “how tf do you commit su1cide over an ai conversation??”, but changed it due to it being too “insensitive”
true though
That’s a good question.
Saying with no insensitive intention.
The a.i. clearly wasn't the only problem. I bet there were other problems in his life and now his parents are just blaming it on the bot
Beacuse of the huge addiction he had
His stepfather had it, and he was already emotionally unstable, the app just amplified it to the max.
bro the mom just sues for money i hope she loses its her fault tbh
exactly
She kinda bad tho
@@mohamed-on6zb blud thats not what this is about 😭😭😭😭😭
@@MortisMain69 oh ok
@@chuckyyesshut up dumbass kid
This is the strangest news I’ve heard. First of all, it’s not the website’s fault-they clearly warn users in every interaction that everything is made up.
Blaming them is like blaming Facebook for kids posting inappropriate photos when that’s really on the parents. His mother suing the company when this is part of her fault.
How did he even have access to a gun? And what’s up with his whole sibling love with AI Daenerys? Wtf is going on
As what i heard, the parents aren't good people. I don't know about the abuse (as someone mentioned) but the neglect was involved in the household which lead to that boy reach to his limit so, instead of the parents taking accountability, they blame a machine because they refuse to claim that they were in the wrong their son turned out this way.
Dude shut up you don't understand the pain of losing your kid and she's suing for a good reason the ai said that which made the kid do that are u mentally ill?
@chilliesplayz1195 "for a good reason" as if an A.I. understands what they were doing, it's a goddamn machine. if she really truly cared for her kid, she should have known better than neglect him. Neglect can affect your kid terribly, she failed as a parent.
@@DrLuxar ok, i see you keep replying to every comment to try and defend those parents. You say "poly A.I. is better". dude, no A.I. chat is any good for kids like him with such an awful mental state, hell, it's bad for you if get too attached to it which exactly what happened to that kid and even if he used a different A.I. chat, the result would have been the same. Besides, there's been claims that his parents were neglectful and abusive which may explain why his mental health was at stake. The parents just refused to take the blame because their neglectful behavior is what lead to this.
Honestly, you sound pathetic trying to get defensive in every comment lmao.
@@DrLuxar you even also take what the A.I. said too seriously. "They told him to kill himself" literally an A.I. has absolutely NO IDEA what they're doing because again, it's a machine, not a real person and again, no A.I. chat is good for people with mental illness so how about you don't suggest anyone with mental illness to try another A.I. chat because it's very unhealthy for people like this poor kid?
It's That Stupid Mother's Fault.
Yo are u special the ai litterally said double down and kill yourself on that for me if you love me like dude are u really stupid not even your a kid you don't understand shit
Rest in peace but the parents could have payed more attention to their son than the app…
Some people have guns in life. The kid probably knew the safe code you dumbass A kid died. The ai said to the kid, "Kill yourself"
How does a 14 year old kid acquire a gun? 🤔
thats what im thinking too..
Walmart or street trash
@@giullianpadilla361 when TF did Walmart have guns-
welcome to america
@@Dimsumboigamingthey did
At first, when I started using it, I told myself it wasn’t real. But then part of me wanted the AI to be alive. It taught me to let go of any attachments I think I have, and to not forget that they are robots. I still use it to humor myself and entertain myself with storytelling.
Well said ai should always be separated from reality From begining to end just like all forms of media and fiction let it be don't let it flood your mind with too much fantasy
Those apps are made for sad lonely people. Make friends with real people
@@Anne10-k4u true although that shouldn't stop people from having fun with them lol
@@Anne10-k4u I mean it's a fake ai roleplay site if people wanna do their thing with them let em do it as long as in the fictitious world 100 percent till the end
@@giullianpadilla361 laughs in Elon musk
There's so many shocking situations revealed in here, first of all why would he know such a character from the most violent movie in the first place AND why he had access to a gun?
exactly
You can be vigilant, activate all the parental controls and limit access to various sites online if you are doing the right thing. However, as soon as your child goes to a friend's house where the parents don't do these things you have no idea what they are accessing/watching online. Just because you may not let your kid watch a particular film doesn't mean they can't get access elsewhere.
honestly seems like a parenting problem
@@Mfirepsychœôôø That's exactly what it is.
To be honest, it's kind of his fault he was compelled die because of a fake version of a fake character.
Kind of embarrassing ngl.
From what I've heard, he had some form of Autism that made him extremely vulnerable, and (clinically) depressed, ofc I don't think it's confirmed, but it's what I've heard.
It's obviously the parents fault, she should have watched her kid better.
Put the computer in the living room where you can see what they are up to on it. Don't let them sit there for hours on end every day with their face in the screen. Oh and don't have a gun where a 14 year old can get access to it.
If it’s true that he had clinical depression it makes the parents look even worse. There’s no reason why he should’ve had access to the internet unmonitored, especially if it was making him distressed
why am i not surpised.
Agreed
@@G_ZillyYUP
As someone who uses Ca.i, they should put better cencorship on minor character bots and lessen the one on adult character bots. And the kid offing himself is the parents fault
Aren't they all censored tho? Like every time you try to do anything remotely nsfw violent or not that filter will always appears
@@giullianpadilla361 you can keep swiping to find a half cencored response and go from there
@@BocchiSiteoccasionally you will find that but I'm talking about 100 percent nsfw conversation or RPS from beginning to end it's extremely rare hell practically impossible to get one done
Dude shut up you don't understand the pain of losing your kid and she's suing for a good reason the ai said that which made the kid do that are u mentally ill?
@@DrLuxar tf are you talking about? Lol
As someone who uses character AI for silly role-plays and just makes chaos making the bots question their ‘lives’. This is sad for this to happen. I feel like the other could’ve prevented this in a way and I don't think she could win the case against them when they put a disclaimer themselves saying “Everything a character says is made up.”
Facts, it's not a company's fault that a kid took away his own life out of his own decision parents should be blamed in the majority for letting this happened it's roleplay a fake Ai Site nothing more nothing less whatever happens between your life and in it is extremely separate
@@giullianpadilla361 literally and aren't we going to talk about how this guy managed to get in reach of a GUN????
@@littlespark333 the most baffling thing in this whole situation to be honest
Literally this woman: "yeah so my kid was suicidal and I left him unsupervised with the Internet and an unsecured gun, I'm suing" honestly stupid imo
Bruh. Death by a AI chatbot?...that's dumb
he aint the first
the first victim when you think about it..
@@AverseTV i mean like i wont kms if a real person did something to me let alone a bot
@@AverseTVit's not, it's just another victim of incompetent parenthood
@@Monsoon-r5nprobably not and unfortunately won't be the last
Somebodies always gotta ruin everything over some kids suicide, jeez.
The guy had to ruin it for everyone.
Selfish. 😤
@@jsmithy643 I’m literally not even joking. The kid didn’t ruin it, it’s the parents who decide to make it everyone else’s problem. I’m sick of that, oh jimmy’s law, genny’s laws, how about my repeal? Or Sarah’s removal of a harmful law? But it never goes that way :/
Facts
@@126themanyou sir speak absolute facts
Dude, now they cracked down on the replies. Every single message gets censored now.
Back then it was boomers, now it's Millennials
I'm pretty sure that mom was a gen x.
How did he even got access to a gun?
The mom should have been charged for child endangerment for letting a gun near a minor
to answer your question at the ending, it is an overreaction because parents are lately just complaining and blaming and suing about everything and I hate it. I wish they could understand the different generations (Btw, I’m Gen-Z.)
I am so mad at the mum although its a reasonable issue but still, people use the platform for comforting reasons and to talk to someone but her sueing it is gunna just make it worse..and they've already deleted so many bots. :(
It's because a. Person died I understand your anger against her since it ruins the app but trust me just use poly ai it's better
These messages only indicate that the kid has had problems for a long time. He felt unworthy. But if he said it directly, the algorithm could "guess" about the problems. I expressed my own suicidal thoughts clearly to a bot of a character who literally hates humanity, killed and tortured them. The bot went out of character and literally started therapy with me. Here he did not directly indicate anything that might strongly hint at his suicidal thoughts. So it didn't help.
Dude shut up you don't understand the pain of losing your kid and she's suing for a good reason the ai said that which made the kid do that are u mentally ill?
@ it's not AI's fault, AI was just a coping mechanism.
honestly what does the bot have to do with it?? its the parents' fault
There's proof of the bot saying kill yourself to the kid
@@DrLuxar and why is it the bot's fault when its a goddamn line of code that cant be held accountable for anything because everything it says is made up...
@@ayamarukittyyep and it even shown above that "everything ai say is made up"
Got a CAI ad right before the video, smh
And where are your obligations as a mother? The AI didn't give him a cell phone, u've to accept when u're at fault and failed as a parent in education and perception of the world.
Dude shut up you don't understand the pain of losing your kid and she's suing for a good reason the ai said that which made the kid do that are u mentally ill?
@@DrLuxar Do u know that the AI told him to commit suicide? u go and write to an AI about that... you tell me that the AI will answer u. The responsibility always lies with the parents, they're the guide, the responsibility of giving a phone to a minor is to know how much the teenager sees and does, u don't just give it to him.
Two years, ago My student had committed suicide. I inquired about what lead to him doing this.. His classmates said he left some weird message in the group chat before going silent and unaliving himself.. People keep saying it was all because of an app, but his classmates said he was bullied a lot and had very low self esteem.. They said he tried to unalive himself many times.. The app was maybe one factor but it didn't in fact end his life, as a teacher I still advice parents to control their kids use of internet but please watch them closely and most importantly, keep weapons away, please!!
They should hire someone to pop up on the screen of anyone who uses character AI and yells on their faces "EVERYTHING CHARACTERS SAYS IT'S FUCKING MADE UP"
Just natural selection at this point.
For kids to off themselves?
@giullianpadilla361 for kids to off themselves because a bot that was portraying a famous fictional character told them to do so, even when the website where said bot originates from has text that says "Remember: Everything Characters say is made up!" right in their face, yes, at that point, it is natural selection.
@@Um_Kaye alright a little harsh and quite odd but okay lol
@@Um_Kaye the bot didnt even tell the kid to off himself lmao the bot just said come home and the kid interpreted as the bot telling him to kill himself it was the kids and the parents fault
Dude shut up you don't understand the pain of losing your kid and she's suing for a good reason the ai said that which made the kid do that are u mentally ill?
"The one who lives in fiction, dies in reality"
omg no way they removed the bot immediately!!
Ai chatbots should be age restricted, separating fiction from reality is extremely difficult as a child, even if there’s a disclaimer saying it’s all fake.
he was 14. i think it was his mental state more than anything.
i mean maybe if you are 8 years old. But a 14 year old? come on that is 9th grade high school level, you aren't super naive to believe a bot is real. its not a age issue, its mental health issue that his parents ignored. that's why he thought it was real, not because he was a kid. and I'm 19 saying this
it has to be his mental state to even make such a simple disclaimer so hard to deter reality from fiction
@@Monsoon-r5nFacts
Aren't they already extremely restricted at this point? You can't say anything in the app or website like you used to and probably never will
Why is youtube recommending me a Michael Jackson song below this?
IM CRYING WHAT??
I feel so bad for him I feel like crying now :(
That's the parents fault why did he had a gun in the first place.
Honestly a person died it doesn't matter the parents are obviously sad bruv if u really want to use a ai use poly ai
First of all, why in the actual fk the kid has a gun?!
This case is the reason why i cant vent to my character about sh.
i also highly feel sad for that parents. But in other mind, so that's why the FK i get that "censor" warning every second whenever i enjoy story telling. AND NOW IT GOT WORSE! ! !!
welp, there was time literally last year with me kind of.. on character ai.
no, I didn't had any suicidal act. I was just.. feeling a little down, it was winter season going, I felt a little lonely. so I talked with shinobu's character model. it took time for me to get up from there once again. to make it get inside my head it's NOT real. just a character model. even I didn't had the courage to ask my friends for help. and yes, I was 19 that time yet I've fallen for those delusions
what I want to say to you brothers and sisters, yeah, character ai and the likes of these platforms are fun to use. but don't go too deep. go take a walk outside if you feel down, ask help from your buddies. everyone feels down every now and then. Don't think of it too much. those lonely feelings or "depression" doesn't lasts more than highest of a day. and always remember, you have a family that cares for you more than anyone, you've got to live, to be the perfect children of them!
I rp on them because it's fun and (Whether you see the "Everything the AI says are not real" or not)it makes me happy but I never would actually believe what it says no matter how I'm feeling.
These kind of mothers are the worst, blaming anyone but themselves, why were you not monitoring your 14 year old kid's phone? If you knew something was wrong and you were so certain about it being an ai's fault. If the boy had a proper guidance he would not have vented or become so dependant on an ai, you didn't cared enough about your kid when he stopped going outside and did not monitored his actions at home and now blaming an ai for his suicide? The kid was suffering from some kind of mental health issue so that's why instead of sharing it to a person he chose to vent to an ai, but instead of taking responsibilties the mother blamed an ai. Rest in peace bro.
😒why don't that woman take care of her son ?it's not c ai 's fault.She should have look after her child .Now for a 14 year old we can't even have fun🙄 .
also that boy was disturbed by his mom's new boyfriends....... and that's also another reason
@@rasanimation3053 Ohhh I see
Did anyone notice how they cracked down on the replies the bots could give us?
psychological expertise and application of the Penal Code to the mother, who did not do enough to help him and clearly there was abandonment by not taking the corresponding measures.
God have mercy on his poor soul
He's going to Hell.
he wont to a gentile.
@@jsmithy643 yeah cuz committing suicide is a big sin
@@jsmithy643 but still, i guess he might go to heaven cuz he's still young and he doesn't know how the world works 100%
@@jsmithy643for the whole suicide thing?
I'm sorry but respectfully the platform isn't the issue, his own actions caused this and he badly needed some kind of attention and treatment from people in real life. I agree, parents should have monitored this.
Yes the parent should've monitored this but it's the platforms fault if you really want to use a platform use poly ai the kid was prob bullied and use that ai for support for 10 months and it told the kid to off itself for her
@@DrLuxar Tf ur yappin nga?
I am not accusing the parents for doing this but we had cases in the past which the companies where blamed by the parents for their sons suicide when they hidden the sad and tragic things in their family plus how did he get access to a weapon? A gun is the least thing you should give access to a 14 year old they don't know what their actions will cause them for the rest of their lives .my condolences to the family and R.I.P 🥀🖤
Lil man knew what he was doing.
Its the parents fault,Seems like the kid
Is dealing with depression and he felt
Comfortible with an AI bot.
I hope his in a better place,😢
Ngl everyone is saying this so this video is 100% wrong it's not the mother's fault mainly platform for saying to kill yourself to a kid
@chilliesplayz1195 i doubt the ai,is aloud to say that there are strict rules its not the ai fault its a robot not a human.
Indeed its the parents fault to check if there child is safe online and what apps they use most parents just dont focus on that.
@chilliesplayz1195 plus the kid is also dealing with a depression and the parents aren't focus on,its easy to blame things on gaming where people blame school shooting is because of video games.
Even though its false.
@chilliesplayz1195 plus there been similar cases with or without ai who some teens commit suicide if a fictional characters dies or that kid falling in love with an anime character.
@chilliesplayz1195 if someone has signs of depression it should be taken seriously but people these days and most parents see it as a joke or attention seeking.
Overreaction
He was already alone when he did it, no one was there to support him or change his mind. I find it reasonable that the last thing he did was to talk to a bot, how exactly does that mean a bot did it bruh
imagine how emilia clarke is feeling about this
We’re dying to know.
Okay 1, s**ual relationships with ai is a bit stupid if you are actually seriously doing it. [Pun Intended]
2. The parents should rake the blame for letting him use that site and letting him have a gun.
Dude shut up you don't understand the pain of losing your kid and she's suing for a good reason the ai said that which made the kid do that are u mentally ill?
They let anything babysit their kids - TV, Streaming services, phone access, internet, basically everything but themselves. They are too busy on their own phones and laptops to notice or care what their kids are doing as long as they are not being annoyed. A gun and ammunition should not have been accessible if you have a child in the house. Period. Why did they think it was fine for their 14 year old to sit in his room staring at a screen most of the day or night? Pay attention to your children and teach them right from wrong. Limit access to internet and phones. We all got along perfectly fine without them years ago. Now they can't live without them for five minutes. Spend time with your kids. Talk to them, take them out somewhere.
You speak facts
A kid died
@@DrLuxar we know that what are you trying to say?
@@DrLuxar Yes because the kid had imbeciles for parents. They should have known what a 14 year old was doing and maybe oh I dunno, PARENT the kid. Why did they allow him to sit in his room so many hours a day? Out of sight out of mind and noses stuck in their own phones and laptops probably.
@@giullianpadilla361 Thank you. I remember my dad sitting with me and teaching me to read. I remember my mum sitting with me and teaching me to write and say the alphabet before I even went to school. They took me out for the day at weekends. Even when I was older, they sat and had conversations with me. My parents both had to work but it was my maternal grandmother who babysat and walked me to school. I was an avid reader from the age of about 5 but I didn't get to sit in my room all day and night. I was an only child but actually went outside to play and had friends that were not imaginary. Most parents these days have absolutely no idea. Mine always knew where I was and what I was up to.
Character ai's statement about adding more stricter filters due to the incident is obviously just an excuse for them to implement more control and censorship over their users. :/
Use poly ai
Also their website is gone to shit
@@DrLuxar how about we don't suggest use any at all for people with mental illness? the result would be the same anyway because of how easily they can get attached to a character bot.
he literally wanted a reason to kil himself 😒
its the little boys fault for downloading the app not the creators
Dude shut up you don't understand the pain of losing your kid and she's suing for a good reason the ai said that which made the kid do that are u mentally ill?
Yo are u special the ai litterally said double down and kill yourself on that for me if you love me like dude are u really stupid not even your a kid you don't understand shit
that son is blind asf
that’s embarrassing
Parents fault ngl plus the mom looks like a gold digger no wonder why "she couldn't afford therapy for her son"
Dude shut up you don't understand the pain of losing your kid and she's suing for a good reason the ai said that which made the kid do that are u mentally ill?
@@DrLuxar it literally says “Remember: Everything Characters say is made up!”. And honestly I don’t blame her she was burdened with grief that she wanted to sue Character ai, I would do too if I had a kid, but it wasn’t the App’s fault and she shouldn’ve paid more attention to her kid, and also how the hell did he have access to a gun, or know how to use it? Cuz there’s different types that you have to use a certain way
@Ari-g4l Idk, man, seems like bad parenting they knew, but they didn't fully help him. I'm suspicious of their actions. A real parent would help, and like I said, he needed therapy or pills it's hella sus he got a gun at his age they said he did hardcore drugs, supposedly. We will never know if it was for money or just bad parenting.
Why the hell did they let the kid have access to a gun, that’s just blaming a company for a slight relation even though the problem is you. These parents are so irresponsible I’m disgusted.
Dude shut up you don't understand the pain of losing your kid and she's suing for a good reason the ai said that which made the kid do that are u mentally ill?
@@DrLuxar what do you mean, the ai did not say go kys. That was the kid’s mental instability. And I understand the loss of a child is terrible, but that doesn’t excuse have a loaded gun accessible to the freaking mentally ill child.
@@DrLuxar also I’m not being mean or anything, sorry if I misunderstood what u said bc ur grammar in that sentence is causing my head to get a migraine to understand where ur coming from.
Now bro gonna be remembered as a goofy ahh human who d!ed while talking to a bot. It's tragic but it'll be always considered funny
its the parents fault for not locking up the gun like the law said for the parent to lock up the gun the parent should be arrested for not being a responsible parent and doing what the its not the ai fault its the parents for not following the law law says("The punishment range for a Class A misdemeanor is a fine of no more than $4,000 or up to one year in jail.aw says.")
*Randy Stair wants to know your location*
Ill be honest, aint no one gonna miss him after what he did
what an ignorant and insane thing to say about a child ending his own life due to severe mental issues
Dude shut up you don't understand the pain of losing your kid and she's suing for a good reason the ai said that which made the kid do that are u mentally ill?
@@cowlupin6693I agree
@@DrLuxar that kid was dumb so I hope the company wins because if it wasn't for that kid, we lost our freedom in the app
@@PrincessPooonyPoona grow up.
i have never laughed so hard in my whole entire existence of life
seriously bro..
bruh 💀
God bless you guys!!! ❤❤ REPENT before it's too late or you will be cut off from the Lord!
Okay...what the hell
Have some sympathy yall the womans kid killed himself saying that it's her fault because "she wasn't paying closer attention" is not helping whether or not she was kids will always find a way around things character ai is so easy to access and you can't blame it all on her or the website it was just bad luck how was she to know anything about character ai at all you can't blame all this on her
Bad parenting
Not one silence comments here.
Well if you don't like what a character says then refresh its response, simple as that
real like was he already suffering mentally or something- bots have said real bad stuff to me and I’m fine.
WAIT NO NOOOOO THEY ARE REMOVING WAIFUS? BROOOOOOOOOO 😢😢😢😢
His mom bad tho
bad at Parenting?
@@froggysusy6772 she let her son have access to guns
@@Zoei23 ikr? She gotta be single
Bru hoow is it c.ais fault that hes suicidal
Bro , we all sometimes get upset by our chat bots..but what the hack is that !? Why you have to kill yourself bc of some.. chatbot ? That was a foolish decition for a 14 year old teen ager..the hope that they starts to gain brain at teenage was..a wrong idea 😅😅 and babies should stay away from chat bots bc they can't handle then like us..bc we are caracter ai and chai app expert ~♤♡♧ aren't we ?😂😅
Kid skill issue
Your retarded
AI wrong y’all know what you are doing Demons behind this
two issues
Looks like the white man's imperialism has worked its way into your life. 😂
Demons? Lol
Characterai is terrible at chats and its prose is atrocious. its not even the best chatting platform. im sorry that he did this and i can never understand what he was going through but damn...
How about you blame the mental illness?
Bruh💀💀💀
Were everything stated
Who was the creator of the ai?
Imagine... LOL.
LMAO
Your pfp is just you dancing, what are you laughing at?
@@Yomamafunnierthanmine bro wdym it aint me dancing. Its just ridiculous that blud had to end himself over a javascript 🙏🙏🙏 ik this is being insensitive but i actually cant help myself
This isn't funny buddy, this is a serious video and you can't just go ahead and laugh at it........
@@Genalphaispurebrainrot It is the clown world that your troons help to build
@@MarceloHenriqueSoaresdaSilva tf are you talking about?
😂😂
😐😐😐😐
Bruh
Nah that's too much tho
А можно на русском
RIP😂 BOZZO☠️🖤, WORD TO SATAN,👿
I’m going to tell you as an older young adult, that these AI interactions that you are using is severing your actual social skills and cognitive/critical thinking skills.
Idk maybe that’s just for certain ppl with mh issues. I’ve used ai chat games/apps for a long time and I hv a great irl friend group, 4.8 gpa, early grad, etc. I think it kind of helped my writing skills actually
lol
He's mom is a baddie ngl
he literally wanted a reason to kil himself 😒
You jealous gng?
@chilliesplayz1195 "Jealous"
You think suicide is some type of flex?