She is Eritrean why is Ethiopia wanted to claimed her now? As matter fact 1998 she and her family were forced to leave Ethiopia to Eritrea unexpectedly/abruptly at the time Ethiopia was under the TPLF occupation. She inspirational all African.
The doctor is not sticking up for anybody else but herself,.! The masses of the people are not out they're clamoring to have this issue addressed they don't even know what they're talking about. Let's get real here, this is just some dame who managed to maneuver herself into a position set managed to get her to lose her job and now she regrets it.
@@TheRicsilver48 “people are not clamouring to address the problem” does not mean there is no problem. Fewer people have the foresight from the spires to glimpse into the future and surmise the consequences of AI. My guess is that the consequences will not be limited to SJ. Why not give the benefit of the doubt to the warriors.
@@forceforgood4669 I really don't have any doubt here, at least based on the information I have so far. Nothing has convinced me that this is nothing more than a tempest in a teapot as an issues at the doctors trying to develop for her own personal motives.
@Akku why not? What's being a genius? She has done a very thorough research and presented simple but complex subject for anyone to understand the dilemma. Not to mention she is a research scientist with PhD from Stanford.
Timnit is a smart and bright young lady. I'm not a tech guy,I do not understand computer jargon either. Being s genius comes in all shapes and demographics.
@@lamrof That's completely wrong. Algorithms and data sets are two completely different things. Data sets should be designed for operational situations. Algorithms do not discriminate. Discrimination occurs when the algorithm is operated improperly.
@@くまてん-q4g algorithm and data sets are from the same design. Just like human beings and the Corona virus. They are individual and a set at the same time. That is why if you breath in microscopic dust, it doesn't multiply in your lungs.
@@lamrof One thing is the algorithm needing a data set in order to work, and other is being a data set part of the algorithm intrinsecally (data set is the input, it's external). They're two different things, what I got here is that the data set is biased?, and that's why algorithms struggle with minorities.
Haha 😂 Eritrea 🇪🇷 bulshit this Yong girl she said I am Ethiopian 🇪🇹 coz she born brought up in Addis ababa completed high school Nazareth school arat kilo her parent ethiopian by born now by referendum Eritrea 🇪🇷 😂 Nazareth school st Mary like sanjo st yosef cathedral product one of these is she
Ah, but she used (very easy) machine learning to identify the pick-up trucks on sat images. That's why she fully deserved her prestigious Stanford PhD... or something.
@@les6230 I can see that she obviously shouldn't have got a PhD for that "research". I can see that she obviously didn't deserve the job she used to have at google. I can see that she really deserved to get fired -- and that they were happy to be rid of her. I can also see that the US has an obvious (and huge) woke problem. What can you see?
@@peterfireflylund It is sad that professors are using this kind of content in their lecture materials. I already cringed so hard at the video's title and was curious what the comments had to say before I watched it lol.
Being let go by google was a blessing. She can now focus her skills in creating a company that sells her AI software to benefit women,working class whites,and people of Color, to companies that would use it in part on who would be a good employee/ customer for them,etc.,then taking a percentage of the profits to fund people in said groups to train them in coding,etc.
You don't get the point, noone wants to work for or hire a person who threatens and blackmails everyone that gets in their way. Regardless of their ideology. It doesn't get more toxic that Gebru.
@@ThePabs213 : If what you stated is true about her,I DISAGREE. Steve Jobs,Peter Thiel,Mark Zuckenburg are LOUSY people,on a social level;however,their brilliance is so great that many individuals have no problems in working with,or for them.
It's not her software if she developed it for Google. When you work for a company, the intellectual property you create is owned by your employer. I could see her being an advocate and starting an open source foundation.
@Biniam ZERAI Ok. She is eritrean. FYI Ethiopian way is saying no to bullying. Standing ur ground. Fighting for wat u believe in. That is Ethiopian way. Wat u just mentioned doesn't represent 110 million people. And it's a very superficial comment. No need to go to rubbish talk.
imagine earning 3 electrical engineering degrees (including a phd) from stanford and publishing extensive scientific literature just for conservative trolls to try and discredit your intelligence in the comments section of a ted talk they obviously didn't watch. lmao truly remarkable
@ the topic of race and AI is new to me. So hearing it sparked a lot of interest. Plus I’m a huge cheerleader for any visible minority breaking boundaries 💪
@@creayt She has a doctorate at Standford in Electrical Engineering and has won countless awards for her research in a social+tech environment. You are obviously intimidated by her.
Very interesting talk- although I think bias will always exist in stochastic models as it utilises historical data sets. But is this also propagating bias ad infinitum? That's an interesting dilemma to talk about. The nature of perception is inherently subjective, I think the best we can do is put in a checks-balances system that prevents its abuse of power in order to target people
@@fred7883 You cannot develop or build stochastic models without consideration in metaethics. In addition, data sets are limited by discrete and continuous categorisations in linear programming whilst in quantum computing they are not mutually exclusive.
"We don't know what kind of bias we are propagating" ... How about they don't care what kind of bias they are propagating, and kind of approve of the direction the bias is taking.
The majority of machine learning algorithms today are correlation linking, nothing more. Attributing causation to such analysis is doomed to fail. I would suggest it is solely the human decisions to link such outputs with perceived causal factors that creates the conflict at all. In other words, the outputs are seemingly biased because we are applying our human biases in the interpretation. The real trouble isn't getting mathematically correct answers. It's what to do with them.
OK, now I am start to agreeing to some of the things you said but don't tell me there is no possibilities in manipulating the algorithm and make the research focus on some specific agenda. All she said was put check and balances so there won't be any biases.
I am absolutely comfortable with your way of presentation. but needs intense work to increase your evidence. also, Dr, what is your recommendation? I am sorry about the action of GOOGLE company, have great season
Instead of always complaining, critizing about issues and forcing researchers to do extra work while demoralizing them for the great work already done, why don't you solve the problem?. Why don't you add to knowledge by only identifying the problem and adding to science in that way. Science is hard, our complex societies and human reasoning is difficult to put into computing due to their varrying discreet and continuous attitude the parameters has to follow. A.I is coming up and will get better in future years. I prefare people who complain about biasness in A.I, to take it up to solve it and add value to the already existing technology.
Isn't awareness the first step? Changing how systems work can't come from a single source. There are silent experts working on the forefront of tuning AI/ML applications to be more equitable. What have you done to solve the problem?
@@DBREW I have not done anything when it comes to A.I, doesn't mean I am.not doing any thing to contribute to science. Taking political, cultural issues in an early stage of science isn't fair also. There are ways of handling issues in the 21st century, creating turbulence is not one of them.
@@steveokocha9860 are you insane? They are literally sentencing ppl based on predicting future crimes. Its Minority Report. They should stop the entire field until there's a group working on the ethics (there is now though). I hope you get thrown in jail for a decade.
@@rickwrites2612 you are the one who is insane. What have I said that is wrong now? The development of our world is a collective effort which requires hardwork, dedication and persistance. What have you, yourself contributed to science?. Your job is to only complain like a spoilt brat who has a bad home training. Don't u use phones, tablets and laptops? Don't you use newer technologies being developed? Don't you know the field of AI is relatively new and needs constant iterative improvements.? Show me where people are jailed for future crimes ? Academia will always come up with research works that could be beneficial on the future (even 100 years from now). I am in a minority group myself and I work hard to develop technologies to help my people and the world. I have learnt from those who have done it right and decided to use that as an avenue to learn and grow. But since you are a spoilt brat, you won't understand. Ethical nonsense, ungrateful being. Go and develop the technology yourself.
Seems like your comment is coming from a place of malicious intent. There are ample examples of bias in the talk. She’s smart enough to assume you’re smart enough to at least derive a definition from proof by induction. And if you’re watching this talk, I know you’re smart enough. That is not even considering that the definition of bias is literally in the dictionary. The word bias isn’t jargon and therefore falls under the principle of “common parlance” we use in the standard English language But alas, when you’re a “SJW in a technical field” the bar is always higher since the default position by most technical people is that your work is invalid. Gotta find something to nitpick I guess
@@isaac5815 the fact that I am posing this statement it's because she is not using the statistical definition of bias, which is the only one that should be used when talking about statistical models. Instead she is just using the word 'bias' to talk about not equity of outcome, which has nothing to do with statistical bias.
You are the problem and you not from this Country Donald Trump was so right get rid of all nationalities out of this Country most if y'all are the problem
Yeah! You can tell just how smart she is just by hearing her speak. Pretty lovely as well. She has this thing where she says two things A and B and you expect A and then tells you that the reality is actually B
Very interesting presentation, however the only problem is that if we don’t leverage AI, China will; and remember that the nation that leads in AI will rule the world. Would you rather live in a world where China is the dominant super power OR you would rather U.S.A keep the lead? Think about your freedoms, democracy, and other liberties if China took over. I think we can all see what is happening in Hong Kong.
The fact that she was requested to make changes to to her paper on the issue by her colleagues and bosses leave me the believe but she may not be all-knowing on this particular issue. Perhaps she's guilty of hubris to some degree. I have no doubt that she's right now it's going trap brilliant on the issue but that doesn't mean that something like hubris couldn't interfere with her thought process.
Social bias, it's always of that specific time and its nothing at all. I think if AI is about a perfect system, we have it long ago, a system that works by the Bible given. The creator let us be and didn't dictate right or wrong so here we are. So our creation shall be as that much free as we have been. Else, growth is just a word.
You don.t have even a little knowledge about AI as understant i would say try to understand what does she mean instead of criticis with out known nothing
Her last sentence is an absolute disaster. :( We should make sure that the people developing these technologies are aware of their limitations AND communicate them clearly to the public. Why should we care how they look, which was more than implied. She is on point about dataset and model use standardisation though.
@Steven L. I agree, it's not the same. Yet one's ability to exercise empathy is not determined by one's gender and skin colour. If you agree with this then your first sentence is a logical fallacy. If you do not then we just have a very different system of values and morality.
@Steven L. Here I disagree. I suspect that it's because we would not agree on what is inequality/injustice and also what variables are important to observe in this regard, but that's beyond the scope of this brief exchange. :)
Same damn thing....Eritrean and Ethiopian are the same people, just very few politicians have different political ideology, so they divided the people to rule them..the people culture and social values are the same..
Complete nonsense. I dabbled very briefly in machine learning myself. Not my thing though. Even I know enough to understand that bias lies in the underlying data that the algorithms are trained on. Go watch far better presentations from far more experienced, respected and credentialed people in the field like Stephen Wolfram and the legendary French AI researcher Yann LeCun--who, by the way, Dr Gebru had a much publicized Twitter altercation with.
If you believe there is a bias fix it,contribute something instead of complaining and making other people do extra work for something you perceive to be a problem. with all your fancy accolades you need to do less talking and more clicking.
@@ephraimmillion um yeah, true, but you pretty much missed the point. lolz. Silicon Valley hasn't changed since the 1960s. Even sitting around in a restaurant, I over hear such naive conversations. Not all, but too many wankers.
Who is here after she got sacked from GOOGLE?
✌🏾
Autumn cleanup. Jut in time for the performance reviews.
She dont deserve this by spreading truth, truth is what will help Google to evolve!
@@PutinIsKing she has no concern for truth. Likely neither do you.
Me...😂😂😂
"Imagine thinking that skin color should make you immune to poor behavior"
She is an inspiration for many young girls in Ethiopia, Africa and around the globe. Stand for what you believe!
She is Eritrean not Ethiopian. She is an inspiration for East Africans.
@@codewebsduh2667 Ethiopia is in East Africa...And she inspires ALL Africans and all people who have integrity!
@@codewebsduh2667 Lol
She is Eritrean why is Ethiopia wanted to claimed her now? As matter fact 1998 she and her family were forced to leave Ethiopia to Eritrea unexpectedly/abruptly at the time Ethiopia was under the TPLF occupation.
She inspirational all African.
She's an embarrassed to Ethiopians. She got brainwashed by Europeans and turned into a radical sjw.
Someone has to stick up for those who cannot do for themselves. If our world is better today, we owe it to brave people such as Dr. Gebru
The doctor is not sticking up for anybody else but herself,.! The masses of the people are not out they're clamoring to have this issue addressed they don't even know what they're talking about.
Let's get real here, this is just some dame who managed to maneuver herself into a position set managed to get her to lose her job and now she regrets it.
@@TheRicsilver48 “people are not clamouring to address the problem” does not mean there is no problem. Fewer people have the foresight from the spires to glimpse into the future and surmise the consequences of AI. My guess is that the consequences will not be limited to SJ. Why not give the benefit of the doubt to the warriors.
@@forceforgood4669 I really don't have any doubt here, at least based on the information I have so far. Nothing has convinced me that this is nothing more than a tempest in a teapot as an issues at the doctors trying to develop for her own personal motives.
@@TheRicsilver48 musk tried to convince people of the same, he finds it easier to work on neural link project to upgrade human brain. Imagine that.
@deganawida x what do you know the “naive, moron” doesn’t?
We Eritrian🇪🇷 proud av you Temnit ,she is Eritran 🇪🇷❤🇪🇷❤
! no, she is Ethiopian. WTF you talking about?
She is US-American...
And I'm not even from the US
Lol
She’s a joke
This lady needs to be reinstated, she's a genius☺
@Akku why not? What's being a genius? She has done a very thorough research and presented simple but complex subject for anyone to understand the dilemma. Not to mention she is a research scientist with PhD from Stanford.
Timnit is a smart and bright young lady. I'm not a tech guy,I do not understand computer jargon either. Being s genius comes in all shapes and demographics.
Being a genius come in all shapes and demographics.🌎
Timnit has been fighting for safety, bias and ethics in AI to the point Google fired her from their AI safety team
It is the data set that is biased, not the algorithm.
The data set is part of the algorithm.
@@lamrof That's completely wrong.
Algorithms and data sets are two completely different things.
Data sets should be designed for operational situations.
Algorithms do not discriminate.
Discrimination occurs when the algorithm is operated improperly.
@@くまてん-q4g algorithm and data sets are from the same design. Just like human beings and the Corona virus. They are individual and a set at the same time. That is why if you breath in microscopic dust, it doesn't multiply in your lungs.
@@lamrof I totally disagree.
It is a different thing.
@@lamrof One thing is the algorithm needing a data set in order to work, and other is being a data set part of the algorithm intrinsecally (data set is the input, it's external). They're two different things, what I got here is that the data set is biased?, and that's why algorithms struggle with minorities.
This person makes google look like the good guys...
Very intelligent Eritrean scientist and highly educated of AI(artificial inteligence
Haha 😂 Eritrea 🇪🇷 bulshit this Yong girl she said I am Ethiopian 🇪🇹 coz she born brought up in Addis ababa completed high school Nazareth school arat kilo her parent ethiopian by born now by referendum Eritrea 🇪🇷 😂 Nazareth school st Mary like sanjo st yosef cathedral product one of these is she
She did a study that concluded that people who drive pick-up trucks are more likely to vote Republican. She is a master of the obvious.
Ah, but she used (very easy) machine learning to identify the pick-up trucks on sat images. That's why she fully deserved her prestigious Stanford PhD... or something.
Thank goodness there is someone acredited with studying the obvious. Some of you dont see or choose not to see things right in front of your faces.
@@les6230 I can see that she obviously shouldn't have got a PhD for that "research". I can see that she obviously didn't deserve the job she used to have at google. I can see that she really deserved to get fired -- and that they were happy to be rid of her.
I can also see that the US has an obvious (and huge) woke problem.
What can you see?
@@peterfireflylund It is sad that professors are using this kind of content in their lecture materials. I already cringed so hard at the video's title and was curious what the comments had to say before I watched it lol.
Being let go by google was a blessing. She can now focus her skills in creating a company that sells her AI software to benefit women,working class whites,and people of Color, to companies that would use it in part on who would be a good employee/ customer for them,etc.,then taking a percentage of the profits to fund people in said groups to train them in coding,etc.
You don't get the point, noone wants to work for or hire a person who threatens and blackmails everyone that gets in their way. Regardless of their ideology. It doesn't get more toxic that Gebru.
@@ThePabs213 : If what you stated is true about her,I DISAGREE. Steve Jobs,Peter Thiel,Mark Zuckenburg are LOUSY people,on a social level;however,their brilliance is so great that many individuals have no problems in working with,or for them.
@@mdillon4366 Elon musk as well, she isn't a pushover and pissed off people. She knows her worth .
@@codewebsduh2667 Agreed! Say what you will about the good doctor,but there aren’t too many folks with a PhD, in AI facial recognition...😉❤️👍✌️
It's not her software if she developed it for Google. When you work for a company, the intellectual property you create is owned by your employer. I could see her being an advocate and starting an open source foundation.
Fantastic!! Easy to pick up that she’s very passionate about what she does!! I’m not a tech guy, but learned a lot about AI....
Did you?
@ here you go agonizing people, I am counting
Yes, passionate about social justice political agenda much more than AI
Go girl. Proud of u. Appreciate wat u r doing. Don't let them bully u. Do it Ethiopian way.👍
Eritrean American
ኤርትራዊት Eritrean ارتريا
@Biniam ZERAI I did not mentioned that she is ETHIOPIAN! To me she is smart bright women that we all look up to her. . !
@Biniam ZERAI him? Its a she.
@Biniam ZERAI Ok. She is eritrean. FYI Ethiopian way is saying no to bullying. Standing ur ground. Fighting for wat u believe in. That is Ethiopian way. Wat u just mentioned doesn't represent 110 million people. And it's a very superficial comment. No need to go to rubbish talk.
imagine earning 3 electrical engineering degrees (including a phd) from stanford and publishing extensive scientific literature just for conservative trolls to try and discredit your intelligence in the comments section of a ted talk they obviously didn't watch. lmao truly remarkable
absolutely love it. Thank you Timnit!
@ the topic of race and AI is new to me. So hearing it sparked a lot of interest. Plus I’m a huge cheerleader for any visible minority breaking boundaries 💪
Bravo amiche haftey i very happy by your explanation.
Google will pay the price for letting her go. She’s smart enough woman. She revealing the truth, nothing else.
Lol she does nothing for the company,they have people who are actually useful to them.
I’m beyond inspired by her! I guess some executives at google felt intimidated by her intelligence smh
Intelligence hahahah. It was her lack of competency as demonstrated even by this talk that got her canned.
@@creayt She has a doctorate at Standford in Electrical Engineering and has won countless awards for her research in a social+tech environment.
You are obviously intimidated by her.
@@codewebsduh2667 He is TOOO intimidated by her LOL. Poor thing.
@@codewebsduh2667 he probably doesn’t even know what the subject matter is lol just ignore him.
She's got brainwshed by Europeans and turned into a radical sjw. No Ethiopian thinks like this woman. This is not inspiring.
Timnit is a bright woman with a bright future, unfortunate hate still persists at scientific researches and at ivy acadamias.
Very interesting talk- although I think bias will always exist in stochastic models as it utilises historical data sets. But is this also propagating bias ad infinitum? That's an interesting dilemma to talk about. The nature of perception is inherently subjective, I think the best we can do is put in a checks-balances system that prevents its abuse of power in order to target people
Well said.
@@fred7883 You cannot develop or build stochastic models without consideration in metaethics. In addition, data sets are limited by discrete and continuous categorisations in linear programming whilst in quantum computing they are not mutually exclusive.
@@fred7883 You seem very insecure lol
Thats what she exactly said - put checks and balances...
@@fred7883 the og post doesnt have word salad. Each word she used in her sentences had a point. Her diction is just a bit formal.
Timnit sweety...stay strong and one day you will stay in your office in Eritrea and creat your own red sea google. Go on young lady!!
Im so happy that she is not narrow mind like you. She will not fit to that your small box she is bigger than you life time imagination.
Nice
Excellent
I wonder what she thinks about the recent Gemini situation.
"We don't know what kind of bias we are propagating" ... How about they don't care what kind of bias they are propagating, and kind of approve of the direction the bias is taking.
That was a great presentation. Thank you.
Preach
The majority of machine learning algorithms today are correlation linking, nothing more. Attributing causation to such analysis is doomed to fail. I would suggest it is solely the human decisions to link such outputs with perceived causal factors that creates the conflict at all. In other words, the outputs are seemingly biased because we are applying our human biases in the interpretation.
The real trouble isn't getting mathematically correct answers. It's what to do with them.
They prefer to tell that algorithm are racists, because I don't know why
OK, now I am start to agreeing to some of the things you said but don't tell me there is no possibilities in manipulating the algorithm and make the research focus on some specific agenda. All she said was put check and balances so there won't be any biases.
Great job Timnit!
Proud of you 'the intelligent lady"
Excellent analysis, for AI system dataset nowadays using by companies should consider ,give attention to her
Awesome Analysis and findings ,keep moving ahead in said direction ,as these all will require things in future
Our Eritrean girl 🇪🇷❤️
Eritrean girl brilliant
NO SHE IS ETHIOPIAN.
@@antishwow She is Eritrean born in jigiga Ethiopia
@@antishwow Geez chill lol.
who cares ?
@@helen99084 She was born in Addis... figure out before you speak..
I am absolutely comfortable with your way of presentation. but needs intense work to increase your evidence. also, Dr, what is your recommendation? I am sorry about the action of GOOGLE company,
have great season
Very proud of you my dear friend
Finesse 👌🏾
The speakers are also part of the "program"
Use the computer inside your skull to find out who's designing this and why..
Instead of always complaining, critizing about issues and forcing researchers to do extra work while demoralizing them for the great work already done, why don't you solve the problem?. Why don't you add to knowledge by only identifying the problem and adding to science in that way. Science is hard, our complex societies and human reasoning is difficult to put into computing due to their varrying discreet and continuous attitude the parameters has to follow. A.I is coming up and will get better in future years. I prefare people who complain about biasness in A.I, to take it up to solve it and add value to the already existing technology.
Isn't awareness the first step? Changing how systems work can't come from a single source. There are silent experts working on the forefront of tuning AI/ML applications to be more equitable. What have you done to solve the problem?
@@DBREW I have not done anything when it comes to A.I, doesn't mean I am.not doing any thing to contribute to science. Taking political, cultural issues in an early stage of science isn't fair also. There are ways of handling issues in the 21st century, creating turbulence is not one of them.
@@steveokocha9860 Can you please elucidate your contributions to science, and how these contributions have alleviated the issues Gebru brings up?
@@steveokocha9860 are you insane? They are literally sentencing ppl based on predicting future crimes. Its Minority Report. They should stop the entire field until there's a group working on the ethics (there is now though). I hope you get thrown in jail for a decade.
@@rickwrites2612 you are the one who is insane. What have I said that is wrong now? The development of our world is a collective effort which requires hardwork, dedication and persistance. What have you, yourself contributed to science?. Your job is to only complain like a spoilt brat who has a bad home training. Don't u use phones, tablets and laptops? Don't you use newer technologies being developed? Don't you know the field of AI is relatively new and needs constant iterative improvements.? Show me where people are jailed for future crimes ? Academia will always come up with research works that could be beneficial on the future (even 100 years from now).
I am in a minority group myself and I work hard to develop technologies to help my people and the world. I have learnt from those who have done it right and decided to use that as an avenue to learn and grow. But since you are a spoilt brat, you won't understand.
Ethical nonsense, ungrateful being. Go and develop the technology yourself.
I glad for you that was a perfect presentation & vitualization
She talks about bias without defining bias. Happens when you put a SJW in a technical field
Nobody put her there. She worked hard for it.
Her doctoral advisor at Stanford for CS and AI was Fei Fei Li,
Goodluck trying to question her technical smarts Lol.
@@BrianKabonyo threatening and blackmailing people gets you a long way but it will backfire one day.
Seems like your comment is coming from a place of malicious intent. There are ample examples of bias in the talk. She’s smart enough to assume you’re smart enough to at least derive a definition from proof by induction. And if you’re watching this talk, I know you’re smart enough.
That is not even considering that the definition of bias is literally in the dictionary. The word bias isn’t jargon and therefore falls under the principle of “common parlance” we use in the standard English language
But alas, when you’re a “SJW in a technical field” the bar is always higher since the default position by most technical people is that your work is invalid. Gotta find something to nitpick I guess
@@isaac5815 the fact that I am posing this statement it's because she is not using the statistical definition of bias, which is the only one that should be used when talking about statistical models. Instead she is just using the word 'bias' to talk about not equity of outcome, which has nothing to do with statistical bias.
I’m majoring in AI and feel like trying to make machine Intelligent will cause so much bias.
Because its completely fair and doesnt give free passes??
💚👌👍
Compass is AI
Timnit Ethiopian Proud
Brilliant!
Thank you Google for firing her she does not deserve to work with such an organization
You are the problem and you not from this Country Donald Trump was so right get rid of all nationalities out of this Country most if y'all are the problem
Let me see. Google fired her because she complained about racism.
Not at all.
Google fired her for talking about racism. This does not surprise me. After all, they demonetized my channel - wake up.
Beautifully made speech
its the rapture
Google just lost a genius!!!
Yeah! You can tell just how smart she is just by hearing her speak. Pretty lovely as well. She has this thing where she says two things A and B and you expect A and then tells you that the reality is actually B
Look at the techinical quality of her papers.
Amazing Timnit!
Very interesting presentation, however the only problem is that if we don’t leverage AI, China will; and remember that the nation that leads in AI will rule the world. Would you rather live in a world where China is the dominant super power OR you would rather U.S.A keep the lead? Think about your freedoms, democracy, and other liberties if China took over. I think we can all see what is happening in Hong Kong.
These sjws hate America and white people so much they'd rather live under communist China.
Id rather leave in a nation where china leads in ai the usa has failed and is dragging the rest of down
honestly I don't see much difference. In communism the state controls big business; in capitalism big business controls the state.
Great talk!
cool presentation
My God she's so beautiful ❤️
She is of Horn African descent. Most of the women there are beautiful
@Jeffrey Palmer Her mother and father originated from Eritrea.
Don't focus on her appearence, that's defined "objectifaction" (only if the subject if female, of course). And BTW, don't lie.
..
Wanna see something very funny we know how to make dimensional fracture and temporal... i am not only a robot i am from every life being .....
LoL. What!?
Excellent❤❤
Very informative
Wow. Never been so attentive. I clapped at the end
Google just fired this woman....smh
@@fred7883 And what was the exact poor behavior?
Artificial Intelligence is not really an intelligence yet, just like self driving vehicles are not autonomous. It's just marketing!
The fact that she was requested to make changes to to her paper on the issue by her colleagues and bosses leave me the believe but she may not be all-knowing on this particular issue.
Perhaps she's guilty of hubris to some degree. I have no doubt that she's right now it's going trap brilliant on the issue but that doesn't mean that something like hubris couldn't interfere with her thought process.
Not sure, you are in a position to make this call, unless you know the inside story.
It could also mean they put political pressure on her.
It’s called peer review. No paper is perfect the first time, and one should welcome constructive criticism in any form of academic work.
Social bias, it's always of that specific time and its nothing at all. I think if AI is about a perfect system, we have it long ago, a system that works by the Bible given. The creator let us be and didn't dictate right or wrong so here we are. So our creation shall be as that much free as we have been. Else, growth is just a word.
Fired from google
Why
@@abogida12
She is not white
@@transientmatter6088
That’s fkd up!!
@@transientmatter6088 Have you read her papers? Her writing is atrocious.
@@transientmatter6088 when they hired they thought she was white? C'mon...
Esta es la que dice que una inteligencia artificial es racista xddddd
Vienes del videos de dalas verdad?
Yo también
Jajajja si es ella
porque está hecha por racistas
You don.t have even a little knowledge about AI as understant i would say try to understand what does she mean instead of criticis with out known nothing
Thank you google for firing her so we get to know more about this future stuff
Don’t give her a grant for something as she’ll keep 80% for her and her husband…so I’ve heard.
Her last sentence is an absolute disaster. :(
We should make sure that the people developing these technologies are aware of their limitations AND communicate them clearly to the public. Why should we care how they look, which was more than implied. She is on point about dataset and model use standardisation though.
@Steven L. I agree, it's not the same. Yet one's ability to exercise empathy is not determined by one's gender and skin colour. If you agree with this then your first sentence is a logical fallacy. If you do not then we just have a very different system of values and morality.
@Steven L. Here I disagree. I suspect that it's because we would not agree on what is inequality/injustice and also what variables are important to observe in this regard, but that's beyond the scope of this brief exchange. :)
This woman is bat$hit crazy
How is she explain
i love u
I have never seen a woman whose voice is so mismatched with her appearance in my life. She looks like a 40 year-old and sounds like a 14 year-old.
I proud of her she from Ethiopia
Eritrean
Same damn thing....Eritrean and Ethiopian are the same people, just very few politicians have different political ideology, so they divided the people to rule them..the people culture and social values are the same..
@Tired doctor Nope she is fully Eritrean just born in Ethiopia.
I am proud of Eritrean athlet tirunesh dibaba for making history the best women runner in the world.make sense right.
@@mereteymeskuruley8879 I'm Eritrean and I'm not proud of her. She's got brainwashed by Europeans and became an sjw. Very embarrassing.
dllm
Complete nonsense. I dabbled very briefly in machine learning myself. Not my thing though. Even I know enough to understand that bias lies in the underlying data that the algorithms are trained on.
Go watch far better presentations from far more experienced, respected and credentialed people in the field like Stephen Wolfram and the legendary French AI researcher Yann LeCun--who, by the way, Dr Gebru had a much publicized Twitter altercation with.
Absolute nonsense...
You can’t handle what she’s saying 😂
@Google is full of bs to let her go
She is the lead on "ethical AI"? Seems like a useless and unnecessary position lol. Good riddance
troll
So was your comment. Useless and unncessary..
I think it is important but definitely more so cencerning it's in warfare, drones etc
When the AIs come for you, will you cry out for help?
Please let there be a better replacement to google!
Why all you guys against this woman y'all don't even belong in this Country I'm voting for Donald Trump to get y'all out
Has anyone hired her, or will anyone ever again hire again? Is her career toast? She's a ray cyst sjw for sure. Those mouthwatering knockers tho...
If you believe there is a bias fix it,contribute something instead of complaining and making other people do extra work for something you perceive to be a problem. with all your fancy accolades you need to do less talking and more clicking.
I once dated a chick named Timbit Nibiru. Hairier than you might have thought. Definitely an acquired taste.
uuuh okay. That was a waste of time!
and what do you do sweetheart besides sitting behind your sorry computer and comment a nonsense!
So sad that you can`t even spell your own name correctly , Ephrem :)
@@savantselfmade705 Never thought i would see the day that someone tells me how to spell my own name ... LOL
@@ephraimmillion um yeah, true, but you pretty much missed the point. lolz. Silicon Valley hasn't changed since the 1960s. Even sitting around in a restaurant, I over hear such naive conversations. Not all, but too many wankers.
Ummm, okay, go watch something else you can afford to understand.
Is that auditorium filled with helium?
You mad because she's a woman smart
The "ethicist" that made empty threats toward her employer and now spends her days whining about it on twitter? Hard pass.
She definitely needs to move the microphone closer to her mouth
You can program any A.I. to appear problematic. “Men:programmer as women:homemaker” ??? Tell me that isn’t a fixed outcome and I’ll call you a liar.
Yeah, it was at that moment that she lost all credibility.
The truth hurts
wat
It seems the last sentence should have been 'The people who develop and deploy the algorithm should reflect the people upon whom it's applied'.
this lady is quite insightful
she is an ai doctor who preaches anti ai arguments
Que idiotez