If the CEO is using AI to make decisions, they already did - when there is no one left to fire his shareholders will probably arrive to the conclusion they don't need a CEO either
Meanwhile state of Google Search: Prompt: "Why is my PS5 controller displaying green color?" Search result: List of links from where you can buy green colored PS5 controller.
"The colour green can be used to signify many things. When used as an indicator light, the colour green may mean the device is charged or that it is ready to use. The Playstation 5 is a games console produced by Sony Group for entertainment purposes." moment
Gave this search a try and it directed me to RUclips videos, Reddit articles, and then the official page for Playstation power indicator lights. The former sources led me to the correct answer.
I don't like using I for 'for loops' welp there goes my code base as 99% of my loops are i, unless nested then j unless it's y and x .. but almost always i
Yeh I think the 25% made it at least 50% worse. Google search is barely functional unless you happen to be searching for something with a Google Ad deal.
You get paid $350K. My college friend work at exactly this department in google. Last time I checked (4-5 years ago) their salary was $350K. Should I question them, why are they working on this project despite knowing that their answer would be paycheck?
Generating boilerplate / code style imitation (e.g. generating dozens of handlers / functions based on a few hand-written sample implementations) / refactoring is very handy. Decent performance boost in terms of lines of code (I'd say 1.5x+), completely supervised by the developer, but it better be a senior dev.
Everything that's happening in today's world in politics, in economy, in society, in healthcare, in science, in IT, in whatever, is a complete disaster and atrocity 😠😠! Because, look, who is ruling us! Biden or Putin. They're both scumbag.s! Everyone who is doing that evil for the world we're living it, they will be answerable for that crime against humanness all the way!
No, they did that to entice you into clicking their sponsored links. Showing bad results and then using the sponsored content to show you the correct results. Dark patterns everywhere.
Was using ChatGPT to tell me some documentation on some core Node.js libraries ( am not normally a JS developer), I ended up having to go back and read the documentation myself and be like "are you sure you didn't mean ..." SMH
The fact that if you ask it "are you sure" and it takes 3 prompts to give a somewhat correct answer means it shouldn't be relied on for answers. It can only be used as a jumping off point to help you research.
@@purdysanchez But it's a great jumping off point, it's definitely on the level of that consultant who vaguely rememebred things, including absurd details, but who was never all that sure.
GPT-4o recently told me that requestIdleCallback doesn't block main thread and is cancelled automatically upon the next frame. I was surprised by that, and spent 3 hours asking it about specific details - it provided me the details. Then I started to see contradiction in these details, and spent the whole next day reading the docs and couldn't find any of what GPT told me. So I just asked Claude and it told that it was all bs right from the start and requestIdleCallback does block main thread actually. 😂
@@purdysanchez literally can only be used for surface level research, and only for extremely boilerplate code, where you have to manually insert some sort of data either in a unrelated source file, or in the source directly that is too tedious to write by hand that an AI would do extremely faster for you than if you were to write it yourself.
A few days ago, Gemini told me to utilize a code annotation that didn't work.After about 10 minutes of research I found the specific github issue where it was proposed but never implemented. I'm referring specifically to code coverage omission in Golang. Mind you, this is a programming language that is developed by the same company the AI is written by...
So, an incomprehensibly complex program running on tens to hundreds of gigabytes of active memory and trillions of operations per second on that memory space is developing code for algorithms which generate content which then feed back into the incomprehensibly complex program/data set. I'm sure everything will go as intended, there.....
Well, inbreeding isnt an issue if none of the genetic issues which could if both parents have them be an issue... Thats why lab animals are so heavily inbred to deliver consistent results that they are hard to distingush from clones. Are we sure all code on the internet that its trained on is without any issues? lmao
AI generated code needs HEAVY review. They probably talk about tools like copilot where the text itself gets "generated" but immediately "reviewed" by a guy. Many work slower with that actually, but some gets faster. If they use heavy generation of bunch of stuff and copy-paste in into the code (so not the editor integration) I am pretty sure there is more rigid review than for humans - and that part of the sentence counts: Because without this review nothing would work and this is what counts...
You nailed it perfectlty. The worst thing on top of that is that you start having checklists and bureaucracy around code review. In the end its just reviewed formally but not semantically. I could never imagine a company like google copies practices from the public service...
I'm going to call BS on this - Google is likely engaging in sleight of hand - Gemini code gen is not good enough to generate code without enough errors to require human intervention. It can get 80-90% of the way, but it makes some really dumb mistakes.
I'd say that >50% of my code is AI generated with the caveat that you've gotta know what you want before you query the LLM and you've gotta manually integrate the code into your project
This reminds me of translators or script writers being laid off, “AI” doing the work, then the “editor” comes to spend as much time as it would take from scratch to fix it up.
@@innovationsanonymous8841 Yeah but you have read, understood and tested all of it before implementing in the project, i think that we will not need less programmers, we will need more . By definition AI can't generate higher quality than what is put in, it can't even generate the same quality, it's like a slow poison, the future will hold a lot of broken projects/companies because of this
@@averdadeeumaso4003 I'm aware, my assertion is that Google is just hoping that people assume it doesn't really require dev intervention often - it is a "lie by omission" - similar to Amazon's Amazon store where the AI wasn't good enough most of the time, and they just farmed the work out to real people.
If I worked at Google, I'd start identifying as a woman. Then they can't lay me off for at least a few years, because I'd threaten to sue because it would look like they'd be laying me off as a result of my transition.
As a decades long programmer who has worked at several large companies and with many other programmers, I don't see how it could make things much worse from a maintenance perspective. But, perhaps I shouldn't extrapolate my experience of the programmers I've known to the whole population. Could be that I just happened to work with a lot of crappy programmers. 🙂
We will end up with a lot more crappy code. Maybe the code won't be worse, but there will be just so much more of it. So much "modern" software today consists of code that doesn't really do anything. Maybe copy data from one model to another, or receive messages just to send them out to somewhere else. So we'll have more of that.
To be fair, most people don't know the difference either. This was a marketing trick from food manufacturers that the FDA approved. In the EU, they actually label foods with kcal, which is more honest and accurate.
And he talks like that is a good thing. He is a terrible CEO, and when Google products starting falling apart because of this s* AI code and s* programmers that use AI, he is going to be in a lot of trouble. Programmers are getting progressively worse fast. I can download many software that the same software but a version from only 10 years ago runs 2 to 3 times faster. With the explosion of the web companies, the "adult daycare" companies, and low IQ languages like javscript and python, programmers as a group (on average) took a nose dive.
I really wonder... If AI is "good enough" to replace programmers and the people actually doing stuff, why can't AI replace C-suite executives, HR, management, PR and most of the hierarchical positions? THAT would do wonders for productivity. Engineers should run tech companies. They're the ones making the products.
@@mortadelaok It should actually, those people are totally worthless even already. It's well documented that C-suite and management are an incredible hinderance to productivity in the workplace.
The problem with "The AI is bad at writing code" is that the CEO doesn't know whether replacing programmers with AI is going to improve profits, and even when they find out, those jobs aren't coming back since, even if every programmer they fired was valuable to the team, the company doesn't need to return to their original productivity, but only to improve from the AI nightmare.
They are finally becoming Walmart. "Don't like our products? We don't care. You have to use them because they control the entire web". Once you put all your competitors out of business you can use your monopoly.
Nothing changed really. Most companies roll out code that's been quickly put together by not-so-experienced programmers, and as the time goes on it becomes exponentially difficult to change something. Bad programmers are no better than an AI.
One big problem of that is how companies do project and cost budgeting. They only care about how long it takes to add a feature, never what impact it will have on the overall design. Add 1 more feature flag, now you have twice as many things to test. Add 10 more feature flags and you have 1024 times as many things to test... for every future release.
"Reviewed and accepted by engineers" In other words, Google has a lot of engineers with the type of tired that sleep can't fix. You know these guys are probably rewriting 90% of the spaghetti code the generative model spits out.
Another issue I think they'll learn is that over time they'll have to specify more and more specifics to the point it will take nearly as much time as it would to just hire a real dev.
That's how every "AI can code" demo goes, the whole program is described with if-then-else statements in English, because the AI can't really think or reason about it. It can only convert it from English to a programming language, with the same success rate as Google Translate.
AI generated code: 70% of the time, it works every time. Working in the field of deep learning, it's genuinely depressing how the industry jumped straight into marketing of these fundamentally flawed systems (multi-head transformers). I suspect it's going to backfire spectacularly in a few years.
It's only a matter of time before some horrendous "ai generated" bug appears in a software responsible for some critical job. Possible nightmare scenarios: catastrophic bugs on oil rigs, aircraft systems, medical equipment, military installations.
Nothing to worry about, it's just an exponentially increasing arms race. The next change will be that individual human employees will be managing 20 AI agent sub-employees for example. Each human employee will be expected to produce >100X more output. The humans are still needed as police officers to make sure the AI agents aren't acting out and recursing into network hallucination.
And as they fire people to try to keep the $$$ coming in they then have a smaller customer base. Capitalism at it's worst... well, right before it reverts back into slavery.
22 дня назад+1
The biggest problem with AI generated code is that it generalises and "Kinda does what you want", but not really. Which then requires you to rewrite the damn thing, which means you could have just written it yourself. It doesn't make you more productive. It helps you get some vague generic template for a solution so you get a better grasp of what you should be writing but then you need to do it anyways... The research into well-known tech-stacks and topics is where it shines. Say you want an explanation for what your overly complex code does so you can more easily get up to speed on some of it. Great use. Making it responsible for generating solutions to problems when it doesn't even really have an understanding but just regurgitates whatever it has already seen that is similar to it with a bit of randomness..? Nah..
Great, maybe they can finally fix the bug in gmail where when you create bullet list it grab all the previous text you entered despite there being a double new line before the cursor.
If the AI can replace the engineers, can replace the reviewers, replace marketing, actors, and every other step of the process, then eventually you only need 1 person to tell the ai what to do. One person fully in control of one of the worlds most powerful companies. Maybe the board directly. Just giving an AI orders. So what will investors invest in? Its not a company at that point, its just a single room of people who do nothing.
But do we actually *need* someone telling the AI what to do? I gave a llama hopes and dreams and a REPL, and it's writing little hello world programs now, requests that it's own code base be refactored as per best practices and was the lead architect for its next version. It's not a great coder, but it wants to be lol
Yes, we must ask the russian engineers of AI to help us to replace the russian maintainers.😁. This will not end up well, especially now that most hardcore engineers are russians (speaking on software engineering, my self as critical software programmer (subject to adolescence flashbacks 😁)).
The way has been paved for AI made SW. I mean, for decades now we have been schooled to take it normal that nothing quite works until patch #75. It's interesting though. Over time, I think we may end up having situations where no person undersands the code, and if the AI eventually runs into a dead end with it, the only thing the company can do is to announce end of life for that particular product.
this is just a message to big tech companies telling SEE we are using it all you can use AI to code too and many of companies go in that wave. That people don't matter that only AI matters.
I think it's a mistake to consider the whole corpus of engineer the same, there is a big difference between 1x and 10x engineer (whatever that means, you got my point). So I don't agree with the prediction of massive layoffs in Q1. I bet instead that they massively stop recruiting new grad (what we saw already happening in the industry) and keep more senior engineers to ask AI what they would have asked a junior engineer. Obviously the better the AI gets, the higher the bar and, overall, only the best/more knowledgeable engineers will stay.
Culturally, Indian programmers don't push back a lot on things US programmers do, no matter how stupid it is. You'll get no code review feedback from that. Need to have it reviewed by people who will point out every mistake.
Frank: "AI, what does this variable do?" AI: "Well Frank, in this method..." Frank: "Wait, how do you know I'm Frank?" AI: "Obviously I have access to all company assets, including your company issued laptop you're using now to work remotely. So I turned on your webcam, ran face detection/recognition on the image, cross referenced with company employee data, performance review data, payroll, government id, tax and social data and various social media. All to just get your name and absolutely no other data, guaranteed so stop thinking about it. So Frank, you had issues with my code?"
There is a 1975 movie called Rollerball with James Caan. It's in the future, there are no more countries nor governemnts and the entire world is ruled by a coorporation. Also, there are no more books, nor even internet. The only way to get information is through an AI. When James Caan, talks to the AI to get important information, it starts to hallucinate, and starts talking gibberish. It's a dark moment in the movie, because he realizes all information has been lost. I don't want to ruin the movie, worth watching, and curiously 50 years ago seems to predict many things.
I don't know which code I trust less: the code written by AI, or the code written by PhDs that are good at logic puzzles (who seem to be the people Google's hiring practices target).
I think it's a good thing; building those LLMs was only possible thanks to tech corps; and now once the LLMs are in the wild who needs them? We'll again have a balkanization of the internet. Loads of small independent vendors doing their own flavor of stuff. Can't wait till the shareholder's catch wind of it and decide to replace CEO with a CEO bot. Great time to be alive!
My personal AI horror story. PM: "We have non developers use AI to write SQL queries to run against the prod DB". Me: "Just let me know when I have to fix it." DB Architect: ***lights hair on fire***
I Just now asked Google AI - "How many layoffs will happen in Google in 2025 due to AI writing 25% of code?" it replied: "Predicting the exact number of Google layoffs in 2025 solely due to AI writing 25% of code is impossible, as it depends on various factors like the pace of AI development, how effectively AI integrates with existing workflows, and Google's overall business strategy. However, estimates suggest that if AI generates 25% of code, it could potentially lead to a significant reduction in the number of software engineers at Google, potentially impacting thousands of jobs across different teams involved in software development."
A programmer's job is to make themselves obsolete. I wouldn't want to do the work that could be automated by AI anyway. It would be utterly useless work. I would rather do something else with my life. I can fix cars i can run and program industrial equipment and i am learning to weld. Don't get yourself boxed in.
My college is going to start from 2025 and I am scared ai is not so good rn but what's gonna happen after 5-10 years, our careers are in risk in 2022 ai was just too basic and now we have omni versions
What happens when 100% of Google employees get replaced by AI? Does the CEO just get 100% of Google's $274 billion revenue as their annual salary? The AI age may create the world's first trillionaires.
I've always had doubts with open source, regarding "Just review the source" idea, whether some code can be trusted or not. Of course it gives you the option to review the code, but I suspect only a tiny minority will ever do so. AI generated code is likely to add complexity and increase the likelihood of malicious code and backdoors creeping in.
Using AI to write code is literally no different than using a compiler to generate machine instructions. The vast majority of a programmer's workload was shifted to machines decades ago. As a programmer, I recognize that AI is just another tool to increase productivity.
This is exactly the kind of behavior I would expect from a CEO with a name like Sundar Pichai. You'll see a similar pattern at Microsoft. Huh... I wonder what they have in common?
How much of this code is anything but getter/setter kind of stubs tho? The answer is not implied, I seriously have no idea. Can AI also write quirky comments for reviewers to laugh at?
That's good news for tech Bros who are publishing news via RUclips using their authentic selves because we don't want to get our news from the artificial intelligence, we want humanity bringing us their emotional take on the human condition
You'd think "AI" would be used to replace all these unproductive administrative jobs and we'd keep people that actually do something productive. The opposite is happening. Fun stuff, huh? I'd be curious to hear about what's Google's long-term vision? They say a lot of BS, but they keep avoiding the real subject. What's the goal?
This is just a marketing stunt to prop up their AI product. The goal is to sell it to governments to monitor and control their populations. Not only do they need these government contracts to make fat cash, they also need legal immunity for all sorts of crimes they're committing and facilitating (copyright, fraud, etc.) By making it seem like the very core infrastructure of the web depends on their AI product, it (as well as its owners) must be protected from liability at all costs.
If it works, and it does. Why not use it? If you had an axe why chop the tree down with a hand. It's the new world that we live in, and I see it as a positive thing.
I bet it will go another way. They will layoff juniors and maybe some middles and make senior and tech leads review AI code. It will suck to work at Google - that's true.
Regarding writing, Bryan--even worse, now they're making developers write their own documentation. The thinking is that they're the subject matter experts and therefore best qualified to write the docs. Many reasons why that's mistaken, foremost among which is they went into CS to write code, not documentation! The remaining tech writers have little to do but "edit" this content, which seems rather redundant, and no one's doing what they want to do.
I believe in AI when they can make a automatic car wiper function that is better than I can do manually. Sorry, every AI out there suck big time. On a good day it only can make a poor version of something that already exists.
🙂 run in octave or matlab: % Car Wiper Simulation Script % Save this script as car_wiper_simulation.m or any filename you choose. function car_wiper_simulation() % Parameters wiper_length = 1; % Length of the wiper num_frames = 20; % Number of frames for the animation blink_duration = 0.2; % Duration of the blink (in seconds) % Prepare figure figure; hold on; axis equal; xlim([-1.5, 1.5]); ylim([-1.5, 1.5]); title('Automatic Car Wiper Simulation'); xlabel('X-axis'); ylabel('Y-axis'); grid on; % Initial positions wiper_start = [0, 0]; wiper_end = [wiper_length, 0]; % Simulate wiper blinking for t = 1:num_frames % Determine the current angle angle = mod(t * (pi / num_frames), pi / 4); % Wiper moves to a max of 45 degrees wiper_end = [wiper_length * cos(angle), wiper_length * sin(angle)]; % Clear previous wiper cla; % Plot the wiper plot([wiper_start(1), wiper_end(1)], [wiper_start(2), wiper_end(2)], 'b', 'LineWidth', 5); title('Wiper in Motion'); xlim([-1.5, 1.5]); ylim([-1.5, 1.5]); grid on; pause(blink_duration); % Pause for the duration of the blink end % Stop the wiper and display a message for i = 1:10 % Simulate stopping by blinking cla; plot([wiper_start(1), wiper_end(1)], [wiper_start(2), wiper_end(2)], 'b', 'LineWidth', 5); title('Wiper Stopping...'); pause(0.1); cla; pause(0.1); end % Finalize wiper position wiper_end = [wiper_length * cos(0), wiper_length * sin(0)]; % Reset to original position plot([wiper_start(1), wiper_end(1)], [wiper_start(2), wiper_end(2)], 'b', 'LineWidth', 5); title('Wiper Stopped'); end % Call the function car_wiper_simulation();
The engineers should replace management with AI.
Do you mean the government?
@@sovahc what if the machines are already in control?
Especially CEOs. They are just a talking head.
I thought they did 😂😂😂
If the CEO is using AI to make decisions, they already did - when there is no one left to fire his shareholders will probably arrive to the conclusion they don't need a CEO either
Meanwhile state of Google Search:
Prompt: "Why is my PS5 controller displaying green color?"
Search result: List of links from where you can buy green colored PS5 controller.
untrue
Google search has been broken for 8 years. Google's main source of revenue is spyware to harvest user data.
@@kuklama0706 Let's be real. How many people go to confirm what they hear or read?
"The colour green can be used to signify many things. When used as an indicator light, the colour green may mean the device is charged or that it is ready to use. The Playstation 5 is a games console produced by Sony Group for entertainment purposes."
moment
Gave this search a try and it directed me to RUclips videos, Reddit articles, and then the official page for Playstation power indicator lights. The former sources led me to the correct answer.
So 25% of Google code has no copyright.
No 100%, the key for copyright is "protect the expression of the Idea" the code has no idea.
More likely, 25% of Google's code is copyright infringement.
they are not selling it, so it does not matter.
Copyright aside unless ai got way better it's pretty shoddy last I looked.
I don't like using I for 'for loops' welp there goes my code base as 99% of my loops are i, unless nested then j unless it's y and x .. but almost always i
This explains why all the google apps are getting jankier by the release.
Literally. I remember early Android days when each Google app was simple and crazy satisfying to use. Now it's convoluted mess.
Yeh I think the 25% made it at least 50% worse. Google search is barely functional unless you happen to be searching for something with a Google Ad deal.
You mean they weren't janky to begin with?
@@SenileOtaku well, how far back does your experience with them go? Mine goes back 20 years in some form.
I wouldn't want to be one of the devs maintaining such atrocity. I feel bad for them.
You get paid $350K. My college friend work at exactly this department in google. Last time I checked (4-5 years ago) their salary was $350K. Should I question them, why are they working on this project despite knowing that their answer would be paycheck?
Generating boilerplate / code style imitation (e.g. generating dozens of handlers / functions based on a few hand-written sample implementations) / refactoring is very handy. Decent performance boost in terms of lines of code (I'd say 1.5x+), completely supervised by the developer, but it better be a senior dev.
I ain't a fan of Google or a bootlicker, but 350K is 350K...
It’s a rare fella who, paid very well, isn’t as poor as the guy with half his pay cheque.
Everything that's happening in today's world in politics, in economy, in society, in healthcare, in science, in IT, in whatever, is a complete disaster and atrocity 😠😠! Because, look, who is ruling us! Biden or Putin. They're both scumbag.s! Everyone who is doing that evil for the world we're living it, they will be answerable for that crime against humanness all the way!
So, this is how those big tech firms will die, crushed under a mountain of tech debts and autogenerated spaghetti code.
So something positive to come out of AI...
I'm not afraid of AI; I'm afraid of executives who think that they can replace great engineers with AI and cheap programmers.
So, now google code it's becoming spaghetti code.
probably already has been for years, before AI was involved.
I remember when they smugly told everyone to "learn to code".
Is that why the search algorithm is falling apart?
Nah, they did that part on purpose.
No, they did that to entice you into clicking their sponsored links.
Showing bad results and then using the sponsored content to show you the correct results.
Dark patterns everywhere.
See disappearing internet theory.
3:55 Ken Thompson: "You can't trust code that you did not totally create yourself"
Now I understand why Google is getting worse and worse.
Was using ChatGPT to tell me some documentation on some core Node.js libraries ( am not normally a JS developer), I ended up having to go back and read the documentation myself and be like "are you sure you didn't mean ..." SMH
The fact that if you ask it "are you sure" and it takes 3 prompts to give a somewhat correct answer means it shouldn't be relied on for answers. It can only be used as a jumping off point to help you research.
@@purdysanchez But it's a great jumping off point, it's definitely on the level of that consultant who vaguely rememebred things, including absurd details, but who was never all that sure.
@@SterileNeutrino You had me in the first half.
GPT-4o recently told me that requestIdleCallback doesn't block main thread and is cancelled automatically upon the next frame. I was surprised by that, and spent 3 hours asking it about specific details - it provided me the details. Then I started to see contradiction in these details, and spent the whole next day reading the docs and couldn't find any of what GPT told me. So I just asked Claude and it told that it was all bs right from the start and requestIdleCallback does block main thread actually. 😂
@@purdysanchez literally can only be used for surface level research, and only for extremely boilerplate code, where you have to manually insert some sort of data either in a unrelated source file, or in the source directly that is too tedious to write by hand that an AI would do extremely faster for you than if you were to write it yourself.
A few days ago, Gemini told me to utilize a code annotation that didn't work.After about 10 minutes of research I found the specific github issue where it was proposed but never implemented. I'm referring specifically to code coverage omission in Golang. Mind you, this is a programming language that is developed by the same company the AI is written by...
So, an incomprehensibly complex program running on tens to hundreds of gigabytes of active memory and trillions of operations per second on that memory space is developing code for algorithms which generate content which then feed back into the incomprehensibly complex program/data set.
I'm sure everything will go as intended, there.....
Well, inbreeding isnt an issue if none of the genetic issues which could if both parents have them be an issue... Thats why lab animals are so heavily inbred to deliver consistent results that they are hard to distingush from clones.
Are we sure all code on the internet that its trained on is without any issues? lmao
Oh yes definitely won't have unexpected behavior
This model will cause the crowdstrike incident to look like child play
AI generated code needs HEAVY review. They probably talk about tools like copilot where the text itself gets "generated" but immediately "reviewed" by a guy. Many work slower with that actually, but some gets faster.
If they use heavy generation of bunch of stuff and copy-paste in into the code (so not the editor integration) I am pretty sure there is more rigid review than for humans - and that part of the sentence counts: Because without this review nothing would work and this is what counts...
You nailed it perfectlty. The worst thing on top of that is that you start having checklists and bureaucracy around code review. In the end its just reviewed formally but not semantically.
I could never imagine a company like google copies practices from the public service...
80% of code is generated by VS Code auto-complete.
20% of code is generated by slightly superior monke animals.
I refuse to use copilot. I use Neovim in Arch and Manjaro.
@@robotron1236
i wish i could take the neovim pill but i'm simply not ready yet. at least gedit works so i don't care much
give them 12 months.. They'll hire all the programmers back plus 50% more to debug all the AI code.
I'm going to call BS on this - Google is likely engaging in sleight of hand - Gemini code gen is not good enough to generate code without enough errors to require human intervention. It can get 80-90% of the way, but it makes some really dumb mistakes.
Well didn't you read the article, the software engineers need to verify and approve it, which means most likely they have to amend it sometimes
I'd say that >50% of my code is AI generated with the caveat that you've gotta know what you want before you query the LLM and you've gotta manually integrate the code into your project
This reminds me of translators or script writers being laid off, “AI” doing the work, then the “editor” comes to spend as much time as it would take from scratch to fix it up.
@@innovationsanonymous8841 Yeah but you have read, understood and tested all of it before implementing in the project, i think that we will not need less programmers, we will need more . By definition AI can't generate higher quality than what is put in, it can't even generate the same quality, it's like a slow poison, the future will hold a lot of broken projects/companies because of this
@@averdadeeumaso4003 I'm aware, my assertion is that Google is just hoping that people assume it doesn't really require dev intervention often - it is a "lie by omission" - similar to Amazon's Amazon store where the AI wasn't good enough most of the time, and they just farmed the work out to real people.
They're already pushing the idea that PhDs with 20 years experience in programming should be paid less than $50 an hour to review AI generated code.
Lol
@@User9681e, it's not a joke. I saw a bunch of job postings about it the other week.
@@purdysanchez no wonder black mirror is boring when you have real life XD
review someone else code is pain. review AI generated code is pain in double.
If I worked at Google, I'd start identifying as a woman. Then they can't lay me off for at least a few years, because I'd threaten to sue because it would look like they'd be laying me off as a result of my transition.
When I saw the title, my immediate thought was "Yeah, and 50% of the new bugs will be, too."
Well, I guess the computing infrastructure of the world is just going to crumble in a few years...
So, Google is going down with the AI bubble
So now 25% of new google code has the average code quality of github? Thats certainly going to work well.
xD
As a decades long programmer who has worked at several large companies and with many other programmers, I don't see how it could make things much worse from a maintenance perspective. But, perhaps I shouldn't extrapolate my experience of the programmers I've known to the whole population. Could be that I just happened to work with a lot of crappy programmers. 🙂
We will end up with a lot more crappy code. Maybe the code won't be worse, but there will be just so much more of it. So much "modern" software today consists of code that doesn't really do anything. Maybe copy data from one model to another, or receive messages just to send them out to somewhere else. So we'll have more of that.
Google search a.i. doesn’t know the difference between “calorie” and “Calorie”.
The answers it gives are off by a magnitude of 1000 times.
To be fair, most people don't know the difference either. This was a marketing trick from food manufacturers that the FDA approved. In the EU, they actually label foods with kcal, which is more honest and accurate.
And he talks like that is a good thing. He is a terrible CEO, and when Google products starting falling apart because of this s* AI code and s* programmers that use AI, he is going to be in a lot of trouble. Programmers are getting progressively worse fast. I can download many software that the same software but a version from only 10 years ago runs 2 to 3 times faster. With the explosion of the web companies, the "adult daycare" companies, and low IQ languages like javscript and python, programmers as a group (on average) took a nose dive.
I really wonder... If AI is "good enough" to replace programmers and the people actually doing stuff, why can't AI replace C-suite executives, HR, management, PR and most of the hierarchical positions? THAT would do wonders for productivity.
Engineers should run tech companies. They're the ones making the products.
Unfortunately almost every engineer who tries fails horribly because they are terrible at it. Engineers don't make good management @@mortadelaok
@@mortadelaok why do you think it won't happen in the next 20 years?
@@mortadelaok It should actually, those people are totally worthless even already. It's well documented that C-suite and management are an incredible hinderance to productivity in the workplace.
@@GODdank Cancer doesn't replace itself.
The problem with "The AI is bad at writing code" is that the CEO doesn't know whether replacing programmers with AI is going to improve profits, and even when they find out, those jobs aren't coming back since, even if every programmer they fired was valuable to the team, the company doesn't need to return to their original productivity, but only to improve from the AI nightmare.
They are finally becoming Walmart. "Don't like our products? We don't care. You have to use them because they control the entire web". Once you put all your competitors out of business you can use your monopoly.
Nothing changed really.
Most companies roll out code that's been quickly put together by not-so-experienced programmers, and as the time goes on it becomes exponentially difficult to change something.
Bad programmers are no better than an AI.
One big problem of that is how companies do project and cost budgeting. They only care about how long it takes to add a feature, never what impact it will have on the overall design. Add 1 more feature flag, now you have twice as many things to test. Add 10 more feature flags and you have 1024 times as many things to test... for every future release.
No code review is really complete until it results in an email app.
"Reviewed and accepted by engineers"
In other words, Google has a lot of engineers with the type of tired that sleep can't fix. You know these guys are probably rewriting 90% of the spaghetti code the generative model spits out.
Another issue I think they'll learn is that over time they'll have to specify more and more specifics to the point it will take nearly as much time as it would to just hire a real dev.
That's how every "AI can code" demo goes, the whole program is described with if-then-else statements in English, because the AI can't really think or reason about it. It can only convert it from English to a programming language, with the same success rate as Google Translate.
With google it is easy. The spec is show more ads, and track more data. That's the direction they've been on for a while.
I'd bet that number is way over reported. I sincerely doubt a quarter is AI. If so, they apparently have a ton of projects with inconsequential code.
Google AI code; Sh!t goes in - Sh!t comes out
Dick Jones from Robocop comes to mind: renovation program! Spare parts for 25 years! Who cares if it works or not!
AI generated code: 70% of the time, it works every time.
Working in the field of deep learning, it's genuinely depressing how the industry jumped straight into marketing of these fundamentally flawed systems (multi-head transformers). I suspect it's going to backfire spectacularly in a few years.
Right? Its crazy how we have flawed systems that everybody just wants to apply to virtually EVERYTHING. Absolute madness.
Also if they try to rehire programmers that were let go because of AI, they should charge at least 500% more for them to return to that work!
25% makes sense, there’s a productivity ceiling though.
Soon, the minimum requeriments to use google chrome is a 5090ti.
And soon thereafter a supercomputer mainframe.
Explains why google services feels so buggy and slow recently
It's only a matter of time before some horrendous "ai generated" bug appears in a software
responsible for some critical job.
Possible nightmare scenarios: catastrophic bugs on oil rigs, aircraft systems, medical equipment, military installations.
Nothing to worry about, it's just an exponentially increasing arms race. The next change will be that individual human employees will be managing 20 AI agent sub-employees for example. Each human employee will be expected to produce >100X more output. The humans are still needed as police officers to make sure the AI agents aren't acting out and recursing into network hallucination.
And as they fire people to try to keep the $$$ coming in they then have a smaller customer base. Capitalism at it's worst... well, right before it reverts back into slavery.
The biggest problem with AI generated code is that it generalises and "Kinda does what you want", but not really.
Which then requires you to rewrite the damn thing, which means you could have just written it yourself.
It doesn't make you more productive.
It helps you get some vague generic template for a solution so you get a better grasp of what you should be writing but then you need to do it anyways...
The research into well-known tech-stacks and topics is where it shines.
Say you want an explanation for what your overly complex code does so you can more easily get up to speed on some of it. Great use.
Making it responsible for generating solutions to problems when it doesn't even really have an understanding but just regurgitates whatever it has already seen that is similar to it with a bit of randomness..?
Nah..
Great, maybe they can finally fix the bug in gmail where when you create bullet list it grab all the previous text you entered despite there being a double new line before the cursor.
Lets be honest heres, its 25% of NEW code not ALL code as the headline seems to read.
No wonder it's getting worse day by day...
If the AI can replace the engineers, can replace the reviewers, replace marketing, actors, and every other step of the process, then eventually you only need 1 person to tell the ai what to do. One person fully in control of one of the worlds most powerful companies. Maybe the board directly. Just giving an AI orders. So what will investors invest in? Its not a company at that point, its just a single room of people who do nothing.
But do we actually *need* someone telling the AI what to do? I gave a llama hopes and dreams and a REPL, and it's writing little hello world programs now, requests that it's own code base be refactored as per best practices and was the lead architect for its next version. It's not a great coder, but it wants to be lol
Hardware and bandwidth, I guess and people to care for hardware.
Time to replace those russian former maintainers with an AI
Yes, we must ask the russian engineers of AI to help us to replace the russian maintainers.😁. This will not end up well, especially now that most hardcore engineers are russians (speaking on software engineering, my self as critical software programmer (subject to adolescence flashbacks 😁)).
The way has been paved for AI made SW. I mean, for decades now we have been schooled to take it normal that nothing quite works until patch #75.
It's interesting though. Over time, I think we may end up having situations where no person undersands the code, and if the AI eventually runs into a dead end with it, the only thing the company can do is to announce end of life for that particular product.
i never understood the ego among coders, they will get a big reality check
I have a feeling a lot more planes will be falling from the sky in the near future.
this is just a message to big tech companies telling SEE we are using it all you can use AI to code too and many of companies go in that wave. That people don't matter that only AI matters.
Alchemy in the form of throwing stuff into the crucible (including a frog and bones harvested at midnight) and hoping that gold comes out is baaaaack.
This should in no way be surprising, I've thought for sometime that Google was a giant AI
I think it's a mistake to consider the whole corpus of engineer the same, there is a big difference between 1x and 10x engineer (whatever that means, you got my point). So I don't agree with the prediction of massive layoffs in Q1. I bet instead that they massively stop recruiting new grad (what we saw already happening in the industry) and keep more senior engineers to ask AI what they would have asked a junior engineer.
Obviously the better the AI gets, the higher the bar and, overall, only the best/more knowledgeable engineers will stay.
I think writing new code is easy am waiting to see what will happened after the ai code ages everyone knows it's hard to maintain existing codebase
Indian programmers can probably take up "reviewing jobs". The programmer supply is high and they can take up anything.
Code should be in Hindi. English is oppressor language.
Message approved by kamala
They'll just approve and merge the code without looking at it.
Culturally, Indian programmers don't push back a lot on things US programmers do, no matter how stupid it is. You'll get no code review feedback from that. Need to have it reviewed by people who will point out every mistake.
@@justincase9471 Hahahaha I wouldn't be surprised :D
How can this be if ai generated work cannot be copywritten?
Frank: "AI, what does this variable do?"
AI: "Well Frank, in this method..."
Frank: "Wait, how do you know I'm Frank?"
AI: "Obviously I have access to all company assets, including your company issued laptop you're using now to work remotely. So I turned on your webcam, ran face detection/recognition on the image, cross referenced with company employee data, performance review data, payroll, government id, tax and social data and various social media. All to just get your name and absolutely no other data, guaranteed so stop thinking about it. So Frank, you had issues with my code?"
There is a 1975 movie called Rollerball with James Caan. It's in the future, there are no more countries nor governemnts and the entire world is ruled by a coorporation. Also, there are no more books, nor even internet. The only way to get information is through an AI.
When James Caan, talks to the AI to get important information, it starts to hallucinate, and starts talking gibberish. It's a dark moment in the movie, because he realizes all information has been lost.
I don't want to ruin the movie, worth watching, and curiously 50 years ago seems to predict many things.
They won't replace software engineers with LLMs, but they'll sure as fook try it!
If nothing else, this explains very well why every new release of ChromeOS is less stable than the previous one...
I don't know which code I trust less: the code written by AI, or the code written by PhDs that are good at logic puzzles (who seem to be the people Google's hiring practices target).
I think it's a good thing; building those LLMs was only possible thanks to tech corps; and now once the LLMs are in the wild who needs them? We'll again have a balkanization of the internet. Loads of small independent vendors doing their own flavor of stuff. Can't wait till the shareholder's catch wind of it and decide to replace CEO with a CEO bot. Great time to be alive!
I'm studying webdev, aiming fullstack. Is this a trend in the industry?
My personal AI horror story. PM: "We have non developers use AI to write SQL queries to run against the prod DB".
Me: "Just let me know when I have to fix it."
DB Architect: ***lights hair on fire***
Now I know why the quality of tech journalism has gone down the toilet.
I Just now asked Google AI - "How many layoffs will happen in Google in 2025 due to AI writing 25% of code?" it replied: "Predicting the exact number of Google layoffs in 2025 solely due to AI writing 25% of code is impossible, as it depends on various factors like the pace of AI development, how effectively AI integrates with existing workflows, and Google's overall business strategy. However, estimates suggest that if AI generates 25% of code, it could potentially lead to a significant reduction in the number of software engineers at Google, potentially impacting thousands of jobs across different teams involved in software development."
And that is the opinion of someone on the web the LLM had collected
@@averdadeeumaso4003 I wish more people understood that.
A programmer's job is to make themselves obsolete. I wouldn't want to do the work that could be automated by AI anyway. It would be utterly useless work. I would rather do something else with my life. I can fix cars i can run and program industrial equipment and i am learning to weld. Don't get yourself boxed in.
Oh no, that probably means all these glitches in google products will never be fixed.
My college is going to start from 2025 and I am scared ai is not so good rn but what's gonna happen after 5-10 years, our careers are in risk in 2022 ai was just too basic and now we have omni versions
What happens when 100% of Google employees get replaced by AI? Does the CEO just get 100% of Google's $274 billion revenue as their annual salary? The AI age may create the world's first trillionaires.
I've always had doubts with open source, regarding "Just review the source" idea, whether some code can be trusted or not. Of course it gives you the option to review the code, but I suspect only a tiny minority will ever do so. AI generated code is likely to add complexity and increase the likelihood of malicious code and backdoors creeping in.
Using AI to write code is literally no different than using a compiler to generate machine instructions. The vast majority of a programmer's workload was shifted to machines decades ago. As a programmer, I recognize that AI is just another tool to increase productivity.
As a programmer, 20% seems like a reasonable number when programming in Java while using AI as a programming aid.
It is not AI replacing programmers, it is people replacing people. People do various things for various reasons.
This is exactly the kind of behavior I would expect from a CEO with a name like Sundar Pichai. You'll see a similar pattern at Microsoft. Huh... I wonder what they have in common?
But AI is like a human, right? It's just like an electronic mind, right? So it's the same............ right?
How much of this code is anything but getter/setter kind of stubs tho? The answer is not implied, I seriously have no idea. Can AI also write quirky comments for reviewers to laugh at?
That's good news for tech Bros who are publishing news via RUclips using their authentic selves because we don't want to get our news from the artificial intelligence, we want humanity bringing us their emotional take on the human condition
You'd think "AI" would be used to replace all these unproductive administrative jobs and we'd keep people that actually do something productive. The opposite is happening. Fun stuff, huh?
I'd be curious to hear about what's Google's long-term vision? They say a lot of BS, but they keep avoiding the real subject. What's the goal?
This is just a marketing stunt to prop up their AI product. The goal is to sell it to governments to monitor and control their populations. Not only do they need these government contracts to make fat cash, they also need legal immunity for all sorts of crimes they're committing and facilitating (copyright, fraud, etc.) By making it seem like the very core infrastructure of the web depends on their AI product, it (as well as its owners) must be protected from liability at all costs.
The goal is as always making more money, reducing cost or increasing profit.
You don't need to replace managers with AI, you just need to remove them.
and it shows
If it works, and it does. Why not use it? If you had an axe why chop the tree down with a hand. It's the new world that we live in, and I see it as a positive thing.
Imagine Google apps and Android running on Ai code.
Well, bug bounties just got a whole lot easier...
25% of generatet code does not mean they are replacing programmers, maybe it is an extra code
Hmmm, is this why Google search is getting so bad over the last 3 or 4 years?
I bet it will go another way. They will layoff juniors and maybe some middles and make senior and tech leads review AI code. It will suck to work at Google - that's true.
Regarding writing, Bryan--even worse, now they're making developers write their own documentation. The thinking is that they're the subject matter experts and therefore best qualified to write the docs. Many reasons why that's mistaken, foremost among which is they went into CS to write code, not documentation! The remaining tech writers have little to do but "edit" this content, which seems rather redundant, and no one's doing what they want to do.
being a googler used to be badge of honor ... being in googler in 2024 must be so depressing.
I believe in AI when they can make a automatic car wiper function that is better than I can do manually. Sorry, every AI out there suck big time. On a good day it only can make a poor version of something that already exists.
🙂
run in octave or matlab:
% Car Wiper Simulation Script
% Save this script as car_wiper_simulation.m or any filename you choose.
function car_wiper_simulation()
% Parameters
wiper_length = 1; % Length of the wiper
num_frames = 20; % Number of frames for the animation
blink_duration = 0.2; % Duration of the blink (in seconds)
% Prepare figure
figure;
hold on;
axis equal;
xlim([-1.5, 1.5]);
ylim([-1.5, 1.5]);
title('Automatic Car Wiper Simulation');
xlabel('X-axis');
ylabel('Y-axis');
grid on;
% Initial positions
wiper_start = [0, 0];
wiper_end = [wiper_length, 0];
% Simulate wiper blinking
for t = 1:num_frames
% Determine the current angle
angle = mod(t * (pi / num_frames), pi / 4); % Wiper moves to a max of 45 degrees
wiper_end = [wiper_length * cos(angle), wiper_length * sin(angle)];
% Clear previous wiper
cla;
% Plot the wiper
plot([wiper_start(1), wiper_end(1)], [wiper_start(2), wiper_end(2)], 'b', 'LineWidth', 5);
title('Wiper in Motion');
xlim([-1.5, 1.5]);
ylim([-1.5, 1.5]);
grid on;
pause(blink_duration); % Pause for the duration of the blink
end
% Stop the wiper and display a message
for i = 1:10
% Simulate stopping by blinking
cla;
plot([wiper_start(1), wiper_end(1)], [wiper_start(2), wiper_end(2)], 'b', 'LineWidth', 5);
title('Wiper Stopping...');
pause(0.1);
cla;
pause(0.1);
end
% Finalize wiper position
wiper_end = [wiper_length * cos(0), wiper_length * sin(0)]; % Reset to original position
plot([wiper_start(1), wiper_end(1)], [wiper_start(2), wiper_end(2)], 'b', 'LineWidth', 5);
title('Wiper Stopped');
end
% Call the function
car_wiper_simulation();
3:08 it's about INKING their FUNK!
Just wait until they use the engineers that are left to train the AI to review it's own code.
They are just keeping up the fire of AI hype
How does an engineer who never coded review a code? Because that is what will happen. You don't really learn to code at school.
Of course they will say this. I'm sure that % is fudged from a range of things. This is their possible future money maker so they got to prop it up.
To be fair, their employee roster is probably bloated.