As a software engineer, all I have to say is look at this guy’s incentives for saying the things he’s saying. Nvidia’s stock pretty much tripled in value off of the currently growing AI bubble. He has every reason to say that AI will replace coding, and you have every reason to distrust him.
@tomlxyz I bring it up because I’m speaking from a place of experience working with code every day. As it currently stands, AI is best used as an assistance tool in generating boilerplate code and writing unit tests. When it comes to translating nebulous business objectives into a technical solution, that’s not something you can ChatGPT your way out of, no matter how much AI bros want you to believe it. I bet most engineers working at NVIDIA don’t even believe this; I’d even be skeptical if the CEO himself believes this either. He put out a statement clearly aimed at the general masses who don’t know what working in software engineering is truly like.
@@blitzwing1 Nvidia is able to sell H100s at $40,000. The H100 has a massive 825mm2 die which is close to the rhetorical limit on TSMC's N4 node and such a large die means lower yields. It is not easy to make your own silicon for AI when fabricating it can be a problem. TSMC heavily caters to its already existing large customers like Apple, Nvidia and AMD and new semiconductor designers are less likely to get favourable wafer pricing.
This idea is something that's going to be done one way or another, because it will accelerate development in basically every field. The idea isn't the issue, the issue is a single company dominating the fair use of that idea
yeah, my company recently approved github copilot for use, and honestly it's not even good at helping write code. don't get me wrong, it's an impressive feat of development. But it's still a long way away from being able to write good code.
In other words, he wants to turn a part of potential programmers into a new category of dependent consumers (of his proprietary AI product). The reason you learn to code is not just the code itself but to actually take control of the technology you paid for. What he says is the tech equivalent to "don't get a driver's license, you can just get a Uber".
He also obviously doesn't do any real coding or he would understand that you can't have the AI debug your code for you at some point a human being has to decide the damn thing is actually working or not and if it isn't a human being has to go in and figure out what went wrong. AI simply does not possess the executive Independent consciousness to decide to do things like that.
@inamarie9656Real-life coding is a lot like playing Factorio. ChatGPT "knows" (or at least parrots correctly in simple cases) all the design patterns and basic knowledge you expect, but when you give it your megabase and ask kindly to solve a complicated logistical issue, the best it will produce is delightfully incorrect nonsense, at worst something that has the shape of a solution without being such. It's the greatest simulator of productivity, no wonder CEOs who just profit off of other's work love it.
@@angelainamarie9656 Not to mention the code that ChatGPT generates is almost always loaded with bugs or just garbage. You're better off writing the code from scratch usually anyway. We're not even close to AI that writes flawless code, and probably never will be.
Imagining a 40k where the Mars cult is run by grifters who don’t have nearly the tech they claim they have but lie for power and prominence in the imperium of man
1950s - Machines will do the drudgery so humans can have time to be creative. 2020s - The Machine will do the creative work so you can focus on the drudgery.
It must be the best job in the world, just bullshit on stages and when things go bad refuse to leave until they give you a severance package on which you can retire. Must be nice to not care about other people/your own dignity
Not really because someone would have to know how to operate the machines and baseline knowledge if there’s ever a malfunction. For example, let’s say we have a fully automated McDonald’s full of kiosks, machine cooks and servers all they need is a tech to handle the maintenance.
@@WhoBlah21 things break down. you don't need JUST the mcdonalds building and real estate and machines. you also need someone who knows how to build a functioning mcdonalds. not to mention the supply chain that makes mcdonalds products possible to begin with.
I'm sure he knows. he's just also willing to spin bullshit and lie to the people, because it's a public company and lying to push a bubble is highly incentivized
It's already being investigated as a vulnerability. I wrote on it being a problem 3 months after ChatGPT came out. Don't get me wrong, the AI is basically another super IDE to help a developer, but the fact that all these structures are going to be predictable is rough.
See the trick is, that the AI gets upset when you describe to it malicious blackhat actors because it will think you're trying to make it racist instead of secure...
nah that's just Google's piss take on AI. this dude is talking about going Robinhood on the gaping holes in the crudely generated AI spaghetti code @@Eyclonus
It is, don't let anyone tell you otherwise. People don't realise how the human touch affects coding. How you approach a problem might be completely different to how I approach a problem. Most people aren't privy to thinking about this, because they think code is magic and it very rarely is. They expect it to do the impossible and if you can't get it to do what you want, then it's the problem of the one who wrote the code. It's annoying. I have had to deal with clients who have no idea how their software works, because why would they... Honestly AI isn't going to replace coding, at best it can give you something to work with. But any functionality you want with it? Good fucking luck trying to get AI to do magic for you😂
@@blitzwing1 LLMs as they are imo will probably never be great at coding. You don’t just need rigidity, you need logical consistency, something which LLMs (which again, are glorified word calculators) are fundamentally incapable of. They’ll probably be able to make simple scripts that do something well documented (like set up a Python HTTP server) just fine, but to even a little bit off that beaten path and it borderline becomes a detriment as it confidently spits out inaccurate or even nonfunctional code
@@ashutoshsethi6150 LLMs cannot do logic. They, as I understand it, just use very complicated statistics and a metric truckload of data to find out what word comes after the current one. Could they invent an AI model that can actually do logic? Sure, but it won’t be an LLM, which is what the industries currently pouring all their money into. That’s my point.
@@ashutoshsethi6150Not without contextual awareness, which you can't really have when you're working on just probability like these current models are.
Well it does need some human oversight, the thing is though you wouldn’t need nearly as much humans as you would need today. Like you would only need 1/1000 humans in the loop compared to today, and that’s only gonna be in about 5 years. So you wouldn’t layoff everyone, but it might as well be that your laying off laying everyone
The main problem of putting wants into code has always been that people with no experience have no idea what they want and AI is just gonna give them crap instead of what they actually need
yeah, you're still gonna need people who are able to discern the requirements to be able to give the ai tool proper specs to write good code. And that's just a developer. You may need fewer developers, but the field won't go away
I kid you not, the whole job of a lead developer is just translating whatever barely articulated bs the client comes up with into actual thoughts an engineer can understand.
There was a Star Trek TNG episode about a civilization that was reliant on their technology but they didn't actually have anyone left who understood how any of it worked. This reminds me of that.
we've had real life examples too. it is indeed a bad idea. we're already very far into that reality with the software we run. few to no people really grasp the whole tech stack (including all the software at every level, and all the hardware at every level, all the firmware. lots in specific to mention for an individual computer, so I won't. also not to mention how a billion-user site is architectured). it's too complicated now for most people to grasp in its entirety, and even those that do so only do at a "jack" level of understanding for most of the pieces, not a "master" level. heck, most people in industry these days don't even learn the whole thing at an early 1990s level of understanding, let alone fully modern designs.
@@blarghblargh I remember a story a while back about a city/town desperately searching for someone with old Cobol skills because some crucial infrastructure ran on it. XD
From my experience being a tech guy, most of the hard "work" of programming is sitting around thinking about the architecture of your project rather than the actual implementation of the code. This is why more experienced programers usually have LESS lines of code that they write. (Thank you Elon). The thing that chatbots are good at is the easy part and my fear is that this technology will create two classes of programers. The actual architects and the army of interchangeable code monkeys who type prompts into chat bots and string shit together. This would allow you to pay the low level "prompt engineers" very very little. This effect is already happening in certain areas of the engineering world. At truss companies for instance, there is usually one competent engineer who sits at the top and then a whole army of low skill technicians that type Numbers into the truss program without any understanding of what they are doing. It's terrifying and this is the future of most technical jobs if things continue progressing as they are.
Sounds like what's happening with art. Real artists/photographers/designers are being replaced with prompt-jockeys, and they're not learning anything beyond basic photoshop skills. AI art bro: _"Help, i submitted an AI image for a corporate commission, and now the client wants _*_revisions!"_* 😰
GPUs are basically chips that do a lot of floating-point multiplication fast. They're not specific to graphics. Training and using a neural net both involve a lot of multiplication. This whole video reminds me of an essay called "The Law of Leaky Abstractions" by Joel Spolsky. That basically asks why programming isn't easy, even though tools have improved. His answer is basically that all abstractions are "leaky", more abstractions mean you can get more done more easily, but there's a deeper set of stuff to contend with as soon as anything doesn't line up or something goes wrong. I wouldn't be surprised if language models are eventually used in some really cool programming tools that let programs be specified in a very abstract way. And also wouldn't surprise me if the resulting programs have defects that are an absolute nightmare to debug.
Every code that you haven't written yourself is a nightmare to debug. Given that finding / fixing bugs already takes 90% of the programming time in any complex system, AI surely isn't helping by taking away any code ownership here.
Cute of you to assume they use a calculator. I'd expect a significant number of people already use voice assistants even for basic things like that. "Alexa, what is 2+2?" We do live in the most ret4rded reality.
god i hate him. it's clear that nvidia's push for fake pixels has nothing to do with performance, it's just this guy being a techbro who wants to squeeze AI into anything he can, and the entire gaming industry has just jumped after him getting right on board with making games look like they were dipped in vaseline
I do believe it will finally bite them back soon. This bubble has to burst. It always does, everything corporate sooner or later drops completely in the void.
The image reconstruction and the frame interpolation techs are absolutely essential. We need those. it's the only feasible way to reach natural/realistic motion portrayal (ideally 5-digits frames rates on 3 digits refresh rates displays) in our lifetimes. That's the currently unobtainable frame rates that are needed to simultaneously fix both of the two major motion artifacts of finite refresh rate displays: "image persistence based eye tracking motion blur" and "stroboscopic stepping on relative motions" (also called the phantom array effect). There is no way to do that and continue to improve graphic fidelity (better light transport, texture resolution, models complexity, ...) with the limitations of silicon.
@@hastesoldat i don't know what to tell you i really just want a game that looks good enough and runs well, i don't need to see every pore in a character's asscheek on a retina display at imperceptibly high framerates made up of 90% pixels that an AI model made up. this push for upsampling and frame generation hasn't advanced graphics tech it has only made developers get insanely lazy with optimization and some games even take away the choice to play at native resolution and/or without any TAA bs
@@notmyrealname7634 Humans do always produce buggy code. There are bugs in every level of the software you are using to read this comment. Some are known about, some will be found in the future.
@@notmyrealname7634Yeah but a human can think and work around bugs in ways that ai just can't. An Ai can run code and see it doesn't work, and it can try to isolate the problem, but if it generated the shit ass code then it doesnt know how to make it work. AI knows what good code looks like, but doesn't really know how to write it.
People think AI will code for them and software developers will cease to exist is because at the moment AI helps to write routine pieces of code because it studies the work of real programmers who at least sometimes know what they are doing. Imagine what will happen if copilot software will start to canibalize the code written by other copilots? It will go bad very quickly.
Even for people who will never actually code, learning the logic of how code works is just really good building block of knowledge that everyone should get at least a basic understanding of.
Or ever. I work for a pretty large tech company. You haven't heard of it probably, but if you live in Europe or Asia you probably used a service that relies on our products. The conflict between upper management who has never touched code, and doesn't understand how any of it works, and the actual developers is the biggest money and productivity loss for sure
Even if you don't end up programming for a job or anything, you should still learn how to do it because it's a great teacher of problem solving skills. Also it never hurts to better understand the workings of the machines that have become so crucial to our lives
The scariest thing about this "ai" stuff... is what it's saying is "we shouldn't teach humans anymore".... what? I've actually seen people say that they've stopped trying to learn things because of ai. What are we doing? If humans don't learn, then they won't come up with new ideas, and the AI won't do it, by definition. So just the end of new human technology and art? What are we doing.
Sometimes I dream of an Ayn Randian fantasy; Atlas Shrugged except its not the CEOs leaving but rather all the tired professionals like engineers, doctors, game devs, or any source of productive labor left in general, finally getting fed up with being told how to do their job by the people above, going off to form their new society or whatever... But yes, it is disturbing how this AI bubble's philosophy basically pivots around embracing ignorance, meaninglessness, greed, allowing yourself to devolve into something less than human, a mere economic actor whose entire existence is centered around consumption and accumulation of status... AI bros will tell people not to learn to code, not to learn how to draw or 3D model, not to learn math or physics or any other skill. They're manufacturing a crisis of meaning the only solution to which is to cover yourself head to toe in symbols of wealth. If you are not anymore defined by you craftsmanship and mind all you have left is your bank account. I really hope it will all just implode on itself as soon as possible. I'm genuinely becoming more open to joining some commune just to escape this constant stream of propaganda. I just want to feel human again, free of this failed attempt to strip me from it.
Yeah that's people failing to understand how technology works. GPS is literally causing parts of our brain to be underdeveloped compared to a traditional taxi driver. Technology can let us offload work to devices. Our education has been moving away from memorization for a while which was super important before universal internet access and even more so before cheap books. Arithmetic is slowly being de-emphasized. The archivists still have a role to play. People who still build "out of date skills" are important. From orienteering to using abacuses to ham radio to making true sour dough.
I love it when tech CEOs like Elon or Jensen get to the position they are at and then say stupid stuff periodically exposing themselves, showing the world how stupid they really are.
I use AI to help me code, it gives you a good skeleton to work from and makes my workflow faster But god damn, without know how to clean it up or correct extremely basic errors, it's just useless
The best take I've heard is that AI is great at bridging small knowledge gaps to speed up the process, because if it gives you a solution for a small problem, you then understand that and can apply that knowledge the next time you encounter that issue. If the knowledge gap is too large, eg if you don't know how to code at all, you will not invest time in understanding the solution the AI provides and it becomes useless again.
Here's why I'm not worrying about my job (at least in the context of AI) for the forseeable future. Software Engineering, as a job, isn't just "coding". In fact, its surprisingly little coding. Its architecture design, its debugging, its many things that AI have not even so much as touched on as a solution. Can an AI write code better than me? Proably one day. But I'll start worrying when that AI can start solving bug reports and optimization problems in a codebase with 60000 lines of code. Once the AI can understand novel problems, and solve them, then I'll start to worry.
It seems like a great idea to transfer all of our computer science knowledge out of our hands and into the black box of AI. I'm sure it won't incest itself to death, again.
I'm a chemist graduate and biology master's student. I am currently working on my bioinformatics master's project in python. I have used ChatGPT as a helping tool, but let me tell you, it cannot even write code to properly extract information from well-documented filetypes. Each and every response from GPT had to be modified and adapted, despite being given a very detailed explanation of what I want it to do. And while speaking to researchers, they say that "AI" (real term being machine learning) just creates the same amount of workload that it is supposed to displace.
Yeah, as a software engineer, everything that CEO says is wrong. This guy thinks we are already at "Jarvis, make me a suit that flies and is bullet proof". It's pretty close to true that "a GPU is a computer within a computer". The difference is that the GPU has hardware optimized to solve certain kinds of problems very fast. If you can turn a given problem (e.g. mining bitcoin, which I know nothing about) into the type of problem that a GPU is optimized for, then it's possible to build an extremely powerful mining system by just throwing multiple GPUs at it. NVIDIA stocks surged when it was found that GPUs could mine bitcoin efficiently. It's obvious from this talk that he's just looking for that next bump. Coding is absolutely an art -- even more than it is a science. In college, they will teach you the science, and on a basic level your code won't work without the science. But designing a system to be future-proof (i.e. easy to fix bugs, add new features, and teach to new engineers) is WAY more difficult, and WAY more impressive.
I love how the example of a simple AI generated code was to move a circle in Blender, when in reality, even this is beyond AI. LLMs are fundamentally unable to effectively work with niche libraries/APIs such as Blender/bpy.
I imagine that we'd be stuck at the "jagged hellish mess of edges forming an object that is impossible in a simple three dimensional space" stage for a while with Blender.
as someone who's currently in the process of learning to code (SQL specifically) the notion that everyone is now a programmer because the "programming language is now human" is fuckin' stupid. It's ALREADY human language, the syntax is just incredibly specific and commands run in certain orders. He's actually saying nothing here, but since no one knows how to code people THINK he's smart and saying something profound. AI will not save you, my dude.
In a timeline where everyone listens to him... "Oopsie, I tripped and hit a lever that made the subscription fee to use the AI engine go up by 500%. Can't go back, either, muh bad. UwU"
It’s like telling people you don’t need to know how to do math anymore because we have computers and calculators now. But it’s still important to know how you can solve a math problem, it’s important to know what you are inputting, and understanding objective truths of our universe and world. Other wise yeah we may have calculators and computers but do we know how to use this?
Even if true, even if Ai replaces everything. We should still continue to learn and understand it, else we leave future generations with technology no more indistinguishable from magic
As an engineer: I'd personally really like some AI that does first drafts of certain engineering drawings. Mind you, I do think this would get misused at some point and it would be a whole drama, but used responsibly it could cut out hundreds or thousands of hours per project of meaningless grunt work in the engineering world and let engineers focus on important details instead of sifting through endless mounds of busywork. Chatgpt is also not terrible for first-pass sanity checks on engineering calculations. It has about the reliability of a freshman engineering student with a textbook in front of them, but because it shows you its work, it's not hard to use as a first draft, then correct its math manually if it made a silly mistake. I do feel like there is real potential here as a work aid in some applications. But it's not a replacement for real human work the way CEOs really fucking want it to be. Someone still needs to sign off on the bridge being built, and that means going over everything with a fine tooth comb and human eyes. Those parts cannot ever be replaced by AI, and it's deeply concerning to see CEOs salivating over the prospect.
The more you learn to code, the more you realize how full of shit those AI-Techbros are. Especially if you write and train a LLM yourself, even if you are not good at it (you still gain a decent amount of understanding on how it operates).
i'm relieved. i'd never actually heard that AI could fuck up code. it makes sense with a lossy reiteration, but i'd taken people's word for it that the computer can speak computer. can't wait for AI to genuinely fucking destroy anyone who invests too much trust and finances into its accuracy
The issue is that GPT AI is not great at working with scientific facts, rules, data, and structure. chatgpt is actually pretty useful for writing small snippets of code, as long as there is someone there to review the output, verify and rewrite it for correctness. There's no way in hell I would trust it to generate an entire program in its current state. GPT is great at vomiting something that approximates human written text, but when it comes to actual facts, rules, coding language, structure, consistency, and protocols, it can easily forget what it wrote 100 lines ago and start contradicting itself, or simply write code using functions that don't exist at all. I could imagine a future AI that incorporates a combination of GPT-AI as a sort of front end "hmm, what is the user asking, and how should i organize the output", with another system in the backend that actually reads data, coding rulebooks, facts, history, database, etc. Basically, in order to guarantee correctness, we need a human-written architecture do the heavy lifting with data.
I'm a software engineer and you're 100% correct, but for now. An application will always require humans to create it, but maybe fewer developers will be needed in the future. It's not only going to affect coding jobs. Some jobs will be safe from AI, like dentistry. Imagine a ChatGPT with access to an entire repository and documentation; it would only need a method written for it to generate tests or suggest tests for written code.
6:15 - Vaush says that when he thinks of Nvidia, he doesn't think of AI. This shows how little he fucking knows about the AI field and about Nvidia. Nvidia is the most important company related to AI in the world at the moment. Their stock price has absolutely ballooned as a result of their AI chips that are far faster than any other processing unit on the market, which make training models far faster.
My brother, who has a college degree in programming, and before that got a job at a game studio straight out of high school, has spent the last year and a half teaching himself Godot... from the horror stories I have heard, I don't think AI is going to be much successful. The kind of mistakes AI makes with image generation? would *destroy* the programming for something like a video game. Meanwhile, given that I myself went to *art* college, I can 100% vouch for the "artists learn to iterate." If AI is the future for creating video game graphics, I'm assuming that, eventually, everything will look like a Mandlebulb fractal. Which, looks cool, but when everything in the game has the same "look," it'll get old pretty quick.
it's less sinster and more cynical than that. he cares about quarterly earnings/growth, and it boosts their share prices to hype themselves as THE market leader, in what investors percieve as the big upcoming tech for the next decade+. he's not a visionary. he's a hype man. he's selling shovels for a gold rush.
As a programmer of sorts (Computational Neuroscience) I don't use chat GPT because the output has no practical value to me. The code it outputs is actually pretty much gibberish. At most you can pull the individual commands line by line from the output and adapt them... which is basically the same as having looked up the relevant commands like "how to do X operation" or "how to visualize Y." It's just not useful for coding. I've also never used it to cheat on assignments not out of some sense of honor but because it can't provide reliable or even basic information on any topic at a college level, at least not in BME or Neuro. The topics are too niche and complex to fake knowledge about, so you have to actually, you know, do the assignments and answer the questions. "What happens to your field of view when there's a lesion to this part of the brain... " or "how far apart do these nails have to be placed to bear a load if XYZ"
The luddism is strong here. "He has to know, he's the CEO of NVIDIA!" ... but for some reason vaush feels more qualified to comment on this. Saying that humans have an esotherical quality about them which can never possibly be synthesized is religious quackery. I am curious to see your redemption tour in 2-5 years.
Stuff like copilot is impressive and already useful but it is still very *very* far form replacing programmers. Completely replacing a job like that would require something more akin to ACT-R which is largely a theoretical thing for now.
Imagine if my accounting classes never taught me how to record journal entries or what debits and credits mean and how they function within the accounting system because “The accounting software does all that for you”. Thats great, now I’m a shit accountant because I have no idea how to fix a problem with the balance sheet not balancing because I don’t even know what it means to be out of balance. You need to understand he fundamentals of a system even if there are tools to make said system easier to use and interact with. Everyone should learn coding the CEO of nvidia is exactly the reason why
he's not ignorant. He knows what he's advocating for will result in the demographic that has any computer literacy being reduced to autodidactic nerds, which he's fine with because that's the bubble he's in. People have a tendency to be okay with a consolidation of power if they're in the group the power is being consolidated to.
Why does Google say that Nvidia invented the GPU in 1999? SGI made graphics processors before them, the Nintendo 64 had an SGI GPU and was released in 1996.
I'm a mechanical engineer. I also work in programming CAD add-ins and macros. I also program robots (industrial robots for welding and other purposes) along with actually making entire welding cell projects. I only tried ChatGPT a bit. It is not impressive for fields in which there is not enough data. I mean it is not even like for web development where it throws some relatively okay stuff with some errors. It just throws some illogical stuff that is completely wrong due to lack of data. Just try it for PLCs or for KUKA KRL language. It is just bad due to the lack of data.
Watching chagt write firmware would probably be hilarious. Just don't use the code. The lack of security would be astonishing not even including all the bugs.
The reason he’s spouting lies is because he knows how important actual programmers are. It’s about paying programmers to do the same job for less because they utilize AI while also taking a check from them for the right to utilize AI. It’s about shrinking the power of the worker, making them more replaceable. It’s the same across all industries.
The reason graphics processors are good for these models is that the models use multi-dimensional arrays to hold all the different characteristics of the object in question. They do vector math on 3-dimensional objects in graphics rendering, but they're not limited to 3-dimensional math. In fact, luminance, reflectivity, and so on can be assigned dimensions and math can be applied to alter those values. Likewise there can be dimensions for heaviness, colors, does a thing make noise or not - just add more dimensions after x, y and z and you can do the same manipulations even though humans don't consciously acknowledge all these properties as dimensions. Anyway, being made for 3d graphics literally is the link between graphics cards and machine learning. That aspect is not a fad or fraud in any way.
5:11 the ability to convey what you want is solution ideation. Something that can be taught, as programming is. The ability to generating solutions in natural language has benefits that are universal, whereas syntax knowledge is contained. Can you not use your vast knowledge to infer how AI could evolve to increase/enhance output?
In this hypothetical future: While AI coding is the primary source of software, people with coding knowledge will be able to hack into the sloppy patchwork systems that are vomited out to the masses. Then, once the AI coding algorithm collapses under the weight of incest, people with coding knowledge will be highly sought after to rebuild the digital world from scratch. [All that assuming AI coding actually takes over.] Really, this is a hype-job for the short-term boosting of NVIDIA shares on the back of the AI tech bubble. If people actually fall for this bad advice, the long-term results for NVIDIA will be labor shortages and even _higher_ pay for good coders. *Never* take advice from the public statements of a CEO.
Okay, that chatter did not explain it, but "bouncing idea off AI" is bouncing idea off yourself, using AI as you would yourself when you talk to yourself during a creative process. In that way, it does have some minimal use.
As long as there are humans left on this Earth who still hear the voice of the sublime over the loud scream of the material, some jobs will not be replaced, for they were never about the product but the process.
@@rkvkydqf -- The point is not enough jobs will be available for the number of people that need to work to make money. Not that "all jobs will be gone."
This is hilariously wrongheaded. Basically this CEO has reinvented something a rest of the world has already tried and rejected decades ago: programming in natural language is just COBOL (a business-oriented "English-like" programming language from the 1950's) all over again! That was a travesty because it turns out coding is the least of a software developer's job (it's mostly about design and modelling, stuff LLMs can't do btw). Business people suck at even just describing what they want, let alone making a reliable specification for a machine. Anyone who has ever developed for a client project can vouch for that.
It's almost like this man has a fucking team of genius devs that advise him and have already supported this ideas feasibility . You're acting as tho you have this knowledge of COBOL and the only reason they're Pursuing this is because they're dumb and missed COBOLs attempt at English like programming language. The truth is probably that they know of COBOL and are going in a more successful direction .
@@brutuslugo3969 I have seen enough high profile flops with even more trivial oversights to not share your blind faith in corporate meritocracy. A very recent example is Hyperloop One (formerly Virgin Hyperloop). It swallowed years investment and grants only to shut down with nothing to show for. One of the engineers said that it will never be practical because in short it turns out the Earth is too bumpy for vacuum trains. Yeah, no shit! A team of geniuses behind a maniac boss will just go "yes sir, we'll look into it" and waste as much time as they can get away with and jump ship just before the gravy train runs out.
As a software engineer I don't really want to take his side, but I can see where he is coming from. He is correct in that prospect that the environment is changing and thus the tools we can and should use to create software. Like - no one needs to learn assembler for example to become a programmer, because there are higher programming languages that are more accessable, "more powerful" and easier to learn. So that much is certainly true. BUT. We can't dismiss everyone in the field and not learn this stuff in the first place. Sure AI is a great asset for supporting development, but it can't make the heavy lifting of either deciding what to do for very specific use case, we can not assume for it to work properly at all and we need to check what has been generated, which in turn means, that we "at least" need to understand what it generated. So yeah. His basic idea is correct, but his understanding of the field is lacking, to say the least.
Generative AI is impressive but far from being able to replace an engineer. In some ways, the tech industry does not change as much as people make it seem. C++ is still one of the most popular languages after 40 years. We’re still using network protocols developed in the 70’s. The concept of relational databases which are still widely used today was defined in the 70’s.
2:25 the commenter's correct. At that point in the video the CEO has only said that you shouldn't say *every* kid needs to learn to code, not that nobody should learn it. That said, as a programmer, I can confidently say that AI is currently not good enough to code and while it can certainly make coding easier, I doubt it'll make programmers obsolete any time soon.
I think Jensen is saying this in anticipation for advanced A.I. that will be developed within this decade. Yes, current A.I. (ChatGPT, Gemini) makes a lot of mistakes and isn't really useful due to the number of hallucinations. However, this is only temporary. A.I. currently is the worse it will ever be. It will only get better and I think we'll have to accept that as a society. As leftists I think its a good time to start strongly advocating for some kind of UBI.
Vaush, the thing is, GPU stuff DOES now involve AI with the latest developments... As of like a year ago you can use AI to basically upscale Skyrim and literally DOUBLE your FPS via a thing called DLSS, or I think some other thing if you're using a different type of GPU... Essentially, you run the game at like 360p which is obviously going to be easy for your GPU since things look pixelated and shitty, but then the AI comes in and upscales things to like 1080p or 4K... With your settings right, the only downside is some very minor artifacting that pretty much no one is very likely to even be able to notice to where it seems basically graphically indistinguishable from not using AI to the average person... But again, for a major boost in performance, like 40 or 60 or more extra FPS...
@@ArDeeMee Dictionary Definitions from Oxford Languages · Learn more sar·casm /ˈsärˌkazəm/ noun the use of irony to mock or convey contempt. "his voice, hardened by sarcasm, could not hide his resentment"
Chat GPT can code, as much as a 5yo can paint, yeah, it can smash its fingers full of paint and draw a fucking smile face... it cannot fucking paint a wall white.... neither can it make a master piece...
My understanding is half of all bio-engineering research at this point is just a guy sipping coffee while an AI is running through a billion data points, then when it's done an hour later he tries doing what AI says and sometimes it works.
Vaush I definitely underestimating just how impressive AI is with software. It doesn’t build your apps for you, but if you’re a senior and know exactly what you want, you can develop lightning fast with it. It will give you complex code or starting points if you tell it the right thing.
while a little extreme, as a game dev ive seen how good chatgpt etc is and soon most code will absolutely be from AIs in some way. even my coders are like "why are you even paying me to copypasta chatgpt".
Also as a game dev I'm dreading when games are mostly coded by ai. There will be a bug some major game studio will have and it'll take them 6 months to even find the error since no one actually wrote it. This will happen for every single bug the game has. Ai can't take an entire unity project and spit out something workable.
@@OneEyeShadowConsidering how over capacity basically every healthcare service in the world is perpetually any automation in healthcare should be welcome.
@@big-gloomDiagnostics mainly. Auto segmentation and generally handling of data as well. AI is already kinda superior when it comes to recommending medicine.
@@OneEyeShadowPretty fine I presume. WebMD had existed for a decade or two and actually uses a statistical model more fit for its purpose and had only helped expose incompetent medical practitioners for who they were; at no point did anyone fucking dissolve hospitals and show people DIY tutorial for surgical operations. And as for electrical engineering, you'll me fine, more so than even software which too isn't going anywhere! Your domain is applied physics, its levels of complexity are so astounding this stochastic parrot won't get past the local optimum of memorizing test answers. All types of engineering share one defining trait: complexity; until these things can understand with nuance paragraphs and paragraphs of documentation, something even latest GPT-4.5 struggles with over just 300 LOC of normal code, it will not be practically applicable.
It's interesting to see the levels of anger Vaush reaches, and equally interesting how he calms down once chat says something that isn't stupid and actually contributes knowledge.
Another implication of telling people to not to learn to code is you have fewer people who know how to do it, then they can pay the few people they have very little to keep the AI.
I'm new to programming (studying gamedev for homebrew NES games) and I wholeheartedly believe AI could NEVER replace humans in programming. Optimizing code, let alone building large and complex programs, requires a deep understanding of how every section of code interacts with other sections. To my understanding, AI works by looking at something and comparing it to another, not assessing the functions of something and building to the goal from scratch by understanding the minimum instructions required.
My favourite AI story is that they trained a model on medical scans of people's lungs to detect tuberculosis. It got better than actual doctors, so they tried to dig through it to figure out how it knew. It was looking for visual artefacts present in old medical scans - when tuberculosis was much more common - and assigning those to have much higher likelihood of tuberculosis. But yes, AI is going to replace us, dude.
I think some other comments are already explaining more or less the same, but the reason GPUs (graphics cards) are more popular (and better) for things like AI (or crypto/Bitcoin mining for that matter) than just straight "computers" (CPUs) is that GPUs are basically giant clusters of mini-CPUs that can do lots of simple computations in parallel. * Rendering graphics involves mass amounts of simple calculations for vectors (think how lighting at various different angles takes lots of simple math to figure out exactly the shapes you see on the screen), so they're built for exactly that * Bitcoin is "mined" by solving tons of calculations (to the extent you even can still mine it, not sure if there's much you can still get out of that today) * AI models process mass amounts of data in matrices to perform operations on, which to the computer is just more extremely repetitive calculations CPUs, by contrast, have more complex designs and capabilities, but can only do so much at once and won't process all that stuff nearly as efficiently.
I am a biochemist and a computer scientist. You are correct in your hatred for generative ai but using data science both in vitro and in silico is the most powerful tool for learning things
I too hate the AI hype, and Vaush wasn't too off technically. However he doesn't get Nvidia and normal machine learning, as chat help him with models for biochem are quite capable e.g. protein folding as mentioned. And Nvidia is predominantly a Hardware Company with actual Products and R&D, selling computing power based around GPU technology. (Many of) Their products are genuinely really impressive and besides the gaming market known to most people, they power all sorts of datacenters and enterpirse workload, for e.g. VFX rendering, CAE, BigData and all those buzzwords including AI the training of which famously requires insane amounts of computing power so Nvidia is more riding the hype selling the compute. In the gold rush they get rich by selling shovels.
Do these tech bros understand that we dont actually have thinking AI? Its all long language models.. they really seem to think that we have, Right now, sentient thinking AI..
The sad thing is that AI has incredible liberatory potential if done correctly. It's very different to previous tech bubbles like Crypto. AI has the potential to give us fully automated gay space luxury communism if we actually did it right
coding is also about a way of thinking and analyzing problems. As a business analyst / programmer analyst, a small part of problem solving is the actual coding. Most of it is framing the problem
that reminds me of a video i've seen. a very experienced programmer talked about how manufacturers don't want us to understand and directly use the hardware. the abstractions we can use with the hardware are somewhat vetted by the manufacturer, remotely most of the time. it's supposedly not because it's that hard to learn to use a specific GPU architecture, they just don't want you to use something else than the graphics APIs that abstract most of that GPU-specific stuff.
As a software engineer, all I have to say is look at this guy’s incentives for saying the things he’s saying. Nvidia’s stock pretty much tripled in value off of the currently growing AI bubble. He has every reason to say that AI will replace coding, and you have every reason to distrust him.
Hi isnt competely wrong and he is going to make money doing it
I'm a software engineer too but in this case i don't see why you need to mention it. Clear conflict of interest against being honest either way
@tomlxyz I bring it up because I’m speaking from a place of experience working with code every day. As it currently stands, AI is best used as an assistance tool in generating boilerplate code and writing unit tests. When it comes to translating nebulous business objectives into a technical solution, that’s not something you can ChatGPT your way out of, no matter how much AI bros want you to believe it. I bet most engineers working at NVIDIA don’t even believe this; I’d even be skeptical if the CEO himself believes this either. He put out a statement clearly aimed at the general masses who don’t know what working in software engineering is truly like.
Beyond that, how tf do we know we can trust ai to code itself??
@@blitzwing1 Nvidia is able to sell H100s at $40,000. The H100 has a massive 825mm2 die which is close to the rhetorical limit on TSMC's N4 node and such a large die means lower yields. It is not easy to make your own silicon for AI when fabricating it can be a problem. TSMC heavily caters to its already existing large customers like Apple, Nvidia and AMD and new semiconductor designers are less likely to get favourable wafer pricing.
AI is the new gold rush for companies, and Nvidia is selling the shovels & picks; they have an innate bias and a perverse incentive to push this idea.
Nvidia also profited off the crypto hype,i wonder if there will be a next after AI
This idea is something that's going to be done one way or another, because it will accelerate development in basically every field. The idea isn't the issue, the issue is a single company dominating the fair use of that idea
Jenson always overpromises for hype. He needs every leather jacket in existence
@@veilmontTV that may be true but the stock often overperforms as well.
yeah, my company recently approved github copilot for use, and honestly it's not even good at helping write code. don't get me wrong, it's an impressive feat of development. But it's still a long way away from being able to write good code.
In other words, he wants to turn a part of potential programmers into a new category of dependent consumers (of his proprietary AI product).
The reason you learn to code is not just the code itself but to actually take control of the technology you paid for. What he says is the tech equivalent to "don't get a driver's license, you can just get a Uber".
Very well said!
He also obviously doesn't do any real coding or he would understand that you can't have the AI debug your code for you at some point a human being has to decide the damn thing is actually working or not and if it isn't a human being has to go in and figure out what went wrong. AI simply does not possess the executive Independent consciousness to decide to do things like that.
@inamarie9656Real-life coding is a lot like playing Factorio. ChatGPT "knows" (or at least parrots correctly in simple cases) all the design patterns and basic knowledge you expect, but when you give it your megabase and ask kindly to solve a complicated logistical issue, the best it will produce is delightfully incorrect nonsense, at worst something that has the shape of a solution without being such. It's the greatest simulator of productivity, no wonder CEOs who just profit off of other's work love it.
@@angelainamarie9656 Not to mention the code that ChatGPT generates is almost always loaded with bugs or just garbage. You're better off writing the code from scratch usually anyway. We're not even close to AI that writes flawless code, and probably never will be.
Great analogy imo
This is how you get tech priests. Gather the incense and sacred oils to appease the machine spirit
No need to know how it works
PRAISE THE OMNISSIAH!!
insert futurama scene for reviving calculon
Imagining a 40k where the Mars cult is run by grifters who don’t have nearly the tech they claim they have but lie for power and prominence in the imperium of man
Starsector moment.
1950s - Machines will do the drudgery so humans can have time to be creative.
2020s - The Machine will do the creative work so you can focus on the drudgery.
CEOs love being wrong
best part for him is he's mostly right AND will make bank on this, i am sorry guys but Huang is not even nearly as braindead as Musk is, just wait
It must be the best job in the world, just bullshit on stages and when things go bad refuse to leave until they give you a severance package on which you can retire. Must be nice to not care about other people/your own dignity
hes just lying
Ah yes, lets completely lose all knowledge of the machines that make society run, that seems like a good way to organise a civilization
This is a real issue and it's called the black box problem
Not really because someone would have to know how to operate the machines and baseline knowledge if there’s ever a malfunction.
For example, let’s say we have a fully automated McDonald’s full of kiosks, machine cooks and servers all they need is a tech to handle the maintenance.
@@WhoBlah21 That's the POINT, people need to learn to code in order to maintain the shit that the CEO on the screen wants to replace all coders with.
@curvingfyre6810 yea what will they do when chatgpt shits it self...ask chatgpt?
@@WhoBlah21 things break down. you don't need JUST the mcdonalds building and real estate and machines. you also need someone who knows how to build a functioning mcdonalds. not to mention the supply chain that makes mcdonalds products possible to begin with.
Another perfect example of how being a CEO doesn't require you to know anything about your industry, your product, or anything at all, really
Dude. You are serioudly taking an opinion of someone who know nothing rather then someone who is balls deep into the tech stuff.
@vipcypr8368 dude the ceo main goal is to increase share price, he's gonna lie to do it if he has to
I'm sure he knows. he's just also willing to spin bullshit and lie to the people, because it's a public company and lying to push a bubble is highly incentivized
Management isn't a merit job, it's an office politics job.
Being a CEO is no different from being a politician.
@@vipcypr8368 Pfft... he doesn't even know why computer can never replace programmers. (it's halting problem)
When firms are all using ai to write their software, you will see exciting new opportunities in the field of black hat hacking
It's already being investigated as a vulnerability. I wrote on it being a problem 3 months after ChatGPT came out.
Don't get me wrong, the AI is basically another super IDE to help a developer, but the fact that all these structures are going to be predictable is rough.
See the trick is, that the AI gets upset when you describe to it malicious blackhat actors because it will think you're trying to make it racist instead of secure...
nah that's just Google's piss take on AI. this dude is talking about going Robinhood on the gaping holes in the crudely generated AI spaghetti code @@Eyclonus
They’ll have the perfect defence in that their shit won’t fucking work
I can't wait for whistleblowers to say their company is using sentient AI as slaves.
I can't believe Vaush called my profession art. My own father never had said nicer thing than that 😢
it's poetry. As the great bob martin has said many times, good code reads like well written prose
programming is art, the way watchmaking is art, the way woodworking is art.
if you're making games, programming is double-art :D
It is, don't let anyone tell you otherwise. People don't realise how the human touch affects coding. How you approach a problem might be completely different to how I approach a problem. Most people aren't privy to thinking about this, because they think code is magic and it very rarely is. They expect it to do the impossible and if you can't get it to do what you want, then it's the problem of the one who wrote the code. It's annoying. I have had to deal with clients who have no idea how their software works, because why would they... Honestly AI isn't going to replace coding, at best it can give you something to work with. But any functionality you want with it? Good fucking luck trying to get AI to do magic for you😂
Haircutting seems like an art to me, but many people just go through robotic motions for money.
So you ARE into Arts & Crafts, after all.
He's wrong. The Ai obviously needs heavy oversight. People need to know these skills.
@@blitzwing1 LLMs as they are imo will probably never be great at coding. You don’t just need rigidity, you need logical consistency, something which LLMs (which again, are glorified word calculators) are fundamentally incapable of. They’ll probably be able to make simple scripts that do something well documented (like set up a Python HTTP server) just fine, but to even a little bit off that beaten path and it borderline becomes a detriment as it confidently spits out inaccurate or even nonfunctional code
@@Colddirectorwhy, cannot logic be coded?
@@ashutoshsethi6150 LLMs cannot do logic. They, as I understand it, just use very complicated statistics and a metric truckload of data to find out what word comes after the current one.
Could they invent an AI model that can actually do logic? Sure, but it won’t be an LLM, which is what the industries currently pouring all their money into. That’s my point.
@@ashutoshsethi6150Not without contextual awareness, which you can't really have when you're working on just probability like these current models are.
Well it does need some human oversight, the thing is though you wouldn’t need nearly as much humans as you would need today. Like you would only need 1/1000 humans in the loop compared to today, and that’s only gonna be in about 5 years. So you wouldn’t layoff everyone, but it might as well be that your laying off laying everyone
The main problem of putting wants into code has always been that people with no experience have no idea what they want and AI is just gonna give them crap instead of what they actually need
yeah, you're still gonna need people who are able to discern the requirements to be able to give the ai tool proper specs to write good code. And that's just a developer. You may need fewer developers, but the field won't go away
I kid you not, the whole job of a lead developer is just translating whatever barely articulated bs the client comes up with into actual thoughts an engineer can understand.
He's trying to create the Adeptus Mechanicus, we pray to the machines for them to work
So kind of how I deal with my printer.
Buy my book Do Not Create the Torment Nexus.
There was a Star Trek TNG episode about a civilization that was reliant on their technology but they didn't actually have anyone left who understood how any of it worked. This reminds me of that.
we've had real life examples too. it is indeed a bad idea.
we're already very far into that reality with the software we run. few to no people really grasp the whole tech stack (including all the software at every level, and all the hardware at every level, all the firmware. lots in specific to mention for an individual computer, so I won't. also not to mention how a billion-user site is architectured).
it's too complicated now for most people to grasp in its entirety, and even those that do so only do at a "jack" level of understanding for most of the pieces, not a "master" level.
heck, most people in industry these days don't even learn the whole thing at an early 1990s level of understanding, let alone fully modern designs.
@@blarghblargh I remember a story a while back about a city/town desperately searching for someone with old Cobol skills because some crucial infrastructure ran on it. XD
From my experience being a tech guy, most of the hard "work" of programming is sitting around thinking about the architecture of your project rather than the actual implementation of the code. This is why more experienced programers usually have LESS lines of code that they write. (Thank you Elon). The thing that chatbots are good at is the easy part and my fear is that this technology will create two classes of programers. The actual architects and the army of interchangeable code monkeys who type prompts into chat bots and string shit together. This would allow you to pay the low level "prompt engineers" very very little. This effect is already happening in certain areas of the engineering world. At truss companies for instance, there is usually one competent engineer who sits at the top and then a whole army of low skill technicians that type Numbers into the truss program without any understanding of what they are doing. It's terrifying and this is the future of most technical jobs if things continue progressing as they are.
Sounds like what's happening with art. Real artists/photographers/designers are being replaced with prompt-jockeys, and they're not learning anything beyond basic photoshop skills.
AI art bro: _"Help, i submitted an AI image for a corporate commission, and now the client wants _*_revisions!"_* 😰
GPUs are basically chips that do a lot of floating-point multiplication fast. They're not specific to graphics. Training and using a neural net both involve a lot of multiplication.
This whole video reminds me of an essay called "The Law of Leaky Abstractions" by Joel Spolsky. That basically asks why programming isn't easy, even though tools have improved. His answer is basically that all abstractions are "leaky", more abstractions mean you can get more done more easily, but there's a deeper set of stuff to contend with as soon as anything doesn't line up or something goes wrong. I wouldn't be surprised if language models are eventually used in some really cool programming tools that let programs be specified in a very abstract way. And also wouldn't surprise me if the resulting programs have defects that are an absolute nightmare to debug.
Lmao ChatGPT generated data race
Every code that you haven't written yourself is a nightmare to debug. Given that finding / fixing bugs already takes 90% of the programming time in any complex system, AI surely isn't helping by taking away any code ownership here.
Nvidia has made mega money off AI, Jensen's just lining his pockets
Cannot agree more. If Capitalism wouldn't be a thing, we wouldn't even have this ai bs to deal with.
Instead, we could have probably made AI for good things that helps humanity. Sad timeline we live in.
Vaporware
This is why people put 2+2 into a calculator
Cute of you to assume they use a calculator. I'd expect a significant number of people already use voice assistants even for basic things like that. "Alexa, what is 2+2?" We do live in the most ret4rded reality.
god i hate him. it's clear that nvidia's push for fake pixels has nothing to do with performance, it's just this guy being a techbro who wants to squeeze AI into anything he can, and the entire gaming industry has just jumped after him getting right on board with making games look like they were dipped in vaseline
I do believe it will finally bite them back soon. This bubble has to burst. It always does, everything corporate sooner or later drops completely in the void.
The image reconstruction and the frame interpolation techs are absolutely essential. We need those. it's the only feasible way to reach natural/realistic motion portrayal (ideally 5-digits frames rates on 3 digits refresh rates displays) in our lifetimes. That's the currently unobtainable frame rates that are needed to simultaneously fix both of the two major motion artifacts of finite refresh rate displays: "image persistence based eye tracking motion blur" and "stroboscopic stepping on relative motions" (also called the phantom array effect).
There is no way to do that and continue to improve graphic fidelity (better light transport, texture resolution, models complexity, ...) with the limitations of silicon.
Thats just RE engine.
@@TowerWatchTV Crypto and meta lasted longer than I would've believed. This might have more staying power (unfortunately).
@@hastesoldat i don't know what to tell you i really just want a game that looks good enough and runs well, i don't need to see every pore in a character's asscheek on a retina display at imperceptibly high framerates made up of 90% pixels that an AI model made up.
this push for upsampling and frame generation hasn't advanced graphics tech it has only made developers get insanely lazy with optimization and some games even take away the choice to play at native resolution and/or without any TAA bs
Even if you get an AI to write the code, you still need a human to debug it. Because it WILL be buggy.
Yes, but they'll need less employees all together. That's the point.
@@notmyrealname7634It will be trained on human code. Humans write buggy code.
@@notmyrealname7634 But there's no way a computer will know it doesn't
@@notmyrealname7634 Humans do always produce buggy code. There are bugs in every level of the software you are using to read this comment. Some are known about, some will be found in the future.
@@notmyrealname7634Yeah but a human can think and work around bugs in ways that ai just can't. An Ai can run code and see it doesn't work, and it can try to isolate the problem, but if it generated the shit ass code then it doesnt know how to make it work. AI knows what good code looks like, but doesn't really know how to write it.
This is like saying midi has made music theory obsolete.
People think AI will code for them and software developers will cease to exist is because at the moment AI helps to write routine pieces of code because it studies the work of real programmers who at least sometimes know what they are doing. Imagine what will happen if copilot software will start to canibalize the code written by other copilots? It will go bad very quickly.
it already has😂 there’s programming articles about it similar to the ai pictures
@@Terminator-ht3sxDoes it write programs as well as those six-fingered twisted arms pictures?
Even for people who will never actually code, learning the logic of how code works is just really good building block of knowledge that everyone should get at least a basic understanding of.
i think the most telling thing here is that the ceo of a tech company probably hasnt written a line of code in 15 years
Or ever. I work for a pretty large tech company. You haven't heard of it probably, but if you live in Europe or Asia you probably used a service that relies on our products.
The conflict between upper management who has never touched code, and doesn't understand how any of it works, and the actual developers is the biggest money and productivity loss for sure
Even if you don't end up programming for a job or anything, you should still learn how to do it because it's a great teacher of problem solving skills. Also it never hurts to better understand the workings of the machines that have become so crucial to our lives
The scariest thing about this "ai" stuff... is what it's saying is "we shouldn't teach humans anymore".... what? I've actually seen people say that they've stopped trying to learn things because of ai.
What are we doing? If humans don't learn, then they won't come up with new ideas, and the AI won't do it, by definition.
So just the end of new human technology and art?
What are we doing.
"the AI won't do it, by definition"
Yeah no. I agree humans should continue to learn but this is nonsense. By what definition?
Sometimes I dream of an Ayn Randian fantasy; Atlas Shrugged except its not the CEOs leaving but rather all the tired professionals like engineers, doctors, game devs, or any source of productive labor left in general, finally getting fed up with being told how to do their job by the people above, going off to form their new society or whatever...
But yes, it is disturbing how this AI bubble's philosophy basically pivots around embracing ignorance, meaninglessness, greed, allowing yourself to devolve into something less than human, a mere economic actor whose entire existence is centered around consumption and accumulation of status... AI bros will tell people not to learn to code, not to learn how to draw or 3D model, not to learn math or physics or any other skill. They're manufacturing a crisis of meaning the only solution to which is to cover yourself head to toe in symbols of wealth. If you are not anymore defined by you craftsmanship and mind all you have left is your bank account.
I really hope it will all just implode on itself as soon as possible. I'm genuinely becoming more open to joining some commune just to escape this constant stream of propaganda. I just want to feel human again, free of this failed attempt to strip me from it.
> l've actually seen people say that they've stopped trying to learn things because of ai.
Why?
Ive learned more because of ai
Yeah that's people failing to understand how technology works. GPS is literally causing parts of our brain to be underdeveloped compared to a traditional taxi driver.
Technology can let us offload work to devices. Our education has been moving away from memorization for a while which was super important before universal internet access and even more so before cheap books. Arithmetic is slowly being de-emphasized. The archivists still have a role to play. People who still build "out of date skills" are important. From orienteering to using abacuses to ham radio to making true sour dough.
I love it when tech CEOs like Elon or Jensen get to the position they are at and then say stupid stuff periodically exposing themselves, showing the world how stupid they really are.
I use AI to help me code, it gives you a good skeleton to work from and makes my workflow faster
But god damn, without know how to clean it up or correct extremely basic errors, it's just useless
This what I've heard from some coding friends, too. It gives a fine base, but it needs soooo much revision to be functional.
The best take I've heard is that AI is great at bridging small knowledge gaps to speed up the process, because if it gives you a solution for a small problem, you then understand that and can apply that knowledge the next time you encounter that issue. If the knowledge gap is too large, eg if you don't know how to code at all, you will not invest time in understanding the solution the AI provides and it becomes useless again.
To borrow a quote... "Just imagine where we'll be 2 papers down the line".
For now. It'll replace programming jobs faster than artists can jerk off to your tears.
Here's why I'm not worrying about my job (at least in the context of AI) for the forseeable future.
Software Engineering, as a job, isn't just "coding". In fact, its surprisingly little coding.
Its architecture design, its debugging, its many things that AI have not even so much as touched on as a solution.
Can an AI write code better than me? Proably one day.
But I'll start worrying when that AI can start solving bug reports and optimization problems in a codebase with 60000 lines of code. Once the AI can understand novel problems, and solve them, then I'll start to worry.
Every time VC weirdos, marketers and tech priests say "we don't need coders anymore," a bunch of actual coders/programmers laugh out heartily.
It seems like a great idea to transfer all of our computer science knowledge out of our hands and into the black box of AI. I'm sure it won't incest itself to death, again.
I'm a chemist graduate and biology master's student. I am currently working on my bioinformatics master's project in python. I have used ChatGPT as a helping tool, but let me tell you, it cannot even write code to properly extract information from well-documented filetypes. Each and every response from GPT had to be modified and adapted, despite being given a very detailed explanation of what I want it to do.
And while speaking to researchers, they say that "AI" (real term being machine learning) just creates the same amount of workload that it is supposed to displace.
Yeah, as a software engineer, everything that CEO says is wrong. This guy thinks we are already at "Jarvis, make me a suit that flies and is bullet proof".
It's pretty close to true that "a GPU is a computer within a computer". The difference is that the GPU has hardware optimized to solve certain kinds of problems very fast. If you can turn a given problem (e.g. mining bitcoin, which I know nothing about) into the type of problem that a GPU is optimized for, then it's possible to build an extremely powerful mining system by just throwing multiple GPUs at it. NVIDIA stocks surged when it was found that GPUs could mine bitcoin efficiently. It's obvious from this talk that he's just looking for that next bump.
Coding is absolutely an art -- even more than it is a science. In college, they will teach you the science, and on a basic level your code won't work without the science. But designing a system to be future-proof (i.e. easy to fix bugs, add new features, and teach to new engineers) is WAY more difficult, and WAY more impressive.
I love how the example of a simple AI generated code was to move a circle in Blender, when in reality, even this is beyond AI. LLMs are fundamentally unable to effectively work with niche libraries/APIs such as Blender/bpy.
I imagine that we'd be stuck at the "jagged hellish mess of edges forming an object that is impossible in a simple three dimensional space" stage for a while with Blender.
I've been trying to make it update a library to blender 4.0.... guess how that's going.
as someone who's currently in the process of learning to code (SQL specifically) the notion that everyone is now a programmer because the "programming language is now human" is fuckin' stupid. It's ALREADY human language, the syntax is just incredibly specific and commands run in certain orders. He's actually saying nothing here, but since no one knows how to code people THINK he's smart and saying something profound.
AI will not save you, my dude.
In a timeline where everyone listens to him...
"Oopsie, I tripped and hit a lever that made the subscription fee to use the AI engine go up by 500%. Can't go back, either, muh bad. UwU"
I feel like no one knowing how to code is unironically the first step towards the machine uprising
He's right that not *everyone* needs to be able to code. He's wrong that coding can be fixed by asking an AI to do it.
It’s like telling people you don’t need to know how to do math anymore because we have computers and calculators now. But it’s still important to know how you can solve a math problem, it’s important to know what you are inputting, and understanding objective truths of our universe and world. Other wise yeah we may have calculators and computers but do we know how to use this?
Even if true, even if Ai replaces everything. We should still continue to learn and understand it, else we leave future generations with technology no more indistinguishable from magic
As an engineer: I'd personally really like some AI that does first drafts of certain engineering drawings. Mind you, I do think this would get misused at some point and it would be a whole drama, but used responsibly it could cut out hundreds or thousands of hours per project of meaningless grunt work in the engineering world and let engineers focus on important details instead of sifting through endless mounds of busywork.
Chatgpt is also not terrible for first-pass sanity checks on engineering calculations. It has about the reliability of a freshman engineering student with a textbook in front of them, but because it shows you its work, it's not hard to use as a first draft, then correct its math manually if it made a silly mistake.
I do feel like there is real potential here as a work aid in some applications. But it's not a replacement for real human work the way CEOs really fucking want it to be. Someone still needs to sign off on the bridge being built, and that means going over everything with a fine tooth comb and human eyes. Those parts cannot ever be replaced by AI, and it's deeply concerning to see CEOs salivating over the prospect.
The more you learn to code, the more you realize how full of shit those AI-Techbros are. Especially if you write and train a LLM yourself, even if you are not good at it (you still gain a decent amount of understanding on how it operates).
i'm relieved. i'd never actually heard that AI could fuck up code. it makes sense with a lossy reiteration, but i'd taken people's word for it that the computer can speak computer. can't wait for AI to genuinely fucking destroy anyone who invests too much trust and finances into its accuracy
it's saying don't learn math because fucking calculators exist
The issue is that GPT AI is not great at working with scientific facts, rules, data, and structure. chatgpt is actually pretty useful for writing small snippets of code, as long as there is someone there to review the output, verify and rewrite it for correctness. There's no way in hell I would trust it to generate an entire program in its current state. GPT is great at vomiting something that approximates human written text, but when it comes to actual facts, rules, coding language, structure, consistency, and protocols, it can easily forget what it wrote 100 lines ago and start contradicting itself, or simply write code using functions that don't exist at all. I could imagine a future AI that incorporates a combination of GPT-AI as a sort of front end "hmm, what is the user asking, and how should i organize the output", with another system in the backend that actually reads data, coding rulebooks, facts, history, database, etc. Basically, in order to guarantee correctness, we need a human-written architecture do the heavy lifting with data.
I'm a software engineer and you're 100% correct, but for now. An application will always require humans to create it, but maybe fewer developers will be needed in the future. It's not only going to affect coding jobs. Some jobs will be safe from AI, like dentistry. Imagine a ChatGPT with access to an entire repository and documentation; it would only need a method written for it to generate tests or suggest tests for written code.
6:15 - Vaush says that when he thinks of Nvidia, he doesn't think of AI.
This shows how little he fucking knows about the AI field and about Nvidia. Nvidia is the most important company related to AI in the world at the moment. Their stock price has absolutely ballooned as a result of their AI chips that are far faster than any other processing unit on the market, which make training models far faster.
You can't debug AI code without knowing how to code.
how about AI debug the code itself 💀💀 ???
@@OnePiece_Fandom Works up to a point. Don't know what future will bring.
@@OnePiece_Fandom who gets to decide that the AI can do it itself?
As a engineer... AI is a fantastic way to get some code, that you have to trouble shoot much longer than if you had just wrote the code yourself.
"code is code."
great, now i have that meme stuck in my head: "MATH IS MATH"
but as a math enthusiast, you do have a point.
LIFE IS LIFE
My brother, who has a college degree in programming, and before that got a job at a game studio straight out of high school, has spent the last year and a half teaching himself Godot... from the horror stories I have heard, I don't think AI is going to be much successful.
The kind of mistakes AI makes with image generation? would *destroy* the programming for something like a video game.
Meanwhile, given that I myself went to *art* college, I can 100% vouch for the "artists learn to iterate."
If AI is the future for creating video game graphics, I'm assuming that, eventually, everything will look like a Mandlebulb fractal. Which, looks cool, but when everything in the game has the same "look," it'll get old pretty quick.
We are 100% on a DUNE-WH40K timeline.
he literally wants to gatekeep tech because he knows that tech eventually will have the potential to upend the system that made him rich.
it's less sinster and more cynical than that. he cares about quarterly earnings/growth, and it boosts their share prices to hype themselves as THE market leader, in what investors percieve as the big upcoming tech for the next decade+. he's not a visionary. he's a hype man. he's selling shovels for a gold rush.
As a programmer of sorts (Computational Neuroscience) I don't use chat GPT because the output has no practical value to me. The code it outputs is actually pretty much gibberish. At most you can pull the individual commands line by line from the output and adapt them... which is basically the same as having looked up the relevant commands like "how to do X operation" or "how to visualize Y." It's just not useful for coding.
I've also never used it to cheat on assignments not out of some sense of honor but because it can't provide reliable or even basic information on any topic at a college level, at least not in BME or Neuro. The topics are too niche and complex to fake knowledge about, so you have to actually, you know, do the assignments and answer the questions. "What happens to your field of view when there's a lesion to this part of the brain... " or "how far apart do these nails have to be placed to bear a load if XYZ"
I will accept his advice on one condition. He personally pays UBI for all Americans.
The luddism is strong here.
"He has to know, he's the CEO of NVIDIA!" ... but for some reason vaush feels more qualified to comment on this.
Saying that humans have an esotherical quality about them which can never possibly be synthesized is religious quackery.
I am curious to see your redemption tour in 2-5 years.
Stuff like copilot is impressive and already useful but it is still very *very* far form replacing programmers. Completely replacing a job like that would require something more akin to ACT-R which is largely a theoretical thing for now.
Python is not a backbone for anything, it's a glue that sticks lots of things together.
Imagine if my accounting classes never taught me how to record journal entries or what debits and credits mean and how they function within the accounting system because “The accounting software does all that for you”. Thats great, now I’m a shit accountant because I have no idea how to fix a problem with the balance sheet not balancing because I don’t even know what it means to be out of balance. You need to understand he fundamentals of a system even if there are tools to make said system easier to use and interact with. Everyone should learn coding the CEO of nvidia is exactly the reason why
he's not ignorant. He knows what he's advocating for will result in the demographic that has any computer literacy being reduced to autodidactic nerds, which he's fine with because that's the bubble he's in. People have a tendency to be okay with a consolidation of power if they're in the group the power is being consolidated to.
Why does Google say that Nvidia invented the GPU in 1999? SGI made graphics processors before them, the Nintendo 64 had an SGI GPU and was released in 1996.
I'm a mechanical engineer. I also work in programming CAD add-ins and macros. I also program robots (industrial robots for welding and other purposes) along with actually making entire welding cell projects. I only tried ChatGPT a bit. It is not impressive for fields in which there is not enough data. I mean it is not even like for web development where it throws some relatively okay stuff with some errors. It just throws some illogical stuff that is completely wrong due to lack of data. Just try it for PLCs or for KUKA KRL language. It is just bad due to the lack of data.
Watching chagt write firmware would probably be hilarious. Just don't use the code. The lack of security would be astonishing not even including all the bugs.
He's right. I'm pretty sure 90% of all jobs will be obsolete by 2030 due to AI.
No it wont
The reason he’s spouting lies is because he knows how important actual programmers are. It’s about paying programmers to do the same job for less because they utilize AI while also taking a check from them for the right to utilize AI. It’s about shrinking the power of the worker, making them more replaceable. It’s the same across all industries.
ChatGPT code is legitimately terrible and sometimes doesn't even compile
Because it's not meant for that. OpenAI has a model for coding called Codex but everyone seems focused on ChatGPT's capabilities
The reason graphics processors are good for these models is that the models use multi-dimensional arrays to hold all the different characteristics of the object in question. They do vector math on 3-dimensional objects in graphics rendering, but they're not limited to 3-dimensional math. In fact, luminance, reflectivity, and so on can be assigned dimensions and math can be applied to alter those values. Likewise there can be dimensions for heaviness, colors, does a thing make noise or not - just add more dimensions after x, y and z and you can do the same manipulations even though humans don't consciously acknowledge all these properties as dimensions. Anyway, being made for 3d graphics literally is the link between graphics cards and machine learning. That aspect is not a fad or fraud in any way.
He keeps makin' these, and I keep watchin' 'em. I am being RADICALIZED BABY!
This is the Patrick Star meme "We have technology."
5:11 the ability to convey what you want is solution ideation. Something that can be taught, as programming is. The ability to generating solutions in natural language has benefits that are universal, whereas syntax knowledge is contained. Can you not use your vast knowledge to infer how AI could evolve to increase/enhance output?
In this hypothetical future: While AI coding is the primary source of software, people with coding knowledge will be able to hack into the sloppy patchwork systems that are vomited out to the masses. Then, once the AI coding algorithm collapses under the weight of incest, people with coding knowledge will be highly sought after to rebuild the digital world from scratch. [All that assuming AI coding actually takes over.]
Really, this is a hype-job for the short-term boosting of NVIDIA shares on the back of the AI tech bubble. If people actually fall for this bad advice, the long-term results for NVIDIA will be labor shortages and even _higher_ pay for good coders.
*Never* take advice from the public statements of a CEO.
Okay, that chatter did not explain it, but "bouncing idea off AI" is bouncing idea off yourself, using AI as you would yourself when you talk to yourself during a creative process. In that way, it does have some minimal use.
I'm a music teacher and I have accepted that it will come for my job one day as well. It's coming for all of our jobs. I understand your anger.
As long as there are humans left on this Earth who still hear the voice of the sublime over the loud scream of the material, some jobs will not be replaced, for they were never about the product but the process.
@@rkvkydqf -- The point is not enough jobs will be available for the number of people that need to work to make money. Not that "all jobs will be gone."
Funny how humans are fighting tooth and nail for their soulless jobs that they actually hate 😂
@@wantanamera It's almost as if they need to make money to survive and therefore can't afford to loose their soulless jobs...
@@andersonisowo9603 no shit
This is hilariously wrongheaded. Basically this CEO has reinvented something a rest of the world has already tried and rejected decades ago: programming in natural language is just COBOL (a business-oriented "English-like" programming language from the 1950's) all over again! That was a travesty because it turns out coding is the least of a software developer's job (it's mostly about design and modelling, stuff LLMs can't do btw). Business people suck at even just describing what they want, let alone making a reliable specification for a machine. Anyone who has ever developed for a client project can vouch for that.
It's almost like this man has a fucking team of genius devs that advise him and have already supported this ideas feasibility . You're acting as tho you have this knowledge of COBOL and the only reason they're Pursuing this is because they're dumb and missed COBOLs attempt at English like programming language. The truth is probably that they know of COBOL and are going in a more successful direction .
@@brutuslugo3969 I have seen enough high profile flops with even more trivial oversights to not share your blind faith in corporate meritocracy. A very recent example is Hyperloop One (formerly Virgin Hyperloop). It swallowed years investment and grants only to shut down with nothing to show for. One of the engineers said that it will never be practical because in short it turns out the Earth is too bumpy for vacuum trains. Yeah, no shit! A team of geniuses behind a maniac boss will just go "yes sir, we'll look into it" and waste as much time as they can get away with and jump ship just before the gravy train runs out.
As a software engineer I don't really want to take his side, but I can see where he is coming from. He is correct in that prospect that the environment is changing and thus the tools we can and should use to create software. Like - no one needs to learn assembler for example to become a programmer, because there are higher programming languages that are more accessable, "more powerful" and easier to learn. So that much is certainly true.
BUT. We can't dismiss everyone in the field and not learn this stuff in the first place. Sure AI is a great asset for supporting development, but it can't make the heavy lifting of either deciding what to do for very specific use case, we can not assume for it to work properly at all and we need to check what has been generated, which in turn means, that we "at least" need to understand what it generated.
So yeah. His basic idea is correct, but his understanding of the field is lacking, to say the least.
"Stop programming." OK so then what do we do? Go back to factory jobs?
You'd be lucky to find a factory job in America, more likely the service industry
AI that produces code blocks is basically a machine that mass produces technical debt.
ChatGPT can barely code the game Snake on Unity without very specific instructions. But sure, it can somehow code everything now.
Remember that AI will improve over time.
So many straw men in this comment section ... who is saying ChatGPT can somehow code everything now? Who?
This is why I take the Torvalds perspective when it comes to NVidia
F*ck Nvidia?
Generative AI is impressive but far from being able to replace an engineer. In some ways, the tech industry does not change as much as people make it seem. C++ is still one of the most popular languages after 40 years. We’re still using network protocols developed in the 70’s. The concept of relational databases which are still widely used today was defined in the 70’s.
2:25 the commenter's correct. At that point in the video the CEO has only said that you shouldn't say *every* kid needs to learn to code, not that nobody should learn it.
That said, as a programmer, I can confidently say that AI is currently not good enough to code and while it can certainly make coding easier, I doubt it'll make programmers obsolete any time soon.
I think Jensen is saying this in anticipation for advanced A.I. that will be developed within this decade. Yes, current A.I. (ChatGPT, Gemini) makes a lot of mistakes and isn't really useful due to the number of hallucinations. However, this is only temporary. A.I. currently is the worse it will ever be. It will only get better and I think we'll have to accept that as a society. As leftists I think its a good time to start strongly advocating for some kind of UBI.
Yeah Vaush is so dense on this issue it's insane.
I read all comments, looks like you are the only person got Jessen's point, he talks about AI in the future.
Vaush, the thing is, GPU stuff DOES now involve AI with the latest developments... As of like a year ago you can use AI to basically upscale Skyrim and literally DOUBLE your FPS via a thing called DLSS, or I think some other thing if you're using a different type of GPU... Essentially, you run the game at like 360p which is obviously going to be easy for your GPU since things look pixelated and shitty, but then the AI comes in and upscales things to like 1080p or 4K... With your settings right, the only downside is some very minor artifacting that pretty much no one is very likely to even be able to notice to where it seems basically graphically indistinguishable from not using AI to the average person... But again, for a major boost in performance, like 40 or 60 or more extra FPS...
everyday i stray closer to the unabomber fr fr
^ Not the own you think it is…
@@ArDeeMee Dictionary
Definitions from Oxford Languages · Learn more
sar·casm
/ˈsärˌkazəm/
noun
the use of irony to mock or convey contempt.
"his voice, hardened by sarcasm, could not hide his resentment"
17:04 “The modern world is a sight to see
It’s a stimulant
It’s pornography
It takes all my will…
Not to turn it off”
-Barbary coast.
By Conor Oberst
Chat GPT can code, as much as a 5yo can paint, yeah, it can smash its fingers full of paint and draw a fucking smile face... it cannot fucking paint a wall white.... neither can it make a master piece...
My understanding is half of all bio-engineering research at this point is just a guy sipping coffee while an AI is running through a billion data points, then when it's done an hour later he tries doing what AI says and sometimes it works.
Vaush I definitely underestimating just how impressive AI is with software. It doesn’t build your apps for you, but if you’re a senior and know exactly what you want, you can develop lightning fast with it. It will give you complex code or starting points if you tell it the right thing.
I come here for all you smart lefties. I live in rural Florida and sometimes it feels like I'm the last one.
while a little extreme, as a game dev ive seen how good chatgpt etc is and soon most code will absolutely be from AIs in some way. even my coders are like "why are you even paying me to copypasta chatgpt".
Also as a game dev I'm dreading when games are mostly coded by ai. There will be a bug some major game studio will have and it'll take them 6 months to even find the error since no one actually wrote it. This will happen for every single bug the game has. Ai can't take an entire unity project and spit out something workable.
As an electrical engineer student, me and my peer's futures look bleak af with the explosion of AI.
Just imagine how doctors must feel.
@@OneEyeShadow in what way are doctors threatened by ai?
@@OneEyeShadowConsidering how over capacity basically every healthcare service in the world is perpetually any automation in healthcare should be welcome.
@@big-gloomDiagnostics mainly. Auto segmentation and generally handling of data as well. AI is already kinda superior when it comes to recommending medicine.
@@OneEyeShadowPretty fine I presume. WebMD had existed for a decade or two and actually uses a statistical model more fit for its purpose and had only helped expose incompetent medical practitioners for who they were; at no point did anyone fucking dissolve hospitals and show people DIY tutorial for surgical operations.
And as for electrical engineering, you'll me fine, more so than even software which too isn't going anywhere! Your domain is applied physics, its levels of complexity are so astounding this stochastic parrot won't get past the local optimum of memorizing test answers. All types of engineering share one defining trait: complexity; until these things can understand with nuance paragraphs and paragraphs of documentation, something even latest GPT-4.5 struggles with over just 300 LOC of normal code, it will not be practically applicable.
It's interesting to see the levels of anger Vaush reaches, and equally interesting how he calms down once chat says something that isn't stupid and actually contributes knowledge.
Another implication of telling people to not to learn to code is you have fewer people who know how to do it, then they can pay the few people they have very little to keep the AI.
Every dark age humanity has experienced was preceded by an era where we stopped caring about how the stuff we had worked
I'm new to programming (studying gamedev for homebrew NES games) and I wholeheartedly believe AI could NEVER replace humans in programming. Optimizing code, let alone building large and complex programs, requires a deep understanding of how every section of code interacts with other sections. To my understanding, AI works by looking at something and comparing it to another, not assessing the functions of something and building to the goal from scratch by understanding the minimum instructions required.
My favourite AI story is that they trained a model on medical scans of people's lungs to detect tuberculosis. It got better than actual doctors, so they tried to dig through it to figure out how it knew. It was looking for visual artefacts present in old medical scans - when tuberculosis was much more common - and assigning those to have much higher likelihood of tuberculosis.
But yes, AI is going to replace us, dude.
I think some other comments are already explaining more or less the same, but the reason GPUs (graphics cards) are more popular (and better) for things like AI (or crypto/Bitcoin mining for that matter) than just straight "computers" (CPUs) is that GPUs are basically giant clusters of mini-CPUs that can do lots of simple computations in parallel.
* Rendering graphics involves mass amounts of simple calculations for vectors (think how lighting at various different angles takes lots of simple math to figure out exactly the shapes you see on the screen), so they're built for exactly that
* Bitcoin is "mined" by solving tons of calculations (to the extent you even can still mine it, not sure if there's much you can still get out of that today)
* AI models process mass amounts of data in matrices to perform operations on, which to the computer is just more extremely repetitive calculations
CPUs, by contrast, have more complex designs and capabilities, but can only do so much at once and won't process all that stuff nearly as efficiently.
I am a biochemist and a computer scientist. You are correct in your hatred for generative ai but using data science both in vitro and in silico is the most powerful tool for learning things
Just because a claim is exaggerated doesn’t mean it’s wrong. There’s been a coding bubble getting bigger and bigger for the last decade
I too hate the AI hype, and Vaush wasn't too off technically. However he doesn't get Nvidia and normal machine learning, as chat help him with models for biochem are quite capable e.g. protein folding as mentioned. And Nvidia is predominantly a Hardware Company with actual Products and R&D, selling computing power based around GPU technology. (Many of) Their products are genuinely really impressive and besides the gaming market known to most people, they power all sorts of datacenters and enterpirse workload, for e.g. VFX rendering, CAE, BigData and all those buzzwords including AI the training of which famously requires insane amounts of computing power so Nvidia is more riding the hype selling the compute. In the gold rush they get rich by selling shovels.
Do these tech bros understand that we dont actually have thinking AI? Its all long language models.. they really seem to think that we have, Right now, sentient thinking AI..
No one who is somewhat knowledgeable about AI thinks that
I guess as long as he profit off off selling more or less an illusion he simply won't care.
AI is the new FSD that never delivers 😎
@@tropix4429Well if he were smart enough to know that, he’d know that it would take a thinking AI to replace a software engineer.
The sad thing is that AI has incredible liberatory potential if done correctly. It's very different to previous tech bubbles like Crypto. AI has the potential to give us fully automated gay space luxury communism if we actually did it right
but we wont though....
coding is also about a way of thinking and analyzing problems. As a business analyst / programmer analyst, a small part of problem solving is the actual coding. Most of it is framing the problem
that reminds me of a video i've seen. a very experienced programmer talked about how manufacturers don't want us to understand and directly use the hardware. the abstractions we can use with the hardware are somewhat vetted by the manufacturer, remotely most of the time. it's supposedly not because it's that hard to learn to use a specific GPU architecture, they just don't want you to use something else than the graphics APIs that abstract most of that GPU-specific stuff.