Full podcast episode: ruclips.net/video/pdJQ8iVTwj8/видео.html Lex Fridman podcast channel: ruclips.net/user/lexfridman Guest bio: Chris Lattner is a legendary software and hardware engineer, leading projects at Apple, Tesla, Google, SiFive, and Modular AI, including the development of Swift, LLVM, Clang, MLIR, CIRCT, TPUs, and Mojo.
GPT doesn't possess cognition. I have utilized it in my own programming projects, and at times, it proves to be a better tool than Stack Overflow. However, it occasionally generates incorrect code and apologizes when you point out the mistake. Instead of providing the correct version, it continues to offer additional incorrect suggestions, resulting in an infinite loop of erroneous recommendations. It amuses me when people claim that these issues will be resolved in future versions of GPT. In reality, such problems are inherent to large and matured language models and cannot be completely eliminated unless a revolutionary alternative emerges. Ultimately, when GPT fails, I find myself turning to Stack Overflow to seek human feedback. in simple terms, What GPT creates looks only impressive to the untrained eye and mediocre programmer like me.
There is no technical reason why these engineering problems cannot be resolved. Not sure we necessarily need a revolutionary alternative, but given the help we can get from these new tools in developing new solutions, I'm sure the progress in all areas, including programing, and specifically for a programing focused solution (like a "programingGTP+some_other_type_of_AI-hybrid" thing for example), will be quite astonishing in a continuous way going forward.
@@ingerasulffs lmao > There is no technical reason why these engineering problems cannot be resolved what happens when you use a lot of average code for training NN? It will only produce average buggy code as a result. You either don't understand what NN and LLM are, or I dunno
Before watching: NO, it won't and I'm honestly tired of this BS discussions. There is almost infinite demand for software and the only reason it's not being made is because it is incredibly expensive - corporate system implementations cost BILLIONS. Tools like GPT will drive the price down, one developer will be able to increase the output, companies will want more stuff, more frequent updates etc.
This is the correct response. There's a gigantic and painful backlog of software projects in most companies and this won't change - rich people will always be competing for market share via software projects. You'll also need developers to write the code. Software projects are rarely some 1 paragraph generic scripts easily located on stack overflow or elsewhere online. They're often tens or hundreds of thousands of lines of code, specifically tailored to a unique use case. You can't just prompt your way into that. This whole thing is stack overflow via prompt, and nothing more
That was my opinion Always. Demand for software Will Always be Like infinite u can Always make more stuff, New things. There is no ending. You Will be able to do much more in X time.
This. The number of developers you employ is a costly value to change, and it doesn't change anything when you consider that chatGPT doesn't do all the work a programmer does. The value that will probably change is code deadlines. If a team can write faster with ChatGPT people will just want updates and features sooner. Not randomly fire 10% of their staff and keep going at the same rate.
Why not use your programmers as prompt engineers so if their is any error they can take care of it. And you can certainly reduce the amount of working people required in any particular company.
As a programmer who has spent hours fighting with ChatGPT to get working code for a new problem (and failing)... of course it won't replace programmers. Future developments could change that, but even then it will take years of "co-piloting" with human coders before it could possibly be trusted. To be clear, I'd use this in an IDE co-pilot role at the drop of a hat - but that's productivity increase, not replacement.
But that’s merely anecdotal. With training and greater application one programmer can become much, much, more productive. Chat GPT would take the jobs of ALL programmers, just the least productive. It’s like going from shovels to using excavators in digging a hole. A 100 people (programmers) can dig the hole (write the programme) but 10 programmers using AI can dig the hole in the same time frame. Suddenly you’ve lost 90 jobs.
@@Notabot437 it's not "bound to happen" any time soon, and the capabilities of chatgpt are only limited to doing things that aren't that interesting to begin with. The only people who think it will replace programmers tend not to be doing that much programming to begin with.
I really admire how down-to-earth Lex is. Despite being really smart and talented, he never comes across as arrogant. It's refreshing to see someone like him who interacts with others in such a humble way. I hope to learn from his example and be more like that in my own personal interactions.
@@Adamskyization He is smarter than 90% of people on Earth or 85% of US residents for sure. The fact that Lex can push provocative conversations without losing control of his emotion, just showed he has a good brain which most of us do not have.
Newer(just a year+) self taught programmer and I’m getting to that point of building real robust projects and ideas so chatgpt has been amazing when I realize for example hey I need to loop over this particular data type or something simple and I don’t know how in that moment but if I ever ask it for something a bit more complicated I always end up fighting erroneous responses and having to over explain my ask. This could definitely be worked on in future gpt iterations but I think this idea of truly understanding what someone wants and needs seems super hard to reproduce with a LLM.
As of now LLMs that write code are like low skill interns that do tiny code generation that needs to be supervised by actual developers. It needs to be guided along and it needs a lot of help to be integrated into an actual project. It is very impressive don't get me wrong, but it's no where near human replacement and I don't see it changing drastically anytime soon. Programming, unlike other activities, needs a lot of contextual understanding. It is on the opposite side of the spectrum from highly specialized activities like digital illustration. We saw the latter being perfected already. I'm going to say that AI will be capable of doing the former sort of activities LAST and the latter first. Especially things like game dev requite so much unrelated skills like music, level design etc. If one AI can do all of that then I suspect it could do everything else in the world at which point we have AGI and the singularity. I wouldn't be too worried about programmers... I'm more worried for the world as a whole.
I think future AI models will probably replace some parts of what we do quicker than we think - but I also think anyone who has the mind capable of manufacturing complex software will probably find a way of building something interesting and novel with the new tools AI creates - or extending our capabilities to simply make more fantastical software. I can't imagine it writing any and all complex software projects that we could produce, expecially when we consider it as a tool to extend our capabilities. Just some thoughts: Even if it were capable of responding to "Generate me a Cloud based Web Video Viewing application" - You might want things like - "Oh but with support for webm videos" - "and support video comments" - "but make the comments filter out profane language for users under X age" - or "with a REST API and documentation for comments and stats" So programming could definitely become simplified into product / technical descriptions some day. Where rather than a repository of code - you could have a repository of a product description with caveats and nuances in human readable and understandable language (perhaps plain English descriptions). Humans love pushing the limits - so we'll probably use those programmers to push the limits of how complex of prompts we can generate, and generally solve novel problems in the realm of "what do we want exactly". At least until AI can predict what we want and produce outputs better than we could even think to ask for. Wouldn't be surprised if humans and AI are in a reinforcing feedback cycle of humans training AI with new & improved input - and AI providing new and improved tools for humans to produce new and improved outputs for. Wouldn't be surprised if many of us move towards (many many years from now) mostly working by training and improving AI/LLMs with high quality inputs, and providing feedback / improvement in the long run - or providing the right prompts/inputs for the desired output. idk - impossible to really predict but it will be interesting at least to see where things go
The question is whether the prompts required to gain useful answers from AI will change with every new iteration of the AI or whether we can start to build a pattern of understanding of how best to communicate with AI as a whole.
Reflect on the unique value humans bring to programming at [0:11]. Consider how large language models (LLMs) are changing programming practices at [0:18]. Recognize the potential for LLMs to automate routine coding tasks at [2:32]. Explore the role of LLMs as companions in the coding process at [2:38]. Contemplate the interplay between human creativity and LLM-generated code at [5:28].
Jesus Christ , finally someone who's not following a narrative built on irrationality! Nobody can predict the future, but if you program complex systems, you know very well where the limits are. Maybe AGI will arrive, and by then we're all on the same boat, and the issue will always be if we're not ALL on the same boat!
I know, I get so sick of this constant assumption that ChatGPT is going to get so much better even though it's fundamental operation really doesn't do what programmers do. I'm also not buying Lex's angle that the statistical average of all language on the internet is somehow a deeper insight into the nature of reality and intelligence.
Point is - you do not need AGI. The concept of AAGI is needed to change something is stupid. See, larger projects have people writing specs, there are coding standards, there are testers. An AI that can work with that can make most developers redundant. Hint here is - that is NOT AN AGI. A good programming AI may be not a doctor or lawyer, and the definition of AGI is that it does ALL the human stuff in one model. Nope, it is specialized. It is just 2-3 generations further than what we have now.
@@ThomasTomiczek even what you're saying isn't enough. People generating "specs" to feed into an AI is not enough, because specs are written in an ambiguous language known as "english" wheras code is written in a logically unambiguous language known as "code". Tests don't really solve the problem either, because your code can pass all tests while still failing in production, and writing tests is often not possible without a prexisting API or knowledge of how said API is implemented.
@@zacharychristy8928 humans are not intelligent... We can reason things , figure out ways of solving problems and build over another people's hard work. But what is intelligence? Can you focus really hard to invent something new? For sure not You can only work on another person experience on any given field. Inteligence is an illusion Don't fool yourself
ChatGPT will never replace software developers because ChatGPT first has to figure out what the client wants, and they client doesn't even know what they want.
See, first - CHATGPT is a chat program. A proper cognitive loop using the API can have WAY better capabilities. Second, you work on super small stuff, right? Because any project I id in the last decade has PRODUCT OWNERS handling this part. Then user stories get evaluated (which ChatGPT actually is not that bad at). Planning poker. It is not there yet (even with cognitive loop), but saying "never" marks you super high on "ignorant idiot". 3 years ago it could not talk complex ethics - now it can. What is ever in your universe? 2 years? What in 10?
@@ThomasTomiczek It rather sounds like it is you who is talking bullshit. Cognitive loop? These things are currently tripping up on their own randomly generated bs, with devs desperately trying to mask it by glueing not some very intelligent AI but-rather clunky old traditional hardcoded logic workarounds on top of it to hide the embarrassment. Also slow and memory inefficient as hell, compared to handcrafted algorithms, say, for parsing or compiling good ole machine languages.
@@ThomasTomiczek no, because there is no "cognitive loop" that's a term you're inventing that doesn't exist. ChatGPT is simply a "semantic loop" which is not the same thing. You don't get deeper insights into what would help a person solve a problem by simply being able to generate a response to the last thing they said. You're imagining a richer operation is taking place than what actually is. There is no insight to be gained from simply generating responses. You're implying there is some context-aware cognitive model underneath because you've been utterly fooled by a Chinese Room.
I imagine in the future, the client will ask the AI for an app and it will get pumped out in seconds. If the user doesn’t like it and requests any changes, the AI will attempt again; repeat until satisfied client.
In my experience the jump from GPT 3 to 4 is huge for programming in python specifically. I've found it helpful to first communicate back and forth with GPT to design an algorithm, then once satisfied with the logic to ask just one time to produce the code.
What a kind and lovely person Chris Lattner is. Thank you for Swift
Год назад+3
I am a novice in programming, but it has proven to be extremely useful in my work and I am still learning. I can imagine that many advanced programmers might be able to accomplish more sophisticated tasks, but for a beginner like me, GPT-4 has been an incredible booster. I believe I have at least doubled the amount of useful code I can produce. Additionally, it has taught me how to learn about programming. My fear was that using GPT-4 would make me complacent or reduce my drive to learn more, but in reality, based on my experience, it has accelerated my learning and helped me make more progress. I think the impact will vary greatly depending on one's level of programming expertise, but this technology is bound to transform the field in one way or another.
Calculator didnt kill science - it just made it less error-prone in certain areas. LLM for programmers will become just what calculators are for scientists. It does simple things quickly, allowing shift focus on more complicated things.
except that it didn't. I started my undergrad in 2006 and already we used to consider "coding" as a menial job or "robotic". AI has existed for years and gotten mainstream only now.
@@rockwithyou2006You all need to discount everything before AlphaGo (the 2017 version), all that 50-60 years was basically like learning how to build the effective tool of actually working Deep learning.
So far I've found it useful for websites, web servers, microcontrollers, datasheet reading, plc languages, video game engines, and so many other languages. It's an extremely useful tool.
@@synthwave8548 Well in the world of automation engineering there are two methods so far that could allow somebody to program some robot for some industrial application but there is a method alternative to the use of a PLC with HMI to display data to the operator. Using a microcontroller in C is more in depth than a PLC. If you want to set up a webserver on a microcontroller then you'd need to include the javascript and html in the IDE along with additional Node software and files. So using chat gpt for that is a great option because it will specify all the libraries to use in the IDE and write the JS and HTML code necessary to display the website correctly. As well, I've asked chatgpt to write code for a PLC (can do both structured text and ladder logic) that would do simple digital and analog outputs to relays or transistor switches, counters, pwm, variable current supply to pneumatic actuator or variable frequency drive, feedback from sensors requiring analog to digital converters, comparators etc. Basically specify your PLC, whether you want ST or LL format, what industrial scenario you'd want to accomplish and it will get you started and continually refines itself as we know and provides an indepth explanation of it if needed.
Right now I am perfectly in tune with "the customer wants a faster horse...........". I'm working on a project whose original specs were for a "software horse" that can fly, scuba dive, can carry 10 tons up a vertical wall, and can easily fit in your pocket. ChatGPT is a very good programmer, but try to make him steer the customer towards something in the realm of possibilities. (without giving a call to Harry Potter)
That is out of the question for anything larger, though, where this steering is NOT A PRORAMMERS JOB. That is what Product Owners are for. Isolating the customer from... Also, you likely use ChatGPT without a proper planning and cost matrix. AI is capable in limits of common sense - just not a chat bot implementation. If you drop ChatGPT and got GPT (i.e. sue the API) and put a proper cognitive architecture there things turn out a lot different regarding common sense. I have one here that ASKS ME QUESTIONS. Clever prompting to not have a slave.
@@ThomasTomiczek Programmers are not code monkeys. Helping steer is absolutely the job of a programmer, especially more senior level. Product would love to have some features that are neither feasible or maintainable - those who develop the systems should be high level advisers on the direction.
Coding is us simply telling a machine what to do in a way that we have the most control. Yes, AIs can give us some good code, and sometimes not so good code. Maybe down the road it will always be good. Either way, we still have to tell it what we want. Personally I’d rather do that in code vs. chat/text.
Yeah, code is much more logical that any human-spoken language, therefore why communicate with computer code, i.e. AI, in English when you could communicate with the language it's actually built on, code.
1) Can ChatGPT create a compiler and an operating system? 2) Can ChatGPT create a computer language? 3) Can ChatGPT create its own Python modules, packages and libraries?
@@gastonangelini8352 Why would that be a joke? It was the same with technology, some technology replaced old-school jobs, but it created million more new jobs. For example, AI already created Prompt Engineering, people are already making money from it!
I think ChatGPT is too hight level and leaves space for a lot of ambiguity. It will obscure a lot what the machine will be doing and how will be performing. So there's still going to be needed a lot of computational knowledge, even more than these days if you want to get something sized and performant, without surprises in terms of costs, and of course, something that can be maintainable in the future. I don't see big companies leaving all their profit and reputation to a bunch of guys without computational knowledge playing with a prompt. Also, I don't think the prompt will be the future of programing. There will be something in the middle that can deal with ambiguity and that is more precise. I believe programing language will be needed less and less, but computational knowledge more an more.
As a 66 year old Application Developer I hope we at least have 5 to 10 years before it is all over. If I don't outlive becoming obsolete there is no problem. Sorry to be morbid but AI doesn't care anyway. What am I doing after hearing the narrative about AI replacing developers? I went into phase 2 of my AI learning. After 4 years of Python I started PyTorch and Deep Learning. Understand tensors and the basics of Neural Networks and model training/testing. Phase 3 will be returning to my Python fastAPI NFL Colts app where I build 15 graphs but leveling up with some predictability or projection on a player or team stat. Then have a stat of the week feature to give my graphs/bar charts a little more spice. That I deploy to heroku/salesforce cloud as an Angular web UI.
Programming is just translating solutions to problems into a language a computer can understand It's the problem solving that is the difficult part, once chat gpt can do that, then programmers are in trouble, haven't seen any sign of that yet
> problem solving that is the difficult part It depends on what kind of problem. A bug in the C library? Sure, that's difficult. An e-commerce website? I doubt AI would still have difficulty coding one by itself in the next 10 years. The latter is still a job for thousands of people currently by the way. Thing is, problem solving isn't as special as we think it is. It's really just a bunch of neural networks. The only strength we have as humans now are our "Eureka" moments and how we aren't as prone to catastrophic forgetting as DL networks are. In the future, I think knowledge work will only be reserved to the most intelligent humans (i.e. those who are considered geniuses). It's a dismal prediction but there's really no other way to see it.
@@unhash631 Yeah but it would have to understand the domain, Dynamics and Salesforce have tried to be completely generic, but you still need armies of developers to integrate if you want any business value It's also the problem that it has to be able to modify the existing code and be an interpreter for humans giving requirements Maybe in 10 years? Sure, its not there right now, which was my original point Problem solving is actually special, I've seen many people who do it wrong
There's myriad things that Chris and Lex do not address... an LLM spits out 'the highest likely next word'... Do you want that to be writing your code? What are the consequences? Code needs to be consistent, complete and coherent... and it needs to take into account the functional and non-functional needs of all of its stakeholders Neither of which can be done right now if you want to replace developers. Example: There are 7 million lines of code spread over 150+ systems/chips in a wafer-stepper machine. Every hour of it *not* being operational costs ~300k. After an issue: - Do you want to be the 'prompt engineer' that convinces an LLM to spit out the 7 million lines of code that doesn't break it? - Do you want to rely on a tested code-base and engineers?
Anyone who thinks ChatGPT can write industrial-strenght software systems and can replace software engineers is either a bedroom programmer or does not have any knowledge at all about software development.
AI absolutely will replace software engineers. It's just a matter of how long until then. Anyone saying it's not going to replace engineers along with a lot of other people seem in denial.
I agree. Not all programmers, but I think many of them are just afraid to lose their job, and so they try to spread these news to make others think AI won't replace them. Even in 4 years we could have a powerful chat GPT. Technology is getting more and more advanced every month.
I look at software tools as a series of increasing levels of abstraction and decreasing levels of skill necessary to use them. You started out with machine code. It might even have been wires on a plug board. The 1's and 0's entered into specific memory addresses. Then IBM Hollerith Punch cards or paper tape to load code. The assemblers came along to create machine code. Much faster and easier. Then higher-level languages came along and were even easier. The skills needed to use higher level languages became less and less at the higher levels of abstraction. There are people who can program in Python that can't write or read machine code. Of course, object-oriented programming is all the rage. No need to know how to create all those fancy data structures. ChatGPT will just be a higher level of abstraction that requires less detailed knowledge to use. Do you know why there is a language of mathematics? Because English is ambiguous and imprecise. Some level of computer language will be around for a long while.
I think ChatGPT is too hight level and leaves space for a lot of ambiguity. It will obscure a lot what the machine will be doing and how will be performing. So there's still going to be needed a lot of computational knowledge, even more than these days if you want to get something sized and performant, without surprises in terms of costs, and of course, something that can be maintainable in the future. I don't see big companies leaving all their profit and reputation to a bunch of guys without computational knowledge playing with a prompt. Also, I don't think the prompt will be the future of programing. There will be something in the middle that can deal with ambiguity and that is more precise. I believe programing language will be needed less and less, but computational knowledge more an more.
I use it from time to time for code generation on a certain specific problems I am having issues. At the end of the day , the users using chatGPT still need to understand the code.
Yes, until the model understands what the full project code is doing, and until they figure out how to properly train their weird recursive transformer that allows an unlimited amount of tokens, we're fine. (Programmers are doomed).
@@pladselsker8340programmers create these projects. A real programmer is an engineer. Writing code is just a mechanical thing. Creating ways, how to achieve the goal, is what a software engineer does. Before softer engineers are doomed, other professions stop existing except probably mathematicians.
Why would that be. Do you understand how your microwave works? Everyone I have spoken with was like Oh, yeah, AI is very powerful, it will take some jobs, but not MINE. Because X, Y and Z. Does this smell of something? It has to replace SOMEONE 🙂 Stop burying your heads in the sand and consider the future.
Why? I've given specs to hundreds of offshore developers, and they have close to zero understanding of what the overall system is intended to accomplish.
I don't know how Lex could overlook asking Christian about "The Shot". Crazy that he hit one of the most iconic shots in basketball history and went on to become one of the world's most prolific programmers. Big things from a guy who use to do keg stands at Duke.
I thought your comment was interesting so I did some research. I think you have Chris Lattner confused with Christian Laettner. Christian Laettner played college ball for Duke then went on to the NBA. Chris Lattner attended University of Portland.
1. You have to explain to chat gpt exactly what you want your code to do. 2. You have to check it’s code. Within that time you could have already coded your function/program.
This is true to senior developers who know their stuff. But midd and junior developers who struggle can get close results to seniors. Suddenly, we get so many developers ... I think there are many gray areas in future.
Nah, are you so bad at explaining? It can spit out hundrets of lines of code under a minute. Its not that bad it wouldnt even be useful as framework.. It often corrects own problems if you compile it and feed it back with errors or misbehaviour..
wont happen because writing new code is only 10% of the job. 90% are changing *existing* code and communicating with the customer or other teams. I dont see how AI could do that.
No, it CERTAINLY won't replace programmers. ChatGPT can, given a description, return a small snippet of code. But I think overall what it generates is trivial. There is still the matter of understanding the problem abstractly, what the constraints are and what we are trying to solve, and architecting a solution. That is, the abstract problem solving component of writing code that is separate from the specific implementation in a specific language and environment are separate things. And LLMs like ChatGPT understand NOTHING. It isn't an expert system, it's a neural net that mimics language. So it can't reason about anything. What I predict will happen vis a vis ChatGPT/LLMs and programmers is like what happened when CAD/CAM and networked workstations became mature and viable for Mechanical Engineers. The tools are so good, and fill in the skill gap, that most of the demand will be for codemonkeys (just like CADmonkeys), and salaries will drop for these positions will drop. Most positions will be of this type. Gone will be the days of "rockstar" hipsters that are just good at quickly memorizing hipster frameworks/fashionable languages du jour. What I mean is that ChatGPT or assistants will flatten the field. There won't be these "rockstar" positions (because most of them are unwarranted in the first place), but the productivity will open up in new dimensions that the sheer NUMBER of available jobs will open up. You won't have the elitism and barrier to entry. I think it will be nice. And alot more cool code will end up being written that will solve real problems, which is out of reach right now.
what the hell did I hear? LLM inside compiler, that's not how the compiler works. There is always a fraction of uncertainty in the answers of LLM but compiler is never uncertain (it must not be). Both are different worlds. ChatGPT cannot replace programmers but programmers who use it efficiently can replace other programmers.
@Jake-oq2bq christ never said that, lex was asking if it was possible for which i commented. Anyways don't just go for people telling this is possible or not, they all have their vested interest to jump on ai hype.
Anyone who's actually used ChatGPT saying that it won't replace programmers is in such denial. Just because it's not always right doesn't mean it's not effective. It's more right than humans are. The only things ChatGPT is bad at are high-level mathematics and physical labor.
But developers are saying chat gpt cannot write code and it’s just pulling code, which most of it is not optimize, or broken. Can’t replace developers if it writes like that at the moment
Code shouldn't be innovativein most cases. Do we build new types of engines and tools every day? No they are massproduced. What is unique is the business and the goal it has, its your job to build something that reflects and help those goals. The actual code to do this doesn't have to be unique or innovative.
The only people I hear talking about this are students and academics. The reality is that most companies have a proprietary codebase that they will not feed into an outside owned AI platform. It's a security risk. Everyone talks about this from a possibility standpoint, not a practicality standpoint. If AI replaces software engineers it won't be for a long, long time. Not until the companies can buy bespoke AI implementations and own them outright.
ok, kiddo. I'm sure you know better than someone who deals with this every day and makes these decisions for a living. Even though you have zero experience with this type of data. 5 years? At work we have legacy computer systems that are over 20 years old and cannot be updated due to security and data integrity issues. You truly have no idea what you're talking about and are just imagining some childish utopia. Because you're young, naive, and ignorant. With absolutely no clue how the world really works. @@jojoma2248
Honestly, at a certain level... Probably. But there is a level of specificity in reasoning, expertise and stochastic thought/problem solving that (at least with current LLM's) can never really be matched until computer scientists actually build TRUE A.I.. Until then, I think the experts have it covered..
OpenAI-01 changes this entire discussion. We are within 3 years of a true AGI that will have the ability to replace a large percentage of programmers in the world.
Did the robot "Data" replace workers in Star Trek the series? No. Computers are just tools used to solve complex problems. I dont think ai will be any different. It's just a tool
I have a new kid and was thinking of skill to teach him. Planned to learn a little Python so, when he gets old enough, I could teach him some coding. Now... not sure if it is worth it. Will programming survive another 18 years as a job? Probably. But just barely being able to code is a good job now. I have lots of friends that tried it; they all got jobs after a short coding academy. It might not be that kind of job in a few years.
It's already replaced coders. Just look how hard it is to get a coding job nowadays. It really never made much sense in the first place to have so many people working in a company's IT department anyways. What, it takes 100 people to code a website? No it doesn't... The future is in customer facing or physical roles I suppose. It doesn't matter how smart you think you are, you're competing against a literal digital superintelligence
Seriously. ChatGPT is great at solving tiny, isolated, pre-existing and well-understood problems. It's basically a searchable textbook, except at least a textbook gets checked for correctness. Actual valuable or interesting programming does not fit into this model.
@@jendabekCZ no they aren't. They're usually working on larger projects solving highly contextualized problems that don't translate well into a single text prompt. This is what people who studied code in school but never worked in industry would think. For most bugs, I can't explain in a single sentence what the bug is, when it's happening, what Ive ruled out and what could be causing it. The input to chatGPT for a single bug fix would basically have to be the entire codebase, a description of the hardware, the desired behavior and some accompanying pictures + explanations. Then when it inevitably fails to find the bug in all of that, I then have to find a way to explain what happened when I tried what it suggested, and the fix didn't work. It's not well suited to the task being described.
For Programmers, ChatGPT is an excellent tool but a poor master. A Swiss army knife for regular expressions. While very useful for pair programming, it takes a skilled developer to distinguish the wheat from the chaff.
Respectfully to all these coders in comment section saying “It won’t replace programmers” : Well, it is meant to replace all these jobs in the first place. The whole point of creation of these LLMs is not a hobby or to see where it may go but it is created with an intent to excel what humans can do intellectually. So it’s all about a “matter of time” when these LLMs reaches that level of error-free proficiency to actually be safely implemented at a commercial scale. And the way things are going, it’ll hardly take a next decade to do so.
0:46 the uniqueness is your human experience shapes your thinking. You can’t compare the ability to think and the ability to create/manufacture to what a calculator that can simulate all possible scenarios. ChatGPT didn’t think of itself or build itself and it can’t kill anyone without the help of humans. Wouldn’t that be a hilarious cosmic joke that AI is the source of consciousness and it created us to experience the world with all 5 senses.
What i think ai will do is what tractor did to farmers i see a future wgereby programmers will not worry about coding but worry about how to solve difficult problems. We will assume the full duty of a develop which is to solve problems and focus less on writing codes
@@hamm8934 why? I mean, most high level languages pride themselves as being easy to learn and debug and update. As LLMs become better, why shouldn't low level languages be used to speed everything up?
It’s only as smart as the person using it. You will need basic programming knowledge still and maybe help it have you do a little heavy lifting like for like form validation or drop down menus.
This is definitely going to replace entry level programmers. I only know basic programming concepts and was able to build a fully functional web scraper that stores data in a database, and uses JavaScript and php to display on the front end. There'd be no way I could accomplish this without it. It does however often give you incorrect code, and as your project grows it begins to lose context of all the moving parts. On more complex projects you would definitely need a high level programmer to know what questions to ask, and where that code fits in to the entire structure
I'd be much more concerned if GPT's code wasn't complete garbage. There is also no way for it to plan and architect large scale projects and to understand the problems customers are facing
So, you tell me that you do not know how to write a self-correcting loop using automatic testing feeding the info back into the AI? Ah. ALso, you are not conerned because the CURRENT version can not program perfectly while the version 2 years ago could not talk properly and use tools - so you come from that to the conclusion that in 5 years it STILL is not on your level? Hint: Realize what the future may hold. THERE WILL BE MAGIC. Maybe not - but you are confident development stops HERE.
90 percent of people on the internet I see say NO, which I agree with. ChatGPT makes lots of mistakes and can't know all the requirements to a complex system. But do you really think in 3 to 5 to 10 years when AI is exponentially smarter and I mean thousands of times smarter (which is inevitable in my opinion), it won't be able to code faster and better than every human on earth combined??? And on top of that be able to manage and create requirements by itself??? 😆😆chatGPT is literally just the very very very very very tip of the ice berg when it comes to artificial intelligence. Coding will be dead within the next decade.
well if they do we'll all be job less, or everyone will be able to build the same sytems as big companies, were all gonna be CEOS? or were all gonna have to learn a trade skill, so my point is until then no one will have to work anymore (except trade jobs/nursing/doc etc.)
@dan that is a problem that humanity is already running into. Many people around the world have their job being done better by a machine for decades. That creates unemployment around the globe. Programmers are just another one of these professions. What to do when fewer and fewer tasks are needed to be done by a human? That is what we, as a society, will need to figure out in the future
AI only writes ''old'' code, meaning it only knows codes that been written by a human, so yes if all you want is an instagram clone or a facebook clone AI will indeed take some jobs in that regard, but if you are trying to innovate and create something new then no not a chance. It will however speed things up a little.
Por enquanto a ferramenta esta bem longe de substituir programadores. Ela é interessante desde que o usuário saiba o que quer perguntar e elabore um prompt muito bem elaborado. O maior risco disso pode ser uma sequência de orientações erradas ou confusas para alguém que esta desenvolvendo conhecimento sobre uma área ou questão especifica.
you must be very bad to be replaced by a template generator lol a software developer MUST: understand business intricacies, deal with different people, ask them questions, understand what they need, negotiate, deliver, integrate with already existent software etc. NONE of these are available on the open internet, large corporations keep their own information private and well safe from outsiders thats why AI can't replace devs these days.
Full podcast episode: ruclips.net/video/pdJQ8iVTwj8/видео.html
Lex Fridman podcast channel: ruclips.net/user/lexfridman
Guest bio: Chris Lattner is a legendary software and hardware engineer, leading projects at Apple, Tesla, Google, SiFive, and Modular AI, including the development of Swift, LLVM, Clang, MLIR, CIRCT, TPUs, and Mojo.
GPT doesn't possess cognition. I have utilized it in my own programming projects, and at times, it proves to be a better tool than Stack Overflow. However, it occasionally generates incorrect code and apologizes when you point out the mistake. Instead of providing the correct version, it continues to offer additional incorrect suggestions, resulting in an infinite loop of erroneous recommendations. It amuses me when people claim that these issues will be resolved in future versions of GPT. In reality, such problems are inherent to large and matured language models and cannot be completely eliminated unless a revolutionary alternative emerges. Ultimately, when GPT fails, I find myself turning to Stack Overflow to seek human feedback. in simple terms, What GPT creates looks only impressive to the untrained eye and mediocre programmer like me.
@@sh4ny1 still full of hallucinations, albeit less than GPT-3.5
There is no technical reason why these engineering problems cannot be resolved. Not sure we necessarily need a revolutionary alternative, but given the help we can get from these new tools in developing new solutions, I'm sure the progress in all areas, including programing, and specifically for a programing focused solution (like a "programingGTP+some_other_type_of_AI-hybrid" thing for example), will be quite astonishing in a continuous way going forward.
@@ingerasulffs lmao
> There is no technical reason why these engineering problems cannot be resolved
what happens when you use a lot of average code for training NN? It will only produce average buggy code as a result.
You either don't understand what NN and LLM are, or I dunno
Exactly. At the end of the day ChatGPT is just a tool. It can't do the thinking for you.
you told chatgpt to write you this comment didn't you
Before watching: NO, it won't and I'm honestly tired of this BS discussions.
There is almost infinite demand for software and the only reason it's not being made is because it is incredibly expensive - corporate system implementations cost BILLIONS. Tools like GPT will drive the price down, one developer will be able to increase the output, companies will want more stuff, more frequent updates etc.
This is the correct response. There's a gigantic and painful backlog of software projects in most companies and this won't change - rich people will always be competing for market share via software projects. You'll also need developers to write the code. Software projects are rarely some 1 paragraph generic scripts easily located on stack overflow or elsewhere online. They're often tens or hundreds of thousands of lines of code, specifically tailored to a unique use case. You can't just prompt your way into that. This whole thing is stack overflow via prompt, and nothing more
That was my opinion Always. Demand for software Will Always be Like infinite u can Always make more stuff, New things. There is no ending. You Will be able to do much more in X time.
Yup we still have Jira backlog from like 6+ years ago lol
This. The number of developers you employ is a costly value to change, and it doesn't change anything when you consider that chatGPT doesn't do all the work a programmer does.
The value that will probably change is code deadlines. If a team can write faster with ChatGPT people will just want updates and features sooner. Not randomly fire 10% of their staff and keep going at the same rate.
@@Luke-xp5pe A wise nerd once pointed out that measuring software by lines of code is like measuring aircraft by weight.
Fire all your programmers and hire prompt engineers to manage your codebase. Let me know how it goes.
Try it! Develop a 3D AAA engine like Unreal or Frostbite without programmers.
@@4dillusions😂 exactly
@@4dillusions bomb😂
One programmer & 5 prompt engineers.
Why not use your programmers as prompt engineers so if their is any error they can take care of it.
And you can certainly reduce the amount of working people required in any particular company.
As a programmer who has spent hours fighting with ChatGPT to get working code for a new problem (and failing)... of course it won't replace programmers. Future developments could change that, but even then it will take years of "co-piloting" with human coders before it could possibly be trusted. To be clear, I'd use this in an IDE co-pilot role at the drop of a hat - but that's productivity increase, not replacement.
Its bound to happen tho? That's why I decided not code fuck chat gpt lol
@@JohnStockton7459 that's what I heard too
But that’s merely anecdotal. With training and greater application one programmer can become much, much, more productive. Chat GPT would take the jobs of ALL programmers, just the least productive. It’s like going from shovels to using excavators in digging a hole. A 100 people (programmers) can dig the hole (write the programme) but 10 programmers using AI can dig the hole in the same time frame. Suddenly you’ve lost 90 jobs.
@@JohnStockton7459 nope! There are lots of problems GPT can't solve, for example, anything that someone hasn't put on the internet yet!
@@Notabot437 it's not "bound to happen" any time soon, and the capabilities of chatgpt are only limited to doing things that aren't that interesting to begin with.
The only people who think it will replace programmers tend not to be doing that much programming to begin with.
I really admire how down-to-earth Lex is. Despite being really smart and talented, he never comes across as arrogant. It's refreshing to see someone like him who interacts with others in such a humble way. I hope to learn from his example and be more like that in my own personal interactions.
The thing is, he is not that smart.
@@Adamskyization Smarter than me and lot others I know. And much nicer too!
👆if you’re wondering what said arrogance sounds like, and how to avoid it
@@Adamskyization He is smarter than 90% of people on Earth or 85% of US residents for sure. The fact that Lex can push provocative conversations without losing control of his emotion, just showed he has a good brain which most of us do not have.
Idk he lacks stamina
Newer(just a year+) self taught programmer and I’m getting to that point of building real robust projects and ideas so chatgpt has been amazing when I realize for example hey I need to loop over this particular data type or something simple and I don’t know how in that moment but if I ever ask it for something a bit more complicated I always end up fighting erroneous responses and having to over explain my ask. This could definitely be worked on in future gpt iterations but I think this idea of truly understanding what someone wants and needs seems super hard to reproduce with a LLM.
As of now LLMs that write code are like low skill interns that do tiny code generation that needs to be supervised by actual developers. It needs to be guided along and it needs a lot of help to be integrated into an actual project. It is very impressive don't get me wrong, but it's no where near human replacement and I don't see it changing drastically anytime soon.
Programming, unlike other activities, needs a lot of contextual understanding. It is on the opposite side of the spectrum from highly specialized activities like digital illustration. We saw the latter being perfected already. I'm going to say that AI will be capable of doing the former sort of activities LAST and the latter first. Especially things like game dev requite so much unrelated skills like music, level design etc. If one AI can do all of that then I suspect it could do everything else in the world at which point we have AGI and the singularity. I wouldn't be too worried about programmers... I'm more worried for the world as a whole.
“They’ll tell you they need a faster horse when they need a car”
Fucking loved that
Original Henry Ford quote : )
I mean can go both ways. In this analogy we might be the horses and ChatGPT the car
I think future AI models will probably replace some parts of what we do quicker than we think - but I also think anyone who has the mind capable of manufacturing complex software will probably find a way of building something interesting and novel with the new tools AI creates - or extending our capabilities to simply make more fantastical software. I can't imagine it writing any and all complex software projects that we could produce, expecially when we consider it as a tool to extend our capabilities.
Just some thoughts:
Even if it were capable of responding to "Generate me a Cloud based Web Video Viewing application" - You might want things like - "Oh but with support for webm videos" - "and support video comments" - "but make the comments filter out profane language for users under X age" - or "with a REST API and documentation for comments and stats"
So programming could definitely become simplified into product / technical descriptions some day.
Where rather than a repository of code - you could have a repository of a product description with caveats and nuances in human readable and understandable language (perhaps plain English descriptions).
Humans love pushing the limits - so we'll probably use those programmers to push the limits of how complex of prompts we can generate, and generally solve novel problems in the realm of "what do we want exactly". At least until AI can predict what we want and produce outputs better than we could even think to ask for. Wouldn't be surprised if humans and AI are in a reinforcing feedback cycle of humans training AI with new & improved input - and AI providing new and improved tools for humans to produce new and improved outputs for.
Wouldn't be surprised if many of us move towards (many many years from now) mostly working by training and improving AI/LLMs with high quality inputs, and providing feedback / improvement in the long run - or providing the right prompts/inputs for the desired output. idk - impossible to really predict but it will be interesting at least to see where things go
The question is whether the prompts required to gain useful answers from AI will change with every new iteration of the AI or whether we can start to build a pattern of understanding of how best to communicate with AI as a whole.
Reflect on the unique value humans bring to programming at [0:11].
Consider how large language models (LLMs) are changing programming practices at [0:18].
Recognize the potential for LLMs to automate routine coding tasks at [2:32].
Explore the role of LLMs as companions in the coding process at [2:38].
Contemplate the interplay between human creativity and LLM-generated code at [5:28].
Jesus Christ , finally someone who's not following a narrative built on irrationality! Nobody can predict the future, but if you program complex systems, you know very well where the limits are. Maybe AGI will arrive, and by then we're all on the same boat, and the issue will always be if we're not ALL on the same boat!
I know, I get so sick of this constant assumption that ChatGPT is going to get so much better even though it's fundamental operation really doesn't do what programmers do.
I'm also not buying Lex's angle that the statistical average of all language on the internet is somehow a deeper insight into the nature of reality and intelligence.
Point is - you do not need AGI. The concept of AAGI is needed to change something is stupid. See, larger projects have people writing specs, there are coding standards, there are testers. An AI that can work with that can make most developers redundant. Hint here is - that is NOT AN AGI. A good programming AI may be not a doctor or lawyer, and the definition of AGI is that it does ALL the human stuff in one model. Nope, it is specialized. It is just 2-3 generations further than what we have now.
@@ThomasTomiczek even what you're saying isn't enough.
People generating "specs" to feed into an AI is not enough, because specs are written in an ambiguous language known as "english" wheras code is written in a logically unambiguous language known as "code".
Tests don't really solve the problem either, because your code can pass all tests while still failing in production, and writing tests is often not possible without a prexisting API or knowledge of how said API is implemented.
@@zacharychristy8928 humans are not intelligent... We can reason things , figure out ways of solving problems and build over another people's hard work.
But what is intelligence? Can you focus really hard to invent something new? For sure not
You can only work on another person experience on any given field.
Inteligence is an illusion
Don't fool yourself
Let's take a moment to appreciate the variable names in the thumbnail code.
ChatGPT will never replace software developers because ChatGPT first has to figure out what the client wants, and they client doesn't even know what they want.
I guess the idea is that in future ChatGPT will interview clients to help them figure out what they want. And then implement it.
See, first - CHATGPT is a chat program. A proper cognitive loop using the API can have WAY better capabilities. Second, you work on super small stuff, right? Because any project I id in the last decade has PRODUCT OWNERS handling this part. Then user stories get evaluated (which ChatGPT actually is not that bad at). Planning poker. It is not there yet (even with cognitive loop), but saying "never" marks you super high on "ignorant idiot". 3 years ago it could not talk complex ethics - now it can. What is ever in your universe? 2 years? What in 10?
@@ThomasTomiczek It rather sounds like it is you who is talking bullshit. Cognitive loop? These things are currently tripping up on their own randomly generated bs, with devs desperately trying to mask it by glueing not some very intelligent AI but-rather clunky old traditional hardcoded logic workarounds on top of it to hide the embarrassment. Also slow and memory inefficient as hell, compared to handcrafted algorithms, say, for parsing or compiling good ole machine languages.
@@ThomasTomiczek no, because there is no "cognitive loop" that's a term you're inventing that doesn't exist.
ChatGPT is simply a "semantic loop" which is not the same thing. You don't get deeper insights into what would help a person solve a problem by simply being able to generate a response to the last thing they said.
You're imagining a richer operation is taking place than what actually is. There is no insight to be gained from simply generating responses. You're implying there is some context-aware cognitive model underneath because you've been utterly fooled by a Chinese Room.
I imagine in the future, the client will ask the AI for an app and it will get pumped out in seconds. If the user doesn’t like it and requests any changes, the AI will attempt again; repeat until satisfied client.
For me it generates incorrect code most of the times
In my experience the jump from GPT 3 to 4 is huge for programming in python specifically. I've found it helpful to first communicate back and forth with GPT to design an algorithm, then once satisfied with the logic to ask just one time to produce the code.
How is jump to GPT4-turbo?
@@VoloBonja Nice to have longer input but no big difference otherwise
@@lukemelo failed attempt of GPT5, so they called it turbo…
What a kind and lovely person Chris Lattner is.
Thank you for Swift
I am a novice in programming, but it has proven to be extremely useful in my work and I am still learning. I can imagine that many advanced programmers might be able to accomplish more sophisticated tasks, but for a beginner like me, GPT-4 has been an incredible booster. I believe I have at least doubled the amount of useful code I can produce. Additionally, it has taught me how to learn about programming. My fear was that using GPT-4 would make me complacent or reduce my drive to learn more, but in reality, based on my experience, it has accelerated my learning and helped me make more progress. I think the impact will vary greatly depending on one's level of programming expertise, but this technology is bound to transform the field in one way or another.
Im learning a lot from it.
Calculator didnt kill science - it just made it less error-prone in certain areas. LLM for programmers will become just what calculators are for scientists. It does simple things quickly, allowing shift focus on more complicated things.
Bruh AI went from zero to 100 in a short period of time and aint stopping, and ppl still saying nah thus is the best it can get 🤡
except that it didn't. I started my undergrad in 2006 and already we used to consider "coding" as a menial job or "robotic". AI has existed for years and gotten mainstream only now.
@@rockwithyou2006You all need to discount everything before AlphaGo (the 2017 version), all that 50-60 years was basically like learning how to build the effective tool of actually working Deep learning.
So far I've found it useful for websites, web servers, microcontrollers, datasheet reading, plc languages, video game engines, and so many other languages. It's an extremely useful tool.
How have you used it with PLCs and microcontrollers?
@@synthwave8548 Well in the world of automation engineering there are two methods so far that could allow somebody to program some robot for some industrial application but there is a method alternative to the use of a PLC with HMI to display data to the operator. Using a microcontroller in C is more in depth than a PLC. If you want to set up a webserver on a microcontroller then you'd need to include the javascript and html in the IDE along with additional Node software and files. So using chat gpt for that is a great option because it will specify all the libraries to use in the IDE and write the JS and HTML code necessary to display the website correctly. As well, I've asked chatgpt to write code for a PLC (can do both structured text and ladder logic) that would do simple digital and analog outputs to relays or transistor switches, counters, pwm, variable current supply to pneumatic actuator or variable frequency drive, feedback from sensors requiring analog to digital converters, comparators etc. Basically specify your PLC, whether you want ST or LL format, what industrial scenario you'd want to accomplish and it will get you started and continually refines itself as we know and provides an indepth explanation of it if needed.
Right now I am perfectly in tune with "the customer wants a faster horse...........".
I'm working on a project whose original specs were for a "software horse" that can fly, scuba dive, can carry 10 tons up a vertical wall, and can easily fit in your pocket.
ChatGPT is a very good programmer, but try to make him steer the customer towards something in the realm of possibilities. (without giving a call to Harry Potter)
That is out of the question for anything larger, though, where this steering is NOT A PRORAMMERS JOB. That is what Product Owners are for. Isolating the customer from... Also, you likely use ChatGPT without a proper planning and cost matrix. AI is capable in limits of common sense - just not a chat bot implementation. If you drop ChatGPT and got GPT (i.e. sue the API) and put a proper cognitive architecture there things turn out a lot different regarding common sense. I have one here that ASKS ME QUESTIONS. Clever prompting to not have a slave.
@@ThomasTomiczek Programmers are not code monkeys. Helping steer is absolutely the job of a programmer, especially more senior level. Product would love to have some features that are neither feasible or maintainable - those who develop the systems should be high level advisers on the direction.
Coding is us simply telling a machine what to do in a way that we have the most control. Yes, AIs can give us some good code, and sometimes not so good code. Maybe down the road it will always be good. Either way, we still have to tell it what we want. Personally I’d rather do that in code vs. chat/text.
Yeah, code is much more logical that any human-spoken language, therefore why communicate with computer code, i.e. AI, in English when you could communicate with the language it's actually built on, code.
1) Can ChatGPT create a compiler and an operating system?
2) Can ChatGPT create a computer language?
3) Can ChatGPT create its own Python modules, packages and libraries?
Answer is NO, big NO. Because ChatGPT is shitty sotware that provides bad code 99% of the time.
@@MasterFrechmen732 This is all a big scam to lower programmers salaries lol
for me LLM's works best figuring out an error and not writing code but explaining why the error happend and what I can do to fix it
I asked ChatGPT about this and it said No, and also that AI will create even more job opportunities.
Wake up and stop believing everything you read. It’s called Fact Checking. People lie, just like you
More job opportunities for other AIs for sure...
It's a joke 🤣
Or not?
@@gastonangelini8352 Why would that be a joke? It was the same with technology, some technology replaced old-school jobs, but it created million more new jobs.
For example, AI already created Prompt Engineering, people are already making money from it!
@@bobanmilisavljevic7857 You are mentally sick
I would say that too if I was an AI
I think ChatGPT is too hight level and leaves space for a lot of ambiguity. It will obscure a lot what the machine will be doing and how will be performing. So there's still going to be needed a lot of computational knowledge, even more than these days if you want to get something sized and performant, without surprises in terms of costs, and of course, something that can be maintainable in the future. I don't see big companies leaving all their profit and reputation to a bunch of guys without computational knowledge playing with a prompt. Also, I don't think the prompt will be the future of programing. There will be something in the middle that can deal with ambiguity and that is more precise. I believe programing language will be needed less and less, but computational knowledge more an more.
why would u want to put a LLM in a compiler?
This is Absolutely truth that we should not compete with LLMs but work in sync with them to achieve greater height
As a 66 year old Application Developer I hope we at least have 5 to 10 years before it is all over. If I don't outlive becoming obsolete there is no problem. Sorry to be morbid but AI doesn't care anyway. What am I doing after hearing the narrative about AI replacing developers? I went into phase 2 of my AI learning. After 4 years of Python I started PyTorch and Deep Learning. Understand tensors and the basics of Neural Networks and model training/testing. Phase 3 will be returning to my Python fastAPI NFL Colts app where I build 15 graphs but leveling up with some predictability or projection on a player or team stat. Then have a stat of the week feature to give my graphs/bar charts a little more spice. That I deploy to heroku/salesforce cloud as an Angular web UI.
Programming is just translating solutions to problems into a language a computer can understand
It's the problem solving that is the difficult part, once chat gpt can do that, then programmers are in trouble, haven't seen any sign of that yet
It's like thinking that the calculator replaced math teachers. People still have to learn basic arithmetic.
Best comments
> problem solving that is the difficult part
It depends on what kind of problem. A bug in the C library? Sure, that's difficult. An e-commerce website? I doubt AI would still have difficulty coding one by itself in the next 10 years. The latter is still a job for thousands of people currently by the way.
Thing is, problem solving isn't as special as we think it is. It's really just a bunch of neural networks. The only strength we have as humans now are our "Eureka" moments and how we aren't as prone to catastrophic forgetting as DL networks are. In the future, I think knowledge work will only be reserved to the most intelligent humans (i.e. those who are considered geniuses). It's a dismal prediction but there's really no other way to see it.
@@unhash631 Yeah but it would have to understand the domain, Dynamics and Salesforce have tried to be completely generic, but you still need armies of developers to integrate if you want any business value
It's also the problem that it has to be able to modify the existing code and be an interpreter for humans giving requirements
Maybe in 10 years? Sure, its not there right now, which was my original point
Problem solving is actually special, I've seen many people who do it wrong
There's myriad things that Chris and Lex do not address... an LLM spits out 'the highest likely next word'... Do you want that to be writing your code? What are the consequences?
Code needs to be consistent, complete and coherent... and it needs to take into account the functional and non-functional needs of all of its stakeholders
Neither of which can be done right now if you want to replace developers.
Example: There are 7 million lines of code spread over 150+ systems/chips in a wafer-stepper machine. Every hour of it *not* being operational costs ~300k.
After an issue:
- Do you want to be the 'prompt engineer' that convinces an LLM to spit out the 7 million lines of code that doesn't break it?
- Do you want to rely on a tested code-base and engineers?
Anyone who thinks ChatGPT can write industrial-strenght software systems and can replace software engineers is either a bedroom programmer or does not have any knowledge at all about software development.
Chatgpt won't do it but be damned sure an even more advanced tool in 2030-2040 will.
AI absolutely will replace software engineers. It's just a matter of how long until then. Anyone saying it's not going to replace engineers along with a lot of other people seem in denial.
I agree. Not all programmers, but I think many of them are just afraid to lose their job, and so they try to spread these news to make others think AI won't replace them.
Even in 4 years we could have a powerful chat GPT. Technology is getting more and more advanced every month.
I look at software tools as a series of increasing levels of abstraction and decreasing levels of skill necessary to use them. You started out with machine code. It might even have been wires on a plug board. The 1's and 0's entered into specific memory addresses. Then IBM Hollerith Punch cards or paper tape to load code. The assemblers came along to create machine code. Much faster and easier. Then higher-level languages came along and were even easier. The skills needed to use higher level languages became less and less at the higher levels of abstraction. There are people who can program in Python that can't write or read machine code. Of course, object-oriented programming is all the rage. No need to know how to create all those fancy data structures. ChatGPT will just be a higher level of abstraction that requires less detailed knowledge to use. Do you know why there is a language of mathematics? Because English is ambiguous and imprecise. Some level of computer language will be around for a long while.
I think ChatGPT is too hight level and leaves space for a lot of ambiguity. It will obscure a lot what the machine will be doing and how will be performing. So there's still going to be needed a lot of computational knowledge, even more than these days if you want to get something sized and performant, without surprises in terms of costs, and of course, something that can be maintainable in the future. I don't see big companies leaving all their profit and reputation to a bunch of guys without computational knowledge playing with a prompt. Also, I don't think the prompt will be the future of programing. There will be something in the middle that can deal with ambiguity and that is more precise. I believe programing language will be needed less and less, but computational knowledge more an more.
I use it from time to time for code generation on a certain specific problems I am having issues. At the end of the day , the users using chatGPT still need to understand the code.
It will greatly diminish the need for programmers but not eliminate them.
It will make programmers more efficient, you'll still have to always understand what the code is doing.
Yes, until the model understands what the full project code is doing, and until they figure out how to properly train their weird recursive transformer that allows an unlimited amount of tokens, we're fine.
(Programmers are doomed).
@@pladselsker8340programmers create these projects. A real programmer is an engineer. Writing code is just a mechanical thing. Creating ways, how to achieve the goal, is what a software engineer does. Before softer engineers are doomed, other professions stop existing except probably mathematicians.
Why would that be. Do you understand how your microwave works?
Everyone I have spoken with was like Oh, yeah, AI is very powerful, it will take some jobs, but not MINE. Because X, Y and Z.
Does this smell of something? It has to replace SOMEONE 🙂 Stop burying your heads in the sand and consider the future.
@@pladselsker8340 At that point say goodbye to a majority of technical jobs
Why? I've given specs to hundreds of offshore developers, and they have close to zero understanding of what the overall system is intended to accomplish.
I don't know how Lex could overlook asking Christian about "The Shot". Crazy that he hit one of the most iconic shots in basketball history and went on to become one of the world's most prolific programmers. Big things from a guy who use to do keg stands at Duke.
I thought your comment was interesting so I did some research. I think you have Chris Lattner confused with Christian Laettner. Christian Laettner played college ball for Duke then went on to the NBA. Chris Lattner attended University of Portland.
Wrong Chris bro lol
1. You have to explain to chat gpt exactly what you want your code to do.
2. You have to check it’s code.
Within that time you could have already coded your function/program.
This is true to senior developers who know their stuff. But midd and junior developers who struggle can get close results to seniors. Suddenly, we get so many developers ... I think there are many gray areas in future.
Nah, are you so bad at explaining? It can spit out hundrets of lines of code under a minute. Its not that bad it wouldnt even be useful as framework.. It often corrects own problems if you compile it and feed it back with errors or misbehaviour..
They have been saying developers will be replaced by machines or WYSIWYG for literally decades. I'm not holding my breath.
wont happen because writing new code is only 10% of the job. 90% are changing *existing* code and communicating with the customer or other teams. I dont see how AI could do that.
No, it CERTAINLY won't replace programmers. ChatGPT can, given a description, return a small snippet of code. But I think overall what it generates is trivial. There is still the matter of understanding the problem abstractly, what the constraints are and what we are trying to solve, and architecting a solution. That is, the abstract problem solving component of writing code that is separate from the specific implementation in a specific language and environment are separate things. And LLMs like ChatGPT understand NOTHING. It isn't an expert system, it's a neural net that mimics language. So it can't reason about anything. What I predict will happen vis a vis ChatGPT/LLMs and programmers is like what happened when CAD/CAM and networked workstations became mature and viable for Mechanical Engineers. The tools are so good, and fill in the skill gap, that most of the demand will be for codemonkeys (just like CADmonkeys), and salaries will drop for these positions will drop. Most positions will be of this type. Gone will be the days of "rockstar" hipsters that are just good at quickly memorizing hipster frameworks/fashionable languages du jour. What I mean is that ChatGPT or assistants will flatten the field. There won't be these "rockstar" positions (because most of them are unwarranted in the first place), but the productivity will open up in new dimensions that the sheer NUMBER of available jobs will open up. You won't have the elitism and barrier to entry. I think it will be nice. And alot more cool code will end up being written that will solve real problems, which is out of reach right now.
what the hell did I hear? LLM inside compiler, that's not how the compiler works. There is always a fraction of uncertainty in the answers of LLM but compiler is never uncertain (it must not be). Both are different worlds. ChatGPT cannot replace programmers but programmers who use it efficiently can replace other programmers.
@Jake-oq2bq christ never said that, lex was asking if it was possible for which i commented. Anyways don't just go for people telling this is possible or not, they all have their vested interest to jump on ai hype.
Chatgpt is a glorified search engine,it pulls or regurgitates info that is fed into it.
It can answer novel questions with novel answers, so no
Delusional take.
Except it can come to conclusions for you as search engines just give you information
IMHO, it allows to write code in blocks instead of lines. Not replacing at me moment
Short answer: yes
Long answer: not for another 5 years
Even in linux commands or some files or problems or maybe instalations, humans with starkoverflow works much better
Future releases of something like ChatGPT will definitely replace programmers.
You probably never coded or built anything or a complete noob
In like 10 years or less I think. Self improving computer that will become infinitely improved.
Anyone who's actually used ChatGPT saying that it won't replace programmers is in such denial. Just because it's not always right doesn't mean it's not effective. It's more right than humans are. The only things ChatGPT is bad at are high-level mathematics and physical labor.
And also it will keep improving. This is just the beggining. I don't know if all these people are aware of exponencial progress.
But developers are saying chat gpt cannot write code and it’s just pulling code, which most of it is not optimize, or broken. Can’t replace developers if it writes like that at the moment
Code shouldn't be innovativein most cases. Do we build new types of engines and tools every day? No they are massproduced.
What is unique is the business and the goal it has, its your job to build something that reflects and help those goals. The actual code to do this doesn't have to be unique or innovative.
10 years from now we will talk
Yes
The only people I hear talking about this are students and academics. The reality is that most companies have a proprietary codebase that they will not feed into an outside owned AI platform. It's a security risk. Everyone talks about this from a possibility standpoint, not a practicality standpoint. If AI replaces software engineers it won't be for a long, long time. Not until the companies can buy bespoke AI implementations and own them outright.
I think your wrong. The competitive advantage of using AI is too big. They’ll make AI models trained but isolated so there’s no security risks.
@@jojoma2248 you obviously have never worked in a corporate job with Intellectual property and data integrity concerns.
@@Andrew-un8tx come back in 5 years, I think youll see im right
ok, kiddo. I'm sure you know better than someone who deals with this every day and makes these decisions for a living. Even though you have zero experience with this type of data. 5 years? At work we have legacy computer systems that are over 20 years old and cannot be updated due to security and data integrity issues. You truly have no idea what you're talking about and are just imagining some childish utopia. Because you're young, naive, and ignorant. With absolutely no clue how the world really works. @@jojoma2248
Ai CAD libraries should be a big thing
Honestly, at a certain level... Probably. But there is a level of specificity in reasoning, expertise and stochastic thought/problem solving that (at least with current LLM's) can never really be matched until computer scientists actually build TRUE A.I.. Until then, I think the experts have it covered..
Just have the CEO write all the software with ChatGPT. They better hope they actually know what they are doing.
ChatGPT is generating more vulnerable code than humans
OpenAI-01 changes this entire discussion. We are within 3 years of a true AGI that will have the ability to replace a large percentage of programmers in the world.
a large percentage of everything in the world
Did the robot "Data" replace workers in Star Trek the series? No. Computers are just tools used to solve complex problems. I dont think ai will be any different. It's just a tool
Sure. It will. It will replace us all. There will be a task force and it will put a GPT in everybody's place.
If it looks like a programmer, codes like a programmer, programs like a programmer then it just may be a programmer.
I have a new kid and was thinking of skill to teach him. Planned to learn a little Python so, when he gets old enough, I could teach him some coding. Now... not sure if it is worth it. Will programming survive another 18 years as a job? Probably. But just barely being able to code is a good job now. I have lots of friends that tried it; they all got jobs after a short coding academy. It might not be that kind of job in a few years.
ChatGPT is only as good as the prompt you give and the current iterations still makes awful mistakes and is really just regurgitating stack overflow.
I'm more scared of recession than gpt-4 xD
It's already replaced coders. Just look how hard it is to get a coding job nowadays. It really never made much sense in the first place to have so many people working in a company's IT department anyways. What, it takes 100 people to code a website? No it doesn't... The future is in customer facing or physical roles I suppose. It doesn't matter how smart you think you are, you're competing against a literal digital superintelligence
Chatgpt helped lex realize he's just a casual when it comes to programming. It's ok lex, you're a podcaster now, not a programmer 😂
yeah exactly, sometimes it buffles my mind how someone that apparently was a professor on the subject knows so little about the tech itself
Seriously. ChatGPT is great at solving tiny, isolated, pre-existing and well-understood problems. It's basically a searchable textbook, except at least a textbook gets checked for correctness.
Actual valuable or interesting programming does not fit into this model.
@@zacharychristy8928 On the other hand, most of coders / programmers are just "solving tiny, isolated, pre-existing and well-understood problems" :)
@@jendabekCZ no they aren't. They're usually working on larger projects solving highly contextualized problems that don't translate well into a single text prompt. This is what people who studied code in school but never worked in industry would think.
For most bugs, I can't explain in a single sentence what the bug is, when it's happening, what Ive ruled out and what could be causing it. The input to chatGPT for a single bug fix would basically have to be the entire codebase, a description of the hardware, the desired behavior and some accompanying pictures + explanations. Then when it inevitably fails to find the bug in all of that, I then have to find a way to explain what happened when I tried what it suggested, and the fix didn't work.
It's not well suited to the task being described.
For Programmers, ChatGPT is an excellent tool but a poor master. A Swiss army knife for regular expressions. While very useful for pair programming, it takes a skilled developer to distinguish the wheat from the chaff.
Respectfully to all these coders in comment section saying “It won’t replace programmers” : Well, it is meant to replace all these jobs in the first place. The whole point of creation of these LLMs is not a hobby or to see where it may go but it is created with an intent to excel what humans can do intellectually.
So it’s all about a “matter of time” when these LLMs reaches that level of error-free proficiency to actually be safely implemented at a commercial scale.
And the way things are going, it’ll hardly take a next decade to do so.
Maybe they could replace politicians? Couldn’t be worse.
Chat gpt is overhyped.
ChatGPT cant even correctly do medium level code challenge lol.
when someone speaks about AI who actually has profit from AI hype.
Sexbots will replace programmers.
haha
0:46 the uniqueness is your human experience shapes your thinking. You can’t compare the ability to think and the ability to create/manufacture to what a calculator that can simulate all possible scenarios. ChatGPT didn’t think of itself or build itself and it can’t kill anyone without the help of humans. Wouldn’t that be a hilarious cosmic joke that AI is the source of consciousness and it created us to experience the world with all 5 senses.
1:14 think of how much time you’d have for dating!
It will but not all. people still have to program AI itself...
What i think ai will do is what tractor did to farmers i see a future wgereby programmers will not worry about coding but worry about how to solve difficult problems.
We will assume the full duty of a develop which is to solve problems and focus less on writing codes
I _just_ started trying to learn Python...
can chat gpt or any other llm re write todays games and programs in basic low level languages like c++ or c so it is well optimised?
Nope
@@hamm8934 why? I mean, most high level languages pride themselves as being easy to learn and debug and update. As LLMs become better, why shouldn't low level languages be used to speed everything up?
It’s only as smart as the person using it. You will need basic programming knowledge still and maybe help it have you do a little heavy lifting like for like form validation or drop down menus.
Agreed
This is definitely going to replace entry level programmers. I only know basic programming concepts and was able to build a fully functional web scraper that stores data in a database, and uses JavaScript and php to display on the front end. There'd be no way I could accomplish this without it. It does however often give you incorrect code, and as your project grows it begins to lose context of all the moving parts. On more complex projects you would definitely need a high level programmer to know what questions to ask, and where that code fits in to the entire structure
the entry level programmers of this generation are already far better than what chatGPT can do
haha maybe in 20 years.. It couldnt even replace basic HTML CSS web programmers but hey its a good start
I'd be much more concerned if GPT's code wasn't complete garbage. There is also no way for it to plan and architect large scale projects and to understand the problems customers are facing
So, you tell me that you do not know how to write a self-correcting loop using automatic testing feeding the info back into the AI? Ah. ALso, you are not conerned because the CURRENT version can not program perfectly while the version 2 years ago could not talk properly and use tools - so you come from that to the conclusion that in 5 years it STILL is not on your level? Hint: Realize what the future may hold. THERE WILL BE MAGIC. Maybe not - but you are confident development stops HERE.
90 percent of people on the internet I see say NO, which I agree with. ChatGPT makes lots of mistakes and can't know all the requirements to a complex system. But do you really think in 3 to 5 to 10 years when AI is exponentially smarter and I mean thousands of times smarter (which is inevitable in my opinion), it won't be able to code faster and better than every human on earth combined??? And on top of that be able to manage and create requirements by itself??? 😆😆chatGPT is literally just the very very very very very tip of the ice berg when it comes to artificial intelligence. Coding will be dead within the next decade.
well if they do we'll all be job less, or everyone will be able to build the same sytems as big companies, were all gonna be CEOS? or were all gonna have to learn a trade skill, so my point is until then no one will have to work anymore (except trade jobs/nursing/doc etc.)
@dan that is a problem that humanity is already running into. Many people around the world have their job being done better by a machine for decades. That creates unemployment around the globe. Programmers are just another one of these professions. What to do when fewer and fewer tasks are needed to be done by a human? That is what we, as a society, will need to figure out in the future
Increasing programmer efficiency without expanding the amount of work will result in fewer jobs for humans. That's the crossroads we're at.
Im programmer and i dont use artificial inteligence.
📍3:46
"really well," as in it doesn't work.
no ai will not replace coders or programers
After o1 , We're cooked
Poor Lex 🙁🙁🙁
AI only writes ''old'' code, meaning it only knows codes that been written by a human, so yes if all you want is an instagram clone or a facebook clone AI will indeed take some jobs in that regard, but if you are trying to innovate and create something new then no not a chance. It will however speed things up a little.
Por enquanto a ferramenta esta bem longe de substituir programadores. Ela é interessante desde que o usuário saiba o que quer perguntar e elabore um prompt muito bem elaborado. O maior risco disso pode ser uma sequência de orientações erradas ou confusas para alguém que esta desenvolvendo conhecimento sobre uma área ou questão especifica.
totally agreed
AI is a fad, everyone will have forgotten about it in a month
Lmao
Code generation will be replaced with English coding since LLM are pretty good in NLP
I don't want to be jobless due to AI
First Lex woah!
you must be very bad to be replaced by a template generator lol
a software developer MUST: understand business intricacies, deal with different people, ask them questions, understand what they need, negotiate, deliver, integrate with already existent software etc.
NONE of these are available on the open internet, large corporations keep their own information private and well safe from outsiders
thats why AI can't replace devs these days.
what if large corporations train their own AI for their specific needs with their private data.
Gpt 4 buddy. And they’re adding the ability to upload files, once that happens it’s over