Will ChatGPT replace programmers? | Chris Lattner and Lex Fridman

Поделиться
HTML-код
  • Опубликовано: 21 ноя 2024

Комментарии • 416

  • @LexClips
    @LexClips  Год назад +11

    Full podcast episode: ruclips.net/video/pdJQ8iVTwj8/видео.html
    Lex Fridman podcast channel: ruclips.net/user/lexfridman
    Guest bio: Chris Lattner is a legendary software and hardware engineer, leading projects at Apple, Tesla, Google, SiFive, and Modular AI, including the development of Swift, LLVM, Clang, MLIR, CIRCT, TPUs, and Mojo.

  • @Mikegeb4545
    @Mikegeb4545 Год назад +501

    GPT doesn't possess cognition. I have utilized it in my own programming projects, and at times, it proves to be a better tool than Stack Overflow. However, it occasionally generates incorrect code and apologizes when you point out the mistake. Instead of providing the correct version, it continues to offer additional incorrect suggestions, resulting in an infinite loop of erroneous recommendations. It amuses me when people claim that these issues will be resolved in future versions of GPT. In reality, such problems are inherent to large and matured language models and cannot be completely eliminated unless a revolutionary alternative emerges. Ultimately, when GPT fails, I find myself turning to Stack Overflow to seek human feedback. in simple terms, What GPT creates looks only impressive to the untrained eye and mediocre programmer like me.

    • @AuxiliaryBeats
      @AuxiliaryBeats Год назад +12

      @@sh4ny1 still full of hallucinations, albeit less than GPT-3.5

    • @ingerasulffs
      @ingerasulffs Год назад +22

      There is no technical reason why these engineering problems cannot be resolved. Not sure we necessarily need a revolutionary alternative, but given the help we can get from these new tools in developing new solutions, I'm sure the progress in all areas, including programing, and specifically for a programing focused solution (like a "programingGTP+some_other_type_of_AI-hybrid" thing for example), will be quite astonishing in a continuous way going forward.

    • @no_bs_science
      @no_bs_science Год назад +23

      @@ingerasulffs lmao
      > There is no technical reason why these engineering problems cannot be resolved
      what happens when you use a lot of average code for training NN? It will only produce average buggy code as a result.
      You either don't understand what NN and LLM are, or I dunno

    • @arafay
      @arafay Год назад

      Exactly. At the end of the day ChatGPT is just a tool. It can't do the thinking for you.

    • @abujessica
      @abujessica Год назад +3

      you told chatgpt to write you this comment didn't you

  • @piotrjasielski
    @piotrjasielski Год назад +104

    Before watching: NO, it won't and I'm honestly tired of this BS discussions.
    There is almost infinite demand for software and the only reason it's not being made is because it is incredibly expensive - corporate system implementations cost BILLIONS. Tools like GPT will drive the price down, one developer will be able to increase the output, companies will want more stuff, more frequent updates etc.

    • @Luke-xp5pe
      @Luke-xp5pe Год назад +31

      This is the correct response. There's a gigantic and painful backlog of software projects in most companies and this won't change - rich people will always be competing for market share via software projects. You'll also need developers to write the code. Software projects are rarely some 1 paragraph generic scripts easily located on stack overflow or elsewhere online. They're often tens or hundreds of thousands of lines of code, specifically tailored to a unique use case. You can't just prompt your way into that. This whole thing is stack overflow via prompt, and nothing more

    • @VeyroneR
      @VeyroneR Год назад +6

      That was my opinion Always. Demand for software Will Always be Like infinite u can Always make more stuff, New things. There is no ending. You Will be able to do much more in X time.

    • @jaredpetri7783
      @jaredpetri7783 Год назад +5

      Yup we still have Jira backlog from like 6+ years ago lol

    • @zacharychristy8928
      @zacharychristy8928 Год назад +5

      This. The number of developers you employ is a costly value to change, and it doesn't change anything when you consider that chatGPT doesn't do all the work a programmer does.
      The value that will probably change is code deadlines. If a team can write faster with ChatGPT people will just want updates and features sooner. Not randomly fire 10% of their staff and keep going at the same rate.

    • @davidagnew8465
      @davidagnew8465 Год назад

      @@Luke-xp5pe A wise nerd once pointed out that measuring software by lines of code is like measuring aircraft by weight.

  • @PetrovForever
    @PetrovForever 8 месяцев назад +58

    Fire all your programmers and hire prompt engineers to manage your codebase. Let me know how it goes.

    • @4dillusions
      @4dillusions 5 месяцев назад +8

      Try it! Develop a 3D AAA engine like Unreal or Frostbite without programmers.

    • @tanura5830
      @tanura5830 4 месяца назад +1

      ​@@4dillusions😂 exactly

    • @RangarHz
      @RangarHz 2 месяца назад +2

      ​@@4dillusions bomb😂

    • @prasanthsshorts4262
      @prasanthsshorts4262 Месяц назад

      One programmer & 5 prompt engineers.

    • @GaneshPalraj1991
      @GaneshPalraj1991 Месяц назад

      Why not use your programmers as prompt engineers so if their is any error they can take care of it.
      And you can certainly reduce the amount of working people required in any particular company.

  • @KenOtwell
    @KenOtwell Год назад +181

    As a programmer who has spent hours fighting with ChatGPT to get working code for a new problem (and failing)... of course it won't replace programmers. Future developments could change that, but even then it will take years of "co-piloting" with human coders before it could possibly be trusted. To be clear, I'd use this in an IDE co-pilot role at the drop of a hat - but that's productivity increase, not replacement.

    • @Notabot437
      @Notabot437 Год назад +4

      Its bound to happen tho? That's why I decided not code fuck chat gpt lol

    • @Notabot437
      @Notabot437 Год назад +1

      @@JohnStockton7459 that's what I heard too

    • @thomasweir2834
      @thomasweir2834 Год назад +43

      But that’s merely anecdotal. With training and greater application one programmer can become much, much, more productive. Chat GPT would take the jobs of ALL programmers, just the least productive. It’s like going from shovels to using excavators in digging a hole. A 100 people (programmers) can dig the hole (write the programme) but 10 programmers using AI can dig the hole in the same time frame. Suddenly you’ve lost 90 jobs.

    • @zacharychristy8928
      @zacharychristy8928 Год назад +8

      ​@@JohnStockton7459 nope! There are lots of problems GPT can't solve, for example, anything that someone hasn't put on the internet yet!

    • @zacharychristy8928
      @zacharychristy8928 Год назад +19

      ​@@Notabot437 it's not "bound to happen" any time soon, and the capabilities of chatgpt are only limited to doing things that aren't that interesting to begin with.
      The only people who think it will replace programmers tend not to be doing that much programming to begin with.

  • @jonnyschaff7068
    @jonnyschaff7068 Год назад +136

    I really admire how down-to-earth Lex is. Despite being really smart and talented, he never comes across as arrogant. It's refreshing to see someone like him who interacts with others in such a humble way. I hope to learn from his example and be more like that in my own personal interactions.

    • @Adamskyization
      @Adamskyization Год назад +29

      The thing is, he is not that smart.

    • @hnaku8748
      @hnaku8748 Год назад +8

      @@Adamskyization Smarter than me and lot others I know. And much nicer too!

    • @btn237
      @btn237 Год назад +8

      ⁠👆if you’re wondering what said arrogance sounds like, and how to avoid it

    • @shaggyfeng9110
      @shaggyfeng9110 Год назад +7

      ​@@Adamskyization He is smarter than 90% of people on Earth or 85% of US residents for sure. The fact that Lex can push provocative conversations without losing control of his emotion, just showed he has a good brain which most of us do not have.

    • @CristianIntriago_
      @CristianIntriago_ Год назад +2

      Idk he lacks stamina

  • @ttc1867
    @ttc1867 Год назад +26

    Newer(just a year+) self taught programmer and I’m getting to that point of building real robust projects and ideas so chatgpt has been amazing when I realize for example hey I need to loop over this particular data type or something simple and I don’t know how in that moment but if I ever ask it for something a bit more complicated I always end up fighting erroneous responses and having to over explain my ask. This could definitely be worked on in future gpt iterations but I think this idea of truly understanding what someone wants and needs seems super hard to reproduce with a LLM.

  • @oredaze
    @oredaze Год назад +36

    As of now LLMs that write code are like low skill interns that do tiny code generation that needs to be supervised by actual developers. It needs to be guided along and it needs a lot of help to be integrated into an actual project. It is very impressive don't get me wrong, but it's no where near human replacement and I don't see it changing drastically anytime soon.
    Programming, unlike other activities, needs a lot of contextual understanding. It is on the opposite side of the spectrum from highly specialized activities like digital illustration. We saw the latter being perfected already. I'm going to say that AI will be capable of doing the former sort of activities LAST and the latter first. Especially things like game dev requite so much unrelated skills like music, level design etc. If one AI can do all of that then I suspect it could do everything else in the world at which point we have AGI and the singularity. I wouldn't be too worried about programmers... I'm more worried for the world as a whole.

  • @ttc1867
    @ttc1867 Год назад +21

    “They’ll tell you they need a faster horse when they need a car”
    Fucking loved that

    • @rafidhoda
      @rafidhoda 8 месяцев назад

      Original Henry Ford quote : )

    • @jurassicthunder
      @jurassicthunder 4 месяца назад

      I mean can go both ways. In this analogy we might be the horses and ChatGPT the car

  • @SuperCoolBoy0mg
    @SuperCoolBoy0mg Год назад +15

    I think future AI models will probably replace some parts of what we do quicker than we think - but I also think anyone who has the mind capable of manufacturing complex software will probably find a way of building something interesting and novel with the new tools AI creates - or extending our capabilities to simply make more fantastical software. I can't imagine it writing any and all complex software projects that we could produce, expecially when we consider it as a tool to extend our capabilities.
    Just some thoughts:
    Even if it were capable of responding to "Generate me a Cloud based Web Video Viewing application" - You might want things like - "Oh but with support for webm videos" - "and support video comments" - "but make the comments filter out profane language for users under X age" - or "with a REST API and documentation for comments and stats"
    So programming could definitely become simplified into product / technical descriptions some day.
    Where rather than a repository of code - you could have a repository of a product description with caveats and nuances in human readable and understandable language (perhaps plain English descriptions).
    Humans love pushing the limits - so we'll probably use those programmers to push the limits of how complex of prompts we can generate, and generally solve novel problems in the realm of "what do we want exactly". At least until AI can predict what we want and produce outputs better than we could even think to ask for. Wouldn't be surprised if humans and AI are in a reinforcing feedback cycle of humans training AI with new & improved input - and AI providing new and improved tools for humans to produce new and improved outputs for.
    Wouldn't be surprised if many of us move towards (many many years from now) mostly working by training and improving AI/LLMs with high quality inputs, and providing feedback / improvement in the long run - or providing the right prompts/inputs for the desired output. idk - impossible to really predict but it will be interesting at least to see where things go

    • @zootsoot2006
      @zootsoot2006 11 месяцев назад

      The question is whether the prompts required to gain useful answers from AI will change with every new iteration of the AI or whether we can start to build a pattern of understanding of how best to communicate with AI as a whole.

  • @ReflectionOcean
    @ReflectionOcean 11 месяцев назад +1

    Reflect on the unique value humans bring to programming at [0:11].
    Consider how large language models (LLMs) are changing programming practices at [0:18].
    Recognize the potential for LLMs to automate routine coding tasks at [2:32].
    Explore the role of LLMs as companions in the coding process at [2:38].
    Contemplate the interplay between human creativity and LLM-generated code at [5:28].

  • @yacce4463
    @yacce4463 Год назад +22

    Jesus Christ , finally someone who's not following a narrative built on irrationality! Nobody can predict the future, but if you program complex systems, you know very well where the limits are. Maybe AGI will arrive, and by then we're all on the same boat, and the issue will always be if we're not ALL on the same boat!

    • @zacharychristy8928
      @zacharychristy8928 Год назад +5

      I know, I get so sick of this constant assumption that ChatGPT is going to get so much better even though it's fundamental operation really doesn't do what programmers do.
      I'm also not buying Lex's angle that the statistical average of all language on the internet is somehow a deeper insight into the nature of reality and intelligence.

    • @ThomasTomiczek
      @ThomasTomiczek Год назад +3

      Point is - you do not need AGI. The concept of AAGI is needed to change something is stupid. See, larger projects have people writing specs, there are coding standards, there are testers. An AI that can work with that can make most developers redundant. Hint here is - that is NOT AN AGI. A good programming AI may be not a doctor or lawyer, and the definition of AGI is that it does ALL the human stuff in one model. Nope, it is specialized. It is just 2-3 generations further than what we have now.

    • @zacharychristy8928
      @zacharychristy8928 Год назад +5

      @@ThomasTomiczek even what you're saying isn't enough.
      People generating "specs" to feed into an AI is not enough, because specs are written in an ambiguous language known as "english" wheras code is written in a logically unambiguous language known as "code".
      Tests don't really solve the problem either, because your code can pass all tests while still failing in production, and writing tests is often not possible without a prexisting API or knowledge of how said API is implemented.

    • @gastonangelini8352
      @gastonangelini8352 Год назад

      ​@@zacharychristy8928 humans are not intelligent... We can reason things , figure out ways of solving problems and build over another people's hard work.
      But what is intelligence? Can you focus really hard to invent something new? For sure not
      You can only work on another person experience on any given field.
      Inteligence is an illusion
      Don't fool yourself

  • @takeuchi5760
    @takeuchi5760 Год назад +10

    Let's take a moment to appreciate the variable names in the thumbnail code.

  • @user-xx7tv7cc1y
    @user-xx7tv7cc1y Год назад +9

    ChatGPT will never replace software developers because ChatGPT first has to figure out what the client wants, and they client doesn't even know what they want.

    • @clray123
      @clray123 Год назад +1

      I guess the idea is that in future ChatGPT will interview clients to help them figure out what they want. And then implement it.

    • @ThomasTomiczek
      @ThomasTomiczek Год назад +1

      See, first - CHATGPT is a chat program. A proper cognitive loop using the API can have WAY better capabilities. Second, you work on super small stuff, right? Because any project I id in the last decade has PRODUCT OWNERS handling this part. Then user stories get evaluated (which ChatGPT actually is not that bad at). Planning poker. It is not there yet (even with cognitive loop), but saying "never" marks you super high on "ignorant idiot". 3 years ago it could not talk complex ethics - now it can. What is ever in your universe? 2 years? What in 10?

    • @clray123
      @clray123 Год назад

      @@ThomasTomiczek It rather sounds like it is you who is talking bullshit. Cognitive loop? These things are currently tripping up on their own randomly generated bs, with devs desperately trying to mask it by glueing not some very intelligent AI but-rather clunky old traditional hardcoded logic workarounds on top of it to hide the embarrassment. Also slow and memory inefficient as hell, compared to handcrafted algorithms, say, for parsing or compiling good ole machine languages.

    • @zacharychristy8928
      @zacharychristy8928 Год назад +6

      @@ThomasTomiczek no, because there is no "cognitive loop" that's a term you're inventing that doesn't exist.
      ChatGPT is simply a "semantic loop" which is not the same thing. You don't get deeper insights into what would help a person solve a problem by simply being able to generate a response to the last thing they said.
      You're imagining a richer operation is taking place than what actually is. There is no insight to be gained from simply generating responses. You're implying there is some context-aware cognitive model underneath because you've been utterly fooled by a Chinese Room.

    • @chrisstucker1813
      @chrisstucker1813 Месяц назад +1

      I imagine in the future, the client will ask the AI for an app and it will get pumped out in seconds. If the user doesn’t like it and requests any changes, the AI will attempt again; repeat until satisfied client.

  • @MonisKhanIM
    @MonisKhanIM Год назад +5

    For me it generates incorrect code most of the times

  • @lukemelo
    @lukemelo Год назад +10

    In my experience the jump from GPT 3 to 4 is huge for programming in python specifically. I've found it helpful to first communicate back and forth with GPT to design an algorithm, then once satisfied with the logic to ask just one time to produce the code.

    • @VoloBonja
      @VoloBonja 7 месяцев назад

      How is jump to GPT4-turbo?

    • @lukemelo
      @lukemelo 7 месяцев назад

      @@VoloBonja Nice to have longer input but no big difference otherwise

    • @VoloBonja
      @VoloBonja 7 месяцев назад

      @@lukemelo failed attempt of GPT5, so they called it turbo…

  • @petrulutenco6600
    @petrulutenco6600 Год назад +4

    What a kind and lovely person Chris Lattner is.
    Thank you for Swift

  •  Год назад +3

    I am a novice in programming, but it has proven to be extremely useful in my work and I am still learning. I can imagine that many advanced programmers might be able to accomplish more sophisticated tasks, but for a beginner like me, GPT-4 has been an incredible booster. I believe I have at least doubled the amount of useful code I can produce. Additionally, it has taught me how to learn about programming. My fear was that using GPT-4 would make me complacent or reduce my drive to learn more, but in reality, based on my experience, it has accelerated my learning and helped me make more progress. I think the impact will vary greatly depending on one's level of programming expertise, but this technology is bound to transform the field in one way or another.

  • @misterbuu666
    @misterbuu666 7 месяцев назад +1

    Calculator didnt kill science - it just made it less error-prone in certain areas. LLM for programmers will become just what calculators are for scientists. It does simple things quickly, allowing shift focus on more complicated things.

  • @soggybiscuit6098
    @soggybiscuit6098 Год назад +15

    Bruh AI went from zero to 100 in a short period of time and aint stopping, and ppl still saying nah thus is the best it can get 🤡

    • @rockwithyou2006
      @rockwithyou2006 Год назад +5

      except that it didn't. I started my undergrad in 2006 and already we used to consider "coding" as a menial job or "robotic". AI has existed for years and gotten mainstream only now.

    • @rejectionistmanifesto8836
      @rejectionistmanifesto8836 8 месяцев назад

      ​@@rockwithyou2006You all need to discount everything before AlphaGo (the 2017 version), all that 50-60 years was basically like learning how to build the effective tool of actually working Deep learning.

  • @jamesbra4410
    @jamesbra4410 Год назад +2

    So far I've found it useful for websites, web servers, microcontrollers, datasheet reading, plc languages, video game engines, and so many other languages. It's an extremely useful tool.

    • @synthwave8548
      @synthwave8548 Год назад

      How have you used it with PLCs and microcontrollers?

    • @jamesbra4410
      @jamesbra4410 Год назад

      @@synthwave8548 Well in the world of automation engineering there are two methods so far that could allow somebody to program some robot for some industrial application but there is a method alternative to the use of a PLC with HMI to display data to the operator. Using a microcontroller in C is more in depth than a PLC. If you want to set up a webserver on a microcontroller then you'd need to include the javascript and html in the IDE along with additional Node software and files. So using chat gpt for that is a great option because it will specify all the libraries to use in the IDE and write the JS and HTML code necessary to display the website correctly. As well, I've asked chatgpt to write code for a PLC (can do both structured text and ladder logic) that would do simple digital and analog outputs to relays or transistor switches, counters, pwm, variable current supply to pneumatic actuator or variable frequency drive, feedback from sensors requiring analog to digital converters, comparators etc. Basically specify your PLC, whether you want ST or LL format, what industrial scenario you'd want to accomplish and it will get you started and continually refines itself as we know and provides an indepth explanation of it if needed.

  • @carlettoburacco9235
    @carlettoburacco9235 Год назад +4

    Right now I am perfectly in tune with "the customer wants a faster horse...........".
    I'm working on a project whose original specs were for a "software horse" that can fly, scuba dive, can carry 10 tons up a vertical wall, and can easily fit in your pocket.
    ChatGPT is a very good programmer, but try to make him steer the customer towards something in the realm of possibilities. (without giving a call to Harry Potter)

    • @ThomasTomiczek
      @ThomasTomiczek Год назад

      That is out of the question for anything larger, though, where this steering is NOT A PRORAMMERS JOB. That is what Product Owners are for. Isolating the customer from... Also, you likely use ChatGPT without a proper planning and cost matrix. AI is capable in limits of common sense - just not a chat bot implementation. If you drop ChatGPT and got GPT (i.e. sue the API) and put a proper cognitive architecture there things turn out a lot different regarding common sense. I have one here that ASKS ME QUESTIONS. Clever prompting to not have a slave.

    • @Ivcota
      @Ivcota Год назад +3

      @@ThomasTomiczek Programmers are not code monkeys. Helping steer is absolutely the job of a programmer, especially more senior level. Product would love to have some features that are neither feasible or maintainable - those who develop the systems should be high level advisers on the direction.

  • @blingblockchain
    @blingblockchain 11 месяцев назад +3

    Coding is us simply telling a machine what to do in a way that we have the most control. Yes, AIs can give us some good code, and sometimes not so good code. Maybe down the road it will always be good. Either way, we still have to tell it what we want. Personally I’d rather do that in code vs. chat/text.

    • @zootsoot2006
      @zootsoot2006 11 месяцев назад +1

      Yeah, code is much more logical that any human-spoken language, therefore why communicate with computer code, i.e. AI, in English when you could communicate with the language it's actually built on, code.

  • @amparoconsuelo9451
    @amparoconsuelo9451 11 месяцев назад +1

    1) Can ChatGPT create a compiler and an operating system?
    2) Can ChatGPT create a computer language?
    3) Can ChatGPT create its own Python modules, packages and libraries?

    • @MasterFrechmen732
      @MasterFrechmen732 10 месяцев назад +2

      Answer is NO, big NO. Because ChatGPT is shitty sotware that provides bad code 99% of the time.

    • @rajeevdsamuel
      @rajeevdsamuel 9 месяцев назад

      @@MasterFrechmen732 This is all a big scam to lower programmers salaries lol

  • @SynthByte_
    @SynthByte_ 8 месяцев назад

    for me LLM's works best figuring out an error and not writing code but explaining why the error happend and what I can do to fix it

  • @dominikcicea
    @dominikcicea Год назад +23

    I asked ChatGPT about this and it said No, and also that AI will create even more job opportunities.

    • @DionaldysSalcedo
      @DionaldysSalcedo Год назад

      Wake up and stop believing everything you read. It’s called Fact Checking. People lie, just like you

    • @gastonangelini8352
      @gastonangelini8352 Год назад +8

      More job opportunities for other AIs for sure...
      It's a joke 🤣
      Or not?

    • @dominikcicea
      @dominikcicea Год назад

      @@gastonangelini8352 Why would that be a joke? It was the same with technology, some technology replaced old-school jobs, but it created million more new jobs.
      For example, AI already created Prompt Engineering, people are already making money from it!

    • @dominikcicea
      @dominikcicea Год назад

      @@bobanmilisavljevic7857 You are mentally sick

    • @Mica_No
      @Mica_No Год назад +9

      I would say that too if I was an AI

  • @rubendariofrancodiaz6944
    @rubendariofrancodiaz6944 Год назад +4

    I think ChatGPT is too hight level and leaves space for a lot of ambiguity. It will obscure a lot what the machine will be doing and how will be performing. So there's still going to be needed a lot of computational knowledge, even more than these days if you want to get something sized and performant, without surprises in terms of costs, and of course, something that can be maintainable in the future. I don't see big companies leaving all their profit and reputation to a bunch of guys without computational knowledge playing with a prompt. Also, I don't think the prompt will be the future of programing. There will be something in the middle that can deal with ambiguity and that is more precise. I believe programing language will be needed less and less, but computational knowledge more an more.

  • @surfbeach100
    @surfbeach100 15 дней назад

    why would u want to put a LLM in a compiler?

  • @AfriTechChronicles
    @AfriTechChronicles 2 месяца назад

    This is Absolutely truth that we should not compete with LLMs but work in sync with them to achieve greater height

  • @NomadicBrian
    @NomadicBrian 6 месяцев назад +2

    As a 66 year old Application Developer I hope we at least have 5 to 10 years before it is all over. If I don't outlive becoming obsolete there is no problem. Sorry to be morbid but AI doesn't care anyway. What am I doing after hearing the narrative about AI replacing developers? I went into phase 2 of my AI learning. After 4 years of Python I started PyTorch and Deep Learning. Understand tensors and the basics of Neural Networks and model training/testing. Phase 3 will be returning to my Python fastAPI NFL Colts app where I build 15 graphs but leveling up with some predictability or projection on a player or team stat. Then have a stat of the week feature to give my graphs/bar charts a little more spice. That I deploy to heroku/salesforce cloud as an Angular web UI.

  • @mikestock1848
    @mikestock1848 Год назад +5

    Programming is just translating solutions to problems into a language a computer can understand
    It's the problem solving that is the difficult part, once chat gpt can do that, then programmers are in trouble, haven't seen any sign of that yet

    • @Neo-Reloaded
      @Neo-Reloaded Год назад +4

      It's like thinking that the calculator replaced math teachers. People still have to learn basic arithmetic.

    • @amadeusbojiuc2613
      @amadeusbojiuc2613 Год назад

      Best comments

    • @unhash631
      @unhash631 Год назад +1

      > problem solving that is the difficult part
      It depends on what kind of problem. A bug in the C library? Sure, that's difficult. An e-commerce website? I doubt AI would still have difficulty coding one by itself in the next 10 years. The latter is still a job for thousands of people currently by the way.
      Thing is, problem solving isn't as special as we think it is. It's really just a bunch of neural networks. The only strength we have as humans now are our "Eureka" moments and how we aren't as prone to catastrophic forgetting as DL networks are. In the future, I think knowledge work will only be reserved to the most intelligent humans (i.e. those who are considered geniuses). It's a dismal prediction but there's really no other way to see it.

    • @mikestock1848
      @mikestock1848 Год назад

      @@unhash631 Yeah but it would have to understand the domain, Dynamics and Salesforce have tried to be completely generic, but you still need armies of developers to integrate if you want any business value
      It's also the problem that it has to be able to modify the existing code and be an interpreter for humans giving requirements
      Maybe in 10 years? Sure, its not there right now, which was my original point
      Problem solving is actually special, I've seen many people who do it wrong

  • @esteban80
    @esteban80 11 месяцев назад

    There's myriad things that Chris and Lex do not address... an LLM spits out 'the highest likely next word'... Do you want that to be writing your code? What are the consequences?
    Code needs to be consistent, complete and coherent... and it needs to take into account the functional and non-functional needs of all of its stakeholders
    Neither of which can be done right now if you want to replace developers.
    Example: There are 7 million lines of code spread over 150+ systems/chips in a wafer-stepper machine. Every hour of it *not* being operational costs ~300k.
    After an issue:
    - Do you want to be the 'prompt engineer' that convinces an LLM to spit out the 7 million lines of code that doesn't break it?
    - Do you want to rely on a tested code-base and engineers?

  • @AlexDrastico380
    @AlexDrastico380 Год назад +4

    Anyone who thinks ChatGPT can write industrial-strenght software systems and can replace software engineers is either a bedroom programmer or does not have any knowledge at all about software development.

    • @michaelpieters1844
      @michaelpieters1844 Год назад

      Chatgpt won't do it but be damned sure an even more advanced tool in 2030-2040 will.

  • @michaelhernandez2075
    @michaelhernandez2075 Год назад +8

    AI absolutely will replace software engineers. It's just a matter of how long until then. Anyone saying it's not going to replace engineers along with a lot of other people seem in denial.

    • @giuseppegatti5079
      @giuseppegatti5079 Год назад +2

      I agree. Not all programmers, but I think many of them are just afraid to lose their job, and so they try to spread these news to make others think AI won't replace them.
      Even in 4 years we could have a powerful chat GPT. Technology is getting more and more advanced every month.

  • @camgere
    @camgere Год назад +10

    I look at software tools as a series of increasing levels of abstraction and decreasing levels of skill necessary to use them. You started out with machine code. It might even have been wires on a plug board. The 1's and 0's entered into specific memory addresses. Then IBM Hollerith Punch cards or paper tape to load code. The assemblers came along to create machine code. Much faster and easier. Then higher-level languages came along and were even easier. The skills needed to use higher level languages became less and less at the higher levels of abstraction. There are people who can program in Python that can't write or read machine code. Of course, object-oriented programming is all the rage. No need to know how to create all those fancy data structures. ChatGPT will just be a higher level of abstraction that requires less detailed knowledge to use. Do you know why there is a language of mathematics? Because English is ambiguous and imprecise. Some level of computer language will be around for a long while.

    • @rubendariofrancodiaz6944
      @rubendariofrancodiaz6944 Год назад +2

      I think ChatGPT is too hight level and leaves space for a lot of ambiguity. It will obscure a lot what the machine will be doing and how will be performing. So there's still going to be needed a lot of computational knowledge, even more than these days if you want to get something sized and performant, without surprises in terms of costs, and of course, something that can be maintainable in the future. I don't see big companies leaving all their profit and reputation to a bunch of guys without computational knowledge playing with a prompt. Also, I don't think the prompt will be the future of programing. There will be something in the middle that can deal with ambiguity and that is more precise. I believe programing language will be needed less and less, but computational knowledge more an more.

  • @thydevdom
    @thydevdom 4 месяца назад

    I use it from time to time for code generation on a certain specific problems I am having issues. At the end of the day , the users using chatGPT still need to understand the code.

  • @TomT-ds9vn
    @TomT-ds9vn 8 месяцев назад

    It will greatly diminish the need for programmers but not eliminate them.

  • @yashguma
    @yashguma Год назад +22

    It will make programmers more efficient, you'll still have to always understand what the code is doing.

    • @pladselsker8340
      @pladselsker8340 Год назад +9

      Yes, until the model understands what the full project code is doing, and until they figure out how to properly train their weird recursive transformer that allows an unlimited amount of tokens, we're fine.
      (Programmers are doomed).

    • @ddd7386
      @ddd7386 Год назад

      ​@@pladselsker8340programmers create these projects. A real programmer is an engineer. Writing code is just a mechanical thing. Creating ways, how to achieve the goal, is what a software engineer does. Before softer engineers are doomed, other professions stop existing except probably mathematicians.

    • @theawebster1505
      @theawebster1505 Год назад

      Why would that be. Do you understand how your microwave works?
      Everyone I have spoken with was like Oh, yeah, AI is very powerful, it will take some jobs, but not MINE. Because X, Y and Z.
      Does this smell of something? It has to replace SOMEONE 🙂 Stop burying your heads in the sand and consider the future.

    • @rileyfletch
      @rileyfletch Год назад +1

      @@pladselsker8340 At that point say goodbye to a majority of technical jobs

    • @davidh00
      @davidh00 Год назад

      Why? I've given specs to hundreds of offshore developers, and they have close to zero understanding of what the overall system is intended to accomplish.

  • @wbb5954
    @wbb5954 Год назад +5

    I don't know how Lex could overlook asking Christian about "The Shot". Crazy that he hit one of the most iconic shots in basketball history and went on to become one of the world's most prolific programmers. Big things from a guy who use to do keg stands at Duke.

    • @jerryyu7270
      @jerryyu7270 Год назад +1

      I thought your comment was interesting so I did some research. I think you have Chris Lattner confused with Christian Laettner. Christian Laettner played college ball for Duke then went on to the NBA. Chris Lattner attended University of Portland.

    • @baldsportsfan9368
      @baldsportsfan9368 Год назад

      Wrong Chris bro lol

  • @amadeusbojiuc2613
    @amadeusbojiuc2613 Год назад +9

    1. You have to explain to chat gpt exactly what you want your code to do.
    2. You have to check it’s code.
    Within that time you could have already coded your function/program.

    • @jora5483
      @jora5483 Год назад +1

      This is true to senior developers who know their stuff. But midd and junior developers who struggle can get close results to seniors. Suddenly, we get so many developers ... I think there are many gray areas in future.

    • @NotHumant8727
      @NotHumant8727 Год назад

      Nah, are you so bad at explaining? It can spit out hundrets of lines of code under a minute. Its not that bad it wouldnt even be useful as framework.. It often corrects own problems if you compile it and feed it back with errors or misbehaviour..

  • @tradermunky1998
    @tradermunky1998 Год назад +4

    They have been saying developers will be replaced by machines or WYSIWYG for literally decades. I'm not holding my breath.

  • @maltimoto
    @maltimoto 7 месяцев назад +1

    wont happen because writing new code is only 10% of the job. 90% are changing *existing* code and communicating with the customer or other teams. I dont see how AI could do that.

  • @The_Conspiracy_Analyst
    @The_Conspiracy_Analyst 10 месяцев назад +1

    No, it CERTAINLY won't replace programmers. ChatGPT can, given a description, return a small snippet of code. But I think overall what it generates is trivial. There is still the matter of understanding the problem abstractly, what the constraints are and what we are trying to solve, and architecting a solution. That is, the abstract problem solving component of writing code that is separate from the specific implementation in a specific language and environment are separate things. And LLMs like ChatGPT understand NOTHING. It isn't an expert system, it's a neural net that mimics language. So it can't reason about anything. What I predict will happen vis a vis ChatGPT/LLMs and programmers is like what happened when CAD/CAM and networked workstations became mature and viable for Mechanical Engineers. The tools are so good, and fill in the skill gap, that most of the demand will be for codemonkeys (just like CADmonkeys), and salaries will drop for these positions will drop. Most positions will be of this type. Gone will be the days of "rockstar" hipsters that are just good at quickly memorizing hipster frameworks/fashionable languages du jour. What I mean is that ChatGPT or assistants will flatten the field. There won't be these "rockstar" positions (because most of them are unwarranted in the first place), but the productivity will open up in new dimensions that the sheer NUMBER of available jobs will open up. You won't have the elitism and barrier to entry. I think it will be nice. And alot more cool code will end up being written that will solve real problems, which is out of reach right now.

  • @testtest-co9hk
    @testtest-co9hk Год назад +1

    what the hell did I hear? LLM inside compiler, that's not how the compiler works. There is always a fraction of uncertainty in the answers of LLM but compiler is never uncertain (it must not be). Both are different worlds. ChatGPT cannot replace programmers but programmers who use it efficiently can replace other programmers.

    • @testtest-co9hk
      @testtest-co9hk Год назад

      @Jake-oq2bq christ never said that, lex was asking if it was possible for which i commented. Anyways don't just go for people telling this is possible or not, they all have their vested interest to jump on ai hype.

  • @RANDO4743
    @RANDO4743 Год назад +21

    Chatgpt is a glorified search engine,it pulls or regurgitates info that is fed into it.

    • @yashagarwal8249
      @yashagarwal8249 Год назад +9

      It can answer novel questions with novel answers, so no

    • @qweds3127
      @qweds3127 Год назад +1

      Delusional take.

    • @trentforshee4556
      @trentforshee4556 Год назад

      Except it can come to conclusions for you as search engines just give you information

  • @ihbrzmkqushzavojtr72mw5pqf6
    @ihbrzmkqushzavojtr72mw5pqf6 9 месяцев назад

    IMHO, it allows to write code in blocks instead of lines. Not replacing at me moment

  • @aR-dm3ns
    @aR-dm3ns 5 месяцев назад

    Short answer: yes
    Long answer: not for another 5 years

  • @almaguerrero3041
    @almaguerrero3041 Год назад

    Even in linux commands or some files or problems or maybe instalations, humans with starkoverflow works much better

  • @grafdp
    @grafdp 10 месяцев назад

    Future releases of something like ChatGPT will definitely replace programmers.

    • @aseeralfaisalsaad
      @aseeralfaisalsaad 8 месяцев назад

      You probably never coded or built anything or a complete noob

  • @sarscov9854
    @sarscov9854 Год назад

    In like 10 years or less I think. Self improving computer that will become infinitely improved.

  • @sv-xi6oq
    @sv-xi6oq 10 месяцев назад +1

    Anyone who's actually used ChatGPT saying that it won't replace programmers is in such denial. Just because it's not always right doesn't mean it's not effective. It's more right than humans are. The only things ChatGPT is bad at are high-level mathematics and physical labor.

    • @gerardoricor
      @gerardoricor 8 месяцев назад

      And also it will keep improving. This is just the beggining. I don't know if all these people are aware of exponencial progress.

    • @drwho9319
      @drwho9319 Месяц назад

      But developers are saying chat gpt cannot write code and it’s just pulling code, which most of it is not optimize, or broken. Can’t replace developers if it writes like that at the moment

  • @TheCameltotem
    @TheCameltotem 6 месяцев назад +1

    Code shouldn't be innovativein most cases. Do we build new types of engines and tools every day? No they are massproduced.
    What is unique is the business and the goal it has, its your job to build something that reflects and help those goals. The actual code to do this doesn't have to be unique or innovative.

  • @justcodeitbro1312
    @justcodeitbro1312 Год назад +1

    10 years from now we will talk

  • @numanunal6699
    @numanunal6699 8 месяцев назад +1

    Yes

  • @Andrew-un8tx
    @Andrew-un8tx Год назад +2

    The only people I hear talking about this are students and academics. The reality is that most companies have a proprietary codebase that they will not feed into an outside owned AI platform. It's a security risk. Everyone talks about this from a possibility standpoint, not a practicality standpoint. If AI replaces software engineers it won't be for a long, long time. Not until the companies can buy bespoke AI implementations and own them outright.

    • @jojoma2248
      @jojoma2248 Год назад

      I think your wrong. The competitive advantage of using AI is too big. They’ll make AI models trained but isolated so there’s no security risks.

    • @Andrew-un8tx
      @Andrew-un8tx Год назад

      @@jojoma2248 you obviously have never worked in a corporate job with Intellectual property and data integrity concerns.

    • @jojoma2248
      @jojoma2248 Год назад

      @@Andrew-un8tx come back in 5 years, I think youll see im right

    • @Andrew-un8tx
      @Andrew-un8tx Год назад

      ok, kiddo. I'm sure you know better than someone who deals with this every day and makes these decisions for a living. Even though you have zero experience with this type of data. 5 years? At work we have legacy computer systems that are over 20 years old and cannot be updated due to security and data integrity issues. You truly have no idea what you're talking about and are just imagining some childish utopia. Because you're young, naive, and ignorant. With absolutely no clue how the world really works. @@jojoma2248

  • @Ayo22210
    @Ayo22210 9 месяцев назад

    Ai CAD libraries should be a big thing

  • @SavvySavant
    @SavvySavant Год назад

    Honestly, at a certain level... Probably. But there is a level of specificity in reasoning, expertise and stochastic thought/problem solving that (at least with current LLM's) can never really be matched until computer scientists actually build TRUE A.I.. Until then, I think the experts have it covered..

  • @ilovetech8341
    @ilovetech8341 10 месяцев назад

    Just have the CEO write all the software with ChatGPT. They better hope they actually know what they are doing.

  • @hassansyed5661
    @hassansyed5661 Год назад +1

    ChatGPT is generating more vulnerable code than humans

  • @TheBigBlueMarble
    @TheBigBlueMarble 2 месяца назад

    OpenAI-01 changes this entire discussion. We are within 3 years of a true AGI that will have the ability to replace a large percentage of programmers in the world.

  • @tjmns
    @tjmns 11 месяцев назад

    Did the robot "Data" replace workers in Star Trek the series? No. Computers are just tools used to solve complex problems. I dont think ai will be any different. It's just a tool

  • @paweszczepanski6738
    @paweszczepanski6738 Месяц назад

    Sure. It will. It will replace us all. There will be a task force and it will put a GPT in everybody's place.

  • @CPB4444
    @CPB4444 Год назад +5

    If it looks like a programmer, codes like a programmer, programs like a programmer then it just may be a programmer.

  • @mikeunger4165
    @mikeunger4165 Год назад

    I have a new kid and was thinking of skill to teach him. Planned to learn a little Python so, when he gets old enough, I could teach him some coding. Now... not sure if it is worth it. Will programming survive another 18 years as a job? Probably. But just barely being able to code is a good job now. I have lots of friends that tried it; they all got jobs after a short coding academy. It might not be that kind of job in a few years.

  • @omid_tau
    @omid_tau Год назад +1

    ChatGPT is only as good as the prompt you give and the current iterations still makes awful mistakes and is really just regurgitating stack overflow.

  • @scoff7032
    @scoff7032 Год назад +1

    I'm more scared of recession than gpt-4 xD

  • @geneherald8169
    @geneherald8169 5 месяцев назад

    It's already replaced coders. Just look how hard it is to get a coding job nowadays. It really never made much sense in the first place to have so many people working in a company's IT department anyways. What, it takes 100 people to code a website? No it doesn't... The future is in customer facing or physical roles I suppose. It doesn't matter how smart you think you are, you're competing against a literal digital superintelligence

  • @kevinfortier556
    @kevinfortier556 Год назад +21

    Chatgpt helped lex realize he's just a casual when it comes to programming. It's ok lex, you're a podcaster now, not a programmer 😂

    • @Kersich86
      @Kersich86 Год назад +7

      yeah exactly, sometimes it buffles my mind how someone that apparently was a professor on the subject knows so little about the tech itself

    • @zacharychristy8928
      @zacharychristy8928 Год назад +4

      Seriously. ChatGPT is great at solving tiny, isolated, pre-existing and well-understood problems. It's basically a searchable textbook, except at least a textbook gets checked for correctness.
      Actual valuable or interesting programming does not fit into this model.

    • @jendabekCZ
      @jendabekCZ Год назад +2

      @@zacharychristy8928 On the other hand, most of coders / programmers are just "solving tiny, isolated, pre-existing and well-understood problems" :)

    • @zacharychristy8928
      @zacharychristy8928 Год назад +2

      @@jendabekCZ no they aren't. They're usually working on larger projects solving highly contextualized problems that don't translate well into a single text prompt. This is what people who studied code in school but never worked in industry would think.
      For most bugs, I can't explain in a single sentence what the bug is, when it's happening, what Ive ruled out and what could be causing it. The input to chatGPT for a single bug fix would basically have to be the entire codebase, a description of the hardware, the desired behavior and some accompanying pictures + explanations. Then when it inevitably fails to find the bug in all of that, I then have to find a way to explain what happened when I tried what it suggested, and the fix didn't work.
      It's not well suited to the task being described.

  • @eljefeog
    @eljefeog 11 месяцев назад

    For Programmers, ChatGPT is an excellent tool but a poor master. A Swiss army knife for regular expressions. While very useful for pair programming, it takes a skilled developer to distinguish the wheat from the chaff.

  • @xsparik
    @xsparik Год назад +1

    Respectfully to all these coders in comment section saying “It won’t replace programmers” : Well, it is meant to replace all these jobs in the first place. The whole point of creation of these LLMs is not a hobby or to see where it may go but it is created with an intent to excel what humans can do intellectually.
    So it’s all about a “matter of time” when these LLMs reaches that level of error-free proficiency to actually be safely implemented at a commercial scale.
    And the way things are going, it’ll hardly take a next decade to do so.

  • @annunacky4463
    @annunacky4463 Год назад

    Maybe they could replace politicians? Couldn’t be worse.

  • @bagzhansadvakassov1093
    @bagzhansadvakassov1093 Год назад +3

    Chat gpt is overhyped.

  • @mk17173n
    @mk17173n Год назад +1

    ChatGPT cant even correctly do medium level code challenge lol.

  • @simpleman1218
    @simpleman1218 Год назад +1

    when someone speaks about AI who actually has profit from AI hype.

  • @MK-tk8tb
    @MK-tk8tb Год назад +1

    Sexbots will replace programmers.

  • @ReginaJune
    @ReginaJune Год назад

    0:46 the uniqueness is your human experience shapes your thinking. You can’t compare the ability to think and the ability to create/manufacture to what a calculator that can simulate all possible scenarios. ChatGPT didn’t think of itself or build itself and it can’t kill anyone without the help of humans. Wouldn’t that be a hilarious cosmic joke that AI is the source of consciousness and it created us to experience the world with all 5 senses.

    • @ReginaJune
      @ReginaJune Год назад

      1:14 think of how much time you’d have for dating!

  • @MAMW93
    @MAMW93 Год назад

    It will but not all. people still have to program AI itself...

  • @isrealfemi7576
    @isrealfemi7576 3 месяца назад

    What i think ai will do is what tractor did to farmers i see a future wgereby programmers will not worry about coding but worry about how to solve difficult problems.
    We will assume the full duty of a develop which is to solve problems and focus less on writing codes

  • @godforreal7355
    @godforreal7355 8 месяцев назад

    I _just_ started trying to learn Python...

  • @Coldeye816
    @Coldeye816 Год назад

    can chat gpt or any other llm re write todays games and programs in basic low level languages like c++ or c so it is well optimised?

    • @hamm8934
      @hamm8934 Год назад +6

      Nope

    • @manm5302
      @manm5302 Год назад

      @@hamm8934 why? I mean, most high level languages pride themselves as being easy to learn and debug and update. As LLMs become better, why shouldn't low level languages be used to speed everything up?

  • @KingGisInDaHouse
    @KingGisInDaHouse 11 месяцев назад

    It’s only as smart as the person using it. You will need basic programming knowledge still and maybe help it have you do a little heavy lifting like for like form validation or drop down menus.

  • @philforrence
    @philforrence Год назад

    Agreed

  • @gonootropics2.065
    @gonootropics2.065 Год назад +1

    This is definitely going to replace entry level programmers. I only know basic programming concepts and was able to build a fully functional web scraper that stores data in a database, and uses JavaScript and php to display on the front end. There'd be no way I could accomplish this without it. It does however often give you incorrect code, and as your project grows it begins to lose context of all the moving parts. On more complex projects you would definitely need a high level programmer to know what questions to ask, and where that code fits in to the entire structure

    • @rockwithyou2006
      @rockwithyou2006 Год назад

      the entry level programmers of this generation are already far better than what chatGPT can do

  • @valuetraveler2026
    @valuetraveler2026 Год назад

    haha maybe in 20 years.. It couldnt even replace basic HTML CSS web programmers but hey its a good start

  • @amcmillion3
    @amcmillion3 Год назад +2

    I'd be much more concerned if GPT's code wasn't complete garbage. There is also no way for it to plan and architect large scale projects and to understand the problems customers are facing

    • @ThomasTomiczek
      @ThomasTomiczek Год назад +4

      So, you tell me that you do not know how to write a self-correcting loop using automatic testing feeding the info back into the AI? Ah. ALso, you are not conerned because the CURRENT version can not program perfectly while the version 2 years ago could not talk properly and use tools - so you come from that to the conclusion that in 5 years it STILL is not on your level? Hint: Realize what the future may hold. THERE WILL BE MAGIC. Maybe not - but you are confident development stops HERE.

  • @piggyceiling7652
    @piggyceiling7652 Год назад +12

    90 percent of people on the internet I see say NO, which I agree with. ChatGPT makes lots of mistakes and can't know all the requirements to a complex system. But do you really think in 3 to 5 to 10 years when AI is exponentially smarter and I mean thousands of times smarter (which is inevitable in my opinion), it won't be able to code faster and better than every human on earth combined??? And on top of that be able to manage and create requirements by itself??? 😆😆chatGPT is literally just the very very very very very tip of the ice berg when it comes to artificial intelligence. Coding will be dead within the next decade.

    • @dan-cj1rr
      @dan-cj1rr Год назад

      well if they do we'll all be job less, or everyone will be able to build the same sytems as big companies, were all gonna be CEOS? or were all gonna have to learn a trade skill, so my point is until then no one will have to work anymore (except trade jobs/nursing/doc etc.)

    • @thiagobadin5331
      @thiagobadin5331 Год назад

      @dan that is a problem that humanity is already running into. Many people around the world have their job being done better by a machine for decades. That creates unemployment around the globe. Programmers are just another one of these professions. What to do when fewer and fewer tasks are needed to be done by a human? That is what we, as a society, will need to figure out in the future

  • @alexshaykevich509
    @alexshaykevich509 Год назад +2

    Increasing programmer efficiency without expanding the amount of work will result in fewer jobs for humans. That's the crossroads we're at.

  • @fernandohiar9985
    @fernandohiar9985 6 месяцев назад

    Im programmer and i dont use artificial inteligence.

  • @janklaas6885
    @janklaas6885 Год назад

    📍3:46

  • @aaron___6014
    @aaron___6014 2 месяца назад

    "really well," as in it doesn't work.

  • @matthewberg5835
    @matthewberg5835 5 месяцев назад

    no ai will not replace coders or programers

  • @Abdullah-zd5rz
    @Abdullah-zd5rz 2 месяца назад

    After o1 , We're cooked

  • @Wubbay828
    @Wubbay828 Год назад

    Poor Lex 🙁🙁🙁

  • @jakewolf079
    @jakewolf079 11 месяцев назад

    AI only writes ''old'' code, meaning it only knows codes that been written by a human, so yes if all you want is an instagram clone or a facebook clone AI will indeed take some jobs in that regard, but if you are trying to innovate and create something new then no not a chance. It will however speed things up a little.

  • @Virtual_ismo
    @Virtual_ismo Год назад

    Por enquanto a ferramenta esta bem longe de substituir programadores. Ela é interessante desde que o usuário saiba o que quer perguntar e elabore um prompt muito bem elaborado. O maior risco disso pode ser uma sequência de orientações erradas ou confusas para alguém que esta desenvolvendo conhecimento sobre uma área ou questão especifica.

    • @ldandco
      @ldandco Год назад +1

      totally agreed

  • @karrde666666
    @karrde666666 Год назад +1

    AI is a fad, everyone will have forgotten about it in a month

  • @bastabey2652
    @bastabey2652 Год назад +1

    Code generation will be replaced with English coding since LLM are pretty good in NLP

  • @furqantarique3484
    @furqantarique3484 5 месяцев назад

    I don't want to be jobless due to AI

  • @sunandasanyal7878
    @sunandasanyal7878 Год назад +2

    First Lex woah!

  • @PepeCoinMania
    @PepeCoinMania Год назад

    you must be very bad to be replaced by a template generator lol
    a software developer MUST: understand business intricacies, deal with different people, ask them questions, understand what they need, negotiate, deliver, integrate with already existent software etc.
    NONE of these are available on the open internet, large corporations keep their own information private and well safe from outsiders
    thats why AI can't replace devs these days.

    • @CombatCombat-0
      @CombatCombat-0 Год назад

      what if large corporations train their own AI for their specific needs with their private data.

    • @jojoma2248
      @jojoma2248 Год назад

      Gpt 4 buddy. And they’re adding the ability to upload files, once that happens it’s over