3 Key Version Control Mistakes (HUGE STEP BACKWARDS)

Поделиться
HTML-код
  • Опубликовано: 25 июн 2024
  • Version Control is pervasive these days, and is fundamental to professional software development, but where does it go next? Git, often via platforms like GitHub, GitLab or BitBucket, is by far the most popular VCS, but its value is being watered down, and the next steps in software development, look to be ignoring this vital tool of software engineering. Without the ability to step back safely from mistakes, and re-establish "known-good" starting points from Version Control, we will loose the ability to make incremental progress, and with that loss, we also loose the ability to create truly complex systems.
    In this episode Dave Farley, author of best sellers "Continuous Delivery" and "Modern Software Engineering" describes that essential role of version control, that we often ignore, and explores 3 ways in which we often compromise the value and utility of version control in our software projects.
    -
    🖇 LINKS:
    🔗 "How Version Control is Evolving" ➡️ www.infoworld.com/article/371...
    🔗 "History of Version Control" ➡️ ericsink.com/vcbe/html/histor...
    🔗 "Austerity Policy based on a Spreadsheet error" ➡️ www.theguardian.com/politics/...
    🔗 "Impact of Austerity on Death Rate" - British Medical Journal ➡️ bmjopen.bmj.com/content/7/11/...
    -
    ⭐ PATREON:
    Join the Continuous Delivery community and access extra perks & content! ➡️ bit.ly/ContinuousDeliveryPatreon
    -
    👕 T-SHIRTS:
    A fan of the T-shirts I wear in my videos? Grab your own, at reduced prices EXCLUSIVE TO CONTINUOUS DELIVERY FOLLOWERS! Get money off the already reasonably priced t-shirts!
    🔗 Check out their collection HERE: ➡️ bit.ly/3Uby9iA
    🚨 DON'T FORGET TO USE THIS DISCOUNT CODE: ContinuousDelivery
    -
    BOOKS:
    📖 Dave’s NEW BOOK "Modern Software Engineering" is available as paperback, or kindle here ➡️ amzn.to/3DwdwT3
    and NOW as an AUDIOBOOK available on iTunes, Amazon and Audible.
    📖 The original, award-winning "Continuous Delivery" book by Dave Farley and Jez Humble ➡️ amzn.to/2WxRYmx
    📖 "Continuous Delivery Pipelines" by Dave Farley
    Paperback ➡️ amzn.to/3gIULlA
    ebook version ➡️ leanpub.com/cd-pipelines
    NOTE: If you click on one of the Amazon Affiliate links and buy the book, Continuous Delivery Ltd. will get a small fee for the recommendation with NO increase in cost to you.
    -
    CHANNEL SPONSORS:
    Equal Experts is a product software development consultancy with a network of over 1,000 experienced technology consultants globally. They increase the pace of innovation by using modern software engineering practices that embrace Continuous Delivery, Security, and Operability from the outset ➡️ bit.ly/3ASy8n0
    TransFICC provides low-latency connectivity, automated trading workflows and e-trading systems for Fixed Income and Derivatives. TransFICC resolves the issue of market fragmentation by providing banks and asset managers with a unified low-latency, robust and scalable API, which provides connectivity to multiple trading venues while supporting numerous complex workflows across asset classes such as Rates and Credit Bonds, Repos, Mortgage-Backed Securities and Interest Rate Swaps ➡️ transficc.com
    Semaphore is a CI/CD platform that allows you to confidently and quickly ship quality code. Trusted by leading global engineering teams at Confluent, BetterUp, and Indeed, Semaphore sets new benchmarks in technological productivity and excellence. Find out more ➡️ bit.ly/CDSemaphore
    #softwareengineer #developer #git #github #versioncontrol
  • НаукаНаука

Комментарии • 127

  • @username7763
    @username7763 2 дня назад +58

    One thing I've noticed throughout my career is a lot of teams are way too quick to throw away version control history. It seems nearly every company I work for, I end up having to argue against a plan that includes throwing away version history. This happens when switching version control systems (including ones that have an importer), or restructuring repos. I've done quite a bit of version control sleuthing on bug fixes where version history I am looking for is years or even decades old. Often times version control history is the only documentation that exists for knowing why some complex logic exists and if it is still needed or what it should do.

    • @dalehagglund
      @dalehagglund 2 дня назад +3

      I absolutely agree regarding the value of preserving history, and I've also fought off plans to do exactly that several times.

    • @tobibobibo
      @tobibobibo 2 дня назад +4

      I am somewhat inbetween because.. yes, i do know the importance of version history, but many times i see version history been used wrong and is generally misunderstood and i can even use myself as being a culprint of this. When i see version history works at its very best is when each step in the history reflects a clean cut modification to the code base that presents meaningful, complete changes. These changes are valuable changes that presents a complete set of changes that have been verified from start to end and adds a measureable outcome.. this being a feature addition, a refactor or a bug fix. However, where commits doesnt add value is when they represent an intermediate step that is on the way towards one of these goals as they are not able to be perceived as one concrete commit that can reasonably be reviewed as a complete change. The shallower the worse they are as they again reflect an intermediate step towards the goal. Examples of these are end of day commits or.. commits that essentially just are snapshots of state while achieving a bigger goal.
      So.. am i saying that you should not save your work? No definitely not. Commit all you want. Its important! But do allocate work in the end of a complete feature, bugfix or refactor to consolidate your work so that all subtle irrelevant small changes are melted together into a meaningful chunk that reflects the combined work of the specific task instead of reflecting the workflow that you had that day.
      This is an ideology that i keep trying to get people to remember that version history shall reflect the increments in the code, and not the personal workflow that developers have.. (unless it can be said to have a major impact on the outcome of course)

    • @disgruntledtoons
      @disgruntledtoons 2 дня назад +4

      "Often times version control history is the only documentation that exists for knowing why some complex logic exists and if it is still needed or what it should do."
      I blame the "good code is self-documenting" crowd for situations like this. Some devs and dev managers have a woody for deleting comments. My viewpoint is that I'd much rather see a comment I don't need than need a comment I don't see, and the time spent reading an unnecessary comment is far less than the amount of time spent trying to learn a code base that wasn't documented properly.

    • @dalehagglund
      @dalehagglund 2 дня назад +3

      @@tobibobibo I agree fully with your view that history should reflect a "final version" of what happened during development, not the rough draft based on how the dev actually made changes. (I've heard this called the "clean history" approach, and I first ran across it on the linux kernel mailing lists.) And I've worked with this method, and it's great, but it requires the focus to do it, and intermediate git skills as well. I would love to always have clean history, but I messy one is much better than an abandoned history. (In my view, at least.)

    • @JanVerny
      @JanVerny 2 дня назад

      @@disgruntledtoons The problem I personally see with comments, is that they are by nature merely comments. They will be incorrect, they will be as hard to understand as your code and in the end they have no impact on the program itself.
      I think it's much better to put in the effort to create clear structures that show your intentions through architecture, naming and (sometimes even) creating a strong coupling in your code to make sure that nothing is lost to someone who didn't read some random comment at the top of the file. If you can't do that, I don't think you're going to put in the effort to write good comments either.

  • @PhilmannDark
    @PhilmannDark 2 дня назад +14

    Version control gives you "undo" in software projects. No one would use a word processor without undo.

  • @MrLeeFergusson
    @MrLeeFergusson 2 дня назад +29

    I use version control for basically everything these days. From writing letters, for my study notes and config files. To never worry about a mistake setting you back more than a few commits is a powerful thing indeed.

    • @user-mh2ye9nf3y
      @user-mh2ye9nf3y 2 дня назад

      Agreed. I use git for software development projects and rcs for writing projects. I don't need git's extra features / complexity for my writing projects.

    • @jasonfreeman8022
      @jasonfreeman8022 2 дня назад +7

      I’ve often wondered, watching my C-level wife wrangle with loads of documents, why they don’t manage their spreadsheets and Word docs using uncompressed XML and Git. I’ve seen my wife at apoplectic levels crying about how their financial model dejour got the way it did. Meanwhile I’m thinking “git log …”

    • @keepgoing335
      @keepgoing335 2 дня назад

      @@jasonfreeman8022 at my shop we use sharepoint's version control feature for word, ppt, excel. our BA and QA colleagues don't want to learn Git, so sharepoint version control is something they prefer. it lets us include comments along with each version update. I can see the benefit for using Git for word, but reading diffs of xmls to trace back changes seems to be quite difficult

    • @Yorgarazgreece
      @Yorgarazgreece 2 дня назад

      yeah i don't see this solely as an exclusive coding tool. i use it from random stuff too

    • @amandaosvaldo953
      @amandaosvaldo953 День назад

      Me too :D

  • @Hofer2304
    @Hofer2304 2 дня назад +10

    A VCS should be taught as early as possible.Git has the advantage, you can use it offline. You can use Git on your phone. You won't use your phone for a serious project, but for toy projects it's a good tool. If you use Git for toy projects, where you can experiment, a Git disaster in serious projects is less likely.

  • @esra_erimez
    @esra_erimez 2 дня назад

    This is a succinct clarity of such a critical topic that we all too often take for granted.

  • @nviorres
    @nviorres День назад

    Simply put but pure gold. Thank you Dave

  • @kalmarnagyandras
    @kalmarnagyandras 2 дня назад +2

    I would love to see an Abstract Syntax Tree based VCS, where you work on language constructs (functions, classes), instead of filesystem constructs (files). Most languages have an 'official' formatter, that can output the AST in nicely formatted files for portability.

    • @ElProfesorBaltazar
      @ElProfesorBaltazar День назад +1

      Unison language does something like this, but even more sophisticated for dependencies etc.

  • @markrosenthal9108
    @markrosenthal9108 День назад

    To make things work together, I think that it helps to think of what you are versioning as an ENVIRONMENT, everything you need to make the "system" work, not just code, but also your Jenkins, and Ansible scripts. Use a branch to build an environment from dark, and use tags to provide version control. The trunk then becomes the complete serialized image of your production environment. "No junk on the trunk".

  • @edgeeffect
    @edgeeffect 2 дня назад +3

    I've recently head of, but not tried, a sort-of new version control system Jujutsu. I say "sort-of new" because it sits on git's very good back-end whilst putting a slightly less tortuous user interface on the front end. That's the part of git that really really does need to be replaced. git's user interface provides absolutely no abstraction whatsoever and after over 20 years of banging my head against it, I still find it next to impossible to give it any love at all.

    • @ContinuousDelivery
      @ContinuousDelivery  2 дня назад +1

      Yeah! Git's UI sucks!

    • @black-snow
      @black-snow 14 часов назад

      Are we talking about the CLI? I don't get your point then. What level of abstraction should exist beyond switching branches, committing changes or even rewriting history. I've only ever seen people struggle that haven't taken a minute to understand the one and a half underlying concepts you really need. The daily top-level commands do abstract away a bunch of things already.

  • @username7763
    @username7763 2 дня назад +9

    I'm not sure about git successor but I've used a lot of different version control systems and git absolutely could be improved on. It does a lot of things right. Probably the big one that git and other DVCS do is allow for easy branching and merging. A lot of prior systems didn't handle this well at all. Even SVN didn't have merge tracking for a long time. But there is a ton we lose with git too. The command-line commands are pretty random and arbitrary. It feels like a random pile of shell scripts (which is partially correct). There is no way to manage which branch should integrate into which. No permissions. No file locking (yeah lfs is a hack). No bug tracking integrations build in. You have to git stash way too much to do basic things. Because a branch is a pointer, commits don't track what branch they were done on. Deleting a branch can delete history. So yes, git is an improvement from many other systems but it is also a step back. First we should get back what we lost and then we can talk about the future.

    • @TonyWhitley
      @TonyWhitley 2 дня назад +4

      I used Perforce for many years and it (mostly) did what I wanted. Git does what it wants and if you're lucky tells what it didn't like about what you did but mostly you're left copy/pasting obtuse error messages into a search engine in the hope that someone else has already managed to disentangle the word salad into a bunch of impenetrable git commands that will recover the situation.
      I have no idea what on earth Git's local repo is supposed to offer in a world of 100% Internet connectivity and the unnecessary complexity involved causes so many problems.
      I don't know what comes after Git but it has to focus on what version control is trying to *achieve*, not just throw out a large number of often contradictory commands and tell you to sort it out for yourself. It's at the pre-smart phone era, I'm waiting for the iPhone.

    • @username7763
      @username7763 2 дня назад +1

      @@TonyWhitley Yeah git met Linus's need to integrate patches from contributors all around the world. It wasn't supposed to be the one and only version control system people use. I also have good things to say about Perforce. Sometimes it makes sense to pay for software that you use every day to have something a little nicer and easier.

    • @mycroftholmesiv
      @mycroftholmesiv 2 дня назад +1

      Git isn't perfect - but it is better than a number of the other SCMs that were out there. (File locking...ick.)
      Git's submodules and subtrees are still not at a level I would prefer. (Hard to keep code dry and share code)
      But I would love to see Git handle secrets better.
      I am surprised that he's glossed over the traceability that GIt offers. (Blame, etc) As anyone who's had to respond to an audit can tell you - being able to determine who authorized a code change, when was it built, tested, and placed into production become critical....far more so than simply the ability to roll back a change.
      In fact, I'd argue that that the branching/tagging which allows you to maintain multiple versions of a code base simultaneously, is a critical feature.

    • @quademasters249
      @quademasters249 2 дня назад +4

      My biggest issue with GIT too. I fundamentally like how it works with a local version and remote version but anything at a higher level is just guesswork and looking at obscure internet docs.
      Often times I'll run into an issue like detached head and have to cache off a copy because I never know when fixing detached state will trash my current changes. Sometimes it does and sometimes it doesn't
      It's very "Linux". The basics are never updated while features advance forward.

    • @Hofer2304
      @Hofer2304 2 дня назад +2

      Don't branch! If you branch on your local computer, because you want to try something, it may be okay, but not for a long time.

  • @simonabunker
    @simonabunker 20 минут назад

    The best feature of git is that you can work on a branch to work on a feature, and the main branch always remains releasable. Previous version control systems were not good at merging - which put a lot of people off from using branches.

  • @HunterMayer
    @HunterMayer День назад

    You hit on a big point. Save your AI interactions. last year I started identifying the log files of the AI systems that I use and I capture those logs, and though although some of them are proprietary, I've used AI to backwards engineer what was said and unless they start encrypting them I will continue to do so as to change. But this discussion that you have with the system is important. If you save URLs in your code documentation for reference later or even in just the check-in notes it's essentially the same thing. And I find it extremely valuable to go back over discussions that I've had with systems and read what was said before and compare it to even new systems output because the systems have gotten better. I try not to rely too heavily on the output on critical systems without seriously vetting everything that comes out of it but probably preach into the choir there.
    Some of the agentic solutions that are out there are doing this. And they even allow you to roll back. So this is a problem that is in the process of being solved. But this feature is not shiny and isn't getting a lot of attention. Yet.

  • @nathanturner6091
    @nathanturner6091 13 часов назад

    Also useful for configuration, just need to be careful to handle secrets securely.

  • @pilotboba
    @pilotboba День назад

    I've read some of your books and also the CD book with you and Jez.
    They seem to conflict or I am not understanding. Do you have two repos, one for the application and one for the configuration? What about IAC? Or should they all be in the same repo?

  • @MichaelSchuerig
    @MichaelSchuerig 2 дня назад +2

    What's next for version control? I hope for a way to better record the semantics of changes. In an ideal world you never have to rename things, move them around or change signatures when you're in the middle of a task that's only tangentially related. In an ideally world, refactorings like these go into their own commit with its nice and helpful message. My world isn't always ideal and so I keep hoping that one day VCS will be able to recognize this kind of mechanical changes.

    • @cristianpallares7565
      @cristianpallares7565 2 дня назад

      I guess there's a potential communication context between a language server and a version control system. Not sure big the benefits would be, though.

    • @qj0n
      @qj0n День назад +1

      That would actually be not only change to vcs, but it's whole relation to source code and possibly it would require changes in the programming language itself. Still, interesting area
      I believe that we will see comeback of centralisation. Most of systems are currently centralised, everyone has Internet connection and you often can't compile or run your software without Internet connection. Even open source, which was driving the decentralisation of vcs, now is centralised on github
      That being said, centralisation and constant network connection would enable new experience - more like Google docs, where you can quickly peek your colleagues' changes live and possibly co-develop if you like peer programming or simply have changes in the same file

  • @vk3fbab
    @vk3fbab 2 дня назад +1

    Version control is a subject guaranteed to get the developer arguments flowing. The mistakes i have seen are backup files committed to VCS. Showing someone didn't really understand what VCS did. The other is using branches to avoid merging into a release branch. So dev creates a feature branch and then expects that feature branch to pass QA before it gets merged into the development branch. Whereas customers only care that the feature actually works in their release and the fact it worked on a development branch means nothing. I work with a lot of people that avoid TDD and TBD. Hence we have lots of integration challenges and also ground hog days with bugs that would have been caught by TDD.

    • @mecanuktutorials6476
      @mecanuktutorials6476 2 дня назад

      TDD and feature branching are independent concerns that can and often do coexist. You’re probably thinking about feature flags which are hiding or disabling features, but letting the code for them onto the main branch. That’s fine but can also cause the code to become bloated with unused features. Feature branching ensures the main branch is clean.

  • @bart2019
    @bart2019 2 дня назад

    So... How would you version control 2 separate software projects that are meant to work together but work on separate platforms, for example a server and a client?

    • @ContinuousDelivery
      @ContinuousDelivery  2 дня назад +3

      Version control works on text, and so is not platform specific. If you have different components of a system that work on different platforms there is nothing to prevent you from keeping them in the same repo. Ideally do that, along with there deployment scripts, for each of the different platforms. All completely doable, all fairly common practice.

    • @zedrikcayne
      @zedrikcayne 2 дня назад +1

      Currently? Subprojects.
      The 'release' is a repo with the server and client in a subproject. As part of the release process we pin the version of the client with the version of the server.
      The staging environment is a branch off that repo. We merge our new version on. Let ci/cd build it. Iron out all the problems, merging fixes back down into the subprojects. And then merge that version to the 'production' branch which builds and deploys to production directly, releasing client versions etc.

  • @dlabor1965
    @dlabor1965 День назад

    The one and only coding God is you, David!

  • @Kitsune_Dev
    @Kitsune_Dev День назад

    Can you please add time stamps so I can understand what you are talking about in a short summary?

  • @xybersurfer
    @xybersurfer День назад

    AI generating different things to the same question is due to a setting usually called "Temperature", that makes the output less predictable the higher it is. it was probably created as a way to mask the limitations (i also hate this because i too value reproducibility). if i'm not mistaking, generative AI can be fed its previous output to improve upon. i think the focus should be more on the human making the incremental changes to the specifications. the AI working incrementally is more of a detail, that may or may not be necessary to implement the specifications

    • @TomVahlman-bz9nj
      @TomVahlman-bz9nj 14 минут назад

      Yes, you cannot blame the machine for the human not setting clear guidelines for the machine to work in. The human must improve the guidelines in small steps :)

  • @johnvonachen1672
    @johnvonachen1672 День назад +1

    I think the next version of version control is making a file system which has version control built in. In other words there would be no difference between the VCS and the file system, they are one and the same. Probably something where it uses a database to manage all files and your access to it is only through an interface to that database. Alright, now go and make that.

    • @edgeeffect
      @edgeeffect День назад +1

      ... that filesystem already exists(???) ZFS (?????)

    • @johnvonachen1672
      @johnvonachen1672 День назад

      @@edgeeffect Cool

  • @aaronbono4688
    @aaronbono4688 23 часа назад

    I have no idea where you get this concept that the AI tools just create the whole solution in one fell swoop. Yeah it appears on your screen and one big lump but it's usually not complete and you have to prompt it for more but we don't really know how it works and if it's anything like how it builds sentences for us as we type it does it one word at a time so under the covers it could very well be building up the chunk of code a little bit at a time before it drops it in our lap.

  • @NachtmahrNebenan
    @NachtmahrNebenan 2 дня назад +1

    Thank you, Dave, for pointing out the bug in the Excel spreadsheet! Our minister of finance here in Germany knowingly acted on the wrong suggestion and did big harm to countries like Greece, Italy and Spain. Many people died. He never apologized.

  • @georgebeierberkeley
    @georgebeierberkeley 2 дня назад

    I think I'm the only one left using MS Team Foundation Server. I haven't left it for (free, co-pilot enabled) get b/c porting over all the history seems like a scary proposition.

  • @oleksandrsova4803
    @oleksandrsova4803 2 дня назад

    The only thing with this approach that is still an issue for me is that I don't see a clear way to define a *range* of versions of component A that component B depends on. Only the single currently latest version in the repo. But it is not a big issue as modern orchestrators demand a single version of each component anyway.

    • @ContinuousDelivery
      @ContinuousDelivery  2 дня назад +2

      But “only the single current latest version in the repo” is presumably the one you have tested with all the others, so the one that you know works. The idea is to either test what combination will end up in production, or design things so you don’t really care.

    • @jasondbaker
      @jasondbaker 2 дня назад +1

      The few ways I can think of to define these kinds of version dependencies in vcs are either: a) put everything into a mono repo, or b) create a special “orchestration” repo which stores and tracks which software versions should be deployed in which environments. Both options suck because they create a constraint, a choke point through which all teams must pass. Hard wiring these sorts of changes also leads to unpleasant things like release trains and cat herding release managers. My preference is to encourage autonomy and leverage versioned apis and contract testing to validate whether or not an updated service is feasible to deploy to production.

    • @dalehagglund
      @dalehagglund День назад

      @@ContinuousDelivery Hi Dave. First, thanks for creating this great channel. I would be fairly surprised if you haven't seen this, but for an on-prem, "enterprise", cross-site block storage system (so all quite old-school stuff) I worked on for quite a few years, we definitely ran automated system tests to check compatibility of the versions of the separate builds that had to work together. We were lucky to have built a very competent team of testers who were capable of creating and managing these complex automated systems.

  • @rethardotv5874
    @rethardotv5874 2 дня назад +3

    AI can be prompted the last iteration of the code to incrementally build on it and GPT-4o even tries to run the code it proposes, inspects the results and reiterates on them to improve the output. Atleast charGPTs level of managing this is at junior dev level, which is sufficient for an experienced developer to use as a tool.
    Whoever argued for a git successor does not understand Software. There is only one thing that git does horribly and that’s handling binary artifacts, but we don’t need that, we can just add a file that points to them and a script that downloads them after checking out or updating the repository.

    • @username7763
      @username7763 2 дня назад

      AI is not remotely at the junior dev level.... unless my standards are too high. I expect a junior dev to be able to fix bugs in an existing codebase. Look at the code, look at the bug tracker, look at version control history, run the software to reproduce, make the change, test it. Sometimes this involves contacting the bug reporter for more information. AI is crazy impressive at times... for what it is. But i've had interns do more than it is capable of.

    • @rethardotv5874
      @rethardotv5874 2 дня назад

      @@username7763 I’d consider doing the things you describe reliably without handholding closer to mid level.

    • @username7763
      @username7763 2 дня назад

      @@rethardotv5874 Well I didn't mean no handholding. But it is one thing to need to come back to me with questions or when stuck on something and quite another to not be able to accomplish the task at all. AI systems are very far from being able to do basic software maintenance even with handholding.

    • @rethardotv5874
      @rethardotv5874 2 дня назад

      @@username7763 I totally agree on that

    • @martyndissington
      @martyndissington 2 дня назад +1

      "Whoever argued for a git successor does not understand Software"
      Is this a five minute argument or the full half hour?

  • @jasondbaker
    @jasondbaker 2 дня назад

    I agree that putting an excel spreadsheet in version control is a good idea. I just wonder if using something like Git would be practical in sone scenarios due to the way that git stores changes - I.e, by making a complete copy of the file. I’ve worked in organizations that made daily changes to massive excel files with millions of rows of data. Storing these changes in git probably isn’t practical.

    • @paultapping9510
      @paultapping9510 2 дня назад

      is that how git works? I thought it just kept a log of the changes, which is what the diff is, and then the actual files were pushed to wherever you set it (so could be github, could be a private server repo or wherever)? I'm very new though so I'm might very well have it entirely wrong.

    • @ContinuousDelivery
      @ContinuousDelivery  2 дня назад

      That is my understanding of how it works too!

    • @albucc
      @albucc День назад +1

      Nowadays spreadsheets can be stored as text (xml). (docx)... but the big problem with these is that spreadsheets have a lot of non-semantic garbage embedded into it : "places this text in bold... Increase the column width... Change the font from Comic Sans to Arial... Place the borders fitting nicely. Move the yellow duck from the middle to the top right"...

    • @jasondbaker
      @jasondbaker День назад

      @@paultapping9510 Traditional vcs solutions used diffs, but the problem was that the performance of diff-based solutions suffered as the number of file changes grew over time. The vcs would need to process hundreds or thousands of diffs just to present a particular version of a file. People sometimes avoided branching in these solutions because it could take a long time for the vcs to render all the files in the branch.
      Back in 2005, Linus recognized this performance issue and knew that disk storage was cheap. Git doesn't store code changes in diffs, rather it makes a complete copy of a changed file or stores references to files which hash the same (kind of like de-dup). It can also compress files for storage savings. This methodology of storing complete file versions is one reason that Git is so fast when switching branches because there's no overhead involved in rendering file versions.
      The downside is that very large code files (e.x., spreadsheets) with lots of changes may quickly consume large amounts of storage. This storage is consumed in your local Git repository on your workstation and in a remote repository hosted somewhere like GitHub when you push the commits upstream.

  • @Fitzrovialitter
    @Fitzrovialitter День назад

    2:39 Do you know what "it's" means?

  • @simonabunker
    @simonabunker 11 минут назад

    Dependencies shouldn't be defined by a VCS. There are much better tools to create build and runtime environments. Quite a few companies in my industry uses an open source system called rez - and it's pretty good. You define the other packages you depend on using semantic versioning in a package file. You then request a set of packages at runtime and the system goes and brings in all the relevant dependencies and makes sure that they are compatible. This environment can also be frozen in time and rolled back. I think there are quite a few similar environment control systems.

  • @AndrewBlucher
    @AndrewBlucher День назад

    One issue I have always seen with GIT is the complexity of operations it offers. The complexity appeals to bro programmers who think they are heroes. Version Control Systems are vital, but like code itself, if you cannot explain it to your Mum then you don't understand it. And thats a problem, waiting to bite you.

  • @chrisnuk
    @chrisnuk 2 дня назад +1

    I've been using AI for a few weeks now. It's something that works pretty well if you are disciplined. By that I mean give it a set of interfaces and ask it to create some tests. Tests are an odd sort of code where you want code duplication and let's be honest we all copy and paste a previous test as a template, which means there are invariably bugs. So letting the AI write them makes sense.
    I don't understand people who ask what comes next after GIT. I want to shout back we work out how to use it best!

    • @PavelHenkin
      @PavelHenkin 2 дня назад +1

      I disagree that you want duplication in any codebase, even test. It doesn't become as painful as quickly, but your test suite will become unmaintainable just as surely as your prod code if you don't refactor the duplication as you go.

  • @RoamingAdhocrat
    @RoamingAdhocrat 2 дня назад

    Betteridge's Law applies to thumbnails, right

    • @ContinuousDelivery
      @ContinuousDelivery  2 дня назад +1

      Well, kinda, but I am using “Git” as a shorthand for Version Control, because the value of version control has been significantly compromised in the ways that I describe in the video.

  • @12q8
    @12q8 2 дня назад

    Hah!
    I've seen entire Oracle EBS systems with NO version control ever backing it. When asked how do they get the latest version, they just tell the DBAs to dump a copy of prod into dev/test LMAO

    • @ContinuousDelivery
      @ContinuousDelivery  2 дня назад

      I have seen things like that too, I usually say “catch up to the 1980s and use version control” 🥴

  • @allenbythesea
    @allenbythesea День назад +3

    I hope its dying. I've said for the longest time that the git workflow was insanely bad for everything except open source long tail development. Its horrendous for in house devops style development as it discourages frequent commits and updates. We recently moved several projects back to SVN.

  • @qj0n
    @qj0n День назад

    VCS is essential, but git is hugely overestimated. It was a huge improvement to predecessors, but it was built for open source, not enterprise. Older systems like p4 or tfs were centralised which allowed for better support of binary files - git lfs is trying to mimic that, but it's quite problematic itself. Additionally, in continous delivery processes, trunk based development is more natural in linear vcs
    That being said, While there are things, where vcs for enterprise could be better than git, the popularity of git makes it more expensive to look for alternative than to just choose git.
    I think git will be main vcs, until some next generation of vcs come make some breakthrough. I believe it will be something bringing live pair coding experience like Google Docs

  • @12q8
    @12q8 2 дня назад

    I really think AI will excel in TDD *_if_* guided by a professional and experienced prompt engineer. lol

    • @NicodemusT
      @NicodemusT 2 дня назад +2

      The idea that AI would stop to write TDD to appeal to human dogma is hilarious.

    • @12q8
      @12q8 2 дня назад

      @@NicodemusT What is instead we write the test and tell AI to write code that makes it pass? Rinse and repeat until you get what you want.

    • @NicodemusT
      @NicodemusT 2 дня назад

      @@12q8 TDD is driven by human error. If a paradigm, like payment flow, is perfected, making tests for it would be incredibly redundant. A tailored LLM would only contain perfected code in such case, making tests hilarious overengineering made solely to appease gatekeepers who can't let go.

    • @ContinuousDelivery
      @ContinuousDelivery  2 дня назад +1

      Nothing to do with human dogma, but rather, as a way of checking your, or its, working. Without that you, and it, are only guessing at the solution. If you believe that you can catch mistakes by thinking hard and by understanding your code, I think you are missing the point of software development. It is about solving problems, not writing code, and you don't know that the problem is solved, however smart you are, until you try out the solution. Anything else depends on PERFECT PREDICTION of results and certainly AI can't do that, and Physics says it never can.

    • @NicodemusT
      @NicodemusT 2 дня назад

      @@ContinuousDelivery tests written by humans poses the same issues as the code they write - it's not infallible. That's anthropology, physics, and history.
      AI won't write tests for itself. That would be really stupid, given where it's results come from. Letting go is hard, but necessary for the future.
      People writing tests for recipe blogs isn't the future. An open, accessible web is - accessible to everyone, not just cranky gatekeepers hellbent on their own exceptionalism. That's the past.

  • @vincentvogelaar6015
    @vincentvogelaar6015 2 дня назад +1

    Oh, and…. I’m a huge fan…. Lol

  • @stanislauyan3204
    @stanislauyan3204 2 дня назад

    I think you have a serious issues in your arguments.
    At first. What is actually the “incremental” development? That is basically to have a snapshot in the memory of current state and the current task to implement. And it doesn’t matter if you know all previous snapshots as long as they do not influence the current state or future development. And even in such case it is just another input of the AI. That level we already have with the AI.
    Second mistake is that you assume that we develop the system as the whole while in reality we separate the system to the smaller blocks and work on them. That is one of the reason we invented functions and scope. And again AI can operate on that level.
    The real reason why we don’t have AI everywhere is just a limitations of the current AI and its early stage. We will see more

    • @ContinuousDelivery
      @ContinuousDelivery  2 дня назад

      Sure, the AI (to some limit of its capacity - currently its 'context window') can remember previous versions, but it can't discard one and step back to a known good state when it makes a mistake, it remembers the mistake on the same basis as the success. That prevents it from working incrementally, as someone else put it so clearly, "there is no 'Undo'".
      On your second point, that kind of *is* my point, yes we break things into pieces, but AI doesn't really, certainly not in a comparative way. Ask an AI to build a system for you, it will create it all, not build on what went before.

    • @stanislauyan3204
      @stanislauyan3204 2 дня назад

      @@ContinuousDelivery and why we do need to remember mistake? This concept is purely human one.
      If you change a human who made mistake to the new human - he will be free of that mistake memory and will do as AI - take a snapshot of current state - implement what is required. If you make a mistake- put that mistake into your input memory and repeat implementation. I do not see how we cannot apply it to the AI? They are dumb now and you are correct AI can produce wrong results. They are still probabilistic models, not really problem solvers as we are. But the result of their work already exceeds the results of the humans in some areas. It gives better answers than average human.
      And decomposition of the system that looks “problematic” could be just wrong approach to ask the question to the AI.
      I really think you restrain yourself with such arguments. And the assumption that we cannot have as an input our previous mistakes to the AI is not correct. If you say - do not repeat this mistake even to the modern AI - they will most likely will not repeat it.
      There is a question- do we ask AI correctly about what we need? And do the AI has capabilities to give us results what we need?
      I believe the whole idea of incremental development is correct, but with AI we can have shortcuts and become controllers, rather than executors. To achieve it we must modify our approach to development with AI and you have a big challenge ahead of how to do it.
      We will not abolish the basic rules as the incremental development, but that doesn’t mean that we cannot improve processes with AI.
      By the way initial topic is also have another question- do we need the “same” version to make system stable and working? Bcs it brings us to the question what is the system?

    • @davestorm6718
      @davestorm6718 День назад

      @@ContinuousDelivery I've been working with AI for some time now, and, even with "good" prompting have spent way too much time correcting code mistakes made by AI (it's nowhere near what some people claim it is). Anything bigger than a few functions, and it often gets into a loop of bad coding, inadvertently mixes design patterns or systems (for example, try to do a MAUI project and it injects WPF code and Xamarin code, all 3 of which are incompatible on so many levels - no matter how you specify your prompting). AI forgets it's context - a lot! If it doesn't forget it, it corrupts it! LLMs are pretty bad for coding, altogether, because, let's face it, they're glorified inference engines. An LLM doesn't have, nor is truly capable of, a temporal reference nor a spatial reference (It's effectively flat). Because of this, they aren't capable of doing something as simple as version management - anything you see, that makes you believe it is doing so, is illusory.
      Where the current AI shines is where it can help you to "shake out" new ideas and also preventing re-inventing the wheel for SMALL stuff. I needed a function for a cryptographic process, but was reluctant to incorporate a giant library into an app, so I asked it to create the particular method from scratch (and gave it some context, of course). It generated it in seconds and I was able to save a ton of memory and space (for a WASM app). Naturally, you want to verify any code generated by any LLM and test it thoroughly - you will find mistakes or find that it made inefficient code.

  • @vincentvogelaar6015
    @vincentvogelaar6015 2 дня назад

    Foundation models will fail for the reasons you say it will, but you aren’t imagining what’s directly around the corner in terms of AI application.😊

    • @ContinuousDelivery
      @ContinuousDelivery  2 дня назад

      My point is not that AI can never do this, but that this seems like a limitation of LLMs to me, and this is probably one of the bigger barriers on the path to AGI because if they can work incrementally, then they are learning dynamically, and I am guessing that that will take more than a bigger “context window”. I am pretty sure that AIs will be doing all the programming one day, but not just yet, and not until they can work incrementally, allowing them to make mistakes, recognise them, step back to a known good point, and try again.

    • @vincentvogelaar6015
      @vincentvogelaar6015 День назад

      I think it’s safe to say no LLM has reasoning abilities. Still, you can arrange a tree of experts.
      You can simulate TDD

    • @vitalyl1327
      @vitalyl1327 День назад

      @@vincentvogelaar6015 LLMs do have reasoning abilities. Just give them a Prolog, and let them write, test and debug proofs.

  • @eppiox
    @eppiox День назад

    Looking up the definition of 'fundamental' I can see scenarios where it's not needed, but it relates directly to scope/size/type of SD - Risk included. Having worked with so many different types of developers i'm super weary of absolute terms - You've probably all worked with that type of developer who would call an IRL tree a 'cellulose factory'.

  • @aaronbono4688
    @aaronbono4688 23 часа назад +1

    Stop saying AI will "understand", it doesn't "understand" anything. It is not smart, it does not think all it does is regurgitate patterns that it consumes out on the internet

  • @foppel
    @foppel День назад +1

    AI is Japanese.. it will give you an answer, even if it is wrong..

  • @znahardev
    @znahardev 7 часов назад

    Very inconsistent video. There were only 32 bugs.

  • @vincentvogelaar6015
    @vincentvogelaar6015 2 дня назад +2

    I don’t understand how you can be so smart, and not understand that you are attacking a straw man AI thesis
    Single shot AI isn’t what you should be arguing against!
    Think about a tree of experts application that is primed with multi shot, each one operating RAG style within their scope of expertise.
    ….
    Your argument doesn’t hold water to this!!!!

    • @stalinthomas9850
      @stalinthomas9850 2 дня назад

      @@vincentvogelaar6015 The current scope of ML cannot create good programs let alone softwares. You have to understand the fundamental principle of any sequence to sequence models is that it just predicts what is the next most likely token in the sequence. This is not programming! This fundamental principle goes against the very logic of programming. Programming is about converting logic to computer instructions. Even though we call it a programming language it's not the same as language that these ML models are really good at. The idea you have shared doesn't work the way you think it does. There's just a lot of hype around this and the fact of the matter is that we still have long ways to go.

    • @ContinuousDelivery
      @ContinuousDelivery  2 дня назад

      Still not reproducible, and so still not amenable to building solutions incrementally.

  • @feralaca123
    @feralaca123 2 дня назад

    Git is a tragedy

  • @jean-michelgilbert8136
    @jean-michelgilbert8136 2 дня назад

    So Git sucks. It is obscure, breaks with large repos and binaries unless you use workarounds like LFS, sub-repos or some ad-hoc thing like Epic's GitDependencies and requires total batshit insane workarounds when you need to go back to an older revision for a single file. The cure: Perforce. It's just way nicer to work with and doesn't require satanic incantations to do the most basic tasks. And yes it can do DVCS. And yes you can totally install a Git connector to allow your Perforce server to integrate to Git (might not be available to the free 5 seat version)

    • @edgeeffect
      @edgeeffect День назад

      I know nothing of this "perforce" of which you speak... but feel compelled to agree on git's "satanic incantations".

    • @dloorkour1256
      @dloorkour1256 22 часа назад

      I came to git from Perforce about 9 years ago, and used to feel that way sometimes. I still think the documentation sucks and the command naming/options feel like they are whatever popped into Linus' head at the time he first needed them. Now that I understand reset and rebase, use the reflog when needed, and make a habit of stashing or committing to a temporary branch before doing anything that might detach HEAD, I experience very little pain. Also, nowadays I use a gui for most ordinary git tasks (committing, staging, branching, reset, rebase, stash push/pop/delete) so no pain there. There are also streamlined git command line command alias sets that improve the experience. For me, the easy branching and no file locking/server connection requirement are big wins in git, plus the flexibility (which is a pain point initially.)
      As to restoring a single file to a particular commit: git restore --source
      Although I'll confess to just copy/pasting from the gui diff tool to get what I need.