3 Key Version Control Mistakes (HUGE STEP BACKWARDS)

Поделиться
HTML-код
  • Опубликовано: 23 ноя 2024

Комментарии • 262

  • @username7763
    @username7763 5 месяцев назад +126

    One thing I've noticed throughout my career is a lot of teams are way too quick to throw away version control history. It seems nearly every company I work for, I end up having to argue against a plan that includes throwing away version history. This happens when switching version control systems (including ones that have an importer), or restructuring repos. I've done quite a bit of version control sleuthing on bug fixes where version history I am looking for is years or even decades old. Often times version control history is the only documentation that exists for knowing why some complex logic exists and if it is still needed or what it should do.

    • @dalehagglund
      @dalehagglund 5 месяцев назад +7

      I absolutely agree regarding the value of preserving history, and I've also fought off plans to do exactly that several times.

    • @tobibobibo
      @tobibobibo 5 месяцев назад +5

      I am somewhat inbetween because.. yes, i do know the importance of version history, but many times i see version history been used wrong and is generally misunderstood and i can even use myself as being a culprint of this. When i see version history works at its very best is when each step in the history reflects a clean cut modification to the code base that presents meaningful, complete changes. These changes are valuable changes that presents a complete set of changes that have been verified from start to end and adds a measureable outcome.. this being a feature addition, a refactor or a bug fix. However, where commits doesnt add value is when they represent an intermediate step that is on the way towards one of these goals as they are not able to be perceived as one concrete commit that can reasonably be reviewed as a complete change. The shallower the worse they are as they again reflect an intermediate step towards the goal. Examples of these are end of day commits or.. commits that essentially just are snapshots of state while achieving a bigger goal.
      So.. am i saying that you should not save your work? No definitely not. Commit all you want. Its important! But do allocate work in the end of a complete feature, bugfix or refactor to consolidate your work so that all subtle irrelevant small changes are melted together into a meaningful chunk that reflects the combined work of the specific task instead of reflecting the workflow that you had that day.
      This is an ideology that i keep trying to get people to remember that version history shall reflect the increments in the code, and not the personal workflow that developers have.. (unless it can be said to have a major impact on the outcome of course)

    • @disgruntledtoons
      @disgruntledtoons 5 месяцев назад +12

      "Often times version control history is the only documentation that exists for knowing why some complex logic exists and if it is still needed or what it should do."
      I blame the "good code is self-documenting" crowd for situations like this. Some devs and dev managers have a woody for deleting comments. My viewpoint is that I'd much rather see a comment I don't need than need a comment I don't see, and the time spent reading an unnecessary comment is far less than the amount of time spent trying to learn a code base that wasn't documented properly.

    • @dalehagglund
      @dalehagglund 5 месяцев назад +4

      @@tobibobibo I agree fully with your view that history should reflect a "final version" of what happened during development, not the rough draft based on how the dev actually made changes. (I've heard this called the "clean history" approach, and I first ran across it on the linux kernel mailing lists.) And I've worked with this method, and it's great, but it requires the focus to do it, and intermediate git skills as well. I would love to always have clean history, but I messy one is much better than an abandoned history. (In my view, at least.)

    • @JanVerny
      @JanVerny 5 месяцев назад

      @@disgruntledtoons The problem I personally see with comments, is that they are by nature merely comments. They will be incorrect, they will be as hard to understand as your code and in the end they have no impact on the program itself.
      I think it's much better to put in the effort to create clear structures that show your intentions through architecture, naming and (sometimes even) creating a strong coupling in your code to make sure that nothing is lost to someone who didn't read some random comment at the top of the file. If you can't do that, I don't think you're going to put in the effort to write good comments either.

  • @PhilmannDark
    @PhilmannDark 5 месяцев назад +69

    Version control gives you "undo" in software projects. No one would use a word processor without undo.

    • @kayakMike1000
      @kayakMike1000 4 месяца назад

      You're right, it doesn't matter what version control you use, it's that you use version control.
      You could just do vc by hand if you want to, but that won't really scale.

    • @ytubeanon
      @ytubeanon 4 месяца назад

      there's an undo button in Gitkraken

    • @vexorian
      @vexorian 4 месяца назад

      Sure, if everyone was rational and had the foresight to understand that undo is important.
      And yet, have you seen web3? Tons of people who think that rollback is a completely tertiary requirement and not like, vital.

    • @codinghusky5196
      @codinghusky5196 4 месяца назад +2

      @@ytubeanon .......gitKraken is just a GUI for git, which is version control... I don't quite understand your point?

    • @ytubeanon
      @ytubeanon 4 месяца назад

      @@codinghusky5196 there's a literal Undo button in the GitKraken toolbar, I don't know what the equivalent is in git command line terms, but it's there

  • @MrLeeFergusson
    @MrLeeFergusson 5 месяцев назад +45

    I use version control for basically everything these days. From writing letters, for my study notes and config files. To never worry about a mistake setting you back more than a few commits is a powerful thing indeed.

    • @Gregory-o6v
      @Gregory-o6v 5 месяцев назад

      Agreed. I use git for software development projects and rcs for writing projects. I don't need git's extra features / complexity for my writing projects.

    • @jasonfreeman8022
      @jasonfreeman8022 5 месяцев назад +8

      I’ve often wondered, watching my C-level wife wrangle with loads of documents, why they don’t manage their spreadsheets and Word docs using uncompressed XML and Git. I’ve seen my wife at apoplectic levels crying about how their financial model dejour got the way it did. Meanwhile I’m thinking “git log …”

    • @keepgoing335
      @keepgoing335 5 месяцев назад

      @@jasonfreeman8022 at my shop we use sharepoint's version control feature for word, ppt, excel. our BA and QA colleagues don't want to learn Git, so sharepoint version control is something they prefer. it lets us include comments along with each version update. I can see the benefit for using Git for word, but reading diffs of xmls to trace back changes seems to be quite difficult

    • @Yorgarazgreece
      @Yorgarazgreece 5 месяцев назад

      yeah i don't see this solely as an exclusive coding tool. i use it from random stuff too

    • @amandaosvaldo953
      @amandaosvaldo953 4 месяца назад

      Me too :D

  • @Hofer2304
    @Hofer2304 5 месяцев назад +15

    A VCS should be taught as early as possible.Git has the advantage, you can use it offline. You can use Git on your phone. You won't use your phone for a serious project, but for toy projects it's a good tool. If you use Git for toy projects, where you can experiment, a Git disaster in serious projects is less likely.

  • @Skiamakhos
    @Skiamakhos 4 месяца назад +3

    Where I work we tend not to do rollbacks to previous versions at least not in production. If a bug is introduced it can swiftly be tracked down to a specific unit of work and a specific commit that was made. This commit can be reverted, and a hotfix produced with that reversion. This now increments the minor version of the main branch & the production code. Given that each release could represent 100 separate JIRA tickets across maybe 5 teams, this more surgical approach works better for us than removing all the work that worked well in a given release.

  • @kozas0
    @kozas0 2 месяца назад

    Thanks, great video. At some point I thought you were advocating for monorepos, which to some extend you do (and that's great), but then realized you want to address a few more general ideas.
    The reality is that in S.E. most people don't understand Git and Version Control in general, only learn some commands on how to do some common tasks, which is counter-productive cause now V.C. appears to them as some sort of an "enemy", an obstacle that they have to overcome, instead of a tool that helps them maintaining their code. A S.E. that doesn't understand V.C. trembles in fear of having conflicts with its coworkers, while one that understands it, embraces those conflicts and if anything would like to see more. For example, you refactor a function in your branch and all its uses, but on my branch I introduce a new usage of the pre-refactored function; in a new file. We both merge into the "main branch", and Git will tell us both that everything is fine... of course the code will probably fail to even compile after that. If anything, I would want future Git to have at least that, understanding code semantics and giving us conflicts in situations like this, instead of treating code as plain text.
    In a similar way, the monorepos idea is largely dismissed in favor of "per project" repositories. On my last job we were like 15 engineers and there were like 300+ repositories, cause the lead engineer there wouldn't understand V.C. at all and was over-obsessed into creating new repositories, most of the time copying existing repositories into new ones to make some changes there... madness, the complete opposite of DRY. Many companies are hiring DevOps engineers and then ask them to do what System Engineers used to do, sort out the runtime stuff, without having a say on how the SDLC should be improved.
    Subscribed!

  • @username7763
    @username7763 5 месяцев назад +14

    I'm not sure about git successor but I've used a lot of different version control systems and git absolutely could be improved on. It does a lot of things right. Probably the big one that git and other DVCS do is allow for easy branching and merging. A lot of prior systems didn't handle this well at all. Even SVN didn't have merge tracking for a long time. But there is a ton we lose with git too. The command-line commands are pretty random and arbitrary. It feels like a random pile of shell scripts (which is partially correct). There is no way to manage which branch should integrate into which. No permissions. No file locking (yeah lfs is a hack). No bug tracking integrations build in. You have to git stash way too much to do basic things. Because a branch is a pointer, commits don't track what branch they were done on. Deleting a branch can delete history. So yes, git is an improvement from many other systems but it is also a step back. First we should get back what we lost and then we can talk about the future.

    • @TonyWhitley
      @TonyWhitley 5 месяцев назад +5

      I used Perforce for many years and it (mostly) did what I wanted. Git does what it wants and if you're lucky tells what it didn't like about what you did but mostly you're left copy/pasting obtuse error messages into a search engine in the hope that someone else has already managed to disentangle the word salad into a bunch of impenetrable git commands that will recover the situation.
      I have no idea what on earth Git's local repo is supposed to offer in a world of 100% Internet connectivity and the unnecessary complexity involved causes so many problems.
      I don't know what comes after Git but it has to focus on what version control is trying to *achieve*, not just throw out a large number of often contradictory commands and tell you to sort it out for yourself. It's at the pre-smart phone era, I'm waiting for the iPhone.

    • @username7763
      @username7763 5 месяцев назад +2

      @@TonyWhitley Yeah git met Linus's need to integrate patches from contributors all around the world. It wasn't supposed to be the one and only version control system people use. I also have good things to say about Perforce. Sometimes it makes sense to pay for software that you use every day to have something a little nicer and easier.

    • @mycroftholmesiv
      @mycroftholmesiv 5 месяцев назад +1

      Git isn't perfect - but it is better than a number of the other SCMs that were out there. (File locking...ick.)
      Git's submodules and subtrees are still not at a level I would prefer. (Hard to keep code dry and share code)
      But I would love to see Git handle secrets better.
      I am surprised that he's glossed over the traceability that GIt offers. (Blame, etc) As anyone who's had to respond to an audit can tell you - being able to determine who authorized a code change, when was it built, tested, and placed into production become critical....far more so than simply the ability to roll back a change.
      In fact, I'd argue that that the branching/tagging which allows you to maintain multiple versions of a code base simultaneously, is a critical feature.

    • @quademasters249
      @quademasters249 5 месяцев назад +4

      My biggest issue with GIT too. I fundamentally like how it works with a local version and remote version but anything at a higher level is just guesswork and looking at obscure internet docs.
      Often times I'll run into an issue like detached head and have to cache off a copy because I never know when fixing detached state will trash my current changes. Sometimes it does and sometimes it doesn't
      It's very "Linux". The basics are never updated while features advance forward.

    • @Hofer2304
      @Hofer2304 5 месяцев назад +2

      Don't branch! If you branch on your local computer, because you want to try something, it may be okay, but not for a long time.

  • @nardove
    @nardove 4 месяца назад +5

    It is a shame that all the knowledge that could be shared always has a negative connotation, would love to see more positive videos on this channel

  • @ersia87
    @ersia87 4 месяца назад +1

    I only work in traditional coding environments, but I have used no-code systems a few times when helping out a friend with their webiste, and I'm also responsible for my housing community's website. The lack of version control in those systems has made me terrified and ready to pull out my hair every time I've used them. Each time something breaks I have to mentally step back each step of the way to figure out what went wrong, and if I make a big change and realize it wasn't good, tough luck. At least one of these frameworks has the feature of having "work-in-progress" changes that don't take effekt until you press publish. But the other framework doesn't even have that. Every little change is instantly published.

  • @villekauppila9807
    @villekauppila9807 4 месяца назад +5

    Very good points. My personal gripe with low code systems is that they often don't integrate well with version control. If the app's state is for example in binary or similar format, VC doesn't help much.

    • @alias914
      @alias914 4 месяца назад

      Or a CAD project.

    • @charlesbyrneShowComments4all
      @charlesbyrneShowComments4all 4 месяца назад

      Yeah for Crystal Reports I literally wrote an app to export the object definitions in an XML structure. I wasn't able to get everything, but enough that mattered so the binary report also had an XML. It wasn't perfect but it worked. Git didn't handle binaries well, but those were earlier days and we used subversion which handled the binaries better.

    • @FlapMeister
      @FlapMeister 4 месяца назад

      Mendix uses git

    • @CallousCoder
      @CallousCoder 4 месяца назад

      @@FlapMeisterbut is a hell in CI/CD and external configuration per environment.

    • @FlapMeister
      @FlapMeister 4 месяца назад

      @CallousCoder that's not my experience. But I have no frame of reference. Deploying seems real easy to me. Development also. The only thing I don't like about Mendix, is that I sound like I work on Men's reproductive organs.

  • @esra_erimez
    @esra_erimez 5 месяцев назад +1

    This is a succinct clarity of such a critical topic that we all too often take for granted.

  • @TeunSegers
    @TeunSegers 4 месяца назад +1

    There are certainly scenarios where this approach can work well, but it appears more niche than presented here. Each micro-service should ideally have its own independent lifecycle. Grouping them all in one repository, presumably using submodules, for validation and deployment presents an interesting concept. However, I doubt this would scale well or offer much flexibility, especially with distributed teams or heterogeneous tech stacks. The fundamental challenge is governance over your application landscape, which should include system integration level version validation. This is only one of many valid approaches, and others might be better depending on the circumstances. For instance, using a dedicated testing solution that pulls packages from systems like NuGet and npm, validates them against a specific maturity environment, and provides clear reporting on the health of that constellation might address your concerns in a more flexible and robust manner. While you could apply the Git approach to such a system, what would it add? It would likely complicate the incorporation of software not under your direct source control, such as external packages, off-the-shelf products, or SaaS services. Overall, this solution seems too idealized for practical application in diverse environments.

  • @polovne
    @polovne 4 месяца назад

    1. Using Known Good Set (KGS). It's often done outside VCS by pinning versions of any parts
    2. Sometimes we did autocommit interval for non coders. It's not a matter of coding but of education
    3. The main word here is gathering documentation. It's not Magic. Also, a part of education.
    So it's not really about version control but more how to be efficient outside version control.

  • @ArneBab
    @ArneBab 4 месяца назад +1

    Building incrementally does not allow infinite scaling: the small steps we build require coordination and long term organization which incur a fixed cost on every later step. To sidestep that, we try to build newer languages that avoid the small missteps of old but make new mistakes and require relearning how to best solve other problems. So we cannot actually avoid the cost of larger systems. All we can do is to turn it from a linearly rising cost to something that more likely is a logarithmically rising cost. And by getting used to how we do things, we can turn them from a cognitive load into a habit that has much lower cost, but is much harder to change.

  • @GenoppteFliese
    @GenoppteFliese 4 месяца назад +1

    1) I've seen too many projects trusting external modules in the "LATEST" version, not a clearly defined specific version (that didn't include a ransomware attack in "LATEST").
    2) I've seen too many projects trusting external stuff that was gone the next day and no proper local backup existed.
    3) I've seen too many projects not recording the tool versions required to rebuild the software exactly as it was 10 years ago.
    In the past one of our engineers had to rebuild our software from vaulted tapes on a brand new machine and this was watched by a lawyer, because of contractual obligations.

  • @ArneBab
    @ArneBab 4 месяца назад

    The key to define the combination of versions is a shell-repo: a repository that *only* contains subrepositories, and maybe some pipeline definitions. As a concept that was proposed by Mercurial, and to my experience it is the only way to do the exact linking subrepos provide and retain your sanity on the long run.

  • @charlesgaskell5899
    @charlesgaskell5899 4 месяца назад

    As a seasoned Endevor Administrator - Endevor is the main Software Configuration Management tool used on the IBM mainframe platform - fascinating video which I mostly agree with.
    I'm from a modularized mostly "waterfall" school of software development life-cycle so the idea of incremental change, and being able to reliably fall back to a previous version, and testing from a known position (which may include known bugs, and feature lacunæ) is a key requirement as I see it

  • @yrtepgold
    @yrtepgold 4 месяца назад +1

    I agree with every point you except the last part when you talked about AI prompts not being able to generate iterative improvements. The issue in that scenario is not the tool, it's the user. An engineer needs to have prompt engineering skills. Once you begin a conversation, you need to be able to work with the AI model to make small improvements. Using the tool in the same way that humans naturally program by themselves results in the best outcomes. Whereas using the tool to create everything from scratch in one go, that's asking for too much from the tool.

    • @JasonKaler
      @JasonKaler 3 месяца назад

      Exactly. There's a video "Can AI code Flappy Bird? Watch ChatGPT try" where the user gets AI to iteratively build a game.

  • @simonabunker
    @simonabunker 4 месяца назад

    The best feature of git is that you can work on a branch to work on a feature, and the main branch always remains releasable. Previous version control systems were not good at merging - which put a lot of people off from using branches.

  • @nviorres
    @nviorres 4 месяца назад +1

    Simply put but pure gold. Thank you Dave

  • @16randomcharacters
    @16randomcharacters 4 месяца назад

    I completely agree that VC is critical, but I'd say that containerization (and distributables of other formats) somewhat eat into the argument that VC ensures predictability of compatibility. Distributables both embed dependencies outside of purely the code in VC, and produce a roll-back-able artifact.

  • @gammalgris2497
    @gammalgris2497 3 месяца назад

    Dunno. Git is fine as is. I had the enjoyment to work with some other source code versioning tools. If you keep it simple it's a big timesaver especially if you have to track changes within directory trees. Other tools were more cumbersome (i.e. locking files and stuff). Git with a decent diff and merge tool is a huge help.

  • @briancolfer415
    @briancolfer415 4 месяца назад

    An important consideration also not captured in this video is that almost always the application that will be used is defined by external dependencies: jars, gems, wheels, crates etc. And transitive dependencies. Version control is crucial but it is not enough to define the state of the system. The definition of how the software artifacts was construction is ultimately the crucial component for reproducing the artifact.

  • @nathanturner6091
    @nathanturner6091 4 месяца назад +1

    Also useful for configuration, just need to be careful to handle secrets securely.

  • @PovlKvols
    @PovlKvols 3 месяца назад

    I absolutely agree. Thank you for sharing!

  • @vk3fbab
    @vk3fbab 5 месяцев назад +1

    Version control is a subject guaranteed to get the developer arguments flowing. The mistakes i have seen are backup files committed to VCS. Showing someone didn't really understand what VCS did. The other is using branches to avoid merging into a release branch. So dev creates a feature branch and then expects that feature branch to pass QA before it gets merged into the development branch. Whereas customers only care that the feature actually works in their release and the fact it worked on a development branch means nothing. I work with a lot of people that avoid TDD and TBD. Hence we have lots of integration challenges and also ground hog days with bugs that would have been caught by TDD.

    • @mecanuktutorials6476
      @mecanuktutorials6476 5 месяцев назад

      TDD and feature branching are independent concerns that can and often do coexist. You’re probably thinking about feature flags which are hiding or disabling features, but letting the code for them onto the main branch. That’s fine but can also cause the code to become bloated with unused features. Feature branching ensures the main branch is clean.

  • @markrosenthal9108
    @markrosenthal9108 5 месяцев назад +2

    To make things work together, I think that it helps to think of what you are versioning as an ENVIRONMENT, everything you need to make the "system" work, not just code, but also your Jenkins, and Ansible scripts. Use a branch to build an environment from dark, and use tags to provide version control. The trunk then becomes the complete serialized image of your production environment. "No junk on the trunk".

  • @xybersurfer
    @xybersurfer 4 месяца назад

    AI generating different things to the same question is due to a setting usually called "Temperature", that makes the output less predictable the higher it is. it was probably created as a way to mask the limitations (i also hate this because i too value reproducibility). if i'm not mistaking, generative AI can be fed its previous output to improve upon. i think the focus should be more on the human making the incremental changes to the specifications. the AI working incrementally is more of a detail, that may or may not be necessary to implement the specifications

    • @TomVahlman-bz9nj
      @TomVahlman-bz9nj 4 месяца назад +1

      Yes, you cannot blame the machine for the human not setting clear guidelines for the machine to work in. The human must improve the guidelines in small steps :)

  • @edgeeffect
    @edgeeffect 5 месяцев назад +6

    I've recently head of, but not tried, a sort-of new version control system Jujutsu. I say "sort-of new" because it sits on git's very good back-end whilst putting a slightly less tortuous user interface on the front end. That's the part of git that really really does need to be replaced. git's user interface provides absolutely no abstraction whatsoever and after over 20 years of banging my head against it, I still find it next to impossible to give it any love at all.

    • @ContinuousDelivery
      @ContinuousDelivery  5 месяцев назад +1

      Yeah! Git's UI sucks!

    • @black-snow
      @black-snow 4 месяца назад +2

      Are we talking about the CLI? I don't get your point then. What level of abstraction should exist beyond switching branches, committing changes or even rewriting history. I've only ever seen people struggle that haven't taken a minute to understand the one and a half underlying concepts you really need. The daily top-level commands do abstract away a bunch of things already.

  • @maxlutz3674
    @maxlutz3674 4 месяца назад

    Bugs seem to defy normal math rules. Have 12, correct 8, be left with 5. It´s especially so if the correction is marked as "urgent, no time for tests first" .

  • @piotr780
    @piotr780 4 месяца назад

    docker images are also not stable - you pull latest, then overwrite it with next version, if you dont store images version, then you dont know which have worked properly etc.

  • @artemsapegin
    @artemsapegin 4 месяца назад +3

    Background animations make the video impossible to watch :-(

  • @AerialWaviator
    @AerialWaviator 4 месяца назад

    The issue with AI not being able to reproduce consistent solutions is a serious issue. It reminds be of the ACID test (atomicity, consistency, isolation, durability) for transaction systems, which is what versioning, or iterative software releases are.
    Regarding versioning software, it's important to version both the components the make up a system, and the interfaces between components. Often it's just the code/text files that are versioned.

  • @georgelionon9050
    @georgelionon9050 4 месяца назад

    My two grievances with git are: end user simplicity, some things are just too complicated and plumping commands not well enough hidden away. svn and mercurial were easier two get. And as you said, referencing other git repos, svn had "externals" git-externals on the otherhand are a cluster nonsense that get wierd as hell and most sane people forget about it. Many wrote there own "module dependency repository" system atop of it, like e.g. npm etc. A better git would have this build in.. like svn had.

  • @thought-provoker
    @thought-provoker 4 месяца назад

    Monorepo with infrastructure as code ftw.
    Nothing like being able to move back and forward in time up to "today" in real time.
    And whatever comes after git, must have at least that same capability.

  • @vladimir0rus
    @vladimir0rus Месяц назад

    Did not get that is wrong with Git. Maybe a wrong video naming? "Why you need to use VCS" - am I get it right now?

  • @kalmarnagyandras
    @kalmarnagyandras 5 месяцев назад +3

    I would love to see an Abstract Syntax Tree based VCS, where you work on language constructs (functions, classes), instead of filesystem constructs (files). Most languages have an 'official' formatter, that can output the AST in nicely formatted files for portability.

    • @ElProfesorBaltazar
      @ElProfesorBaltazar 4 месяца назад +1

      Unison language does something like this, but even more sophisticated for dependencies etc.

  • @simonabunker
    @simonabunker 4 месяца назад

    Dependencies shouldn't be defined by a VCS. There are much better tools to create build and runtime environments. Quite a few companies in my industry uses an open source system called rez - and it's pretty good. You define the other packages you depend on using semantic versioning in a package file. You then request a set of packages at runtime and the system goes and brings in all the relevant dependencies and makes sure that they are compatible. This environment can also be frozen in time and rolled back. I think there are quite a few similar environment control systems.

  • @johnvonachen1672
    @johnvonachen1672 5 месяцев назад +3

    I think the next version of version control is making a file system which has version control built in. In other words there would be no difference between the VCS and the file system, they are one and the same. Probably something where it uses a database to manage all files and your access to it is only through an interface to that database. Alright, now go and make that.

    • @edgeeffect
      @edgeeffect 4 месяца назад +2

      ... that filesystem already exists(???) ZFS (?????)

    • @johnvonachen1672
      @johnvonachen1672 4 месяца назад

      @@edgeeffect Cool

  • @bart2019
    @bart2019 5 месяцев назад

    So... How would you version control 2 separate software projects that are meant to work together but work on separate platforms, for example a server and a client?

    • @ContinuousDelivery
      @ContinuousDelivery  5 месяцев назад +7

      Version control works on text, and so is not platform specific. If you have different components of a system that work on different platforms there is nothing to prevent you from keeping them in the same repo. Ideally do that, along with there deployment scripts, for each of the different platforms. All completely doable, all fairly common practice.

    • @zedrikcayne
      @zedrikcayne 5 месяцев назад +3

      Currently? Subprojects.
      The 'release' is a repo with the server and client in a subproject. As part of the release process we pin the version of the client with the version of the server.
      The staging environment is a branch off that repo. We merge our new version on. Let ci/cd build it. Iron out all the problems, merging fixes back down into the subprojects. And then merge that version to the 'production' branch which builds and deploys to production directly, releasing client versions etc.

    • @CallousCoder
      @CallousCoder 4 месяца назад

      Use the git submodules

  •  4 месяца назад

    In times of ancient computers, true version control was a task for file systems. And this is where version control truly belongs. Version control toys for source code are a poorly reinvented substitute. Spreadsheet argument is a proof.

  • @MichaelSchuerig
    @MichaelSchuerig 5 месяцев назад +2

    What's next for version control? I hope for a way to better record the semantics of changes. In an ideal world you never have to rename things, move them around or change signatures when you're in the middle of a task that's only tangentially related. In an ideally world, refactorings like these go into their own commit with its nice and helpful message. My world isn't always ideal and so I keep hoping that one day VCS will be able to recognize this kind of mechanical changes.

    • @cristianpallares7565
      @cristianpallares7565 5 месяцев назад

      I guess there's a potential communication context between a language server and a version control system. Not sure big the benefits would be, though.

    • @qj0n
      @qj0n 4 месяца назад +1

      That would actually be not only change to vcs, but it's whole relation to source code and possibly it would require changes in the programming language itself. Still, interesting area
      I believe that we will see comeback of centralisation. Most of systems are currently centralised, everyone has Internet connection and you often can't compile or run your software without Internet connection. Even open source, which was driving the decentralisation of vcs, now is centralised on github
      That being said, centralisation and constant network connection would enable new experience - more like Google docs, where you can quickly peek your colleagues' changes live and possibly co-develop if you like peer programming or simply have changes in the same file

  • @oleksandrsova4803
    @oleksandrsova4803 5 месяцев назад

    The only thing with this approach that is still an issue for me is that I don't see a clear way to define a *range* of versions of component A that component B depends on. Only the single currently latest version in the repo. But it is not a big issue as modern orchestrators demand a single version of each component anyway.

    • @ContinuousDelivery
      @ContinuousDelivery  5 месяцев назад +2

      But “only the single current latest version in the repo” is presumably the one you have tested with all the others, so the one that you know works. The idea is to either test what combination will end up in production, or design things so you don’t really care.

    • @jasondbaker
      @jasondbaker 5 месяцев назад +1

      The few ways I can think of to define these kinds of version dependencies in vcs are either: a) put everything into a mono repo, or b) create a special “orchestration” repo which stores and tracks which software versions should be deployed in which environments. Both options suck because they create a constraint, a choke point through which all teams must pass. Hard wiring these sorts of changes also leads to unpleasant things like release trains and cat herding release managers. My preference is to encourage autonomy and leverage versioned apis and contract testing to validate whether or not an updated service is feasible to deploy to production.

    • @dalehagglund
      @dalehagglund 4 месяца назад

      @@ContinuousDelivery Hi Dave. First, thanks for creating this great channel. I would be fairly surprised if you haven't seen this, but for an on-prem, "enterprise", cross-site block storage system (so all quite old-school stuff) I worked on for quite a few years, we definitely ran automated system tests to check compatibility of the versions of the separate builds that had to work together. We were lucky to have built a very competent team of testers who were capable of creating and managing these complex automated systems.

  • @dlabor1965
    @dlabor1965 4 месяца назад

    The one and only coding God is you, David!

  • @georgebeierberkeley
    @georgebeierberkeley 5 месяцев назад

    I think I'm the only one left using MS Team Foundation Server. I haven't left it for (free, co-pilot enabled) get b/c porting over all the history seems like a scary proposition.

    • @ContinuousDelivery
      @ContinuousDelivery  5 месяцев назад +4

      Yes, I think you are the only one 🤣🤣

    • @natescode
      @natescode 4 месяца назад

      TFS is absolute garbage 🤮

  • @HunterMayer
    @HunterMayer 4 месяца назад

    You hit on a big point. Save your AI interactions. last year I started identifying the log files of the AI systems that I use and I capture those logs, and though although some of them are proprietary, I've used AI to backwards engineer what was said and unless they start encrypting them I will continue to do so as to change. But this discussion that you have with the system is important. If you save URLs in your code documentation for reference later or even in just the check-in notes it's essentially the same thing. And I find it extremely valuable to go back over discussions that I've had with systems and read what was said before and compare it to even new systems output because the systems have gotten better. I try not to rely too heavily on the output on critical systems without seriously vetting everything that comes out of it but probably preach into the choir there.
    Some of the agentic solutions that are out there are doing this. And they even allow you to roll back. So this is a problem that is in the process of being solved. But this feature is not shiny and isn't getting a lot of attention. Yet.

  • @notthere83
    @notthere83 3 месяца назад

    This video seems strange to me.
    First of all, AI can already work incrementally. You tell it that something in its code is wrong and it will try to build on what it wrote previously. Doesn't always work well but often enough for my taste.
    Secondly (actually earlier in the video), using version control to define which versions work together doesn't seem feasible. With large monorepos, you might have dozens of people or more working on them. The chances that every project at all times integrates well with others is about as high as with separate repos. Because whether you have integration tests that run within a single repo or on multiple repos doesn't matter if you set them up accordingly. What should be emphasized instead is accurate semantic versioning and API contracts. THOSE are the things one can rely on - ideally. And if something goes wrong there, THEN version control can be used to go back to a state where e.g. the contract wasn't broken yet.

  • @radui7468
    @radui7468 4 месяца назад +1

    Can you give us some examples? It is very abstract what you are saying.

  • @rethardotv5874
    @rethardotv5874 5 месяцев назад +3

    AI can be prompted the last iteration of the code to incrementally build on it and GPT-4o even tries to run the code it proposes, inspects the results and reiterates on them to improve the output. Atleast charGPTs level of managing this is at junior dev level, which is sufficient for an experienced developer to use as a tool.
    Whoever argued for a git successor does not understand Software. There is only one thing that git does horribly and that’s handling binary artifacts, but we don’t need that, we can just add a file that points to them and a script that downloads them after checking out or updating the repository.

    • @username7763
      @username7763 5 месяцев назад

      AI is not remotely at the junior dev level.... unless my standards are too high. I expect a junior dev to be able to fix bugs in an existing codebase. Look at the code, look at the bug tracker, look at version control history, run the software to reproduce, make the change, test it. Sometimes this involves contacting the bug reporter for more information. AI is crazy impressive at times... for what it is. But i've had interns do more than it is capable of.

    • @rethardotv5874
      @rethardotv5874 5 месяцев назад

      @@username7763 I’d consider doing the things you describe reliably without handholding closer to mid level.

    • @username7763
      @username7763 5 месяцев назад

      @@rethardotv5874 Well I didn't mean no handholding. But it is one thing to need to come back to me with questions or when stuck on something and quite another to not be able to accomplish the task at all. AI systems are very far from being able to do basic software maintenance even with handholding.

    • @rethardotv5874
      @rethardotv5874 5 месяцев назад

      @@username7763 I totally agree on that

    • @martyndissington
      @martyndissington 5 месяцев назад +1

      "Whoever argued for a git successor does not understand Software"
      Is this a five minute argument or the full half hour?

  • @jasondbaker
    @jasondbaker 5 месяцев назад

    I agree that putting an excel spreadsheet in version control is a good idea. I just wonder if using something like Git would be practical in sone scenarios due to the way that git stores changes - I.e, by making a complete copy of the file. I’ve worked in organizations that made daily changes to massive excel files with millions of rows of data. Storing these changes in git probably isn’t practical.

    • @paultapping9510
      @paultapping9510 5 месяцев назад

      is that how git works? I thought it just kept a log of the changes, which is what the diff is, and then the actual files were pushed to wherever you set it (so could be github, could be a private server repo or wherever)? I'm very new though so I'm might very well have it entirely wrong.

    • @ContinuousDelivery
      @ContinuousDelivery  5 месяцев назад

      That is my understanding of how it works too!

    • @albucc
      @albucc 4 месяца назад +1

      Nowadays spreadsheets can be stored as text (xml). (docx)... but the big problem with these is that spreadsheets have a lot of non-semantic garbage embedded into it : "places this text in bold... Increase the column width... Change the font from Comic Sans to Arial... Place the borders fitting nicely. Move the yellow duck from the middle to the top right"...

    • @jasondbaker
      @jasondbaker 4 месяца назад

      @@paultapping9510 Traditional vcs solutions used diffs, but the problem was that the performance of diff-based solutions suffered as the number of file changes grew over time. The vcs would need to process hundreds or thousands of diffs just to present a particular version of a file. People sometimes avoided branching in these solutions because it could take a long time for the vcs to render all the files in the branch.
      Back in 2005, Linus recognized this performance issue and knew that disk storage was cheap. Git doesn't store code changes in diffs, rather it makes a complete copy of a changed file or stores references to files which hash the same (kind of like de-dup). It can also compress files for storage savings. This methodology of storing complete file versions is one reason that Git is so fast when switching branches because there's no overhead involved in rendering file versions.
      The downside is that very large code files (e.x., spreadsheets) with lots of changes may quickly consume large amounts of storage. This storage is consumed in your local Git repository on your workstation and in a remote repository hosted somewhere like GitHub when you push the commits upstream.

  • @aaronbono4688
    @aaronbono4688 4 месяца назад +65

    Stop saying AI will "understand", it doesn't "understand" anything. It is not smart, it does not think all it does is regurgitate patterns that it consumes out on the internet

    • @Ratstail91
      @Ratstail91 4 месяца назад +15

      So they're as smart as the average user? XD

    • @aaronbono4688
      @aaronbono4688 4 месяца назад

      @@Ratstail91 yea, just about

    • @mAcCoLo666
      @mAcCoLo666 4 месяца назад +9

      Just like humans you mean?

    • @aaronbono4688
      @aaronbono4688 4 месяца назад +1

      @@mAcCoLo666 not all humans but perhaps most unfortunately.

    • @tupG
      @tupG 4 месяца назад +5

      AI (the algorithms that make up AI) is not smart. It's basically a very sophisticated pattern searching calculator. You can train it to answer 1+1=4.
      Just as 47^4.5 is easy for a desk calculator, searching through vast amounts of data to fish out unexpected data patterns that no human was able to spot is easy for AI.
      Trouble starts when influential people/co-operations start to exploit this for their own benefits. It's a powerful tool for manipulation.

  • @Roney-m2k
    @Roney-m2k 4 месяца назад

    На этом канале всегда годные связки!

  • @pilotboba
    @pilotboba 4 месяца назад

    I've read some of your books and also the CD book with you and Jez.
    They seem to conflict or I am not understanding. Do you have two repos, one for the application and one for the configuration? What about IAC? Or should they all be in the same repo?

  • @davegraham9100
    @davegraham9100 4 месяца назад

    Comment b4 listened.
    B4 Git, a checked in package would work 1 year later. For whatever reason. Time now guarantees broken.

  • @qj0n
    @qj0n 4 месяца назад

    VCS is essential, but git is hugely overestimated. It was a huge improvement to predecessors, but it was built for open source, not enterprise. Older systems like p4 or tfs were centralised which allowed for better support of binary files - git lfs is trying to mimic that, but it's quite problematic itself. Additionally, in continous delivery processes, trunk based development is more natural in linear vcs
    That being said, While there are things, where vcs for enterprise could be better than git, the popularity of git makes it more expensive to look for alternative than to just choose git.
    I think git will be main vcs, until some next generation of vcs come make some breakthrough. I believe it will be something bringing live pair coding experience like Google Docs

  • @Fitzrovialitter
    @Fitzrovialitter 4 месяца назад

    2:39 Do you know what "it's" means?

  • @chrisnuk
    @chrisnuk 5 месяцев назад +1

    I've been using AI for a few weeks now. It's something that works pretty well if you are disciplined. By that I mean give it a set of interfaces and ask it to create some tests. Tests are an odd sort of code where you want code duplication and let's be honest we all copy and paste a previous test as a template, which means there are invariably bugs. So letting the AI write them makes sense.
    I don't understand people who ask what comes next after GIT. I want to shout back we work out how to use it best!

    • @PavelHenkin
      @PavelHenkin 5 месяцев назад +1

      I disagree that you want duplication in any codebase, even test. It doesn't become as painful as quickly, but your test suite will become unmaintainable just as surely as your prod code if you don't refactor the duplication as you go.

    • @maxlutz3674
      @maxlutz3674 4 месяца назад +1

      @@PavelHenkin The moment I need code in more than one place, is the moment to create a method or funktion. Also I am not afraid to create a base class when sensible.
      Duplicate code is a pain. It seems that duplicate code has a tendency to contain the bugs.

  • @aaronbono4688
    @aaronbono4688 4 месяца назад +1

    I have no idea where you get this concept that the AI tools just create the whole solution in one fell swoop. Yeah it appears on your screen and one big lump but it's usually not complete and you have to prompt it for more but we don't really know how it works and if it's anything like how it builds sentences for us as we type it does it one word at a time so under the covers it could very well be building up the chunk of code a little bit at a time before it drops it in our lap.

  • @RoamingAdhocrat
    @RoamingAdhocrat 5 месяцев назад

    Betteridge's Law applies to thumbnails, right

    • @ContinuousDelivery
      @ContinuousDelivery  5 месяцев назад +1

      Well, kinda, but I am using “Git” as a shorthand for Version Control, because the value of version control has been significantly compromised in the ways that I describe in the video.

  • @tectopic
    @tectopic 4 месяца назад

    At 11:05 and onwards!❤

  • @ateijelo
    @ateijelo 4 месяца назад

    Betteridge's Law of Headlines

  • @vincentvogelaar6015
    @vincentvogelaar6015 5 месяцев назад

    Foundation models will fail for the reasons you say it will, but you aren’t imagining what’s directly around the corner in terms of AI application.😊

    • @ContinuousDelivery
      @ContinuousDelivery  5 месяцев назад

      My point is not that AI can never do this, but that this seems like a limitation of LLMs to me, and this is probably one of the bigger barriers on the path to AGI because if they can work incrementally, then they are learning dynamically, and I am guessing that that will take more than a bigger “context window”. I am pretty sure that AIs will be doing all the programming one day, but not just yet, and not until they can work incrementally, allowing them to make mistakes, recognise them, step back to a known good point, and try again.

    • @vincentvogelaar6015
      @vincentvogelaar6015 4 месяца назад

      I think it’s safe to say no LLM has reasoning abilities. Still, you can arrange a tree of experts.
      You can simulate TDD

    • @vitalyl1327
      @vitalyl1327 4 месяца назад

      @@vincentvogelaar6015 LLMs do have reasoning abilities. Just give them a Prolog, and let them write, test and debug proofs.

    • @vincentvogelaar6015
      @vincentvogelaar6015 4 месяца назад

      @@vitalyl1327 respectfully, they do NOT have "reasoning" abilities.

    • @vincentvogelaar6015
      @vincentvogelaar6015 4 месяца назад

      @@vitalyl1327 Ever played an RPG where some treasure chests are mimics? They have all the characteristics of a treasure chest, but they are not ontologically.... a treasure chest!!! LLMs are intelligence mimics. they have characteristics of reasoning, but when you probe them, they can't do elementary reasoning when tested for examples not found in the data they have been trained upon.
      thus, they mimics. and this isn't a philosophical point, its integral to the dialogue above

  • @NachtmahrNebenan
    @NachtmahrNebenan 5 месяцев назад +2

    Thank you, Dave, for pointing out the bug in the Excel spreadsheet! Our minister of finance here in Germany knowingly acted on the wrong suggestion and did big harm to countries like Greece, Italy and Spain. Many people died. He never apologized.

  • @eppiox
    @eppiox 4 месяца назад

    Looking up the definition of 'fundamental' I can see scenarios where it's not needed, but it relates directly to scope/size/type of SD - Risk included. Having worked with so many different types of developers i'm super weary of absolute terms - You've probably all worked with that type of developer who would call an IRL tree a 'cellulose factory'.

  • @DemiImp
    @DemiImp 4 месяца назад +1

    There are a lot of good insights into this video, however your AI argument of "the only way AI could work is if AI did things EXACTLY how I would do it" is a flawed way of thinking. At some point you're going to have to think more flexibly and consider that AI might develop a better process that works with how it thinks and not with how you think.
    Also, the loss of reproducibility of AI, from what I understand, is just the result of a few parameters. The result of a LLM is a list of expected tokens with probabilities attached. The AI then randomly picks which one to use based on the probabilities. You could design an AI to always use random seed 0 and always produce the same output. You could also design it to pick the token with the largest probability and ignore the other tokens. Either of these would guarantee a reproducible output. It sounds like you are talking about things you don't understand and you should be more careful with your thoughts and words.

    • @kriffos
      @kriffos 4 месяца назад

      In the current state AI is not able to develop in any kind, despite the name it is not intelligent at all. It's trained with human input and could possibly match but not overcome it. It uses statistical probability to create its answers and there are still wrong ones given. I think AI is overrated.

    • @DemiImp
      @DemiImp 4 месяца назад

      @@kriffos I don't think people use the word "intelligence" as synonymous with "the ability to learn". AI not being able to learn is currently a feature. You're able to ship a product and it will perform reliably. If all AI models learned, then they would become vectors of attack.
      Imagine a company using an AI chat bot to interact with customers and help support their product. If the AI was forced to learn while communicating with the customers, there would be malicious actors teaching it bad/incorrect information and corrupting it. LLMs not being modified with every interaction is a feature.

    • @kriffos
      @kriffos 4 месяца назад

      @@DemiImp well but I think that's a part of intelligence, beeing able to learn and apply knowledge to new problems. I do not think that has to be an ongoing process for AI, but that ability is also completely missing during the training phase. LLMs only preserves the status quo and do not bring anything on their own to the table. If there is no new human input, these LLMs are at a dead end and will not be able to improve.

    • @DemiImp
      @DemiImp 3 месяца назад

      @@kriffos Why do you think that ability to change and respond to new input is not a part of training? Being able to tell the AI to forget previous information and focus on new information is absolutely a part of their training. Have you actually spent time with things like chatgpt? I don't really understand how you could say that they don't learn from things you tell it.

    • @kriffos
      @kriffos 3 месяца назад

      @@DemiImp exactly that, what ChatGPT and similar chatbots respond to your input is not training. Training takes place before the LLM is used in production. There is no realtime training as far as I know. If you tell them to forget about some things, that's just some additional rules you apply via your prompt. The same is true for the opposite, if you give additional information in your prompt, that is lost as soon as you leave the context. I think you do not get how these chatbots work and you think they are intelligent, but they are not. The output you get is basically mathematical probability based on trained data.

  • @Kitsune_Dev
    @Kitsune_Dev 4 месяца назад

    Can you please add time stamps so I can understand what you are talking about in a short summary?

  • @mariusg8824
    @mariusg8824 3 месяца назад

    With reproducable AI you could store a bunch of parameters like seeds and prompts, and gradually build towards some system. That would actually be really useful. But I guess, currently AI systems are not mature enough to freeze them in containers.

  • @spacemanspiff85
    @spacemanspiff85 4 месяца назад

    Lets all go back to CVS

  • @12q8
    @12q8 5 месяцев назад

    I really think AI will excel in TDD *_if_* guided by a professional and experienced prompt engineer. lol

    • @NicodemusT
      @NicodemusT 5 месяцев назад +2

      The idea that AI would stop to write TDD to appeal to human dogma is hilarious.

    • @12q8
      @12q8 5 месяцев назад

      @@NicodemusT What is instead we write the test and tell AI to write code that makes it pass? Rinse and repeat until you get what you want.

    • @NicodemusT
      @NicodemusT 5 месяцев назад

      @@12q8 TDD is driven by human error. If a paradigm, like payment flow, is perfected, making tests for it would be incredibly redundant. A tailored LLM would only contain perfected code in such case, making tests hilarious overengineering made solely to appease gatekeepers who can't let go.

    • @ContinuousDelivery
      @ContinuousDelivery  5 месяцев назад +1

      Nothing to do with human dogma, but rather, as a way of checking your, or its, working. Without that you, and it, are only guessing at the solution. If you believe that you can catch mistakes by thinking hard and by understanding your code, I think you are missing the point of software development. It is about solving problems, not writing code, and you don't know that the problem is solved, however smart you are, until you try out the solution. Anything else depends on PERFECT PREDICTION of results and certainly AI can't do that, and Physics says it never can.

    • @NicodemusT
      @NicodemusT 5 месяцев назад

      @@ContinuousDelivery tests written by humans poses the same issues as the code they write - it's not infallible. That's anthropology, physics, and history.
      AI won't write tests for itself. That would be really stupid, given where it's results come from. Letting go is hard, but necessary for the future.
      People writing tests for recipe blogs isn't the future. An open, accessible web is - accessible to everyone, not just cranky gatekeepers hellbent on their own exceptionalism. That's the past.

  • @12q8
    @12q8 5 месяцев назад

    Hah!
    I've seen entire Oracle EBS systems with NO version control ever backing it. When asked how do they get the latest version, they just tell the DBAs to dump a copy of prod into dev/test LMAO

    • @ContinuousDelivery
      @ContinuousDelivery  5 месяцев назад

      I have seen things like that too, I usually say “catch up to the 1980s and use version control” 🥴

    • @CallousCoder
      @CallousCoder 4 месяца назад

      Did you work at the Dutch justice department? When I had to assist the on Oracle EBS, I nearly fainted that they just manually copied and pasted configurations! I always said employees or the state are inept idiots. That moment it was proven without a shadow of a doubt.

    • @12q8
      @12q8 4 месяца назад +1

      @@CallousCoder nope, but it confirms that any state has the same issue. Inept employees that never developed or caught up with what the industry is doing. Lol.
      I thought it was just my government, but I keep hearing and seeing more countries governments having the same issue.
      Ignorance, ineptitude, and outdated. Filled with bodged software customizations.
      I saw a similar issue with very big and old corporations, like banks, but not to the same degree as the government.
      My best work experience has been in the private sector, telecom and tech companies in general. They also seem to have the best and brightest minds.
      Even coworkers there shared the same or worse horror stories working for some government department.

  • @AndrewBlucher
    @AndrewBlucher 4 месяца назад

    One issue I have always seen with GIT is the complexity of operations it offers. The complexity appeals to bro programmers who think they are heroes. Version Control Systems are vital, but like code itself, if you cannot explain it to your Mum then you don't understand it. And thats a problem, waiting to bite you.

  • @FatherGapon-gw6yo
    @FatherGapon-gw6yo 4 месяца назад

    Perforce makes git look like child scratch

  • @piotr780
    @piotr780 4 месяца назад

    merge hell is hell

  • @vincentvogelaar6015
    @vincentvogelaar6015 5 месяцев назад +2

    Oh, and…. I’m a huge fan…. Lol

  • @nezbrun872
    @nezbrun872 3 месяца назад

    There is nothing special in CI/CD that helps with reproducibility. In many ways it makes it harder.
    You're not testing the same "complex system" if you're not including ALL the moving parts, including ALL of the components of the system with the same data. CI/CD explicitly makes regular full system regression testing impossible in practice, not least by encouraging more and more microservices to be wired up at deploy time rather than build time.

  • @znahardev
    @znahardev 4 месяца назад

    Very inconsistent video. There were only 32 bugs.

  • @JanKowalski-vj9py
    @JanKowalski-vj9py 4 месяца назад

    Our company switched from Polarion to Git/Gerrit. There are no words offensive enough in english language to describe this shit.

  • @DamianEstevez-n3y
    @DamianEstevez-n3y 2 месяца назад

    I love that the mac on 1:07 shows the dirty screen we all know very well lmao

  • @NickDanger3
    @NickDanger3 4 месяца назад

    AI does not have any “mental capacity”

  • @allenbythesea
    @allenbythesea 4 месяца назад +4

    I hope its dying. I've said for the longest time that the git workflow was insanely bad for everything except open source long tail development. Its horrendous for in house devops style development as it discourages frequent commits and updates. We recently moved several projects back to SVN.

    • @jaroslaww7764
      @jaroslaww7764 4 месяца назад

      We use git but no gitflow. Everybody who creates PR in Github to our repo gets informed his PR won't be merged until he rebases his branch on newest master.

  • @stanislauyan3204
    @stanislauyan3204 5 месяцев назад

    I think you have a serious issues in your arguments.
    At first. What is actually the “incremental” development? That is basically to have a snapshot in the memory of current state and the current task to implement. And it doesn’t matter if you know all previous snapshots as long as they do not influence the current state or future development. And even in such case it is just another input of the AI. That level we already have with the AI.
    Second mistake is that you assume that we develop the system as the whole while in reality we separate the system to the smaller blocks and work on them. That is one of the reason we invented functions and scope. And again AI can operate on that level.
    The real reason why we don’t have AI everywhere is just a limitations of the current AI and its early stage. We will see more

    • @ContinuousDelivery
      @ContinuousDelivery  5 месяцев назад

      Sure, the AI (to some limit of its capacity - currently its 'context window') can remember previous versions, but it can't discard one and step back to a known good state when it makes a mistake, it remembers the mistake on the same basis as the success. That prevents it from working incrementally, as someone else put it so clearly, "there is no 'Undo'".
      On your second point, that kind of *is* my point, yes we break things into pieces, but AI doesn't really, certainly not in a comparative way. Ask an AI to build a system for you, it will create it all, not build on what went before.

    • @stanislauyan3204
      @stanislauyan3204 5 месяцев назад

      @@ContinuousDelivery and why we do need to remember mistake? This concept is purely human one.
      If you change a human who made mistake to the new human - he will be free of that mistake memory and will do as AI - take a snapshot of current state - implement what is required. If you make a mistake- put that mistake into your input memory and repeat implementation. I do not see how we cannot apply it to the AI? They are dumb now and you are correct AI can produce wrong results. They are still probabilistic models, not really problem solvers as we are. But the result of their work already exceeds the results of the humans in some areas. It gives better answers than average human.
      And decomposition of the system that looks “problematic” could be just wrong approach to ask the question to the AI.
      I really think you restrain yourself with such arguments. And the assumption that we cannot have as an input our previous mistakes to the AI is not correct. If you say - do not repeat this mistake even to the modern AI - they will most likely will not repeat it.
      There is a question- do we ask AI correctly about what we need? And do the AI has capabilities to give us results what we need?
      I believe the whole idea of incremental development is correct, but with AI we can have shortcuts and become controllers, rather than executors. To achieve it we must modify our approach to development with AI and you have a big challenge ahead of how to do it.
      We will not abolish the basic rules as the incremental development, but that doesn’t mean that we cannot improve processes with AI.
      By the way initial topic is also have another question- do we need the “same” version to make system stable and working? Bcs it brings us to the question what is the system?

    • @davestorm6718
      @davestorm6718 5 месяцев назад

      @@ContinuousDelivery I've been working with AI for some time now, and, even with "good" prompting have spent way too much time correcting code mistakes made by AI (it's nowhere near what some people claim it is). Anything bigger than a few functions, and it often gets into a loop of bad coding, inadvertently mixes design patterns or systems (for example, try to do a MAUI project and it injects WPF code and Xamarin code, all 3 of which are incompatible on so many levels - no matter how you specify your prompting). AI forgets it's context - a lot! If it doesn't forget it, it corrupts it! LLMs are pretty bad for coding, altogether, because, let's face it, they're glorified inference engines. An LLM doesn't have, nor is truly capable of, a temporal reference nor a spatial reference (It's effectively flat). Because of this, they aren't capable of doing something as simple as version management - anything you see, that makes you believe it is doing so, is illusory.
      Where the current AI shines is where it can help you to "shake out" new ideas and also preventing re-inventing the wheel for SMALL stuff. I needed a function for a cryptographic process, but was reluctant to incorporate a giant library into an app, so I asked it to create the particular method from scratch (and gave it some context, of course). It generated it in seconds and I was able to save a ton of memory and space (for a WASM app). Naturally, you want to verify any code generated by any LLM and test it thoroughly - you will find mistakes or find that it made inefficient code.

  • @Andrew-rc3vh
    @Andrew-rc3vh 4 месяца назад +1

    The video is wasting time. Get to the point. I don't want to know either about your sponsor or your like button. 3m in 2 detours.

  • @eavdmeer
    @eavdmeer 4 месяца назад +1

    Version control allows you to define what you mean when you say that all the parts of your software work together? What? 😂 Lost me in the first minute of your video. That's *absolutely* not what version control does. It has nothing at all to do with the function or correctness of your software. It's solely there to track the entire history and development of 'something' that doesn't even need to be software and to be able to reproduce the state at any desired point in time and development. Sorry, but with a premise like that, I can't even watch the rest of what you have to say

  • @kamertonaudiophileplayer847
    @kamertonaudiophileplayer847 3 месяца назад

    It's dead from birth.

  • @MrNathanstenzel
    @MrNathanstenzel 4 месяца назад

    I was never any good with Git. I was better with SVN.

  • @foppel
    @foppel 4 месяца назад +2

    AI is Japanese.. it will give you an answer, even if it is wrong..

  • @feralaca123
    @feralaca123 5 месяцев назад +1

    Git is a tragedy

  • @vincentvogelaar6015
    @vincentvogelaar6015 5 месяцев назад +2

    I don’t understand how you can be so smart, and not understand that you are attacking a straw man AI thesis
    Single shot AI isn’t what you should be arguing against!
    Think about a tree of experts application that is primed with multi shot, each one operating RAG style within their scope of expertise.
    ….
    Your argument doesn’t hold water to this!!!!

    • @stalinthomas9850
      @stalinthomas9850 5 месяцев назад

      @@vincentvogelaar6015 The current scope of ML cannot create good programs let alone softwares. You have to understand the fundamental principle of any sequence to sequence models is that it just predicts what is the next most likely token in the sequence. This is not programming! This fundamental principle goes against the very logic of programming. Programming is about converting logic to computer instructions. Even though we call it a programming language it's not the same as language that these ML models are really good at. The idea you have shared doesn't work the way you think it does. There's just a lot of hype around this and the fact of the matter is that we still have long ways to go.

    • @ContinuousDelivery
      @ContinuousDelivery  5 месяцев назад

      Still not reproducible, and so still not amenable to building solutions incrementally.

  • @jean-michelgilbert8136
    @jean-michelgilbert8136 5 месяцев назад

    So Git sucks. It is obscure, breaks with large repos and binaries unless you use workarounds like LFS, sub-repos or some ad-hoc thing like Epic's GitDependencies and requires total batshit insane workarounds when you need to go back to an older revision for a single file. The cure: Perforce. It's just way nicer to work with and doesn't require satanic incantations to do the most basic tasks. And yes it can do DVCS. And yes you can totally install a Git connector to allow your Perforce server to integrate to Git (might not be available to the free 5 seat version)

    • @edgeeffect
      @edgeeffect 4 месяца назад

      I know nothing of this "perforce" of which you speak... but feel compelled to agree on git's "satanic incantations".

    • @dloorkour1256
      @dloorkour1256 4 месяца назад +2

      I came to git from Perforce about 9 years ago, and used to feel that way sometimes. I still think the documentation sucks and the command naming/options feel like they are whatever popped into Linus' head at the time he first needed them. Now that I understand reset and rebase, use the reflog when needed, and make a habit of stashing or committing to a temporary branch before doing anything that might detach HEAD, I experience very little pain. Also, nowadays I use a gui for most ordinary git tasks (committing, staging, branching, reset, rebase, stash push/pop/delete) so no pain there. There are also streamlined git command line command alias sets that improve the experience. For me, the easy branching and no file locking/server connection requirement are big wins in git, plus the flexibility (which is a pain point initially.)
      As to restoring a single file to a particular commit: git restore --source
      Although I'll confess to just copy/pasting from the gui diff tool to get what I need.

  • @NoelPickering
    @NoelPickering 4 месяца назад

    Click bait much?