Please make more real-world videos like this. I've been searching for a channel that actually dives into "real-world" project piece by piece instead of providing a simplified example.
One remark. You don't do Linting only in pipeline unless you pay for cloud costs by yourself :) Linting should be done at very early stage so you should have scripts to lint you code locally and then next step is to lint code during push and last is lint before merge. So linting in build is as well OK but before that you have bunch of tests before you will trigger build.
Thanks for the amazing content regarding codepipeline. Where can I find the sample pipeline files for matbot and other samples you showed in the slides?
Sorry, I'm not understanding why the 3rd stage is called Commit. Why not call it 'Build Image'? Commit often refers to a change in the source code which would be your trigger for your build, assuming you've configured your CI/CD system to automatically push and merge commits into your branch.
We use the terms introduced in the "Continuous Delivery" book by Jez Humble and David Farley. Check out www.informit.com/articles/article.aspx?p=1621865&seqNum=4 to learn more.
@@cloudonaut I found out that you can use codebuild. Basically, you can provide the project and repo info by using env vars, then trigger the build using the aws sdk and before each execution you overwrite the project and repo info as env vars. This is limited to codebuild, I'm still not sure how to integrate it with pipeline and multiple stages
It depends on the project. For example, we have been using JUnit for a Java project to write integration tests. For Node.js we are typically using mocha.
@@cloudonaut if I understand correctly integration tests are hitting real services? I typically use jest for unit testinf but never tried for integration tests
You guys are so knowledgeable! I love your AWS content, so helpful to me studying for my DevOps Engineer renewal exam.
Thanks a lot for your kind words. All the best for your DevOps certification.
Please make more real-world videos like this. I've been searching for a channel that actually dives into "real-world" project piece by piece instead of providing a simplified example.
Thank you very much for your motivating feedback.
just discovered your channel but am loving it , thanks and good job
Glad you enjoy it!
Fantastic content as always, keep it up!
Thanks!
Great video guys
Thanks so much!
amazing
the wave deployments at 15:19 sounds like canary deployment
Yes, kind of.
I am not able to understand why would pipeline needs to update itself as you mentioned from 19:48 onwards.
In GitHub Actions, if you change the workflow, it just works. In CodePipeline, you have to update the pipeline yourself.
One remark. You don't do Linting only in pipeline unless you pay for cloud costs by yourself :) Linting should be done at very early stage so you should have scripts to lint you code locally and then next step is to lint code during push and last is lint before merge. So linting in build is as well OK but before that you have bunch of tests before you will trigger build.
Yes, lint before tests. I also agree that you should be able to run all the steps locally as well.
Thanks for the amazing content regarding codepipeline. Where can I find the sample pipeline files for matbot and other samples you showed in the slides?
Check out cloudonaut.io/configure-your-cloudformation-managed-infrastructure-with-parameter-store-and-codepipeline/ for an example.
Sorry, I'm not understanding why the 3rd stage is called Commit. Why not call it 'Build Image'? Commit often refers to a change in the source code which would be your trigger for your build, assuming you've configured your CI/CD system to automatically push and merge commits into your branch.
We use the terms introduced in the "Continuous Delivery" book by Jez Humble and David Farley. Check out www.informit.com/articles/article.aspx?p=1621865&seqNum=4 to learn more.
Is it possible to dynamically chnge the source before running the pipeline? I'm interested on running the same pipeline for different repositories
I don't think, that's possible. But it is possible to define multiple sources.
@@cloudonaut I found out that you can use codebuild. Basically, you can provide the project and repo info by using env vars, then trigger the build using the aws sdk and before each execution you overwrite the project and repo info as env vars. This is limited to codebuild, I'm still not sure how to integrate it with pipeline and multiple stages
How to trigger code pipeline in case of pull request?
I don't think that this will easily work. CodePipeline is designed to work with a single branch.
how to have that self updating pipeline stage?
You can find an example in CloudFormation here: github.com/widdix/aws-velocity/blob/master/deploy/pipeline.yml#L164-L181
What tools or script did you use to run integration tests?
It depends on the project. For example, we have been using JUnit for a Java project to write integration tests. For Node.js we are typically using mocha.
@@cloudonaut if I understand correctly integration tests are hitting real services? I typically use jest for unit testinf but never tried for integration tests