Man, I found your channel two months ago, and since then, I almost never listen to any other dev content creator except you. You're so genuine, and you talk in a very reasonable way-it's not too much and not too little. Keep it up, bro!
Testing the frontend is way different than testing the backend. Having "unit" tests (either at the API level or business logic level, not mocking the DB) on the backend allows you to move way faster, because you don't need to verify the changes manually and you don't need to worry about regressions. I do TDD like this for 90-95% of tests. If you have a more complex business logic, then you test just that logic, but you make the function as pure as possible. As for the front end, it mostly just sucks, because the UI is fragile, especially in the beginning. Automating e2e tests for stable features makes sense, depending on how much work it is.
It seems that no matter the framework you are using, or if you are making unit tests or e2e tests the experience of testing the frontend will always be unpleasant
For side projects that I'm planning to maintain over the months / years I tend to add tests because future me will have forgotten most if not all. Professionally, based on budget and time I always try to add tests if the budget / time is limited I always try to cover the happy path on the highest level possible. You covered it pretty well!
Tests are this weird area that can bring so much value and benefit, but you can also still end up in this weird situation where you might require 80-90% code coverage and still have a consistently crappy software and user or developer experience. You can reach that code coverage consistently and still have software that breaks all the time, tests that break all the time etc. Just because people start writing tests for the sake of hitting coverage instead of for the sake of writing good reliable tests that make sense and tests behavior that should be tested. Testing is a tricky and hard area.
The "value added" to "effort/time" graph would look something like a logarithm. The more interesting graph is "value added / time-and-effort" to "time-and-effort", which will roughly slope downward indicating that the value derived per unit of effort diminishes as the amount of effort increases for a feature. (Maybe something like y=x * (0.9^x) would be more accurate. Idk.)
Nice discussion of testing. I generally agree with your sentiments on this, though I do sometimes specifically create tests to document how vague requirements were interpreted. Very useful documentation of the implemented rules.
was getting ready for a hot & spicy take but I 100% agree one thing that also sucks with the whole "we gotta test everything dogma" is when managers and product owners tell you to test shit that doesn't make sense to test. but you can't explain that to them cause they don't understand code so now you gotta do unnecessary work 😅
I'm not a big fan of skipping tests, even on small projects-I hate manually testing my app before each deployment. That's why I always include e2e tests, even for smaller apps. If something doesn't work smoothly during testing, I don't test the use case instead of spending too much time on it. For example, I had a download button with JPEG, PNG, and PDF options. The PDF snapshot tests kept failing, even though they were correct, while the JPEG and PNG tests passed just fine. Instead of wasting time troubleshooting, I removed the PDF test and moved on. Combined with TypeScript, this approach makes me feel confident when deploying. Not sure how well it scales, though.
My philosophy is to write as few tests as I can get away with while ensuring it functions properly, but you wouldn't know it by looking at my projects. For example, my tapescript project (byte code virtual machine with domain-specific assembly language) has 239 tests and 104 test vectors for the compiler and decompiler. If I made a flowchart for that one, it would look like an integrated circuit.
personally i'm loving testing as a development tool to get a debugger breakpoint on the function i'm working on, one test case at a time, no backtracking that way. etc.
16:00 Man I know that feeling. This happened to me after WEEKS of building a complex set of feature that were « crucial » only to be told that users were not using it at all… makes you want to throw hands at somebody… 😂 I remember our « proxy product owner » ( cool dude) had to deliver the news, he was acting like Steve Harvey when he made that blunder on miss universe.
Testing is great when you want to maintain certain sets of behaviours and properties of a system. If something happens to break in a different part of the system and you don’t know about it, then you are in a situation where you have a behaviour or property that you want, but is not tested. So whack a test on it.
I have been trying to learn programming and have watched countless videos and learned a lot of programming languages in addition to trying to create projects but when I try to do them on my own I just don't know HOW and WHEN to use the things I have learnt in the tutorials. + when I am trying to read other people's code they look much more complicated then what I learn in those tutorials I don't know what to do any one got advice?
Test your domain/buisnes logic. No matter if its 1 if or 100ifs. This will document your system. Second factor ask yourself what test should be there in CI for you to push strait to production and implement them.
Would be really good if you can make a video about good dev content creators in RUclips. I just discovered your channel recently, and if I had known earlier about your channel I would start watching you earlier.
I want to commend you very much for this video. Everything you said for when testing is great is spot on (except for the diminishing returns part which is debatable). However, your first point for when testing sucks is a skill issue which can be fixed by the developer. Giving up testing because tests are brittle defeats you earlier point about testing key business-crucial features. So, build on and fix your your skill issues. Cheers
Man, I found your channel two months ago, and since then, I almost never listen to any other dev content creator except you. You're so genuine, and you talk in a very reasonable way-it's not too much and not too little. Keep it up, bro!
Thanks man glad you enjoy it
Testing the frontend is way different than testing the backend. Having "unit" tests (either at the API level or business logic level, not mocking the DB) on the backend allows you to move way faster, because you don't need to verify the changes manually and you don't need to worry about regressions. I do TDD like this for 90-95% of tests. If you have a more complex business logic, then you test just that logic, but you make the function as pure as possible.
As for the front end, it mostly just sucks, because the UI is fragile, especially in the beginning. Automating e2e tests for stable features makes sense, depending on how much work it is.
It seems that no matter the framework you are using, or if you are making unit tests or e2e tests the experience of testing the frontend will always be unpleasant
For side projects that I'm planning to maintain over the months / years I tend to add tests because future me will have forgotten most if not all. Professionally, based on budget and time I always try to add tests if the budget / time is limited I always try to cover the happy path on the highest level possible. You covered it pretty well!
Brilliant and valuable perspective. It's great that this is being discussed openly
she was faster, again.
@@lucaschitolina7156 that’s what he said. 👀
Man just quit. We can't win
Who is she?
hey really excited to watch this because this is nuanced and challenging when you really get into it
Tests are this weird area that can bring so much value and benefit, but you can also still end up in this weird situation where you might require 80-90% code coverage and still have a consistently crappy software and user or developer experience.
You can reach that code coverage consistently and still have software that breaks all the time, tests that break all the time etc. Just because people start writing tests for the sake of hitting coverage instead of for the sake of writing good reliable tests that make sense and tests behavior that should be tested. Testing is a tricky and hard area.
The "value added" to "effort/time" graph would look something like a logarithm. The more interesting graph is "value added / time-and-effort" to "time-and-effort", which will roughly slope downward indicating that the value derived per unit of effort diminishes as the amount of effort increases for a feature. (Maybe something like y=x * (0.9^x) would be more accurate. Idk.)
Honestly it’s more of a step graph. Until your test is done, you have no added value and sometimes it take a lot of time to finish the test.
Nice discussion of testing. I generally agree with your sentiments on this, though I do sometimes specifically create tests to document how vague requirements were interpreted. Very useful documentation of the implemented rules.
was getting ready for a hot & spicy take but I 100% agree
one thing that also sucks with the whole "we gotta test everything dogma" is when managers and product owners tell you to test shit that doesn't make sense to test. but you can't explain that to them cause they don't understand code so now you gotta do unnecessary work 😅
Great intro, love it.
I'm not a big fan of skipping tests, even on small projects-I hate manually testing my app before each deployment. That's why I always include e2e tests, even for smaller apps. If something doesn't work smoothly during testing, I don't test the use case instead of spending too much time on it.
For example, I had a download button with JPEG, PNG, and PDF options. The PDF snapshot tests kept failing, even though they were correct, while the JPEG and PNG tests passed just fine. Instead of wasting time troubleshooting, I removed the PDF test and moved on.
Combined with TypeScript, this approach makes me feel confident when deploying. Not sure how well it scales, though.
you will always find some people thinking tests saved their life and you found out that they never really write tests and just bluffing
My philosophy is to write as few tests as I can get away with while ensuring it functions properly, but you wouldn't know it by looking at my projects. For example, my tapescript project (byte code virtual machine with domain-specific assembly language) has 239 tests and 104 test vectors for the compiler and decompiler. If I made a flowchart for that one, it would look like an integrated circuit.
personally i'm loving testing as a development tool to get a debugger breakpoint on the function i'm working on, one test case at a time, no backtracking that way. etc.
Sees video title: "I guess I know why cody screamed at his screen at work this morning"
one more randomly failing cypress test and I'm going to switch careers
Love you babe
@@twitchizle aht aht aht…. Me no share!
Cody out here with the nuanced takes
Nice video, btw what keyboard are u using
tests are also a form of documentation
16:00 Man I know that feeling. This happened to me after WEEKS of building a complex set of feature that were « crucial » only to be told that users were not using it at all… makes you want to throw hands at somebody… 😂
I remember our « proxy product owner » ( cool dude) had to deliver the news, he was acting like Steve Harvey when he made that blunder on miss universe.
Testing is great when you want to maintain certain sets of behaviours and properties of a system. If something happens to break in a different part of the system and you don’t know about it, then you are in a situation where you have a behaviour or property that you want, but is not tested. So whack a test on it.
I have been trying to learn programming and have watched countless videos and learned a lot of programming languages in addition to trying to create projects but when I try to do them on my own I just don't know HOW and WHEN to use the things I have learnt in the tutorials. + when I am trying to read other people's code they look much more complicated then what I learn in those tutorials I don't know what to do any one got advice?
Test your domain/buisnes logic. No matter if its 1 if or 100ifs. This will document your system. Second factor ask yourself what test should be there in CI for you to push strait to production and implement them.
what's a good advice for an API automation tester?
test coverage is a scam, great video btw.
me: I like working with projects that have well-written tests but I don't like writing tests:)
btw what's that chrome extension you're using?
Would be really good if you can make a video about good dev content creators in RUclips. I just discovered your channel recently, and if I had known earlier about your channel I would start watching you earlier.
Can you make a video about load balancing and how it’s implemented?
Idk i needed this Masterclass on testing
I just finished doing my first 2hours of test in go :D
all to verify add(1, 2) gives you 3 amiright?
Could be nice if you make videos about how to test
Good video.
lol. that intro
testing in js sucks but when it does not (Laravel) it's hard not to test
I want to commend you very much for this video. Everything you said for when testing is great is spot on (except for the diminishing returns part which is debatable).
However, your first point for when testing sucks is a skill issue which can be fixed by the developer. Giving up testing because tests are brittle defeats you earlier point about testing key business-crucial features.
So, build on and fix your your skill issues.
Cheers
Testing is wasting.
What's a youtube view quota?
Just need my wife to comment and I’m good
@@WebDevCody ngawwwwww 😩😘
Testing is also great when you want increase estimates to earn more money as developer.
First!!!!
Aw man, nearly got there
No fair, she knows beforehand 😣🤣🤣🤣
thanks babe!
Babe ! You are doing great babe !
@@klapaucius515 hahaha insider trading at its finest. #nepotism
Lovely jublee, I will never test anything