Master testing and become an expert in TDD, BDD & ATDD. This comprehensive course bundle gives you sought after skills, ready to tackle the modern world of software engineering 🚀 Find out more about this collection HERE ➡ courses.cd.training/bundles/totally-testing PAYMENT PLANS AVAILABLE TOO.
Do you offer any discounts for programmers who are not companies ? I live in South Africa and given our exchange rate those courses are prohibitively expensive for me as a single developer who wants to upskill in BDD .
I've watched lots of videos (many from this channel), read lots of articles and even one book on BDD and TDD. This video did more to clarify for me what BDD and TDD are and how to best approach them than all the rest of my previous sources combined. It should be among the very first things people are presented with when being introduced to the topic!
14:54 This is so true. In my company, there is a saying: "The user does never know what he really wants, but he knows exactly what he doesn't want if he sees the result"
It's fantastic. It's still amazing to me we can just sit at home, turn on RUclips, and magically get teleported sage advice from software developers with decades of experience. Thanks Dave.
Totally agree, this is a training from a true Rockstar, while I can't get my co-workers to watch him, they for some reason think I'm a genius after citing him continuously.
Fantastic video as always... If I can summarize what I got from this video into 2 thought chunks: 1. This approach is both a paradigm shift from developer-centric to user-centric, and 2. coding the tests to using the SOLID principles, i.e. particularly in the dependency control so that the Low-Level modules (the plumbing as you called it) depends on the High-Level modules, the beauty of this is when a vendor changes their implementation details, then only the LL modules will need to change, not the HL.
This cuts straight to the essential nature of TDD in the most cogent and well-thought-out explanation I have seen to date. Well done! Also...love the shirt! I think we need a link to the maker. :)
I've also come to the realization that workflows or usecases are simple steps that can describe how the high level code can look like, like unit tests. Naming them sensibly will act as titles to chapters in the index of a book. A support library contains a glossary of functionality or behaviours that are used in those steps. Discussing with a user what they want to accomplish is very helpful. It often comes with their own examples but has the danger of a dev taking it as the implementation goal. If you brainstorm a little and come up with something that rethinks the whole problem can also be a great tool to level up the work a user can accomplish.
I was using top-down design and writing specifications 40 years ago. We seem to have come full circle, with only the agile crowd thinking this is something new.
Not really, there is nothing really new, but this approach is NOT widely practiced. It is not normal for most SW dev, and it should be. TDD began in the 1950s, iterative incremental design has always been a thing, but on thing that we are good at in the software world is NOT LEARNING FROM THE PAST, so we have to keep re-learning things. This is not about "who thought of it first" this is about, "this is what works".
Thank you for this video Dave! I have been trying to sell my team on BDD over implementation tests and struggled to articulate myself well enough. You have done a great job of doing exactly that with demonstrations in this video!
I've had excellent success with removing "magic text/numbers", often times we would specify a port number or IP address in a test, nobody would know where it came from, but if you suddenly replace it with IP_SUCCESS or PORT_XSERVICE, magically I can show the code to a Product owner and even they can get on board.
I absolutely agree to this approach. Especially a big idea in terms of innovation is: No body knows. Not the user, not the developer, not the stakeholder, not even the personal developer that want to write some tool for himself. How ofter did I write a little tool for me which turned out to be not so usefull as I thought. But is there any scientific paper or study on that issue?
I wonder if a useful word to apply for what we are doing here is “interface”. Coding is building towards the interface from within. Testing is essentially “taking the interface for a test drive”. Plugging it into the system real quick. “Interface validation”. “Test driving”. You shut the hood and take the baby out for a drive and see how she handles it. The mechanic perspective and the driver perspective.
Maybe, though I confess I think it is more than that. I think that maybe a "mechanic's viewpoint" 😉 The kind of 'testing' I mean when I am talking about when I describe Acceptance tests and BDD, is MUCH more that "interface validation" it is about exploring and specifying the behaviour of the system, this is design, functional design that captures "WHAT we want the system to do" without getting lost in the niceties of "HOW we want the system to do it". Interfaces are still part of the "HOW". If I am building Amazon, I can say "I want to search for a book, and add it to my shopping cart". That is a perfectly valid, very precise specification of what the user wants from the system, without technical detail of HOW to achieve that. Once I have that captured as a BDD scenario, now I am free to implement that behaviour however I want, including with nice interfaces or horrid ones. That is a separate part of the problem that we always need to solve in software development.
Great video as always! I'm currently messing around with your 4 layer approach, and it looks like the DSL acts sort of like a god class in terms of its size. Although I maximally reuse driver functionality, I would love to know your opinion: Would you consider this a smell? Or is this just the nature of the DSL layer to be this big (15+ functions)? If it is a code smell, what way do you recommend for solving it?
Yes, I think that the "god class" problem is a common one. I generally divide the code into functional areas to try and alleviate the omnipotence. I generally make these scoped areas of the DSL available from a super class so that I can say things like this in a test case: admin.addUser("name: user1") admin.addUser("name: user2") trading.placeTrade("name: user1", "sell: 100@50") trading.placeTrade("name: user2", "buy: 10@50") account.confirmPosition("name: user1", "90") account.confirmPosition("name: user2", "10") (Don't worry too much about the numbers they don't make much sense)
@@ContinuousDelivery That's the good stuff!!! I managed to implement it just like you have said and it has solved this smell. For the other people reading this, this is the simple process translated from your example: I ask myself: which type of user would make that action? If the action is not specific to a user type -> then which system is responsible for doing this action? This then clears which DSL classes should be responsible for anything, makes it easier to read. I initialize them by injecting the driver instances. Here's an example: def test_When_player_plays_his_turn_Then_it_is_the_other_players_turn(self): self.player.played_his_turn() self.opponent.assert_my_turn()
Excellent video! You have an amazing way of explaining things that are easy to understand. A small suggestion would be to include more diverse set of examples. But really amazing job with this channel. Thank you!
Trouble is nowadays all the developers I meet seem to think BDD (in the form of a set of specflow tests) replaces a well thought out formal specification (in a plain old document). It doesn't.
Hey, I've been following for some time now and I do really enjoy your videos. I really liked about this one, that you showed the implementation of those concepts in code and would really like, if you could do that more in the future. Great work nevertheless and keep going :)
The thing with BDD frameworks is that readable specs are much more maintainable and less complex as outputs rather than inputs. Write high-level tests using your ubiquitous domain language in a proper programming language. Have your test steps output the actions they take as structured logs. Throw together something that turns the logs in to some html and you've got a nice readable spec. If you have the logs output as baggage to OTel, with spans and traces injected into api calls by your tests, you can also get some really nice drill-down into test failures (and successes). See also: Nat Pryce on domain driven testing.
What do you think about testing in frontend? All the concepts like TDD, BDD, Unit test, Integration test or test strategies aplies the same way in the frontend? Love your content and your channel!
I feel that BDD is a guidance rather than a new paradigm while doing TDD or Acceptance Testing? Also, I guess Acceptance Testing, varies a lot from TDD. Instead of red-green-refactor lasting 10-15mins as you say on other videos. Acceptance tests would stay red for a long time, and then when we will finally get a green, we wouldn't go to the phase of refactoring as we went through it during the smaller TDD steps?
I prefer to use the acceptance criteria as part of my test specification so in that case it's still part of my red-green-refactor. You can still make something that fits the acceptance criteria that is built poorly so I would argue that you should refactor it
BDD will never be a specification. If it were, I wouldn't be currently looking at a system with 5000 test cases whilst having absolutely no idea what it actually does!
I wonder in the better test example. Is that shopping variable an instance of a class specifically designed for testing? It seems strange to put an assertItemListenInShoppingBasket method on a domain object.
I like to divide the domain specific language that I build to create test cases to keep the pieces modular. Otherwise you tend to end up with one big "DSL" class. Subdividing the problem into functional areas like "shopping", "admin" etc keeps the code a bit cleaner, and I think, helps with the readability. All of the code in the test case itself should be about the problem, not the solution. Lower layers of code will translate these "problem level ideas" into real interactions with the system under test.
On clicking the CDLaunchDarkly link: "Warning: This URL has been blocked by Bitly's systems as potentially harmful." That doesn't really instill confidence.
The code font size is too small to read from the video depending on the device or resolution. Please make it larger or put a copy somewhere else like in the description or in a comment.
@@AkosLukacs42 That's an ugly work around, no one should have to use that. Also you need a Premium subscription to use it, just like you need a fast connection to watch in high resolution. Not everyone can have that, if you care about inclusion.
Thank you Dave for this great video and channel. I have a small question/opinion about user stories as they're typically written. I see User Stories as solutions to a problem, when looking from the user's perspective. Meaning, given a problem/goal (I want a book), here's how it could be solved (defining a user and an action to solve the problem, which is the "so that..." part). What's missing to me in the diagram, is the problem statement that should precede the user story. If we start with the story itself, we might miss the opportunity to realize that some users are completely unnecessary and could be automated. Or, we can realize that there are other ways to solve the problem at hand. What do you think? Any comment would be appropriated.
Hi Dave, very interesting! Can you tell more how to create DSL as shown in test2() - That is: how to close in 'shopping' object business examples and tests? Maybe you can point us some article or examples? Big thanks ;]
Can I safely say that if i am doing TDD right, then i am actually doing BDD? Because in a test first approach, we treat our tests as consumers? The examples become different test cases when i use an xUnit framework.
It depends what "TDD done right" really means. I think that certainly can be true, that was one of the phrases that we used to describe BDD in the early days "TDD done right". The problem is that lots of people don't know what that means, or maybe don't agree with what that means. If you write your test first, but are thinking of the internal design of your code while you write it, and test the implementation detail, then TDD is NOT the same as BDD.
I love this channel., I am a mid/Senior level QA - I have a seen a lot of videos where you recommend Continuous Integration rather than feature branching... I always thought Continuous integration was part of Feature branching so I am confused when you say this. We usually create a PR and once all the Tests run against it + Code review we merge, I thought this was Continuous Integration. What do you mean by this.. do you mean just Pushing Develop? How do you enforce integrity without Pull Requests?
CI is defined as integrating "everyone's changes at least once per day", so if your FBs last longer than a day, you can't practice CI, because my code on my branch is hidden from yours, so we don't get to integrate them together, until we both think that we are finished. The reason that this matters, is because any tests that you run before this point of integration, are testing a version of the code that is not the truth of the system. So whether the tests pass or fail, they are not a definitive statement. CI works on the idea that there is only one interesting version of your software, the current version, and the only way that we can test the "current version" is to integrate everyone's changes after each small change. This is a very different way to work, but the data (from State of DevOps report) says that you produce better software faster with CI.
In the test2() example, how are those statements linking together? Are the methods mutating state stored within shipping? Why have the final assert method is part of the shipping object?
"shipping" is not the system under test. That is already up and running. "shipping" is just an abstraction to make the testing DSL clearer. The state is in the system under test, not in the "shipping" class.
@@ContinuousDelivery thanks for the reply. Yes I get that shipping is a testing abstraction. I’ve never seen this pattern before. I’m asking how it works and how the methods tie together.
Another demonstration that sometimes we forget the learn the fundamentals first. It's harder to unlearn something than it is to learn it correctly from the start.
I would say Use Cases are more similar to User Stories in Extreme Programming. BDD scenarios are executable specifications, that's their real power in my experience - a communication method that is also a verification of the communicated ideas all in one. I suppose a Use Case can be captured as one or more BDD scenarios, at least to my mind. Mind you I've been out of the loop with Use Cases, my last serious foray into them was Rational Rose, and the less said about that particular piece of software the better 😀
Certainly not, not the same thing. You can see similarities in terms of taking a user perspective, but I think that the comparison stops there. Use cases are often created as a more technical, formal description, but aren't executable. Certainly in the teams where I have done any real Use Case modelling it rarely happened as we were about to write the software, and usually was done by different groups of people. None of this is definitional for Use Cases, so you can certainly argue that the goals are similar. Practically my experience of seeing Use Case modelling and BDD in use, BDD is lighter-weight and creates more valuable assists in the form of tests as executable specifications.
I primarily use the Use case technique to capture requirements. I confess I don't strictly follow the specifications laid out by Jacobson. In my case, I was/am always the architect and lead programmer who also captures requirements; so I have regularly tweaked the process to suit my needs and the changes in the technology landscape. I was unfortunate enough to get caught up in the methodology wars of the early 90s, when every new project I entered had a different methodology laid down by the management. If one project followed Grady Booch, the next one followed Rumbaugh, or Jacobson, or Fusion method or Shlaer-Mellor... By the time I managed to escape process wars (late 90s), I found only Use cases, at least some parts of it, worth keeping. I avoided UML entirely, despite the FOMO. 😀I build middleware APIs, so there is no UI. After capturing requirements (via use case), I write the user manual for the APIs. This is a living document. At first it is like the vague wish in BDD, and it continuously changes as features are built. This allows the development and testing teams to work independently, following the same script, working towards a common goal. I suppose that is why I was feeling BDD is similar to the process I follow, which always starts as Use cases.
You are saying that going from the vague idea straight to the solution is a huge mistake. But isn't doing a prototype the simplest way to better understand the problem and confirm with the client if it is fit for purpose? How long do you thing workshopping the process of going from Vague Wish to Executable specifications should take in a project compared to the time it takes to code it?
Yes, I am saying that is a mistake. To prototype, you still need to understand the problem. This is a simple way to organise your thinking. Without that thinking, what are you prototyping? This isn't a slow workshopping process, it usually takes a few minutes once you start working on the story, maybe as long as an hour if the problem is complicated, but as I have already said, this is thinking that you need to do anyway, so it doesn't really add delays, it is just a slightly more organised way to do that thinking.
Is that so? I understand from Dave's explanation that it is rather about specifying application behaviour. It is about the behaviour that is exposed to a user (human or not). I find this a very useful perspective in my daily work as a Test Engineer. BDD is primarily about functional application behaviour, described in a way that is understandable for stakeholders other than Developers and supported by libraries that make it executable. A BDD specification for a web page can specify that the page _displays_ a Close button; it cannot specify that the user _sees_ the Close button. User behaviour, like a mouse-click, is merely a trigger for (expected) application behaviour.
I really like the idea that I could read a test suite and know exactly how the software is supposed to work, and I strive to make my own tests that legible. However in practice, reading someone's test code is like reading a EULA. I do believe that someday soon something will make it better though.
Tests written as AAA (arrange-act-assert), yes, they tend to look like EULA. Test written as GWT (given-when-then), when people take the time to write sentences that actually mean something, those are are invaluable.
Master testing and become an expert in TDD, BDD & ATDD. This comprehensive course bundle gives you sought after skills, ready to tackle the modern world of software engineering 🚀
Find out more about this collection HERE ➡ courses.cd.training/bundles/totally-testing
PAYMENT PLANS AVAILABLE TOO.
Do you offer any discounts for programmers who are not companies ? I live in South Africa and given our exchange rate those courses are prohibitively expensive for me as a single developer who wants to upskill in BDD .
I've watched lots of videos (many from this channel), read lots of articles and even one book on BDD and TDD. This video did more to clarify for me what BDD and TDD are and how to best approach them than all the rest of my previous sources combined. It should be among the very first things people are presented with when being introduced to the topic!
14:54 This is so true. In my company, there is a saying: "The user does never know what he really wants, but he knows exactly what he doesn't want if he sees the result"
This channel is pure gold!
gold, jerry! gold!
It's fantastic. It's still amazing to me we can just sit at home, turn on RUclips, and magically get teleported sage advice from software developers with decades of experience. Thanks Dave.
I got a 15k pay rise and i hold this channel responsible for my improvement in how I work.
Totally agree, this is a training from a true Rockstar, while I can't get my co-workers to watch him, they for some reason think I'm a genius after citing him continuously.
Wow! That's great! Glad we could help.
Dave is The Goat. Not only for the content but for his clarity when presenting the topic.
Fantastic video as always... If I can summarize what I got from this video into 2 thought chunks: 1. This approach is both a paradigm shift from developer-centric to user-centric, and 2. coding the tests to using the SOLID principles, i.e. particularly in the dependency control so that the Low-Level modules (the plumbing as you called it) depends on the High-Level modules, the beauty of this is when a vendor changes their implementation details, then only the LL modules will need to change, not the HL.
This cuts straight to the essential nature of TDD in the most cogent and well-thought-out explanation I have seen to date. Well done!
Also...love the shirt! I think we need a link to the maker. :)
I've also come to the realization that workflows or usecases are simple steps that can describe how the high level code can look like, like unit tests. Naming them sensibly will act as titles to chapters in the index of a book. A support library contains a glossary of functionality or behaviours that are used in those steps.
Discussing with a user what they want to accomplish is very helpful. It often comes with their own examples but has the danger of a dev taking it as the implementation goal. If you brainstorm a little and come up with something that rethinks the whole problem can also be a great tool to level up the work a user can accomplish.
I was using top-down design and writing specifications 40 years ago. We seem to have come full circle, with only the agile crowd thinking this is something new.
Not really, there is nothing really new, but this approach is NOT widely practiced. It is not normal for most SW dev, and it should be. TDD began in the 1950s, iterative incremental design has always been a thing, but on thing that we are good at in the software world is NOT LEARNING FROM THE PAST, so we have to keep re-learning things.
This is not about "who thought of it first" this is about, "this is what works".
Thank you for this video Dave! I have been trying to sell my team on BDD over implementation tests and struggled to articulate myself well enough. You have done a great job of doing exactly that with demonstrations in this video!
I am pleased that you have found it helpful, thanks.
Respect++ as an engineer who worked in UX as well, this makes so much sense to me....
This channel is a gold mine
Thank you so much Dave Farley! Please consider releasing the Eastern Economy Versions of your courses. It would be really helpful. ❤
Love the BDD approach - easy to read for anyone, and the framework is so clear and organized.
I love that software devs create an entire dictionary everytime they improve/create something. Same thing for BDD.
I've had excellent success with removing "magic text/numbers", often times we would specify a port number or IP address in a test, nobody would know where it came from, but if you suddenly replace it with IP_SUCCESS or PORT_XSERVICE, magically I can show the code to a Product owner and even they can get on board.
Excellent video, Dave, especially the "reset" on the origins of TDD and BDD!!
The most appealing points: focus on behavior, create an executable specification.
I absolutely agree to this approach. Especially a big idea in terms of innovation is: No body knows. Not the user, not the developer, not the stakeholder, not even the personal developer that want to write some tool for himself. How ofter did I write a little tool for me which turned out to be not so usefull as I thought.
But is there any scientific paper or study on that issue?
I wonder if a useful word to apply for what we are doing here is “interface”.
Coding is building towards the interface from within.
Testing is essentially “taking the interface for a test drive”.
Plugging it into the system real quick.
“Interface validation”.
“Test driving”.
You shut the hood and take the baby out for a drive and see how she handles it.
The mechanic perspective and the driver perspective.
Maybe, though I confess I think it is more than that. I think that maybe a "mechanic's viewpoint" 😉
The kind of 'testing' I mean when I am talking about when I describe Acceptance tests and BDD, is MUCH more that "interface validation" it is about exploring and specifying the behaviour of the system, this is design, functional design that captures "WHAT we want the system to do" without getting lost in the niceties of "HOW we want the system to do it". Interfaces are still part of the "HOW".
If I am building Amazon, I can say "I want to search for a book, and add it to my shopping cart". That is a perfectly valid, very precise specification of what the user wants from the system, without technical detail of HOW to achieve that. Once I have that captured as a BDD scenario, now I am free to implement that behaviour however I want, including with nice interfaces or horrid ones. That is a separate part of the problem that we always need to solve in software development.
Great video as always!
I'm currently messing around with your 4 layer approach, and it looks like the DSL acts sort of like a god class in terms of its size.
Although I maximally reuse driver functionality, I would love to know your opinion:
Would you consider this a smell? Or is this just the nature of the DSL layer to be this big (15+ functions)?
If it is a code smell, what way do you recommend for solving it?
Yes, I think that the "god class" problem is a common one. I generally divide the code into functional areas to try and alleviate the omnipotence. I generally make these scoped areas of the DSL available from a super class so that I can say things like this in a test case:
admin.addUser("name: user1")
admin.addUser("name: user2")
trading.placeTrade("name: user1", "sell: 100@50")
trading.placeTrade("name: user2", "buy: 10@50")
account.confirmPosition("name: user1", "90")
account.confirmPosition("name: user2", "10")
(Don't worry too much about the numbers they don't make much sense)
@@ContinuousDelivery That's the good stuff!!! I managed to implement it just like you have said and it has solved this smell.
For the other people reading this, this is the simple process translated from your example:
I ask myself: which type of user would make that action?
If the action is not specific to a user type -> then which system is responsible for doing this action?
This then clears which DSL classes should be responsible for anything, makes it easier to read. I initialize them by injecting the driver instances.
Here's an example:
def test_When_player_plays_his_turn_Then_it_is_the_other_players_turn(self):
self.player.played_his_turn()
self.opponent.assert_my_turn()
Bridge pattern: abstraction level is the test2, implemention part is what you call plumbing.
Excellent video! You have an amazing way of explaining things that are easy to understand. A small suggestion would be to include more diverse set of examples. But really amazing job with this channel. Thank you!
I love this channel! So much condensed wisdom.
Trouble is nowadays all the developers I meet seem to think BDD (in the form of a set of specflow tests) replaces a well thought out formal specification (in a plain old document). It doesn't.
Waterfall?
Hey, I've been following for some time now and I do really enjoy your videos. I really liked about this one, that you showed the implementation of those concepts in code and would really like, if you could do that more in the future. Great work nevertheless and keep going :)
The thing with BDD frameworks is that readable specs are much more maintainable and less complex as outputs rather than inputs.
Write high-level tests using your ubiquitous domain language in a proper programming language. Have your test steps output the actions they take as structured logs. Throw together something that turns the logs in to some html and you've got a nice readable spec. If you have the logs output as baggage to OTel, with spans and traces injected into api calls by your tests, you can also get some really nice drill-down into test failures (and successes).
See also: Nat Pryce on domain driven testing.
What do you think about testing in frontend?
All the concepts like TDD, BDD, Unit test, Integration test or test strategies aplies the same way in the frontend?
Love your content and your channel!
This is first class content
The Qwertee discount link is not in the description.
This channel should be mandatory for SE
Did you just say that if I'm a backend developer my users are other developers. Could you explain that?
I feel that BDD is a guidance rather than a new paradigm while doing TDD or Acceptance Testing? Also, I guess Acceptance Testing, varies a lot from TDD. Instead of red-green-refactor lasting 10-15mins as you say on other videos. Acceptance tests would stay red for a long time, and then when we will finally get a green, we wouldn't go to the phase of refactoring as we went through it during the smaller TDD steps?
I prefer to use the acceptance criteria as part of my test specification so in that case it's still part of my red-green-refactor. You can still make something that fits the acceptance criteria that is built poorly so I would argue that you should refactor it
Is this entire video basically explaining the DRY principle?
To prevent misunderstandings call it finally SDD ! Specification Driven Development!
BDD will never be a specification. If it were, I wouldn't be currently looking at a system with 5000 test cases whilst having absolutely no idea what it actually does!
I recall that you told story about how huge IBM lost to small company that used TDD. I lost that video, could you provide source?
One of my favorite channel 🎉
So how does that translate to ml and algorithm design where the thingnu work on is the how it works and there aren't ry any users.
I like the "translation" analogy
Do you have any videos on the pros/cons of specific tools used in cd?
Amazing Clarity, thank you.
I wonder in the better test example. Is that shopping variable an instance of a class specifically designed for testing?
It seems strange to put an assertItemListenInShoppingBasket method on a domain object.
Yes, it is code written specifically for testing, it is part of the implementation of a testing DSL.
@@ContinuousDelivery Wow quick response! Thanks that clears up my question.
Holw molly. It's been a really long time since I've seen such a high quality content.
THANK YOU! Glad you enjoyed it.
Great topic as always! Would you touch on how monitoring/observability ties in to continuous delivery?
I am curious, is the "shopping" object supposed to be designed specifically for the test but not directly translatable to the solution?
I like to divide the domain specific language that I build to create test cases to keep the pieces modular. Otherwise you tend to end up with one big "DSL" class. Subdividing the problem into functional areas like "shopping", "admin" etc keeps the code a bit cleaner, and I think, helps with the readability.
All of the code in the test case itself should be about the problem, not the solution. Lower layers of code will translate these "problem level ideas" into real interactions with the system under test.
On clicking the CDLaunchDarkly link:
"Warning: This URL has been blocked by Bitly's systems as potentially harmful."
That doesn't really instill confidence.
Thanks for the heads-up. we'll sort this out with LaunchDarkly.
Top level content!
The code font size is too small to read from the video depending on the device or resolution. Please make it larger or put a copy somewhere else like in the description or in a comment.
You can pinch-zoom now. Watching this on my phone, and did just that to skim the code fragments
@@AkosLukacs42 That's an ugly work around, no one should have to use that. Also you need a Premium subscription to use it, just like you need a fast connection to watch in high resolution. Not everyone can have that, if you care about inclusion.
@@rafaelfv3362 get youtube vanced
@@rafaelfv3362 ugly, but I don't have premium and can zoom in on my phone
Thank you Dave for this great video and channel.
I have a small question/opinion about user stories as they're typically written. I see User Stories as solutions to a problem, when looking from the user's perspective. Meaning, given a problem/goal (I want a book), here's how it could be solved (defining a user and an action to solve the problem, which is the "so that..." part).
What's missing to me in the diagram, is the problem statement that should precede the user story. If we start with the story itself, we might miss the opportunity to realize that some users are completely unnecessary and could be automated. Or, we can realize that there are other ways to solve the problem at hand.
What do you think? Any comment would be appropriated.
Hi Dave, very interesting! Can you tell more how to create DSL as shown in test2() - That is: how to close in 'shopping' object business examples and tests? Maybe you can point us some article or examples? Big thanks ;]
Can I safely say that if i am doing TDD right, then i am actually doing BDD? Because in a test first approach, we treat our tests as consumers? The examples become different test cases when i use an xUnit framework.
It depends what "TDD done right" really means. I think that certainly can be true, that was one of the phrases that we used to describe BDD in the early days "TDD done right".
The problem is that lots of people don't know what that means, or maybe don't agree with what that means. If you write your test first, but are thinking of the internal design of your code while you write it, and test the implementation detail, then TDD is NOT the same as BDD.
so, raw Selenium is like ASM programming language
BDD is like any modern programming languages
got it.
Thank you for a great summary what BDD is.
Thank you for this channel
I love this channel., I am a mid/Senior level QA - I have a seen a lot of videos where you recommend Continuous Integration rather than feature branching... I always thought Continuous integration was part of Feature branching so I am confused when you say this. We usually create a PR and once all the Tests run against it + Code review we merge, I thought this was Continuous Integration.
What do you mean by this.. do you mean just Pushing Develop? How do you enforce integrity without Pull Requests?
CI is defined as integrating "everyone's changes at least once per day", so if your FBs last longer than a day, you can't practice CI, because my code on my branch is hidden from yours, so we don't get to integrate them together, until we both think that we are finished. The reason that this matters, is because any tests that you run before this point of integration, are testing a version of the code that is not the truth of the system. So whether the tests pass or fail, they are not a definitive statement. CI works on the idea that there is only one interesting version of your software, the current version, and the only way that we can test the "current version" is to integrate everyone's changes after each small change.
This is a very different way to work, but the data (from State of DevOps report) says that you produce better software faster with CI.
@@ContinuousDelivery I understand a lot more now, thank you. Appreciate the response.
Thank you very much
Thank you!
Thank you, so clear and easy to understand.
You are welcome!
Great tutorial
Thank you
In the test2() example, how are those statements linking together? Are the methods mutating state stored within shipping? Why have the final assert method is part of the shipping object?
"shipping" is not the system under test. That is already up and running. "shipping" is just an abstraction to make the testing DSL clearer. The state is in the system under test, not in the "shipping" class.
@@ContinuousDelivery thanks for the reply. Yes I get that shipping is a testing abstraction. I’ve never seen this pattern before. I’m asking how it works and how the methods tie together.
Another demonstration that sometimes we forget the learn the fundamentals first. It's harder to unlearn something than it is to learn it correctly from the start.
Isn't BDD similar to Ivar Jacobson's Use Case methodology? I use the Use Case techniques for my designs even after 30+ years.
I would say Use Cases are more similar to User Stories in Extreme Programming. BDD scenarios are executable specifications, that's their real power in my experience - a communication method that is also a verification of the communicated ideas all in one.
I suppose a Use Case can be captured as one or more BDD scenarios, at least to my mind. Mind you I've been out of the loop with Use Cases, my last serious foray into them was Rational Rose, and the less said about that particular piece of software the better 😀
Certainly not, not the same thing. You can see similarities in terms of taking a user perspective, but I think that the comparison stops there. Use cases are often created as a more technical, formal description, but aren't executable. Certainly in the teams where I have done any real Use Case modelling it rarely happened as we were about to write the software, and usually was done by different groups of people. None of this is definitional for Use Cases, so you can certainly argue that the goals are similar. Practically my experience of seeing Use Case modelling and BDD in use, BDD is lighter-weight and creates more valuable assists in the form of tests as executable specifications.
I primarily use the Use case technique to capture requirements. I confess I don't strictly follow the specifications laid out by Jacobson. In my case, I was/am always the architect and lead programmer who also captures requirements; so I have regularly tweaked the process to suit my needs and the changes in the technology landscape. I was unfortunate enough to get caught up in the methodology wars of the early 90s, when every new project I entered had a different methodology laid down by the management. If one project followed Grady Booch, the next one followed Rumbaugh, or Jacobson, or Fusion method or Shlaer-Mellor... By the time I managed to escape process wars (late 90s), I found only Use cases, at least some parts of it, worth keeping. I avoided UML entirely, despite the FOMO. 😀I build middleware APIs, so there is no UI. After capturing requirements (via use case), I write the user manual for the APIs. This is a living document. At first it is like the vague wish in BDD, and it continuously changes as features are built. This allows the development and testing teams to work independently, following the same script, working towards a common goal. I suppose that is why I was feeling BDD is similar to the process I follow, which always starts as Use cases.
The link to Launch Darkley is broken
I guess he didn't test that link.
Thanks for letting us know. We will sort this out with LaunchDarkly.
Yes, the link was working, but sometimes this happens with bitly links. We'll get it sorted out with LaunchDarkly
You are saying that going from the vague idea straight to the solution is a huge mistake. But isn't doing a prototype the simplest way to better understand the problem and confirm with the client if it is fit for purpose? How long do you thing workshopping the process of going from Vague Wish to Executable specifications should take in a project compared to the time it takes to code it?
Yes, I am saying that is a mistake. To prototype, you still need to understand the problem. This is a simple way to organise your thinking. Without that thinking, what are you prototyping?
This isn't a slow workshopping process, it usually takes a few minutes once you start working on the story, maybe as long as an hour if the problem is complicated, but as I have already said, this is thinking that you need to do anyway, so it doesn't really add delays, it is just a slightly more organised way to do that thinking.
@@ContinuousDelivery I feel we overspend on the prep stage, but your way sounds way more productive! Thank you for your videos!
A good review of BDD, how it looks and feels and its top-down perspective.
please don’t make the text squiggly it’s hard to read
Yeah
Quincy Ports
15:03 made my day 🤣😂😂🤣
Omg I thought BDD body dysmorphic disorder 😂 I thought he was giving example about I was like WTF is going on did I mess a point 😂 it okay it 4 am
Love your shirts!
Labadie Fields
Danyka Square
Jakayla Port
BDD in a nutshell: when you're building something, think about the user
Is that so? I understand from Dave's explanation that it is rather about specifying application behaviour. It is about the behaviour that is exposed to a user (human or not). I find this a very useful perspective in my daily work as a Test Engineer.
BDD is primarily about functional application behaviour, described in a way that is understandable for stakeholders other than Developers and supported by libraries that make it executable.
A BDD specification for a web page can specify that the page _displays_ a Close button; it cannot specify that the user _sees_ the Close button. User behaviour, like a mouse-click, is merely a trigger for (expected) application behaviour.
call it DOD-driven development - DODDD - they will skill spoil it
I really like the idea that I could read a test suite and know exactly how the software is supposed to work, and I strive to make my own tests that legible.
However in practice, reading someone's test code is like reading a EULA.
I do believe that someday soon something will make it better though.
BDD already does.
@@ContinuousDelivery I'll give it a proper go then
Tests written as AAA (arrange-act-assert), yes, they tend to look like EULA. Test written as GWT (given-when-then), when people take the time to write sentences that actually mean something, those are are invaluable.