From my experience, most, if not all, software development teams are following some sort of agile software development methodology. SCRUM being the most popular. Different organisations will adapt SCRUM to suit their need and figure out how this methodology can guide them in their development efforts. The differences from organisation to organisation are small and if you've experience with one well then you're ready for them all.
The aim here is to guide you through my experience of how QA Engineers fit into this methodology and how to be a valuable member of a SCRUM team. So the SCRUM Guide won't mention QA engineers specifically, but there are references to quality throughout. It is at these points where you inject yourself into the development efforts.
Just a couple of definitions in case this is not familiar to you:
- Ticket: some work item, usually a business need like providing some piece of functionality to an end user. These might also be technical tasks that affect an area but with no visible changes. For example dependency updates.
- Backlog: simply a list of tickets.
- Sprint: a period of time in which a selected set of tickets from the backlog are delivered, usually all the way through to production.
- User Story: just a type of ticket.
Scrum Ceremonies
So let's focus on the SCRUM ceremonies:
- Refinement
- Sprint Planning
- Daily Scrum
- Sprint Review
- Sprint Retrospective
Refinement
My favourite of all the SCRUM ceremonies. This is where the entire SCRUM team get introduced to a business need typically by the teams Product Owner, who is the teams link between the business and the SCRUM Team. These items usually come from the teams backlog. So if you're really savvy you'll have had a snoop pre refinement to give you the extra edge. Get involved, be objective and ask those questions that are bothering you. Use your business knowledge to challenge and inform. By questioning as much as we need to we get a better understanding of what needs to be delivered. Through in dept collaboration better approaches are agreed on. Everyones opinion matters.Sprint Planning
- What are we going to test?
- How are we going to test this?
- At what point are we going to test this?
- Are we going to write automated tests?
- Will we write automated tests in parallel with developers coding, close to in parallel, or after it's been released to some test environment?
- What environment do we want to test this on?
- What level of unit test coverage is required?
- Do we need manual exploratory testing?
Daily Scrum
Sprint Review
Sprint Retrospective
- What went well
- What could be improved
- Actions to perform to improve
The Sprint Itself
In the sprint we start with a list of tickets that need to transition from written text to working software. Here's my take on the most common way for this to happen.
The blue is the path for the ticket to get from written text to working software. And the yellow are the quality gates that allow work to transition through the phases. Some look at this flow and think waterfall. I assure you it's not. This is a continuous feedback loop that allows very quick feedback to developers should there be any issues with what was produced. That's not to say the feedback loop cannot be shortened. Indeed it can. Not by adding to the above. But by rearranging. We'll look at the shift left concept in further blog posts. But I can assure that if you and your team are following a similar model you are doing software development well. And you are ready for the next steps. Remember, if you see potential points of improvement, bring it to your retrospective and get actioning.
So the developer gets their ticket and builds some software. They'll commit and push it usually via git to some centralised source control management system like Stash or Github. It's at this point the first quality checks are performed. Unit tests are ran on the branch, other developers perform code reviews and automated static code analysis is performed by a tool such as SonarQube. Can you as a tester get involved here. Yes is the answer. View, confirm and/or add to unit tests is an option. Code reviews? Give it a go and you'll surprise yourself. And begin to add to your automated suites. You now have a working(hopefully) branch get in there and start automating. Or maybe you started before this in parallel with the developer. Now you might be able to modify or even fully run them if you have branch testing capabilities. If they checks fail, feedback is given to the developer and any issues are addressed and in it comes again. Once they pass, the code is merged and the build is kicked off. Hopefully this is automated for you with some devops magic and the package is created and deployed to a test environment. Then we're into our next round of quality gates which is our automated end-to-end tests and maybe some manual exploratory testing if necessary. Again if anything fails at this point feedback is given to the developer and back in we come. And once they pass out it goes to the further environments where we might have another round of lesser tests. Simple and effective. And can be improved further of course.
Rinse and repeat for all other tickets in the sprint.
Comments
Post a Comment