- Posted by admin
- On March 7, 2018
- 1 Comments
- agile, Consulting, Delivery, design thinking, transformation
As they say “You Will Get What You Measure(or Reward)!”
If I was pushed to the wall then only metric that I would meet is “Satisfying the customer through early and continuous delivery of valuable software”
We believe in continually monitoring the performance of our deliveries with both the team and customer to ensure they are aligned to both expectations and plans. Progress against plans made for the service or blockers hindering the required velocity are reviewed through stand-ups (daily), sprint show and tells, retrospective improvement actions and release reviews. Any issues that arise will be managed to successful resolution in an open manner to ensure visibility and confidence are maintained.
We measure and manage quality and speed of delivery teams by checking if software delivered by our team by checking:
- Are we releasing early and often and keep your application always production ready?
- Does the product match the user needs and can we evidence that?
- Does the team have a regular delivery cadence?
- Are risks, dependencies and blockers managed and addressed at all stages of the pipeline
- Are our stakeholders satisfied that the delivery provides adequate ROI?
- Are we building quality in, automate code quality analysis, code coverage?
- Are we making all new source code open and reusable; publish it with appropriate licences?
- Are we creating processes primarily aimed at producing fast feedback; this leads to a quality product?
Below is an e.g. of the business value and agile metric used in DWP and GDS across the wall monitoring the progress every sprint
Delivering value to users
How we Manage it: We follow user-centred design and research processes, based on an understanding of user personas, behaviours and scenarios, on an ongoing basis. We use lo-fidelity paper-based or clickable prototypes. to test early-stage improvements to user to get user feedback early and often. Our developers attend the user testing regularly.
How we measure it: We consider the target business outcomes and collect data needed to help achieve these
- Financial savings
- Contribution to overall strategic goals
- End user satisfaction
- Lead time
- Test coverage/success measured by percentage coverage and number of failing tests
- Velocity of team and individual, measured by story points delivered
Team / individual speed and quality
- Satisfying the customer through early and continuous delivery of valuable software.
- Simple to use
- Can the service be completed by user without intervention in one go
- Is it digitally inclusive, accessible etc.
Technical excellence and good design
- Fast feedback: Being able to find out whether a change has been successful in moments, not days..
- Simple: We build;
- software that contains no more complexity than needed to do a good job.
- for what we need now, not what might come
- We make choices that allow our software to rapidly change to meet upcoming requirements.
- Clean Code: Software that’s easy to understand and maintain and is intention-revealing.
- Repeatability: Confidence and predictability that comes from removing manual tasks that introduce inconsistencies.
- Release early and often and keep your application always production ready
- Automated code deployment success.
We focus on results centred on the value delivered to users. We measure quality via metrics to evaluate, modify, and improve processes.
We use Scrum which includes a built-in way of measuring software development efficiency and productivity at the team level. We use burnup’s, CFD’d , cycle time and burndown to track if we are on track to meet targets.
Customer satisfaction: The most important thing for us is that customers are happy with our work. From our experience, the best measure of our development efficiency is how quickly your software improves business results. Pair programming is a good way to measure software development because each person agrees further improvement isn’t required.
Peer Code Reviews: All code goes through a peer code review, we also spot-check projects to ensure that quality is being maintained. The amount and quality of code written to the amount of time spent and logged on an issues are also compared.
The code’s performance, security integration and maintenance cost are monitored through tools.
Continuous improvement following retrospectives,
We implement prioritized continuous improvement stories. Allowing self-organisation and accountability for improving the quality of their process.
Other areas we look to measure to allow continual improvement are:
- Quality across the life cycle: Lead time
- Value committed vs value delivered
- MTTR (mean time to repair)
- The reliability, throughput, and time-and-motion