4 Lenz: Quality Metrics Widgets
Features and Functionality
This article is part of our series on the Features and Functionality of the 4 Lenz Dashboard. We've broken each area down for you, into shorter articles.
- Understanding the 4 Lenz Dashboard
- Flow Metrics
- Quality Metrics
- Happiness Metrics
There are videos in the Metrics tab article that talk through how manually entered data and how integration data is used to populate the 4 Lenz Dashboard.
Defect Ratio
Defect Ratio looks at the ratio of bugs to work items in the backlog that were created in the “Last 90 days”. Work items created more than 90 days ago are excluded from the calculation.
When data has been entered manually for a team, the numbers entered for Work Items and Total Defects are considered to establish the ratio of Work Items to Total Defects for each sprint that ended in the last 90 days and then averaged to obtain the displayed value.
Variance is shown next to the percentage in the widget in green when there is a decrease and in red when there is an increase. Variance is determined at the individual team level by the following formula: (new value - old value / old value). Variance % is determined for the multi-team level and above by aggregating the percentages for each individual team and dividing by the number of teams to get an average.
Click the three-dot icon to select the Defect Ratio Formula link and see the formula behind the widget when data is populated via an integration.
Avg Lead Time
This metric displays the average time it takes to resolve high-severity incidents for the last 90 days. This widget does not yet function when data is entered manually for teams.
Variance is shown next to the percentage in the widget in green when there is an increase and in red when there is a decrease. Variance is determined at the individual team level by the following formula: (new value - old value / old value). Variance % is determined for the multi-team level and above by aggregating the percentages for each individual team and dividing by the number of teams to get an average.
In order for the Avg. Lead Time for High Severity Incidents widget to work, data needs to be configured in the mapping section of the Jira or Rally Integrations page first. The mapping section allows a user to define their High Severity Incidents configuration used in their Jira instance or Rally workspace.
Click the three-dot icon to select the Avg Lead Time Formula link and see the formula behind the widget when data is populated via an integration.
Stakeholder Satisfaction
This metric displays the Stakeholder Satisfaction Score from the team's last assessment. The variance compares the score between the team's two most recent assessments. Not all assessments have this question.
Comments