LinearB is making available a free dashboard through which they can visualize DevOps Assessment and Research (DORA) metrics from within the company’s software delivery management platform.
DORA metrics enable DevOps teams to track deployment frequency, mean lead time for changes, mean time to recover and change failure rates. The dashboard is currently available via an early access program and is scheduled to be generally available in late fall.
LinearB CEO Ori Keren said the challenge is that not enough DevOps teams track these metrics because there previously was a fee associated with running them. LinearB is now absorbing those costs to help DevOps teams that have adopted the company’s namesake software delivery platform to be more efficient.
In addition to the trailing DORA indicators of DevOps performance, LinearB’s dashboard makes it possible to track leading indicators such as merge frequency and pull request size. In effect, the ability to track DORA metrics should be table stakes for any DevOps team, noted Keren.
LinearB’s 2023 Software Engineering Benchmarks Report based on analysis of more than 3.7 million pull requests (PRs) from more than 2,000 DevOps teams found code reviews are the biggest bottleneck in a DevOps workflow. Best practices for improving code reviews include automating reviewer assignments, setting up automated alerts for changes to be made to deprecated application programming interfaces (APIs) and using automated labels to prioritize workflows. Not surprisingly, the report also noted that startups and smaller companies are much more efficient when it comes to managing DevOps workflows than the average enterprise.
In the meantime, the one issue that DevOps teams are already encountering with the rise of generative AI tools is that the amount of code that needs to be reviewed is exponentially increasing. Unfortunately, the pace at which that code is generated is faster than the pace at which AI is being applied to accelerate DevOps workflows. As a result, DevOps bottlenecks are likely to increase.
However, generative AI will soon be applied to surface recommendations for where DevOps teams can improve, for example, cycle times, noted Keren. Organizations that can effectively apply AI across the entire application life cycle will have a substantial competitive advantage as the pace at which software can be developed and deployed continues to accelerate, he added.
The most important DevOps metric, of course, remains developer productivity. While it’s important for DevOps teams to measure their own efficiency, all that effort is for naught if developers are inefficient. Unfortunately, most of them spend a lot more time managing development environments than they do writing code. The goal for many organizations is to enable DevOps workflows that result in developers being able to deploy higher-quality applications faster.
Each organization will naturally need to determine what level of DevOps maturity makes the most sense to strive for based on how critical software development is to their organization. Regardless, however, the one thing that is certain when it comes to DevOps is there is always room for improvement.