Software Quality Governance: How Quality is an Integral Part of Digital Transformation

Home/Blog/Software Quality Governance: How Quality is an Integral Part of Digital Transformation

Software Quality Governance: How Quality is an Integral Part of Digital Transformation

Chris RowettVP of Customer Success | October 29, 2020

In today’s digital world, organizations need to be data-driven in order to keep up and maintain a competitive edge. 

Joe Colantonio interviewed me on Test Talks to discuss the importance of quality in the Digital Transformation process. Digital Transformation is placing increasing demands on IT to deliver software faster and processes and technologies have evolved in order to keep up. However, software quality is still lagging behind. Quality Engineering requires a new approach–Software Quality Governance–to manage built-in-quality as an integral part of software development and delivery.  

You can listen to the podcast here:

Over the last decade, I have focused on the challenges of Digital Transformation and worked with many enterprises around the world on what Digital Transformation means in terms of a process of change; including organizational design, automation, Agile, DevOps, the ways of working, and the tools that accelerate it. These findings provided me with insights to help organizations understand how to prioritize these changes. Data is critical for organizations to gain insights into making decisions and driving quality change.

COVID-19 and Digital Transformation

Since COVID-19, Digital Transformation has pervaded every aspect of life. For example, in my spare time, I play in a worship band at a local church. Since the start of COVID, my band has moved to doing music digitally and remotely from home. Rather than standing together on a stage and rehearsing and playing, we are now doing that digitally and using new technologies to accomplish this. 

The Digital Transformation journey is not new. It started several years ago as a reaction to new digital disruptors coming into different markets and different industries. What it really impacted was the larger enterprises, because they didn’t necessarily have agility or modern processes. They had a lot of complexity and different infrastructure, which impacted their ability to change fast and react fast to new digital business models. 

For decades, these organizations in IT were fine-tuning their approach to specifying planning, building, testing, and releasing software, and suddenly, that was not good enough. However, all of their employees, processes, and tools were focused on that well-trodden path. They realized that in order to survive in their markets or to become the disruptors themselves, they needed to completely rethink their approach. From this, the growth of Digital Transformation was born.

COVID-19 has put another dimension on that. Now that teams have become more agile and distributed since the start of Digital Transformation, they are being forced to use more digital communications to collaborate. This has definitely driven a rise in the need for better metrics and understanding across our software processes about where we are, what we’re doing, and what our quality is. COVID-19 has definitely increased the pressure on organizations to be more deliberate about what they’re doing and how they measure it.

The term “Digital Transformation” is usually used by upper management only and not the average tester. So how is Digital Transformation explained at a high level to a practitioner?

Digital Transformation Explained

The number one focus of Digital Transformation is the ability to do that faster and at better quality. Therefore, you live and die by the quality of our end-user experience. We could argue it’s the number one goal and right behind it is speed. Unfortunately right behind that is keeping cost flat.

Historically, if you think about the quality-speed-cost triangle, if you wanted to go faster, you could throw more money at it, or you could take shortcuts and reduce quality. Digital Transformation means you have to go super fast and still deliver an outstanding end-user experience at high quality. Now you’re focusing on QA specifically–it’s become a first-class citizen in this whole process. You need to be thinking quality about everything you do in your end-to-end process and you have to think in new and innovative ways about how to do that. How you manage end-to-end quality, and what your organizations look like to support that.

Digital Transformation, Software Quality, and User Experience

Digital Transformation helps enhance the end-user experience because it improves your ability to put the customer in the center of that process. Today, with the digital channels that you deliver through, you get very quick feedback on how you’re doing, the quality of your app, and whether you’re providing a good enough quality of service that’s meeting your customers’ goals. When it doesn’t, you can measure the bleed of customers and the bleed of revenue. The user experience is number one, it’s your lifeblood and how you keep focused on that becomes key.

When we’re talking about quality we are talking about things like production incidents. If I’m using online banking for a new bank, but their application doesn’t work or it fails when I’m trying to do simple things because of quality issues, not only might I quickly switch to another provider–so they’ve lost my business–but there will also be damage to the bank’s reputation. If it’s a broad enough problem, maybe it will be on the news or maybe it will impact their stock price. 

Quality in a digital world becomes very important because it’s much easier for us to switch providers and change our minds. If the experience isn’t outstanding, we’ll go somewhere else. Therefore, we have to drive a quality first principle through everything we do and reimagine our QA functions to support that.

Quality Governance: Better Data for Better Decisions

Digital Transformation means more automated pipelines: accelerating code through the pipelines and then testing it in order to get it ready for release. In the last few years, there’s been a big push on automation. Automation is great but it doesn’t make any difference in quality because it’s just about speed. Businesses relying on software for their customer experience need to release complex software, faster, and still keep a high level of quality. This need for speed, along with increasing complexity, led to a lot of new approaches and practices (Agile, DevOps, Shift-Left) as well as a wide range of new tools and systems. While these new practices improved speed, they also highlighted that existing quality processes and practices are no longer relevant.

With Software Quality Governance you can augment your automation engines with the right data to make the best decisions using AI. You’re not actually improving the quality bar at all. Getting the right data in the right place so you can make those decisions and can feed those into your pipelines is mission-critical to achieve quality and speed.

SeaLights collects the available metrics from the end-to-end process in order to make smarter decisions about whether that code is ready for release. You cannot afford production incidents and need to minimize them. The only way to improve that is by getting the right data at the right time to empower decisions regarding quality gates.

For example, you can prevent untested code going into the code base in the first place by providing people authorizing pull requests with smarter data around how well the code has been tested. It’s about driving data to the right places in the pipeline, in order to make data-driven and fact-based decisions. 

Quality as a Holistic Value

The strongest message that’s come across from Digital Transformation is that quality is no longer a single department. Quality has become everybody’s responsibility end-to-end. From defining requirements to testing and test coverage to release processes, quality is now a holistic thing, it’s not a dedicated function. You still need the expertise of the quality professionals to lead the entire end-to-end process in a holistic way, but quality is now an end-to-end issue.

This initiative is being driven top-down in the organizations we’re talking to. There are requirements from senior management to understand and measure the quality bar in these organizations. If you want the organization to change, it needs to be a top-down initiative so it’s got the power behind it, the drive, the investment to make the change. 

This is done by providing a consistent and aggregated view across all of those teams. For example, a consolidated quality dashboard can show you across your application landscape how the quality is improving and on which areas you need to continue to focus. A consolidated view allows those top-down executives to continue to focus on the areas that need further development, where they need more training, where they need to keep investing in and where they need to close quality gaps.

Digital Transformation in the Real World

All the organizations I have worked with wanted to become data-driven in their quality, but first they needed to figure out how to get the insights to do so. In other words, they needed to discover how to change their software quality governance in order to go through this transformation. Governance is the way that they can actually use the quality insights and figure out how good their testing is or where there are gaps in coverage. The insights can also provide metrics into quality gates and into how teams are maturing up that curve.

When we talk to organizations, they’re always starting at that point. It’s always a desire to have better insights so they can enforce better behavior and measure how they’re improving, as well as let the teams know that this is important. The fact that there are metrics and KPIs that appear on a monthly dashboard shows the importance to the organization of this journey. 

Some of our customers are trying to build modern architectures and reimagine some of their core applications in terms of microservices, cloud, native, all those sorts of things. They’re using POD and modern Agile development methods to achieve that. Other customers have legacy systems that are 20, 30, 40 years old, which are still vital to the business. In both cases, you’ve got a complementary but different set of challenges. 

For example, if you’re doing a mainframe migration, how are you testing that? Maybe the original developers are no longer around and the original requirements are no longer available but it’s still the core system. You can build APRs and other layers on top just to expose those capabilities to your customers and partners, however, the quality of those legacy systems is just as mission critical as anything you’re building that is cloud-based and microservices-based. So in legacy applications, how do you figure out when to test or when to monitor production customer interactions? What code is actually executed on where your gaps are? How much production code today has no test coverage at all?

These kinds of insights allow you to work with partners and internal teams to understand where you’re going to focus your investments in order to close the quality gaps first. A lot of this is cost, speed, quality. Cost is super important in this calculation so it’s important to understand what’s the best bang for the buck in terms of what you should focus on first. What are the high-value customer transactions running today that you know have zero coverage? Those are the first ones you’re going to spend the time and money on to build better testing.

Providing organizations that roadmap through understanding where the big-ticket items are missing coverage helps close and reduce the current risk surface. The organization understands that those are the things they are going to build out in the next few sprints as they improve their regression pack.  At the same time, the organization also needs to make sure that every new change that is released is tested. 

The organization needs to get as fast as possible to the stage where every new change can be guaranteed before they approve the pull request. That way, with the overall coverage they can slowly reduce the risk surface of the existing code and can stop it from getting worse. Because new change by new change, they’re only being committed and approved if there are tests that cover them.

We do have customers who are definitely taking these approaches and their next challenge then is around adoption. It doesn’t matter if you have tools that are giving you actionable insights and data, if you can’t ensure that all the teams are actually using it. That is where quality governance comes in – adding the processes, metrics, and KPIs that need to show continuous improvement before any release. It’s that next generation of insight that drives the governance that allows adoption to proceed. 

AI-Based Technology, Quality Governance, and Team Collaboration

AI and Machine Learning are one of the core tenants that we bring in terms of value with the Sealights platform. Historically a lot of the work we did in quality was reactive. We reacted to test results, what passed, what failed and those sorts of things. SeaLights uses AI and Machine Learning to inject quality governance into the process. The intention is rather than being static, based on a quality phase, we want a holistic view across the pipelines, end-to-end. We’re going to learn all of the testing, the test phases, the coverage, build by build, and then the AI Machine Learning engine is going to use that to make recommendations to you about where the gaps are, what you should fix first, how you can decide which tests are worth running and which tests aren’t worth running.

It’s not just about improving the quality bar, it’s pushing quality end-to-end across the process. SeaLights gives you the insight to both prioritize where you invest in closing quality gaps and reducing the quality risk. Additionally, you can use that same recommendation engine (based on a build by build, test stage by test stage basis), to inform you which tests from your massive regression set are the ones you should run based on the code changes that just happened. You can run only that subset and still get the same high level of quality at much faster speed. Back to our speed, quality, cost, that kind of AI based recommendation engine is really going to reduce how much time you spend in the testing phase, but at the highest quality bar.

AI and Machine Learning can also assist in collaboration–something that has become a hotter topic because of COVID and working from home in distributed teams. The SeaLights platform provides everyone insights across your whole application landscape, across all your test stages. Those consolidated views that you’ve shared with your team can be viewed at the end of sprint reviews, as well as backlog planning that the management can see on a monthly basis in terms of trend reports. How well are all the teams closing quality gaps? How are you improving your coverage? How do you use the risk? Having that end-to-end, cross build, cross application consolidated data that can be shared at all of those different levels, gives you insight and collaboration that you’ve never had before.

For example, if a QA manager can see on a particular application trend report that the modified code coverage (the amount of code we check in that is tested) is going down. It’s alarming, but he can use the report to drill down and see that the number of unit tests per change is decreasing. Now he has something specific and actionable that he can report to his development team and tell them how to correct it. It’s the ability to pinpoint where the problems are and in a cost effective manner have people quickly change and measure the improvements month by month.

These insights become mission critical to meet Digital Transformation goals by making sure that the acceleration you’re trying to accomplish with automation is at quality. The ability to aggregate and share those insights at each level of the organization and make it actionable. 

Automated Quality Decisions with AI and Machine Learning

We started by talking about automation being a great thing for speed, but automation on its own, if you don’t think about quality, just accelerates the problem by pushing bad code into production faster. 

Adding to automation, AI and Machine Learning take into account many additional factors. Today, a lot of quality gates are based on passed or failed tests. With Sealights, we can also look at things like test gaps, or code that isn’t tested. You can focus on modified code. Based on the recent check-ins that have gone into the build you are working on and executing, you can see how much of that new code has been changed.

 In the future, you can make automated decisions based on that information. You can add those insights  into your quality gates. You can use those quality gates either to fail a build or to notify the quality governance policies. In the future, you can use that data regarding your builds, modules, and teams to make smarter decisions  that lower your risk profile around how fast you can push code into production and how smart those quality gates can be. 

Sealights is the beginning, the engine we have and how it empowers teams today is very significant. However, the future is also bright because as you start adding all those extra dimensions to AI and Machine Learning, it gets more and more intelligent and informed about the decisions it can make to help you achieve quality and speed.

Real End-to-End Visibility, Even in Production

SeaLights provides end-to-end visibility, including in production. We have specific capabilities in production to be able to actually monitor and listen to the deployed application. We can give you the same coverage and find the same quality gaps based on code that’s running in production, even after it’s been deployed and checked through the earlier stages. This is something groundbreaking for companies with legacy applications.

In some cases, legacy applications have no documentation and the developers who developed it are no longer around–leaving you clueless. In those cases, if you really want to reduce the risk surface, your only option is to actually track them in production. The nice thing about that is, production is where you have real customer interaction. You can learn from how customers are really using the deployed applications, and see which methods they’re doing and which code path they’re actually using. From those builds, the AI and Machine Learning will track and then recommend to you in the areas where you need to build tests.

Building tests is expensive and costly, so it’s helpful to prioritize and focus on the areas where you know there are critical customer interactions with code that has no test coverage. These are the priority areas where you need to start building regression tests. 

Get Started on Your Digital Transformation Journey Today

Organizations use SeaLights to solve big questions around quality and to support their Digital Transformation and Quality Governance processes. The platform provides them insight through Machine Learning and AI to make decisions and act on those insights in cost effective ways.

Contact me via Linkedin  if you want to discuss how Software Quality Governance can be a part of your Digital Transformation journey.