Andrea Danti - Fotolia
It's easy to prophesize about innovations in software development and testing. But for these predictions to mean anything at all, they must include two key elements: context and advice. What is unimportant, or even misleading, without the why and the how.
Go beyond the hype. Innovations in software testing and development matter in terms of how that technology affects day-to-day work and responsibilities. Artificial intelligence and machine learning, for example, give QA professionals new ways to approach test coverage.
Sometimes innovation is forced upon an IT organization. The COVID-19 pandemic made development teams rethink how they measure and visualize work. Developers also need innovations to deal with stark differences between legacy and modern applications, and adapt to modern software architectures, such as microservices for mobile apps.
Let's explore these innovations, and the practical effects these processes and technologies will have on developers and testers.
1. AI and test coverage
AI and ML offer promise for automated testing, but in reality many organizations can't yet take advantage of this technology. AI-based systems require a great deal of training to make correct choices.
Some commercially available testing tools use AI to guess identifiers in the user interface as they change. This means the tool will recognize, for example, a shopping cart button on a retail app, even if the image changes from one version of the UI to the next. Additional problems AI tech can solve for test tools -- again, with a lot of training -- will include recognition of characters and handwriting.
AI has exciting uses for test design as well. Look for AI software that can take a decision set of inputs and create test ideas, called pairwise or all-pairs testing. Some tools can even apply a weight to a particular choice.
Imagine using AI to build automated and exploratory tests based on actual core use cases. An AI-based IT monitoring tool analyzes logs or data about what customers actually do with a piece of software. A team can use the AI's interpretation of this information to adjust manual and automated test cases. Some testers already approach load testing this way, to realistically simulate software use.
An AI tool for analytics can break down data to show the amount of time software users spend on each feature, and compare that figure to defect sources, the amount of churn and the value to the company; this creates a weighted and mature test strategy for the software.
2. Ways to measure and visualize workflow
The COVID-19 pandemic prompted development teams to figure out, among other things, how to create software remotely. This upheaval has led to a rethinking of how IT organizations measure and visualize the flow of work.
This shift to distributed teams has managers in search of new tools and features, including features and plugins in Atlassian's Jira related to cycle time and lead time -- i.e., how long it takes for a feature to go from concept to coded. While it might be easy to find these tools, it's another challenge to use them effectively.
Pre-COVID-19, managers could stick to more traditional -- and sometimes subjective -- measurements. For example, they could manage by butt-in-seat time, promises or blame. Lean software testing, a tool set to manage and predict flow, can clarify some of this vagueness.
Managers struggle to measure and visualize workflow when different roles and teams use distinct tools and terms because they're in a silo, also called ensiled. As tool sets like Visual Studio or Eclipse expand in scope by plugin architecture, they can either improve the situation or worsen it. If everyone uses the same major tool, it makes comparison easier. Conversely, if team members have their own personal favorite plugins and processes, that makes workflow unique by team, or even by person.
Not sure whether current tools are working for the whole organization? Track messaging. The more ensiled each team is -- and the more tool sets vary -- the more problem solving will take place on the organization's instant messaging platform(s).
Change isn't easy
Even if a trend or innovation seems like a no-brainer for a development team, adoption likely won't happen without buy-in from the business. IT organization executives who lack technical expertise still make the decisions -- no matter how much developers grumble.
Successful change initiatives often require a coalition between these roles:
- executive sponsor, to give the initiative authority;
- champion, to increase its visibility;
- coach, to roll out change across teams and roles;
- specifier, who defines what the initiative is; and
- subject matter expert, who explains how it works.
It's a tall order to get all those people aligned. For this reason, "business as usual" is often a more appealing option.
3. Modern techs and legacy apps
New applications can be cloud-native, created in a CI/CD pipeline. But many organizations, especially those that have been around for a decade or more, have monolithic legacy applications.
The gulf between what works best for legacy applications and for newer software is widening. IT organizations must decide whether they should design software according to the status quo or with modern approaches. That architecture choice will help to define the right best practices and even technologies for the project.
An organization with time and expertise could rewrite a monolithic app to take advantage of newer software development, test and deployment possibilities. One option is to retire a legacy app incrementally via a strangler pattern approach, but that doesn't always make business sense, or the organization may lack the capabilities or capacity to do so.
Innovation in application builds and deployment through Docker containerization has changed that calculus slightly for applications that run on Linux. While Windows support for containers is anemic, a CI/CD pipeline that creates test environments on demand in a Kubernetes cluster could free up many environmental bottlenecks. For example, the ability to create a test environment on demand eliminates needless waiting. The same applies for test data; there's no need for a development team to worry about a corrupt or occupied test environment when it can create a test database or service as needed.
If an IT organization is one of the cool kids, relying on automation from coding to delivery in a CI/CD pipeline, it likely already applies other modern precepts, like DevSecOps. For older legacy applications, transformation will be one victory at a time. Don't try to change everything at once.
4. Microservices architectures for mobile apps
Microservices are the emerging standard to integrate a light application front end with legacy architecture on the back end. This meeting of modern and legacy systems is necessary for mobile development if one isn't starting out with everything in a cloud-native environment.
Mobile development tools are relatively easy to adopt. However, getting microservices right can take a fair bit of work. To write microservices, organizations must create a service catalog, extract and isolate business logic and design tests carefully.