It’s time to take a fresh look at Software Quality Assurance, a discipline that deserves much more respect that it usually gets.
Managing the software development process for the ultimate benefit of the users of the software is the work of Software Quality Assurance (QA). QA work in software has always been widely misunderstood and has often been done poorly, but especially in recent times, QA work in the software universe has languished.
Because of some historical accidents, QA in software development is wrongly conflated with software testing, so much so that there is a strong movement among software testers to divest themselves of the title of QA altogether. But that leaves real Quality Assurance an orphan without champions.
Software testers are often the sole members of the development team working on the system as a whole, and are often the only members of the team working as proxies for actual users. The critical skills of good software testers, the ability to identify issues and to advocate change, are a great start to providing real Quality Assurance on agile software development projects. There is a strong case to be made that software testers are in a position to provide good QA to software development teams. But QA work demands skills and experience beyond software testing. Good QA work is methodology work, not testing work.
Other skills are required beyond testing skills to provide good QA. Besides testing, I would identify three additional areas that provide a good background for QA work, although you may find my selections surprising. But before I get to them, let me motivate the discussion with an example of why QA work might be necessary on agile software development projects.
Software Application Semantics
There is a thing that happens sometimes on agile software development projects that looks something like this:
A fairly complex hunk of data is to be exposed to users. The development team builds a “view” page to display that data. But privileged users need to edit the data, so then in subsequent iterations the team builds an “edit” page for privileged users that is hidden from unprivileged users. At this point the application has very clean and clear semantics: the “view” page displays some complex data, the “edit” page allows privileged users to alter that data.
But over a few more iterations, it becomes required to edit the data in several different modes, so the team builds multiple search and view options into the “edit” page. Over a few more iterations, it becomes desirable to allow unprivileged users to edit certain aspects of the data in question, so all users, regardless of privilege, are given a limited edit ability on what had been the “view” page.
This is the point where quality on an agile project can begin to suffer. Each enhancement to the “view” page and to the “edit” page makes sense within the context of each iteration. But after a number of iterations, the original semantics governing the concepts of “view” and “edit” have become muddled, overlapping, and confused. And when the semantics of a software application are muddled and confused, the workflow presented to the users of the application is also muddled and confused.
The same sort of confused semantics is often found in very old legacy applications with dedicated function keys and combination key presses and such, where special training materials and documentation are required to be able to manipulate the application correctly. Agile projects are capable of generating this same sort of complexity and confusion—and they can do so much more quickly.
Testing or Design or UX?
Not every agile team ends up with confused application semantics, and there are people with certain roles on software development teams who may help prevent such mistakes from happening.
Dedicated software testers certainly may be in a position to spot confusion in the UI as it comes up, and to help guide the semantics of the application in a more appropriate direction.
Front-end designers and User Experience (UX) specialists may also be in a position to prevent confusing, contradictory, and inefficient workflow from evolving over the course of time. Although the state of UX practice today is an interesting grab-bag of technique and opinion, there are some emerging and coherent schools of thought that embody a high degree of sympathy for the users of software, something that has not always been true in the history of software development.
But regardless of one’s role on the the team, it is often still difficult to see the forest for the trees when working in an iterative manner. Some real, dedicated QA work is often required—and is frequently missing.
Linguistics, Architecture, Methodology
I use the word “semantics” liberally in this article, and I do so quite deliberately. Semantics is the study of meaning, and someone who understands a bit about semantics will more readily see when the concepts of “view” and “edit” begin to lose their meaning. Taking this further, an understanding of semiotics is even more useful.
Semiotics deals with “signs,” the relationships between the “signified,” the swarm of concepts and meanings that comprise the culture within which the work is performed, and the “signifiers,” the concrete representations of the signified. The culture of software development has its own signs: “search,” “display,” “expand/collapse,” “edit,” “enter,” “submit,” “save,” “OK/Cancel,” “CRUD,” “push,” “pop,” etc., etc., etc. To create software is a linguistic act: it is to string together signifiers in the same way that an author writes a book or that performers put on a show. Someone working in QA (or UX, for that matter) would do well to have a good understanding of semantics, semiotics, and linguistics generally.
Besides linguistics, someone working in Quality Assurance should have a solid grasp of software architecture. Without a visceral understanding of the consequences of technical debt and of the legacy of bad design, Quality Assurance will be mostly illusion. This is one of the reasons that software testers reject the label of QA so vehemently: merely having a Quality Assurance process in place will not guarantee that the product will be of high quality.
One useful way to view Quality Assurance work is that it exposes a series of examples and counterexamples, that ultimately invest a useful methodology within the software development process. Brian Marick has a remarkable paper on this subject from 2004 called “Methodology Work is Ontology Work.”
The paper is remarkable in a number of ways (Kent Beck and Ron Jeffries as Emersonian philosophers?), but of particular interest to those working in QA is the well-considered justification for finding some few core postulates, or some sets of them, and for then building a local methodology or a process based in those few postulates. But “[m]ethodologies do not succeed because they are aligned with some platonic Right Way to build software. Methodologies succeed because people make them succeed.” This conclusion of course resonates strongly with the first value of the Agile Manifesto, “Individuals and interactions over processes and tools.”
Software testing or UX work is a good first step to real software QA work. Add linguistics, add a deep understanding of software architecture and a sophisticated view of methodology, and you put real Software Quality Assurance back into the development tool box.
Chris McMahon is a software tester and former professional bass player. His background in software testing is both deep and wide, having tested systems from mainframes to web apps, from the deepest telecom layers and life-critical software to the frothiest eye candy. Chris has been part of the greater public software testing community since about 2004, both writing about the industry and contributing to open source projects like Watir, Selenium, and FreeBSD. His recent work has been to start the process of prying software development from the cold, dead hands of manufacturing and engineering into the warm light of artistic performance. A dedicated agile telecommuter on distributed teams, Chris lives deep in the remote Four Corners area of the U.S. Luckily, he has email: email@example.com.