Although I am lucky that I cannot recall a single similar example from my own experience, dealing with increasing amounts of confidential material could raise questions for testers about the ethicality of some projects or software.
At the recent Nordic Testing Days, an annual international testing conference, I attended the keynote by Fiona Charles who teaches organizations to manage their software testing risks and IT practitioners to take their project skills “beyond process”. She offered some practical suggestions for coping as an ethical tester, calling them her “10 commandments”.
I would like to take the liberty here of sharing her commandments, as I believe they could be helpful for software testers anywhere.
You should always try to minimize the negative consequences of computing systems and ensure that the product of your efforts will be used in socially responsible ways, meaning no harm health, safety, welfare, or the local or global environment.
You must always obey existing local, state, national and international laws unless there is a compelling ethical basis not to do so. You should also obey the policies and producers of the organizations in which you participate.
Violating a law or regulation may be ethical when that law or rule has an inadequate moral basis or when it conflicts with another law judged to be more important. If you decide to violate some law or rule because you view it as unethical, then you must accept full responsibility for its consequences. You may also want to draw the line about not working for some companies if it violates your ethics.
One way to avoid unintentional harm is to carefully consider the potential impact on all those affected by decisions made during design and implementation. You get paid by certain stakeholders, and yet you also serve other stakeholders and end users as the person who needs to collect and report accurate information about the actual functioning and quality of the product.
Technology enables the collection and exchange of personal information on a very large scale. Consequently, there is increased potential for violating the privacy of individuals and groups. You must take precautions to ensure the accuracy of data, as well as protecting it from unauthorized access or accidental disclosure to inappropriate individuals. Also, beware of dark patterns that make users who are not so fully aware do things they really don’t want to do.
You should avoid making deliberately false or deceptive claims about a system or system design, and instead try to provide full disclosure of all pertinent system limitations and problems. Sometimes this might mean choosing between your integrity and your job.
You have the responsibility to request a change in any assignment that you feel cannot be completed as defined. The major underlying principle here is the obligation to accept personal accountability for professional work. At the end of the day, your professional reputation is all you have to stand up for you. Even those that would like to influence you to skew information will eventually recognize your moral grounds.
You should be able to give comprehensive and thorough evaluations of software systems and their impact, including any analysis of potential risks. Sometimes powerful people don’t want to have their illusions shattered, and they fight against you. Being aware of this possibility will give you the opportunity to prepare.
You should have the obligation to report any signs of risks that might result in serious personal or social damage.
You should always know how far you are willing to go for the truth. Ingrained fear makes many people exaggerate in their own minds the risks involved in challenging authority. You should always think through your tolerance for possible losses.
Interested in testing? Have a look at Helena Jeret-Mäe’s starter kit for testers.