This is a guest post by Peter G. Walen.
A constant theme in Western literature is that a bright future can be promised if we trust technology. A common counter-theme is that technology will destroy society, maybe the world. Perhaps these themes are more related than they might seem. Perhaps they tie into the function and role of software testing.
You see the idea in many works of fiction and science fiction, that some form of technology will make things better. Better for people to enjoy leisure, so they can be freed from the drudgery of their lives, to have more leisure and be protected from dangerous forms of work.
It is a common theme dating to the industrial revolution. Machines that could do the mundane work would make the lives of people currently doing mundane work better. From the late 1780s, people have been speaking out, pro and con, on how society will be changed. How people’s lives will be changed.
As people working in technology sectors, we often find ourselves looking at how amazing it would be to implement a given “solution.” We see the outcomes as clear positive solutions to specific problems. We see technology as an obviously good thing. After all, the work we are doing can be grown and developed to help people, not just customers, do something better. It is a huge part of why many of us went into this field.
I am often reminded that sometimes, even in fiction, the “solution” provides the basis for the next problem.
Things to Come
For example, the 1930s British film “Things to Come” delivered a message about scientists looking for the greater good and the benefit of mankind. It gave a theme of ignoring science at your peril. It also gave a story about scientists saving humanity from a new dark age.
Set in the “near future” of December 1940, the film opens with people getting ready for Christmas Holidays even while threats of war loom in the newspapers and on the radio. A scientist/engineer warns of horrific things to come, stagnating the growth of knowledge, medicine, science and the potential to destroy all of society. War DOES come as people are celebrating Christmas. Society collapses, a plague sweeps across the world, in short, everything the scientist predicted comes to pass. Dramatically.
The war continues non-stop for decades. Long after the war began, the original scientist returns, flying a new airplane unlike anything seen by any of the people in the film. He decries the “little warlords” he and his group have encountered around the World. He talks about a society being built based on knowledge and wisdom and science and how they were eliminating war by eliminating the causes and reasons for war – want, greed, and inequality.
The scientists built an egalitarian society ruled by logic and scientific principles which made war and violence not merely unlikely, but impossible.
Time goes on, the grandchildren of the founders are now running society and everyone is living in peace and harmony. Except trouble is brewing. A growing faction is unsettled with the pace of advancement. They see the advances as going too far and needing to be pulled back. The leaders wax poetic on how ignorance and fear will always try to drag society backward, and how learning and science can save them if allowed.
The original Jurassic Park movie featured Jeff Goldbloom in perhaps the best casting of a skeptic, based on scientific theory, of scientists pushing the envelope of achievement. The film about a massively rich fellow wanting to open a theme park featuring dinosaurs of various eras, complete with now-extinct vegetation for the herbivores was a pop-culture smash. Loads of people watched a movie about unintended consequences. Loads of people watched the “gee-whiz” science portrayed and were awestruck. Here were scientists being portrayed as tracking down the elusive DNA strains for long-dead animals and figuring out how to bring them to life.
Everything was there from cute little chicken-sized dinosaurs to massive brachiosaurs, velociraptors, and tyrannosaurs. Everything was great, until it wasn’t.
Instead of looking at how science and technology are great things for human society, this movie has a more sober take.
The message from the 1930s was that we could always count on science and scientists to look for our best interests. In Jurassic Park, one statement rang out: “You spent so much effort trying to see if you could, you never asked if you should.”
Real Life, Today
We see amazing technology being created and deployed all around us. It can help keep us safe when driving our cars. It can monitor our homes when we’re away. It can monitor our thermostats for optimum comfort and energy savings at the same time. There are software-driven vacuums that can scurry around the room knowing where they are going, what areas they have and have not cleaned and can keep things tidy for us.
We see facial recognition tagging our friends and maybe us in social media. We see similar technology being experimented with by law enforcement to scan and identify people wanted for crimes from crowds. We see autonomous vehicles learning to drive from one location to another without human intervention. We see similar technology deploying “fighting vehicles,” autonomous combat and surveillance aircraft.
We also see how these systems fail, sometimes fatally, because the people developing the technology “missed something.” Of course, broadening the basis for what people test for is part of the solution. Still, if people are not aware of how their perceptions might impact the software driving the amazing technology, how can we help illuminate the gaps?
Can we look at how the technology is to be used, then consider how it might be misused? Should we, as knowledge workers in technology consider the impact of what we are making, beyond the stated purpose?
Are we like the scientists in “Things to Come” who are so certain that we can make things better, that anyone who opposes our work must simply be refusing to understand how what we are building will make things amazingly better for everyone? Or are we like the scientists in Jurassic Park solving incredible challenges because the challenges would be so cool to solve?
At what point do we need to ask ourselves and our organizations, “Just because we can, should we?”
Peter G. Walen has over 25 years of experience in software development, testing, and agile practices. He works hard to help teams understand how their software works and interacts with other software and the people using it. He is a member of the Agile Alliance, the Scrum Alliance and the American Society for Quality (ASQ) and an active participant in software meetups and frequent conference speaker.
- Announcing TestRail 6.0 with UI Enhancements and Docker Support
- TestRail Again a Leader in the G2 Grid for Software Testing