Mobile/IoT: When is it a bug and when is it an improvement?

Last week the Chevy Volt care under went a massive recall to update the software (you can google on that). It seems they want to issue a software update to shut the car off after about 1.5 hours, because the car would be “running” (off batteries at first and the the small gas motor) and the user could miss it because the car is “to quite”.

Now to their credit, they do have warnings that sound when the car is left running and not moving for some period of time. This is good, but there still have been cases where the warnings were ignored, the car ran, and CO2 built up in a garage (this is bad). So they added a new feature to “fail safe” by shutting the engine down totally after the warnings and time period. Easy fix.

Those of us working with systems-hardware have long had the joke “we will fix that hardware-system problem in the software”. This is the great thing about software. We can do this.

But I was left wondering. Did testers report the “missing feature” years ago, but missed the CO2 build as an effect? Did the system have a comprehensive risk/failure modes effects analysis (FMEA) done? Many embedded/IoT system do have these very detailed analysis (my book talks about these).

Now many software developers will argue they met requirements and so the new feature was not to fix a bug, but to improve the system. I argue, there is cost with the recall that might have paid for a more comprehensive system-software-test FMEA. What else did they miss?

Advertisement

Security, Insecurity, and the Single Tester

Many of us have been writing and talking about the need for better security throughout the IT industry. I focus on Mobile, Embedded, and IoT, but security spans all types of software and systems. Now with all of the news reports, industry incidents, and government involvement (ref.: Obama’s speech this last week), there is recognition and expanding action but what should be next for security testing?
I don’t think there will be consensus. For example, some say better software development including good testing. Others want to build walls, cyber safeguards, and have cyber offense just like with the traditional military and police actions. Yet more want rules, regulations, and penalties for both sides of security (the companies and the bad guy hackers).
As in most complex things, and cyber IT certainly is complex, the answer will be yes to all of the above and many more. A key step has happened. Most all of the users and interested parties now seem to be aware of the problems. When people find out that I am involved in security testing, they ask “should we be scared of IT security” and I answer “yes and you should be more scared”.
There are actions being taken and many things to be considered. We have cyber security warriors in some places. I have written about the need to grow the number of cyber-tester security warriors. And while I realize that the testing community will not agree on my list of near term actions to take to achieve cyber-security warriors (experts), I think there are many possible paths toward becoming better cyber security test warriors. The general actions I think should be considered include:
1. Learn more about general software testing from books, classes and for some , consider certifications (ISTQB is not supported as a good idea by everyone but it can be an early step to gaining test KNOWLEDGE)
2. Practice tester skills (see AST skill list – TBD web site) and become an experienced and ever improving skilled tester (I have been practicing testing for 35 years, and still have more to learn)
3. Learn more about the hacker’s world and their skills (This means we need to become “good” hacker/crackers to be able to “fight fire with fire”)
4. Understand and work with government and industry regulations and standards (yes I know many of you don’t believe in them, but standards will be put in place and get abused, so we should work to make standards and policies as acceptable as possible and then know how to use them correctly)
5. Know more about how to better develop software including security and other qualities (this means we must be more that testers, e.g. be software and system people)
6. Understand risk-based testing driven by the integrity level of the software (IEEE 1012 and ISO 29119, and again, I know that some people dislike these standards, but they represent a low level starting point from which to tailor processes, techniques, and documents)
7. Be better practiced in testing non-functional elements of the software-system including quality testing, model-driven testing, math-based test techniques, and attack-based exploratory testing (these approaches are often not understood or poorly used “tools” of our industry, and testers should have a great many test techniques that they can use beyond just checking requirements).
I know quite a few software people and testers feel that many of these ideals are “wrong” and even toxic. I hear that software and testing are arts, and we need more creativeness. True, but software-testing is much more. I hear that we need more rigor using math or models in engineering development and test. True, in part, but software is more than just science and engineering. I hear we don’t need regulations for our test industry because it is “too young”, or because that restricts free thinking and lets managers hide from the “hard work of testing” by claiming “dumb” conformance to meaningless documents. There is truth in these statements too, but every discipline started some place (read the history of the early books on medical anatomy), and having some regulations can force better development behaviors than the current “open season” in the wild wild west of software security. For example, clean air regulations have helped to keep the air clean in many USA cities in my life time). We should not too quickly dismiss standards.
We will never solve all aspects of cyber security. Just like security in everyday life, we will need the police, military, artists, and engineers. This has not changed for thousands of years. Cyber has just given the bad players a new environment in which to do crimes, make war, and do evil things. Most of us would not trade the benefits that IT gives us, so we must deal with the costs that cyber brings. Security is one of those costs.