Privacy, Security, IoT, and the Car

This story is rapidly being overcome by events at a particular and unnamed automobile manufacturer, but illustrates one basic risk of embedded-IoT (internet of things) devices. To make a long story shorter, the manufacturer has decided to replace their particular software-system related in this story due to many problems reported with the system, most of which dealt with functionality and usability. Those reports are not a consideration in this post. However, replacing any vehicle’s system and/or having recalls on a vehicle (or by a company) can be very costly. As testers, we should help to avoid these types of issues by providing information about the functionality–and these days—about the privacy-security of systems. (In testing, just like with the TSA, “See something? Say Something.”)

What I noticed with this bug is an integration-privacy issue, and I suspect this type of bug will reoccur all too often in other IoT systems. Testers, as well as company executives, should take a lesson from this story. The feature of this car system as outlined is what happens when you interface two devices together without fully informing your users that systems are being “integrated” with information sharing; the impacts of such integrations; and the associated risks. I credit my wife for getting stung by this or rather making this finding.

In this scenario, the user could access a USB/ power outlet in the car, which one would normally use to “get power.” In this case, my wife plugged her cell phone into the USB/power outlet. The car’s system recognized the device, and allowed “hands free” control of the device while providing power.

Great, right? Well, as in most stories, there is more to it.

The car’s system was able to read information from her cell phone, including phone numbers, user name, and email information (and we are not sure what else). This data was persistent in the car’s system, because my wife could see five other users of the car’s system. You guessed it, it was a rental car, and each renter had plugged their phone into the car’s system and had their information “downloaded”. So, ask yourself this question, when renting a car, “Do you expect to have all of the information on your cell phone downloaded to the car’s systems even if you only wish to plug into a power outlet to recharge your phone?” And to take it one step further, “Do you expect anyone else to access the information on your cell phone once you unplug your phone and exit the car?”

This is at least a privacy concern feature (I’d call it a bug), and I would be willing to bet, none of the other users knew this information was being “shared” as it was not disclosed to them through the on screen help nor when they rented the car. Further, having reviewed the user guide, I saw no disclosure there were such “features.” If I were a “bad guy” hacker, I could probably start to use the car’s information as a starting point for an attack on any users of rental cars with such a system. So, features of compromised privacy can become a security risk too.

Additionally, a dealer of the type of car (not the rental company) had access to this “leaked” information, because not long after my wife’s rental car period was over, she received “welcome” advertisements from a dealer of the car offering discounted service features. Obviously the dealer did not really understand how the car’s system was working since they thought my wife had bought the car. We are not certain how much the rental car company knows about this sort of “issue.” The blind leading the uninformed.

So here is the moral of this story for security-IoT testers: you must test beyond the required functionality of the system, and assess other quality characteristics. The hackers are breaking into retailers, major companies, and attacking everyday people. IoT testing must consider privacy and security issues of these devices. This means IoT-embedded testers must think at the system level (this was an integration between systems), and go beyond basic functional testing. For this system, because it is being phased out, I am guessing that even functional/user interface testing was done poorly and so probably nothing was done with security and privacy. Sad.

Many companies tend not to worry about these issues until they have a major and costly breach. Then, they scramble and try to fix the problem and spend lots of money in litigation efforts. As testers, we should move the finding of issues (testing) to the left, and explain to the decision makers (management) about the risks (bad publicity, lost money, rebuilding or replacing systems, lawyers, lawsuits and court costs, etc.) of such problems, before problems escape into the field and onto the nightly news. (No, I did not call any reporters. Tempting though.)

This has been a food for thought story. Use it to think and explain to management about IoT-Embedded security-privacy risks. And, think twice before plugging your cell phone into a power outlet in one of the newer cars—anyone’s.