I've been thinking about the Volkswagen scandal a lot lately. In September, the US Environmental Protection Agency (EPA) determined that VW had clandestinely inserted software into many of its diesel vehicles to determine when it was connected to a smog detection machine, and reduce its emissions output accordingly. Once the vehicle detected the machine had been disconnected, it resumed spewing nitrogen oxide pollutants at elevated levels-- up to forty times federally-allowable limits. Adding insult to injury, many of these vehicles were purchased by consumers wishing to minimize their impact on the environment. Volkswagen, whose stock plummeted by nearly fifty percent in the weeks immediately following the revelations, has since admitted that approximately 11 million VW cars worldwide are equipped with this code. "[We have] broken the trust of our customers and the public," said VW Chairman Martin Winterkorn. His board soon ousted him over the disclosures.
New revelations continue to pour forth about how this breach of public trust happened, and element14's Elecia White has written a great piece on the moral choices engineers must make when designing products that run counter to their values. While the magnitude of the VW scandal is pretty obvious, it's not always clear when a technological product has crossed the line. Since the birth of the atomic age, scientists have often tried to ensure that their discoveries would be used for the benefit of humankind, rather than its destruction.
As I weighed these concerns, I found myself coming back to another facet of the VW story which has received scant attention: how our flawed relationship with the cars we drive made this scandal virtually inevitable.
Driving the Black Box
When something goes wrong with my cell phone, I go on Google and try to diagnose and fix the problem. Same for my computers. And my home media center. I even figured out how to set the action on my electric guitar-- a delicate job --by watching a series of YouTube videos which took me through the process one step at a time. But my car? That's entirely different.
When something goes wrong with my car, I either call AAA, or I drive it to a garage and pay someone to tell me what's going on.
You know, like this guy.
I pay a stranger to tell me about the product I already bought.
Why is that?
I think it has to do with the importance I attach to my vehicle: if I try to repair it and get it wrong, I'm screwed. Given this zero-fault tolerance and the fact that cars have become incredibly complex machines over the past decade, I dare not mess with it myself. Better to just take it to someone who knows what they're doing.
This complexity has always been there for those who-- like me --do not consider themselves car guys. Even before the embedded revolution, I found the inner workings of a car hopelessly unfathomable. Today, the black box runs code: millions of lines of code. High-end automobiles now run more code than a Boeing 787:
The complexity of managing such vast coding projects can easily overwhelm automotive manufacturers. To see a recent example of this, look no further than Toyota's infamous "unintended acceleration" case. Claims started surfacing in 2002 that many Toyota models would temporarily accelerate when their owners applied the brake pedal. Toyota fought these claims as being baseless, but a long investigation ensued, ending with the conclusion that UA was real, and was caused by coding errors in the engine firmware. Forensic programmer Michael Barr revealed how each of the reported errors claimed by victims of UA could be explained by looking at the way the firmware code was actually interacting with vehicle acceleration control. In March of last year, the US Department of Justice fined Toyota an unprecedented $1.2 billion criminal penalty for issuing misleading and deceptive statements to consumers and federal regulators.
Nearly 100 people died as a result of Toyota's ineptitude and subsequent attempts to conceal the problem from regulators.
As serious as the case is, one can reasonably concede that the auto maker did not intend for the deadly effects of SA to happen: this was the result of code conflicts introduced by accident. For instance, when two software developers declared the same global variable unbeknowst to each other, the stage was set for tragic events. Toyota's mistake was the code error its engineers unwittingly introduced into its vehicles; Toyota's crime was its subsequent attempt to cover it up and hope that nobody would find out, even as more accidents occurred.The VW case is inherently different because the auto maker purposefully inserted this deceptive code in its vehicles from the start in a conscious attempt to subvert federal emissions guidelines. In both cases, however, the auto makers attempted to use code complexity to their advantage-- for Toyota, to obfuscate the matter in the hopes that regulators would be unable to ascertain the cause of SA; for VW, to hope the truth about its smog-spoofing code would never become known.
It's 2015. There should be a better way.
In both cases, complexity was used by these auto makers to cover their tracks. As the economist Luigi Zingales once said in defense of transparency: "In the shade, a lot of things take place."
I recently explained to an audience of PR professionals that one of the amazing things about element14 is that while the rest of the world talks about the problems we face as a civilization, our members are uniquely equipped to actually solve them. It's because the combination of creativity and technical know-how allows you to come up with beautiful, strange ways to address the problems that vex us. Like building a smart sports uniform to protect young athletes against concussion-related injuries. Or solving the #1 source of pollution in the world by showing everyone a better way to farm.
So what about cars?
Cars today are really just rolling computers. They have millions of lines of code. And that means-- with the right technology --we can open a window and see what's going on inside of them. Picture the tired, uninspired dashboard of today. It shows speed, RPMs, and fuel. Even with modern lighting technology and LCD displays, it basically shows the same limited information it always has shown.
But imagine a dashboard of tomorrow: one that opens dozens of little windows into just what's happening under the hood anytime you want to know.
Want to see how much of your brake pads are left? No problem.
Fluid levels? Check.
How about realtime emissions analysis? That way, you'll never have any surprises when you take it for a smog check.
Or instant tire pressure?
Wheel alignment?
Just about anything you might want to know about your vehicle can be depicted with the right sensors. Embedded developers are experts in the art of seamless sensor deployment, and are unquestionably qualified to modernize the dashboard to show drivers anything they might want to know about the cars they drive. Auto manufacturers should welcome this transparency-- especially in light of the litany of scandals that have come to light.
Maybe 2015's VW scandal will be seen as a turning point in our relationship with cars-- the moment when we said Enough, and decided that the only response to the growing complexity of our automobiles was to give drivers radical visibility into what's happening inside the computers we happen to drive. I can already imagine a car maker unburdened by the fear of change-- Elon Musk, or Tim Cook, for instance --revolutionizing the way we drive by empowering consumers with the information they deserve to have about the products they use. We should know as much detailed information about our cars as Tony Stark knows about his iron suit.
What do you think? Are cars ripe for radical change?
Top Comments