Autonomous Cars + IoT, and Life or Death Decisions – Part 2

The Danger of Mixed Messages

Right now, autonomous driving isn’t that popular. Concerns about the technology indicate that adaption will lag optimums significantly over the next decade or so because people don’t trust the tech. We saw during the pandemic that mixed messages from the CDC on masks, and concerns about blood clots, which are incredibly infrequent, are causing people to distrust the vaccines.

The significant number of drivers that have died in Teslas while allegedly using autopilot has grown substantially, and each crash seems to get front-page coverage. These crashes are mainly being caused by Tesla drivers who think they have an autopilot. Still, that capability won’t arrive until we have Level 4 or 5 autonomous systems. Tesla cars currently range from Level 2 to 2+, and they lag Cadillac in terms of the system’s performance while calling it Autopilot.

Since I was a kid, we’ve told the story of the first significant cruise control accident and how it was caused by someone who didn’t know what cruise control was, renting an RV, and being told it was like an autopilot. So he set the cruise control, went back to make some coffee, and ended up in a massive crash. Elon Mush either never heard that story or has some weird desire to see how many people he can get to crash their cars by using the same method.

In much the same way that the CDC mixed messages and blood clot reports adversely impact the vaccine rollout, these crashes are scaring people away from autonomous cars before they are even ready.

The same as we need a critical mass of people vaccinated so we can get to herd immunity, we need a critical mass of people using autonomous cars; so we can reduce the 38,000 people who are killed, and the 4.4 million who are injured seriously enough to require medical attention in car accidents each year on U.S. roadways to numbers far closer to zero.

If you buy an autonomous car, it will make you safer. But if a critical mass of people buy the technology, assuming it works, only then do cars because genuinely safe. I often wonder if Elon Musk has some self-destructive condition — or doesn’t like people — because there is no reason to call a technology Autopilot before Level 4 autonomous driving is reached.

Consumer Reports has been trying to get him to change the name of this feature until it can do what the name implies, but he hasn’t. Check out this video for more about that:

A couple of weeks ago, two more people allegedly died as a result. Tesla disputes the cause. Nonetheless, Tesla death numbers are very troubling given the car is built like a tank.

People who drive a Tesla are often the upper class which includes politicians. Both the EU and the U.S. have handed out fines in the billions to tech companies in the past. I wish Musk would stop using the name Autopilot to save lives. But even if he stopped only to avoid a huge fine, I expect the result would be beneficial to Tesla drivers, the Tesla company, and the future of autonomous driving.

Wrapping Up

Autonomous cars and the related robots and drones coming out of applied AI promise to transform the world into something pretty amazing. Companies like Intel, Nvidia, and Qualcomm are working furiously to create an extraordinary level of technology that will keep us safer and massively reduce the significant number of people killed or crippled in car accidents. Each has elements of security surrounding their efforts.

Still, it may be BlackBerry’s focus on security — the environment that these autonomous, connected things operate in — that will provide the most significant level of protection.

But these companies and their efforts will have reduced success if Elon Musk doesn’t stop referring to his offerings as having autonomous driving capabilities that are greater than the car is capable of providing yet. Overpromising and underdelivering on a technology that can save or take lives is incredibly foolish and damaging to the overall effort.

If the government begins hitting Tesla and Musk with huge fines, or people, over time, connect the Telsa brand to death, it won’t bode well for Tesla. There is no upside to implying you have autonomous driving before you have it. It just gets people killed.

When autonomous driving truly arrives, it could be excellent. But only if people trust it enough to implement it in numbers significant enough to reach critical mass — and only if the entire ecosystem that the vehicles operate in is secure and capable of doing the job. We aren’t there yet — and Musk, one of the leading pioneers of technology, is making things worse, not better.

If we want the better future that autonomous cars promise, everyone on the critical path to success needs to be on the same page. That isn’t the case yet, which will push adoption out farther than it needs to be.

Source: https://www.technewsworld.com/story/87115.html

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *