The Tesla and its Full Self-Driving mode failed to stop for the school bus and its stop sign eight times in a row during a Santa Barbara test on June 22. | Credit: Courtesy

Elon Musk and Tesla are firmly in Dan O’Dowd’s sights, the first targets of the Montecito billionaire’s quest to make “computers safe for humanity.” O’Dowd owns five Teslas himself, so it’s nothing personal, but his Santa Barbara company produces software that’s un-hackable for critical machines like the F-35 and ICBMs. He believes the self-driving software that Tesla uses will kill people. To get the word out, O’Dowd ran for the U.S. Senate on the single issue in 2022, bought a full-page ad in the New York Times last November, and placed a 30-second Super Bowl commercial in February. He’s not wrong.

Since 2019, Teslas with their Autopilot engaged were involved in 17 fatal accidents and more than 700 crashes, the National Highway Traffic Safety Administration reported recently. As digested thoroughly in the Washington Post and Fortune, the reports revealed 10 of the fatalities occurred over four months, and an increasing number involved motorcycles. Attention focused on Tesla in part because of its real-time reporting, which other systems lack, and because of multiple incidents of Teslas running into fire trucks parked on roadways with their lights flashing. Roughly 400,000 Teslas have Autopilot, with Full Self-Driving an add-on that costs $15,000.

“I’ve been trying a variety of things to make Tesla fix its harmful software,” O’Dowd said on Thursday, as he stood behind Peabody Charter School in Santa Barbara’s San Roque neighborhood. He was re-creating the bus scene from the Super Bowl ad, in which a Tesla ignores a stop sign protruding from a school bus.

O’Dowd’s crew affixes GoPros and cell phones to the test car.

“We think the car views a school bus like a truck with its flashers on,” O’Dowd said, and simply does not recognize humans shorter than four feet tall. O’Dowd, dressed in a gray suit and tie, has a mild but determined and factual manner. He was accompanied by an entourage of about 15 young people: his two sons, who work with him at Green Hills Software, the company O’Dowd founded; a few guys from Maltin PR of London; more guys with the YouTube channel AI Addict; and a school bus, its driver, and about six people wearing orange safety T-shirts.

Across the street, neighbors were coming out of their homes to see what was going on.

As the AI Addict group drove with O’Dowd around and around Peabody, every single time, the Tesla stopped at the stop sign on the corner, cruised past the “do not enter” and “road closed” signs on Calle Rosales, and sailed past the bus with its stop sign extended and blinking. On none of the eight passes did the car stop for the bus or potential students. O’Dowd explained that a month after his Super Bowl ad ran, a teen stepped off a bus in North Carolina and was struck by a Tesla that failed to stop for the bus and its stop sign.

“The kid was hospitalized and is still recovering,” O’Dowd said. “And Tesla likes to claim its Full Self-Driving software is five times safer than human drivers.”

Johnetta Lane | Credit: Courtesy

Johnetta Lane, who was driving the school bus O’Dowd’s team had rented from Riverside, said that in her experience, most drivers will stop for the red stop sign when she extended it. “A few people don’t stop,” Lane noted, “but I’d say that most of them do.” In fact, as the crew conducted the tests, random drivers coming down Calle Rosales stopped for Lane’s sign and bus.

During Tesla’s AI Day in 2022, Musk reportedly argued that the self-driving mode saved more lives, but they didn’t necessarily know it. “Adding autonomy reduces injury and death,” Musk asserted.

O’Dowd’s Dawn Project, named after Green Hills’ Dawn Methodology that brings software reliability “from the Dark of Night to the Light of Day,” acknowledges at its pages online that Tesla warns drivers that its Full Self-Driving (FSD) mode “might do the wrong thing at the worst time” in telling drivers to keep their hands near the steering wheel and to stay attentive. But O’Dowd regarded the statement as “unacceptable” given the clear fact that some drivers paid no mind.

Some of the crashes are legendary, such as when a Tesla crossing the Bay Bridge suffered “phantom braking” while changing lanes, causing an eight-car pileup in which nine people were injured but fortunately no one killed. Anecdotes collected by Mother Jones range from a driver playing video games before one fatal accident to another driver leaning down to pick up a cell phone as the car crashed into a parked car, killing the occupant.

What Tesla should do, O’Dowd said, is use professional testers, not the driving public, to test the system, which is what he does at Green Hills. His company makes software for “the world’s most reliable secure systems,” he said, with the U.S. military, its jets and missiles, and government agencies among his clients. He said Green Hills hires professional hackers to ensure the software is foolproof.

The problem, said O’Dowd, is that Musk is cheap and impatient, and it’s become part of the company culture. One of the guys on the AI Addict team, John Bernal, affirmed that Musk would yell and scream to press his people to make something work. Bernal created AI Addict, which reviews electric vehicles, while he worked for Tesla to demonstrate how he was teaching the car to drive. But he was fired after he posted videos showing his Tesla nearly hitting another car in an intersection and then filmed a crash of the car plowing into pylons marking a new bike lane while making a right turn.

Earlier on Thursday, O’Dowd was videotaped sparring with Ross Gerber, who had made a bid in February for a seat on Tesla’s board. He was backed by other investors but withdrew after Tesla agreed to ease the brand away from Musk by putting other executives in the spotlight and adding Tesla content on Twitter. Gerber and O’Dowd were livestreamed by AI Addict as they drove through Santa Barbara analyzing the car’s successes and failures at staying within the lines, managing curves, and stopping for pedestrians and bicycles.

They were supposed to end up at the top of Glen Annie Road, a private part of the road, where the one-sided billionaire versus billionaire campaign hit a snag. The schedule called for the press to accompany Gerber and O’Dowd on ride-alongs down Glen Annie as a stroller and child mannequin were placed in the road.

Viewers would recognize the spot as where the Super Bowl ad was filmed, but one resident had apparently had enough. She told O’Dowd’s crew to leave, and shortly after, about five Sheriff’s deputies arrived to escort everyone away.

Dan O’Dowd owns five Teslas, including this Roadster, but believes the car company’s autonomous driving software will kill children. | Credit: Courtesy

Login

Please note this login is to submit events or press releases. Use this page here to login for your Independent subscription

Not a member? Sign up here.