Waymo recently got some smart news, with the swift approval of its application to service giant service spaces in San Francisco and Los Angeles. Recently, they’ve also received a small series of more negative headlines worthy of attention, at most one day in Phoenix, where two of their vehicles suffered minor impacts, and the same truck was towed minutes later.
There was also an arson attack on a Waymo and a minor but significant impact with a bicyclist. While that’s nothing compared to the chaotic month that led to GM’s Cruise downfall last year, some wondered if more unrest would ensue. Come for the company. The answer is, of course, yes, but it turns out to be a smart thing to do.
The evolution of autonomous vehicles, one of the most ambitious computing, software and artificial intelligence projects ever attempted, is fraught with difficulties, dangers and errors, and it cannot be otherwise: all of this will have to be understood if this generation is to produce the benefits it promises.
None of those incidents (with the exception of arson) resulted in significant injuries or damage to assets, yet they are notable for Waymo’s exemplary protection record, which was astonishing to Cruise and led to its removal from the streets of the United States.
Most appealing were the pair of injuries between Waymos and the corner of the same pickup truck being towed down the center lane of a Phoenix street. As reported by the NHTSA, the pickup truck was a tow truck that had lifted the front end, but its front wheels were on the ground. However, in an uncommon way (and Waymo says this inappropriately), the truck’s idler was supposedly locked to the right. The tow truck was traveling in the shared “left turn” center lane, which is normally reserved for cars. turning left in any direction. Because the front wheels were turning, the towed truck was turning to the right, constantly putting its nose in the lane to its right (of the tow truck), in which the Waymos were traveling. The Waymo car hit the corner of the truck, but there was no significant damage and the tow truck continued on its way. “Several minutes” later, another Waymo arrived and made the same mistake.
To Waymo’s credit, a tow truck has tow or driven for a long time in that middle lane. On the contrary, however, is the fact that it would be highly unlikely that a human driving force would make the mistake that Waymo did. The nature of the robot formula is also evident from the fact that at one point Waymo made the same mistake.
The most important component of any autonomous vehicle is the prediction engine, which attempts to calculate where everything on the road is most likely to happen as we move into the future. Speculation suggests that Waymo’s prediction engine saw the van’s orientation. with the wheels turned and incorrectly predicted that it would head into the lane in which it was being towed. Waymo declined to comment on the main points of this, however, there are two possibilities: sources of a bad prediction. In one case, they may not have realized the van was being towed and simply saw it as a truck making an improper foray into the Waymo lane, but obviously exiting into the left-turn lane. Another, probably more likely, mistake would be that they knew the truck was being towed and predicted that it would tow the tow truck as it usually does. Towed cars may move out of the way, but will temporarily continue to follow the tow vehicle.
As such, I assume that Waymo expects the van to temporarily move into the middle lane, and as such, Waymo can simply continue in its adjacent lane, as the truck would have disappeared during the time it was given there. He continued to wait for this until he got too close and sent the van flying off. According to the description of the accident, it is most likely that the tow truck driver did not even know that this had happened and so it continued, and then the moment Waymo arrived.
What’s troubling for Waymo is that there’s an echo here of one of Cruise’s most well-known accidents, where they crashed into the back of an articulated city bus in San Francisco. An articulated bus is a vehicle towed through another, and Cruise mistakenly programmed his car. Forget about the knowledge of the sensors at the back of the bus and wait for where the back of the bus would be based based based on what the front was doing. Foolishly, Cruise did this even when the front of the bus was hidden, so he had to guess where it was. He guessed wrong, because the bus was braking and the Cruise might not see it. Well, you might only see the back, but you didn’t know that and you might just not see the front. A stupid mistake, but what was most worrying was that no formula was able to realize that the back of the bus was coming temporarily and that it was going to be hit, which deserved to have caused the emergency cancellation of a plan based on poor forecasts.
The twist of Cruise’s fate deserved to have been a quick wake-up call for Waymo to take a closer look at its code for handling towed cars. If they did, they missed out on a full investigation of the tactics in which towed cars can behave strangely. This is a rare scenario and, if necessary, this can also go unnoticed. They may have also missed this general solution, which is to realize when sensor knowledge indicates that a prediction is increasingly obviously wrong. This deserves to be taken into account has been very undeniable for Cruise, as his purpose was “the length of a bus”. For Waymo, getting smarter about the disagreements between sensor knowledge and forecasting is a trickier task.
Double collision is another thing that can affect robots, but not humans. Normally, this is an advantage. When a robot makes a mistake, the team immediately works to fix it, and after a few days, no robot will ever make that mistake again. But this may not be fixed in a few minutes. This isn’t a problem, as infrequent conditions don’t repeat quickly, but they can when it’s literally the same vehicle or location. It might make sense, for any accident, to mark not only a position as risky, but also some other vehicle or an express elegance of vehicle.
The lesson for the public and regulators, however, is that the most productive autonomous vehicles will make mistakes. Generally, after any incident, it’s not uncommon for critics to come forward and say, “this mistake proves that those cars are unfit” to run on the roads. Unfortunately, they are rarely able to describe the obstacles that exist in deciding who is in a position and what is not, and the implication is that any mistake is too much.
But if that’s the case, the cars will never be ready, because no one expects them to be perfect. At best, they expect to have much higher degrees of threat than human drivers, and groups began hitting the roads when they calculated a greater threat. degrees than human drivers. At the same time, we allow young student drivers to hit the road when their threat point is a little worse than typical human drivers, in the hope that they will improve. In many ways, those newly licensed academics and drivers are “not ready,” but we welcome them.
In order for this generation to develop it needs to be informed, and only many vital classes can be informed so that they do not drive on real roads. Regulators have taken Cruise off the road either for not being enough or for withholding information, but they have. It has not yet been defined what enough means, or what will prepare them. While there is an intuition to be wary of any incident (especially injuries), policy decisions should be based on statistical research of giant sets of knowledge, not isolated incidents.
For each incident, I advised that an investigation of the failure, severity, and likelihood of recurrence be conducted. This scenario was unusual, as it was repeated in just a few minutes, but that was because it was the same vehicle. At Waymo Adulthood level, their knowledge shows that it’s rare (but not impossible) for them to be faced with entirely new scenarios that they can’t handle. After traveling tens of millions of miles, anything that has never been noticed before is definitely a rare occurrence and is unlikely, which is smart. Thus, even when an incident occurs, it is not necessary to “stop the fleet” for fear that it will happen again immediately, unless it is this way, with the same truck.
In this accident, the fault is usually with the improperly towed truck, but part of the fault lies with the Waymo driver. The severity of the incident was minor. The probability of a repeat with another towed vehicle is low, but the probability with the same vehicle is high. As such, it is recommended to avoid any unusual location or vehicle that would cause an incident immediately. other cars until the investigation can better determine the cause. It’s not easy, but it’s more than shutting down a fleet.
Waymo was pretty quiet about an incident in which they collided with a bicyclist at a four-lane prevention sign. According to them, the cyclist crossed the 4-lane prevention sign, as cyclists often do. I was following a truck blocked at the warning sign. . The Waymo made the decision that it was his turn, which turned out to be correct, and the rider actually veered off his turn and went into hiding, according to Waymo. By the time they were finally able to spot the cyclist, it had already passed. to avoid a collision. Fortunately the injuries were minor, the cyclist allegedly refused medical attention and sought to get back on the road.
Most cyclists pass warning signs. It’s not legal (driving is legal in some states, but not if another road user has the right of way), but it’s probably more common than seeing one hit a complete obstacle. In fact I did. By doing this, he assumes the responsibility of ensuring that other road users do not wait or see him, so he will have to be vigilant to make sure that no one else passes him: he will pay a much higher price. . If this happens. This cyclist turns out to have failed. At the same time, it’s appealing to wonder if the Waymo driver could have done anything to prevent the incident. Waymo had no comment, but they may have seen the cyclist get behind the truck and may have noticed him “disappearing. ” Although it is reasonable to assume that a cyclist can easily pass over the obstacle, the unwritten rules of the road do not allow this without problems. If you assume there might be a cyclist behind another giant vehicle on a four-way road, you’ll be too timid and impede traffic. If we assume that Waymo had no chance of seeing the cyclist before he approached the avoid sign, it is not clear that a human driver would have avoided this situation. We don’t have data on Waymo’s reaction times once the user becomes visual to see if he could have done better.
In this case, the fault lies with the cyclist, but the likelihood of this happening is moderately high and the severity, although low in this case, can also be extreme. This turns out to be a factor that deserves further analysis. An attractive option might be to pay more attention to radar returns. Sometimes radar provides returns from hidden vehicles, even bicycles, but if the motorcycle is very close to the truck, this may not happen. Assuming each and every truck has a bicyclist hiding it, you can still do it when radar signals indicate it, it’s more likely.
Previous reports of an arson attack on a Waymo on the eve of the Super Bowl and Chinese New Year indicated that serious efforts could be made to arrest the perpetrators, who are transparent in publicly released videos. Plus, Waymo cameras deserve to have been filmed. Unless the recordings on the hard drives were destroyed in the fire. The mayor of San Francisco, concerned about the city’s reputation for lawlessness and poor law enforcement, seemed to need the crime solved. However, no arrests have been made at this time.
According to the video, if someone were to report one of the perpetrators, there would be strong evidence against them. Waymo will also have videos of other cars on the domain that the perpetrators would have possibly stumbled upon. Reportedly, police and Waymo are wary of the frightening nature of the use of facial popularity in this case, however, other clues have been present. Since the perpetrators will most likely not be able to pay damages, Waymo would possibly have little explanation for why to continue with their arrest in this case, unless it happens again, in which case they will need a deterrent to oppose it.