View Full Version : Uber halts self-driving car tests after death.
Jimbuna
03-19-18, 01:11 PM
Uber said it is suspending self-driving car tests in all North American cities after a fatal accident.
A woman was hit by a car and killed as she crossed the street in Tempe, Arizona.
While self-driving cars have been involved in several accidents, it is thought to be the first time a self-driving car has been involved in a fatal collision.
http://www.bbc.co.uk/news/business-43459156
Surely something like this was foreseeable :hmmm:
mako88sb
03-19-18, 01:32 PM
Not a big fan of the concept but I doubt it's going away. Eventually, once driverless cars are more prevalent, stats will be generated that show traffic accident or road collision deaths declining. Having said that, I don't even want to think about how these things will perform in severe winter driving conditions like what we are still going through here in Calgary. The road clearing has been terrible because our city has always gambled every year and lost this time on the chinook winds we get taking care of it for them. I can't imagine how driverless cars or trucks can be expected to safely navigate such conditions. Same with those winter storms that often hit the American NE.
Onkel Neal
03-19-18, 02:26 PM
http://www.bbc.co.uk/news/business-43459156
Surely something like this was foreseeable :hmmm:
Yeah, especially since people these days don't look before crossing the street
Skybird
03-19-18, 06:45 PM
Not just since this accident I have problems to imagine that autonomous driving will become wide-spread reality outside very well guarded, clearly defined perimeters. There is a lot of hype in this, like in e-mobility for saving climate and good conscience.
Relevant for clearly defined, controlled perimeters, yes. But in the open, chaotic wild? I believe it when I see it. And I will not see it during my lifetime.
Buddahaid
03-19-18, 07:07 PM
Somehow I'm always reminded of this scene.
https://www.youtube.com/watch?time_continue=16&v=0H5k--n7sFI
Subnuts
03-19-18, 07:14 PM
The first video of the accident has been released.
https://j.gifs.com/yrOwqn.gif
Eichhörnchen
03-20-18, 07:23 AM
Relevant for clearly defined, controlled perimeters, yes. But in the open, chaotic wild? I believe it when I see it. And I will not see it during my lifetime.
I agree... to me, driving a car is often an intuitive activity... machines do not (yet) possess intuition
Commander Wallace
03-20-18, 07:38 AM
http://www.bbc.co.uk/news/business-43459156
Surely something like this was foreseeable :hmmm:
I'm left wondering how much of this accident was the fault of the " driver-less car. " The link that you provided mentions the fact that the woman was not crossing in a designated crosswalk. I'm wondering how likely it was that she crossed the street against the light. Further, was she oblivious to the street traffic because she was busy talking or playing with her cell phone. We have all seen people on their cell phones not paying any attention to their surroundings and walking right into traffic. The article doesn't mention if that was a factor in the accident.
A human driver makes allowances for the most part for the negligence of other people. This driver-less vehicle may not.
Mr Quatro
03-20-18, 08:08 AM
What about Uber's self driving truck program ... how long will it last if they hit someone?
https://www.motorauthority.com/news/1115802_ubers-self-driving-trucks-are-now-in-service
Uber has officially put its self-driving semi-trailer trucks into operation under the new service Uber Freight. In fact, Uber has had the self-driving trucks in service for a few months in the state of Arizona,
mako88sb
03-20-18, 08:46 AM
I'm left wondering how much of this accident was the fault of the " driver-less car. " The link that you provided mentions the fact that the woman was not crossing in a designated crosswalk. I'm wondering how likely it was that she crossed the street against the light. Further, was she oblivious to the street traffic because she was busy talking or playing with her cell phone. We have all seen people on their cell phones not paying any attention to their surroundings and walking right into traffic. The article doesn't mention if that was a factor in the accident.
A human driver makes allowances for the most part for the negligence of other people. This driver-less vehicle may not.
Yes, it will be interesting to see if it was something that would have been unavoidable. If not, would the monitor be charged? Seems like he/she should be but maybe that second or two that it takes to override the computer is the difference.
Bilge_Rat
03-20-18, 01:18 PM
The first video of the accident has been released.
https://j.gifs.com/yrOwqn.gif
:up:
I'm sure UBER will somehow try to pin the blame on the pedestrian...:ping:
Aktungbby
03-20-18, 01:48 PM
I'm left wondering how much of this accident was the fault of the " driver-less car. "
A human driver makes allowances for the most part for the negligence of other people. This driver-less vehicle may not.
What about Uber's self driving truck program ... how long will it last if they hit someone?
I'm of the opinion, as a professional patrol driver and ex- interstate trucker, that any taxpayer has an inalienable right not to get killed by a profit motivated....anything! including driverless cars/big rigs that are experimenting innately with innocent lives on my roadway! I pay good gas and registration taxes and don't fancy being in some geek engineer'$ 'laboratory' :damn: Arizona officials $aw opportunity when Uber and other companies began testing driverless cars a few years ago. Promising to keep over$ight light, they invited the companies to te$t their robotic vehicles on the $tate’s roads. Then on Sunday night, an autonomous car operated by Uber — and with an emergency backup driver behind the wheel — struck and killed a woman on a street in Tempe, Ariz. It was believed to be the first pedestrian death associated with self-driving technology.... California requires companies to report the number of instances when human drivers are forced to take over for the autonomous vehicle, called “disengagements.”
Waymo, the self-driving car unit of Google’s parent company Alphabet, has been using cars without a human in the driver’s seat (https://www.nytimes.com/2017/11/07/technology/waymo-autonomous-cars.html) to pick up and drop off passengers in Arizona.
Most testing of driverless cars occurs with a safety driver in the front seat who is available to take over if something goes wrong. It can be challenging, however, to take control of a fast-moving vehicle. Between December 2016 and November 2017, Waymo’s self-driving cars drove about 350,000 miles and human drivers retook the wheel 63 times — an average of about 5,600 miles between every disengagement. :nope: Uber has not been testing its self-driving cars long enough in California to be required to release its disengagement numbers. https://www.nytimes.com/2018/03/19/technology/uber-driverless-fatality.html (https://www.nytimes.com/2018/03/19/technology/uber-driverless-fatality.html)
Commander Wallace
03-20-18, 03:25 PM
I'm of the opinion, as a professional patrol driver and ex- interstate trucker, that any taxpayer has an inalienable right not to get killed by a profit motivated....anything! including driverless cars/big rigs that are experimenting innately with innocent lives on my roadway! I pay good gas and registration taxes and don't fancy being in some geek engineer'$ 'laboratory' :damn: https://www.nytimes.com/2018/03/19/technology/uber-driverless-fatality.html (https://www.nytimes.com/2018/03/19/technology/uber-driverless-fatality.html)
I'm also not in favor of a driver-less car, big truck or anything automated like that. With that being said, the investigators sifting through the evidence have said on record that this accident may not be the fault of UBER or the automated car. The police have said it would have been impossible to stop in time even if the car had been driven by a human.
https://www.thesun.co.uk/news/5854438/driverless-uber-car-that-killed-woman-crossing-the-street-is-not-at-fault-for-running-her-down-cops-say/
This would suggest that the behavior of the woman who was unfortunately killed contributed in some way to her untimely demise. The investigation is ongoing so more information should be forthcoming in the days and weeks ahead. It's unfortunate that it took a woman's death to call in to question the desirability or viability of having a self driving vehicle. The technology is there but what about the software packages ? The car that struck the woman was said to be traveling at 38mph in a designated 35mph zone. While police routinely give drivers a 5 mph buffer, the computer controlled car should have been traveling slower as one would expect from a better controlled, computerized car.
Jimbuna
03-20-18, 03:44 PM
I'm left wondering how much of this accident was the fault of the " driver-less car. " The link that you provided mentions the fact that the woman was not crossing in a designated crosswalk. I'm wondering how likely it was that she crossed the street against the light. Further, was she oblivious to the street traffic because she was busy talking or playing with her cell phone. We have all seen people on their cell phones not paying any attention to their surroundings and walking right into traffic. The article doesn't mention if that was a factor in the accident.
A human driver makes allowances for the most part for the negligence of other people. This driver-less vehicle may not.
True that :yep:
Jimbuna
03-22-18, 06:38 AM
Video footage of the fatal event.
http://www.bbc.co.uk/news/world-us-canada-43497364
Commander Wallace
03-22-18, 07:08 AM
Video footage of the fatal event.
http://www.bbc.co.uk/news/world-us-canada-43497364
The video sheds a bit of light on this accident. The woman was wearing dark clothing and impossible to see and only visible at the 00:20 mark right before she is fatally injured. I understand now why law enforcement investigators have said it wouldn't have mattered if there was a human driver or not. A human driver would probably not be charged for this accident.
My only question is this: Don't these autonomous vehicles have sensors, like radar to detect if an obstacle is in the way ? They are supposed to be functioning to be able to drive themselves. A number of auto manufacturers like Ford and Japanese made vehicles have automatic braking cars and some cars made by Ford can even park themselves in tight quarters. I think a sensor suite like this coupled with ABS braking might have lessened the impact to the unfortunate woman. That is, if the sensors are working properly.
http://www.thedrive.com/tech/8657/heres-how-the-sensors-in-autonomous-cars-work
http://insideunmannedsystems.com/combination-sensors-needed-autonomous-vehicle-development/
Skybird
03-22-18, 07:29 AM
Interesting complications ahead: in case of an accident, who is to be charged on side of the car? The software engineers? The hardware engineers? The producing company? The car owner? The state? The traffic departement?
The question of legal responsibilities so far is completely unanswered.
Autonomous system can prevent accidents that humans would be unable to avoid, right becasue they do not depend or trust on "guts-feeling", "experience" and other typically human "habits" :) Thats why you have such systems in subways, trains, on planes already. Also, such robotized traffic systems have been demonstrated to work incredibly well inside factories and Japanese (or were it Chinese...) mail sorting centres. However, such systems then were operated in relatively pre-sorted, normatized, limited environment with more or less strictly ocntrolled numbers of potentially disturbing variables. Public private traffic is all that NOT. Thats why I would not even trust in autonomous cars being operated only on exclsuvely reserved own street lanes. The human factor remains, and it brings chaos into the well-ordered world of autonomous cars, inevitably, always.
And as far as there are attempts of centralised car and traffic control in autonomous traffic environments, that is a nightmare. Hack this centrlaised control, and then imagine the carnage you can do, or threaten with in order to blackmail complainace with your demands.
I read that some experts say this accident now has pushed back autonomous driving by at least five years. Some even say one or two more accidents like this that end lethally, and it will be over for autonomous driving.
Another intreesting scenario. Imagine autonomous driving controlled by not a set of automatic repsonse schemes (nothing else the term artifical intelligence today and so far means), but by an AI that indeed has reached true self-awareness. I would assume that such self-aware artifically intelliegnces then also may have or form a sense of self-preservation. Everythign that is swelf-aware in our world, is a living mind, and every living mind we know of fights for its survival, forms borders that defeines wehre it begins and where the boutside has to end. It is conflict-ready. What if there is an accident forming up where the AI, self-aware and wanting to survive, decides to kill the human (allows him to get killed) in order to survive itself, what if the human could only be saved by the aI destroying itself - and refuses to do so?
I assume where there is self-awareness, the carrier of such self-awareness is no longer limited by the prohibitions of its code that express ethical imperatives designed by an alien life form humans.
Aktungbby
03-22-18, 10:27 AM
The video sheds a bit of light on this accident. The woman was wearing dark clothing and impossible to see and only visible at the 00:20 mark right before she is fatally injured. I understand now why law enforcement investigators have said it wouldn't have mattered if there was a human driver or not. A human driver would probably not be charged for this accident. :hmmm: " A large median at the site of the crash has signs warning people not to cross mid-block and to use the crosswalk to the north at Curry Road instead. But the median also has a brick pathway cutting through the desert landscaping that accommodates people who do cross at that site. It's very clear it would have been difficult to avoid this collision in any kind of mode (autonomous or human-driven) based on how she came from the shadows right into the roadway," Moir also told the San Francisco Chronicle after viewing the footage. Footage captured from a camera inside the vehicle shows Vasquez looking down moments before the crash. It also shows that as soon as she picks up her head to look at the road she looks surprised before the video cuts off. The Volvo was traveling about 40 mph and made no visible attempt to brake in the video, police have said. The speed limit in the area is 35 mph. The operators that Uber use frequently have a laptop in the car used to direct the vehicle's route and also to record information about the car's performance. It is unclear from the video released by police what Vasquez was directing her attention to in the vehicle. Having been rear-ended by a police vehicle who's Lieutenant admitted he was looking at his computer while I was stopped for a traffic light myself... a lot of human error is apparent including the victim's poor choice of crossing location....
According to the state Department of Corrections, Vasquez served nearly four years in prison (https://www.azcentral.com/story/news/local/tempe/2018/03/19/operator-self-driving-uber-vehicle-killed-pedestrian-felon/440501002/) for attempted armed robbery and giving a false statement in order to get unemployment benefits.
Court records show that Vasquez said she recognized she had surrounded herself with people who encouraged to do “ill-advised” actions, leading her to get in trouble. She said she needed to change who she allowed into her life and make better decisions, court records show.
It appears she followed through. Vasquez had a clean record since. Uber proudly touts its corporate policy to offer convicts a second chance.
Court records show that Herzberg had been convicted on drug possession charges. In an April 2015 letter written by her husband to a judge, he said that Herzberg had been using drugs to “self medicate to deal with her depression” during the past 13 years.
She would stay at a homeless camp near where the crash occurred.
Well that's it in a nut$hell: not particularly qualified ex-felons using a computer while driving (illegal in California-same as texting while driving) killing addicts on depression medication when straying from homeless camps....a social problem after all; Uber's corporate pocketbook is off the hook here.:timeout:
Mike Abberton
03-22-18, 11:07 AM
I'm left wondering how much of this accident was the fault of the " driver-less car. " The link that you provided mentions the fact that the woman was not crossing in a designated crosswalk. I'm wondering how likely it was that she crossed the street against the light. Further, was she oblivious to the street traffic because she was busy talking or playing with her cell phone. We have all seen people on their cell phones not paying any attention to their surroundings and walking right into traffic. The article doesn't mention if that was a factor in the accident.
A human driver makes allowances for the most part for the negligence of other people. This driver-less vehicle may not.
To be fair, though, human drivers also look at their cell phones, drink coffee, shave, read the newspaper, talk to passengers, fiddle with the radio, etc. The computer driving the automated vehicle does not do any of those things (at least not yet).
Ultimately the issue is not whether automated cars kill zero people, but rather do they kill less people than human-driven vehicles driving in the same conditions? If the answer is yes, that's a net benefit. If it's no, then they need more work or should be abandoned.
Mike
Aktungbby
03-22-18, 12:23 PM
Interesting complications ahead: in case of an accident, who is to be charged on side of the car? The software engineers? The hardware engineers? The producing company? The car owner? The state? The traffic departement?
:hmmm: Uber's corporate pocketbook is off the hook here.:timeout:
TODAY'S WSJ-THE SLEEPING CORPORATE DRAGON AWAKENS TO A WHOLE NEW GAME: As federal investigators begin to examine a pedestrian fatality involving (https://www.wsj.com/articles/uber-suspends-driverless-car-program-after-pedestrian-is-killed-1521551002) a self-driving Uber Technologies Inc. car this week, America’s car insurers are watching closely.
Car insurers haul in roughly $230 billion of premiums a year, but much of that intake could evaporate in coming decades, say some consultants, assuming crucial breakthroughs in driverless technology that would eliminate the many wrecks caused by human error.
The potential hit to their bottom lines has property-casualty insurers in an arms race to figure out how they can design policies and price the risk of the vehicles that technology firms, such as Uber and Alphabet (http://quotes.wsj.com/GOOGL) Inc., GOOGL -2.82% (http://quotes.wsj.com/GOOGL?mod=chiclets)are seeking to deploy in huge numbers, according to industry brokers, executives and trade groups.A person familiar with Uber said the firm’s test vehicles are insured through a commercial-insurance policy for a maximum of $5 million per accident. The insurer or insurers couldn’t immediately be confirmed.
The Uber accident highlights a likely broader trend to come in driverless cars. Under the current arrangement, individual car owners must buy liability policies to help cover damage in wrecks they cause. But in a possible metamorphosis, individuals would bear less financial responsibility.
Instead, the makers of the vehicles and their many complex parts will instead assume a bigger share—via product-liability coverage, consultants say....but we do know insurance companies are engaged with developers, trying to help them reduce their liability exposure,” said Jim Whittle, associate general counsel for the American Insurance Association, a lobbying group representing some of the nation’s biggest property-casualty insurers.
Those possibly at fault for accidents: vehicle owners, manufacturers, suppliers, service providers and even data providers.
The shift from personal liability is also an opportunity for many of the nation’s biggest insurers eager to get in on the action of insuring autonomous vehicles. When it comes to deep-pocketed corporate owners of vehicles, victims could sue for greater sums.
Indeed, should autonomous cars proliferate and if their safety record isn’t as great as many technology enthusiasts envision, car makers and the manufacturers of component parts can “expect to get to know their way around every courthouse in America,” wrote Randy Maniloff, an insurance lawyer with White and Williams LLP in Philadelphia, in an insurance-coverage newsletter.
Uber itself has made insurance a high priority in its driverless-car push. THE EXPERTS AGREE: THE CAR'S TECHNOLOGY SHOULD HAVE BEEN ABLE TO DETECT POOR MS HERZBERG IN THE DARKNESS BETTER THAN A HUMAN.... AND IT DID NOT...:o https://media1.giphy.com/media/3oEjI0YYgsx8iTZzYA/200w.gif
Skybird
03-22-18, 12:36 PM
Many experts also point out - at least here in Germany - that the scandal also lies in letting these cars already drive now, with current technology and software and under current regulations - while their technology simply still is not as advanced as it is claimed to be. Practically every such expert in the media over here pointed out that these cars still are years away from even considering to let them drive in the wild. That it is already being done, is seen as irresponsible by a majority, it seems.
Aktungbby
03-22-18, 01:00 PM
I'm of the opinion, as a professional patrol driver and ex- interstate trucker, that any taxpayer has an inalienable right not to get killed by a profit motivated....anything! including driverless cars/big rigs that are experimenting innately with innocent lives on my roadway! I pay good gas and registration taxes and don't fancy being in some geek engineer'$ 'laboratory' :damn: https://www.nytimes.com/2018/03/19/technology/uber-driverless-fatality.html (https://www.nytimes.com/2018/03/19/technology/uber-driverless-fatality.html)
Many experts also point out - at least here in Germany - that the scandal also lies in letting these cars already drive now, with current technology and software and under current regulations - while their technology simply still is not as advanced as it is claimed to be. Practically every such expert in the media over here pointed out that these cars still are years away from even considering to let them drive in the wild. That it is already being done, is seen as irresponsible by a majority, it seems. PRECISLY: I MENTION THAT I WAS THE INSURANCE DRIVER FOR THE INTERSTATE TRUCK COMPANY; GETTING TOWED OUT A THOUSAND MILES WITH A FREIGHLINER-TRACTOR TO SALVAGE THE COMPANY'S CARGOS AND/OR TRAILERS WHEN NECESSARY AND MAKE FINAL ARRANGEMENTS FOR BERIEVED FAMILIES.... FIRST-HAND EXPERIENCE HERE!!! THE LEADING CAUSE OF DEATH ON THIS PLANET IS GENERALLY.... OTHER PEOPLE:yep:... USUALLY WIVES AND DOCTORS-SOMETIMES IN COLLUSION:haha:, WARLORDS, SCIENTISTS AND BAD PROFIT-MOTIVATED ENGINEERS (NEW BRIDGE COLLAPSES ETChttps://cdn.cnn.com/cnnnext/dam/assets/180315151003-06-bridge-collapse-0315-super-169.jpg)??!! AT 67, I'M NOT AS QUICK ANYMORE, AND AS STATED: JUST DON'T WANT TO BE PART OF ANYONE'S COMMERCIAL EXPERIMENTS; THE MORE SO AS I STILL DRIVE 2-300 MILES A WEEK AT NIGHT....:oops: https://www.cnn.com/2018/03/16/us/bridge-collapse-florida/index.html (https://www.cnn.com/2018/03/16/us/bridge-collapse-florida/index.html)
Every one is looking for a reason to blame the Autonomous vehicle for this accident, Has anyone asked why this lady stepped into the street like she did? It was dark, so did she not see the headlights of the car or did she just step out without looking?
I wonder if these cars have the ability to swerve.:hmmm:
Commander Wallace
03-22-18, 07:58 PM
To be fair, though, human drivers also look at their cell phones, drink coffee, shave, read the newspaper, talk to passengers, fiddle with the radio, etc. The computer driving the automated vehicle does not do any of those things (at least not yet).
Ultimately the issue is not whether automated cars kill zero people, but rather do they kill less people than human-driven vehicles driving in the same conditions? If the answer is yes, that's a net benefit. If it's no, then they need more work or should be abandoned.
Mike
Good point mike. I don't want to derail this thread. However, here is something else to consider with these sensors on autonomous vehicles. I ride motorcycles, mostly through rural areas. My biggest concern is wild life jumping out and my motorcycle hitting them. I travel through cities on occasion and frequently encounter traffic lights. If I'm the first in line, my motorcycle usually fails to trigger the sensors to activate the lights. As a result, I have sat through 2 cycles of light with angry traffic behind me. Fortunately, The legislature in a number of states have enacted laws that effectively say that under those conditions and circumstances, I can " legally " assume the traffic lights are " defective " and go through the light after having sat through a cycle. The mere fact that the legislature have enacted these laws means that they know about the problem but won't force whoever manufactures and maintains the traffic signals ( Dept. Of Transportation ) to upgrade them in any way.
While this new law helps in a way, If I am moving against a red light, someone else has a green light and I am taking my life in my hands by going through the red light. For that reason, I leave a large cushion at lights if I am first in line and encourage car ( s ) behind me to pass me and take the lead position at the light so that the cars mass can trigger the light. By the way, my motorcycle is relatively large and tips the scales at roughly close to 800 lbs with me on it.
The point is, if these autonomous cars are using sensors of a similar design and I don't know that they are, they won't be sensitive enough to detect pedestrians if they can't detect a 600 pound motorcycle without the rider.
It may well come down to a cruel and simple fact: It's cheaper and more cost effective to settle a few law suits than to fix the initial problem for these manufactures. We have seen that with defective auto components like tires and seat belts and more recently, air bags manufactured by Takata. Takata Gambled badly on that one and as a result of the scandal resulting from their defective airbags and the deaths associated with their air bags, they have filed for bankruptcy. If these autonomous cars are going to be on the road, then it should at the very least be mandated that they have the auto braking system and a sensor suite sensitive enough to detect pedestrians, especially children who don't weigh much at all. We have all seen a ball roll out into the street with a child fast behind, chasing it. With the code and software for these systems being written, these contingencies have to be factored in.
http://money.cnn.com/2017/06/25/news/companies/takata-bankruptcy/index.html
Aktungbby
03-23-18, 04:05 PM
not particularly qualified ex-felons using a computer while driving I wonder if these cars have the ability to swerve.:hmmm: YES...WELL SORTA https://www.forbes.com/sites/patricklin/2017/04/05/heres-how-tesla-solves-a-self-driving-crash-dilemma/#774261436813 (https://www.forbes.com/sites/patricklin/2017/04/05/heres-how-tesla-solves-a-self-driving-crash-dilemma/#774261436813) Tesla did not do what no manufacturer has ever done—“develop and implement computer algorithms that would eliminate the danger of full throttle acceleration into fixed objects” even if it is caused by human error…Tesla disputes that there is a legal duty to design a failsafe car.:k_confused: The test operator in the Uber Technologies Inc. self-driving car that killed an Arizona woman was a felon with a history of traffic citations who wasn’t watching the road before the accident happened, facts that raise new questions about the company’s testing process for autonomous technology.
In November, Colorado officials fined Uber $8.9 million (https://drive.google.com/file/d/1HDZAhjiwt1WBxjN1BdHhazp95YDTHoZK/view) after discovering it had allowed several dozen drivers onto its service who had prior felony convictions, a violation of state rules for ride-hailing firms. Uber attributed the hirings to a “process error.” It has resisted allowing fingerprinting in most markets, saying the process can be lengthy and produce misleading results.
Drivers of autonomous vehicles, called test operators or safety drivers, are trained to monitor the road and to take the wheel or hit the brake when the vehicles, which are still in test mode, act erratically. Uber gives test operators three weeks of training before they go out on the road.
The accident also revives debate about whether humans and robots can coexist when operating a car together. Experts question whether a human can handle when the car’s brain hands over control in a complex driving situation, especially when there is little time to react.
“This event only highlights the handover problem,” Missy Cummings, a professor of mechanical engineering and material science at Duke University, said in an email. “If trained ‘safety’ drivers can not make themselves pay attention, how will the rest of us fare?”
Her research has found people have difficulty remaining vigilant when monitoring automation for long periods. Her group studied 27 subjects during four hours of simulated driving and found vigilance decreased in about 21 minutes on average. Executives had put test vehicles into the hands of employees and instructed them to monitor the roadway. The employees were video recorded. The videos showed employees quickly became comfortable with the self-driving technology and their attention wandered. Videos show one employee sleeping; another applied makeup.
Uber said it has a total of some 400 drivers in Tempe, Toronto, San Francisco and Pittsburgh, where it is testing self-driving cars. It has pulled the vehicles from public roads in Toronto, Tempe, San Francisco and Pittsburgh while the crash investigation continues.
Industry players say the testing of self-driving cars on U.S. roads is bound to continue. “Nobody knows any other option,”....:oops: :ping: BOTTOM LINE: these cars are not ready for (Skybirds 'wild') public roadways and no one should be killed for a corporate (Uber, Tesla etc) profit motive. Ms Herzberg's estate is owed Uber's profit$ for the year....imho
vBulletin® v3.8.11, Copyright ©2000-2025, vBulletin Solutions Inc.