- cross-posted to:
- selfdrivingcars@lemmy.ml
- futurism@lemmy.ca
- technology@midwest.social
- cross-posted to:
- selfdrivingcars@lemmy.ml
- futurism@lemmy.ca
- technology@midwest.social
They got an army of thousands of Indians to watch the road for you?
No, you’re thinking of Amazon.
don’t come with a requirement that drivers watch the road
Seems it’s like every other Mercedes then
Hey I’m watching it on my mirrors.
And they managed to do it without us obsessing about their CEO several times a day? I refuse to believe that!
…
As of April 11, there were 65 Mercedes autonomous vehicles available for sale in California, Fortune has learned through an open records request submitted to the state’s DMV. One of those has since been sold, which marks the first sale of an autonomous Mercedes in California, according to the DMV. Mercedes would not confirm sales numbers. Select Mercedes dealerships in Nevada are also offering the cars with the new technology, known as “level 3” autonomous driving.
…
Drivers can activate Mercedes’s technology, called Drive Pilot, when certain conditions are met, including in heavy traffic jams, during the daytime, on spec ific California and Nevada freeways, and when the car is traveling less than 40 mph. Drivers can focus on other activities until the vehicle alerts them to resume control. The technology does not work on roads that haven’t been pre-approved by Mercedes, including on freeways in other states.
…
U.S. customers can buy a yearly subscription of Drive Pilot in 2024 EQS sedans and S-Class car models for $2,500.
…
Mercedes is also working on developing level 4 capabilities. The automaker’s chief technology officer Markus Schäfer expects that level 4 autonomous technology will be available to consumers by 2030, Automotive News reported.
…
Hmm, so only on a very small number of predetermined routes, and at very slow speeds for those roads.
Still impressive, but not as impressive as the headline makes out.
And definitely not worth the $2500 a year they’re asking for the feature.
Chances are, If you can afford the car, then that amount is nothing to you.
Having known one, some of their customers love their feature loaded cars to brag about and feel extra special. Some will definitely pay the 2.5k gladly.
If they assume full liability for any collisions while the feature is active (and it looks like they do), then I can see that being fair.
Come on, you have been able to pay the price of that Mercedes in the first place.
These 2500 are not going to hurt.
Yes, but it’s actually level 3.
Not the Tesla “full self driving - no wait we actually lied to you, you need to be alert at all times” bullshit.
We’ll wait and see i guess
Fucking subscription. No.
U.S. customers can buy a yearly subscription of Drive Pilot in 2024 EQS sedans and S-Class car models for $2,500
yeah, fuck that.
Have you seen Tesla’s price for full self driving? And they don’t take liability
I think you can afford that if you own an EQS
Love how companies can decide who has to supervise their car’s automated driving and not an actual safety authority. Absolutely nuts.
Who said there was no safety authority involved? I thought it was part of the 4 level system the government decided on for assisted driving.
Paywalled.
Paywalled.
On a different subject, why would someone downvote a one-word comment that accurately describes what the content is behind?
There are people who are pathologically contrarian. I’ve had to end a friendship over it—the endless need to say something negative about literally everything that ever happens and an unwillingness to be charitable to others.
Reddit 1.3 is just like that.
Doesn’t answer my question though.
deleted by creator
Nope. Someone absolutely downvoted him. Because, just like Reddit, the downvote button here is the ‘wow fuck that guy for saying a thing i don’t like’ button.
Musk: Fuuuuuuu
Wonder how this works with car insurance. Os there a future where the driver doesn’t need to be insured? Can the vehicle software still be “at fault” and how will the actuaries deal with assessing this new risk.
I believe Mercedes takes responsibility if there is an accident while driving autonomously.
Will it pull a Tesla and switch off the autopilot seconds before an accident?
Wow I hope we see some regulation about that kind of thing.
If memory serves, that’s not an intentional feature, but more a coincidence, since if the driver thinks the cruise control is about to crash the car, they’ll pop the brakes. Touching the brakes disengages the cruise control by design, so you end up with it shutting down before a crash happens.
That makes perfect sense. If the driver looks up to notice that he’s in a dangerous, unfixable situation, slams the breaks, disconnecting the autopilot (which have been responaible for letting the situation develop) hopefully the automaker can’t entirely say “not our fault, the system wasn’t even engaged at the time of the collision”
And this is how they will push everyone into driverless. Through insurance costs. Who would insure 1 human driver vs 100 bots, (once the systems have a few billion miles on them)
You’re probably right. Another decade or two and human driver controlled cars might be prohibitively expensive to insure for some or even not allowed in certain areas.
I can imagine an awesome world where that’s a great thing but also imagine a dystopian world like wall-e as well. I guess we’ll know then which one we chose.
Berkshire Hathaway owns Geico the car insurance company. In one of his annual letters Buffett said that autonomous cars are going to be great for humanity and bad for insurance companies.
“If [self-driving cars] prove successful and reduce accidents dramatically, it will be very good for society and very bad for auto insurers.”
Actuaries are by definition bad at assessing new risk. But as data get collected they quickly adjust to it. There are a lot of cars so if driverless cars become even a few percent of cars on the road they will quickly be able to build good actuarial tables.
According to who? Did the NTSB clear this? Are they even allowed to clear this? If this thing fucks up and kills somebody, will the judge let the driver off the hook 'cuz the manufacturer told them everything’s cool?
According to who? Did the NTSB clear this?
Yes.
If this thing fucks up and kills somebody, will the judge let the driver off the hook 'cuz the manufacturer told them everything’s cool?
Yes, the judge will let the driver off the hook, because Mercedes told them it will assume the liability instead.
You do realize humans kill hundreds of other humans a day in cars, right? Is it possible that autonomous vehicles may actually be safer than a human driver?
Sure. But no system is 100% effective and all of their questions are legit and important to answer. If I got hit by one of these tomorrow I want to know the process for fault, compensation and pathway to improvement are all already done not something my accident is going to landmark.
But that being said, I was a licensing examiner for 2 years and quit because they kept making it easier to pass and I was forced to pass so many people who should not be on the road.
I think this idea is sound, but that doesn’t mean there aren’t things to address around it.
Honestly I’m sure there will be a lot of unfortunate mistakes until computers and self driving systems can be relied upon. However there needs to be an entry point for manufacturers and this is it. Technology will get better over time, it always has. Eventually self driving autos will be the norm.
That still doesn’t address all the issues surrounding it. I am unsure if you are just young and not aware how these things work or terribly naive. But companies will always cut corners to keep profits. Regulation forces a certain level of quality control (ideally). Just letting them do their thing because “it’ll eventually get better” is a gateway to absurd amounts of damage. Also, not all technology always gets better. Plenty just get abandoned.
But to circle back, if I get hit by a car tomorrow and all these thinga you think are unimportant are unanswered does that mean I might mot get legal justice or compensation? If there isn’t clearly codified law I might not, and you might be callous enough to say you don’t care about me. But what about you? What if you got hit by a unmonitored self driving car tomorrow and then told you’d have to go through a long, expensive court battle to determine fault because no one had done it it. So you’re in and out of a hospital recovering and draining all of your money on bills both legal and medical to eventually hopefully get compensated for something that wasn’t your fault.
That is why people here are asking these questions. Few people actually oppose progress. They just need to know that reasonable precautions are taken for predictable failures.
To be clear I never said that I didn’t care about an individual’s safety, you inferred that somehow from my post and quite frankly are quite disrespectful. I simply stated that autonomous vehicles are here to stay and that the technology will improve more with time.
The legal implications of self driving cars are still being determined and as this is literally one of the first approved technologies available. Tesla doesn’t count as it’s not a SAE level 3 autonomous driving vehicle. There are some references in the liability section of the wiki.
https://en.m.wikipedia.org/wiki/Regulation_of_self-driving_cars
But then it’s good that the manufacturer states the driver isn’t obliged to watch the road. Because it shifts responsibility towards the manufacturer and thus - it’s a great incentive to make technology as safe as possible.
You’re deciding to prioritize economic development over human safety.
*at 40mph on a clear straight road on a sunny day in a constant stream of traffic with no unexpected happenings, Ts&Cs apply.
Only on closed courses. The best AI lacks the basic heuristics of a child and you simply can’t account for all possible outcomes.
According to that teal light.
if it can drive a car why wouldn’t it be able to drive a truck?
I’m surprised companies don’t just build their own special highway for automated trucking and use people for last mile stuff.
We could make it work on a guide line and attach a bunch of trailers to one truck. You’re a genius.
This idea seems to be getting some steam. I’m all aboard it!
A monorail of course.
yeah that would be great. Say, you can save on that a little if you put wheel guides on the road since theyre all headed in the same direction, and maybe you can replace the tires with something that fits into that guide pretty well so that you don’t have to replace them as much. Matter of fact, all of these trucks can become electric if they run electricity through the track or above it. This is a revolutionary idea!!
They are testing them already. I only have a German article that came out this week https://www.tagesschau.de/wirtschaft/technologie/fahrerlose-lkw-man-test-autobahn-100.html
The truck division of Mercedes (Daimler) is already testing the trucks in the US. They plan commercial usage in 2027. MAN is testing in Europe in wants to start commercial usage in 2030.
deleted by creator
This is also the company that promises to prioritise the vehicle occupants over pedestrians.
I mean that’s exactly what the driver would do, I’m not sure why this is controversial
The human does it out of self preservation, but the car doesn’t need to feel too preserve itself.
By getting the in the car, the passengers should be aware of the risks and that if there is an accident, the car will protect pedestrians over the occupants. The pedestrians had no choice but the passengers have a choice of not getting in the vehicle.
I feel like car manufacturers are going to favour protecting the passengers as a safety feature, and then governments will eventually legislate it to go the other way after a series of high profile deaths of child pedestrians.
You’re probably over-estimating the likelyhood of a scenario where a self driving car needs to make a such decision. Also take into account that if a self driving car is a significantly better driver than a human then it’s by definition going to be much safer for pedestrians aswell even if it’s programmed to prioritize the passengers.
Who would buy a car that will sacrifice the passengers in the event of an unavoidable accident? If it’s significantly better driver than a human would be then it’s safer for pedestrians aswell.
Yes. As it should be. I’ll buy the car that chooses to mow down a sidewalk full of pregnant babies instead of mildly inconveniencing myself or my passengers. Why the hell would you even consider any other alternative?
pregnant babies
🤔
I’d consider a 5yr old a baby https://en.m.wikipedia.org/wiki/Lina_Medina
It’s not really an issue. 99.9% of the time the passengers will already be safe and the pedestrian is the one at risk. The only time I see this being an issue is if the car is already out of control, but at that point there’s little anyone can do.
I mean, what’s the situation where a car can’t break but has enough control where it HAS to kill a pedestrian in order to save the passengers?
Tesla on their autopilot during night. All the time basically. There were number of motorcycle deaths where Tesla just mowed them down. The reason? They had two tail lights side by side instead one big light. Tesla thought this was a car far away and just ran through people.
That’s a problem with the software. The passengers in the car were never at risk and the car could have stopped at any time, the issue was that the car didn’t know what was happening. This situation wouldn’t have engaged the autopilot in the way we are discussing.
As an aside, if what you said is true, people at Tesla should be in jail. WTF
Tesla washes their hands of any wrongdoing with terms of use where owner agrees he’s responsible bla bla bla.
Here’s a related video.
How is this different from the capabilities of Tesla’s FSD, which is considered level 2? It seems like Mercedes just decided they’ll take on liability to classify an equivalent level 2 system as level 3.
According to the mercedes website the cars have radar and lidar sensors. FSD has radar only, but apparently decided to move away from them and towards optical only, I’m not sure if they currently have any role in FSD.
That’s important because FSD relies on optical sensors only to tell not only where an object is, but that it exists. Based on videos I’ve seen of FSD, I suspect that if it hasn’t ingested the data to recognize, say, a plastic bucket, it won’t know that it’s not just part of the road (or at best can recognize that the road looks a little weird). If there’s a radar or lidar sensor though, those directly measure distance and can have 3-D data about the world without the ability to recognize objects. Which means they can say “hey, there’s something there I don’t recognize, time to hit the brakes and alert the driver about what to do next”.
Of course this still leaves a number of problems, like understanding at a higher level what happened after an accident for example. My guess is there will still be problems.
It’s also limited to slow traffic on some roads
“DRIVE PILOT can be activated in heavy traffic jams at a speed of 40 MPH or less on a pre-defined freeway network approved by Mercedes-Benz.” https://www.mbusa.com/en/owners/manuals/drive-pilot#:~:text=DRIVE PILOT can be activated in heavy traffic jams at a speed of 40 MPH or less on a pre-defined freeway network approved by Mercedes-Benz.
You’ve inadvertently pointed out how Tesla deliberately skirts the law. Teslas are way more capable than what level 2 describes, but they choose to stay as level 2 so they wouldn’t have to take responsibility for their public testing
It’s not about the sensors, it’s about the software. That’s the solution.
Please tell me how software will be able to detect objects in low/no-light conditions if they say, have cameras with poor dynamic range and no low-light sensitivity?
How is that legal?
Because it’s an extremely narrowly defined set of requirements in order to use it. It’s “approved freeways with clear markings and moderate to heavy traffic under 40MPH during daytime hours and clear conditions” meaning it will inch forward for you in bumper to bumper traffic provided you’re in an approved area and that’s it.
How is that different than LKAS + ACC?
Those still require your full attention and hands on the wheel.
In theory. In practice, it just beeps at you if your sandwich hand is steering.
Well, not always hands on wheel. I have spent over an hour straight on an interstate with hands off. Ford’s system watches your eyes and lets your hands stay off if it’s decent conditions and on a LIDAR-mapped freeway. Note I wouldn’t trust it at night (there have been two crashes, both at night with stopped vehicles on freeway), but then I wouldn’t really trust myself at night either too much (there are many many more human caused crashes at night, I’m not sure a human at freeway speed could avoid a crash with a surprise stationary vehicle in middle of the road).
Still seems not legal to not pay attention to the road. Wouldn’t fly over here at least.
They got certification from the authorities, and in the event of an accident, the manufacturer takes on responsibility.
lol, ‘manufacturer takes on responsibility’ so… I’m just fucked if one of these hits me?
see a mercedes, shoot a mercedes. destroy it in whatever way you can.
No you’re guaranteed that the Mercedes that hit you is better insured for paying out your damages than pretty much anyone else on the road that could hit you.
lol corporations don’t have responsibility though. that’s the whole point of them. they’re machines for avoiding responsibility.
The sad part of this is somehow thinking that payment solves any problem. Like, idk what they would pay me, just bring back my dead wife/child/father whatever. You can’t fix everything with money.
It only works on a small handful of freeways (read: no pedestrians) in California/Nevada, and only under 40 MPH. The odds of a crash within those parameters resulting in a fatality are quite low.
Human drivers are far more dangerous on the road, and you should be applauding assisted driving development.
This presumes the options are only:
- Human and no autonomous system watching
- Autonomous system, with no meaningful human attention
Key word is ‘assisted’ driving. ADAS should roughly be a nice add, so long as human attention is policed. Ultimately, the ADAS systems are better able to react to some situations, but may utterly make some stupid calls in exceptional scenarios.
Here, the bar of ‘no human paying attention at all’ is one I’m not entirely excited about celebrating. Of course the conditions are “daytime traffic jam only”, where risk is pretty small, you might have a fender bender, pedestrians are almost certainly not a possibility, and the conditions are supremely monotonous, which is a great area for ADAS but not a great area for bored humans.
that paid for it to be, like everything else that’s legal?
deleted by creator
It will be litigated almost immediately. There is no current combination of model and hardware platform that a car could reasonably run that could be called “fully self driving” at any useful speed. This thing sounds like parking assist on steroids maybe, or “stalled traffic assist”. They will be sued.
Did you read the article? There are already plenty of conditions for activating the self driving mode.
There’s tons of conditions
when certain conditions are met, including in heavy traffic jams, during the daytime, on spec ific California and Nevada freeways, and when the car is traveling less than 40 mph. Drivers can focus on other activities until the vehicle alerts them to resume control.
I doubt this is a mistake, they must have really high confidence in the tech as well as with the restrictions, not even Tesla had the balls to announce that you could drive distracted.
not even Tesla had the balls to announce that you could drive distracted.
That’s the difference between Level 2 and Level 3 full self driving. Teslas are Level 2.
That’s what I’m saying, they could have called this a “Ultra advanced level 2” and avoided opening themselves up to a TON of liabilities. Once you start saying this is a level 3 system and you don’t need to pay attention to the road with it, well, that shuts the door to many defenses they could use of it was “just” level 2 if something happens. So that means they must be really confident in their system
Sued for what?
There is no current combination of model and hardware platform that a car could reasonably run that could be called “fully self driving” at any useful speed.
It’s still not flawless and reguires an attentive driver but Tesla FSD Beta V12 is pretty damn impressive. They made a huge leap forward by going from human code to 100% neural nets. I don’t think we’re too far a way from a true robo-taxi and there’s going to be some humble pie served for the LiDAR/radar advocates. I highly recommend everyone to watch some reviews on YouTube if you aren’t up to speed with the recent changes they’ve made.