How do you feel about Texans being guinea pigs on the Texas highways testing ground for driverless semis?
As long as they stay the hell away from where I'm driving for a few years, I guess I'm ok with it. On the one hand, I don't trust self-driving technology at all yet, on the other hand there are so many problems with human truck drivers that almost anything might be an improvement. Older article, but here's some info: https://www.kxan.com/investigations...s-highway-testing-tech-for-driverless-trucks/
Of two minds about it. On one hand, we do have miles and miles of straight flat road that should be pretty easy for an autonomous vehicle. And, by being the state that leans into it, we could perhaps capture more of the economic benefit by being home to the companies that participate in it. On the other hand, I don't feel like I can trust our leadership to put in the regulatory framework needed for minimum safety levels nor manage the economic displacement it will eventually cause. But then, I don't really trust our leadership to do anything.
Did you finally get the help that you needed? I’m happy for you. I don’t see Trump anywhere near this thread. This is a positive sign in the right direction, although I’m not so sure about a permanent cure because there is no cure for TDS.
On one hand, it can be pretty terrifying to know that an AI can mischaracterize your car and kill you without hesitation. But the same is true for a human driver. I still think the real value in shipping is in freight by train rather than with a bunch of semi or fully-autonomous AI drive semi trucks.
I see lots of trucks being driven by derelicts. They kill and maime a lot of people. I'm all for SAFE automated trucking.
Ready or not, self-driving semi-trucks are coming to America’s highways Autonomous truck companies plan a major expansion this year to deliver your packages and food, speeding well ahead of federal safety regulations https://www.washingtonpost.com/technology/2024/03/31/autonomous-semi-truck-jobs-regulation/ PALMER, Tex. — Perched in the cab of a 35,000-pound semi-truck lumbering south on Interstate 45, AJ Jenkins watched the road while the big rig’s steering wheel slid through his hands. Jenkins was in the driver’s seat, but he wasn’t driving. The gigantic 18-wheeler was guiding itself. Over several miles on the popular trucking route between Dallas and Houston, the truck navigated tire debris, maneuvered around a raggedy-looking flatbed and slowed for an emergency vehicle. Exiting the highway, it came to an abrupt stop as a pickup jumped its turn at a four-way intersection. “You need to be ready for anything,” said Jenkins, 64, a former FedEx driver whose job is to take control if anything goes wrong. “People do some crazy stuff around trucks.”
Yes it is true AI can be trained to attack. However it's not going to gain consciousness and attack people. It's not how AI works. It's no different than a truck losing it's brakes on the way down the mountain and slamming into a village of people.
Don't worry. When Tesla semis start putting truckers out of business, they will blame Trump since he will be president.
Not that scenario, but one where the AI decides my car is not a car and it's safe to switch lanes. I am not worried about Skynet.
How is it any different than a mechanic failure. Software failures are becoming a problem already. Like anything else, improve and reduce faults. If we expected perfection, we would still be on horse and buggy. And if you're concerned about AI going wacko for nonreason, let me tell you about those horses
I was just at the Maryland Hunt Cup, so I am well aware of how wild horses can get. But I am also aware of how dumb bugs in AI code can be. For the record, I am not opposed to it but I would not want to be stuck next to a semi, human driver or AI.
The vast majority of vehicle accidents today are due to human errors. Looking towards a future with fully automated driving, it's reasonable to expect that driverless systems will significantly improve safety compared to human drivers. I don't see why we wouldn't get there, but when...who knows. If and when we get there and along the way there, will an individualistic society such as us ever accept it? The age old problem of individual freedom vs collective safety (or risk). Of course, we aren't ANYWHERE close to fully autonomous driving. Software doesn't fail. It might do unexpected things due to poor design or often, complexity that is not well understood. Mechanical failure remains about the same for both driver-operated or autonomous vehicles, and software alone doesn't mitigate this risk.