I have very mixed feelings about the recent advancement of AI technology. On some platforms, I find it fun to mess around with and see what it might generate. I’m very critical of AI art because that is a computer sucking away any creativity that a human could have used. AI-generated videos are becoming scarily realistic, but I think that’s an argument for another day. It’ll be pretty hard to stop all of the horrors of AI because, at this rate, it’s so widespread that it’s here to stay.
However, one aspect that heavily worries me is self-driving cars. Stepping into one of those is like signing a death warrant. How am I supposed to just trust a car to just take me home?
Believe it or not, I am speaking from experience. During this summer, I would call Uber rides to and from my internship. One day on the way home, my driver arrived in a Tesla. About halfway through my 30 minute commute, he calls for my attention. I’m listening to music like usual, just spacing out. When I turn to him, he has the audacity to tell me that he wasn’t even driving the car. Surely enough, he was lounging back as the car just did its thing.
I admit that the technology is amazing, but not when my life is on the line. There are plenty of sensors that keep the car safe from collisions, but when your driver goes out of his way to say “Look, no hands!” I absolutely do feel in danger.
Uber recently partnered with a little known app called Waymo to foray into the future of car rides. As of June 2025, you could request an UberX, Comfort, or Comfort Electric vehicle and be matched with a Waymo all-electric Jaguar I-PACE, according to Uber Investor Relations.
The site does acknowledge user safety as its priority. “Tens of millions” of miles were driven by Waymo’s fully automated cars on public roads, so by their standards, the concept is perfectly safe. Waymo claimed that this has also reduced traffic injuries.
I’m sure that this company tested countless prototypes before releasing them, but morally, this should not become commonplace. Much like the AI art argument, relying on your own car to get you through traffic takes away from your own spatial awareness. You just rely on a machine to make sudden decisions. I went through 30 hours of driving school and a few very stressful tests before I finally got my license. Driving is a necessary skill to have.
My last gripe is legality. A car trained to detect dangers does not mean it will never get into a collision. If the car itself malfunctioned, is it technically the driver’s fault? Responsibility on the road is number one, so who’s to say that someone will just find a loophole? Technology today is impressive. I have no issue with using it, but I would be twice as alert if my car were on autopilot.