March 17, 2025

Strike Force heroes4

Connecting the World with Advanced Technology

Are AI and Talking Cars the Future of Driving?

Are AI and Talking Cars the Future of Driving?
PublicDomainPictures/Pixabay

Source: PublicDomainPictures/Pixabay

We are living in transformative times—due, in part, to the renaissance of artificial intelligence (AI) machine learning and the capabilities of large language models (LLMs). Transportation is one industry that AI may soon change dramatically.

New research presented at the recent 27th IEEE International Conference on Intelligent Transportation Systems shows how a conversational AI based on an LLM framework called Talk2Drive can interpret human commands to guide autonomous vehicles (AVs). “This is the first-of-its-kind multi-scenario field experiment that deploys LLMs on a real-world autonomous vehicle,” wrote corresponding author Can Cui along with co-authors Ziran Wang, Zichong Yang, Yupeng Zhou, Yunsheng Ma, Juanwu Lu, Lingxi Li, Yaobin Chen, and Jitesh Panchal, all of Purdue University.

The concept of AI-powered talking cars was a television fantasy a little over 42 years ago. An AI-powered talking car called K.I.T.T. (Knight Industries Two Thousand) was the fictional main character of a TV series, “Knight Rider,” that ran for four seasons on NBC. KITT was a chatty 1982 Pontiac Trans Am two-door coupe that costarred alongside David Hasselhoff, which helped launch his human co-star into global fame and stardom.

But KITT’s talking abilities were due to television magic instead of true artificial intelligence. In 2002, the last Pontiac Trans Am model was released; eight years later, General Motors shuttered it entirely. Fast forward to September 2024, and Cruise, majority-owned by General Motors since 2016, announced its plans to resume testing up to five AVs in California’s Bay Area. A year prior, in October 2023, California’s Department of Motor Vehicles suspended Cruise’s driverless vehicles following incidents involving its AV robotaxis, according to Ars Technica.

Self-driving vehicles, also known as autonomous vehicles, use AI and cameras to navigate. SAE International categorizes the levels of driving automation on a scale from zero to five, where the SAE Level 0 has no automation and SAE Level 5 is full driving automation.

Today there are AI-powered autonomous vehicles on the road in San Francisco, Beijing, Phoenix, and Wuhan, and by 2027, an estimated 30% of all new vehicle sales will be partially autonomous cars that perform acceleration and steering under driver supervision according to Goldman Sachs Research. By 2040, Goldman Sachs analysts estimate that out of total car sales, 65% in the U.S. and 80% in Europe will be advanced autonomous vehicles.

The worldwide autonomous car industry is a 41 billion USD market size in 2024 that is forecasted to increase to 115 billion USD by 2029 according to a recent Statista report published in October. In the United Kingdom alone, an estimated 73% of all cars will contain some degree of autonomous capabilities by 2025 per Statista.

Major auto manufacturers are incorporating Avs into their product lines. Automakers Tesla, Ford, Audi, Mercedes, Toyota, Nissan, and Volvo are testing self-driving cars according to the Insurance Information Institute. On October 10, 2024, at a “We, Robot” event in Los Angeles, Tesla CEO Elon Musk showcased its autonomous vehicles, including the Tesla Cybercab prototype, and its 20-seater Tesla Robovan.

Given the industry projections, it seems only natural to ask if AI conversational chatbot capabilities can be integrated into autonomous vehicles.

For the Purdue University study, the researchers integrated their Talk2Drive framework with a 2019 Lexus RX450h and tested the solution using OpenAI’s LLM GPT-4, a large-scale transformer-based model that can product text outputs from images and text inputs and can contextually process more than 25,000 words of text. According to OpenAI, Stripe, Morgan Stanley wealth management, Duolingo, Be My Eyes, and others are using GPT-4 technology.

The researchers evaluated their AI framework with a variety of male and female drivers in various driving situations including a stadium parking lot, a three-way highway, and a two-way junction intersection.

According to the researchers, “Real-world experiments proved that the proposed system can comprehend human intentions at different intuition levels, ranging from direct commands like ‘can you drive faster’ to indirect commands like ‘I am really in a hurry now.’”

Using their AI framework, the researchers discovered that there was a significant overall reduction in the driver takeover rate with up to 78.8% for the highway, up to 66.7% for the intersection, and up to 100% reduction in the parking scenario. The more direct the command, the more reduction happened in driver takeover rate. The researchers reported that the addition of a memory module reduced the driver takeover rate up to 65.2% in comparison to those without an installed memory module.

Artificial Intelligence Essential Reads

The rising ubiquity of AI and LLMs is bringing concepts from fictional TV series to the forefront of innovative research. Will we be talking to self-driving cars in the future? That possibility is rapidly unfolding in the hands of human drivers.

Copyright © 2024 Cami Rosso All rights reserved.

link

Copyright © All rights reserved. | Newsphere by AF themes.