Self-Driving Vehicles Archives - The Robot Report https://www.therobotreport.com/category/robots-platforms/self-driving-vehicles/ Robotics news, research and analysis Tue, 25 Jun 2024 19:52:33 +0000 en-US hourly 1 https://wordpress.org/?v=6.5.5 https://www.therobotreport.com/wp-content/uploads/2017/08/cropped-robot-report-site-32x32.png Self-Driving Vehicles Archives - The Robot Report https://www.therobotreport.com/category/robots-platforms/self-driving-vehicles/ 32 32 Waymo ends waitlist, opens robotaxi service to all in San Francisco https://www.therobotreport.com/waymo-ends-waitlist-opens-robotaxi-service-all-san-francisco/ https://www.therobotreport.com/waymo-ends-waitlist-opens-robotaxi-service-all-san-francisco/#respond Tue, 25 Jun 2024 19:49:46 +0000 https://www.therobotreport.com/?p=579568 Before today, Waymo has welcomed new riders incrementally in the city, and now it's opening it up to everyone.

The post Waymo ends waitlist, opens robotaxi service to all in San Francisco appeared first on The Robot Report.

]]>
A person with a bag walking towards a Waymo robotaxi.

Waymo is ditching its waitlist and allowing anyone to hail a Waymo robotaxi in San Francisco. | Source: Waymo

Starting today, anyone in San Francisco can hail a robotaxi using Waymo LLC’s app. The company has been operating in the city for years now, slowing scaling its operations. In total, nearly 300,000 people, including those who live, work, and visit San Francisco, have signed up to ride since the company first opened its waitlist. 

Before today, Waymo has welcomed new riders incrementally in the city, and now it’s opening its services to everyone. The Mountain View, Calif.-based Alphabet Inc. subsidiary said in a blog post that it is already completing tens of thousands of weekly trips in San Francisco. The company claimed that its Waymo One service provides safe, sustainable, and reliable transportation to locals and visitors to the city. 

“I’m thankful to be living in a city that embraces technology when it can improve our lives with convenient and safe modes of transit,” stated Michelle Cusano, Executive Director at The Richmond Neighborhood Center.

Waymo has been hard at work expanding its robotaxi operations in several cities this year. Earlier this month, it expanded its service in Phoenix, its largest service area. The company added 90 square miles (233 sq. km) to what was already its largest service area in metropolitan Phoenix.

Waymo said its riders can now hail Waymo One service across 315 square miles (815.8 sq. km) of the Valley. The expanded service area covers more of Scottsdale’s resorts and expands to downtown Mesa, Ariz. This gives riders access to desert attractions, golf courses, and downtown destinations such as the Mesa Arts Center and Pioneer Park.

How are riders using Waymo One in SF?

Waymo recently conducted a rider survey to learn about where its users are going in its robotaxis. The company reported that about 30% of its rides in San Francisco are to local businesses.

In addition, over half of the riders said they’ve used Waymo in the past couple of months to travel to or from medical appointments. The company asserted that this highlights the value of personal space during these trips. 

Interestingly, 36% of riders in San Francisco said they used Waymo to connect to other forms of transit, like BART or Muni. 

“I enjoy riding in Waymo cars and appreciate the ease of transportation,” said Charles Renfroe, development manager at Openhouse SF. “Members of our community, especially transgender and gender non-conforming folks, don’t have to worry about being verbally assaulted or discriminated against when riding with Waymo.”

Waymo’s fleet is all-electric and sources 100% renewable energy from the City’s CleanPowerSF program. Since the beginning of its commercial operations in August 2023, the company said its rides have helped curb carbon emissions by an estimated 570,000 kg (628,317 tons).

California Sen. Dave Cortese last week withdrew Senate Bill 915. It would have allowed local governments to restrict and tax autonomous vehicle companies, similar to how conventional taxicab companies are regulated in California.

Robotaxi hits rough roads in Phoenix

Earlier this month, Waymo issued a voluntary software recall for all of its 672 robotaxis after one autonomously drove into a telephone pole in Phoenix last month. This was Waymo’s second-ever recall.

During the incident, which took place on May 21, an empty Waymo vehicle was driving to pick up a passenger. To get there, it drove through an alley lined on both sides by wooden telephone poles that were level with the road, not up on a curb. The road had longitudinal yellow striping on both sides to indicate the path for vehicles.

As the vehicle pulled over, it struck one of the poles at a speed of 8 mph (12.8 kph), sustaining some damage. No passengers or bystanders were hurt, said Waymo.

After completing the software update, the company filed the recall with the National Highway Traffic Safety Administration (NHTSA). Waymo said this update corrects an error in the software that “assigns a low damage score” to the telephone pole. In addition, it updates the company’s map so its vehicles can better account for the hard road edge in the alleyway that was previously not included.

Waymo’s engineers deployed the recall at its central depot to which its robotaxis regularly return for maintenance and testing. It was not an over-the-air software update.

The post Waymo ends waitlist, opens robotaxi service to all in San Francisco appeared first on The Robot Report.

]]>
https://www.therobotreport.com/waymo-ends-waitlist-opens-robotaxi-service-all-san-francisco/feed/ 0
Autonomous vehicle legislation withdrawn from California senate https://www.therobotreport.com/autonomous-vehicle-legislation-withdrawn-from-california-senate/ https://www.therobotreport.com/autonomous-vehicle-legislation-withdrawn-from-california-senate/#respond Tue, 25 Jun 2024 14:37:30 +0000 https://www.therobotreport.com/?p=579490 SB 915 would have allowed California municipalities to restrict and tax AV companies, similar to how taxi companies are regulated in the state. 

The post Autonomous vehicle legislation withdrawn from California senate appeared first on The Robot Report.

]]>
A Waymo vehicle pulling up to a crosswalk in San Francisco.

Several autonomous vehicle companies are based in California, including Cruise and Waymo. | Source: Waymo

California Sen. Dave Cortese last week withdrew Senate Bill (SB) 915 from consideration. SB 915 would have allowed local municipalities to restrict and tax autonomous vehicle (AV) companies, similar to how taxicab companies are regulated in the state. 

“It’s good to see California lawmakers going back to the drawing board on autonomous vehicle policy,” said Chamber of Progress director of civic innovation policy Ruth Whittaker. “Autonomous vehicles have the power to save thousands of lives in California by eliminating drunk, distracted, and unsafe human driving. Over the past month, we’ve heard leaders from across the state raise concerns that this bill could derail progress on California’s roads.”

Currently, AVs are legislated by two statewide entities in California, the California Department of Motor Vehicles (DMV) and the California Public Utilities Commission (CPUC). Once an AV company has received all the proper permits from these two entities, it can run its robotaxi service in the state. 

SB 915 would have required AV companies to obtain permits from every city and/or county they run their services in.

Had the bill passed, cities and counties could create a permitting program for the vehicles, establish vehicle caps and hours of service restrictions, and establish interoperability or override systems that first responders could access in emergencies. Each city and/or county would have also been able to levy service charges, fees, or assessments to the companies. 

Additionally, SB 915 would have made it unlawful to operate an autonomous vehicle service without a valid permit issued by the local jurisdiction in which the service is substantially located. 

Opponents of the bill said it would keep AV companies held up in legislative red tape, throttling their ability to grow and deploy their services. Proponents, on the other hand, say it gives power back to cities and counties, where legislation typically moves more quickly than in state-wide agencies. 

Autonomous vehicle companies face additional scrutiny in California

While SB 915 has been dropped, it doesn’t mean the AV industry in California is completely in the clear. Earlier this year, the city of San Francisco filed a lawsuit against the CPUC to drastically reduce the number of robotaxis on the city’s roads

The lawsuit centers around the CPUC’s decision in August 2023 to grant both GM’s Cruise and Alphabet’s Waymo their final permits in the state (Cruise’s permits have since been revoked). These permits allowed the companies to charge for rides, expand the hours of operation and service area, and add as many robotaxis to their fleets as they wanted. The lawsuit is asking the CPUC to reconsider its decision and whether it was compliant with the law.

San Francisco city attorney David Chiu filed an administrative motion after the August decision in an attempt to delay Cruise and Waymo from ramping up operations and get another hearing with the CPUC. In December, the City Attorney’s office filed a lawsuit with the California Appellate Court to request the CPUC review its August decision and revoke Waymo’s permit.

The lawsuit also asks the CPUC to develop reporting requirements, safety benchmarks, and other public safety regulations to address incidents that have involved first responders, created traffic, and disrupted public transportation.

There are several autonomous vehicle companies based in California, including Cruise and Waymo.

The post Autonomous vehicle legislation withdrawn from California senate appeared first on The Robot Report.

]]>
https://www.therobotreport.com/autonomous-vehicle-legislation-withdrawn-from-california-senate/feed/ 0
Neya Systems, AUVSI to develop cybersecurity certification program for UGVs https://www.therobotreport.com/neya-systems-auvsi-to-develop-cybersecurity-certification-program-for-ugvs/ https://www.therobotreport.com/neya-systems-auvsi-to-develop-cybersecurity-certification-program-for-ugvs/#respond Fri, 21 Jun 2024 13:50:50 +0000 https://www.therobotreport.com/?p=579511 Neya Systems and AUVSI say there is a growing need for standardized evaluation and certification of uncrewed ground vehicles.

The post Neya Systems, AUVSI to develop cybersecurity certification program for UGVs appeared first on The Robot Report.

]]>
Neya Systems offers full-stack autonomy, mission planning, and open architecture, for UGVs.

Neya offers autonomy, mission planning, and open architecture for uncrewed ground vehicles. | Source: Neya Systems

Neya Systems yesterday announced that it is partnering with the Association for Uncrewed Vehicle Systems International, or AUVSI. The partners said they plan to develop a cybersecurity and supply chain framework and certification program for uncrewed ground vehicles (UGVs). 

AUVSI and Neya Systems said they have observed a growing need for standardized evaluation and certification of UGVs. The goal of the collaboration is to establish comprehensive standards and testing protocols to enhance the security, safety, performance, and reliability of uncrewed and autonomous ground vehicles and robots.

The framework and voluntary certification program will focus on enhancing the protection, mitigation, recovery, and adaptability of AGVs, said the organizations. 

“We are excited to announce the development of this cybersecurity certification program for UGVs,” stated Kurt Bruck, vice president at Neya Systems. “This initiative represents a significant step forward in our efforts to establish an industry standard for protecting UGVs from unauthorized access. Our partnership with AUVSI will enable us to foster innovation and trust within the industry as a whole, ultimately enhancing the safety and reliability of these autonomous systems.”

Neya Systems has cybersecurity, simulation expertise

Warrendale, Pa.-based Neya Systems develops and integrates advanced, vehicle-agnostic, off-road, and airborne autonomy. The subsidiary of Applied Research Associates is a 2024 RBR50 Robotics Innovation Award winner for its cyber autonomy initiative.

In March, Neya said it is working with the Embodied AI Foundation to update the CARLA open-source simulator for autonomous driving research to Unreal Engine 5.

Neya Systems said will be bringing its expertise in applying the U.S. Department of Defense’s (DoD) Zero Trust cybersecurity principles to its autonomy software to the partnership.

Neya Systems has worked with he U.S. Army to turn the Palletized Load System into an optionally crewed, autonomous vehicle.

Neya has worked with the U.S. Army to turn the Palletized Load System into an optionally crewed, autonomous vehicle. Source: Neya Systems

AUVSI brings complementary experience

Arlington, Va.-based AUVSI plans to share the industry expertise of members in its Cyber Working Group and Ground Advocacy Committee

The nonprofit organization is dedicated to the advancement of uncrewed systems and robotics. It represents corporate, government, and academic professionals from more than 60 countries. AUVSI said its members work in defense, civil, and commercial markets. 

AUVSI’s Cyber Working Group previously advised on the development of AVUSI’s Green UAS Frameworks and certification. It said this is the only verification method besides Blue UAS that the DoD’s Defense Innovation Unit has approved as confirming compliance with National Defense Authorization Act (NDAA) requirements for drones. 

“The need for standards and certifications for uncrewed systems continues to grow alongside the development and integration of uncrewed and autonomous vehicles and robotics,” noted Casie Ocaña, director of trusted programs at AUVSI. “In the ground domain, AUVSI is looking to leverage our Trusted Cyber framework so that we can offer a solution to verify and support compliance among ground vehicle and robotics companies – which will further advance the safe and reliable future of these technologies.”

The post Neya Systems, AUVSI to develop cybersecurity certification program for UGVs appeared first on The Robot Report.

]]>
https://www.therobotreport.com/neya-systems-auvsi-to-develop-cybersecurity-certification-program-for-ugvs/feed/ 0
Wayve launches PRISM-1 4D reconstruction model for autonomous driving https://www.therobotreport.com/wayve-launches-prism-1-4d-reconstruction-model-for-autonomous-driving/ https://www.therobotreport.com/wayve-launches-prism-1-4d-reconstruction-model-for-autonomous-driving/#respond Tue, 18 Jun 2024 17:15:35 +0000 https://www.therobotreport.com/?p=579482 Wayve says PRISM-1 enables scalable, realistic re-simulations of complex scenes with minimal engineering or labeling input. 

The post Wayve launches PRISM-1 4D reconstruction model for autonomous driving appeared first on The Robot Report.

]]>
A scene reconstructed by Wayve's PRISM-1 technology.

A scene reconstructed by Wayve’s PRISM-1 technology. | Source: Wayve

Wayve, a developer of embodied artificial intelligence, launched PRISM-1, a 4D reconstruction model that it said can enhance the testing and training of its autonomous driving technology. 

The London-based company first showed the technology in December 2023 through its Ghost Gym neural simulator. Wayve used novel view synthesis to create precise 4D scene reconstructions (three dimensions in space plus time) using only camera inputs.

It achieved this using unique methods that it claimed will accurately and efficiently simulating the dynamics of complex and unstructured environments for advanced driver-assist systems (ADAS) and self-driving vehicles. PRISM-1 is the model that powers the next generation of Ghost Gym simulations. 

“PRISM-1 bridges the gap between the real world and our simulator,” stated Jamie Shotton, chief scientist at Wayve. “By enhancing our simulation platform with accurate dynamic representations, Wayve can extensively test, validate, and fine-tune our AI models at scale.”

“We are building embodied AI technology that generalizes and scales,” he added. “To achieve this, we continue to advance our end-to-end AI capabilities, not only in our driving models, but also through enabling technologies like PRISM-1. We are also excited to publicly release our WayveScenes101 dataset, developed in conjunction with PRISM-1, to foster more innovation and research in novel view synthesis for driving.”

PRISM-1 excels at realism in simulation, Wayve says

Wayve said PRISM-1 enables scalable, realistic re-simulations of complex driving scenes with minimal engineering or labeling input. 

Unlike traditional methods, which rely on lidar and 3D bounding boxes, PRISM-1 uses novel synthesis techniques to accurately depict moving elements like pedestrians, cyclists, vehicles, and traffic lights. The system includes precise details, like clothing patterns, brake lights, and windshield wipers. 

Achieving realism is critical for building an effective training simulator and evaluating driving technologies, according to Wayve. Traditional simulation technologies treat vehicles as rigid entities and fail to capture safety-critical dynamic behaviors like indicator lights or sudden braking. 

PRISM-1, on the other hand, uses a flexible framework that can identify and track changes in the appearance of scene elements over time, said the company. This enables it to precisely re-simulate complex dynamic scenarios with elements that change in shape and move throughout the scene. 

It can distinguish between static and dynamic elements in a shelf-supervised manner, avoiding the need for explicit labels, scene graphs, and bounding boxes to define the configuration of a busy street.

Wayve said this approach maintains efficiency, even as scene complexity increases, ensuring that more complex scenarios do not require additional engineering effort. This makes PRISM-1 a scalable and efficient system for simulating complex urban environments, it asserted.

WayveScenes 101 benchmark released

Wayve also released its WayveScenes 101 Benchmark. This dataset comprises 101 diverse driving scenarios from the U.K. and the U.S. It includes urban, suburban, and highway scenes over various weather and lighting conditions. 

The company says it aims for this dataset to support the AI research community in advancing novel view synthesis models and the development of more robust and accurate scene representation models for driving. 

Last month, Wayve closed a $1.05 billion Series C funding round. SoftBank Group led the round, which also included new investor NVIDIA and existing investor Microsoft.

Since its founding, Wayve has developed and tested its autonomous driving system on public roads. It has also developed foundation models for autonomy, similar to “GPT for driving,” that it says can empower any vehicle to perceive its surroundings and safely drive through diverse environments. 

The post Wayve launches PRISM-1 4D reconstruction model for autonomous driving appeared first on The Robot Report.

]]>
https://www.therobotreport.com/wayve-launches-prism-1-4d-reconstruction-model-for-autonomous-driving/feed/ 0
Waabi raises $200M from Uber, NVIDIA, and others on the road to self-driving trucks https://www.therobotreport.com/waabi-raises-200m-uber-nvidia-on-the-road-self-driving-trucks/ https://www.therobotreport.com/waabi-raises-200m-uber-nvidia-on-the-road-self-driving-trucks/#respond Tue, 18 Jun 2024 12:40:06 +0000 https://www.therobotreport.com/?p=579477 Waabi, which has been developing self-driving trucks using generative AI, plans to put its systems on Texas roads in 2025.

The post Waabi raises $200M from Uber, NVIDIA, and others on the road to self-driving trucks appeared first on The Robot Report.

]]>
The Waabi Driver includes a generative AI stack as well as sensors and compute hardware.

The Waabi Driver includes a generative AI stack as well as sensors and compute hardware. Source: Waabi

Autonomous passenger vehicles have hit potholes over the past few years, with accidents leading to regulatory scrutiny, but investment in self-driving trucks has continued. Waabi today announced that it has raised $200 million in an oversubscribed Series B round. The funding brings total investment in the Toronto-based startup to more than $280 million.

Waabi said that it “is on the verge of Level 4 autonomy” and that it expects to deploy fully autonomous trucks in Texas next year. The company claimed that it has been able to advance quickly toward that goal because of its use of generative artificial intelligence in the physical world.

“I have spent most of my professional life dedicated to inventing new AI technologies that can deliver on the enormous potential of AI in the physical world in a provably safe and scalable way,” stated Raquel Urtasun, a professor at the University of Toronto and founder and CEO of Waabi.

“Over the past three years, alongside the incredible team at Waabi, I have had the chance to turn these breakthroughs into a revolutionary product that has far surpassed my expectations,” she added. “We have everything we need — breakthrough technology, an incredible team, and pioneering partners and investors — to launch fully driverless autonomous trucks in 2025. This is monumental for the industry and truly marks the beginning of the next frontier for AI.”

Waabi uses generative AI to reduce on-road testing

Waabi said it is pioneering generative AI for the physical world, starting with applying the technology to self-driving trucks. The company said it has developed “a single end-to-end AI system that is capable of human-like reasoning, enabling it to generalize to any situation that might happen on the road, including those it has never seen before.”

Because of that ability to generalize, the system requires significantly less training data and compute resources in comparison with approaches to autonomy, asserted Waabi. In addition, the company claimed that its system is fully interpretable and that its safety can be validated and verified.

The company said Copilot4D, its “end-to-end AI system, paired with Waabi World, the world’s most advanced simulator, reduces the need for extensive on-road testing and enables a safer, more efficient solution that is highly performant and scalable from Day 1.”

Several industry observers have pointed out that self-driving trucks will likely arrive on public roads before widespread deployments of robotaxis in the U.S. While Waymo has pumped the brakes on development, other companies have made progress, including Inceptio, FERNRIDE, Kodiak Robotics, and Aurora.

At the same time, work on self-driving cars continues, with Wayve raising $1.05 billion last month and TIER IV obtaining $54 million. General Motors invested another $850 million in Cruise yesterday.

“Self-driving technology is a prime example of how AI can dramatically improve our lives,” said AI luminary Geoff Hinton. “Raquel and Waabi are at the forefront of innovation, developing a revolutionary approach that radically changes the way autonomous systems work and leads to safer and more efficient solutions.”

Waabi plans to expand its commercial operations and grow its team in Canada and the U.S. The company cited recent accomplishments, including the opening of its new Texas AV trucking terminal, a collaboration with NVIDIA to integrate NVIDIA DRIVE Thor into the Waabi Driver, and its ongoing partnership with Uber Freight. It has run autonomous shipments for Fortune 500 companies and top-tier shippers in Texas.

Copilot4D predicts future LiDAR point clouds from a history of past LiDAR observations, akin to how LLMs predict the next word given the preceding text. We design a 3 stage architecture that is able to exploit all the breakthroughs in LLMs to bring the first 4D foundation model.

Copilot4D predicts future lidar point clouds from a history of past observations, similar to how large language models (LLMs) predict the next word given the preceding text. Source: Waabi

Technology leaders invest in self-driving trucks

Waabi noted that top AI, automotive, and logistics enterprises were among its investors. Uber and Khosla Ventures led Waabi’s Series B round. Other participants included NVIDIA, Volvo Group Venture Capital, Porsche Automobil Holding, Scania Invest, and Ingka Investments.

“Waabi is developing autonomous trucking by applying cutting-edge generative AI to the physical world,” said Jensen Huang, founder and CEO of NVIDIA. “I’m excited to support Raquel’s vision through our investment in Waabi, which is powered by NVIDIA technology. I have championed Raquel’s pioneering work in AI for more than a decade. Her tenacity to solve the impossible is an inspiration.”

Additional support came from HarbourVest Partners, G2 Venture Partners, BDC Capital’s Thrive Venture Fund, Export Development Canada, Radical Ventures, Incharge Capital, and others.

“We are big believers in the potential for autonomous technology to revolutionize transportation, making a safer and more sustainable future possible,” added Dara Khosrowshahi, CEO of Uber. “Raquel is a visionary in the field, and under her leadership, Waabi’s AI-first approach provides a solution that is extremely exciting in both its scalability and capital efficiency.”

Vinod Khosla, founder of Khosla Ventures, said: “Change never comes from incumbents but from the innovation of entrepreneurs that challenge the status quo. Raquel and her team at Waabi have done exactly that with their products and business execution. We backed Waabi very early on with the bet that generative AI would transform transportation and are thrilled to continue on this journey with them as they move towards commercialization.”

The post Waabi raises $200M from Uber, NVIDIA, and others on the road to self-driving trucks appeared first on The Robot Report.

]]>
https://www.therobotreport.com/waabi-raises-200m-uber-nvidia-on-the-road-self-driving-trucks/feed/ 0
GM invests another $850M into Cruise as it expands manual operations https://www.therobotreport.com/gm-invests-another-850m-into-cruise-as-it-expands-manual-operations/ https://www.therobotreport.com/gm-invests-another-850m-into-cruise-as-it-expands-manual-operations/#respond Mon, 17 Jun 2024 20:51:10 +0000 https://www.therobotreport.com/?p=579470 To date, General Motors has invested more than $8 billion into Cruise, and in 2023 alone, the company lost $3.48 billion.

The post GM invests another $850M into Cruise as it expands manual operations appeared first on The Robot Report.

]]>
Cruise vehicle driving down a palm tree-lined road.

Cruise has resumed some operations in three cities. | Source: Cruise

Despite a bumpy 2023, Cruise LLC isn’t nearing the end of the road yet. General Motors, its parent company, said last week that it would invest an additional $850 million into the autonomous vehicle developer.

To date, GM has invested more than $8 billion into Cruise and hasn’t yet seen much of a return. In 2023 alone, the San Francisco-based company lost $3.48 billion.

However, it doesn’t seem like GM is ready to take the route that Ford and VW did when those automakers shut down Argo AI. Its additional investment will help cover Cruise’s operational costs. The company also said it’s looking for new external investors to help bolster its financial situation. 

GM to review self-driving division

Paul Jacobson, GM’s chief financial officer, announced the investment during the Deutsche Bank Global Auto Industry Conference held in New York City. Jacobson said that the investment will buy GM time to conduct a “strategic review” of the division‘s future. 

Cruise faced a number of struggles in 2023, culminating in an Oct. 2 incident when one of Cruise’s vehicles dragged a pedestrian 20 feet after she was hit by another driver. After the accident, the California Department of Motor Vehicles (DMV) suspended Cruise’s permits in the state, alleging that the company withheld footage of it.

Cruise disputed the allegation but paused nationwide operations to reestablish trust with the public. Since then, the city of San Francisco filed a lawsuit against the California Public Utilities Commission (CPUC), the organization responsible for regulating autonomous vehicles (AVs) in the state, to drastically reduce the number of robotaxis on the city’s roads. 

In May 2024, Cruise reached a settlement with the pedestrian for between $8 million and $12 million, according to Bloomberg News

Cruise gets back on the road in select areas

Last week, Cruise resumed manual driving in Houston and Dallas and announced that supervised driving is under way in Phoenix and Dallas. It began manual operations in Phoenix in April.

The company has yet to restart driving in its home state of California. Cruise has long had its sights set on deploying its driverless robotaxis in San Francisco, where it was founded in 2013. It has said the city’s difficult driving conditions will prepare its autonomous driving system for any other city. 

Cruise is taking an incremental approach to rolling its robotaxis back out on public roads. Starting with manual operations, where a human driver controls the vehicles without autonomous systems engaged, allows the company to gather road information and create maps. 

The company said that supervised autonomous driving, where Cruise’s vehicles drive themselves with a safety driver behind the wheel ready to take over if needed, build on its data. Cruise added that safety drivers play an important role in testing AV performance on real-world roads and driving scenarios. 

During this phase of operations, Cruise said it will validate its AV’s end-to-end behaviors against its rigorous safety and performance requirements. 

The post GM invests another $850M into Cruise as it expands manual operations appeared first on The Robot Report.

]]>
https://www.therobotreport.com/gm-invests-another-850m-into-cruise-as-it-expands-manual-operations/feed/ 0
At CVPR, NVIDIA offers Omniverse microservices, shows advances in visual generative AI https://www.therobotreport.com/nvidia-offers-omniverse-microservices-advances-visual-generative-ai-cvpr/ https://www.therobotreport.com/nvidia-offers-omniverse-microservices-advances-visual-generative-ai-cvpr/#respond Mon, 17 Jun 2024 13:00:07 +0000 https://www.therobotreport.com/?p=579457 Omniverse Cloud Sensor RTX can generate synthetic data for robotics, says NVIDIA, which is presenting over 50 research papers at CVPR.

The post At CVPR, NVIDIA offers Omniverse microservices, shows advances in visual generative AI appeared first on The Robot Report.

]]>
NVIDIA Omniverse Cloud Sensor RTX Generates Synthetic Data to Speed AI Development of Autonomous Vehicles, Robotic Arms, Mobile Robots, Humanoids and Smart Spaces

As shown at CVPR, Omniverse Cloud Sensor RTX microservices generate high-fidelity sensor simulation from
an autonomous vehicle (left) and an autonomous mobile robot (right). Sources: NVIDIA, Fraunhofer IML (right)

NVIDIA Corp. today announced NVIDIA Omniverse Cloud Sensor RTX, a set of microservices that enable physically accurate sensor simulation to accelerate the development of all kinds of autonomous machines.

NVIDIA researchers are also presenting 50 research projects around visual generative AI at the Computer Vision and Pattern Recognition, or CVPR, conference this week in Seattle. They include new techniques to create and interpret images, videos, and 3D environments. In addition, the company said it has created its largest indoor synthetic dataset with Omniverse for CVPR’s AI City Challenge.

Sensors provide industrial manipulators, mobile robots, autonomous vehicles, humanoids, and smart spaces with the data they need to comprehend the physical world and make informed decisions.

NVIDIA said developers can use Omniverse Cloud Sensor RTX to test sensor perception and associated AI software in physically accurate, realistic virtual environments before real-world deployment. This can enhance safety while saving time and costs, it said.

“Developing safe and reliable autonomous machines powered by generative physical AI requires training and testing in physically based virtual worlds,” stated Rev Lebaredian, vice president of Omniverse and simulation technology at NVIDIA. “Omniverse Cloud Sensor RTX microservices will enable developers to easily build large-scale digital twins of factories, cities and even Earth — helping accelerate the next wave of AI.”

Omniverse Cloud Sensor RTX supports simulation at scale

Built on the OpenUSD framework and powered by NVIDIA RTX ray-tracing and neural-rendering technologies, Omniverse Cloud Sensor RTX combines real-world data from videos, cameras, radar, and lidar with synthetic data.

Omniverse Cloud Sensor RTX includes software application programming interfaces (APIs) to accelerate the development of autonomous machines for any industry, NVIDIA said.

Even for scenarios with limited real-world data, the microservices can simulate a broad range of activities, claimed the company. It cited examples such as whether a robotic arm is operating correctly, an airport luggage carousel is functional, a tree branch is blocking a roadway, a factory conveyor belt is in motion, or a robot or person is nearby.

Microservice to be available for AV development 

CARLA, Foretellix, and MathWorks are among the first software developers with access to Omniverse Cloud Sensor RTX for autonomous vehicles (AVs). The microservices will also enable sensor makers to validate and integrate digital twins of their systems in virtual environments, reducing the time needed for physical prototyping, said NVIDIA.

Omniverse Cloud Sensor RTX will be generally available later this year. NVIDIA noted that its announcement coincided with its first-place win at the Autonomous Grand Challenge for End-to-End Driving at Scale at CVPR.

The NVIDIA researchers’ winning workflow can be replicated in high-fidelity simulated environments with Omniverse Cloud Sensor RTX. Developers can use it to test self-driving scenarios in physically accurate environments before deploying AVs in the real world, said the company.

Two of NVIDIA’s papers — one on the training dynamics of diffusion models and another on high-definition maps for autonomous vehicles — are finalists for the Best Paper Awards at CVPR.

The company also said its win for the End-to-End Driving at Scale track demonstrates its use of generative AI for comprehensive self-driving models. The winning submission outperformed more than 450 entries worldwide and received CVPR’s Innovation Award.

Collectively, the work introduces artificial intelligence models that could accelerate the training of robots for manufacturing, enable artists to more quickly realize their visions, and help healthcare workers process radiology reports.

“Artificial intelligence — and generative AI in particular — represents a pivotal technological advancement,” said Jan Kautz, vice president of learning and perception research at NVIDIA. “At CVPR, NVIDIA Research is sharing how we’re pushing the boundaries of what’s possible — from powerful image-generation models that could supercharge professional creators to autonomous driving software that could help enable next-generation self-driving cars.”

Foundation model eases object pose estimation

NVIDIA researchers at CVPR are also presenting FoundationPose, a foundation model for object pose estimation and tracking that can be instantly applied to new objects during inference, without the need for fine tuning. The model uses either a small set of reference images or a 3D representation of an object to understand its shape. It set a new record on a benchmark for object pose estimation.

FoundationPose can then identify and track how that object moves and rotates in 3D across a video, even in poor lighting conditions or complex scenes with visual obstructions, explained NVIDIA.

Industrial robots could use FoundationPose to identify and track the objects they interact with. Augmented reality (AR) applications could also use it with AI to overlay visuals on a live scene.

NeRFDeformer transforms data from a single image

NVIDIA’s research includes a text-to-image model that can be customized to depict a specific object or character, a new model for object-pose estimation, a technique to edit neural radiance fields (NeRFs), and a visual language model that can understand memes. Additional papers introduce domain-specific innovations for industries including automotive, healthcare, and robotics.

A NeRF is an AI model that can render a 3D scene based on a series of 2D images taken from different positions in the environment. In robotics, NeRFs can generate immersive 3D renders of complex real-world scenes, such as a cluttered room or a construction site.

However, to make any changes, developers would need to manually define how the scene has transformed — or remake the NeRF entirely.

Researchers from the University of Illinois Urbana-Champaign and NVIDIA have simplified the process with NeRFDeformer. The method can transform an existing NeRF using a single RGB-D image, which is a combination of a normal photo and a depth map that captures how far each object in a scene is from the camera.

NVIDIA researchers have simplified the process of generating a 3D scene from 2D images using NeRFs.

Researchers have simplified the process of generating a 3D scene from 2D images using NeRFs. Source: NVIDIA

JeDi model shows how to simplify image creation at CVPR

Creators typically use diffusion models to generate specific images based on text prompts. Prior research focused on the user training a model on a custom dataset, but the fine-tuning process can be time-consuming and inaccessible to general users, said NVIDIA.

JeDi, a paper by researchers from Johns Hopkins University, Toyota Technological Institute at Chicago, and NVIDIA, proposes a new technique that allows users to personalize the output of a diffusion model within a couple of seconds using reference images. The team found that the model outperforms existing methods.

NVIDIA added that JeDi can be combined with retrieval-augmented generation, or RAG, to generate visuals specific to a database, such as a brand’s product catalog.

JeDi is a new technique that allows users to easily personalize the output of a diffusion model within a couple of seconds using reference images, like an astronaut cat that can be placed in different environments.

JeDi is a new technique that allows users to easily personalize the output of a diffusion model within a couple of seconds using reference images, like an astronaut cat that can be placed in different environments. Source: NVIDIA

Visual language model helps AI get the picture

NVIDIA said it has collaborated with the Massachusetts Institute of Technology (MIT) to advance the state of the art for vision language models, which are generative AI models that can process videos, images, and text. The partners developed VILA, a family of open-source visual language models that they said outperforms prior neural networks on benchmarks that test how well AI models answer questions about images.

VILA’s pretraining process provided enhanced world knowledge, stronger in-context learning, and the ability to reason across multiple images, claimed the MIT and NVIDIA team.

The VILA model family can be optimized for inference using the NVIDIA TensorRT-LLM open-source library and can be deployed on NVIDIA GPUs in data centers, workstations, and edge devices.

As shown at CVPR, VILA can understand memes and reason based on multiple images or video frames.

VILA can understand memes and reason based on multiple images or video frames. Source: NVIDIA

Generative AI drives AV, smart city research at CVPR

NVIDIA Research has hundreds of scientists and engineers worldwide, with teams focused on topics including AI, computer graphics, computer vision, self-driving cars, and robotics. A dozen of the NVIDIA-authored CVPR papers focus on autonomous vehicle research.

Producing and Leveraging Online Map Uncertainty in Trajectory Prediction,” a paper authored by researchers from the University of Toronto and NVIDIA, has been selected as one of 24 finalists for CVPR’s best paper award.

In addition, Sanja Fidler, vice president of AI research at NVIDIA, will present on vision language models at the Workshop on Autonomous Driving today.

NVIDIA has contributed to the CVPR AI City Challenge for the eighth consecutive year to help advance research and development for smart cities and industrial automation. The challenge’s datasets were generated using NVIDIA Omniverse, a platform of APIs, software development kits (SDKs), and services for building applications and workflows based on Universal Scene Description (OpenUSD).

AI City Challenge synthetic datasets span multiple environments generated by NVIDIA Omniverse, allowing hundreds of teams to test AI models in physical settings such as retail and warehouse environments to enhance operational efficiency.

AI City Challenge synthetic datasets span multiple environments generated by NVIDIA Omniverse, allowing hundreds of teams to test AI models in physical settings such as retail and warehouse environments to enhance operational efficiency. Source: NVIDIA

Isha Salian headshot.About the author

Isha Salian writes about deep learning, science and healthcare, among other topics, as part of NVIDIA’s corporate communications team. She first joined the company as an intern in summer 2015. Isha has a journalism M.A., as well as undergraduate degrees in communication and English, from Stanford.

The post At CVPR, NVIDIA offers Omniverse microservices, shows advances in visual generative AI appeared first on The Robot Report.

]]>
https://www.therobotreport.com/nvidia-offers-omniverse-microservices-advances-visual-generative-ai-cvpr/feed/ 0
Beep deploys autonomous shuttles at Honolulu airport with partners https://www.therobotreport.com/beep-deploys-autonomous-shuttles-honolulu-airport-with-partners/ https://www.therobotreport.com/beep-deploys-autonomous-shuttles-honolulu-airport-with-partners/#respond Sun, 16 Jun 2024 12:00:27 +0000 https://www.therobotreport.com/?p=579428 Beep discusses its pilot of Miki shuttles with Sustainability Partners and the Honolulu DoT at Daniel K. Inouye International Airport.

The post Beep deploys autonomous shuttles at Honolulu airport with partners appeared first on The Robot Report.

]]>
The Beep Miki autonomous shuttles operate at Honolulu's airport.

Wiki Wiki shuttles await passenger riders at the Daniel K. Inouye International Airport (HNL) in Honolulu. Source: Beep

As millions of people in the Northern Hemisphere begin summer vacations, some of them will board autonomous vehicles for part of their journeys. Last month, Beep Inc. announced that it is working with the Hawai’i Department of Transportation, or HDOT, and Sustainability Partners to launch an 18-month self-driving shuttle pilot at Daniel K. Inouye International Airport, formerly HNL.

“Through our partnership with Sustainability Partners, we’re honored that HDOT and HNL have placed their trust in our experience, leadership and differentiated approach of safe and integrated autonomous mobility with the launch of the Miki shuttle pilot service,” stated Joe Moye, CEO of Beep, at the time. “Our fleet of turnkey shared and electric autonomous shuttles prioritizes safety and sustainability while enhancing the airport travel experience for passengers.”

Founded in 2018, Beep said it delivers software and services for next-generation autonomous, shared-mobility systems. The Orlando, Fla.-based company plans, deploys, and manages autonomous shuttles for private and public communities. It also claimed that it continually improves safety and operating capabilities with data from its deployments.

Eduardo Rosa, senior vice president of operations at Beep, answered the following questions about the Honolulu deployment, which the company claimed is the first of its kind:

Autonomous shuttles face crowded airport environs

What are some of the current transport challenges at airports, and specifically at the Daniel K. Inouye International Airport?

Rosa: Between a skyrocketing number of air travelers, traditional transportation networks and staffing issues, many airports around the country are facing significant challenges addressing their transportation needs.

While the Daniel K. Inouye International Airport shares many common transport challenges with other major airports, its unique location and the high volume of tourist traffic presents specific issues that require ongoing attention and improvement.

The Hawaii Department of Transportation, which operates HNL and 14 other airports statewide, is addressing these challenges through an airports modernization program that includes infrastructure upgrades, improved traffic management systems, and enhancements to existing transport links. This includes initiatives such as the pilot program with Beep to see if autonomous shuttles can be incorporated into the airport’s passenger-shuttle operations.

How does the Miki shuttle deal with obstacles such as luggage carts, manual shuttles, and pedestrians?

Rosa: Like all of Beep’s autonomous shared-mobility solutions, the Miki — the Hawaiian word for “agile” — shuttles are equipped with autonomous software and hardware that allow the vehicles to safely navigate any obstacles and operate alongside human-driven vehicles. These technologies, along with an onboard attendant, allow the shuttles to be operated safely in airport and other congested environments like campuses, communities, and public transit.

The Miki shuttles operate as secondary support vehicles to the airport’s existing Wiki Wiki — the Hawaiian word for “fast” — shuttle buses. The vehicles operate along dedicated routes in the airport’s restricted area, where pedestrians, luggage carts and other obstacles do not impede mobility.

Beep leverages its leadership in having run the nation’s largest and most tenured autonomous shuttle service deployments to date—from Hawaii, California, and North Carolina to Florida and more from coast to coast. [It] is entrusted by local governments and transit authorities nationwide.

Our deployments have shuttled tens of thousands of people safely and are allowing us to actively lead the charge in evolving the technology and operations to navigate through complicated, real-world environments. It’s a constant evolution.

How much integration is necessary between the Miki and Wiki Wiki shuttles?

Rosa: With the ability to carry 11 passengers including an attendant, the Miki shuttles are operating on the same routes as the Wiki Wiki shuttles and are easily capable of working alongside and augmenting human-driven routes. In many of Beep’s deployments, shuttles operate safely alongside other drivers and municipal vehicles.

The Miki and Wiki Wiki shuttles transport passengers between Terminals 1 and 2 between 7:00 a.m. and 10:00 p.m. daily.

Beep engaged the Wiki Wiki service early on in the deployment process to ensure seamless integration and coordinated operations of augmented transportation. Beep always maintains continuous radio communications with the buses at all times through our Command Center, allowing for real-time updates and immediate response to any issues, thereby enhancing safety and efficiency.

Most of Beep’s shuttle deployments are designed to integrate into and enhance existing transport systems – reflected strongly by this project being led by the Hawai’i Department of Transportation and facilitated by Sustainability Partners.

Beep is working with the Jacksonville Transportation Authority in Florida for virtual reality command and control.

Beep is working with the Jacksonville Transportation Authority in Florida on the Autonomous Innovation Center. Source: Beep

Beep dedicates staffers to Mika shuttle pilot

Are the shuttles fully autonomous — is there a teleoperation or manual option?

Rosa: The Miki shuttles operate autonomously along a pre-programmed route with an onboard attendant who, at any time, can take control of the shuttle if needed. These staff members are there to educate passengers about autonomous vehicles, as well as serve as an extra set of eyes for safety – in line with our work with NHTSA [National Highway Traffic Safety Administration].

Additionally, technicians in the Beep Command Center at our headquarters in Orlando’s Lake Nona are able to monitor shuttles remotely and can alert the attendant if there is a need for them to take manual control of the vehicle.

Are there dedicated staffers from Beep or HNL onsite during this pilot?

Rosa: Yes, in addition to the attendants who are on board the shuttles while they are in operation. All Beep service deployments are integrated into the communities where we operate with a mobility-as-a-service approach that’s clearly differentiated from other forms of AV implementation. This means everything from education, first responder training to other forms of onsite support.

Beep has launched the AutonomOS service management platform.

Beep has launched the AutonomOS service management platform for rapid deployment and fleet visibility. Source: Beep

Sustainability a goal for autonomous shuttle developers

Is Sustainability Partners providing temporary charging infrastructure? How would that work at scale? Are the Wiki Wiki shuttles electric?

Rosa: The Wiki Wiki buses, HNL’s traditional transportation system, run on gasoline, but the Miki shuttles are all electric. The Hawai’i Department of Transportation, as part of its ongoing sustainability goals, is working to transition its fleet to electric vehicles, and the Miki shuttles are helping them work toward those goals.

Sustainability Partners is helping to advance the state’s electrification mission by facilitating the development of its electric vehicle infrastructure. The Beep shuttles only require a 220v plug to use their chargers, so temporary charging infrastructure is not required. The HNL Airport has ordered 18 electric transit buses as part of its efforts to transition its vehicle fleet to electric vehicles.

What are HDOT and the airport looking for in this pilot? What are their metrics or goals?

Rosa: HDOT and its partners are using the pilot project to evaluate new ways to increase the overall efficiency and augmentation of intra-airport transportation services. This pilot project is also helping HDOT continue testing the viability of electrified mobility as a clean, affordable option to connect passengers and staff to terminals and services.

Enhancing roadway safety is always a common goal between Beep and our partners, and is at the center of everything we do as a company—it is the focal point of planning, deployment and management of our mobility services and a critical component in our partnerships and education.

Beep builds on nationwide experience

How is this project different from Beep’s other deployments across the U.S., and have any of those led to full deployments?

Rosa: This project is a very exciting first use of Beep shuttles in an airport environment, the natural and ideal setting for shared, autonomous mobility systems.

It’s also very similar to many of our other projects spanning across the U.S. Beep has been testing and deploying autonomous shuttles in diverse environments for more than five years. This has brought us unmatched experience in the industry and provides us with the data, insights, and learnings needed to continue to safely advance the use of our shuttle systems in autonomous mobility networks everywhere.

Our leadership in testing and operating autonomous shuttle networks is demonstrated by the operation of the largest and longest tenured autonomous shuttle deployment in the U.S., with five routes serving Lake Nona, Fla.’s medical campus, residential community, business park and entertainment district in the master-planned, 17-sq.-mi. community.

We have also been awarded the nation’s largest public-sector contract for the deployment of autonomous shuttles by the Jacksonville Transportation Authority in Jacksonville, Fla. Beep also operated the first and only federally procured autonomous shuttle deployment serving public passengers at Yellowstone National Park, alongside additional deployments in Arizona, Florida, North Carolina, and Georgia.

We currently have several full deployments in planned developments, college campuses, retail hubs, municipalities and more. These first-mile, last-mile mobility solutions are providing valuable transportation options for passengers, while helping to reduce traffic and congestion where they are operating.

The post Beep deploys autonomous shuttles at Honolulu airport with partners appeared first on The Robot Report.

]]>
https://www.therobotreport.com/beep-deploys-autonomous-shuttles-honolulu-airport-with-partners/feed/ 0
Detroit launching robotaxi pilot for people with mobility issues https://www.therobotreport.com/robotaxi-pilot-launching-in-detroit-for-elderly-disabled-people/ https://www.therobotreport.com/robotaxi-pilot-launching-in-detroit-for-elderly-disabled-people/#respond Thu, 13 Jun 2024 19:06:35 +0000 https://www.therobotreport.com/?p=579385 May Mobility's pilot will run six days per week, and operate across 68 stops in 11 square miles of downtown Detroit. There will be human safety operators inside each vehicle.

The post Detroit launching robotaxi pilot for people with mobility issues appeared first on The Robot Report.

]]>
 

May Mobility is launching a robotaxi pilot program in Detroit for people with disabilities age 62 and older. The service will be available to select residents of Detroit starting from June 20, 2024 through 2026.

May Mobility will deploy three robotaxis, including two wheelchair-accessible vehicles, to help participants achieve greater access to healthcare facilities, shopping centers, jobs, and social and recreational activities. The free service will operate across 68 stops in 11 square miles of downtown Detroit Monday and Wednesday through Friday from 8 a.m. to 6 p.m. and on the weekend from 8 a.m. to 1 p.m. Persons interested in riding the service must submit an interest form and will be contacted to enroll.

While all of May Mobility’s robotaxis are autonomous, they do have human safety drivers behind the wheel to provide customer service and monitor vehicle operations. May Mobility told The Robot Report human operators also provide educational support for people who may have questions before they can feel comfortable getting into an autonomous vehicle, support and to help riders with various accessibility needs.

“Many Detroiters have trouble getting around due to the costs of owning a car or mobility challenges arising from age or disabilities,” said Edwin Olson, CEO and co-founder of May Mobility. “We’re excited to show how autonomous technology can help in Detroit, where we will be launching our largest service area to date.”

Last July, the Detroit City Council unanimously approved a $2.4 million contract with May Mobility to provide this service. May Mobility performed extensive vehicle testing with the University of Michigan’s Mcity and the American Center for Mobility (ACM) in preparation for launch. Testing protocols included the Mcity Safety Assessment Program, made up of a Driver’s License Test and Driving Intelligence Test, and a testing and evaluation process developed by ACM that simulated genuine scenarios encountered in urban settings like Detroit.

a robotaxi from May Mobility.

May Mobility’s Accessibili-D autonomous vehicle

May Mobility’s autonomous vehicles use its Multi-Policy Decision Making (MPDM) technology to navigate city streets. Each vehicle is equipped with multiple lidar, radar and cameras that feed MPDM a 360-degree view of its surroundings. Using the data collected from the AV’s sensor suite, MPDM can simulate thousands of possible scenarios every second. As the AV detects vehicles, pedestrians, bikers and pets, MPDM analyzes the best maneuver to perform to efficiently and safely reach its destination, even in unencountered situations.

“We’re thrilled to launch the ‘Accessibili-D’ autonomous shuttle service, a vital step toward enhancing mobility for our older residents and those with disabilities. This free, innovative service will provide safe and efficient transportation, greatly improving access to essential services for residents who have faced difficulty navigating their needs in the city,” said Tim Slusser, chief of the Office of Mobility Innovation at the City of Detroit. “We are thankful for the expert collaboration of the institutions and individuals at the Michigan Mobility Collaborative and May Mobility for their invaluable partnership. Together, we’re making Detroit a more inclusive city for all.”

In December 2023, May Mobility deployed robotaxis at a retirement community in the Phoenix metro area. Two autonomous minivans were made available to a select group of early riders who live in Sun City, a retirement community home to 39,931 people, according to the 2020 census, which is about 14.4 square miles in size. To start, May Mobility said its minivans will cover about 4.5 miles of Sun City, bringing the early riders to a variety of spots such as resident buildings, medical centers, and “other key locations.”

The pilot in Detroit is May Mobility’s 14th deployment to date. The company also currently operates in Ann Arbor, Mich.; Grand Rapids, Minn.; Miami, Fla.; Arlington, Texas and Sun City, Ariz.

The post Detroit launching robotaxi pilot for people with mobility issues appeared first on The Robot Report.

]]>
https://www.therobotreport.com/robotaxi-pilot-launching-in-detroit-for-elderly-disabled-people/feed/ 0
Waymo updates software after robotaxi drives into telephone pole https://www.therobotreport.com/waymo-updates-software-after-robotaxi-drives-into-telephone-pole/ https://www.therobotreport.com/waymo-updates-software-after-robotaxi-drives-into-telephone-pole/#respond Wed, 12 Jun 2024 16:49:47 +0000 https://www.therobotreport.com/?p=579393 Software update corrects an error in the software that "assigns a low damage score" to telephone poles and updates the company's map.

The post Waymo updates software after robotaxi drives into telephone pole appeared first on The Robot Report.

]]>

Waymo is issuing a voluntary software recall for all of its 672 robotaxis after one autonomously drove into a telephone pole in Phoenix, Ariz. last month. This is Waymo’s second-ever recall.

The Verge first reported the new software recall. During the incident, which took place on May 21, an empty Waymo vehicle was driving to pick up a passenger. To get there, it drove through an alley lined on both sides by wooden telephone poles that were level with the road, not up on a curb. The road had longitudinal yellow striping on both sides to indicate the path for vehicles.

As the vehicle pulled over, it struck one of the poles at a speed of 8 MPH, sustaining some damage. Waymo says no passengers or bystanders were hurt.

After completing the software update, the Mountain View, Calif.-based company filed the recall with the National Highway Traffic Safety Administration (NHTSA). Waymo says this update corrects an error in the software that “assigns a low damage score” to the telephone pole. Additionally, it updates the company’s map so its vehicles can better account for the hard road edge in the alleyway that was previously not included.

“Following an event on May 21 in Phoenix, we have chosen to file a voluntary software recall with the National Highway Traffic Safety Administration (NHTSA) to address a mapping and software issue. We have already deployed mapping and software updates across our entire fleet, and this does not impact our current operations. As we serve more riders in more cities, we will continue our safety first approach, working to earn trust with our riders, community members, regulators, and policymakers,” a Waymo spokesperson told The Robot Report.

Waymo’s engineers deployed the recall at its central depot where its robotaxis regularly return to for maintenance and testing. It was not an over-the-air software update.

Waymo expands operations despite scrutiny

Last February, Waymo recalled 444 vehicles after two minor collisions. The two collisions happened in December 2023, when two of Waymo’s vehicles, in two different instances, collided with a backward-facing pickup truck being improperly towed ahead of the vehicle. The pickup truck was angled across the center turn lane and traffic lane.

The company is also currently under investigation by the NHTSA for over two dozen incidents involving its driverless vehicles. These incidents include 17 crashes and five reports of possible traffic law violations, according to the NHTSA.

Despite these issues, Waymo has continued to expand its robotaxi services. Last week, the company expanded its service in Phoenix, its largest service area. Waymo added 90 square miles (233 sq. km) to what was already its largest service area in metropolitan Phoenix.

Waymo said its riders can now enjoy Waymo One service across 315 square miles (815.8 sq. km) of the Valley. The expanded service area reaches further into North Phoenix and as far as Desert Ridge. Its new service area covers more of Scottsdale’s resorts and expands to downtown Mesa. This gives riders access to desert attractions, golf courses, and downtown destinations such as the Mesa Arts Center and Pioneer Park.

The post Waymo updates software after robotaxi drives into telephone pole appeared first on The Robot Report.

]]>
https://www.therobotreport.com/waymo-updates-software-after-robotaxi-drives-into-telephone-pole/feed/ 0
Waymo expands Phoenix service area; Zoox starts testing in Austin and Miami https://www.therobotreport.com/waymo-expands-phoenix-service-area-zoox-starts-testing-austin-miami/ https://www.therobotreport.com/waymo-expands-phoenix-service-area-zoox-starts-testing-austin-miami/#comments Fri, 07 Jun 2024 13:22:22 +0000 https://www.therobotreport.com/?p=579330 Waymo plans to widen its robotaxi service area, while Zoox said it will begin testing autonomous vehicles in two more cities.

The post Waymo expands Phoenix service area; Zoox starts testing in Austin and Miami appeared first on The Robot Report.

]]>
Left, Waymo's autonomous robotaxi, and right, Zoox's purpose built autonomous robotaxi.

Left, Waymo’s autonomous robotaxi, and right, Zoox’s purpose-built autonomous robotaxi. | Sources: Waymo, Zoox

It has been an exciting week for the autonomous vehicle industry. Waymo LLC announced that it will be expanding its service area in Phoenix, and Zoox Inc. said it will begin testing in Austin and Miami. Both companies have spent the year so far gradually expanding operations.

In March, Waymo launched its autonomous service in Los Angeles, and Zoox expanded its testing in Las Vegas. 

Let’s start with Waymo. The Mountain View, Calif.-based unit of Alphabet Inc. is continuing to invest in the city in which it first launched its robotaxi services. The company has added 90 square miles (233 sq. km) to what was already its largest service area in metropolitan Phoenix. 

Waymo said that its riders can now enjoy Waymo One service across 315 square miles (815.8 sq. km) of the Valley. The expanded service area reaches further into North Phoenix and as far as Desert Ridge. 

“Metro Phoenix holds a special place in Waymo’s history and our hearts,” stated Saswat Panigrahi, chief product officer at Waymo. “It’s a privilege to continue serving Phoenicians and visitors alike, and our team is excited to offer access to even more popular destinations across the Valley.”


SITE AD for the 2024 RoboBusiness registration now open.Register now.


Inside Waymo’s expanded Phoenix service area

Waymo’s new service area covers more of Scottsdale’s resorts and expands to downtown Mesa. This gives riders access to desert attractions, golf courses, and downtown destinations such as the Mesa Arts Center and Pioneer Park. 

In a first for the company, Waymo has also established a partnership with the Salt River Pima-Maricopa Indian Community (SRPMIC). Its vehicles will be able to operate on tribal land as part of the agreement. Riders can now access the Talking Stick Entertainment District, which features the Talking Stick Resort, Salt River Fields at Talking Stick, TopGolf, and more. 

“The Salt River Pima-Maricopa Indian Community is excited to announce its partnership with Waymo,” said SRPMIC President Martin Harvier. “This collaboration marks a significant step forward in our commitment to innovation and sustainable solutions.”

“Waymo’s proven track record in developing and deploying autonomous technologies gives us confidence that this initiative will revolutionize the way people travel within our city, enhancing accessibility, safety, and efficiency for residents and visitors alike,” he added.

The company has more expansions coming on the horizon. It recently gave Waymo employees 24/7 access to curbside terminal pickup and drop-offs at Phoenix Sky Harbor International Airport. Waymo said it intends to offer this service to public riders soon. 

Currently, Waymo riders can access terminals directly from 10:00 p.m. to 6:00 a.m. MT, or they can use the 24th St and 44th St PHX Sky Train Stations at all hours of the day. The company is also offering services in San Francisco and plans to expand in Austin, Texas, while scaling back its Waymo Via shipping across the Southwest.

A map comparing Waymo's previous Phoenix service area to its expanded service area.

A map comparing Waymo’s previous Phoenix service area with its expanded service area. | Source: Waymo

Zoox expands testing to two cities

While Zoox isn’t as far along in its deployment journey as Waymo is, the Foster City, Calif.-based company has been making steady progress. Austin and Miami will mark the fourth and fifth public testing locations for the Amazon subsidiary

Zoox began operating in San Francisco in 2018, and then it expanded to Las Vegas in 2019 and to Seattle in 2021. 

“We’re laying the foundations for our autonomous ride-hailing service in new cities across the U.S.,” said Rone Thaniel, senior director of policy and regulatory affairs at Zoox. “Austin and Miami offer key learning opportunities that will support the continued growth and refinement of our testing and service.” 

The company said it plans to enter these two new cities with care. Zoox will first conduct a brief mapping mission. Once this is finished, it will begin deploying its retrofitted Toyota Highlander test fleet, equipped with safety drivers, in small areas near the business and entertainment districts of two cities. This initial deployment will allow it to gather valuable insights and feedback. 

While this process may seem similar to those of other robotaxi companies, Zoox has unique plans for its eventual commercial deployments, which it said will still happen in Las Vegas and San Francisco first.

The company is developing a purpose-built robotaxi with no steering wheel, brakes, or even driver’s seat. Zoox asserted that it is committed to only providing commercial services in these vehicles. 

Because these vehicles can never be operated by safety drivers, Zoox acknowledged that it’s even more important  to do extensive testing to refine its autonomous driving system. To do this, the company has identified specific, pre-planned routes that it said offer the most challenging driving features and scenarios.

At the same time, it will be testing randomly selected point-to-point routes within a defined, geofenced area. 

Zoox noted that Austin and Miami both offer unique opportunities and valuable challenges that will help it to refine its driving. Austin has horizontal traffic lights, traffic lights hanging on wires, railway crossings, and strong thunderstorms.

Miami, on the other hand, has traffic lights that are suspended diagonally across intersections, said Zoox. 

As before, the company will start with its focused testing areas in each city, and then methodically expand as its AI gets more familiar with driving conditions in each city. Zoox added that it is working closely with local officials, regulators, and residents to ensure safe and seamless integration of its autonomous vehicles into these cities.

The post Waymo expands Phoenix service area; Zoox starts testing in Austin and Miami appeared first on The Robot Report.

]]>
https://www.therobotreport.com/waymo-expands-phoenix-service-area-zoox-starts-testing-austin-miami/feed/ 1
Foresight to collaborate with KONEC on autonomous vehicle concept https://www.therobotreport.com/foresight-collaborates-with-konec-autonomous-vehicle-concept/ https://www.therobotreport.com/foresight-collaborates-with-konec-autonomous-vehicle-concept/#respond Mon, 03 Jun 2024 12:30:02 +0000 https://www.therobotreport.com/?p=579242 Foresight will integrate its ScaleCam 3D perception technology with KONEC into a conceptual autonomous driving vehicle. 

The post Foresight to collaborate with KONEC on autonomous vehicle concept appeared first on The Robot Report.

]]>
Two Foresight branded cameras on top of a white car.

Foresight says its ScaleCam system can generate high-quality depth maps. | Source: Foresight

Foresight Autonomous Holdings Ltd. last week announced that it has signed a co-development agreement with KONEC Co., a Korean Tier 1 automotive supplier. Under the agreement, the companies will integrate Foresight’s ScaleCam 3D perception technology into a concept autonomous vehicle. 

The collaboration is sponsored by the Foundation of Korea Automotive Parts Industry Promotion (KAP), founded by Hyundai Motor Group. The partners said they will combine KONEC’s expertise in developing advanced automotive systems with KAP’s mission to foster innovation within the automobile parts industry. 

“We believe that the collaboration with KONEC represents a significant step forward in the development of next-generation autonomous driving solutions,” stated Haim Siboni, CEO of Foresight. “By combining our resources, image-processing expertise, and innovative technologies, we aim to accelerate the development and deployment of autonomous vehicles, ultimately contributing to safer transportation solutions in the Republic of Korea.” 

Foresight is an innovator in automotive vision systems. The Ness Ziona, Israel-based company is developing smart multi-spectral vision software systems and cellular-based applications. Through its subsidiaries, Foresight Automotive Ltd., Foresight Changzhou Automotive Ltd., and Eye-Net Mobile Ltd., it develops both in-line-of-sight vision systems and beyond-line-of-sight accident-prevention systems. 

KONEC has established a batch production system for lightweight metal raw materials, models, castings, processing, and assembly through cooperation among its group affiliates. The Seosan-si, South Korea-based company‘s major customers include Tesla, Hyundai Motor, and Kia.

KONEC has entered the field of information processing technology using cameras to perform tasks such as developing a license-plate recognition system with companies that have commercialized systems on chips (SoCs) and modules for Internet of Things (IoT) communication. 


SITE AD for the 2024 RoboBusiness registration now open.Register now.


Foresight ScaleCam to enhance autonomous capabilities 

The collaboration will incorporate Foresight’s ScaleCam 360º 3D perception technology. The company said it will enable the self-driving vehicle to accurately perceive its surroundings. It and KONEC said say the successful integration of ScaleCam could significantly enhance the capabilities and safety of autonomous vehicles. 

ScaleCam is based on stereoscopic technology. The system uses advanced and proven image-processing algorithms, according to Foresight. The company claimed that it provides seamless vision by using two visible-light cameras for highly accurate and reliable obstacle-detection capabilities. 

Typical stereoscopic vision systems require constant calibration to ensure accurate distance measurements, Foresight noted. To solve this, some developers mount stereo cameras on a fixed beam, but this can limit camera placement positions and lead to technical issues, it said.

Foresight asserted that its technology allows for the independent placement of both visible-light and thermal infrared camera modules. This allows the system to support large baselines without mechanical constraints, providing greater distance accuracy at long ranges, it said. 

The post Foresight to collaborate with KONEC on autonomous vehicle concept appeared first on The Robot Report.

]]>
https://www.therobotreport.com/foresight-collaborates-with-konec-autonomous-vehicle-concept/feed/ 0
Volvo launches VNL Autonomous truck equipped with Aurora software https://www.therobotreport.com/volvo-launches-vnl-autonomous-truck-equipped-with-aurora-software/ https://www.therobotreport.com/volvo-launches-vnl-autonomous-truck-equipped-with-aurora-software/#comments Thu, 23 May 2024 16:53:36 +0000 https://www.therobotreport.com/?p=579161 Volvo says its autonomous truck is the result of years of research into self-driving vehicle technology and intentional design.

The post Volvo launches VNL Autonomous truck equipped with Aurora software appeared first on The Robot Report.

]]>
The Volvo VNL Autonomous truck brings together Volvo's commercial vehicle expertise with autonomous driving technology from Aurora. | Source: Volvo Autonomous Solutions.

The Volvo VNL truck includes features for safety, integration, and commercial scale. | Source: Volvo Autonomous Solutions

Volvo Autonomous Solutions this week unveiled its first-ever production-ready autonomous truck. The company debuted the vehicle at the ACT Expo in Las Vegas. The VNS Autonomous truck brings together Volvo’s commercial vehicle expertise with autonomous driving technology from Aurora Innovation Inc., it said.

“We are at the forefront of a new way to transport goods, complementing and enhancing transportation capacity, and thereby enabling trade and societal growth,” stated Nils Jaeger, president of Volvo Autonomous Solutions (VAS). “This truck is the first of our standardized global autonomous technology platform, which will enable us to introduce additional models in the future, bringing autonomy to all Volvo Group truck brands, and to other geographies and use cases.”

Volvo added that the truck is the result of years of research and development into autonomous vehicle technology. The Gothenburg, Sweden-based automaker said the platform-based design approach will enable it to do two things.

The first is to use its virtual driver developed in house for trucks and machines working within confined applications. The second is to partner with virtual driving technologies for on-highway trucking applications. 

Chris Urmson, Sterling Anderson, and Drew Bagnell founded Aurora in 2017. Its Aurora Driver is a self-driving system with a common core of hardware and software. The Pittsburgh-based company designed it to adapt to a broad set of vehicle types, from a four-door sedan to a Class 8 semi truck. 


SITE AD for the 2024 RoboBusiness registration now open.Register now.


Volvo prioritizes safety and integration

Founded in 2020, Volvo Autonomous Solutions said it is addressing the transportation industry’s capacity constraints with safe, sustainable, and efficient autonomous technology. The unit of the Volvo Group offers customers tailored transport as a service (TaaS) with autonomous vehicles, required infrastructure, operational and uptime support, and cloud-based management of logistics flows.

With the Volvo VNL Autonomous truck, VAS said it made every design and engineering decision with safety in mind. This is why the system has redundant steering, braking, communication, computation, power management, energy storage, and vehicle motion management systems. 

In addition to safety, Volvo prioritized integrating Aurora’s Driver, an SAE Level 4 system, into the Volvo VNL Autonomous truck. Aurora said its driver is made up of AI software, dual computers, proprietary lidar that can detect more than 400 m (1,312.3 ft.) away, high-resolution cameras, an imaging reader, and additional sensors. These technologies enable the Volvo VNL Autonomous to safely navigate the world around it. 

“Powered by the Aurora Driver, the new Volvo VNL Autonomous is the realization of our shared vision,” said Sterling Anderson, co-founder and chief product officer at Aurora, said. “This truck combines Aurora’s industry-leading self-driving technology with Volvo’s best-in-class truck, designed specifically for autonomy, making it a must-have for any transport provider that wants to strengthen and grow their business.”

The Aurora Driver has been trained and tested in Aurora’s virtual suite, where it has driven billions of miles, claimed the company. It also has 1.5 million commercial miles on public roads, where it navigates end-to-end trucking routes. It has driven on highways, rural roadways, and surface streets day and night and in good and bad weather, Aurora said. 

Looking to the road ahead

The Volvo VNL Autonomous truck will be assembled at Volvo’s flagship New River Valley (NRV) plant in Dublin, Va. The NRV plant is the largest Volvo Trucks plant in the world. Volvo said its high-volume production experience, combined with stringent autonomous quality processes, will give enable it to produce autonomous trucks to meet industry demand. 

In 2023, the U.S. was short 80,000 truck drivers, according to the American Journal of Transportation. This problem will only increase, as the study showed that by 2030, the U.S. will be short 160,000 drivers. Volvo said its autonomous trucks can address this challenge. 

“The Volvo VNL Autonomous, powered by the Aurora Driver, offers a fully integrated autonomous solution in the hub-to-hub segment,” said Sasko Cuklev, head of on-road solutions at Volvo Autonomous Solutions. “Our approach reduces complexity for our customers while allowing them to experience the benefits of an autonomous solution with peace of mind by ensuring efficiency, safety, and reliability.”

The post Volvo launches VNL Autonomous truck equipped with Aurora software appeared first on The Robot Report.

]]>
https://www.therobotreport.com/volvo-launches-vnl-autonomous-truck-equipped-with-aurora-software/feed/ 2
Stratom develops airworthy material handling system for USSOCOM https://www.therobotreport.com/stratom-develops-airworthy-material-handling-system-for-ussocom/ https://www.therobotreport.com/stratom-develops-airworthy-material-handling-system-for-ussocom/#respond Wed, 22 May 2024 00:28:28 +0000 https://www.therobotreport.com/?p=579135 Stratom says its system will be able to lift, load, unload, and transport palletized 463L tactile cargo across operating environments.

The post Stratom develops airworthy material handling system for USSOCOM appeared first on The Robot Report.

]]>
A military-style vehicle fully loaded going into an aircraft.

Stratom creates unmanned ground systems that move material in both controlled settings and expeditionary environments. | Source: Stratom

Stratom Inc. today announced that it is developing a “unique, small-form-factor material handling system.” It said its latest robot will be able to lift, load, unload, and transport palletized 463L tactical cargo across various operating environments and terrains.

The Louisville, Colo.-based company is participating in a Small Business Innovation Research (SBIR) Phase I project with the U.S. Special Operations Command (USSOCOM) and SOFWERX Inc.

The SBIR program funds a diverse portfolio of startups and small businesses across technology areas and markets. Its aim is to stimulate technological innovation, meet federal research and development needs, and increase commercialization. 

“Building upon our prior expertise and knowledge in expeditionary robotic material handling, we’re helping define the appropriate tradeoffs of size, weight, capability and price to ensure our design is a strong fit for this application,” stated Jesse Weifenbach, lead vehicle systems engineer at Stratom.

“Integrating multiple proven technologies into a simplified, refined solution that streamlines USSOCOM and Air Force cargo operations and enhances warfighter safety, our innovative material-handling system will be lighter weight than traditional equipment, much quicker and efficient to deploy, and safer when unloading in undeveloped locations,” he added. “These unique capabilities enhance cargo operations while reducing fuel waste and minimizing cycle times for military personnel.”

“We have real-life integration experience, expanding from military deployments of rapid refueling and offroad autonomy,” Zachary Savit, senior manager for business development at Stratom, told The Robot Report.


SITE AD for the 2024 RoboBusiness registration now open.Register now.


New vehicle to be compact but powerful

Stratom is building a compact, remotely operated vehicle that will weigh less than 10,000 lb. (4,535.9 kg) and be airworthy. The company said the system will occupy just one pallet position while performing all cargo loading and unloading tasks for full-size, fully loaded 463L pallets in a variety of challenging conditions and locations. 

“Without proper material handling equipment, alternate methods of combat cargo offloading consume significant time and are very labor-intensive while posing serious risks to personnel, materials, and the aircraft,” explained Mark Gordon, president and CEO of Stratom. “Plus, the current equipment used can weigh up to three times its payload capacity and occupies precious space within an aircraft.”

“The military is transitioning toward more agile combat employment in expeditionary environments and across large geographical footprints,” he said. “Consequently, the need for inexpensive, lightweight and flight-ready material-handling equipment that is immediately operational upon landing is required as assets are distributed across a greater range of isolated locations without sufficient infrastructure.”

Stratom builds on prior experience

Stratom has developed several autonomous ground vehicles and robotic systems for logistics and operational applications. Most recently, the company updated its Autonomous Pallet Loader (APL) in support of the U.S. Marine Corps. 

The APL is a flexible, autonomous forklift that Stratom designed to transport heavy and bulky cargo across various terrains for both military and commercial use. The company said it is focused on maturing the integration of its Summit off-road autonomy architecture onto the APL.

Summit is a highly modular platform that enables customization for various container-loading applications. Stratom said this includes full automation of cargo operations for military aircraft.

Stratom has also designed the eXpeditionary Robotic Platform (XRP). This is an autonomous vehicle capable of driving in and out of an MV-22 Osprey aircraft while carrying more than 2,400 lb. (1,088.6 kg) of supplies. 

On the other hand, Stratom’s eXpeditionary Robotic-Field Artillery Autonomous Resupply (XR-FAAR) can deliver ammunition and supplies to weapons systems. 

Finally, SALT is Stratom’s small agile lift truck. The company said it aims to streamline Air Force aerial stores and munition-loading operations with the systems, increasing safety and enhancing adaptability.

“We’ve gotten a lot of feedback from recent trade shows, and we don’t yet know what other problems our products can solve,” said Savit. “We’re also looking at how our vehicles might fit in closed fleets, such as in mines.” 

The post Stratom develops airworthy material handling system for USSOCOM appeared first on The Robot Report.

]]>
https://www.therobotreport.com/stratom-develops-airworthy-material-handling-system-for-ussocom/feed/ 0
Hyundai picks up some Aptiv shares of Motional for $448M https://www.therobotreport.com/hyundai-picks-up-some-aptiv-shares-of-motional-shares-for-448m/ https://www.therobotreport.com/hyundai-picks-up-some-aptiv-shares-of-motional-shares-for-448m/#respond Fri, 17 May 2024 18:22:29 +0000 https://www.therobotreport.com/?p=579082 As Aptiv pulls back its support in Motional, Hyundai has invested nearly $1 billion in the driverless vehicle developer this year.

The post Hyundai picks up some Aptiv shares of Motional for $448M appeared first on The Robot Report.

]]>
A Motional robotaxi in front of a building with columns and palm trees in Las Vegas.

Motional is shifting from deploying robotaxis and to developing and generalizing its driverless technology. | Source: Motional

Aptiv PLC and Hyundai Motor Group have completed their ownership restructuring transitions for Motional AD LLC. Earlier this year, Aptiv said it would stop funding Motional after incurring millions of dollars in losses.

Aptiv had forecast a non-cash equity loss of about $340 million in 2024. Now, Aptiv has sold an 11% common equity interest in the autonomous vehicle developer to Hyundai for about $448 million of cash consideration.

The mobility software company also exchanged 21% of its common equity in Motional for a like number of Motional preferred shares.

Hyundai doubles down on driverless development

Despite Aptiv pulling out, Hyundai has doubled down on its investment. This week’s news came just weeks after Hyundai announced a $475 million funding round for Motional. With the funding, these transactions have resulted in the reduction of Aptiv’s common equity interest in Motional from 50% to just 15%. 

Boston-based Motional said its latest investment demonstrates Hyundai’s belief in the strategic importance of autonomy technology and its confidence in the company’s ability to capitalize on the market opportunity. 

Aptiv is a leading automotive parts supplier. It launched Motional with Hyundai in 2017 as a joint venture initially worth $4 billion.

Since its inception, Motional has introduced its all-electric IONIQ 5 robotaxi with a fully integrated Level 4 autonomous vehicle (AV) system. The company has established partnerships with Uber and Lyft to deploy its AVs in their networks. 

To date, Motional said it has completed more than 100,000 autonomous rides in Las Vegas, and thousands of autonomous food deliveries in Los Angeles. 


SITE AD for the 2024 RoboBusiness registration now open.Register now.


Motional steers away from mass robotaxi deployment

Motional said its mission is to make driverless vehicles a safe, reliable, and accessible reality. While Hyundai’s latest investment will help move this mission forward, the company acknowledged that it is pumping the brakes on new driverless deployments. 

Driverless vehicles will enter the market when the technology behind them has evolved, and when the business case for autonomous development is clear, it said.

“While we’re excited by our pace of technical progress, and our initial commercial deployments have yielded valuable insights, large-scale deployment of AVs remains a goal for the future, not the present,” Karl Iagnemma, president and CEO of Motional, wrote in a blog post

The company updated its strategic plan in collaboration with its shareholders. It plans to focus its resources on the continued development and generalization of its core driverless technology. At the same time, it’s de-emphasizing near-term commercial deployments and ancillary activities. 

The company said this updated strategy required streamlining its teams, resulting in a reduction in staff across the business. 

“Having spent my entire career in developing robotic and autonomous technology, I remain convinced that AVs will have a transformative, positive impact on the way we move, with the capability to dramatically improve roadway safety, reduce emissions, and enhance the overall experience of personal transportation,” said Iagnemma. “I’m confident in the road ahead and Motional’s ability to achieve its mission of making driverless vehicles a reality.” 

The post Hyundai picks up some Aptiv shares of Motional for $448M appeared first on The Robot Report.

]]>
https://www.therobotreport.com/hyundai-picks-up-some-aptiv-shares-of-motional-shares-for-448m/feed/ 0