Home Latest Insights | News Uber Launches AV Labs to Fuel Autonomous Vehicle Partnerships with Real-World Driving Data

Uber Launches AV Labs to Fuel Autonomous Vehicle Partnerships with Real-World Driving Data

Uber Launches AV Labs to Fuel Autonomous Vehicle Partnerships with Real-World Driving Data

Uber Technologies Inc. has unveiled a new division called Uber AV Labs, aimed at accelerating the development of autonomous vehicles (AVs) by collecting and sharing vast amounts of real-world driving data with industry partners.

Announced on January 27, 2026, the initiative marks a strategic pivot for the ride-hailing giant, which is not re-entering the robotaxi manufacturing space but instead leveraging its operational scale to address a critical bottleneck in AV advancement: access to diverse, high-volume training data.

The move comes amid a broader industry shift from rules-based AV systems to those reliant on reinforcement learning and machine learning models, where exposure to rare “edge cases”—unpredictable real-world scenarios—proves essential for safety and reliability.

Register for Tekedia Mini-MBA edition 19 (Feb 9 – May 2, 2026).

Register for Tekedia AI in Business Masterclass.

Join Tekedia Capital Syndicate and co-invest in great global startups.

Register for Tekedia AI Lab.

Uber’s chief technology officer, Praveen Neppalli Naga, emphasized in an interview with TechCrunch that the value of advancing partners’ AV technology outweighs immediate monetization, stating, “Our goal, primarily, is to democratize this data… the value of this data and having partners’ AV tech advancing is far bigger than the money we can make from this.”

For now, Uber plans to provide the data free of charge, focusing first on building a robust foundation before exploring commercial models.

Uber AV Labs begins modestly with a single Hyundai Ioniq 5 vehicle equipped with sensors, including lidars, radars, and cameras, though the company is not committed to a specific model.

VP of Engineering Danny Guo described the early-stage setup as “scrappy,” noting that the team is still physically installing hardware and testing durability.

The division, which Uber expects to grow to a few hundred employees within a year, will deploy these sensor-laden cars in select cities to capture driving data, starting with targeted collections based on partner needs.

With operations in over 600 cities globally, Uber can flexibly prioritize locations of interest, such as those with unique traffic patterns or environmental challenges.

Partners like Alphabet’s Waymo, Waabi, Lucid Motors, and more than 20 others stand to benefit, though no formal contracts have been signed yet.

These companies, many already amassing their own datasets, recognize that scaling beyond fleet size limitations requires broader access to real-road scenarios—something simulations alone cannot fully replicate.

For instance, Waymo’s decade-long operations have not prevented incidents like robotaxis illegally passing stopped school buses, highlighting the need for more comprehensive data to preempt edge cases.

Data will not be shared raw; Uber plans to process it with a “semantic understanding” layer tailored to partners’ needs, aiding real-time path planning and decision-making.

An intermediate “shadow mode” step will integrate partners’ software into Uber’s vehicles, flagging discrepancies between human drivers and AV systems to refine models and promote more human-like driving behaviors.

This mirrors Tesla’s data collection strategy, which harnesses millions of customer vehicles, but Uber’s approach emphasizes precision over sheer volume, drawing from its ride-hailing expertise.

Uber’s history with AVs informs this cautious relaunch. After a fatal 2018 pedestrian incident involving one of its test vehicles in Tempe, Arizona, the company halted operations and sold its Advanced Technologies Group (ATG) to Aurora in a 2020 deal valued at around $4 billion, including Uber’s $400 million investment in Aurora.

Now, AV Labs focuses solely on data facilitation, aligning with Uber’s broader ecosystem role as a mobility platform rather than a hardware developer. Privacy considerations are addressed through a dedicated Road Data Collection Privacy Hub, where Uber commits to blurring faces and license plates in footage and sharing data only with vetted AV partners for safety advancements.

The division is actively hiring experts in data, machine learning, computer vision, and infrastructure to build capabilities in data mining, simulation, validation, and system improvements across perception, prediction, and planning.

This initiative complements Uber’s ongoing AV partnerships, including collaborations with NVIDIA for a data factory to support global fleet scaling starting in 2027, targeting up to 100,000 vehicles; Lucid and Nuro for a next-generation robotaxi program with 20,000 Lucid Gravity SUVs over six years; and others like May Mobility, Volkswagen, and Avride for robotaxi, delivery, and truck applications.

Guo underscored Uber’s unique position, saying: “If we don’t do this, we really don’t believe anybody else can… we believe we have to take on this responsibility right now.”

As AV Labs ramps up, partners’ feedback that “give us anything that will be helpful” reflects the data hunger driving the sector, where Uber’s vast network could prove transformative in closing the gap between simulation and reality.

No posts to display

Post Comment

Please enter your comment!
Please enter your name here