Home Latest Insights | News OpenAI Quietly Revives Robotics Ambitions with Secret Lab Focused on Humanoid Development

OpenAI Quietly Revives Robotics Ambitions with Secret Lab Focused on Humanoid Development

OpenAI Quietly Revives Robotics Ambitions with Secret Lab Focused on Humanoid Development

In the shadow of its blockbuster language models, OpenAI is quietly charting a course toward physical intelligence, reviving its robotics program with a secretive lab that could bridge the gap between digital AI and tangible, human-like machines, according to people with knowledge of the matter who spoke to Business Insider.

This resurgence, emerging after a five-year hiatus, positions the company at the forefront of embodied AI—a field where software brains meet hardware bodies to navigate and manipulate the real world. Insiders describe the effort as a foundational step toward artificial general intelligence (AGI) that operates beyond screens, potentially transforming everything from household chores to industrial labor.

The initiative represents a strategic pivot for OpenAI, which disbanded its original robotics team in 2020 to concentrate on generative AI breakthroughs like ChatGPT. That early project, unveiled in 2019, featured a robotic hand trained via reinforcement learning to solve a Rubik’s Cube one-handed—a feat that demonstrated dexterity but highlighted data and compute limitations.

Register for Tekedia Mini-MBA edition 19 (Feb 9 – May 2, 2026).

Register for Tekedia AI in Business Masterclass.

Join Tekedia Capital Syndicate and co-invest in great global startups.

Register for Tekedia AI Lab (class begins Jan 24 2026).

Tekedia unveils Nigerian Capital Market Masterclass.

“We chose to refocus the team on other projects,” a spokesperson said at the time, citing challenges in scaling physical AI.

Fast-forward to 2025, and the company has reassembled a dedicated robotics unit, hiring over a dozen engineers specializing in humanoid systems and filing trademarks for “user-programmable humanoid robots” with communication and learning capabilities.

Launched in February 2025, the San Francisco lab—co-located with the finance team—has expanded rapidly, now employing around 100 data collectors working in three shifts across dozens of workstations. The core work involves teleoperating Franka robotic arms using affordable, 3D-printed GELLO controllers, a technology inspired by a 2023 UC Berkeley study on scalable teleoperation.

These controllers mimic the robot’s kinematics, allowing human operators to demonstrate tasks with precision while cameras capture both sides for training data. Progress has accelerated: Initial exercises involved simple actions like placing a rubber duck in a cup, evolving to household duties such as toasting bread or folding laundry.

Performance metrics emphasize “good hours” of functional data, with recent months seeing doubled collection rates amid calls for greater efficiency. A humanoid robot prototype, likened to an “iRobot-like” design, is on display but largely inactive, underscoring the lab’s arm-focused strategy over full-body integration—for now.

Plans include a second facility in Richmond, California, with job postings for robotics operators already circulating. This data-centric approach addresses a perennial robotics bottleneck: acquiring vast, high-quality datasets for training.

“Everyone is fighting for a way to develop large data sets,” said Jonathan Aitken, a robotics expert at the University of Sheffield.

GELLO’s low-cost design offers advantages over motion-capture suits or VR systems used by rivals, enabling more direct human-to-robot motion translation. One Berkeley researcher from the GELLO study joined OpenAI in August 2024 to contribute to “Building the Robot Brain.”

OpenAI’s hardware ambitions extend beyond the lab. Last week, the company issued a Request for Proposals (RFP) seeking U.S.-based manufacturers for consumer devices, robotics components like motors and actuators, and cloud infrastructure—aiming to foster domestic supply chains amid geopolitical tensions.

The RFP, open through June 2026, aligns with broader efforts to scale production, though timelines and budgets remain undisclosed.

The revival draws on OpenAI’s investments in external ventures. Partnerships include 1X Technologies (backed since 2023), which develops home-focused humanoids like EVE and NEO, with preorders open for 2026 shipments. A 2024 collaboration with Figure AI—to integrate AI models into humanoids—ended in February 2025 as Figure advanced in-house capabilities, including pilots at BMW plants.

OpenAI also supports Physical Intelligence, focusing on versatile manipulation software. CEO Sam Altman’s vision frames this as inevitable: Last year, he predicted the “humanoid robots moment” was approaching, emphasizing AI’s need for physical embodiment to achieve AGI.

Internal discussions, per reports, explore humanoid development as a path to “AG-level intelligence in dynamic, real-world settings.” Job listings seek experts in sensor suites, actuators, and large-scale manufacturing, hinting at ambitions beyond research.

Yet OpenAI faces stiff competition in a booming sector. Tesla’s Optimus, with 50-actuator hands and 2026 production targets, leads in dexterity demos.  Figure’s 02 model, backed by $700 million from Microsoft and Nvidia, plans 5,000 units in 2025, scaling to tens of thousands by 2026. Chinese firms like Unitree (R1 humanoid) and EngineAI showcase acrobatic prototypes, while Agility’s Digit operates in warehouses.

However, market projections are staggering: Morgan Stanley forecasts 1 billion humanoids by 2050, generating a $5 trillion market, with 302 million in China alone, while Bank of America anticipates 10 million annual shipments by 2035.

But there are challenges: Oregon State’s Alan Fern notes that scaling arm data to full humanoids is “something that hasn’t been proven out yet.” Safety, ethics, and job displacement loom large, with experts warning of workforce disruptions.

OpenAI’s integration of its language models with physical hardware—potentially enabling robots to interpret commands and learn from interactions— is expected to blur the line between virtual and real. With pilots in homes and factories accelerating, 2026 could mark the dawn of widespread embodied AI, driven by OpenAI’s methodical resurgence.

No posts to display

Post Comment

Please enter your comment!
Please enter your name here