Space-compute startup Starcloud has vaulted into unicorn territory, securing a $1.1 billion valuation in one of the fastest climbs from Y Combinator demo day to billion-dollar status.
The company’s Series A round, led by Benchmark and EQT Ventures, raised $170 million and brought total funding to $200 million, according to the company and investors. The fundraise comes just 17 months after its YC debut, underscoring the surging investor appetite for infrastructure plays tied to the artificial intelligence boom.
The enthusiasm has remarkably been sustained by a bold thesis to move power-hungry AI data centers off Earth and into orbit.
Register for Tekedia Mini-MBA edition 20 (June 8 – Sept 5, 2026).
Register for Tekedia AI in Business Masterclass.
Join Tekedia Capital Syndicate and co-invest in great global startups.
Register for Tekedia AI Lab.
As hyperscalers race to build capacity for generative AI workloads, terrestrial data center expansion is increasingly constrained by power shortages, land scarcity, water use concerns, and regulatory bottlenecks. Starcloud is pitching orbit as a solution, where near-continuous solar power and the vacuum of space could, in theory, transform the economics of compute.
The company has already moved beyond concept.
In November 2025, Starcloud launched its first satellite carrying an Nvidia H100 GPU, becoming one of the first companies to deploy a state-of-the-art terrestrial AI chip in orbit. The satellite was used to perform inference and early AI model training tasks in space, a milestone that helped validate the technical premise behind orbital computing.
“An H100 is probably not the best chip for space, to be honest, but the reason we did it is we wanted to prove that we could run state of the art terrestrial chips in space,” he told TechCrunch.
Later this year, Starcloud plans to launch a second, more advanced spacecraft equipped with multiple GPUs, including an Nvidia Blackwell chip, an Amazon Web Services server blade, and even a bitcoin-mining computer.
That second mission is designed less as a demonstration and more as an engineering testbed, particularly for thermal management and power systems. Cooling remains one of the most difficult problems in orbital computing because high-performance chips generate significant heat and cannot rely on conventional air-based cooling systems.
Chief executive Philip Johnston says the next-generation spacecraft will carry what is expected to be the largest deployable radiator yet flown on a privately owned satellite, a critical step in making space-based computing viable at scale.
The longer-term ambition is far larger.
Starcloud is developing Starcloud 3, a three-ton, 200-kilowatt orbital data center spacecraft intended for deployment via SpaceX’s Starship system. The design is meant to fit the launch company’s “pez dispenser” deployment architecture originally built for Starlink satellites.
If launch costs fall to roughly $500 per kilogram, Johnston believes the platform could deliver electricity costs near five cents per kilowatt-hour, placing it in direct competition with land-based data centers.
That assumption, however, rests heavily on Starship becoming commercially operational by 2028 or 2029. This is where the investment case becomes more speculative.
Starship has yet to begin routine commercial flights, and many analysts believe the high-frequency launch cadence required to make orbital data centers economically viable may not emerge until the 2030s. Until then, the cost of lofting powerful compute hardware into orbit remains a significant barrier.
Johnston acknowledges as much, saying the company will continue deploying smaller systems on SpaceX’s Falcon 9 if Starship timelines slip.
“If it ends up being delayed, we’ll just carry on launching the smaller versions on Falcon 9,” Johnston said. “We’re not going to be competitive on energy costs until Starship is flying frequently.”
The economics also highlight how early this market remains. While Starcloud’s ambitions include an 88,000-satellite compute constellation, the entire global installed base of advanced GPUs in orbit is still measured in the dozens. By contrast, Nvidia is estimated to have shipped nearly four million advanced GPUs to terrestrial hyperscalers in 2025 alone.
The gap is even starker in power terms.
SpaceX’s Starlink constellation, currently the world’s largest satellite network with roughly 10,000 spacecraft, is estimated to generate around 200 megawatts of energy. On Earth, more than 25 gigawatts of data-center capacity are under construction in the United States alone.
This makes Starcloud less a direct competitor to terrestrial hyperscalers today and more a strategic infrastructure bet on where AI computing may go next.
Its near-term business model reflects that reality.
Rather than immediately replacing ground-based cloud services, the company is focused first on selling processing power to other spacecraft operators. One example already in use is the processing of Earth observation data from Capella Space’s radar satellites.
In the longer term, Starcloud hopes to position itself as an energy and compute infrastructure provider to hyperscalers seeking overflow or distributed AI workloads.
Competition is intensifying.
Alongside Starcloud, companies such as Aethero, Aetherflux, and Google’s Project Suncatcher are exploring adjacent orbital infrastructure models. Meanwhile, SpaceX itself has reportedly sought regulatory approval for a million-satellite distributed compute network, potentially making it the most formidable rival in the field.
That looming presence is the elephant in the room.
Still, Johnston argues the two companies are addressing different markets, with SpaceX likely prioritizing internal workloads tied to xAI’s Grok and Tesla systems, while Starcloud positions itself as an independent infrastructure player.
“They are building for a slightly different use case than us,” he told TechCrunch. “They’re mainly planning on serving Grok and Tesla workloads. It may be at some point that they offer a third party cloud service, but what I think they are unlikely to do is what we’re doing [as] an energy and infrastructure player.”
Investors are betting that if launch costs collapse and orbital compute becomes technically scalable, Starcloud could sit at the intersection of two of the decade’s biggest themes, space infrastructure and AI.



