DD
MM
YYYY

PAGES

DD
MM
YYYY

spot_img

PAGES

Home Blog Page 4

Cencora Moves to Tighten Grip on U.S. Cancer Care With $5bn OneOncology Deal

0

U.S. drug distributor Cencora said on Monday it will take majority control of cancer care network OneOncology in a transaction valued at about $5 billion, marking a major step in its strategy to deepen its role in oncology services and specialty drug distribution.

Under the terms of the deal, Cencora will buy most of the remaining shares in OneOncology from private equity firm TPG and other investors for roughly $3.6 billion in cash, while also assuming and paying down about $1.3 billion of the oncology group’s debt. The transaction values OneOncology at approximately $7.4 billion and will give Cencora majority ownership of a business it already partly controls.

The acquisition accelerates Cencora’s push beyond traditional drug distribution into higher-margin, service-oriented segments of healthcare, particularly oncology, where demand continues to rise as cancer treatments become more complex, personalized, and costly. By consolidating control of OneOncology, Cencora is positioning itself closer to the point of care, strengthening its ability to integrate drug distribution, data, practice management, and clinical services for cancer clinics across the United States.

OneOncology operates a network of independent oncology practices, providing them with operational support, clinical pathways, data analytics, and access to specialty medicines. For Cencora, full control of the network is expected to create synergies in the distribution of specialty drugs used to treat complex conditions such as cancer, an area that offers structurally higher margins than traditional pharmaceuticals and has been growing faster than the broader drug market.

Investors initially welcomed the move. Shares of Cencora rose about 1% in morning trading, reflecting confidence that the deal strengthens the company’s long-term growth profile even as it increases near-term leverage.

Analysts broadly described the acquisition as a logical next step. Michael Cherny of Leerink Partners said the accelerated purchase completes an expected move to increase exposure to faster-growing oncology assets, reinforcing Cencora’s strategic focus on cancer care as a core growth driver. J.P. Morgan analyst Lisa Gill also viewed the transaction positively, pointing to operational and commercial synergies that come with full ownership of OneOncology rather than a minority stake.

The deal comes against the backdrop of Cencora’s broader investment push in the U.S. healthcare supply chain. In November, the company pledged to invest $1 billion through 2030 to expand its U.S. distribution network, aligning with calls from the White House for pharmaceutical manufacturers and distributors to strengthen domestic production and supply resilience. Greater control of oncology services fits into that strategy, giving Cencora a more integrated presence in a critical segment of the healthcare system.

Cencora said the transaction is expected to close by the end of its second quarter of fiscal 2026 and will be financed with new debt. While the company maintained its fiscal 2026 earnings forecast, it acknowledged that the decision to halt share buybacks ahead of the acquisition makes results more likely to land at the lower end of its previously guided range of $17.45 to $17.75 per share.

At the same time, management raised its long-term adjusted profit outlook, citing expectations that OneOncology will increasingly contribute to earnings as it is more fully integrated into Cencora’s U.S. healthcare operations. The company believes the deal will enhance its ability to serve oncology providers over time, even if it temporarily pressures capital returns to shareholders.

For OneOncology, the transaction represents a shift away from the private equity ownership model at a time when exit routes such as initial public offerings remain uncertain. IPO markets have stayed uneven, prompting many healthcare startups to opt for strategic sales to larger industry players rather than risk subdued public debuts.

Taken together, the deal underlines how large drug distributors are moving closer to patient care, betting that scale, data, and tighter integration with clinics will be key to capturing growth in oncology — one of the most lucrative and rapidly evolving areas of modern medicine.

Nvidia Acquires SchedMD to Expand Open-Source Workload Management for AI

0

Nvidia has announced the strategic acquisition of SchedMD, the developer behind the widely adopted open-source workload management system, Slurm.

NVIDIA today announced it has acquired SchedMD — the leading developer of Slurm, an open-source workload management system for high-performance computing (HPC) and AI — to help strengthen the open-source software ecosystem and drive AI innovation for researchers, developers and enterprises.

NVIDIA will continue to develop and distribute Slurm as open-source, vendor-neutral software, making it widely available to and supported by the broader HPC and AI community across diverse hardware and software environments.

HPC and AI workloads involve complex computations running parallel tasks on clusters that require queuing, scheduling and allocating computational resources. As HPC and AI clusters get larger and more powerful, efficient resource utilization is critical.

This move is a significant affirmation of the chip designer’s commitment to strengthening its grip on the artificial intelligence (AI) ecosystem, not just through hardware, but through control and enhancement of the essential software layers that enable the efficient use of its high-performance chips. The acquisition, for which financial details were not disclosed, comes as Nvidia faces intensifying competition and strategically doubles down on the importance of the open-source community.

SchedMD’s flagship product, Slurm (Simple Linux Utility for Resource Management), is a sophisticated, vendor-neutral software that serves as the de facto cluster management and job scheduling system for large-scale computing environments globally.

Slurm’s importance in the current AI landscape is profound. The product is designed to schedule and manage enormous computing jobs, such as those required for training large foundation models and performing generative AI inference. It ensures the efficient allocation of vast shares of a data center’s server capacity and, crucially, excels at managing and optimizing the use of high-value resources like GPUs alongside CPUs across massive computing clusters.

This directly maximizes the return on investment for customers purchasing Nvidia hardware.

Slurm is a cornerstone of the global supercomputing community, currently acting as the scheduler for more than half of the top 10 and top 100 supercomputer systems on the prestigious TOP500 list, demonstrating its unmatched scalability and efficiency in distributed environments. Its customer base spans major institutions and private firms, including the cloud infrastructure firm CoreWeave and the Barcelona Supercomputing Center.

The integration of SchedMD and its Slurm technology directly supports Nvidia’s strategy, which views its proprietary CUDA software ecosystem as the critical moat protecting its market dominance. While Nvidia builds its reputation on fast chips, the software stack is what locks developers into the platform.

Nvidia gains the ability to accelerate the development of the scheduler itself by acquiring the team behind Slurm. This allows the company to ensure that Slurm is optimized at the most fundamental level to meet the rapidly evolving demands of next-generation AI and supercomputing, maximizing the throughput and efficiency of its newest hardware generations, such as the NVIDIA GB200 systems. This deep integration is vital for foundation model developers who need highly efficient resource management for distributed training jobs.

Additionally, this acquisition helps neutralize the efforts of rival chip makers, such as AMD, who are actively building competing GPU platforms and often rely on Slurm integration to appeal to customers. By owning the leading open-source scheduler, Nvidia can directly influence its future, making it strategically more challenging for competitors to offer a truly seamless and optimized experience on alternative hardware stacks, thereby maintaining its overall control of the AI infrastructure.

Despite taking ownership, Nvidia has made an explicit and crucial commitment to maintain and distribute Slurm as open-source, vendor-neutral software. This dedication reassures the existing ecosystem of supercomputing centers and AI labs that rely on Slurm for flexibility and multi-vendor compatibility.

The move strengthens Nvidia’s overall open-source push, which also includes the recent unveiling of its new Nemotron 3 family of open-source AI models, all aimed at cementing its influence across the entire AI stack, from silicon to scheduling.

The acquisition of the 40-person company, which was founded in 2010 by Slurm developers Morris “Moe” Jette and Danny Auble, reinforces Nvidia’s strategy of making targeted, high-impact investments in the software layers that control how its accelerated computing platform is utilized, ensuring its hardware remains the indispensable component in the global AI infrastructure.

Nvidia Weighs H200 Production Ramp as China Orders Surge After U.S. Export Green Light

0

Nvidia is considering ramping up production of its H200 graphics processing units as demand from Chinese customers accelerates following fresh approval from the Trump administration to resume sales of the chips to China, according to a Reuters report citing people familiar with the matter.

The H200 is the most powerful chip from Nvidia’s previous Hopper generation and is widely used to train large language models and other advanced artificial intelligence systems. Until recently, the chip was effectively off-limits to China after the Biden administration proposed tighter export controls aimed at restricting the flow of cutting-edge AI hardware to Chinese firms.

That position shifted last week when the U.S. Department of Commerce cleared Nvidia to sell the H200 in China, under an arrangement that requires the company to remit 25% of sales from those chips. The approval reopened a critical market for Nvidia at a time when Chinese technology companies are racing to secure compute capacity to remain competitive in AI development.

According to Reuters, interest from China has been strong enough that Nvidia is now examining whether to add manufacturing capacity for the H200, which is currently produced in limited volumes. Chinese firms have been moving quickly to place orders, reflecting pent-up demand created by years of export restrictions that forced many developers to rely on less capable hardware.

Chinese regulators, however, have not yet given final clearance for the chips to be imported, and discussions are ongoing within Beijing over whether to allow large-scale purchases. If approved, the H200 would represent a significant step up from the H20 chips that Nvidia previously tailored for the Chinese market to comply with U.S. restrictions. The H20, while compliant, is widely seen as a compromised alternative with reduced performance.

Several major Chinese technology companies are already in talks with Nvidia about potential orders. Firms such as Alibaba and ByteDance, both of which are building proprietary AI models, are said to be assessing how many H200 chips they can secure if imports are approved. For these companies, access to more capable GPUs could shorten training cycles and narrow the performance gap with U.S.-based rivals that have had uninterrupted access to Nvidia’s latest hardware.

“We are managing our supply chain to ensure that licensed sales of the H200 to authorized customers in China will have no impact on our ability to supply customers in the United States,” an Nvidia spokesperson said in an emailed statement.

The possible production increase highlights Nvidia’s delicate balancing act. On one hand, China remains one of the world’s largest markets for data center and AI hardware, and renewed access offers Nvidia a chance to unlock substantial incremental revenue. On the other hand, the company must navigate U.S. national security concerns and reassure policymakers that expanded China sales will not undermine domestic supply or strategic objectives.

For China’s AI sector, the development points to how export controls have reshaped priorities. With access to top-tier hardware constrained, many Chinese firms have focused on improving model efficiency and software optimization rather than brute-force scaling. The return of a chip as capable as the H200 could shift that balance, at least temporarily, even as Beijing continues to push for homegrown alternatives to reduce long-term reliance on U.S. suppliers.

More broadly, the episode illustrates how geopolitics is now directly shaping supply chains in the AI industry. Decisions about chip production, allocation, and pricing are no longer driven solely by market demand but by negotiations between governments, regulators, and corporate giants. Nvidia’s consideration of a production ramp suggests that, for now, demand from China remains too large to ignore — even as the rules governing that demand continue to evolve.

Nigeria’s Inflation Eases Sharply to 14.45% in November as Base Effects and Price Pressures Recede

0

Nigeria’s headline inflation rate slowed markedly to 14.45% in November 2025, down from 16.05% in October, signaling a notable easing in price pressures after a prolonged period of elevated inflation.

The latest figures, released on Monday by the National Bureau of Statistics (NBS), show a month-on-month decline of 1.6 percentage points, one of the sharpest disinflationary moves recorded this year. The data also indicate a slowdown compared with November last year, though the NBS cautioned that the year-on-year comparison reflects a different base year, November 2009, following the recent rebasing of inflation data.

On a month-on-month basis, however, inflation dynamics remain mixed. The NBS said headline inflation rose by 1.22% in November, higher than the 0.93% recorded in October. This suggests that while annual inflation is cooling, prices are still rising at a faster pace within the month.

“On a month-on-month basis, the Headline inflation rate in November 2025 was 1.22%, which was 0.29% higher than the rate recorded in October 2025,” the statistics agency said, noting that the average price level increased faster in November than in the preceding month.

Urban and rural price trends diverge

The moderation was evident across both urban and rural areas, though the pace of decline differed.

Urban inflation stood at 13.61% year-on-year in November 2025, a steep drop of 23.49 percentage points from the 37.10% recorded in November 2024. Month-on-month, urban inflation eased to 0.95%, down from 1.14% in October, pointing to some relief in city price pressures. The 12-month average urban inflation rate fell to 20.80%, compared with 35.07% a year earlier.

In rural areas, year-on-year inflation came in at 15.15%, down from 32.27% in November 2024. While still higher than the urban rate, the decline of 17.12 percentage points reflects easing pressures in food-producing and semi-urban communities, where inflation had been particularly severe over the past two years.

Food inflation cools sharply, but prices are still rising

Food inflation, a key driver of household hardship, slowed significantly to 11.08% year-on-year in November 2025, from 39.93% in the same month last year. The NBS attributed much of this dramatic drop to the change in the base year rather than a broad-based collapse in food prices.

Indeed, the agency noted that several staple items continued to record price increases during the month. These included dried tomatoes, cassava tubers, shelled periwinkle, ground pepper, eggs, crayfish, unshelled melon (egusi), oxtail, and fresh onions, underscoring the reality that many Nigerians are yet to feel tangible relief at the markets.

Core inflation, which strips out volatile food and energy prices, stood at 18.04% year-on-year in November. This suggests that underlying price pressures linked to transport, housing, healthcare, and services remain elevated, even as headline numbers ease.

Policy backdrop and lingering doubts

The latest data land against the backdrop of an ambitious inflation target set by President Bola Tinubu. In December 2025, while presenting the 2025 Appropriation Bill to a joint session of the National Assembly, Tinubu pledged to bring inflation down from 34.6% to 15% by the end of 2025.

“The 2025 budget projects that inflation will decline significantly from the current 34.6% to 15% by the end of next year,” the president said at the time.

While November’s reading of 14.45% appears, on the surface, to put that target within reach, economists have been cautious in interpreting the figures. Several analysts argue that base effects from the rebasing exercise are doing much of the heavy lifting, warning that structural drivers of inflation—such as exchange rate volatility, high energy costs, insecurity affecting food supply, and elevated transport expenses—have not been significantly addressed to yield the needed result.

In recent months, some economists have also called for a reassessment of monetary policy, noting that consecutive declines in headline inflation could strengthen the case for easing the Monetary Policy Rate. Others counter that the higher month-on-month inflation rate and stubborn core inflation suggest it may be too early for the Central Bank of Nigeria to declare victory.

For households and businesses, the key question remains whether the statistical slowdown will translate into sustained affordability. Currently, the data point to easing pressure, but not yet to a return to comfort, as prices continue to rise even if at a slower annual pace.

Zoom Brings AI Assistant to Web with Unveiling of Companion 3.0

0

Zoom has unveiled AI Companion 3.0, a sweeping update that redefines the platform as a comprehensive, AI-first productivity ecosystem extending far beyond its video conferencing origins.

The release introduces a dedicated web surface for the AI assistant and, critically, democratizes access to core AI features for users on the free Basic tier. The company positions the AI Companion 3.0 as a solution for “conversation to completion,” aiming to eliminate the friction between discussion and actionable outcomes.

The AI Companion is now accessible via a new, permanent conversational work surface at ai.zoom.us on a desktop web browser, establishing the assistant as a central hub for daily work outside of live meetings. This expansion is supported by a strategic freemium model designed to drive adoption.

Basic plan holders gain access to the AI Companion in up to three meetings per month for free. During these sessions, they can utilize high-value features such as meeting summary, in-meeting question answering, and AI note-taking. Additionally, free users can ask up to 20 questions each month via the side panel or the new web surface to retrieve highlights or action items from past meetings.

Full, unrestricted access to the AI Companion is available as a $10 per user per month add-on plan, which can be purchased without needing a separate paid Zoom Workplace license. An advanced Custom Companion tier is also offered for enterprise users, providing deeper customization, personalized knowledge collections, and integrations with their proprietary data sources.

Agentic Capabilities and Cross-Platform Orchestration

The most significant advancement in AI Companion 3.0 is its shift toward “agentic” capabilities, allowing it to perform multi-step actions and retrieve information across silos, turning scattered work conversations into continuous intelligence.

This intelligent assistance covers the entire workflow. The AI Companion features agentic retrieval capabilities, enabling it to pull information not only from all data stored within the Zoom ecosystem (meetings, chats, notes) but also from connected third-party platforms.

It currently supports Google Drive and Microsoft OneDrive, with planned, imminent support for Gmail and Microsoft Outlook, allowing the assistant to pull in email and document context for more informed responses. The system can even take notes for meetings held on Microsoft Teams and Google Meet, addressing the reality of mixed-platform work environments.

The assistant proactively manages the workday by generating a Daily Reflection Report, which summarizes meetings, tasks, and updates. It also automates post-meeting work with the Post-Meeting Follow-Up prompt template, which generates next steps, tasks, and drafts follow-up email messages. Custom AI agents, currently in beta for power users, allow a low-code design of personal workflows that automate routine tasks like summarizing chat threads every morning.

A new Agentic Writing Mode empowers users to draft, edit, and refine business documents using context derived directly from meeting discussions and documents. Users can start collaborative projects within the companion interface and seamlessly shift them to Zoom Docs, supporting exports to MD, PDF, Microsoft Word, and Zoom Docs formats.

Zoom, founded by CEO Eric Yuan, is actively competing with productivity behemoths like Microsoft (Copilot) and Google (Gemini) by leveraging its massive base of meeting data and an independent, platform-agnostic approach.

Lijuan Qin, head of AI product at Zoom, emphasized that the company’s independence and access to deep contextual meeting data give it a crucial advantage. Zoom’s technical core is its federated AI approach, which strategically combines the power of Zoom’s own custom Large Language Models (LLMs) and Small Language Models (SLMs) with the best models from third-party partners like OpenAI, Anthropic, and open-source models like NVIDIA Nemotron.

This hybrid approach dynamically routes tasks to the most suitable model for a given function, optimizing for performance, cost efficiency, and accuracy. This system has reportedly shown superior performance on certain benchmarks compared to reliance on a single frontier model.

By offering its AI assistant with limited free access and focusing on cross-platform functionality, Zoom is banking on a freemium strategy that captures new users and positions the AI Companion as an indispensable, neutral productivity layer across the entire enterprise stack, regardless of a company’s core software provider.