DD
MM
YYYY

PAGES

DD
MM
YYYY

spot_img

PAGES

Home Blog Page 55

Memory Takes the Lead as AI Fuels a Fresh Semiconductor Rally

0

Semiconductor stocks have started the year on a strong footing, with gains concentrated not in flashy logic chips but in a quieter, more fundamental corner of the industry: memory.

Shares of the world’s biggest memory makers have surged, reflecting how artificial intelligence is reshaping demand patterns across the chip sector and tightening supply in critical components.

South Korea’s SK Hynix and Samsung Electronics, the two largest memory chipmakers globally, are up 11.5% and 15.9% respectively so far this year. In the United States, Micron has climbed 16.3%. The rally comes as investors bet that AI-related demand, which drove chip markets through 2025, is not fading but intensifying.

At the heart of the move is memory’s central role in AI computing. Training and running large AI models designed by companies such as Nvidia and AMD requires vast amounts of fast, high-capacity memory to move data efficiently between processors. As cloud providers and tech giants pour billions of dollars into AI data centers, memory has emerged as a bottleneck.

One segment has been especially important: dynamic random-access memory, or DRAM, used extensively in AI servers. Prices for DRAM surged sharply in 2025 as demand outpaced supply, and that pressure has not eased. Counterpoint Research expects memory prices to rise another 40% through the second quarter of 2026, extending what analysts increasingly describe as a full-blown cycle rather than a brief spike.

“The recent rally across the semiconductor space has been driven largely by the memory side of the market rather than logic chips,” Ben Barringer, head of technology research at Quilter Cheviot, said in an email to CNBC. “We’re seeing a combination of very strong demand from AI workloads and relatively constrained supply, particularly in high-bandwidth memory, which is essential for training and running large AI models.”

High-bandwidth memory, or HBM, has become one of the most sought-after components in the AI supply chain. It sits close to processors in advanced packaging configurations, enabling faster data transfer and lower power consumption. SK Hynix is widely seen as a leader in this area, supplying HBM used in some of Nvidia’s most powerful AI accelerators, a position that has strengthened its earnings outlook.

That backdrop explains the optimism heading into earnings season. Samsung is expected to report a 140% jump in fourth-quarter operating profit, according to LSEG estimates, marking a sharp turnaround after a prolonged downturn in its memory business. Micron’s earnings per share are forecast to rise more than 400% year-on-year in the December quarter, reflecting both higher prices and improving utilization rates.

The rally has spilled beyond memory producers themselves. Investors are increasingly positioning for a broader AI-driven expansion across the semiconductor value chain. Intel shares are up 7.6% year-to-date, while Taiwan Semiconductor Manufacturing Co., the world’s largest contract chipmaker, has gained 10%. Both companies manufacture a wide range of chips and are expected to benefit as customers ramp up spending on advanced semiconductors tied to AI workloads.

Equipment suppliers are also riding the wave. ASML, the Dutch firm whose lithography machines are essential for producing the most advanced chips, has seen its shares rise 15.2% this year. Bernstein on Sunday raised its price target on ASML from 800 euros to 1,300 euros, implying about 24% upside from Tuesday’s trading level.

“ASML stands to benefit enormously from the wave of capacity expansion planned for 2026 and 2027,” Bernstein analysts wrote, pointing specifically to memory. They said the company would gain “from the upcoming DRAM super cycle,” as manufacturers invest heavily in new fabs and more advanced production lines.

That link is crucial. As memory makers respond to tight supply and strong pricing by expanding capacity, demand for ASML’s tools rises in tandem. Advanced DRAM and HBM production requires cutting-edge manufacturing equipment, locking ASML deeper into the AI investment cycle.

Recent signals from industry executives have reinforced the bullish narrative. SK Hynix has pointed to the possibility of an extended HBM supercycle, suggesting demand could remain elevated well beyond a single year.

“Recent comments from SK Hynix pointing to a potential HBM supercycle have reinforced the idea that this is not just a short-term bounce, but a more structural shift linked to the ongoing build-out of AI infrastructure,” Barringer said. “That has helped improve sentiment across the sector, especially for companies with direct exposure to AI-driven memory demand.”

The emerging picture is one where memory, long treated as the most cyclical and volatile part of the semiconductor industry, has become central to the AI story. As long as companies continue to scale data centers and push larger, more data-hungry models, memory demand is likely to stay tight.

For investors, this has reframed how the semiconductor rally is being judged. This is not simply about who designs the smartest AI chips, but about who controls the components that make those chips usable at scale. So far in 2026, memory makers are winning that argument.

Sony Honda Defies EV Slowdown With Afeela Push at CES, Betting on Premium Tech as U.S. Market Cools

0

Sony Honda Mobility stepped onto the CES stage in Las Vegas with a message that cut against the prevailing mood in the U.S. auto industry: it is still pressing ahead with electric vehicles, even as many rivals retreat.

The electric vehicle joint venture between Sony Group and Honda Motor unveiled its latest prototype at the Consumer Electronics Show on Monday, reaffirming plans to bring its first production model, the Afeela 1, to U.S. customers. Chief executive Yasuhide Mizuno said deliveries in California are expected to begin late this year, with a broader U.S. rollout of a model based on the Afeela prototype targeted as early as 2028.

The appearance was striking in a year when CES featured fewer splashy automotive debuts. Several U.S. and global automakers have scaled back EV ambitions, delayed new launches, or paused production altogether, citing weakening demand, rising costs, and policy uncertainty. Against that backdrop, Sony Honda’s presence underscored a longer-term bet that the EV market, while cooling in the near term, will eventually reward companies that can differentiate through software and user experience.

The Afeela 1, priced from $89,900, positions the venture firmly in the premium segment. That pricing reflects both its technology-heavy pitch and the reality that mass-market EV adoption in the United States has proven more difficult than many automakers expected. Consumers have grown wary of high sticker prices, charging infrastructure gaps, and concerns about resale values, challenges that have become more visible as incentives have been reduced.

Policy changes under the Trump administration have added to the pressure. The rollback of EV-friendly measures, including the removal of a $7,500 federal tax credit, has made electric vehicles less attractive to price-sensitive buyers. Automakers say the shift has slowed showroom traffic and forced a reassessment of production volumes, especially for models aimed at the middle of the market.

Tariffs on imported vehicles and auto parts have further complicated the picture, raising costs at a time when companies are already struggling to protect margins. As a result, CES 2026 unfolded with a noticeably more cautious tone from carmakers, many of whom opted to focus on incremental technology updates rather than full vehicle launches.

Sony Honda Mobility’s strategy appears deliberately insulated from some of those pressures. Formed in 2022, the joint venture was built on a clear division of strengths: Honda contributes decades of experience in vehicle engineering, manufacturing, and safety, while Sony brings software, sensor technology, entertainment, and gaming ecosystems. The companies have repeatedly framed Afeela not simply as an electric car, but as a software-defined platform designed to evolve over time.

At CES, that positioning was again front and center. The Afeela concept emphasizes advanced driver-assistance systems, immersive in-car entertainment, and deep integration with digital services. Sony has previously highlighted the use of imaging and sensing technologies derived from its consumer electronics and gaming businesses, as well as the potential for over-the-air updates to continuously add features.

That focus reflects a broader shift in the auto industry, where software is increasingly seen as a key battleground. Traditional automakers are racing to build in-house software capabilities or partner with technology firms, while newer entrants argue that the vehicle is becoming another connected device. Sony Honda is attempting to bridge those worlds, betting that consumers are willing to pay a premium for a seamless digital experience as much as horsepower or range.

Still, the road ahead is far from smooth as entering the U.S. market at the luxury end puts Afeela in direct competition with established EV brands and legacy automakers that already have scale, charging partnerships, and brand loyalty. Delivering on promises around software reliability, autonomous features, and user experience will be critical, particularly as consumers grow more skeptical of grand claims following years of delays and missed targets across the industry.

Timing also remains a risk. While Sony Honda expects initial deliveries in California later this year, its broader ambition to roll out a production model by 2028 means navigating several years of uncertain demand, evolving regulations, and rapid technological change. Battery costs, charging standards, and consumer expectations could all shift significantly before then.

Even so, the decision to press ahead sends a signal. At a moment when many automakers are pulling back to reassess, Sony Honda Mobility is choosing visibility and momentum. Its CES unveiling suggests confidence that the current slowdown is cyclical rather than structural, and that a carefully positioned, technology-led EV can still find an audience in the United States.

Some analysts believe the bet paying off will depend not just on policy or market conditions, but on execution. But currently, the Afeela stands as one of the few new EV programs still moving forward at a time when the industry is hitting the brakes.

The Most Undervalued Investment Right Now? Analysts Point to Ozak AI’s Fast Presale Growth and Near-$5.5M Raise

0

The market, which is prominently known for its volatility, shrinking liquidity and declining confidence is somehow seen altering its trend. Ozak AI, a new AI token, is approaching the $5.5 million presale mark, and analysts are openly saying that it is one of the most undervalued early-stage investments of this year.

While top-cap assets continue to flatten, Ozak AI is attracting investors at a pace that hasn’t been seen in months — something analysts say is a direct reflection of how deeply the market is shifting toward AI-driven utility projects.

A Rare Signal in a Weak Market: Ozak AI’s Presale Grows Faster as Everything Else Slows

The broader market has been red for weeks. Bitcoin dominance is up, altcoins are down, and sentiment remains soft. Yet Ozak AI has done the opposite — its presale surged past $5.41M and continues accelerating every day. This reversal is precisely what captured analyst attention. Why?

Because presales typically slow down during downturns — but Ozak AI’s has only strengthened, marking it as a project with independent momentum and real investor conviction.

Many analysts now classify Ozak AI as “severely undervalued relative to its narrative and tech stack”, given its starting price of just $0.014.

 

Why Analysts Believe Ozak AI Is Still Massively Undervalued

According to market strategists, several critical factors make Ozak AI stand out as a high-growth candidate:

  1. AI Narrative Strength + Real Utility

AI remains the strongest macro driver heading into 2026–2028. Ozak AI isn’t a hype token — it offers:

  • Prediction Agents (PAs)
  • Ozak Stream Network (OSN) 
  • EigenLayer AVS readiness
  • Arbitrum Orbit integration
  • Ozak Data Vaults for secure AI training datasets

This gives it a deeper technical foundation than most AI projects currently on the market.

  1. A Micro Valuation With Macro-Level Growth Potential

With under $5.5M raised, Ozak AI sits in the prime price discovery zone. Tokens that grow from microcaps to midcaps often generate the highest multipliers. All of which began small and multiplied 50×–150× once demand peaked. Ozak AI’s entry point is still early — even after gaining significant traction.

  1. Strong Investor Rotation From Larger Tokens

Analysts are already tracking a rotation pattern:

  • Small BTC holders
  • Short-term ETH traders
  • Early SOL flippers

are reallocating small amounts of capital into Ozak AI to capture the early-stage upside that blue-chips can no longer match. This shift reinforces the belief that Ozak AI is not fully priced in despite its fast-approaching $5.5M raise.

Why “Undervalued” Is Becoming the Consensus View

Most tokens approaching a $5.5M presale valuation are priced much higher — often at $0.05–$0.10 or more. Ozak AI, in contrast, sits at just $0.014.

This mismatch between:

  • presale traction,
  • market demand,
  • AI narrative strength, and
  • extremely low starting price

is exactly why analysts label it undervalued compared to its long-term potential. In their models, Ozak AI could very realistically reach:

  • $1 at listing,
  • $3–$5 in broader market expansion, and
  • $7–$10 in peak AI bull-cycle demand by 2027–2028.

Even the most conservative scenarios show it outperforming top-caps by a large margin.

Approaching $5.5M: The Moment That May Trigger a Major Repricing Wave

Crossing the $5.5M presale threshold often marks the psychological shift from “emerging token” to “serious contender.” And with Ozak AI only steps away, analysts expect: bigger buyers entering, more aggressive community growth, stronger exchange listing interest and rapid acceleration in demand. It is the exact point where undervaluation starts to vanish — and multipliers begin forming. In this journey the support from SINT, HIVE Intel, Weblume, Pyth Network and others has also helped a lot.

Final Take: Undervalued Today, Potentially a Top Performer Tomorrow

Ozak AI has become the rare project that checks every box: fast funding, real utility, strong AI positioning, low entry price, high-confidence investor rotation and a rapidly increasing presale cap. Analysts aren’t calling it undervalued for hype — they’re pointing to the data.

With the presale nearing $5.5M, this could be the last phase before the market revalues Ozak AI toward its true potential.

For more information about Ozak AI, visit the links below:

Website: https://ozak.ai/

Twitter/X: https://x.com/OzakAGI

Telegram: https://t.me/OzakAGI

AB InBev Moves to Reassert Control of U.S. Can Plants With $3bn 49.9% Stake Buy Back

0

Anheuser-Busch InBev has decided the time is right to reverse a deal it struck at the height of its debt-cutting drive, announcing plans to buy back a 49.9% stake in its U.S. metal container plants for about $3 billion as aluminum prices surge under the weight of tariffs and tight supply.

The world’s largest brewer said on Tuesday it would exercise a repurchase option agreed in 2020, when it sold the minority stake in its U.S. packaging business to a group of investors led by Apollo Global Management. The sale was part of a broader effort to shore up its balance sheet after years of acquisition-led expansion left it with a heavy debt burden.

Under the original deal, AB InBev retained operational control of the packaging business, which spans seven plants across six U.S. states, and secured a long-term supply agreement to meet its canning needs. The agreement also included an option to buy back the stake after five years at a predetermined price, a clause the brewer is now set to activate.

The company said the transaction would be funded with cash and is expected to close in the first quarter. It added that the deal would boost profits from the first year, though it declined to provide detailed financial projections. AB InBev shares were down 0.7% in midday trading.

While the brewer kept its explanation brief, the timing offers clues. Aluminum costs have climbed sharply, driven by tariffs and supply constraints, which have pushed U.S. market premiums to record highs. Benchmark three-month aluminum on the London Metal Exchange reached $3,130 per metric ton on Tuesday, the highest level since April 2022.

President Donald Trump doubled tariffs on aluminum imports to 50% on June 4, a move aimed at encouraging domestic production of the metal, which is widely used in construction, power infrastructure, and packaging. For beverage companies that rely heavily on cans, the policy has added another layer of cost pressure.

AB InBev has said hedging has helped cushion the immediate impact, but chief executive Michel Doukeris warned last year that the effects could become more pronounced in 2026. Regaining full ownership of its U.S. container plants strengthens the brewer’s grip on a critical part of its supply chain at a time when external costs are becoming harder to predict.

The repurchase also marks a shift in posture for a company that has spent much of the past decade focused on raising cash rather than deploying it. After years of asset sales, dividend restraint, and disciplined spending, AB InBev cut its debt to below levels widely viewed as acceptable by investors by the end of 2024. The buyback of the packaging stake is its first major transaction since crossing that threshold.

Analysts see a clear financial logic. Bernstein analyst Trevor Stirling said the combined cost of buying packaging externally and servicing the minority interest was higher than the effective financing cost of the $3 billion repurchase. On that basis, he said, the deal should lift earnings and only modestly reduce the scale of future share buybacks, an important consideration for investors who have pushed for higher returns as leverage has come down.

Still, the move comes as the brewer faces headwinds in its largest profit pool. Beer sales in the United States have been sliding as consumers rein in discretionary spending, while spirits continue to gain share. Some investors also point to changing attitudes toward alcohol among younger consumers, adding to longer-term uncertainty around volumes.

Against that backdrop, controlling costs has become increasingly important. By bringing its U.S. canning operations fully back in-house, AB InBev is betting that tighter control over packaging will help offset margin pressure elsewhere in the business, even as metal prices remain elevated.

For a company long defined by financial discipline and deleveraging, the decision to spend $3 billion is notable. It signals that AB InBev believes the balance sheet is strong enough to support selective reinvestment, and that, in an era of trade barriers and volatile input costs, owning more of the supply chain can be as valuable as paying down debt.

AMD’s Lisa Su Warns AI Will Need “10 Yottaflops” of Compute, Far Beyond Anything the World Has Ever Built

0

When AMD chief executive Lisa Su stepped onto the CES 2026 stage in Las Vegas, she was not just unveiling a new generation of chips. She was trying to recalibrate how the industry thinks about scale.

Artificial intelligence, she said, is growing so fast that familiar yardsticks for computing power no longer apply. The future, in her telling, belongs to a unit so large it still sounds theoretical: the yottaflop.

Su told the audience that keeping pace with AI over the next five years will require more than 10 yottaflops of compute. She paused mid-speech to underline how unfamiliar that number is.

“How many of you know what a yottaflop is?” she asked, inviting a show of hands. When none appeared, she explained it herself.

“A yottaflop is a one followed by 24 zeros. So 10 yottaflop flops is 10,000 times more compute than we had in 2022,” she said.

At its core, a flop is a single mathematical operation. A computer capable of performing one billion operations per second is said to deliver a gigaflop. A yottaflop represents one septillion calculations every second. At that scale, scientists say, computers could theoretically run atom-level simulations for entire planets, workloads that today sit firmly in the realm of speculation.

What makes Su’s projection striking is not just the size of the number, but the speed at which the industry is approaching it. In 2022, global AI compute was estimated at roughly one zettaflop, or 10²¹ operations per second. By 2025, Su said, that figure had already surged beyond 100 zettaflops. The jump from zettaflops to yottaflops is not a smooth curve. It is a steep climb that compresses decades of historical progress into a few years.

“There’s just never, ever been anything like this in the history of computing,” Su told the conference.

To grasp the magnitude, Su compared her forecast with the most powerful machine currently in operation. The US Department of Energy’s El Capitan supercomputer, which tops global rankings today, would need to be multiplied by about 5.6 million times to reach 10 yottaflops. Even the vast data centers being built by cloud giants fall dramatically short of that benchmark.

The implication is that AI’s next phase is no longer limited by software ingenuity alone. It is colliding with hard physical constraints. Power consumption has already become a central issue. Training large AI models and running them at scale requires enormous amounts of electricity, and that demand is placing visible strain on the US power grid. Data center operators are competing for capacity, while utilities warn that generation and transmission upgrades are struggling to keep up.

Scaling compute by several more orders of magnitude would require a parallel transformation of energy infrastructure. More power plants, stronger grids, advanced cooling systems, and new approaches to efficiency would all be necessary. In that sense, the yottaflop challenge extends far beyond chipmakers. It touches energy policy, industrial planning, and national infrastructure strategy.

There is also an economic dimension. The cost of building and operating yottaflop-scale systems will be immense. As computing becomes more concentrated in a handful of hyperscale players, questions around access, pricing, and competition are likely to intensify. Smaller firms and research institutions may find themselves locked out of the most advanced AI capabilities unless new models for shared infrastructure emerge.

Against this backdrop, Su used the CES keynote to position AMD as a key supplier for what comes next. She unveiled the company’s next generation of AI accelerators, including the MI455 GPU, underscoring AMD’s push deeper into the data-center market. The company is increasingly targeting customers building massive AI systems, including OpenAI, as it seeks to close the gap with Nvidia in high-performance AI hardware.

The timing comes as AI is moving from experimentation into industrial-scale deployment. Governments are embedding it into national strategies, companies are baking it into core products, and scientific research is leaning on it for breakthroughs. That shift is driving demand for compute at a scale that was barely discussed a few years ago.

Su’s message at CES was less a distant forecast than a warning shot. If AI continues on its current trajectory, the world will be forced to rethink how computing power is built, powered, and governed. The yottaflop, once a mathematical curiosity, is rapidly becoming the next benchmark.