DD
MM
YYYY

PAGES

DD
MM
YYYY

spot_img

PAGES

Home Blog Page 16

Sterling Bank Taps Thunes Network to Boost Global Payments For Nigerians Abroad

0

Thunes a financial technology company that provides a global cross-border payment infrastructure has announced the onboarding of Sterling Bank into its Direct Global Network, a move set to enhance cross-border payment experiences for Nigerians living and working abroad.

Through this partnership, both new and existing Sterling Bank account holders can now access seamless, instant international payment services, making it easier to send money home in real time.

Speaking on the collaboration Daniel Parreira, SVP, Sales Africa at Thunes said,

“Welcoming Sterling Bank to our Direct Global Network marks another significant milestone in our expansion across Africa, and the trust in our infrastructure across the continent. Together, we’re enabling a new level of convenience, speed, and confidence for customers managing finances across borders. This alliance demonstrates our ongoing dedication to making global money movement instant, transparent and accessible for all.”

Also commenting, Ayodeji Saba, Head, Switch & Remittances at Sterling Bank said,

“This partnership reflects Sterling Bank’s deep commitment to making it easier for Nigerians abroad to send money home. With Thunes’ trusted technology, we’re giving our customers a faster, more reliable, and more affordable way to fund their Sterling Bank accounts from their foreign bank accounts. It’s a major step forward in improving the experience for our diaspora community.”

The collaboration underscores the shared commitment of Thunes and Sterling Bank to advancing financial inclusion and strengthening trusted, real-time payment infrastructure across Africa and beyond. Thunes direct global network connects banks, mobile Wallets and digital assets in 130 countries, reaching billions of endpoints in both fiat and Stablecoins.

For Sterling Bank the partnership is pivotal to its growth and innovation and integrating directly with a global payment network like Thunes, positions it to compete more effectively with standalone remittance providers, retain diaspora customers, and unlock new revenue streams tied to cross-border services. For customers, this means fewer intermediaries, reduced transaction failures, improved FX transparency, and faster access to funds often within seconds rather than days.

Notably, last year September, Sterling Bank’s SEABaas (Sterling Ecosystem and API Banking-as-a-Service) platform recorded a major milestone, processing over two billion transactions within a single year, underscoring the bank’s growing role as a technology-driven financial services provider.

This achievement highlights the scale, resilience, and reliability of Sterling’s digital infrastructure, which powers payments, collections, transfers, and embedded finance solutions for fintechs, merchants, corporates, and developers.

With an estimated 17 million Nigerians in the diaspora, the demand for fast, transparent, and reliable remittance solutions continues to grow. As a result, demand is shifting toward modern, digital remittance solutions that offer instant or near-instant transfers, clear pricing, real-time tracking, and seamless integration with local bank accounts and wallets.

The Thunes–Sterling Bank partnership reflects a growing consensus across the financial services industry. The future of remittances is instant, transparent, and customer-centric.

As diaspora populations grow and digital adoption accelerates, collaborations that combine global infrastructure with strong local banking presence will be key to reshaping cross-border payments, strengthening economic ties, and driving sustainable financial inclusion across Nigeria and the wider African continent.

Looking ahead

Partnerships like the Thunes–Sterling Bank collaboration signal a broader shift in how cross-border payments into Africa will be delivered.

As diaspora remittances remain one of Nigeria’s most stable sources of foreign exchange, financial institutions are increasingly prioritizing real-time, low-cost, and fully digital payment rails over legacy correspondent banking models

Lenovo Taps Nvidia to Fast-Track AI Data Centers as It Pushes Deeper Into the AI Stack

0

China’s Lenovo, the world’s largest personal computer maker, is sharpening its push into artificial intelligence by partnering with U.S. chip giant Nvidia to help AI cloud providers deploy data centers at unprecedented speed.

The company announced the tie-up on Tuesday at the Consumer Electronics Show (CES) in Las Vegas, framing it as a direct response to surging global demand for AI infrastructure. Lenovo said the partnership is designed to cut the time it takes to bring AI data centers online from months to just weeks, a critical advantage as competition intensifies among cloud providers racing to support large-scale AI workloads.

Under the programme, Lenovo will combine its liquid-cooled hybrid AI infrastructure with Nvidia’s computing platforms, offering what it described as an end-to-end solution for building so-called AI factories. Liquid cooling has become increasingly important as AI chips consume more power and generate more heat, turning thermal management into a key constraint on scaling.

“Lenovo AI Cloud Gigafactory with NVIDIA sets a new benchmark for scalable AI factory design, enabling the world’s most advanced AI environments to be deployed in record-setting time,” Lenovo chief executive Yang Yuanqing said, speaking alongside Nvidia CEO Jensen Huang.

The language reflects a broader shift in the AI industry. As demand moves from experimentation to industrial deployment, speed of execution is becoming as important as raw computing power. Cloud providers are under pressure from customers building large language models and AI services to add capacity quickly, even as supply chains for chips, power, and cooling remain tight.

For Lenovo, the partnership also signals ambition beyond its traditional PC stronghold. While the company remains best known for laptops and desktops, it also has a sizeable server business and has been positioning itself as a full-stack AI infrastructure provider, spanning devices, data centers, and software.

That strategy was on full display at CES. Alongside the Nvidia announcement, Lenovo showcased an AI platform, a range of concept devices, and its first foldable smartphone under the Motorola brand, highlighting how it plans to weave AI across its entire product portfolio.

Yang also unveiled Qira, a personal AI system designed to operate across Lenovo and Motorola PCs, smartphones, tablets, and wearables. Unlike standalone assistants, Qira is intended to work continuously in the background and integrate third-party services. Lenovo said the system would be able to offer services from companies such as travel firm Expedia, suggesting a move toward an ecosystem model rather than a single-device assistant.

The approach mirrors a broader industry trend, where hardware makers are trying to differentiate themselves by embedding AI deeper into everyday use, rather than treating it as a bolt-on feature. Lenovo is aiming to lock users into its ecosystem while gathering data and usage patterns that can inform future products by controlling both the devices and the AI layer that runs across them.

Lenovo also used the event to showcase concept AI glasses, placing them alongside companies such as Alibaba and Samsung Electronics, which are also exploring AI-powered wearables. In addition, it previewed an AI assistant wearable under “Project Maxwell,” designed to offer users real-time assistance, another signal of how AI is spilling beyond screens into ambient, always-on devices.

The Nvidia partnership sits at the center of this broader push. Nvidia’s chips have become the backbone of AI computing globally, and aligning closely with the company gives Lenovo credibility with cloud providers looking for proven, scalable solutions. The tie-up extends Lenovo’s reach deeper into enterprise and cloud infrastructure by pairing its platforms with Lenovo’s hardware, integration, and global supply chain.

The announcement also comes at a time when geopolitical and supply-chain considerations loom large. A Chinese company working closely with a U.S. AI chip leader highlights how interdependent the global AI ecosystem remains, even as governments talk more openly about technological decoupling.

However, there is the challenge of execution for Lenovo. Competing in AI infrastructure means going head-to-head with established server and systems players, while also keeping pace with rapid advances in chips, cooling, and software. But by promising faster deployment and tighter integration with Nvidia’s platforms, Lenovo is aiming to become a practical partner for cloud providers under pressure to scale.

Lenovo’s message at the CES is that it no longer wants to be seen only as a PC maker adapting to the AI age. Thus, with Nvidia at its side and a growing lineup of AI-driven devices and platforms, it is trying to claim a seat at the table of companies shaping how AI is built, deployed, and used.

Memory Takes the Lead as AI Fuels a Fresh Semiconductor Rally

0

Semiconductor stocks have started the year on a strong footing, with gains concentrated not in flashy logic chips but in a quieter, more fundamental corner of the industry: memory.

Shares of the world’s biggest memory makers have surged, reflecting how artificial intelligence is reshaping demand patterns across the chip sector and tightening supply in critical components.

South Korea’s SK Hynix and Samsung Electronics, the two largest memory chipmakers globally, are up 11.5% and 15.9% respectively so far this year. In the United States, Micron has climbed 16.3%. The rally comes as investors bet that AI-related demand, which drove chip markets through 2025, is not fading but intensifying.

At the heart of the move is memory’s central role in AI computing. Training and running large AI models designed by companies such as Nvidia and AMD requires vast amounts of fast, high-capacity memory to move data efficiently between processors. As cloud providers and tech giants pour billions of dollars into AI data centers, memory has emerged as a bottleneck.

One segment has been especially important: dynamic random-access memory, or DRAM, used extensively in AI servers. Prices for DRAM surged sharply in 2025 as demand outpaced supply, and that pressure has not eased. Counterpoint Research expects memory prices to rise another 40% through the second quarter of 2026, extending what analysts increasingly describe as a full-blown cycle rather than a brief spike.

“The recent rally across the semiconductor space has been driven largely by the memory side of the market rather than logic chips,” Ben Barringer, head of technology research at Quilter Cheviot, said in an email to CNBC. “We’re seeing a combination of very strong demand from AI workloads and relatively constrained supply, particularly in high-bandwidth memory, which is essential for training and running large AI models.”

High-bandwidth memory, or HBM, has become one of the most sought-after components in the AI supply chain. It sits close to processors in advanced packaging configurations, enabling faster data transfer and lower power consumption. SK Hynix is widely seen as a leader in this area, supplying HBM used in some of Nvidia’s most powerful AI accelerators, a position that has strengthened its earnings outlook.

That backdrop explains the optimism heading into earnings season. Samsung is expected to report a 140% jump in fourth-quarter operating profit, according to LSEG estimates, marking a sharp turnaround after a prolonged downturn in its memory business. Micron’s earnings per share are forecast to rise more than 400% year-on-year in the December quarter, reflecting both higher prices and improving utilization rates.

The rally has spilled beyond memory producers themselves. Investors are increasingly positioning for a broader AI-driven expansion across the semiconductor value chain. Intel shares are up 7.6% year-to-date, while Taiwan Semiconductor Manufacturing Co., the world’s largest contract chipmaker, has gained 10%. Both companies manufacture a wide range of chips and are expected to benefit as customers ramp up spending on advanced semiconductors tied to AI workloads.

Equipment suppliers are also riding the wave. ASML, the Dutch firm whose lithography machines are essential for producing the most advanced chips, has seen its shares rise 15.2% this year. Bernstein on Sunday raised its price target on ASML from 800 euros to 1,300 euros, implying about 24% upside from Tuesday’s trading level.

“ASML stands to benefit enormously from the wave of capacity expansion planned for 2026 and 2027,” Bernstein analysts wrote, pointing specifically to memory. They said the company would gain “from the upcoming DRAM super cycle,” as manufacturers invest heavily in new fabs and more advanced production lines.

That link is crucial. As memory makers respond to tight supply and strong pricing by expanding capacity, demand for ASML’s tools rises in tandem. Advanced DRAM and HBM production requires cutting-edge manufacturing equipment, locking ASML deeper into the AI investment cycle.

Recent signals from industry executives have reinforced the bullish narrative. SK Hynix has pointed to the possibility of an extended HBM supercycle, suggesting demand could remain elevated well beyond a single year.

“Recent comments from SK Hynix pointing to a potential HBM supercycle have reinforced the idea that this is not just a short-term bounce, but a more structural shift linked to the ongoing build-out of AI infrastructure,” Barringer said. “That has helped improve sentiment across the sector, especially for companies with direct exposure to AI-driven memory demand.”

The emerging picture is one where memory, long treated as the most cyclical and volatile part of the semiconductor industry, has become central to the AI story. As long as companies continue to scale data centers and push larger, more data-hungry models, memory demand is likely to stay tight.

For investors, this has reframed how the semiconductor rally is being judged. This is not simply about who designs the smartest AI chips, but about who controls the components that make those chips usable at scale. So far in 2026, memory makers are winning that argument.

Sony Honda Defies EV Slowdown With Afeela Push at CES, Betting on Premium Tech as U.S. Market Cools

0

Sony Honda Mobility stepped onto the CES stage in Las Vegas with a message that cut against the prevailing mood in the U.S. auto industry: it is still pressing ahead with electric vehicles, even as many rivals retreat.

The electric vehicle joint venture between Sony Group and Honda Motor unveiled its latest prototype at the Consumer Electronics Show on Monday, reaffirming plans to bring its first production model, the Afeela 1, to U.S. customers. Chief executive Yasuhide Mizuno said deliveries in California are expected to begin late this year, with a broader U.S. rollout of a model based on the Afeela prototype targeted as early as 2028.

The appearance was striking in a year when CES featured fewer splashy automotive debuts. Several U.S. and global automakers have scaled back EV ambitions, delayed new launches, or paused production altogether, citing weakening demand, rising costs, and policy uncertainty. Against that backdrop, Sony Honda’s presence underscored a longer-term bet that the EV market, while cooling in the near term, will eventually reward companies that can differentiate through software and user experience.

The Afeela 1, priced from $89,900, positions the venture firmly in the premium segment. That pricing reflects both its technology-heavy pitch and the reality that mass-market EV adoption in the United States has proven more difficult than many automakers expected. Consumers have grown wary of high sticker prices, charging infrastructure gaps, and concerns about resale values, challenges that have become more visible as incentives have been reduced.

Policy changes under the Trump administration have added to the pressure. The rollback of EV-friendly measures, including the removal of a $7,500 federal tax credit, has made electric vehicles less attractive to price-sensitive buyers. Automakers say the shift has slowed showroom traffic and forced a reassessment of production volumes, especially for models aimed at the middle of the market.

Tariffs on imported vehicles and auto parts have further complicated the picture, raising costs at a time when companies are already struggling to protect margins. As a result, CES 2026 unfolded with a noticeably more cautious tone from carmakers, many of whom opted to focus on incremental technology updates rather than full vehicle launches.

Sony Honda Mobility’s strategy appears deliberately insulated from some of those pressures. Formed in 2022, the joint venture was built on a clear division of strengths: Honda contributes decades of experience in vehicle engineering, manufacturing, and safety, while Sony brings software, sensor technology, entertainment, and gaming ecosystems. The companies have repeatedly framed Afeela not simply as an electric car, but as a software-defined platform designed to evolve over time.

At CES, that positioning was again front and center. The Afeela concept emphasizes advanced driver-assistance systems, immersive in-car entertainment, and deep integration with digital services. Sony has previously highlighted the use of imaging and sensing technologies derived from its consumer electronics and gaming businesses, as well as the potential for over-the-air updates to continuously add features.

That focus reflects a broader shift in the auto industry, where software is increasingly seen as a key battleground. Traditional automakers are racing to build in-house software capabilities or partner with technology firms, while newer entrants argue that the vehicle is becoming another connected device. Sony Honda is attempting to bridge those worlds, betting that consumers are willing to pay a premium for a seamless digital experience as much as horsepower or range.

Still, the road ahead is far from smooth as entering the U.S. market at the luxury end puts Afeela in direct competition with established EV brands and legacy automakers that already have scale, charging partnerships, and brand loyalty. Delivering on promises around software reliability, autonomous features, and user experience will be critical, particularly as consumers grow more skeptical of grand claims following years of delays and missed targets across the industry.

Timing also remains a risk. While Sony Honda expects initial deliveries in California later this year, its broader ambition to roll out a production model by 2028 means navigating several years of uncertain demand, evolving regulations, and rapid technological change. Battery costs, charging standards, and consumer expectations could all shift significantly before then.

Even so, the decision to press ahead sends a signal. At a moment when many automakers are pulling back to reassess, Sony Honda Mobility is choosing visibility and momentum. Its CES unveiling suggests confidence that the current slowdown is cyclical rather than structural, and that a carefully positioned, technology-led EV can still find an audience in the United States.

Some analysts believe the bet paying off will depend not just on policy or market conditions, but on execution. But currently, the Afeela stands as one of the few new EV programs still moving forward at a time when the industry is hitting the brakes.

The Most Undervalued Investment Right Now? Analysts Point to Ozak AI’s Fast Presale Growth and Near-$5.5M Raise

0

The market, which is prominently known for its volatility, shrinking liquidity and declining confidence is somehow seen altering its trend. Ozak AI, a new AI token, is approaching the $5.5 million presale mark, and analysts are openly saying that it is one of the most undervalued early-stage investments of this year.

While top-cap assets continue to flatten, Ozak AI is attracting investors at a pace that hasn’t been seen in months — something analysts say is a direct reflection of how deeply the market is shifting toward AI-driven utility projects.

A Rare Signal in a Weak Market: Ozak AI’s Presale Grows Faster as Everything Else Slows

The broader market has been red for weeks. Bitcoin dominance is up, altcoins are down, and sentiment remains soft. Yet Ozak AI has done the opposite — its presale surged past $5.41M and continues accelerating every day. This reversal is precisely what captured analyst attention. Why?

Because presales typically slow down during downturns — but Ozak AI’s has only strengthened, marking it as a project with independent momentum and real investor conviction.

Many analysts now classify Ozak AI as “severely undervalued relative to its narrative and tech stack”, given its starting price of just $0.014.

 

Why Analysts Believe Ozak AI Is Still Massively Undervalued

According to market strategists, several critical factors make Ozak AI stand out as a high-growth candidate:

  1. AI Narrative Strength + Real Utility

AI remains the strongest macro driver heading into 2026–2028. Ozak AI isn’t a hype token — it offers:

  • Prediction Agents (PAs)
  • Ozak Stream Network (OSN) 
  • EigenLayer AVS readiness
  • Arbitrum Orbit integration
  • Ozak Data Vaults for secure AI training datasets

This gives it a deeper technical foundation than most AI projects currently on the market.

  1. A Micro Valuation With Macro-Level Growth Potential

With under $5.5M raised, Ozak AI sits in the prime price discovery zone. Tokens that grow from microcaps to midcaps often generate the highest multipliers. All of which began small and multiplied 50×–150× once demand peaked. Ozak AI’s entry point is still early — even after gaining significant traction.

  1. Strong Investor Rotation From Larger Tokens

Analysts are already tracking a rotation pattern:

  • Small BTC holders
  • Short-term ETH traders
  • Early SOL flippers

are reallocating small amounts of capital into Ozak AI to capture the early-stage upside that blue-chips can no longer match. This shift reinforces the belief that Ozak AI is not fully priced in despite its fast-approaching $5.5M raise.

Why “Undervalued” Is Becoming the Consensus View

Most tokens approaching a $5.5M presale valuation are priced much higher — often at $0.05–$0.10 or more. Ozak AI, in contrast, sits at just $0.014.

This mismatch between:

  • presale traction,
  • market demand,
  • AI narrative strength, and
  • extremely low starting price

is exactly why analysts label it undervalued compared to its long-term potential. In their models, Ozak AI could very realistically reach:

  • $1 at listing,
  • $3–$5 in broader market expansion, and
  • $7–$10 in peak AI bull-cycle demand by 2027–2028.

Even the most conservative scenarios show it outperforming top-caps by a large margin.

Approaching $5.5M: The Moment That May Trigger a Major Repricing Wave

Crossing the $5.5M presale threshold often marks the psychological shift from “emerging token” to “serious contender.” And with Ozak AI only steps away, analysts expect: bigger buyers entering, more aggressive community growth, stronger exchange listing interest and rapid acceleration in demand. It is the exact point where undervaluation starts to vanish — and multipliers begin forming. In this journey the support from SINT, HIVE Intel, Weblume, Pyth Network and others has also helped a lot.

Final Take: Undervalued Today, Potentially a Top Performer Tomorrow

Ozak AI has become the rare project that checks every box: fast funding, real utility, strong AI positioning, low entry price, high-confidence investor rotation and a rapidly increasing presale cap. Analysts aren’t calling it undervalued for hype — they’re pointing to the data.

With the presale nearing $5.5M, this could be the last phase before the market revalues Ozak AI toward its true potential.

For more information about Ozak AI, visit the links below:

Website: https://ozak.ai/

Twitter/X: https://x.com/OzakAGI

Telegram: https://t.me/OzakAGI