DD
MM
YYYY

PAGES

DD
MM
YYYY

spot_img

PAGES

Home Blog Page 76

Fintech Evolution Notes a Centralization through Democratization 

0

In many fintech platforms, especially super-apps, embedded finance providers, neobanks, or centralized digital banking ecosystems, customer journeys and revenue streams become highly integrated.

A single app or platform might handle payments, lending, investments, insurance, subscriptions, rewards, and more. This creates centralization of user data, identity, transactions, and interactions under one roof (or one data layer). While this centralization drives better user experience, faster innovation, and higher retention, it often distorts or skews revenue attribution in several ways: Multi-product / bundled revenue makes clean source attribution difficult.

A user might sign up via a marketing campaign for free P2P transfers, but later activates high-margin products like buy-now-pay-later (BNPL), crypto trading, or premium subscriptions. The initial acquisition channel gets credit for the sign-up, but the real revenue often comes from downstream cross-sells or usage-based fees that happen months later. Last-click or simple models massively under- or over-credit channels.

Platform-level economics obscure channel / partner contribution. In centralized fintechs (think Revolut, Nubank, Chime, or super-apps like WeChat Pay / Alipay), revenue frequently comes from interchange, float, lending spreads, premium tiers, or data-driven upsell — not always directly tied to a specific paid ad click, affiliate referral, or organic search.

When everything flows through one centralized ledger / identity system, it’s hard to trace incremental revenue back to fragmented marketing or partnership efforts. Data silos vs. over-centralization paradox. Ironically, extreme centralization without strong governance can create new attribution problems. When all data lives in one place but definitions of “customer,” “conversion,” “lifetime value,” or “attributable revenue” aren’t consistently governed across teams (marketing, product, finance), models still disagree.

Recent discussions highlight how fragmented governance — even inside a centralized system — leads to conflicting attribution results, as teams interpret the same unified data differently. Winner-takes-all dynamics amplify the skew. Research on fintech evolution notes a “centralization through democratization” pattern: digital tools lower barriers, but scale advantages and network effects lead to concentrated market power among a few large platforms.

These giants capture disproportionate revenue, but attributing that revenue to specific digital channels, features, or partners becomes opaque because so much value accrues at the platform level rather than at individual touchpoints. In practice this means Marketing / growth teams struggle to prove ROI as budgets get cut or misallocated.

Product teams over-invest in features that look high-engagement but drive low incremental revenue. Finance / investor reporting shows strong top-line growth but unclear unit economics or channel profitability.

Many fintechs are countering this by moving toward more sophisticated multi-touch attribution, incrementality testing, unified customer data platforms with strong governance, and ML-driven behavioral attribution that credits downstream revenue events more intelligently — rather than relying on simplistic digital-first models.

The centralization that makes digital fintech so powerful is exactly what makes precise revenue attribution harder than in traditional, siloed banking or pure e-commerce. It’s a feature, not a bug — but one that requires mature analytics and governance to manage.

Multi-touch attribution (MTA) is a marketing analytics approach that assigns credit to multiple touchpoints (interactions) a customer has with your brand across their entire journey toward a conversion — such as a sign-up, deposit, purchase, subscription activation, or revenue-generating event.

Unlike single-touch models (e.g., first-click or last-click attribution), which give 100% of the credit to just one interaction, MTA recognizes that modern customer journeys — especially in digital fintech — involve many steps across channels like paid ads, organic search, email, social media, referrals, app notifications, content, and in-app features.

MTA distributes credit fractionally across these touchpoints to show which ones truly drive value. This is particularly relevant in centralized digital fintech platforms, where users often enter via low-intent channels (e.g., a free transfer promo) but generate most revenue later through high-margin products (lending, premium tiers, investments).

MTA helps avoid over-crediting the “last click” while revealing the full role of earlier nurturing touchpoints.

Upcoming Tekedia Programs: AI, Business, and Investment Learning Opportunities

0

Greetings! We are pleased to announce the upcoming start dates for several Tekedia Institute programs:

Tekedia AI Technical Lab – Begins Saturday, March 14, 2026. Register here.

Tekedia Mini-MBA – Registration has opened for the next edition starting in June 2026. Take advantage of the early bird discounts by registering here.

Python Coding with AI for Agentic AI Development – If you have completed Tekedia AI Lab, this program will help you conceptualize solutions and build Python-based AI agents to solve real-world problems. The course begins on April 11, 2026. Register here.

In addition, we offer several other programs including Tekedia AI in Business, Tekedia Startup Masterclass, Tekedia Investment and Portfolio Management, and more. You can explore the full list of Tekedia programs through this link.

Memory Chip Crisis: Executives Say A Decades-Old Boom-And-Bust Cycle May Finally Be Breaking Down

0

The global race to build artificial intelligence infrastructure is reshaping the economics of the memory chip industry, driving record share gains for manufacturers and prompting executives to say a decades-old boom-and-bust cycle may finally be breaking down.

Shares of Micron Technology have surged more than 370% over the past year as demand for AI-related memory accelerates. Meanwhile, SanDisk — which returned to the public market in February last year after being spun out of Western Digital — has soared more than 1,100%, highlighting investor enthusiasm for companies tied to the AI supply chain.

For much of the past three decades, memory manufacturers operated under one of the semiconductor industry’s most volatile cycles. Prices for DRAM and NAND storage would spike during supply shortages, prompting producers to rapidly expand manufacturing capacity. The resulting oversupply would then drive prices down, triggering sharp downturns before the next recovery.

Executives across the technology sector now say artificial intelligence is fundamentally altering that dynamic.

“We will continue to raise prices because the industry will continue to raise prices,” said Antonio Neri, chief executive of Hewlett Packard Enterprise. “There is not enough supply for demand.”

The shift is being driven by the unprecedented computing requirements of modern AI systems. Training large language models and generative AI platforms requires vast clusters of processors working simultaneously, supported by massive pools of ultra-fast memory. That architecture is dramatically more memory-intensive than traditional computing environments used for enterprise software, personal computers, or smartphones.

High-bandwidth memory, commonly known as HBM, has emerged as one of the most critical components in AI hardware. The technology allows chips to access data far faster than conventional DRAM, making it essential for training large AI models and running advanced inference workloads.

Demand for HBM has surged so rapidly that technology companies are rushing to secure long-term supply contracts.

SK Hynix, one of the world’s largest memory manufacturers and a major supplier of HBM, said the industry is undergoing structural changes as customers increasingly prefer multi-year supply agreements.

“The company’s customers, including hyperscalers, have increasingly preferred long-term contracts over the one-year agreements that were more common in the past,” an SK Hynix spokesperson said.

Micron Technology has reported a similar shift, telling CNBC that customers are now willing to sign long-term agreements to lock in supply as competition intensifies for AI hardware components.

Those customers include some of the world’s largest technology companies, often referred to as hyperscalers because of the massive scale of their cloud computing infrastructure.

Executives say these companies are reserving memory capacity years in advance.

On the latest earnings call for Broadcom, Chief Executive Hock Tan said the company has already secured supply commitments for key components through 2028 as demand for AI chips and systems accelerates.

Technology giants building their own AI hardware are also confronting supply constraints. Meta Platforms on Wednesday unveiled a new internally designed AI chip as part of its push to expand computing capacity for artificial intelligence workloads.

But even as the company ramps up hardware development, it remains concerned about securing sufficient memory.

“We’re absolutely worried about HBM supply,” said Yee Jiun Song, vice president of engineering at Meta. “But we think that we have secured our supply for what we’re planning to build out.”

The pressure on memory supply is being amplified by a massive wave of capital spending across the technology industry. Major cloud providers such as Amazon, Microsoft, Alphabet, and Meta are investing hundreds of billions of dollars in AI data centers to support the growing demand for generative AI services.

Each of those facilities requires enormous volumes of memory chips to feed data to powerful processors and graphics chips that train and run AI models.

As hyperscalers absorb increasing amounts of available supply, analysts say the balance of the memory market is shifting away from consumer electronics. Manufacturers of smartphones, PCs, and other consumer devices are finding themselves competing with data center operators for the same components, often at higher prices.

An executive at Seagate Technology told the South China Morning Post that memory price increases could become “the new normal” for the next several years. The long lead times required to expand semiconductor manufacturing capacity are reinforcing those expectations.

Building advanced memory fabrication plants costs tens of billions of dollars and can take several years to complete, meaning supply cannot quickly adjust to sudden surges in demand. As a result, industry executives believe meaningful relief from supply constraints may not arrive until at least 2027, when new facilities currently under construction begin operating at full scale.

The emergence of long-term contracts is also changing how the memory industry manages supply and pricing. Historically, most memory was sold on short-term contracts or even spot markets, leaving prices highly sensitive to shifts in demand.

Multi-year agreements with hyperscalers, by contrast, provide greater revenue visibility for manufacturers while ensuring customers receive priority access to scarce components.

That change could smooth the dramatic price swings that once defined the sector. For investors, the sharp rally in memory stocks is an indication that markets are increasingly convinced the industry is entering a new phase — one powered by sustained demand from artificial intelligence rather than the cyclical consumer electronics markets that dominated the past

Nvidia invests $2bn in AI cloud firm Nebius as chip giant deepens control of the AI infrastructure boom

0

Nvidia said on Wednesday it will invest $2 billion in artificial intelligence cloud provider Nebius, extending the chipmaker’s aggressive push to shape the infrastructure powering the global AI boom.

A filing with the U.S. Securities and Exchange Commission showed Nvidia agreed to purchase shares representing roughly an 8.3% stake in the Amsterdam-based company at $94.94 per share. Nebius, which is listed on the Nasdaq, surged nearly 14% following the disclosure, trading around $109.72 in afternoon dealings.

The deal denotes how Nvidia, now widely viewed as the central supplier of hardware behind artificial intelligence, is increasingly investing directly in companies that build and operate AI computing infrastructure.

Nvidia has been embedding itself across the rapidly expanding AI ecosystem—from semiconductor manufacturing to data centers and cloud platforms that deploy its chips, a strategy the investment is believed to reflect.

Demand for AI computing power has surged since the explosion of generative AI tools, forcing cloud providers and startups to build massive data-center networks capable of running advanced machine-learning models.

Nvidia’s graphics processing units (GPUs) remain the dominant chips used to train and run those systems, giving the company extraordinary leverage over the industry’s supply chain.

By investing in infrastructure providers such as Nebius, Nvidia can both accelerate the deployment of its hardware and ensure the continued expansion of AI computing capacity globally.

Nebius’ Massive Data-Center Expansion Plans

Nebius said it plans to deploy more than 5 gigawatts of data-center capacity by 2030, a scale that illustrates the enormous electricity demands associated with modern AI workloads. That level of capacity is roughly equivalent to the electricity consumption of more than four million U.S. households, highlighting the rapidly growing energy footprint of artificial intelligence infrastructure.

To support the build-out, Nebius has significantly increased its spending on infrastructure. The company reported capital expenditure of $2.1 billion in the December quarter, up sharply from $416 million a year earlier, as it accelerates expansion of its computing capacity.

Rise Of The “Neocloud” Sector

Nebius belongs to a new category of AI infrastructure providers sometimes described as “neocloud” companies. Unlike traditional cloud giants that serve a broad mix of industries and computing workloads, these firms focus almost entirely on high-performance infrastructure optimized for artificial intelligence.

Other players in the segment include CoreWeave, which has rapidly gained prominence through multibillion-dollar deals supplying AI computing power to large technology companies.

Nebius and its peers are positioning themselves as specialized providers capable of delivering massive GPU clusters to train advanced models.

That model has already attracted major clients. Nebius has secured large contracts with U.S. technology companies, including a $17 billion infrastructure agreement with Microsoft and a $3 billion deal with Meta Platforms, underscoring the scale of investment flowing into AI computing.

Nvidia’s Expanding Investment Portfolio

The Nebius investment adds to a growing list of high-profile deals through which Nvidia is helping finance the infrastructure underpinning artificial intelligence.

Last year, the company agreed to deploy at least 10 gigawatts of AI systems for OpenAI and later announced a $30 billion investment in the startup, deepening its role in the development of advanced AI models.

Such moves illustrate how Nvidia is evolving beyond a chip supplier into a key financial and strategic backer of companies building the next generation of AI platforms.

However, Nvidia’s growing web of investments has also raised concerns among some analysts and investors.

Many of the companies receiving funding from Nvidia are also major buyers of its chips, creating what critics describe as a form of circular financing in which Nvidia effectively helps fund the infrastructure that purchases its hardware.

Supporters argue the strategy accelerates the rollout of global AI capacity and ensures that computing supply keeps pace with surging demand.

But skeptics warn it could concentrate too much influence over the AI ecosystem in the hands of a single supplier.

Nvidia’s chief executive Jensen Huang framed the Nebius partnership as part of the next phase of artificial intelligence development.

“Nebius is building an AI cloud designed for the agentic era,” Huang said in a statement, referring to a new generation of AI systems capable of performing autonomous tasks rather than simply responding to prompts.

The partnership, he added, will help scale the company’s infrastructure to meet “surging global demand for intelligence.”

The deal highlights the massive financial and energy investments now required to support the AI revolution. Global spending on AI data centers is expected to reach hundreds of billions of dollars in the coming years as technology companies race to build the computing power required for increasingly sophisticated models.

For Nvidia, whose chips sit at the center of that expansion, strategic investments like the Nebius deal help reinforce its position not only as the industry’s dominant hardware supplier but also as a key architect of the infrastructure shaping the future of artificial intelligence.

Iran Says Oil Price Will Hit $200 as Nigeria Bets on Dangote Refinery to Shield Economy

0

Escalating tensions in the Middle East have reignited fears of a global oil shock after an Iranian military official warned that crude prices could surge to $200 per barrel as the U.S. continues its military action in the region.

A spokesperson for Iran’s Khatam al-Anbiya Central Headquarters said Tehran may abandon its strategy of retaliatory strikes in favor of sustained attacks on adversaries, including actions that could target oil shipments.

“We won’t allow even one liter of oil to reach the U.S., Zionists (Israel) and their partners. Any vessel or tanker bound to them will be a legitimate target,” said Ebrahim Zolfaqari.

The warning comes as Nigeria insists it will maintain market-based fuel pricing even if global oil markets become more volatile.

Zolfaqari warned that any escalation threatening regional stability would have immediate consequences for energy markets.

“Get ready for the oil barrel to be at $200 because the oil price depends on the regional security which you have destabilized,” he said.

Global energy supply was caught in the crossfire following the involvement of the Strait of Hormuz, a narrow shipping corridor through which roughly a fifth of global seaborne oil passes. The disruption to tanker traffic through the waterway triggered a supply shock that has sent energy prices sharply higher.

Energy markets have historically reacted swiftly to instability in the Persian Gulf. Analysts say even the perception of risk to the shipping route tends to push up crude prices as traders price in potential shortages and higher shipping insurance costs.

A surge in oil prices sends ripples across the global economy by raising transportation costs, fueling inflation, and tightening financial conditions.

Nigeria Sticks With Market Pricing

Other nations have been taking measures to curtail the impact of the rising energy costs, as they pose risks of inflation and broader economic downturn. Despite the risks, Nigeria’s government said it would not reintroduce fuel price controls or subsidies.

Nigeria’s Minister of Finance and Coordinating Minister of the Economy, Wale Edun, said the administration of President Bola Tinubu remains committed to allowing market forces to determine petrol prices.

“Rather than now reverting back and taking a backward step, we will look at every other measure that can help the cost of living of Nigerians without resorting to non-market pricing,” Edun said during an interview on Politics Today aired on Channels Television.

The government removed Nigeria’s long-standing petrol subsidy in 2023, arguing that the policy had drained public finances and distorted the energy market.

“It is the market price. That is what has been instilled by Mr. President that was missing for so long, market pricing of petroleum products,” Edun added.

Officials say one of Nigeria’s main buffers against global shocks is the country’s expanding domestic refining capacity.

Edun said the government has moved to increase crude supply to the Dangote Refinery, the 650,000 BPD privately owned refinery built by billionaire industrialist Aliko Dangote. Authorities believe the refinery’s operations could help stabilize fuel availability in Nigeria by reducing dependence on imported refined products.

“At this time, the resilience that the Nigerian economy has is coming largely from the fact that we do have that investment in refining,” Edun said.

He added that the expansion of local refining capacity has improved Nigeria’s ability to withstand external shocks that have forced some countries to ration fuel supplies during energy crises.

Nigeria historically exported crude oil while importing most of its refined fuel due to inadequate domestic refining capacity, a structural weakness that often amplified the impact of global price swings.

However, energy analysts say local refining capacity may not fully insulate Nigeria from the effects of a major global oil shock.

Even if fuel is refined domestically, the cost of crude oil—the key input—still tracks global market prices. If crude prices surge, the cost of producing petrol locally will also rise, meaning consumers could still face sharply higher pump prices.

The impact could be particularly severe in Nigeria, where unreliable electricity supply forces households and businesses to depend heavily on petrol and diesel generators.

In cities and rural communities alike, millions of Nigerians rely on generators to power homes, shops, and small businesses amid chronic outages from the national grid. If crude prices were to approach $200 per barrel, analysts warn that fuel costs could rise to levels many households and businesses would struggle to afford.

In such a scenario, the economic consequences could extend beyond transport costs, potentially affecting food prices, manufacturing output, and the survival of small enterprises that depend on generator power to operate.

For Nigeria’s government, the situation presents a difficult balancing act: maintaining economic reforms designed to strengthen public finances while cushioning households from the potential fallout of global energy turmoil.

But Edun said direct intervention in fuel pricing would only be considered under extreme circumstances.

“Normally, given the policies and philosophy of this government, it would always have to be a last resort,” he said.