DD
MM
YYYY

PAGES

DD
MM
YYYY

spot_img

PAGES

Home Blog Page 3

Perp DEXes Hit $70B Single Day Volume 

0

Perpetual decentralized exchanges (Perp DEXes) recorded over $70 billion in daily trading volume amid a significant market sell-off with notable deleveraging in BTC, ETH, SOL, etc.

This marks the second-highest single-day volume ever for the sector. The all-time high remains October 10, 2025 often called the “10/10” or “1011” flash crash, when volumes hit around $101 billion some earlier reports cited ~$78 billion for on-chain perps specifically, but aggregated figures including the broader event pushed it higher.

That day was triggered by extreme volatility, resulting in over $19 billion in total liquidations across crypto markets—the largest single-day wipeout in history—driving massive forced unwinds and trading activity. For context on February 5: Hyperliquid led with ~$24.7 billion about 31% market share.

Aster followed at ~$10–11.5 billion. edgeX ~$8.7 billion. Lighter ~$7.5 billion. These top platforms accounted for a majority of the total.

This surge highlights how Perp DEXes especially on-chain ones like Hyperliquid have matured and captured significant leverage trading flow, even during downturns, as traders seek decentralized alternatives amid volatility.

Flash crashes in Decentralized Finance (DeFi) refer to sudden, extreme price drops in cryptocurrency assets, often triggered by liquidity shortages, leveraged positions, or market manipulations, followed by rapid recoveries.

These events expose the interplay between high leverage, automated systems, and fragmented liquidity across centralized exchanges (CeFi) and DeFi protocols. Unlike traditional finance, where circuit breakers might halt trading, crypto markets often amplify crashes through cascading liquidations and oracle dependencies.

October 10-11, 2025 (The “10/10” Collapse): This remains the largest crypto flash crash, with over $19 billion in liquidations across CeFi and DeFi platforms. It began with a localized liquidity failure on Binance, where the stablecoin USDe traded as low as $0.65 due to a large sell order overwhelming the order book.

This triggered automated deleveraging, margin calls, and a feedback loop that devalued collateral, leading to widespread forced sells. Bitcoin dropped 14%, while altcoins fell steeper some by 60-80%.

The event highlighted how single-venue pricing can propagate risks to DeFi, where protocols like lending markets rely on oracles that may not account for such dislocations. A recent sell-off saw Perp DEXes hit $70 billion in trading volume, the second-highest ever, amid deleveraging in major assets like BTC, ETH, and SOL.

This caused brief but severe price wicks, liquidating “safe” borrow positions in DeFi lending platforms despite quick recoveries. The 2017 Ethereum flash crash on GDAX saw ETH drop from $300 to $0.10 in minutes due to stop-loss and margin liquidations.

More recently, Venus (XVS) plunged 30% in 10 minutes in January 2026, underscoring ongoing volatility in DeFi tokens. Flash crashes exacerbate DeFi’s inherent vulnerabilities, leading to widespread economic and structural damage.

Cascading Liquidations and User Losses

High leverage in protocols like lending on Aave and perpetuals amplifies drops. In October 2025, $16.8 billion in long positions were wiped out, with Hyperliquid alone handling $10.3 billion. This creates a domino effect: falling prices devalue collateral, triggering more liquidations, which further depresses prices.

Users face permanent capital loss, with sophisticated traders like basis traders hit hardest when hedges fail via auto-deleveraging. In DeFi, this can lead to $180 million in penalties on platforms like Aave if gaps persist.

Sharp swings increase risks of runs on stablecoins or protocols, as seen in Terra’s 2022 collapse attributed to Anchor protocol runs. Flash loans enable attackers to borrow massive sums instantly, manipulate oracles, and exploit compositions of protocols, leading to “flash crash for cash” scenarios where millions are drained.

Out of 10 multi-protocol attacks in 2020, nine were flash-loan funded oracle manipulations. DeFi’s reliance on oracles assumes fair value from secondary markets, but crashes reveal flaws in tokens like USDe or LRTs, where value depends on reserves and redemptions.

Blockchain congestion delays transactions, incentivizing pre-emptive runs. Liquidity mismatches and leverage create fire-sale risks, potentially spilling to traditional finance via stablecoin reserves.

Centralized dependencies like one exchange dominating price discovery undermine decentralization, turning DeFi into “dependency finance.” Crashes wipe out market makers up to 33% in October 2025, delay recoveries, and erode trust, leading to capital outflows and reduced participation.

They also expose smart contract vulnerabilities, increasing exploit frequency. While destructive, flash crashes serve as stress tests, fostering resilience and innovation: DeFi platforms like Uniswap handled $9 billion in volume with zero downtime in recent crashes, while Aave automated $180 million in liquidations efficiently.

Crashes like October 2025 deleveraged overleveraged markets, paving the way for healthier rallies post-2021 drawdowns preceded booms. Over 90% of positions in some protocols survived, validating designs like crvUSD’s peg stability.

Events highlight needs for better oracles; integrating proof-of-reserves and risk layers, circuit breakers, and adaptive designs to mitigate cascades. They push for real-time attestations, application-specific pricing, and shared standards among issuers and integrators.

Post-crash, DeFi has matured, capturing more leverage trading flow via on-chain alternatives. Quick rebounds allow savvy traders to capitalize, and crashes weed out weak protocols, strengthening the ecosystem overall.

Flash crashes inflict immediate pain through liquidations and volatility but ultimately refine DeFi by exposing flaws and incentivizing robust, decentralized solutions. As the sector scales, addressing leverage and oracle risks will be crucial to prevent systemic failures.

Tesla Maintains Foothold in China with Modest January Delivery Growth Amid Industry Slowdown and New Regulatory Headwinds

0

Tesla’s China-produced electric vehicle deliveries showed modest resilience in January 2026, rising 9% year-on-year to 69,129 units from 63,238 in January 2025, according to data published by the China Passenger Car Association (CPCA) on Wednesday.

The figures, which reflect shipments from Tesla’s Shanghai Gigafactory for both domestic sales and exports, placed Tesla third among major Chinese EV makers, behind BYD’s 205,518 units and Geely’s 124,252 vehicles.

While the increase marks a positive contrast to the broader market’s slowdown, analysts caution that the numbers primarily reflect production and export dynamics rather than a clear resurgence in domestic demand. Tesla’s Shanghai plant produces the Model 3 and Model Y for the Chinese market as well as for export to Europe, Asia-Pacific, and other regions.

New registrations—a closer proxy for actual sales—showed only slight growth in Europe in January, per Reuters tracking, with no clear domestic uptick reported. The performance comes against a backdrop of intensifying challenges in China’s EV market. New energy vehicle (NEV) sales, including battery-electric and plug-in hybrid models, grew just 1% year-on-year in January—the fourth consecutive month of decelerating growth, according to CPCA data.

The slowdown reflects a combination of policy shifts, economic pressures, and fierce domestic competition. A key policy change took effect on January 1, 2026, when China reinstated a 5% purchase tax on NEVs after more than a decade of full exemption from the 10% vehicle purchase tax. This reversal has prompted some consumers to delay purchases, with analysts expecting further moderation in early 2026 demand.

“We see increasing pressure on China’s auto market in 2026, driven by a combination of policy and competitive factors,” Helen Liu, partner at Bain & Company, noted.

Tu Le, founder of Sino Auto Insights, said: “We know [EV sales will] slow, we just don’t know by how much. We’ll know much better after the first quarter is over.”

Tesla has faced particularly stiff competition from local rivals offering more affordable models. The base Model 3 sedan starts at around 235,500 yuan ($33,943), nearly three times the price of BYD’s Seal base model at approximately 79,800 yuan. To counter this, Tesla has introduced aggressive incentives on its Chinese website, including five-year 0% interest loans and seven-year ultra-low interest rate loans for orders placed before February 28, 2026.

The price war has squeezed margins across the industry, with Abby Tu, principal research analyst at S&P Global Mobility, noting: “We have [had] really intense price wars that have gone on, although the government and industry have called on automakers to not engage with aggressive pricing strategies.”

Despite these efforts, Tesla’s full-year 2025 China-produced EV sales fell 4.8%, one of only two major manufacturers to report an annual decline.

Regulatory changes add further complexity. On Monday, February 2, 2026, China’s Ministry of Industry and Information Technology (MIIT) announced that, effective January 1, 2027, all vehicles sold in China must feature interior and exterior mechanical door releases. The rule follows high-profile incidents in the U.S. and China where EV occupants could not escape burning vehicles due to power failures in electronic door-locking systems.

Flush, concealed door handles—popularized by Tesla as a signature design element—will need modification or replacement. Tu Le described the regulation as a “decent sized headache” for Tesla, given the brand’s reliance on minimalist, flush-handle designs. However, he noted that most Chinese automakers are unlikely to be caught off guard.

“When regulators were drafting the new regulations, they consulted OEMs and industry experts intensively,” he said.

Tesla will have a runway to adapt, but the change could require design adjustments and additional costs. Despite these headwinds, Tesla’s Shanghai Gigafactory continues to serve as a major export hub, supporting deliveries to Europe and other markets. The modest January growth may partly reflect export demand and production scheduling rather than pure domestic recovery.

Geely, which climbed to second place in China’s NEV market, reported strong performance across its Galaxy and Zeekr brands, while Aito (Huawei-backed), Leapmotor, and Nio posted year-on-year delivery gains. Broader economic pressures, including a prolonged real estate slump and weak consumer confidence, continue to weigh on discretionary spending.

The auto sector, supporting over 30 million jobs, remains a critical economic pillar. Fitch Ratings economist Alex Muscatelli noted that while autos represent only 3.7% of fixed asset investment (versus real estate’s 23%), further deterioration could prompt Beijing to reinstate subsidies if Q1 data confirms a deeper slowdown.

China’s top leaders will release 2026 policy targets at the annual parliamentary meeting in March. With the Lunar New Year holiday contributing to volatile early-year figures, the full impact of the tax reinstatement and competitive pressures will become clearer later in the year.

How to Build a Profitable Marketplace in Africa: Trust Infrastructure, Payments, and Lifecycle Email Marketing

0

The African ecommerce landscape presents one of the most compelling growth opportunities of our generation. With over 500 million internet users and a rapidly expanding middle class, entrepreneurs who crack the marketplace model here will build transformative businesses. But success demands more than ambition – it requires understanding the unique infrastructure challenges that separate thriving platforms from failed experiments.

The Trust Deficit: Africa’s Hidden Marketplace Challenge

Before discussing technology stacks or marketing automation, let’s address the elephant in the room: trust remains the primary barrier to ecommerce adoption across African markets.

Unlike mature markets where buyer protection, reliable logistics, and established payment rails have existed for decades, African marketplaces must build this infrastructure from scratch. Your platform isn’t just facilitating transactions – it’s creating the very conditions that make commerce possible.

Consider what trust infrastructure actually means in practice:, especially given persistent logistical inefficiencies and limited consumer trust in online transactions continue to shape African e-commerce:

Verification systems that confirm seller identities and product authenticity. In markets where counterfeit goods proliferate, buyers need assurance that what they order matches what arrives.

Escrow mechanisms that protect both parties until transaction completion. When a buyer in Lagos orders from a seller in Nairobi, neither party has recourse if things go wrong—unless your platform provides it.

Rating and review systems that surface reliable sellers while marginalizing bad actors. These systems must account for cultural nuances around public criticism and feedback.

Building trust isn’t a feature – it’s your core product.

Solving the Payment Puzzle

Payment infrastructure across Africa remains fragmented. Mobile money dominates in East Africa, bank transfers prevail in Nigeria, and card penetration varies wildly by country and demographic.

Successful marketplaces don’t force customers into unfamiliar payment methods. Instead, they meet buyers wherever they are financially.

Mobile money integration is non-negotiable for reaching the mass market. Partnerships with M-Pesa, MTN Mobile Money, and similar providers unlock access to hundreds of millions of potential customers with almost half of African adults not possessing any formal bank account.

Multi-currency support becomes essential as you scale across borders. A marketplace serving both Nigeria and Kenya must handle Naira and Shilling transactions seamlessly, with transparent conversion rates.

Cash-on-delivery options still matter, particularly for first-time online buyers testing ecommerce waters, with COD representing 74% of transactions in Morocco and 60% in Egypt. While this adds operational complexity, it removes friction for customers who remain skeptical of digital payments.

The payment experience directly impacts conversion rates. Every additional step, every moment of confusion, every failed transaction attempt costs you customers who may never return.

Lifecycle Marketing: Your Competitive Moat

Here’s where many African marketplace founders miss the opportunity: they invest heavily in customer acquisition while neglecting the systems that turn one-time buyers into loyal repeat customers.

Lifecycle email marketing isn’t optional – it’s the engine that drives sustainable unit economics.

Think about the customer journey on your marketplace:

A new user signs up but doesn’t complete their first purchase. Without automated follow-up, they’re gone forever. With a well-crafted welcome series, you guide them toward that crucial first transaction.

A buyer abandons their cart – perhaps distracted, perhaps uncertain, perhaps experiencing a payment issue. Abandoned cart emails recover a significant percentage of these lost sales, often with conversion rates exceeding 10%.

A customer completes a purchase. Post-purchase emails confirm the order, provide tracking information, request reviews, and eventually suggest complementary products. Each touchpoint strengthens the relationship.

A previously active customer goes quiet. Win-back campaigns re-engage dormant users before they forget your platform entirely.

The beauty of marketing automation lies in its scalability. Whether you have 1,000 customers or 100,000, these workflows run continuously, delivering personalized messages at precisely the right moments.

Beyond Email: The Power of Conversational SMS

While email forms the backbone of lifecycle marketing, African markets present unique opportunities for SMS engagement. Mobile phone penetration exceeds smartphone adoption across much of the continent, and SMS reaches customers regardless of internet connectivity.

Implementing conversational sms alongside email creates a multichannel approach that meets customers on their preferred platforms. Order confirmations, delivery updates, and time-sensitive promotions often perform better via SMS, while longer-form content and detailed product recommendations suit email.

The key is channel coordination – ensuring your SMS and email programs work together rather than bombarding customers with redundant messages.

Practical Implementation Steps

Start with your welcome series. New user onboarding sets the tone for the entire customer relationship. Introduce your marketplace’s value proposition, highlight trust features, and guide users toward their first purchase.

Build abandoned cart recovery next. This single automation often generates the highest ROI of any marketing investment. Test different timing, messaging, and incentives to optimize performance.

Implement post-purchase flows. Confirm orders, provide tracking, request reviews, and suggest related products. These touchpoints transform one-time buyers into repeat customers.

Develop segmentation strategies. Not all customers are equal. Segment by purchase history, browsing behavior, geographic location, and engagement level. Personalized messaging dramatically outperforms generic broadcasts.

Test and iterate continuously. A/B test subject lines, send times, content formats, and offers. Small improvements compound over time into significant performance gains.

The Path Forward

Building a profitable marketplace in Africa requires simultaneous excellence across multiple dimensions: trust infrastructure that enables commerce, payment systems that remove friction, and lifecycle marketing that maximizes customer lifetime value.

The entrepreneurs who succeed will be those who recognize that technology alone isn’t sufficient. Understanding local context, building genuine trust, and delivering consistent value at every customer touchpoint – these fundamentals separate lasting businesses from fleeting experiments.

The opportunity is real. The challenges are surmountable. And the tools to build sophisticated marketing automation are more accessible than ever.

Your marketplace’s success ultimately depends on execution: solving real problems for real customers, one transaction at a time.

Goldman Sachs Bets on AI Agents With Anthropic Partnership to Automate Core Banking Functions

0
The logo for Goldman Sachs is seen on the trading floor at the New York Stock Exchange (NYSE) in New York City, New York, U.S., November 17, 2021. REUTERS/Andrew Kelly/Files

Goldman Sachs is accelerating its bet on artificial intelligence, partnering with AI startup Anthropic to build autonomous agents that could fundamentally reshape how core banking functions are performed, from trade accounting to client onboarding.

For the past six months, the Wall Street giant has been working closely with embedded Anthropic engineers to co-develop AI agents powered by Anthropic’s Claude model. The initial focus is on two operationally intensive areas: accounting for trades and transactions, and client vetting and onboarding, according to Goldman’s chief information officer, Marco Argenti.

The agents are still in development, but Argenti said the bank expects to deploy them soon. While he declined to give a specific launch date, the direction is clear: Goldman is no longer treating generative AI as an experimental add-on but as a core pillar of its operating model.

Argenti described the technology as a “digital co-worker” designed to operate alongside humans in roles that are complex, repetitive, and highly scaled. These are functions that sit at the heart of a modern investment bank and traditionally require large teams to manage regulatory requirements, documentation, reconciliations, and approvals.

The initiative fits into a broader transformation outlined by Goldman Sachs CEO David Solomon last year. In October, Solomon said the bank had embarked on a multi-year plan to reorganize itself around generative AI. Even as Goldman benefits from strong revenues in trading and advisory businesses, Solomon said the firm would seek to constrain headcount growth as AI-driven productivity gains take hold.

At Goldman, the move into autonomous agents builds on earlier experiments with AI-assisted coding. Last year, the bank began testing an autonomous coding tool known as Devin, which is now widely available to its engineers. That project served as a proving ground, demonstrating that advanced models could reliably handle complex tasks within a highly regulated environment.

What surprised Goldman’s technology leadership was how quickly those capabilities translated beyond software development. Argenti said Claude’s strength is not limited to writing code, but lies in its ability to reason through complex problems step by step, applying logic across large volumes of data and documents.

That capability is particularly valuable in areas like accounting and compliance, where staff must interpret rules, reconcile discrepancies, and make judgments based on incomplete information. In Argenti’s words, technology teams realized that “there are these other areas of the firm where we could expect the same level of automation and the same level of results that we’re seeing on the coding side.”

The potential operational impact is significant as client onboarding, often slowed by manual checks, document reviews, and regulatory approvals, could be completed much faster. Trade reconciliation issues, which can take days to resolve, could be identified and fixed more quickly, reducing operational risk and improving client experience.

Goldman is also exploring the use of AI agents in other parts of the business. Argenti pointed to possibilities such as employee surveillance and the creation of investment banking pitchbooks, both of which require processing large amounts of information under tight timelines. While these ideas are still exploratory, they underscore how broadly the bank is thinking about automation.

The announcement comes at a sensitive moment for the AI sector. Recent model updates from Anthropic have triggered sharp reactions in financial markets, with investors selling off shares of software companies and reassessing which firms are best positioned to benefit from the AI boom. The volatility reflects growing recognition that rapid improvements in foundation models could disrupt existing business models, including those built around legacy enterprise software.

Goldman’s willingness to work closely with Anthropic also highlights a shift in how large financial institutions engage with AI vendors. Rather than simply buying off-the-shelf tools, Goldman is embedding engineers and co-developing systems tailored to its specific needs, regulatory obligations, and risk controls.

Despite the scale of automation being discussed, Goldman has been cautious in addressing concerns about job losses. The bank employs thousands of people in compliance, accounting, and operations, and Argenti said it would be premature to assume the technology will directly eliminate those roles. Instead, Goldman’s stated aim is to “inject capacity,” allowing teams to do more work faster and improve service quality.

Still, the longer-term implications are difficult to ignore. As AI agents mature, Goldman could reduce its reliance on third-party service providers that currently handle parts of its operational workload. That could shift costs away from external vendors and further concentrate expertise and control inside the bank.

More broadly, Goldman’s strategy signals a bigger change in how Wall Street views technology. Generative AI is no longer framed as a productivity tool for individual workers, but as an organizing principle for the firm itself. By embedding autonomous agents into core processes, Goldman is testing whether a global investment bank can be redesigned around machines that reason, decide, and act with limited human intervention.

If successful, the effort could set a precedent for the industry, forcing rivals to accelerate their own AI adoption or risk falling behind.

AI Boom Triggers Server CPU Crunch as Intel and AMD Reportedly Warn Chinese Customers of Lengthy Delays

0

The global artificial intelligence buildout is no longer straining only cutting-edge GPUs. It is now tightening the supply of the more traditional computing backbone that underpins data centers, cloud services, and enterprise IT.

Fresh warnings from Intel and AMD to Chinese customers about server CPU shortages underscore how the AI infrastructure race is cascading through the entire semiconductor supply chain, driving up prices, extending delivery times, and complicating expansion plans for some of the world’s largest technology firms.

According to people familiar with the matter, who spoke to Reuters, Intel and AMD have recently notified customers in China that supplies of server central processing units are constrained, with Intel cautioning that delivery lead times for some products could stretch as long as six months. The shortages have already pushed prices for Intel’s server CPUs in China up by more than 10% on average, although the impact varies depending on contract terms and customer scale.

China accounts for more than 20% of Intel’s global revenue and hosts some of the largest cloud computing and data center operators in the world. Any sustained disruption to CPU availability risks slowing deployments across sectors ranging from AI model training and inference to e-commerce, fintech, and government digital infrastructure.

The most severe constraints are affecting Intel’s fourth- and fifth-generation Xeon processors, which remain widely used across Chinese data centers. Sources say Intel has begun rationing deliveries as it grapples with a growing backlog of unfulfilled orders, with some customers facing waits of up to half a year.

AMD, which has steadily expanded its footprint in the server market, has also informed Chinese clients of supply constraints. While its situation appears less acute than Intel’s, delivery lead times for some AMD server CPUs have reportedly been pushed out to eight to ten weeks, signaling that capacity pressures are spreading across the industry.

These developments are being reported for the first time by Reuters and point to a broader structural issue rather than a short-term hiccup. The AI investment wave has triggered a surge not only in demand for specialized accelerators but also for the CPUs that coordinate workloads, manage data flows, and support complex, multi-tenant data center environments.

AI infrastructure strains the full stack

While Nvidia’s GPUs have dominated headlines as the most visible bottleneck in AI hardware, industry participants say CPUs have quietly become another pressure point. Modern AI systems still rely heavily on server CPUs for preprocessing data, orchestrating GPU workloads, handling inference pipelines, and running non-AI applications alongside training clusters.

The rise of agentic AI systems is intensifying this trend. Unlike earlier chatbot-style applications, agentic systems perform multi-step tasks, interact continuously with software tools, and operate around the clock. These workloads are significantly more CPU-intensive, increasing the number of processors required per deployment and amplifying demand just as supply is tightening.

Memory constraints are compounding the problem. Prices for memory chips have continued to climb, particularly in China, as suppliers prioritize AI-optimized products. Distributors say that when memory prices began rising sharply late last year, customers rushed to secure CPUs earlier than planned to avoid mismatched system builds or higher overall costs. That front-loading of orders further depleted available CPU inventories.

Manufacturing limits on both sides

The root causes of the shortages differ between Intel and AMD, but converge in outcome. Intel has struggled to ramp up production of its latest server chips amid persistent manufacturing yield challenges, limiting how quickly it can meet surging demand. AMD, meanwhile, relies on Taiwan Semiconductor Manufacturing Co., which has prioritized capacity for AI accelerators and advanced-node chips, leaving less room for high-volume server CPU production.

Intel acknowledged the tight conditions in a statement, saying the rapid adoption of AI has driven strong demand for what it described as “traditional compute.” The company said inventory levels are expected to be at their lowest point in the first quarter but added that it is addressing the situation aggressively and expects supply to improve in the second quarter through 2026, suggesting constraints could linger for months.

AMD reiterated comments made during its earnings call that it has boosted supply capabilities and remains confident in its ability to meet global demand, citing its supplier agreements and relationship with TSMC. Even so, the reported delays indicate that the fabless model offers limited insulation when the entire advanced semiconductor ecosystem is under strain.

Market dynamics amplify the impact

The shortages come against the backdrop of a shifting competitive landscape in server CPUs. Intel’s global market share has fallen from over 90% in 2019 to about 60% in 2025, while AMD’s share has risen from roughly 5% to more than 20%, according to a UBS report. In a tighter, more balanced market, disruptions at either supplier can have outsized effects, as customers have fewer surplus alternatives.

In China, major buyers include server manufacturers and cloud providers such as Alibaba and Tencent, which are racing to expand AI services while navigating U.S. export controls that restrict access to the most advanced accelerators. As GPUs become harder to source, CPUs have grown even more strategically important, making shortages particularly disruptive for long-term planning.

Taken together, the warnings from Intel and AMD highlight a critical shift in the AI boom. What began as a scramble for GPUs is evolving into a system-wide supply challenge spanning CPUs, memory, and manufacturing capacity. This means higher costs, longer deployment timelines, and tougher prioritization decisions for AI developers and cloud operators.