DD
MM
YYYY

PAGES

DD
MM
YYYY

spot_img

PAGES

Home Blog Page 157

Implications of US Mortgage Rates Falling to Multi-year Lows

0
A detached three-bedroom apartments are pictured at Haggai Estate, Redeption Camp on Lagos Ibadan highway in Ogun State, southwest Nigeria on August, 30, 2012. The high cost of living and the massive urbanization of Lagos, the largest city and the economic capital of Nigeria, has engineered a migration of residents mostly middle class and the poor to neighbouring towns in Ogun State, both in southwest part of the country in search of cheap accommodations. Estate developers are quick in exploiting the high cost and scarcity of accommodation leading to emerging new towns, modern estates to accommodate the spillover in Lagos. AFP PHOTO/PIUS UTOMI EKPEI (Photo credit should read PIUS UTOMI EKPEI/AFP/GettyImages)

US mortgage rates have recently fallen to multi-year lows. According to the latest data from Freddie Mac’s Primary Mortgage Market Survey. The average 30-year fixed-rate mortgage dropped to 6.01%, down from 6.09% the previous week.

This marks the lowest level since September 2022; over 3 years ago, when it last dipped below 6%. The 15-year fixed-rate mortgage fell to 5.35%, down from 5.44% the week before. A year ago (around February 2025), the 30-year rate averaged around 6.85%, so this represents a meaningful decline of about 0.84 percentage points year-over-year.

This pullback improves affordability for homebuyers and has boosted refinance activity, with many recent homeowners able to lower their payments significantly. Freddie Mac’s chief economist noted that the lower rate environment is enhancing buyer affordability and strengthening homeowners’ financial positions.

Note that while some headlines describe it as a “nearly 4-year low” or similar, the precise benchmark from Freddie Mac is since September 2022 roughly 3.5 years as of now. Daily rates from other sources like Zillow, Bankrate, or NerdWallet can vary slightly due to different methodologies and timing—some show averages in the mid-5% to low-6% range as of February 20-21—but the weekly Freddie Mac figure is the most widely referenced standard.

Rates remain in a relatively narrow band around 6% so far in 2026, influenced by factors like inflation trends, jobs data, and broader economic signals. Forecasts from groups like Fannie Mae and the Mortgage Bankers Association suggest rates could hover near 6% or slightly above through much of the year.

Mortgage rate forecasts for the US, particularly the benchmark 30-year fixed-rate mortgage, are primarily provided by major institutions like Fannie Mae, the Mortgage Bankers Association (MBA), and others such as the National Association of Realtors (NAR) or National Association of Home Builders (NAHB).

These forecasts are updated periodically often monthly or quarterly based on economic indicators like inflation, Federal Reserve policy, Treasury yields especially the 10-year note, employment data, and broader economic growth.

As of February 2026 with the most recent major updates from January 2026 for Fannie Mae and late 2025 and early 2026 for MBA, the consensus points to rates stabilizing in the low- to mid-6% range for much of the year, with only modest further declines expected from current levels around 6.0-6.2%.

Expects rates to average around 6.1% in Q1 2026, then settle at 6.0% for Q2 through Q4. Rates hover near 6% through most of 2026, with a slight dip to around 5.9-6.0% by year-end or into 2027 in some outlooks. This reflects expectations of gradual economic cooling and limited additional Fed rate cuts.

Mortgage Bankers Association (MBA): Projects rates holding steady at approximately 6.1% throughout 2026, with some earlier views citing 6.4% averages for the year potentially reflecting more conservative assumptions. The MBA views rates as having largely bottomed out, remaining in the low- to mid-6% range into 2027 and even 2028, influenced by persistent inflation risks and steady growth.

Other notable mentions: Some aggregated expert views place 2026 averages between 6.0% and 6.4%, with groups like NAR or NAHB aligning closer to 6% or slightly above. Forecasts often see rates flat or ticking slightly lower to 5.9-6.2%, though some see minor upticks if economic strength persists.

Mortgage rates don’t move in lockstep with the Fed’s federal funds rate—they’re more closely tied to the 10-year Treasury yield plus a spread typically 1.5-2% that accounts for credit risk, lender margins, and demand. Current forecasts assume:Moderate inflation control and limited further Fed easing perhaps 0-1 cuts in 2026.

A softening but not recessionary economy (unemployment rising mildly to ~4.4-4.6%). Potential volatility from policy changes. No dramatic drops expected, as much of the relief from 2023-2025 peaks (7%+) has already occurred. These are educated projections, not guarantees—rates can shift quickly with new data.

Recent weeks have seen actual rates dip to multi-year lows which aligns with or slightly beats some forecasts. If you’re planning to buy or refinance, compare personalized quotes from lenders, as your rate depends on credit, down payment, and other factors—shopping around can still yield meaningful differences even in a stable range.

It Takes 20 Years of Food & Water to Develop a Human: Altman Pushes Back on AI Water, Energy Consumption Claims

0

Altman drew a sharp line between what he called exaggerated per-query water claims and the very real macro-scale energy buildout AI will require, arguing the infrastructure challenge is about power generation — not gallons per prompt.


At a moment when artificial intelligence is reshaping industries and straining infrastructure planning, Sam Altman is confronting one of the most persistent criticisms head-on: the environmental cost of AI.

Speaking on the sidelines of the India AI Impact Summit in an interview with The Indian Express, the OpenAI chief executive dismissed viral claims that ChatGPT consumes gallons of water per query as “completely untrue” and “totally insane,” arguing that such figures bear “no connection to reality.”

The remarks land amid intensifying scrutiny of data center expansion, resource use, and AI’s long-term sustainability.

The Water Narrative — and What It Misses

Concerns about AI’s water footprint stem largely from how data centers are cooled. Many traditional facilities rely on evaporative cooling systems that draw significant volumes of water to regulate temperatures for densely packed servers.

Yet the link between a single AI query and water consumption is not direct. Water use occurs at the infrastructure level — in cooling systems and, in some regions, in power generation itself — rather than at the level of an individual prompt.

Cooling technology is also evolving. Hyperscale operators are deploying closed-loop liquid systems, advanced air cooling, and even water-free designs in some new builds. Efficiency gains per compute unit have improved steadily, though rising overall demand may offset those gains.

A recent projection by water technology firm Xylem and Global Water Intelligence estimated that water drawn for cooling could more than triple over the next quarter-century as global computing expands. That forecast reflects aggregate growth, not per-interaction intensity.

Altman’s pushback suggests he views the viral framing — “gallons per query” — as a distortion that conflates systemic resource use with marginal consumption.

Energy: The Real Constraint

Where Altman acknowledged a legitimate concern is the electricity demand.

“Not per query, but in total — because the world is using so much AI … and we need to move towards nuclear or wind and solar very quickly,” he said.

The distinction he is drawing is fundamental to understanding AI’s environmental calculus.

AI systems consume energy at two primary stages:

  1. Training: the compute-intensive process of building large models, often requiring massive parallel processing over weeks or months.
  2. Inference: the ongoing use of trained models to generate outputs in response to user inputs.

Training can require substantial bursts of energy, but inference — especially once hardware and software are optimized — is far less energy-intensive per transaction. The challenge lies in scale. Billions of inferences across millions of users translate into persistent demand on grids.

According to a May report from the International Monetary Fund, global data center electricity consumption in 2023 had already reached levels comparable to those of Germany or France, shortly after the debut of ChatGPT.

That comparison underscores how quickly AI has shifted data centers from background infrastructure to frontline energy consumers.

The Human Brain Analogy

Altman also addressed comparisons drawn by Bill Gates, who has suggested that the human brain’s efficiency implies AI systems could become dramatically more energy-efficient over time.

Altman argued that many comparisons overlook the energy embedded in human development.

“It takes like 20 years of life, and all the food you eat before that time, before you get smart,” he said.

He suggested the more appropriate benchmark is energy consumed per response once a model is trained — and by that metric, he believes AI may already be competitive.

The analogy has sparked debate. Critics argue that equating human cognition with computational systems risks flattening ethical distinctions. Sridhar Vembu of Zoho Corporation publicly criticized the equivalence, saying he does not want to see technology equated with human beings.

Beyond philosophy, the exchange highlights a deeper issue: AI efficiency is often discussed without standardized metrics. Measuring energy per inference, per token generated, or per model lifecycle produces very different narratives.

Infrastructure, Investment, and Political Friction

The debate is unfolding as governments and corporations commit billions to new data center capacity. AI has become a strategic priority, intertwined with economic competitiveness and national security.

To accommodate growth, some governments are accelerating approval processes for new power generation — including nuclear, solar, and wind. Environmental advocates caution that rapid buildouts could complicate climate commitments if fossil fuels fill short-term supply gaps.

Local resistance is also mounting. In San Marcos, Texas, the city council recently rejected a proposed $1.5 billion data center after sustained public opposition over concerns about grid strain and rising electricity costs.

These disputes reveal a widening tension between national AI ambitions and local resource constraints. Data centers are capital-intensive, geographically concentrated, and highly visible infrastructure projects.

One of the central questions is whether technological efficiency can outpace demand growth.

Historically, improvements in chip design and software optimization have reduced energy use per computation. However, AI workloads are expanding so rapidly that total consumption continues to climb — a classic case of the rebound effect, where efficiency gains stimulate additional usage.

Altman’s call for accelerated nuclear and renewable deployment implicitly acknowledges that efficiency alone will not solve the energy equation. Expanded generation capacity appears inevitable if AI adoption continues at current rates.

Recent FXHASH Funding Round Will Support Development of New Development 

0

Fxhash, the generative art platform and NFT marketplace, originally built on Tezos and now expanding to Ethereum and Base has announced a new funding round. The funding will support continued development of new creative tools and futures for digital art on Ethereum and Base.

This appears to be a recent strategic round likely seed or follow-on, though the exact amount and type weren’t specified in the public announcement. It aligns with fxhash’s evolution, including their $FXH token and protocol launch on Base in 2025, which introduced art coins, bonding curves, and new monetization for artists.

Prior to this, fxhash’s last known major round was a $5M seed in August 2023, led by 1kx with participants like Fabric Ventures and Union Square Ventures. The involvement of Coinbase Ventures is notable, as it ties into Base’s ecosystem growth and fxhash’s shift toward Ethereum-compatible infrastructure for broader accessibility and lower fees.

This move signals strong institutional confidence in generative art and onchain creativity platforms amid the broader crypto and Web3 recovery. It provides runway to accelerate development of creative tools, protocol features, and expansions on Ethereum and Base.

fxhash can invest more aggressively in features like improved generative tools, open-form collections, AI integration potential (as hinted in their roadmap), and ecosystem growth. This supports their shift toward a full “art economy” via the $FXH protocol (art coins on bonding curves, liquidity pools, and tokenized artist stakes).

Backing from reputable crypto VCs especially Coinbase Ventures signals confidence in fxhash’s post-2025 evolution — from Tezos roots to Base and Ethereum multi-chain presence. It could attract more artists, collectors, and developers, increasing mint volume, community engagement, and $FXH token utility and governance potential.

With $FXH already listed on Coinbase, this funding could drive positive sentiment, liquidity, and adoption. It reinforces the protocol’s role in blending art + DeFi. fxhash remains a leader in code-based generative art. This round highlights renewed institutional interest in niche but innovative Web3 verticals like tokenized art economies.

Amid crypto recovery, funding for cultural and creative protocols vs. pure DeFi/infra shows diversification. It positions generative art as a viable long-term category, potentially inspiring similar platforms to explore hybrid art-finance models. The $FXH protocol’s art coins and bonding curves offer new revenue streams.

This funding could help refine these mechanics, reducing risks like volatility or accessibility barriers for non-crypto-native artists. Base has grown as a low-fee hub for NFTs, social, and experimental projects. fxhash’s migration/expansion here with $FXH on Base adds cultural depth alongside DeFi and trading use cases.

This round ties fxhash tighter to Base’s growth, potentially increasing onchain activity, user onboarding, and integrations. While their 2026 priorities emphasize RWA perpetuals, AI agents, and financial infra, they continue backing ecosystem plays. This fits their mission of expanding crypto’s economic freedom — here, through democratized digital art creation/ownership.

It also leverages Coinbase’s distribution. This round reflects confidence in sustainable, community-driven projects over hype cycles. It could catalyze more cross-pollination between art, DeFi, and L2 ecosystems, especially as Base pushes for mainstream accessibility.

This is a bullish development for fxhash’s long-term vision of “new creative futures” in digital art — less about short-term pumps, more about building enduring infrastructure for onchain expression. If execution continues via ongoing series like “Deliverance,” it positions fxhash as a standout in the evolving Web3 culture space.

Gemini Restructuring and Laying Off 25% of its Global Workforce 

0

Gemini, the cryptocurrency exchange founded by Tyler and Cameron Winklevoss (often referred to as Gemini Space Station in some contexts, ticker: GEMI), has undergone a significant restructuring in February 2026.

This includes laying off approximately 25% of its global workforce up to around 200 employees and exiting operations in the UK, EU including other European jurisdictions, and Australia. The moves come amid a sharp downturn in the crypto market, with Bitcoin experiencing notable declines, leading to lower trading volumes, tighter liquidity, and rising regulatory pressures.

Gemini, which went public via IPO in September 2025, has seen its stock plummet more than 80% from post-IPO highs, with its market value dropping sharply. The restructuring aims to reduce operating expenses, streamline toward a leaner model partly enabled by increased use of AI in engineering and other roles, and refocus primarily on the US and Singapore.

The company is shifting emphasis toward custody services and its newly launched prediction markets platform, as revenue growth has lagged behind expenses.
Expected pre-tax restructuring costs are around $11 million, with most changes to be completed in the first half of 2026.

In mid-February, Gemini also parted ways with three C-suite executives: COO Marshall Beard, CFO Dan Chen, and CLO Tyler Meade (effective immediately), with interim replacements appointed internally. Cameron Winklevoss is absorbing some COO responsibilities. Additional quiet US staff cuts have occurred beyond the initial announcement.

This isn’t related to Google’s Gemini AI model. The news pertains specifically to the crypto platform. The crypto industry continues facing challenges post-2025 market cycle, with firms adjusting through cost-cutting and strategic pivots. Gemini’s changes reflect broader pressures in the sector.

The 25% staff layoffs (affecting up to ~200 employees globally) and related restructuring at Gemini (the crypto exchange founded by the Winklevoss twins, NASDAQ: GEMI) in early February 2026 carry several significant implications, both for the company and the broader crypto sector. This comes amid a sharp crypto market downturn, with Bitcoin down over 40% from its late-2025 highs, reduced trading volumes, and persistent profitability challenges.

The moves are explicitly designed to slash operating expenses, align costs with lower revenue, and accelerate breakeven. The company cited revenue growth lagging behind expenses, with prior quarters showing substantial losses, ~$159.5M in one reported period, and estimates of up to ~$600M net loss for 2025.

Restructuring costs are estimated at ~$11M pre-tax, mostly in Q1 2026, but the long-term goal is meaningful cost reduction through workforce cuts, AI integration in operations, and exiting high-complexity/low-return international markets (UK, EU, Australia). This refocuses Gemini on core strengths in the US and Singapore.

Emphasis shifts toward custody services (a more stable, fee-based revenue stream less tied to volatile trading) and the newly launched prediction markets platform. This bets on emerging niches amid declining traditional exchange activity, but success is uncertain in a bearish environment.

The abrupt departures of three C-suite executives (COO Marshall Beard, CFO Dan Chen, CLO Tyler Meade) —shortly after the initial announcement—signal deeper internal turmoil. Cameron Winklevoss is absorbing some COO duties without a replacement, suggesting recentralization of power but raising concerns about governance and execution risk.

GEMI shares have plummeted >80% from post-IPO highs peaking near $45-46 in late 2025, with market cap dropping from ~$4B to under $700M (recent trading around $5.8–$6.6). Additional quiet US staff cuts and executive exits triggered further declines. Analysts from  Truist Securities highlight solvency worries and question the original IPO prospectus’s optimism about international growth.

International users face account wind-downs potentially driving them to competitors. This shrinks Gemini’s global footprint but simplifies operations. The crypto “winter” is hitting infrastructure players hard. Gemini’s aggressive post-IPO expansion bet on continued bull conditions through 2027 backfired with the rapid price crash, highlighting overexpansion risks in bull markets.

It echoes patterns seen in prior cycles where exchanges over-hire and over-extend during highs. As a US-focused, compliance-heavy exchange, Gemini faces higher costs from regulations, while offshore/DEX competitors capture volume with lower overhead. The retreat from regulated but complex markets (EU/UK) underscores difficulties in global scaling under tightening rules.

This reinforces that even well-known, publicly traded players aren’t immune. It could pressure peers to accelerate cost controls, pivot to non-trading revenue (custody, derivatives, prediction markets), or face similar scrutiny. Investor confidence in crypto IPOs/post-IPO stability may wane further.

If Gemini stabilizes via its US pivot and new products, it could emerge leaner. However, ongoing market weakness risks further cuts or distress. This reflects a classic crypto cycle correction: hype-driven growth unraveling under reality. Gemini is in survival mode, prioritizing US dominance and profitability over global ambition.

The crypto sector continues its Darwinian phase, where adaptability and cash runway determines who endures. The situation remains fluid with no major positive reversals reported.

Anthropic Alleges Massive Distillation Campaign by Chinese AI Labs, Escalating Fight Over Chips and Safeguards

0

Anthropic’s claim that 16 million Claude interactions were siphoned through 24,000 fake accounts reframes the AI race as a battle over inference access, safeguards, and chip supply — not just model training.


U.S. artificial intelligence firm Anthropic has accused three Chinese AI developers, DeepSeek, Moonshot AI, and MiniMax, of orchestrating what it describes as a coordinated, large-scale “distillation” campaign to extract capabilities from its Claude AI system.

Anthropic said the three labs created more than 24,000 fraudulent accounts that generated over 16 million interactions with Claude. The queries, it alleged, were designed to systematically replicate some of Claude’s most advanced features, including agentic reasoning, tool use, and coding — areas considered differentiators among frontier AI systems.

The accusations land amid heightened geopolitical tension over artificial intelligence, particularly as Washington reassesses export controls on advanced semiconductors to China and as Chinese AI labs close the performance gap with U.S. counterparts.

Distillation as a competitive shortcut

Distillation is a common technique in machine learning in which a large, high-performing model acts as a “teacher” to train a smaller “student” model. Within a single company, it is used to compress models for lower-cost deployment while retaining much of their capability.

Across companies, however, the method becomes controversial. By querying a rival’s model at scale and using the responses as training data, a competitor can approximate performance without replicating the comprehensive research, compute expenditure, or alignment work that went into the original system.

Anthropic said the alleged campaigns varied in focus and scale. It tracked more than 150,000 exchanges linked to DeepSeek that appeared aimed at strengthening foundational reasoning and alignment, including workarounds for policy-sensitive prompts. Moonshot AI allegedly generated more than 3.4 million exchanges focused on agentic reasoning, tool integration, coding, data analysis, and computer vision. MiniMax accounted for roughly 13 million exchanges targeting agentic coding and orchestration, with Anthropic claiming it observed traffic being redirected toward the latest Claude release shortly after launch.

The scale matters since sixteen million interactions represent not casual usage but what Anthropic characterizes as industrialized extraction.

DeepSeek, in particular, has drawn scrutiny since releasing its open-source R1 reasoning model last year, which analysts said approached the performance of leading U.S. frontier labs at a fraction of the cost. The company is reportedly preparing DeepSeek V4, a new model said to outperform both Claude and OpenAI’s ChatGPT in certain coding benchmarks. Earlier this month, OpenAI also accused DeepSeek in a memo to U.S. lawmakers of using distillation techniques to mimic its systems.

Export controls and compute leverage

The dispute is unfolding alongside debate over access to high-end chips. Last month, the Trump administration allowed U.S. firms, including Nvidia, to export advanced AI processors such as the H200 to China, loosening earlier restrictions.

Anthropic linked the alleged distillation campaigns to computing power. “The scale of extraction… requires access to advanced chips,” the company said in a blog post.

It argued that export controls serve a dual function: limiting direct model training and constraining the compute needed for high-volume distillation.

This framing shifts the chip debate. Historically, export controls focused on preventing Chinese firms from training large frontier models from scratch. Anthropic’s claim suggests that even if direct training is restricted, access to sufficient inference compute could enable large-scale replication via API querying.

Policy analysts say this complicates enforcement. Distillation occurs through legitimate product interfaces — paid or public APIs — rather than through overt hacking. That creates a grey zone between normal usage and systematic extraction.

National security dimension

Anthropic also framed the issue as one of security. The company said U.S. developers build safeguards into frontier systems to prevent misuse in areas such as bioweapons design or malicious cyber operations.

“Models built through illicit distillation are unlikely to retain those safeguards,” Anthropic wrote, warning that dangerous capabilities could proliferate if protections are stripped out during replication.

It pointed to the possibility of authoritarian governments deploying advanced AI for offensive cyber operations, disinformation campaigns, and mass surveillance — risks that increase if such systems are open-sourced without embedded safety layers.

Dmitri Alperovitch, chairman of the Silverado Policy Accelerator and co-founder of CrowdStrike, told TechCrunch the allegations were unsurprising.

“It’s been clear for a while now that part of the reason for the rapid progress of Chinese AI models has been theft via distillation of U.S. frontier models. Now we know this for a fact,” he said. “This should give us even more compelling reasons to refuse to sell any AI chips to any of these [companies], which would only advantage them further.”

Anthropic said it will continue investing in defensive measures to make distillation harder to execute and easier to detect, while calling for “a coordinated response across the AI industry, cloud providers, and policymakers.”

Such coordination could include tighter API rate limits, improved anomaly detection, contractual enforcement mechanisms, and shared threat intelligence among AI labs. Cloud providers — which host the infrastructure underpinning both U.S. and Chinese AI workloads — may also face pressure to monitor and flag high-volume extraction patterns.

The broader stakes extend beyond one company. If frontier AI capabilities can be replicated rapidly through sustained querying, the competitive moat built on research expenditure and chip access narrows. In that scenario, advantage may hinge less on breakthrough architecture and more on distribution control, access management, and compute governance.

At the same time, aggressive restrictions carry trade-offs. Limiting chip exports could affect U.S. semiconductor revenues and accelerate domestic Chinese chip development. Restricting API access could constrain legitimate global customers and developers.

Anthropic’s allegations therefore crystallize a central tension in the global AI race: openness versus control. The tools that make AI widely usable, APIs, cloud access, and scalable inference, also create vectors for replication. As Chinese labs close the performance gap with U.S. peers, the contest increasingly revolves not just around building the most advanced model, but protecting it.