The enemy of knowledge is not ignorance, it’s the illusion of knowledge (Stephen Hawking)

It ain’t what you don’t know that gets you into trouble. It’s what you know for sure that just ain’t so (Mark Twain)

Invest with smart knowledge and objective odds

YOUR DAILY EDGE: 18 November 2025

Airplane Note: I am currently travelling. Hence the more limited postings.

Notices of Impending Layoffs by US Companies Surged in October

Some 39,006 Americans were given advance notice as required under the Worker Adjustment and Retraining Notification Act last month, the preliminary Cleveland Fed measure showed. In monthly data from 2006, that number has only ever been higher in 2008, 2009, 2020 and May 2025. (…)

image

China’s Rare-Earth Product Exports Falter as Talks Go On With US

China’s exports of rare-earth products edged lower in October from a month earlier, as Beijing and Washington continue to hash out details of supply arrangements under their trade truce.

Outbound shipments of the materials used in electric vehicles, weapons and high-tech manufacturing dropped to 6,173 tons, the lowest level since June, according to customs data released on Tuesday. This category is typically dominated by rare-earth magnets, the industrial components that played a pivotal role for China in facing down America’s trade offensive.

The US and China are still fleshing out details of a trade truce clinched by Presidents Xi Jinping and Donald Trump in Seoul in late October. The two sides have given their negotiators until end-November to agree on supply terms for US-bound rare earths, according to people familiar with the matter. (…)

  • #China rare earths deal will ‘hopefully’ be done by Thanksgiving, Bessent says – Reuters
China’s Grip on American Medicine Cabinets Grows More Entrenched

Roughly one-in-four generic drugs taken by Americans rely on key ingredients from China, according to a report released Tuesday by the US-China Economic and Security Review Commission. The often low-cost staples account for 90% of the medicines used by Americans. Some of the ingredients — found in blood thinners, antibiotics and cancer treatments — are produced only in China.

With China’s recent restrictions on rare earth minerals top of mind, the commission said that similar moves involving drug ingredients “could have drastic consequences for the US healthcare system, causing supply shocks that would result in loss of lives and force hospitals to make tough choices in allocating insufficient supply.” (…)

Much of the government’s understanding of China’s reach is an estimate because the Food and Drug Administration doesn’t collect data on where the basic building blocks of medicine are made. The group is recommending Congress prepare legislation that would require companies to disclose that information to the FDA.

“We’re really, really far away from figuring this out,” Miller said.

Even as its control over generic drugs draws criticism, China is working to replicate that success in the production of more innovative treatments, according to the report. Economic incentives and a more lax regulatory landscape have made China an important development partner for brand-name pharmaceutical companies around the world, particularly for conducting “cheap, fast early-stage exploration,” the report said.

A survey last year by the Biotechnology Innovation Organization, an industry trade group, found 79% of 124 biopharmaceutical companies had China-based development and manufacturing partners. Most biotech companies don’t have the money needed to make drugs in the US, a key priority for President Donald Trump.

FDA Commissioner Martin Makary floated the idea last month of lowering the multimillion-dollar fees companies are charged for reviewing new medications if early-stage studies are done in the US, rather than in China. Makary told a gathering of pharmaceutical supply chain experts in Washington that the agency is eyeing upcoming user fee negotiations with the industry that occur every five years to negotiate the potentially lower prices. (…)

China isn’t done yet. According to the report, it is leading in so-called “synthetic biology,” or the artificial creation of biological organisms.

Dominance in that scientific field puts China in an indispensable position on a number of medical fronts, from making amino acids crucial to insulin and antibiotics to developing mRNA technologies and genetically engineered cells. Importantly, it also entrenches the country in every aspect of pharmaceutical production.

“The Chinese synthetic biology industry, for the foreseeable future, will have access to the innovations and know how of global competitors,” the report said. (…)

China isn’t the only country the US relies on for its drug supply. India also plays a large role, producing the bulk of the country’s generic drugs in finished form. While India makes many of the key pharmaceutical ingredients itself, a large share of the necessary materials come from China, according to the report. Also affected are brand-name drugs from Europe, where companies get more than half of their key ingredients from China, the report said.

In the end, fixing supply chain vulnerabilities will require a wholesale approach, take years and be difficult to pull off, the authors of the report said. It will require “significant modifications to US and global economic statecraft, tools, and approaches,” including efforts to bolster domestic manufacturing, they said.

While the Trump administration has secured commitments from some large drugmakers to open manufacturing plants in the US, they don’t include generic companies that can’t afford it. Recently imposed restrictions and reductions in research funding at US universities and other institutions also could limit America’s chances at extracting itself from China’s grasp.

AI CORNER

AliQianwen APP Beta Test, Competes with ChatGPT in Full Scope

On November 17, Alibaba officially announced the “Qwen” project, fully entering the AI to C market. On the same day, the public beta version of Qwen APP was launched. Based on the world’s top-performing open-source model Qwen3, it competes comprehensively with ChatGPT by being free and integrating with various life scenarios. The core management of Alibaba regards the “Qwen” project as the “battle for the future in the AI era.” (…)

The international version of Qwen APP for the global market will be launched soon, directly competing with ChatGPT for overseas users by leveraging the overseas influence of the Qwen model.

In February this year, Alibaba announced an investment of 38 billion yuan for AI infrastructure construction and set a long-term goal of expanding the energy consumption scale of cloud data centers tenfold by 2032. Since fully open-sourcing in 2023, Alibaba’s Qwen has surpassed models like Llama and Deepseek, becoming the most powerful and widely used open-source large model globally.

To date, the global download count of the Qwen series model has exceeded 600 million, accumulating significant industry reputation. Recently, Alibaba released the flagship model Qwen3-Max, whose performance surpasses international competitors such as GPT5 and Claude Opus4, placing it among the top three globally.

Qwen is rapidly capturing the Silicon Valley market. Airbnb CEO Brian Chesky publicly stated that the company is “heavily relying on Qwen,” as it is faster and better than openAI models. NVIDIA CEO Jensen Huang said that Qwen has captured a large share of the global open-source model market and continues to expand its share. Alibaba’s open-source model Qwen is becoming the foundation of Silicon Valley.

Alibaba’s management believes that the development of AI will go through three stages: “learning from people,” “assisting people,” and “exceeding people.” Currently, the capabilities of large models have entered the Agentic AI stage of “assisting people,” and the timing for Alibaba to heavily enter the consumer market is now mature.

Alibaba stated that the Qwen APP released this time is a preliminary version, which will use the most advanced model to create a “smart personal AI assistant that can chat and handle tasks.” In addition to being smart in chatting, “being able to handle tasks” will be a key focus area for the Qwen APP. The strategic goal of Qwen APP is to become the AI lifestyle entrance of the future.

Currently, Qwen has already shown some ability to handle tasks. For example, one instruction can allow the Qwen APP to complete a research report in a few seconds and turn it into a beautiful PowerPoint with dozens of pages. Not long ago, Qwen won the championship in a real-time investment competition against global top models such as ChatGPT, Gemini, and Grok.

It has been revealed that Alibaba is planning to integrate various life scenarios such as maps, food delivery, ticket booking, office work, learning, shopping, and health into the Qwen APP, enabling Qwen to have stronger task-handling capabilities.

Samsung hikes memory chip prices by up to 60% as shortage worsens, sources say

Samsung Electronics this month raised prices of certain memory chips – now in short supply due to the global race to build AI data centres – by as much as 60% compared to September, two people with knowledge of the hikes said.

Shares of Samsung, SK Hynix and U.S. chipmakers rallied sharply on the news which underlines how the boom in artificial intelligence has stoked intense demand for chip units specifically designed for AI tasks as well as the memory chips used in those units.

Soaring prices for these memory chips, which are mainly used in servers, are likely to add to stress for big companies building out data infrastructure. They also threaten to increase the costs of other products like smartphones and computers in which they are also used.

Many of the largest server makers and data center builders are “now accepting that they won’t get nearly enough product. The price premiums being paid are extreme,” Tobey Gonnerman, president of semiconductor distributor Fusion Worldwide, told Reuters.

The South Korean firm’s contract prices for 32 gigabyte(GB) DDR5 memory chip modules jumped to $239 in November, up from $149 in September, he said.

DDR memory chips are used in servers, computers and other devices, assisting with computing performance by temporarily storing data and managing rapid data transfer and retrievals.

Samsung also lifted prices of 16GB DDR5 and 128GB DDR5 chips by about 50% to $135 and $1,194 respectively. Prices of 64GB DDR5 and 96GB DDR5 have gone up by more than 30%, Gonnerman said.

Samsung declined to comment. It separately announced on Sunday that it will build a new chip production line at its plant in South Korea, as it expects AI will drive demand for the mid- and long-term.

The chip crunch has been so severe that it has spurred panic buying by some customers, according to industry executives and analysts.

China’s top contract chipmaker SMIC said on Friday that the memory chip shortage has meant that customers are holding back orders for other types of chips that are also used in their products.

Xiaomi, a Chinese smartphone, electronics and auto manufacturer, also warned last month that the surging prices have raised the cost of making phones.

The Debate About the Quality of AI Earnings

Ed Yardeni

Michael Burry, the man behind the “Big Short” during the Great Financial Crisis, is shorting the AI trade because he notes that hyperscalers have been depreciating their GPU chip investments over more than 3 years. He thinks that they should be doing it for under three years. (…)

Hyperscalers are stretching GPU depreciation schedules, a move that lowers expenses and boosts reported earnings. Critics argue that this is aggressive accounting since GPUs often become obsolete faster.

Many major hyperscalers publicly use an estimated useful life for their AI server equipment, including GPUs, of five to six years. This is an extension from their historical depreciation schedules for general-purpose servers, which were often around three years. Companies like Microsoft and Oracle have been cited as using or factoring in a useful life of up to six years for their new AI chips/servers. Cloud GPU rental company CoreWeave also extended its GPU depreciation period to six years, from four years, in 2023.

Amazon (AWS) uses shorter schedules closer to four years, while Meta has pushed to extreme lengths of 11–12 years. Microsoft, Google, and Oracle generally fall in the four- to five-year range.

Critics, including some prominent investors, argue that the true economic lifespan is much shorter, perhaps one to three years. Nvidia is now releasing new, significantly more powerful and energy-efficient AI chips (like the Blackwell and Rubin generations) on a one-year product cycle. This rapid innovation can make older chips economically obsolete for high-end AI training workloads much faster than a five- to six-year schedule suggests. High utilization rates (60%-70%) in demanding AI workloads also contribute to faster physical degradation.

Hyperscalers justify the longer depreciation schedule by arguing for a value cascade model. They contend that older generation GPUs, once replaced in top-tier training jobs, are simply cascaded down to power less computationally intense but high-volume inference (running the model) or other tasks, where they can still generate significant economic value for years. They also cite continuous software and data center operational improvements that extend the hardware’s life and efficiency.

If depreciation schedules don’t align with real-world replacement cycles, companies may be overstating their profitability and underestimating the capital-intensive nature of AI infrastructure. That would increase the chances that the AI boom is turning into an AI bubble that may be about to burst.

We side with the hyperscalers rather than Michael Burry in the depreciation debate. Data Centers existed before AI caught on in late 2022, when ChatGPT was first introduced. During 2021, there were as many as 4,000 of them in the US as a result of the rapidly increasing demand for cloud computing. Many are still operating with their original chips. The revenues and earnings of the hyperscalers continue to rise rapidly.

I agree with Ed. Performance needs will increasingly vary as AI moves from frontier AI training to less demanding applications such as inference, fine-tuning, edge deployments requiring moderate performance needs, and batch processing. Cloud providers are developing sophisticated scheduling and lifecycle management systems to dynamically allocate GPUs according to workload intensity.

And it’s not like we are about to have excess GPU production. Demand still far exceeds production capacity.

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.