Price, Narrative, and the Breakdown of Verification in the Intangible Economy
On November 30, 2022, OpenAI released ChatGPT. Within weeks, Nvidia's valuation surged by hundreds of billions of dollars. No new factories had been built. No new chips had been shipped.
On January 27th, 2025, DeepSeek released its R1 AI model, challenging industry perceptions of model performance. Within hours, the market capitalizations of public AI infrastructure companies plunged.
On February 22nd, 2026, Citrini7 released an essay bemoaning the future of software companies. Within days, the S&P 500 plunged nearly 1%, with software companies taking the brunt of the loss.
Considered in isolation, these events seem to be standard market corrections. Markets reprice expectations all the time, especially during periods of market euphoria. Many have ascribed the market's reaction to events such as the DeepSeek release to the innate biases that have long gripped investor psychology.
With nearly a $1 trillion invested in chips, infrastructure, and energy, the ongoing AI buildout has drawn parallels to the Railroad Bubble of the 1840s. Looking to the likes of Soros and Popper, many have described the preconditions for the bubble phenomenon as endemic to human nature. In Soros's view, our inherent biases and perceptions shape market fundamentals, driving asset buying and selling in a feedback loop that leads to bubbles and deflation.
Taking these events together, however, something far more unusual appears. The distance between narrative and valuation has all but disappeared. The real answer to this shift lies not in unsubstantiated notions of market implosion or behavioral biases, but in a structural shift in asset composition.
The price system historically functioned through externally imposed verification intervals — discrete moments where forecasts met physical constraints. The dominance of intangible assets has weakened those intervals by reducing the role of exogenous physical constraints. Prices now increasingly reflect expectations about future revisions of expectations rather than expectations about constrained cashflows.
The Growing Share of Intangibles in the Economy
The post-war period through the late 90s and 2000s marked a pronounced transition, as the rise of ICT gave way to the "knowledge economy." Accompanying this shift was growing investment in "intangible" assets — in 1975, tangible assets accounted for 83% of S&P 500 market value; by 2025, intangible assets accounted for 92%.
Intangible assets are those that "typically involve the development of specific products or processes, or are investments in organizational capabilities, creating or strengthening product platforms that position a firm to compete in certain markets." Per economist Charles Hulten, intangible assets fall into three main categories — computerized information, scientific and creative property, and economic competencies (firm-specific knowledge). One of the more emblematic categories of these intangible assets is software. Platforms such as Meta and Slack are now embedded in our social fabric, demonstrating this broader economic shift.
Contrary to those who analyze this secular trend from an accounting lens, the critical differentiator between intangible and "tangible" assets is ontological — it is the presence of physical constraints that enable the prices of these assets to be verified.
Apply this framework to software. From social networks to marketplaces, none of these platforms have an engineering limit analogous to factory throughput. In theory, the platform can scale indefinitely; its valuation stems from user coordination, not from physical capacity. There is no moment when observed output reveals that the plan was wrong about capacity, as software platforms lack veritable "physical constraints".
Physical constraints serve as "temporal anchors" — continuously verifying asset prices against constraints such as delivery cycles, capacity utilization, and energy costs on a scheduled basis. Intangible assets structurally lack the same magnitude of these temporal anchors as tangible assets do.
Rather than blindly attributing the market's reactions to events such as the DeepSeek launch to behavioral biases, the real question lies in this structural shift in asset composition. An unexplored presupposition underlying the Austrian School of Economics might bear an answer.
Implicitly considered a social technology by most economists, prices enable information transmission. Friedrich Hayek described price as a function of two primary sources of information: scientific and local. Scientific knowledge pertains to the systematic, articulable knowledge of general rules, laws, and regularities. Localized knowledge refers to information that remains contextual and tacit — it is acquired at specific moments, revealed through temporal processes, and becomes obsolete as circumstances change.
With scientific knowledge remaining constant, prices adjust only when localized information changes. When novel data enters the localized set, it is assessed against prior expectations, functioning as a "verification" event. If the new information point deviates from the forecast, the price adjusts accordingly. If the information point confirms the forecast, the price holds. For prices to discipline forecasts, discrete "after" moments must exist in which those forecasts can be verified or falsified by new data points arriving outside this pricing mechanism.
The dominance of intangible assets has collapsed this temporal structure, and with it, the epistemic foundation on which Hayekian coordination depends.
Unlike tangible assets, intangible assets are not disciplined by regular, external verification intervals. Software platforms, model weights, and brand ecosystems do face costs, but these constraints are less frequent, less binding, and more interpretation-sensitive than physical throughput or resource depletion. As intangible assets come to encompass a larger share of the economy, prices cease to represent constrained states of the world — now encoding expectations about the future revision of expectations themselves.
With these intangible assets lacking a discrete temporal structure, the verification window is indefinitely deferred. The informational environment shifts continuously, and the price of the asset being verified is now an evolving expectational target. The market's reaction to narrative shifts like that of Citrini7's article now makes sense — intangible asset valuation becomes a function of the beliefs about the persistence of the narrative under which these trajectories remain meaningful.
Even seemingly discrete firm-level events — such as strong earnings, a major partnership, or a successful launch — now rarely serve as terminal verification of the forecasts embedded in intangible assets. Instead, these events are immediately incorporated into the next round of expectations. For tangible assets, verification terminated the chain. For intangible assets, verification feeds the chain. Without the temporal structure presupposing this pricing process, novel data points do not produce closure — rather, each data point becomes the basis for reconstructing the forecast itself.
Rather than the behavioral biases that dictated the bubbles of yesteryear, the growing share of intangible assets represents a regime shift.
A pricing regime is characterized by the extent to which it is governed by temporal anchors. A regime dominated by tangible assets is one that sees continuous falsification on a regular schedule. Emblematic entities of this regime include energy companies and commodity producers, whose valuations are persistently disciplined by physical constraints. In contrast, a regime dominated by intangible assets is one that sees expectations revised primarily through interpretation. Examples of entities within this regime include software platforms and frontier AI research labs. In a tangible-based regime, time delivers externally generated "afters" that impose discipline. In an intangible-based regime, time delivers successive rounds of internally generated reinterpretation.
From tulips to dot-coms, markets have always priced narratives. The current regime is structurally different — they lack the temporal anchors that once disciplined these expectational dynamics.
Schelling Point Entities
It is within this regime shift that a novel category of entities has emerged. These entities don't fall in either category; they straddle the boundary between the two regimes. While part of their valuation is disciplined by external verification events, the other portion is governed by expectational dynamics.
The tangible aspect of their valuation prevents full detachment from the temporal process, leaving some expectations exposed to falsification. The intangible aspect enables them to emit signals that other actors treat as narrative shifts, resulting in the prices of these entities effectively serving as coordination events.
Analogous to the Misesean Regression, initial physical constraints such as engineering constraints and unit economics permit capital formation at scale. This capital, once deployed, finances the accumulation of intangible assets such as ecosystems, brand, and network effects. In practice, these "Schelling Point" entities are both disciplined and disciplining, subject to verification while serving as a coordination point for other entities. External communications and price movements serve as a barometer for market sentiment, both aggregating and emitting the changes in information.
In the context of the AI infrastructure build-out, NVIDIA has emerged as a reference for public and private companies alike — with its quarterly earnings dictating market sentiment. Hyperscaler capital expenditure forecasts at the likes of Google and Amazon are revised. Valuations of power infrastructure companies, semiconductor equipment companies, and entities several degrees removed from NVIDIA adjust in coordination.
At the same time, NVIDIA faces semiconductor supply constraints. Constraints such as fab capacity allocation at TSMC and delivery schedules for Blackwell chips yield verifiable outcomes within a bounded time horizon. The April 2025 license restrictions triggered a forced $5.5 billion inventory write-down on H20 chips, forcing repricing that existed independently.
As a Schelling Point entity, NVIDIA's communications and financial metrics generate cross-asset revisions and directionality that other non-schelling-point entities do not. The economic significance of these boundary-point entities lies in these dual characteristics — partially disciplined by physical constraints, whilst simultaneously serving as a reference point for other actors.
During NVIDIA's May 2024 guidance, investor materials introduced the framing of "AI factories" and emphasized inference-driven demand as a distinct growth vector. In subsequent quarters, hyperscaler earnings calls adopted NVIDIA's framing. Rather than prior metrics, capital expenditure was increasingly justified in terms of GPU throughput and inference capacity. By late 2025, when Meta guided for $115–135 billion in 2026 capital expenditure and Alphabet guided for $175–185 billion, these figures were presented and interpreted through a framework that NVIDIA's signal emissions had made salient a year ago. Through its external communications, NVIDIA structured the interpretive categories that in turn constituted the information environment for subsequent pricing.
Metastability
The resulting system structure is non-classical. The system now coordinates around an endogenously produced reference point whose relationship to external fundamentals is contingent. Information is continuously generated and filtered through the same channels; perturbations in the information environment are incorporated without producing system-wide resolution.
Targets are revised, horizons are shifted, and evaluation criteria migrate. In simpler terms, the system preserves coherence by continuously amending the standards of coherence. This is metastability in the thermodynamic sense — a locally stable configuration that is not the global minima, maintained by barriers to transition rather than by convergence to a true equilibrium.
Coordination increasingly proceeds through interpretive alignment rather than constraint-based falsification. The drastic market reaction to the release of the DeepSeek model or the Citrini7 article wasn't driven by behavioral biases, but by a shift in market perception of the information environment. Coordination continues through mutual orientation rather than sequential falsification, with the system preserving coherence without ever producing a terminal verification event. As resynchronization repeats around the Schelling points, the mismatch between verificational-era institutions and focal-era dynamics widens.
A Loss of Sovereignty
Beyond pricing mechanisms and information transmissions, the capital markets occupy a vital position in our society. Corporations use mergers and acquisitions to bolster their strategy and diversify their balance sheets. In 2025, global M&A deal value reached $4.8 trillion (Bain), with nearly half of strategic technology deal value explicitly citing AI benefits — in some cases borrowing the terminology and guidance numbers set by the focal entities themselves.
Retail investors utilize the markets as a source of wealth preservation. Nearly 62% of Americans own stock directly or through retirement accounts (Gallup, 2025). US higher education endowments hold around half their portfolio (NACUBO 2024) in equity strategies in order to meet their fiduciary responsibilities, with 48% of endowment spending going directly to student financial aid.
Pension funds hold $29.6 trillion in total financial assets (Federal Reserve, Q3 2025), with public pensions alone serving nearly 25 million active members and annuitants while allocating roughly 44% of their capital to public equities (NASRA, 2024). As required by their mandates, each of these institutions is fiduciarily obligated to treat prices as verification events. The dissolution of the temporal structure now removes the conditions under which these obligations can be met.
Despite the gravity of this new paradigm, our institutions remain structurally insufficient to re-anchor the markets — their capabilities inherently presupposed by the same temporal structure that has dissolved in our new regime.
In response to the lax risk parameters and opaque reporting that led to the Financial Crisis of 2008, much of the regulation in the years following sought to expand disclosure requirements. The SEC's disclosure framework was constructed on the assumption that material information exists in a form that could be identified, verified, and communicated to investors on a periodic basis. These efforts hold salience in assets facing physical constraints — a factory's utilization rate or a mine's reserves. For intangible assets, however, the notion of "disclosure requirements" loses its epistemic grounding.
While seldom utilized, circuit breakers and position limits are designed to arrest erratic price movements that have diverged from a recoverable fundamental value. The assumption underpinning these tools is that volatility represents temporary dislocation. If trading is halted, rational assessment can resume, and prices will converge back to some reference level. In the focal regime, however, there is no stable reference level independent of the coordination process itself.
Frameworks utilized to dictate ethical standards lose salience. Modern portfolio theory, the theoretical framework utilized to govern risk parameters for pensions, endowments, and institutional investors, decomposes risk into factors based on historical price performance. Portfolio allocation then optimizes these risk factors to meet fiduciary responsibility standards. With the dissolution of the temporal structure, however, the very risk parameters utilized to meet these responsibilities become endogenous to the same expectational dynamics as the portfolio. Correlations measured during periods of stable verification reveal nothing about correlations during a narrative phase transition.
Our current institutions treat market dysfunction as a product of informational asymmetry or behavioral distortion — both of which assume that a correctable relationship to underlying fundamentals still exists. With a change in the structural composition of assets, the dissolution of the temporal structure now reveals a structural deficiency that neither diagnosis can reach.
It is because the deficiency is structural rather than behavioral or informational that current dynamics cannot be simply resolved by reform or action within the existing regulatory order. In Carl Schmitt's terminology, the system has now reached a point where it has exceeded the normative parameters of its own jurisprudence.
These deficiencies compound. In this new regime where the risk parameters are endogenous to the same expectational dynamics as the portfolio itself, our existing regulatory tools systematically reinforce these patterns. The normative order now produces the conditions of its own crisis.
The FSOC of Tomorrow
After the financial crisis of 2008, the Dodd-Frank Act of 2010 established the Financial Stability Oversight Council (FSOC). Charged with monitoring "systematically important institutions," the FSOC was a collaborative body composed of the heads of the major financial regulators. In light of the poorly constructed risk parameters and capital requirements that led to the implosion of the American real estate market, the entity was established to force select financial institutions to be regulated in anticipation of a second financial crisis.
The FSOC was established with the intent of serving as this sovereign entity, capable of identifying the widening cracks in the system and enforcing pricing discipline. However, the tools that it utilizes to monitor changes remain calibrated to a regime dominated by tangible assets — leverage ratios, credit ratings, counterparty exposure.
The FSOC of tomorrow is one attuned to the interpretive dependency that cascades throughout the system, capable of mapping the frameworks communicated by the boundary entities and understanding the system's locally stable configuration and the constraints that nominally anchor it. In practice, such an entity would possess operational license, not mere oversight — the ability to directly intervene in the markets to reintroduce exogenous verification and prevent coordination from collapsing into a single reference point.