Inside This Week’s Issue:
The return of geopolitical gravity
R.U.R. to R.O.I.: The oldest Valley trick in new packaging
The most dangerous ideas arrive in business casual, bearing convenience and small rewards.
Kodiak’s SPAC: storytelling first, substance later?
Wisdom has not increased, even if information has.
We lack a periodic table of cognition.
Interpretability Is Wayfinding, Not Autopsy
We’re unclear about the essential lines between inspiration, author and publisher. In the eyes of some, there is now too much distance between the laws of yesterday and the technology of tomorrow.
Directing the evolution of other species looks a lot like playing God.
Long reading is the only detribalizing technology known in a society slouching toward digital orality.
Order Out of Chaos: The Unwritten Laws of Nature’s Creativity
A.I. Uncloaks the Corporate Personality
“Oh, people do care about the brand. They were very passionate.”
All the world is a stage.
Authenticity is a holistic phenomenon, and when we don’t know who we are authentically we become very confused about what we want.
If book smarts can’t be trusted, or agreed on, then why not revert to more intuitive and traditional forms of learning.
Tipping is just redistribution of capital from people pleasers to non people pleasers
When combined with an underfunded, poorly designed welfare state, it’s not surprising the US has seen decades of underwhelming productivity growth, an alarming rise of economic inequality, and death rates not seen elsewhere in other rich countries.
Things that if America contemplated doing even 10% of what China does, various people would say this will instantly cause us to ‘Lose to China’
Measuring the performance of our models on real-world tasks
Chronic stress means chronic hypoxia
Light Is a Map: Seasonal Blue and Your Clock
Meanwhile, China is still under-promising and over-delivering.
A revolution is brewing.
The Miraculous Complexity of Freedom
Can Space Win Against the Screen?
Underground Signals, Overground Loneliness
CVS’s subsidiary Omnicare has collapsed under the weight of its own alleged misconduct.
Are brands useful methods for coordinating user behavior in the age of ai?
Where is symbolic AI?
Back pain tied to higher rates of common diseases
We’re entering the age of design because our AIs are designed. … This is a key step in the world and in the universe. … Now we’re giving rise to designed entities.
The ultimate goal for a platform is to innovate continuously, create evolutionary tension with its ecosystem, and anticipate future needs, build them, and innovate continuously.
No Middle Ground: Platforms, Wellness, and the Missing Nuance
Some weeks, it feels like the map itself has returned to collect its debt. For a decade, we let stories float above the world—unmoored, frictionless, eager for consensus, conveniently blind to muscle, place, and risk. But as any seasoned navigator on a dark sea will tell you, there are limits: beyond a certain point, no map survives first contact with the rocks. Geography, history, and the struggle of lived life retake the helm.
There’s a new price on abstraction. Narratives untethered from place and practice simply don’t pay out like they used to; the market corrects, reality reasserts, and technique-driven storytelling finds its ceiling. The signal, stubborn as ever, cuts through. My weekly round-up is a hunt for these signals and the persistent undertow of the real economy—a mosaic built on insights from people who are seeking purpose, direction, and meaning in work, life, and the world at large.
We’re entering a cycle where stories must submit to their coordinates. Our ancestors had to smuggle know-how through the needle’s eye of memory—what survived wasn’t what was easiest to tell, but what was hardest to ignore. The myths that stuck did so because they fit the room, the context, the ears hungry to listen. There was no cloud storage for trial and error. Narrative was paid for in pain, discipline, and the privilege of still having a voice next year.
The craft, the risk, the hard-won callouses—these are once again what matter most. Abstract technique cracks and flakes at the edge, where the only thing that matters is what you can actually lift, weld, feel. And the correction towards roots and archetypal wisdom rises out of need. You see, the most enduring tales travel light and hard and survive because they compress what was too important to lose, not what was easiest to archive.
We’re haunted, still, by the question: which maps serve us now, and which maps are relics left for cryptographers? Every generation inherits resonance and divergence—Buddha and Jesus ring different notes, but at bottom, human experience overlaps where the ground is hardest and the signal clearest. But the transmission is never static. The resonance of ancient themes survives ruthless technological advance, even as the written word, and now digital code, strip the old constraints from how we store and circulate meaning. We can send every story, but which ones still move feet and hands in the territory?
There’s a limit to importing maps built for a slow, local world into an exponential, untethered present. It’s time to move beyond the endless recycling of Nietzsche, Steve Jobs, and Charlie Munger quotes—intellectual shortcuts that have become mere decorations rather than authentic guidance. Some adaptive canyons are so deep, maybe the archetypes last; others are just shadows made flat by time and progress. Which are which? The operating system of humanity ran on narrative, not spreadsheet or code. If we seek models with deeper roots—those born of millennia of lived experience rather than a century of industrialized thinking—we might find wisdom better coupled to our fundamental nature. If the current batch of archetypes calls forth the elements we need for this transition, all well. If they become kindling for hubris, we’re left staring at coordinates with no landscape beneath.
What’s required is the care of a groundskeeper—what is worth watering, what must be pruned. The answer isn’t more accumulation, more streaming, higher numbers. As inputs scale, outputs do not. Consider the stark realities: 23% of American adults didn’t read a single book last year, while one in five women of childbearing age now cannot conceive. Nearly 40% of adult men report experiencing “extreme loneliness,” and cancer rates among those under 50 have risen by 79% since 1990. More degrees, more dollars, more data—yet the world feels stuck, running a Red Queen’s race, speed without gain. The signal shrinks as the statistics grow, numbing out empathy in oceans of numbers. Real action connects at the level of the individual, the particular, the stubborn detail.
The antidote is slower attention anchored in the direct, the local: names, faces, places. But in these times, bear in mind how easy it is to mistake groupthink for culture. The former compresses variance and rewards conformity to a story, but cultural evolution compounds variation under constraint—selecting for place, history, and the willingness to lose and try again. Culture actually learns. It’s adaptation, it’s sweat, it’s procedural lived reality—the groove in the vinyl, the landmark in the graffiti, the meaning found in limits. This is the awkward gift of our species: to break from script, invent what neither nature nor tradition had planned, and risk getting it wrong so that, just maybe, someone with skin in the game might get it right.
Our species exists within a fundamental tension—the brain evolves slowly while the world accelerates rapidly, creating a mismatch that shapes our reality. Humans ingeniously invent, adapt, and build culture at remarkable speed, yet we frequently find ourselves constrained by inherited instincts and outdated scripts. Our unique privilege—and responsibility—is to redraw our maps, determining what wisdom to preserve and what limitations to discard, at a scale unprecedented in human history. This work demands locality, granularity, and iteration. Risk-taking and trust form the fertile ground where future narratives will either flourish or perish. The era of frictionless progress and consensus storytelling has ended; what remains is the necessity to employ our tools anew on terrain that demands acknowledgment of its reality. The guideposts emerge from ground truths, meaning arises from what challenges convenience, roots reconnect at the cost of abstraction, and narratives reanchor to their proper context. This is how coherence regains its power—an endeavor that requires sustained attention, calculated risk, collective memory, and deliberate care.
The return of geopolitical gravity
Gravity always wins. While tech utopians sell digital sandcastles detached from physical reality, geopolitics reminds us that geography trumps narrative. Four events in 72 hours—Saudi-Pakistan alliance, Trump’s Bagram negotiations, Xi-Trump rapprochement, and US sanctions against India’s Iran port—reveal what happens when imperial electromagnetic fields weaken: particles realign to natural forces.
The map isn’t the territory—but the territory is reclaiming the map. Localization isn’t just coming; it’s here, and those ignoring it are building on sand while the tide rises.
R.U.R. to R.O.I.: The oldest Valley trick in new packaging
A century after Čapek coined “robot,” R.U.R. warned that automation’s first optimization target is the human. A hundred years on, the spell is the same—just better marketed.
Problems-as-a-service: Anthropic’s vibe is infinite issues, infinite prompts. Keep thinking with Claude
Fear-as-management: “We’re up against the Chinese.” Eric Schmidt used it to swat down remote work on the All‑In podcast.
The missing layer is cultural integration. As David Foster Wallace kept pointing out, the crisis we didn’t see coming from the spread of access is loneliness. Turns out, loneliness grows when attention is un-stewarded. The Valley plays perception games because the gods of marketing always claim their due: if words rule the market, words will rule you. Ask Altman—play as if words outweigh substance and you’ll be judged by the script, not the build.
The work is to make people less lonely—not more optimized. Remember the ancient wisdom of Lao Tzu: “Only someone who is unconcerned about managing the world can be entrusted with the world” (Tao Te Ching, verse 13)
The most dangerous ideas arrive in business casual, bearing convenience and small rewards.
Here’s the most dystopian cultural shift you never saw coming: humans actively transforming themselves into raw material for AI training. Neon, now the #2 social app, pays users trinkets for the privilege of recording their private phone conversations. Let that sink in.
In fairness thought, it’s the logical endpoint of our techno-industrial surrender. We nod approvingly at meeting transcription software because “everyone consented” but remain blind to the larger pattern. Neon’s wild success exposes some collective delusion: when millions eagerly trade intimate moments for pocket change, we’ve crossed a threshold from which there’s no easy return. The technologists harvesting this bounty have discovered your dignity has a market price—and it’s shockingly, devastatingly low.
Kodiak’s SPAC: storytelling first, substance later?
Remember Trevor Milton from Nikola? He’s not in prison after a 2025 pardon, a reminder that narrative often outruns accountability. Nikola sold a map without the territory. There’s no allegation of fraud at Kodiak—but listing pre‑scale in a capital‑intensive field invites a familiar moral hazard: “speculative offloading,” where SPAC structures and media oxygen let insiders capture early liquidity while the public prices the promise.
In theory, reality taxes abstractions. So, if you can’t point to delivered miles, unit economics, and safety envelopes, you’re pricing a story, not a system. Privatize early upside, socialize late risk—that’s not innovation. It’s geography‑blind finance dressed as progress.
Wisdom has not increased, even if information has.
Ed West exposes the scholastic fallacy of our cultural institutions. ‘Museum fatigue’ is revealing the intellectual bankruptcy that results when curators substitute wisdom with verbose signaling. At the Hogarth exhibition at the Tate, West encountered performative narcissism: gallery notes that read like Teen Vogue editorials, stuffed with fashionable ideological posturing—irrelevant to the art itself. This institutional cancer has metastasized widely.
We lack a periodic table of cognition.
It is very probable we will discover that intelligence is likewise not a foundational singular element, but a derivative compound composed of multiple cognitive elements, combined in a complex system unique to each species of mind. The result that we call intelligence emerges from many different cognitive primitives such as long-term memory, spatial awareness, logical deduction, advance planning, pattern perception, and so on. There may be dozens of them, or hundreds. We currently don’t have any idea of what these elements are.
Interpretability Is Wayfinding, Not Autopsy
Interpretability that actually unlocks the A.I. application layer demands context and process—how the system came to be. Naomi Saphra’s wager is simple and correct: treat training like evolution, not a black box to be dissected after the fact. Alfred North Whitehead once warned that civilization advances by extending important operations we perform without thinking; in AI, that advance stalls when we forget to ask where those operations came from and what they were trained to ignore.
Real interpretability is operational history plus situated evaluation. It tells you which knobs map to behavior in the wild, so teams can ship with fewer unknown unknowns and design business models that price real causal control. Wayfinding beats map worship. If you want applications that survive contact with the territory, study the route, not just the destination.
We’re unclear about the essential lines between inspiration, author and publisher. In the eyes of some, there is now too much distance between the laws of yesterday and the technology of tomorrow.
As with fair use, defining “transformative” is a matter of degree. Then there’s the question of who is culpable for the creative act in the first place. So far lawsuits have only been filed against tech companies, but could we also see charges brought against individuals creators who used generative AI and unwittingly created derivative outputs? Angela Oduor Lungati recognises this murkiness as reflective of deeper shortcomings in how we understand generative AI. “There’s still no clear answer to who owns what in this era,” she says, “and that’s worrying.”
Directing the evolution of other species looks a lot like playing God.
Four billion years of evolution prove that transformation is nature’s genius. But the scale and speed of humanity’s impacts suggest that even genius might need some assistance. If so, then we shouldn’t approach this simply as a change in technology, but as an opportunity for a change of mind.
Long reading is the only detribalizing technology known in a society slouching toward digital orality.
Donna Tartt once reflected: “Ever since I was a little girl I... always loved to read books and I thought what a wonderful thing if I could just read books all day.”
We might have been naive to believe that digital media, especially when it precedes print literacy, wouldn’t pull cognition back toward orality and performance—short cycles, tribal signaling, and attention capture over slow synthesis. Perhaps those pathologies we moralize as personal failure—polarization, superficiality, hair-trigger rage—are effects of how we consume information more than moral collapse. I’ve always seen long-form reading as cartography for the mind. In a way, it restores essential coordinates: sequence, context, inference, and the muscle of sustained attention. So where do we go from here? Surely, the goal isn’t nostalgia or a Luddite retreat, but something like re‑anchoring thinking in practices that make meaning travel across time and place. If we build and design for thinking, thought will return.
True innovation is fragile—convex, antifragile, and deeply embedded in the constraints of geography, energy, demography, and institutions. The Polynesian navigators didn’t invent maps to get lost; they learned place by risk, tradition, and calibrating every decision to the facts beneath their feet. Today’s innovators must relearn this: if you can’t point to it on a map, don’t price it. The old regime of tech narratives is in its endgame. Gravity wins.
It felt like learning to reject this kind of anti-relational concept of objectivity. It was like learning to accept the subjectivity experience again and sharing that subjectivity with someone else. … Most of my life, my perception has operated within a very limited bounded box. And when I learned to expand that box, the whole world opens up, full of new relationships.
To the untrained eye, a Pacific sailing vessel might appear archaic and unsophisticated, but if you look closer, these boats embody a really complex set of advanced techniques, from aerodynamics and hydrodynamics to sustainable manufacturing, accompanied by a philosophy of robust repairability
Treating public figures like saints is not new, but the speed and scale of the process is.
In a polarized environment, the elevation of a figure into sainthood works as if thought a form of cultural alchemy that transmutes ordinary political struggle into a sacred ritual. When you canonize someone as a martyr, opposition becomes sacrilege. Hagiography sustains more than mere remembrance—it changes the quality of the memory to weaponize it. This is how movements mobilize the living—through the sanctified dead.
The downside of this behavior reveals itself quickly. Sanctification creates untouchable zones in our discourse. The more we sacralize a figure, the more we render their flaws invisible, their errors unmentionable, their controversies unspeakable. It’s a flattening of complexity that generates a fragility in our historical understanding, crushing nuance. Remember the way that Martin Luther King Jr. became victim of selective canonization. His economic critiques, anti-war positions, and structural analyses of racism disappear from mainstream memory. We get instead the dreamer, safely defanged. The annual holiday, streets, public murals seemingly honor him while actually domesticating his important and radical legacy into palatable unity narratives. Now look at Kennedy or Lincoln, whose assassinations triggered immediate mythologization. Death by violence serves as an express elevator to cultural sainthood. The black swan event of political murder creates a narrative distortion field that warps our collective memory.
Order Out of Chaos: The Unwritten Laws of Nature’s Creativity
The narrative always privileges entropy, but look closer: nature is the ultimate trickster, compounding order in defiance of the odds. In a universe allegedly ruled by the blunt tyranny of entropy, stars ignite from clouds of hydrogen, living cells organize, and entire ecosystems knit themselves from the raw ingredients of chance. Dembski’s provocation rightly interrogates the fraud of pretending this is a minor footnote to physics. What some call formational order feels like a rebellion against knowledge—the black swan of creative emergence that does not yet fit what we think we know about nature. When theory worships randomness, but your senses show you pattern after pattern—who are you going to trust, the spreadsheet or your skin?At this weird interval in human history, maybe we’re finally learning how to observe this improvisational dynamism.
A.I. Uncloaks the Corporate Personality
When Harvard Business Review reports on “workslop”—low-effort, AI-generated work that creates more problems than it solves—it misses the fundamental lesson. The AI hype machine created this predicament by detaching its promises from reality—allowing exaggeration and market fervor to spread before establishing a solid foundation.
In 1978, Wally Olins conceptualized organizations as having personalities, encouraging the public to relate to companies as they would to individuals. Today, this framework manifests with striking literalness. A company’s statements must ultimately align with its behavior and deliverables.
“Oh, people do care about the brand. They were very passionate.”
Ignore the street and sooner or later the street returns the favor. The Cracker Barrel fiasco reveals a deeper indifference on the part of the company toward the product itself. If Cracker Barrel truly cared, it would have sweated the details, tested its moves, and paid attention to the lifeblood of any lasting brand: people who actually use it, walk past it, and talk about it. The synthetic boycott also illuminates a feature of our information landscape that’s only going to get worse: orchestrated narrative manipulation. So, it turns out you can’t shortcut sincerity. As soon as a product loses touch with lived experience, digital noise fills the vacuum, but reality votes with capital and attention.
And while so much of modern commerce is geared toward scale, automation, and abstraction, there are brands that choose to zag—doubling down on signals of craft, care, and originality. Audi Blends The Road With The Runway campaign is a masterclass in this approach. By reimagining car components as a haute-couture dressmaking pattern, the campaign fuses engineering with elegance, turning complexity into something legible, tactile, and beautiful. I continue to appreciate how Shopify CEO Tobi Lutke summarizes the craft begets trust principle with characteristic directness: “It’s impossible to make great products if you don’t give a shit.”
All the world is a stage.
All the world is indeed a stage—but lately, the scene feels more like an exit interview than an inspiring act of collective imagination. Innovation and customer loyalty are clearly casualties under the current leadership. As Disney and Apple roll out sharp price hikes, squeezing subscribers with little promise of improvement, we see the same dead-end tactics that private equity uses to milk failing ventures: raise prices, cut costs, ignore quality, and ride the brand until the gates close for good.
There’s a growing gulf between instrumental affordance—the real creative capacity to solve problems, invent new futures, build bridges to possibility—and symbolic manipulation, where narratives and dreams are sold wholesale, but the underlying product rarely changes. This is why a figure like Elon Musk can sell Mars, trading on mythic possibility and narrative boldness, while much of corporate culture grinds along trying to sell incremental updates, squeezing ever-tighter margins from exhausted consumers. It’s a landscape regressing to the mean, where true innovation is sidelined for short-term safety, and the only risk is failing to change at all.
The only way to resist these endgame maneuvers is with action—cancel, and reclaim the creative, healing genius that mainstream gatekeepers have neglected in pursuit of oligarchic profit. The longing for genuine freedom and community is ineradicable, and as the mask slips, it is up to us—the players alive at this turning point—to vote for a different world with every choice we make.
Authenticity is a holistic phenomenon, and when we don’t know who we are authentically we become very confused about what we want.
This dynamic isn’t limited to the abstract or the spiritual—it’s playing out all around us, shaping headlines and personal choices alike. As nutrition guidelines trend on TikTok and new diet fads rise and fall in the wellness media churn, millions of people report feeling more confused than ever about what’s actually good for their bodies. Dating apps, built for optimization but not discernment, are awash in think-pieces about rising rates of “situationships” and decision paralysis. And in the labor market, the “great resignation” has given way to quiet quitting—not just a work trend, but a symptom of deeper existential exhaustion, as highly skilled people walk away from lucrative but soul-draining careers.
To live in the shadow of the root chakra is, increasingly, to swim in the currents of collective anxiety and uncertainty radiated by the culture itself. In our newsfeeds, we see the symptoms: headlines chronicling burnout, epidemic loneliness, surging self-help industries, and the constant search for belonging and security. We may not always recognize the origin of our unease, but the signals are everywhere in our media, our markets, and our daily lives: a longing for fulfillment and security that runs deeper than the confusion at the surface.
If book smarts can’t be trusted, or agreed on, then why not revert to more intuitive and traditional forms of learning.
Chris Arnaude is one of my favorite writers on the internet today. He ease-fully finds meaning in the unnoticed stories that haven’t been devoured by algorithms. This piece reminds me that where modern book smarts fracture under lived complexity magic realism becomes a strategy for establishing fidelity to conditions too splintered for tidy prose. Better not to frame this as a meager attempt at optimizing functions so much as a powerful use of faith to encode reality’s overload. It’s a reminder too that myth and folklore have, for centuries, been lossless compressors when rational consensus fails.
While we do help animals primarily to make us feel good, we do that because it is a reminder that there is a point to life, which isn’t grounded in the rational, but in the spiritual. Those daily acts of small “irrationality” are attempts to maintain our soul in an overly rational dehumanizing world. It is a very human moment in an otherwise cold world, that’s fulfilling because we are not computers, not a mechanical being run by algorithms, but a spiritual being aligned towards the good.
…
Words, language, and fiction are odd if you think about them too much, because there truly are realities and truths that can’t be described, where no amount of verbosity or vocabulary can do them justice. So you throw up your hands and embrace the ambiguity through Magic Realism, surrealism, dark humor, absurdism, and even science fiction and fantasy.
So, in times when the map can’t carry the territory, people reach for older symbol systems—ritual, story, animism—to stitch coherence back together. If contemporary life feels “dehumanizing,” it may be because we keep optimizing the sentence while losing the song.
Tipping is just redistribution of capital from people pleasers to non people pleasers
This tipping dynamic is a cautionary tale for technologists obsessed with engineering user “nudges”—those subtle prompts, reward loops, and behavioral hacks meant to guide us along supposedly optimal paths. When you spend more time tuning the mechanics of the nudge than enriching the underlying experience, you risk the same fate as tipping culture: optimizing for surface signals, missing the deeper point. You’re not building real value or meaningful engagement—you’re just teaching people to play the social game better. If the map becomes all nudges and no meaningful territory, the plot is lost and technology becomes another layer of ritualized noise.
When combined with an underfunded, poorly designed welfare state, it’s not surprising the US has seen decades of underwhelming productivity growth, an alarming rise of economic inequality, and death rates not seen elsewhere in other rich countries.
For all its world-leading firms and universities, the US has often failed to convert its innovation into meaningful improvements in Americans’ lives. Compared to the Nordics, as well as several other countries in Europe and Asia, US economic growth per hour worked has been mediocre over the last several decades. Most of that growth has flowed to the rich, limiting the country’s ability to improve rates for life expectancy, child and maternal mortality, suicide, depression, and poverty—all metrics where the US fares poorly against its peers. Meanwhile the US has continued to struggle to use its technological might to prevent climate change, relying on a dirty grid and making inferior progress in green energy over the past 50 years.
How the Nordics and the US approach innovation raises important questions about the relationship between technology and the social good. What’s the best way for the state to boost innovation? How can it guide innovation toward socially useful purposes and away from harmful ones? How can we prevent innovation from creating a set of winners and losers that widen inequality and cause long-lasting damage to households and entire communities? And is there an inherent tension in achieving all of these goals?
Things that if America contemplated doing even 10% of what China does, various people would say this will instantly cause us to ‘Lose to China
The clash between coordinated and uncoordinated markets—between models of state-backed scale and private, open innovation—is being thrown into sharp relief by the global AI arms race. Chinese markets are dynamic and ruthlessly efficient, but not in the hands-off, invisible-hand sense many Western commentators expect. The Chinese path is built on strategic coordination, top-down control, and the ability to align talent and capital behind national priorities. Were America to mimic even a sliver of these tactics, we’d be warned about “losing to China”—yet it’s clear by now this isn’t a menu of options from which to casually borrow. These strategies are package deals, each with their own strengths and fatal flaws. Mixing them, as history warns, risks losing the distinctive advantages that made each system powerful in the first place.
Nowhere are these tradeoffs clearer than in the contest to lead in open-source AI. As Ben Horowitz argues, the US has ceded its leadership as a consequence of self-defeating policies that have pushed US companies to close their platforms. The unintended result: most dominant open-source AI models now originate in China. Attempts to put AI behind walled gardens haven’t contained the technology—information flows, nationals move, and the open society’s greatest advantage, its porous, collaborative innovation process, now lies dormant. And the stakes are immense. Whoever controls the world’s leading open-source AI models shapes the global cultural “control layer.” The encoded weights of these models carry embedded assumptions, values, and blindspots of those who make them—down to which histories, politics, and human rights are remembered or forgotten. The old tension between command and openness, between orchestrated scale and the evolutionary wilds of the open market, is being recast for the digital era.
Measuring the performance of our models on real-world tasks
Of course AI will impact the labor market, but the real story isn’t as simple as “automation replaces jobs.” The specifics hinge on path-dependence and attribution: companies often use new shocks—like AI advances—as convenient cover for layoffs they already intended. Call it the Red Herring Layoff: blame “AI efficiency,” collect a narrative discount, and write off past mistakes under a fresh excuse.
The kind of work most vulnerable to this dynamic is already fragile: roles with thin margins, high coordination overhead, or easily templated outputs—jobs that concentrate risk without anyone really having skin in the game. These are the first to collapse at the scent of automation.
By contrast, robust work is rooted in real-world constraints—regulated sectors, face-to-face trust, operations with clear feedback loops and liability. These jobs can absorb shocks, degrade gracefully, and keep people employed through cycles of change.
The most antifragile work actually benefits from the disorder: small, outcome-owning teams that iterate with customers and build compounding knowledge. They’ll use AI as leverage rather than a substitute, profiting as slower-moving competitors overreact.
Ultimately, mass layoffs are rarely a function of model perplexity; they’re driven by capital cycles, interest rates, and executive storytelling. Watch where risk and variance get priced: through insurance policies, service guarantees, and whoever puts their name on the line for outcomes. When risk migrates, jobs follow. The market will re‑price low‑signal roles first, then re‑bundle tasks around those who own the causal chain from decision to consequence. In that reshuffle, the scarce asset is coordination under uncertainty, not keystrokes. AI just accelerates the audit.
Chronic stress means chronic hypoxia.
The more stressed and depleted you are, the more lactic acid you’re producing at rest. Your body is anticipating danger, and so it prepares to bolt at any given moment which results in lactate production.
Let’s stop thinking of stress as a mental state and see it for what it is: a metabolic program coded by our environment. The way our culture dials up vigilance, rewards image and optics over raw human connection, pushes bodies into a state of perpetual alert—resting lactate spikes, breathing gets shallow, mitochondria abandon repair to prime us for sprint and survival. We moralize the individual’s stress response, but miss the upstream design flaw: our social architecture wires us for spectacle and transaction, so our nervous systems adapt for flight over fellowship.
If recovery is the goal, you don’t fix cultural hypoxia by slapping more trackers or metrics on life. You reset the physics: thicken the air with practices that privilege friction and embodied presence over curated distance. Slow rituals, local contact, repeatable rhythms—these are the biohacks that shift resource allocation from dopamine chasing to deep tissue repair. My favorite questions these days ask how we might redesign space so that trust is metabolically cheap.
Light Is a Map: Seasonal Blue and Your Clock
During the summer months we tend to switch on the lights in our homes later in the evening, but as the days shrink during the fall season, we switch on the lights earlier and (unless you have installed circadian light bulbs) we are exposed to progressively more blue‑rich light. The average bedtime in the USA is 11:37 pm, according to Apple Watch data, and 11:21 pm according to 6 billion nights of Fitbit data. So, when sunset is at 6:39 pm in Boston on September 22 the average electrically lit evening is about 5 hours long, but lengthens to about 7 hours when sunset occurs at 4:15 pm at the winter solstice on December 21st. Of course, the exact exposures are related to where you live on the globe. So, for example, in London, England, sunset is at 6:58 pm on the September 22 equinox and the average electrically lit evening is about 4 hours long, but this lengthens to about 7 hours when sunset occurs at 3:53 pm at the winter solstice on December 21st.
As nights stretch longer, technology delivers us into an artificial “second day”—bathed in LED glare, spiking cortisol, and pushing melatonin further down the timeline. We’ve engineered permanent dusk, flattening circadian amplitude, erasing the ancient cues that signaled winter’s drift toward rest and repair. Cities glow like Vegas at midnight, and headlines confirm the damage: recent reports warn that global light pollution is rising at record speed, with urban centers suffering from chronic circadian jet lag, metabolic syndrome, and immune suppression at the population level.
If we keep selling light as an upgrade, we ignore the consequences of relentless illumination—blue-rich streetlights that hijack the biological night, offices lit for productivity but tuned against the body’s healing clock. Instead of designing for biological timing, we optimize for 24/7 engagement, producing overstimulation, shallow sleep, and a citizenry that’s perpetually under-repaired. Data shows connections to rising rates of mental health disorders, degraded learning, and even increased cancer risk.
Biology is not optional, it’s fundamental.
Meanwhile, China is still under-promising and over-delivering.
This is China’s third pledge under the Paris Agreement, but is the first to put firm constraints on the country’s emissions by setting an “absolute” target to reduce them.
In contrast, the United States’ Paris commitment targets a 50–52% cut below 2005 levels by 2030. Policy momentum has accelerated, but the delivery curve remains uneven: federal incentives are strong, permitting and transmission are slow, state policies are fragmented, and the credibility of long‑horizon pledges is discounted by administration turnover. The map-vs‑territory lesson holds: China’s new absolute cap clarifies coordinates, while the U.S. maintains an ambitious waypoint with execution risk. What matters next is steel-in-the-ground pace, not podium promises.
A revolution is brewing.
What we’re witnessing isn’t yet a full-blown insurrection, but all the signals are there. Across headlines, social feeds, dinner tables, and even classrooms, a low-frequency backlash is gathering strength. The uneasy resistance focuses on a simple, visceral realization: our lives are dominated by screens, and we’re no longer okay with it. For years, we crowned each new gadget as evidence of progress—believing, or at least hoping, that the next innovation would make our lives unmistakably better.
But now, the mood has shifted. Instead of celebration, there’s a collective pause—a reckoning with the costs of relentless connectivity. So we ask, sometimes quietly: Why is anxiety everywhere? Why do so many gatherings feel hollow or vanish altogether? Where did the children playing outside go? The answers, though uncomfortable, are close at hand. And as the water in the pot grows hotter, an appetite for change is finally rising—one that may reject the very technologies we once queued up to welcome.
The Miraculous Complexity of Freedom
Sixty years ago, Leonard Read’s “I, Pencil” delivered a timeless argument for the humility required by complex collaboration. It is still extraordinarily true that no single mind or institution fully grasps the web of know-how required to make even the simplest object. So, in a world infatuated with monolithic idealism and powerful individuals, it’s tempting to elevate planners—be they technocrats or visionaries—to steer the whole enterprise. And yes, organizing forces can be invaluable in creating coordination or accelerating problem-solving. But caution is needed at the tipping point: when subjective experience and creative diversity give way to groupthink and the stifling pressures of top-down control, the system loses its antifragility. If we are still interested in progress, we’d do well to remember that humility enables our grasp of complexity. Faith in free people and distributed intelligence is the ground from which unpredictable, resilient flourishing arises, whether in markets, music, or meaning itself. How might organizing models serve, not suffocate, this emergent genius? It’s the question of the decade.
Can Space Win Against the Screen?
We need to create homes that are human‑centered.
Jonathan Haidt’s contributions to the conversation about kids in digital spaces offer useful delay tactics, but they start from the wrong default: that screens are inevitable and omnipresent in family life. The better question is architectural. Designers—of homes, streets, schools, and cities—set the friction that either protects or punctures attention. If we build for scrolling, we will scroll. If we build for play, we will play. Parents are told to manage “how much” and “how” screen time, while the built world quietly erases the places and cues that once made unstructured play the path of least resistance.
Innovation that serves people starts with space. The role of the designer—especially spatial designers—is now pivotal. Give kids rooms, courtyards, blocks, and tiny wild zones that invite invention and make screens inconvenient at the margins. Most families are increasing screen hours because the alternative is out of reach; the damage from the loss of unstructured, outdoor, kid‑led time will be irreparable if space does not change to meet the need.
Underground Signals, Overground Loneliness
The subway blitz works precisely because the territory is lonely. Friend’s $1M out‑of‑home spend and 1,000+ posters are clearly a wager that visibility can substitute for embeddedness, that narrative can outrun design constraints of real community. But if the map is reasserting itself, then companionship is a spatial problem before it is a software problem. A pendant that “listens” tries to wallpaper over the absence of third places and frictioned, proximate ties. The campaign’s backlash—scribbles of “surveillance capitalism” on the posters—signals that audiences feel the gap between vibe and venue. If tech wants to serve the human nervous system, our politicians and real estate developers and banks must invest in the commons—benches, stoops, courts, kid‑safe blocks—then layer tools atop coordinates that already convene people. Otherwise, tech bros are going to keep advertising friends to commuters who don’t know their neighbors
CVS’s subsidiary Omnicare has collapsed under the weight of its own alleged misconduct.
Last week, CVS’s subsidiary Omnicare collapsed under the accumulated weight of its own alleged misconduct—forced into Chapter 11 bankruptcy after a judgment approaching $1 billion in penalties for dispensing drugs on expired and invalid prescriptions. For nearly a decade, Omnicare supplied powerful medications to the most vulnerable—elderly patients in nursing homes—on the basis of paperwork that no longer passed the threshold for care, then billed Medicare and Medicaid as if nothing was amiss.
The invisible machinery of healthcare, entrusted to safeguard those most at risk, instead became a mechanism for exploitation—abstracted accountability with concrete human cost. See the pattern?
Are brands useful methods for coordinating user behavior in the age of ai?
The traditional view of a brand—as something that emerges from the sum of impressions, interactions, and gut beliefs—has always defied easy manipulation. In classic brand strategy, it’s tough to pin down the true essence that people intuitively sense, but at least we can usually identify the product or company at its core. With AI, though, the terrain shifts beneath our feet: knowing “what the thing is” becomes strangely slippery, while the emerging patterns of user attachment and expectation are more chaotic than ever.
AI products like ChatGPT don’t anchor to a singular referent. The technology itself mutates with every user, spawning not just one public brand but a thousand private ones—personal trainers, poets, therapists, and confidantes, all coexisting as ephemeral avatars. The collective impressions around “ChatGPT” or “Claude” might feel distinct, but do they actually coordinate consumer behavior in the way brands historically did? Or are we seeing brand as a dissolving category—a residue, rather than a center of gravity?
The frameworks of persona and impression break down when the product itself is a shifting interface, constantly re-skinned to fit the user. In the age of foundational AI models, brand as a tool of manipulation or strategy may be less relevant than ever—overtaken by the raw gravitational pull of computation and scale, where wrappers and surface-level brand identities are swallowed by the next model update.
Where is symbolic AI?
Before alphabets, there were symbols.
People did not ‘invent’ two-dimensional images; nor did they discover them in natural marks and ‘macaronis’. On the contrary, their world was already invested with two-dimensional images; such images were a product of the functioning of the human nervous system in altered states of consciousness and in the context of the higher-order consciousness.
-David Lewis-Williams
That primordial inheritance still organizes our intelligence, whether we recognize it or not. Humans depend on cultural transmission—the transfer of information between individuals and across generations—for successful adaptation to diverse environments. So, mastery—of craft, of culture, of cognition—is a cultural learning model that follows some basic patterns: borrowing from the interplay of lived experience and symbolic scaffolding, individuals develop ways of binding instinct to form, and build meaning from memory. In a world where digital learning machines are constrained by the cost and carbon weight of brute-force iteration, the mastery pattern—which efficiently anchors complexity in symbol, ritual, and story—demands a new audience.
Emerging AI research is discovering that symbolic AI is needed to drive accurate, reliable results and critically, to bolster natural language conversational abilities. At heart, a thesaurus structured using an ontology like SKOS is technically symbolic AI—a fundamental, machine-readable representation of knowledge. As Gary Marcus has argued, “To build a robust, knowledge-driven approach to AI we must have the machinery of symbol-manipulation in our toolkit. Too much of useful knowledge is abstract to make do without tools that represent and manipulate abstraction, and to date, the only machinery that we know of that can symbol-manipulation.”
As we hunt for better forms of machine intelligence, perhaps the edge is where it always was: in the ancestral genius for compressing chaos into shared, generative coordinates. The future of intelligence may depend on remembering what symbols are for.
Back pain tied to higher rates of common diseases
So, it turns out that chronic back pain is a low-frequency signal that something larger may be amiss. Like so much of what we ignore, there is a forecast in the subtle: those struggling with back pain are statistically more likely to also face heart disease, arthritis, diabetes, and depression. The recent work led by Rafael Zambelli Pinto reframes back pain as a flag in the middle of the healthcare landscape: a lived warning, not just for the spine but for the whole system. For millions, daily discomfort blends into the background until the full burden—activity limitations, compounded diseases, isolation—becomes undeniable. As the research suggests, routine clinical responses aimed only at “pain management” risk papering over these broader systemic dangers. What we tune out today becomes tomorrow’s reckoning.
We’re entering the age of design because our AIs are designed. … This is a key step in the world and in the universe. … Now we’re giving rise to designed entities.
What we want, to quote Alan Turing, is a machine that can learn from experience, where experience is the things that actually happen in your life. You do things, you see what happens, and that’s what you learn from
The ultimate goal for a platform is to innovate continuously, create evolutionary tension with its ecosystem, and anticipate future needs, build them, and innovate continuously.
Despite all the talk of disruption, the platform model—monolithic, modular, gravity-heavy—remains the gravity well of the digital economy. Marketplaces, clouds, and SaaS behemoths still rule by scale and orchestration. I’ve been a fan of Simon Wardley’s thinking for some time now. His ILC model describes the platform potential: it should enable a business to innovate boldly, leverage mercilessly, then componentize and repackage every new feature until it becomes invisible infrastructure. Amazon is the best example we have for that flywheel.
Now AI enters this orbit, twitching with the promise to scramble the code. So a new dream is appearing: ecosystem business models—living, breathing commercial networks where orchestration becomes adaptive, where value might move at the speed of need instead of legacy platforms. And yet, for all the theory, the evidence is thin, the experiments tentative. The connective tissue for ecosystem businesses is somewhere inside us still. The guts that let knowledge, incentive, and feedback spiral into new forms remains embryonic or altogether absent.
The core question is not whether AI can crack the platform paradigm, but whether our habits—and the incentive structures that prop them up—will really allow it. Are we willing to embrace creative destruction, relinquish the old playbooks, abandon the chase for scale at all costs, and embrace a world where friction, context, and co-creation actually matter? Or is the barbell economy inevitable, its gravitational pull too familiar to escape?
No Middle Ground: Platforms, Wellness, and the Missing Nuance
The platform playbook isn’t just eating tech—it’s colonizing culture, too: heavy gravity at the extremes, a famine in the middle. We see it in AI, and we see it in the $6.8T wellness complex. On one end, it’s all monastic optimization stacks and cold-plunge bootcamp maximalism; on the other, it’s lifestyle cosplay and mall-grade myth, complete with sauna raves and pop-ups for people desperate to sweat out their anxiety. The middle path—the space for durable habits, communal ritual, seasonal rhythm—gets arbitraged out because it doesn’t spike engagement or fit an “always on” narrative.
As the category scales, the choice is stark: monk-mode asceticism or soft-focus fantasy set. Train like an algorithm or slip into the soothing bath of branded wellness—there’s little space for those who want coherence without the cult, performance with a little pleasure, real balance. The curve gets crueler when mapped to outcomes: population health backslides, the wealth gap widens, and “wellness” consumption outpaces any gains in real vitality. The industry mirrors capital: concierge longevity at the summit, aspirational merch at the base, and, in the middle, a desert where communal practice and authentic connection should thrive.
The curve is crueler when you map it to reality. Obesity climbs. The wealth gap widens. Health “consumption” rises with little to show in population vitality. So the industry mirrors the money: concierge longevity on one end, aspirational merch on the other, and a desert where communal practice should be. If AI platforms are teaching us that the application layer needs interpretability and coordination, wellness needs the same: context, craft, and community. Otherwise we’re stuck in metrics without meaning. If AI platforms are teaching us that application layers need interpretability and social context, wellness needs it even more: craft, coherence, and actual community, not just a stack of metrics. Otherwise it’s metrics without meaning—all calculation, no celebration.
The data reflects this tension. Derek Thompson’s “Death of Partying in the U.S.A.” notes that young Americans spend 70% less time socialising than in 2003. Work grind, cost-of-living, and digital distraction all play a role, but the rise of optimization culture looms large. As Dazed recently argued, over-exercise culture risks flattening life, leaving people “less interested in developing well-rounded interests and a healthy social life.” It’s a world calibrated for dopamine and discipline, but starved for the messy, periodic magic that once made ordinary life feel worth living.
Fractionalizing Memory: When Culture Becomes a Traded Good
World’s largest private Rembrandt collection may be fractionalised, owner reveals
Thomas S. Kaplan’s plan to fractionalize the Leiden Collection and take it public is a stress test of our thesis: when the map reasserts itself, do we finance territory or just securitize the story? Turning Rembrandts into IPO slices could democratize exposure to cultural assets, but it also risks abstracting place, provenance, and curatorial craft into a ticker symbol. If haute culture becomes a financial product, the question is whether stewardship scales or whether liquidity outpaces care.
Craft is how signal acquires a story.
We’re living in an era where the constraints that once shaped real creativity have dissolved, and with them, the anchors that kept story tethered to wisdom forged in experience. The landscape is a frictionless plane—a map without terrain, an endless scroll tuned for amplification over meaning. And yet, just beneath the surface, there’s a stirring—a return to the kinds of truth that only arise from being fully present in a particular place, at a particular time, with nerves attuned and stakes on the line.
Call it the phenomenological comeback. Before narrative was a commodity, story had to be pried from experience, smuggled through memory’s narrow keyhole. Ancient navigators crossed the ocean guided by memory and trust, carrying with them stories formed by endurance and risk. Their wisdom came from contact with elements—the salt, the sun, the tension between the horizon and the ground beneath their feet. The signals that endure were passed hand to hand, voice to ear, like living code.
Progress rests on our ability to remember which signals are real and which are just noise. The archetypes we model our childhood dreams with persist because, as we grow and experience life, those models become felt truths—etched by generations into the feedback loop between encounter and understanding. Their wisdom is alive in every groove worn by use, every recipe perfected in a kitchen, every callous earned on a fretboard. If you want a revolution now, you have to know your neighborhood. Context is the engine of coherence. Craft is the transmission. The relevant network is always local—embodied, relational, particular. And as the information tide rises, what stands out are those things that still require all of you: the slow book, the patient walk, the act of attention that roots the story back to land.
And remember: architects have been collaborating with machines for decades. From Sketchpad to today’s digital twins, intelligent tools refine judgment—they don’t substitute for it. The best makers use technology as a lens to see more deeply, not as an escape from responsibility. The model matters, but only as an instrument for the making. Trust doesn’t live in code; it grows in process.
So as you close this issue, resist the impulse to skim the future for its headlines. Seek the irregularities of handwork, the stitches of communal memory, the resonance of embedded context. Craft makes promises hype cannot keep because it wagers the maker’s whole being.
Read deeply. Look closely. Walk locally. Become a person. Let your senses teach your models what matters.