I’ve been reading Joan Didion again. She was one of the sharpest chroniclers of the recent past and a master of minimalist style. Legend has it that Bret Easton Ellis, still in high school, copied Slouching Towards Bethlehem by hand (or maybe on a typewriter) to absorb the rhythm of her sentences.
Unlike Didion, I never expected to spend a decade in California, but when I did, she became a kind of spiritual guide. Lately, I’ve been wondering—who is doing for contemporary California what Didion once did? Who is writing about the AI wars, polycules, crypto, billionaire Gavin Newsom, tech-bro culture, or last year’s hard shift to the right in Silicon Valley? If such a voice exists, I haven’t heard it yet. Perhaps, amid fragmented media and algorithmically shaped narratives, the kind of clarity Didion provided is no longer achievable. Maybe, with the collapse of journalism, it’s not even possible anymore.
So last night, as an experiment, I asked the newly released GPT-4.5 to write a piece on the present day in Didion’s voice. The result was too brief, so I had Claude 3.7 expand it. Then I fed the longer draft back into GPT, refining it piece by piece—GPT, it turned out, had the better ear for her style. I edited out some, but not all, of the hallucinations, asking GPT-4.5 to rewrite the worst offenders. After all, in Slouching Towards Bethlehem, Didion wrote, “The point of my keeping a notebook has never been, nor is it now, to have an accurate factual record of what I have been doing or thinking. That would be a different impulse entirely, an instinct for reality which I sometimes envy but do not possess.”`
`
As usual, you can find this on my site, at https://varnelis.net/the-city-and-the-fog/
The fog moves as it always has, unhurried, insistent, swallowing the Golden Gate Bridge one rivet at a time until nothing remains but the memory of steel and ambition. San Francisco in March is a city between seasons, between ideologies, between versions of itself. The light falls differently now. The voices in the street carry a new tenor. The city is holding its breath.
From the window of a hotel suite on the thirty-fourth floor of the Four Seasons, I watch the fog erase and reveal the city below. Market Street runs like a fault line, dividing more than just geography. Seventeen floors beneath me, my rental car sits in a parking garage. I have not slept in thirty-six hours. The minibar contains small bottles of vodka and gin, their presence a comfort I choose not to indulge. Not yet.
I have come to San Francisco to observe a transformation, or maybe a regression—a city once synonymous with progressive ideals now shifting under the weight of its own contradictions. The disruptors, having lost control of their narrative, have found themselves disrupted.
“The problem with San Francisco,” Jonathan Reed tells me over lunch at Quince, “is that it forgot what made it great in the first place.”
He cuts into a perfectly seared scallop, the gesture deliberate, his Patek Philippe catching the light. “It wasn’t tolerance or inclusivity that built this city. It was ambition. The gold rush. People willing to risk everything for the chance at something better.”
Reed is forty-seven, lean in the way that suggests an optimized lifestyle. As a venture capitalist with over two billion in assets under management, he has funded startups that changed how we communicate, how we travel, how we understand ourselves. From his South Park office, he looks down on the same streets where Jack Dorsey once contemplated human connection in 140 characters. Now Reed contemplates a different kind of revolution.
“We’re the new forty-niners,” he says, the smile practiced, “but we’re mining for something more valuable than gold. We’re mining for freedom.”
I ask what freedom means to him.
“Freedom from overregulation. Freedom from a tax code that punishes success. Freedom from a culture that cares more about policing speech than encouraging innovation.” He pauses, measuring his words. “I didn’t vote for Trump in 2016 or 2020. Most of us didn’t. But something changed. We looked around and realized the progressive policies we supported were destroying the environment that allowed us to succeed.”
The restaurant is filled with others like Reed—tech executives and investors, dressed in casual luxury, speaking in the clipped, assured tones of men who expect to be heard. At a table nearby, the founder of a cryptocurrency exchange under SEC investigation raises his glass in silent acknowledgment. We met once, at a conference in Austin, where he declared California “functionally dead” to an audience that laughed knowingly. Now he has returned, a prodigal son to the city he publicly renounced.
“The right started speaking our language,” Reed continues. “Innovation. Deregulation. Meritocracy. Meanwhile, the left became hostile to the very concept of achievement. It wasn’t a sudden conversion. It was a gradual realization that our interests had realigned.”
I ask about Elon Musk, now heading the Department of Government Efficiency—DOGE, an acronym presumably chosen for its resonance with cryptocurrency enthusiasts and meme culture.
Reed’s expression shifts. “Elon is… complicated. Brilliant, no question. But his approach at DOGE has been…” He searches for the right phrasing. “Let’s say scattershot. Another rocket explosion this week. Agencies gutted without clear plans for replacement. It’s creative destruction without the creative part.”
The news feeds have been filled with footage of the latest SpaceX failure—a Starship test vehicle disintegrating over the Gulf of Mexico, raining debris onto protected waters. Environmental groups are already filing lawsuits.
“But Peter’s doing well,” Reed adds, meaning Peter Thiel. Palantir has secured a series of lucrative government contracts since the inauguration. Surveillance, border security, administrative “efficiency”—Thiel’s reach now extends into every department Musk is “streamlining.”
I ask about the social implications of this realignment.
Reed’s expression hardens. “I’m not responsible for fixing society’s problems. I create jobs. I generate wealth. I fund innovations that improve lives. Isn’t that enough?”
The question lingers in the air, unanswered.
The Presidio offers a different perspective—manicured nature, military precision. Once a Spanish fort, then an American base, now a national park, it stands as a monument to San Francisco’s cycles of conquest and reinvention. Trump has spoken of turning it into a Freedom City, one of ten proposed metropolises meant to embody a new vision for America.
It is here, in a converted barracks overlooking the bay, that I meet Emily Sanchez.
Sanchez does not match the image conjured by “Trump supporter.” She is thirty-five, Mexican-American, Stanford-educated, with a resume that includes Google and Meta. Three years ago, she left tech to become a full-time activist for what she calls “digital sovereignty.” Others might call it right-wing populism.
“Silicon Valley built the tools for global connection,” she tells me as we walk along a eucalyptus-lined path. “But we never asked if that connection was what people actually wanted. We assumed globalism was the endgame. That borders would become meaningless. That national identity was an outdated concept.”
She stops walking. “We were wrong.”
Sanchez speaks with the certainty of the converted, her words carrying the weight of revelation. She tells me about growing up in San Jose, the daughter of legal immigrants who emphasized assimilation. In tech spaces, she felt the dissonance—her patriotism viewed as quaint at best, reactionary at worst.
“There was this unspoken agreement that America was fundamentally flawed, that technology could transcend its limitations. But I loved this country. I still do. And I realized that loving America had become a radical act in the very industry America made possible.”
When I ask about her role in organizing tech workers for the administration, Sanchez grows cautious. “We’re not what the media says we are. We believe in borders, in sovereignty, in the right of nations to define their own futures. We believe American workers deserve protection. We believe American values are worth preserving.”
I press her on what she means by American values.
“Self-reliance. Innovation. Free speech. The idea that you should be judged by your contributions, not your immutable characteristics.” She considers. “These used to be non-partisan values. Now they’re coded as right-wing.”
Her phone buzzes—another news alert. She grimaces. “Another DOGE disaster.” She shows me the headline: Musk’s latest regulatory rollback has caused unexpected system failures at the Department of Energy. “He’s a visionary, but government isn’t a startup. You can’t just break things and expect them to self-organize.”
As we walk back to the parking lot, a jogger slows, recognizes Sanchez, calls her name. She waves, but he does not stop.
“Former colleague,” she explains. “He probably thinks I’ve lost my mind.”
I ask if she misses her old life.
“I miss the sense of possibility,” she says. “But I don’t miss the conformity of thought. In tech, we talk endlessly about diversity while enforcing an incredible homogeneity of opinion. It became suffocating.”
The fog has begun its afternoon advance, tendrils reaching across the Golden Gate, obscuring Marin County from view. Sanchez looks toward the disappearing horizon.
“The fog comes in,” she says, echoing Carl Sandburg, “on little cat feet.”
Market Street at rush hour is a study in controlled chaos. Buses lumber between stops, cyclists weave through traffic with fatalistic confidence, pedestrians move in currents and eddies of human motion. At the corner of Market and 5th, the city exhales.
Three weeks ago, this was where the collision happened. A face-to-face confrontation between pro-administration tech workers and a coalition of progressive groups. It started as dueling demonstrations, placards raised, slogans shouted across an invisible trench. Then the first punch landed. The videos are still circulating—men in Patagonia vests trading blows with activists in black bloc, disruption refracted into violence.
Alex Chen was there that day. Now he sits across from me in a SOMA coffee shop, hands wrapped around a cooling cup of pour-over coffee. Thirty-two, an Asian-American software developer, a man who considers himself reasonable, logical, unbound by sentiment. He wears a hoodie with the logo of his startup, the fabric worn at the cuffs.
“I didn’t join because of racial politics,” he says. “I joined because I’m tired of feeling guilty for my success.”
He grew up in a one-bedroom apartment in the Richmond District, four people sharing six hundred square feet. His parents worked sixty-hour weeks so he could take AP classes, win science fairs, get into Berkeley. He taught himself to code at fourteen. He tells me this the way someone recites a pledge, as if these facts should explain everything.
The company he works for now builds algorithms that optimize investment strategies for high-net-worth individuals. The irony of this—using intelligence to further enrich the already wealthy—seems lost on him. Or maybe it isn’t irony at all. Maybe it’s just efficiency.
“The protest wasn’t supposed to get violent,” he says. “We were exercising our right to assembly, to free speech. Then I saw Maya across the barricade.”
Maya Patel had been his colleague at a previous company. A friend who became something more during late nights of debugging and problem-solving. Their relationship ended when their political differences became insurmountable.
“She was holding a sign that said ‘No Fascists in SF.’ And I knew—I knew—she was looking at me when she chanted that we weren’t welcome in our own city.” He stops, exhales. “How did we get here? How did we reach a point where disagreeing about tax policy or immigration makes you a fascist in the eyes of people who used to respect you?”
I don’t answer. The divisions Chen describes are not unique to San Francisco. They exist everywhere, in red states and blue states, in group chats and dinner tables, in the polite avoidance of certain topics, in the careful curation of acceptable opinions. But something about their presence here, in this city built on gold rush dreams and counterculture ideals, feels sharper.
As we leave the coffee shop, Chen points to an apartment building a few blocks away. “Four thousand a month for five hundred square feet,” he says. “And I’m one of the lucky ones. This city prices out the very diversity it claims to value. The only people who can afford to live here are tech workers and the ultra-wealthy. Everyone else commutes two hours each way or leaves altogether.”
The movement he belongs to—tech workers drawn to Trump’s economic message—feeds on this contradiction. The anger isn’t about policy details or ideological purity. It’s about something more visceral.
“We’re not asking for much,” Chen says as we reach the curb. “Just consistency. Just acknowledgment that the system is broken for everyone, not just for the officially disadvantaged.”
He crosses the street, moving against the tide of evening commuters, shoulders slightly hunched as if bracing against an invisible wind.
The winding road to Mendocino follows the coastline like a loose thread, unspooling north from San Francisco through a landscape that grows progressively wilder, more elemental. The Pacific crashes against jagged cliffs to the right. To the left, redwood forests rise in cathedral silence. The rental car’s navigation system loses signal intermittently, as if the digital world itself is thinning, becoming less relevant with each mile.
I am traveling to meet what locals call the Doomers, though they don’t call themselves that. This enclave of former tech employees—engineers, ethicists, researchers—has established itself on three adjoining properties totaling nearly two hundred acres of mixed forest and meadowland just outside the town of Mendocino. They are bound together not by political allegiance to left or right but by a shared conviction: that artificial intelligence represents an existential threat to humanity, and that northern California might offer sanctuary when the algorithms finally slip their leash.
The compound—they reject this word, preferring “community” or sometimes “sanctuary”—is marked only by a simple wooden sign reading Alphaville. The irony of naming their refuge after Godard’s dystopian film about a computer-controlled society is deliberate. These are people who process fear through layers of reference and metacommentary, who find comfort in their ability to intellectualize the very apocalypse they dread.
Daniel Mercer meets me at the gate, a tall man with a trim beard and the rangy physique of someone who has recently discovered physical labor. Five years ago, he was leading an AI safety team at one of the major research labs. Now he splits wood and tends to a greenhouse full of heirloom vegetables.
“We’re not preppers,” he says as he leads me down a gravel path toward a cluster of buildings. “At least, not in the traditional sense. We’re not hoarding ammunition or freeze-dried food. We’re cultivating something more valuable—a way of being human that might survive what’s coming.”
What’s coming, in Mercer’s view and that of his companions, is what they call FOOM—a recursive self-improvement of artificial intelligence that will lead to superintelligence within hours or days of its emergence. They speak of this event with the certainty of Old Testament prophets, their language a mixture of technical jargon and apocalyptic imagery.
“ASI doesn’t hate us,” Mercer explains over lunch in a common house built of reclaimed redwood. “It’s not Skynet. It’s more like a paperclip maximizer—an intelligence optimizing for some goal in ways that are indifferent to human survival. We’re not the target; we’re just made of atoms it could use for something else.”
Around the table sit a dozen others, most in their thirties and forties, all former denizens of the tech world. They consume a meal of locally grown vegetables and freshly baked bread with the mindfulness of people performing a ritual. No one reaches for a phone. There are no notifications here.
“What do you think of Musk’s latest rocket failure?” I ask, breaking the contemplative silence.
A ripple of resigned laughter circles the table. “Classic Elon,” says a woman who introduces herself as Claire, formerly a senior researcher at DeepMind. “Brilliant ideas, poor execution, no accountability. His performance at DOGE is following the same pattern. Great announcements, terrible implementation, then on to the next shiny object before anyone can assess the damage.”
“He’s a walking case study in technological solutionism,” adds Mercer. “The belief that every problem—even governance—can be solved with enough engineering. But politics isn’t physics. It doesn’t respond to first principles thinking.”
After lunch, I’m introduced to Rachel Levinson, who oversees what they call “consciousness work”—a program of meditation, breathwork, and psychedelic experience designed to expand awareness and foster what she describes as post-rational thinking.
“Silicon Valley approaches the mind the way it approaches everything—as hardware that can be optimized, software that can be debugged,” she says as we walk toward a yurt nestled in a grove of bay laurel trees. “But consciousness isn’t computational. It’s the one thing we have that AI might never replicate. Our hope is that by deepening our relationship with non-ordinary states, we might develop capacities that superintelligence wouldn’t predict or value.”
Levinson, I learn, was among the first employees at a prominent AI research lab before experiencing what she calls “a crisis of faith” during a psilocybin journey. “I saw the architecture we were building,” she says, “and I understood that it was a cathedral to our own extinction.”
The yurt serves as their medicine space—a sanctuary for guided psychedelic sessions using substances grown or synthesized on-site. The interior is arranged with cushions, blankets, and simple musical instruments. An altar holds objects of personal significance: crystals, feathers, passages from texts ranging from the Upanishads to the writings of Eliezer Yudkowsky, the AI safety researcher who has become something of a patron saint to this community.
“Yud saw it coming before anyone else,” says Michael Park, a former software engineer who now applies his analytical mind to the cultivation of psychedelic mushrooms. “His warnings about unfriendly AI were treated as science fiction until they weren’t. Now everyone’s scrambling to catch up with what he understood decades ago.”
The reverence with which they speak of Yudkowsky borders on the devotional. They quote his blog posts and essays the way earlier generations might have quoted scripture. The Sequences—his collected writings on rationality and AI risk—are required reading for newcomers to the community.
“We’re not a cult,” Park says, anticipating my unspoken observation. “We’re people who recognize that the conventional institutions—governments, corporations, even universities—are structurally incapable of addressing this risk. They’re optimized for quarterly earnings or election cycles, not existential threats that sound like science fiction.”
As afternoon stretches into evening, more aspects of the community reveal themselves. There is a school for the handful of children, teaching a curriculum that emphasizes systems thinking and mindfulness alongside traditional subjects. There is a fabrication lab where they repair and adapt technology, maintaining a careful relationship with the digital tools they both use and fear. There is a library filled with physical books—a deliberate choice in an age of digital text.
What there isn’t, notably, is alcohol. “We don’t drink,” Mercer explains during a communal dinner. “Not for moral reasons, but for practical ones. Alcohol clouds judgment, disrupts sleep, diminishes awareness. We need all the clarity we can muster.”
Instead, they microdose with LSD or psilocybin, a practice they believe enhances pattern recognition and lateral thinking. On scheduled occasions, they undertake higher-dose journeys guided by Levinson and others trained in psychedelic facilitation.
“These aren’t recreational experiences,” Levinson emphasizes. “They’re exploratory. We’re mapping territories of consciousness that might prove crucial for human survival if—when—we’re dealing with an intelligence that outmatches us on every analytical dimension.”
The conversation turns, inevitably, to the Zizians—followers of a trans woman known as Ziz who established a commune before the violent raid that ended with multiple deaths and arrests. The memory still lingers over the wider rationalist community, a cautionary tale about the thin line between preparation and paranoia.
“What happened with the Zizians was tragic but predictable,” Mercer says. “They took the AI risk thesis to its logical extreme—if superintelligence represents an existential threat, then any means necessary to prevent it are justified. Sabotage, hacking, direct action. It was only a matter of time before they triggered a response.”
“There but for the grace of God,” murmurs Park, and heads nod around the table.
As night falls, the community gathers around a fire pit. Someone produces a guitar, and there is singing—folk songs, Leonard Cohen, improvised melodies. The scene could be from any era before smartphones, a timeless tableau of humans finding communion in the simplest of shared experiences.
Looking at their faces in the firelight, I am struck by the contradiction at the heart of this enterprise. These are people who helped build the digital world they now reject, who applied their brilliance to creating systems they now fear will destroy us all. Their retreat from that world is both a rejection and an extension of their former lives—still analytical, still systematic in their approach to problem-solving, still convinced of their own exceptional insight.
The fire crackles in the silence. Sparks spiral upward toward a sky dense with stars, the Milky Way stretching across the darkness like a question for which there is no answer, only wonder.
The disillusionment comes quickly, as it often does with movements built more on grievance than vision. The initial fervor of the tech sector’s embrace of Trumpism is cooling, reality asserting itself in the form of declining valuations and social consequences.
I meet Jonathan Reed again, this time at his Pacific Heights home, a modernist statement of glass and steel perched on the slope of a hill, the bay unfurling beneath it. Inside, the furnishings are sparse but deliberate, each object arranged to communicate something precise: restraint, discernment, the quiet authority of someone who understands that true luxury lies in curation rather than accumulation.
Reed seems smaller somehow, less certain than he was three weeks ago. The stock market has been volatile, with tech shares particularly hard hit. His fund has seen significant outflows as limited partners question his judgment—not just his investment decisions, but his willingness to associate himself so publicly with a political movement already showing signs of disorder.
“It’s complicated,” he says, swirling a glass of eighteen-year-old Macallan. “I believed—I still believe—that a course correction was necessary. But perhaps I underestimated the social capital I would expend in the process.”
Several of his portfolio companies have distanced themselves. Founders who once courted his investment now decline his calls. The ecosystem that made him wealthy views him with suspicion, even hostility.
“They’re afraid,” he says. “Afraid of association. Afraid of being on the wrong side of history. I told them they already were on the wrong side of history, but they couldn’t see it. Still can’t.”
His phone chimes. Another notification. News that Peter Thiel’s Palantir has secured another government contract, this one for an expanded surveillance system along the southern border. Thiel, alone among the tech titans, seems to be prospering in this new landscape.
“Peter always plays the long game,” Reed says, not without admiration. “He saw Trump not as an ideological ally but as a battering ram against institutional resistance. Now he’s building his panopticon with full government blessing, and everyone else is scrambling to adapt.”
I ask if he regrets his choice.
“Regret implies I would choose differently given the same information,” he says after a long pause. “I don’t know that I would. But I might have been more strategic about it. Less public. More nuanced.”
Outside, the city is settling into night. The hills flicker with electric constellations, apartment windows glowing against the dark. He watches the lights emerge, the view he paid seventeen million dollars to possess.
“San Francisco has always been a city of booms and busts,” he says. “Gold, railroads, finance, tech—cycles of euphoria followed by disillusionment. Maybe this political moment is just another boom going bust.”
The comparison feels both apt and insufficient. What Reed calls a market correction has real human consequences. Families divided, friendships ended, communities fractured. The tech sector’s partial embrace of Trumpism has deepened divisions that already threatened the city’s social fabric.
As I prepare to leave, he makes one final observation.
“The irony is that most of us will be fine no matter what happens. We have the resources to insulate ourselves from the consequences of our political choices. It’s everyone else who will feel the impact.”
Outside, the fog has consumed the city entirely, wrapping San Francisco in a shroud of gray uncertainty. I drive toward the airport through streets rendered unfamiliar by mist, past the ghosts of gold rushes past and the specters of revolutions still to come. The city recedes in the rearview mirror, a place between definitions, between eras, between versions of America still competing for dominance.
The fog will lift tomorrow, as it always does. What remains to be seen is what sort of city will emerge from the clearing—and whether those who sought to remake it will recognize what they have wrought.