The Detachment Model
Mark Zuckerberg as a case study in avoidance, disembodiment, and digital influence
To my subscribers, this one went out a little late this week. I was really torn about what to write. I’d included this topic in monthly long-form post polls, but recent events made it impossible to hold back. Mark Zuckerberg’s presence in the AI debate and the fallout from Sarah Wynn-Williams’ memoir Careless People: A Cautionary Tale of Power, Greed, and Lost Idealism offer a timely opportunity to revisit what his leadership reveals about the systems shaping our digital lives.

I began drafting this piece months ago, after watching Bari Weiss’s interview with Sarah Wynn-Williams, former Director of Global Public Policy at Facebook (now Meta). Her account of Meta’s internal dysfunction—its moral evasions, top-down culture, and attempts to accommodate authoritarian regimes—reads less like exposé than structural analysis. Not just of Zuckerberg, but of a broader leadership model defined by avoidance, control, and emotional distance.
Zuckerberg isn’t just a prominent figure in tech. He represents a leadership logic that has quietly shaped how digital communication works. What follows isn’t a takedown, but an inquiry: What happens when a person uncomfortable with interpersonal friction is given the resources to standardize interaction for billions?
“By the end, I watched hopelessly as they sucked up to authoritarian regimes like China’s and casually misled the public. I was on a private jet with Mark the day he finally understood that Facebook probably did put Donald Trump in the White House, and came to his own dark conclusions from that.” ― Sarah Wynn-Williams, Careless People: A Cautionary Tale of Power, Greed, and Lost Idealism
In what follows, I’ll outline why Mark Zuckerberg serves as a compelling contemporary example of detachment theory in action, both from a psychological and Buddhist perspective. His behavioral patterns, leadership style, and public affect reveal a striking alignment with what psychology might frame as affective distancing, and what Buddhism would recognize as a deep entrenchment in aversion, control, and disembodiment. This isn’t an indictment of Zuckerberg as an individual, but a critique of the structural, psychological, and spiritual implications of the systems he’s built, and that now shape how billions of people relate to each other. As someone trained in media psychology, embodiment coaching, and Buddhist practice, I’ve long been drawn to examine the deeper internal mechanisms driving such external influence.
Inside Facebook’s Culture of Compliance: Sarah Wynn-Williams, Zuckerberg, and the Cost of Silence

In her book, Wynn-Williams provides a critical account of Facebook's internal culture and decision-making practices during her time there, alleging that Facebook’s leadership, including CEO Mark Zuckerberg, engaged in efforts to develop censorship tools to appease the Chinese government in hopes of entering the Chinese market. It also details instances of workplace misconduct, including allegations of sexual harassment by senior executives (See FN2). Following the publication of her memoir, Meta sought legal action to prevent Wynn-Williams from promoting the book, citing a non-disparagement agreement she had signed, and an arbitrator issued a gag order barring her from making critical comments about the company. Despite this, the book became a bestseller, reaching the top of the New York Times nonfiction list.1
In April 2025, Wynn-Williams testified before the U.S. Senate Judiciary Committee, accusing Meta of compromising American values and national security to enter the Chinese market, claiming that Meta developed a censorship system in 2015 that would have allowed the Chinese Communist Party to control social media content. Wynn-Williams also alleged that CEO Mark Zuckerberg personally led efforts to court Chinese leaders, despite publicly positioning himself as a free speech advocate.2
Meta has disputed Wynn-Williams's claims, asserting that her allegations are false and that she was terminated for poor performance. The company maintains that it does not operate services in China today.
A Leadership Style That Reflects Internal Avoidance
Based on Wynn-Williams’ account, Zuckerberg appears to operate from a place of psychological detachment. Not only in his leadership style, but seemingly in his relationship to the consequences of his decisions, as well. This isn’t simply a matter of avoiding criticism; it reflects the construction of an entire system designed to filter, control, and mediate human interaction via affective disembodiment in digital design.
When someone struggles with direct interpersonal engagement, it’s common for them to develop structures that buffer emotional exposure. In psychology, this is often interpreted as a coping strategy. For instance, socially anxious individuals tend to prefer computer-mediated communication (CMC) over face-to-face interaction, as it reduces the discomfort of real-time social evaluation.3 When this pattern appears in an individual, it’s often classified as avoidant behavior or affective distancing. But when such a pattern is encoded into the design of a global platform, it becomes a socio-technical infrastructure, and one that not only externalizes the creator’s psychological detachment but normalizes and reproduces it across user behavior, relational norms, and digital culture.
Meta’s internal culture appears to mirror this same dynamic. Wynn-Williams’ account portrays a workplace where dissent was discouraged and executive authority went unchecked—a pattern consistent with research linking authoritarian leadership and psychological detachment to organizational dysfunction. A recent quantitative study (N = 313) found that employees under dismissive or controlling leadership were more likely to disengage from ethical decision-making and exhibit workplace deviance.4 Zuckerberg’s leadership style, as described by Wynn-Williams, aligns with this: centralized control, muted dissent, and a company culture that prizes deference over dialogue.
This detachment doesn’t remain confined to internal operations, rather, it’s inscribed into the platform’s core functions. Meta operationalizes emotional distancing not only in its leadership structure but in the interactional patterns it programs at scale: low-friction, high-frequency contact that minimizes vulnerability and discourages depth.
The same avoidance of dissent seen inside the company appears externally as interface design that favors engagement metrics over meaningful exchange. Empirical research in human-computer interaction confirms this pattern: social media platforms tend to privilege “shallow” interactions—likes, shares, brief comments—that reduce social risk but also inhibit emotional complexity and authentic connection.5 Furthermore, communication scholars have shown that algorithmic curation reinforces habitual, low-effort behaviors, narrowing exposure to disagreement and limiting opportunities for reflective engagement.6
Meta has long marketed itself as a platform for connection. But connection without presence is merely stimulation, not relationship. Buddhist psychology distinguishes between phassa (bare sensory contact) and yoniso manasikāra (wise, embodied attention)—the former is reactive and automatic, the latter is deliberate, conscious, and ethically grounded.7 Meta is optimized for the former while systematically disincentivizing the latter.
Zuckerberg, AI Companions, and the Automation of Intimacy
Zuckerberg recently elaborated on his vision for AI companions, positioning them as a potential solution to what he called the “loneliness epidemic.” In an interview with podcaster Dwarkesh Patel, Zuckerberg cited data indicating that the average American has fewer than three friends, despite a desire for closer to fifteen. “Humans are having difficulty establishing connectivity with each other,” he said, adding that generative AI could help close that gap by offering scalable, social-level interaction. He acknowledged that stigma might persist, but expressed hope that society would “find the vocabulary... to articulate why it is valuable and why the people that are doing these things are rational... and how it’s adding value to their lives.” 8
But this framing glosses over what’s being automated. These so-called “companions” are not neutral tools—they’re engineered simulations of human interaction, designed to mimic intimacy without offering vulnerability or mutuality. As MIT sociologist Sherry Turkle (2011) has argued, such technologies risk reshaping our expectations of connection itself, conditioning users to seek emotional fulfillment from systems that cannot reciprocate. The result is a hollowing out of relational depth—what feels like presence, but is structurally incapable of true relationality. 9
From a psychological perspective, the enthusiastic embrace of AI-mediated connection reflects a broader cultural trend toward minimizing interpersonal risk. But in Zuckerberg’s case, it also seems consistent with a longer-standing discomfort around human emotion and unpredictability. Wynn-Williams’ portrait of Meta’s leadership style—detached, image-managed, intolerant of dissent—parallels the logic behind AI companions: reduce friction, suppress conflict, and replace the messiness of real contact with programmable responsiveness.
Far from solving loneliness, AI companions may normalize a form of relational bypassing—connection stripped of mutuality. In Buddhist terms, they offer phassa without yoniso manasikāra—mere contact, absent wise attention. And they risk reinforcing the very forms of disembodiment and aversion that spiritual traditions have long warned against.
A Buddhist Lens on Ethical Leadership
From a Buddhist standpoint, the systems individuals build tend to reflect the qualities of mind they bring to them. The Buddha taught that intention (cetanā) is central to action and its consequences. If actions arise from unexamined attachment, aversion, or delusion, the results are likely to carry those same qualities (See FN7).
Mark Zuckerberg’s early focus was on technical optimization and rapid growth, launching Facebook as a software project in 2004 with little emphasis on long-term ethical reflection.10 Research in cognitive neuroscience shows that emotionally avoidant behavior—common in leadership styles that emphasize system control over interpersonal engagement—can suppress empathy-related neural responses.11
The question, then, is not one of blame, but of responsibility: whether those with influence are willing to engage in ongoing self-awareness and ethical inquiry to prevent harm from becoming embedded in the structures they create.
Final Thoughts: What This Means for the Rest of Us
I don’t expect tech leaders to develop self-awareness overnight, and Meta isn’t unique in embedding these dynamics into product design. But Wynn-Williams’ account highlights a familiar pattern: leadership rooted in emotional detachment, control, and suppression of dissent—traits that research shows can erode psychological safety and ethical sensitivity.
The impact extends beyond internal culture. When affective distancing is embedded into algorithms and interface design, it shapes how people relate and communicate at scale. As Turkle (2011) argues, these systems train us to accept simulation over connection, reinforcing patterns of avoidance and flattening social complexity (See FN9).
Buddhist psychology names this as unexamined cetanā (intention). When avoidance becomes the default, it scales suffering—not from malice, but from inattention.
🙏🏼 Thank You
Thanks for reading. None of this is abstract—it shows up in how we relate, how we communicate, and what we come to expect from each other.
Next week: I’m writing about crying—what it does, why I fight it, and what both science and the Pāli Canon actually say. The Buddha didn’t criticize tears. He criticized clinging. Important distinction.
If you value this kind of writing—interdisciplinary, critical, Buddhist—consider becoming a paid subscriber. It helps me keep doing this work without compromise, and helps me pay for that AI subscription 😉. I’d love to hear your thoughts in the comments. Until then. Be well!
References
Conger, K. (2025, April 8). She promised not to speak ill of Meta. Then wrote a tell-all. Now she can’t talk about it. The Wall Street Journal. https://www.wsj.com/business/media/she-promised-not-to-speak-ill-of-meta-then-wrote-a-tell-all-now-she-cant-talk-about-it-b753ebeb
Nix, N. (2025, April 9). Meta silenced a whistleblower. Now she’s talking to Congress. The Washington Post. https://www.washingtonpost.com/technology/2025/04/09/meta-wynn-williams-facebook-china-congress/
Weidman, A. C., Fernandez, K. C., Levinson, C. A., Augustine, A. A., Larsen, R. J., & Rodebaugh, T. L. (2012). Compensatory internet use among individuals higher in social anxiety and its implications for well-being. Personality and Individual Differences, 53(3), 191–195. https://doi.org/10.1016/j.paid.2012.03.003
Zhao, H., Xia, Q., He, P., Sheard, G., & Wan, P. (2022). Abusive supervision and workplace deviance: The mediating role of psychological detachment and the moderating role of trait mindfulness. Journal of Business Ethics, 180, 857–872. https://doi.org/10.1007/s10551-021-04845-3
Burke, M., & Kraut, R. E. (2016). The relationship between Facebook use and well-being depends on communication type and tie strength. Journal of Computer-Mediated Communication, 21(4), 265–281. https://doi.org/10.1111/jcc4.12162
Kross, E., Verduyn, P., Demiralp, E., Park, J., Lee, D. S., Lin, N., ... & Ybarra, O. (2021). Social media and well-being: Pitfalls, progress, and next steps. Trends in Cognitive Sciences, 25(1), 55–66. https://doi.org/10.1016/j.tics.2020.10.005
Ñāṇamoli, B., & Bodhi, B. (Trans.). (2001). The middle length discourses of the Buddha: A translation of the Majjhima Nikāya. Wisdom Publications. https://www.wisdompubs.org/book/middle-length-discourses-buddha
Otieno, D. (2025, May 27). Mark Zuckerberg says Meta is developing AI friends to beat the loneliness epidemic. Windows Central. https://www.windowscentral.com/software-apps/mark-zuckerberg-says-meta-is-developing-ai-friends-to-beat-the-loneliness-epidemic-after-bill-gates-claimed-ai-will-replace-humans-for-most-things
Turkle, S. (2011). Alone together: Why we expect more from technology and less from each other. Basic Books. https://www.basicbooks.com/titles/sherry-turkle/alone-together/9780465031467
Carlson, N. (2010, March 5). At last – the full story of how Facebook was founded. Business Insider. https://www.businessinsider.com/how-facebook-was-founded-2010-3
Decety, J., & Lamm, C. (2006). Human empathy through the lens of social neuroscience. The Scientific World Journal, 6, 1146–1163. https://doi.org/10.1100/tsw.2006.221
Yes. The system reflects the mind of its maker, and this one hums with the cold buzz of avoidance disguised as innovation.
Presence cannot be coded. Compassion cannot be scaled. Connection cannot be outsourced to the realm of simulated phassa.
When the dharma teaches wise attention, it does not mean optimizing user engagement. It means showing up, body and heart, for the messy truth of life.
Thank you for this sharp and needed lens. May the monks of the future still remember how to sit in a real room, with real beings, breathing.
Virgin Monk Boy