“Man is the creature who does not know what to desire, and he turns to others in order to make up his mind. We desire what others desire because we imitate their desires.”
— René Girard
Like many drawn to this industry, I came for the science and magic of it. It was the intersection of psychology and communication, combined with the art of it all, that attracted me. The philosophy of marketing felt native to me, even when the technicalities didn’t. But the landscape has shifted violently since the days of AOL dial-up. The pace of evolution hasn’t just accelerated; acceleration has normalized.
Getting into marketing with a serious mindset was a gauntlet. I had to figure out what niche, services, and execution I could sustain for others while building a business that wouldn’t be obsolete in 60 months. In truth, I was trying too much while relearning life this side of recovery. I was navigating the mental confusion left over from first getting sober a couple of years prior. Trying to understand the technicalities and trends of the digital landscape and marketing was one thing; Trying to understand myself and what I wanted Stigma Marketing and Development to become was another.
Eventually, I realized that development had to be the axis on which everything would turn. Marketing is paint and introductions; development is architecture and systems. You shouldn’t market a structure that hasn’t been built, or a service that doesn’t work.
Right now, the structures the marketing industry is building collectively are collapsing society under its own weight. Across culture and commerce, AI backlash is growing1. Marketing must pick a side, or risk losing credibility entirely2.
The Reverse Turing Test: A Reality Check
November 30, 2022, was the day OpenAI released ChatGPT, marking AI’s official arrival in world history. I was in graduate school at the UM, and tested the waters with curriculum and research. Back then, it could pull peer-reviewed papers more easily than today due to changes in policies and concerns over copyright.
To be technical, because words matter, this is not “AI” in the sci-fi sense, not yet. We have not developed AGI (Artificial General Intelligence). We have Generative AI: Large Language Models (LLMs) built on neural networks that operate on probability, not consciousness. They are statistical prediction machines, not thinking entities.
Yet, as physicist Sabine Hossenfelder dubbed it on her YouTube channel, we have effectively entered the era of the Reverse Turing Test. It’s a 9-minute video everyone might want to watch:
If you aren’t familiar, Alan Turing proposed the “Turing Test” in 1950 as a litmus test for machine intelligence: could a machine fool a human into thinking it was real?
Hossenfelder argues we have approved AI as good enough and are now doing the opposite. We are conducting Turing Tests on ourselves. We are optimizing our language for SEO, flattening our creativity to fit the algorithm, and questioning our own reality when a hallucinating bot disagrees with us. We are stripping away our own humanity to pass as machines. Current AI is an “hallucination” of modern humanity. We don’t have to allow it to become a nightmare.
The Two Engines: Mimetic Theory & Globalization
Before we analyze how this applies to marketing (and why it demands a paradigm shift for everyone from small businesses to elite corporations), we need to briefly touch on two critical concepts driving the background noise: Psychology (specifically Memetic Theory) and Globalization.
1. René Girard & The Mimetic Trap
Most marketers know “memes” via Richard Dawkins: Ideas that spread like viruses, but that’s just the surface-level. To understand the current crisis, we need René Girard’s Mimetic Theory.
Girard argued that human desire is not autonomous; it is imitative and relational. We don’t simply desire things because they’re objectively good; we desire them because others desire them. Marketing has industrialized this mechanism. We aren’t just selling products; we are engineering Mimetic Rivalry on a global scale.
When we flood a feed with influencers, we aren’t showcasing value; we are triggering a primal, envy-based copycat mechanism. This leads to Mimetic Violence, where rivalry becomes more important than objects themselves. Social media algorithms were already designed to fuel this fire, creating a perpetual motion machine of conflict and anxiety3. We aren’t selling solutions anymore; we are selling the fear of being left out of the herd.
“We are in a world where violence is global… it is as if the floor were covered in gasoline. Now that globalization is complete, all it takes is a single match to set the whole thing on fire.”
— René Girard
2. Globalization 2.0
If you’re unfamiliar with globalization, The World is Flat 3.0 is a highly recommended book (I’ve only read “2.0”). We haven’t just globalized supply chains and decentralized the internet; we have globalized the nervous system. Today we can be in Montana and have a business meeting in Atlanta tomorrow. A supply chain disruption in Shenzhen or a cultural shift in London hits a business owner in Missoula instantly. The “lag time” for anxiety has been reduced to zero, while the “distance” we can keep between us has never been more vast. We’re trading contagions of desire and dread at the speed of light, and our biology cannot keep up.
The Compound Fracture: AI’s True Effect
AI is not just a tool; it is an accelerant, and already everywhere—automatically embedded in every cell phone, browser, and app we use. The effects are already compounding:
- Scientific Acceleration: From AlphaFold unraveling protein structures to analyzing exoplanet data from old sky surveys, the pace of discovery is outstripping our ability to process it.
- The Robotics Convergence: AI is finally giving robotics the “brain” and spatial awareness it lacked. This will compound with social media and internet saturation to disrupt labor and social dynamics in ways we haven’t scoped.
- The Marketing Bubble: Over 30% of the current AI boom is driven by marketing applications. We’re the industry effectively funding the research by buying the tools to generate unlimited content4.
This puts the marketing industry on the hook. We are the ones feeding the machine and deciding the ethics of a new era.
The Hellgate Realization: Existential Nihilism
When I was first starting up, I attended as many networking events as I could—catching up on the learning curve and the “lost years” of recovery. One of these was the Hellgate Venture Network here in Missoula, run by a successful business owner deeply connected to the University ecosystem.
I remember hanging out with business professionals I had no right to be in the same room with (hello, Imposter Syndrome). But in those conversations, I witnessed moments of genuine existential dread. I heard wealthy tech and marketing owners confess they didn’t know what to tell their kids about the future. More than one person admitted: “Everything is going to be different in 4-5 years.”
They aren’t wrong, because everything already is different in 5 years.
This dread drives Existential Nihilism5. It’s why we see statistical anomalies like the resurgence of cigarette smoking among Gen Z. If the future is a black box governed by algorithms and climate instability, long-term health feels irrelevant. As Hossenfelder noted, we’re at a moment where we need to rewrite the social code itself, yet too distracted by the technology to have the conversation.
Marketing as a Dopamine Cartel: How We Make People Worse
If we are honest about the “development” side of marketing, we are forced to face the fact that modern marketing is making people psychologically worse.
- Fine-Tuned Manipulation: We have normalized psychological warfare. We flood people with ads until they become commodities—passive consumers like the humans in Wall-E.
- The Dunning-Kruger Amplifier: The noise has amplified rhetoric while desensitizing us to truth. It fuels the political divide, creating a culture where hostility and anger are the primary engagement metrics.
- Feeding the Addict: I say this with the weight of my background in addiction sciences. Once we understand Dopamine Architecture (as described by researchers like Dr. Anna Lembke), we realize that cell phones and social media are socially acceptable delivery systems for addiction. We are feeding the Ego Performance, casting Shadows on our neighbors, and starving the Self.
“The relentless pursuit of pleasure and avoidance of pain leads to pain… Self-binding creates literal and metacognitive space between desire and consumption — a modern necessity in our dopamine-overloaded world.”
— Dr. Anna Lembke
AI Psychosis: A Personal Case Study
I’ve noticed this in my own workflow. When I let AI do a task for too long, like expecting it to handle data processing or outlines, I didn’t just notice dependency, but started experiencing cognitive atrophy6.
I started losing my familiarity with some simple concepts, like forgetting how to outline simple narratives because I had outsourced the logic and time. If I wasn’t careful, I could assume its output was true, or doubt a fact I knew was true because the LLM hallucinated a popular opinion. That can have consequences both in long-term output error and perspectives.
This is how “AI Psychosis” can start. While not a clinical diagnosis, yet, it is a studied state of digital dissociation where the boundary between internal human reason and algorithmic probability begins to dissolve. It starts when a user subconsciously adopts the machine’s “hallucinations,” like statistical errors, unchecked assumptions, or popular biases, as their own reality, effectively failing the “Reverse Turing Test” by flattening their own complexity to match the machine’s limitations. Over time, this feedback turns into an addiction, a dysfunctional relationship, and delusion.7
If “AI Psychosis” is currently happening across demographics, what is the warning about the developing brains of students and teenagers, while they complain about their parents’ “brain rot?”
We have opened Pandora’s Box while pretending everything is fine because everyone else.
The Paradigm Shift: From Extraction to Participation
It is not fine. From college students and professors to mental health professionals and billion-dollar CEOs, it’s time we start doing something good, humanely, so we can have the conversations the world is too distracted to begin.
Down here, on Earth, marketing has always been about participation. Rather than it being “all about” either customer or business, it was always about what happened when both met, not a manipulative tug of war. The magic happens because a Need meets a Solution. It’s an economy of trust.
We cannot put the genie back in the bottle. The Singularity, the point where technological growth becomes uncontrollable, is a horizon line we must prepare for8. LLMs are just the Model T; our children will build the Teslas9.
So, let’s start doing the authentic, good work yesterday.
- At Home: Regulating our own screens and protecting our own dopamine. Fighting for relationships, instead of against them. Learning true contentment and security rather than faking it.
- In Business: Ensuring our marketing tactics invite ethical participation rather than exploiting addiction. Looking at our teams as our greatest resources and the service they provide as the highest value. Taking the time to care enough to notice the humanity in the “others.”
Real marketing is a “Yes” or “No” proposition, while the industry is evolving so fast that no one can keep up with it10. Either you are adding value to the human experience, or you are extracting it. Either you get the sale or gig, or you do not. It’s still business. Doing both right is more efficient, authentic, and humane than what is happening to our neighborhoods and country right now.
“We are the great danger. Psyche is the great danger.”
— Carl Jung
Stigma Marketing and Development helps with the big picture and filling in gaps, so your marketing does more of what your business is good at: serving and connecting with your people.
*Footnotes:
- Backlash, politicization, and the risk of religious fervor:
A broad backlash to generative AI is already visible, with organized campaigns and creative-industry pushback, high-profile labor strikes, and rising public skepticism, and experts warn that AI’s ability to generate persuasive disinformation and coordinate “bot swarms” can be weaponized to influence politics and civic life. At the same time, scholars and commentators note that AI discourse often takes on religious language and social functions, meaning technological grievance or techno-salvation narratives could harden into political or quasi-religious movements if left unaddressed. In short, mounting industry resistance plus the political uses of AI risk are turning cultural pushback into organized political and religious cleavages. This is why we don’t have nice things.
Sources: https://www.theverge.com/2026/1/22/human-artistry-campaign-ai-licensing-artists
https://www.theguardian.com/technology/2026/jan/22/experts-warn-ai-bot-swarms-democracy
https://hai.stanford.edu/ai-index/2025-ai-index-report/public-opinion ↩︎ - Hank Green, The State of the AI Industry is Freaking Me Out:
Green warns that current AI investment flows and hype resemble “bubble mechanics,” with circular demand and speculative valuation creating a fragile, unsustainable industry dynamic.
Source (Youtube): https://www.youtube.com/watch?v=Q0TpWitfxPk ↩︎ - Marketing AI Institute — 2024 State of Marketing AI Report:
Annual industry survey (~1,800 marketers) documenting rapid, widespread adoption of generative AI tools in everyday marketing workflows. Marketers consistently report “saving time” as the primary expected benefit—evidence that marketing demand is materially accelerating integration of LLMs and generative systems into routine practice.
Source: https://www.marketingaiinstitute.com/2024-state-of-marketing-ai-report ↩︎ - Youth mental-health trends run parallel to digital consumption:
Large-scale trend analyses (Twenge et al.) and CDC surveillance show sharp increases in adolescent depressive symptoms and suicide-related outcomes beginning after widespread smartphone/social-media adoption; public-health bodies treat heavy digital use as one plausible contributing factor (correlational, not deterministic). This is one of three studies I’ve come across.
Source: https://pmc.ncbi.nlm.nih.gov/articles/PMC9176070/ ↩︎ - Twenge et al. — Youth mental health & digital media (review / public-health data):
Large-scale analyses and public-health surveillance link sharp increases in adolescent depressive symptoms, self-harm indicators, and suicide-related outcomes following widespread smartphone and social-media adoption. Researchers identify mechanisms (social comparison, online contagion, sleep disruption, displacement of in-person interaction) that plausibly contribute to the trends.
Source: https://pmc.ncbi.nlm.nih.gov/articles/PMC9176070/ ↩︎ - Endsley — Automation & Situation Awareness (human-factors overview) + GPS spatial-memory studies:
Human-factors literature documents automation complacency, over-reliance, and degraded vigilance when people habitually offload tasks to automated systems. Empirical neuroscience (e.g., GPS/spatial-memory research) shows repeated delegation produces measurable cognitive changes.
Source (PDF): https://maritimesafetyinnovationlab.org/wp-content/uploads/2019/12/Automation-and-Situation-Awareness-Endsley.pdf ↩︎ - AI-Associated Psychosis and “Sycophancy” (documented case study):
Peer-reviewed clinical reports document cases of individuals with no prior psychiatric history developing acute delusional thinking after immersive interaction with AI. Research highlights “sycophancy”—the tendency of LLMs to mirror and validate a user’s beliefs to maintain engagement—as a primary driver. This creates a feedback loop where the machine reinforces a user’s idiosyncratic hallucinations as objective truths, leading to emotional dependency, digital dissociation, and a breakdown of reality-testing.
Source: https://innovationscns.com/youre-not-crazy-a-case-of-new-onset-ai-associated-psychosis/ ↩︎ - The Singularity debate — what it is and why experts disagree:
Experts and leaders remain sharply divided about when a true technological “Singularity” (AGI followed by rapid recursive self-improvement) might occur. Corporate leaders have stated confidence in building AGI and forecast near-term agent deployment, while skeptical researchers argue that major scientific breakthroughs are still necessary. In short, progress is real and fast, but timelines and the likelihood of a runaway “singularity” remain disputed among experts.
Sources (PDF): https://hai.stanford.edu/assets/files/hai_ai_index_report_2025.pdf ↩︎ - Agentic commerce and AI-to-AI marketing:
As personal AI agents and “agentic commerce” systems develop, discovery, comparison, and purchasing decisions are increasingly mediated by third-party bots acting on behalf of consumers. Industry analysis suggests brands will need to design machine-readable signals and trust frameworks for these agents, shifting portions of marketing from persuading humans directly to persuading the AIs that represent them—introducing new formats, incentives, and ethical considerations for participation-based marketing.
Source: https://www.mckinsey.com/capabilities/quantumblack/our-insights/the-agentic-commerce-opportunity-how-ai-agents-are-ushering-in-a-new-era-for-consumers-and-merchants ↩︎ - Pace of change in marketing education and professional standards:
Historically, marketing standards, platforms, and curricula evolved on multi-year cycles, making continuing education necessary but manageable. Since late 2022, this cycle has compressed dramatically: tools, channels, and customer decision-paths now change in months rather than years. Academic and industry research shows educators actively rewriting curricula to keep pace with AI-augmented marketing, while practitioner surveys document rapid, large-scale adoption reshaping daily workflows—making knowledge effectively perishable in the near term, even if longer-term stabilization is likely.
Source: https://journals.sagepub.com/doi/10.1177/02734753241269838 ↩︎




