top of page

Digital Unconscious: How AI Reveals Our Hidden Aesthetic Desires

ree

Last night, the YouTube algorithm introduced me to a song that perfectly described a feeling I didn't even know I had. A melancholy melody by an Estonian composer I'd never heard of, it appeared in my feed without any apparent logic. And yet, somehow, that music spoke directly to a hidden part of me I thought was inaccessible. How on earth can a machine intuit what we don't even know we desire? This seemingly innocent question leads us straight to the heart of one of the most silent and unsettling revolutions of our time: the ability of artificial intelligence to act as an involuntary psychoanalyst, revealing through our digital behaviors the existence of a true algorithmic collective unconscious.


Jung Meets Netflix: The Paradox of the Psychoanalyst Algorithm

In 1916, Carl Gustav Jung theorized the existence of a psychic layer deeper than the personal unconscious: the collective unconscious, a repository of universal archetypes and psychic structures shared by humanity¹. Jung based this insight on recurring dreams, universal mythological symbols, and cross-cultural behavioral patterns. It was a fascinating but essentially unprovable theory. Today, for the first time in human history, we have something Jung could not have even imagined: eight billion people leaving digital traces of their deepest desires. And what big data is revealing is extraordinary: recommendation algorithms don't just show us what we like, but are empirically validating the existence of the Jungian collective unconscious. If we were to analyze the categories that the TikTok algorithm uses to organize content, we might discover patterns surprisingly aligned with Jungian archetypes. The so-called ‘wholesome content’—those reassuring videos of happy families, puppies, and acts of kindness—that makes us feel safe recalls the Innocent, the physical transformation videos that make us cheer for the protagonist activate the Hero, the adventure content that awakens our desire to explore speaks to the Explorer². It's not science fiction: the algorithm doesn't "know" anything about Jung, yet it seems to have empirically identified the same universal behavioral structures that the Swiss psychoanalyst theorized a century ago.


The Algorithm as the Lacanian Big Other

There is something profoundly Lacanian in all this. The algorithm functions like the “Big Other” of Lacan: an entity that observes us, understands us, and—most interestingly—seems to know us better than we know ourselves³. When you do something and think “what will people say?”, that “people” is the Big Other. It's not specific individuals, but a symbolic entity that represents the social gaze, judgment, and approval. But unlike the Lacanian Other, which was a symbolic construct, the algorithm is terribly concrete in its manifestations. The algorithm has mapped our Jungian Shadow through digital behaviors never before analyzed on such a scale. Research by Palmer and colleagues has shown the existence of robust correlations between music and colors, while Lindborg and Friberg have documented specific associations between joyful music and yellow, angry music and red, and sad music and dark blue. It is plausible that Spotify, by analyzing music listening patterns together with visual preferences on social media, can globally confirm these studies. Similarly, Netflix could map the correlations between soundtracks and users' preferred color palettes. These cross-modal correspondences are not random: as Ward and colleagues have shown, they span different cultures and use the same neural mechanisms as synesthesia, suggesting that certain sounds really do activate universal visual archetypes, just as Jung theorized with his “feeling-toned complexes.”

ree

Generative Art as a Digital Rorschach Test

Midjourney, DALL-E, and other AI art systems aren't just creating images: they are functioning as gigantic digital Rorschach tests, allowing us to observe collective patterns in human desires by analyzing billions of prompts. Research into the most popular prompts reveals recurring themes: “surreal forests,” “dreamlike landscapes,” “abstract cities,” and “fantasy worlds” dominate global requests. It is significant that terms like “surreal forest” and “abstract cityscape” are among the most used prompts, suggesting a deep archetypal desire to escape from ordinary reality toward imaginary spaces. This trend finds a real-world correlate: the subreddit r/LiminalSpace has surpassed 968,000 members, the @SpaceLiminalBot account on X has accumulated more than 1.3 million followers, and the TikTok hashtag #liminalspaces has exceeded two billion views. The first peak in popularity for images of liminal spaces occurred in March 2020, when lockdowns began. These spaces—empty shopping malls, deserted hallways, abandoned pools—activate what we could call the archetype of the “Passage”: thresholds between conscious and unconscious, between real and imaginary. The reaction occurs because we experience a “break from the ‘spatial narratives,’ the story that a given space tells us.” The fact that millions of different people find them simultaneously fascinating confirms the existence of shared psychic structures that go far beyond cultural differences.


The Aesthetic Voyeurism of the Digital Age

But perhaps the most revealing aspect of this digital unconscious concerns what we could call “aesthetic voyeurism.” The algorithms have identified that there are two completely different modes of aesthetic consumption: the “social” one (what we share and declare to appreciate) and the “private” one (what we actually consume when no one sees us). The most striking case concerns the so-called “comfort content”: videos of people cooking, cleaning the house, gardening, painting watercolors. Content that we would publicly consider “boring” but that in private views reaches astronomical numbers. The algorithm has identified in these videos the activation of what Jung called the archetype of the Great Mother: figures that evoke security, nourishment, and protection. It is humbling to realize that a machine is mapping our repressed maternal needs through the time we spend watching strangers prepare bread. But it is also illuminating: it reveals how our hyper-connected and accelerated world has created an unconscious hunger for slow rhythms, ancestral gestures, and rituals of care that we thought we had outgrown.


Towards a Predictive Aesthetic: The Future of the Digital Unconscious

What we are observing is only the beginning of a deeper transformation. If AI can map the collective unconscious, can it also begin to modify it? Algorithms are already experimenting with what researchers call “predictive aesthetics”: the ability to not only anticipate what we will like, but to actively influence the formation of our future tastes. TikTok has begun testing content that activates “dormant” archetypes—showing us things that we didn't know we wanted in order to observe if they can create new patterns of desire. It is a real-time experiment on the evolution of the collective unconscious, conducted on billions of unwitting guinea pigs. Jung believed that the collective unconscious was eternal and immutable while big data suggests instead that it is fluid, emergent, and co-created by the technologies that observe it. We are not just discovering our hidden desires—we are writing them in real time.


Authenticity in the Algorithmic Era

This brings us to the fundamental question: in a world where the algorithm knows our archetypes better than we do, what does “knowing oneself” still mean? If AI can predict and influence our aesthetic tastes before we consciously form them, are we still choosing, or are we being chosen?⁵ I discussed this in a previous article (The End of History Has Been Postponed and Algorithms Are Making Fun of Us). Perhaps the answer lies in recognizing that the digital unconscious is neither a pure revelation nor a total manipulation, but something more complex: a continuous dialogue between our deepest drives and the technologies that reflect and amplify them. As in any authentic dialogue, the key is to maintain awareness of both voices at play. The next time an algorithm suggests something you “inexplicably” like, stop for a moment. Ask yourself: what is it seeing in me that I don't see? And most importantly: do I really want to know? Ultimately, perhaps the digital unconscious is nothing more than the most precise and unforgiving mirror humanity has ever had. The question is whether we are ready to look at what it reflects. There are scientific theories that show how man is a trained animal (by God, by aliens or more likely by himself through society); we are entering an era in which AI will contribute to further training, but this is a topic for a future article.


Bibliography

  • Jung, C.G. (1959). The Archetypes and the Collective Unconscious. Princeton University Press.

  • Pariser, E. (2011). The Filter Bubble: What the Internet Is Hiding from You. Penguin Press.

  • Lacan, J. (1998). The Seminar of Jacques Lacan Book XI: The Four Fundamental Concepts of Psychoanalysis. W. W. Norton & Company.

  • Palmer, S. E., Schloss, K. B., Xu, Z., & Prado-León, L. R. (2013). Music-color associations are mediated by emotion. Cognitive Science.

  • Lindborg, P., & Friberg, A. (2015). Music-color correspondence in film music. Psychology of Music.

  • Ward, J., Huckstep, B., & Tsakanikos, E. (2006). Sound-colour synaesthesia: to what extent does it use cross-modal mechanisms common to us all? Cortex, 42(2), 264-280.

  • Think with Google. (2022). The soothing video trend captivating Gen Z. Google Consumer Insights.

  • Barratt, E. L., & Davis, N. J. (2015). Autonomous sensory meridian response (ASMR): a flow-like mental state. PeerJ, 3, e851.

  • For more on how algorithms manipulate the perception of reality, see: “The End of History Has Been Postponed and Algorithms Are Making Fun of Us.”

  • Zuboff, S. (2019). The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power. PublicAffairs.

 
 
 

Comments


bottom of page