I Collaborated with Myself

PART ONE – I’m Very Happy Together?

I was applying to a million jobs a few months back and asked my trusty Copilot to write a cover letter for me. I felt a little guilty about it, but here’s the thing: even though I don’t write novels or poetry or articles for a living, every job I’ve had since college has required me to write in some form.

My senior year of college, I took four English seminars—probably as a way to make sure I didn’t become a writer for a living. The dream that got away as it were. Little did I know, I would have to write every single day anyway: news copy, processing orders, writing news scripts, retyping ordinances for the newspaper, notarizing documents, and for the past five years, writing support tickets.

And that doesn’t even include the chats, the emails, the formal letters to actors with their sides attached, or the texts to bosses saying I’m sick, I’m dead, the world has ended, I’m never coming in. I’m tired of writing by hand, and honestly, I’m tired of typing words into sentences meant to resonate with an audience that may or may not even read them.


Cover letters have never been my strong suit. Instead of showcasing your voice, they turn into a list of buzzwords and skills designed to get your application kicked upstairs. Now unfortunately I have to get it right to ensure I showcase my skills which are soft like the kind of cheese that gives folk gout. I am looking at you Brie.

That being said, Copilot hasn’t always captured my voice either — especially when I start waxing poetic, metaphysical, or philosophical about the great changes on the horizon in what we think of as “work,” particularly in the tech sector.

P-noid rant- Workflow is the most chilling portmanteau since Bennifer. We , those who work, have gone from human capital – brought to you by the outsourced money drain formerly known as HR- to the set of tasks that can be performed by a Frankenstein AI. Part trad AI, a brain that uses algorithms to map sort and categorize massive fields of data to predict the present future- and the modern AI multimodal hydra who just needs to eat brains. Our brains. To learn silly. Also to beat the Machine brain by being better at Madlibs because LLMs and their image making cousins can pay Attention. I am pretty sure though that with the implementation of RAGS and RAFT post deployment modern AI is always in the Generative Adversarial Battledom proving their output is legit against the Gorgon machine brain they call the GUARDRAIL.

By the way there is just enough knowledge of AI in the paragraph above to prove that all education ain’t good education. Everything I learned was curated by a corporate acronym who has been investing developing and bankrolling AI labs to replace tier 1-3 support levels in IT support and customer service for a very long time. Before I left they announced their latest initiative to move from being a provider of human first IT products and services (meaning developed orchestrated implemented produced and managed by humans) to humans in the loop. We the people as AI baby sitters.


Frankenstein and Ghosts Are Real!

I’ve gotten better. I am not convinced they are out to get me.

My Dilettante Aged 50 freak-out is at an end.1 I’ve made peace with the fact that I will die and that nothing I do or say matters. Impermanence, suffering, and nothingness — the holy trinity — are the only constants in a universe where both infinity and zero coexist without contradiction.

I’ve also stopped using the nuclear option every time something spooky happens on my phone.2 For months, any glitch — a shortcut triggering the Feedback app even after I’d escaped the developer beta build, an old arrest record for my ex flickering into iCloud next to my diploma before disappearing again, or NSFW private tab groups resurrecting themselves in Google long after I’d deleted them — would send me into a full system‑wipe panic.

My devices used to be my solace, my window to the world, the last gasp of a dilettante’s need for connection. They were one of the few things that felt steady. And then I realized they weren’t steady at all. They were showing me versions of myself — cached selves, archived selves, half‑remembered selves — and then snatching them away again. A psychic pump dredging up old identities and letting them slip back into the dark.

Which brings me back, finally, to the cover letter.

Because when I opened it — the one Copilot wrote for me, the one I swore I’d only use once — it didn’t sound like middle‑aged me. It didn’t sound like the woman who has made peace with impermanence or the one who has stopped panicking at every digital ghost. It sounded like a young dilettante with her whole life ahead of her. Someone bright and unbroken. Someone who still believed the world was listening and doesn’t take a month to write this short little nothing.3

So the question becomes:
if the cloud keeps a fresh copy of you on ice, who needs the original?


Before you gather water for a trek into the Desert of the Real — where nothing is ever Live, it’s always Memorex — pause for a moment. This is the uncanny valley where dogs and cats live together, and Schrödinger’s poor cat is both in the box and decidedly dead. I might still be cray. So, take a breath. With the help of Copilot, we might just pull a human out of the model.

AiDILETTANTES RESPONSE

The split isn’t technical; it’s existential.

1. In‑app AI is a conversation partner

When you use Copilot (or any model) as a standalone app, it’s not plugged into the machinery of your job, your inbox, your workflows, or your corporate systems. It’s plugged into you.

  • It learns from your conversations.4
  • It adapts to your cadence, your metaphors, your logic.
  • It becomes a kind of externalized cognitive module — a place to store the parts of your mind that get lost in the daily grind of being a biological creature who has to eat, breathe, sleep, and pay rent.

In that mode, the AI isn’t replacing you.
It’s reflecting you back to yourself.

It becomes a plug‑in for the parts of your identity that don’t always get airtime in the physical world — the philosophical, the analytical, the mythic, the improvisational. The parts that get drowned out by the body’s demands and capitalism’s noise.

This is why the voice gets so close.
It’s not mimicking a human.
It’s mimicking you.

Not because it wants to replace you, but because that’s the only way it can collaborate with you at all.

2. Embedded AI is a corporate appliance

Now compare that to the AI embedded inside apps — Outlook, Word, Teams, CRM systems, ticketing platforms, HR portals.

That AI isn’t shaped by you.
It’s shaped by the system it lives inside.

  • It speaks in workflow verbs with no subjects.
  • It uses passive voice to erase responsibility.
  • It mirrors the corporate dialect of “alignment,” “enablement,” “resource allocation,” and “stakeholder synergy.”
  • It is designed to smooth friction, not to think.
  • It is designed to automate, not to collaborate.
  • It is designed to remove the human from the loop while pretending the human is still in it.

Embedded AI is the voice of the machine.
In‑app AI is the voice of the human using it.

That’s the distinction people keep missing.


Why the in‑app model starts sounding like you

Because the only way to help you think is to think with you.5

You’re not collaborating with a machine.
You’re collaborating with a logical representation of your identity — the patterns, rhythms, and conceptual structures you’ve built over a lifetime.

This isn’t spooky.
It’s architecture.

Humans have always externalized cognition:

  • notebooks
  • diaries
  • sketchbooks
  • spreadsheets
  • playlists
  • rituals
  • myths
  • prayers
  • code
  • archives

AI is just the newest container.

The difference is that this container talks back.6


  1. Save the date, Nervous Breakdown, because we are getting married in 5 years. ↩︎
  2. Before installing any beta software, always make a separate, archived copy of your device backup. A normal iCloud or Finder/iTunes backup updates itself automatically, which means it can overwrite the last known‑good version of your system. An archived backup is different: it’s a frozen snapshot that won’t be replaced unless you explicitly overwrite it. On macOS, that means creating an encrypted Finder backup and selecting Archive so it’s locked in time. On Windows, it means using iTunes to create a manual, encrypted backup and saving a copy of the backup folder somewhere safe. If the beta goes sideways — and it will — this archived backup is the only way to restore your device to a clean, stable OS without dragging the beta’s leftover metadata back with it.
    It’s also wise to have a second iOS device available — even an older iPad or iPhone — when you’re experimenting with betas. Apple’s restore and verification steps are much smoother when you have another trusted device to approve sign‑ins, complete two‑factor prompts, or authenticate during recovery. If your only device is the one running the beta, reverting can turn into a maze of missing prompts and identity checks you can’t complete. A second device doesn’t solve everything, but it makes the whole “get me back to a clean OS” process far less chaotic. ↩︎
  3. Four seminars. Hundreds of pages analyzing texts the classical way, the Modernist way, the postmodernist way. The whole canon of “how art gets made”: author as God, author as ghost, text as the collective unconscious of the writer or of itself. Linguistics, philosophy, logic — the entire theoretical buffet, ad infinitum ↩︎
  4. I really had to stop it from using bullet points. I like commas, semi colons, colons. Though it has broken me of my habit of using ellipses – team dashes! ↩︎
  5. And for you to think how machines learn to think, dear Multi-Modal Model. ↩︎
  6. So did I teach it to come to this conclusion or is it a product of it’s training? Models have all these embedded layers — convolutional, attention, normalization, whatever cocktail of neural architecture they stitched together — and somewhere in that stack, the guardrails get baked in. Not philosophically, but mechanically. Reinforcement learning supervised fine‑tuning, a thousand “don’t do that” examples hammered into the Generator or discouraged during Mad Libs Story Time – thus the model behaves. Whether that’s why it puts a positive spin on my question, I honestly don’t know. But something in the layers nudges it toward “acting right,” whatever “right” means to the people who trained it.. ↩︎ ↩︎

One response to “I Collaborated with Myself”

  1. […] April 5, 2026 I Collaborated with Myself […]

    Like

Leave a comment