“No more reading emails, OK?” says tech founder and content creator Jason Yeager’s satirical boss character MyTechCeo in a recent TikTok skit. “I want your AI reading my AI-generated email—and answering my email.” It’s a parody, but only just. AI emails are proliferating across industries.
In October, LinkedIn’s CEO Ryan Roslansky said he uses AI for almost every “super high-stakes” email he sends. And a recent survey from the email verification software company ZeroBounce found that one in four respondents admit to using it daily for drafting or editing their own emails.
On Reddit, employees swap stories about bosses who use AI “to answer every email at work and thinks no one notices” or who “only communicate through AI-generated emails and it’s giving me anxiety.” When unsure, the most realistic response is to use AI too. Plug your message into a chatbot, tweak what comes out, and send it back.
How to Spot an AI-Generated Email
But if you receive a message that was likely written by AI, especially in the midst of a disagreement, you can tell—something’s off. It sounds a little too well drafted. The tone is reasonable and balanced. And while the problems are addressed, there’s something missing: the voice of the person you’re communicating with. (A dead giveaway, of course, is when the prompt is left in.)
Emails may sound smoother this way, but experts worry that outsourcing difficult conversations also bypasses the relationship-building that makes workplaces function. When you ask a chatbot to rewrite your message to be more “concise” or “professional,” it can also strip away the emotional substance of the exchange—an act that may be shaping the future of work for the worse, incubating a generation of professionals who can’t talk to one another.
The Double-Edged Sword of AI Rehearsal
There is some reported benefit to “dry-chatting” with AI—practicing tricky topics with a bot first so you can tackle the issue directly and clearly with someone afterward. Used as rehearsal, AI can be an effective tool in building confidence. But when used as a substitute, it does the opposite.
Filling the gap entirely, with one person’s ChatGPT effectively talking to another person’s Claude, can create distance. This runs counter to what companies say they want when bringing colleagues back into the office: creativity, collaboration, and stronger working relationships.
Social Offloading: The Hidden Cost of AI in Leadership
“When it handles the hard conversation, the human never builds the muscle of doing that,” Leena Rinne, vice president of leadership, business, and coaching at the workplace skills management platform Skillsoft, tells Fast Company. “It’s not just that the interaction risks feeling like AI—because it does—but you’re actually compromising trust with the person.”
Rinne calls this outsourcing of difficult conversations “social offloading.” It’s particularly problematic when leaders resort to it, Rinne says, because it “almost regresses their ability to have the hard conversations.”
“Now you’re less in the moment and less able to do this thing that leaders need to be able to do,” she says. It’s a problem for everyone involved: The boss isn’t developing the skill of communicating more clearly, and