A personal take on keeping it real in the world of fast-moving AI
Microsoft Copilot is rapidly changing how we get things done across Microsoft 365. You see it everywhere now—in Word, Excel, Outlook, Teams, and SharePoint—popping up with suggestions, drafting summaries and even generating slides and emails. Let’s be honest, it’s powerful. It’s incredibly useful, saves heaps of time and it’s definitely here to stay. But as I’ve been working with it, a big question keeps coming to mind: where do we let AI assist, and where do we absolutely need to keep it human?
Where Do We Draw the Line with AI?
This is part personal reflection, part gentle nudge against the creeping perfectionism in the corporate world, and part reality check. It’s all brought to life by a story from my sister and a very telling reaction from my mum. So, let’s unpack what we’re gaining with Copilot and what we might be giving up if we’re not careful.
The Pressure to Be Perfect
When I first started creating training content for Simply SharePoint, I noticed a lot of content out there that looked utterly flawless. Perfect audio, immaculate slide decks, flawlessly written copy, all delivered by perfectly calm voices—often, not human ones. It was slick, shiny and it made me pause. It felt like that was the benchmark and I thought I had to keep up.
So, I started playing around with AI tools. I tried voiceover generators, video narration, even crafting a synthetic version of myself. I uploaded my scripts, used an AI version of my own voice and tested what it would sound like to have a digital avatar deliver my training videos.
Technically, it worked a treat. But when I watched it back, my heart just sank. It didn’t feel like me. It looked good, it sounded clean, but it was empty. It had no life, no personality, no rhythm. Just a faceless voice reciting lines.
Here’s the thing: we’ve seen this pressure before. On Instagram, with curated ‘perfect’ lives. In magazines, with ‘perfect’ bodies. Now, it’s quietly creeping into our professional world. We’re seeing perfect presentations, perfect scripts and AI-generated voices and avatars pretending to be us. And that’s dangerous.
Because it creates this illusion that if you don’t sound like a robot or look like a broadcast presenter, you’re not professional. And that’s simply not true.
It made me realise we’ve allowed this Instagram-style perfectionism to quietly seep into professional spaces. Everything is starting to look too perfect, too scripted, too polished. This might seem harmless, but it’s not. In trying to make things look and sound professional, we’re creating content that doesn’t feel real anymore. And when content loses that human touch, people stop trusting it.
The Feedback That Changed Everything
When I shared my AI-generated content with a few trusted folks, the reaction was immediate: “It sounds robotic.” “This doesn’t look or feel like you.” “I don’t trust it. I want to hear a real voice.” And I completely agreed. What I do, what we all do when we teach, explain and share knowledge, is about more than just words on a screen. It’s about trust and trust doesn’t come from a perfect voiceover. It comes from presence.
An Academics Perspective
Around that time, I was chatting with my mum, Judy. My mum isn’t your average end-user. She’s a retired scientist, an academic and one of the smartest people I know. She’s a power user who’s been writing, researching and thinking deeply her whole life. One day, she opened Word, sat down to write something and Microsoft Copilot popped up. Her reaction? “It’s just sitting there, watching me. I don’t want help. I just want to write.”
She didn’t want assistance. She didn’t want prompts. She wanted to think, to get into a flow, to work on something important without being interrupted by a tool that assumed it knew what she needed. And I thought, wow, that’s the quiet tension a lot of people are feeling right now. We’ve gone from tools being invisible helpers to very visible presences, suggesting, pushing, even demanding input before we’ve had time to process our own ideas. This isn’t a fear of technology; it’s resistance to being over-automated.
It’s worth noting she’s not anti-AI at all. I remember a conversation we had months ago when she finally listened to me and tried out ChatGPT. She was on the phone to me for about an hour, going on and on about how wonderful it was and how it had changed her life. She was using it as a coach to help her research, among other things.
A Flight Attendant’s Perspective
Then came a conversation with my sister. She’s a flight attendant, travels constantly, and sees a lot. She’s also incredibly sharp, noticing patterns and inconsistencies most people overlook. She stumbled across something online that honestly shocked her. She found two completely different courses—one teaching Spanish for beginners, and another offering financial investment advice. Different topics, different audiences, but both courses were presented by the exact same AI-generated avatar and voice. Same tone, same gestures, same facial movements. One day, ‘she’ was teaching verbs in Spanish; the next, ‘she’ was explaining portfolio diversification and compound interest.
That’s when my sister said, “This is not okay. You can tell it’s not real. It’s been thrown together. There’s no credibility. Anyone can fake anything now.” And that’s scary. She’s absolutely right. If we’re flooding the internet and the workplace with synthetic ‘experts’, we’re not just lowering the bar; we’re erasing the value of real experience.
And this isn’t just a ‘boomers versus AI’ thing either. I’ve seen it with my daughter’s generation too. She walked in the other day and asked suspiciously, “Did you do that with AI?” There’s a real caution now, a pushback. People want to know there’s a human behind the content they’re consuming, and I’m here for it.
Why This Matters in Microsoft 365
Let’s bring this back to Microsoft Copilot. We’ve reached a point where anyone can generate a presentation, a summary, a document, a course, an onboarding guide, or a newsletter, all without contributing any genuine insights. That’s fantastic for productivity, but terrible for trust, especially when the person consuming that content is expecting real, grounded help.
If you’re using Microsoft Copilot to speed up formatting or summarise long meeting notes, that’s brilliant. If it helps you unblock a task, save time, or reduce admin, awesome. But if you’re training others, teaching complex tools like SharePoint, or trying to communicate something that involves empathy, strategy, or nuance, that needs you. AI can help, but it can’t replace your judgment, your voice, or your experience.
How I Use AI (And My Disclaimer!)
So, here’s my disclaimer: I use AI. I use it every single day. I use ChatGPT when I get stuck on a sentence. I brainstorm ideas with it. I even use it to help create visuals, but they’re always based on my ideas. Every piece of content I produce comes from me. AI is a tool I use to speed things up, not a replacement for my voice or experience.
So, I use ChatGPT to help rephrase things or structure rough outlines. I use Canva for visuals, sometimes using AI prompts as a starting point. I use Adobe Enhance to clean up my podcast audio, and ClipChamp to record my real voice quickly.
But I don’t use AI-generated voiceovers, AI presenters, or scripted bots delivering content in place of me. Because if you’re learning from me, you’re trusting that I know this stuff because I’ve lived it. I’ve done the migrations. I’ve restructured the document libraries. I’ve worked with stakeholders, and I’ve cleaned up SharePoint messes. That’s what I bring. And that’s what AI can’t replicate.
Building Better Workplaces with AI and Authenticity
So, here’s where I’ll leave you, or rather, where I’ll slow down, pause, and ask you to think with me. While this discussion started with Copilot and the wave of AI running through Microsoft 365, it’s really about something deeper:
- What kind of workplaces are we building?
- What kind of content are we putting into the world?
- And what kind of professionals do we want to be?
We’re living through what I believe will be one of the most transformational periods in digital work history. Microsoft isn’t just suggesting Copilot; it’s baking it into everything. Open Word, Copilot’s there. Start a meeting in Teams, Copilot’s ready. Navigate SharePoint, Copilot is watching, suggesting, summarising. Even PowerPoint is offering to build your slides before you’ve had a chance to think.
Sure, it’s exciting, it’s efficient, it’s revolutionary. But it’s also a little unsettling. What happens when every meeting summary, every project report, every client proposal sounds the same? What happens when nuance, tone, and personality—things that used to help us stand out—get flattened by templated AI suggestions?
I’ve already seen it happening in client sites. You know the kind of content I’m talking about: the Copilot-generated email with just enough polish but none of the grit. The SharePoint page that looks clean but lacks real insight. The training material that hits the mark technically but feels like it could have come from anyone. It’s all fine, but it’s not memorable. It’s not sticky. And that’s the risk we run: AI-generated content that feels like a checkbox, not a connection.
Your Voice Matters More Than Ever
AI is smart, but you’re wiser. AI has access to everything you’ve ever typed, every policy document, every template, and every training manual in your tenant. But it doesn’t know what really happens in your organisation. It doesn’t know the personalities on your team. It doesn’t know how that one manager likes to frame things or how your executive prefers bullet points to paragraphs. It doesn’t know what went wrong last quarter and how you finally fixed it.
That’s what you bring to the table: your lived experience, your insight, your voice. And that’s what makes SharePoint training, Microsoft 365 governance, and workplace change management successful. It’s never just about showing someone how to click a button. It’s about guiding them through how that button fits into a real process, with real people and real challenges.
Don’t get me wrong, though. This isn’t a call to abandon Copilot. I use it daily for things like summarising meetings, rewriting dry documentation into something snappier, and drafting repetitive process guides so I can tweak the tone later. But I don’t let it lead. And I don’t hand over anything with my name on it until I’ve had my say. In fact, this whole blog post was inspired by AI, but shaped by real stories, real moments, and real people. That’s what gives it weight.
Some Questions to Ask Yourself
Here are a few questions I keep in my back pocket before I use AI to publish or share something:
- Am I trying to speed something up or avoid doing the thinking? There’s a difference between efficiency and avoidance.
- Will this content be trusted more because it’s faster or because it’s real? If it’s a sensitive topic or strategy, people want clarity and authenticity.
- Would I feel comfortable presenting this face-to-face? If not, it probably needs more of me in it.
These questions have saved me from publishing some very slick but very soulless work.
Intentionally Using AI in Microsoft 365
Copilot is here. It’s powerful, it’s helpful, and it’s the direction Microsoft is heading. But I believe the future of work, especially in Microsoft 365, belongs not to those who adopt AI blindly, but to those who use it intentionally. Those who know when to automate and when to speak, when to summarise and when to explain, when to publish and when to pause and rewrite.
We need both automation and authenticity. We need both speed and storytelling. We need both AI and your voice.
Want to draw the line for yourself? I’ve put together a simple, quick decision matrix to help you figure this out. It outlines the most common Microsoft 365 tasks—everything from creating meeting notes to training materials—and categorises them by:
- Green: Great for AI
- Yellow: Use AI with human input
- Red: Keep it human
It’s not a perfect science, but it’s a start. More importantly, it’s a conversation tool. You may not agree with some of my suggestions, but share it with your team. Ask them where they draw the line. Use it to spark better decisions, not just faster ones. You’ll find the download link for this below.
If you take anything away from this, I hope it’s this: your voice matters, even in an age of automation—especially in an age of automation. Because the tools may be smart, but you’re still the expert.
Final Thoughts
Copilot is changing how we work—but it shouldn’t change who we are. In a world full of automation, authenticity is your superpower. Keep using your voice. Keep showing your experience. Keep it real.
Thanks for reading—and if you want more thoughts like this, check out the following podcast episode that inspired this post:



