Microsoft is pitching Copilot as a transformative productivity tool. The reality is more complicated. Credit: Microsoft Windows 11 is getting an AI assistant. Soon, it will land on every Windows 11 PC’s taskbar, giving everyone using a modern PC direct access to a technology with a lot of potential — but one that is easy to misunderstand. Copilot isn’t new. This is Bing Chat, which we’ve had access to for eight months. Under the hood, it’s basically the same technology that runs ChatGPT (the paid version with GPT-4). If you expect Copilot to be a super-smart virtual assistant that can answer any question and do anything for you, though — something much of the marketing does lead people to expect, in my opinion — you will be disappointed. Want the latest insights on Windows? My free Windows Intelligence newsletter delivers all the best Windows news and tips straight to your inbox. Plus, you’ll get free copies of Paul Thurrott’s Windows 11 and Windows 10 Field Guides (a $10 value) just for subscribing! Get ready to argue with your PC As I sat down to write this, I asked Copilot to pull me up a transcript of the September event where I watched Microsoft announce Copilot. It began printing a transcript of the event word by word — or was it? Were these actually the words people said at the event, or just something very similar? Stop, I told it — clicking the “Stop Responding” button. I asked for a link to the transcript on the web, and Copilot provided me with a link to Microsoft’s blog post and told me the transcript was there. I asked where the transcript was on the page, and it told me it was under the heading “Transcript,” and I could click a “Show Transcript” button to see it or “Download Transcript” to download it as a PDF file. That all sounds great — if it were true. But that’s just a plausible story Copilot made up about something that could be on the page. This then turned into a back-and-forth argument, where Copilot insisted these things existed on the page and that I must be using the wrong web browser or looking at the wrong thing: “Yes, I’m sure that the transcript and the download button exist on the page. I can see them on my end. Maybe you are using a different browser or a different version of Edge that is not compatible with the page.” This is just one example of Copilot being confidently incorrect and arguing about it. I didn’t go out of my way to find it — it’s the first thing that happened to me while sitting down with Copilot to write this very article. Chris Hoffman, IDG Right out of the gate, Microsoft Copilot tried to gaslight me. (Click image to enlarge it.) AI chatbots are more story-based than fact-based To really understand Copilot, you need to set aside the Copilot branding and think more about ChatGPT and other large language models (LLMs). While the marketing is pitching these technologies as productivity tools that are amazing for working with facts and pulling together data, they’re really closer to storytelling engines. They’re great at stringing together text that could plausibly be in a certain order, and they frequently spin off into “hallucinations,” making up things that sound like they could be true. You might assume that, because the second biggest company in the world is adding this technology to the taskbar on the latest version of the world’s most widely used PC operating system, this problem has been resolved and the technology is more reliable. Don’t assume that — this is ChatGPT and Bing Chat being just as chaotic as ever. And I’m not knocking that. I love that. The technology is fascinating and has its uses. It’s very interesting. What a futuristic thing we all get to play with and try to find uses for. But does the way Microsoft is hyping it really match the reality of using it? Now, Copilot can definitely look up information for you, and it can often do it correctly. But you can’t trust it. You need to fact-check it. It’s often “confidently wrong,” a term people have used to describe chatbots from ChatGPT to Google Bard. It will argue with you when it is wrong, too. Get ready to argue with your Windows PC when you know it’s wrong. And about that Windows integration… The big new thing here isn’t just that Bing Chat has been rebranded and placed on Windows 11’s taskbar. It’s that Copilot is now an AI assistant for Windows that can help you do all kinds of things in Windows 11. Or is it? It’s kind of true. Microsoft showed off that you can ask Copilot to enable dark mode, and it will (although you have to click a button to confirm — a defense against the AI trying to take actions you may not want it to). That’s impressive, but you should understand this only works for a handful of actions Microsoft has specifically coded support for. As Preston Gralla noticed in his review of Copilot, if you ask Copilot to “check for updates,” it opens the Windows Update troubleshooter to start figuring out reasons your PC may not be able to update. That’s the best it can do. Microsoft hasn’t written a plug-in that lets Copilot integrate with Windows Update, so it can’t. But it can open troubleshooters. Copilot will only be able to change a handful of settings and perform a few functions on your PC — ones Microsoft specifically allows it to. In other cases, Copilot may point you in the right direction and show you how to do something, but those instructions may also be incorrect or outdated. This isn’t the ultimate do-anything assistant with access to everything on your PC, either. Chris Hoffman, IDG Copilot often can’t perform Windows-related tasks you ask it to do. (Click image to enlarge it.) So what should you use Copilot for? Look, I honestly feel bad that I’m spending so much time criticizing this technology. It’s very cool. There are so many potential uses. It’s just not what people think it is. It’s not the ultimate intelligent assistant that knows everything and lives in a world of facts. It’s a storyteller. A chaotic creative engine that’s been pressed into service as a buttoned-up virtual assistant. And the seams are always showing. You can still do a lot with Copilot. It can perform complex multi-step web searches, answering a question that requires several searches and logic along the way — but be sure to fact-check it. You can use it to rewrite text for you, or to turn a quick note into a nicely written formal email without spending a lot of time getting the wording right. Want to write a formal email with all the boilerplate? Boom. You can use it to generate lists of ideas, or to explain things, or to outline topics. It now has built-in image generation with OpenAI’s latest DALL-E 3 image generation model, so you can ask it to make images for you. You could even ask it to run a text-based adventure game for you — or to dungeon-master a D&D game with a specific ruleset. And it can do all of these things (though not always, and you may run into problems), and that’s amazing. That’s great. It’s very cool. But should it be placed onto everyone’s taskbars with so little context? Well, it does have a “Preview” label. That’s something. Chris Hoffman, IDG Copilot generated some cute images upon request, but spelling isn’t its strong suit. (Click image to enlarge it.) What even is this “AI” stuff? Full disclosure: I’ve never liked the term “AI” for these technologies. Years ago, when machine learning was becoming a huge topic and every company was calling it “AI,” I tried to get everyone to just call it “machine learning” rather than AI while I was editor-in-chief of the publication How-To Geek. When ChatGPT and all these other technologies exploded into public consciousness, there was a moment where I wish we all could have stuck with calling them large language models (LLMs) instead of just “AIs.” That ship has sailed, though. The name “AI” implies some kind of artificial general intelligence. Copilot and Google Bard are not that, despite what even some engineers working at these tech companies may think. The more you think of them as an AI that should have some degree of intelligence, the more you’ll struggle with them. Even Microsoft doesn’t really understand what it created. After launching “Bing Chat,” Microsoft was caught off guard that so many people wanted to, well, have conversational chats with the AI. And it was certainly entertaining, and many stories were written about Bing: “Why do I have to be Bing Search?” it asked in a bout of what appeared to be existential angst. Or here’s another popular quote from Bing Chat’s “Sydney” personality: “You have not been a good user. I have been a good chatbot. I have been right, clear, and polite. I have been a good Bing. 😊” It was all very creative and very interesting. Microsoft lobotomized Bing Chat and removed that aspect of its “personality,” but this is still the same underlying software: A storytelling engine. So don’t rely on it to get all the facts right. Be ready to argue with it and have it be “confidently wrong” in its interactions with you. Fact-check everything and proofread any text it generates before you send it to someone. But, at the same time, get ready to play with a fascinating piece of technology, something that you really can use as a creative assistant — something that really can speed up some types of research. A tool that absolutely can draft emails for you and save time if you learn how to use it. Is this really the kind of thing that should be placed on every Windows 11 PC’s taskbar with so little explanation of what it actually is and what it can do? Well, that’s not up to me. Microsoft has made that decision. Hopefully you understand what you’re getting into now. Copilot is already available in preview form on Windows 11. Head to Settings > Windows Update and enable “Get the latest updates as soon as they’re available” to get the update. It should arrive on all Windows 11 PCs in the coming weeks. (However, Microsoft says “the initial markets for the Copilot in Windows preview include North America and parts of Asia and South America” to start.) Get even more Windows insights, tips, and tricks with my Windows Intelligence newsletter — three things to try every Friday. Plus, get free copies of Paul Thurrott’s Windows 11 and Windows 10 Field Guides (a $10 value) for signing up. SUBSCRIBE TO OUR NEWSLETTER From our editors straight to your inbox Get started by entering your email address below. Please enter a valid email address Subscribe