Microsoft’s generative AI assistant is integrated within a host of the company's workplace apps. Here’s what enterprises need to know about the renamed ‘Copilot for Microsoft 365.’ Initially called Microsoft 365 Copilot when it launched in November 2023, the renamed “Copilot for Microsoft 365” brings a range of generative AI (genAI) features to office productivity apps such as Word, Outlook, Teams and Excel. In a blog post announcing the tool, Microsoft CEO Satya Nadella described it as “the next major step in the evolution of how we interact with computing…. With our new copilot for work, we’re giving people more agency and making technology more accessible through the most universal interface — natural language.” At launch, Microsoft explained that the Copilot “system” consists of three elements: Microsoft 365 apps such as Word, Excel, and Teams, where users interact with the AI assistant; Microsoft Graph, which includes files, documents, and data across the Microsoft 365 environment; and the OpenAI models that process user prompts, such as the ChatGPT-4 large language model and DALL-E 3 model for image generation. With the tool, Microsoft aims to create a “more usable, functional assistant” for work, J.P. Gownder, vice president and principal analyst at Forrester’s Future of Work team, told Computerworld in fall 2023. “The concept is that you’re the ‘pilot,’ but the Copilot is there to take on tasks that can make life a lot easier.” The Copilot for M365 is “part of a larger movement of generative AI that will clearly change the way that we do computing,” he said, noting how the technology has already been applied to a variety of job functions — from writing content to creating code — since ChatGPT-3.5 launched in late 2022. A Forrester report last year predicted that 6.9 million US knowledge workers — around 8% of the total — would be using Copilot for M365 by the end of 2024. Nadella talked up the effectiveness of the M365 Copilot during a 2023 earnings call, claiming customers had seen productivity gains in line with that of the GitHub Copilot, the AI assistant aimed at developers that launched two years ago. (For reference, GitHub has previously claimed developers were able to complete a single task 55% quicker thanks to the GitHub Copilot, while acknowledging the challenges in measuring productivity.) Even priced at $30 per user per month, there’s potential to deliver considerable value to businesses, assuming the Copilot delivers on its promise over time. Said Gownder: “The key issue is, ‘Does it actually save that time?’ because it’s hard to measure and we don’t really know for sure. But even conservative time savings estimates are pretty generous.” The Copilot for M365 is billed as providing employees with access to genAI without the security concerns of consumer genAI tools; Microsoft says its models aren’t trained on customer data, for instance. But deploying the tool represents significant challenges, said Avivah Litan, distinguished vice president analyst at Gartner. There are two primary business risks, she said: the potential for the Copilot to ‘hallucinate’ and provide inaccurate information to users, and the ability for the Copilot’s language models to access huge swathes of corporate data that’s not locked down properly. “Information oversharing is one of the biggest issues people are going to face in the next few months, or six months to a year,” said Litan. “That’s where the rubber is going to hit the road on the risk — it’s not so much giving the data to Microsoft or OpenAI or Google, it’s all the exposure internally.” Copilot for Microsoft 365 features: How do you use it? Copilot interactions within apps can take a variety of forms, depending on the application. In many cases, users will interact with it via the chat interface available in a sidebar; Copilot functionality is also built more directly into some apps, such as a pop-up in a Word document or Outlook email, for instance. Here’s how the Copilot works in some M365 apps. In a Word doc, it can suggest improvements to existing text or let users create a first draft from scratch. To generate a draft, a user can ask Copilot in natural language to create text based on a prompt, and can upload additional files and sources of information to guide the AI assistant. Once created, the user can edit th document, adjust the style, or ask the Copilot to redo the whole thing. A Copilot sidebar provides space for more interactions with the bot, which also suggests prompts to improve the draft, such as adding images or an FAQ section, or summarize the text. During a Teams video call, the Copilot provides a recap of what’s been discussed so far, with a brief overview of conversation points in real time. It’s also possible to ask the AI assistant for feedback on people’s views during a call, or what questions remain unresolved. Those unable to attend a particular meeting can send the AI assistant in their place to provide a summary of what they missed and action items they need to follow up on. Copilot can help a Word user draft a proposal from meeting notes. In PowerPoint, Copilot can automatically turn a Word document into draft slides that can then be adapted via natural language in the Copilot sidebar. It can also generate suggested speaker notes to go with the slides and add more images. These are just some examples. Other apps that feature Copilot integration include Excel, Outlook, OneNote, Loop, and Whiteboard. The other way to interact with Copilot is via a separate chat interface that’s accessible via Teams. Here, the Copilot works as a search tool that surfaces information from a range of sources, including documents, calendars, emails, and chats. For instance, an employee could ask for an update on a project, and get a summary of relevant team communications and documents already created, with links to sources. Microsoft will extend Copilot’s reach into other apps workers use via “plugins” — essentially third-party app integrations. These will allow the assistant to tap into data held in apps from other software vendors including Atlassian, ServiceNow, and Mural. Fifty such plugins are available, with “thousands” more expected eventually, Microsoft said. How much does Copilot cost — is it worth $30 per user, per month? The main Microsoft 365 Copilot is available for enterprise customers on E3, E5, F1 and F3 plans, as well as Office E1, E3, E5, and Apps for Enterprise. It’s also available for smaller business customers on the following plans: Businesses Basic, Business Standard, Business Premium, and Apps for Business. In each case, the Copilot for Microsoft 365 costs an additional $30 per user each month. It’s a significant extra expense given that M365 subscriptions start at $6 per user each month for Busines Basic and go up to $55 per user each month for E5. Part of this due to the cost of the high computing costs of the Copilot incurred by Microsoft, said Raúl Castañón, senior research analyst at 451 Research, a part of S&P Global Market Intelligence. “Microsoft is likely looking to avoid the challenges faced with GitHub Copilot, which was made generally available in mid-2022 for $10/month and, despite surpassing more than 1.5 million users, reportedly remains unprofitable,” said Castañón. In addition to the core Copilot for M365, job role-specific Copilots are available as paid add-ons. Sales and service Copilots each cost an additional $20 per user each month, while a finance Copilot is currently in preview. The pricing strategy reflects Microsoft’s confidence in the impact that genAI will have on workforce productivity. Per Forrester’s calculations in the “Build Your Business Case For Microsoft 365 Copilot” report, an employee earning $120,000 annually — roughly $57 per hour — might save four hours a month on various productivity tasks; those four hours would be worth around $230 a month. In that scenario, it would make sense to invest in Copilot for an employee earning even half that amount, and that’s leaving aside less tangible benefits around employee experience when automating mundane tasks. There are, as the Forrester points out, other costs to consider beyond licensing — employee training, for instance, as employees learn the new technology. Gartner also predicts that enterprise security spending will increase in the region of 10% to 15% in the next couple of years as a result of efforts to secure genAI tools (not just M365 Copilot). Businesses are likely to take a cautious approach to deploying the Microsoft tool, at least at first. Microsoft expects revenue related to M365 Copilot to “grow gradually over time,” Microsoft CFO Amy Hood said during the company’s Q1 2024 earnings call. On the same call, Nadella noted that Copilot will be subject to the usual “enterprise cycle times in terms of adoption and ramp.” Even if the pace of adoption is gradual, there appears to be plenty of interest in deploying it. Forrester expects around a third of M365 customers in the US to invest in Copilot in the first year. Companies that do so will provide licenses to around 40% of employees during this period, the firm estimated. (Note: while not actually branded as Copilot, Microsoft also makes some genAI features available in Teams Premium. This includes AI-generated notes, AI-generated tasks and live translations in video calls, all of which are powered by ChatGPT AI models. For businesses that are mostly interested in AI assistant features for meetings, this offers a cheaper option than a full Copilot for M365 subscription.) What are Microsoft’s other Copilots? Microsoft’s Copilot is embedded in a wide array of products. Beyond the M365 suite, there are Copilots for Dynamics, Power BI, GitHub, and Microsoft’s security suite. And then there are Copilots aimed primarily at consumer, rather than business, users. Microsoft launched Copilot Pro in January 2024, a $20 a month subscription that provides individuals with similar functionality to the Copilot for M365. Copilot Pro customers gain access to Copilot chatbot and genAI image creation, as well as AI assistant features in free web versions of apps such as Word, Excel, PowerPoint, and Outlook (though not Teams). Those with Microsoft 365 Personal and family subscriptions can also access the Copilot in desktop apps. There’s also a free version of the Copilot with access to chatbot functionality only. The Copilot chat interface is accessible in several ways by both paid and free users. There’s a dedicated web page, a mobile app, and a chatbot built into the Windows operating system, Edge browser, and Bing search engine. How are early customers using Copilot? There are two basic ways users will interact with Copilot. It can be accessed directly within a particular app — to create PowerPoint slides, for example, or an email draft — or via a natural language chatbot accessible in Teams, known as Microsoft 365 Chat. Interactions within apps can take a variety of forms, depending on the application. When Copilot is invoked in a Word document, for example, it can suggest improvements to existing text, or even create a first draft. To generate a draft, a user can ask Copilot in natural language to create text based on a particular source of information or from a combination of sources. One example: creating a draft proposal based on meeting notes from OneNote and a product road map from another Word doc. Once a draft is created, the user can edit it, adjust the style, or ask the AI tool to redo the whole document. A Copilot sidebar provides space for more interactions with the bot, which also suggests prompts to improve the draft, such as adding images or an FAQ section. During a Teams video call, a participant can request a recap of what’s been discussed so far, with Copilot providing a brief overview of conversation points in real time via the Copilot sidebar. It’s also possible to ask the AI assistant for feedback on people’s views during the call, or what questions remain unresolved. Those unable to attend a particular meeting can send the AI assistant in their place to provide a summary of what they missed and action items they need to follow up on. In PowerPoint, Copilot can automatically turn a Word document into draft slides that can then be adapted via natural language in the Copilot sidebar. Copilot can also generate suggested speaker notes to go with the slides and add more images. The other way to interact with Copilot is via Microsoft 365 Chat, which is accessible as a chatbot with Teams. Here, Microsoft 365 Chat works as a search tool that surfaces information from a range of sources, including documents, calendars, emails, and chats. For instance, an employee could ask for an update on a project, and get a summary of relevant team communications and documents already created, with links to sources. Microsoft will extend Copilot’s reach into other apps workers use via “plugins” — essentially third-party app integrations. These will allow the assistant to tap into data held in apps from other software vendors including Atlassian, ServiceNow, and Mural. Fifty such plugins are available, with “thousands” more expected eventually, Microsoft said. Copilot can synthesize information about a project from different sources. How are early customers using Copilot? Prior to launch, many businesses accessed the Copilot for M365 as part of a paid early access program (EAP); it began with a small number of participants before growing to several hundred customers, including Chevron, Goodyear, and General Motors. One of those involved in the EAP was marketing firm Dentsu, which began deploying 300 licenses to tech staff and then employees across its business lines globally. The most popular use case so far is summarization of information generated in M365 apps — a Teams call being one example. “Summarization is definitely the most common use case we see right out of the box, because it’s an easy prompt: you don’t really have to do any prompt engineering…, it’s suggested by Copilot,” Kate Slade, director of emerging technology enablement at Dentsu, said. Staffers would also access M365 Chat functions to prepare for meetings, for instance, with the ability to quickly pull information from different sources. This could mean finding information from a project several years ago “without having to hunt through a folder maze,” said Slade. The feedback from workers at Dentsu has been overwhelmingly positive, said Slade, with a waiting list now in place for those who want to use the AI tool. “It’s reducing the time that they spend on [tasks] and giving them back time to be more creative, more strategic, or just be a human and connect peer to peer in Teams meetings,” she said. “That’s been one of the biggest impacts that we’ve seen…, just helping make time for the higher-level cognitive tasks that people have to do.” Use cases have varied between different roles. Denstu’s graphic designers would get less value from using Copilot in PowerPoint, for example: “They’re going to create really visually stunning pieces themselves and not really be satisfied with that out-of-the-box capability,” said Slade. “But those same creatives might get a lot of benefits from Copilot in Excel and being able to use natural language to say, ‘Hey, I need to do some analysis on this table,’ or ‘What are key trends from this data?’ or ‘I want to add a column that does this or that.’” How does Copilot compare with other productivity and collaboration genAI tools? Most vendors in the productivity and collaboration software market have added genAI to their offerings at this point. Google, Microsoft’s main competitor in the productivity software arena, launched DuetAI for Workspace in 2023, and rebranded to Gemini Enterprise ($30 per user each month) and Gemini Business ($20 user each month). Google’s AI assistant can summarize Gmail conversations, draft texts, and generate images in Workspace apps such as Docs, Sheets,and Slides. Slack, the collaboration software firm owned by Salesforce and a rival to Microsoft Teams, launched its Slack AI feature in February. Other firms that compete with elements of the Microsoft 365 portfolio, such as Zoom, Box, Coda, and Cisco, have also touted genAI plans. Meanwhile, Apple announced that it will build generative AI features into its range of productivity tools. Then there are the AI specific tools, such as OpenAI’s ChatGPT, as well as Claude, Perplexity AI, Jasper AI and others, that provide also provide text generation and document summarization features. Copilot has some advantages over rivals. One is Microsoft’s dominant position in the productivity and collaboration software market, said Castañón. “The key advantage the Microsoft 365 Copilot will have is that — like other previous initiatives such as Teams — it has a ‘ready-made’ opportunity with Microsoft’s collaboration and productivity portfolio and its extensive global footprint,” he said. Microsoft’s close partnership with OpenAI (Microsoft has invested billions of dollars in the company on several occasions since 2019 and has a large non-controlling share of the business), likely helped it build generative AI across its applications at faster rate than rivals. “Its investment in OpenAI has already had an impact, allowing it to accelerate the use of generative AI/LLMs in its products, jumping ahead of Google Cloud and other competitors,” said Castañón. What are the genAI risks for businesses? ‘Hallucinations’ and data protection Along with the potential benefits of genAI tools like the Copilot for M365, businesses should consider risks. These include the hallucinations large language models (LLMs) are prone to, where incorrect information is provided to employees. “Copilot is generative AI — it definitely can hallucinate,” said Slade, citing the example of one employee who asked the Copilot to provide a summary of pro bono work completed that month to add to their timecard and send to their manager. A detailed two-page summary document was created without issue; however, the address of all meetings was given as “123 Main Street, City, USA” — an error that’s easily noticed, but an indication of the care required by users when relying on Copilot. The occurrence of hallucinations can be reduced by improving prompts, but Dentsu staff have been advised to treat outputs from the genAI assistant with caution. “The more context you can give it generally, the closer you’re going to get to a final output,” said Slade. “But it’s never going to replace the need for human review and fact check. “As much as you can, level-set expectations and communicate to your first users that this is still an evolving technology. It’s a first draft, it’s not a final draft — it’s going to hallucinate and mess up sometimes.” Tools that filter Copilot outputs are emerging that could help here, said Litan, but this is likely to remain a key challenge for businesses for the forseeable future. Another risk relates to one of the major strengths of the Copilot: its ability to sift through files and data across a company’s M365 environment using natural language inputs. While Copilot is only able to access files according to permissions granted to individual employees, the reality is that businesses often fail to adequately label sensitive documents. This means individual employees might suddenly realize they are able to ask Copilot to provide details on payroll or customer information if it hasn’t been locked down with the right permissions. A 2022 report by data security firm Varonis claimed that one in 10 files hosted in SaaS environments is accessible by all staff; an earlier 2019 report put that figure — including cloud and on-prem files and folders — at 22%. In many cases, this can mean organization-wide permissions are granted to thousands of sensitive files, Varonis said. In many cases, the most important data, around payroll, for instance, will have strict permissions in place. A greater challenge lies in securing unstructured data, with sensitive information finding its way into a wide range of documents created by individual employees — a store manager planning payroll in an Excel spreadsheet before updating a central system, for example. This is similar to a situation that the CTO of an unnamed US restaurant chain encountered during the EAP, said Litan. “There’s a lot of personal data that’s kept on spreadsheets belonging to individual managers,” said Litan. “There’s also a lot of intellectual property that’s kept on Word documents in SharePoint or Teams or OneDrive.” “You don’t realize how much you have access to in the average company,” said Matt Radolec, vice president for incident response and cloud operations at Varonis. “An assumption you could have is that people generally lock this stuff down: they do not. Things are generally open.” Another consideration is that employees often end up storing files relating to their personal lives on work laptops. “Employees use their desktops for personal work, too — most of them don’t have separate laptops,” said Litan. “So you’re going to have to give employees time to get rid of all their personal data. And sometimes you can’t, they can’t just take it off the system that easily because they’re locked down — you can’t put USB drives in [to corporate devices, in some cases]. “So it’s just a lot of processes companies have to go through. I’m on calls with clients every day on the risk. This one really hits them.” Getting data governance in order is a process that could take businesses more than a year to get sorted, said Litan. “There are no shortcuts. You’ve got to go through the entire organization and set up the permissions properly,” she said. In Radolec’s view, very few M365 customers have yet adequately addressed the risks around data access within their organization. “I think a lot of them are just planning to do the blocking and tackling after they get started,” he said. “We’ll see to what degree of effectiveness that is [after launch]. We’re right around the corner from seeing how well people will fare with it.” The Copilot for M365 pros and cons Pros: Boost to productivity. GenAI features can save time for users by automating certain tasks. Breadth of features. Copilot for M365 is built into the productivity apps that many workers use on a daily basis, including Word, Excel, Outlook and Teams. Responses generated by the Copilot for M365 are anchored in the emails, files, calendars, meetings, contacts, and other information contained in Microsoft 365. This means the Copilot for M365 can arguably offer greater insights into work data than any other generative AI tool. Enterprise-grade privacy and security controls. Unlike consumer AI assistants, Microsoft promises that customer data won’t be used to train Copilot models. It also offers tools to help manage access to data in M365 apps. Cons: Price. GenAI isn’t cheap and M365 customers are required to pay a significant additional fee each month for access to Copilot features. An individual employee might not need access to Copilot in more than a couple ofM365 apps. Need for employee training. Getting the most out of genAI tools will require guidance around effective prompts, particularly for employees that are unfamiliar with the technology — an additional cost businesses must factor in. Accuracy and hallucinations. LLMs are notoriously unreliable, confidently offering answers that are incorrect. This is a particular concern when it comes to business data, and users must be on the lookout for errors in Copilot outputs. Data protection risks. The ability for Copilot for M365 to access a wide range of corporate data means businesses must be careful to ensure that sensitive documents are not exposed. The Copilot functionality in Excel is limited at this stage. More on Copilot for Microsoft 365: Copilot for Microsoft 365 deep dive: Productivity at a steep price Microsoft Copilot Pro review: Office joins the genAI revolution Microsoft extends Copilot access to individuals and SMBs SUBSCRIBE TO OUR NEWSLETTER From our editors straight to your inbox Get started by entering your email address below. Please enter a valid email address Subscribe