Advania UK logo  Advania UK logo compact

Copilot for Microsoft 365: how to get the most out of your AI assistant

Posted On
Written by
Duration of read
10  min
Share Article
Subscribe via email

Copilot for Microsoft 365 is a powerful AI assistant that helps you to analyse and locate information, create content, be more productive and meet more effectively. But how do you get started with Copilot? How do you make sure you are using it effectively and responsibly? And how do you unleash its full potential to create amazing content and achieve meaningful outcomes?

In this blog, Advania’s Director of Client Technology Value Dan Coleby will share some tips and best practices based on our expertise at Advania UK and experience gained through our client engagements. At Advania, we are seeing great value from Copilot internally and have licensed it for the whole of our organisation to put Copilot’s power into the hands of every user. Here’s how you can do the same.

Getting ready for Copilot for Microsoft 365

Before you can start using Copilot, it’s crucial that you comply with the prerequisites of technology adoption and plan how your organisation is going to successfully embrace the use of this AI assistant. We’ve helped many customers start from this position and have crafted this process into a successful and value-driven journey.

Getting your Copilot’s license

A Microsoft licence is required for each user who wishes to make use of Copilot for Microsoft 365. The good news is that Copilot is available as an add-on to all existing Microsoft 365 licences.

Copilot for Microsoft 365 integrates with your familiar and most relied-on apps, including Word, PowerPoint, Excel, Outlook, Teams, OneNote, Whiteboard, Loop, and forms. For Copilot to flourish across your environment and for you to leverage the greatest value in your daily routine, users’ identity, data and documents must all be in the cloud.

Security considerations for Copilot

Many organisations are concerned about the security risks of using AI, including Copilot for Microsoft 365. A lot of our customers have chosen to work with us to better understand the security risks involved.

The main consideration when adopting Copilot is whether users have access to documents and information that they should not. When a user receives a Copilot licence, Copilot can see all the Microsoft 365 information that is available to the user. This may seem obvious, and not a point of concern.

However, documents are sometimes overshared, despite containing sensitive and confidential data. Users also sometimes have access to documents that they should not and may previously have been unaware of this. Copilot is more likely to discover this information when surfacing data and information than the users were before by manually browsing or searching.

The risk, therefore, that Copilot poses is not new – but it makes any existing risks in your tenant more visible.

Understanding your sharing risks – before and after Copilot adoption

In our experience, it is good practise to understand the sharing risk that exists within your documents and data on SharePoint and across Microsoft 365. We do not, however, recommend stalling the value of Copilot in order to perform this process: many of our customers have started to deliver AI pilots schemes whilst assessing their sharing risk at the same time. We took this approach internally, asking each of our initial users of Copilot to sign an NDA which required them to report any sensitive data that they discovered, as well as to not share it.

The good news though, is that Copilot has been shown to have strong internal guardrails. For example, if I ask Copilot to tell me the salary of our CEO, its response will be: “I won’t tell you that” rather than “I don’t have that information to hand”. After actively using Copilot for many months within Advania UK, it has not been responsible for any sensitive data making its way into the wrong hands. Of course, we followed our own advice and as well as beginning to roll out Copilot to our people, we simultaneously assessed the sharing risk across our organisation, remediating the main issues that we found.

Copilot: assessing the business case

Another area that customers frequently want to assess prior to investment is the business case for the technology. You may have experience with earlier iterations of AI, but knowing the differences between old and new is crucial.

Copilot and other generative AI assistants are distinct from the robotic process automation (RPA) AI that you may have used in recent years. With RPA, the business case and transformation are obvious: you take a process which is currently delivered by humans and fully automate it using the technology and AI. Once you have changed the process and implemented the new automated one, you cannot accidentally fall back to the old process. The business case for change is very clear and the return on investment (ROI) is usually significant and fast.

Copilot is a generative AI assistant that you provide to all your staff. Each staff member will use the Copilot in their own way and get different benefits from its features. The value to the organisation is the total of the value to all users. To get value from Copilot, each user needs more than just a licence – they need to adapt their tried and tested work habits. They need to evolve the way they work, improving their engagement with this new technology to build a new model of user and AI working in partnership.  Copilot licensing is a monthly fee per user, so if your people do not use the features, there is no return on your investment.

Assessing Microsoft Copilot ROI through Value Experimentation

Because the benefits of generative AI assistants like Copilot are not yet widely understood, the most effective way to assess the value to individuals and to your organisation is to experiment with it. We call this Value Experimentation.

By focusing on this method of Value Experimentation and by providing support through a structured and focused adoption process, your users will gain maximum value from Copilot and learn how to use it to its fullest potential. Additionally, you will capture the value that they are observing and build that data into a business case that can be extrapolated across your whole organisation.

Successful adoption and use

One of the most critical elements to consider when deploying Copilot for Microsoft 365 is user behaviour and adoption. Copilot, as with AI assistants more generally, are not tools that we are used to using. For many of us, leveraging these technologies requires us to substantially change the way that we work, often overturning years or decades of muscle memory and professional reflexes. Even for those who embrace this change, it can take a while to remember when to ask for Copilot’s help or when you are better off completing the task on your own.

Two elements of the user adoption journey that organisations need to plan for and accommodate to see successful Copilot adoption are setting expectations and encouraging confident engagement with the technology.

Microsoft Copilot: setting expectations

In Gartner’s hype curve, we can observe a stage called the trough of disillusionment. This is when a user is starting to use a new technology where expectations have been set really high, but the reality does not meet those expectations. With Copilot for Microsoft 365 there is a good chance that users will be disillusioned by some of the features and functionality that they encounter. There has been significant hype around AI in general and Copilot in particular, and the reality is that whilst it has amazing capability, it does not necessarily offer everything at the moment that the marketing suggests it will ultimately do.

It is important that you set reasonable expectations around which capabilities are mature and which ones are still developing. Importantly within this idea, Copilot presents itself differently in each of the Office applications that it is integrated into. This is intentional: just as each application has a different purpose and focus, so does the Copilot that comes with it. However, users need to understand the nuances between these different applications and learn how to get the best out of Copilot across all of them.

Microsoft Copilot: encouraging adoption and dispelling concern

Many users are cautious of leveraging AI and new tools like Copilot. The negative temptation is to think that the AI will replace them in their roles and make their position in the company redundant. This is not true. Copilot is exactly what it says – a co-pilot, not an autopilot. The user is still the pilot. The user is still in control. The user still runs the process and does the work, but now does so more quickly, efficiently, easily, and accurately with the help of Copilot.

It’s essential that this message is well communicated to your users so that they embrace the value that Copilot can provide with open eyes rather than being wary of this new technology.

Unleash the power: the art of prompting your Copilot

Because Copilot is there to assist you, you need to tell it what to do. This is done through the instruction you provide for it – called prompts. Prompts are generally text-based but can also be words that are spoken aloud and then transcribed into text that instructs Copilot.

One of the key elements of a successful adoption and change programme is making sure that users understand what the elements of a good prompt involve, how to prompt well and how to change their prompts if they don’t get the responses that they want.

Simple prompts result in generic, high-level responses. Although this can sometimes be enough, often it is not what the user is looking for. Poor prompting can be one of the biggest causes of disillusionment for users who are just getting to grips with Copilot, and it’s essential that they learn how to engage and prompt the tool properly.

How to prompt Copilot for Microsoft 365 effectively

So how do you write effective prompts that elicit the best suggestions from Copilot? Here are some tips and best practices to help you master the art.

Be clear and specific

Copilot works best when you give it a clear and specific goal or direction for what you want it to do. You should avoid vague or ambiguous prompts that could lead to multiple interpretations or outcomes.

You should also avoid prompts that are too broad or too complex, that could require too much information or context to be answered clearly. For example, instead of prompting “Write an article about Copilot”, you could prompt “Write an introduction paragraph for an article about Copilot for Microsoft 365”.

Be creative and curious

Copilot works best when you give it a creative or curious challenge or prompt for what you want it to respond with. You should avoid boring or repetitive prompts that could lead to dull or generic suggestions. You should also avoid prompts that are too easy or too hard, or that could require too little or too much effort to be answered.

For example, instead of prompting “Write a thank you email to a customer”, you could say “Write a thank you email to a customer who just gave us a positive review on social media and reference their review”.

Be respectful and responsible

Copilot works best when you give it a respectful and responsible purpose or intent. You should avoid offensive or inappropriate prompts that could lead to harmful or unethical suggestions. You should also avoid prompts that are illegal or immoral, especially that could violate any laws or regulations, or any rights or values.

For example, instead of writing “Write a falsely positive testimonial for this product”, you could prompt “Write a testimonial for this product based on the feedback we received from our customers”.

Prompting for Copilot: a four-pronged approach

Based on the suggestions above, it should be clear that Copilot benefits from clarity and logic just as much as human beings do. To add a final layer of structure to your prompts, there are four considerations you should keep in mind when engaging your AI assistant. If you can remember these four things, you’ll be in good shape.

Successful and valuable prompts for Copilot should include:

  • A Goal: what do I want?
  • Context: why do I need it?
  • Source: where should Copilot look for the answer to your question?
  • Expectations: how do you want the task to be done?

 

If you were asking a colleague to perform a task for you, you would probably give them some elements of each of these components, rather than a vague instruction as to what you want. Copilot is your new colleague and is there to help, so long as you give it the information that it needs to do so. Offer it the same professional approach as your teammates and you will see the power that it can offer you.

Copilot for Microsoft 365: making the extraordinary ordinary

Encouraging your people to engage with AI will in time become part and parcel of every IT strategy across the world. Those organisations who take the bull by the horns and experiment, adapt and adopt early stand to see the benefits of the technology that others will have to catch up on.

By inculcating your organisation with a passion for AI and an understanding of how to gain the greatest value from your investment, you can leverage Copilot to get ahead of the competition, empower your people to their fullest potential and experience a world of work unimaginable just a few years ago.

Sign up to receive insights from our experts

Get the latest news and developments from Advania delivered to your inbox

Other blog articles that might interest you

Driven by client success

We’re proud to work with the some of the most ambitious and innovative organisations.

MANAGED IT SERVICES

Sign up to receive insights from our experts

Get the latest news and developments from Advania delivered to your inbox.