Managing risks and opportunities with AI

Managing risks and opportunities with AI

In a GC100 poll of 106 companies in September 2024, 8% of respondents reported they already regularly used Co-Pilot and Teams Premium for transcription of initial draft minutes; since then, there has been an influx of providers in the market that can prepare agendas, summarise discussions, and draft lists of action points. Before employing such AI tools in your company, it is essential to consider whether the use of AI is appropriate, and, if so, whether all the necessary risk-mitigation steps have been taken.

What are the key risks?

  • AI tools lack the ability to differentiate between different types of contexts and tones, cannot exercise discretion, and may fail to properly identify commercial nuances and stifle candid discussions.
  • Confidential and sensitive discussions may become disclosable in future litigation or regulatory contexts, as part of subject access requests, or during due diligence processes. AI tools may not be able to identify legally privileged information, which could lead to an inadvertent loss of privilege.
  • Without careful review, there is a risk AI would not deliver a formal record of decisions made that is accurate and impartial.
  • When using AI tools that rely on third-party providers, organisations face the risk of data breaches and confidential information leaks.

How can you mitigate these risks?

  • The most effective risk-mitigation measure is to ensure human review by an employee of an appropriate level. This ensures there is a clear, accurate, and concise outcome with careful consideration being given for any inclusion of commercially sensitive or legally privileged information.
  • Companies should conduct thorough due diligence on any AI providers, including understanding the extent to which the provider uses data inputted by users for AI development or training activities and any processes that can lead to a leak of confidential company information.
  • Companies should also ensure suppliers are compliant with all relevant data protection regulations and have adequate security systems in place to ensure their AI tools do not increase the company’s exposure to cyber-attacks/data breaches/leaks.
  • Organisations should establish robust consent and communication processes regarding the use of AI tools internally and externally. Companies should think carefully about having AI tools as a default setting and providing the opportunity/awareness around opting out of such use.

Key takeaway

Consider carefully whether the benefits of using AI tools outweighs the significant risks. If used, human oversight, comprehensive diligence on AI providers, and clear guidance is essential to protect organisations.

AUTHORS

Emily Miles Managing Associate

Emily is a managing associate with extensive experience advising corporate groups, investors and individuals on a wide range of corporate and commercial matters.

Emily is a managing associate with extensive experience advising corporate groups, investors and individuals on a wide range of corporate and commercial matters.

With expertise in private mergers and acquisitions, equity fundraisings, joint ventures, corporate restructurings and incentivisation schemes, Emily is a versatile transactional lawyer. She regularly advises both management teams and investors on complex, multijurisdictional transactions.

Emily works with clients in a broad range of sectors, with a particular focus on the retail & fashion and hospitality & leisure industries, advising global brands onstrategic acquisitions, growth equity investments and general corporate advisory matters. Additionally, Emily advises ultra-high net worth (UHNW) clients and their family offices on their investment and acquisition activities in the UK.

Emily’s experience includes an in-house secondment at the head office of a global entertainment, hospitality and leisure brand. During this time, she worked closely with stakeholders and commercial teams to support the company’s diverse investment portfolio, giving her invaluable insight into the practical, operational needs of businesses.

Emily’s expertise has been recognised by The Legal 500, where she is listed as a “Key Lawyer” for M&A; clients commend her “exemplary prowess in corporate and M&A realms”. Emily graduated from the University of Oxford with an MA in history before converting to law. Before joining Harbottle & Lewis in 2017, Emily trained at DLA Piper where she spent time on secondment in the Sydney office.