Follow us

The Chartered Institute of Arbitrators (CIArb) has recently published its much-anticipated guidelines on the use of Artificial Intelligence (AI) in arbitration. These Guidelines aim to provide a framework for the effective and ethical use of AI tools in arbitration proceedings. The Guidelines are not intended to be a comprehensive manual but instead establish a framework to support informed decision-making. We summarise the key aspects of the CIArb Guidelines below.

 

Scope and Purpose of the CIArb Guidelines

The CIArb Guidelines provide practical guidance on the use of AI in arbitration, covering:

  1. The benefits and risks of the use of AI in arbitration (Part I);
  2. General recommendations on the use of AI in an arbitration (Part II);
  3. Arbitrators’ powers to give directions and make rulings on the use of AI by parties in arbitration (Part III); and
  4. The use of AI in arbitration by arbitrators (Part IV).

The Guidelines also provide a template agreement and template procedural order (with a short and long form alternatives) on the use of AI in arbitration.

 

Benefits and Risks of AI in Arbitration

The Guidelines set the context by summarising the benefits and risks of AI in arbitration. Key benefits of AI technologies mentioned include the enhancement of efficiency and quality and the potential for remedying inequality of arms. Use cases that are listed include legal research, data analysis, text generation, collection of evidence, translation and interpretation, transcription and case analysis.

Despite these benefits, the Guidelines highlight several risks including concerns about the enforceability of arbitral awards, potential biases in AI algorithms, loss of confidentiality and data security and the impact on due process rights.

 

Recommendations on the Use of AI in Arbitration

The Guidelines are clear that the use of an AI tool by any participant in an arbitration shall not in any way diminish the responsibility and accountability that would ordinarily apply to them. Parties and arbitrators are encouraged to make enquiries about any prospective AI tool, to weigh up its benefits and risks and to understand any applicable AI-related laws or regulations.

The CIArb Guidelines also provide several specific recommendations for the use of AI in arbitration:

  1. Arbitrators’ powers to give directions and make procedural rulings on the use of AI: The Guidelines confirm that the use of AI falls within the power of arbitrators to conduct the proceedings, subject to any mandatory applicable laws or party agreement, and provided that arbitrators do not seek to regulate the private use of AI by the parties. For example, arbitrators can issue procedural orders that outline how AI technologies should be integrated into the process, they can appoint AI experts and they may require disclosure of the parties' use of AI tool tools.
  2. Party Autonomy: While arbitrators have the authority to regulate the use of AI, the Guidelines also emphasise the importance of party autonomy. Parties have the right to choose whether and how to use AI tools in their arbitration proceedings. The parties may also propose specific AI Tools that may or may not be utilised and can agree to use within specific parameters.
  3. Ruling on use of AI and admissibility of AI-generated material in the arbitration record: In cases where disputes arise regarding the use of AI, arbitrators have the power to make rulings to resolve these issues, including determining the admissibility of AI-generated evidence and addressing challenges related to AI-assisted analysis.
  4. Disclosure: The Guidelines emphasise the importance of transparency, so that all participants know what role AI plays in the arbitration process and to preserve the integrity of the arbitration and/or the validity and enforceability of the award.  Parties may be required to disclose the use of AI tools in arbitration proceedings to the extent that their use may have an impact on the evidence, the outcome of the arbitration or otherwise involve a delegation of an express duty toward the arbitrators or any other party.

 

The Use of AI by Arbitrators

The Guidelines also address the use of AI by arbitrators themselves. Key points include:

  1. Discretion over use of AI: Arbitrators may leverage AI tools to enhance the arbitral process. For example, arbitrators may use AI to support more accurate and efficient processing of information. However, they should not relinquish their decision-making powers to AI and must always maintain independent judgment. They should independently verify the accuracy and correctness of information obtained through AI, ensuring their judgment is free from confirmation bias and other distortions.
  2. Transparency: Arbitrators are encouraged to consult with the parties on the use of any AI Tool and provide the parties with an opportunity to comment and oppose such tools being used by the arbitrators. If the parties disagree on the use of AI by the arbitrators, the arbitrators should refrain from using the specified AI Tool. An arbitrator sitting on a tribunal should consult with their co-arbitrators on the use of any AI Tool in the context of their mandate.

 

Templates

The Guidelines include the following templates as a starting point for parties and arbitrators to customise according to their specific needs:

  • a template agreement which establishes a framework for parties and the tribunal to regulate the use of AI tools in arbitration proceedings, including agreement on permitted tools and their uses; and
  • two template procedural orders to facilitate the use of AI in arbitration, one of which is a short-form agreement which incorporates the CIArb Guidelines by way of guidance.

 

Comment

The CIArb Guidelines represent a significant step forward in integrating AI technology into arbitration whilst regulating and promoting best practice. As AI continues to evolve, the Guidelines will serve as a valuable resource to help navigate the complexities that can arise from its use.  

The launch of these Guidelines is also very timely given the enactment of the EU AI Act, which applies to AI systems used in the "administration of justice and democratic processes" (which includes arbitration). From August 2026, the Act will regulate the use of "high-risk applications" by arbitrators, specifically "researching and interpreting facts and the law and in applying the law to a concrete set of facts, or to be used in a similar way”. Undoubtedly, the CIARB drafting committee had the Act in mind when putting together guideline 8.2, which states that the tribunal should avoid delegating any tasks to AI Tools, such as legal analysis, research and interpretation of facts and law, or application of the law to the facts.

As the use of AI tools becomes more widespread, arbitrators and all participants in the arbitral process will need to work together to ensure the adoption of best practice, compliance with applicable laws and the preservation of the integrity of the arbitral process.

 

Related categories

Key contacts

Charlie Morgan photo

Charlie Morgan

Partner, London

Charlie Morgan
Didon Misri photo

Didon Misri

Associate (India), London

Didon Misri
Elizabeth Kantor photo

Elizabeth Kantor

Knowledge Lawyer, London

Elizabeth Kantor
Charlie Morgan Didon Misri Elizabeth Kantor