Follow us

On 6 March 2020, the HKMA issued a circular setting out its expectations on risk management for algorithmic (algo) trading, as well as sound practices observed from thematic on-site examinations of seven authorised institutions (AIs) in 2019.

The on-site examinations of the seven AIs, mainly international banks using algorithms for making investment decisions, were conducted following a survey by the HKMA which found (among other things) that around 40% of the AIs surveyed were engaging in algo trading and that a majority of such AIs were intending to expand the scale of their algo trading.

AIs which engage in algo trading activities or are looking to introduce an algo trading system should give due consideration to the supervisory expectations and sound practices when developing or enhancing their risk management framework.

The supervisory expectations reflect the HKMA’s minimum standards, while the sound practices are examples of the good measures adopted by the more advanced AIs as observed by the HKMA during the on-site examinations. AIs should implement risk management measures which meet the HKMA’s minimum standards and are commensurate with the nature, scale and complexity of their algo trading activities.

Below are the focus areas discussed by the HKMA.

Governance and oversight

The HKMA expects AIs to:
have in place proper governance and risk management frameworks to oversee and manage the risks associated with algo trading and ensure that these risks are within their risk appetite;
have in place a control function as the second line of defence independent of the front office;
ensure that their first and second line of defence conduct regular reviews (at least once a year) to evaluate the performance of the algorithms implemented, and whether the relevant governance, systems and controls, and business continuity planning remain adequate and effective;
ensure that their third line of defence – the internal audit function – perform regular reviews of algo-trading activities to ensure that these activities are subject to proper governance and risk management.

The HKMA noted various sound practices adopted by the more advanced AIs, such as:

  • having dedicated governance bodies comprising representatives from relevant functions (such as front office, control function, finance, information technology and operations) for overseeing the AIs’ algo trading activities;
  • clearly setting out the respective roles and responsibilities of the first, second and third line of defence relating to algo trading;
  • where the underlying algorithms or systems are adopted from the AIs’ headquarters, being actively involved in managing the risks associated with local algo trading activities without unduly relying on oversight by their headquarters, as well as directly participating in group-level discussions to provide input from the local perspective;
  • having a control function which plays a proactive role in the key processes throughout the life cycle of the algorithms, and is staffed with algo trading experts who are given sufficient authority to challenge the front office and the means to take necessary action to mitigate risk;
  • including all key processes in the life cycle of the algorithms in regular reviews by the first and second line of defence, and ensuring that the results of the reviews are extensively discussed by the governance bodies and are taken into account when strengthening risk management for algo trading;
  • treating algo trading as a separate business area from general treasury activities in the internal audit programme and having a tailored programme for algo trading.

Development, testing and approval

The HKMA expects AIs to:
have in place an effective framework governing the development and testing of algorithms, to ensure they behave as intended and comply with the relevant regulatory requirements and the institutions’ internal policies;
have in place robust approval policies and procedures to ensure that new algorithms or changes to algorithms are subject to proper testing, reviews and challenges before implementation;
ensure that their staff who are responsible for developing and testing algorithms possess the requisite expertise and experience.

The HKMA noted various sound practices adopted by the more advanced AIs, such as:

  • testing the robustness and resilience of algorithms (and their monitoring and controls) for stressed market conditions;
  • performing comprehensive tests on updated algorithms as if they were new algorithms;
  • using standardised approval templates for the evaluation of new algorithms or changes to algorithms to ensure that all necessary information is provided to the staff assigned with the approval authority;
  • conducting additional expert reviews when evaluating more complex algorithms.

Risk monitoring and controls

The HKMA expects AIs to:
have in place a comprehensive set of pre-trade controls for algo-trading activities to ensure that risks are managed prudently, such as risk limits based on the AI’s capital, trading strategy and risk tolerance, and price collars which block orders that do not satisfy pre-defined price parameters;
conduct real-time monitoring of algo-trading activities and have in place real-time alerts to assist staff in identifying limit excesses and other abnormal trading activities, as well as automated surveillance tools to detect suspicious activities and possible conduct issues;
have in place a proper kill functionality as an emergency measure to suspend the use of an algorithm and cancel part or all of the unexecuted orders if necessary (together with a robust framework governing the activation of such functionality and the subsequent re-enablement of algo-trading);
have in place a robust business continuity plan (which is subject to regular testing) setting out the contingency measures for dealing with possible adverse scenarios where algo trading systems can no longer function normally, including fall-back solutions such as alternative arrangements to execute orders;
have in place proper security controls over the physical and electronic access to algo-trading systems, with reliable identity authentication and differentiated access controls according to the staff’s responsibility and authority which are subject to regular reviews;
have in place robust policies and procedures for handling incidents relating to algo-trading, including implementation of remedial actions in a timely manner with proper audit trails, escalation of incidents and remedial actions to the governance bodies and other responsible staff, and review of the adequacy and effectiveness of the remedial actions.

The HKMA noted various sound practices adopted by the more advanced AIs, such as:

  • setting pre-trade controls at a granular level which are reviewed and analysed regularly to ensure that they are in line with the AI’s risk appetite and take into account the latest market conditions;
  • having in place alerts that are more stringent than the control limits to provide early warning signals;
  • setting up a kill functionality which can be activated at various levels (such as at the system, algorithm, trader and client levels), to minimise disruption to other algo trading activities which are not related to the underlying reasons for activating the kill functionality;
  • developing a tailored business continuity plan which covers a wide range of scenarios for each major type of algorithm, having regard to the purpose of the algorithms and the markets and products to which the algorithms are applied;
  • initiating a holistic review of all relevant algorithms and associated controls following an incident to avoid recurrence of similar incidents in other algorithms, where appropriate in light of the nature and root causes of the incident.

Documentation

The HKMA expects AIs to:
maintain proper documentation so as to provide sufficient audit trails on the key processes throughout the life cycle of algorithms;
maintain a comprehensive inventory of documentation relating to all the algorithms implemented, including information on the trading strategies involved, the owner, the approver and approval date, the implementation date, the systems involved, scope of application, the reviews undertaken and the applicable risk controls.

The HKMA noted various sound practices adopted by the more advanced AIs, such as maintaining two inventories, one for the algorithms implemented and the other for risk controls, to facilitate the identification of any inconsistencies in the risk controls across the implemented algorithms.

 

Hannah Cassidy photo

Hannah Cassidy

Partner, Head of Financial Services Regulatory, Asia, Hong Kong

Hannah Cassidy
Valerie Tao photo

Valerie Tao

Professional Support Lawyer, Hong Kong

Valerie Tao

Related categories

Key contacts

Hannah Cassidy photo

Hannah Cassidy

Partner, Head of Financial Services Regulatory, Asia, Hong Kong

Hannah Cassidy
Valerie Tao photo

Valerie Tao

Professional Support Lawyer, Hong Kong

Valerie Tao
Hannah Cassidy Valerie Tao