In response to the increasing prevalence of algorithmic trading (algo-trading) in the banking industry (e.g. for executing trade orders and market making for foreign exchange-related transactions), the Hong Kong Monetary Authority (the HKMA) has issued a circular setting out its supervisory expectations with regard to risk management practices for algo-trading activities which authorized institutions (AIs) should observe when developing their risk management framework. The circular follows a survey conducted in 2018 by the HKMA (which indicated that around 40% of the surveyed AIs currently use algo-trading) and a round of thematic on-site examinations conducted in 2019.

The HKMA’s expectations for AIs engaging in algo-trading are set out under four main areas (note the HKMA has also identified risk management practices adopted by more advanced institutions – for further details, please refer to the Annex to the circular here):

Governance and oversight

  • AIs are expected to put in place proper governance and risk management frameworks to oversee and manage the risks associated with algo-trading services, ensuring that they fall within applicable risk appetites.
  • Effective and independent control functions should be established as a second line of defence, which have authority to challenge the front office.
  • Both first and second lines of defence should conduct regular reviews (annually, at a minimum) of algorithms and relevant governance and controls including all key processes throughout the life cycle of algorithms, ensuring that they remain adequate and effective, and appropriate remedial actions are taken where required.
  • AIs’ internal audit functions, being the third line of defence, should perform regular reviews of algo-trading activities to ensure that they are subject to proper governance and any risks arising from these activities are adequately and effectively managed.

Development, testing and approval

  • An effective framework governing development and testing of algorithms should be established to ensure that algorithms behave as intended, and comply with relevant regulatory requirements and the AI’s internal policies.
  • A robust algorithm approval policy should be implemented to ensure that any new algorithms or changes to existing ones are properly tested, reviewed and challenged before implementation.

Risk monitoring and controls

  • A comprehensive set of pre-trade controls should be in place to ensure that risks are managed prudently.
  • AIs’ front offices and independent control functions should conduct real-time monitoring of algo-trading activities (e.g. real-time alerts to assist in identifying limit excesses and other abnormal trading activities).
  • Proper kill functionality which can suspend the use of an algorithm and cancel part or all of an unexecuted order immediately should be put in place.
  • A robust and effective business continuity plan should be established which sets out contingency measures for dealing with possible adverse scenarios where algo-trading systems are malfunctioning. Fall-back solutions should be subject to regular testing.
  • AIs should put in place proper security controls on access rights (physical and electronic) to algo-trading systems which should include the use of reliable techniques to authenticate the identity of staff and application of differentiated levels of access. Access records and activity logs should be subject to regular reviews to identify any unauthorised use of the systems.
  • AIs should establish robust policy and procedures for handling and escalating incidents related to algo-trading. Sufficient information should be provided to governance bodies and other responsible staff to facilitate review of the incidents and adequacy of remedial actions.

Documentation

  • AIs should maintain proper documentation to provide sufficient audit trails on key processes throughout the life cycle of algorithms.
  • AIs should establish and maintain a comprehensive inventory to document all algorithms implemented and the relevant key information (e.g. brief description of algorithms and trading strategies, owner, approver and approval date, implementation date, systems where algorithms are implemented including scope of application, review records and applicable risk controls).

The HKMA highlighted that AIs should take into account the nature, scale and complexity of their algo-trading activities when developing their risk management framework.

A copy of the circular can be found here.