Trading strategies, or algorithms, constitute the brain of the electronic trading framework. They provide the clients, be they internal trading desks or external financial institutions, an intelligent fully automated facility to achieve their trading objectives with sufficient degree of control and transparency. The Algorithmic Container provides the required infrastructure to incrementally design, implement, test and deploy the strategies to meet the needs of the most sophisticated and demanding customers.
Separation of Concerns
Algorithmic Container is a powerful framework based on simple “atomic” structures. Each layer and components it consists of carry out a specific role in the system. It is well defined, loosely coupled and therefore perfectly testable. It can be replaced by a different implementation should the need arise.
Event Driven Approach
Any meaningful event in the system is captured, persisted and delivered for processing. The event-response model employed by individual components reflects our Simplicity-Agility-Reliability paradigm.
Multiagent System Approach
The algorithmic framework is modelled as a specialised multiagent system where “agents” (be their analytics, trading strategies, fragmentation and order placement strategies) are operating in a distributed environment and can exhibit cooperative and/or competitive behaviours. They can be deployed in a single process or seamlessly in a cluster without any changes to the way they interact.
Microkernel Architecture & Message Passing IPC
At its core, our low level components follow the microkernel design that uses a bespoke messaging framework for inter-process communications (IPC). The framework provides reliable delivery of events with real-time persistence and recovery capabilities (e.g. support for late joiners). The strategies, analytics, services and functional layers in a single instance of the Algorithmic Container operate in a lightweight cooperative multitasking environment and exhibit deterministic behaviour. The preemptive multitasking is achieved through deploying multiple instances of the container microkernel that act as parallel pipelines. This approach allows us to engineer and deliver very resilient and highly available components.
Embedded Transaction Support
Transactions are an intrinsic part of our event-driven frameworks. They are persistent, recoverable and can be easily audited.
The Configuration, Reference Data, Market Data, Analytical and other services provided by the Algorithmic Container are completely generalised and have no dependencies on the vendor solutions. Such decoupling makes it possible to change the data and trading service providers without programmatic changes to the main infrastructure and the trading algorithms.
Multiple Redundant Peers
Multiple redundant peers (instances of the Algorithmic Container) can be deployed to replace the active components lost during a hardware failure.
High availability (HA) is an inherent feature of all our systems. The properties required to achieve HA are implemented from very early stages of the development cycle.
The system is designed to be linearly scalable up and down, from large-scale deployment in a data centre to a notebook computer running all the application layers running a cluster of strategies in a realistic simulator.
Buy and Sell Side
The Algorithmic Container is designed for both agency and proprietary trading. Even though proprietary and agency deployments are expected to be different in topology and independent, the same execution and alpha-seeking algorithms can be leveraged to reduce the market impact and slippage and improve performance subject to the applicable regulatory constraints.
Our solutions are geared towards multi-asset trading that becomes crucial in the investment banking brokerage industries to achieve uniformity, improve efficiency, risk transparency and promote cross-selling on various levels.
Even if the initial client requirements are skewed towards a particular business flow and asset type, design provisions were put in place to make the components extendible to multi-asset trading.
We avoid monolithic approach in strategy development and encourage the use of the Composite pattern by nesting strategies. This can be done both statically in the code and the configuration.
As a test harness for prototyping quantitative ideas and a “debugger” for troubleshooting, refining and testing strategies, we incorporate the Trading Simulator.
The simulator is an intrinsic part of the platform and can be an invaluable tool for:
- Developing and testing trading strategies
- Simulating market macro and flash crashes and testing the robustness and resilience of strategies
- Replaying and analysing real life data
- Reproducing production problems and troubleshooting the strategies
- Refining the strategy analytics (e.g. running batch non-linear optimisation processes in a CPU cluster)
- Carrying out what-if analysis and estimating “pre”, “at” and “post” trade analytics
- Running fully automated integration and performance tests (a part of the continuous integration process)
- Sales as the primary demo tool to the existing and prospective clients
The Functional Layers are the specialised processing layers that constitute the Assembly Line, a complex event processing (CEP) pipeline structured as a chain of responsibility. The Assembly Line ensures the orderly service notification, event delivery, direct order instruction handling and indication gathering, indication enrichment and processing, validation, adjustments and conversion to placements for the subsequent time order priority optimisation and delivery to the final validation and command dispatching layer, the Order Publishing Pipeline.
The Services are required by a Strategy to carry out its essential functions, be it trading, surveillance and monitoring, or management of other strategies. They capture the common patterns used by the strategies in interacting with the external environment. Without them, the strategy developers would have to have an innate understanding of the low-level communication protocol and build their own state machines and objects. This would result in unnecessary complexity and duplication of effort.
The Managers complement the Services by providing additional functionality (e.g. management of other strategies, indication handling) and standardising the interactions with the Container (e.g. system lifecycle monitoring and streamlined direct trading capabilities).
The event-driven and transactional nature of the system allows us to capture all the system and business events that improves the transparency. The real-time reporting capability can be tailored to meet the specific reporting and compliance requirements.
The historical data outside of the trading session (that can incorporate single or multiple day trading sessions) is retained in a decoupled long-term storage (such as a classical RDBMS or its NoSQL substitute). Custom drop-feed and query components are developed to support real-time long-term data persistence as well as retrieval.
Flexible Strategy Order Hierarchy
We avoid rigid order hierarchies and support N levels of strategy order nesting (parent-child relationship). Parent, work/care, market-side or other types orders use the uniform order representation. Constraints on the depth of nesting, slicing child orders, and propagation of trade events and states can be imposed on individual branches of the order hierarchy.
Powerful Order Routing
We employ a DSL-driven routing engine to manage the business flow segmentation, order routing, ah-hoc business rules, enrichment, validation, and custom behaviours.
The quality of testing is one our key differentiating factors. A special attention is payed to unit tests, automated component-centric and cross-system integrations tests, performance and stress tests. In addition to that, we employ model-driven methodology complemented by a set of proprietary techniques to generate hundreds of thousands individual state transition tests collectively producing millions of events the system is subjected to in a single test run. The aim of this layer of tests is to ensure the adherence to the product specification and robustness of the state engines. It is used as a solid regression pack protecting us against unwanted changes and bugs that can be introduced in the course of incremental development.
We can cater for highly distributed, localised and “co-located” deployment configurations.
The Algorithmic Container solution is intuitively modularised, well-documented and easily maintainable. The design supports ongoing updates of individual components and would not typically require re-deployment of the entire system.
We provide continuous product training for the first and second-level support staff. Additional advanced training can be arranged upon request. Our development team can provide the third level support and help handling production emergencies.