HftBacktest
High-Frequency Trading Backtesting Tool
This framework is designed for developing high-frequency trading and market-making strategies. It focuses on accounting for both feed and order latencies, as well as the order queue position for order fill simulation. The framework aims to provide more accurate market replay-based backtesting, based on full order book and trade tick feed data.
Rust implementation with experimental features
The experimental features are currently in the early stages of development, having been completely rewritten in Rust to support the following features.
Backtesting of multi-asset and multi-exchange models
Deployment of a live trading bot using the same algo code.
Please see rust directory.
Key Features
Working in Numba JIT function.
Complete tick-by-tick simulation with a variable time interval.
Full order book reconstruction based on L2 feeds(Market-By-Price).
Backtest accounting for both feed and order latency, using provided models or your own custom model.
Order fill simulation that takes into account the order queue position, using provided models or your own custom model.
Getting started
Installation
hftbacktest supports Python 3.10+. You can install hftbacktest using pip
:
pip install hftbacktest
Or you can clone the latest development version from the Git repository with:
git clone https://github.com/nkaz001/hftbacktest
Data Source & Format
Please see Data or Data Preparation.
A Quick Example
Get a glimpse of what backtesting with hftbacktest looks like with these code snippets:
@njit
def simple_two_sided_quote(hbt, stat):
max_position = 5
half_spread = hbt.tick_size * 20
skew = 1
order_qty = 0.1
last_order_id = -1
order_id = 0
# Checks every 0.1s
while hbt.elapse(100_000):
# Clears cancelled, filled or expired orders.
hbt.clear_inactive_orders()
# Obtains the current mid-price and computes the reservation price.
mid_price = (hbt.best_bid + hbt.best_ask) / 2.0
reservation_price = mid_price - skew * hbt.position * hbt.tick_size
buy_order_price = reservation_price - half_spread
sell_order_price = reservation_price + half_spread
last_order_id = -1
# Cancel all outstanding orders
for order in hbt.orders.values():
if order.cancellable:
hbt.cancel(order.order_id)
last_order_id = order.order_id
# All order requests are considered to be requested at the same time.
# Waits until one of the order cancellation responses is received.
if last_order_id >= 0:
hbt.wait_order_response(last_order_id)
# Clears cancelled, filled or expired orders.
hbt.clear_inactive_orders()
last_order_id = -1
if hbt.position < max_position:
# Submits a new post-only limit bid order.
order_id += 1
hbt.submit_buy_order(
order_id,
buy_order_price,
order_qty,
GTX
)
last_order_id = order_id
if hbt.position > -max_position:
# Submits a new post-only limit ask order.
order_id += 1
hbt.submit_sell_order(
order_id,
sell_order_price,
order_qty,
GTX
)
last_order_id = order_id
# All order requests are considered to be requested at the same time.
# Waits until one of the order responses is received.
if last_order_id >= 0:
hbt.wait_order_response(last_order_id)
# Records the current state for stat calculation.
stat.record(hbt)
Examples
You can find more examples in examples directory.
Contributing
Thank you for considering contributing to hftbacktest! Welcome any and all help to improve the project. If you have an idea for an enhancement or a bug fix, please open an issue or discussion on GitHub to discuss it.
The following items are examples of contributions you can make to this project:
Improve performance statistics reporting
Implement test code
Add additional queue or exchange models
Update documentation and examples
Data Preparation
To fully utilize the power of HftBacktest, it requires to input Tick-by-Tick full order book and trade feed data. Unfortunately, free Tick-by-Tick full order book and trade feed data for HFT is not available unlike daily bar data provided by platforms like Yahoo Finance. However, in the case of cryptocurrency, you can collect the full raw feed yourself.
Getting started from Binance Futures’ raw feed data
You can collect Binance Futures feed yourself using https://github.com/nkaz001/collect-binancefutures
[1]:
import gzip
with gzip.open('usdm/btcusdt_20230404.dat.gz', 'r') as f:
for i in range(20):
line = f.readline()
print(line)
b'1680652700423575 {"stream":"btcusdt@bookTicker","data":{"e":"bookTicker","u":2710246762461,"s":"BTCUSDT","b":"28145.10","B":"3.868","a":"28145.20","A":"6.887","T":1680652700430,"E":1680652700435}}\n'
b'1680652700441533 {"stream":"btcusdt@trade","data":{"e":"trade","E":1680652700455,"T":1680652700452,"s":"BTCUSDT","t":3535186032,"p":"28145.10","q":"0.002","X":"MARKET","m":true}}\n'
b'1680652700441685 {"stream":"btcusdt@trade","data":{"e":"trade","E":1680652700455,"T":1680652700452,"s":"BTCUSDT","t":3535186033,"p":"28145.10","q":"0.020","X":"MARKET","m":true}}\n'
b'1680652700441725 {"stream":"btcusdt@trade","data":{"e":"trade","E":1680652700455,"T":1680652700452,"s":"BTCUSDT","t":3535186034,"p":"28145.10","q":"0.020","X":"MARKET","m":true}}\n'
b'1680652700442528 {"stream":"btcusdt@trade","data":{"e":"trade","E":1680652700455,"T":1680652700452,"s":"BTCUSDT","t":3535186035,"p":"28145.10","q":"0.008","X":"MARKET","m":true}}\n'
b'1680652700442569 {"stream":"btcusdt@bookTicker","data":{"e":"bookTicker","u":2710246762974,"s":"BTCUSDT","b":"28145.10","B":"3.818","a":"28145.20","A":"6.887","T":1680652700452,"E":1680652700455}}\n'
b'1680652700454910 {"stream":"btcusdt@trade","data":{"e":"trade","E":1680652700468,"T":1680652700462,"s":"BTCUSDT","t":3535186036,"p":"28145.20","q":"0.002","X":"MARKET","m":false}}\n'
b'1680652700455070 {"stream":"btcusdt@trade","data":{"e":"trade","E":1680652700468,"T":1680652700462,"s":"BTCUSDT","t":3535186037,"p":"28145.20","q":"0.008","X":"MARKET","m":false}}\n'
b'1680652700455110 {"stream":"btcusdt@bookTicker","data":{"e":"bookTicker","u":2710246763198,"s":"BTCUSDT","b":"28145.10","B":"3.818","a":"28145.20","A":"6.907","T":1680652700462,"E":1680652700468}}\n'
b'1680652700458611 {"stream":"btcusdt@bookTicker","data":{"e":"bookTicker","u":2710246763205,"s":"BTCUSDT","b":"28145.10","B":"3.818","a":"28145.20","A":"6.927","T":1680652700462,"E":1680652700469}}\n'
b'1680652700461970 {"stream":"btcusdt@bookTicker","data":{"e":"bookTicker","u":2710246763256,"s":"BTCUSDT","b":"28145.10","B":"3.818","a":"28145.20","A":"6.917","T":1680652700462,"E":1680652700470}}\n'
b'1680652700462351 {"stream":"btcusdt@bookTicker","data":{"e":"bookTicker","u":2710246763281,"s":"BTCUSDT","b":"28145.10","B":"3.818","a":"28145.20","A":"6.925","T":1680652700463,"E":1680652700471}}\n'
b'1680652700487340 {"stream":"btcusdt@bookTicker","data":{"e":"bookTicker","u":2710246763977,"s":"BTCUSDT","b":"28145.10","B":"3.818","a":"28145.20","A":"6.954","T":1680652700498,"E":1680652700501}}\n'
b'1680652700566269 {"stream":"btcusdt@bookTicker","data":{"e":"bookTicker","u":2710246765398,"s":"BTCUSDT","b":"28145.10","B":"3.818","a":"28145.20","A":"6.833","T":1680652700575,"E":1680652700579}}\n'
b'1680652700573952 {"stream":"btcusdt@bookTicker","data":{"e":"bookTicker","u":2710246765530,"s":"BTCUSDT","b":"28145.10","B":"3.818","a":"28145.20","A":"6.953","T":1680652700583,"E":1680652700588}}\n'
b'1680652700574554 {"stream":"btcusdt@bookTicker","data":{"e":"bookTicker","u":2710246765551,"s":"BTCUSDT","b":"28145.10","B":"3.818","a":"28145.20","A":"6.961","T":1680652700585,"E":1680652700588}}\n'
b'1680652700599351 {"lastUpdateId": 2710246765483, "E": 1680652700590, "T": 1680652700580, "bids": [["28145.10", "3.818"], ["28145.00", "0.413"], ["28144.70", "0.002"], ["28144.60", "0.023"], ["28144.50", "0.003"], ["28144.40", "2.430"], ["28144.20", "0.022"], ["28144.10", "0.055"], ["28144.00", "0.160"], ["28143.80", "0.049"], ["28143.70", "0.002"], ["28143.60", "1.151"], ["28143.50", "0.001"], ["28143.40", "0.009"], ["28143.30", "0.008"], ["28143.20", "0.713"], ["28143.10", "0.006"], ["28143.00", "1.633"], ["28142.90", "0.005"], ["28142.80", "0.027"], ["28142.70", "0.180"], ["28142.60", "0.066"], ["28142.50", "0.895"], ["28142.40", "1.654"], ["28142.30", "0.004"], ["28142.20", "0.470"], ["28142.10", "0.054"], ["28142.00", "0.631"], ["28141.90", "1.312"], ["28141.80", "0.046"], ["28141.70", "0.044"], ["28141.60", "0.067"], ["28141.50", "0.110"], ["28141.40", "1.124"], ["28141.30", "4.160"], ["28141.20", "3.110"], ["28141.10", "0.012"], ["28141.00", "0.010"], ["28140.90", "0.201"], ["28140.80", "0.054"], ["28140.70", "0.011"], ["28140.60", "0.324"], ["28140.50", "0.509"], ["28140.40", "0.071"], ["28140.30", "0.207"], ["28140.20", "0.983"], ["28140.10", "0.039"], ["28140.00", "1.007"], ["28139.80", "0.046"], ["28139.70", "0.382"], ["28139.60", "1.200"], ["28139.50", "0.413"], ["28139.40", "0.222"], ["28139.30", "0.406"], ["28139.20", "1.656"], ["28139.10", "0.030"], ["28139.00", "2.014"], ["28138.90", "0.017"], ["28138.80", "0.004"], ["28138.70", "0.099"], ["28138.60", "1.580"], ["28138.50", "5.951"], ["28138.40", "0.537"], ["28138.30", "0.002"], ["28138.20", "0.603"], ["28138.10", "0.004"], ["28138.00", "0.010"], ["28137.90", "0.196"], ["28137.80", "0.178"], ["28137.70", "1.191"], ["28137.60", "2.539"], ["28137.50", "0.013"], ["28137.40", "0.903"], ["28137.30", "0.002"], ["28137.20", "1.023"], ["28137.10", "1.874"], ["28137.00", "4.064"], ["28136.90", "2.793"], ["28136.80", "0.008"], ["28136.70", "0.553"], ["28136.60", "0.017"], ["28136.50", "0.010"], ["28136.40", "0.005"], ["28136.30", "0.022"], ["28136.20", "0.282"], ["28136.10", "8.013"], ["28136.00", "0.278"], ["28135.90", "0.045"], ["28135.80", "2.203"], ["28135.70", "2.601"], ["28135.60", "1.460"], ["28135.50", "0.050"], ["28135.40", "0.345"], ["28135.30", "0.256"], ["28135.20", "0.005"], ["28135.10", "0.267"], ["28135.00", "0.160"], ["28134.90", "0.002"], ["28134.80", "1.929"], ["28134.70", "0.090"], ["28134.60", "1.176"], ["28134.50", "0.715"], ["28134.40", "3.618"], ["28134.30", "15.128"], ["28134.20", "2.789"], ["28134.10", "0.032"], ["28134.00", "0.334"], ["28133.90", "0.048"], ["28133.80", "1.652"], ["28133.60", "0.054"], ["28133.50", "0.002"], ["28133.40", "0.315"], ["28133.30", "1.448"], ["28133.20", "0.036"], ["28133.10", "0.004"], ["28133.00", "0.200"], ["28132.90", "0.760"], ["28132.80", "0.476"], ["28132.70", "0.063"], ["28132.60", "0.449"], ["28132.40", "0.007"], ["28132.30", "2.406"], ["28132.20", "10.661"], ["28132.10", "0.008"], ["28132.00", "0.512"], ["28131.90", "0.794"], ["28131.80", "0.054"], ["28131.70", "1.402"], ["28131.60", "1.305"], ["28131.50", "1.800"], ["28131.40", "0.201"], ["28131.20", "0.054"], ["28131.10", "2.420"], ["28131.00", "0.296"], ["28130.90", "0.008"], ["28130.80", "0.011"], ["28130.70", "0.102"], ["28130.60", "0.001"], ["28130.50", "0.078"], ["28130.40", "0.002"], ["28130.30", "0.004"], ["28130.20", "1.234"], ["28130.10", "0.060"], ["28130.00", "0.146"], ["28129.80", "0.013"], ["28129.70", "0.055"], ["28129.60", "0.819"], ["28129.50", "0.890"], ["28129.40", "2.296"], ["28129.30", "1.003"], ["28129.20", "3.000"], ["28129.10", "0.080"], ["28129.00", "1.844"], ["28128.90", "2.610"], ["28128.80", "10.346"], ["28128.70", "2.683"], ["28128.60", "1.148"], ["28128.30", "0.064"], ["28128.20", "0.001"], ["28128.10", "13.066"], ["28128.00", "0.316"], ["28127.90", "0.001"], ["28127.70", "3.002"], ["28127.60", "0.008"], ["28127.50", "0.468"], ["28127.40", "2.042"], ["28127.30", "0.061"], ["28127.20", "2.208"], ["28127.10", "0.086"], ["28127.00", "0.109"], ["28126.90", "0.386"], ["28126.80", "0.132"], ["28126.70", "7.302"], ["28126.60", "10.660"], ["28126.30", "0.045"], ["28126.10", "4.328"], ["28126.00", "1.750"], ["28125.90", "1.007"], ["28125.80", "0.004"], ["28125.70", "0.038"], ["28125.60", "0.192"], ["28125.50", "0.001"], ["28125.40", "0.706"], ["28125.30", "0.003"], ["28125.20", "0.097"], ["28125.10", "8.451"], ["28125.00", "25.287"], ["28124.90", "1.010"], ["28124.60", "0.004"], ["28124.50", "0.012"], ["28124.40", "0.413"], ["28124.20", "0.466"], ["28124.10", "1.475"], ["28124.00", "0.531"], ["28123.90", "0.020"], ["28123.80", "0.001"], ["28123.70", "2.028"], ["28123.50", "0.002"], ["28123.40", "0.036"], ["28123.30", "0.208"], ["28123.20", "0.839"], ["28123.10", "0.009"], ["28123.00", "2.830"], ["28122.90", "0.530"], ["28122.80", "0.087"], ["28122.70", "0.215"], ["28122.60", "0.251"], ["28122.50", "0.308"], ["28122.40", "0.062"], ["28122.30", "1.864"], ["28122.20", "0.027"], ["28122.10", "0.002"], ["28122.00", "0.103"], ["28121.90", "0.811"], ["28121.80", "8.052"], ["28121.70", "0.806"], ["28121.50", "0.064"], ["28121.40", "1.047"], ["28121.30", "0.002"], ["28121.20", "0.232"], ["28121.10", "0.002"], ["28121.00", "0.772"], ["28120.90", "0.135"], ["28120.80", "0.042"], ["28120.70", "0.212"], ["28120.60", "0.011"], ["28120.50", "3.413"], ["28120.10", "0.156"], ["28120.00", "3.143"], ["28119.80", "0.605"], ["28119.60", "0.919"], ["28119.40", "0.404"], ["28119.20", "1.333"], ["28119.10", "0.215"], ["28119.00", "3.003"], ["28118.90", "0.767"], ["28118.80", "3.326"], ["28118.70", "0.010"], ["28118.60", "0.153"], ["28118.30", "0.033"], ["28118.10", "0.007"], ["28117.90", "1.959"], ["28117.60", "2.722"], ["28117.50", "11.443"], ["28117.40", "0.007"], ["28117.30", "2.235"], ["28117.10", "0.744"], ["28117.00", "0.223"], ["28116.90", "3.895"], ["28116.80", "1.510"], ["28116.70", "0.003"], ["28116.60", "0.003"], ["28116.50", "4.452"], ["28116.40", "6.051"], ["28116.30", "2.006"], ["28116.20", "3.300"], ["28116.10", "2.584"], ["28116.00", "0.067"], ["28115.80", "0.030"], ["28115.70", "0.001"], ["28115.60", "0.062"], ["28115.50", "0.727"], ["28115.40", "1.641"], ["28115.30", "3.191"], ["28115.20", "0.065"], ["28115.10", "0.011"], ["28115.00", "3.178"], ["28114.90", "0.001"], ["28114.80", "0.107"], ["28114.70", "0.133"], ["28114.50", "4.029"], ["28114.40", "3.876"], ["28114.20", "0.827"], ["28114.10", "0.077"], ["28114.00", "0.796"], ["28113.90", "0.409"], ["28113.80", "0.266"], ["28113.70", "3.197"], ["28113.60", "3.032"], ["28113.50", "0.068"], ["28113.40", "1.207"], ["28113.30", "3.000"], ["28113.20", "0.213"], ["28113.10", "3.016"], ["28113.00", "0.472"], ["28112.70", "6.047"], ["28112.60", "2.034"], ["28112.50", "1.369"], ["28112.40", "1.452"], ["28112.30", "3.105"], ["28112.20", "0.212"], ["28112.00", "5.929"], ["28111.90", "0.951"], ["28111.80", "3.270"], ["28111.60", "13.388"], ["28111.50", "0.055"], ["28111.40", "0.005"], ["28111.30", "2.748"], ["28111.20", "0.495"], ["28111.10", "6.307"], ["28111.00", "1.337"], ["28110.90", "11.898"], ["28110.70", "3.402"], ["28110.60", "0.213"], ["28110.40", "0.019"], ["28110.20", "0.388"], ["28110.10", "0.701"], ["28110.00", "6.217"], ["28109.90", "0.002"], ["28109.80", "0.102"], ["28109.60", "1.655"], ["28109.50", "0.010"], ["28109.40", "0.032"], ["28109.30", "0.001"], ["28109.10", "0.243"], ["28109.00", "0.037"], ["28108.90", "1.974"], ["28108.80", "1.373"], ["28108.70", "1.302"], ["28108.60", "4.321"], ["28108.50", "0.102"], ["28108.40", "2.026"], ["28108.30", "0.030"], ["28108.20", "5.421"], ["28108.10", "0.223"], ["28108.00", "0.091"], ["28107.90", "0.075"], ["28107.80", "0.015"], ["28107.60", "0.200"], ["28107.50", "0.002"], ["28107.40", "3.469"], ["28107.30", "0.284"], ["28107.20", "1.545"], ["28107.10", "6.866"], ["28107.00", "0.213"], ["28106.90", "0.524"], ["28106.80", "3.141"], ["28106.70", "0.007"], ["28106.60", "3.101"], ["28106.50", "0.369"], ["28106.40", "0.258"], ["28106.30", "0.785"], ["28106.20", "0.111"], ["28106.10", "0.553"], ["28106.00", "1.088"], ["28105.60", "0.953"], ["28105.30", "0.533"], ["28105.20", "2.129"], ["28105.10", "0.003"], ["28105.00", "0.978"], ["28104.80", "0.030"], ["28104.70", "0.001"], ["28104.60", "0.209"], ["28104.50", "0.002"], ["28104.40", "0.317"], ["28104.30", "0.917"], ["28104.20", "0.008"], ["28104.10", "0.048"], ["28104.00", "0.939"], ["28103.90", "0.232"], ["28103.80", "0.548"], ["28103.70", "0.270"], ["28103.60", "0.010"], ["28103.40", "0.012"], ["28103.30", "0.006"], ["28103.20", "0.213"], ["28103.10", "0.351"], ["28103.00", "0.251"], ["28102.90", "0.884"], ["28102.80", "6.814"], ["28102.70", "1.471"], ["28102.60", "30.273"], ["28102.50", "1.525"], ["28102.40", "0.049"], ["28102.30", "7.455"], ["28102.20", "0.072"], ["28102.10", "1.428"], ["28102.00", "0.204"], ["28101.90", "6.973"], ["28101.80", "0.001"], ["28101.70", "18.887"], ["28101.60", "0.247"], ["28101.50", "2.869"], ["28101.40", "0.268"], ["28101.30", "30.271"], ["28101.20", "0.062"], ["28101.10", "0.660"], ["28101.00", "0.113"], ["28100.90", "0.205"], ["28100.70", "0.124"], ["28100.60", "0.181"], ["28100.50", "1.703"], ["28100.40", "0.275"], ["28100.30", "0.001"], ["28100.20", "0.212"], ["28100.10", "1.573"], ["28100.00", "2.727"], ["28099.90", "6.474"], ["28099.70", "0.099"], ["28099.60", "0.056"], ["28099.50", "0.360"], ["28099.40", "0.996"], ["28099.20", "0.406"], ["28099.10", "0.478"], ["28099.00", "3.469"], ["28098.90", "1.864"], ["28098.80", "1.804"], ["28098.70", "0.092"], ["28098.60", "0.002"], ["28098.50", "0.119"], ["28098.40", "0.093"], ["28098.30", "0.002"], ["28098.20", "0.729"], ["28098.10", "0.070"], ["28098.00", "1.096"], ["28097.90", "0.054"], ["28097.80", "2.228"], ["28097.70", "0.448"], ["28097.60", "0.070"], ["28097.50", "1.361"], ["28097.40", "1.100"], ["28097.30", "0.446"], ["28097.20", "1.831"], ["28097.10", "4.399"], ["28097.00", "0.074"], ["28096.90", "0.327"], ["28096.80", "0.226"], ["28096.70", "0.244"], ["28096.60", "0.312"], ["28096.50", "1.530"], ["28096.40", "0.166"], ["28096.30", "0.208"], ["28096.20", "5.612"], ["28096.10", "0.587"], ["28096.00", "0.139"], ["28095.90", "2.851"], ["28095.80", "0.129"], ["28095.70", "0.101"], ["28095.60", "0.088"], ["28095.40", "3.064"], ["28095.30", "1.375"], ["28095.20", "0.187"], ["28095.10", "0.225"], ["28095.00", "0.811"], ["28094.90", "0.801"], ["28094.80", "0.174"], ["28094.70", "0.295"], ["28094.60", "0.421"], ["28094.50", "0.060"], ["28094.40", "0.660"], ["28094.30", "0.371"], ["28094.20", "0.169"], ["28094.10", "2.136"], ["28094.00", "0.183"], ["28093.90", "0.130"], ["28093.80", "5.429"], ["28093.70", "13.389"], ["28093.60", "0.093"], ["28093.40", "0.159"], ["28093.30", "0.005"], ["28093.20", "0.164"], ["28093.10", "0.038"], ["28093.00", "3.091"], ["28092.80", "0.003"], ["28092.70", "3.147"], ["28092.60", "0.117"], ["28092.50", "1.473"], ["28092.40", "8.439"], ["28092.30", "2.853"], ["28092.20", "1.371"], ["28092.10", "10.690"], ["28092.00", "0.058"], ["28091.90", "0.024"], ["28091.70", "4.074"], ["28091.60", "0.004"], ["28091.50", "0.231"], ["28091.30", "0.618"], ["28091.20", "1.096"], ["28091.00", "0.542"], ["28090.90", "0.123"], ["28090.80", "0.108"], ["28090.70", "0.350"], ["28090.60", "0.138"], ["28090.50", "1.754"], ["28090.40", "2.112"], ["28090.30", "0.181"], ["28090.20", "5.214"], ["28090.10", "0.422"], ["28090.00", "0.254"], ["28089.90", "2.105"], ["28089.80", "19.973"], ["28089.60", "5.150"], ["28089.50", "0.256"], ["28089.40", "1.079"], ["28089.30", "1.593"], ["28089.20", "5.242"], ["28089.10", "0.312"], ["28089.00", "1.047"], ["28088.90", "4.891"], ["28088.80", "0.166"], ["28088.70", "0.001"], ["28088.60", "0.144"], ["28088.50", "0.140"], ["28088.40", "2.132"], ["28088.20", "0.068"], ["28088.10", "6.753"], ["28088.00", "1.447"], ["28087.90", "0.412"], ["28087.80", "4.234"], ["28087.70", "0.311"], ["28087.60", "1.810"], ["28087.50", "0.031"], ["28087.40", "1.345"], ["28087.30", "0.267"], ["28087.20", "0.324"], ["28087.10", "0.003"], ["28087.00", "0.650"], ["28086.90", "5.378"], ["28086.80", "0.656"], ["28086.70", "2.352"], ["28086.60", "0.131"], ["28086.50", "0.265"], ["28086.40", "0.174"], ["28086.30", "0.300"], ["28086.20", "0.207"], ["28086.10", "0.401"], ["28086.00", "1.076"], ["28085.90", "0.475"], ["28085.80", "3.487"], ["28085.70", "5.414"], ["28085.60", "0.086"], ["28085.50", "0.144"], ["28085.40", "3.595"], ["28085.30", "0.289"], ["28085.20", "2.669"], ["28085.10", "0.007"], ["28085.00", "14.256"], ["28084.90", "0.343"], ["28084.80", "0.421"], ["28084.70", "0.481"], ["28084.60", "0.390"], ["28084.50", "0.187"], ["28084.40", "0.224"], ["28084.20", "0.085"], ["28084.10", "0.059"], ["28084.00", "7.846"], ["28083.90", "3.144"], ["28083.80", "3.763"], ["28083.70", "0.006"], ["28083.60", "0.082"], ["28083.50", "0.296"], ["28083.40", "0.124"], ["28083.30", "9.287"], ["28083.10", "29.720"], ["28083.00", "125.094"], ["28082.80", "0.091"], ["28082.70", "0.536"], ["28082.60", "0.014"], ["28082.50", "0.091"], ["28082.40", "0.086"], ["28082.30", "3.703"], ["28082.20", "0.032"], ["28082.10", "3.001"], ["28082.00", "0.024"], ["28081.90", "0.307"], ["28081.80", "0.114"], ["28081.70", "0.037"], ["28081.60", "0.141"], ["28081.50", "0.138"], ["28081.40", "3.000"], ["28081.30", "0.210"], ["28081.20", "0.211"], ["28081.10", "0.199"], ["28081.00", "0.146"], ["28080.90", "0.632"], ["28080.70", "0.362"], ["28080.60", "0.164"], ["28080.50", "0.138"], ["28080.40", "1.070"], ["28080.30", "0.212"], ["28080.20", "3.000"], ["28080.00", "4.756"], ["28079.90", "0.002"], ["28079.80", "1.560"], ["28079.70", "6.080"], ["28079.60", "0.416"], ["28079.50", "0.023"], ["28079.40", "0.173"], ["28079.20", "0.017"], ["28079.10", "2.076"], ["28079.00", "1.575"], ["28078.90", "0.004"], ["28078.80", "0.084"], ["28078.70", "0.019"], ["28078.60", "0.108"], ["28078.50", "0.071"], ["28078.40", "6.548"], ["28078.30", "0.981"], ["28078.20", "3.954"], ["28078.10", "0.202"], ["28078.00", "0.920"], ["28077.90", "0.201"], ["28077.80", "0.107"], ["28077.70", "0.466"], ["28077.60", "16.112"], ["28077.50", "0.099"], ["28077.40", "0.080"], ["28077.30", "0.136"], ["28077.20", "0.081"], ["28077.10", "0.297"], ["28077.00", "7.175"], ["28076.90", "0.347"], ["28076.80", "1.552"], ["28076.70", "0.284"], ["28076.60", "1.143"], ["28076.50", "3.103"], ["28076.30", "2.133"], ["28076.10", "0.501"], ["28076.00", "0.229"], ["28075.70", "0.117"], ["28075.60", "0.091"], ["28075.50", "0.276"], ["28075.40", "0.116"], ["28075.30", "0.119"], ["28075.20", "0.910"], ["28075.10", "0.141"], ["28075.00", "0.424"], ["28074.90", "0.176"], ["28074.80", "3.707"], ["28074.70", "0.204"], ["28074.60", "0.085"], ["28074.50", "0.075"], ["28074.40", "0.006"], ["28074.30", "0.088"], ["28074.20", "0.625"], ["28074.10", "0.421"], ["28074.00", "0.592"], ["28073.90", "0.410"], ["28073.80", "0.018"], ["28073.70", "0.176"], ["28073.60", "2.122"], ["28073.50", "0.351"], ["28073.40", "0.001"], ["28073.30", "0.001"], ["28073.10", "0.069"], ["28073.00", "0.858"], ["28072.90", "0.345"], ["28072.80", "0.316"], ["28072.70", "0.054"], ["28072.60", "0.001"], ["28072.50", "0.215"], ["28072.40", "0.116"], ["28072.20", "1.316"], ["28072.10", "1.531"], ["28072.00", "4.635"], ["28071.90", "5.776"], ["28071.80", "0.184"], ["28071.70", "0.001"], ["28071.60", "0.016"], ["28071.40", "0.217"], ["28071.30", "1.938"], ["28071.20", "0.089"], ["28071.10", "0.357"], ["28071.00", "14.024"], ["28070.90", "0.021"], ["28070.80", "0.423"], ["28070.70", "0.146"], ["28070.60", "10.724"], ["28070.50", "0.093"], ["28070.40", "2.891"], ["28070.30", "0.343"], ["28070.20", "0.106"], ["28070.10", "0.051"], ["28070.00", "45.977"], ["28069.90", "0.270"], ["28069.80", "0.116"], ["28069.70", "0.057"], ["28069.60", "0.080"], ["28069.40", "1.521"], ["28069.30", "0.085"], ["28069.20", "1.365"], ["28069.10", "0.058"], ["28069.00", "0.112"], ["28068.90", "0.195"], ["28068.80", "2.878"], ["28068.70", "0.271"], ["28068.60", "0.200"], ["28068.50", "0.352"], ["28068.40", "0.077"], ["28068.30", "0.368"], ["28068.20", "0.182"], ["28068.10", "0.315"], ["28068.00", "0.116"], ["28067.90", "0.106"], ["28067.80", "0.249"], ["28067.70", "0.003"], ["28067.60", "0.003"], ["28067.50", "8.164"], ["28067.40", "0.177"], ["28067.30", "0.116"], ["28067.20", "0.393"], ["28067.10", "0.240"], ["28067.00", "5.245"], ["28066.90", "0.129"], ["28066.80", "0.373"], ["28066.70", "0.166"], ["28066.60", "0.021"], ["28066.50", "1.518"], ["28066.40", "0.041"], ["28066.30", "0.167"], ["28066.20", "0.007"], ["28066.10", "0.125"], ["28066.00", "0.337"], ["28065.80", "0.135"], ["28065.70", "0.085"], ["28065.60", "0.255"], ["28065.50", "0.059"], ["28065.40", "0.415"], ["28065.30", "2.592"], ["28065.20", "0.086"], ["28065.10", "0.407"], ["28065.00", "0.615"], ["28064.90", "0.308"], ["28064.80", "0.457"], ["28064.70", "4.348"], ["28064.60", "0.016"], ["28064.50", "0.004"], ["28064.40", "1.835"], ["28064.20", "0.089"], ["28064.10", "0.320"], ["28064.00", "0.145"], ["28063.90", "0.002"], ["28063.70", "0.594"], ["28063.60", "0.621"], ["28063.50", "0.001"], ["28063.40", "0.082"], ["28063.30", "0.256"], ["28063.20", "0.166"], ["28063.10", "3.124"], ["28063.00", "2.869"], ["28062.90", "0.201"], ["28062.80", "0.424"], ["28062.70", "0.007"], ["28062.60", "0.133"], ["28062.50", "0.080"], ["28062.40", "0.199"], ["28062.20", "0.292"], ["28062.10", "0.056"], ["28062.00", "0.163"], ["28061.90", "0.180"], ["28061.80", "0.053"], ["28061.70", "0.082"], ["28061.60", "0.001"], ["28061.50", "0.537"], ["28061.40", "1.035"], ["28061.30", "0.239"], ["28061.20", "0.085"], ["28061.10", "0.083"], ["28061.00", "0.407"], ["28060.90", "0.087"], ["28060.80", "0.473"], ["28060.70", "0.822"], ["28060.60", "0.205"], ["28060.50", "2.651"], ["28060.40", "0.281"], ["28060.30", "0.348"], ["28060.20", "0.127"], ["28060.10", "0.129"], ["28060.00", "12.187"], ["28059.90", "0.337"], ["28059.80", "1.211"], ["28059.70", "0.350"], ["28059.60", "1.515"], ["28059.50", "3.055"], ["28059.40", "0.217"], ["28059.30", "0.441"], ["28059.20", "1.583"], ["28059.10", "0.011"], ["28059.00", "0.254"], ["28058.90", "0.152"], ["28058.80", "0.828"], ["28058.70", "0.074"], ["28058.60", "0.033"], ["28058.50", "0.121"], ["28058.40", "0.550"], ["28058.30", "1.555"], ["28058.20", "0.155"], ["28058.10", "0.103"], ["28058.00", "6.335"], ["28057.90", "0.042"], ["28057.80", "0.873"], ["28057.70", "0.004"], ["28057.60", "0.001"], ["28057.50", "0.301"], ["28057.30", "0.551"], ["28057.20", "1.538"], ["28057.10", "0.195"], ["28057.00", "0.560"], ["28056.90", "0.101"], ["28056.80", "0.003"], ["28056.70", "0.002"], ["28056.60", "0.011"], ["28056.50", "0.085"], ["28056.40", "0.299"], ["28056.30", "0.006"], ["28056.20", "0.030"], ["28056.10", "0.553"], ["28056.00", "1.433"], ["28055.90", "0.039"], ["28055.80", "0.490"], ["28055.70", "0.085"], ["28055.60", "0.454"], ["28055.50", "0.263"], ["28055.40", "4.539"], ["28055.30", "0.085"], ["28055.20", "0.255"], ["28055.10", "0.175"], ["28055.00", "3.113"], ["28054.90", "0.173"], ["28054.80", "0.001"], ["28054.70", "0.514"], ["28054.60", "0.222"], ["28054.50", "0.307"], ["28054.40", "0.514"], ["28054.30", "0.099"], ["28054.20", "0.060"], ["28054.10", "0.121"], ["28054.00", "0.587"], ["28053.90", "9.321"], ["28053.80", "0.185"], ["28053.70", "0.001"], ["28053.60", "0.242"], ["28053.50", "0.022"], ["28053.40", "0.035"], ["28053.30", "0.007"], ["28053.10", "0.105"], ["28053.00", "0.907"], ["28052.90", "0.216"], ["28052.80", "7.400"], ["28052.60", "0.059"], ["28052.50", "0.191"], ["28052.40", "0.365"], ["28052.30", "0.131"], ["28052.20", "3.907"], ["28052.10", "0.120"], ["28052.00", "0.805"], ["28051.90", "172.732"], ["28051.80", "0.016"], ["28051.70", "0.100"], ["28051.60", "0.192"], ["28051.50", "1.835"], ["28051.40", "0.099"], ["28051.20", "0.097"], ["28051.10", "0.153"], ["28051.00", "9.564"], ["28050.90", "0.177"], ["28050.80", "0.030"], ["28050.70", "3.525"], ["28050.60", "10.026"], ["28050.50", "0.212"], ["28050.40", "0.098"], ["28050.30", "0.012"], ["28050.10", "1.148"], ["28050.00", "38.892"], ["28049.90", "0.115"], ["28049.80", "10.540"], ["28049.70", "5.713"], ["28049.60", "0.288"], ["28049.50", "0.001"], ["28049.40", "0.001"], ["28049.30", "0.103"], ["28049.20", "1.513"], ["28049.10", "0.215"], ["28049.00", "0.331"], ["28048.80", "0.034"], ["28048.70", "0.531"], ["28048.60", "0.004"], ["28048.50", "1.171"], ["28048.40", "0.175"], ["28048.30", "0.206"], ["28048.10", "8.352"], ["28048.00", "2.224"], ["28047.90", "0.120"], ["28047.80", "0.549"], ["28047.70", "0.214"], ["28047.60", "0.106"], ["28047.50", "0.178"], ["28047.40", "0.240"], ["28047.30", "0.012"], ["28047.20", "0.038"], ["28047.10", "0.091"], ["28047.00", "0.611"], ["28046.90", "0.242"], ["28046.80", "0.073"], ["28046.70", "0.101"], ["28046.60", "0.329"], ["28046.30", "0.014"], ["28046.20", "0.002"], ["28046.10", "0.382"], ["28046.00", "0.535"], ["28045.90", "0.135"], ["28045.60", "0.011"], ["28045.50", "0.815"], ["28045.40", "0.119"], ["28045.30", "0.230"], ["28045.20", "0.003"], ["28045.10", "0.155"], ["28045.00", "1.935"], ["28044.90", "0.123"], ["28044.80", "12.323"], ["28044.70", "0.306"], ["28044.60", "0.141"], ["28044.50", "0.005"], ["28044.40", "0.007"], ["28044.30", "0.198"], ["28044.20", "0.101"], ["28044.10", "0.071"], ["28044.00", "3.365"], ["28043.90", "0.001"], ["28043.80", "1.073"], ["28043.70", "0.585"], ["28043.60", "0.001"], ["28043.50", "0.002"], ["28043.30", "0.034"], ["28043.20", "0.088"], ["28043.10", "0.006"], ["28043.00", "0.257"], ["28042.90", "0.324"], ["28042.70", "0.123"], ["28042.60", "0.042"], ["28042.50", "0.238"], ["28042.40", "0.002"], ["28042.30", "10.905"], ["28042.20", "0.042"], ["28042.10", "0.187"], ["28042.00", "0.324"], ["28041.90", "0.083"], ["28041.80", "0.056"], ["28041.70", "0.006"], ["28041.60", "0.080"], ["28041.50", "0.018"], ["28041.40", "0.002"], ["28041.30", "0.003"], ["28041.20", "0.427"], ["28041.10", "0.534"], ["28041.00", "0.163"], ["28040.90", "0.142"], ["28040.80", "0.007"], ["28040.60", "0.119"], ["28040.50", "0.003"], ["28040.40", "0.691"], ["28040.30", "0.135"], ["28040.20", "0.158"], ["28040.10", "0.047"], ["28040.00", "57.404"], ["28039.90", "0.081"], ["28039.80", "0.017"], ["28039.70", "0.217"], ["28039.60", "0.256"], ["28039.50", "0.718"], ["28039.40", "0.243"], ["28039.30", "0.027"], ["28039.20", "1.464"], ["28039.10", "0.364"], ["28039.00", "0.771"], ["28038.90", "0.082"], ["28038.80", "1.601"], ["28038.70", "0.001"], ["28038.60", "0.322"], ["28038.40", "0.209"], ["28038.30", "0.041"], ["28038.20", "0.004"], ["28038.00", "4.476"], ["28037.90", "3.860"], ["28037.80", "0.068"], ["28037.70", "0.542"], ["28037.60", "0.249"], ["28037.50", "0.301"], ["28037.40", "4.364"], ["28037.30", "0.055"], ["28037.20", "0.103"], ["28037.10", "3.944"], ["28037.00", "0.014"], ["28036.90", "0.320"], ["28036.80", "0.002"], ["28036.70", "0.070"], ["28036.60", "0.088"], ["28036.50", "0.015"], ["28036.40", "0.343"], ["28036.30", "0.100"], ["28036.20", "0.228"], ["28036.10", "0.002"], ["28036.00", "0.662"], ["28035.90", "0.260"], ["28035.80", "0.210"], ["28035.70", "2.550"], ["28035.60", "0.129"], ["28035.50", "0.354"], ["28035.40", "0.003"], ["28035.30", "0.103"]], "asks": [["28145.20", "6.833"], ["28145.30", "0.009"], ["28145.40", "0.008"], ["28145.60", "0.002"], ["28145.70", "0.003"], ["28145.90", "0.009"], ["28146.00", "0.002"], ["28146.10", "0.009"], ["28146.20", "0.045"], ["28146.40", "0.139"], ["28146.50", "0.047"], ["28146.60", "0.027"], ["28146.70", "0.023"], ["28146.80", "0.380"], ["28146.90", "0.254"], ["28147.00", "0.027"], ["28147.10", "0.084"], ["28147.20", "0.002"], ["28147.30", "0.177"], ["28147.50", "0.001"], ["28147.70", "0.177"], ["28147.80", "0.105"], ["28147.90", "0.138"], ["28148.00", "0.419"], ["28148.10", "0.426"], ["28148.20", "0.008"], ["28148.30", "0.155"], ["28148.40", "0.088"], ["28148.50", "0.260"], ["28148.60", "0.006"], ["28148.70", "0.871"], ["28148.80", "0.595"], ["28148.90", "2.000"], ["28149.00", "1.147"], ["28149.10", "0.370"], ["28149.20", "1.754"], ["28149.40", "0.628"], ["28149.50", "5.833"], ["28149.60", "1.963"], ["28149.70", "0.001"], ["28149.80", "1.704"], ["28149.90", "0.480"], ["28150.00", "2.712"], ["28150.10", "0.345"], ["28150.20", "0.002"], ["28150.40", "0.202"], ["28150.50", "0.072"], ["28150.60", "3.905"], ["28150.70", "0.212"], ["28150.90", "0.374"], ["28151.00", "1.357"], ["28151.10", "0.074"], ["28151.20", "0.176"], ["28151.30", "0.001"], ["28151.40", "0.160"], ["28151.60", "0.135"], ["28152.00", "0.568"], ["28152.10", "1.432"], ["28152.20", "0.007"], ["28152.30", "0.010"], ["28152.40", "0.110"], ["28152.60", "0.012"], ["28152.70", "0.110"], ["28152.80", "0.070"], ["28152.90", "1.146"], ["28153.00", "5.070"], ["28153.10", "4.020"], ["28153.20", "12.494"], ["28153.30", "2.211"], ["28153.50", "0.321"], ["28153.60", "1.553"], ["28153.70", "0.248"], ["28153.80", "0.008"], ["28153.90", "0.080"], ["28154.00", "0.339"], ["28154.10", "0.415"], ["28154.20", "0.001"], ["28154.30", "0.100"], ["28154.40", "0.385"], ["28154.50", "0.001"], ["28154.60", "0.177"], ["28154.70", "0.001"], ["28154.80", "0.445"], ["28154.90", "0.717"], ["28155.00", "1.370"], ["28155.10", "1.489"], ["28155.30", "0.035"], ["28155.40", "0.129"], ["28155.60", "0.391"], ["28155.70", "1.065"], ["28155.80", "0.501"], ["28155.90", "1.464"], ["28156.00", "0.800"], ["28156.10", "0.393"], ["28156.20", "0.046"], ["28156.30", "1.067"], ["28156.40", "1.634"], ["28156.50", "0.001"], ["28156.60", "1.411"], ["28156.70", "0.089"], ["28156.80", "1.159"], ["28157.00", "2.333"], ["28157.10", "0.357"], ["28157.20", "0.575"], ["28157.30", "0.253"], ["28157.40", "0.977"], ["28157.50", "0.062"], ["28157.60", "1.290"], ["28157.70", "0.034"], ["28157.80", "0.279"], ["28157.90", "0.067"], ["28158.00", "0.782"], ["28158.10", "0.224"], ["28158.20", "0.533"], ["28158.30", "0.801"], ["28158.40", "0.872"], ["28158.50", "1.372"], ["28158.60", "1.956"], ["28158.70", "0.451"], ["28158.80", "0.356"], ["28158.90", "0.267"], ["28159.00", "2.356"], ["28159.10", "0.747"], ["28159.20", "1.102"], ["28159.30", "5.502"], ["28159.40", "1.647"], ["28159.50", "0.662"], ["28159.60", "0.726"], ["28159.70", "2.441"], ["28159.80", "1.814"], ["28159.90", "1.407"], ["28160.00", "0.995"], ["28160.10", "0.006"], ["28160.20", "2.176"], ["28160.30", "0.432"], ["28160.40", "0.489"], ["28160.60", "0.001"], ["28160.70", "2.037"], ["28160.80", "0.181"], ["28160.90", "1.832"], ["28161.00", "0.004"], ["28161.10", "0.595"], ["28161.20", "0.200"], ["28161.30", "1.191"], ["28161.40", "0.004"], ["28161.50", "0.003"], ["28161.60", "0.325"], ["28161.80", "0.195"], ["28161.90", "0.054"], ["28162.00", "0.871"], ["28162.20", "0.751"], ["28162.30", "0.057"], ["28162.40", "0.302"], ["28162.50", "0.004"], ["28162.70", "0.004"], ["28162.80", "0.099"], ["28162.90", "1.662"], ["28163.10", "0.502"], ["28163.30", "0.054"], ["28163.40", "0.911"], ["28163.50", "7.023"], ["28163.60", "0.095"], ["28163.70", "0.003"], ["28163.80", "0.648"], ["28163.90", "0.083"], ["28164.00", "0.492"], ["28164.10", "0.595"], ["28164.20", "5.000"], ["28164.40", "3.216"], ["28164.50", "0.584"], ["28164.60", "0.002"], ["28164.70", "2.085"], ["28164.80", "0.563"], ["28164.90", "5.215"], ["28165.00", "4.531"], ["28165.10", "0.750"], ["28165.20", "0.980"], ["28165.30", "0.356"], ["28165.40", "0.301"], ["28165.50", "0.303"], ["28165.60", "0.002"], ["28165.70", "0.007"], ["28165.80", "0.383"], ["28165.90", "1.105"], ["28166.00", "0.161"], ["28166.20", "4.622"], ["28166.30", "0.202"], ["28166.40", "0.440"], ["28166.50", "1.251"], ["28166.60", "0.034"], ["28166.70", "0.232"], ["28166.90", "0.370"], ["28167.00", "0.107"], ["28167.10", "0.399"], ["28167.20", "0.059"], ["28167.30", "0.040"], ["28167.40", "0.003"], ["28167.60", "0.325"], ["28167.70", "0.024"], ["28167.80", "0.777"], ["28167.90", "0.022"], ["28168.00", "0.633"], ["28168.10", "2.091"], ["28168.20", "0.396"], ["28168.30", "3.324"], ["28168.40", "0.362"], ["28168.50", "0.107"], ["28168.60", "2.304"], ["28168.80", "10.144"], ["28169.00", "0.055"], ["28169.10", "0.063"], ["28169.20", "0.070"], ["28169.30", "0.003"], ["28169.40", "3.075"], ["28169.50", "12.127"], ["28169.60", "3.905"], ["28169.70", "0.108"], ["28169.80", "0.072"], ["28169.90", "1.229"], ["28170.00", "2.082"], ["28170.10", "0.028"], ["28170.20", "0.001"], ["28170.30", "0.129"], ["28170.50", "0.011"], ["28170.60", "0.216"], ["28170.80", "1.498"], ["28170.90", "5.049"], ["28171.10", "0.011"], ["28171.20", "0.352"], ["28171.50", "1.278"], ["28171.70", "0.471"], ["28171.80", "1.966"], ["28171.90", "0.011"], ["28172.00", "0.010"], ["28172.10", "1.060"], ["28172.20", "1.669"], ["28172.30", "7.416"], ["28172.60", "0.886"], ["28172.70", "0.001"], ["28172.80", "0.055"], ["28172.90", "0.053"], ["28173.00", "0.585"], ["28173.10", "0.010"], ["28173.30", "0.014"], ["28173.40", "0.002"], ["28173.50", "3.029"], ["28173.70", "0.344"], ["28173.80", "6.066"], ["28173.90", "0.004"], ["28174.00", "0.001"], ["28174.10", "4.263"], ["28174.20", "0.709"], ["28174.30", "0.096"], ["28174.40", "0.384"], ["28174.60", "0.005"], ["28174.70", "3.405"], ["28174.80", "0.003"], ["28175.00", "3.116"], ["28175.10", "0.123"], ["28175.20", "1.063"], ["28175.40", "3.993"], ["28175.50", "2.137"], ["28175.60", "1.139"], ["28175.70", "0.002"], ["28175.80", "2.647"], ["28176.00", "0.071"], ["28176.10", "0.050"], ["28176.20", "0.011"], ["28176.40", "5.880"], ["28176.50", "0.050"], ["28176.60", "0.001"], ["28176.70", "3.832"], ["28176.80", "3.693"], ["28177.10", "0.001"], ["28177.20", "1.047"], ["28177.30", "0.001"], ["28177.40", "3.001"], ["28177.60", "0.037"], ["28177.80", "3.684"], ["28177.90", "0.040"], ["28178.10", "0.341"], ["28178.20", "3.195"], ["28178.40", "0.846"], ["28178.50", "0.003"], ["28178.60", "2.880"], ["28178.70", "0.763"], ["28178.80", "0.388"], ["28178.90", "0.077"], ["28179.10", "0.003"], ["28179.20", "1.981"], ["28179.30", "3.001"], ["28179.40", "0.029"], ["28179.50", "0.517"], ["28179.60", "0.216"], ["28179.80", "0.450"], ["28179.90", "0.410"], ["28180.00", "4.999"], ["28180.10", "0.074"], ["28180.20", "0.452"], ["28180.30", "3.017"], ["28180.50", "0.037"], ["28180.70", "0.500"], ["28180.90", "3.001"], ["28181.00", "0.180"], ["28181.20", "0.028"], ["28181.30", "0.213"], ["28181.40", "0.054"], ["28181.60", "0.096"], ["28181.70", "0.321"], ["28181.80", "0.082"], ["28181.90", "0.004"], ["28182.00", "5.527"], ["28182.10", "1.942"], ["28182.20", "0.138"], ["28182.30", "3.115"], ["28182.40", "8.624"], ["28182.50", "3.128"], ["28182.60", "0.324"], ["28182.70", "0.554"], ["28182.90", "1.942"], ["28183.00", "3.001"], ["28183.10", "0.219"], ["28183.20", "0.006"], ["28183.30", "0.005"], ["28183.50", "0.052"], ["28183.70", "1.187"], ["28184.00", "0.051"], ["28184.10", "1.392"], ["28184.20", "2.828"], ["28184.30", "2.053"], ["28184.40", "0.093"], ["28184.50", "0.062"], ["28184.60", "0.010"], ["28184.70", "1.937"], ["28184.80", "0.178"], ["28184.90", "0.482"], ["28185.00", "1.207"], ["28185.10", "0.207"], ["28185.20", "0.058"], ["28185.40", "0.056"], ["28185.50", "0.067"], ["28185.70", "0.124"], ["28185.80", "0.002"], ["28185.90", "0.112"], ["28186.00", "0.677"], ["28186.20", "0.233"], ["28186.30", "0.064"], ["28186.40", "0.019"], ["28186.50", "0.050"], ["28186.60", "2.438"], ["28186.70", "0.696"], ["28186.80", "2.121"], ["28186.90", "0.096"], ["28187.00", "0.655"], ["28187.10", "0.075"], ["28187.20", "3.055"], ["28187.30", "0.138"], ["28187.40", "0.490"], ["28187.50", "0.261"], ["28187.60", "0.057"], ["28187.70", "0.249"], ["28187.80", "0.093"], ["28187.90", "0.139"], ["28188.00", "0.511"], ["28188.10", "6.819"], ["28188.20", "4.594"], ["28188.30", "2.910"], ["28188.50", "30.115"], ["28188.60", "0.094"], ["28188.70", "0.301"], ["28188.80", "0.132"], ["28189.00", "0.211"], ["28189.10", "0.035"], ["28189.20", "4.035"], ["28189.30", "0.046"], ["28189.40", "0.077"], ["28189.50", "0.131"], ["28189.60", "5.334"], ["28189.70", "0.017"], ["28189.80", "1.417"], ["28189.90", "0.102"], ["28190.00", "0.331"], ["28190.10", "6.807"], ["28190.20", "1.500"], ["28190.30", "6.361"], ["28190.40", "0.024"], ["28190.50", "0.656"], ["28190.60", "0.395"], ["28190.70", "0.002"], ["28190.80", "0.053"], ["28190.90", "0.512"], ["28191.00", "0.326"], ["28191.10", "0.208"], ["28191.20", "0.014"], ["28191.30", "0.001"], ["28191.40", "0.365"], ["28191.50", "0.316"], ["28191.70", "0.070"], ["28191.80", "3.992"], ["28191.90", "0.053"], ["28192.00", "0.223"], ["28192.10", "0.078"], ["28192.20", "10.683"], ["28192.30", "0.001"], ["28192.40", "0.054"], ["28192.50", "0.010"], ["28192.60", "0.011"], ["28192.70", "0.121"], ["28192.80", "0.367"], ["28192.90", "0.264"], ["28193.00", "0.025"], ["28193.10", "0.059"], ["28193.20", "0.108"], ["28193.30", "0.101"], ["28193.40", "0.948"], ["28193.50", "4.147"], ["28193.60", "0.265"], ["28193.70", "0.014"], ["28193.80", "0.011"], ["28194.00", "2.695"], ["28194.10", "3.476"], ["28194.20", "0.559"], ["28194.30", "0.014"], ["28194.40", "0.387"], ["28194.50", "0.100"], ["28194.60", "0.108"], ["28194.70", "0.111"], ["28194.80", "0.912"], ["28194.90", "0.115"], ["28195.00", "1.881"], ["28195.20", "0.117"], ["28195.30", "0.166"], ["28195.40", "4.603"], ["28195.50", "0.056"], ["28195.60", "0.257"], ["28195.70", "0.005"], ["28195.80", "0.099"], ["28195.90", "0.035"], ["28196.00", "2.935"], ["28196.10", "0.008"], ["28196.20", "0.012"], ["28196.30", "0.113"], ["28196.40", "0.168"], ["28196.50", "3.003"], ["28196.60", "0.060"], ["28196.70", "0.990"], ["28196.80", "4.424"], ["28197.00", "0.170"], ["28197.10", "0.106"], ["28197.20", "2.753"], ["28197.30", "3.056"], ["28197.40", "0.085"], ["28197.50", "0.001"], ["28197.60", "0.433"], ["28197.70", "1.963"], ["28197.80", "1.724"], ["28197.90", "0.090"], ["28198.00", "0.414"], ["28198.10", "0.083"], ["28198.20", "0.436"], ["28198.30", "0.473"], ["28198.40", "1.423"], ["28198.50", "0.123"], ["28198.70", "0.396"], ["28198.80", "0.186"], ["28198.90", "0.011"], ["28199.00", "7.475"], ["28199.10", "2.453"], ["28199.20", "0.380"], ["28199.30", "3.340"], ["28199.40", "4.688"], ["28199.50", "0.153"], ["28199.60", "0.023"], ["28199.70", "0.315"], ["28199.90", "0.014"], ["28200.00", "18.031"], ["28200.10", "1.041"], ["28200.20", "0.505"], ["28200.40", "0.054"], ["28200.50", "0.084"], ["28200.60", "1.081"], ["28200.70", "0.182"], ["28200.80", "4.406"], ["28200.90", "0.409"], ["28201.00", "0.016"], ["28201.10", "0.414"], ["28201.20", "1.624"], ["28201.30", "0.111"], ["28201.40", "2.080"], ["28201.50", "0.268"], ["28201.60", "0.083"], ["28201.70", "0.081"], ["28201.80", "0.186"], ["28201.90", "0.209"], ["28202.00", "3.496"], ["28202.10", "0.055"], ["28202.20", "3.021"], ["28202.30", "0.433"], ["28202.40", "0.219"], ["28202.50", "5.000"], ["28202.60", "0.001"], ["28202.70", "0.255"], ["28202.80", "22.064"], ["28202.90", "0.084"], ["28203.00", "11.714"], ["28203.10", "0.001"], ["28203.20", "0.092"], ["28203.30", "0.233"], ["28203.40", "4.383"], ["28203.50", "19.015"], ["28203.60", "3.235"], ["28203.70", "0.269"], ["28203.80", "0.109"], ["28203.90", "0.296"], ["28204.00", "0.337"], ["28204.10", "0.142"], ["28204.20", "0.087"], ["28204.30", "0.004"], ["28204.40", "6.686"], ["28204.50", "2.388"], ["28204.60", "0.426"], ["28204.70", "0.444"], ["28204.80", "1.497"], ["28204.90", "8.441"], ["28205.00", "0.627"], ["28205.10", "4.597"], ["28205.20", "0.001"], ["28205.30", "0.011"], ["28205.40", "0.064"], ["28205.50", "0.411"], ["28205.60", "0.295"], ["28205.70", "0.213"], ["28205.80", "0.084"], ["28205.90", "1.043"], ["28206.00", "0.227"], ["28206.10", "0.004"], ["28206.20", "0.406"], ["28206.30", "1.735"], ["28206.40", "0.018"], ["28206.50", "2.383"], ["28206.70", "0.973"], ["28206.80", "3.162"], ["28206.90", "3.793"], ["28207.00", "10.365"], ["28207.10", "0.295"], ["28207.20", "0.085"], ["28207.30", "0.196"], ["28207.40", "2.647"], ["28207.50", "4.286"], ["28207.70", "0.506"], ["28207.80", "0.085"], ["28207.90", "1.995"], ["28208.00", "3.232"], ["28208.10", "0.050"], ["28208.20", "0.105"], ["28208.30", "0.071"], ["28208.50", "1.316"], ["28208.60", "0.085"], ["28208.70", "0.849"], ["28208.80", "1.434"], ["28208.90", "2.712"], ["28209.00", "9.793"], ["28209.10", "12.185"], ["28209.20", "0.720"], ["28209.40", "9.574"], ["28209.50", "1.182"], ["28209.60", "0.145"], ["28209.70", "0.292"], ["28209.80", "0.256"], ["28209.90", "0.120"], ["28210.00", "1.127"], ["28210.10", "6.834"], ["28210.20", "30.180"], ["28210.30", "1.563"], ["28210.40", "6.369"], ["28210.50", "0.081"], ["28210.60", "0.167"], ["28210.70", "0.002"], ["28210.90", "0.091"], ["28211.00", "1.032"], ["28211.10", "0.359"], ["28211.20", "4.299"], ["28211.30", "2.474"], ["28211.40", "6.268"], ["28211.50", "0.200"], ["28211.60", "0.289"], ["28211.70", "0.107"], ["28211.80", "3.086"], ["28211.90", "1.842"], ["28212.00", "4.012"], ["28212.10", "3.379"], ["28212.20", "2.258"], ["28212.30", "0.081"], ["28212.40", "2.738"], ["28212.50", "0.015"], ["28212.60", "4.417"], ["28212.70", "0.080"], ["28212.80", "4.372"], ["28212.90", "0.328"], ["28213.00", "0.075"], ["28213.10", "0.910"], ["28213.20", "0.117"], ["28213.30", "3.360"], ["28213.40", "0.379"], ["28213.50", "0.943"], ["28213.60", "0.564"], ["28213.70", "0.009"], ["28213.80", "3.940"], ["28213.90", "0.024"], ["28214.00", "4.199"], ["28214.10", "0.085"], ["28214.20", "0.012"], ["28214.30", "0.175"], ["28214.40", "0.306"], ["28214.50", "0.742"], ["28214.60", "0.170"], ["28214.70", "0.325"], ["28214.80", "0.080"], ["28214.90", "0.994"], ["28215.00", "0.617"], ["28215.10", "0.001"], ["28215.20", "0.561"], ["28215.30", "0.215"], ["28215.40", "0.007"], ["28215.50", "3.623"], ["28215.60", "0.138"], ["28215.70", "0.020"], ["28215.80", "0.317"], ["28215.90", "1.659"], ["28216.00", "0.628"], ["28216.10", "1.021"], ["28216.30", "0.220"], ["28216.40", "1.549"], ["28216.50", "0.211"], ["28216.60", "0.122"], ["28216.70", "0.238"], ["28216.80", "5.845"], ["28216.90", "0.279"], ["28217.00", "1.474"], ["28217.20", "0.129"], ["28217.30", "3.198"], ["28217.40", "0.237"], ["28217.50", "6.120"], ["28217.60", "0.747"], ["28217.70", "0.385"], ["28217.80", "0.001"], ["28217.90", "2.335"], ["28218.00", "0.679"], ["28218.10", "0.331"], ["28218.20", "2.191"], ["28218.30", "0.764"], ["28218.40", "0.341"], ["28218.50", "0.162"], ["28218.60", "2.987"], ["28218.70", "0.152"], ["28218.80", "0.932"], ["28218.90", "1.451"], ["28219.00", "0.386"], ["28219.10", "3.103"], ["28219.20", "0.115"], ["28219.30", "0.247"], ["28219.40", "6.307"], ["28219.50", "0.122"], ["28219.60", "0.041"], ["28219.70", "0.963"], ["28219.80", "0.129"], ["28219.90", "4.433"], ["28220.00", "6.985"], ["28220.10", "0.460"], ["28220.20", "0.372"], ["28220.30", "0.424"], ["28220.40", "1.460"], ["28220.50", "1.451"], ["28220.70", "0.238"], ["28220.80", "2.689"], ["28220.90", "0.289"], ["28221.00", "0.098"], ["28221.10", "0.011"], ["28221.20", "0.001"], ["28221.30", "1.032"], ["28221.40", "0.845"], ["28221.50", "0.148"], ["28221.60", "0.113"], ["28221.70", "0.359"], ["28221.80", "0.055"], ["28222.00", "2.391"], ["28222.10", "0.159"], ["28222.20", "0.012"], ["28222.30", "0.035"], ["28222.40", "0.252"], ["28222.50", "1.670"], ["28222.60", "0.051"], ["28222.70", "0.090"], ["28222.80", "0.005"], ["28222.90", "3.115"], ["28223.00", "0.941"], ["28223.10", "0.211"], ["28223.20", "0.011"], ["28223.30", "0.086"], ["28223.40", "3.336"], ["28223.60", "1.432"], ["28223.70", "11.014"], ["28223.80", "0.052"], ["28223.90", "0.064"], ["28224.00", "15.905"], ["28224.10", "0.186"], ["28224.20", "0.244"], ["28224.30", "0.492"], ["28224.40", "0.125"], ["28224.50", "0.002"], ["28224.60", "0.135"], ["28224.70", "0.026"], ["28224.80", "0.862"], ["28224.90", "0.138"], ["28225.00", "2.189"], ["28225.10", "1.957"], ["28225.20", "0.005"], ["28225.30", "0.035"], ["28225.40", "0.088"], ["28225.50", "0.305"], ["28225.70", "0.191"], ["28225.80", "0.001"], ["28225.90", "2.999"], ["28226.00", "0.361"], ["28226.10", "0.084"], ["28226.20", "0.182"], ["28226.30", "11.900"], ["28226.40", "0.570"], ["28226.50", "1.386"], ["28226.60", "0.002"], ["28226.70", "0.059"], ["28226.80", "0.118"], ["28226.90", "0.115"], ["28227.00", "0.181"], ["28227.10", "0.004"], ["28227.20", "0.221"], ["28227.30", "0.320"], ["28227.40", "0.163"], ["28227.50", "0.080"], ["28227.60", "0.630"], ["28227.80", "0.065"], ["28227.90", "0.279"], ["28228.00", "2.303"], ["28228.10", "0.380"], ["28228.20", "2.645"], ["28228.40", "6.630"], ["28228.50", "0.386"], ["28228.60", "0.370"], ["28228.70", "0.336"], ["28228.80", "0.005"], ["28228.90", "0.159"], ["28229.00", "28.807"], ["28229.10", "0.001"], ["28229.20", "0.004"], ["28229.30", "0.187"], ["28229.40", "0.001"], ["28229.50", "0.901"], ["28229.60", "1.410"], ["28229.70", "0.083"], ["28229.80", "1.477"], ["28229.90", "1.876"], ["28230.00", "6.240"], ["28230.10", "0.311"], ["28230.20", "2.843"], ["28230.30", "0.207"], ["28230.40", "5.485"], ["28230.50", "0.208"], ["28230.60", "3.067"], ["28230.70", "0.235"], ["28230.80", "0.092"], ["28230.90", "0.117"], ["28231.00", "0.041"], ["28231.10", "0.021"], ["28231.20", "0.002"], ["28231.30", "0.006"], ["28231.40", "1.011"], ["28231.50", "0.014"], ["28231.70", "0.252"], ["28231.80", "2.973"], ["28231.90", "0.007"], ["28232.00", "1.155"], ["28232.10", "0.110"], ["28232.20", "0.135"], ["28232.30", "0.050"], ["28232.40", "0.149"], ["28232.50", "0.270"], ["28232.60", "0.203"], ["28232.70", "0.015"], ["28232.80", "0.197"], ["28232.90", "0.145"], ["28233.00", "0.074"], ["28233.20", "0.122"], ["28233.30", "2.320"], ["28233.50", "0.137"], ["28233.60", "0.152"], ["28233.80", "0.196"], ["28233.90", "0.905"], ["28234.00", "3.679"], ["28234.10", "0.169"], ["28234.20", "0.084"], ["28234.30", "0.488"], ["28234.40", "0.126"], ["28234.50", "0.209"], ["28234.60", "0.596"], ["28234.70", "0.357"], ["28234.80", "2.721"], ["28234.90", "0.150"], ["28235.00", "3.057"], ["28235.10", "2.129"], ["28235.20", "0.118"], ["28235.40", "0.165"], ["28235.50", "0.139"], ["28235.60", "0.422"], ["28235.70", "0.329"], ["28235.80", "0.024"], ["28235.90", "0.085"], ["28236.00", "2.251"], ["28236.10", "0.354"], ["28236.20", "0.076"], ["28236.30", "0.099"], ["28236.40", "0.221"], ["28236.50", "8.147"], ["28236.60", "0.166"], ["28236.70", "0.313"], ["28236.90", "1.496"], ["28237.00", "0.084"], ["28237.10", "0.001"], ["28237.20", "0.074"], ["28237.30", "0.096"], ["28237.40", "0.102"], ["28237.50", "5.147"], ["28237.60", "0.066"], ["28237.70", "0.253"], ["28237.90", "0.099"], ["28238.00", "0.046"], ["28238.10", "0.108"], ["28238.20", "1.551"], ["28238.30", "0.286"], ["28238.40", "5.336"], ["28238.50", "0.663"], ["28238.60", "0.009"], ["28238.70", "1.416"], ["28238.80", "2.560"], ["28238.90", "0.233"], ["28239.00", "2.213"], ["28239.10", "0.408"], ["28239.20", "0.281"], ["28239.30", "0.195"], ["28239.40", "0.162"], ["28239.60", "0.188"], ["28239.70", "0.002"], ["28239.80", "0.367"], ["28239.90", "6.932"], ["28240.00", "13.593"], ["28240.10", "0.036"], ["28240.20", "0.118"], ["28240.30", "0.009"], ["28240.40", "10.040"], ["28240.50", "12.444"], ["28240.60", "3.424"], ["28240.70", "2.113"], ["28240.90", "0.362"], ["28241.00", "0.404"], ["28241.20", "0.086"], ["28241.30", "0.093"], ["28241.40", "3.048"], ["28241.60", "0.174"], ["28241.70", "0.296"], ["28241.80", "0.155"], ["28241.90", "0.741"], ["28242.00", "0.040"], ["28242.10", "0.094"], ["28242.20", "0.010"], ["28242.30", "0.188"], ["28242.40", "0.171"], ["28242.50", "0.067"], ["28242.60", "0.144"], ["28242.70", "0.231"], ["28242.80", "0.001"], ["28242.90", "3.156"], ["28243.00", "0.586"], ["28243.10", "8.162"], ["28243.20", "0.030"], ["28243.30", "0.224"], ["28243.40", "3.938"], ["28243.50", "12.026"], ["28243.60", "0.002"], ["28243.70", "0.214"], ["28243.80", "21.460"], ["28243.90", "0.196"], ["28244.00", "0.652"], ["28244.10", "94.981"], ["28244.20", "0.427"], ["28244.30", "0.285"], ["28244.40", "0.180"], ["28244.50", "2.342"], ["28244.60", "0.410"], ["28244.70", "0.003"], ["28244.80", "0.088"], ["28244.90", "6.304"], ["28245.00", "0.848"], ["28245.10", "0.315"], ["28245.20", "1.991"], ["28245.30", "0.366"], ["28245.40", "0.106"], ["28245.50", "0.223"], ["28245.60", "0.223"], ["28245.70", "0.002"], ["28245.90", "0.227"], ["28246.00", "0.316"], ["28246.10", "0.343"], ["28246.20", "2.305"], ["28246.40", "0.021"], ["28246.50", "0.224"], ["28246.60", "0.003"], ["28246.70", "0.025"], ["28246.80", "0.038"], ["28246.90", "0.001"], ["28247.00", "0.486"], ["28247.10", "0.434"], ["28247.20", "0.215"], ["28247.30", "0.119"], ["28247.40", "0.226"], ["28247.50", "0.060"], ["28247.60", "0.225"], ["28247.70", "0.536"], ["28247.80", "0.147"], ["28247.90", "0.080"], ["28248.00", "1.657"], ["28248.10", "0.093"], ["28248.20", "0.317"], ["28248.30", "0.223"], ["28248.40", "0.161"], ["28248.50", "0.954"], ["28248.60", "0.403"], ["28248.70", "0.080"], ["28248.80", "0.005"], ["28248.90", "0.256"], ["28249.00", "3.201"], ["28249.10", "0.309"], ["28249.20", "0.005"], ["28249.30", "0.078"], ["28249.40", "0.268"], ["28249.50", "4.679"], ["28249.60", "0.130"], ["28249.70", "0.210"], ["28249.80", "15.197"], ["28249.90", "4.501"], ["28250.00", "20.080"], ["28250.10", "0.133"], ["28250.20", "0.148"], ["28250.30", "0.106"], ["28250.40", "0.150"], ["28250.50", "0.036"], ["28250.60", "0.044"], ["28250.70", "0.481"], ["28250.80", "0.002"], ["28250.90", "1.351"], ["28251.00", "5.915"], ["28251.10", "0.004"], ["28251.20", "0.176"], ["28251.30", "0.215"], ["28251.40", "0.145"], ["28251.50", "0.202"], ["28251.60", "0.611"], ["28251.80", "0.835"], ["28251.90", "0.288"], ["28252.00", "0.261"], ["28252.10", "0.084"], ["28252.20", "0.005"], ["28252.30", "0.119"], ["28252.60", "0.055"], ["28252.70", "0.005"], ["28252.80", "0.002"], ["28252.90", "0.463"], ["28253.00", "0.070"], ["28253.10", "0.231"], ["28253.20", "0.149"], ["28253.30", "2.858"], ["28253.40", "1.579"], ["28253.50", "0.070"], ["28253.60", "0.103"], ["28253.70", "0.007"], ["28253.80", "0.295"], ["28253.90", "0.001"], ["28254.00", "0.252"], ["28254.10", "0.073"], ["28254.20", "0.020"], ["28254.30", "0.002"], ["28254.40", "0.636"], ["28254.50", "0.171"], ["28254.60", "0.097"], ["28254.70", "0.001"], ["28254.80", "0.007"], ["28255.00", "3.013"], ["28255.10", "0.100"], ["28255.20", "5.642"], ["28255.30", "0.172"], ["28255.40", "0.085"], ["28255.50", "0.131"]]}\n'
b'1680652700600379 {"stream":"btcusdt@depth@0ms","data":{"e":"depthUpdate","E":1680652700594,"T":1680652700586,"s":"BTCUSDT","U":2710246765373,"u":2710246765597,"pu":2710246765326,"b":[["5000.00","14.368"],["28018.90","0.000"],["28049.60","0.288"],["28080.20","3.000"]],"a":[["28145.20","6.961"],["28173.30","0.015"],["28220.30","0.424"]]}}\n'
b'1680652700600379 {"stream":"btcusdt@depth@0ms","data":{"e":"depthUpdate","E":1680652700610,"T":1680652700603,"s":"BTCUSDT","U":2710246765629,"u":2710246765882,"pu":2710246765597,"b":[["28018.90","0.080"],["28080.20","3.080"],["28117.00","0.223"]],"a":[["28220.20","0.452"],["28250.90","1.431"],["28281.50","0.081"]]}}\n'
b'1680652700618593 {"stream":"btcusdt@depth@0ms","data":{"e":"depthUpdate","E":1680652700628,"T":1680652700614,"s":"BTCUSDT","U":2710246765966,"u":2710246766123,"pu":2710246765882,"b":[],"a":[["28173.30","0.014"],["28989.60","0.650"]]}}\n'
The first token of the line is timestamp received by local.
convert
method also attempts to correct timestamps by reordering the rows.[2]:
import numpy as np
from hftbacktest.data.utils import binancefutures
data = binancefutures.convert('usdm/btcusdt_20230404.dat.gz')
np.savez('btcusdt_20230404', data=data)
local_timestamp is ahead of exch_timestamp by 18836.0
found 542 rows that exch_timestamp is ahead of the previous exch_timestamp
Correction is done.
You can save the data directly to a file by providing output_filename
.
[3]:
binancefutures.convert('usdm/btcusdt_20230405.dat.gz', output_filename='btcusdt_20230405')
local_timestamp is ahead of exch_timestamp by 26932.0
found 6555 rows that exch_timestamp is ahead of the previous exch_timestamp
Correction is done.
Saving to btcusdt_20230405
[3]:
array([[ 1.00000000e+00, 1.68065280e+15, 1.68065280e+15,
1.00000000e+00, 2.23000000e+04, 2.78800000e+00],
[ 1.00000000e+00, 1.68065280e+15, 1.68065280e+15,
1.00000000e+00, 2.75774000e+04, 0.00000000e+00],
[ 1.00000000e+00, 1.68065280e+15, 1.68065280e+15,
1.00000000e+00, 2.80238000e+04, 1.63800000e+00],
...,
[ 1.00000000e+00, 1.68065321e+15, 1.68065321e+15,
-1.00000000e+00, 2.81499000e+04, 1.53200000e+00],
[ 1.00000000e+00, 1.68065321e+15, 1.68065321e+15,
-1.00000000e+00, 2.85725000e+04, 1.83000000e-01],
[ 1.00000000e+00, 1.68065321e+15, 1.68065321e+15,
-1.00000000e+00, 2.89844000e+04, 1.00000000e-03]])
Normalized data as follows. You can find more details on Data.
[4]:
import pandas as pd
df = pd.DataFrame(data, columns=['event', 'exch_timestamp', 'local_timestamp', 'side', 'price', 'qty'])
df['event'] = df['event'].astype(int)
df['exch_timestamp'] = df['exch_timestamp'].astype(int)
df['local_timestamp'] = df['local_timestamp'].astype(int)
df['side'] = df['side'].astype(int)
df
[4]:
event | exch_timestamp | local_timestamp | side | price | qty | |
---|---|---|---|---|---|---|
0 | 2 | 1680652700452000 | 1680652700460369 | -1 | 28145.1 | 0.002 |
1 | 2 | 1680652700452000 | 1680652700460521 | -1 | 28145.1 | 0.020 |
2 | 2 | 1680652700452000 | 1680652700460561 | -1 | 28145.1 | 0.020 |
3 | 2 | 1680652700452000 | 1680652700461364 | -1 | 28145.1 | 0.008 |
4 | 2 | 1680652700462000 | 1680652700473746 | 1 | 28145.2 | 0.002 |
... | ... | ... | ... | ... | ... | ... |
71014 | 1 | 1680652799975000 | 1680652799977784 | -1 | 28182.7 | 0.441 |
71015 | 1 | 1680652799975000 | 1680652799977784 | -1 | 28186.9 | 0.054 |
71016 | 1 | 1680652799975000 | 1680652799977784 | -1 | 28225.5 | 3.213 |
71017 | 1 | 1680652799975000 | 1680652799977784 | -1 | 28231.7 | 0.356 |
71018 | 1 | 1680652799975000 | 1680652799977784 | -1 | 28251.4 | 0.262 |
71019 rows × 6 columns
Creating a market depth snapshot
collect-binancefutures
fetches the snapshot only when it makes the connection, so you need build the initial snapshot from the start of the collected feed data.[5]:
from hftbacktest.data.utils import create_last_snapshot
# Build 20230404 End of Day snapshot. It will be used for the initial snapshot for 20230405.
data = create_last_snapshot('btcusdt_20230404.npz', tick_size=0.01, lot_size=0.001)
np.savez('btcusdt_20230404_eod.npz', data=data)
# Build 20230405 End of Day snapshot.
# Due to the file size limitation, btcusdt_20230405.npz does not contain data for the entire day.
create_last_snapshot(
'btcusdt_20230405.npz',
tick_size=0.01,
lot_size=0.001,
initial_snapshot='btcusdt_20230404_eod.npz',
output_snapshot_filename='btcusdt_20230405_eod'
)
Load btcusdt_20230404.npz
Load btcusdt_20230405.npz
[5]:
array([[ 4.00000000e+00, 1.68065321e+15, -1.00000000e+00,
1.00000000e+00, 2.81401000e+04, 8.25100000e+00],
[ 4.00000000e+00, 1.68065321e+15, -1.00000000e+00,
1.00000000e+00, 2.81400000e+04, 1.62000000e-01],
[ 4.00000000e+00, 1.68065321e+15, -1.00000000e+00,
1.00000000e+00, 2.81399000e+04, 4.00000000e-03],
...,
[ 4.00000000e+00, 1.68065321e+15, -1.00000000e+00,
-1.00000000e+00, 3.09404800e+05, 2.00000000e-03],
[ 4.00000000e+00, 1.68065321e+15, -1.00000000e+00,
-1.00000000e+00, 3.09425600e+05, 7.00000000e-03],
[ 4.00000000e+00, 1.68065321e+15, -1.00000000e+00,
-1.00000000e+00, 3.09443200e+05, 5.00000000e-03]])
[6]:
import pandas as pd
df = pd.DataFrame(data, columns=['event', 'exch_timestamp', 'local_timestamp', 'side', 'price', 'qty'])
df['event'] = df['event'].astype(int)
df['exch_timestamp'] = df['exch_timestamp'].astype(int)
df['local_timestamp'] = df['local_timestamp'].astype(int)
df['side'] = df['side'].astype(int)
df
[6]:
event | exch_timestamp | local_timestamp | side | price | qty | |
---|---|---|---|---|---|---|
0 | 4 | 1680652799977784 | -1 | 1 | 28155.1 | 0.060 |
1 | 4 | 1680652799977784 | -1 | 1 | 28155.0 | 0.004 |
2 | 4 | 1680652799977784 | -1 | 1 | 28154.9 | 0.001 |
3 | 4 | 1680652799977784 | -1 | 1 | 28154.8 | 0.001 |
4 | 4 | 1680652799977784 | -1 | 1 | 28154.7 | 0.002 |
... | ... | ... | ... | ... | ... | ... |
4092 | 4 | 1680652799977784 | -1 | -1 | 30827.5 | 1.620 |
4093 | 4 | 1680652799977784 | -1 | -1 | 31500.0 | 33.077 |
4094 | 4 | 1680652799977784 | -1 | -1 | 33500.0 | 11.648 |
4095 | 4 | 1680652799977784 | -1 | -1 | 33752.3 | 0.001 |
4096 | 4 | 1680652799977784 | -1 | -1 | 33783.5 | 12.417 |
4097 rows × 6 columns
Getting started from Tardis.dev data
Few vendors offer tick-by-tick full market depth data along with snapshot and trade data, and Tardis.dev is among them.
[ ]:
# https://docs.tardis.dev/historical-data-details/binance-futures
# Download sample Binance futures BTCUSDT trades
!wget https://datasets.tardis.dev/v1/binance-futures/trades/2020/02/01/BTCUSDT.csv.gz -O BTCUSDT_trades.csv.gz
# Download sample Binance futures BTCUSDT book
!wget https://datasets.tardis.dev/v1/binance-futures/incremental_book_L2/2020/02/01/BTCUSDT.csv.gz -O BTCUSDT_book.csv.gz
[8]:
from hftbacktest.data.utils import tardis
data = tardis.convert(['BTCUSDT_trades.csv.gz', 'BTCUSDT_book.csv.gz'])
np.savez('btcusdt_20200201.npz', data=data)
Reading BTCUSDT_trades.csv.gz
Reading BTCUSDT_book.csv.gz
Merging
found 20948 rows that exch_timestamp is ahead of the previous exch_timestamp
Correction is done.
You can save the data directly to a file by providing output_filename
. If there are too many rows, you need to increase buffer_size
.
[9]:
tardis.convert(
['BTCUSDT_trades.csv.gz', 'BTCUSDT_book.csv.gz'],
output_filename='btcusdt_20200201.npz',
buffer_size=200_000_000
)
Reading BTCUSDT_trades.csv.gz
Reading BTCUSDT_book.csv.gz
Merging
found 20948 rows that exch_timestamp is ahead of the previous exch_timestamp
Correction is done.
Saving to btcusdt_20200201.npz
[9]:
array([[ 2.0000000e+00, 1.5805152e+15, 1.5805152e+15, 1.0000000e+00,
9.3645100e+03, 1.1970000e+00],
[ 2.0000000e+00, 1.5805152e+15, 1.5805152e+15, 1.0000000e+00,
9.3656700e+03, 2.0000000e-02],
[ 2.0000000e+00, 1.5805152e+15, 1.5805152e+15, 1.0000000e+00,
9.3658600e+03, 1.0000000e-02],
...,
[ 1.0000000e+00, 1.5806016e+15, 1.5806016e+15, 1.0000000e+00,
9.3514700e+03, 3.9140000e+00],
[ 1.0000000e+00, 1.5806016e+15, 1.5806016e+15, -1.0000000e+00,
9.3977800e+03, 1.0000000e-01],
[ 1.0000000e+00, 1.5806016e+15, 1.5806016e+15, 1.0000000e+00,
9.3481400e+03, 3.9800000e+00]])
You can also build the snapshot in the same way as described above.
Getting Started
Printing the best bid and the best ask
[1]:
from numba import njit
# numba.njit is strongly recommended for fast backtesting.
@njit
def print_bbo(hbt):
# Iterating until hftbacktest reaches the end of data.
while hbt.run:
# Elapses 60-sec every iteration.
# Time unit is the same as data's timestamp's unit.
# timestamp of the sample data is in microseconds.
if not hbt.elapse(60 * 1e6):
# hftbacktest encounters the end of data while elapsing.
return False
# Prints the best bid and the best offer.
print(
'current_timestamp:', hbt.current_timestamp,
', best_bid:', round(hbt.best_bid, 3),
', best_ask:', round(hbt.best_ask, 3)
)
return True
[2]:
from hftbacktest import HftBacktest, FeedLatency, Linear
hbt = HftBacktest(
'btcusdt_20230405.npz',
tick_size=0.1, # Tick size of a target trading asset
lot_size=0.001, # Lot size of a target trading asset, minimum trading unit.
maker_fee=0.0002, # 0.02%, Maker fee, rebates if it is negative.
taker_fee=0.0007, # 0.07%, Taker fee.
order_latency=FeedLatency(), # Latency model: ConstantLatency, FeedLatency.
asset_type=Linear, # Asset type: Linear, Inverse.
snapshot='btcusdt_20230404_eod.npz'
)
Load btcusdt_20230405.npz
You can see the best bid and the best ask every 60-sec.
[3]:
print_bbo(hbt)
current_timestamp: 1680652860032116 , best_bid: 28150.7 , best_ask: 28150.8
current_timestamp: 1680652920032116 , best_bid: 28144.1 , best_ask: 28144.2
current_timestamp: 1680652980032116 , best_bid: 28149.9 , best_ask: 28150.0
current_timestamp: 1680653040032116 , best_bid: 28145.7 , best_ask: 28145.8
current_timestamp: 1680653100032116 , best_bid: 28140.5 , best_ask: 28140.6
current_timestamp: 1680653160032116 , best_bid: 28143.8 , best_ask: 28143.9
[3]:
False
Feeding the data
When you possess adequate memory, preloading the data into memory and providing it as input will be more efficient than lazy-loading during repeated backtesting. HftBacktest is compatible with either numpy
arrays or pandas
DataFrames.
[4]:
import numpy as np
btcusdt_20230405 = np.load('btcusdt_20230405.npz')['data']
btcusdt_20230404_eod = np.load('btcusdt_20230404_eod.npz')['data']
hbt = HftBacktest(
btcusdt_20230405,
tick_size=0.1,
lot_size=0.001,
maker_fee=0.0002,
taker_fee=0.0007,
order_latency=FeedLatency(),
asset_type=Linear,
snapshot=btcusdt_20230404_eod
)
You can also provide the list of data.
[5]:
hbt = HftBacktest(
[
btcusdt_20230405
],
tick_size=0.1,
lot_size=0.001,
maker_fee=0.0002,
taker_fee=0.0007,
order_latency=FeedLatency(),
asset_type=Linear,
snapshot=btcusdt_20230404_eod
)
Due to the vast size of tick-by-tick market depth and trade data, loading the entire dataset into memory can be challenging, particularly when backtesting across multiple days. HftBacktest offers lazy loading support and is compatible with npy
, npz
(data should be stored under the data
key), and pickled pandas
DataFrames.
[6]:
hbt = HftBacktest(
[
'btcusdt_20230405.npz'
],
tick_size=0.1,
lot_size=0.001,
maker_fee=0.0002,
taker_fee=0.0007,
order_latency=FeedLatency(),
asset_type=Linear,
snapshot='btcusdt_20230404_eod.npz'
)
Load btcusdt_20230405.npz
Getting the market depth
[7]:
@njit
def print_3depth(hbt):
while hbt.run:
if not hbt.elapse(60 * 1e6):
return False
# a key of bid_depth or ask_depth is price in tick format.
# (integer) price_tick = price / tick_size
print('current_timestamp:', hbt.current_timestamp)
i = 0
for tick_price in range(hbt.best_ask_tick, hbt.high_ask_tick + 1):
if tick_price in hbt.ask_depth:
print(
'ask: ',
hbt.ask_depth[tick_price],
'@',
round(tick_price * hbt.tick_size, 3)
)
i += 1
if i == 3:
break
i = 0
for tick_price in range(hbt.best_bid_tick, hbt.low_bid_tick - 1, -1):
if tick_price in hbt.bid_depth:
print(
'bid: ',
hbt.bid_depth[tick_price],
'@',
round(tick_price * hbt.tick_size, 3)
)
i += 1
if i == 3:
break
return True
[ ]:
[8]:
hbt = HftBacktest(
btcusdt_20230405,
tick_size=0.1,
lot_size=0.001,
maker_fee=0.0002,
taker_fee=0.0007,
order_latency=FeedLatency(),
asset_type=Linear,
snapshot=btcusdt_20230404_eod
)
print_3depth(hbt)
current_timestamp: 1680652860032116
ask: 9.228 @ 28150.8
ask: 0.387 @ 28150.9
ask: 3.996 @ 28151.0
bid: 3.135 @ 28150.7
bid: 0.002 @ 28150.6
bid: 0.813 @ 28150.5
current_timestamp: 1680652920032116
ask: 1.224 @ 28144.2
ask: 0.223 @ 28144.3
ask: 0.001 @ 28144.5
bid: 10.529 @ 28144.1
bid: 0.168 @ 28144.0
bid: 0.29 @ 28143.9
current_timestamp: 1680652980032116
ask: 3.397 @ 28150.0
ask: 1.282 @ 28150.1
ask: 0.003 @ 28150.4
bid: 7.951 @ 28149.9
bid: 0.02 @ 28149.8
bid: 0.02 @ 28149.7
current_timestamp: 1680653040032116
ask: 3.905 @ 28145.8
ask: 1.695 @ 28145.9
ask: 0.003 @ 28146.0
bid: 5.793 @ 28145.7
bid: 0.059 @ 28145.6
bid: 0.044 @ 28145.5
current_timestamp: 1680653100032116
ask: 6.8 @ 28140.6
ask: 0.001 @ 28140.7
ask: 0.004 @ 28141.1
bid: 2.416 @ 28140.5
bid: 0.004 @ 28140.4
bid: 0.012 @ 28140.3
current_timestamp: 1680653160032116
ask: 3.666 @ 28143.9
ask: 1.422 @ 28144.0
ask: 1.455 @ 28144.1
bid: 3.189 @ 28143.8
bid: 5.136 @ 28143.7
bid: 0.012 @ 28143.5
[8]:
False
Submitting an order
[9]:
from hftbacktest import GTC, NONE, NEW, FILLED, CANCELED, EXPIRED
@njit
def print_orders(hbt):
# You can access open orders and also closed orders via hbt.orders.
# hbt.orders is a Numba dictionary and its key is order_id(int).
for order_id, order in hbt.orders.items():
order_status = ''
if order.status == NONE:
order_status = 'NONE' # Exchange hasn't received an order yet.
elif order.status == NEW:
order_status = 'NEW'
elif order.status == FILLED:
order_status = 'FILLED'
elif order.status == CANCELED:
order_status = 'CANCELED'
elif order.status == EXPIRED:
order_status = 'EXPIRED'
order_req = ''
if order.req == NONE:
order_req = 'NONE'
elif order.req == NEW:
order_req = 'NEW'
elif order.req == CANCELED:
order_req = 'CANCEL'
print(
'current_timestamp:', hbt.current_timestamp,
', order_id:', order_id,
', order_price:', order.price,
', order_qty:', order.qty,
', order_status:', order_status,
', order_req:', order_req
)
@njit
def submit_order(hbt):
is_order_submitted = False
while hbt.run:
if not hbt.elapse(30 * 1e6):
return False
# Prints open orders.
print_orders(hbt)
if not is_order_submitted:
# Submits a buy order at 100 tick below the best bid.
order_id = 1
order_price = hbt.best_bid - 100 * hbt.tick_size
order_qty = 1
time_in_force = GTC # Good 'till cancel
hbt.submit_buy_order(order_id, order_price, order_qty, time_in_force)
is_order_submitted = True
return True
[10]:
hbt = HftBacktest(
btcusdt_20230405,
tick_size=0.1,
lot_size=0.001,
maker_fee=0.0002,
taker_fee=0.0007,
order_latency=FeedLatency(),
asset_type=Linear,
snapshot=btcusdt_20230404_eod
)
submit_order(hbt)
current_timestamp: 1680652860032116 , order_id: 1 , order_price: 28146.300000000003 , order_qty: 1.0 , order_status: NEW , order_req: NONE
current_timestamp: 1680652890032116 , order_id: 1 , order_price: 28146.300000000003 , order_qty: 1.0 , order_status: FILLED , order_req: NONE
current_timestamp: 1680652920032116 , order_id: 1 , order_price: 28146.300000000003 , order_qty: 1.0 , order_status: FILLED , order_req: NONE
current_timestamp: 1680652950032116 , order_id: 1 , order_price: 28146.300000000003 , order_qty: 1.0 , order_status: FILLED , order_req: NONE
current_timestamp: 1680652980032116 , order_id: 1 , order_price: 28146.300000000003 , order_qty: 1.0 , order_status: FILLED , order_req: NONE
current_timestamp: 1680653010032116 , order_id: 1 , order_price: 28146.300000000003 , order_qty: 1.0 , order_status: FILLED , order_req: NONE
current_timestamp: 1680653040032116 , order_id: 1 , order_price: 28146.300000000003 , order_qty: 1.0 , order_status: FILLED , order_req: NONE
current_timestamp: 1680653070032116 , order_id: 1 , order_price: 28146.300000000003 , order_qty: 1.0 , order_status: FILLED , order_req: NONE
current_timestamp: 1680653100032116 , order_id: 1 , order_price: 28146.300000000003 , order_qty: 1.0 , order_status: FILLED , order_req: NONE
current_timestamp: 1680653130032116 , order_id: 1 , order_price: 28146.300000000003 , order_qty: 1.0 , order_status: FILLED , order_req: NONE
current_timestamp: 1680653160032116 , order_id: 1 , order_price: 28146.300000000003 , order_qty: 1.0 , order_status: FILLED , order_req: NONE
current_timestamp: 1680653190032116 , order_id: 1 , order_price: 28146.300000000003 , order_qty: 1.0 , order_status: FILLED , order_req: NONE
[10]:
False
Clearing inactive orders (FILLED, CANCELED, EXPIRED)
[11]:
from hftbacktest import GTC
@njit
def clear_inactive_orders(hbt):
is_order_submitted = False
while hbt.run:
if not hbt.elapse(30 * 1e6):
return False
print_orders(hbt)
# Removes inactive(FILLED, CANCELED, EXPIRED) orders from hbt.orders.
hbt.clear_inactive_orders()
if not is_order_submitted:
order_id = 1
order_price = hbt.best_bid - 100 * hbt.tick_size
order_qty = 1
time_in_force = GTC
hbt.submit_buy_order(order_id, order_price, order_qty, time_in_force)
is_order_submitted = True
return True
[12]:
hbt = HftBacktest(
btcusdt_20230405,
tick_size=0.1,
lot_size=0.001,
maker_fee=0.0002,
taker_fee=0.0007,
order_latency=FeedLatency(),
asset_type=Linear,
snapshot=btcusdt_20230404_eod
)
clear_inactive_orders(hbt)
current_timestamp: 1680652860032116 , order_id: 1 , order_price: 28146.300000000003 , order_qty: 1.0 , order_status: NEW , order_req: NONE
current_timestamp: 1680652890032116 , order_id: 1 , order_price: 28146.300000000003 , order_qty: 1.0 , order_status: FILLED , order_req: NONE
[12]:
False
Watching a order status - pending due to order latency
[13]:
from hftbacktest import GTC
@njit
def watch_pending(hbt):
is_order_submitted = False
while hbt.run:
# Elapses 0.01-sec every iteration.
if not hbt.elapse(0.01 * 1e6):
return False
print_orders(hbt)
hbt.clear_inactive_orders()
if not is_order_submitted:
order_id = 1
order_price = hbt.best_bid - 100 * hbt.tick_size
order_qty = 1
time_in_force = GTC
hbt.submit_buy_order(order_id, order_price, order_qty, time_in_force)
is_order_submitted = True
# Prevents too many prints
if hbt.orders[order_id].status == NEW:
return False
return True
[14]:
hbt = HftBacktest(
btcusdt_20230405,
tick_size=0.1,
lot_size=0.001,
maker_fee=0.0002,
taker_fee=0.0007,
order_latency=FeedLatency(),
asset_type=Linear,
snapshot=btcusdt_20230404_eod
)
watch_pending(hbt)
current_timestamp: 1680652800052116 , order_id: 1 , order_price: 28145.100000000002 , order_qty: 1.0 , order_status: NONE , order_req: NEW
current_timestamp: 1680652800062116 , order_id: 1 , order_price: 28145.100000000002 , order_qty: 1.0 , order_status: NONE , order_req: NEW
current_timestamp: 1680652800072116 , order_id: 1 , order_price: 28145.100000000002 , order_qty: 1.0 , order_status: NEW , order_req: NONE
[14]:
False
Waiting for an order response
[15]:
from hftbacktest import GTC
@njit
def wait_for_order_response(hbt):
order_id = 0
is_order_submitted = False
while hbt.run:
if not hbt.elapse(0.01 * 1e6):
return False
print_orders(hbt)
hbt.clear_inactive_orders()
# Prevent too many prints
if order_id in hbt.orders:
if hbt.orders[order_id].status == NEW:
return False
if not is_order_submitted:
order_id = 1
order_price = hbt.best_bid
order_qty = 1
time_in_force = GTC
hbt.submit_buy_order(order_id, order_price, order_qty, time_in_force)
# Wait for the order response for a given order id.
print('an order is submitted at', hbt.current_timestamp)
hbt.wait_order_response(order_id)
print('an order response is received at', hbt.current_timestamp)
is_order_submitted = True
return True
[16]:
hbt = HftBacktest(
btcusdt_20230405,
tick_size=0.1,
lot_size=0.001,
maker_fee=0.0002,
taker_fee=0.0007,
order_latency=FeedLatency(),
asset_type=Linear,
snapshot=btcusdt_20230404_eod
)
wait_for_order_response(hbt)
an order is submitted at 1680652800042116
an order response is received at 1680652800070493
current_timestamp: 1680652800080493 , order_id: 1 , order_price: 28155.100000000002 , order_qty: 1.0 , order_status: NEW , order_req: NONE
[16]:
False
Printing position, balance, fee, and equity
[17]:
@njit
def position(hbt):
is_order_submitted = False
while hbt.run:
if not hbt.elapse(60 * 1e6):
return False
print_orders(hbt)
hbt.clear_inactive_orders()
# Prints position, balance, fee, and equity
print(
'current_timestamp:', hbt.current_timestamp,
', position:', hbt.position,
', balance:', hbt.balance,
', fee:', hbt.fee,
', equity:', hbt.equity
)
if not is_order_submitted:
order_id = 1
order_price = hbt.best_bid
order_qty = 1
time_in_force = GTC
hbt.submit_buy_order(order_id, order_price, order_qty, time_in_force)
hbt.wait_order_response(order_id)
is_order_submitted = True
return True
[18]:
hbt = HftBacktest(
btcusdt_20230405,
tick_size=0.1,
lot_size=0.001,
maker_fee=0.0002,
taker_fee=0.0007,
order_latency=FeedLatency(),
asset_type=Linear,
snapshot=btcusdt_20230404_eod
)
position(hbt)
current_timestamp: 1680652860032116 , position: 0.0 , balance: 0.0 , fee: 0.0 , equity: 0.0
current_timestamp: 1680652920095398 , order_id: 1 , order_price: 28150.7 , order_qty: 1.0 , order_status: FILLED , order_req: NONE
current_timestamp: 1680652920095398 , position: 1.0 , balance: -28150.7 , fee: 5.630140000000001 , equity: -12.180139999999273
current_timestamp: 1680652980095398 , position: 1.0 , balance: -28150.7 , fee: 5.630140000000001 , equity: -6.380140000000001
current_timestamp: 1680653040095398 , position: 1.0 , balance: -28150.7 , fee: 5.630140000000001 , equity: -10.580140000000728
current_timestamp: 1680653100095398 , position: 1.0 , balance: -28150.7 , fee: 5.630140000000001 , equity: -15.780139999997818
current_timestamp: 1680653160095398 , position: 1.0 , balance: -28150.7 , fee: 5.630140000000001 , equity: -12.480139999998546
[18]:
False
Canceling an open order
[19]:
@njit
def submit_and_cancel_order(hbt):
is_order_submitted = False
while hbt.run:
if not hbt.elapse(0.1 * 1e6):
return False
print_orders(hbt)
hbt.clear_inactive_orders()
# Cancels if there is an open order
for order_id, order in hbt.orders.items():
# an order is only cancellable if order status is NEW.
# cancel request is negated if the order is already filled or filled before cancel request is processed.
if order.cancellable:
hbt.cancel(order_id)
# You can see status still NEW and see req CANCEL.
print_orders(hbt)
# cancels request also has order entry/response latencies the same as submitting.
hbt.wait_order_response(order_id)
if not is_order_submitted:
order_id = 1
order_price = hbt.best_bid - 100 * hbt.tick_size
order_qty = 1
time_in_force = GTC
hbt.submit_buy_order(order_id, order_price, order_qty, time_in_force)
hbt.wait_order_response(order_id)
is_order_submitted = True
else:
if len(hbt.orders) == 0:
return False
return True
[20]:
hbt = HftBacktest(
btcusdt_20230405,
tick_size=0.1,
lot_size=0.001,
maker_fee=0.0002,
taker_fee=0.0007,
order_latency=FeedLatency(),
asset_type=Linear,
snapshot=btcusdt_20230404_eod
)
submit_and_cancel_order(hbt)
current_timestamp: 1680652800254816 , order_id: 1 , order_price: 28145.100000000002 , order_qty: 1.0 , order_status: NEW , order_req: NONE
current_timestamp: 1680652800254816 , order_id: 1 , order_price: 28145.100000000002 , order_qty: 1.0 , order_status: NEW , order_req: CANCEL
current_timestamp: 1680652800973764 , order_id: 1 , order_price: 28145.100000000002 , order_qty: 1.0 , order_status: CANCELED , order_req: NONE
[20]:
False
Market order
[21]:
@njit
def print_orders_exec_price(hbt):
for order_id, order in hbt.orders.items():
order_status = ''
if order.status == NONE:
order_status = 'NONE'
elif order.status == NEW:
order_status = 'NEW'
elif order.status == FILLED:
order_status = 'FILLED'
elif order.status == CANCELED:
order_status = 'CANCELED'
elif order.status == EXPIRED:
order_status = 'EXPIRED'
order_req = ''
if order.req == NONE:
order_req = 'NONE'
elif order.req == NEW:
order_req = 'NEW'
elif order.req == CANCELED:
order_req = 'CANCEL'
print(
'current_timestamp:', hbt.current_timestamp,
', order_id:', order_id,
', order_price:', order.price,
', order_qty:', order.qty,
', order_status:', order_status,
', exec_price:', order.exec_price
)
@njit
def market_order(hbt):
is_order_submitted = False
while hbt.run:
if not hbt.elapse(60 * 1e6):
return False
print_orders(hbt)
hbt.clear_inactive_orders()
print(
'current_timestamp:', hbt.current_timestamp,
', position:', hbt.position,
', balance:', hbt.balance,
', fee:', hbt.fee,
', equity:', hbt.equity
)
if not is_order_submitted:
order_id = 1
# Sets a deep price in the opposite side to take liquidity.
order_price = hbt.best_bid - 50 * hbt.tick_size
order_qty = 1
time_in_force = GTC
hbt.submit_sell_order(order_id, order_price, order_qty, time_in_force)
hbt.wait_order_response(order_id)
# You can see the order immediately filled.
# Also you can see the order executed at the best bid which is different from what it was submitted at.
print('best_bid:', hbt.best_bid)
print_orders_exec_price(hbt)
is_order_submitted = True
return True
[22]:
hbt = HftBacktest(
btcusdt_20230405,
tick_size=0.1,
lot_size=0.001,
maker_fee=0.0002,
taker_fee=0.0007,
order_latency=FeedLatency(),
asset_type=Linear,
snapshot=btcusdt_20230404_eod
)
market_order(hbt)
current_timestamp: 1680652860032116 , position: 0.0 , balance: 0.0 , fee: 0.0 , equity: 0.0
best_bid: 28150.7
current_timestamp: 1680652860095398 , order_id: 1 , order_price: 28145.7 , order_qty: 1.0 , order_status: FILLED , exec_price: 28150.7
current_timestamp: 1680652920095398 , order_id: 1 , order_price: 28145.7 , order_qty: 1.0 , order_status: FILLED , order_req: NONE
current_timestamp: 1680652920095398 , position: -1.0 , balance: 28150.7 , fee: 19.70549 , equity: -13.155490000000729
current_timestamp: 1680652980095398 , position: -1.0 , balance: 28150.7 , fee: 19.70549 , equity: -18.95549
current_timestamp: 1680653040095398 , position: -1.0 , balance: 28150.7 , fee: 19.70549 , equity: -14.755489999999273
current_timestamp: 1680653100095398 , position: -1.0 , balance: 28150.7 , fee: 19.70549 , equity: -9.555490000002184
current_timestamp: 1680653160095398 , position: -1.0 , balance: 28150.7 , fee: 19.70549 , equity: -12.855490000001456
[22]:
False
GTX, Post-Only order
[23]:
from hftbacktest import GTX
@njit
def submit_gtx(hbt):
is_order_submitted = False
while hbt.run:
if not hbt.elapse(60 * 1e6):
return False
print_orders(hbt)
hbt.clear_inactive_orders()
print(
'current_timestamp:', hbt.current_timestamp,
', position:', hbt.position,
', balance:', hbt.balance,
', fee:', hbt.fee,
', equity:', hbt.equity
)
if not is_order_submitted:
order_id = 1
# Sets a deep price in the opposite side and it will be rejected by GTX.
order_price = hbt.best_bid - 100 * hbt.tick_size
order_qty = 1
time_in_force = GTX
hbt.submit_sell_order(order_id, order_price, order_qty, time_in_force)
hbt.wait_order_response(order_id)
is_order_submitted = True
return True
[24]:
hbt = HftBacktest(
btcusdt_20230405,
tick_size=0.1,
lot_size=0.001,
maker_fee=0.0002,
taker_fee=0.0007,
order_latency=FeedLatency(),
asset_type=Linear,
snapshot=btcusdt_20230404_eod
)
submit_gtx(hbt)
current_timestamp: 1680652860032116 , position: 0.0 , balance: 0.0 , fee: 0.0 , equity: 0.0
current_timestamp: 1680652920095398 , order_id: 1 , order_price: 28140.7 , order_qty: 1.0 , order_status: EXPIRED , order_req: NONE
current_timestamp: 1680652920095398 , position: 0.0 , balance: 0.0 , fee: 0.0 , equity: 0.0
current_timestamp: 1680652980095398 , position: 0.0 , balance: 0.0 , fee: 0.0 , equity: 0.0
current_timestamp: 1680653040095398 , position: 0.0 , balance: 0.0 , fee: 0.0 , equity: 0.0
current_timestamp: 1680653100095398 , position: 0.0 , balance: 0.0 , fee: 0.0 , equity: 0.0
current_timestamp: 1680653160095398 , position: 0.0 , balance: 0.0 , fee: 0.0 , equity: 0.0
[24]:
False
Plotting BBO
[25]:
@njit
def plot_bbo(hbt, local_timestamp, best_bid, best_ask):
while hbt.run:
if not hbt.elapse(1 * 1e6):
return False
# Records data points
local_timestamp.append(hbt.current_timestamp)
best_bid.append(hbt.best_bid)
best_ask.append(hbt.best_ask)
return True
[26]:
# Uses Numba list for njit.
from numba.typed import List
from numba import float64
import pandas as pd
hbt = HftBacktest(
btcusdt_20230405,
tick_size=0.1,
lot_size=0.001,
maker_fee=0.0002,
taker_fee=0.0007,
order_latency=FeedLatency(),
asset_type=Linear,
snapshot=btcusdt_20230404_eod
)
local_timestamp = List.empty_list(float64, allocated=10000)
best_bid = List.empty_list(float64, allocated=10000)
best_ask = List.empty_list(float64, allocated=10000)
plot_bbo(hbt, local_timestamp, best_bid, best_ask)
local_timestamp = pd.to_datetime(local_timestamp, unit='us', utc=True)
best_bid = pd.Series(best_bid, index=local_timestamp)
best_ask = pd.Series(best_ask, index=local_timestamp)
best_bid.plot()
best_ask.plot()
[26]:
<Axes: >

Printing stats
[27]:
@njit
def submit_order_stats(hbt, recorder):
buy_order_id = 1
sell_order_id = 2
half_spread = 1 * hbt.tick_size
while hbt.run:
if not hbt.elapse(1 * 1e6):
return False
hbt.clear_inactive_orders()
mid = (hbt.best_bid + hbt.best_ask) / 2.0
if buy_order_id not in hbt.orders:
order_price = round((mid - half_spread) / hbt.tick_size) * hbt.tick_size
order_qty = 1
time_in_force = GTC
hbt.submit_buy_order(buy_order_id, order_price, order_qty, time_in_force)
if sell_order_id not in hbt.orders:
order_price = round((mid + half_spread) / hbt.tick_size) * hbt.tick_size
order_qty = 1
time_in_force = GTC
hbt.submit_sell_order(sell_order_id, order_price, order_qty, time_in_force)
recorder.record(hbt)
return True
[28]:
from hftbacktest import Stat
hbt = HftBacktest(
btcusdt_20230405,
tick_size=0.1,
lot_size=0.001,
maker_fee=0.0002,
taker_fee=0.0007,
order_latency=FeedLatency(),
asset_type=Linear,
snapshot=btcusdt_20230404_eod
)
stat = Stat(hbt)
submit_order_stats(hbt, stat.recorder)
# Default resample is 5min.
stat.summary(capital=200000, resample='1s')
=========== Summary ===========
Sharpe ratio: -355.7
Sortino ratio: -210.0
Risk return ratio: -52717.6
Annualised return: -3141.57 %
Max. draw down: 0.06 %
The number of trades per day: 9
Avg. daily trading volume: 9
Avg. daily trading amount: 255269
Max leverage: 1.27
Median leverage: 0.14

Working with Market Depth and Trades
Display 3-depth
[1]:
from numba import njit
@njit
def print_3depth(hbt):
while hbt.elapse(60 * 1e6):
# a key of bid_depth or ask_depth is price in tick format.
# (integer) price_tick = price / tick_size
print('current_timestamp:', hbt.current_timestamp)
i = 0
for tick_price in range(hbt.best_ask_tick, hbt.high_ask_tick + 1):
if tick_price in hbt.ask_depth:
print(
'ask: ',
hbt.ask_depth[tick_price],
'@',
round(tick_price * hbt.tick_size, 3)
)
i += 1
if i == 3:
break
i = 0
for tick_price in range(hbt.best_bid_tick, hbt.low_bid_tick - 1, -1):
if tick_price in hbt.bid_depth:
print(
'bid: ',
hbt.bid_depth[tick_price],
'@',
round(tick_price * hbt.tick_size, 3)
)
i += 1
if i == 3:
break
return True
[2]:
import numpy as np
btcusdt_20230405 = np.load('btcusdt_20230405.npz')['data']
btcusdt_20230404_eod = np.load('btcusdt_20230404_eod.npz')['data']
[3]:
from hftbacktest import HftBacktest, FeedLatency, Linear
hbt = HftBacktest(
btcusdt_20230405,
tick_size=0.1,
lot_size=0.001,
maker_fee=0.0002,
taker_fee=0.0007,
order_latency=FeedLatency(),
asset_type=Linear,
snapshot=btcusdt_20230404_eod
)
print_3depth(hbt)
current_timestamp: 1680652860032116
ask: 9.228 @ 28150.8
ask: 0.387 @ 28150.9
ask: 3.996 @ 28151.0
bid: 3.135 @ 28150.7
bid: 0.002 @ 28150.6
bid: 0.813 @ 28150.5
current_timestamp: 1680652920032116
ask: 1.224 @ 28144.2
ask: 0.223 @ 28144.3
ask: 0.001 @ 28144.5
bid: 10.529 @ 28144.1
bid: 0.168 @ 28144.0
bid: 0.29 @ 28143.9
current_timestamp: 1680652980032116
ask: 3.397 @ 28150.0
ask: 1.282 @ 28150.1
ask: 0.003 @ 28150.4
bid: 7.951 @ 28149.9
bid: 0.02 @ 28149.8
bid: 0.02 @ 28149.7
current_timestamp: 1680653040032116
ask: 3.905 @ 28145.8
ask: 1.695 @ 28145.9
ask: 0.003 @ 28146.0
bid: 5.793 @ 28145.7
bid: 0.059 @ 28145.6
bid: 0.044 @ 28145.5
current_timestamp: 1680653100032116
ask: 6.8 @ 28140.6
ask: 0.001 @ 28140.7
ask: 0.004 @ 28141.1
bid: 2.416 @ 28140.5
bid: 0.004 @ 28140.4
bid: 0.012 @ 28140.3
current_timestamp: 1680653160032116
ask: 3.666 @ 28143.9
ask: 1.422 @ 28144.0
ask: 1.455 @ 28144.1
bid: 3.189 @ 28143.8
bid: 5.136 @ 28143.7
bid: 0.012 @ 28143.5
[3]:
True
Order Book Imbalance
[4]:
@njit
def orderbookimbalance(hbt, out):
while hbt.elapse(10 * 1e6):
mid_price = (hbt.best_bid + hbt.best_ask) / 2.0
sum_ask_qty_50bp = 0.0
sum_ask_qty = 0.0
for tick_price in range(hbt.best_ask_tick, hbt.high_ask_tick + 1):
if tick_price in hbt.ask_depth:
ask_price = tick_price * hbt.tick_size
depth_from_mid = (ask_price - mid_price) / mid_price
if depth_from_mid > 0.01:
break
sum_ask_qty += hbt.ask_depth[tick_price]
if depth_from_mid <= 0.005:
sum_ask_qty_50bp = sum_ask_qty
sum_bid_qty_50bp = 0.0
sum_bid_qty = 0.0
for tick_price in range(hbt.best_bid_tick, hbt.low_bid_tick - 1, -1):
if tick_price in hbt.bid_depth:
bid_price = tick_price * hbt.tick_size
depth_from_mid = (mid_price - bid_price) / mid_price
if depth_from_mid > 0.01:
break
sum_bid_qty += hbt.bid_depth[tick_price]
if depth_from_mid <= 0.005:
sum_bid_qty_50bp = sum_bid_qty
imbalance_50bp = sum_bid_qty_50bp - sum_ask_qty_50bp
imbalance_1pct = sum_bid_qty - sum_ask_qty
imbalance_tob = hbt.bid_depth[hbt.best_bid_tick] - hbt.ask_depth[hbt.best_ask_tick]
out.append((hbt.current_timestamp, imbalance_tob, imbalance_50bp, imbalance_1pct))
return True
[5]:
from numba.typed import List
from numba.types import Tuple, float64
hbt = HftBacktest(
btcusdt_20230405,
tick_size=0.1,
lot_size=0.001,
maker_fee=0.0002,
taker_fee=0.0007,
order_latency=FeedLatency(),
asset_type=Linear,
snapshot=btcusdt_20230404_eod
)
tup_ty = Tuple((float64, float64, float64, float64))
out = List.empty_list(tup_ty, allocated=100_000)
orderbookimbalance(hbt, out)
[5]:
True
[6]:
import polars as pl
df = pl.DataFrame(out).transpose()
df.columns = ['Local Timestamp', 'TOB Imbalance', '0.5% Imbalance', '1% Imbalance']
df = df.with_columns(
pl.from_epoch('Local Timestamp', time_unit='us')
)
df
[6]:
Local Timestamp | TOB Imbalance | 0.5% Imbalance | 1% Imbalance |
---|---|---|---|
datetime[μs] | f64 | f64 | f64 |
2023-04-05 00:00:10.032116 | 10.62 | 79.522 | -521.026 |
2023-04-05 00:00:20.032116 | -4.868 | -152.592 | -684.247 |
2023-04-05 00:00:30.032116 | 0.59 | -161.95 | -701.843 |
2023-04-05 00:00:40.032116 | 3.962 | -142.51 | -669.033 |
2023-04-05 00:00:50.032116 | 4.912 | -114.827 | -653.331 |
2023-04-05 00:01:00.032116 | -6.093 | 273.067 | -755.551 |
2023-04-05 00:01:10.032116 | 10.201 | -36.007 | -701.998 |
2023-04-05 00:01:20.032116 | 3.009 | 121.205 | -735.241 |
2023-04-05 00:01:30.032116 | 8.383 | 381.521 | -704.022 |
2023-04-05 00:01:40.032116 | 8.672 | 166.934 | -691.313 |
2023-04-05 00:01:50.032116 | -2.575 | 393.212 | -695.99 |
2023-04-05 00:02:00.032116 | 9.305 | 165.543 | -701.034 |
… | … | … | … |
2023-04-05 00:05:00.032116 | -4.384 | 504.205 | -1005.883 |
2023-04-05 00:05:10.032116 | -10.432 | 506.615 | -952.28 |
2023-04-05 00:05:20.032116 | 20.341 | 542.513 | -917.164 |
2023-04-05 00:05:30.032116 | 3.113 | 587.247 | -858.536 |
2023-04-05 00:05:40.032116 | 1.212 | 542.287 | -901.735 |
2023-04-05 00:05:50.032116 | 3.997 | 184.424 | -833.991 |
2023-04-05 00:06:00.032116 | -0.477 | 180.863 | -825.373 |
2023-04-05 00:06:10.032116 | -3.77 | 525.716 | -887.492 |
2023-04-05 00:06:20.032116 | -5.273 | 434.96 | -1004.985 |
2023-04-05 00:06:30.032116 | -4.487 | 570.354 | -837.517 |
2023-04-05 00:06:40.032116 | -6.186 | 565.936 | -838.518 |
2023-04-05 00:06:50.032116 | 3.351 | 534.445 | -870.112 |
[7]:
import matplotlib.pyplot as plt
plt.plot(df['Local Timestamp'], df['TOB Imbalance'])
plt.plot(df['Local Timestamp'], df['0.5% Imbalance'])
plt.plot(df['Local Timestamp'], df['1% Imbalance'])
plt.legend(['TOB Imbalance', '0.5% Imbalance', '1% Imbalance'])
[7]:
<matplotlib.legend.Legend at 0x7f271704caf0>

Display last trades between the step
[31]:
from hftbacktest import COL_EXCH_TIMESTAMP, COL_SIDE, COL_PRICE, COL_QTY
@njit
def print_trades(hbt):
while hbt.elapse(60 * 1e6):
print('-------------------------------------------------------------------------------')
print('current_timestamp:', hbt.current_timestamp)
num = 0
for trade in hbt.last_trades:
if num > 10:
print('...')
break
print(
'exch_timestamp:',
trade[COL_EXCH_TIMESTAMP],
'buy' if trade[COL_SIDE] == 1 else 'sell',
trade[COL_QTY],
'@',
trade[COL_PRICE]
)
num += 1
hbt.clear_last_trades()
return True
[32]:
hbt = HftBacktest(
btcusdt_20230405,
tick_size=0.1,
lot_size=0.001,
maker_fee=0.0002,
taker_fee=0.0007,
order_latency=FeedLatency(),
asset_type=Linear,
snapshot=btcusdt_20230404_eod,
trade_list_size=10_000
)
print_trades(hbt)
-------------------------------------------------------------------------------
current_timestamp: 1680652860032116
exch_timestamp: 1680652804962000.0 sell 0.001 @ 28155.1
exch_timestamp: 1680652804964000.0 buy 0.012 @ 28155.2
exch_timestamp: 1680652804966000.0 buy 0.002 @ 28155.2
exch_timestamp: 1680652804968000.0 buy 0.003 @ 28155.2
exch_timestamp: 1680652804981000.0 sell 0.019 @ 28155.1
exch_timestamp: 1680652804981000.0 sell 0.004 @ 28155.1
exch_timestamp: 1680652804981000.0 sell 0.001 @ 28155.1
exch_timestamp: 1680652804981000.0 sell 0.02 @ 28155.1
exch_timestamp: 1680652804981000.0 sell 0.013 @ 28155.1
exch_timestamp: 1680652804981000.0 sell 0.002 @ 28155.1
exch_timestamp: 1680652804981000.0 sell 0.001 @ 28155.0
...
-------------------------------------------------------------------------------
current_timestamp: 1680652920032116
exch_timestamp: 1680652860008000.0 buy 1.887 @ 28150.8
exch_timestamp: 1680652860008000.0 buy 0.139 @ 28150.8
exch_timestamp: 1680652860009000.0 buy 0.053 @ 28150.8
exch_timestamp: 1680652860580000.0 buy 0.007 @ 28150.8
exch_timestamp: 1680652860605000.0 buy 0.063 @ 28150.8
exch_timestamp: 1680652860659000.0 sell 0.006 @ 28150.7
exch_timestamp: 1680652860659000.0 sell 0.011 @ 28150.7
exch_timestamp: 1680652860674000.0 buy 0.018 @ 28150.8
exch_timestamp: 1680652860696000.0 sell 0.009 @ 28150.7
exch_timestamp: 1680652860696000.0 sell 0.061 @ 28150.7
exch_timestamp: 1680652860821000.0 sell 0.05 @ 28150.7
...
-------------------------------------------------------------------------------
current_timestamp: 1680652980032116
exch_timestamp: 1680652920308000.0 buy 0.013 @ 28144.2
exch_timestamp: 1680652920308000.0 buy 0.001 @ 28144.2
exch_timestamp: 1680652920308000.0 buy 0.02 @ 28144.2
exch_timestamp: 1680652920308000.0 buy 0.036 @ 28144.2
exch_timestamp: 1680652920308000.0 buy 0.002 @ 28144.2
exch_timestamp: 1680652920308000.0 buy 0.011 @ 28144.2
exch_timestamp: 1680652920308000.0 buy 0.004 @ 28144.2
exch_timestamp: 1680652920308000.0 buy 0.028 @ 28144.2
exch_timestamp: 1680652920308000.0 buy 0.026 @ 28144.2
exch_timestamp: 1680652920308000.0 buy 0.035 @ 28144.2
exch_timestamp: 1680652920308000.0 buy 0.026 @ 28144.2
...
-------------------------------------------------------------------------------
current_timestamp: 1680653040032116
exch_timestamp: 1680652980140000.0 buy 0.086 @ 28150.0
exch_timestamp: 1680652980140000.0 buy 0.002 @ 28150.0
exch_timestamp: 1680652980140000.0 buy 0.012 @ 28150.0
exch_timestamp: 1680652980140000.0 sell 0.02 @ 28149.9
exch_timestamp: 1680652980140000.0 sell 0.488 @ 28149.9
exch_timestamp: 1680652980140000.0 sell 0.195 @ 28149.9
exch_timestamp: 1680652980140000.0 sell 0.791 @ 28149.9
exch_timestamp: 1680652980213000.0 buy 0.007 @ 28150.0
exch_timestamp: 1680652980239000.0 sell 0.062 @ 28149.9
exch_timestamp: 1680652980266000.0 buy 0.001 @ 28150.0
exch_timestamp: 1680652980271000.0 buy 0.006 @ 28150.0
...
-------------------------------------------------------------------------------
current_timestamp: 1680653100032116
exch_timestamp: 1680653040100000.0 buy 0.007 @ 28145.8
exch_timestamp: 1680653040117000.0 buy 0.001 @ 28145.8
exch_timestamp: 1680653040117000.0 buy 0.006 @ 28145.8
exch_timestamp: 1680653040119000.0 buy 0.014 @ 28145.8
exch_timestamp: 1680653040120000.0 buy 0.008 @ 28145.8
exch_timestamp: 1680653040535000.0 sell 0.2 @ 28145.7
exch_timestamp: 1680653040652000.0 buy 0.004 @ 28145.8
exch_timestamp: 1680653040652000.0 buy 0.02 @ 28145.8
exch_timestamp: 1680653040652000.0 buy 0.216 @ 28145.8
exch_timestamp: 1680653040652000.0 buy 0.904 @ 28145.8
exch_timestamp: 1680653040838000.0 buy 0.008 @ 28145.8
...
-------------------------------------------------------------------------------
current_timestamp: 1680653160032116
exch_timestamp: 1680653100101000.0 sell 0.02 @ 28140.5
exch_timestamp: 1680653100182000.0 buy 0.007 @ 28140.6
exch_timestamp: 1680653100197000.0 buy 0.005 @ 28140.6
exch_timestamp: 1680653100230000.0 sell 0.02 @ 28140.5
exch_timestamp: 1680653100303000.0 buy 0.007 @ 28140.6
exch_timestamp: 1680653100341000.0 buy 0.017 @ 28140.6
exch_timestamp: 1680653100358000.0 sell 0.009 @ 28140.5
exch_timestamp: 1680653100358000.0 sell 0.041 @ 28140.5
exch_timestamp: 1680653100628000.0 buy 0.008 @ 28140.6
exch_timestamp: 1680653100706000.0 sell 0.004 @ 28140.5
exch_timestamp: 1680653100707000.0 sell 0.001 @ 28140.5
...
[32]:
True
Rolling Volume-Weighted Average Price
[10]:
@njit
def rolling_vwap(hbt, out):
buy_amount_bin = np.zeros(100_000, np.float64)
buy_qty_bin = np.zeros(100_000, np.float64)
sell_amount_bin = np.zeros(100_000, np.float64)
sell_qty_bin = np.zeros(100_000, np.float64)
idx = 0
last_trade_price = np.nan
while hbt.elapse(10 * 1e6):
for trade in hbt.last_trades:
if trade[COL_SIDE] == 1:
buy_amount_bin[idx] += trade[COL_PRICE] * trade[COL_QTY]
buy_qty_bin[idx] += trade[COL_QTY]
else:
sell_amount_bin[idx] += trade[COL_PRICE] * trade[COL_QTY]
sell_qty_bin[idx] += trade[COL_QTY]
hbt.clear_last_trades()
idx += 1
if idx >= 1:
vwap10sec = np.divide(
buy_amount_bin[idx - 1] + sell_amount_bin[idx - 1],
buy_qty_bin[idx - 1] + sell_qty_bin[idx - 1]
)
else:
vwap10sec = np.nan
if idx >= 6:
vwap1m = np.divide(
np.sum(buy_amount_bin[idx - 6:idx]) + np.sum(sell_amount_bin[idx - 6:idx]),
np.sum(buy_qty_bin[idx - 6:idx]) + np.sum(sell_qty_bin[idx - 6:idx])
)
buy_vwap1m = np.divide(np.sum(buy_amount_bin[idx - 6:idx]), np.sum(buy_qty_bin[idx - 6:idx]))
sell_vwap1m = np.divide(np.sum(sell_amount_bin[idx - 6:idx]), np.sum(sell_qty_bin[idx - 6:idx]))
else:
vwap1m = np.nan
buy_vwap1m = np.nan
sell_vwap1m = np.nan
out.append((hbt.current_timestamp, vwap10sec, vwap1m, buy_vwap1m, sell_vwap1m))
return True
[11]:
hbt = HftBacktest(
btcusdt_20230405,
tick_size=0.1,
lot_size=0.001,
maker_fee=0.0002,
taker_fee=0.0007,
order_latency=FeedLatency(),
asset_type=Linear,
snapshot=btcusdt_20230404_eod,
trade_list_size=1_000_000
)
tup_ty = Tuple((float64, float64, float64, float64, float64))
out = List.empty_list(tup_ty, allocated=100_000)
rolling_vwap(hbt, out)
[11]:
True
[12]:
df = pl.DataFrame(out).transpose()
df.columns = ['Local Timestamp', '10-sec VWAP', '1-min VWAP', '1-min Buy VWAP', '1-min Sell VWAP']
df = df.with_columns(
pl.from_epoch('Local Timestamp', time_unit='us')
)
df
[12]:
Local Timestamp | 10-sec VWAP | 1-min VWAP | 1-min Buy VWAP | 1-min Sell VWAP |
---|---|---|---|---|
datetime[μs] | f64 | f64 | f64 | f64 |
2023-04-05 00:00:10.032116 | 28152.252939 | NaN | NaN | NaN |
2023-04-05 00:00:20.032116 | 28155.780263 | NaN | NaN | NaN |
2023-04-05 00:00:30.032116 | 28158.015906 | NaN | NaN | NaN |
2023-04-05 00:00:40.032116 | 28155.813019 | NaN | NaN | NaN |
2023-04-05 00:00:50.032116 | 28156.783983 | NaN | NaN | NaN |
2023-04-05 00:01:00.032116 | 28154.537949 | 28155.548141 | 28155.915523 | 28155.223338 |
2023-04-05 00:01:10.032116 | 28148.396356 | 28153.368132 | 28154.890947 | 28152.319296 |
2023-04-05 00:01:20.032116 | 28146.825218 | 28152.039289 | 28153.901905 | 28150.83132 |
2023-04-05 00:01:30.032116 | 28144.455386 | 28150.627706 | 28151.934098 | 28149.81655 |
2023-04-05 00:01:40.032116 | 28144.432187 | 28149.560181 | 28150.540685 | 28148.888516 |
2023-04-05 00:01:50.032116 | 28144.340373 | 28147.623107 | 28148.057738 | 28147.411383 |
2023-04-05 00:02:00.032116 | 28142.735181 | 28146.012256 | 28146.175163 | 28145.918324 |
… | … | … | … | … |
2023-04-05 00:05:00.032116 | 28140.810824 | 28142.411037 | 28142.7071 | 28142.244012 |
2023-04-05 00:05:10.032116 | 28139.182176 | 28141.264196 | 28142.570677 | 28140.811206 |
2023-04-05 00:05:20.032116 | 28138.95427 | 28140.263581 | 28140.273185 | 28140.260421 |
2023-04-05 00:05:30.032116 | 28139.49472 | 28139.946069 | 28139.43771 | 28140.107281 |
2023-04-05 00:05:40.032116 | 28139.720917 | 28139.82033 | 28139.223683 | 28140.005444 |
2023-04-05 00:05:50.032116 | 28140.151155 | 28139.602697 | 28139.723243 | 28139.493739 |
2023-04-05 00:06:00.032116 | 28143.477257 | 28139.927132 | 28140.206712 | 28139.635456 |
2023-04-05 00:06:10.032116 | 28142.323272 | 28140.338009 | 28140.35086 | 28140.321614 |
2023-04-05 00:06:20.032116 | 28138.54843 | 28140.391805 | 28140.716124 | 28140.025369 |
2023-04-05 00:06:30.032116 | 28137.611515 | 28139.958313 | 28139.962781 | 28139.951255 |
2023-04-05 00:06:40.032116 | 28140.107487 | 28139.965883 | 28139.964373 | 28139.9681 |
2023-04-05 00:06:50.032116 | 28139.449719 | 28139.746092 | 28139.841886 | 28139.673659 |
[13]:
plt.plot(df['Local Timestamp'], df['10-sec VWAP'])
plt.plot(df['Local Timestamp'], df['1-min VWAP'])
plt.plot(df['Local Timestamp'], df['1-min Buy VWAP'])
plt.plot(df['Local Timestamp'], df['1-min Sell VWAP'])
plt.legend(['10-sec VWAP', '1-min VWAP', '1-min Buy VWAP', '1-min Sell VWAP'])
[13]:
<matplotlib.legend.Legend at 0x7f271ad2bac0>

Integrating Custom Data
By combining your custom data with the feed data (order book and trades), you can enhance your strategy while harnessing the full potential of hftbacktest.
Accessing Spot Price
In this example, we’ll combine the spot BTCUSDT mid-price with the USDM-Futures BTCUSDT feed data. This will enable you to estimate the fair value price, taking the underlying price into consideration.
The spot data is used only in the local-side, and thus, should come with a local timestamp. Following this, in your backtesting logic, your task is to identify the most recent data that predates the current timestamp.
The raw spot feed is processed to create spot data, which includes both a local timestamp and the spot mid price.
[1]:
import numpy as np
import gzip
import json
spot = np.full((100_000, 2), np.nan, np.float64)
i = 0
with gzip.open('spot/btcusdt_20230405.dat.gz', 'r') as f:
while True:
line = f.readline()
if line is None or line == b'':
break
line = line.decode().strip()
local_timestamp = int(line[:16])
obj = json.loads(line[17:])
if obj['stream'] == 'btcusdt@bookTicker':
data = obj['data']
mid = (float(data['b']) + float(data['a'])) / 2.0
# Sets the event ID to 110 and assign an invalid exchange timestamp,
# as it's not utilized in the exchange simulation.
# And stores the mid-price in the price column.
spot[i] = [local_timestamp, mid]
i += 1
spot = spot[:i]
It displays the basis and spot mid price as it identifies the latest Point-in-Time data that falls before the current timestamp.
[2]:
from numba import njit
from hftbacktest import HftBacktest, FeedLatency, Linear
@njit
def print_basis(hbt, spot):
spot_row = 0
# Checks every 60-sec (in microseconds)
while hbt.elapse(60_000_000):
# Finds the latest spot mid value.
while spot_row < len(spot) and spot[spot_row, 0] <= hbt.current_timestamp:
spot_row += 1
spot_mid_price = spot[spot_row - 1, 1] if spot_row > 0 else np.nan
mid_price = (hbt.best_bid + hbt.best_ask) / 2.0
basis = mid_price - spot_mid_price
print(
'current_timestamp:',
hbt.current_timestamp,
'futures_mid:',
round(mid_price, 2),
', spot_mid:',
round(spot_mid_price, 2),
', basis:',
round(basis, 2)
)
hbt = HftBacktest(
[
'btcusdt_20230405_m.npz'
],
tick_size=0.1,
lot_size=0.001,
maker_fee=0.0002,
taker_fee=0.0007,
order_latency=FeedLatency(),
asset_type=Linear,
snapshot='btcusdt_20230404_eod.npz'
)
print_basis(hbt, spot)
Load btcusdt_20230405_m.npz
current_timestamp: 1680652860032116 futures_mid: 28150.75 , spot_mid: 28164.42 , basis: -13.67
current_timestamp: 1680652920032116 futures_mid: 28144.15 , spot_mid: 28155.82 , basis: -11.67
current_timestamp: 1680652980032116 futures_mid: 28149.95 , spot_mid: 28163.48 , basis: -13.53
current_timestamp: 1680653040032116 futures_mid: 28145.75 , spot_mid: 28158.88 , basis: -13.12
current_timestamp: 1680653100032116 futures_mid: 28140.55 , spot_mid: 28156.06 , basis: -15.51
current_timestamp: 1680653160032116 futures_mid: 28143.85 , spot_mid: 28155.82 , basis: -11.97
Combining Spot Price
While integrating custom data with feed data might be more challenging than simply accessing the data demonstrated in the first example, this process could be necessary if you’re intending to develop your own custom exchange model. Viewing the custom data from the exchange-side could indeed provide a more comprehensive approach to backtesting, such as when considering funding.
[3]:
tmp = np.full((100_000, 6), np.nan, np.float64)
i = 0
with gzip.open('spot/btcusdt_20230405.dat.gz', 'r') as f:
while True:
line = f.readline()
if line is None or line == b'':
break
line = line.decode().strip()
local_timestamp = int(line[:16])
obj = json.loads(line[17:])
if obj['stream'] == 'btcusdt@bookTicker':
data = obj['data']
mid = (float(data['b']) + float(data['a'])) / 2.0
# Sets the event ID to 110 and assign an invalid exchange timestamp,
# as it's not utilized in the exchange simulation.
# And stores the mid-price in the price column.
tmp[i] = [110, -1, local_timestamp, 0, mid, 0]
i += 1
tmp = tmp[:i]
You can merge the two data sets using merge_on_local_timestamp
and then proceed to validate the data.
[4]:
from hftbacktest import merge_on_local_timestamp, validate_data
usdm_feed_data = np.load('btcusdt_20230405_m.npz')['data']
merged = merge_on_local_timestamp(usdm_feed_data, tmp)
validate_data(merged)
[4]:
0
You can obtain the spot mid-price by using get_user_data
function along with event id 110.
[5]:
from hftbacktest import reset, COL_PRICE
@njit
def print_basis(hbt):
# Checks every 60-sec (in microseconds)
while hbt.elapse(60_000_000):
funding_rate = hbt.get_user_data(102)
spot_mid_price = hbt.get_user_data(110)
mid_price = (hbt.best_bid + hbt.best_ask) / 2.0
basis = mid_price - spot_mid_price[COL_PRICE]
print(
'current_timestamp:',
hbt.current_timestamp,
'futures_mid:',
round(mid_price, 2),
'funding_rate:',
funding_rate[COL_PRICE],
', spot_mid:',
round(spot_mid_price[COL_PRICE], 2),
', basis:',
round(basis, 2)
)
reset(
hbt,
[
merged
],
snapshot='btcusdt_20230404_eod.npz'
)
print_basis(hbt)
current_timestamp: 1680652860004231 futures_mid: 28150.75 funding_rate: 2.76e-05 , spot_mid: 28164.42 , basis: -13.67
current_timestamp: 1680652920004231 futures_mid: 28144.15 funding_rate: 2.813e-05 , spot_mid: 28155.82 , basis: -11.67
current_timestamp: 1680652980004231 futures_mid: 28149.95 funding_rate: 2.826e-05 , spot_mid: 28163.48 , basis: -13.53
current_timestamp: 1680653040004231 futures_mid: 28145.75 funding_rate: 2.826e-05 , spot_mid: 28158.88 , basis: -13.12
current_timestamp: 1680653100004231 futures_mid: 28140.55 funding_rate: 2.841e-05 , spot_mid: 28156.06 , basis: -15.51
current_timestamp: 1680653160004231 futures_mid: 28143.85 funding_rate: 2.85e-05 , spot_mid: 28155.82 , basis: -11.97
Combining Funding Rate by Using Built-in Data Utility
If you’re using data that has been converted from raw feed by the built-in utility, you can effortlessly incorporate markPrice
stream data. Find out more details here.
[6]:
from hftbacktest.data.utils import binancefutures
data = binancefutures.convert('usdm/btcusdt_20230405.dat.gz', opt='m')
np.savez('btcusdt_20230405_m', data=data)
local_timestamp is ahead of exch_timestamp by 26932.0
found 6555 rows that exch_timestamp is ahead of the previous exch_timestamp
Correction is done.
You can obtain the funding rate by using get_user_data
function along with event id 102.
[7]:
@njit
def print_funding_rate(hbt):
# Checks every 60-sec (in microseconds)
while hbt.elapse(60_000_000):
# funding_rate data is stored with event id 102.
funding_rate = hbt.get_user_data(102)
mid_price = (hbt.best_bid + hbt.best_ask) / 2.0
print(
'current_timestamp:',
hbt.current_timestamp,
'futures_mid:',
round(mid_price, 2),
'funding_rate:',
funding_rate[COL_PRICE]
)
reset(
hbt,
[
'btcusdt_20230405_m.npz'
],
snapshot='btcusdt_20230404_eod.npz'
)
print_funding_rate(hbt)
Load btcusdt_20230405_m.npz
current_timestamp: 1680652860032116 futures_mid: 28150.75 funding_rate: 2.76e-05
current_timestamp: 1680652920032116 futures_mid: 28144.15 funding_rate: 2.813e-05
current_timestamp: 1680652980032116 futures_mid: 28149.95 funding_rate: 2.826e-05
current_timestamp: 1680653040032116 futures_mid: 28145.75 funding_rate: 2.826e-05
current_timestamp: 1680653100032116 futures_mid: 28140.55 funding_rate: 2.841e-05
current_timestamp: 1680653160032116 futures_mid: 28143.85 funding_rate: 2.85e-05
High-Frequency Grid Trading
Note: This example is for educational purposes only and demonstrates effective strategies for high-frequency market-making schemes. All backtests are based on a 0.005% rebate, the highest market maker rebate available on Binance Futures. See Binance Upgrades USDⓢ-Margined Futures Liquidity Provider Program for more details.
Plain High-Frequency Grid Trading
This is a high-frequency version of Grid Trading that keeps posting orders on grids centered around the mid-price, maintaining a fixed interval and a set number of grids.
[3]:
from numba import njit
import pandas as pd
import numpy as np
from numba.typed import Dict
from hftbacktest import NONE, NEW, HftBacktest, GTX, FeedLatency, SquareProbQueueModel, BUY, SELL, Linear, Stat, reset
@njit
def gridtrading(hbt, stat):
max_position = 5
grid_interval = hbt.tick_size * 10
grid_num = 20
half_spread = hbt.tick_size * 20
# Running interval in microseconds
while hbt.elapse(100_000):
# Clears cancelled, filled or expired orders.
hbt.clear_inactive_orders()
mid_price = (hbt.best_bid + hbt.best_ask) / 2.0
bid_order_begin = np.floor((mid_price - half_spread) / grid_interval) * grid_interval
ask_order_begin = np.ceil((mid_price + half_spread) / grid_interval) * grid_interval
order_qty = 0.1
last_order_id = -1
# Creates a new grid for buy orders.
new_bid_orders = Dict.empty(np.int64, np.float64)
if hbt.position < max_position:
for i in range(grid_num):
bid_order_begin -= i * grid_interval
bid_order_tick = round(bid_order_begin / hbt.tick_size)
# Do not post buy orders above the best bid.
if bid_order_tick > hbt.best_bid_tick:
continue
# order price in tick is used as order id.
new_bid_orders[bid_order_tick] = bid_order_begin
for order in hbt.orders.values():
# Cancels if an order is not in the new grid.
if order.side == BUY and order.cancellable and order.order_id not in new_bid_orders:
hbt.cancel(order.order_id)
last_order_id = order.order_id
for order_id, order_price in new_bid_orders.items():
# Posts an order if it doesn't exist.
if order_id not in hbt.orders:
hbt.submit_buy_order(order_id, order_price, order_qty, GTX)
last_order_id = order_id
# Creates a new grid for sell orders.
new_ask_orders = Dict.empty(np.int64, np.float64)
if hbt.position > -max_position:
for i in range(grid_num):
ask_order_begin += i * grid_interval
ask_order_tick = round(ask_order_begin / hbt.tick_size)
# Do not post sell orders below the best ask.
if ask_order_tick < hbt.best_ask_tick:
continue
# order price in tick is used as order id.
new_ask_orders[ask_order_tick] = ask_order_begin
for order in hbt.orders.values():
# Cancels if an order is not in the new grid.
if order.side == SELL and order.cancellable and order.order_id not in new_ask_orders:
hbt.cancel(order.order_id)
last_order_id = order.order_id
for order_id, order_price in new_ask_orders.items():
# Posts an order if it doesn't exist.
if order_id not in hbt.orders:
hbt.submit_sell_order(order_id, order_price, order_qty, GTX)
last_order_id = order_id
# All order requests are considered to be requested at the same time.
# Waits until one of the order responses is received.
if last_order_id >= 0:
if not hbt.wait_order_response(last_order_id):
return False
# Records the current state for stat calculation.
stat.record(hbt)
return True
[2]:
hbt = HftBacktest(
[
'data/ethusdt_20221003.npz',
'data/ethusdt_20221004.npz',
'data/ethusdt_20221005.npz',
'data/ethusdt_20221006.npz',
'data/ethusdt_20221007.npz'
],
tick_size=0.01,
lot_size=0.001,
maker_fee=-0.00005,
taker_fee=0.0007,
order_latency=FeedLatency(),
queue_model=SquareProbQueueModel(),
asset_type=Linear,
snapshot='data/ethusdt_20221002_eod.npz'
)
stat = Stat(hbt)
Load data/ethusdt_20221003.npz
[3]:
%%time
gridtrading(hbt, stat.recorder)
Load data/ethusdt_20221004.npz
Load data/ethusdt_20221005.npz
Load data/ethusdt_20221006.npz
Load data/ethusdt_20221007.npz
CPU times: user 3min 58s, sys: 6.03 s, total: 4min 4s
Wall time: 4min 5s
[3]:
True
[4]:
stat.summary(capital=15_000)
=========== Summary ===========
Sharpe ratio: 20.9
Sortino ratio: 22.4
Risk return ratio: 211.5
Annualised return: 330.53 %
Max. draw down: 1.56 %
The number of trades per day: 5954
Avg. daily trading volume: 595
Avg. daily trading amount: 798115
Max leverage: 0.52
Median leverage: 0.21

High-Frequency Grid Trading with Skewing
By incorporating position-based skewing, the strategy’s risk-adjusted returns can be improved.
[5]:
@njit
def gridtrading(hbt, stat, skew):
max_position = 5
grid_interval = hbt.tick_size * 10
grid_num = 20
half_spread = hbt.tick_size * 20
# Running interval in microseconds
while hbt.elapse(100_000):
# Clears cancelled, filled or expired orders.
hbt.clear_inactive_orders()
mid_price = (hbt.best_bid + hbt.best_ask) / 2.0
reservation_price = mid_price - skew * hbt.position * hbt.tick_size
bid_order_begin = np.floor((reservation_price - half_spread) / grid_interval) * grid_interval
ask_order_begin = np.ceil((reservation_price + half_spread) / grid_interval) * grid_interval
order_qty = 0.1 # np.round(notional_order_qty / mid_price / hbt.lot_size) * hbt.lot_size
last_order_id = -1
# Creates a new grid for buy orders.
new_bid_orders = Dict.empty(np.int64, np.float64)
if hbt.position < max_position: # hbt.position * mid_price < max_notional_position
for i in range(grid_num):
bid_order_begin -= i * grid_interval
bid_order_tick = round(bid_order_begin / hbt.tick_size)
# Do not post buy orders above the best bid.
if bid_order_tick > hbt.best_bid_tick:
continue
# order price in tick is used as order id.
new_bid_orders[bid_order_tick] = bid_order_begin
for order in hbt.orders.values():
# Cancels if an order is not in the new grid.
if order.side == BUY and order.cancellable and order.order_id not in new_bid_orders:
hbt.cancel(order.order_id)
last_order_id = order.order_id
for order_id, order_price in new_bid_orders.items():
# Posts an order if it doesn't exist.
if order_id not in hbt.orders:
hbt.submit_buy_order(order_id, order_price, order_qty, GTX)
last_order_id = order_id
# Creates a new grid for sell orders.
new_ask_orders = Dict.empty(np.int64, np.float64)
if hbt.position > -max_position: # hbt.position * mid_price > -max_notional_position
for i in range(grid_num):
ask_order_begin += i * grid_interval
ask_order_tick = round(ask_order_begin / hbt.tick_size)
# Do not post sell orders below the best ask.
if ask_order_tick < hbt.best_ask_tick:
continue
# order price in tick is used as order id.
new_ask_orders[ask_order_tick] = ask_order_begin
for order in hbt.orders.values():
# Cancels if an order is not in the new grid.
if order.side == SELL and order.cancellable and order.order_id not in new_ask_orders:
hbt.cancel(order.order_id)
last_order_id = order.order_id
for order_id, order_price in new_ask_orders.items():
# Posts an order if it doesn't exist.
if order_id not in hbt.orders:
hbt.submit_sell_order(order_id, order_price, order_qty, GTX)
last_order_id = order_id
# All order requests are considered to be requested at the same time.
# Waits until one of the order responses is received.
if last_order_id >= 0:
if not hbt.wait_order_response(last_order_id):
return False
# Records the current state for stat calculation.
stat.record(hbt)
return True
Weak skew
[6]:
reset(
hbt,
[
'data/ethusdt_20221003.npz',
'data/ethusdt_20221004.npz',
'data/ethusdt_20221005.npz',
'data/ethusdt_20221006.npz',
'data/ethusdt_20221007.npz'
],
snapshot='data/ethusdt_20221002_eod.npz'
)
stat = Stat(hbt)
skew = 1
gridtrading(hbt, stat.recorder, skew)
stat.summary(capital=15_000)
Load data/ethusdt_20221003.npz
Load data/ethusdt_20221004.npz
Load data/ethusdt_20221005.npz
Load data/ethusdt_20221006.npz
Load data/ethusdt_20221007.npz
=========== Summary ===========
Sharpe ratio: 18.0
Sortino ratio: 17.5
Risk return ratio: 169.2
Annualised return: 166.77 %
Max. draw down: 0.99 %
The number of trades per day: 6488
Avg. daily trading volume: 648
Avg. daily trading amount: 870207
Max leverage: 0.50
Median leverage: 0.10

Strong skew
[7]:
reset(
hbt,
[
'data/ethusdt_20221003.npz',
'data/ethusdt_20221004.npz',
'data/ethusdt_20221005.npz',
'data/ethusdt_20221006.npz',
'data/ethusdt_20221007.npz'
],
snapshot='data/ethusdt_20221002_eod.npz'
)
stat = Stat(hbt)
skew = 10
gridtrading(hbt, stat.recorder, skew)
stat.summary(capital=15_000)
Load data/ethusdt_20221003.npz
Load data/ethusdt_20221004.npz
Load data/ethusdt_20221005.npz
Load data/ethusdt_20221006.npz
Load data/ethusdt_20221007.npz
=========== Summary ===========
Sharpe ratio: 29.3
Sortino ratio: 33.4
Risk return ratio: 735.4
Annualised return: 100.30 %
Max. draw down: 0.14 %
The number of trades per day: 6636
Avg. daily trading volume: 663
Avg. daily trading amount: 889749
Max leverage: 0.51
Median leverage: 0.02

Multiple Assets
You might need to find the proper parameters for each asset to achieve better performance. As an example, here it uses single parameters set to demonstrate how the performance of a combination of multiple assets will be.
[8]:
@njit
def gridtrading(hbt, stat, half_spread, grid_interval, skew, order_qty):
grid_num = 20
max_position = grid_num * order_qty
# Running interval in microseconds
while hbt.elapse(100_000):
mid_price = (hbt.best_bid + hbt.best_ask) / 2.0
normalized_position = hbt.position / order_qty
bid_depth = half_spread + skew * normalized_position
ask_depth = half_spread - skew * normalized_position
bid_price = min(mid_price - bid_depth, hbt.best_bid)
ask_price = max(mid_price + ask_depth, hbt.best_ask)
grid_interval = max(np.round(half_spread / hbt.tick_size) * hbt.tick_size, hbt.tick_size)
bid_price = np.floor(bid_price / grid_interval) * grid_interval
ask_price = np.ceil(ask_price / grid_interval) * grid_interval
#--------------------------------------------------------
# Updates quotes.
hbt.clear_inactive_orders()
# Creates a new grid for buy orders.
new_bid_orders = Dict.empty(np.int64, np.float64)
if hbt.position < max_position and np.isfinite(bid_price):
for i in range(grid_num):
bid_price -= i * grid_interval
bid_price_tick = round(bid_price / hbt.tick_size)
# order price in tick is used as order id.
new_bid_orders[bid_price_tick] = bid_price
for order in hbt.orders.values():
# Cancels if an order is not in the new grid.
if order.side == BUY and order.cancellable and order.order_id not in new_bid_orders:
hbt.cancel(order.order_id)
for order_id, order_price in new_bid_orders.items():
# Posts an order if it doesn't exist.
if order_id not in hbt.orders:
hbt.submit_buy_order(order_id, order_price, order_qty, GTX)
# Creates a new grid for sell orders.
new_ask_orders = Dict.empty(np.int64, np.float64)
if hbt.position > -max_position and np.isfinite(ask_price):
for i in range(grid_num):
ask_price += i * grid_interval
ask_price_tick = round(ask_price / hbt.tick_size)
# order price in tick is used as order id.
new_ask_orders[ask_price_tick] = ask_price
for order in hbt.orders.values():
# Cancels if an order is not in the new grid.
if order.side == SELL and order.cancellable and order.order_id not in new_ask_orders:
hbt.cancel(order.order_id)
for order_id, order_price in new_ask_orders.items():
# Posts an order if it doesn't exist.
if order_id not in hbt.orders:
hbt.submit_sell_order(order_id, order_price, order_qty, GTX)
# Records the current state for stat calculation.
stat.record(hbt)
return True
[9]:
from hftbacktest import IntpOrderLatency, LogProbQueueModel2, COL_PRICE, COL_SIDE
latency_data = np.concatenate(
[np.load('../latency/order_latency_{}.npz'.format(date))['data'] for date in range(20230701, 20230732)]
)
def backtest(args):
asset_name, asset_info = args
hbt = HftBacktest(
['data/{}_{}.npz'.format(asset_name, date) for date in range(20230701, 20230732)],
tick_size=asset_info['tick_size'],
lot_size=asset_info['lot_size'],
maker_fee=-0.00005,
taker_fee=0.0007,
order_latency=IntpOrderLatency(data=latency_data),
queue_model=LogProbQueueModel2(),
asset_type=Linear,
snapshot='data/{}_20230630_eod.npz'.format(asset_name)
)
stat = Stat(hbt)
# Obtains the mid-price of the assset to determine the order quantity.
data = np.load('data/{}_20230630_eod.npz'.format(asset_name))['data']
best_bid = max(data[data[:, COL_SIDE] == 1][:, COL_PRICE])
best_ask = min(data[data[:, COL_SIDE] == -1][:, COL_PRICE])
mid = (best_bid + best_ask) / 2.0
# Sets the order quantity to be equivalent to a notional value of $100.
order_qty = max(round((100 / mid) / asset_info['lot_size']), 1) * asset_info['lot_size']
half_spread = mid * 0.0008
grid_interval = mid * 0.0008
skew = mid * 0.000025
gridtrading(hbt, stat.recorder, half_spread, grid_interval, skew, order_qty)
np.savez(
'stats/{}_stat_grid_multi'.format(asset_name),
timestamp=np.asarray(stat.timestamp),
mid=np.asarray(stat.mid),
balance=np.asarray(stat.balance),
position=np.asarray(stat.position),
fee=np.asarray(stat.fee),
)
[10]:
%%capture
import json
from multiprocessing import Pool
with open('assets.json', 'r') as f:
assets = json.load(f)
with Pool(16) as p:
print(p.map(backtest, list(assets.items())))
[11]:
from matplotlib import pyplot as plt
equity_values = {}
for asset_name in assets.keys():
stat = np.load('stats/{}_stat_grid.npz'.format(asset_name))
timestamp = stat['timestamp']
mid = stat['mid']
balance = stat['balance']
position = stat['position']
fee = stat['fee']
equity = mid * position + balance - fee
equity = pd.Series(equity, index=pd.to_datetime(timestamp, unit='us', utc=True))
equity_values[asset_name] = equity.resample('5min').last()
fig = plt.figure()
fig.set_size_inches(10, 3)
legend = []
net_equity = None
for i, equity in enumerate(list(equity_values.values())):
asset_number = i + 1
if net_equity is None:
net_equity = equity.copy()
else:
net_equity += equity.copy()
if asset_number % 10 == 0:
# 2_000 is capital for each trading asset.
net_equity_ = (net_equity / asset_number) / 2_000
net_equity_rs = net_equity_.resample('1d').last()
pnl = net_equity_rs.diff()
sr = pnl.mean() / pnl.std()
ann_sr = sr * np.sqrt(365)
legend.append('{} assets, SR={:.2f} (Daily SR={:.2f})'.format(asset_number, ann_sr, sr))
(net_equity_ * 100).plot()
plt.legend(legend)
plt.grid()
plt.ylabel('Cumulative Returns (%)')
[11]:
Text(0, 0.5, 'Cumulative Returns (%)')

Impact of Order Latency
This example illustrates the impact of order latency on the performance of the strategy.
Note: This example is for educational purposes only and demonstrates effective strategies for high-frequency market-making schemes. All backtests are based on a 0.005% rebate, the highest market maker rebate available on Binance Futures. See Binance Upgrades USDⓢ-Margined Futures Liquidity Provider Program for more details.
[1]:
from numba import njit
import numpy as np
from numba.typed import Dict
from hftbacktest import (
HftBacktest,
NONE,
NEW,
GTX,
BUY,
SELL,
ConstantLatency,
FeedLatency,
IntpOrderLatency,
SquareProbQueueModel,
Linear,
Stat
)
@njit
def measure_trading_intensity(order_arrival_depth, out):
max_tick = 0
for depth in order_arrival_depth:
if not np.isfinite(depth):
continue
# Sets the tick index to 0 for the nearest possible best price
# as the order arrival depth in ticks is measured from the mid-price
tick = round(depth / .5) - 1
# In a fast-moving market, buy trades can occur below the mid-price (and vice versa for sell trades)
# since the mid-price is measured in a previous time-step;
# however, to simplify the problem, we will exclude those cases.
if tick < 0 or tick >= len(out):
continue
# All of our possible quotes within the order arrival depth,
# excluding those at the same price, are considered executed.
out[:tick] += 1
max_tick = max(max_tick, tick)
return out[:max_tick]
@njit
def linear_regression(x, y):
sx = np.sum(x)
sy = np.sum(y)
sx2 = np.sum(x ** 2)
sxy = np.sum(x * y)
w = len(x)
slope = (w * sxy - sx * sy) / (w * sx2 - sx**2)
intercept = (sy - slope * sx) / w
return slope, intercept
@njit
def compute_coeff(xi, gamma, delta, A, k):
inv_k = np.divide(1, k)
c1 = 1 / (xi * delta) * np.log(1 + xi * delta * inv_k)
c2 = np.sqrt(np.divide(gamma, 2 * A * delta * k) * ((1 + xi * delta * inv_k) ** (k / (xi * delta) + 1)))
return c1, c2
@njit
def gridtrading_glft_mm(hbt, stat):
arrival_depth = np.full(10_000_000, np.nan, np.float64)
mid_price_chg = np.full(10_000_000, np.nan, np.float64)
t = 0
prev_mid_price_tick = np.nan
mid_price_tick = np.nan
tmp = np.zeros(500, np.float64)
ticks = np.arange(len(tmp)) + .5
A = np.nan
k = np.nan
volatility = np.nan
gamma = 0.05
delta = 1
adj1 = 1
adj2 = 0.05
order_qty = 1
max_position = 20
grid_num = 20
# Checks every 100 milliseconds.
while hbt.elapse(100_000):
#--------------------------------------------------------
# Records market order's arrival depth from the mid-price.
if not np.isnan(mid_price_tick):
depth = -np.inf
for trade in hbt.last_trades:
side = trade[3]
trade_price_tick = trade[4] / hbt.tick_size
if side == BUY:
depth = np.nanmax([trade_price_tick - mid_price_tick, depth])
else:
depth = np.nanmax([mid_price_tick - trade_price_tick, depth])
arrival_depth[t] = depth
hbt.clear_last_trades()
prev_mid_price_tick = mid_price_tick
mid_price_tick = (hbt.best_bid_tick + hbt.best_ask_tick) / 2.0
# Records the mid-price change for volatility calculation.
mid_price_chg[t] = mid_price_tick - prev_mid_price_tick
#--------------------------------------------------------
# Calibrates A, k and calculates the market volatility.
# Updates A, k, and the volatility every 5-sec.
if t % 50 == 0:
# Window size is 10-minute.
if t >= 6_000 - 1:
# Calibrates A, k
tmp[:] = 0
lambda_ = measure_trading_intensity(arrival_depth[t + 1 - 6_000:t + 1], tmp)
lambda_ = lambda_[:70] / 600
x = ticks[:len(lambda_)]
y = np.log(lambda_)
k_, logA = linear_regression(x, y)
A = np.exp(logA)
k = -k_
# Updates the volatility.
volatility = np.nanstd(mid_price_chg[t + 1 - 6_000:t + 1]) * np.sqrt(10)
#--------------------------------------------------------
# Computes bid price and ask price.
c1, c2 = compute_coeff(gamma, gamma, delta, A, k)
half_spread = (c1 + 1 / 2 * c2 * volatility) * adj1
skew = c2 * volatility * adj2
bid_depth = half_spread + skew * hbt.position
ask_depth = half_spread - skew * hbt.position
# If the depth is invalid, set a large spread to prevent execution.
if not np.isfinite(bid_depth):
bid_depth = 1_000
if not np.isfinite(ask_depth):
ask_depth = 1_000
bid_price = min(round(mid_price_tick - bid_depth), hbt.best_bid_tick) * hbt.tick_size
ask_price = max(round(mid_price_tick + ask_depth), hbt.best_ask_tick) * hbt.tick_size
grid_interval = round(max(half_spread, 1)) * hbt.tick_size
bid_price = np.floor(bid_price / grid_interval) * grid_interval
ask_price = np.ceil(ask_price / grid_interval) * grid_interval
#--------------------------------------------------------
# Updates quotes.
hbt.clear_inactive_orders()
# Creates a new grid for buy orders.
new_bid_orders = Dict.empty(np.int64, np.float64)
if hbt.position < max_position:
for i in range(grid_num):
bid_price -= i * grid_interval
bid_price_tick = round(bid_price / hbt.tick_size)
# order price in tick is used as order id.
new_bid_orders[bid_price_tick] = bid_price
for order in hbt.orders.values():
# Cancels if an order is not in the new grid.
if order.side == BUY and order.cancellable and order.order_id not in new_bid_orders:
hbt.cancel(order.order_id)
for order_id, order_price in new_bid_orders.items():
# Posts an order if it doesn't exist.
if order_id not in hbt.orders:
hbt.submit_buy_order(order_id, order_price, order_qty, GTX)
# Creates a new grid for sell orders.
new_ask_orders = Dict.empty(np.int64, np.float64)
if hbt.position > -max_position:
for i in range(grid_num):
ask_price += i * grid_interval
ask_price_tick = round(ask_price / hbt.tick_size)
# order price in tick is used as order id.
new_ask_orders[ask_price_tick] = ask_price
for order in hbt.orders.values():
# Cancels if an order is not in the new grid.
if order.side == SELL and order.cancellable and order.order_id not in new_ask_orders:
hbt.cancel(order.order_id)
for order_id, order_price in new_ask_orders.items():
# Posts an order if it doesn't exist.
if order_id not in hbt.orders:
hbt.submit_sell_order(order_id, order_price, order_qty, GTX)
t += 1
if t >= len(arrival_depth) or t >= len(mid_price_chg):
raise Exception
# Records the current state for stat calculation.
stat.record(hbt)
Order Latency from Feed Latency
[2]:
hbt = HftBacktest(
[
'data/ethusdt_20230401.npz',
'data/ethusdt_20230402.npz',
'data/ethusdt_20230403.npz',
'data/ethusdt_20230404.npz',
'data/ethusdt_20230405.npz',
],
tick_size=0.01,
lot_size=0.001,
maker_fee=-0.00005,
taker_fee=0.0007,
order_latency=FeedLatency(),
queue_model=SquareProbQueueModel(),
asset_type=Linear,
snapshot='data/ethusdt_20230331_eod.npz',
trade_list_size=10_000
)
stat = Stat(hbt)
gridtrading_glft_mm(hbt, stat.recorder)
stat.summary(capital=25_000)
Load data/ethusdt_20230401.npz
Load data/ethusdt_20230402.npz
Load data/ethusdt_20230403.npz
Load data/ethusdt_20230404.npz
Load data/ethusdt_20230405.npz
=========== Summary ===========
Sharpe ratio: 4.6
Sortino ratio: 3.3
Risk return ratio: 43.1
Annualised return: 119.45 %
Max. draw down: 2.77 %
The number of trades per day: 3212
Avg. daily trading volume: 3212
Avg. daily trading amount: 5886441
Max leverage: 3.40
Median leverage: 0.22

Historical Order Latency
[3]:
latency_data = np.concatenate(
[np.load('../latency/ethusdt_{}_latency.npz'.format(date))['data'] for date in range(20230401, 20230406)]
)
hbt = HftBacktest(
[
'data/ethusdt_20230401.npz',
'data/ethusdt_20230402.npz',
'data/ethusdt_20230403.npz',
'data/ethusdt_20230404.npz',
'data/ethusdt_20230405.npz',
],
tick_size=0.01,
lot_size=0.001,
maker_fee=-0.00005,
taker_fee=0.0007,
order_latency=IntpOrderLatency(data=latency_data),
queue_model=SquareProbQueueModel(),
asset_type=Linear,
snapshot='data/ethusdt_20230331_eod.npz',
trade_list_size=10_000
)
stat = Stat(hbt)
gridtrading_glft_mm(hbt, stat.recorder)
stat.summary(capital=25_000)
Load data/ethusdt_20230401.npz
Load data/ethusdt_20230402.npz
Load data/ethusdt_20230403.npz
Load data/ethusdt_20230404.npz
Load data/ethusdt_20230405.npz
=========== Summary ===========
Sharpe ratio: 0.4
Sortino ratio: 0.3
Risk return ratio: 2.8
Annualised return: 11.03 %
Max. draw down: 4.00 %
The number of trades per day: 3493
Avg. daily trading volume: 3493
Avg. daily trading amount: 6401297
Max leverage: 2.47
Median leverage: 0.22

Order Latency from Amplified Feed Latency
[4]:
hbt = HftBacktest(
[
'data/ethusdt_20230401.npz',
'data/ethusdt_20230402.npz',
'data/ethusdt_20230403.npz',
'data/ethusdt_20230404.npz',
'data/ethusdt_20230405.npz',
],
tick_size=0.01,
lot_size=0.001,
maker_fee=-0.00005,
taker_fee=0.0007,
order_latency=FeedLatency(entry_latency_mul=4, resp_latency_mul=3),
queue_model=SquareProbQueueModel(),
asset_type=Linear,
snapshot='data/ethusdt_20230331_eod.npz',
trade_list_size=10_000
)
stat = Stat(hbt)
gridtrading_glft_mm(hbt, stat.recorder)
stat.summary(capital=25_000)
Load data/ethusdt_20230401.npz
Load data/ethusdt_20230402.npz
Load data/ethusdt_20230403.npz
Load data/ethusdt_20230404.npz
Load data/ethusdt_20230405.npz
=========== Summary ===========
Sharpe ratio: 0.2
Sortino ratio: 0.2
Risk return ratio: 1.8
Annualised return: 6.66 %
Max. draw down: 3.61 %
The number of trades per day: 3193
Avg. daily trading volume: 3193
Avg. daily trading amount: 5849525
Max leverage: 4.57
Median leverage: 0.22

Guéant–Lehalle–Fernandez-Tapia Market Making Model and Grid Trading
Overview
Grid trading is straightforward and easy to comprehend, and it excels in high-frequency environments. However, given the intricacies of high-frequency trading, which necessitate comprehensive tick-by-tick simulation with latencies and order fill simulation, optimizing the ideal spread, order interval, and skew can be a challenging task. Furthermore, these values fluctuate over time, especially in response to market conditions, making a fixed setup less than optimal.
To improve grid trading’s adaptability, one solution is to combine it with a well-developed market-making model. Let’s delve into how this can be achieved.
Guéant–Lehalle–Fernandez-Tapia Market Making Model
This model represents an advanced evolution of the well-known Avellaneda-Stoikov model and provides a closed-form approximation of asymptotic behavior for terminal time T. Simply, this model does not specify a terminal time, which makes it suitable for typical stocks, spot assets, or crypto perpetual contracts. By employing this model, it is anticipated that the half spread and skew will be accurately adjusted according to market conditions.
In this analysis, we will focus on equations (4.6) and (4.7) in Optimal market making and explore how they can be applied to real-world scenarios.
The optimal bid quote depth, \(\delta^{b*}_{approx}\), and ask quote depth, \(\delta^{a*}_{approx}\), are derived from the fair price as follows:
\begin{align} \delta^{b*}_{approx}(q) = {1 \over {\xi \Delta}}log(1 + {\xi \Delta \over k}) + {{2q + \Delta} \over 2}\sqrt{{{\gamma \sigma^2} \over {2A\Delta k}}(1 + {\xi \Delta \over k})^{{k \over {\xi \Delta}} + 1}} \label{eq4.6}\tag{4.6} \\ \delta^{a*}_{approx}(q) = {1 \over {\xi \Delta}}log(1 + {\xi \Delta \over k}) - {{2q - \Delta} \over 2}\sqrt{{{\gamma \sigma^2} \over {2A\Delta k}}(1 + {\xi \Delta \over k})^{{k \over {\xi \Delta}} + 1}} \label{eq4.7}\tag{4.7} \end{align}
Let’s introduce \(c_1\) and \(c_2\) and define them by extracting the volatility 𝜎 from the square root:
\begin{align} c_1 = {1 \over {\xi \Delta}}log(1 + {\xi \Delta \over k}) \\ c_2 = \sqrt{{\gamma \over {2A\Delta k}}(1 + {\xi \Delta \over k})^{{k \over {\xi \Delta}} + 1}} \end{align}
Now we can rewrite equations (4.6) and (4.7) as follows:
\begin{align} \delta^{b*}_{approx}(q) = c_1 + {\Delta \over 2} \sigma c_2 + q \sigma c_2 \\ \delta^{a*}_{approx}(q) = c_1 + {\Delta \over 2} \sigma c_2 - q \sigma c_2 \end{align}
As you can see, this consists of the half spread and skew. \(q\) represents a market maker’s inventory(position).
\begin{align} \text{half spread} = C_1 + {\Delta \over 2} \sigma C_2 \\ \text{skew} = \sigma C_2 \\ \delta^{b*}_{approx}(q) = \text{half spread} + \text{skew} \times q \\ \delta^{a*}_{approx}(q) = \text{half spread} - \text{skew} \times q \end{align}
Thus,
\begin{align} \text{bid price} = \text{fair price} - (\text{half spread} + \text{skew} \times q) \\ \text{ask price} = \text{fair price} + (\text{half spread} - \text{skew} \times q) \end{align}
Calculating Trading Intensity
To determine the optimal quotes, we need to compute \(c_1\) and \(c_2\). In order to do that, we need to calibrate \(A\) and \(k\) of trading intensity, as well as calculate the market volatility \(\sigma\).
Trading intensity is defined as:
We will calibrate these values using market data according to the this article. In order to do that, we need to record market order’s arrivals.
Our market maker will react every 100ms, which means they will post or cancel orders at this interval. So, our quotes’ trading intensity will be measured in the same time-step. Ideally, we should also account for our orders’ queue position; however, to simplify the problem, we will not consider the order queue position in this analysis.
[1]:
from numba import njit
from hftbacktest import BUY, SELL
import numpy as np
@njit
def measure_trading_intensity_and_volatility(hbt):
arrival_depth = np.full(10_000_000, np.nan, np.float64)
mid_price_chg = np.full(10_000_000, np.nan, np.float64)
t = 0
prev_mid_price_tick = np.nan
mid_price_tick = np.nan
# Checks every 100 milliseconds.
while hbt.elapse(100_000):
#--------------------------------------------------------
# Records market order's arrival depth from the mid-price.
if not np.isnan(mid_price_tick):
depth = -np.inf
for trade in hbt.last_trades:
side = trade[3]
trade_price_tick = trade[4] / hbt.tick_size
if side == BUY:
depth = np.nanmax([trade_price_tick - mid_price_tick, depth])
else:
depth = np.nanmax([mid_price_tick - trade_price_tick, depth])
arrival_depth[t] = depth
hbt.clear_last_trades()
prev_mid_price_tick = mid_price_tick
mid_price_tick = (hbt.best_bid_tick + hbt.best_ask_tick) / 2.0
# Records the mid-price change for volatility calculation.
mid_price_chg[t] = mid_price_tick - prev_mid_price_tick
t += 1
if t >= len(arrival_depth) or t >= len(mid_price_chg):
raise Exception
return arrival_depth[:t], mid_price_chg[:t]
Since we’re not considering the order’s queue position when measuring trading intensity, only market trades that cross our quote will be counted as executed.
[2]:
@njit
def measure_trading_intensity(order_arrival_depth, out):
max_tick = 0
for depth in order_arrival_depth:
if not np.isfinite(depth):
continue
# Sets the tick index to 0 for the nearest possible best price
# as the order arrival depth in ticks is measured from the mid-price
tick = round(depth / .5) - 1
# In a fast-moving market, buy trades can occur below the mid-price (and vice versa for sell trades)
# since the mid-price is measured in a previous time-step;
# however, to simplify the problem, we will exclude those cases.
if tick < 0 or tick >= len(out):
continue
# All of our possible quotes within the order arrival depth,
# excluding those at the same price, are considered executed.
out[:tick] += 1
max_tick = max(max_tick, tick)
return out[:max_tick]
Run HftBacktest to replay the market and record order arrival depth and price changes.
[3]:
from hftbacktest import HftBacktest, FeedLatency, Linear
hbt = HftBacktest(
[
'data/ethusdt_20221003.npz',
],
snapshot='data/ethusdt_20221002_eod.npz',
tick_size=0.01,
lot_size=0.001,
maker_fee=-0.00005,
taker_fee=0.0007,
asset_type=Linear,
order_latency=FeedLatency(),
trade_list_size=10_000,
)
arrival_depth, mid_price_chg = measure_trading_intensity_and_volatility(hbt)
Load data/ethusdt_20221003.npz
Measure trading intensity from the recorded order arrival depth and plot it.
[4]:
tmp = np.zeros(500, np.float64)
# Measures trading intensity (lambda) for the first 10-minute window.
lambda_ = measure_trading_intensity(arrival_depth[:6_000], tmp)
# Since it is measured for a 10-minute window, divide by 600 to convert it to per second.
lambda_ /= 600
# Creates ticks from the mid-price.
ticks = np.arange(len(lambda_)) + .5
[5]:
from matplotlib import pyplot as plt
plt.plot(ticks, lambda_)
plt.xlabel('$ \delta $ (ticks from the mid-price)')
plt.ylabel('Count (per second)')
[5]:
Text(0, 0.5, 'Count (per second)')

Calibrate \(A\) and \(k\) using linear regression, since by taking the logarithm of both sides of lambda, it becomes \(log \lambda = -k \delta + logA\).
[6]:
@njit
def linear_regression(x, y):
sx = np.sum(x)
sy = np.sum(y)
sx2 = np.sum(x ** 2)
sxy = np.sum(x * y)
w = len(x)
slope = (w * sxy - sx * sy) / (w * sx2 - sx**2)
intercept = (sy - slope * sx) / w
return slope, intercept
[7]:
y = np.log(lambda_)
k_, logA = linear_regression(ticks, y)
A = np.exp(logA)
k = -k_
print('A={}, k={}'.format(A, k))
A=0.8793116000410844, k=0.01761086117922129
[8]:
plt.plot(lambda_)
plt.plot(A * np.exp(-k * ticks))
plt.xlabel('$ \delta $ (ticks from the mid-price)')
plt.ylabel('Count (per second)')
plt.legend(['Actual', 'Fitted curve'])
[8]:
<matplotlib.legend.Legend at 0x7f7d19c09810>

As you can see, the fitted lambda function is not accurate across the entire range. More specifically, it overestimates the trading intensity for the shallow range near the mid-price and underestimates it for the deep range away from the mid-price.
Since our quotes are likely to be placed in the range close to the mid-price, at least under typical market conditions (excluding high volatility conditions), we will refit the function specifically for the nearest range.
[9]:
# Refits for the range un to 70 ticks.
x_shallow = ticks[:70]
lambda_shallow = lambda_[:70]
y = np.log(lambda_shallow)
k_, logA = linear_regression(x_shallow, y)
A = np.exp(logA)
k = -k_
print('A={}, k={}'.format(A, k))
A=2.9932203436865956, k=0.04249732177397641
[10]:
plt.plot(lambda_shallow)
plt.plot(A * np.exp(-k * x_shallow))
plt.xlabel('$ \delta $ (ticks from the mid-price)')
plt.ylabel('Count (per second)')
plt.legend(['Actual', 'Fitted curve'])
[10]:
<matplotlib.legend.Legend at 0x7f7d19a28820>

Now, we have a more accurate trading intensity function. Let’s see where our quote will be placed.
But before we do that, let’s calculate the volatility first.
[11]:
# Since we need volatility in ticks per square root of a second and our measurement is every 100ms,
# multiply by the square root of 10.
volatility = np.nanstd(mid_price_chg) * np.sqrt(10)
print(volatility)
10.690046868333601
Compute \(c_1\) and \(c_2\) according to the equations.
[12]:
@njit
def compute_coeff(xi, gamma, delta, A, k):
inv_k = np.divide(1, k)
c1 = 1 / (xi * delta) * np.log(1 + xi * delta * inv_k)
c2 = np.sqrt(np.divide(gamma, 2 * A * delta * k) * ((1 + xi * delta * inv_k) ** (k / (xi * delta) + 1)))
return c1, c2
In the Guéant–Lehalle–Fernandez-Tapia formula, \(\Delta = 1\) and \(\xi = \gamma\). the value of \(\gamma\) is arbitrarily chosen.
[13]:
gamma = 0.05
delta = 1
volatility = 10.69
c1, c2 = compute_coeff(gamma, gamma, delta, A, k)
half_spread = 1 * c1 + 1 / 2 * c2 * volatility
skew = c2 * volatility
print('half_spread={}, skew={}'.format(half_spread, skew))
half_spread=20.419892817641397, skew=9.7302402975805
What does it mean when your quote is positioned 20 ticks away from the mid-price? By analyzing the recorded order arrival depth, you can identify the number of market trades you’ll participate in as a market maker, measured in terms of count instead of volume. Additionally, the skew appears to be quite strong, as accumulating just two positions offsets the entire half spread.
[14]:
from scipy import stats
# inverse of percentile
pct = stats.percentileofscore(arrival_depth[np.isfinite(arrival_depth)], half_spread)
your_pct = 100 - pct
print('{:.2f}%'.format(your_pct))
1.86%
Approximately 1.86% of market trades per given time-step could execute your quote. Be aware that it’s not the percentage of the traded quantity.
Implement a Market Maker using the Model
Note: This example is for educational purposes only and demonstrates effective strategies for high-frequency market-making schemes. All backtests are based on a 0.005% rebate, the highest market maker rebate available on Binance Futures. See Binance Upgrades USDⓢ-Margined Futures Liquidity Provider Program for more details.
In this example, we will disregard the forecast term and assume that the fair price is equal to the mid price, as we can expect the intrinsic value to remain stable in the short term.
[15]:
from numba.typed import Dict
@njit
def glft_market_maker(hbt, stat):
arrival_depth = np.full(10_000_000, np.nan, np.float64)
mid_price_chg = np.full(10_000_000, np.nan, np.float64)
out = np.full((10_000_000, 5), np.nan, np.float64)
t = 0
prev_mid_price_tick = np.nan
mid_price_tick = np.nan
tmp = np.zeros(500, np.float64)
ticks = np.arange(len(tmp)) + .5
A = np.nan
k = np.nan
volatility = np.nan
gamma = 0.05
delta = 1
order_qty = 1
max_position = 20
# Checks every 100 milliseconds.
while hbt.elapse(100_000):
#--------------------------------------------------------
# Records market order's arrival depth from the mid-price.
if not np.isnan(mid_price_tick):
depth = -np.inf
for trade in hbt.last_trades:
side = trade[3]
trade_price_tick = trade[4] / hbt.tick_size
if side == BUY:
depth = np.nanmax([trade_price_tick - mid_price_tick, depth])
else:
depth = np.nanmax([mid_price_tick - trade_price_tick, depth])
arrival_depth[t] = depth
hbt.clear_last_trades()
prev_mid_price_tick = mid_price_tick
mid_price_tick = (hbt.best_bid_tick + hbt.best_ask_tick) / 2.0
# Records the mid-price change for volatility calculation.
mid_price_chg[t] = mid_price_tick - prev_mid_price_tick
#--------------------------------------------------------
# Calibrates A, k and calculates the market volatility.
# Updates A, k, and the volatility every 5-sec.
if t % 50 == 0:
# Window size is 10-minute.
if t >= 6_000 - 1:
# Calibrates A, k
tmp[:] = 0
lambda_ = measure_trading_intensity(arrival_depth[t + 1 - 6_000:t + 1], tmp)
lambda_ = lambda_[:70] / 600
x = ticks[:len(lambda_)]
y = np.log(lambda_)
k_, logA = linear_regression(x, y)
A = np.exp(logA)
k = -k_
# Updates the volatility.
volatility = np.nanstd(mid_price_chg[t + 1 - 6_000:t + 1]) * np.sqrt(10)
#--------------------------------------------------------
# Computes bid price and ask price.
c1, c2 = compute_coeff(gamma, gamma, delta, A, k)
half_spread = c1 + delta / 2 * c2 * volatility
skew = c2 * volatility
bid_depth = half_spread + skew * hbt.position
ask_depth = half_spread - skew * hbt.position
bid_price = min(np.round(mid_price_tick - bid_depth), hbt.best_bid_tick) * hbt.tick_size
ask_price = max(np.round(mid_price_tick + ask_depth), hbt.best_ask_tick) * hbt.tick_size
#--------------------------------------------------------
# Updates quotes.
hbt.clear_inactive_orders()
# Cancel orders if they differ from the updated bid and ask prices.
for order in hbt.orders.values():
if order.side == BUY and order.cancellable and order.price != bid_price:
hbt.cancel(order.order_id)
if order.side == SELL and order.cancellable and order.price != ask_price:
hbt.cancel(order.order_id)
# If the current position is within the maximum position,
# submit the new order only if no order exists at the same price.
if hbt.position < max_position and np.isfinite(bid_price):
bid_price_as_order_id = round(bid_price / hbt.tick_size)
if bid_price_as_order_id not in hbt.orders:
hbt.submit_buy_order(bid_price_as_order_id, bid_price, order_qty, GTX)
if hbt.position > -max_position and np.isfinite(ask_price):
ask_price_as_order_id = round(ask_price / hbt.tick_size)
if ask_price_as_order_id not in hbt.orders:
hbt.submit_sell_order(ask_price_as_order_id, ask_price, order_qty, GTX)
#--------------------------------------------------------
# Records variables and stats for analysis.
out[t, 0] = half_spread
out[t, 1] = skew
out[t, 2] = volatility
out[t, 3] = A
out[t, 4] = k
t += 1
if t >= len(arrival_depth) or t >= len(mid_price_chg) or t >= len(out):
raise Exception
# Records the current state for stat calculation.
stat.record(hbt)
return out[:t]
[16]:
from hftbacktest import SquareProbQueueModel, Stat, GTX
hbt = HftBacktest(
[
'data/ethusdt_20221003.npz',
],
tick_size=0.01,
lot_size=0.001,
maker_fee=-0.00005,
taker_fee=0.0007,
order_latency=FeedLatency(),
queue_model=SquareProbQueueModel(),
asset_type=Linear,
trade_list_size=10_000,
snapshot='data/ethusdt_20221002_eod.npz'
)
stat = Stat(hbt)
out = glft_market_maker(hbt, stat.recorder)
stat.summary(capital=10_000)
Load data/ethusdt_20221003.npz
=========== Summary ===========
Sharpe ratio: -271.1
Sortino ratio: -306.4
Risk return ratio: -365.0
Annualised return: -2213.36 %
Max. draw down: 6.06 %
The number of trades per day: 6828
Avg. daily trading volume: 6828
Avg. daily trading amount: 8859717
Max leverage: 1.45
Median leverage: 0.00

Adjustment factors
It looks like the skew is too strong, which is why the market maker is hesitant to take on the position. To alleviate the skew, you can introduce adjustment factors, \(adj_1\) and \(adj_2\), to the calculated half spread and skew, as follow.
[17]:
from numba.typed import Dict
@njit
def glft_market_maker(hbt, stat):
arrival_depth = np.full(10_000_000, np.nan, np.float64)
mid_price_chg = np.full(10_000_000, np.nan, np.float64)
out = np.full((10_000_000, 5), np.nan, np.float64)
t = 0
prev_mid_price_tick = np.nan
mid_price_tick = np.nan
tmp = np.zeros(500, np.float64)
ticks = np.arange(len(tmp)) + .5
A = np.nan
k = np.nan
volatility = np.nan
gamma = 0.05
delta = 1
adj1 = 1
adj2 = 0.05 # Uses the same value as gamma.
order_qty = 1
max_position = 20
# Checks every 100 milliseconds.
while hbt.elapse(100_000):
#--------------------------------------------------------
# Records market order's arrival depth from the mid-price.
if not np.isnan(mid_price_tick):
depth = -np.inf
for trade in hbt.last_trades:
side = trade[3]
trade_price_tick = trade[4] / hbt.tick_size
if side == BUY:
depth = np.nanmax([trade_price_tick - mid_price_tick, depth])
else:
depth = np.nanmax([mid_price_tick - trade_price_tick, depth])
arrival_depth[t] = depth
hbt.clear_last_trades()
prev_mid_price_tick = mid_price_tick
mid_price_tick = (hbt.best_bid_tick + hbt.best_ask_tick) / 2.0
# Records the mid-price change for volatility calculation.
mid_price_chg[t] = mid_price_tick - prev_mid_price_tick
#--------------------------------------------------------
# Calibrates A, k and calculates the market volatility.
# Updates A, k, and the volatility every 5-sec.
if t % 50 == 0:
# Window size is 10-minute.
if t >= 6_000 - 1:
# Calibrates A, k
tmp[:] = 0
lambda_ = measure_trading_intensity(arrival_depth[t + 1 - 6_000:t + 1], tmp)
lambda_ = lambda_[:70] / 600
x = ticks[:len(lambda_)]
y = np.log(lambda_)
k_, logA = linear_regression(x, y)
A = np.exp(logA)
k = -k_
# Updates the volatility.
volatility = np.nanstd(mid_price_chg[t + 1 - 6_000:t + 1]) * np.sqrt(10)
#--------------------------------------------------------
# Computes bid price and ask price.
c1, c2 = compute_coeff(gamma, gamma, delta, A, k)
half_spread = (c1 + delta / 2 * c2 * volatility) * adj1
skew = c2 * volatility * adj2
bid_depth = half_spread + skew * hbt.position
ask_depth = half_spread - skew * hbt.position
bid_price = min(np.round(mid_price_tick - bid_depth), hbt.best_bid_tick) * hbt.tick_size
ask_price = max(np.round(mid_price_tick + ask_depth), hbt.best_ask_tick) * hbt.tick_size
#--------------------------------------------------------
# Updates quotes.
hbt.clear_inactive_orders()
# Cancel orders if they differ from the updated bid and ask prices.
for order in hbt.orders.values():
if order.side == BUY and order.cancellable and order.price != bid_price:
hbt.cancel(order.order_id)
if order.side == SELL and order.cancellable and order.price != ask_price:
hbt.cancel(order.order_id)
# If the current position is within the maximum position,
# submit the new order only if no order exists at the same price.
if hbt.position < max_position and np.isfinite(bid_price):
bid_price_as_order_id = round(bid_price / hbt.tick_size)
if bid_price_as_order_id not in hbt.orders:
hbt.submit_buy_order(bid_price_as_order_id, bid_price, order_qty, GTX)
if hbt.position > -max_position and np.isfinite(ask_price):
ask_price_as_order_id = round(ask_price / hbt.tick_size)
if ask_price_as_order_id not in hbt.orders:
hbt.submit_sell_order(ask_price_as_order_id, ask_price, order_qty, GTX)
#--------------------------------------------------------
# Records variables and stats for analysis.
out[t, 0] = half_spread
out[t, 1] = skew
out[t, 2] = volatility
out[t, 3] = A
out[t, 4] = k
t += 1
if t >= len(arrival_depth) or t >= len(mid_price_chg) or t >= len(out):
raise Exception
# Records the current state for stat calculation.
stat.record(hbt)
return out[:t]
[18]:
from hftbacktest import SquareProbQueueModel, Stat, GTX
hbt = HftBacktest(
[
'data/ethusdt_20221003.npz',
],
tick_size=0.01,
lot_size=0.001,
maker_fee=-0.00005,
taker_fee=0.0007,
order_latency=FeedLatency(),
queue_model=SquareProbQueueModel(),
asset_type=Linear,
trade_list_size=10_000,
snapshot='data/ethusdt_20221002_eod.npz'
)
stat = Stat(hbt)
out = glft_market_maker(hbt, stat.recorder)
stat.summary(capital=10_000)
Load data/ethusdt_20221003.npz
=========== Summary ===========
Sharpe ratio: 5.9
Sortino ratio: 5.6
Risk return ratio: 144.6
Annualised return: 172.58 %
Max. draw down: 1.19 %
The number of trades per day: 5497
Avg. daily trading volume: 5497
Avg. daily trading amount: 7131834
Max leverage: 3.42
Median leverage: 0.26

Improved, but even when accounting for rebates, it can only achieve breakeven at best. As shown below, both the half spread and skew move together, primarily influenced by the \(c_2\) and the market volatility.
[19]:
import pandas as pd
dt = stat.datetime()
mid = pd.Series(stat.mid, index=dt)
half_spread = pd.Series(out[:, 0], index=dt)
skew = pd.Series(out[:, 1], index=dt)
volatility = pd.Series(out[:, 2], index=dt)
A = pd.Series(out[:, 3], index=dt)
k = pd.Series(out[:, 4], index=dt)
fig, axs = plt.subplots(2, 1, sharex=True)
fig.subplots_adjust(hspace=0)
fig.set_size_inches(10, 6)
half_spread.resample('5min').last().plot(ax=axs[0])
mid.resample('5min').last().plot(ax=axs[0].twinx(), style='r')
axs[0].set_ylabel('Half spread (tick)')
skew.resample('5min').last().plot(ax=axs[1])
mid.resample('5min').last().plot(ax=axs[1].twinx(), style='r')
axs[1].set_ylabel('Skew (tick)')
[19]:
Text(0, 0.5, 'Skew (tick)')

[20]:
fig, axs = plt.subplots(3, 1, sharex=True)
fig.subplots_adjust(hspace=0)
fig.set_size_inches(10, 9)
volatility.resample('5min').last().plot(ax=axs[0])
mid.resample('5min').last().plot(ax=axs[0].twinx(), style='r')
axs[0].set_ylabel('Volatility ($ tick/s^{1/2} $)')
A.resample('5min').last().plot(ax=axs[1])
mid.resample('5min').last().plot(ax=axs[1].twinx(), style='r')
axs[1].set_ylabel('A ($ s^{-1} $)')
k.resample('5min').last().plot(ax=axs[2])
mid.resample('5min').last().plot(ax=axs[2].twinx(), style='r')
axs[2].set_ylabel('k ($ tick^{-1} $)')
[20]:
Text(0, 0.5, 'k ($ tick^{-1} $)')

In the 5-day backtest, it’s evident that profits are generated through rebates, as a result of maintaining high trading volume by consistently posting quotes.
[21]:
hbt = HftBacktest(
[
'data/ethusdt_20221003.npz',
'data/ethusdt_20221004.npz',
'data/ethusdt_20221005.npz',
'data/ethusdt_20221006.npz',
'data/ethusdt_20221007.npz'
],
tick_size=0.01,
lot_size=0.001,
maker_fee=-0.00005,
taker_fee=0.0007,
order_latency=FeedLatency(),
queue_model=SquareProbQueueModel(),
asset_type=Linear,
trade_list_size=10_000,
snapshot='data/ethusdt_20221002_eod.npz'
)
stat = Stat(hbt)
out = glft_market_maker(hbt, stat.recorder)
stat.summary(capital=10_000)
Load data/ethusdt_20221003.npz
Load data/ethusdt_20221004.npz
Load data/ethusdt_20221005.npz
Load data/ethusdt_20221006.npz
Load data/ethusdt_20221007.npz
=========== Summary ===========
Sharpe ratio: 17.3
Sortino ratio: 14.0
Risk return ratio: 282.9
Annualised return: 512.09 %
Max. draw down: 1.81 %
The number of trades per day: 8385
Avg. daily trading volume: 8385
Avg. daily trading amount: 11231937
Max leverage: 5.24
Median leverage: 0.27

Integrating Grid Trading
Creating a grid from the bid and ask prices derived from the Guéant–Lehalle–Fernandez-Tapia market making model.
[22]:
from numba.typed import Dict
@njit
def gridtrading_glft_mm(hbt, stat):
arrival_depth = np.full(10_000_000, np.nan, np.float64)
mid_price_chg = np.full(10_000_000, np.nan, np.float64)
out = np.full((10_000_000, 5), np.nan, np.float64)
t = 0
prev_mid_price_tick = np.nan
mid_price_tick = np.nan
tmp = np.zeros(500, np.float64)
ticks = np.arange(len(tmp)) + .5
A = np.nan
k = np.nan
volatility = np.nan
gamma = 0.05
delta = 1
adj1 = 1
adj2 = 0.05
order_qty = 1
max_position = 20
grid_num = 20
# Checks every 100 milliseconds.
while hbt.elapse(100_000):
#--------------------------------------------------------
# Records market order's arrival depth from the mid-price.
if not np.isnan(mid_price_tick):
depth = -np.inf
for trade in hbt.last_trades:
side = trade[3]
trade_price_tick = trade[4] / hbt.tick_size
if side == BUY:
depth = np.nanmax([trade_price_tick - mid_price_tick, depth])
else:
depth = np.nanmax([mid_price_tick - trade_price_tick, depth])
arrival_depth[t] = depth
hbt.clear_last_trades()
prev_mid_price_tick = mid_price_tick
mid_price_tick = (hbt.best_bid_tick + hbt.best_ask_tick) / 2.0
# Records the mid-price change for volatility calculation.
mid_price_chg[t] = mid_price_tick - prev_mid_price_tick
#--------------------------------------------------------
# Calibrates A, k and calculates the market volatility.
# Updates A, k, and the volatility every 5-sec.
if t % 50 == 0:
# Window size is 10-minute.
if t >= 6_000 - 1:
# Calibrates A, k
tmp[:] = 0
lambda_ = measure_trading_intensity(arrival_depth[t + 1 - 6_000:t + 1], tmp)
lambda_ = lambda_[:70] / 600
x = ticks[:len(lambda_)]
y = np.log(lambda_)
k_, logA = linear_regression(x, y)
A = np.exp(logA)
k = -k_
# Updates the volatility.
volatility = np.nanstd(mid_price_chg[t + 1 - 6_000:t + 1]) * np.sqrt(10)
#--------------------------------------------------------
# Computes bid price and ask price.
c1, c2 = compute_coeff(gamma, gamma, delta, A, k)
half_spread = (c1 + delta / 2 * c2 * volatility) * adj1
skew = c2 * volatility * adj2
bid_depth = half_spread + skew * hbt.position
ask_depth = half_spread - skew * hbt.position
bid_price = min(mid_price_tick - bid_depth, hbt.best_bid_tick) * hbt.tick_size
ask_price = max(mid_price_tick + ask_depth, hbt.best_ask_tick) * hbt.tick_size
grid_interval = max(np.round(half_spread) * hbt.tick_size, hbt.tick_size)
bid_price = np.floor(bid_price / grid_interval) * grid_interval
ask_price = np.ceil(ask_price / grid_interval) * grid_interval
#--------------------------------------------------------
# Updates quotes.
hbt.clear_inactive_orders()
# Creates a new grid for buy orders.
new_bid_orders = Dict.empty(np.int64, np.float64)
if hbt.position < max_position and np.isfinite(bid_price):
for i in range(grid_num):
bid_price -= i * grid_interval
bid_price_tick = round(bid_price / hbt.tick_size)
# order price in tick is used as order id.
new_bid_orders[bid_price_tick] = bid_price
for order in hbt.orders.values():
# Cancels if an order is not in the new grid.
if order.side == BUY and order.cancellable and order.order_id not in new_bid_orders:
hbt.cancel(order.order_id)
for order_id, order_price in new_bid_orders.items():
# Posts an order if it doesn't exist.
if order_id not in hbt.orders:
hbt.submit_buy_order(order_id, order_price, order_qty, GTX)
# Creates a new grid for sell orders.
new_ask_orders = Dict.empty(np.int64, np.float64)
if hbt.position > -max_position and np.isfinite(ask_price):
for i in range(grid_num):
ask_price += i * grid_interval
ask_price_tick = round(ask_price / hbt.tick_size)
# order price in tick is used as order id.
new_ask_orders[ask_price_tick] = ask_price
for order in hbt.orders.values():
# Cancels if an order is not in the new grid.
if order.side == SELL and order.cancellable and order.order_id not in new_ask_orders:
hbt.cancel(order.order_id)
for order_id, order_price in new_ask_orders.items():
# Posts an order if it doesn't exist.
if order_id not in hbt.orders:
hbt.submit_sell_order(order_id, order_price, order_qty, GTX)
#--------------------------------------------------------
# Records variables and stats for analysis.
out[t, 0] = half_spread
out[t, 1] = skew
out[t, 2] = volatility
out[t, 3] = A
out[t, 4] = k
t += 1
if t >= len(arrival_depth) or t >= len(mid_price_chg) or t >= len(out):
raise Exception
# Records the current state for stat calculation.
stat.record(hbt)
return out[:t]
[23]:
hbt = HftBacktest(
[
'data/ethusdt_20221003.npz',
'data/ethusdt_20221004.npz',
'data/ethusdt_20221005.npz',
'data/ethusdt_20221006.npz',
'data/ethusdt_20221007.npz'
],
tick_size=0.01,
lot_size=0.001,
maker_fee=-0.00005,
taker_fee=0.0007,
order_latency=FeedLatency(),
queue_model=SquareProbQueueModel(),
asset_type=Linear,
trade_list_size=10_000,
snapshot='data/ethusdt_20221002_eod.npz'
)
stat = Stat(hbt)
out = gridtrading_glft_mm(hbt, stat.recorder)
stat.summary(capital=25_000)
Load data/ethusdt_20221003.npz
Load data/ethusdt_20221004.npz
Load data/ethusdt_20221005.npz
Load data/ethusdt_20221006.npz
Load data/ethusdt_20221007.npz
=========== Summary ===========
Sharpe ratio: 21.1
Sortino ratio: 20.3
Risk return ratio: 381.1
Annualised return: 395.88 %
Max. draw down: 1.04 %
The number of trades per day: 4547
Avg. daily trading volume: 4547
Avg. daily trading amount: 6092144
Max leverage: 2.09
Median leverage: 0.16

You can see it works even better with other coins as well. In the next example, we will show how to create multiple markets to achieve better risk-adjusted returns.
[23]:
hbt = HftBacktest(
[
'data/ltcusdt_20230701.npz',
'data/ltcusdt_20230702.npz',
'data/ltcusdt_20230703.npz',
'data/ltcusdt_20230704.npz',
'data/ltcusdt_20230705.npz'
],
tick_size=0.01,
lot_size=0.001,
maker_fee=-0.00005,
taker_fee=0.0007,
order_latency=FeedLatency(),
queue_model=SquareProbQueueModel(),
asset_type=Linear,
trade_list_size=10_000,
snapshot='data/ltcusdt_20230630_eod.npz'
)
stat = Stat(hbt)
out = gridtrading_glft_mm(hbt, stat.recorder)
stat.summary(capital=2_500)
=========== Summary ===========
Sharpe ratio: 17.8
Sortino ratio: 16.4
Risk return ratio: 207.0
Annualised return: 768.72 %
Max. draw down: 3.71 %
The number of trades per day: 1912
Avg. daily trading volume: 1912
Avg. daily trading amount: 206843
Max leverage: 1.36
Median leverage: 0.22

Wrapping up
Thus far, we have illustrated how to apply the model to a real-world example.
For a more effective market-making algorithm, consider dividing this model into the following categories:
Half-spread: As shown, the half-spread is a function of trading intensity and market volatility. An exponential function used for trading intensity might not be suitable for the entire range. You could develop a more refined approach to convert trading intensity to half-spread. Additionally, while historical trading intensity and market volatility are utilized here, you could forecast short-term trading intensity and volatility to respond more agilely to changes in market conditions. This might involve strategies that use news, events, liquidity vacuums, and other factors to predict volatility explosions.
Skew: The skew is also a function of trading intensity and market volatility. In this model, only inventory risk is considered, but you can also account for other risks, particularly when making multiple markets. BARRA is a good example of other risks that can be managed similarly.
Fair Value Pricing: In this model, the fair price is equal to the mid-price, however, you need to incorporate forecasts such as the micro-price and fair value pricing through correlated assets to enhance the strategy.
Hedging: Hedging is especially crucial when making multiple markets, as it serves as a valuable tool for managing risks.
We will address a few more topics in upcoming examples.
References
Making Multiple Markets
Overview
By diversifying your assets and constructing a market-making book, you can achieve improved risk-adjusted returns through the effects of diversification. In this example, we will demonstrate how the statistics of your market-making portfolio change as you increase the number of assets for which you create markets.
To implement Grid Trading using the GLFT market-making model across multiple assets universally without needing to adjust parameters, a few modifications are required:
Order quantities vary between assets due to differences in price, trading volume, and liquidity in the order book. To backtest all at once, you need to normalize your order quantities and make adjustments accordingly.
In certain assets, market trades primarily take place at the best bid and offer levels. Since we only calculate our trading intensity when market trades match our quotes, you may not achieve adequate trading intensity to suit your trading intensity function in such cases. As a result, you’ll need to explore alternative methods to determine your half spread and skew based on order arrival depths or you need to increase your reaction interval to get more deeper order arrival depth but it leads you to react delayed especially in a fast-moving market.
See how \(adj_2\) is determined to normalize different order quantities.
Note: This example is for educational purposes only and demonstrates effective strategies for high-frequency market-making schemes. All backtests are based on a 0.005% rebate, the highest market maker rebate available on Binance Futures. See Binance Upgrades USDⓢ-Margined Futures Liquidity Provider Program for more details.
[1]:
from numba import njit
from numba.typed import Dict
from hftbacktest import BUY, SELL
from hftbacktest import HftBacktest, Linear, FeedLatency, SquareProbQueueModel, Stat, GTX, GTC
import numpy as np
@njit(cache=True)
def measure_trading_intensity(order_arrival_depth, out):
max_tick = 0
for depth in order_arrival_depth:
if not np.isfinite(depth):
continue
# Sets the tick index to 0 for the nearest possible best price
# as the order arrival depth in ticks is measured from the mid-price
tick = round(depth / .5) - 1
# In a fast-moving market, buy trades can occur below the mid-price (and vice versa for sell trades)
# since the mid-price is measured in a previous time-step;
# however, to simplify the problem, we will exclude those cases.
if tick < 0 or tick >= len(out):
continue
# All of our possible quotes within the order arrival depth,
# excluding those at the same price, are considered executed.
out[:tick] += 1
max_tick = max(max_tick, tick)
return out[:max_tick]
@njit(cache=True)
def compute_coeff(xi, gamma, delta, A, k):
inv_k = np.divide(1, k)
c1 = 1 / (xi * delta) * np.log(1 + xi * delta * inv_k)
c2 = np.sqrt(np.divide(gamma, 2 * A * delta * k) * ((1 + xi * delta * inv_k) ** (k / (xi * delta) + 1)))
return c1, c2
@njit(cache=True)
def linear_regression(x, y):
sx = np.sum(x)
sy = np.sum(y)
sx2 = np.sum(x ** 2)
sxy = np.sum(x * y)
w = len(x)
slope = (w * sxy - sx * sy) / (w * sx2 - sx**2)
intercept = (sy - slope * sx) / w
return slope, intercept
@njit(cache=True)
def gridtrading_glft_mm(hbt, stat, order_qty):
arrival_depth = np.full(30_000_000, np.nan, np.float64)
mid_price_chg = np.full(30_000_000, np.nan, np.float64)
t = 0
prev_mid_price_tick = np.nan
mid_price_tick = np.nan
tmp = np.zeros(500, np.float64)
ticks = np.arange(len(tmp)) + .5
A = np.nan
k = np.nan
volatility = np.nan
gamma = 0.05
delta = 1
adj1 = 1
# adj2 is determined according to the order quantity.
grid_num = 20
max_position = grid_num * order_qty
adj2 = 1 / max_position
# Checks every 100 milliseconds.
while hbt.elapse(100_000):
#--------------------------------------------------------
# Records market order's arrival depth from the mid-price.
if not np.isnan(mid_price_tick):
depth = -np.inf
for trade in hbt.last_trades:
side = trade[3]
trade_price_tick = trade[4] / hbt.tick_size
if side == BUY:
depth = np.nanmax([trade_price_tick - mid_price_tick, depth])
else:
depth = np.nanmax([mid_price_tick - trade_price_tick, depth])
arrival_depth[t] = depth
hbt.clear_last_trades()
prev_mid_price_tick = mid_price_tick
mid_price_tick = (hbt.best_bid_tick + hbt.best_ask_tick) / 2.0
# Records the mid-price change for volatility calculation.
mid_price_chg[t] = mid_price_tick - prev_mid_price_tick
#--------------------------------------------------------
# Calibrates A, k and calculates the market volatility.
# Updates A, k, and the volatility every 5-sec.
if t % 50 == 0:
# Window size is 10-minute.
if t >= 6_000 - 1:
# Calibrates A, k
tmp[:] = 0
lambda_ = measure_trading_intensity(arrival_depth[t + 1 - 6_000:t + 1], tmp)
# To properly calibrate A and K, a sufficient number of data points is required, here, with a minimum of three.
# If market trades only take place at the best bid and offer, an alternative method may be necessary
# to compute half spread and skew, since fitting a function might not be feasible due to insufficient
# data points.
# Alternatively, you can increase the time-step for measuring order arrivals,
# but this could result in a delayed response.
half_spread_one = False
if len(lambda_) > 2:
lambda_ = lambda_[:70] / 600
x = ticks[:len(lambda_)]
y = np.log(lambda_)
k_, logA = linear_regression(x, y)
A = np.exp(logA)
k = -k_
# Updates the volatility.
volatility = np.nanstd(mid_price_chg[t + 1 - 6_000:t + 1]) * np.sqrt(10)
#--------------------------------------------------------
# Computes bid price and ask price.
c1, c2 = compute_coeff(gamma, gamma, delta, A, k)
half_spread = max((c1 + delta / 2 * c2 * volatility) * adj1, 0.5)
skew = c2 * volatility * adj2
bid_depth = half_spread + skew * hbt.position
ask_depth = half_spread - skew * hbt.position
bid_price = min(mid_price_tick - bid_depth, hbt.best_bid_tick) * hbt.tick_size
ask_price = max(mid_price_tick + ask_depth, hbt.best_ask_tick) * hbt.tick_size
grid_interval = max(np.round(half_spread) * hbt.tick_size, hbt.tick_size)
bid_price = np.floor(bid_price / grid_interval) * grid_interval
ask_price = np.ceil(ask_price / grid_interval) * grid_interval
#--------------------------------------------------------
# Updates quotes.
hbt.clear_inactive_orders()
# Creates a new grid for buy orders.
new_bid_orders = Dict.empty(np.int64, np.float64)
if hbt.position < max_position and np.isfinite(bid_price):
for i in range(grid_num):
bid_price -= i * grid_interval
bid_price_tick = round(bid_price / hbt.tick_size)
# order price in tick is used as order id.
new_bid_orders[bid_price_tick] = bid_price
for order in hbt.orders.values():
# Cancels if an order is not in the new grid.
if order.side == BUY and order.cancellable and order.order_id not in new_bid_orders:
hbt.cancel(order.order_id)
for order_id, order_price in new_bid_orders.items():
# Posts an order if it doesn't exist.
if order_id not in hbt.orders:
hbt.submit_buy_order(order_id, order_price, order_qty, GTX)
# Creates a new grid for sell orders.
new_ask_orders = Dict.empty(np.int64, np.float64)
if hbt.position > -max_position and np.isfinite(ask_price):
for i in range(grid_num):
ask_price += i * grid_interval
ask_price_tick = round(ask_price / hbt.tick_size)
# order price in tick is used as order id.
new_ask_orders[ask_price_tick] = ask_price
for order in hbt.orders.values():
# Cancels if an order is not in the new grid.
if order.side == SELL and order.cancellable and order.order_id not in new_ask_orders:
hbt.cancel(order.order_id)
for order_id, order_price in new_ask_orders.items():
# Posts an order if it doesn't exist.
if order_id not in hbt.orders:
hbt.submit_sell_order(order_id, order_price, order_qty, GTX)
t += 1
if t >= len(arrival_depth) or t >= len(mid_price_chg):
raise Exception
# Records the current state for stat calculation.
stat.record(hbt)
The order quantity is determined to be equivalent to a notional value of $100.
[2]:
from hftbacktest import COL_SIDE, COL_PRICE
def backtest(args):
asset_name, asset_info = args
hbt = HftBacktest(
['data/{}_{}.npz'.format(asset_name, date) for date in range(20230701, 20230732)],
tick_size=asset_info['tick_size'],
lot_size=asset_info['lot_size'],
maker_fee=-0.00005,
taker_fee=0.0007,
order_latency=FeedLatency(1),
queue_model=SquareProbQueueModel(),
asset_type=Linear,
snapshot='data/{}_20230630_eod.npz'.format(asset_name),
trade_list_size=10000,
)
stat = Stat(hbt)
# Obtains the mid-price of the assset to determine the order quantity.
data = np.load('data/{}_20230630_eod.npz'.format(asset_name))['data']
best_bid = max(data[data[:, COL_SIDE] == 1][:, COL_PRICE])
best_ask = min(data[data[:, COL_SIDE] == -1][:, COL_PRICE])
mid = (best_bid + best_ask) / 2.0
# Sets the order quantity to be equivalent to a notional value of $100.
order_qty = max(round((100 / mid) / asset_info['lot_size']), 1) * asset_info['lot_size']
gridtrading_glft_mm(hbt, stat.recorder, order_qty)
np.savez(
'{}_stat'.format(asset_name),
timestamp=np.asarray(stat.timestamp),
mid=np.asarray(stat.mid),
balance=np.asarray(stat.balance),
position=np.asarray(stat.position),
fee=np.asarray(stat.fee),
)
By utilizing multiprocessing, backtesting of multiple assets can be conducted simultaneously.
[3]:
%%capture
import json
from multiprocessing import Pool
with open('assets.json', 'r') as f:
assets = json.load(f)
with Pool(16) as p:
print(p.map(backtest, list(assets.items())))
[4]:
import pandas as pd
equity_values = {}
for asset_name in assets.keys():
stat = np.load('{}_stat.npz'.format(asset_name))
timestamp = stat['timestamp']
mid = stat['mid']
balance = stat['balance']
position = stat['position']
fee = stat['fee']
equity = mid * position + balance - fee
equity = pd.Series(equity, index=pd.to_datetime(timestamp, unit='us', utc=True))
equity_values[asset_name] = equity.resample('5min').last()
You can see the equity curve of individual assets and notice how combining multiple assets can lead to a smoother equity curve, thereby enhancing risk-adjusted returns.
[5]:
from matplotlib import pyplot as plt
for i, asset_name in enumerate(assets.keys()):
plt.figure(i, figsize=(10, 3))
plt.plot(list(equity_values.values())[i])
plt.grid()
plt.title(asset_name)
plt.ylabel('Equity ($)')















This presents an equity curve based on the number of assets, which are altcoins excluding BTC and ETH.
[6]:
fig = plt.figure()
fig.set_size_inches(10, 3)
legend = []
net_equity = None
for i, equity in enumerate(list(equity_values.values())):
asset_number = i + 1
if net_equity is None:
net_equity = equity.copy()
else:
net_equity += equity.copy()
if asset_number % 5 == 0:
# 2_000 is capital for each trading asset.
net_equity_ = (net_equity / asset_number) / 2_000
net_equity_rs = net_equity_.resample('1d').last()
pnl = net_equity_rs.diff()
sr = pnl.mean() / pnl.std()
ann_sr = sr * np.sqrt(365)
legend.append('{} assets, SR={:.2f} (Daily SR={:.2f})'.format(asset_number, ann_sr, sr))
(net_equity_ * 100).plot()
plt.legend(legend)
plt.grid()
plt.ylabel('Cumulative Returns (%)')
[6]:
Text(0, 0.5, 'Cumulative Returns (%)')

Impact of Order Latency
When applying amplified feed latency, you can observe a decrease in performance due to the effects of latency.
[7]:
def backtest(args):
asset_name, asset_info = args
hbt = HftBacktest(
['data/{}_{}.npz'.format(asset_name, date) for date in range(20230701, 20230732)],
tick_size=asset_info['tick_size'],
lot_size=asset_info['lot_size'],
maker_fee=-0.00005,
taker_fee=0.0007,
order_latency=FeedLatency(entry_latency_mul=4, resp_latency_mul=3),
queue_model=SquareProbQueueModel(),
asset_type=Linear,
snapshot='data/{}_20230630_eod.npz'.format(asset_name),
trade_list_size=10000,
)
stat = Stat(hbt)
# Obtains the mid-price of the assset to determine the order quantity.
data = np.load('data/{}_20230630_eod.npz'.format(asset_name))['data']
best_bid = max(data[data[:, COL_SIDE] == 1][:, COL_PRICE])
best_ask = min(data[data[:, COL_SIDE] == -1][:, COL_PRICE])
mid = (best_bid + best_ask) / 2.0
# Sets the order quantity to be equivalent to a notional value of $100.
order_qty = max(round((100 / mid) / asset_info['lot_size']), 1) * asset_info['lot_size']
gridtrading_glft_mm(hbt, stat.recorder, order_qty)
np.savez(
'{}_stat_latency1'.format(asset_name),
timestamp=np.asarray(stat.timestamp),
mid=np.asarray(stat.mid),
balance=np.asarray(stat.balance),
position=np.asarray(stat.position),
fee=np.asarray(stat.fee),
)
[8]:
%%capture
with Pool(16) as p:
print(p.map(backtest, list(assets.items())))
[9]:
equity_values = {}
for asset_name in assets.keys():
stat = np.load('{}_stat_latency1.npz'.format(asset_name))
timestamp = stat['timestamp']
mid = stat['mid']
balance = stat['balance']
position = stat['position']
fee = stat['fee']
equity = mid * position + balance - fee
equity = pd.Series(equity, index=pd.to_datetime(timestamp, unit='us', utc=True))
equity_values[asset_name] = equity.resample('5min').last()
fig = plt.figure()
fig.set_size_inches(10, 3)
legend = []
net_equity = None
for i, equity in enumerate(list(equity_values.values())):
asset_number = i + 1
if net_equity is None:
net_equity = equity.copy()
else:
net_equity += equity.copy()
if asset_number % 5 == 0:
# 2_000 is capital for each trading asset.
net_equity_ = (net_equity / asset_number) / 2_000
net_equity_rs = net_equity_.resample('1d').last()
pnl = net_equity_rs.diff()
sr = pnl.mean() / pnl.std()
ann_sr = sr * np.sqrt(365)
legend.append('{} assets, SR={:.2f} (Daily SR={:.2f})'.format(asset_number, ann_sr, sr))
(net_equity_ * 100).plot()
plt.legend(legend)
plt.grid()
plt.ylabel('Cumulative Returns (%)')
[9]:
Text(0, 0.5, 'Cumulative Returns (%)')

When actual historical order latency is applied, the performance may deteriorate further compared to when amplified feed latency is used.
[10]:
from hftbacktest import IntpOrderLatency
latency_data = np.concatenate(
[np.load('../latency/order_latency_{}.npz'.format(date))['data'] for date in range(20230701, 20230732)]
)
def backtest(args):
asset_name, asset_info = args
hbt = HftBacktest(
['data/{}_{}.npz'.format(asset_name, date) for date in range(20230701, 20230732)],
tick_size=asset_info['tick_size'],
lot_size=asset_info['lot_size'],
maker_fee=-0.00005,
taker_fee=0.0007,
order_latency=IntpOrderLatency(data=latency_data),
queue_model=SquareProbQueueModel(),
asset_type=Linear,
snapshot='data/{}_20230630_eod.npz'.format(asset_name),
trade_list_size=10000,
)
stat = Stat(hbt)
# Obtains the mid-price of the assset to determine the order quantity.
data = np.load('data/{}_20230630_eod.npz'.format(asset_name))['data']
best_bid = max(data[data[:, COL_SIDE] == 1][:, COL_PRICE])
best_ask = min(data[data[:, COL_SIDE] == -1][:, COL_PRICE])
mid = (best_bid + best_ask) / 2.0
# Sets the order quantity to be equivalent to a notional value of $100.
order_qty = max(round((100 / mid) / asset_info['lot_size']), 1) * asset_info['lot_size']
gridtrading_glft_mm(hbt, stat.recorder, order_qty)
np.savez(
'{}_stat_latency2'.format(asset_name),
timestamp=np.asarray(stat.timestamp),
mid=np.asarray(stat.mid),
balance=np.asarray(stat.balance),
position=np.asarray(stat.position),
fee=np.asarray(stat.fee),
)
[11]:
%%capture
with Pool(16) as p:
print(p.map(backtest, list(assets.items())))
[12]:
equity_values = {}
for asset_name in assets.keys():
stat = np.load('{}_stat_latency2.npz'.format(asset_name))
timestamp = stat['timestamp']
mid = stat['mid']
balance = stat['balance']
position = stat['position']
fee = stat['fee']
equity = mid * position + balance - fee
equity = pd.Series(equity, index=pd.to_datetime(timestamp, unit='us', utc=True))
equity_values[asset_name] = equity.resample('5min').last()
fig = plt.figure()
fig.set_size_inches(10, 3)
legend = []
net_equity = None
for i, equity in enumerate(list(equity_values.values())):
asset_number = i + 1
if net_equity is None:
net_equity = equity.copy()
else:
net_equity += equity.copy()
if asset_number % 5 == 0:
# 2_000 is capital for each trading asset.
net_equity_ = (net_equity / asset_number) / 2_000
net_equity_rs = net_equity_.resample('1d').last()
pnl = net_equity_rs.diff()
sr = pnl.mean() / pnl.std()
ann_sr = sr * np.sqrt(365)
legend.append('{} assets, SR={:.2f} (Daily SR={:.2f})'.format(asset_number, ann_sr, sr))
(net_equity_ * 100).plot()
plt.legend(legend)
plt.grid()
plt.ylabel('Cumulative Returns (%)')
[12]:
Text(0, 0.5, 'Cumulative Returns (%)')

Therefore, understanding your order latency is crucial to achieving more precise backtest results. This understanding underscores the importance of latency reduction for market makers or high-frequency traders. This is why crypto exchanges not only offer maker rebates but also provide low-latency infrastructure to eligible market makers.
Simpler model
So far, we only cover \(\xi>0\) case, but \(\xi=0\) case would be more simple and appropriate in practice especially in cryptocurrencies.
Revisit the equations (4.6) and (4.7) in Optimal market making and explore how they can be applied to real-world scenarios.
The optimal bid quote depth, \(\delta^{b*}_{approx}\), and ask quote depth, \(\delta^{a*}_{approx}\), are derived from the fair price as follows in the case of \(\xi=0\):
\begin{align} \delta^{b*}_{approx}(q) = {1 \over k} + {{2q + \Delta} \over 2}\sqrt{{{\gamma \sigma^2 e} \over {2A\Delta k}}} \label{eq4.6}\tag{4.6} \\ \delta^{a*}_{approx}(q) = {1 \over k} - {{2q - \Delta} \over 2}\sqrt{{{\gamma \sigma^2 e} \over {2A\Delta k}}} \label{eq4.7}\tag{4.7} \end{align}
Let’s introduce \(c_1\) and \(c_2\) and define them by extracting the volatility 𝜎 from the square root as same as before:
\begin{align} c_1 = {1 \over k} \\ c_2 = \sqrt{{{\gamma e} \over {2A\Delta k}}} \end{align}
Now we can rewrite equations (4.6) and (4.7) as follows:
\begin{align} \delta^{b*}_{approx}(q) = c_1 + {\Delta \over 2} \sigma c_2 + q \sigma c_2 \\ \delta^{a*}_{approx}(q) = c_1 + {\Delta \over 2} \sigma c_2 - q \sigma c_2 \end{align}
It’s more concise and only need to adjust \(\gamma\) and its effect is more straightforward.
[13]:
@njit(cache=True)
def compute_coeff_simplified(gamma, delta, A, k):
inv_k = np.divide(1, k)
c1 = inv_k
c2 = np.sqrt(np.divide(gamma * np.exp(1), 2 * A * delta * k))
return c1, c2
@njit
def gridtrading_glft_mm(hbt, stat, gamma, order_qty):
arrival_depth = np.full(30_000_000, np.nan, np.float64)
mid_price_chg = np.full(30_000_000, np.nan, np.float64)
t = 0
prev_mid_price_tick = np.nan
mid_price_tick = np.nan
tmp = np.zeros(500, np.float64)
ticks = np.arange(len(tmp)) + 0.5
A = np.nan
k = np.nan
volatility = np.nan
delta = 1
grid_num = 20
max_position = 50 * order_qty
# Checks every 100 milliseconds.
while hbt.elapse(100_000):
#--------------------------------------------------------
# Records market order's arrival depth from the mid-price.
if not np.isnan(mid_price_tick):
depth = -np.inf
for trade in hbt.last_trades:
side = trade[3]
trade_price_tick = trade[4] / hbt.tick_size
if side == BUY:
depth = np.nanmax([trade_price_tick - mid_price_tick, depth])
else:
depth = np.nanmax([mid_price_tick - trade_price_tick, depth])
arrival_depth[t] = depth
hbt.clear_last_trades()
prev_mid_price_tick = mid_price_tick
mid_price_tick = (hbt.best_bid_tick + hbt.best_ask_tick) / 2.0
# Records the mid-price change for volatility calculation.
mid_price_chg[t] = mid_price_tick - prev_mid_price_tick
#--------------------------------------------------------
# Calibrates A, k and calculates the market volatility.
# Updates A, k, and the volatility every 5-sec.
if t % 50 == 0:
# Window size is 10-minute.
if t >= 6_000 - 1:
# Calibrates A, k
tmp[:] = 0
lambda_ = measure_trading_intensity(arrival_depth[t + 1 - 6_000:t + 1], tmp)
# To properly calibrate A and K, a sufficient number of data points is required, here, with a minimum of three.
# If market trades only take place at the best bid and offer, an alternative method may be necessary
# to compute half spread and skew, since fitting a function might not be feasible due to insufficient
# data points.
# Alternatively, you can increase the time-step for measuring order arrivals,
# but this could result in a delayed response.
half_spread_one = False
if len(lambda_) > 2:
lambda_ = lambda_[:70] / 600
x = ticks[:len(lambda_)]
y = np.log(lambda_)
k_, logA = linear_regression(x, y)
A = np.exp(logA)
k = -k_
# Updates the volatility.
volatility = np.nanstd(mid_price_chg[t + 1 - 6_000:t + 1]) * np.sqrt(10)
#--------------------------------------------------------
# Computes bid price and ask price.
c1, c2 = compute_coeff_simplified(gamma, delta, A, k)
half_spread = c1 + delta / 2 * c2 * volatility
skew = c2 * volatility
normalized_position = hbt.position / order_qty
bid_depth = half_spread + skew * normalized_position
ask_depth = half_spread - skew * normalized_position
bid_price = min(mid_price_tick - bid_depth, hbt.best_bid_tick) * hbt.tick_size
ask_price = max(mid_price_tick + ask_depth, hbt.best_ask_tick) * hbt.tick_size
grid_interval = max(np.round(half_spread) * hbt.tick_size, hbt.tick_size)
bid_price = np.floor(bid_price / grid_interval) * grid_interval
ask_price = np.ceil(ask_price / grid_interval) * grid_interval
#--------------------------------------------------------
# Updates quotes.
hbt.clear_inactive_orders()
# Creates a new grid for buy orders.
new_bid_orders = Dict.empty(np.int64, np.float64)
if hbt.position < max_position and np.isfinite(bid_price):
for i in range(grid_num):
bid_price -= i * grid_interval
bid_price_tick = round(bid_price / hbt.tick_size)
# order price in tick is used as order id.
new_bid_orders[bid_price_tick] = bid_price
for order in hbt.orders.values():
# Cancels if an order is not in the new grid.
if order.side == BUY and order.cancellable and order.order_id not in new_bid_orders:
hbt.cancel(order.order_id)
for order_id, order_price in new_bid_orders.items():
# Posts an order if it doesn't exist.
if order_id not in hbt.orders:
hbt.submit_buy_order(order_id, order_price, order_qty, GTX)
# Creates a new grid for sell orders.
new_ask_orders = Dict.empty(np.int64, np.float64)
if hbt.position > -max_position and np.isfinite(ask_price):
for i in range(grid_num):
ask_price += i * grid_interval
ask_price_tick = round(ask_price / hbt.tick_size)
# order price in tick is used as order id.
new_ask_orders[ask_price_tick] = ask_price
for order in hbt.orders.values():
# Cancels if an order is not in the new grid.
if order.side == SELL and order.cancellable and order.order_id not in new_ask_orders:
hbt.cancel(order.order_id)
for order_id, order_price in new_ask_orders.items():
# Posts an order if it doesn't exist.
if order_id not in hbt.orders:
hbt.submit_sell_order(order_id, order_price, order_qty, GTX)
t += 1
if t >= len(arrival_depth) or t >= len(mid_price_chg):
raise Exception
# Records the current state for stat calculation.
stat.record(hbt)
[14]:
def backtest(args):
asset_name, asset_info = args
hbt = HftBacktest(
['data/{}_{}.npz'.format(asset_name, date) for date in range(20230701, 20230732)],
tick_size=asset_info['tick_size'],
lot_size=asset_info['lot_size'],
maker_fee=-0.00005,
taker_fee=0.0007,
order_latency=IntpOrderLatency(data=latency_data),
queue_model=SquareProbQueueModel(),
asset_type=Linear,
snapshot='data/{}_20230630_eod.npz'.format(asset_name),
trade_list_size=10000,
)
stat = Stat(hbt)
# Obtains the mid-price of the assset to determine the order quantity.
data = np.load('data/{}_20230630_eod.npz'.format(asset_name))['data']
best_bid = max(data[data[:, COL_SIDE] == 1][:, COL_PRICE])
best_ask = min(data[data[:, COL_SIDE] == -1][:, COL_PRICE])
mid = (best_bid + best_ask) / 2.0
# Sets the order quantity to be equivalent to a notional value of $100.
order_qty = max(round((100 / mid) / asset_info['lot_size']), 1) * asset_info['lot_size']
gamma = 0.00005
gridtrading_glft_mm(hbt, stat.recorder, gamma, order_qty)
np.savez(
'{}_stat_sim'.format(asset_name),
timestamp=np.asarray(stat.timestamp),
mid=np.asarray(stat.mid),
balance=np.asarray(stat.balance),
position=np.asarray(stat.position),
fee=np.asarray(stat.fee),
)
[15]:
%%capture
with Pool(16) as p:
print(p.map(backtest, list(assets.items())))
[16]:
equity_values = {}
for asset_name in assets.keys():
stat = np.load('{}_stat_sim.npz'.format(asset_name))
timestamp = stat['timestamp']
mid = stat['mid']
balance = stat['balance']
position = stat['position']
fee = stat['fee']
equity = mid * position + balance - fee
equity = pd.Series(equity, index=pd.to_datetime(timestamp, unit='us', utc=True))
equity_values[asset_name] = equity.resample('5min').last()
fig = plt.figure()
fig.set_size_inches(10, 3)
legend = []
net_equity = None
for i, equity in enumerate(list(equity_values.values())):
asset_number = i + 1
if net_equity is None:
net_equity = equity.copy()
else:
net_equity += equity.copy()
if asset_number % 5 == 0:
# 2_000 is capital for each trading asset.
net_equity_ = (net_equity / asset_number) / 2_000
net_equity_rs = net_equity_.resample('1d').last()
pnl = net_equity_rs.diff()
sr = pnl.mean() / pnl.std()
ann_sr = sr * np.sqrt(365)
legend.append('{} assets, SR={:.2f} (Daily SR={:.2f})'.format(asset_number, ann_sr, sr))
(net_equity_ * 100).plot()
plt.legend(legend)
plt.grid()
plt.ylabel('Cumulative Returns (%)')
[16]:
Text(0, 0.5, 'Cumulative Returns (%)')

[17]:
def backtest(args):
asset_name, asset_info = args
hbt = HftBacktest(
['data/{}_{}.npz'.format(asset_name, date) for date in range(20230701, 20230732)],
tick_size=asset_info['tick_size'],
lot_size=asset_info['lot_size'],
maker_fee=-0.00005,
taker_fee=0.0007,
order_latency=IntpOrderLatency(data=latency_data),
queue_model=SquareProbQueueModel(),
asset_type=Linear,
snapshot='data/{}_20230630_eod.npz'.format(asset_name),
trade_list_size=10000,
)
stat = Stat(hbt)
# Obtains the mid-price of the assset to determine the order quantity.
data = np.load('data/{}_20230630_eod.npz'.format(asset_name))['data']
best_bid = max(data[data[:, COL_SIDE] == 1][:, COL_PRICE])
best_ask = min(data[data[:, COL_SIDE] == -1][:, COL_PRICE])
mid = (best_bid + best_ask) / 2.0
# Sets the order quantity to be equivalent to a notional value of $100.
order_qty = max(round((100 / mid) / asset_info['lot_size']), 1) * asset_info['lot_size']
gamma = 0.01
gridtrading_glft_mm(hbt, stat.recorder, gamma, order_qty)
np.savez(
'{}_stat_sim2'.format(asset_name),
timestamp=np.asarray(stat.timestamp),
mid=np.asarray(stat.mid),
balance=np.asarray(stat.balance),
position=np.asarray(stat.position),
fee=np.asarray(stat.fee),
)
[18]:
%%capture
with Pool(16) as p:
print(p.map(backtest, list(assets.items())))
You can observe a more straight line in the equity curve with higher \(\gamma\), which induces greater skew. However, it also experiences more severe drawdowns in fast-moving markets. Additionally, because of the higher skew, profits are diminished as there’s a greater tendency to close the position.
[19]:
equity_values = {}
for asset_name in assets.keys():
stat = np.load('{}_stat_sim2.npz'.format(asset_name))
timestamp = stat['timestamp']
mid = stat['mid']
balance = stat['balance']
position = stat['position']
fee = stat['fee']
equity = mid * position + balance - fee
equity = pd.Series(equity, index=pd.to_datetime(timestamp, unit='us', utc=True))
equity_values[asset_name] = equity.resample('5min').last()
fig = plt.figure()
fig.set_size_inches(10, 3)
legend = []
net_equity = None
for i, equity in enumerate(list(equity_values.values())):
asset_number = i + 1
if net_equity is None:
net_equity = equity.copy()
else:
net_equity += equity.copy()
if asset_number % 5 == 0:
# 2_000 is capital for each trading asset.
net_equity_ = (net_equity / asset_number) / 2_000
net_equity_rs = net_equity_.resample('1d').last()
pnl = net_equity_rs.diff()
sr = pnl.mean() / pnl.std()
ann_sr = sr * np.sqrt(365)
legend.append('{} assets, SR={:.2f} (Daily SR={:.2f})'.format(asset_number, ann_sr, sr))
(net_equity_ * 100).plot()
plt.legend(legend)
plt.grid()
plt.ylabel('Cumulative Returns (%)')
[19]:
Text(0, 0.5, 'Cumulative Returns (%)')

A Case for More Assets
The more assets you make a market for, the better risk-adjusted return you achieve. This effect becomes dramatically evident.
[20]:
with open('assets2.json', 'r') as f:
assets = json.load(f)
latency_data = np.concatenate(
[np.load('../latency/order_latency_{}.npz'.format(date))['data'] for date in range(20230731, 20230732)]
)
def backtest(args):
asset_name, asset_info = args
hbt = HftBacktest(
['data/{}_{}.npz'.format(asset_name, date) for date in range(20230731, 20230732)],
tick_size=asset_info['tick_size'],
lot_size=asset_info['lot_size'],
maker_fee=-0.00005,
taker_fee=0.0007,
order_latency=IntpOrderLatency(data=latency_data),
queue_model=SquareProbQueueModel(),
asset_type=Linear,
snapshot='data/{}_20230730_eod.npz'.format(asset_name),
trade_list_size=10000,
)
stat = Stat(hbt)
# Obtains the mid-price of the assset to determine the order quantity.
data = np.load('data/{}_20230730_eod.npz'.format(asset_name))['data']
best_bid = max(data[data[:, COL_SIDE] == 1][:, COL_PRICE])
best_ask = min(data[data[:, COL_SIDE] == -1][:, COL_PRICE])
mid = (best_bid + best_ask) / 2.0
# Sets the order quantity to be equivalent to a notional value of $100.
order_qty = max(round((100 / mid) / asset_info['lot_size']), 1) * asset_info['lot_size']
gamma = 0.00005
gridtrading_glft_mm(hbt, stat.recorder, gamma, order_qty)
np.savez(
'{}_stat_000005'.format(asset_name),
timestamp=np.asarray(stat.timestamp),
mid=np.asarray(stat.mid),
balance=np.asarray(stat.balance),
position=np.asarray(stat.position),
fee=np.asarray(stat.fee),
)
[21]:
%%capture
with Pool(16) as p:
print(p.map(backtest, list(assets.items())))
[22]:
equity_values = {}
sr_values = {}
for asset_name in assets.keys():
stat = np.load('{}_stat_000005.npz'.format(asset_name))
timestamp = stat['timestamp']
mid = stat['mid']
balance = stat['balance']
position = stat['position']
fee = stat['fee']
equity = mid * position + balance - fee
equity = pd.Series(equity, index=pd.to_datetime(timestamp, unit='us', utc=True))
equity_ = equity.resample('5min').last()
pnl = equity_.diff()
sr = np.divide(pnl.mean(), pnl.std())
equity_values[asset_name] = equity_
sr_values[asset_name] = sr
sr_m = np.nanmean(list(sr_values.values()))
sr_s = np.nanstd(list(sr_values.values()))
fig = plt.figure()
fig.set_size_inches(10, 3)
asset_number = 0
legend = []
net_equity = None
for i, (equity, sr) in enumerate(zip(equity_values.values(), sr_values.values())):
# There are some assets that aren't working within this scheme.
# This might be because the order arrivals don't follow a Poisson distribution that this model assumes.
# As a result, it filters out assets whose SR falls outside -0.5 sigma.
if (sr - sr_m) / sr_s > -0.5:
asset_number += 1
if net_equity is None:
net_equity = equity.copy()
else:
net_equity += equity.copy()
if asset_number % 10 == 0:
# 2_000 is capital for each trading asset.
net_equity_ = (net_equity / asset_number) / 2_000
# net_equity_rs = net_equity_.resample('1d').last()
# pnl = net_equity_rs.diff()
pnl = net_equity_.diff()
sr = pnl.mean() / pnl.std() * np.sqrt(288)
ann_sr = sr * np.sqrt(365)
legend.append('{} assets, SR={:.2f} (Daily SR={:.2f})'.format(asset_number, ann_sr, sr))
(net_equity_ * 100).plot()
plt.legend(
legend,
loc='upper center', bbox_to_anchor=(0.5, -0.15),
fancybox=True, shadow=True, ncol=3
)
plt.grid()
plt.ylabel('Cumulative Returns (%)')
/tmp/ipykernel_16711/1802221944.py:16: RuntimeWarning: invalid value encountered in divide
sr = np.divide(pnl.mean(), pnl.std())
/tmp/ipykernel_16711/1802221944.py:16: RuntimeWarning: invalid value encountered in divide
sr = np.divide(pnl.mean(), pnl.std())
/tmp/ipykernel_16711/1802221944.py:16: RuntimeWarning: invalid value encountered in divide
sr = np.divide(pnl.mean(), pnl.std())
/tmp/ipykernel_16711/1802221944.py:16: RuntimeWarning: invalid value encountered in divide
sr = np.divide(pnl.mean(), pnl.std())
/tmp/ipykernel_16711/1802221944.py:16: RuntimeWarning: invalid value encountered in divide
sr = np.divide(pnl.mean(), pnl.std())
/tmp/ipykernel_16711/1802221944.py:16: RuntimeWarning: invalid value encountered in divide
sr = np.divide(pnl.mean(), pnl.std())
/tmp/ipykernel_16711/1802221944.py:16: RuntimeWarning: invalid value encountered in divide
sr = np.divide(pnl.mean(), pnl.std())
/tmp/ipykernel_16711/1802221944.py:16: RuntimeWarning: invalid value encountered in divide
sr = np.divide(pnl.mean(), pnl.std())
/tmp/ipykernel_16711/1802221944.py:16: RuntimeWarning: invalid value encountered in divide
sr = np.divide(pnl.mean(), pnl.std())
[22]:
Text(0, 0.5, 'Cumulative Returns (%)')

Probability Queue Position Models
Overview
Here, we will demonstrate how queue position models affect order fill simulation and, ultimately, the strategy’s performance. It is essential for accurate backtesting to find the proper queue position modeling by comparing backtest and real trading results. In this context, we will illustrate comparisons by changing queue position models. By doing this, you can determine the appropriate queue position model that aligns with the backtesting and real trading results.
Note: This example is for educational purposes only and demonstrates effective strategies for high-frequency market-making schemes. All backtests are based on a 0.005% rebate, the highest market maker rebate available on Binance Futures. See Binance Upgrades USDⓢ-Margined Futures Liquidity Provider Program for more details.
[1]:
from numba import njit
from numba.typed import Dict
from hftbacktest import BUY, SELL
from hftbacktest import (
HftBacktest,
Linear,
Stat,
GTX,
COL_SIDE,
COL_PRICE,
IntpOrderLatency,
SquareProbQueueModel,
LogProbQueueModel2,
PowerProbQueueModel3
)
import numpy as np
@njit(cache=True)
def measure_trading_intensity(order_arrival_depth, out):
max_tick = 0
for depth in order_arrival_depth:
if not np.isfinite(depth):
continue
# Sets the tick index to 0 for the nearest possible best price
# as the order arrival depth in ticks is measured from the mid-price
tick = round(depth / .5) - 1
# In a fast-moving market, buy trades can occur below the mid-price (and vice versa for sell trades)
# since the mid-price is measured in a previous time-step;
# however, to simplify the problem, we will exclude those cases.
if tick < 0 or tick >= len(out):
continue
# All of our possible quotes within the order arrival depth,
# excluding those at the same price, are considered executed.
out[:tick] += 1
max_tick = max(max_tick, tick)
return out[:max_tick]
@njit(cache=True)
def linear_regression(x, y):
sx = np.sum(x)
sy = np.sum(y)
sx2 = np.sum(x ** 2)
sxy = np.sum(x * y)
w = len(x)
slope = (w * sxy - sx * sy) / (w * sx2 - sx**2)
intercept = (sy - slope * sx) / w
return slope, intercept
@njit(cache=True)
def compute_coeff_simplified(gamma, delta, A, k):
inv_k = np.divide(1, k)
c1 = inv_k
c2 = np.sqrt(np.divide(gamma * np.exp(1), 2 * A * delta * k))
return c1, c2
@njit
def gridtrading_glft_mm(hbt, stat, gamma, order_qty):
arrival_depth = np.full(30_000_000, np.nan, np.float64)
mid_price_chg = np.full(30_000_000, np.nan, np.float64)
t = 0
prev_mid_price_tick = np.nan
mid_price_tick = np.nan
tmp = np.zeros(500, np.float64)
ticks = np.arange(len(tmp)) + 0.5
A = np.nan
k = np.nan
volatility = np.nan
delta = 1
grid_num = 20
max_position = 50 * order_qty
# Checks every 100 milliseconds.
while hbt.elapse(100_000):
#--------------------------------------------------------
# Records market order's arrival depth from the mid-price.
if not np.isnan(mid_price_tick):
depth = -np.inf
for trade in hbt.last_trades:
side = trade[3]
trade_price_tick = trade[4] / hbt.tick_size
if side == BUY:
depth = np.nanmax([trade_price_tick - mid_price_tick, depth])
else:
depth = np.nanmax([mid_price_tick - trade_price_tick, depth])
arrival_depth[t] = depth
hbt.clear_last_trades()
prev_mid_price_tick = mid_price_tick
mid_price_tick = (hbt.best_bid_tick + hbt.best_ask_tick) / 2.0
# Records the mid-price change for volatility calculation.
mid_price_chg[t] = mid_price_tick - prev_mid_price_tick
#--------------------------------------------------------
# Calibrates A, k and calculates the market volatility.
# Updates A, k, and the volatility every 5-sec.
if t % 50 == 0:
# Window size is 10-minute.
if t >= 6_000 - 1:
# Calibrates A, k
tmp[:] = 0
lambda_ = measure_trading_intensity(arrival_depth[t + 1 - 6_000:t + 1], tmp)
# To properly calibrate A and K, a sufficient number of data points is required, here, with a minimum of three.
# If market trades only take place at the best bid and offer, an alternative method may be necessary
# to compute half spread and skew, since fitting a function might not be feasible due to insufficient
# data points.
# Alternatively, you can increase the time-step for measuring order arrivals,
# but this could result in a delayed response.
half_spread_one = False
if len(lambda_) > 2:
lambda_ = lambda_[:70] / 600
x = ticks[:len(lambda_)]
y = np.log(lambda_)
k_, logA = linear_regression(x, y)
A = np.exp(logA)
k = -k_
# Updates the volatility.
volatility = np.nanstd(mid_price_chg[t + 1 - 6_000:t + 1]) * np.sqrt(10)
#--------------------------------------------------------
# Computes bid price and ask price.
c1, c2 = compute_coeff_simplified(gamma, delta, A, k)
half_spread = c1 + delta / 2 * c2 * volatility
skew = c2 * volatility
normalized_position = hbt.position / order_qty
bid_depth = half_spread + skew * normalized_position
ask_depth = half_spread - skew * normalized_position
bid_price = min(mid_price_tick - bid_depth, hbt.best_bid_tick) * hbt.tick_size
ask_price = max(mid_price_tick + ask_depth, hbt.best_ask_tick) * hbt.tick_size
grid_interval = max(np.round(half_spread) * hbt.tick_size, hbt.tick_size)
bid_price = np.floor(bid_price / grid_interval) * grid_interval
ask_price = np.ceil(ask_price / grid_interval) * grid_interval
#--------------------------------------------------------
# Updates quotes.
hbt.clear_inactive_orders()
# Creates a new grid for buy orders.
new_bid_orders = Dict.empty(np.int64, np.float64)
if hbt.position < max_position and np.isfinite(bid_price):
for i in range(grid_num):
bid_price -= i * grid_interval
bid_price_tick = round(bid_price / hbt.tick_size)
# order price in tick is used as order id.
new_bid_orders[bid_price_tick] = bid_price
for order in hbt.orders.values():
# Cancels if an order is not in the new grid.
if order.side == BUY and order.cancellable and order.order_id not in new_bid_orders:
hbt.cancel(order.order_id)
for order_id, order_price in new_bid_orders.items():
# Posts an order if it doesn't exist.
if order_id not in hbt.orders:
hbt.submit_buy_order(order_id, order_price, order_qty, GTX)
# Creates a new grid for sell orders.
new_ask_orders = Dict.empty(np.int64, np.float64)
if hbt.position > -max_position and np.isfinite(ask_price):
for i in range(grid_num):
ask_price += i * grid_interval
ask_price_tick = round(ask_price / hbt.tick_size)
# order price in tick is used as order id.
new_ask_orders[ask_price_tick] = ask_price
for order in hbt.orders.values():
# Cancels if an order is not in the new grid.
if order.side == SELL and order.cancellable and order.order_id not in new_ask_orders:
hbt.cancel(order.order_id)
for order_id, order_price in new_ask_orders.items():
# Posts an order if it doesn't exist.
if order_id not in hbt.orders:
hbt.submit_sell_order(order_id, order_price, order_qty, GTX)
t += 1
if t >= len(arrival_depth) or t >= len(mid_price_chg):
raise Exception
# Records the current state for stat calculation.
stat.record(hbt)
[2]:
def backtest(args):
asset_name, asset_info, model = args
if model == 'SquareProbQueueModel':
queue_model = SquareProbQueueModel()
elif model == 'LogProbQueueModel2':
queue_model = LogProbQueueModel2()
elif model == 'PowerProbQueueModel3':
queue_model = PowerProbQueueModel3(3)
else:
raise ValueError
latency_data = np.concatenate(
[np.load('../latency/order_latency_{}.npz'.format(date))['data'] for date in range(20230731, 20230732)]
)
hbt = HftBacktest(
['data/{}_{}.npz'.format(asset_name, date) for date in range(20230731, 20230732)],
tick_size=asset_info['tick_size'],
lot_size=asset_info['lot_size'],
maker_fee=-0.00005,
taker_fee=0.0007,
order_latency=IntpOrderLatency(data=latency_data),
queue_model=queue_model,
asset_type=Linear,
snapshot='data/{}_20230730_eod.npz'.format(asset_name),
trade_list_size=10000,
)
stat = Stat(hbt)
# Obtains the mid-price of the assset to determine the order quantity.
data = np.load('data/{}_20230730_eod.npz'.format(asset_name))['data']
best_bid = max(data[data[:, COL_SIDE] == 1][:, COL_PRICE])
best_ask = min(data[data[:, COL_SIDE] == -1][:, COL_PRICE])
mid = (best_bid + best_ask) / 2.0
# Sets the order quantity to be equivalent to a notional value of $100.
order_qty = max(round((100 / mid) / asset_info['lot_size']), 1) * asset_info['lot_size']
gamma = 0.00005
gridtrading_glft_mm(hbt, stat.recorder, gamma, order_qty)
np.savez(
'{}_stat_000005_{}'.format(asset_name, model),
timestamp=np.asarray(stat.timestamp),
mid=np.asarray(stat.mid),
balance=np.asarray(stat.balance),
position=np.asarray(stat.position),
fee=np.asarray(stat.fee),
)
[3]:
%%capture
from multiprocessing import Pool
import json
with open('assets2.json', 'r') as f:
assets = json.load(f)
with Pool(16) as p:
print(p.map(backtest, [(k, v, 'SquareProbQueueModel') for k, v in assets.items()]))
with Pool(16) as p:
print(p.map(backtest, [(k, v, 'LogProbQueueModel2') for k, v in assets.items()]))
with Pool(16) as p:
print(p.map(backtest, [(k, v, 'PowerProbQueueModel3') for k, v in assets.items()]))
[4]:
import pandas as pd
from matplotlib import pyplot as plt
def load_equity(model):
equity_values = {}
sr_values = {}
for asset_name in assets.keys():
stat = np.load('{}_stat_000005_{}.npz'.format(asset_name, model))
timestamp = stat['timestamp']
mid = stat['mid']
balance = stat['balance']
position = stat['position']
fee = stat['fee']
equity = mid * position + balance - fee
equity = pd.Series(equity, index=pd.to_datetime(timestamp, unit='us', utc=True))
equity_ = equity.resample('5min').last()
pnl = equity_.diff()
sr = np.divide(pnl.mean(), pnl.std())
equity_values[asset_name] = equity_
sr_values[asset_name] = sr
sr_m = np.nanmean(list(sr_values.values()))
sr_s = np.nanstd(list(sr_values.values()))
asset_number = 0
net_equity = None
for i, (equity, sr) in enumerate(zip(equity_values.values(), sr_values.values())):
# There are some assets that aren't working within this scheme.
# This might be because the order arrivals don't follow a Poisson distribution that this model assumes.
# As a result, it filters out assets whose SR falls outside -0.5 sigma.
if (sr - sr_m) / sr_s > -0.5:
asset_number += 1
if net_equity is None:
net_equity = equity.copy()
else:
net_equity += equity.copy()
if asset_number == 100:
# 2_000 is capital for each trading asset.
return (net_equity / asset_number) / 2_000
np.seterr(divide='ignore', invalid='ignore')
fig = plt.figure()
fig.set_size_inches(10, 3)
legend = []
for model in ['SquareProbQueueModel', 'LogProbQueueModel2', 'PowerProbQueueModel3']:
net_equity_ = load_equity(model)
pnl = net_equity_.diff()
sr = pnl.mean() / pnl.std() * np.sqrt(288)
legend.append('100 assets, Daily SR={:.2f}, {}'.format(sr, model))
(net_equity_ * 100).plot()
plt.legend(
legend,
loc='upper center', bbox_to_anchor=(0.5, -0.15),
fancybox=True, shadow=True, ncol=3
)
plt.grid()
plt.ylabel('Cumulative Returns (%)')
[4]:
Text(0, 0.5, 'Cumulative Returns (%)')

Risk Mitigation through Price Protection in Extreme Market Conditions
For high-frequency traders and market makers, latency plays a crucial role in maintaining profitability. However, in the cryptocurrency market especially, significant price movements and delayed market updates are common occurrences. To safeguard your quotes and positions against these unfavorable conditions, it is essential to employ price protection mechanisms akin to those offered by Binance.
Price Protection is another function offered by Binance Futures to protect traders from extreme market movements. This function protects traders from bad actors who exploit market efficiencies and cause price manipulation.
The Price Protection feature is helpful against unusual market conditions, such as a large difference between the Last Price and Mark Price. Usually, the Mark Price is just a few cents away from the Last Price. However, in extreme market conditions, the Last Price may significantly deviate from the Mark Price.
As highlighted by Binance, substantial disparities between futures prices and their underlying spot prices may signal extreme market conditions. This can be mitigated by employing conservative pricing strategies, such as setting the minimum bid price for futures and their underlying spots and the maximum ask price for futures and their underlying spots. Additionally, detecting abnormalities in the price discrepancy between futures and underlying spot prices can prompt exiting positions and awaiting a return to normal market conditions.
Furthermore, it is necessary to carefully monitor latency, including both feed latency and order latency, as it prevents the tracking of market prices and hinders timely adjustments to orders. In extreme market conditions, latency spikes often occur and may impede price protection, making it advisable to withdraw from the market in such situations.
Example to be added…
Examples
You can find more examples here
Data
Please see https://github.com/nkaz001/collect-binancefutures or Data Preparation regarding collecting and converting the feed data.
Format
hftbacktest can digest a numpy file such as npz or npy and pickled pandas.DataFrame. The data has 6 columns as follows in the following order.
event: A type of event
DEPTH_EVENT = 1
TRADE_EVENT = 2
DEPTH_CLEAR_EVENT = 3
DEPTH_SNAPSHOT_EVENT = 4
Event code above 100 is used for user-defined events.
exch_timestamp: exchange timestamp
local_timestamp: local timestamp that your system receives
side: side 1: Buy(Bid) -1: Sell(Ask)
price: price
qty: quantity
Example
Raw data
1676419207212527 {'stream': 'btcusdt@depth@0ms', 'data': {'e': 'depthUpdate', 'E': 1676419206974, 'T': 1676419205108, 's': 'BTCUSDT', 'U': 2505118837831, 'u': 2505118838224, 'pu': 2505118837821, 'b': [['2218.80', '0.603'], ['5000.00', '2.641'], ['22160.60', '0.008'], ['22172.30', '0.551'], ['22173.40', '0.073'], ['22174.50', '0.006'], ['22176.80', '0.157'], ['22177.90', '0.425'], ['22181.20', '0.260'], ['22182.30', '3.918'], ['22182.90', '0.000'], ['22183.40', '0.014'], ['22203.00', '0.000']], 'a': [['22171.70', '0.000'], ['22187.30', '0.000'], ['22194.30', '0.270'], ['22194.70', '0.423'], ['22195.20', '2.075'], ['22209.60', '4.506']]}} 1676419207212584 {'stream': 'btcusdt@trade', 'data': {'e': 'trade', 'E': 1676419206976, 'T': 1676419205116, 's': 'BTCUSDT', 't': 3288803053, 'p': '22177.90', 'q': '0.001', 'X': 'MARKET', 'm': True}}
Normalized data
event |
exch_timestamp |
local_timestamp |
side |
price |
qty |
---|---|---|---|---|---|
1 |
1676419205108000 |
1676419207212527 |
1 |
2218.8 |
0.603 |
1 |
1676419205108000 |
1676419207212527 |
1 |
5000.00 |
2.641 |
1 |
1676419205108000 |
1676419207212527 |
1 |
22160.60 |
0.008 |
1 |
1676419205108000 |
1676419207212527 |
1 |
22172.30 |
0.551 |
1 |
1676419205108000 |
1676419207212527 |
1 |
22173.40 |
0.073 |
1 |
1676419205108000 |
1676419207212527 |
1 |
22174.50 |
0.006 |
1 |
1676419205108000 |
1676419207212527 |
1 |
22176.80 |
0.157 |
1 |
1676419205108000 |
1676419207212527 |
1 |
22177.90 |
0.425 |
1 |
1676419205108000 |
1676419207212527 |
1 |
22181.20 |
0.260 |
1 |
1676419205108000 |
1676419207212527 |
1 |
22182.30 |
3.918 |
1 |
1676419205108000 |
1676419207212527 |
1 |
22182.90 |
0.000 |
1 |
1676419205108000 |
1676419207212527 |
1 |
22183.40 |
0.014 |
1 |
1676419205108000 |
1676419207212527 |
1 |
22203.00 |
0.000 |
1 |
1676419205108000 |
1676419207212527 |
-1 |
22171.70 |
0.000 |
1 |
1676419205108000 |
1676419207212527 |
-1 |
22187.30 |
0.000 |
1 |
1676419205108000 |
1676419207212527 |
-1 |
22194.30 |
0.270 |
1 |
1676419205108000 |
1676419207212527 |
-1 |
22194.70 |
0.423 |
1 |
1676419205108000 |
1676419207212527 |
-1 |
22195.20 |
2.075 |
1 |
1676419205108000 |
1676419207212527 |
-1 |
22209.60 |
4.506 |
2 |
1676419205116000 |
1676419207212584 |
-1 |
22177.90 |
0.001 |
Validation
Before you start backtesting, you should check if the data is valid. The data that is received from crypto exchanges needs data cleaning and validation.
All timestamp should be in the correct order.
You might find local_timestamp is advanced to exch_timestamp due to time-sync. As local_timestamp - exch_timestamp is used as latency, the value must be positive.
Even though local_timestamp is in the correct order, exch_timestamp can be in the incorrect order.
See the following example. exch_timestamp of depth feed is advanced to the prior trade feed even though depth feed is received after trade feed.
1676419207212385 {'stream': 'btcusdt@trade', 'data': {'e': 'trade', 'E': 1676419206968, 'T': 1676419205111, 's': 'BTCUSDT', 't': 3288803051, 'p': '22177.90', 'q': '0.300', 'X': 'MARKET', 'm': True}} 1676419207212480 {'stream': 'btcusdt@trade', 'data': {'e': 'trade', 'E': 1676419206968, 'T': 1676419205111, 's': 'BTCUSDT', 't': 3288803052, 'p': '22177.90', 'q': '0.119', 'X': 'MARKET', 'm': True}} 1676419207212527 {'stream': 'btcusdt@depth@0ms', 'data': {'e': 'depthUpdate', 'E': 1676419206974, 'T': 1676419205108, 's': 'BTCUSDT', 'U': 2505118837831, 'u': 2505118838224, 'pu': 2505118837821, 'b': [['2218.80', '0.603'], ['5000.00', '2.641'], ['22160.60', '0.008'], ['22172.30', '0.551'], ['22173.40', '0.073'], ['22174.50', '0.006'], ['22176.80', '0.157'], ['22177.90', '0.425'], ['22181.20', '0.260'], ['22182.30', '3.918'], ['22182.90', '0.000'], ['22183.40', '0.014'], ['22203.00', '0.000']], 'a': [['22171.70', '0.000'], ['22187.30', '0.000'], ['22194.30', '0.270'], ['22194.70', '0.423'], ['22195.20', '2.075'], ['22209.60', '4.506']]}} 1676419207212584 {'stream': 'btcusdt@trade', 'data': {'e': 'trade', 'E': 1676419206976, 'T': 1676419205116, 's': 'BTCUSDT', 't': 3288803053, 'p': '22177.90', 'q': '0.001', 'X': 'MARKET', 'm': True}} 1676419207212621 {'stream': 'btcusdt@trade', 'data': {'e': 'trade', 'E': 1676419206976, 'T': 1676419205116, 's': 'BTCUSDT', 't': 3288803054, 'p': '22177.90', 'q': '0.005', 'X': 'MARKET', 'm': True}}
This should be converted into the following form. hftbacktest provides correct method to automatically correct this type of mess.
... 1676419207212385 {'stream': 'btcusdt@trade', 'data': {'e': 'trade', 'E': 1676419206968, 'T': -1, 's': 'BTCUSDT', 't': 3288803051, 'p': '22177.90', 'q': '0.300', 'X': 'MARKET', 'm': True}} 1676419207212480 {'stream': 'btcusdt@trade', 'data': {'e': 'trade', 'E': 1676419206968, 'T': -1, 's': 'BTCUSDT', 't': 3288803052, 'p': '22177.90', 'q': '0.119', 'X': 'MARKET', 'm': True}} 1676419207212527 {'stream': 'btcusdt@depth@0ms', 'data': {'e': 'depthUpdate', 'E': 1676419206974, 'T': 1676419205108, 's': 'BTCUSDT', 'U': 2505118837831, 'u': 2505118838224, 'pu': 2505118837821, 'b': [['2218.80', '0.603'], ['5000.00', '2.641'], ['22160.60', '0.008'], ['22172.30', '0.551'], ['22173.40', '0.073'], ['22174.50', '0.006'], ['22176.80', '0.157'], ['22177.90', '0.425'], ['22181.20', '0.260'], ['22182.30', '3.918'], ['22182.90', '0.000'], ['22183.40', '0.014'], ['22203.00', '0.000']], 'a': [['22171.70', '0.000'], ['22187.30', '0.000'], ['22194.30', '0.270'], ['22194.70', '0.423'], ['22195.20', '2.075'], ['22209.60', '4.506']]}} ... -1 {'stream': 'btcusdt@trade', 'data': {'e': 'trade', 'E': 1676419206968, 'T': 1676419205111, 's': 'BTCUSDT', 't': 3288803051, 'p': '22177.90', 'q': '0.300', 'X': 'MARKET', 'm': True}} -1 {'stream': 'btcusdt@trade', 'data': {'e': 'trade', 'E': 1676419206968, 'T': 1676419205111, 's': 'BTCUSDT', 't': 3288803052, 'p': '22177.90', 'q': '0.119', 'X': 'MARKET', 'm': True}} 1676419207212584 {'stream': 'btcusdt@trade', 'data': {'e': 'trade', 'E': 1676419206976, 'T': 1676419205116, 's': 'BTCUSDT', 't': 3288803053, 'p': '22177.90', 'q': '0.001', 'X': 'MARKET', 'm': True}} 1676419207212621 {'stream': 'btcusdt@trade', 'data': {'e': 'trade', 'E': 1676419206976, 'T': 1676419205116, 's': 'BTCUSDT', 't': 3288803054, 'p': '22177.90', 'q': '0.005', 'X': 'MARKET', 'm': True}}
Normalized data
event |
exch_timestamp |
local_timestamp |
side |
price |
qty |
---|---|---|---|---|---|
… |
|||||
2 |
-1 |
1676419207212385 |
-1 |
22177.90 |
0.300 |
2 |
-1 |
1676419207212480 |
-1 |
22177.90 |
0.119 |
1 |
1676419205108000 |
1676419207212527 |
1 |
2218.8 |
0.603 |
1 |
1676419205108000 |
1676419207212527 |
1 |
5000.00 |
2.641 |
1 |
1676419205108000 |
1676419207212527 |
1 |
22160.60 |
0.008 |
1 |
1676419205108000 |
1676419207212527 |
1 |
22172.30 |
0.551 |
1 |
1676419205108000 |
1676419207212527 |
1 |
22173.40 |
0.073 |
1 |
1676419205108000 |
1676419207212527 |
1 |
22174.50 |
0.006 |
1 |
1676419205108000 |
1676419207212527 |
1 |
22176.80 |
0.157 |
1 |
1676419205108000 |
1676419207212527 |
1 |
22177.90 |
0.425 |
1 |
1676419205108000 |
1676419207212527 |
1 |
22181.20 |
0.260 |
1 |
1676419205108000 |
1676419207212527 |
1 |
22182.30 |
3.918 |
1 |
1676419205108000 |
1676419207212527 |
1 |
22182.90 |
0.000 |
1 |
1676419205108000 |
1676419207212527 |
1 |
22183.40 |
0.014 |
1 |
1676419205108000 |
1676419207212527 |
1 |
22203.00 |
0.000 |
1 |
1676419205108000 |
1676419207212527 |
-1 |
22171.70 |
0.000 |
1 |
1676419205108000 |
1676419207212527 |
-1 |
22187.30 |
0.000 |
1 |
1676419205108000 |
1676419207212527 |
-1 |
22194.30 |
0.270 |
1 |
1676419205108000 |
1676419207212527 |
-1 |
22194.70 |
0.423 |
1 |
1676419205108000 |
1676419207212527 |
-1 |
22195.20 |
2.075 |
1 |
1676419205108000 |
1676419207212527 |
-1 |
22209.60 |
4.506 |
… |
|||||
2 |
1676419205111000 |
-1 |
-1 |
22177.90 |
0.300 |
2 |
1676419205111000 |
-1 |
-1 |
22177.90 |
0.119 |
2 |
1676419206976000 |
1676419207212584 |
-1 |
22177.90 |
0.001 |
2 |
1676419206976000 |
1676419207212621 |
-1 |
22177.90 |
0.005 |
-1 in exch_timestamp means that the event is not processed on exchange-side logic such as order fill. -1 in local_timestamp means that the event is not recognized by the local.
Latency Models
Overview
Latency is an important factor that you need to take into account when you backtest your HFT strategy. HftBacktest has three types of latencies.

Feed latency
This is the latency between the time the exchange sends the feed events such as order book change or trade and the time
it is received by the local.
This latency is dealt with through two different timestamps: local_timestamp
and exch_timestamp
(exchange timestamp).
Order entry latency
This is the latency between the time you send an order request and the time it is received by the exchange.
Order response latency
This is the latency between the time the exchange processes an order request and the time the order response is received by the local. The response to your order fill is also affected by this type of latency.

Order Latency Models
HftBacktest provides the following order latency models and you can also implement your own latency model.
ConstantLatency
It’s the most basic model that uses constant latencies. You just set the latencies.
from hftbacktest import ConstantLatency
hbt = HftBacktest(
data,
tick_size=0.01,
lot_size=0.001,
maker_fee=-0.00005,
taker_fee=0.0007,
order_latency=ConstantLatency(entry_latency=50, response_latency=50),
asset_type=Linear
)
BackwardFeedLatency
This model uses the latest feed latency as order latencies. The latencies are calculated according to the given arguments as follows.
feed_latency = local_timestamp - exch_timestamp
entry_latency = entry_latency_mul * feed_latency + entry_latency
resp_latency = resp_latency_mul * feed_latency + resp_latency
from hftbacktest import BackwardFeedLatency
hbt = HftBacktest(
data,
tick_size=0.01,
lot_size=0.001,
maker_fee=-0.00005,
taker_fee=0.0007,
order_latency=BackwardFeedLatency(
entry_latency_mul=1,
resp_latency_mul=1,
entry_latency=0,
response_latency=0
),
asset_type=Linear
)
ForwardFeedLatency
This model uses the next feed latency as order latencies using forward-looking information.
from hftbacktest import ForwardFeedLatency
hbt = HftBacktest(
data,
tick_size=0.01,
lot_size=0.001,
maker_fee=-0.00005,
taker_fee=0.0007,
order_latency=ForwardFeedLatency(
entry_latency_mul=1,
resp_latency_mul=1,
entry_latency=0,
response_latency=0
),
asset_type=Linear
)
FeedLatency
This model uses the average of the latest and the next feed latency as order latencies.
from hftbacktest import FeedLatency
hbt = HftBacktest(
data,
tick_size=0.01,
lot_size=0.001,
maker_fee=-0.00005,
taker_fee=0.0007,
order_latency=FeedLatency(
entry_latency_mul=1,
resp_latency_mul=1,
entry_latency=0,
response_latency=0
),
asset_type=Linear
)
IntpOrderLatency
This model interpolates order latency based on the actual order latency data. This is the most accurate among the provided models if you have the data with a fine time interval. You can collect the latency data by submitting unexecutable orders regularly.
latency_data = np.load('order_latency')
from hftbacktest import IntpOrderLatency
hbt = HftBacktest(
data,
tick_size=0.01,
lot_size=0.001,
maker_fee=-0.00005,
taker_fee=0.0007,
order_latency=IntpOrderLatency(latency_data),
asset_type=Linear
)
Data example
request_timestamp_at_local, exch_timestamp, receive_response_timestamp_at_local
1670026844751525, 1670026844759000, 1670026844762122
1670026845754020, 1670026845762000, 1670026845770003
Implement your own order latency model
You need to implement numba
jitclass
that has two methods: entry
and response
.
See Latency model implementation
@jitclass
class CustomLatencyModel:
def __init__(self):
pass
def entry(self, timestamp, order, proc):
# todo: return order entry latency.
return 0
def response(self, timestamp, order, proc):
# todo: return order response latency.
return 0
def reset(self):
pass
Order Fill
Exchange Models
HftBacktest is a market-data replay-based backtesting tool, which means your order cannot make any changes to the simulated market, no market impact is considered. Therefore, one of the most important assumptions is that your order is small enough not to make any market impact. In the end, you must test it in a live market with real market participants and adjust your backtesting based on the discrepancies between the backtesting results and the live outcomes.
Hftbacktest offers two types of exchange simulation. NoPartialFillExchange is the default exchange simulation where no partial fills occur. PartialFillExchange is the extended exchange simulation that accounts for partial fills in specific cases. Since the market-data replay-based backtesting cannot alter the market, some partial fill cases may still be unrealistic, such as taking market liquidity. This is because even if your order takes market liquidity, the replayed market data’s market depth and trades cannot change. It is essential to understand the underlying assumptions in each backtesting simulation.
NoPartialFillExchange
Conditions for Full Execution
Buy order in the order book
Your order price >= the best ask price
Your order price > sell trade price
Your order is at the front of the queue && your order price == sell trade price
Sell order in the order book
Your order price <= the best bid price
Your order price < buy trade price
Your order is at the front of the queue && your order price == buy trade price
Liquidity-Taking Order
Regardless of the quantity at the best, liquidity-taking orders will be fully executed at the best. Be aware that this may cause unrealistic fill simulations if you attempt to execute a large quantity.
Usage
from hftbacktest import NoPartialFillExchange
hbt = HftBacktest(
data,
tick_size=0.01,
lot_size=0.001,
maker_fee=-0.00005,
taker_fee=0.0007,
exchange_model=NoPartialFillExchange, # Default
asset_type=Linear
)
PartialFillExchange
Conditions for Full Execution
Buy order in the order book
Your order price >= the best ask price
Your order price > sell trade price
Sell order in the order book
Your order price <= the best bid price
Your order price < buy trade price
Conditions for Partial Execution
Buy order in the order book
Filled by (remaining) sell trade quantity: your order is at the front of the queue && your order price == sell trade price
Sell order in the order book
Filled by (remaining) buy trade quantity: your order is at the front of the queue && your order price == buy trade price
Liquidity-Taking Order
Liquidity-taking orders will be executed based on the quantity of the order book, even though the best price and quantity do not change due to your execution. Be aware that this may cause unrealistic fill simulations if you attempt to execute a large quantity.
Usage
from hftbacktest import PartialFillExchange
hbt = HftBacktest(
data,
tick_size=0.01,
lot_size=0.001,
maker_fee=-0.00005,
taker_fee=0.0007,
exchange_model=PartialFillExchange,
asset_type=Linear
)
Queue Models
Knowing your order’s queue position is important to achieve accurate order fill simulation in backtesting depending on the liquidity of an order book and trading activities. If an exchange doesn’t provide Market-By-Order, you have to guess it by modeling. HftBacktest currently only supports Market-By-Price that is most crypto exchanges provide and it provides the following queue position models for order fill simulation.
Please refer to the details at Queue Models.

RiskAverseQueueModel
This model is the most conservative model in terms of the chance of fill in the queue. The decrease in quantity by cancellation or modification in the order book happens only at the tail of the queue so your order queue position doesn’t change. The order queue position will be advanced only if a trade happens at the price.
from hftbacktest import RiskAverseQueueModel
hbt = HftBacktest(
data,
tick_size=0.01,
lot_size=0.001,
maker_fee=-0.00005,
taker_fee=0.0007,
order_latency=IntpOrderLatency(latency_data),
queue_model=RiskAverseQueueModel() # Default
asset_type=Linear
)
ProbQueueModel
Based on a probability model according to your current queue position, the decrease in quantity happens at both before and after the queue position. So your queue position is also advanced according to the probability. This model is implemented as described in
By default, three variations are provided. These three models have different probability profiles.

The function f = log(1 + x) exhibits a different probability profile depending on the total quantity at the price level, unlike power functions.



When you set the function f, it should be as follows.
The probability at 0 should be 0 because if the order is at the head of the queue, all decreases should happen after
the order. * The probability at 1 should be 1 because if the order is at the tail of the queue, all decreases should happen before the order.
You can see the comparison of the models here.
LogProbQueueModel
from hftbacktest import LogProbQueueModel
hbt = HftBacktest(
data,
tick_size=0.01,
lot_size=0.001,
maker_fee=-0.00005,
taker_fee=0.0007,
order_latency=IntpOrderLatency(latency_data),
queue_model=LogProbQueueModel()
asset_type=Linear
)
IdentityProbQueueModel
from hftbacktest import IdentityProbQueueModel
hbt = HftBacktest(
data,
tick_size=0.01,
lot_size=0.001,
maker_fee=-0.00005,
taker_fee=0.0007,
order_latency=IntpOrderLatency(latency_data),
queue_model=IdentityProbQueueModel()
asset_type=Linear
)
SquareProbQueueModel
from hftbacktest import SquareProbQueueModel
hbt = HftBacktest(
data,
tick_size=0.01,
lot_size=0.001,
maker_fee=-0.00005,
taker_fee=0.0007,
order_latency=IntpOrderLatency(latency_data),
queue_model=SquareProbQueueModel()
asset_type=Linear
)
PowerProbQueueModel
from hftbacktest import PowerProbQueueModel
hbt = HftBacktest(
data,
tick_size=0.01,
lot_size=0.001,
maker_fee=-0.00005,
taker_fee=0.0007,
order_latency=IntpOrderLatency(latency_data),
queue_model=PowerProbQueueModel(3)
asset_type=Linear
)
ProbQueueModel2
This model is a variation of the ProbQueueModel that changes the probability calculation to f(back) / f(front + back) from f(back) / (f(front) + f(back)).
LogProbQueueModel2
from hftbacktest import LogProbQueueModel2
hbt = HftBacktest(
data,
tick_size=0.01,
lot_size=0.001,
maker_fee=-0.00005,
taker_fee=0.0007,
order_latency=IntpOrderLatency(latency_data),
queue_model=LogProbQueueModel2()
asset_type=Linear
)
PowerProbQueueModel2
from hftbacktest import PowerProbQueueModel2
hbt = HftBacktest(
data,
tick_size=0.01,
lot_size=0.001,
maker_fee=-0.00005,
taker_fee=0.0007,
order_latency=IntpOrderLatency(latency_data),
queue_model=PowerProbQueueModel2(3)
asset_type=Linear
)
ProbQueueModel3
This model is a variation of the ProbQueueModel that changes the probability calculation to 1 - f(front / (front + back)) from f(back) / (f(front) + f(back)).
PowerProbQueueModel3
from hftbacktest import PowerProbQueueModel3
hbt = HftBacktest(
data,
tick_size=0.01,
lot_size=0.001,
maker_fee=-0.00005,
taker_fee=0.0007,
order_latency=IntpOrderLatency(latency_data),
queue_model=PowerProbQueueModel3(3)
asset_type=Linear
)
Implement a custom probability queue position model
@jitclass
class CustomProbQueueModel(ProbQueueModel):
def f(self, x):
# todo: custom formula
return x ** 3
Implement a custom queue model
You need to implement numba
jitclass
that has four methods: new
, trade
, depth
, is_filled
See Queue position model implementation in detail.
@jitclass
class CustomQueuePositionModel:
def __init__(self):
pass
def new(self, order, proc):
# todo: when a new order is submitted.
pass
def trade(self, order, qty, proc):
# todo: when a trade happens.
pass
def depth(self, order, prev_qty, new_qty, proc):
# todo: when the order book quantity at the price is changed.
pass
def is_filled(self, order, proc):
# todo: check if a given order is filled.
return False
def reset(self):
pass
References
This is initially implemented as described in the following articles.
JIT Compilation Overhead
HftBacktest takes advantage of Numba’s capabilities, with a significant portion of its implementation relying on Numba JIT’ed classes. As a result, the first run of HftBacktest requires JIT compilation, which can take several tens of seconds. Although this may not be significant when backtesting for multiple days, it can still be bothersome.
To minimize this overhead, you can consider using Numba’s cache
feature along with reset
method to reset
HftBacktest. See the example below.
from numba import njit
from hftbacktest import HftBacktest, IntpOrderLatency, SquareProbQueueModel, Linear
# enables caching feature
@njit(cache=True)
def algo(arguments, hbt):
# your algo implementation.
hbt = HftBacktest(
[
'data/ethusdt_20221003.npz',
'data/ethusdt_20221004.npz',
'data/ethusdt_20221005.npz',
'data/ethusdt_20221006.npz',
'data/ethusdt_20221007.npz'
],
tick_size=0.01,
lot_size=0.001,
maker_fee=-0.00005,
taker_fee=0.0007,
order_latency=IntpOrderLatency(),
queue_model=SquareProbQueueModel(),
asset_type=Linear,
snapshot='data/ethusdt_20221002_eod.npz'
)
algo(arguments, hbt)
When you need to execute the same code using varying arguments or different datasets, you can proceed as follows.
from hftbacktest import reset
reset(
hbt,
[
'data/ethusdt_20221003.npz',
'data/ethusdt_20221004.npz',
'data/ethusdt_20221005.npz',
'data/ethusdt_20221006.npz',
'data/ethusdt_20221007.npz'
],
snapshot='data/ethusdt_20221002_eod.npz'
)
algo(arguments, hbt)
Debugging Backtesting and Live Discrepancies
Plotting both live and backtesting values on a single chart is a good initial step. It’s strongly recommended to include the equity curve and position plots for comparison purposes. Additionally, visualizing your alpha, order prices, etc can facilitate the identification of discrepancies.
[Image]
If the backtested strategy is correctly implemented in live trading, two significant factors may contribute to any observed discrepancies.
1. Latency: Latency, encompassing both feed and order latency, plays a crucial role in ensuring accurate backtesting results. It’s highly recommended to collect data yourself to accurately measure feed latency on your end. Alternatively, if obtaining data from external sources, it’s essential to verify that the feed latency aligns with your latency.
Order latency, measured from your end, can be collected by logging order actions or regularly submitting orders away from the mid-price and subsequently canceling them to measure and record order latency.
It’s still possible to artificially decrease latencies to assess improvements in strategy performance due to enhanced latency. This allows you to evaluate the effectiveness of higher-tier programs or liquidity provider programs, as well as quantify the impact of investments made in infrastructure improvement. Understanding whether a superior infrastructure provides a competitive advantage is beneficial.
2. Queue Model: Selecting an appropriate queue model that accurately reflects live trading results is essential. You can either develop your own queue model or utilize existing ones. Hftbacktest offers three primary queue models such as PowerProbQueueModel series, allowing for adjustments to align with your results. For further information, refer to ProbQueueModel.
One crucial point to bear in mind is the backtesting conducted under the assumption of no market impact. A market order, or a limit order that take liquidity, can introduce discrepancies, as it may cause market impact and consequently make execution simulation difficult. Moreover, if your limit order size is too large, partial fills and their market impact can also lead to discrepancies. It’s advisable to begin trading with a small size and align the results first. Gradually increasing your trading size while observing both live and backtesting results is recommended.
Initialization
- HftBacktest(data, tick_size, lot_size, maker_fee, taker_fee, order_latency, asset_type, queue_model=None, snapshot=None, start_position=0, start_balance=0, start_fee=0, trade_list_size=0, exchange_model=None)[source]
Create a HftBacktest instance.
- Parameters:
data (str | ndarray[Any, dtype[ScalarType]] | DataFrame | List[str | ndarray[Any, dtype[ScalarType]] | DataFrame]) – Data to be fed.
tick_size (float) – Minimum price increment for the given asset.
lot_size (float) – Minimum order quantity for the given asset.
maker_fee (float) – Maker fee rate; a negative value indicates rebates.
taker_fee (float) – Taker fee rate; a negative value indicates rebates.
order_latency (OrderLatencyModel) – Order latency model. See Order Latency Models.
asset_type (AssetType) – Either
Linear
orInverse
. See Asset types.queue_model (QueueModel | None) – Queue model with default set as
models.queue.RiskAverseQueueModel
. See Queue Models.snapshot (str | ndarray[Any, dtype[ScalarType]] | DataFrame | None) – The initial market depth snapshot.
start_position (float) – Starting position.
start_balance (float) – Starting balance.
start_fee (float) – Starting cumulative fees.
trade_list_size (int) – Buffer size for storing market trades; the default value of
0
indicates that market trades will not be stored in the buffer.exchange_model (Callable[[Reader, OrderBus, OrderBus, MarketDepth, State, OrderLatencyModel, QueueModel], JitClassType] | None) – Exchange model with default set as
NoPartialFillExchange
.
- Returns:
JIT’ed
SingleAssetHftBacktest
- reset(hbt, data, tick_size=None, lot_size=None, maker_fee=None, taker_fee=None, snapshot=None, start_position=0, start_balance=0, start_fee=0, trade_list_size=None)[source]
Reset the HftBacktest for reuse. This can help reduce Ahead-of-Time (AOT) compilation time by using the
cache=True
option in the@njit
decorator.- Parameters:
hbt (HftBacktest) – HftBacktest instance to be reset.
data (List[str | ndarray[Any, dtype[ScalarType]] | DataFrame] | str | ndarray[Any, dtype[ScalarType]] | DataFrame) – Data to be fed.
tick_size (float | None) – Minimum price increment for the given asset.
lot_size (float | None) – Minimum order quantity for the given asset.
maker_fee (float | None) – Maker fee rate; a negative value indicates rebates.
taker_fee (float | None) – Taker fee rate; a negative value indicates rebates.
snapshot (str | ndarray[Any, dtype[ScalarType]] | DataFrame | None) – The initial market depth snapshot.
start_position (float | None) – Starting position.
start_balance (float | None) – Starting balance.
start_fee (float | None) – Starting cumulative fees.
trade_list_size (int | None) – Buffer size for storing market trades; the default value of
0
indicates that market trades will not be stored in the buffer.
Backtester
- class SingleAssetHftBacktest(local, exch)[source]
Single Asset HftBacktest.
Warning
This has to be constructed by
HftBacktest()
.- Parameters:
local – Local processor.
exch – Exchange processor.
- run
Whether a backtest has finished.
- current_timestamp
Current timestamp
- property position
Current position.
- property balance
Current balance..
- property orders
Orders dictionary.
- property tick_size
Tick size
- property lot_size
Lot size
- property high_ask_tick
The highest ask price in the market depth in tick.
- property low_bid_tick
The lowest bid price in the market depth in tick.
- property best_bid_tick
The best bid price in tick.
- property best_ask_tick
The best ask price in tick.
- property best_bid
The best bid price.
- property best_ask
The best ask price.
- property bid_depth
Bid market depth.
- property ask_depth
Ask market depth.
- property mid
Mid-price of BBO.
- property equity
Current equity value.
- property last_trade
Last market trade. If
None
, no last market trade.
- property last_trades
An array of last market trades.
- submit_buy_order(order_id, price, qty, time_in_force, order_type=0, wait=False)[source]
Places a buy order.
- Parameters:
order_id (int64) – The unique order ID; there should not be any existing order with the same ID on both local and exchange sides.
price (float64) – Order price.
qty (float64) – Quantity to buy.
time_in_force (int64) –
Available Time-In-Force options vary depending on the exchange model. See to the exchange model for details.
GTX
: Post-onlyGTC
: Good ‘till CancelFOK
: Fill or KillIOC
: Immediate or Cancel
order_type (int64) – Currently, only
LIMIT
is supported. To simulate aMARKET
order, set the price very high.wait (bool) – If
True
, wait until the order placement response is received.
- Returns:
True
if the method reaches the specified timestamp within the data. If the end of the data is reached before the specified timestamp, it returnsFalse
.
- submit_sell_order(order_id, price, qty, time_in_force, order_type=0, wait=False)[source]
Places a sell order.
- Parameters:
order_id (int64) – The unique order ID; there should not be any existing order with the same ID on both local and exchange sides.
price (float64) – Order price.
qty (float64) – Quantity to sell.
time_in_force (int64) –
Available Time-In-Force options vary depending on the exchange model. See to the exchange model for details.
GTX
: Post-onlyGTC
: Good ‘till CancelFOK
: Fill or KillIOC
: Immediate or Cancel
order_type (int64) – Currently, only
LIMIT
is supported. To simulate aMARKET
order, set the price very low.wait (bool) – If
True
, wait until the order placement response is received.
- Returns:
True
if the method reaches the specified timestamp within the data. If the end of the data is reached before the specified timestamp, it returnsFalse
.
- modify(order_id, price, qty, wait=False)[source]
Modify the specified order.
If the adjusted total quantity(leaves_qty + executed_qty) is less than or equal to the quantity already executed, the order will be considered expired. Be aware that this adjustment doesn’t affect the remaining quantity in the market, it only changes the total quantity.
Modified orders will be reordered in the match queue.
- Parameters:
order_id (int64) – Order ID to modify.
price (float64) – Order price.
qty (float64) – Quantity to sell.
wait (bool) – If
True
, wait until the order placement response is received.
- Returns:
True
if the method reaches the specified timestamp within the data. If the end of the data is reached before the specified timestamp, it returnsFalse
.
- cancel(order_id, wait=False)[source]
Cancel the specified order.
- Parameters:
order_id (int64) – Order ID to cancel.
wait (bool) – If
True
, wait until the order placement response is received.
- Returns:
True
if the method reaches the specified timestamp within the data. If the end of the data is reached before the specified timestamp, it returnsFalse
.
- wait_order_response(order_id, timeout=-1)[source]
Wait for the specified order response by order ID.
- Parameters:
order_id (int64) – The order ID to wait for.
timeout (int64) – Maximum waiting time; The default value of -1 indicates no timeout.
- Returns:
True
if the method reaches the specified timestamp within the data. If the end of the data is reached before the specified timestamp, it returnsFalse
.
- wait_next_feed(include_order_resp, timeout=-1)[source]
Waits until the next feed is received.
- Parameters:
- Returns:
True
if the method reaches the specified timestamp within the data. If the end of the data is reached before the specified timestamp, it returnsFalse
.
- clear_inactive_orders()[source]
Clear inactive(
CANCELED
,FILLED
,EXPIRED
, orREJECTED
) orders from the localorders
dictionary.
- get_user_data(event)[source]
Retrieve custom user event data.
- Parameters:
event (int64) – Event identifier. Refer to the data documentation for details on incorporating custom user data with the market feed data.
- Returns:
The latest event data for the specified event.
- elapse(duration)[source]
Elapses the specified duration.
- Parameters:
duration (float64) – Duration to elapse. Unit should be the same as the feed data’s timestamp unit.
- Returns:
True
if the method reaches the specified timestamp within the data. If the end of the data is reached before the specified timestamp, it returnsFalse
.
- goto(timestamp, wait_order_response=-1)[source]
Goes to a specified timestamp.
This method moves to the specified timestamp, updating the backtesting state to match the corresponding time. If
wait_order_response
is provided, the method will stop and return when it receives the response for the specified order.- Parameters:
timestamp (float64) – The target timestamp to go to. The timestamp unit should be the same as the feed data’s timestamp unit.
wait_order_response (int64) – Order ID to wait for; the default value is
WAIT_ORDER_RESPONSE_NONE
, which means not waiting for any order response.
- Returns:
True
if the method reaches the specified timestamp within the data. If the end of the data is reached before the specified timestamp, it returnsFalse
.
Asset Types
Order Latency Models
- class ConstantLatency(entry_latency, response_latency)
Provides constant order latency. The units of the arguments should match the timestamp units of your data.
- Parameters:
entry_latency (float64) – Order entry latency.
response_latency (float64) – Order response latency.
- class FeedLatency(entry_latency_mul=1, resp_latency_mul=1, entry_latency=0, response_latency=0)
Provides order latency based on feed latency. The units of the arguments should match the timestamp units of your data.
Order latency is computed as follows:
feed_latency is calculated as the average latency between the latest feed’s latency and the subsequent feed’s latency(by forward-looking).
If either of these values is unavailable, the available value is used as the sole feed latency.
entry_latency = feed_latency * entry_latency_mul + entry_latency response_latency = feed_latency * resp_latency_mul + response_latency
- Parameters:
entry_latency_mul (float64) – Multiplier for feed latency to compute order entry latency.
resp_latency_mul (float64) – Multiplier for feed latency to compute order response latency.
entry_latency (float64) – Offset for order entry latency.
response_latency (float64) – Offset for order response latency.
- class ForwardFeedLatency(entry_latency_mul=1, resp_latency_mul=1, entry_latency=0, response_latency=0)
Provides order latency based on feed latency. The units of the arguments should match the timestamp units of your data.
Order latency is computed as follows:
the subsequent feed’s latency(by forward-looking) is used as the feed latency.
entry_latency = feed_latency * entry_latency_mul + entry_latency response_latency = feed_latency * resp_latency_mul + response_latency
- Parameters:
entry_latency_mul (float64) – Multiplier for feed latency to compute order entry latency.
resp_latency_mul (float64) – Multiplier for feed latency to compute order response latency.
entry_latency (float64) – Offset for order entry latency.
response_latency (float64) – Offset for order response latency.
- class BackwardFeedLatency(entry_latency_mul=1, resp_latency_mul=1, entry_latency=0, response_latency=0)
Provides order latency based on feed latency. The units of the arguments should match the timestamp units of your data.
Order latency is computed as follows:
the latest feed’s latency is used as the feed latency.
entry_latency = feed_latency * entry_latency_mul + entry_latency response_latency = feed_latency * resp_latency_mul + response_latency
- Parameters:
entry_latency_mul (float64) – Multiplier for feed latency to compute order entry latency.
resp_latency_mul (float64) – Multiplier for feed latency to compute order response latency.
entry_latency (float64) – Offset for order entry latency.
response_latency (float64) – Offset for order response latency.
- class IntpOrderLatency(data)
Provides order latency by interpolating the actual historical order latency. This model provides the most accurate results. The units of the historical latency data should match the timestamp units of your feed data.
- Parameters:
data (array) – An (n, 3) array consisting of three columns: local timestamp when the request was made, exchange timestamp, and local timestamp when the response was received.
Queue Models
- class RiskAverseQueueModel[source]
Provides a conservative queue position model, where your order’s queue position advances only when trades occur at the same price level.
- class ProbQueueModel[source]
Provides a probability-based queue position model as described in https://quant.stackexchange.com/questions/3782/how-do-we-estimate-position-of-our-order-in-order-book.
Your order’s queue position advances when a trade occurs at the same price level or the quantity at the level decreases. The advancement in queue position depends on the probability based on the relative queue position. To avoid double counting the quantity decrease caused by trades, all trade quantities occurring at the level before the book quantity changes will be subtracted from the book quantity changes.
- class IdentityProbQueueModel[source]
Bases:
ProbQueueModel
This model uses an identity function
x
to adjust the probability.
- class SquareProbQueueModel[source]
Bases:
ProbQueueModel
This model uses a square function
x ** 2
to adjust the probability.
- class PowerProbQueueModel(n)[source]
Bases:
ProbQueueModel
This model uses a power function
x ** n
to adjust the probability.
- class LogProbQueueModel[source]
Bases:
ProbQueueModel
This model uses a logarithmic function
log(1 + x)
to adjust the probability.
- class ProbQueueModel2[source]
Bases:
ProbQueueModel
This model is a variation of the
ProbQueueModel
that changes the probability calculation to f(back) / f(front + back) from f(back) / (f(front) + f(back)).
- class LogProbQueueModel2[source]
Bases:
ProbQueueModel2
This model uses a logarithmic function
log(1 + x)
to adjust the probability.
- class PowerProbQueueModel2(n)[source]
Bases:
ProbQueueModel2
This model uses a power function
x ** n
to adjust the probability.
- class ProbQueueModel3[source]
Bases:
ProbQueueModel
This model is a variation of the
ProbQueueModel
that changes the probability calculation to 1 - f(front / (front + back)) from f(back) / (f(front) + f(back)).
- class PowerProbQueueModel3(n)[source]
Bases:
ProbQueueModel3
This model uses a power function
x ** n
to adjust the probability.
Stat
- class Stat(hbt, utc=True, unit='us', allocated=100000)[source]
Calculates performance statistics and generates a summary of performance metrics.
- Parameters:
- annualised_return(denom=None, include_fee=True, trading_days=365)[source]
Calculates annualised return.
- Parameters:
denom (float | None) – If provided, annualised return will be calculated in percentage terms by dividing by the specified denominator.
include_fee (bool) – If set to
True
, fees will be included in the calculation; otherwise, fees will be excluded.trading_days (int) – The number of trading days per year used for annualisation.
- Returns:
Annaulised return.
- daily_trade_amount()[source]
Retrieves the average value of daily trades.
- Returns:
Average value of daily trades.
- daily_trade_num()[source]
Retrieves the average number of daily trades.
- Returns:
Average number of daily trades.
- daily_trade_volume()[source]
Retrieves the average quantity of daily trades.
- Returns:
Average quantity of daily trades.
- datetime()[source]
Converts and returns a DateTime series from the timestamp.
- Returns:
DateTime series by converting from the timestamp.
- equity(resample=None, include_fee=True, datetime=True)[source]
Calculates equity values.
- Parameters:
resample (str | None) – If provided, equity values will be resampled based on the specified period.
include_fee (bool) – If set to
True
, fees will be included in the calculation; otherwise, fees will be excluded.datetime (bool) – If set to
True
, the timestamp is converted to a DateTime, which takes a long time. If you want fast computation, set it toFalse
.
- Returns:
the calculated equity values.
- property recorder
Returns a
Recorder
instance to record performance statistics.
- riskreturnratio(include_fee=True)[source]
Calculates Risk-Return Ratio, which is Annualized Return / Maximum Draw Down over the entire period.
- Parameters:
include_fee (bool) – If set to
True
, fees will be included in the calculation; otherwise, fees will be excluded.- Returns:
Risk-Return Ratio
- sharpe(resample, include_fee=True, trading_days=365)[source]
Calculates the Sharpe Ratio without considering benchmark rates.
- Parameters:
- Returns:
The calculated Sharpe Ratio.
- summary(capital=None, resample='5min', trading_days=365)[source]
Generates a summary of performance metrics.
- Parameters:
capital (float | None) – The initial capital investment for the strategy. If provided, it is used as the denominator to calculate annualized return and MDD in percentage terms. Otherwise, absolute values are displayed.
resample (str) – The resampling period, such as ‘1s’, ‘5min’.
trading_days (int) – The number of trading days per year used for annualisation.
- class Recorder(timestamp, mid, balance, position, fee, trade_num, trade_qty, trade_amount)[source]
- Parameters:
timestamp (ListType(int64)) –
mid (ListType(float64)) –
balance (ListType(float64)) –
position (ListType(float64)) –
fee (ListType(float64)) –
trade_num (ListType(int64)) –
trade_qty (ListType(float64)) –
trade_amount (ListType(float64)) –
Data Validation
- convert_to_struct_arr(data, add_exch_local_ev=True)[source]
Converts the 2D ndarray currently used in Python hftbacktest into the structured array that can be used in Rust hftbacktest.
- Parameters:
- Returns:
Converted structured array.
- Return type:
- correct(data, base_latency, tick_size=None, lot_size=None, err_bound=1e-08, method='separate')[source]
Validates the specified data and automatically corrects negative latency and unordered rows. See
validate_data()
,correct_local_timestamp()
,correct_exch_timestamp()
, andcorrect_exch_timestamp_adjust()
.- Parameters:
data (ndarray[Any, dtype[ScalarType]] | DataFrame) – Data to be checked and corrected.
base_latency (float) – The value to be added to the feed latency. See
correct_local_timestamp()
.tick_size (float | None) – Minimum price increment for the specified data.
lot_size (float | None) – Minimum order quantity for the specified data.
err_bound (float) – Error bound used to verify if the specified
tick_size
orlot_size
aligns with the price and quantity.method (Literal['separate', 'adjust']) –
The method to correct reversed exchange timestamp events.
separate
: Usecorrect_local_timestamp()
.adjust
: Usecorrect_exch_timestamp_adjust()
.
- Returns:
Corrected data
- Return type:
- correct_event_order(sorted_exch, sorted_local, add_exch_local_ev)[source]
Corrects exchange timestamps that are reversed by splitting each row into separate events, ordered by both exchange and local timestamps, through duplication. See
data
for details.- Parameters:
- Returns:
Adjusted data with corrected exchange timestamps.
- Return type:
- correct_exch_timestamp(data, num_corr)[source]
Corrects exchange timestamps that are reversed by splitting each row into separate events, ordered by both exchange and local timestamps, through duplication. See
data
for details.
- correct_exch_timestamp_adjust(data)[source]
Corrects reversed exchange timestamps by adjusting the local timestamp value for proper ordering. It sorts the data by exchange timestamp and fixes out-of-order local timestamps by setting their value to the previous value, ensuring correct ordering.
- correct_local_timestamp(data, base_latency)[source]
Adjusts the local timestamp if the feed latency is negative by offsetting the maximum negative latency value as follows:
feed_latency = local_timestamp - exch_timestamp adjusted_local_timestamp = local_timestamp + min(feed_latency, 0) + base_latency
- Parameters:
data (ndarray[Any, dtype[ScalarType]] | DataFrame) – Data to be corrected.
base_latency (float) – Due to discrepancies in system time between the exchange and the local machine, latency may be measured inaccurately, resulting in negative latency values. The conversion process automatically adjusts for positive latency but may still produce zero latency cases. By adding
base_latency
, more realistic values can be obtained. Unit should be the same as the feed data’s timestamp unit.
- Returns:
Adjusted data with corrected timestamps
- Return type:
- validate_data(data, tick_size=None, lot_size=None, err_bound=1e-08)[source]
Validates the specified data for the following aspects, excluding user events. Validation results will be printed out:
Ensures data’s price aligns with tick_size.
Ensures data’s quantity aligns with lot_size.
Ensures data’s local timestamp is ordered.
Ensures data’s exchange timestamp is ordered.
- Parameters:
data (ndarray[Any, dtype[ScalarType]] | DataFrame) – Data to be validated.
tick_size (float | None) – Minimum price increment for the given asset.
lot_size (float | None) – Minimum order quantity for the given asset.
err_bound (float) – Error bound used to verify if the specified
tick_size
orlot_size
aligns with the price and quantity.
- Returns:
The number of rows with reversed exchange timestamps.
- Return type:
Data Utilities
hftbacktest.data.utils.binancefutures module
- convert(input_filename, output_filename=None, opt='', base_latency=0, compress=False, structured_array=False, timestamp_unit='us', combined_stream=True)[source]
Converts raw Binance Futures feed stream file into a format compatible with HftBacktest.
File Format:
local_timestamp raw_stream 1660228023037049 {"stream":"btcusdt@depth@0ms","data":{"e":"depthUpdate","E":1660228023941,"T":1660228023931,"s":"BTCUSDT","U":1801732831593,"u":1801732832589,"pu":1801732831561,"b":[["2467.10","0.000"],["12006.00","0.001"],["24427.70","4.350"],["24620.30","0.172"],["24644.00","44.832"],["24645.40","0.203"],["24652.80","4.900"],["24664.10","4.279"],["24666.50","0.554"],["24666.80","6.764"],["24668.70","7.428"],["24670.90","2.000"],["24671.00","0.000"],["24672.70","0.000"],["24688.30","0.000"]],"a":[["24653.60","0.000"],["24669.80","0.000"],["24670.20","0.000"],["24670.70","0.000"],["24670.90","0.000"],["24671.00","20.812"],["24672.10","0.000"],["24672.30","0.001"],["24674.60","1.520"],["24674.80","0.000"],["24684.20","4.519"],["24684.30","0.202"],["24685.00","0.937"],["24690.90","4.827"],["24693.60","1.500"],["24729.10","0.171"]]}} 1660228023038319 {"stream":"btcusdt@depth@0ms","data":{"e":"depthUpdate","E":1660228023977,"T":1660228023966,"s":"BTCUSDT","U":1801732832805,"u":1801732834115,"pu":1801732832589,"b":[["2467.10","0.008"],["24643.00","4.457"],["24656.30","0.010"],["24657.70","0.005"],["24658.80","1.000"],["24658.90","1.500"],["24659.50","3.781"],["24659.70","1.806"],["24659.90","0.105"],["24660.60","0.787"],["24666.30","5.033"],["24666.40","0.012"],["24666.50","0.556"],["24668.70","7.426"],["24668.90","0.000"],["24670.90","2.535"],["24680.00","0.000"],["24688.30","0.000"]],"a":[["24653.60","0.000"],["24670.10","0.000"],["24670.60","0.000"],["24670.70","0.000"],["24670.90","0.000"],["24671.00","20.642"],["24672.00","0.000"],["24672.10","0.000"],["24673.50","0.145"],["24673.60","1.567"],["24674.50","3.746"],["24674.60","1.520"],["24678.30","1.304"],["24678.40","0.001"],["24678.80","0.546"],["24678.90","0.002"],["24681.60","0.020"],["24681.70","0.613"],["24681.90","0.077"],["24682.10","3.000"],["24682.20","0.000"],["24683.70","0.163"],["24683.80","4.162"],["24684.00","1.227"],["24684.20","4.519"],["24684.30","0.202"],["24684.90","1.331"],["24685.70","0.156"],["24685.80","0.325"],["24686.70","0.648"],["24692.60","0.040"],["24700.00","47.420"],["24729.10","0.006"]]}} 1660228023043260 {"stream":"btcusdt@trade","data":{"e":"trade","E":1660228023980,"T":1660228023973,"s":"BTCUSDT","t":2691833663,"p":"24670.90","q":"0.022","X":"MARKET","m":true}} 1660228023052991 {"stream":"btcusdt@trade","data":{"e":"trade","E":1660228023991,"T":1660228023983,"s":"BTCUSDT","t":2691833664,"p":"24671.00","q":"0.001","X":"MARKET","m":false}} 1660228023071108 {"stream":"btcusdt@depth@0ms","data":{"e":"depthUpdate","E":1660228024010,"T":1660228024002,"s":"BTCUSDT","U":1801732834136,"u":1801732835323,"pu":1801732834115,"b":[["2467.10","0.000"],["12006.00","0.000"],["24599.40","0.641"],["24603.20","0.104"],["24625.50","0.152"],["24645.20","0.476"],["24646.80","0.081"],["24652.60","0.254"],["24664.10","4.279"],["24666.50","0.878"],["24668.80","0.004"],["24670.90","2.513"],["24688.30","0.000"],["24787.00","0.000"]],"a":[["24653.60","0.000"],["24668.10","0.000"],["24668.70","0.000"],["24669.50","0.000"],["24669.80","0.000"],["24670.00","0.000"],["24670.60","0.000"],["24670.70","0.000"],["24670.90","0.000"],["24671.00","20.641"],["24672.20","0.000"],["24672.30","0.001"],["24673.50","0.040"],["24673.90","0.105"],["24674.70","2.139"],["24674.80","0.000"],["24683.70","0.963"],["24683.90","0.009"],["24685.70","0.556"],["24709.30","0.254"],["24723.80","0.000"],["24728.30","0.193"],["24729.50","4.477"],["24739.40","0.807"],["24743.20","0.235"],["24795.00","0.130"]]}} 1660228023117894 {"stream":"btcusdt@depth@0ms","data":{"e":"depthUpdate","E":1660228024044,"T":1660228024034,"s":"BTCUSDT","U":1801732835406,"u":1801732836571,"pu":1801732835323,"b":[["2467.10","0.000"],["24337.10","2.462"],["24616.50","1.050"],["24619.00","0.235"],["24640.00","5.148"],["24649.80","2.805"],["24650.00","14.374"],["24651.90","3.000"],["24653.30","1.400"],["24658.70","1.142"],["24658.80","0.000"],["24659.60","3.263"],["24659.70","0.006"],["24660.50","0.840"],["24660.60","0.387"],["24662.20","0.202"],["24663.10","7.147"],["24664.00","0.922"],["24664.20","0.131"],["24664.50","0.027"],["24666.20","7.066"],["24666.40","0.012"],["24668.80","0.002"],["24669.30","0.002"],["24670.20","0.811"],["24670.90","5.817"],["24688.30","0.000"]],"a":[["24653.60","0.000"],["24669.80","0.000"],["24669.90","0.000"],["24670.90","0.000"],["24671.00","20.121"],["24672.10","0.000"],["24672.80","0.000"],["24674.60","1.520"],["24675.30","0.421"],["24681.20","0.239"],["24681.50","1.343"],["24681.60","0.020"],["24681.70","0.213"],["24683.60","2.929"],["24683.70","0.163"],["24683.80","2.162"],["24684.70","0.646"],["24684.90","0.731"],["24692.90","0.321"],["24693.10","0.040"],["24700.70","0.537"],["24703.60","0.210"],["24721.50","7.245"]]}} 1660228023125009 {"stream":"btcusdt@trade","data":{"e":"trade","E":1660228024062,"T":1660228024055,"s":"BTCUSDT","t":2691833665,"p":"24670.90","q":"0.002","X":"MARKET","m":true}} 1660228023128966 {"stream":"btcusdt@trade","data":{"e":"trade","E":1660228024067,"T":1660228024061,"s":"BTCUSDT","t":2691833666,"p":"24670.90","q":"0.020","X":"MARKET","m":true}} 1660228023138740 {"stream":"btcusdt@depth@0ms","data":{"e":"depthUpdate","E":1660228024077,"T":1660228024066,"s":"BTCUSDT","U":1801732836639,"u":1801732837803,"pu":1801732836571,"b":[["2467.10","0.000"],["24659.00","0.000"],["24659.30","2.500"],["24663.00","1.038"],["24664.20","0.118"],["24666.20","7.065"],["24666.50","0.554"],["24666.70","3.987"],["24666.80","7.088"],["24666.90","0.014"],["24667.40","1.506"],["24668.90","0.006"],["24670.10","0.272"],["24670.90","6.726"],["24688.30","0.000"]],"a":[["24653.60","0.000"],["24668.70","0.000"],["24670.30","0.000"],["24670.50","0.000"],["24670.90","0.000"],["24679.00","0.001"],["24703.10","1.500"],["24710.50","0.057"],["24728.30","0.028"],["24768.50","0.318"],["24980.10","5.446"],["25050.00","119.300"]]}} 1660228023149748 {"stream":"btcusdt@trade","data":{"e":"trade","E":1660228024088,"T":1660228024081,"s":"BTCUSDT","t":2691833667,"p":"24671.00","q":"0.063","X":"MARKET","m":false}}
- Parameters:
input_filename (str) – Input filename with path.
output_filename (str | None) – If provided, the converted data will be saved to the specified filename in
npz
format.opt (Literal['', 'm', 't', 'mt']) –
Additional processing options:
m
: ProcessesmarkPriceUpdate
stream with the following custom event IDs.index:
100
mark price:
101
funding rate:
102
t
: ProcessesbookTicker
stream with the following custom event IDs.best bid:
103
best ask:
104
base_latency (float) – The value to be added to the feed latency. See
correct_local_timestamp()
.compress (bool) – If this is set to True, the output file will be compressed.
structured_array (bool) – If this is set to True, the output is converted into the new format(currently only Rust impl).
timestamp_unit (Literal['us', 'ns']) – The timestamp unit for exchange timestamp to be converted in. Binance provides timestamps in milliseconds. Both local timestamp and exchange timestamp should be in the same unit.
combined_stream (bool) – Raw stream type. combined stream: {“stream”:”solusdt@bookTicker”,”data”:{“e”:”bookTicker”,”u”:4456408609867,”s”:”SOLUSDT”,”b”:”142.4440”,”B”:”50”,”a”:”142.4450”,”A”:”3”,”T”:1713571200009,”E”:1713571200010}} regular stream: {“e”:”bookTicker”,”u”:4456408609867,”s”:”SOLUSDT”,”b”:”142.4440”,”B”:”50”,”a”:”142.4450”,”A”:”3”,”T”:1713571200009,”E”:1713571200010}
- Returns:
Converted data compatible with HftBacktest.
- Return type:
hftbacktest.data.utils.binancehistmktdata module
- convert(depth_filename, trades_filename, output_filename=None, buffer_size=100000000, feed_latency=0, base_latency=0, method='separate', depth_has_header=None, trades_has_header=None)[source]
Converts Binance Historical Market Data files into a format compatible with HftBacktest. Since it doesn’t have a local timestamp, it lacks feed latency information, which can result in a significant discrepancy between live and backtest results. Collecting feed data yourself or obtaining the high quality of data from a data vendor is strongly recommended.
https://www.binance.com/en/landing/data
- Parameters:
depth_filename (str) – Depth data filename
trades_filename (str) – Trades data filename
output_filename (str | None) – If provided, the converted data will be saved to the specified filename in
npz
format.buffer_size (int) – Sets a preallocated row size for the buffer.
feed_latency (float) – Artificial feed latency value to be added to the exchange timestamp to create local timestamp.
base_latency (float) – The value to be added to the feed latency. See
correct_local_timestamp()
.method (Literal['separate', 'adjust']) – The method to correct reversed exchange timestamp events. See
validation.correct()
.depth_has_header (bool | None) – True if the given file has a header, it will automatically detect it if set to None.
trades_has_header (bool | None) – True if the given file has a header, it will automatically detect it if set to None.
- Returns:
Converted data compatible with HftBacktest.
- Return type:
- convert_snapshot(snapshot_filename, output_filename=None, feed_latency=0, has_header=None)[source]
Converts Binance Historical Market Data files into a format compatible with HftBacktest. Since it doesn’t have a local timestamp, it lacks feed latency information, which can result in a significant discrepancy between live and backtest results. Collecting feed data yourself or obtaining the high quality of data from a data vendor is strongly recommended.
https://www.binance.com/en/landing/data
- Parameters:
snapshot_filename (str) – Snapshot filename
output_filename (str | None) – If provided, the converted data will be saved to the specified filename in
npz
format.feed_latency (float) – Artificial feed latency value to be added to the exchange timestamp to create local timestamp.
has_header (bool | None) – True if the given file has a header, it will automatically detect it if set to None.
- Returns:
Converted data compatible with HftBacktest.
- Return type:
hftbacktest.data.utils.snapshot module
- create_last_snapshot(data, tick_size, lot_size, initial_snapshot=None, output_snapshot_filename=None, compress=False, structured_array=False)[source]
Creates a snapshot of the last market depth for the specified data, which can be used as the initial snapshot data for subsequent data.
- Parameters:
data (str | ndarray[Any, dtype[ScalarType]] | DataFrame | List[str | ndarray[Any, dtype[ScalarType]] | DataFrame]) – Data to be processed to obtain the last market depth snapshot.
tick_size (float) – Minimum price increment for the given asset.
lot_size (float) – Minimum order quantity for the given asset.
initial_snapshot (str | ndarray[Any, dtype[ScalarType]] | DataFrame | None) – The initial market depth snapshot.
output_snapshot_filename (str | None) – If provided, the snapshot data will be saved to the specified filename in
npz
format.compress (bool) – If this is set to True, the output file will be compressed.
structured_array (bool) – If this is set to True, the output is converted into the new format(currently only Rust impl).
- Returns:
Snapshot of the last market depth compatible with HftBacktest.
- Return type:
hftbacktest.data.utils.tardis module
- convert(input_files, output_filename=None, buffer_size=100000000, ss_buffer_size=1000000, base_latency=0, snapshot_mode='process', compress=False, structured_array=False, timestamp_unit='us')[source]
Converts Tardis.dev data files into a format compatible with HftBacktest.
For Tardis’s Binance Futures feed data, they use the ‘E’ event timestamp, representing the sending time, rather than the ‘T’ transaction time, indicating when the matching occurs. So the latency is slightly less than it actually is.
- Parameters:
input_files (List[str]) – Input filenames for both incremental book and trades files, e.g. [‘incremental_book.csv’, ‘trades.csv’].
output_filename (str | None) – If provided, the converted data will be saved to the specified filename in
npz
format.buffer_size (int) – Sets a preallocated row size for the buffer.
ss_buffer_size (int) – Sets a preallocated row size for the snapshot.
base_latency (float) – The value to be added to the feed latency. See
correct_local_timestamp()
.snapshot_mode (Literal['process', 'ignore_sod', 'ignore']) –
If this is set to ‘ignore’, all snapshots are ignored. The order book will converge to a complete order book over time.
If this is set to ‘ignore_sod’, the SOD (Start of Day) snapshot is ignored. Since Tardis intentionally adds the SOD snapshot, not due to a message ID gap or disconnection, there might not be a need to process SOD snapshot to build a complete order book. Please see https://docs.tardis.dev/historical-data-details#collected-order-book-data-details for more details.
Otherwise, all snapshot events will be processed.
compress (bool) – If this is set to True, the output file will be compressed.
structured_array (bool) – If this is set to True, the output is converted into the new format(currently only Rust impl).
timestamp_unit (Literal['us', 'ns']) – The timestamp unit for timestamp to be converted in. Tardis provides timestamps in microseconds.
- Returns:
Converted data compatible with HftBacktest.
- Return type:
hftbacktest.data.utils.difforderbooksnapshot module
- class DiffOrderBookSnapshot(*args, **kwargs)[source]
Bases:
DiffOrderBookSnapshot
- class_type = jitclass.DiffOrderBookSnapshot#7fb5d375ae50<num_levels:int64,curr_bids:array(float64, 2d, A),curr_asks:array(float64, 2d, A),prev_bids:array(float64, 2d, A),prev_asks:array(float64, 2d, A),bid_delete_lvs:array(float64, 2d, A),ask_delete_lvs:array(float64, 2d, A),curr_bid_lv:int64,curr_ask_lv:int64,prev_bid_lv:int64,prev_ask_lv:int64,init:bool,tick_size:float64,lot_size:float64>