Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Welcome to Tardis.dev documentation pages
If you prefer accessing historical datasets in CSV format, please see downloadable CSV files docs.
Follow our API getting started guide to learn more how to access historical data via our HTTP API and client libs that offer tick-level market data replay support data both in exchange-native and normalized formats.
Consolidated real-time market data streaming API is available via our open source libraries that connect directly to exchanges' WebSocket APIs. We do not provide hosted real-time API, see why.
See historical data details to learn exactly what, how and since we collect for each supported exchange and what data is available via our market data API and downloadable CSV files.
Refer to our FAQ or contact us via email. We're always here to help you with any questions you have.
BitMEX historical market data details - available data, coverage and data collection specifics
BitMEX historical data for all it's instruments is available since 2019-03-30.
Historical CSV datasets for the first day of each month are available to download without API key. See downloadable CSV files documentation.
data type
symbol
date
trades
XBTUSD
2019-07-01
incremental_book_L2
XBTUSD
2019-07-01
book_snapshot_25
XBTUSD
2019-07-01
quotes
XBTUSD
2019-07-01
derivative_ticker
XBTUSD
2019-07-01
trades
FUTURES
2020-03-01
incremental_book_L2
FUTURES
2020-03-01
liquidations
PERPETUALS
2021-09-01
book_ticker
XBTUSD
2022-01-01
See full downloadable CSV files documentation with datasets format spec, data samples and more.
See docs that shows all available download options (download path customization, filenames conventions and more).
See docs that shows all available download options (download path customization, filenames conventions and more).
See datasets API reference which allows downloading single file at once.
Historical data format is the same as provided by real-time BitMEX WebSocket API with addition of local timestamps. If you'd like to work with normalized data format instead (same format for each exchange) see downloadable CSV files or official client libs that can perform data normalization client-side.
See Python client docs.
See Node.js client docs.
Tardis-machine is a locally runnable server that exposes API allowing efficiently requesting historical market data for whole time periods in contrast to HTTP API that provides data only in minute by minute slices.
See tardis-machine docs.
instrument
partial
messages returned via HTTP API may contain data for all BitMEX instruments and require filtering client-side if only selected symbols were requested
orderBook10 - available since 2021-02-21
Market data collection infrastructure for BitMEX is located in GCP europe-west2 region (London, UK). Real-time market data is captured via single WebSocket connection.
Simply via email.
Historical market data details for each supported exchange — available symbols, channels, date ranges...
You'll find here per-exchange details about:
historical data availability date ranges — since when the historical data has been collected and is available
captured real-time market data channels also described as streams, subscription topics, tables etc in exchanges' docs — available historical raw market data is being sourced from WebSocket real-time APIs provided by the exchanges and can be filtered by channels, e.g.: to get historical trades for BitMEX, channel trade
needs to be provided alongside requested instruments symbols (via HTTP API or client libs function args).
symbols of recorded instruments/currency pairs
incidents - describing periods where due to internal errors data has been missing for given exchange
Some exchanges encode requested symbol in channel name, e.g.: Deribit trades.BTC-PERPETUAL.100ms
channel. This is not the case with our API as we always consider channel name and symbol to be separate inputs. In case of Deribit example channel name would be trades
and symbol BTC-PERPETUAL
. If channel provides option of frequency of updates (e.g.: 100ms vs raw tick by tick) always higher frequency one is being chosen and recorded.
All market data collection is being performed on one of the highly available Google Cloud Platform Kubernetes Clusters - London, UK (europe-west2 region) or Tokyo, Japan (asia-northeast1 region) - information which data center location is used for particular exchange is described on exchange historical data details page.
When exchange provides choice of real-time data frequency for specific data types (e.g. order book data ) always most granular, non aggregated data feed is being collected.
Choice if single or multiple WebSocket connections are being used to record full real-time data feed is made on case by case basis - we take into account exchange API limits and latency which may be higher or lower if single connection is being used - detailed information which strategy is used for particular exchange is described on exchange historical data details page.
WebSocket connection is dynamically re-subscribed (or restarted for some exchanges) at 00:00 UTC every day in order to receive initial order book snapshots.
Each received message is timestamped with 100ns precision using synchronized clock at arrival time and stored in ISO 8601 format.
Messages provided by exchanges' WebSocket feeds are being stored without any modifications.
Checks if there are new instruments available for given exchange are being performed every minute.
Market data collection services are being constantly monitored both manually and via automated tools (monitoring, alert notifications) and have built-in self-healing capabilities. We also constantly monitor for upcoming exchanges' API changes and adapt to those beforehand.
There are multiple built-in checks detecting if connection to exchange is healthy during data collection process, such as:
validating subscription responses - if exchange does not confirm subscriptions within 20 seconds, connection is being restarted
order books sequence numbers validation for exchange that provide those
validating JSON format as in some unusual circumstances exchanges return data that is invalid JSON
stale connection detection - if there are no responses received within certain period (adjusted per exchange) it's most likely stale connection which gets automatically restarted
detection of unusually small messages count being received from exchange in given time period which likely means connection is not healthy, e.g.: receiving only 'pings' without data messages
and many more
Any incident that is caused by us (bugs, network errors etc.) is being logged and available via API.
New market data delay is 6 minutes in relation to real-time (T - 6min
).
Historical market data available via HTTP API provides order book snapshots at the beginning of each day (00:00 UTC) and every-time WebSocket connection has been closed when recording real-time data feed (connection is restarted and new snapshot provided via fresh connection). It means that in order to be sure to receive initial order book snapshots one must replay historical data from 00:00 UTC time of the day. It also means that there is a tiny gap in historical data (around 300-3000ms
range depending on exchange) during re-subscribing to real-time WebSocket feed (every 24 hours) in order to receive order book snapshots.
Some exchanges do not provide initial order book snapshots when subscribing to WebSocket real-time feeds (like Binance, Bitstamp or Coinbase Pro full order book), hence for those there is a 'generated' snapshot available instead (based on REST API call) - details are specific for each exchange and are described in per-exchange historical details pages.
Binance COIN Margined Futures historical market data details - instruments, data coverage and data collection specifics
Binance COIN Futures historical data for all it's instruments is available since 2020-06-16.
Historical CSV datasets for the first day of each month are available to download without API key. See downloadable CSV files documentation.
data type
symbol
date
trades
BTCUSD_200925
2020-07-01
incremental_book_L2
BTCUSD_200925
2020-07-01
book_snapshot_25
BTCUSD_PERP
2020-11-01
quotes
BTCUSD_200925
2020-07-01
derivative_ticker
BTCUSD_200925
2020-07-01
trades
FUTURES
2020-07-01
liquidations
PERPETUALS
2021-09-01
Historical data format is the same as provided by real-time Binance COIN Futures WebSocket API with addition of local timestamps. If you'd like to work with normalized data format instead (same format for each exchange) see downloadable CSV files or official client libs that perform data normalization client-side.
See Python client docs.
See Node.js client docs.
Tardis-machine is a locally runnable server that exposes API allowing efficiently requesting historical market data for whole time periods in contrast to HTTP API that provides data only in minute by minute slices.
See tardis-machine docs.
markPrice @1s
indexPrice @1s
depth @0ms
depthSnapshot - generated channel with full order book snapshots
Binance COIN Futures real-time WebSocket API does not provide initial order book snapshots. To overcome this issue we fetch initial order book snapshots from REST API and store them together with the rest of the WebSocket messages - top 1000 levels. Such snapshot messages are marked with "stream":"<symbol>@depthSnapshot"
and "generated":true
fields.
During data collection integrity of order book incremental updates is being validated using sequence numbers provided by real-time feed (pu
and u
fields) - in case of detecting missed message WebSocket connection is being restarted. We also validate if initial book snapshot fetched from REST API overlaps with received depth
messages.
openInterest - generated channel
Since Binance COIN Futures does not offer currently real-time WebSocket open interest channel, we simulate it by fetching that info from REST API (https://binance-docs.github.io/apidocs/delivery/en/#open-interest) every 30 seconds for each instrument. Such messages are marked with "stream":"<symbol>@openInterest"
and "generated":true
fields and data
field has the same format as REST API response.
Market data collection infrastructure for Binance COIN Futures is located in GCP asia-northeast1 (Tokyo, Japan). Real-time market data is captured via multiple WebSocket connections.
Deribit historical market data details - available data, coverage and data collection specifics
Deribit historical data for all it's instruments (including all options) is available since 2019-03-30.
Historical CSV datasets for the first day of each month are available to download without API key. See downloadable CSV files documentation.
data type
symbol
date
trades
BTC-PERPETUAL
2019-07-01
incremental_book_L2
BTC-PERPETUAL
2019-07-01
quotes
BTC-PERPETUAL
2019-07-01
book_snapshot_25
BTC-PERPETUAL
2019-07-01
derivative_ticker
BTC-PERPETUAL
2019-07-01
trades
OPTIONS
2020-03-01
quotes
OPTIONS
2020-03-01
options_chain
OPTIONS
2020-03-01
book_snapshot_25
OPTIONS
2020-03-01
liquidations
PERPETUALS
2021-09-01
See full downloadable CSV files documentation with datasets format spec, data samples and more.
See docs that shows all available download options (download path customization, filenames conventions and more).
See docs that shows all available download options (download path customization, filenames conventions and more).
See datasets API reference which allows downloading single file at once.
Historical data format is the same as provided by real-time Deribit WebSocket v2 API with addition of local timestamps. If you'd like to work with normalized data format instead (same format for each exchange) see downloadable CSV files or official client libs that perform data normalization client-side.
See Python client docs.
See Node.js client docs.
Tardis-machine is a locally runnable server that exposes API allowing efficiently requesting historical market data for whole time periods in contrast to HTTP API that provides data only in minute by minute slices.
See tardis-machine docs.
book
During data collection integrity of order book incremental updates is being validated using sequence numbers provided by Deribit's real-time feed (prev_change_id
) - in case of detecting missed message WebSocket connection is being restarted.
platform_state - available since 2019-12-31
deribit_volatility_index - available since 2021-04-01
book, perpetual, ticker, trades
channels data was all collected with raw
interval - no aggregation was applied.
Market data collection infrastructure for Deribit is located in GCP europe-west2 region (London, UK). Real-time market data is captured via multiple WebSocket connections.
OKX Futures historical market data details - instruments, data coverage and data collection specifics
OKX historical data for all it's futures instruments is available since 2019-03-30.
Historical CSV datasets for the first day of each month are available to download without API key. See downloadable CSV files documentation.
data type
symbol
date
trades
BTC-USD-200103
2020-01-01
incremental_book_L2
BTC-USD-200103
2020-01-01
book_snapshot_25
BTC-USD-200103
2020-01-01
quotes
BTC-USD-200103
2020-01-01
derivative_ticker
BTC-USD-200103
2020-01-01
trades
FUTURES
2020-03-01
liquidations
FUTURES
2021-09-01
Historical data format is the same as provided by real-time OKX WebSocket v5 API with addition of local timestamps (before 2021-12-23 it was v3 API version). If you'd like to work with normalized data format instead (same format for each exchange) see downloadable CSV files or official client libs that can perform data normalization client-side.
See Python client docs.
See Node.js client docs.
Tardis-machine is a locally runnable server that exposes API allowing efficiently requesting historical market data for whole time periods in contrast to HTTP API that provides data only in minute by minute slices.
See tardis-machine docs.
V3 API channels (recorded until 2021-12-23):
futures/depth - available until 2020-01-29
futures/depth_l2_tbt - available since 2019-12-05
index/ticker - available since 2019-09-21
system/status - available since 2020-07-05
V5 API channels (recorded since 2021-12-23):
trades
trades-all
books-l2-tbt
bbo-tbt
tickers
open-interest
mark-price
price-limit
status
instruments
index-tickers
long-short-account-ratio
taker-volume
estimated-price
public-block-trades
public-struc-block-trades
liquidation-orders
taker-volume-contract
long-short-account-ratio-contract-top-trader
long-short-position-ratio-contract-top-trader
long-short-account-ratio-contract
Market data collection infrastructure for OKX Futures since 2022-05-04T16:45 is located in AWS HK region (Hong Kong, China, VPC colo setup), before that, starting since 2020-05-15 it was located in GCP asia-northeast1 (Tokyo, Japan) and initially it was located in GCP europe-west2 region (London, UK). Real-time market data is captured via multiple WebSocket connections.
Binance historical market data details - currency pairs, data coverage and data collection specifics
Binance historical data for high caps currency pairs is available since 2019-03-30, data for all currency pairs is available since 2021-03-05.
Historical CSV datasets for the first day of each month are available to download without API key. See downloadable CSV files documentation.
data type
symbol
date
trades
BTCUSDT
2019-12-01
incremental_book_L2
BTCUSDT
2019-12-01
book_snapshot_25
BTCUSDT
2019-12-01
quotes
BTCUSDT
2019-12-01
trades
ETHUSDT
2020-03-01
incremental_book_L2
ETHUSDT
2020-03-01
Historical data format is the same as provided by real-time Binance WebSocket API with addition of local timestamps. If you'd like to work with normalized data format instead (same format for each exchange) see downloadable CSV files or official client libs that can perform data normalization client-side.
See Python client docs.
See Node.js client docs.
Tardis-machine is a locally runnable server that exposes API allowing efficiently requesting historical market data for whole time periods in contrast to HTTP API that provides data only in minute by minute slices.
See tardis-machine docs.
aggTrade - available since 2019-11-19
bookTicker - available since 2019-09-21
depth
Binance depth
channel has been recorded with the fastest update speed API allowed at the time. It means until 2019-08-30 it was depth
(without @time
suffix) - book updates pushed every 1000ms and after that date it was depth@100ms
- book updates pushed every 100ms (new API feature).
depthSnapshot - generated channel with full order book snapshots
Binance real-time WebSocket API does not provide initial order book snapshots. To overcome this issue we fetch initial order book snapshots from REST API and store them together with the rest of the WebSocket messages - top 1000 levels. Such snapshot messages are marked with "stream":"<symbol>@depthSnapshot"
and "generated":true
fields.
During data collection integrity of order book incremental updates is being validated using sequence numbers provided by real-time feed (U
and u
fields) - in case of detecting missed message WebSocket connection is being restarted. We also validate if initial book snapshot fetched from REST API overlaps with received depth
messages.
Market data collection infrastructure for Binance since 2020-05-18 is located in GCP asia-northeast1 (Tokyo, Japan), before that it was located in GCP europe-west2 region (London, UK). Real-time market data is captured via multiple WebSocket connections.
Tardis.dev provides the most comprehensive and granular cryptocurrency market data products in the industry and offers:
access to high frequency historical data including the most granular tick level order book updates and trades for both derivatives and top spot cryptocurrency exchanges
non standard data types: historical funding, open interest, indexes, liquidations and more
fast and convenient data access both via API and downloadable CSV files
data sourced from real-time WS market data feeds with complete control and transparency how the data is being recorded
exchange-native and normalized data format support
time-machine market replay allowing reconstruction of state of the limit order book at any given moment in time across all supported cryptocurrency markets
fair, transparent pricing with discounts for solo trades, small prop shops and academic researchers
consolidated real-time market data streaming support via Tardis.dev open source libraries that connect directly to exchanges' public WebSocket APIs (no API key required)
Data use cases
market microstructure and order book dynamics research
trading execution optimization
tick-level granularity market simulation
liquidity and lead-lag analysis
backtesting and optimization of trading strategies
full historical order book reconstruction at any given point in time
training machine learning models
alpha generation
designing quantitative models
academics research
data visualizations
In total over 40 000 distinct instruments & currency pairs across leading derivatives and spot cryptocurrency exchanges is supported. We collect and provide data for all instruments & currency pairs available on given exchange with some exceptions for spot exchanges where we collect high caps currency pairs only (due to exchanges API limitations).
Binance
Bitfinex
Kraken Futures (Crypto Facilities)
Kucoin
We do provide discounts in a transparent form via different subscriptions types.
Yes, if you'd like to test the service (data quality, coverage, API performance etc.) we offer generous free trials. Simply reach out to us and we'll set up trial account for you.
Our support team has in-depth knowledge of market data and exchanges' APIs peculiarities, programming and data analysis expertise. You get the answers straight from people whose day to day job is overseeing and maintaining market data collection process and infrastructure.
For business subscriptions we provide dedicated communication channel (via Telegram Messenger, email or Zoom calls) for your company where our team is on standby every business day (7AM - 3PM UTC) to answer any questions you may have. For pro subscriptions and one-off purchases we do provide email based support with 24-48 business hours initial response time.
For academic and solo subscriptions there is no dedicated support provided, only self-service.
Since cryptocurrency exchanges' market data APIs are public anyone can use those to collect the data, but it's a time consuming and resource intensive undertaking (exchanges we support publish ~1000GB of new data every day), that requires investment in proper infrastructure, constant monitoring and oversight (exchanges API changes, rate-limiting monitoring, new exchanges integrations, unexpected connection issues, latency monitoring etc.), not to mention implementation costs of data collection, storage and distribution services. All in all we think our offering is comprehensive, transparent and fair, provides good value and saves you time and money in comparison to in-house solution allowing you to focus on your core objective not on data management intricacies.
You can access historical market data via API which provides raw data in exchange native format or download CSV datasets with trades, incremental order book L2 updates, order book snapshots, options chains, quotes, derivative tickers (open interest, funding, mark price, index price) and liquidations. Our client libs provide data in normalized format as well, which can be more flexible than CSV datasets for some use cases but also slower to download due to on-demand, client-side data normalization overhead in comparison to ready to download CSV files.
Data is available since 2019-03-30 for majority of the supported exchanges (that existed at that time).
exchange
available since
2019-03-30
2019-03-30
2019-11-17
2020-06-16
2019-03-30
2019-08-01
2019-03-30
2019-03-30
2020-02-01
2019-03-30
2019-11-19
2020-03-28
2020-10-30
2019-11-19
2019-09-14
2019-05-23
2019-03-30
2019-03-30
2019-06-04
2019-03-30
2019-08-30
2020-07-01
2019-11-07
2021-12-04
2021-04-06
2023-01-20
2022-08-16
2023-02-23
2021-03-03
2020-03-17
2020-03-30
2021-03-28
2020-05-22
2019-09-25
2020-07-01
2020-07-01
2023-01-13
2022-06-01
2019-11-19
2019-08-30
2019-11-19
2020-07-14
2019-06-04
Yes, see downloadable CSV files documentation for more details.
Any programming language that can communicate using HTTPS can communicate with our HTTP API.
We do provide official Python and Node.js clients that offer fast and convenient access to tick-level historical market data.
Finally, our open source, locally runnable tardis-machine server with built-in local data caching, provides market data normalization, custom order book snapshots capabilities and real-time market data streaming support that connects directly to exchanges' WebSocket APIs. It provides both streaming HTTP and WebSocket endpoints returning market data for whole time periods (in contrast to Tardis.dev HTTP API where single call returns data for single minute time period) and is available via npm and as a Docker Image.
Historical market data provided by HTTP API can be accessed via HTTPS.
Locally runnable tardis-machine server provides both HTTP and WebSocket based APIs for accessing both historical and real-time market data.
Exchanges' market data WebSocket APIs are designed to publish real-time feeds and not historical ones. Locally runnable tardis-machine server's WebSocket API bridges that gap and allows "replaying" historical market data from any given past point in time with the same data format and 'subscribe' logic as real-time exchanges' APIs. In many cases existing exchanges' WebSocket clients can be used to connect to this endpoint just by changing URL, and receive market data in exchange-native format for date ranges specified in URL query string params.
We do not provide hosted real-time market data API as we think that given everyone can access exchanges' APIs directly for free without restrictions, relaying on 3rd party for such crucial piece of infrastructure does not make sense (additional latency and another SPOF). Instead we developed locally runnable (self hosted) server and open source libraries that offer consolidated real-time normalized market data streaming capabilities, connect directly to exchanges' WebSocket APIs and are completely free to use.
There are no API rate limits for downloadable CSV files API.
Raw data replay API for professional level subscriptions is limited to 30 millions requests per day. For business level subscriptions there are no rate limits for raw data replay API as long as your behavior does not negatively impact other customers API usage experience. If that's the case, we'll contact your via email and do our best to help how to sort it out - in most cases it's download client bug that over and over downloads the same data in a loop.
API key can be obtained on Tardis.dev website via order form. You'll receive it via email after successful order.
Contact us immediately and we will generate new API key for you.
Highly available Google Cloud Platform Kubernetes Clusters located in in London, UK (europe-west2 region) and Tokyo, Japan (asia-northeast1 region)
Two independent, geo-redundant, highly durable storage services
High performance API servers deployed across network of data centers around the globe
We do not have a formal SLA in place yet, but all infrastructure is set up to provide highest availability possible on both data collection and distribution side with geo-redundant setup. Both data collection services and public APIs are constantly monitored from multiple locations and our team is immediately notified in case of any issue. We don't practice maintenance that would affect API availability, but in very rare circumstance if that would happen we'll communicate that in advance. If a formal SLA is something that your business require contact us.
We provide the most comprehensive and granular market data on the market sourced from real-time WebSocket APIs with complete control and transparency how the data is being recorded.
Via downloadable CSV data files following normalized tick-level data types are available:
order book snapshots (top 25 and top 5 levels)
derivative tick info (open interest, funding rate, mark price, index price)
Raw data API that is available for pro and business subscriptions provides data in exchange-native data format. See historical data details to learn about real-time channels captured for each exchange. Each captured channel can be considered a different exchange specific data type (for example Binance bookTicker channel, or BitMEX liquidation channel).
We also provide following normalized data types via our client libs (normalization is done client-side, using raw data API as a data source):
trades
order book L2 updates
order book snapshots (tick-by-tick, 10ms, 100ms, 1s, 10s etc)
quotes
derivative tick info (open interest, funding rate, mark price, index price)
liquidations
options summary
OHLCV
volume/tick based trade bars
We always collect and provide data with the most granularity that exchange can offer via it's real-time WS feeds. High frequency can mean different things for different exchanges due to exchanges APIs limitations. For example for Coinbase Pro it can mean L3 order book data (market-by-order), for Binance Futures all order book L2 real-time updates and for Binance Spot it means order book updates aggregated in 100ms intervals.
Raw market data is sourced from exchanges real-time WebSocket APIs. For cases where exchange lacks WebSocket API for particular data type we fallback to pooling REST API periodically, e.g., Binance Futures open interest data.
Recording exchanges real-time WebSocket feeds allows us preserving and providing the most granular data that exchanges APIs can offer including data that is simply not available via their REST APIs like tick level order book updates. Historical data sourced from WebSocket real-time feeds adheres to what you'll see when trading live and can be used to exactly replicate live conditions even if it means some occasional connection drops causing small data gaps, real-time data publishing delays especially during larger market moves, duplicated trades or crossed books in some edge cases. We find that trade-off acceptable and even if data isn't as clean and corrected as sourced from REST APIs, it allows for more insight into market microstructure and various unusual exchanges behaviors that simply can't be captured otherwise. Simple example would be latency spikes for many exchanges during increased volatility periods where exchange publish trade/order book/quote WebSocket messages with larger than usual latency or simply skip some of the the updates and then return those in one batch. Querying the REST API would result in nice, clean trade history, but such data wouldn't fully reflect real actionable market behavior and would result in unrealistic backtesting results, breaking in the real-time scenarios.
L2 data (market-by-price) includes bids and asks orders aggregated by price level and can be used to analyze among other things:
order book imbalance
average execution cost
average liquidity away from midpoint
average spread
hidden interest (i.e., iceberg orders)
We do provide L2 data both in CSV format as incremental order book L2 updates, tick level order book snapshots (top 25 and top 5 levels) as well as in exchange-native format via API and client libraries that can perform full order book reconstruction client-side.
L3 data (market-by-order) includes every order book order addition, update, cancellation and match and can be used to analyze among other things:
order resting time
order fill probability
order queue dynamics
Historical L3 data is currently available via API for Bitfinex, Coinbase Pro and Bitstamp - remaining supported exchanges provide L2 data only.
We always collect full depth order book data as long as exchange's WebSocket API supports it. Table below shows current state of affairs for each supported exchange.
exchange
order book depth
order book updates frequency
full order book depth snapshot and updates
real-time
full order book depth snapshot and updates
real-time
top 1000 levels initial order book snapshot, full depth incremental order book updates
real-time, dynamically adjusted
top 1000 levels initial order book snapshot, full depth incremental order book updates
real-time, dynamically adjusted
top 1000 levels initial order book snapshot, full depth incremental order book updates
100ms
top 100 levels initial order book snapshot and updates
real-time
top 400 levels initial order book snapshot and updates
real-time
top 400 levels initial order book snapshot and updates
real-time
top 400 levels initial order book snapshot and updates
real-time
top 400 levels initial order book snapshot and updates
real-time
top 150 levels initial order book snapshot and updates
30ms
top 150 levels initial order book snapshot and updates
30ms
top 150 levels initial order book snapshot and updates
30ms
top 150 levels initial order book snapshot and updates
100ms
top 100 levels initial order book snapshot and updates
real-time
top 100 levels initial order book snapshot and updates
real-time
full order book depth snapshot and updates
real-time
full order book depth snapshot and updates
real-time
top 1000 levels initial order book snapshot and updates
real-time
full order book depth snapshot and updates
real-time
full order book depth snapshot and updates
real-time
full order book depth snapshot and updates
real-time
top 25 levels initial order book snapshot and updates
real-time
full order book depth snapshot and updates
real-time
top 15 levels snapshots
real-time
top 30 levels initial order book snapshot and updates
20ms
top 100 levels initial order book snapshot and updates
real-time
top 1000 levels initial order book snapshot, full depth incremental order book updates
100ms
top 20 levels order book snapshots
unknown
top 30 levels order book snapshots
unknown
top 400 levels initial order book snapshot and updates
real-time
full order book depth snapshot and updates
real-time
full order book depth snapshot and updates
real-time
top 1000 levels initial order book snapshot, full depth incremental order book updates
100ms
Liquidations data is sourced from exchanges WebSocket APIs when supported with fallback to pooling REST APIs when WebSockets APIs do not support that data type and can be accessed via raw data APIs (replaying relevant channel) or as normalized data type via CSV downloads.
exchange
available since
notes
2019-03-30
2019-03-30
2019-11-17
2020-06-16
2019-08-01
2020-12-18
collected by pooling OKEx REST APIs since liquidations aren't available via WS feeds
2020-12-18
collected by pooling OKEx REST APIs since liquidations aren't available via WS feeds
2020-06-24
2020-06-24
2019-09-14
2019-03-30
2020-12-18
Yes, we do provide historical options data for Deribit and OKEx Options - see options chain CSV data type and Deribit and OKEx Options exchange details pages.
We cover all leading derivatives exchanges such as BitMEX, Deribit, Binance USDT Futures, Binance COIN Futures, FTX, OKEx, Huobi Futures, Huobi Swap, Bitfinex Derivatives, Bybit and many more.
Futures contract is a contract that has expiry date (for example quarter ahead for quarterly futures). Futures contract price converges to spot price as the contract approaches expiration/settlement date. After futures contract expires, exchange settles it and replaces with a new contract for the next period (next quarter for our previous example).
Perpetual swap contract also commonly called "perp", "swap", "perpetual" or "perpetual future" in crypto exchanges nomenclature is very similar to futures contract, but does not have expiry date (hence perpetual). In order to ensure that the perpetual swap contract price stays near the spot price exchanges employ mechanism called funding rate. When the funding rate is positive, Longs pay Shorts. When the funding rate is negative, Shorts pay Longs. This mechanism can be quite nuanced and vary between exchanges, so it's best to study each contract specification to learn all the details (funding periods, mark price mechanisms etc.).
We are focusing on providing the best possible tick-level historical data for cryptocurrency exchanges and as of now our APIs (both HTTP and CSV datasets) do offer access to tick-level data only and do not offer support for time based aggregated data.
If you're interested in time based aggregated data (OHLC, interval based order book snapshots) see our client libs that provide such capabilities, but with the caveat that data aggregation is performed client-side from tick-level data sourced from the API, meaning it can be relatively slow process in contrast to ready to download aggregated data.
Yes, we're always open to support new promising exchanges. Contact us and we'll get back to you to discuss the details.
Normalized market data (unified data format for every exchange) is available via our official libraries and downloadable CSV files. Our HTTP API provides data only in exchange-native format.
Data we provide has contract amounts exactly as provided by exchanges APIs, meaning in some cases it can be tricky to compare across exchanges due to different contract multipliers (like for example OKEx where each contract has $100 value) or different contract types (linear or inverse). We'll keep it this way, but we also provide instrument metadata API that returns contract multipliers, tick sizes and more for each instrument in uniform way, allowing easily normalize the contract amounts client-side without having to go through all kinds of documentation on various exchange to find this information.
Cryptocurrency markets are very fragmented and every exchange provides data in it's own bespoke data format which we call exchange-native data format. Our HTTP API and client libs can provide market data in this format, meaning data you receive is exactly the same as the live data you would have received from exchanges ("as-is").
For example BitMEX trade message looks like this:
and this is Deribit trade message:
In contrast, normalized data format means the same, unified format across multiple exchanges. We provide normalized data via our client libs (data normalization is performed client-side) as well as via downloadable CSV files.
Sample normalized trade message:
We support following normalized data types via our client libs:
tick-by-tick trades
order book L2 updates
order book snapshots (tick-by-tick, 10ms, 100ms, 1s, 10s etc)
quotes
derivative tick info (open interest, funding rate, mark price, index price)
liquidations
OHLCV
volume/tick based trade bars
and downloadable CSV data files:
tick level order book snapshots (top 25 and top 5 levels)
derivative tick info (open interest, funding rate, mark price, index price)
channel
field used in the HTTP API and client libs replay
functions?Exchanges when publishing real-time data messages, always publish those for subscription topics clients have subscribed to. Those subscriptions topics are also very often called "channels" or "streams" in exchanges documentations pages and describe data type given message belongs to - for example BitMEX publishes it's trades data via trade channel and order book L2 updates data via orderBookL2.
Since we collect the data for all the channels described in exchanges' details page (Captured real-time market data channels section) our HTTP API and client libs offer filtering capability by those channels names, so for example to get historical trades for BitMEX, channel trade needs to be provided alongside requested instruments symbols (via HTTP API or client lib replay
function args).
UTC, always.
We're doing our best to provide the most complete and reliable historical raw data API on the market. To do so amongst many other things, we utilize highly available Kubernetes clusters on Google Cloud Platform that offer best in the class availability, networking and monitoring. However due to exchanges' APIs downtimes (maintenance, deployments, connection drops etc.) we can experience data gaps and cannot guarantee 100% data completeness, but 99.9% (99.99% on most days) which should be more than enough for most of the use cases that tick level data is useful for.
In rare circumstances, when exchange's API changes without any notice or we hit new unexpected rate limits we also may fail to record data during such period, it happens very rarely and is very specific for each exchange. Use /exchanges/:exchange
API endpoint and check for incidentReports
field in order to get most detailed and up to date information on that subject.
As long as exchange WebSocket API is not 'hidden' behind Cloudflare proxy (causing relatively frequent "CloudFlare WebSocket proxy restarting, Connection reset by peer" errors) connections are stable for majority of supported exchanges and there are almost no connection drops during the day. In cases when there is more volatility in the market some exchanges tend to drop connections more frequently or have larger latency spikes. Overall it's a nuanced matter that changes over time, if you'd have any questions regarding particular exchange, please do not hesitate to contact us.
Although is should never happen in theory, in practice due to various crypto exchanges bugs and peculiarities it can happen (very occasionally), see some posts from users reporting those issues:
We do track sequence numbers of WebSocket L2 order book messages when collecting the data and restart connection when sequence gap is detected for exchanges that do provide those numbers. We observe that even in scenario when sequence numbers are in check, bid/ask overlap can occur. When such scenario occurs, exchanges tend to 'forget' to publish delete messages for the opposite side of the book when publishing new level for given side - we validated that hypothesis by comparing reconstructed order book snapshots that had crossed order book (bid/ask overlap) for which we removed order book levels for the opposite side manually (as exchange didn't publish that 'delete'), with quote/ticker feeds if best bid/ask matches (for exchanges that provide those) - see sample code that implements that manual level removal logic.
That shouldn't happen in theory, but we've detected that for some exchanges when new connection is established sometimes first message for given channel & symbol has newer timestamp than subsequent message, e.g., order book snapshot has newer timestamp than first order book update. This is why we provide data via API and CSV downloads for given data ranges based on local timestamps (timestamp of message arrival) which are always monotonically increasing.
Some exchanges are occasionally publishing duplicated trades (trades with the same ids). Since we collect real-time data we also collect and provide duplicate trades via API if those were published by real-time WebSocket feeds of exchanges. Our client libraries have functionality that when working with normalized data can deduplicate such trades, similarly for downloadable CSV files we deduplicate tick-by-tick trades data.
Historical market data available via HTTP API provides order book snapshots at the beginning of each day (00:00 UTC
) - see details.
We also provide custom order book snapshots with customizable time intervals from tick-by-tick, milliseconds to minutes or hours via client libs in which case custom snapshots are computed client side from raw data provided via HTTP API as well as via downloadable CSV files - book_snapshot_25 and book_snapshot_5 .
Order books are collected in streaming mode - snapshot at the beginning of each day and then incremental updates. See details.
We also provide custom order book snapshots with customizable time intervals from tick-by-tick, milliseconds to minutes or hours via client libs in which case custom snapshots are computed client side from raw data provided via HTTP API as well as via downloadable CSV files - book_snapshot_25 and book_snapshot_5 .
incremental_book_l2
CSV dataset is built from real-time data?Cryptocurrency exchanges real-time APIs vary a lot, but for L2 order book data they all tend to follow similar flow, first when WS connection is established and subscription is confirmed, exchanges send initial order book snapshot (all existing price levels or top 'x' levels depending on exchange) and then start streaming 'book update' messages (called frequently deltas as well). Those updates when applied to initial snapshot, result in up to data order book state at given time.
Let's take FTX as an example and start with it's snapshot orderbook message (that is frequently called 'partial' in exchanges API docs as well). Remaining bids and asks levels were removed from this sample message for the sake of clarity.
Such snapshot message maps to the following rows in CSV file:
exchange
symbol
timestamp
local_timestamp
is_snapshot
side
price
amount
ftx
ETH/USD
1601510401216632
1601510401316432
true
ask
359.8
8.101
ftx
ETH/USD
1601510401216632
1601510401316432
true
bid
359.72
121.259
... and here's a sample FTX orderbook update message.
Let's see how it maps to CSV format.
exchange
symbol
timestamp
local_timestamp
is_snapshot
side
price
amount
ftx
ETH/USD
1601510427184054
1601510427204046
false
ask
360.24
4.962
ftx
ETH/USD
1601510427184054
1601510427204036
false
ask
361.02
0
See this answer if you have doubts how to reconstruct order book state based on data provided in incremental_book_L2
dataset.
incremental_book_L2
CSV dataset?In order to reconstruct full order book state correctly from incremental_book_L2
data:
For each row in the CSV file (iterate in the same order as provided in file):
only if local timestamp of current row is larger than previous row local timestamp(local_timestamp
column value) it means you can read your local order book state as it's consistent, why? CSV format is flat where each row represents single price level update, but most exchanges real-time feeds publish multiple order book levels updates via single WebSocket message that need to be processed together before reading locally maintained order book state. We use local timestamp value here to detect all price level updates belonging to single 'update' message.
if current row is a part of the snapshot (is_snapshot
column value set to true
) and previous one was not, reset your local order book state object that tracks price levels for each order book side as it means that there was a connection restart and exchange provided full order book snapshot or it was a start of a new day (each incremental_book_L2 file starts with the snapshot)
if current row amount is set to zero (amount
column value set to 0
) remove such price level (row's price
column) from your local order book state as such price level does not exist anymore
if current row amount is not set to zero update your local order book state price level with new value or add new price level if not exist yet in your local order book state - maintain separately bids and asks order book sides (side
column value)
Alternatively we do also provide top 25 and top 5 levels order book snapshots CSV datasets ready to download.
CSV datasets are available in daily intervals split by exchange, data type and symbol. In addition to standard currency pairs/instrument symbols, each exchange also has special 'grouped' symbols available depending if it supports given market type: SPOT, FUTURES, OPTIONS and PERPETUALS. That feature is useful if someone is interested in for examples all Deribit's options instruments' trades or quotes data without a need to request data for each symbol separately one by one.
Each message received via WebSocket connection is timestamped with 100ns precision using synchronized clock at arrival time (before any message processing) and stored in ISO 8601 format.
For API access it's 15 minutes (T - 15min), downloadable CSV files for given day are available on the next day around 06:00 UTC.
Select data plan and access type that you're interested in via order form on Tardis.dev website.
available data plans
available access types
Proceed to checkout where you provide email address and payment details.
accepted payment methods
Credit Cards (Mastercard Visa Maestro American Express Discover Diners Club JCB UnionPay)
PayPal
Apple Pay (one-off purchases only)
Wire Transfers (for one-off purchases only)
Successfully complete your payment and receive the API key via email that allows you to download CSV datasets and access historical data via API. API key is valid as long as subscription is active or 6 months for one-off purchases.
We do provide discounts in a transparent form via different subscriptions types.
One-off purchase provides access to specific time periods of historical market data. API key is valid for a year since purchase and allows access to all available data types (trades, order books etc.) for ordered date ranges both via API and downloadable CSV files.
Subscriptions based access model relies on recurring payments at regular intervals (monthly, quarterly, yearly) and offers access to newly collected market data as it becomes available as well as existing historical market data which range depends on chosen billing period.
There are three 'dimensions' you can customize your subscription by:
Subscription type - Academic, Solo, Pro or Business
Data plan (which exchanges data you get access to) - Individual, Perpetuals, Derivatives or All Exchanges
Billing interval (how much of historical data you get access to) - monthly, quarterly or yearly
For example "All Exchanges" Business Subscription with yearly billing period allows accessing all available existing historical data via API and CSV files and one year of new data as it becomes available (for initial payment). API key is valid as long as subscription is active and allows access to all available data types (trades, orders book data, quotes, funding, liquidations etc.) via downloadable CSV files and via raw market data API (for Pro and Business subscriptions types only).
Academic
Solo
Professional
Business
✓
✓
✓
✓
—
—
✓
✓
—
—
✓
✓
✓
✓
✓
✓
—
—
✓
✓
Additional API keys
—
—
—
✓
none
none
email (priority)
dedicated
Integration assistance
—
—
—
✓
Vendor onboarding
—
—
—
✓
Api keys count
1
1
1
10
—
—
✓
Yes, depending on chosen billing period, subscriptions include access to existing historical market data as well:
all available historical data if subscription is billed yearly - historical market data is available since 2019-03-30 for majority of the supported exchanges (see exchange details page for exact date for particular exchange)
12 months of historical data if subscription is billed quarterly, e.g., subscription that has started at 2020-04-01, includes access to historical data since 2019-04-01 - it's not a rolling time window, but fixed starting date since when historical data is available for your subscription
4 months of historical data if subscription is billed monthly, e.g., subscription that has started at 2020-04-01, includes access to historical data since 2019-12-01 - it's not a rolling time window, but fixed starting date since when historical data is available for your subscription
All subscriptions provide access to all available data types (trades, orders book data, quotes, funding etc.) via downloadable CSV files and raw data replay API (for pro and business subscriptions types).
"Individual" data plan provides per-exchange access to market data that includes full feed (all instruments) and data types of selected exchange(s), for example full Coinbase exchange data feed.
"Individual" data plan allows access to all available data types (trades, orders book data, quotes, funding etc.) via downloadable CSV files and raw data replay API (for pro and business subscriptions types). Range of historical data access for "Individual" data plan depends on chosen billing period (for example: access to all existing historical data we collected if subscription is billed yearly).
"Perpetuals" data plan provides access to the following perpetual swaps instruments' market data (over 500 perpetual swaps instruments across 13 exchanges):
BitMEX: all perpetual swaps instruments
Deribit: all perpetual swaps instruments
Binance USDT Futures: all perpetual swaps instruments
Binance COIN Futures: all perpetual swaps instruments
FTX: all perpetual swaps instruments
OKX Swap: all perpetual swaps instruments
Huobi COIN Swaps: all perpetual swaps instruments
Huobi USDT Swaps: all perpetual swaps instruments
bitFlyer: FX_BTC_JPY
Bitfinex Derivatives: all perpetual swaps instruments
Bybit: all perpetual swaps instruments
dYdX: all perpetual swaps instruments
Phemex: all perpetual swaps instruments
Delta: all perpetual swaps instruments
Gate.io Futures: all perpetual swaps instruments
CoinFLEX: all perpetual swaps instruments
dYdX: all perpetual swaps instruments
WOO X: all perpetual swaps instruments
Ascendex: all perpetual swaps instruments
Crypto.com: all perpetual swaps instruments
"Perpetuals" data plan allows access to all available data types (trades, orders book data, funding etc.) via downloadable CSV files and raw data replay API (for pro and business subscriptions types). Range of historical data access for "Perpetuals" data plan depends on chosen billing period (for example: access to all existing historical data we collected if subscription is billed yearly).
"Derivatives" data plan provides access to the following derivatives exchanges' market data:
BitMEX: all exchange's instruments
Deribit: all exchange's instruments
Binance USDT Futures: all exchange's instruments
Binance COIN Futures: all exchange's instruments
FTX: all exchange's instruments
OKX Futures: all exchange's instruments
OKX Swap: all exchange's instruments
OKX Options: all exchange's instruments
Huobi Futures: all exchange's instruments
Huobi COIN Swap: all exchange's instruments
Huobi USDT Swaps: all exchange's instruments
Bitfinex Derivatives: all exchange's instruments
Bybit: all exchange's instruments
DYdX: all exchange's instruments
Phemex: all exchange's instruments
CoinFLEX: : all exchange's instruments
Delta: all exchange's instruments
bitFlyer: all exchange's instruments
Gate.io Futures: all exchange's instruments
dYdX: all perpetual swaps instruments
WOO X: all exchange's instruments
Crypto.com: all exchange's instruments
Ascendex: all exchange's instruments
"Derivatives" data plan allows access to all available data types (trades, orders book data, quotes, funding etc.) via downloadable CSV files and raw data replay API (for pro and business subscriptions types).
Range of historical data access for "Derivatives" data plan depends on chosen billing period (for example: access to all existing historical data we collected if subscription is billed yearly).
"All Exchanges" data plan provides access to market data of all supported exchanges (30+ leading spot and derivatives exchanges, see full list).
"All Exchanges" data plan allows access to all available data types (trades, orders book data, quotes, funding, liquidations etc.) for all supported exchanges and theirs instruments/currency pairs via downloadable CSV files and raw data replay API (for pro and business subscriptions types).
Range of historical data access for "All Exchanges" data plan depends on chosen billing period (for example: access to all existing historical data we collected if subscription is billed yearly).
Contact us describing which plan you'd like to change to and we'll handle the rest.
We offer invoicing for customers paying over $6000 for data access. Simply use our order form and "PAY THROUGH INVOICING" button.
Alternatively contact us with orders details you're interested in (data plan, billing period) and we'll send you back invoice that if paid will give you the access to the data.
Yes, please use our order form and "REQUEST QUOTATION" button. Alternatively contact us with orders details you're interested in (data plan, billing period) and we'll send you back quotation document in no time.
After successful order you'll receive Receipt email from Paddle which is our online reseller & payment processor. Click on the button titled "View Receipt" there.
You will be redirected to the receipt page where you will be able to enter your address details by clicking on "Add address & VAT Number“ link.
If you would like to enter a VAT number select "This is a business purchase" checkbox to enter the VAT ID if forgot to enter it during the checkout. The tax amount will be refunded in max. 12 hours after it is confirmed by Paddle.
Right click on the screen and click 'Print...' in context menu
Change destination to 'Save as PDF'
Click 'Save' button to save invoice as PDF file
Click on the link titled "Click here to get a full invoice with address & custom information" provided with the order confirmation email sent by Paddle to get the address and VAT ID of Paddle who process our payments. Paddle acts as a reseller and Merchant of Record so they handle VAT on our behalf.
You need to contact us or help@paddle.com to request a new invoice. Please provide the email address you bought the subscription with and any extra details that might help.
We do not offer refunds for initial subscription payments and one-off purchases.
If you are on yearly billing and forget to cancel your subscription before the renewal date, reach out to us within seven days after the renewal date to discuss a refund.
If you’re on a monthly or quarterly billing, please be sure to cancel your subscription before the end date of your current plan as there are no refunds for recurring payments on monthly and quarterly billing plans.
If you'd like to test the service, we offer generous free trials. Simply reach out to us and we'll set up test account for you in no time.
In order to cancel you active subscription use the 'Cancel subscription' link we've sent you in email together with your API key or contact us and we'll provide cancellation link for you. Alternatively you can email Paddle (help@paddle.com) which acts as our reseller and Merchant of Record including a note of the email address you used to purchase your subscription and your order number.
We accept BTC, ETH and USDT for one-off purchases. Contact us and we'll get back to you with details.
In order to update your credit card information use the 'Update payment method' link we've sent you in email together with your API key or contact us and we'll provide that link for you.
FTX historical market data details - available instruments, data coverage and data collection specifics
Huobi COIN Swaps historical market data details - instruments, data coverage and data collection specifics
Huobi USDT Swaps historical market data details - instruments, data coverage and data collection specifics
Huobi Futures historical market data details - instruments, data coverage and data collection specifics
Huobi Futures historical data for all it's instruments is available since 2019-11-19.
Up until 2020-01-31 depth
channel was collected with step0
aggregation level (no aggregation) which produces full order book snapshots for each book change which is very inefficient to store. To circumvent this issue we stored only initial book snapshots and then incremental updates instead - incremental updates were calculated by diffing two subsequent book snapshots and provided in the same format as other depth
messages, except having additional update: true
flag set as in snippet below. Update with amount (second value in array) set to 0 means such level should be deleted, otherwise price level should be updated with new amount value.
On 2020-01-31 we've switched to depth.size_150.high_freq
channel instead when collecting data and which natively provides incremental order book updates without workarounds described above.
Unfortunately it means that when requesting data for depth
channel it may return slightly different format depending for which time period request was made. It's only slightly different and boils down to the way order book update messages are marked vs order book snapshots. In depth.size_150.high_freq
order book message has event
field always present with value update
or snapshot
, for example:
For messages before 2020-01-31 we've used depth.step0
channel for collecting order book data which means order book update message has flag update
set to true
, if it's a snapshot it doesn't have that flag at all, for example:
All other fields are are the same (tick.bids and tick.asks etc).
OKX Swap historical market data details - instruments, data coverage and data collection specifics
OKX Options historical market data details - instruments, data coverage and data collection specifics
OKX Spot historical market data details - currency pairs, data coverage and data collection specifics
OKEx historical data for spot high caps currency pairs is available since 2019-03-30, data for all spot currency pairs is available since 2021-03-11.
Binance USDT Margined Futures historical market data details - instruments, data coverage and data collection specifics
Binance USDT Futures historical data for all it's instruments is available since 2019-11-17.
derivative_ticker
open interest data is available since 2020-05-13 - date since we've started collecting that info via Binance USDT Futures REST API (open interest channel).
Binance USDT Futures depth
channel has been recorded with the fastest update speed API allowed at the time. It means until 2020-01-07 it was depth@100ms
- book updates pushed every 100ms and after that date it was depth@0ms
- book updates pushed real-time (new API feature).
Bitfinex Derivatives historical market data details - available instruments, data coverage and data collection specifics
(delisted)
(delisted)
(delisted)
collected from channel
collected from channel (trades with liquidation
flag)
collected from stream, since 2021-04-27 liquidation orders streams do not push realtime order data anymore, instead, they push snapshot order data at a maximum frequency of 1 order push per second
collected from stream, since 2021-04-27 liquidation orders streams do not push realtime order data anymore, instead, they push snapshot order data at a maximum frequency of 1 order push per second
collected from channel (trades with liquidation
flag)
collected from channel
collected from channel
collected from channel
collected from channel (trades with liquidation
type)
up until 2021-09-20 collected by pooling Bybit REST APIs since liquidations weren't available via WS feeds, starting from 2021-09-20 collected from channel
(HTTP API /data-feeds
)
✓ ()
Historical CSV datasets for the first day of each month are available to download without API key. See .
Derivative ticker datasets are available since 2020-05-13 - date since we've started collecting that data via FTX REST API ().
See full with datasets format spec, data samples and more.
that shows all available download options (download path customization, filenames conventions and more).
that shows all available download options (download path customization, filenames conventions and more).
See which allows downloading single file at once.
Historical data format is the same as provided by real-time FTX WebSocket API with addition of local timestamps. If you'd like to work with normalized data format instead (same format for each exchange) see or official that can perform data normalization client-side.
See .
See .
See .
is a locally runnable server that exposes API allowing efficiently requesting historical market data for whole time periods in contrast to that provides data only in minute by minute slices.
See docs.
Click any channel below to see response with historical data recorded for it.
- available since 2020-05-22
- available since 2020-07-21
Asorderbook
channel provides data only about the orderbook's best 100 orders on either side, grouped orderbooks channel supplies orderbook data with grouped (collapsed) prices allowing retrieving lower-granularity, higher-depth information about the orderbook.
We set grouping
param to instruments' priceIncrement
value multiplied by 10.
- generated channel, available since 2020-05-13
Since FTX does not offer currently real-time WebSocket instrument info channel with next funding rate, open interest or mark price data, we simulate it by fetching that info from FTX REST API ( and ) every 3-5 seconds for each derivative instrument. Such messages are marked with "channel":"instrument"
and "generated":true
fields and data
field has the same format as REST API responses.
for FTX since 2020-05-14 is located in GCP asia-northeast1 (Tokyo, Japan), before that it was located in GCP europe-west2 region (London, UK). Real-time market data is captured via multiple WebSocket connections.
Historical CSV datasets for the first day of each month are available to download without API key. See .
Historical data format is the same as provided by real-time Huobi COIN Swaps WebSocket API with addition of local timestamps. If you'd like to work with normalized data format instead (same format for each exchange) see or official that perform data normalization client-side.
See .
See .
See .
is a locally runnable server that exposes API allowing efficiently requesting historical market data for whole time periods in contrast to that provides data only in minute by minute slices.
See docs.
Click any channel below to see response with historical data recorded for it.
(depth.size_150.high_freq)
During data collection integrity of order book incremental updates is being validated using provided by Huobi Swap real-time feed (version
field) - in case of detecting missed message WebSocket connection is being restarted.
- available since 2020-09-18
- generated channel, available since 2020-06-24
Since Huobi does not offer currently real-time WebSocket open interest channel, we simulate it by fetching that info from REST API () every 4-6 seconds for each instrument. Such messages are marked with "ch":"market.<symbol>.open_interest"
and "generated":true
fields and data
field has the same format as REST API response data.
(index price updates) - available since 2020-06-24
- available since 2020-06-24
- available since 2020-06-24
- available since 2020-06-24
for Huobi COIN Swaps since 2020-06-19 is located in GCP asia-northeast1 (Tokyo, Japan), before that it was located in GCP europe-west2 region (London, UK). Real-time market data is captured via multiple WebSocket connections.
Historical CSV datasets for the first day of each month are available to download without API key. See .
Historical data format is the same as provided by real-time Huobi USDT Swaps WebSocket API with addition of local timestamps. If you'd like to work with normalized data format instead (same format for each exchange) see or official that perform data normalization client-side.
See .
See .
See .
is a locally runnable server that exposes API allowing efficiently requesting historical market data for whole time periods in contrast to that provides data only in minute by minute slices.
See docs.
Click any channel below to see response with historical data recorded for it.
(depth.size_150.high_freq)
During data collection integrity of order book incremental updates is being validated using provided by Huobi Swap real-time feed (version
field) - in case of detecting missed message WebSocket connection is being restarted.
- generated channel
Since Huobi does not offer currently real-time WebSocket open interest channel, we simulate it by fetching that info from REST API () every 4-6 seconds for each instrument. Such messages are marked with "ch":"market.<symbol>.open_interest"
and "generated":true
fields and data
field has the same format as REST API response data.
for Huobi USDT Swap is located in GCP asia-northeast1 (Tokyo, Japan). Real-time market data is captured via multiple WebSocket connections.
Historical CSV datasets for the first day of each month are available to download without API key. See .
Historical data format is the same as provided by real-time Huobi Futures WebSocket API with addition of local timestamps. If you'd like to work with normalized data format instead (same format for each exchange) see or official that perform data normalization client-side.
See .
See .
See .
is a locally runnable server that exposes API allowing efficiently requesting historical market data for whole time periods in contrast to that provides data only in minute by minute slices.
See docs.
Click any channel below to see response with historical data recorded for it.
During data collection integrity of order book incremental updates is being validated using provided by Huobi Futures real-time feed (version
field) - in case of detecting missed message WebSocket connection is being restarted.
See also details below regarding depth channel data collection details.
- available since 2020-06-24
- generated channel, available since 2020-06-24
Since Huobi Futures does not offer currently real-time WebSocket open interest channel, we simulate it by fetching that info from REST API () every 4-6 seconds for each instrument. Such messages are marked with "ch":"market.<symbol>.open_interest"
and "generated":true
fields and data
field has the same format as REST API response data.
(index price updates) - available since 2020-06-24
- available since 2020-06-24
- available since 2020-06-24
Please feel free to if it's confusing in any way.
We also provide normalization layer that handles those differences transparently via our.
for Huobi Futures since 2020-06-19 is located in GCP asia-northeast1 (Tokyo, Japan), before that it was located in GCP europe-west2 region (London, UK). Real-time market data is captured via multiple WebSocket connections.
Historical CSV datasets for the first day of each month are available to download without API key. See .
Historical data format is the same as provided by real-time OKX WebSocket v3 API with addition of local timestamps. If you'd like to work with normalized data format instead (same format for each exchange) see or official that can perform data normalization client-side.
See .
See .
See .
is a locally runnable server that exposes API allowing efficiently requesting historical market data for whole time periods in contrast to that provides data only in minute by minute slices.
See docs.
Click any channel below to see response with historical data recorded for it.
- available until 2020-02-12
- available since 2020-02-08
- available since 2019-09-2
- available since 2020-07-05
for OKX Swap since 2022-05-04T16:45 is located in AWS HK region (Hong Kong, China, VPC colo setup), before that, starting since 2020-05-15 it was located in GCP asia-northeast1 (Tokyo, Japan) and initially it was located in GCP europe-west2 region (London, UK). Real-time market data is captured via multiple WebSocket connections.
Historical CSV datasets for the first day of each month are available to download without API key. See .
Historical data format is the same as provided by real-time OKX WebSocket v3 API with addition of local timestamps. If you'd like to work with normalized data format instead (same format for each exchange) see or official that can perform data normalization client-side.
See .
See .
See .
is a locally runnable server that exposes API allowing efficiently requesting historical market data for whole time periods in contrast to that provides data only in minute by minute slices.
See docs.
Click any channel below to see response with historical data recorded for it.
- available until 2020-02-12
- available since 2020-02-08
- available since 2020-06-09
- available since 2020-07-05
- available since 2020-07-31
for OKX Options since 2022-05-04T16:45 is located in AWS HK region (Hong Kong, China, VPC colo setup), before that, starting since 2020-05-15 it was located in GCP asia-northeast1 (Tokyo, Japan) and initially it was located in GCP europe-west2 region (London, UK). Real-time market data is captured via multiple WebSocket connections.
Historical CSV datasets for the first day of each month are available to download without API key. See .
See full with datasets format spec, data samples and more.
that shows all available download options (download path customization, filenames conventions and more).
that shows all available download options (download path customization, filenames conventions and more).
See which allows downloading single file at once.
Historical data format is the same as provided by real-time OKX WebSocket v3 API with addition of local timestamps. If you'd like to work with normalized data format instead (same format for each exchange) see or official that can perform data normalization client-side.
See .
See .
See .
is a locally runnable server that exposes API allowing efficiently requesting historical market data for whole time periods in contrast to that provides data only in minute by minute slices.
See docs.
Click any channel below to see response with historical data recorded for it.
- available until 2020-04-10
- available since 2020-04-10
- available since 2020-07-05
for OKX Spot since 2022-05-04T16:45 is located in AWS HK region (Hong Kong, China, VPC colo setup), before that, starting since 2020-05-15 it was located in GCP asia-northeast1 (Tokyo, Japan) and initially it was located in GCP europe-west2 region (London, UK). Real-time market data is captured via multiple WebSocket connections.
Data collection before 2020-05-14 suffered some issues (missing data, latency spikes) during market volatility periods. It has been circumvent by switching to and using multiple WS connections for real-time market data collection.
Historical CSV datasets for the first day of each month are available to download without API key. See .
Historical data format is the same as provided by real-time Binance USDT Futures WebSocket API with addition of local timestamps. If you'd like to work with normalized data format instead (same format for each exchange) see or official that perform data normalization client-side.
See .
See .
See .
is a locally runnable server that exposes API allowing efficiently requesting historical market data for whole time periods in contrast to that provides data only in minute by minute slices.
See docs.
Click any channel below to see response with historical data recorded for it.
- available since 2019-12-05
Recorded with @1s
speed since 2020-02-13 (new API feature).
- generated channel with full order book snapshots
Binance USDT Futures real-time WebSocket API does not provide initial order book snapshots. To overcome this issue we fetch initial order book snapshots from REST API and store them together with the rest of the WebSocket messages - top 1000 levels. Such snapshot messages are marked with "stream":"<symbol>@depthSnapshot"
and "generated":true
fields.
During data collection integrity of order book incremental updates is being validated using provided by real-time feed (pu
and u
fields) - in case of detecting missed message WebSocket connection is being restarted. We also validate if initial book snapshot fetched from REST API overlaps with received depth
messages.
- generated channel, available since 2020-05-14
Since Binance USDT Futures does not offer currently real-time WebSocket open interest channel, we simulate it by fetching that info from REST API () every 30 seconds for each instrument. Such messages are marked with "stream":"<symbol>@openInterest"
and "generated":true
fields and data
field has the same format as REST API response.
- available since 2020-10-13
- generated channel, available since 2020-12-18
Top trader long/short ratio (accounts), sourced by querying every minute
- generated channel, available since 2020-12-18
Top trader long/short ratio (positions), sourced by querying every minute
- generated channel, available since 2020-12-18
Global long/short ratio, sourced by querying every minute
- generated channel, available since 2021-12-01
Taker buy sell volume and ratio, sourced by querying every minute
for Binance USDT Futures since 2020-05-14 is located in GCP asia-northeast1 (Tokyo, Japan), before that it was located in GCP europe-west2 region (London, UK). Real-time market data is captured via multiple WebSocket connections.
Historical CSV datasets for the first day of each month are available to download without API key. See .
Historical data format is the same as provided by real-time Bitfinex WebSocket v2 API with addition of local timestamps and since 2020-05-27 also with addition of channel and symbol at the end of each message which allows us providing filtering for the data server-side. If you'd like to work with normalized data format instead (same format for each exchange) see or official that perform data normalization client-side.
See .
See .
See .
is a locally runnable server that exposes API allowing efficiently requesting historical market data for whole time periods in contrast to that provides data only in minute by minute slices.
See docs.
Click any channel below to see response with historical data recorded for it.
(prec=P0, freq=F0, len=100) During data collection integrity of order book incremental updates is being validated using (SEQ_ALL option) provided by real-time feed - in case of detecting missed message WebSocket connection is being restarted.
(prec=R0, freq=F0, len=100)
for Bitfinex is located in GCP europe-west2 region (London, UK). Real-time market data is captured via multiple WebSocket connections.
data type
symbol
date
trades
BTC-PERP
2020-01-01
incremental_book_L2
BTC-PERP
2020-01-01
book_snapshot_25
BTC-PERP
2020-01-01
quotes
BTC-PERP
2020-01-01
derivative_ticker
BTC-PERP
2020-06-01
trades
FUTURES
2020-03-01
incremental_book_L2
FUTURES
2020-03-01
liquidations
PERPETUALS
2021-09-01
data type
symbol
date
trades
BTC-USD
2020-04-01
incremental_book_L2
BTC-USD
2020-04-01
book_snapshot_25
BTC-USD
2020-04-01
quotes
BTC-USD
2020-04-01
book_snapshot_25
BTC-USD
2020-04-01
derivative_ticker
BTC-USD
2020-07-01
trades
PERPETUALS
2020-05-01
liquidations
PERPETUALS
2021-09-01
data type
symbol
date
trades
BTC-USDT
2023-03-01
incremental_book_L2
BTC-USDT
2023-03-01
book_snapshot_25
BTC-USDT
2023-03-01
quotes
BTC-USDT
2023-03-01
book_ticker
BTC-USDT
2023-03-01
derivative_ticker
BTC-USDT
2023-03-01
trades
FUTURES
2023-03-01
liquidations
PERPETUALS
2023-03-01
data type
symbol
date
trades
BTC_CQ
2020-02-01
incremental_book_L2
BTC_CQ
2020-02-01
book_snapshot_25
BTC_CQ
2020-02-01
quotes
BTC_CQ
2020-02-01
derivative_ticker
BTC_CQ
2020-07-01
trades
FUTURES
2020-03-01
liquidations
FUTURES
2021-09-01
data type
symbol
date
trades
BTC-USD-SWAP
2020-01-01
incremental_book_L2
BTC-USD-SWAP
2020-01-01
book_snapshot_25
BTC-USD-SWAP
2020-01-01
quotes
BTC-USD-SWAP
2020-01-01
derivative_ticker
BTC-USD-SWAP
2020-01-01
trades
PERPETUALS
2020-03-01
liquidations
PERPETUALS
2021-09-01
data type
symbol
date
trades
BTC-USD-200327-8500-P
2020-03-01
incremental_book_L2
BTC-USD-200327-8500-P
2020-03-01
quotes
BTC-USD-200327-8500-P
2020-03-01
trades
OPTIONS
2020-03-01
options_chain
OPTIONS
2020-03-01
quotes
OPTIONS
2020-03-01
book_snapshot_25
OPTIONS
2020-03-01
data type
symbol
date
trades
BTC-USDT
2020-01-01
incremental_book_L2
BTC-USDT
2020-01-01
book_snapshot_25
BTC-USDT
2020-01-01
quotes
BTC-USDT
2020-01-01
trades
ETH-USDT
2020-03-01
incremental_book_L2
ETH-USDT
2020-03-01
data type
symbol
date
trades
BTCUSDT
2020-02-01
incremental_book_L2
BTCUSDT
2020-02-01
quotes
BTCUSDT
2020-02-01
book_snapshot_25
BTCUSDT
2020-09-01
derivative_ticker
BTCUSDT
2020-02-01
trades
PERPETUALS
2020-03-01
liquidations
PERPETUALS
2021-09-01
data type
symbol
date
trades
BTCF0-USTF0
2019-12-01
incremental_book_L2
BTCF0-USTF0
2019-12-01
quotes
BTCF0-USTF0
2019-12-01
book_snapshot_25
BTCF0-USTF0
2019-12-01
derivative_ticker
BTCF0-USTF0
2019-12-01
trades
ETHF0-USTF0
2020-03-01
incremental_book_L2
ETHF0-USTF0
2020-03-01
liquidations
PERPETUALS
2021-09-01
CSV datasets are available via dedicated datasets API that allows downloading tick level incremental order book L2 updates, order book snapshots, trades, options chains, quotes, derivative tickers and liquidations data. For ongoing data, CSV datasets for a given day are available on the next day around 06:00 UTC.
Historical datasets for the first day of each month are available to download without API key. Our Node.js and Python clients have built-in functions to efficiently download whole date range of data.
See full example that shows all available download options (download path customization, filenames conventions and more).
See full example that shows all available download options (download path customization, filenames conventions and more).
columns delimiter: , (comma)
new line marker: \n (LF)
decimal mark: . (dot)
date time format: microseconds since epoch (https://www.epochconverter.com/)
date time timezone: UTC
Incremental order book L2 updates collected from exchanges' real-time WebSocket order book L2 data feeds - data as deep and granular as underlying real-time data source, please see FAQ: What is the maximum order book depth available for each supported exchange? for more details.
As exchanges real-time feeds usually publish multiple order book levels updates via single message you can recognize that by grouping rows by local_timestamp
field if needed.
column name
description
exchange
symbol
instrument symbol as provided by exchange (always uppercase)
timestamp
timestamp provided by exchange in microseconds since epoch - if exchange does not provide one local_timestamp
value is used as a fallback
local_timestamp
message arrival timestamp in microseconds since epoch
is_snapshot
possible values:
true
- if update was a part of initial order book snapshot
false
- if update was not a part of initial order book snapshot
If last update was not a snapshot and current one is, then existing order book state must be discarded (all existing levels removed)
side
determines to which side of the order book update belongs to:
bid
- bid side of the book, buy orders
ask
- ask side of the book, sell orders
price
price identifying book level being updated
amount
updated price level amount as provided by exchange, not a delta - an amount of 0
indicates that the price level can be removed
exchange
symbol
timestamp
local_timestamp
is_snapshot
side
price
amount
deribit
BTC-PERPETUAL
1585699209920000
1585699209934201
false
ask
6443.5
38640
deribit
BTC-PERPETUAL
1585699209947000
1585699209957629
false
bid
6311.5
0
deribit
BTC-PERPETUAL
1585699209950000
1585699209963464
false
ask
6428
13210
deribit
BTC-PERPETUAL
1585699209967000
1585699209979152
false
bid
6311.5
750
deribit
BTC-PERPETUAL
1585699209970000
1585699209983585
false
bid
6327
16010
deribit
BTC-PERPETUAL
1585699209970000
1585699209983585
false
bid
6325
210530
deribit
BTC-PERPETUAL
1585699209972000
1585699209983691
false
bid
6351
810
deribit
BTC-PERPETUAL
1585699209972000
1585699209983691
false
bid
6352.5
18830
deribit
BTC-PERPETUAL
1585699209974000
1585699209983703
false
ask
6492
100
Tick-level order book snapshots reconstructed from exchanges' real-time WebSocket order book L2 data feeds. Each row represents top 25 levels from each side of the limit order book book and was recorded every time any of the tracked bids/asks top 25 levels have changed.
column name
description
exchange
symbol
instrument symbol as provided by exchange (always uppercase)
timestamp
timestamp provided by exchange in microseconds since epoch - if exchange does not provide one local_timestamp
value is used as a fallback
local_timestamp
message arrival timestamp in microseconds since epoch
asks[0..24].price
top 25 asks prices in ascending order, empty if there aren't enough price levels available in the order book or provided by the exchange
asks[0..24].amount
top 25 asks amounts in ascending order, empty if there aren't enough price levels available in the order book or provided by the exchange
bids[0..24].price
top 25 bids prices in descending order, empty if there aren't enough price levels available in the order book or provided by the exchange
bids[0..24].amount
top 25 bids amounts in descending order, empty if there aren't enough price levels available in the order book or provided by the exchange
exchange
symbol
timestamp
local_timestamp
asks[0].price
asks[0].amount
bids[0].price
bids[0].amount
asks[1].price
asks[1].amount
bids[1].price
bids[1].amount
asks[2].price
asks[2].amount
bids[2].price
bids[2].amount
asks[3].price
asks[3].amount
bids[3].price
bids[3].amount
asks[4].price
asks[4].amount
bids[4].price
bids[4].amount
asks[5].price
asks[5].amount
bids[5].price
bids[5].amount
asks[6].price
asks[6].amount
bids[6].price
bids[6].amount
asks[7].price
asks[7].amount
bids[7].price
bids[7].amount
asks[8].price
asks[8].amount
bids[8].price
bids[8].amount
asks[9].price
asks[9].amount
bids[9].price
bids[9].amount
asks[10].price
asks[10].amount
bids[10].price
bids[10].amount
asks[11].price
asks[11].amount
bids[11].price
bids[11].amount
asks[12].price
asks[12].amount
bids[12].price
bids[12].amount
asks[13].price
asks[13].amount
bids[13].price
bids[13].amount
asks[14].price
asks[14].amount
bids[14].price
bids[14].amount
asks[15].price
asks[15].amount
bids[15].price
bids[15].amount
asks[16].price
asks[16].amount
bids[16].price
bids[16].amount
asks[17].price
asks[17].amount
bids[17].price
bids[17].amount
asks[18].price
asks[18].amount
bids[18].price
bids[18].amount
asks[19].price
asks[19].amount
bids[19].price
bids[19].amount
asks[20].price
asks[20].amount
bids[20].price
bids[20].amount
asks[21].price
asks[21].amount
bids[21].price
bids[21].amount
asks[22].price
asks[22].amount
bids[22].price
bids[22].amount
asks[23].price
asks[23].amount
bids[23].price
bids[23].amount
asks[24].price
asks[24].amount
bids[24].price
bids[24].amount
deribit
BTC-PERPETUAL
1599868800206000
1599868800253274
10396
48050
10395.5
18220
10396.5
22220
10395
16570
10397
100
10394.5
22630
10397.5
8360
10394
16670
10398
1500
10393.5
16570
10398.5
13210
10393
5600
10399.5
60070
10392.5
20500
10400
5100
10392
30
10400.5
5140
10391.5
75780
10401
13040
10391
12110
10401.5
2250
10390.5
280
10402
9150
10390
52680
10402.5
119390
10389.5
18240
10403
23070
10389
73010
10403.5
53930
10388.5
67500
10404
43590
10388
313140
10404.5
271050
10387.5
280
10405
73710
10387
9840
10405.5
32480
10386.5
104570
10406
41220
10386
269050
10406.5
20400
10385.5
21840
10407
45460
10385
79000
10407.5
69630
10384.5
220
10408
22230
10384
71440
10408.5
30840
10383.5
44740
deribit
BTC-PERPETUAL
1599868800280000
1599868800310441
10396
48050
10395.5
18220
10396.5
22220
10395
16570
10397
100
10394.5
22630
10397.5
8360
10394
16670
10398
1500
10393.5
16570
10398.5
13210
10393
5600
10399.5
60070
10392.5
20500
10400
5100
10392
30
10400.5
5140
10391.5
75780
10401
13040
10391
12110
10401.5
2250
10390.5
280
10402
9150
10390
52680
10402.5
119390
10389.5
18240
10403
23070
10389
73010
10403.5
53930
10388.5
67500
10404
43590
10388
313140
10404.5
271050
10387.5
280
10405
73710
10387
9850
10405.5
32480
10386.5
104570
10406
41220
10386
269050
10406.5
20400
10385.5
21840
10407
45460
10385
79000
10407.5
69630
10384.5
220
10408
22230
10384
71440
10408.5
30840
10383.5
44740
deribit
BTC-PERPETUAL
1599868814801000
1599868814817631
10398.5
20
10398
7400
10399
4890
10397.5
17680
10399.5
520
10396.5
17680
10400
1700
10396
30280
10400.5
3010
10395.5
44110
10401
40
10395
20080
10401.5
2570
10394.5
91410
10402
400
10394
97570
10402.5
50530
10393.5
27510
10403
9960
10393
3330
10403.5
54250
10392.5
200
10404
40
10392
20400
10404.5
10
10391.5
75650
10405
93470
10391
9580
10405.5
32540
10390.5
260040
10406
26130
10390
310
10406.5
9670
10389.5
21210
10407
1180
10389
87320
10407.5
89030
10388.5
61140
10408
54860
10388
283120
10408.5
42430
10387.5
10680
10409
260680
10387
11400
10409.5
19220
10386.5
92470
10410
94970
10386
49640
10410.5
50
10385.5
6420
deribit
BTC-PERPETUAL
1599868814809000
1599868814817632
10398.5
20
10398
7400
10399
4890
10397.5
17680
10399.5
520
10396.5
17680
10400
1700
10396
30280
10400.5
3010
10395.5
44110
10401
40
10395
20080
10401.5
2570
10394.5
91410
10402
400
10394
97570
10402.5
50530
10393.5
27510
10403
9960
10393
3330
10403.5
54900
10392.5
200
10404
40
10392
20400
10404.5
10
10391.5
75650
10405
93470
10391
9580
10405.5
32540
10390.5
260040
10406
26130
10390
310
10406.5
9670
10389.5
21210
10407
1180
10389
87320
10407.5
89030
10388.5
61140
10408
54860
10388
283120
10408.5
42430
10387.5
10680
10409
260680
10387
11400
10409.5
19220
10386.5
92470
10410
94970
10386
49640
10410.5
50
10385.5
6420
deribit
BTC-PERPETUAL
1599868815411000
1599868815414125
10399
4910
10398
25080
10399.5
20
10397.5
17680
10400
2200
10396.5
17680
10400.5
2910
10396
31780
10401
40
10395.5
44110
10401.5
570
10395
20050
10402
500
10394.5
91440
10402.5
52990
10394
98510
10403
3500
10393.5
26570
10403.5
45100
10393
3330
10404
9190
10392.5
470
10404.5
10
10392
18300
10405
70030
10391.5
85130
10405.5
60800
10391
8640
10406
26130
10390.5
260040
10406.5
9270
10390
22530
10407
240
10389.5
14030
10407.5
89970
10389
65120
10408
23640
10388.5
72380
10408.5
62090
10388
283120
10409
260680
10387.5
10280
10409.5
18150
10387
11400
10410
94970
10386.5
123630
10410.5
50
10386
8470
10411
28210
10385.5
6420
deribit
BTC-PERPETUAL
1599868815411000
1599868815419035
10399
4910
10398
25080
10399.5
20
10397.5
17680
10400
2200
10396.5
17680
10400.5
2910
10396
31780
10401
40
10395.5
44110
10401.5
570
10395
20050
10402
500
10394.5
91440
10402.5
52990
10394
98510
10403
3500
10393.5
26570
10403.5
45100
10393
3330
10404
9190
10392.5
470
10404.5
10
10392
18300
10405
70030
10391.5
85130
10405.5
60800
10391
8640
10406
26130
10390.5
260040
10406.5
17270
10390
22530
10407
240
10389.5
14030
10407.5
89970
10389
65120
10408
23640
10388.5
72380
10408.5
62090
10388
283120
10409
260680
10387.5
10280
10409.5
18150
10387
11400
10410
94970
10386.5
123630
10410.5
50
10386
8470
10411
28210
10385.5
6420
deribit
BTC-PERPETUAL
1599868907943000
1599868907946933
10398
4600
10397.5
73090
10399.5
10
10397
24630
10400
3300
10396.5
22770
10400.5
10270
10396
3130
10401
25390
10395.5
5000
10401.5
119790
10395
9060
10402
8510
10394.5
17910
10402.5
8180
10394
138990
10403
10000
10393.5
33080
10404
7960
10393
3020
10404.5
15130
10392.5
8130
10405
128930
10392
100920
10405.5
109560
10391.5
83330
10406
8610
10391
32220
10406.5
34890
10390.5
278270
10407
44440
10390
47980
10407.5
102620
10389.5
292240
10408
20660
10389
65100
10408.5
175160
10388.5
790
10409
7660
10388
55720
10409.5
308550
10387.5
31440
10410
138130
10387
3830
10410.5
15940
10386.5
109470
10411
2610
10386
31560
10411.5
3780
10385.5
3450
deribit
BTC-PERPETUAL
1599868907944000
1599868907953129
10398
4600
10397.5
73090
10399.5
10
10397
24630
10400
3300
10396.5
22770
10400.5
10270
10396
3130
10401
25390
10395.5
5000
10401.5
119790
10395
1060
10402
8510
10394.5
17910
10402.5
8180
10394
146990
10403
10000
10393.5
33080
10404
7960
10393
3020
10404.5
15130
10392.5
8130
10405
128930
10392
100920
10405.5
109560
10391.5
83330
10406
8610
10391
32220
10406.5
34890
10390.5
278270
10407
44440
10390
47980
10407.5
102620
10389.5
292240
10408
20660
10389
65100
10408.5
175160
10388.5
790
10409
7660
10388
55720
10409.5
308550
10387.5
31440
10410
138130
10387
3830
10410.5
15940
10386.5
109470
10411
2610
10386
31560
10411.5
3780
10385.5
3450
deribit
BTC-PERPETUAL
1599868907993000
1599868907997022
10398
4600
10397.5
73090
10399.5
2010
10397
24630
10400
3300
10396.5
22770
10400.5
8270
10396
3130
10401
25390
10395.5
5000
10401.5
119790
10395
1060
10402
8510
10394.5
17910
10402.5
8180
10394
146990
10403
10000
10393.5
33080
10404
7960
10393
3020
10404.5
15130
10392.5
8130
10405
128930
10392
100920
10405.5
109560
10391.5
83330
10406
8610
10391
32220
10406.5
34890
10390.5
278270
10407
44440
10390
47980
10407.5
102620
10389.5
292240
10408
20660
10389
65100
10408.5
175160
10388.5
790
10409
7660
10388
55720
10409.5
308550
10387.5
31440
10410
138130
10387
3830
10410.5
15940
10386.5
109470
10411
2610
10386
31560
10411.5
3780
10385.5
3450
Tick-level order book snapshots reconstructed from exchanges' real-time WebSocket order book L2 data feeds. Each row represents top 5 levels from each side of the limit order book book and was recorded every time any of the tracked bids/asks top 5 levels have changed.
column name
description
exchange
symbol
instrument symbol as provided by exchange (always uppercase)
timestamp
timestamp provided by exchange in microseconds since epoch - if exchange does not provide one local_timestamp
value is used as a fallback
local_timestamp
message arrival timestamp in microseconds since epoch
asks[0..4].price
top 5 asks prices in ascending order, empty if there aren't enough price levels available in the order book or provided by the exchange
asks[0..4].amount
top 5 asks amounts in ascending order, empty if there aren't enough price levels available in the order book or provided by the exchange
bids[0..4].price
top 5 bids prices in descending order, empty if there aren't enough price levels available in the order book or provided by the exchange
bids[0..4].amount
top 5 bids amounts in descending order, empty if there aren't enough price levels available in the order book or provided by the exchange
exchange
symbol
timestamp
local_timestamp
asks[0].price
asks[0].amount
bids[0].price
bids[0].amount
asks[1].price
asks[1].amount
bids[1].price
bids[1].amount
asks[2].price
asks[2].amount
bids[2].price
bids[2].amount
asks[3].price
asks[3].amount
bids[3].price
bids[3].amount
asks[4].price
asks[4].amount
bitmex
XBTUSD
1598918402683390
1598918402683390
11658
1399982
11657.5
2293327
11658.5
82328
11657
37555
11659
3001
11656.5
110647
11659.5
10843
11656
10063
11660
2522
bitmex
XBTUSD
1598918403229829
1598918403229829
11658
1399982
11657.5
2293327
11658.5
82328
11657
37555
11659
3001
11656.5
110647
11659.5
10835
11656
10063
11660
2522
bitmex
XBTUSD
1598918403232925
1598918403232925
11658
1399982
11657.5
2295327
11658.5
82328
11657
37555
11659
3001
11656.5
110647
11659.5
10835
11656
10063
11660
2522
bitmex
XBTUSD
1598918403253585
1598918403253585
11658
1399982
11657.5
2295256
11658.5
82328
11657
37555
11659
3001
11656.5
110647
11659.5
10835
11656
10063
11660
2522
bitmex
XBTUSD
1598918403256460
1598918403256460
11658
1399982
11657.5
2294159
11658.5
82328
11657
37555
11659
3001
11656.5
110647
11659.5
10835
11656
10063
11660
2522
bitmex
XBTUSD
1598947264502293
1598947264502293
11950
730542
11949.5
1100309
11950.5
30454
11949
454098
11951
72967
11948.5
2519605
11951.5
48967
11948
97449
11952
55623
bitmex
XBTUSD
1598947264505452
1598947264505452
11950
730542
11949.5
1100309
11950.5
30454
11949
454098
11951
72967
11948.5
2509605
11951.5
48967
11948
97449
11952
55623
bitmex
XBTUSD
1598947264510015
1598947264510015
11950
730542
11949.5
1100975
11950.5
30454
11949
454098
11951
72967
11948.5
2509605
11951.5
48967
11948
97449
11952
55623
bitmex
XBTUSD
1598947264510024
1598947264510024
11950
730542
11949.5
1250975
11950.5
30454
11949
454098
11951
72967
11948.5
2509605
11951.5
48967
11948
97449
11952
55623
Individual trades data collected from exchanges' real-time WebSocket trades data feeds.
column name
description
exchange
symbol
instrument symbol as provided by exchange (always uppercase)
timestamp
timestamp provided by exchange in microseconds since epoch - if exchange does not provide one local_timestamp
value is used as a fallback
local_timestamp
message arrival timestamp in microseconds since epoch
id
trade id as provided by exchange, empty if exchange does not provide one - different exchanges provide id's as numeric values, GUID's or other strings, and some do not provide that information at all
side
liquidity taker side (aggressor), possible values:
buy
- liquidity taker was buying
sell
- liquidity taker was selling
unknown
- exchange did not provide that information
price
trade price as provided by exchange
amount
trade amount as provided by exchange
exchange
symbol
timestamp
local_timestamp
id
side
price
amount
bitmex
XBTUSD
1585699202957000
1585699203089980
d20...
buy
6425.5
12
bitmex
XBTUSD
1585699202980000
1585699203095276
619...
sell
6425
150
bitmex
XBTUSD
1585699203002000
1585699203099299
751...
sell
6425
25
bitmex
XBTUSD
1585699203092000
1585699203122233
3c1...
buy
6425.5
1
bitmex
XBTUSD
1585699203092000
1585699203122233
b9b...
buy
6425.5
1
bitmex
XBTUSD
1585699203092000
1585699203122233
433...
buy
6425.5
1
bitmex
XBTUSD
1585699203092000
1585699203122233
d16...
buy
6425.5
1
bitmex
XBTUSD
1585699203092000
1585699203122233
402...
buy
6425.5
1
bitmex
XBTUSD
1585699203092000
1585699203122233
2f8...
buy
6425.5
1
Tick-level options summary info (strike prices, expiration dates, open interest, implied volatility, greeks etc.) for all active options instruments collected from exchanges' real-time WebSocket options tickers data feeds. Options chain data is available for Deribit (sourced from ticker channel) and OKEx Options (sourced from option/summary and index/ticker channels).
column name
description
exchange
symbol
instrument symbol as provided by exchange (always uppercase)
timestamp
ticker timestamp provided by exchange in microseconds since epoch
local_timestamp
ticker message arrival timestamp in microseconds since epoch
type
option type, possible values:
put
call
strike_price
option strike price
expiration
option expiration date in microseconds since epoch
open_interest
current open interest, empty is exchange does not provide one
last_price
price of the last trade, empty if there weren't any trades yet
bid_price
current best bid price, empty if there aren't any bids
bid_amount
current best bid amount, empty if there aren't any bids
bid_iv
implied volatility for best bid, empty if there aren't any bids
ask_price
current best ask price, empty if there aren't any asks
ask_amount
current best ask amount, empty if there aren't any asks
ask_iv
implied volatility for best ask, empty if there aren't any asks
mark_price
mark price, empty is exchange does not provide one
mark_iv
implied volatility for mark price, empty is exchange does not provide one
underlying_index
underlying index name that option contract is based upon
underlying_price
underlying price, empty is exchange does not provide one
delta
delta value for the option, empty is exchange does not provide one
gamma
gamma value for the option, empty is exchange does not provide one
vega
vega value for the option, empty is exchange does not provide one
theta
theta value for the option, empty is exchange does not provide one
rho
rho value for the option, empty is exchange does not provide one
exchange
symbol
timestamp
local_timestamp
type
strike_price
expiration
open_interest
last_price
bid_price
bid_amount
bid_iv
ask_price
ask_amount
ask_iv
mark_price
mark_iv
underlying_index
underlying_price
delta
gamma
vega
theta
rho
deribit
BTC-9JUN20-9875-P
1591574399413000
1591574400196008
put
9875
1591689600000000
0.1
0.0295
0.0205
15.0
55.91
0.0235
15.0
68.94
0.02210436
62.89
SYN.BTC-9JUN20
9756.36
-0.61752
0.00103
2.24964
-53.05655
-0.22796
deribit
BTC-9JUN20-9875-P
1591574404454000
1591574404473112
put
9875
1591689600000000
0.1
0.0295
0.0205
15.0
55.91
0.0235
15.0
68.94
0.02209480
62.86
SYN.BTC-9JUN20
9756.37
-0.61757
0.00103
2.24954
-53.02754
-0.22798
deribit
BTC-9JUN20-9875-C
1591574397505000
1591574400196010
call
9875
1591689600000000
44.3
0.0080
0.0095
0.5
61.00
0.0105
20.0
65.33
0.00992836
62.87
SYN.BTC-9JUN20
9756.25
0.38232
0.00103
2.24933
-53.03038
0.13272
deribit
BTC-9JUN20-9750-C
1591574399414000
1591574400196011
call
9750
1591689600000000
30.5
0.0145
0.0145
0.3
58.80
0.0160
20.0
65.02
0.01527998
62.05
SYN.BTC-9JUN20
9756.36
0.51442
0.00109
2.35092
-54.69903
0.17789
deribit
BTC-9JUN20-9750-P
1591574397562000
1591574400196012
put
9750
1591689600000000
0.8
0.0185
0.0140
0.3
59.40
0.0155
0.3
65.63
0.01464260
62.06
SYN.BTC-9JUN20
9756.25
-0.48570
0.00109
2.35092
-54.70775
-0.17832
deribit
BTC-9JUN20-9625-P
1591574397824000
1591574400197202
put
9625
1591689600000000
9.5
0.0130
0.0090
0.4
61.64
0.0105
0.4
68.29
0.00975848
65.01
SYN.BTC-9JUN20
9756.25
-0.35780
0.00097
2.20136
-53.66975
-0.13100
deribit
BTC-9JUN20-9625-C
1591574397359000
1591574400197208
call
9625
1591689600000000
18.0
0.0220
0.0215
15.0
57.39
0.0235
20.0
66.31
0.02320750
65.02
SYN.BTC-9JUN20
9756.18
0.64212
0.00097
2.20150
-53.67532
0.22058
deribit
BTC-9JUN20-9500-P
1591574397940000
1591574400197209
put
9500
1591689600000000
51.5
0.0065
0.0060
0.5
66.31
0.0065
17.1
68.91
0.00625625
67.64
SYN.BTC-9JUN20
9756.30
-0.25091
0.00080
1.87744
-47.62196
-0.09165
deribit
BTC-9JUN20-9500-C
1591574399413000
1591574400197211
call
9500
1591689600000000
43.8
0.0165
0.0315
20.0
62.19
0.0350
0.4
80.06
0.03252520
67.64
SYN.BTC-9JUN20
9756.36
0.74914
0.00080
1.87725
-47.61618
0.25540
Top of the book (best bid/ask) data reconstructed from exchanges' real-time WebSocket order book L2 data feeds. - best bid/ask recorded every time top of the book has changed. We on purpose choose this solution over native exchanges real-time quotes feeds as those vary a lot between exchanges, can be throttled, some are absent at all, often are delayed and published in batches in comparison to more granular L2 updates which are the basis for our quotes dataset.
column name
description
exchange
symbol
instrument symbol as provided by exchange (always uppercase)
timestamp
timestamp provided by exchange in microseconds since epoch - if exchange does not provide one local_timestamp
value is used as a fallback
local_timestamp
message arrival timestamp in microseconds since epoch
ask_amount
best ask amount as provided by exchange, empty if there aren't any asks
ask_price
best ask price as provided by exchange, empty if there aren't any asks
bid_price
best bid price as provided by exchange, empty if there aren't any bids
bid_amount
best bid amount as provided by exchange, empty if there aren't any bids
exchange
symbol
timestamp
local_timestamp
ask_amount
ask_price
bid_price
bid_amount
huobi-dm-swap
BTC-USD
1585699201147000
1585699201270777
86
6423
6422.9
112
huobi-dm-swap
BTC-USD
1585699201175000
1585699201292111
86
6423
6422.9
114
huobi-dm-swap
BTC-USD
1585699201257000
1585699201373479
84
6423
6422.9
219
huobi-dm-swap
BTC-USD
1585699201279000
1585699201495667
64
6423
6422.9
219
huobi-dm-swap
BTC-USD
1585699201295000
1585699201495715
64
6423
6422.9
229
huobi-dm-swap
BTC-USD
1585699201447000
1585699201564788
2
6423
6422.9
229
huobi-dm-swap
BTC-USD
1585699201556000
1585699201677770
64
6423
6422.9
229
huobi-dm-swap
BTC-USD
1585699201668000
1585699201784213
64
6423
6422.9
235
huobi-dm-swap
BTC-USD
1585699201747000
1585699201865051
2
6423
6422.9
235
book_ticker
Derivative instrument ticker info (open interest, funding, mark price, index price) collected from exchanges' real-time WebSocket instruments & tickers data feeds. Anytime any of the tracked values has changed data was added to final dataset.
column name
description
exchange
symbol
instrument symbol as provided by exchange (always uppercase)
timestamp
timestamp provided by exchange in microseconds since epoch - if exchange does not provide one local_timestamp
value is used as a fallback
local_timestamp
message arrival timestamp in microseconds since epoch
funding_timestamp
timestamp of the next funding event in microseconds since epoch, empty if exchange does not provide one
funding_rate
funding rate that will take effect on the next funding event at funding timestamp, for some exchanges it's fixed, for other it fluctuates, empty if exchange does not provide one
predicted_funding_rate
estimated predicted funding rate for the next after closest funding event, empty if exchange does not provide one
open_interest
current open interest, empty if exchange does not provide one
last_price
last instrument price, empty if exchange does not provide one
index_price
index price of the instrument, empty if exchange does not provide one
mark_price
mark price of the instrument, empty if exchange does not provide one
1
exchange
symbol
timestamp
local_timestamp
funding_timestamp
funding_rate
predicted_funding_rate
open_interest
last_price
index_price
mark_price
2
bitmex
ETHUSD
1585699199651000
1585699202577291
1585713600000000
0.0001
0.001654
45921455
133.25
133.14
133.15
3
bitmex
ETHUSD
1585699200000000
1585699204834359
1585713600000000
0.0001
0.001654
45921455
133.25
133.12
133.13
4
bitmex
ETHUSD
1585699202925000
1585699205076090
1585713600000000
0.0001
0.001654
45921455
133.3
133.12
133.13
5
bitmex
ETHUSD
1585699202925000
1585699205090339
1585713600000000
0.0001
0.001654
45883853
133.3
133.12
133.13
6
bitmex
ETHUSD
1585699203465000
1585699205274555
1585713600000000
0.0001
0.001654
45883853
133.25
133.12
133.13
7
bitmex
ETHUSD
1585699204439000
1585699205951209
1585713600000000
0.0001
0.001654
45883853
133.15
133.12
133.13
8
bitmex
ETHUSD
1585699205000000
1585699206389317
1585713600000000
0.0001
0.001654
45883853
133.15
133.09
133.1
9
bitmex
ETHUSD
1585699207279000
1585699207490211
1585713600000000
0.0001
0.001654
45883853
133.2
133.09
133.1
10
bitmex
ETHUSD
1585699207279000
1585699208084951
1585713600000000
0.0001
0.001654
45867677
133.2
133.09
133.1
Liquidations data collected from exchanges' real-time WebSocket data feeds were available.
See details which exchanges support it and since when.
column name
description
exchange
symbol
instrument symbol as provided by exchange (always uppercase)
timestamp
timestamp provided by exchange in microseconds since epoch - if exchange does not provide one local_timestamp
value is used as a fallback
local_timestamp
message arrival timestamp in microseconds since epoch
id
liquidation id as provided by exchange, empty if exchange does not provide one - different exchanges provide id's as numeric values, GUID's or other strings, and some do not provide that information at all
side
liquidation side:
buy
- short position was liquidated
sell
- long position was liquidated
price
liquidation price as provided by exchange
amount
liquidation amount as provided by exchange
exchange
symbol
timestamp
local_timestamp
id
side
price
amount
binance-futures
BTCUSDT
1632009737493000
1632009737505152
sell
48283.81
0.01
binance-futures
BTCUSDT
1632009802385000
1632009802398690
buy
48339.11
0.132
binance-futures
BTCUSDT
1632009870475000
1632009870485139
buy
48337.02
0.004
binance-futures
BTCUSDT
1632009889760000
1632009889784144
buy
48346.54
0.002
binance-futures
BTCUSDT
1632009891282000
1632009891296156
buy
48350.34
0.032
binance-futures
BTCUSDT
1632009892636000
1632009892646433
buy
48355.68
0.001
binance-futures
BTCUSDT
1632009970533000
1632009970544039
buy
48290.57
0.042
binance-futures
BTCUSDT
1632010836285000
1632010836297995
sell
48186.15
0.036
binance-futures
BTCUSDT
1632010899415000
1632010899428203
sell
48150.64
0.265
In addition to standard currency pairs & instrument symbols that can be requested when via CSV datasets API, each exchange has additional special grouped symbols available depending if it supports given market type: SPOT, FUTURES, OPTIONS and PERPETUALS. When such symbol is requested then downloaded file for it has all the data for all instruments belonging for given market type. This is especially useful for options instruments that as specifying each option symbol one by one can be mundane process, using 'OPTIONS' as a symbol gives data for all options available at given time.
incremental_book_L2 - available for FUTURES
trades - available for SPOT, FUTURES, OPTIONS, PERPETUALS
derivative_ticker - available for FUTURES, PERPETUALS
those special symbols are also listed in response to /exchanges/:exchange API call
all downloadable datasets are gzip compressed
historical market data is available in daily intervals (separate file for each day) based on local timestamp (timestamp of message arrival) split by exchange, data type and symbol
data for a given day is available on the next day around 6h after 00:00 UTC - exact date until when data is available can be requested via /exchanges/:exchange API call (datasets.exportedUntil
), e.g., https://api.tardis.dev/v1/exchanges/ftx
datasets are ordered and split into separate daily files by local_timestamp
(timestamp of message arrival time)
empty gzip compressed file is being returned in case of no data available for a given day, symbol and data type, e.g., exchange downtime, very low volume currency pairs etc.
iftimestamp
equals to local_timestamp
it means that exchange didn't provide timestamp for message, e.g., BitMEX order book updates
cell in CSV file is empty if there's no value for it, e.g., no trade id if a given exchange doesn't provide one
datasets are sourced from Tardis.dev HTTP API, which in turn provides the the data sourced from exchanges real-time WebSocket market data feeds (in contrast to REST API endpoints)
See "Data FAQ" regarding potential order book overlaps issues, non monotonically increasing exchanges timestamps, duplicated trade data and more
GET
https://datasets.tardis.dev/v1/:exchange/:dataType/:year/:month/:day/:symbol.csv.gz
Returns gzip compressed CSV dataset for given exchange, data type, date (year, month, day) and symbol.
exchange
string
one of https://api.tardis.dev/v1/exchanges (field id
, only exchanges with "supportsDatasets":true)
dataType
string
one of datasets.symbols[].dataTypes
values from https://api.tardis.dev/v1/exchanges/:exchange API response
year
string
year in format YYYY
(four-digit year)
month
string
month in format MM
(two-digit month of the year)
day
string
day in format DD
(two-digit day of the month)
symbol
string
one of datasets.symbols[].id
values from https://api.tardis.dev/v1/exchanges/:exchange API response, see details below
Authorization
string
For authenticated requests provide Authorization header with value: 'Bearer YOUR_API_KEY
'.
Without API key historical datasets for the first day of each month are available to download.
symbols param provided to datasets API in comparison to HTTP API needs to be both always uppercase and have '/' and ':' characters replaced with '-' so symbol is url safe.
list of allowed symbols for each exchange can be requested via /exchanges/:exchange API call, e.g., https://api.tardis.dev/v1/exchanges/deribit - datasets.symbols[].id
field
Coinbase Pro historical market data details - currency pairs, data coverage and data collection specifics
Coinbase Pro historical data for all it's currency pairs is available since 2019-03-30.
Historical CSV datasets for the first day of each month are available to download without API key. See downloadable CSV files documentation.
data type
symbol
date
trades
BTCUSD
2019-07-01
incremental_book_L2
BTCUSD
2019-07-01
quotes
BTCUSD
2019-07-01
trades
ETHUSD
2020-03-01
incremental_book_L2
ETHUSD
2020-03-01
Historical data format is the same as provided by real-time Coinbase Pro WebSocket API with addition of local timestamps. If you'd like to work with normalized data format instead (same format for each exchange) see downloadable CSV files or official client libs that perform data normalization client-side.
See Python client docs.
See Node.js client docs.
Tardis-machine is a locally runnable server that exposes API allowing efficiently requesting historical market data for whole time periods in contrast to HTTP API that provides data only in minute by minute slices.
See tardis-machine docs.
change
last_match
full_snapshot - generated channel with full order book L3 snapshots
Coinbase Pro (formerly GDAX) real-time WebSocket API provides initial full order book snapshots for level2, but not for full channel. To overcome this issue we fetch initial order book snapshots from REST API and store them together with rest of the WebSocket messages. Such snapshot messages are marked with "type":"full_snapshot"
and "generated":true
fields. This effectively allows reconstructing historical full order book for full
(L3) channel. Validation based on sequence numbers if full_snapshot
overlap with WS L3 updates has been added 2020-06-11.
Market data collection infrastructure for Coinbase is located in GCP europe-west2 region (London, UK). Real-time market data is captured via multiple WebSocket connections.
Kraken Futures (Crypto Facilities) historical market data details - available instruments, data coverage and data collection specifics
Crypto Facilities (aka Kraken Futures) historical data for all it's instruments is available since 2019-03-30.
Historical CSV datasets for the first day of each month are available to download without API key. See downloadable CSV files documentation.
data type
symbol
date
trades
PI_XBTUSD
2019-07-01
incremental_book_L2
PI_XBTUSD
2019-07-01
quotes
PI_XBTUSD
2019-07-01
derivative_ticker
PI_XBTUSD
2019-07-01
trades
FUTURES
2020-03-01
liquidations
PERPETUALS
2021-09-01
Historical data format is the same as provided by real-time Crypto Facilities WebSocket v1 with addition of local timestamps. If you'd like to work with normalized data format instead (same format for each exchange) see downloadable CSV files or official client libs that perform data normalization client-side.
See Python client docs.
See Node.js client docs.
Tardis-machine is a locally runnable server that exposes API allowing efficiently requesting historical market data for whole time periods in contrast to HTTP API that provides data only in minute by minute slices.
See tardis-machine docs.
book During recording, data integrity of order book incremental updates messages is being validated using sequence numbers provided by Kraken Futures real-time message feed - in case of detecting missed message WebSocket connection is being restarted.
Market data collection infrastructure for Kraken Futures is located in GCP europe-west2 region (London, UK). Real-time market data is captured via multiple WebSocket connections. Recording of data is being done via direct API access - without Cloudflare in between - https://www.cryptofacilities.com/resources/hc/en-us/articles/360022531713-IP-whitelisting-for-direct-access.
Kraken historical market data details - currency pairs, data coverage and data collection specifics
Kraken historical data for all it's currency pairs is available since 2019-06-04.
Historical CSV datasets for the first day of each month are available to download without API key. See downloadable CSV files documentation.
data type
symbol
date
trades
XBT-USD
2019-07-01
incremental_book_L2
XBT-USD
2019-07-01
quotes
XBT-USD
2019-07-01
trades
ETH-USD
2020-03-01
incremental_book_L2
ETH-USD
2020-03-01
Historical data format is the same as provided by real-time Kraken WebSocket v1 API with addition of local timestamps. If you'd like to work with normalized data format instead (same format for each exchange) see downloadable CSV files or official client libs that can perform data normalization client-side.
See Python client docs.
See Node.js client docs.
Tardis-machine is a locally runnable server that exposes API allowing efficiently requesting historical market data for whole time periods in contrast to HTTP API that provides data only in minute by minute slices.
See tardis-machine docs.
book
recorded with depth=1000
Market data collection infrastructure for Kraken is located in GCP europe-west2 region (London, UK). Real-time market data is captured via multiple WebSocket connections.
Bitstamp historical market data details - currency pairs, data coverage and data collection specifics
Bitstamp historical data for all it's currency pairs is available since 2019-03-30.
Historical CSV datasets for the first day of each month are available to download without API key. See downloadable CSV files documentation.
data type
symbol
date
trades
BTCUSD
2019-07-01
incremental_book_L2
BTCUSD
2019-07-01
quotes
BTCUSD
2019-07-01
trades
ETHUSD
2020-03-01
incremental_book_L2
ETHUSD
2020-03-01
Historical data format is the same as provided by real-time Bitstamp WebSocket v2 with addition of local timestamps. If you'd like to work with normalized data format instead (same format for each exchange) see downloadable CSV files or official client libs that can perform data normalization client-side.
See Python client docs.
See Node.js client docs.
Tardis-machine is a locally runnable server that exposes API allowing efficiently requesting historical market data for whole time periods in contrast to HTTP API that provides data only in minute by minute slices.
See tardis-machine docs.
Bitstamp real-time WebSocket API does not provide initial full order book snapshots for live_orders
and diff_order_book
channels subscriptions. To overcome this issue we fetch initial order book snapshots from REST API and store them together with the rest of the WebSocket messages. Such snapshot messages are marked with "event": "snapshot"
and "generated": true
fields for respective channels.
Market data collection infrastructure for Bitstamp is located in GCP europe-west2 region (London, UK). Real-time market data is captured via multiple WebSocket connections.
Bitfinex historical market data details - currency pairs, data coverage and data collection specifics
Bitfinex exchange historical data for: BTCUSD, BTCUST, ETHUSD, ETHUST, LTCUSD, TRXUSD, EOSUSD, XRPUSD, LEOUSD, BABUSD currency pairs is available since 2019-05-23, data for other high caps is available since 2020-05-28, data for all currency pairs is available since 2021-10-29.
Historical CSV datasets for the first day of each month are available to download without API key. See downloadable CSV files documentation.
data type
symbol
date
trades
BTCUSD
2019-08-01
incremental_book_L2
BTCUSD
2019-08-01
quotes
BTCUSD
2019-08-01
trades
ETHUSD
2020-03-01
incremental_book_L2
ETHUSD
2020-03-01
Historical data format is the same as provided by real-time Bitfinex WebSocket v2 API with addition of local timestamps and since 2020-05-27 also with addition of channel and symbol at the end of each message which allows us providing filtering for the data server-side. If you'd like to work with normalized data format instead (same format for each exchange) see downloadable CSV files or official client libs that perform data normalization client-side.
See Python client docs.
See Node.js client docs.
Tardis-machine is a locally runnable server that exposes API allowing efficiently requesting historical market data for whole time periods in contrast to HTTP API that provides data only in minute by minute slices.
See tardis-machine docs.
book (prec=P0, freq=F0, len=100) During data collection integrity of order book incremental updates is being validated using sequence numbers (SEQ_ALL option) provided by real-time feed - in case of detecting missed message WebSocket connection is being restarted.
raw_book (prec=R0, freq=F0, len=100)
All data collection is performed with TIMESTAMP
and SEQ_ALL
config flags set.
Market data collection infrastructure for Bitfinex is located in GCP europe-west2 region (London, UK). Real-time market data is captured via multiple WebSocket connections.
Huobi Global historical market data details - currency pairs, data coverage and data collection specifics
Huobi Global historical data for high caps currency pairs is available since 2019-11-19, data for all currency pairs is available since 2022-06-09.
Historical CSV datasets for the first day of each month are available to download without API key. See downloadable CSV files documentation.
data type
symbol
date
trades
BTCUSDT
2019-12-01
incremental_book_L2
BTCUSDT
2019-12-01
book_snapshot_25
BTCUSDT
2020-11-01
quotes
BTCUSDT
2019-12-01
trades
ETHUSDT
2020-03-01
incremental_book_L2
ETHUSDT
2020-03-01
Historical data format is the same as provided by real-time Huobi Global WebSocket API with addition of local timestamps. If you'd like to work with normalized data format instead (same format for each exchange) see downloadable CSV files or official client libs that perform data normalization client-side.
See Python client docs.
See Node.js client docs.
Tardis-machine is a locally runnable server that exposes API allowing efficiently requesting historical market data for whole time periods in contrast to HTTP API that provides data only in minute by minute slices.
See tardis-machine docs.
mbp (mbp.150, 100ms interval) - market by price incremental updates, available since 2020-07-03
During data collection integrity of order book incremental updates is being validated using sequence numbers provided by Huobi real-time feed (seqNum
and prevSeqNum
fields) - in case of detecting missed message (sequence gap) we fetch new snapshot via "req" request - it means that for brief moment of time between requesting a new snapshot and receiving it due to updates containing a gap reconstructed order book based on those updates may not be 100% valid.
Initial order book snapshot is provided in this channel by requesting it via "req" request.
etp - available since 2020-08-18
depth - available until 2020-12-09, see mbp
channel that provides more granular order book data
Collected with step0
aggregation level and provides data in 1-second intervals
Huobi Globaldepth
real-time WebSocket channel always publishes full order book snapshots which are inefficient to store. To circumvent this issue we stored only initial book snapshots and then incremental updates instead - incremental updates are calculated by diffing two subsequent book snapshots and provided in the same format as other depth
messages, except having additional update: true
flag set as in snippet below. Update with amount (second value in array) set to 0 means such level should be deleted, otherwise price level should be updated with new amount value.
Market data collection infrastructure for Huobi Global since 2020-06-19 is located in GCP asia-northeast1 (Tokyo, Japan), before that it was located in GCP europe-west2 region (London, UK). Real-time market data is captured via multiple WebSocket connections.
Bybit historical market data details - available instruments, data coverage and data collection specifics
Bybit historical data for all it's inverse contracts is available since 2019-11-07 (for linear contracts since 2020-05-28).
Historical CSV datasets for the first day of each month are available to download without API key. See downloadable CSV files documentation.
data type
symbol
date
trades
BTCUSD
2020-01-01
incremental_book_L2
BTCUSD
2020-01-01
quotes
BTCUSD
2019-01-01
derivative_ticker
BTCUSD
2019-01-01
trades
ETHUSD
2020-03-01
incremental_book_L2
ETHUSD
2020-03-01
liquidations
PERPETUALS
2021-09-01
Historical data format is the same as provided by real-time Bybit WebSocket API with addition of local timestamps. If you'd like to work with normalized data format instead (same format for each exchange) see downloadable CSV files or official client libs that can perform data normalization client-side.
See Python client docs.
See Node.js client docs.
Tardis-machine is a locally runnable server that exposes API allowing efficiently requesting historical market data for whole time periods in contrast to HTTP API that provides data only in minute by minute slices.
See tardis-machine docs.
orderBook_200 - available since 2019-12-24
Market data collection infrastructure for Bybit since 2020-05-28 is located in GCP asia-northeast1 (Tokyo, Japan), before that it was located in GCP europe-west2 region (London, UK). Real-time market data is captured via multiple WebSocket connections.
Kucoin Spot historical market data details - instruments, data coverage and data collection specifics
Kucoin Spot exchange historical data for all it's currency pairs is available since 2022-08-16.
Historical CSV datasets for the first day of each month are available to download without API key. See downloadable CSV files documentation.
data type
symbol
date
trades
BTC-USDT
2023-02-01
incremental_book_L2
BTC-USDT
2023-02-01
quotes
BTC-USDT
2023-02-01
book_snapshot_25
BTC-USDT
2023-02-01
trades
SPOT
2023-02-01
book_ticker
BTC-USDT
2023-02-01
Historical data format is the same as provided by real-time Kucoin Spot WebSocket Market Data API v1 (https://docs.kucoin.com/#websocket-feed) with addition of local timestamps. If you'd like to work with normalized data format instead (same format for each exchange) see downloadable CSV files or official client libs that can perform data normalization client-side.
See Python client docs.
See Node.js client docs.
Tardis-machine is a locally runnable server that exposes API allowing efficiently requesting historical market data for whole time periods in contrast to HTTP API that provides data only in minute by minute slices.
See tardis-machine docs.
market/level2Snapshot - channel with initial order book snapshot, uses request order book API to get it as described at https://docs.kucoin.com/#level-2-market-data
Market data collection infrastructure for Kucoin Spot is located in GCP asia-northeast1 (Tokyo, Japan). Real-time market data is captured via multiple WebSocket connections.
FTX US historical market data details - available currency pairs, data coverage and data collection specifics
Delta historical market data details - available instruments, data coverage and data collection specifics
Binance US historical market data details - currency pairs, data coverage and data collection specifics
AscendEX historical market data details - available instruments, data coverage and data collection specifics
Bybit Spot historical market data details - available instruments, data coverage and data collection specifics
Gemini historical market data details - currency pairs, data coverage and data collection specifics
Blockchain.com Exchange historical market data details - instruments, data coverage and data collection specifics
exchange id, one of ([].id
field)
exchange id, one of ([].id
field)
exchange id, one of ([].id
field)
exchange id, one of ([].id
field)
exchange id, one of ([].id
field)
exchange id, one of ([].id
field)
exchange id, one of ([].id
field)
exchange id, one of ([].id
field)
Historical CSV datasets for the first day of each month are available to download without API key. See .
Historical data format is the same as provided by real-time WOO X WebSocket Market Data API v2 () with addition of local timestamps. If you'd like to work with normalized data format instead (same format for each exchange) see or official that can perform data normalization client-side.
See .
See .
See .
is a locally runnable server that exposes API allowing efficiently requesting historical market data for whole time periods in contrast to that provides data only in minute by minute slices.
See docs.
Click any channel below to see response with historical data recorded for it.
- channel with initial order book snapshot, uses request order book API to get it
for WOO X is located in GCP asia-northeast1 (Tokyo, Japan). Real-time market data is captured via multiple WebSocket connections.
Historical CSV datasets for the first day of each month are available to download without API key. See .
See full with datasets format spec, data samples and more.
that shows all available download options (download path customization, filenames conventions and more).
that shows all available download options (download path customization, filenames conventions and more).
See which allows downloading single file at once.
Historical data format is the same as provided by real-time FTX US WebSocket API with addition of local timestamps. If you'd like to work with normalized data format instead (same format for each exchange) see or official that can perform data normalization client-side.
See .
See .
See .
is a locally runnable server that exposes API allowing efficiently requesting historical market data for whole time periods in contrast to that provides data only in minute by minute slices.
See docs.
Click any channel below to see response with historical data recorded for it.
- available since 2020-07-21
Asorderbook
channel provides data only about the orderbook's best 100 orders on either side, grouped orderbooks channel supplies orderbook data with grouped (collapsed) prices allowing retrieving lower-granularity, higher-depth information about the orderbook.
We set grouping
param to currency pairs' priceIncrement
value multiplied by 10.
for FTX US is located in GCP europe-west2 region (London, UK). Real-time market data is captured via multiple WebSocket connections.
Historical CSV datasets for the first day of each month are available to download without API key. See .
Historical data format is the same as provided by real-time Delta Exchange WebSocket API with addition of local timestamps. If you'd like to work with normalized data format instead (same format for each exchange) see or official that perform data normalization client-side.
See .
See .
See .
is a locally runnable server that exposes API allowing efficiently requesting historical market data for whole time periods in contrast to that provides data only in minute by minute slices.
See docs.
Click any channel below to see response with historical data recorded for it.
- available until 2020-10-14, after that trades are available via all_trades
channel, this is due to change in Delta exchange API which discontinued recent_trade
channel support, see
- available until 2020-10-14
- available from 2020-10-14
- available from 2020-10-14
for Delta Exchange is located in GCP europe-west2 region (London, UK). Real-time market data is captured via multiple WebSocket connection.
Historical CSV datasets for the first day of each month are available to download without API key. See .
Historical data format is the same as provided by real-time Upbit WebSocket Market Data API v1 () with addition of local timestamps. If you'd like to work with normalized data format instead (same format for each exchange) see or official that can perform data normalization client-side.
See .
See .
See .
is a locally runnable server that exposes API allowing efficiently requesting historical market data for whole time periods in contrast to that provides data only in minute by minute slices.
See docs.
Click any channel below to see response with historical data recorded for it.
for Upbit is located in GCP asia-northeast1 (Tokyo, Japan). Real-time market data is captured via multiple WebSocket connections.
Historical CSV datasets for the first day of each month are available to download without API key. See .
Historical data format is the same as provided by real-time Binance US WebSocket API with addition of local timestamps. If you'd like to work with normalized data format instead (same format for each exchange) see or official that perform data normalization client-side.
See .
See .
See .
is a locally runnable server that exposes API allowing efficiently requesting historical market data for whole time periods in contrast to that provides data only in minute by minute slices.
See docs.
Click any channel below to see response with historical data recorded for it.
- available since 2019-11-19
- generated channel with full order book snapshots
Binance US real-time WebSocket API does not provide initial order book snapshots . To overcome this issue we fetch initial order book snapshots from REST API and store them together with the rest of the WebSocket messages - top 1000 levels. Such snapshot messages are marked with "stream":"<symbol>@depthSnapshot"
and "generated":true
fields.
During data collection integrity of order book incremental updates is being validated using provided by real-time feed (U
and u
fields) - in case of detecting missed message WebSocket connection is being restarted. We also validate if initial book snapshot fetched from REST API overlaps with received depth
messages.
for Binance US is located in GCP europe-west2 region (London, UK). Real-time market data is captured via multiple WebSocket connections.
Historical CSV datasets for the first day of each month are available to download without API key. See .
Historical data format is the same as provided by real-time AscendEX Exchange WebSocket API v2 () with addition of local timestamps. If you'd like to work with normalized data format instead (same format for each exchange) see or official that perform data normalization client-side.
See .
See .
See .
is a locally runnable server that exposes API allowing efficiently requesting historical market data for whole time periods in contrast to that provides data only in minute by minute slices.
See docs.
Click any channel below to see response with historical data recorded for it.
for AscendEX is located in GCP asia-northeast1 (Tokyo, Japan). Real-time market data is captured via multiple WebSocket connection.
Historical CSV datasets for the first day of each month are available to download without API key. See .
Historical data format is the same as provided by real-time Bybit Spot WebSocket v2 API with addition of local timestamps. If you'd like to work with normalized data format instead (same format for each exchange) see or official that can perform data normalization client-side.
See .
See .
See .
is a locally runnable server that exposes API allowing efficiently requesting historical market data for whole time periods in contrast to that provides data only in minute by minute slices.
See docs.
Click any channel below to see response with historical data recorded for it.
for Bybit Spot is located in GCP asia-northeast1 (Tokyo, Japan). Real-time market data is captured via multiple WebSocket connections.
Historical CSV datasets for the first day of each month are available to download without API key. See .
Historical data format is the same as provided by real-time Gemini WebSocket Market Data Version 2 API with addition of local timestamps. If you'd like to work with normalized data format instead (same format for each exchange) see or official that can perform data normalization client-side.
See .
See .
See .
is a locally runnable server that exposes API allowing efficiently requesting historical market data for whole time periods in contrast to that provides data only in minute by minute slices.
See docs.
Click any channel below to see response with historical data recorded for it.
for Gemini is located in GCP europe-west2 region (London, UK). Real-time market data is captured via single WebSocket connection.
Historical CSV datasets for the first day of each month are available to download without API key. See .
Historical data format is the same as provided by real-time Blockchain.com Exchange WebSocket Market Data API v1 () with addition of local timestamps. If you'd like to work with normalized data format instead (same format for each exchange) see or official that can perform data normalization client-side.
See .
See .
See .
is a locally runnable server that exposes API allowing efficiently requesting historical market data for whole time periods in contrast to that provides data only in minute by minute slices.
See docs.
Click any channel below to see response with historical data recorded for it.
for Blockchain.com exchange is located in GCP europe-west2 region (London, UK). Real-time market data is captured via multiple WebSocket connections.
Currently there's a problem with occasional order book crossed state (best bid>=best ask) when reconstructing the book from dYdX WS updates, it's something of and working on a fix.
nextFundingRate
published via v3_markets
channel does not exactly matching up with the funding rates that dydx end up paying for open positions, it's also something of and working on a fix.
Historical CSV datasets for the first day of each month are available to download without API key. See .
Historical data format is the same as provided by real-time dYdX WebSocket Market Data API v3 () with addition of local timestamps. If you'd like to work with normalized data format instead (same format for each exchange) see or official that can perform data normalization client-side.
See .
See .
See .
is a locally runnable server that exposes API allowing efficiently requesting historical market data for whole time periods in contrast to that provides data only in minute by minute slices.
See docs.
Click any channel below to see response with historical data recorded for it.
for dYdX is located in GCP europe-west2 region (London, UK). Real-time market data is captured via multiple WebSocket connections.
data type
symbol
date
trades
SPOT_BTC_USDT
2023-02-01
incremental_book_L2
SPOT_BTC_USDT
2023-02-01
quotes
PERP_BTC_USDT
2023-02-01
book_snapshot_25
PERP_BTC_USDT
2023-02-01
trades
PERPETUALS
2023-02-01
book_ticker
PERP_BTC_USDT
2023-02-01
derivative_ticker
PERP_BTC_USDT
2023-02-01
data type
symbol
date
trades
BTC-USD
2020-06-01
incremental_book_L2
BTC-USD
2020-06-01
quotes
BTC-USD
2020-06-01
trades
SPOT
2020-06-01
data type
symbol
date
trades
BTCUSD
2020-06-01
incremental_book_L2
BTCUSD
2020-06-01
quotes
BTCUSD
2020-06-01
book_snapshot_25
BTCUSD
2020-06-01
derivative_ticker
BTCUSD
2020-06-01
trades
PERPETUALS
2020-06-01
data type
symbol
date
trades
KRW-BTC
2021-09-01
incremental_book_L2
KRW-BTC
2021-09-01
quotes
KRW-BTC
2021-09-01
book_snapshot 25
KRW-BTC
2021-09-01
trades
SPOT
2021-08-01
data type
symbol
date
trades
BTCUSD
2019-12-01
incremental_book_L2
BTCUSD
2019-12-01
quotes
BTCUSD
2019-12-01
trades
ETHUSD
2020-03-01
incremental_book_L2
ETHUSD
2020-03-01
data type
symbol
date
trades
BTC-PERP
2023-03-01
incremental_book_L2
BTC-PERP
2023-03-01
quotes
BTC-USDT
2023-03-01
book_snapshot_25
BTC-USDT
2023-03-01
derivative_ticker
BTC-PERP
2023-03-01
trades
PERPETUALS
2023-03-01
book_ticker
BTC-PERP
2023-03-01
data type
symbol
date
trades
BTCUSDT
2023-03-01
incremental_book_L2
BTCUSDT
2023-03-01
quotes
BTCUSDT
2023-03-01
trades
SPOT
2023-03-01
book_snapshot_25
BTCUSDT
2023-03-01
book_ticker
BTCUSDT
2023-03-01
data type
symbol
date
trades
BTCUSD
2020-01-01
incremental_book_L2
BTCUSD
2020-01-01
quotes
BTCUSD
2020-01-01
trades
ETHUSD
2020-03-01
incremental_book_L2
ETHUSD
2020-03-01
data type
symbol
date
trades
BTC-USD
2023-03-01
incremental_book_L2
BTC-USD
2023-03-01
quotes
BTC-USD
2023-03-01
book_snapshot_25
BTC-USD
2023-03-01
trades
SPOT
2023-03-01
data type
symbol
date
trades
BTC-USD
2021-09-01
incremental_book_L2
BTC-USD
2021-09-01
quotes
SOL-USD
2021-09-01
book_snapshot 25
SOL-USD
2021-09-01
trades
PERPETUALS
2021-09-01
derivative_ticker
BTC-USD
2021-09-01
Crypto.com exchange historical market data details - instruments, data coverage and data collection specifics
Crypto.com exchange historical data for all it's instruments is available since 2022-06-01.
Historical CSV datasets for the first day of each month are available to download without API key. See downloadable CSV files documentation.
data type
symbol
date
trades
BTCUSD-PERP
2023-03-01
incremental_book_L2
BTCUSD-PERP
2023-03-01
quotes
BTCUSD-PERP
2023-03-01
book_ticker
BTCUSD-PERP
2023-03-01
book_snapshot_25
BTCUSD-PERP
2023-03-01
derivative_ticker
BTCUSD-PERP
2023-03-01
trades
PERPETUALS
2023-03-01
Historical data format is the same as provided by real-time Crypto.com Exchange WebSocket Market Data API v2 (https://exchange-docs.crypto.com/spot/index.html#websocket-subscriptions) with addition of local timestamps. If you'd like to work with normalized data format instead (same format for each exchange) see downloadable CSV files or official client libs that can perform data normalization client-side.
See Python client docs.
See Node.js client docs.
Tardis-machine is a locally runnable server that exposes API allowing efficiently requesting historical market data for whole time periods in contrast to HTTP API that provides data only in minute by minute slices.
See tardis-machine docs.
Market data collection infrastructure for Crypto.com exchange is located in GCP asia-northeast1 (Tokyo, Japan). Real-time market data is captured via multiple WebSocket connections.
OKCoin historical market data details - available instruments, data coverage and data collection specifics
OKCoin historical data for all it's currency pairs is available since 2019-11-19.
Historical CSV datasets for the first day of each month are available to download without API key. See downloadable CSV files documentation.
data type
symbol
date
trades
BTC-USD
2020-01-01
incremental_book_L2
BTC-USD
2020-01-01
quotes
BTC-USD
2020-01-01
trades
ETH-USD
2020-03-01
incremental_book_L2
ETH-USD
2020-03-01
Historical data format is the same as provided by real-time OKCoin WebSocket v3 API with addition of local timestamps. If you'd like to work with normalized data format instead (same format for each exchange) see downloadable CSV files or official client libs that perform data normalization client-side.
See Python client docs.
See Node.js client docs.
Tardis-machine is a locally runnable server that exposes API allowing efficiently requesting historical market data for whole time periods in contrast to HTTP API that provides data only in minute by minute slices.
See tardis-machine docs.
spot/depth - available until 2020-02-17
spot/depth_l2_tbt - available since 2020-02-13
Market data collection infrastructure for OKCoin since 2020-05-15 is located in GCP asia-northeast1 (Tokyo, Japan), before that it was located in GCP europe-west2 region (London, UK). Real-time market data is captured via multiple WebSocket connections.
bitFlyer historical market data details - available instruments, data coverage and data collection specifics
BitFlyer historical data for all it's instruments is available since 2019-08-30.
Real-time market data provided by bitFlyer exchange isn't always the most reliable and clean, especially during market volatility periods (crossed order books, delayed trade data).
Historical CSV datasets for the first day of each month are available to download without API key. See downloadable CSV files documentation.
data type
symbol
date
trades
FX_BTC_JPY
2020-01-01
incremental_book_L2
FX_BTC_JPY
2020-01-01
quotes
FX_BTC_JPY
2020-01-01
trades
FUTURES
2020-03-01
Historical data format is the same as provided by real-time bitFlyer lightning JSON-RPC 2.0 over WebSocket API with addition of local timestamps. If you'd like to work with normalized data format instead (same format for each exchange) see downloadable CSV files or official client libs that can perform data normalization client-side.
See Python client docs.
See Node.js client docs.
Tardis-machine is a locally runnable server that exposes API allowing efficiently requesting historical market data for whole time periods in contrast to HTTP API that provides data only in minute by minute slices.
See tardis-machine docs.
lightning_board_snapshot
lightning_board_snapshot
channel is subscribed only in order to get initial order book snapshot, after that it's automatically unsubscribed as all order book update are being provided via lightning_board
channel.
Market data collection infrastructure for bitFlyer since 2020-05-28 is located in GCP asia-northeast1 (Tokyo, Japan), before that it was located in GCP europe-west2 region (London, UK). Real-time market data is captured via single WebSocket connection.
Gate.io Futures historical market data details - available instruments, data coverage and data collection specifics
Gate.io Futures historical data for all it's instruments is available since 2020-07-01.
Historical CSV datasets for the first day of each month are available to download without API key. See downloadable CSV files documentation.
data type
symbol
date
trades
BTC_USDT
2020-07-01
incremental_book_L2
BTC_USDT
2020-07-01
quotes
BTC_USDT
2020-07-01
derivative_ticker
BTC_USDT
2020-07-01
trades
PERPETUALS
2020-07-01
Historical data format is the same as provided by real-time Gate.io Futures WebSocket v4 API with addition of local timestamps. If you'd like to work with normalized data format instead (same format for each exchange) see downloadable CSV files or official client libs that can perform data normalization client-side.
See Python client docs.
See Node.js client docs.
Tardis-machine is a locally runnable server that exposes API allowing efficiently requesting historical market data for whole time periods in contrast to HTTP API that provides data only in minute by minute slices.
See tardis-machine docs.
order_book (limit=20, interval="0")
Market data collection infrastructure for Gate.io Futures is located in GCP asia-northeast1 (Tokyo, Japan). Real-time market data is captured via multiple WebSocket connections.
Bitnomial historical market data details - instruments, data coverage and data collection specifics
Bitnomial exchange historical data for all it's instruments is available since 2023-01-13.
Historical CSV datasets for the first day of each month are available to download without API key. See downloadable CSV files documentation.
data type
symbol
date
trades
BUIH23
2023-03-01
incremental_book_L2
BUIH23
2023-03-01
quotes
BUIH23
2023-03-01
book_snapshot_25
BUIH23
2023-03-01
trades
FUTURES
2023-03-01
Historical data format is the same as provided by real-time Bitnomial WebSocket Market Data API with addition of local timestamps. If you'd like to work with normalized data format instead (same format for each exchange) see downloadable CSV files or official client libs that can perform data normalization client-side.
See Python client docs.
See Node.js client docs.
Tardis-machine is a locally runnable server that exposes API allowing efficiently requesting historical market data for whole time periods in contrast to HTTP API that provides data only in minute by minute slices.
See tardis-machine docs.
trade
block
status
Market data collection infrastructure for Bitnomial exchange is located in GCP europe-west2 region (London, UK). Real-time market data is captured via multiple WebSocket connections.
Phemex historical market data details - available instruments, data coverage and data collection specifics
Phemex historical data for all it's derivative instruments is available since 2020-03-17 (for spot markets since 2020-06-04).
Historical CSV datasets for the first day of each month are available to download without API key. See downloadable CSV files documentation.
data type
symbol
date
trades
BTCUSD
2020-04-01
incremental_book_L2
BTCUSD
2020-04-01
quotes
BTCUSD
2019-04-01
derivative_ticker
BTCUSD
2019-04-01
trades
ETHUSD
2020-04-01
incremental_book_L2
ETHUSD
2020-04-01
Historical data format is the same as provided by real-time Phemex WebSocket API with addition of local timestamps. If you'd like to work with normalized data format instead (same format for each exchange) see downloadable CSV files or official client libs that can perform data normalization client-side.
See Python client docs.
See Node.js client docs.
Tardis-machine is a locally runnable server that exposes API allowing efficiently requesting historical market data for whole time periods in contrast to HTTP API that provides data only in minute by minute slices.
See tardis-machine docs.
spot_market24h - available since 2020-06-05
Market data collection infrastructure for Phemex since 2020-06-04 is located in GCP asia-northeast1 (Tokyo, Japan), before that it was located in GCP europe-west2 region (London, UK). Real-time market data is captured via multiple WebSocket connections.
Gate.io historical market data details - currency pairs, data coverage and data collection specifics
Gate.io historical data for high caps currency pairs is available since 2020-07-01, data for all currency pairs is available since 2022-06-09.
Historical CSV datasets for the first day of each month are available to download without API key. See downloadable CSV files documentation.
data type
symbol
date
trades
BTC_USDT
2020-07-01
incremental_book_L2
BTC_USDT
2020-07-01
quotes
BTC_USDT
2020-07-01
book_snapshot_25
BTC_USDT
2020-07-01
trades
SPOT
2020-07-01
Historical data format is the same as provided by real-time Gate.io WebSocket v3 API with addition of local timestamps. If you'd like to work with normalized data format instead (same format for each exchange) see downloadable CSV files or official client libs that can perform data normalization client-side.
See Python client docs.
See Node.js client docs.
Tardis-machine is a locally runnable server that exposes API allowing efficiently requesting historical market data for whole time periods in contrast to HTTP API that provides data only in minute by minute slices.
See tardis-machine docs.
depth (limit=30, interval="0")
Market data collection infrastructure for Gate.io is located in GCP asia-northeast1 (Tokyo, Japan). Real-time market data is captured via multiple WebSocket connections.
Poloniex historical market data details - currency pairs, data coverage and data collection specifics
Poloniex exchange historical data for all it's currency pairs is available since 2020-07-01.
Historical CSV datasets for the first day of each month are available to download without API key. See downloadable CSV files documentation.
data type
symbol
date
trades
USDT_BTC
2020-07-01
incremental_book_L2
USDT_BTC
2020-07-01
quotes
USDT_BTC
2020-07-01
trades
SPOT
2020-07-01
Historical data format is the same as provided by real-time Poloniex WebSocket v2 API with addition of local timestamps and also with addition of symbol at the end of each message which allows us providing filtering for the data server-side. If you'd like to work with normalized data format instead (same format for each exchange) see downloadable CSV files or official client libs that perform data normalization client-side.
See Python client docs.
See Node.js client docs.
Tardis-machine is a locally runnable server that exposes API allowing efficiently requesting historical market data for whole time periods in contrast to HTTP API that provides data only in minute by minute slices.
See tardis-machine docs.
price_aggregated_book - initial book snapshot, book modifications, and trades During data collection integrity of order book incremental updates is being validated using sequence numbers provided by real-time feed - in case of detecting missed message WebSocket connection is being restarted.
Market data collection infrastructure for Poloniex is located in GCP europe-west2 region (London, UK). Real-time market data is captured via multiple WebSocket connections.
Overview of the main ways Tardis.dev historical market data can be accessed programmatically
Python client providing convenient access to tick-level historical market data
efficient data replay API returning historical market data for whole time periods (in contrast to Tardis.dev HTTP API where single call returns data for single minute time period)
built-in data caching
market data in exchange-native format
Node.js client providing convenient access to tick-level historical and real-time market data
efficient data replay API returning historical market data for whole time periods (in contrast to Tardis.dev HTTP API where single call returns data for single minute time period)
consolidated real-time data streaming API connecting directly to exchanges' WebSocket APIs
full limit order book reconstruction logic and customizable order book snapshots
built-in data caching
market data both in exchange-native and normalized formats
Locally runnable tardis-machine server providing both HTTP and WebSocket endpoints
efficient data replay API endpoints returning historical market data for whole time periods (in contrast to Tardis.dev HTTP API where single call returns data for single minute time period)
WebSocket API providing historical market data replay from any given past point in time with the same data format and 'subscribe' logic as real-time exchanges' APIs - in many cases existing exchanges' WebSocket clients can be used to connect to this endpoint
consolidated real-time WebSocket data streaming API endpoint
built-in data caching
market data both in exchange-native and normalized formats
customizable order book snapshots and trade bars data types
HTTP API providing historical market data feed in minute by minute slices
Each API response returned as NDJSON (new line delimited JSON) with addition of local timestamp at the beginning of each line
market data in exchange-native format
C++, Java, Rust, C#, Go and R clients - we have plans for adding dedicated clients for those languages, meanwhile following alternatives are available:
use HTTP API directly
use HTTP or WebSocket API of locally installed tardis-machine server
We're always happy to help if you'd have any problems with the integration, contact us.
CoinFLEX 2.0 historical market data details - instruments, data coverage and data collection specifics
CoinFLEX historical data for all it's instruments is available since 2020-07-14.
data type
symbol
date
trades
BTC-USD-SWAP-LIN
2020-08-01
incremental_book_L2
BTC-USD-SWAP-LIN
2020-08-01
quotes
BTC-USD-SWAP-LIN
2020-08-01
derivative_ticker
BTC-USD-SWAP-LIN
2020-08-01
Historical data format is the same as provided by real-time CoinFLEX WebSocket v2 API with addition of local timestamps. If you'd like to work with normalized data format instead (same format for each exchange) see downloadable CSV files or official client libs that perform data normalization client-side.
See Python client docs.
See Node.js client docs.
Tardis-machine is a locally runnable server that exposes API allowing efficiently requesting historical market data for whole time periods in contrast to HTTP API that provides data only in minute by minute slices.
See tardis-machine docs.
Market data collection infrastructure for CoinFlex 2.0 is located in GCP asia-northeast1 (Tokyo, Japan). Real-time market data is captured via single WebSocket connection.
HitBTC historical market data details - currency pairs, data coverage and data collection specifics
HitBTC historical data for high caps currency pairs is available since 2019-11-19.
Not available.
Historical data format is the same as provided by real-time HitBTC WebSocket v2 API with addition of local timestamps. If you'd like to work with normalized data format instead (same format for each exchange) see downloadable CSV files or official client libs that can perform data normalization client-side.
See Python client docs.
See Node.js client docs.
Tardis-machine is a locally runnable server that exposes API allowing efficiently requesting historical market data for whole time periods in contrast to HTTP API that provides data only in minute by minute slices.
See tardis-machine docs.
Market data collection infrastructure for HitBTC is located in GCP europe-west2 region (London, UK). Real-time market data is captured via single WebSocket connection.
API providing useful instrument metadata (tick sizes, contract multipliers, normalised base and quote currencies, expiration dates etc.)
Please provide 'Authorization' header with value: 'Bearer YOUR_API_KEY'.
Returns instrument info for provided exchange and symbol.
Returns instruments array for given exchange matching provided optional filter
JSON object, when provided via query string it needs be url encoded.
Array of instruments objects as described for single instrument endpoint
Convenient access to tick-level historical and real-time cryptocurrency market data via Node.js
transparent historical local data caching (cached data is stored on disk in compressed GZIP format and decompressed on demand when reading the data)
automatic closed connections and stale connections reconnection logic for real-time streams
fast and lightweight architecture — low memory footprint and no heavy in-memory buffering
built-in TypeScript support
Requires Node.js v12+ installed.
Simply change from require
to ES Modules import
to enjoy first class TypeScript typings.
replay(options)
replayNormalized(options, ...normalizers)
stream(options)
streamNormalized(options, ...normalizers)
init(options)
getExchangeDetails(exchange)
getExchangeDetails
getApiKeyAccessInfo(apiKey?)
getApiKeyAccessInfo()
clearCache()
Clears local data cache dir.
Data normalization allows consuming market data feeds from various exchanges in consistent format.
normalizeTrades
normalizeBookChanges
normalizeDerivativeTickers
disconnect
messageExchange
is an exchange id for which mapper object needs to be returned for, localTimestamp
is a date for which mapper is created (and is created after each disconnection). In most cases localTimestamp
is not necessary for anything, but in certain cases like for example exchange API change it can be used to switch to different mapping logic like using new data channel that wasn't available until certain date.
Returned Mapper
object has following signature:
On every disconnection event that occurs normalize factory functions are called again to provide new Mapper
objects with clean state if required (stateful mapping like BitMEX order book data that needs to persist state of mapping between price level id and price level and needs to 'reset' for each new connection). If mapper object is stateful it's required to always return new clean state object from normalize factory function or reset it's state in one way or another.
normalizeLiquidations
normalizerExample implementation of custom normalizeLiquidations
function that normalizes liquidations data for deribit
exchange. Implementations for for other exchanges are left as an exercise for the reader.
type of messages provided by normalizeLiquidations
implementation of deribitLiquidations
mapper and normalizeLiquidations
normalizeLiquidations
usage example
normalizeTrades
for Binance exchangenormalizeTradesWithBinancePatch
usage example
orderBook.update(bookChange)
orderBook.bestBid()
orderBook.bestAsk()
orderBook.asks()
orderBook.bids()
combine(...iterators)
For historical data replay it combines input async iterables
messages by sorting them by localTimestamp
in ascending order, this allows synchronized/ordered market data replay for multiple exchanges.
compute(iterator, ...computables)
computeTradeBars(options)
computeTradeBars
trade_bar
messagecomputeBookSnapshots(options)
computeBookSnaphots
book_snapshot
messagecomputable
where returned Computable
object has following signature:
computeOrderBookImbalanceRatio()
type of messages produced by computeOrderBookImbalanceRatio
BookImbalanceRatioComputable
computable
and computeOrderBookImbalanceRatio
factory function.computeOrderBookImbalanceRatio
usage exampleGiven implementation above we can compute book imbalance ratio for BitMEX real-time XBTUSD
message stream. For this example we compute top 5 levels, 2 second book snapshots as a source to our custom computable. We need to have async iterable
that produces book snapshots as a source to our book imbalance computable, hence two invocations of compute.
Example showing replaying large historical trades across multiple exchanges as those happened.
Example showing simple pattern of providing async iterable
of market data messages to the function that can process them no matter if it's is real-time or historical market data. That effectively enables having the same 'data pipeline' for backtesting and live trading.
Example showing how to quickly display real-time funding rate and open interest info across multiple exchanges at once.
Example showing how to write Deribit exchange historical funding, index and open interest data into CSV.
Example showing implementation of SimpleMovingAverageComputable
that calculates average of trade bar closes prices for specified rolling window in incremental way. It uses CircularBuffer
under the hood.
HTTP based API providing historical market data feeds, supported exchanges details and more
GET
https://api.tardis.dev/v1/data-feeds/:exchange
Provides historical cryptocurrency market data feed for requested exchange in minute by minute slices in new line delimited JSON format (NDJSON) with addition of local timestamp at the beginning of each line - in ISO 8601 format. JSON message in each line is a data message in exchange-native format.
Empty lines in response are being used as markers for disconnect events that occurred when collecting the data.
Responses are gzip compressed (content-encoding: gzip
) and each one contains one minute of historical market data starting from requested date which is from
date plus minute offset
param.
Parallel request to this endpoint are supported and can speed up overall data fetching process substantially, but using more than ~60 parallel requests doesn't bring speed benefits.
In order to achieve best performance HTTP 1.1 protocol is recommended, in our testing HTTP 2 was noticeable slower.
As this is relatively low level API you may also want to try official client libraries that are built on top of it and provide more convenient way of consuming historical market data like requesting whole date ranges of data at once instead of minute by minute pagination or providing normalized data format.
GET
https://api.tardis.dev/v1/exchanges
Gets the list of all supported exchanges that historical market data is available for.
GET
https://api.tardis.dev/v1/exchanges/:exchange
Gets the exchanges details: available symbols, availability dates, available channels, CSV datasets info, incidents etc.
GET
https://api.tardis.dev/v1/api-key-info
Given API_KEY
provided in request header provides information about what historical data (exchanges, date ranges, symbols) is available for given API_KEY
.
Binance DEX historical market data details - currency pairs, data coverage and data collection specifics
Binance Jersey historical market data details - currency pairs, data coverage and data collection specifics
Instruments metadata API is available only for active subscriptions.
Node.js tardis-dev
library provides convenient access to tick-level historical and real-time cryptocurrency market data both in . Instead of callbacks it relies on enabling composability features like or .
tick-level backed by Tardis.dev — includes full order book depth snapshots plus incremental updates, trades, historical open interest, funding, index, mark prices, liquidations and more
consolidated connecting directly to exchanges' public WebSocket APIs
support for both and formats (unified format for accessing market data across all supported exchanges — normalized trades, order book and ticker data)
thanks to providing unified way of consuming data messages
support for top cryptocurrency exchanges: , , , , , , , , , , , , , , , , , and more
via helper function — synchronized historical market data replay and consolidated real-time data streaming from multiple exchanges
like order book imbalance, customizable trade bars, book snapshots and more via helper function and computables
, e.g., volume based bars, top 20 levels order book snapshots taken every 10 ms etc
both for real-time and historical data via OrderBook
object
that allows adjusting normalized formats for specific needs
tardis-dev
lib uses package for verbose logging and debugging purposes that can be enabled via DEBUG
environment variable set to tardis-dev*
.
See page to get detailed information about historical market data available for each exchange.
Replays historical market data messages for given replay options in . Historical market data is being fetched efficiently (in parallel) from the and cached locally. Returns .
is the real-time counterpart of replay
function, returning real-time market data in the same format.
Replays historical market data messages for given replay options and normalizes messages using normalizers provided as . Historical market data is being fetched efficiently (in parallel) from the and cached locally. Returns .
is the real-time counterpart of replayNormalized
function, returning real-time market data in the same format.
replayNormalized
function accepts any number of that map from to normalized data format. tardis-dev
ships with but also allows ones.
Message types and formats depend on specific normalizers provided to replayNormalized
function and are documented in detail in .
Sample message produced by
Streams real-time market data messages for given stream options in . It connects directly to exchanges WebSocket APIs and transparently restarts closed, broken or stale connections (open connections without data being send for specified amount of time). Returns .
is the historical market data counterpart of stream
function, returning historical market data in the same format.
Streams real-time market data messages for given stream options and normalizes messages using provided normalizers provided as . It connects directly to exchanges WebSocket APIs and transparently restarts closed, broken or stale connections (open connections without data being send for specified amount of time). Returns .
is the historical counterpart of streamNormalized
function, returning historical market data in the same format.
streamNormalized
function can accept any number of custom normalizers as that map from to normalized data format. tardis-dev
ships with but also allows .
Message types and formats depend on specific normalizers provided to streamNormalized
function and are documented in detail in .
Sample message produced by
When working with market data via and functions by default only first day of each month of historical data is available for replay as well as locally cached historical data is stored in default location on disk (OS temp dir).
Init
function allows providing apiKey
received via email after ordering historical market data access via as well as customcacheDir
. ApiKey
can also be provided directly via options of and functions - that overrides anything that was provided via init
.
Given exchange id provides exchange details (available symbols, availability dates, available channels, pricing info etc) provided by API endpoint.
Given apiKey
provided as optional parameter or provided in function provides information about what historical data is available for it - exchanges, date ranges, symbols.
tardis-dev
has following built-in normalizers that can be provided to or functions:
- provides normalized trade
data
- provides normalizedbook_change
data
- provides normalized funding, index and mark price data
If you're interested in how exactly data is mapped from to normalized one, please follow code in for each exchange and if you determined that mapping should be done differently please read "" section.
When passed as an arg to or function provides normalized trade
data for all supported exchanges.
When passed as an arg to or function provides normalized book_change
data for all supported exchanges.
Provides initial order book snapshots (isSnapshot=true
) plus incremental updates for each order book change. Please note that amount
is the updated amount at that price level, not a delta. An amount
of 0
indicates the price level can be removed.
When passed as an arg to or function provides normalized derivative_ticker
data for supported exchanges that trade derivative instruments.
When or functions options have withDisconnectMessages
flag set to true
and disconnect event occurred (eg.: WebSocket connection close) then disconnect
message is being returned.
Intardis-dev
data normalization is implemented via normalize
factory functions provided to and functions. This design gives lots of flexibility by allowing replacing, extending and modifying built-in normalizers or adding new ones for new normalized data types without the need of forking the whole library.
Any normalize function provided to and functions needs to have following signature:
Normalized data returned by of Mapper.map
method is expected to have a shape that has at least fields as described in below to play well with other tardis-dev
functions like or .
We could as well provide the same normalizeLiquidations
function to function or use it together it with other normalizers (normalizeTrades
etc.).
Let's assume that default normalization of Binance exchange trades data doesn't fit our use case and we need to use stream as source of trade data instead of used by default stream.
tardis-dev
exports OrderBook
class that when instantiated can process normalized messages with order book snapshots and incremental updates and allows maintaining full local order book (level 2 - aggregated market-by-price) state both for real-time data as well as reconstructing historical order book state at any past point in time. It waits for first book_change
message that is a snapshot (isSnaphot = true
) and then applies subsequent updates to it. Single orderBook
object can maintain order book state only for single symbol/instrument. It uses data structure under the hood to efficiently maintain it's local state in sorted order.
Processes normalized messages to update it's internal local state that maintains ordered bids and asks sets. It ignores any non snapshot book_change
messages before initial snapshot is received. It should be called for every book_change
message received for given symbol we'd like to reconstruct order book for.
Returns for highest bid order (best bid) in order book or undefined
if book doesn't have any bids (not initialized yet with initial snapshot).
Returns for lowest ask order (best ask) in order book or undefined
if book doesn't have any asks (not initialized yet with initial snapshot).
Returns of objects for all asks available ordered from the lowest to highest ask.
Returns of objects for all bids available ordered from highest to lowest bid.
function given multiple async iterators
combines them into single one. That allows synchronized historical market data replay and consolidated streaming of real-time data for multiple exchanges via single loop.
Accepts async iterables
of as and combines them returning single async iteratable
.
For real-time market data streaming it combines input async iterables
messages in FIFO order by using .
function allows computing various derived data locally via so called like:
- computes various trade bars (OHLC, volume based bars, tick based bars) based on data
- computes various order book snapshots based on normalized
If you're interested in adding custom like for example order book imbalance, volume imbalance, open interest or funding rate based bars please read section.
function accepts async iterable
producing together with as a and returns async iterable
with normalized messages produced by provided iterable and additionally all computed messages based on provided functions. It computes and produces separate computed normalized messages for each symbol and exchange combination. When is returned by provided async iterable
it discards existing pending computables and starts from computing them from scratch.
When provided to function, computes normalized messages based on data.
When provided to function, computes normalized messages based on normalized . It produces new snapshots only if there is an actual change in order book state for requested depth
.
Any computables
provided to need to be a factory functions with following signature:
Computable.compute
returned iterator is expected to provide objects that at least have fields as described in to play well with other tardis-dev
functions like .
Example implementation of custom computeOrderBookImbalanceRatio
function that as a source data type uses book snapshots and based on it computes ratio of asks amounts (sell orders) to bids amounts (buy orders) for given depth. It may be used to determine relative buy or sell pressure.
Example showing how to very easy display real-time spread and best bid/ask info across multiple exchanges at once. It can be easily adapted to do the same for historical data ( instead of ).
Not sure
See and related if you'd like to access historical tick-level trades, order book snapshots, incremental order book L2 updates, options chains, quotes, derivative tickers and liquidations datasets in daily intervals split by exchange, data type and symbol. It may be faster and more native to your toolkit to access the historical data this way.
See also library with built-in data caching that provides more convenient access to tick-level historical market data — it returns data for the whole time periods in contrast to where single call returns data for single minute time period.
See also library with built-in data caching that provides more convenient access to tick-level historical market data — it returns data for the whole time periods in contrast to where single call returns data for single minute time period.
Historical data format is the same as provided by real-time Binance DEX WebSocket API with addition of local timestamps. If you'd like to work with normalized data format instead (same format for each exchange) see or official that perform data normalization client-side.
See .
See .
See .
is a locally runnable server that exposes API allowing efficiently requesting historical market data for whole time periods in contrast to that provides data only in minute by minute slices.
See docs.
Click any channel below to see response with historical data recorded for it.
- generated channel with full order book snapshots
Binance DEX real-time WebSocket API does not provide initial order book snapshots . To overcome this issue we fetch initial order book snapshots from REST API and store them together with the rest of the WebSocket messages - top 1000 levels. Such snapshot messages are marked with "stream":"<symbol>@depthSnapshot"
and "generated":true
fields.
for Binance DEX is located in GCP europe-west2 region (London, UK). Real-time market data is captured via multiple WebSocket connections.
provides simple and intuitive way of accessing Tardis.dev historical market data API. Detailed and most up to date documentation & installation instructions can be found on , but the gist of it is that you provide exchange name, historical date ranges and optional filters (channel names are the same as exchange's channels in real-time WebSocket feeds, same for symbols) and you receive an that provides market data message for each iteration. Local disk-based caching is being done transparently in the background. Cached data is stored on disk in compressed form (GZIP) and decompressed on demand when reading the data. See example snippet below that shows how to replay some of the historical BitMEX data.
Historical CSV datasets for the first day of each month are available to download without API key. See .
Historical data format is the same as provided by real-time Binance Jersey WebSocket API with addition of local timestamps. If you'd like to work with normalized data format instead (same format for each exchange) see or official that perform data normalization client-side.
See .
See .
See .
is a locally runnable server that exposes API allowing efficiently requesting historical market data for whole time periods in contrast to that provides data only in minute by minute slices.
See docs.
Click any channel below to see response with historical data recorded for it.
- available since 2019-11-19
- generated channel with full order book snapshots
Binance Jersey real-time WebSocket API does not provide initial order book snapshots . To overcome this issue we fetch initial order book snapshots from REST API and store them together with the rest of the WebSocket messages - top 1000 levels. Such snapshot messages are marked with "stream":"<symbol>@depthSnapshot"
and "generated":true
fields.
During data collection integrity of order book incremental updates is being validated using provided by real-time feed (U
and u
fields) - in case of detecting missed message WebSocket connection is being restarted. We also validate if initial book snapshot fetched from REST API overlaps with received depth
messages.
- generated channel, available since 2020-05-19
After establishing successful real-time WebSocket connection we fetch last 1000 trades from REST API and store that info together with other captured WebSocket messages. Such messages are marked with "stream":"<symbol>@recentTrades"
and "generated":true
fields and data
field has the same format as REST API response.
for Binance Jersey is located in GCP europe-west2 region (London, UK). Real-time market data is captured via single WebSocket connection.
name
type
default
exchange
string
-
requested exchange id - one of allowed values
filters
{channel:string, symbols?: string[]}[]
[]
optional filters of requested historical data feed - use getExchangeDetails
function to get allowed channels and symbols ids for requested exchange
from
string
-
replay period start date (UTC) in a format recognized by the Date.parse(), e.g., 2019-04-01
to
string
-
replay period end date (UTC) in a format recognized by the Date.parse(), e.g., 2019-04-02
skipDecoding
boolean | undefined
undefined
when set to true
returns messages as buffers instead of decoding them to objects
withDisconnects
boolean | undefined
undefined
when set to true
returns message with value undefined
for events when connection that was recording the historical data got disconnected
apiKey
string | undefined
undefined
name
type
default
exchange
string
-
requested exchange id - one of allowed values
symbols
string[] | undefined
undefined
optional symbols for requested data feed - use getExchangeDetails
function to get allowed symbols ids for requested exchange
from
string
-
replay period start date (UTC) in a format recognized by the Date.parse(), e.g., 2019-04-01
to
string
-
replay period end date (UTC) in a format recognized by the Date.parse() e.g., 2019-04-02
withDisconnectMessages
boolean | undefined
undefined
when set to true
returns disconnect
messages for events when connection that was recording the historical data got disconnected
apiKey
string | undefined
undefined
name
type
default
exchange
string
-
requested exchange id - one of allowed values
filters
{channel:string, symbols?: string[]}[]
[]
optional filters of requested real-time data feed - use getExchangeDetails
to get allowed channels and symbols ids for requested exchange
skipDecoding
boolean | undefined
undefined
when set to true
returns messages as buffers instead of decoding them to objects
withDisconnects
boolean | undefined
undefined
when set to true
returns message with value undefined
for real-time stream disconnect events
timeoutIntervalMS
number
10000
specifies time in milliseconds after which connection is restarted if no message has been received from the exchange
onError
(err) => void | undefined
undefined
Optional callback invoked when real-time WebSocket connection error occurs, useful for custom error logging etc.
name
type
default
exchange
string
-
requested exchange id - one of allowed values
symbols
string[] | undefined
undefined
instruments symbols for requested data feed
withDisconnectMessages
boolean | undefined
undefined
when set to true
returns disconnect
messages for real-time stream disconnect events
timeoutIntervalMS
number
10000
specifies time in milliseconds after which connection is restarted if no message has been received from the exchange
onError
((err) => void) | undefined
undefined
Optional callback invoked when real-time WebSocket connection or mapping error occurs, useful for custom error logging etc.
name
type
defaults
apiKey
string | undefined
undefined
API key for Tardis.dev HTTP API - if not provided only first day of each month of historical data is accessible
cacheDir
string
<os.tmpdir>/.tardis-cache
path to local dir that will be used as cache location - if not provided default temp
dir for given OS will be used
name
type
default
kind
| 'time'
| 'volume'
| 'tick'
-
determines the way trades within a bar will be aggregated.
time
- classic OHLC candles aggregated by time
volume
- volume based trade bars agg by sum of trades amount
tick
- tick based trade bars, aggregated by trades count
interval
number
-
determines interval to aggregate by - for time based bars it's number of milliseconds, for volume based bars it's accumulated volume, for tick it's count of trades
name
string | undefined
undefined
optional custom name of trade_bar
, if not specified computed name will be provided based on kind and interval options
name
type
default
depth
number
-
number of closest bids and asks levels to provide snaphot for
interval
number
-
snapshot interval in milliseconds, if 0
is provided it computes snapshots real-time any time there is a change in order book state for requested depth
name
string | undefined
undefined
optional custom name of book_snapshot
, if not specified computed name will be provided based on depth and interval options
exchange*
string
one of https://api.tardis.dev/v1/exchanges (field id
)
from*
string
requested UTC start date of historical market data feed (e.g.: 2019-04-05
or 2019-04-05T01:02:00.000Z
)
offset
number
minute offset that together with from
date specifies exact minute slice of historical data that will be returned (e.g.: from date: 2019-04-05 with offset: 2 will provide historical data between 2019-04-05T00:02:00.000Z
and 2010-04-05T00:03:00.000Z
)
filters
string
URL encoded JSON string with{channel:string, symbols?: string[]}[]
format with optional historical market data filters,
e.g.: [{"channel":"trade", "symbols":["XBTUSD"]}]
In order to get the list of allowed channels and symbols for each exchange usehttps://api.tardis.dev/v1/exchanges/:exchange
API (documented below).
Authorization
string
For authenticated requests provide Authorization header with value: 'Bearer YOUR_API_KEY
'.
Without API key historical data feeds for the first day of each month are available.
exchange*
string
one of https://api.tardis.dev/v1/exchanges (field id
)
Authorization*
string
Authorization header with value: 'Bearer YOUR_API_KEY'
data type
symbol
date
trades
BTCEUR
2019-12-01
incremental_book_L2
BTCEUR
2019-12-01
quotes
BTCEUR
2019-12-01
trades
ETHEUR
2020-03-01
incremental_book_L2
ETHEUR
2020-03-01
Terms you are agreeing to when you use Tardis.dev website and it's services
Last modification date: 2023-09-12
Throughout the page, the terms “we”, “us” and “our” refer to Tardis.dev.
Please read these Terms of Service carefully before using our Service. By using our websites (including any customer portal or interactive customer website) (https://tardis.dev and https://docs.tardis.dev), services, solutions, tools, and related applications, services, and programs, including research and marketing activities, offered by us (the "Services"), you agree to be bound by these Terms of Service.
By visiting our site, purchasing something from us, accessing material we make available or using any software we provide such as API service, you engage in our “Service” and agree to be bound by the following terms and conditions (“Terms of Service”, “Terms”), including those additional terms and conditions and policies referenced herein and/or available by hyperlink. These Terms of Service apply to all users of the site, including without limitation users who are browsers, vendors, customers, merchants, and/or contributors of content.
Any new features or tools we offer are subject to the Terms of Service. You can review the most current version of the Terms of Service at any time on this page. We reserve the right to update, change or replace any part of these Terms of Service by posting updates and/or changes to our website. It is your responsibility to check this page periodically for changes. Your continued use of our Service following the posting of any changes constitutes acceptance of those changes.
By agreeing to these Terms of Service, you represent that you are at least the age of majority in your state or province of residence, or that you are the age of majority in your state or province of residence and you have given us your consent to allow any of your minor dependents to use this site.
A breach or violation of any of the Terms will result in an immediate termination of your Services.
We reserve the right to refuse service to anyone for any reason at any time.
We reserve the right at any time to modify or discontinue the Service (or any part or content thereof) without notice at any time.
We shall not be liable to you or to any third-party for any modification, price change, suspension or discontinuance of the Service.
You agree to indemnify, defend and hold harmless Tardis.dev and our affiliates, officers, directors, contractors and employees, harmless from any claim or demand, including reasonable attorneys’ fees, made by any third-party due to or arising out of your breach of these Terms of Service or the documents they incorporate by reference, or your violation of any law or the rights of a third-party.
In the event that any provision of these Terms of Service is determined to be unlawful, void or unenforceable, such provision shall nonetheless be enforceable to the fullest extent permitted by applicable law, and the unenforceable portion shall be deemed to be severed from these Terms of Service, such determination shall not affect the validity and enforceability of any other remaining provisions.
In addition to other prohibitions as set forth in the Terms of Service, you are prohibited from using our service: (a) for any unlawful purpose; (b) to solicit others to perform or participate in any unlawful acts; (c) to violate any international, federal, provincial or state regulations, rules, laws, or local ordinances; (d) to infringe upon or violate our intellectual property rights or the intellectual property rights of others; (e) to harass, abuse, insult, harm, defame, slander, disparage, intimidate, or discriminate based on gender, sexual orientation, religion, ethnicity, race, age, national origin, or disability; (f) to submit false or misleading information; (g) to upload or transmit viruses or any other type of malicious code that will or may be used in any way that will affect the functionality or operation of the Service or of any related website, other websites, or the Internet; (h) to collect or track the personal information of others; (i) to spam, phish, pharm or pretext; (j) for any obscene or immoral purpose; or (k) to interfere with or circumvent the security features of the Service or any related website, other websites, or the Internet. We reserve the right to terminate your use of the Service or any related website for violating any of the prohibited uses.
The failure of us to exercise or enforce any right or provision of these Terms of Service shall not constitute a waiver of such right or provision.
These Terms of Service and any policies or operating rules posted by us on this site or in respect to The Service constitutes the entire agreement and understanding between you and us and govern your use of the Service, superseding any prior or contemporaneous agreements, communications and proposals, whether oral or written, between you and us (including, but not limited to, any prior versions of the Terms of Service).
Any ambiguities in the interpretation of these Terms of Service shall not be construed against the drafting party.
Our order process is conducted by our online reseller Paddle.com. Paddle.com is the Merchant of Record and authorized reseller for all the Services provided by us which means that you acquire our Services from Paddle.com. By agreeing to these Terms of Service you also agree to Paddle's Buyer Terms and Conditions (https://paddle.com/legal-buyers/). Paddle.com handles returns and provides all customer service inquiries.
By purchasing the Services provided by Tardis.dev (“the Supplier”), you (the “Customer”) are agreeing to this Licence Agreement (“Agreement”). This Agreement (as well as the documents referred to in it, including the Supplier’s Privacy Policy and any additional terms or policies that the Supplier tells the Customer about) sets out the agreement between the Customer and the Supplier - please read them carefully.
1.1 The definitions and rules of interpretation in this clause apply in this Agreement and in any other agreement between the parties.
Authorised Person: means in relation to either party: (i) any director, officer, employee or professional advisor of that party to whom the disclosure of Confidential Information is necessary in order to enable that party to perform obligations or exercise rights pursuant to this Agreement; (ii) any body which regulates that party in any jurisdiction, if disclosure to that body is mandated by applicable law or relevant regulation; (iii) the insurers, brokers and auditors of that party; and (v) any service providers providing administrative and similar support services to that party in the ordinary course of business in connection with the performance of obligations under this Agreement and to whom disclosure of Confidential Information is necessary to enable that party to perform obligations or exercise rights pursuant to this Agreement.
Confidential Information: all financial, business and technical and all other information (regardless of its form or the medium in which it is stored) concerning the business and affairs of a party or of a confidential nature that the other party obtains, receives or has access to, before or after the date of this Agreement, in connection with, or in the performance of, the Agreement.
Customer System: any information technology system or systems owned or operated by the Customer to which Data is delivered or within which Data is Distributed in accordance with this Agreement.
Customer User: any employee of the Customer authorized by the Customer to access and use the Services (wholly or in part), including employees of Customer’s Affiliates.
Customer Affiliate: an entity that owns or controls, is owned or controlled by or is or under common control or ownership with Customer, where control is defined as the possession, directly or indirectly, of the power to direct or cause the direction of the management and policies of an entity, whether through ownership of voting securities, by contract or otherwise;
Customer User Restrictions: the obligations set out in Schedule 1.
Data: the data or information, in whatever form including images, still and moving, and including financial and market research information, the provision of which comprises the Services (wholly or in part).
Derived Data: any Data (wholly or in part):
Manipulated to such a degree that it: (i) cannot be identified as originating or deriving directly from the Data or the Services and cannot be reverse-engineered such that it can be so identified; (ii) is not capable of use substantially as a substitute for the Data or the Services;
that is not separately marketed by the Customer; and
that has no independent commercial value.
Distribute: to make Data accessible (including the provision of access through a database or other application populated with the Data, transferring or disclosing the Data) by any means, including any electronic means, to any Customer User.
Effective Date: date of acceptance of the Agreement by the Customer.
Fees: fees specified on the Website and in the purchase invoice.
Force Majeure Event: means an event beyond a Party's reasonable control (but in each case only to the extent actually beyond the control of the Party seeking to rely on that event as a Force Majeure Event), including: (i) extreme abnormal weather conditions; (ii) nuclear, chemical or biological contamination; (iii) war, civil commotion or terrorist attack; (iv) interruption or failure of a utility service including electric power, gas or water; (v) acts of God, floods or earthquakes; (vi) pandemic (excluding COVID-19); or (vii) the imposition of a sanction, embargo or breaking off of diplomatic relations, but excluding in each case strikes or other forms of industrial action by the employees, agents or subcontractors of that Party, or any change in applicable law or relevant regulation.
Initial Period: the period commencing on the Effective Date that is specified in the purchase invoice.
Insolvency Event: (i) any procedure commenced with a view to the winding-up or re-organisation of such party; (ii) any step taken or any procedure is commenced with a view to the appointment of an administrator, receiver, administrative receiver or trustee in bankruptcy in relation to such party or all or substantially all of its assets; (iii) the holder of any security over all or substantially all of the assets of such party takes any step to enforce that security; (iv) all or substantially all of the assets of such party is subject to attachment, sequestration, execution or any similar process; (v) such party is unable to pay its debts as they fall due; (vi) such party enters into, or any step is taken, whether by the board of directors of such party or otherwise, towards entering into a composition or arrangement with its creditors or any class of them, including a company voluntary arrangement or a deed of arrangement; or (vii) such party enters into, or any step is taken, whether by the board of directors of such party or otherwise, towards any analogous procedure under the laws of any jurisdiction to the procedures set out in (i) to (vi) above.
Intellectual Property Rights: means: (i) rights in, and in relation to, any patents, registered designs, design rights, trade marks, trade and business names (including goodwill associated with any trade marks or trade and business names), copyright and related rights, moral rights, databases, domain names, semi-conductor and other topography rights and utility models, and including registrations and applications for, and renewals or extensions of, such rights, and similar or equivalent rights or forms of protection in any part of the world; (ii) rights in the nature of unfair competition rights and to sue for passing off and for past infringement; and (iii) trade secrets, confidentiality and other proprietary rights, including rights to know how and other technical information.
Licence: the licence granted in Clause 9.
Manipulate: to combine or aggregate the Data (wholly or in part) with other data or information or to adapt the Data (wholly or in part).
Manipulated Data: any Data which has been Manipulated. Manipulated Data includes any Derived Data.
Mark:means the trade marks, trade names, product or service names, logos, slogans, typefaces, brand or other proprietary words or symbols used by the Supplier from time to time.
Materials: any documents or software supplied by the Supplier under this Agreement.
Permitted Use: internal business use (which shall not include the use of the Data or the Materials by, or for the benefit of, any person other than an employee of the Customer).
Release: generally available upgrades and enhancements to the Data.
Services: the services to be supplied by the Supplier under this Agreement, including the supply of any Data, Materials, or Support.
Software: any software provided by the Supplier to enable the Services to be used including any Releases.
Support: the support to be supplied by the Supplier including reasonable efforts to assist the Customer to access the Data.
Term: the Initial Period and any Renewal Periods.
Website: means any webpage of the Supplier, including but not limited to Tardis.dev.
1.2 The headings in this Agreement are inserted for convenience only and shall not affect its construction.
1.3 A person includes a natural person, corporate or unincorporated body (whether or not having separate legal personality).
1.4 The schedules form part of this Agreement and shall have effect as if set out in full in the body of this Agreement. Any reference to this Agreement includes the schedules.
1.5 A reference to a company shall include any company, corporation or other body corporate, wherever and however incorporated or established.
1.6 Unless the context otherwise requires, words in the singular shall include the plural and in the plural shall include the singular.
1.7 A reference to a particular law is a reference to it as it is in force for the time being taking account of any amendment, extension, or re-enactment and includes any subordinate legislation for the time being in force made under it.
1.8 References to clauses and schedules are to the clauses and schedules of this Agreement and references to paragraphs are to paragraphs of the relevant schedule.
1.9 Any words following the terms including, include, in particular or for example or any similar phrase shall be construed as illustrative and shall not limit the generality of the related general words.
1.10 If there is any uncertainty between any provision contained in the body of this Agreement and any provision contained in the Schedules or appendices, the provision in the body of this Agreement shall prevail.
During the Term the Supplier shall supply the Services to the Customer and the Customer shall pay the Fees and use the Services.
3.1 The Supplier shall use reasonable efforts to make connection to the Services available on the Effective Date.
3.2 The Customer shall ensure that it promptly complies with any minimum hardware configuration requirements specified by the Supplier to establish connectivity between the Customer System and the Services.
3.3 Each party shall bear its own costs of establishing that connectivity.
4.1 During the Term the Supplier shall supply the Services to the Customer.
4.2 The Supplier may change at any time, with as much prior notice to the Customer as is reasonably practicable:
the content, format or nature of the Data or the Services; and
the means of access to the Data or the Services.
5.1 Customer will pay to Supplier, without offset or deduction, all fees due under this Agreement. Unless otherwise specified, all fees shall be due 14 days from the date of invoice and all fees are non-cancelable and non-refundable.
5.2 Supplier order process is conducted by Supplier online reseller Paddle.com. Paddle.com is the Merchant of Record for all Supplier orders.
5.3 All Fees are exclusive of VAT or any other applicable sales tax, which shall be paid by the Customer at the rate and in the manner for the time being prescribed by law.
6.1 The parties shall each, as a receiving party: (i) keep confidential all Confidential Information disclosed by the disclosing party; (ii) shall not use the Confidential Information disclosed by the disclosing party in any way contrary to this Agreement, including Schedule I and not otherwise for the benefit of any third party; and (iii) not disclose the Confidential Information disclosed by the disclosing party to any person save to an Authorised Person.
6.2 The parties shall each, as a receiving party, ensure that each Authorised Person complies with confidentiality provisions no less onerous than those contained in this Clause 6 and will remain liable for any disclosure of Confidential Information by each Authorised Person as if it had made such disclosure.
6.3 The parties shall each, on the other party’s request destroy, erase or deliver to the other party all the requesting party’s Confidential Information, save where the retention of such Confidential Information is necessary to comply with applicable law or relevant regulation or otherwise for the other party to exercise its rights or receive benefits due under the Agreement.
6.4 The parties agree that the provisions of Clauses 6.1, 6.2, and 6.3 shall not apply to any information which the receiving party can prove: (i) is or becomes public knowledge other than by breach of this Clause; (ii) was in the possession of the receiving party without restriction in relation to disclosure before the date of receipt from the disclosing party; (iii) is received from a third party who lawfully acquired it and who was under no obligation restricting its disclosure; or (iv) was independently developed without access to any Confidential Information disclosed by the disclosing party.
6.5 The parties agree that these provisions in this Clause 6 shall not apply so as to prevent disclosure of Confidential Information by the receiving party to the extent that such disclosure is required to be made by any authority of competent jurisdiction or by any applicable law or relevant regulation or for the purposes of defending itself in relation to actual or threatened proceedings, regardless of whether brought or threatened by the other party or any other person, provided in each case that where permissible the receiving party: (i) gives the disclosing party reasonable formal written notice (provided that this is not in contravention of applicable law or relevant regulation) prior to such disclosure to allow the disclosing party a reasonable opportunity to seek a protective order; and (ii) uses reasonable endeavours to obtain prior to the disclosures written assurance from the applicable entity that it will keep the Confidential Information confidential.
6.6 Each party reserves all rights in its Confidential Information. No rights or obligations in respect of a party’s Confidential Information, other than those expressly stated in this Agreement, are granted to the other party, or are to be implied from this Agreement.
6.7 The provisions of this Clause 6 shall survive any expiry or termination of the Agreement.
No party shall make, or permit any person to make, any public announcement concerning this Agreement without the prior written consent of the other party (such consent not to be unreasonably withheld or delayed), except as required by law, any governmental or regulatory authority (including any relevant securities exchange), any court or other authority of competent jurisdiction
The Customer shall ensure that the Data and Materials are kept secure, and shall use security practices and systems consistent with standard industry practices and which will be applicable to the use of the Data and Materials to prevent, and take prompt and proper remedial action against, unauthorised access, copying, modification, storage, reproduction, display or distribution of the Data and the Materials.
9.1 The Supplier grants to the Customer a non-exclusive, non-transferable, licence for the Permitted Use only, subject to the Customer User Restrictions, to:
access, view and Manipulate Data and create Derived Data;
store the Data and Manipulated Data on the Customer System;
Distribute Derived Data to Customer Users on the Customer System; and
use (but not modify) the Materials in support of the activities referred to in this Clause 9.1.
9.2 Except as expressly provided in this Agreement, the Customer shall not:
use the Services (wholly or in part) in its products or services; or
redistribute or resell the Data or the Services (wholly or in part) with exception of reselling or redistributing aggregated and calculated data (lowest resolution being 10 minutes)
9.3 The Customer shall comply with the Customer User Restrictions.
10.1 The Customer acknowledges that:
all Intellectual Property Rights in the Data and the Materials are the property of the Supplier or its licensors, as the case may be;
it shall have no rights in or to the Data or the Materials other than the right to use them in accordance with the express terms of this Agreement;
the Supplier or its licensors has or have made and will continue to make substantial investment in the obtaining, verification, selection, coordination, development, presentation and supply of the Data;
it shall use the Supplier’s Mark strictly in accordance with the Supplier’s written instructions; and
any goodwill generated though the Customer’s use of the Supplier’s Mark shall belong only to the Supplier.
10.2 The Customer acknowledges that reference in any element of the Materials to trade names or proprietary products where no specific acknowledgement of such names or products is made does not imply that such names or products may be regarded by the Customer as free for general use, outside the scope of the use of the Materials authorised by this agreement.
10.3 If any third-party claim is made, or in the Supplier’s reasonable opinion is likely to be made, in relation to the use of the Data, the Supplier may at its sole option and expense:
procure for the Customer the right to continue using, developing, modifying or retaining the Data or the Materials (wholly or in part) in accordance with this Agreement;
modify the Data or the Materials (wholly or in part) so that they cease to be infringing;
replace the Data or the Materials (wholly or in part) with non-infringing items; or
terminate this Agreement immediately by notice in writing to the Customer. In respect of ongoing Subscriptions purchased by the Customer, the Supplier shall refund any Fees for the Initial Period or Renewal Period (as relevant) paid in advance by the Customer as at the date of termination (less a reasonable sum in respect of the Customer’s use of the Data or Materials to the date of termination) on return of the Data or the Materials and all copies of each of them.
10.4 Supplier shall indemnify and hold Customer harmless for any claims arising from third-party claims of any infringement of Intellectual Property Rights.
11.1 Except as expressly stated in this Agreement, all warranties, conditions and terms, whether express or implied by statute, common law or otherwise (including any implied warranties of satisfactory quality or fitness for a particular purpose or non-infringement) are hereby excluded to the extent permitted by law.
11.2 Without limiting the effect of Clause 11.1, the Supplier does not warrant or make any representations:
that the supply of the Data will be error-free, free from interruption,or operate without loss or corruption of data or technical malfunction;
that the Data is accurate, complete, reliable, secure, useful, fit for purpose or timely; or
that the Data has been tested for use by the Customer or any third party or that the Data will be suitable for or be capable of being used by the Customer or any third party; or
regarding the benefit the Customer or any third party will obtain from the Data.
12.1 The Customer acknowledges that:
the use and interpretation of the Data requires specialist skill and knowledge of financial markets;
the Customer has that skill and knowledge and undertakes that it will exercise that skill and knowledge and appropriate judgment when using the Data;
the Customer shall be solely responsible, as against the Supplier, for any opinions, recommendations, forecasts or other conclusions made or actions taken by the Customer, any client of the Customer or any other third party based (wholly or in part) on the Data unless otherwise set out in this Clause 12; and
it is in the best position to ascertain any likely loss it may suffer in connection with this Agreement, that it is therefore responsible for making appropriate insurance arrangements to address the risk of any such loss and that the provisions of this Clause 12 are reasonable in these circumstances.
12.2 Neither party excludes or limits liability to the other party for:
fraud or fraudulent misrepresentation;
any matter which cannot be excluded by law.
12.3 Subject to Clause 12.2, each party shall not in any circumstances be liable whether in contract, tort (including for negligence and breach of statutory duty howsoever arising), misrepresentation (whether innocent or negligent), restitution or otherwise, for:
any loss (whether direct or indirect) of profits, business, business opportunities, revenue, turnover, reputation or goodwill;
any loss or corruption (whether direct or indirect) of data or information;
loss (whether direct or indirect) of anticipated savings or wasted expenditure (including management time); or
any loss or liability (whether direct or indirect) under or in relation to any other contract.
12.4 Subject to Clause 12.2 and excluding Clause 10.4, each party's total aggregate liability in contract, tort (including negligence and breach of statutory duty howsoever arising), misrepresentation (whether innocent or negligent), restitution or otherwise, arising in connection with the performance or contemplated performance of this Agreement or any collateral contract shall in all circumstances be limited to the total Fees paid by the Customer to the Supplier during the 12-month period immediately before the date on which the cause of action first arose or, if the cause of actions arose during the Initial Period, in respect of the Initial Period.
12.5 The Supplier shall not be liable for any delay in delivery of the Services that is caused by an event within the scope of Clause 14 or the Customer’s failure to provide the Supplier with adequate delivery instructions or any other instructions that are relevant to the supply of the Services or the Customer’s failure to comply with Clause 3.2.
13.1 This Agreement shall commence on the Effective Date. Unless terminated earlier in accordance with this Clause 13 or Clause 10.3, this Agreement shall continue for the Initial Period.
13.2 The Supplier may terminate this Agreement in respect of the Services (wholly or in part):
with immediate effect by giving written notice to the Customer if the Customer fails to pay any amount due under this Agreement on the due date for payment and remains in default not less than 30 days after being notified in writing to make that payment;
on written notice to the Customer at any time if the Supplier discontinues or withdraws, in whole or in part, its provision of the Services in question to all subscribers of such Services. The Supplier will use reasonable endeavours to give the Customer as much notice of the same as reasonably practicable, but any such termination will be without liability to the Supplier.
13.3 Without prejudice to any rights that have accrued under this Agreement or any of its rights or remedies, either party may terminate this Agreement (or any part thereof) with immediate effect by giving written notice to the other party if:
the other party: (i) commits a material breach of this Agreement and (if that breach is remediable) fails to remedy that breach within a period of 30 days after being notified in writing to do so; or (ii) commits a series of breaches of this Agreement which when taken together have the impact or effect of or otherwise amount to a material breach;
a Force Majeure Event continues for a period exceeding two (2) months;
the other party becomes subject to an Insolvency Event; or
the party reasonably determines that it has become unlawful to perform its obligations under the Agreement.
13.4 Any provision of this Agreement that expressly or by implication is intended to come into or continue in force on or after termination of this Agreement shall remain in full force and effect.
13.5 Termination or expiry of this agreement shall not affect any rights, remedies, obligations or liabilities of the parties that have accrued up to the date of termination or expiry, including the right to claim damages in respect of any breach of the agreement which existed at or before the date of termination or expiry.
13.6 On any termination of this Agreement for any reason or expiry of the Term, the Customer shall:
immediately pay any outstanding amounts owed to the Supplier under this Agreement; and
within a reasonable period of termination or expiry ensure that there is no further use of the Services in any of the Customer's products, applications or services, provided that the Customer shall not be obliged to remove from its products, applications and services any Data or Derived Data incorporated into them in accordance with this Agreement before termination or expiry.
13.7 On termination of this Agreement for any reason (save for termination for material breach by the Customer under Clause 13.3(a) or for failure to pay amounts due under Clause 13.2(a)), the Supplier shall refund any Fees for the Initial Period or Renewal Period (as relevant) paid in advance by the Customer as at the date of termination or expiry (less a reasonable sum taking into account the remaining length of the Initial Period or Renewal Period and the Customer's use of the Data or the Materials to the date of termination). If the Supplier terminates this Agreement under Clause 13.3(a) due to the Customer’s material breach, or under Clause 13.2(a) for the Customer’s failure to pay amounts due, the Customer shall not be entitled to any refund.
Neither party shall be responsible for any failure to fulfill any obligation for so long as, and to the extent to which, the fulfillment of such obligation is impeded by a Force Majeure Event, and the affected party:
has promptly notified the other party of any circumstances which may result in failure to perform its obligations;
uses its best endeavours to minimize the adverse consequences that any failure in performance of its obligations might have, and to return the performance of such obligations to normal as soon as possible.
15.1 This Agreement is personal to the Customer and it shall not assign, transfer, mortgage, charge, sub-contract, or otherwise transfer any of its rights and obligations under this Agreement without the prior written consent of the Supplier (which is not to be unreasonably withheld or delayed). Notwithstanding the preceding, Customer may assign any right or obligations to any of its Affiliates. For the purposes of this Agreement, the term “Affiliate” shall mean in respect of a party, any other entity that directly or indirectly controls, is controlled by or is under common control with, that party.
15.2 The Supplier may at any time assign, transfer, mortgage, charge, sub-contract, or otherwise transfer any of its rights and obligations under this Agreement without the consent of the Customer.
No failure or delay by a party to exercise any right or remedy provided under the Agreement or by law, or a single or partial exercise of such right or remedy, shall constitute a waiver of that or any other right or remedy, nor shall it preclude or restrict the further exercise of that or any other right or remedy.
Except as expressly provided in this Agreement, the rights and remedies provided under this Agreement are in addition to, and not exclusive of, any rights or remedies provided by law.
All notices, demands and other communications provided for or permitted under this Agreement will be made in writing to the parties at the addresses on the Cover Page and will be sent by email and will be deemed received upon receipt of a delivery receipt.
19.1 This Agreement represents the entire agreement between the parties and supersedes all previous discussions, correspondence, negotiations, arrangements, understandings and agreements between them, whether written or oral, relating to its subject matter.
19.2 Each party acknowledges that in entering into this Agreement it does not rely on, and shall have no remedies in respect of, any representation or warranty (whether made innocently or negligently) that is not set out in this Agreement.
19.3 Each party agrees that it shall have no claim for innocent or negligent misrepresentation based on any statement in this agreement.
The Supplier reserves the right to change the Agreement at any time and the Customer should revisit the terms and conditions at Tardis.dev before making purchase to ensure that it is fully aware of the current terms and conditions.
Nothing in this agreement is intended to, or shall be deemed to, establish any partnership or joint venture between any of the parties, constitute any party the agent of another party, or authorise any party to make or enter into any commitments for or on behalf of any other party.
Except as expressly provided in this Agreement, a person who is not a party to this Agreement shall not have any rights under the Contracts (Rights of Third Parties) Act 1999 or otherwise to enforce any term of this Agreement.
In relation to licensing Coinbase market data additional restrictions apply.
23.1 Customer is prohibited from further disseminating Coinbase Data and/or Derivative Data, and using or permitting the use of Coinbase Data, Derivative Data and/or any part thereof for any Prohibited Use that means any of:
use of Coinbase Data or Derivative Data that violates any applicable laws or the terms and conditions of this Agreement,
use of Coinbase Data to create Financial Products,
display or redistribution of any Coinbase Data or Derivative Data
to any third party who is not a Customer,
or by any Customer to any third party,
authorization of any Person to do any of the foregoing,
any other use not expressly permitted under this Agreement, and
any other use of the Data not expressly permitted under the Coinbase Market Data Policy.
23.2 Customer agrees that Coinbase may inspect Customer’s use of Coinbase Data and/or Derivative Data by Licensee and its agents upon ten (10) days advance notice to Customer; and Supplier or Coinbase has the ability to modify the agreement (or any other agreements related to the use of Coinbase Data or Derivative Data) with any Customer as Coinbase may from time to time specify, except that Supplier may continue to provide Coinbase Data and/or Derivative Data to a Customer without affecting the modification for ninety (90) days from that receipt; Supplier and Coinbase shall discontinue its provision of Coinbase Data and/or Derivative Data to a Customer thereafter if the Customer has not agreed to the modifications after this ninety (90) day period.
23.3 Customer agrees that for reporting purposes some of the customer specific information (such as customer name or address for example) may be be required by and reported to Coinbase.
The Customer shall:
(a) limit access to the Services to the Customer Users, which shall include Customer’s Affiliates;
(b) only make copies of the Data and the Materials to the extent reasonably necessary for the following purposes: back-up, mirroring (and similar availability enhancement techniques), security, disaster recovery and testing;
(c) comply with all applicable law and relevant regulations, and not use the Services for any purpose contrary to any applicable law or relevant regulation, or any regulatory code, guidance or request;
(d) not extract, reutilise, use, exploit, redistribute, resell, redisseminate, copy or store the Data or the Materials for any purpose not expressly permitted by this Agreement;
(e) not copy, modify, decompile, reverse engineer or create derivative works from the Software, except to the extent permitted by any applicable law; and
(f) not do anything which may damage the reputation of the Supplier, the Data or the Services, including by way of using the Data (wholly or in part) in any manner which is pornographic, racist or that incites religious hatred or violence.
Tardis.dev privacy policy — what data we collect, how we collect it and what we do with this data
This Privacy Policy explains how Personal Information about our (potential) customers and other individuals using our services is collected, used and disclosed by Tardis.dev and its respective affiliates ("us", "we", "our" or "Tardis.dev"). This Privacy Policy describes our privacy practices in relation to the use of our websites (including any customer portal or interactive customer website) (https://tardis.dev and https://docs.tardis.dev), services, solutions, tools, and related applications, services, and programs, including research and marketing activities, offered by us (the "Services"), as well as your choices regarding use, access, storage and correction of Personal Information. It also describes how we collect, use, disclose and otherwise process Personal Information collected in relation to our Services and otherwise in the course of our business activities. This Privacy Policy does not apply to Personal Information collected about our employees, applicants or other personnel.
By using our Services or by agreeing to our Terms of Service required to use our Services, you agree to the collection, usage, storage and disclosure of information described in this Privacy Policy.
Our Services may contain links to other websites or services; and information practices and/or the content of such other websites or services shall be governed by the privacy statements of such other websites or services.
Here we describe what information we collect, what we use it for, which third parties we use to help us and give links to the privacy policies of those third parties for you to read:
To manage payments, we use Paddle Ltd (https://paddle.com) who will store your payment details such as your payment card number, expiration date or security code. Please refer to their Privacy Statements https://paddle.com/gdpr and https://paddle.com/privacy/
we track and store usage behavior such as which links are clicked on and API endpoints usage statistics so we can optimize the services we provide to you. We use Google Analytics with anonymized IP feature enabled for this purpose. Please refer to their Privacy Statement https://www.google.com/policies/privacy/
We use Crisp to provide livechat support functionality. Please refer to their Privacy Statement https://crisp.chat/en/privacy/
When you contact us for support or other customer service requests, we can maintain records related to such requests, including any information provided by you related to such support or service requests and contact you back about our services with relevant information. We may also obtain Personal Information about you from third parties, such as LinkedIn, Facebook, Twitter and other publicly accessible sources.
We may use your Personal Information to contact you with marketing or promotional materials and other information communications related to Tardis.dev. If you no longer wish to receive marketing or promotional communications related to us, you can at any moment in time by using the unsubscribe button in the email or emailing privacy@tardis.dev to request us to stop sending you such communications. Such a request will be processed immediately by us, but in any event within two (2) business days.
To keep track of which users should have access to paid versions of our services and to handle API authentication and authorization, we store user details such as email address, name, subscription details and IP address in Cloudflare data store (encrypted at rest). Please refer to their Privacy Statement https://www.cloudflare.com/privacypolicy
Unless specified otherwise, all data requested by Tardis.dev is mandatory and failure to provide this data may make it impossible for us to provide our services. In cases where Tardis.dev website specifically states that some data is not mandatory, users are free not to communicate this data without consequences to the availability or the functioning of the service. Users who are uncertain about which personal data is mandatory are welcome to contact us at privacy@tardis.dev.
We may publicly display aggregated anonymous data to help communicate what we know about how our services are typically used.
When you provide us with personal information to make a purchase or return a purchase, we imply that you consent to us collecting this information and using it for that specific reason only.
If we ask for your personal information so that we can send you communications in the future, we will either ask you directly for your expressed consent or provide you with an opportunity to say no afterwards.
If you wish to withdraw the consent you have given to us to collect, store or use your information, please contact us at privacy@tardis.dev.
We may disclose your personal information if we are required by law to do so or if you violate our Terms of Service.
Third-party providers will collect, use and disclose your information to the extent necessary to allow them to perform the services they provide to us. Third-party service providers have their own privacy policies in respect to the information we are required to provide to them. For these providers, we recommend that you read their privacy policies so you can understand the manner in which your personal information will be handled by these providers.
In particular, remember that certain providers may be located in or have facilities that are located in a different jurisdiction than either you or us. Your information may become subject to the laws of the jurisdiction(s) in which a third party service provider or its facilities are located.
When you click on links that appear in any of the content we provide to you, those links may direct you to third party sites. We are not responsible for the privacy practices of other sites and encourage you to read their privacy statements.
To protect your personal information, we take reasonable precautions to make sure it is not inappropriately lost, misused, accessed, disclosed, altered or destroyed.
Our website uses cookies to track anonymized usage behavior and to personalize content.
We retain personal information that you provide us as long as we consider it potentially useful in contacting you about our services, or as needed to comply with our legal obligations, resolve disputes and enforce our agreements.
By using Tardis.dev website and services, you represent that you are at least the age of majority in your state or province of residence, or that you are the age of majority in your state or province of residence and you have given us your consent to allow any of your minor dependents to use this site.
We reserve the right to modify this privacy policy at any time, so please review it frequently. Changes and clarifications will take effect immediately upon their posting on the website. If we make material changes to this policy, we will notify you here that it has been updated, so that you are aware of what information we collect, how we use it, and under what circumstances, if any, we use and/or disclose it.
If we are acquired or merged with another company, your information may be transferred to the new owners.
If you would like to access, correct, amend or delete any personal information we have about you, register a complaint, or simply want more information you should email us at privacy@tardis.dev.
Locally runnable server with built-in data caching, providing both tick-level historical and consolidated real-time cryptocurrency market data via HTTP and WebSocket APIs
Tardis-machine is a locally runnable server with built-in data caching that uses Tardis.dev HTTP API under the hood. It provides both tick-level historical and consolidated real-time cryptocurrency market data via it's HTTP and WebSocket APIs and is available via npm and Docker.
efficient data replay API endpoints returning historical market data for whole time periods (in contrast to Tardis.dev HTTP API where single call returns data for single minute time period)
exchange-native market data APIs
tick-by-tick historical market data replay in exchange-native format
WebSocket API providing historical market data replay from any given past point in time with the same data format and 'subscribe' logic as real-time exchanges' APIs - in many cases existing exchanges' WebSocket clients can be used to connect to this endpoint
consistent format for accessing market data across multiple exchanges
consolidated real-time data streaming connecting directly to exchanges' WebSocket APIs
customizable order book snapshots and trade bars data types
transparent historical local data caching (cached data is stored on disk in compressed GZIP format and decompressed on demand when reading the data)
support for top cryptocurrency exchanges: BitMEX, Deribit, Binance, Binance Futures, FTX, OKEx, Huobi Global, Huobi DM, bitFlyer, Bitstamp, Coinbase Pro, Kraken Futures, Gemini, Kraken, Bitfinex, Bybit, OKCoin, CoinFLEX and more
Pull and run latest version of tardisdev/tardis-machine
image:
Tardis-machine server's HTTP endpoints will be available on port 8000
and WebSocket API endpoints on port 8001
. Your API key will be passed via ENV variable (TM_API_KEY
) — simply replace YOUR_API_KEY
with API key you've received via email.
Command above does not use persistent volumes for local caching (each docker restart will result in loosing local data cache). In order to use for example./host-cache-dir
as persistent volume (bind mount) cache directory, run:
Since using volumes can cause issues especially on Windows, it's fine to run Docker image without them with the caveat of potentially poor local cache ratio after each container's restart.
You can set following environment config variables to configure tardis-machine server:
name
default
description
TM_API_KEY
TM_PORT
8000
HTTP port on which server will be running, WebSocket port is always this value + 1 (8001
with port set to 8000
)
TM_CACHE_DIR
/.cache
path to local dir that will be used as cache location
TM_CLUSTER_MODE
false
TM_DEBUG
false
server will print verbose debug logs to stdout if set to true
TM_CLEAR_CACHE
false
server will clear local cache dir on startup if set to true
Install and runtardis-machine
server via npx
command:
or install globally via npm
:
and then run:
Tardis-machine server's HTTP endpoints will be available on port 8000
and WebSocket API endpoints on port 8001
. Your API key will be passed via --api-key
config flag — simply replace YOUR_API_KEY
with API key you've received via email.
You can set following CLI config flags when starting tardis-machine server installed via npm
:
name
default
description
--api-key
--port
8000
HTTP port on which server will be running, WebSocket port is always this value + 1 (8001
with port set to 8000
)
--cache-dir
<os.tmpdir>/.tardis-cache
path to local dir that will be used as cache location - if not provided default temp
dir for given OS will be used
--cluster-mode
false
--debug
false
server will print verbose debug logs to stdout if set to true
--clear-cache
false
server will clear local cache dir on startup is set to true
--help
shows CLI help
--version
shows tardis-machine version number
Exchange-native market data API endpoints provide historical data in exchange-native format. The main difference between HTTP and WebSocket endpoints is the logic of requesting data:
HTTP API accepts request options payload via query string param
WebSocket API accepts exchanges' specific 'subscribe' messages that define what data will be then "replayed" and send to WebSocket client
HTTP GET
/replay?options={options}Returns historical market data messages in exchange-native format for given replay options query string param. Single streaming HTTP response returns data for the whole requested time period as NDJSON.
We're working on providing more samples and dedicated client libraries in different languages, but in the meanwhile to consume HTTP /replay API responses in your language of choice, you should:
Provide url encoded JSON options object via options query string param when sending HTTP request
Parse HTTP response stream line by line as it's returned - buffering in memory whole response may result in slow performance and memory overflows
Parse each response line as JSON containing messages in exchange-native format
HTTP /replay endpoint accepts required options query string param in url encoded JSON format.
name
type
default
exchange
string
-
filters
{channel:string, symbols?: string[]}[]
[]
from
string
-
to
string
-
withDisconnects
boolean | undefined
undefined
when set to true
, response includes empty lines (\n
) that mark events when real-time WebSocket connection that was used to collect the historical data got disconnected
Streamed HTTP response provides data in NDJSON format (new line delimited JSON) - each response line is a JSON with market data message in exchange-native format plus local timestamp:
localTimestamp
- date when message has been received in ISO 8601 format
message
- JSON with exactly the same format as provided by requested exchange real-time feeds
Sample response
WebSocket
/ws-replay?exchange={exchange}&from={fromDate}&to={toDate}Exchanges' WebSocket APIs are designed to publish real-time market data feeds, not historical ones. Tardis-machine WebSocket /ws-replay API fills that gap and allows "replaying" historical market data from any given past point in time with the same data format and 'subscribe' logic as real-time exchanges' APIs. In many cases existing exchanges' WebSocket clients can be used to connect to this endpoint just by changing URL, and receive historical market data in exchange-native format for date ranges specified in URL query string params.
After connection is established, client has 2 seconds to send subscriptions payloads and then market data replay starts.
If two clients connect at the same time requesting data for different exchanges and provide the same session key via query string param, then data being send to those clients will be synchronized (by local timestamp).
As long as you already use existing WebSocket client that connects to and consumes real-time exchange market data feed, in most cases you can use it to connect to /ws-replay API as well just by changing URL endpoint.
name
type
default
description
exchange
string
-
from
string
-
to
string
-
session
string | undefined
undefined
optional replay session key. When specified and multiple clients use it when connecting at the same time then data being send to those clients is synchronized (by local timestamp).
Normalized market data API endpoints provide data in unified format across all supported exchanges. Both HTTP /replay-normalized and WebSocket /ws-replay-normalized APIs accept the same replay options payload via query string param. It's mostly matter of preference when choosing which protocol to use, but WebSocket /ws-replay-normalized API has also it's real-time counterpart /ws-stream-normalized, which connects directly to exchanges' real-time WebSocket APIs. This opens the possibility of seamless switching between real-time streaming and historical normalized market data replay.
HTTP GET
/replay-normalized?options={options}Returns historical market data for data types specified via query string. Single streaming HTTP response returns data for the whole requested time period as NDJSON. See supported data types which include normalized trade, order book change, customizable order book snapshots etc.
We're working on providing more samples and dedicated client libraries in different languages, but in the meanwhile to consume HTTP /replay-normalized API responses in your language of choice, you should:
Provide url encoded JSON options via options query string param when sending HTTP request
Parse HTTP response stream line by line as it's returned - buffering in memory whole response may result in slow performance and memory overflows
Parse each response line as JSON containing normalized data messages.
HTTP /replay-normalized endpoint accepts required options query string param in url encoded JSON format.
Options JSON needs to be an object or an array of objects with fields as specified below. If array is provided, then data requested for multiple exchanges is returned synchronized (by local timestamp).
name
type
default
exchange
string
-
symbols
string[] | undefined
undefined
from
string
-
to
string
-
dataTypes
string[]
-
withDisconnectMessages
boolean | undefined
undefined
WebSocket
/ws-replay-normalized?options={options}Sends normalized historical market data for data types specified via query string. See supported data types which include normalized trade, order book change, customizable order book snapshots etc.
We're working on providing more samples and dedicated client libraries in different languages, but in the meanwhile to consume WebSocket /ws-replay-normalized API responses in your language of choice, you should:
Provide url encoded JSON options via options query string param when connecting to
WebSocket /ws-replay-normalized endpoint
Parse each received WebSocket message as JSON containing normalized data.
WebSocket /ws-replay-normalized endpoint accepts required options query string param in url encoded JSON format.
Options JSON needs to be an object or an array of objects with fields as specified below. If array is provided, then data requested for multiple exchanges is being send synchronized (by local timestamp).
name
type
default
exchange
string
-
symbols
string[] | undefined
undefined
from
string
-
to
string
-
dataTypes
string[]
-
withDisconnectMessages
boolean | undefined
undefined
WebSocket
/ws-stream-normalized?options={options}Sends normalized real-time market data for data types specified via query string. See supported data types which include normalized trade, order book change, customizable order book snapshots etc.
Doesn't requires API key as connects directly to exchanges real-time WebSocket APIs and transparently restarts closed, broken or stale connections (open connections without data being send for specified amount of time).
Provides consolidated real-time market data streaming functionality with options as an array - provides single consolidated real-time data stream for all exchanges specified in options array.
We're working on providing more samples and dedicated client libraries in different languages, but in the meanwhile to consume WebSocket /ws-stream-normalized API responses in your language of choice, you should:
Provide url encoded JSON options via options query string param when connecting to
WebSocket /ws-stream-normalized endpoint
Parse each received WebSocket message as JSON containing normalized data.
WebSocket /ws-stream-normalized endpoint accepts required options query string param in url encoded JSON format.
Options JSON needs to be an object or an array of objects with fields as specified below. If array is specified then API provides single consolidated real-time data stream for all exchanges specified (as in examples above).
name
type
default
exchange
string
-
symbols
string[] | undefined
undefined
optional symbols of requested real-time data feed
dataTypes
string[]
-
withDisconnectMessages
boolean | undefined
undefined
timeoutIntervalMS
number
10000
specifies time in milliseconds after which connection to real-time exchanges' WebSocket API is restarted if no message has been received
Individual trade
Initial L2 (market by price) order book snapshot (isSnapshot=true
) plus incremental updates for each order book change. Please note that amount
is the updated amount at that price level, not a delta. An amount
of 0
indicates the price level can be removed.
Derivative instrument ticker info sourced from real-time ticker & instrument channels.
Order book snapshot for selected number_of_levels
(top bids and asks), snapshot_interval
and time_unit
.
When snapshot_interval
is set to 0
, snapshots are taken anytime order book state within specified levels has changed, otherwise snapshots are taken anytime snapshot_interval
time has passed and there was an order book state change within specified levels. Order book snapshots are computed from exchanges' real-time order book streaming L2 data (market by price).
book_snapshot_10_0ms
- provides top 10 levels tick-by-tick order book snapshots
book_snapshot_50_100ms
- provides top 50 levels order book snapshots taken at 100 millisecond intervals
book_snapshot_30_10s
- provides top 30 levels order book snapshots taken at 10 second intervals
quote
is an alias of book_snapshot_1_0ms
- provides top of the book (best bid/ask) tick-by-order book snapshots
quote_10s
is an alias of book_snapshot_1_10s
- provides top of the book (best bid/ask) order book snapshots taken at 10 seconds intervals
ms
- milliseconds
s
- seconds
m
- minutes
Trades data in aggregated form, known as OHLC, candlesticks, klines etc. Not only most common time based aggregation is supported, but volume and tick count based as well. Bars are computed from tick-by-tick raw trade data, if in given interval no trades happened, there is no bar produced.
trade_bar_10ms
- provides time based trade bars with 10 milliseconds intervals
trade_bar_5m
- provides time based trade bars with 5 minute intervals
trade_bar_100ticks
- provides ticks based trade bars with 100 ticks (individual trades) intervals
trade_bar_100000vol
- provides volume based trade bars with 100 000 volume intervals
Allowed suffixes:
ms
- milliseconds
s
- seconds
m
- minutes
ticks
- number of ticks
vol
- volume size
Message that marks events when real-time WebSocket connection that was used to collect the historical data got disconnected.
API key for HTTP API - if not provided only first day of each month of historical data is accessible
will launch to handle the incoming requests if set to true
, by default server runs in single process mode
API key for HTTP API - if not provided only first day of each month of historical data is accessible
will launch to handle the incoming requests if set to true
, by default server runs in single process mode
requested exchange id - use to get list of valid exchanges ids
optional filters of requested historical data feed - check for each exchange and to get allowed channels and symbols for requested exchange
replay period start date (UTC) in a format, e.g., 2019-04-01
replay period end date (UTC) in a format, e.g., 2019-04-02
requested exchange id - use to get list of valid exchanges ids
replay period start date (UTC) in a format, e.g., 2019-04-01
replay period end date (UTC) in a format, e.g., 2019-04-02
requested exchange id - use to get list of valid exchanges ids
optional symbols of requested historical data feed - use to get allowed symbols for requested exchange
replay period start date (UTC) in a format, e.g., 2019-04-01
replay period end date (UTC) in a format, e.g., 2019-04-02
array of normalized for which historical data will be returned
when set to true
, response includes messages that mark events when real-time WebSocket connection that was used to collect the historical data got disconnected
requested exchange id - use to get list of valid exchanges ids
optional symbols of requested historical data feed - use to get allowed symbols for requested exchange
replay period start date (UTC) in a format, e.g., 2019-04-01
replay period end date (UTC) in a format, e.g., 2019-04-02
array of normalized for which historical data will be provided
when set to true
, sends also messages that mark events when real-time WebSocket connection that was used to collect the historical data got disconnected
requested exchange id - use to get list of valid exchanges ids
array of normalized for which real-time data will be provided
when set to true
, sends messages anytime underlying exchange real-time WebSocket connection(s) gets disconnected