All pages
Powered by GitBook
1 of 4

Loading...

Loading...

Loading...

Loading...

General

What is Tardis.dev and what is your unique value proposition?

Tardis.dev provides the most comprehensive and granular cryptocurrency market data products in the industry and offers:

  • access to high frequency historical data including the most granular tick level order book updates and trades for both derivatives and top spot cryptocurrency exchanges

  • : historical funding, open interest, indexes, liquidations and more

  • fast and convenient data access both via and

  • data with complete how the data is being recorded

  • and support

  • allowing reconstruction of state of the limit order book at any given moment in time across

  • with discounts for solo trades, small prop shops and academic researchers

  • support via Tardis.dev open source libraries that connect directly to exchanges' public WebSocket APIs (no API key required)

Data use cases

  • market microstructure and order book dynamics research

  • trading execution optimization

  • tick-level granularity market simulation

  • liquidity and lead-lag analysis

Which exchanges, instruments and currency pairs are supported?

In total over 40 000 distinct instruments & currency pairs across leading derivatives and spot cryptocurrency exchanges is supported. We collect and provide data for all instruments & currency pairs available on given exchange with some exceptions for spot exchanges where we collect high caps currency pairs only (due to exchanges API limitations).

See for each supported exchange.

  • Binance

Do you provide discounts?

We do provide discounts in a transparent form via .

Do you offer free trials?

Yes, if you'd like to test the service (data quality, coverage, API performance etc.) we offer generous free trials. Simply and we'll set up trial account for you.

What does professional support mean?

Our support team has in-depth knowledge of market data and exchanges' APIs peculiarities, programming and data analysis expertise. You get the answers straight from people whose day to day job is overseeing and maintaining market data collection process and infrastructure.

For we provide dedicated communication channel (via Telegram Messenger, email or Zoom calls) for your company where our team is on standby every business day (7AM - 3PM UTC) to answer any questions you may have. For and we do provide email based support with 24-48 business hours initial response time.

For there is no dedicated support provided, only self-service.

What would I use your services if I can collect data by myself?

Since cryptocurrency exchanges' market data APIs are public anyone can use those to collect the data, but it's a time consuming and resource intensive undertaking (exchanges we support publish ~1000GB of new data every day), that requires investment in , constant monitoring and oversight (exchanges API changes, rate-limiting monitoring, new exchanges integrations, unexpected connection issues, latency monitoring etc.), not to mention implementation costs of data collection, storage and distribution services. All in all we think our offering is , and , provides good value and saves you time and money in comparison to in-house solution allowing you to focus on your core objective not on data management intricacies.

How can I download the data?

You can access historical market data via which provides raw data in or download with, , , , , (open interest, funding, mark price, index price) and . provide data in as well, which can be more flexible than CSV datasets for some use cases but also slower to download due to on-demand, client-side data normalization overhead in comparison to ready to download CSV files.

How far back the historical data is available?

Data is available since 2019-03-30 for majority of the supported exchanges (that existed at that time).

Do you provide historical market data in CSV flat files?

Yes, see for more details.

What programming languages are supported?

Any programming language that can communicate using HTTPS can communicate with our .

We do provide official and clients that offer fast and convenient access to tick-level historical market data.

Finally, our open source, locally runnable server with built-in local data caching, provides market data normalization, custom order book snapshots capabilities and real-time market data streaming support that connects directly to exchanges' WebSocket APIs. It provides both streaming HTTP and WebSocket endpoints returning market data for whole time periods (in contrast to Tardis.dev where single call returns data for single minute time period) and is available via npm and as a Docker Image.

What API protocols can be used to access market data?

Historical market data provided by can be accessed via HTTPS.

Locally runnable server provides both HTTP and WebSocket based APIs for accessing both historical and real-time market data.

How time-machine market replay works?

Exchanges' market data WebSocket APIs are designed to publish real-time feeds and not historical ones. Locally runnable bridges that gap and allows "replaying" historical market data from any given past point in time with the same data format and 'subscribe' logic as real-time exchanges' APIs. In many cases existing exchanges' WebSocket clients can be used to connect to this endpoint just by changing URL, and receive market data in format for date ranges specified in URL query string params.

Do you support consolidated real-time market data streaming?

We do not provide hosted real-time market data API as we think that given everyone can access exchanges' APIs directly for free without restrictions, relaying on 3rd party for such crucial piece of infrastructure does not make sense (additional latency and another ). Instead we developed that offer consolidated real-time normalized market data streaming capabilities, connect directly to exchanges' WebSocket APIs and are completely free to use.

Are there any rate-limits for the API?

There are no API rate limits for API.

for professional level subscriptions is limited to 30 millions requests per day and up to 60 concurrent requests. API key can be used only from single IP adress at the same time. For business level subscriptions there are no rate limits for raw data replay API as long as your behavior does not negatively impact other customers API usage experience. If that's the case, we'll contact your via email and do our best to help how to sort it out - in most cases it's download client bug that over and over downloads the same data in a loop.

How do I obtain my API key?

API key can be obtained on website via . You'll receive it via email after successful order.

What if my API key was compromised?

immediately and we will generate new API key for you.

What is your infrastructure setup?

Market data collection

Highly available located in in London, UK (europe-west2 region) and Tokyo, Japan (asia-northeast1 region)

Market data storage

Two independent, geo-redundant, highly durable storage services

Market data distribution

High performance API servers deployed across network of data centers around the globe

Do you provide SLA?

We do not have a formal SLA in place yet, but all infrastructure is set up to provide highest availability possible on both data collection and distribution side with geo-redundant setup. Both data collection services and public APIs are constantly monitored from multiple locations and our team is immediately notified in case of any issue. We don't practice maintenance that would affect API availability, but in very rare circumstance if that would happen we'll communicate that in advance. If a formal SLA is something that your business require .

backtesting and optimization of trading strategies

  • full historical order book reconstruction at any given point in time

  • training machine learning models

  • alpha generation

  • designing quantitative models

  • academics research

  • data visualizations

  • COIN Futures

  • Spot

  • FTX

  • OKX

    • Futures

    • Swap

    • Options

  • Huobi

    • Futures

    • COIN Swaps

    • USDT Swaps

  • Bitfinex

    • Derivatives

    • Spot

  • Coinbase Pro

  • Kraken Futures (Crypto Facilities)

  • Kraken

  • Bitstamp

  • Gemini

  • Poloniex

  • Bybit

  • Bybit Spot

  • dYdX

  • WOO X

  • Kucoin

    • Spot

  • Blockchain.com

  • Upbit

  • Phemex

  • Delta

  • Ascendex (BitMax)

  • FTX US

  • Binance US

  • Gate.io Futures

  • Gate.io (high caps)

  • Bitnomial

  • Crypto.com

  • OKCoin

  • bitFlyer

  • HitBTC

  • CoinFLEX

  • Binance Jersey

  • Binance DEX

  • 2019-08-01

    2019-03-30

    2019-03-30

    2020-02-01

    2019-03-30

    2019-11-19

    2020-03-28

    2020-10-30

    2019-11-19

    2019-09-14

    2019-05-23

    2019-03-30

    2019-03-30

    2019-06-04

    2019-03-30

    2019-08-30

    2020-07-01

    2019-11-07

    2021-12-04

    2021-04-06

    2023-01-20

    2022-08-16

    2023-02-23

    2021-03-03

    2020-03-17

    2020-03-30

    2021-03-28

    (delisted)

    2020-05-22

    2019-09-25

    2020-07-01

    2020-07-01

    2023-01-13

    2022-06-01

    2019-11-19

    2019-08-30

    2019-11-19

    2020-07-14

    (delisted)

    2019-06-04

    exchange

    available since

    BitMEX

    2019-03-30

    Deribit

    2019-03-30

    Binance USDT Futures

    2019-11-17

    Binance COIN Futures

    2020-06-16

    Binance Spot

    2019-03-30

    non standard data types
    API
    downloadable CSV files
    sourced from real-time WS market data feeds
    control and transparency
    exchange-native
    normalized data format
    time-machine market replay
    all supported cryptocurrency markets
    always available market data backup
    fair, transparent pricing
    consolidated real-time market data streaming
    professional support
    historical data details
    BitMEX
    Deribit
    different subscriptions types
    reach out to us
    business subscriptions
    pro subscriptions
    one-off purchases
    academic and solo subscriptions
    proper infrastructure
    comprehensive
    transparent
    fair
    API
    exchange native format
    CSV datasets
    trades
    incremental order book L2 updates
    order book snapshots
    options chains
    quotes
    derivative tickers
    liquidations
    Our client libs
    normalized format
    downloadable CSV files documentation
    HTTP API
    Python
    Node.js
    tardis-machine
    HTTP API
    HTTP API
    tardis-machine
    tardis-machine server's WebSocket API
    exchange-native
    SPOF
    locally runnable (self hosted) server and open source libraries
    downloadable CSV files
    Raw data replay API
    Tardis.dev
    order form
    Contact us
    Google Cloud Platform Kubernetes Clusters
    contact us

    (delisted)

    USDT Futures
    Spot
    Global (Spot)
    FTX
    OKX Futures
    OKX Swap
    OKX Options
    OKX Spot
    Huobi Futures
    Huobi COIN Swaps
    Huobi COIN Swaps
    Huobi USDT Swaps
    Huobi Global
    Bitfinex Derivatives
    Bitfinex
    Coinbase Pro
    Kraken Futures
    Kraken
    Bitstamp
    Gemini
    Poloniex
    Bybit
    Bybit Spot
    dYdX
    WOO X
    Kucoin Spot
    Blockchain.com
    Upbit
    Phemex
    Delta
    Ascendex
    FTX US
    Binance US
    Gate.io Futures
    Gate.io
    Bitnomial
    Crypto.com
    OKCoin
    bitFlyer
    HitBTC
    CoinFLEX (2.0)
    Binance DEX

    Frequently Asked Questions

    Got questions? We're happy to help!

    General

    • What is Tardis.dev and what is your unique value proposition?

    • Which exchanges, instruments and currency pairs are supported?

    Data

    Billing and Subscriptions

    Got other Questions? We're happy to help!

    Simply via email.

    What L2 order book data can be used for?

  • What L3 order book data can be used for?

  • What is the maximum order book depth available for each supported exchange?

  • Which exchanges support liquidations data type?

  • Do you provide historical options data?

  • Do you provide historical futures data?

  • What is the difference between futures and perpetual swaps contracts?

  • Do you provide time based aggregated data as well?

  • Can you record market data for exchange that's not currently supported?

  • Do you provide market data in normalized format?

  • Do you provide normalized contract amounts for derivatives exchanges in your historical data feeds?

  • What is a difference between exchange-native and normalized data format?

  • What is the channel field used in the HTTP API and client libs replay functions?

  • What time zone is used in the data?

  • Is provided raw market data complete?

  • How frequently exchanges drop WebSocket connections?

  • Can historical order books reconstructed from L2 updates be crossed (bid/ask overlap) occasionally?

  • Can exchange publish data with non monotonically increasing timestamps for single data channel?

  • Are exchanges publishing duplicated trades data messages?

  • How order book data snapshots are provided?

  • Do you collect order books as snapshots or in streaming mode?

  • How incremental_book_l2 CSV dataset is built from real-time data?

  • How can I reconstruct full order book state from incremental_book_L2 CSV dataset?

  • How CSV datasets are split into the files?

  • How market data messages are being timestamped?

  • What is the new historical market data delay in relation to real-time?

  • What are the differences between subscriptions types?

  • Do subscriptions include access to historical data as well?

  • What is included in "Individual" data plan?

  • What is included in "Perpetuals" data plan?

  • What is included in "Derivatives" data plan?

  • What is included in "All Exchanges" data plan?

  • How can I change my subscription plan?

  • Can I pay through invoicing?

  • Can I get quotation document before making an order?

  • How can I get an invoice and VAT refund?

  • What are your VAT details?

  • I’ve lost my invoice. How do I get a new one?

  • What is the refund policy?

  • How can I cancel my subscription?

  • Do you accept payments in cryptocurrency?

  • How I can update my credit card information?

  • Do you provide discounts?
    Do you offer free trials?
    What does professional support mean?
    What would I use your services if I can collect data by myself?
    How can I download the data?
    How far back the historical data is available?
    Do you provide historical market data in CSV flat files?
    What programming languages are supported?
    What API protocols can be used to access market data?
    How time-machine market replay works?
    Do you support consolidated real-time market data streaming?
    Are there any rate-limits for the API?
    How do I obtain my API key?
    What if my API key was compromised?
    What is your infrastructure setup?
    Do you provide SLA?
    What data types do you support?
    What does high frequency historical data mean?
    How historical raw market data is being sourced?
    Why data source matters and why we use from real-time WebSocket feeds as data source vs periodically calling REST endpoints?
    What is the order process?
    Do you provide discounts?
    How one-off purchase based access works?
    How subscription based access works?
    contact us

    Data

    What data types do you support?

    We provide the most comprehensive and granular market data on the market sourced from real-time WebSocket APIs with complete control and transparency how the data is being recorded.

    Via downloadable CSV data files following normalized tick-level data types are available:

    • (top 25 and top 5 levels)

    • (open interest, funding rate, mark price, index price)

    that is available for subscriptions provides data in . See to learn about captured for each exchange. Each captured channel can be considered a different exchange specific data type (for example , or ).

    We also provide following via our (normalization is done client-side, using as a data source):

    • trades

    • order book L2 updates

    • order book snapshots (tick-by-tick, 10ms, 100ms, 1s, 10s etc)

    • quotes

    What does high frequency historical data mean?

    We always collect and provide data with the most granularity that exchange can offer via it's . High frequency can mean different things for different exchanges due to exchanges APIs limitations. For example for it can mean (market-by-order), for all order book real-time updates and for Spot it means order book updates aggregated in 100ms intervals.

    How historical raw market data is being sourced?

    Raw market data is sourced from exchanges real-time WebSocket APIs. For cases where exchange lacks WebSocket API for particular data type we fallback to pooling REST API periodically, e.g., Binance Futures open interest data.

    See for more details and why .

    Why data source matters and why we use real-time WebSocket feeds as data source vs periodically calling REST endpoints?

    Recording exchanges real-time WebSocket feeds allows us preserving and providing that exchanges APIs can offer including data that is simply not available via their REST APIs like tick level order book updates. Historical data sourced from WebSocket real-time feeds adheres to what you'll see when trading live and can be used to exactly replicate live conditions even if it means some occasional causing , real-time data publishing delays especially during larger market moves, or in some edge cases. We find that trade-off acceptable and even if data isn't as clean and corrected as sourced from REST APIs, it allows for more insight into market microstructure and various unusual exchanges behaviors that simply can't be captured otherwise. Simple example would be latency spikes for many exchanges during increased volatility periods where exchange publish trade/order book/quote WebSocket messages with larger than usual latency or simply skip some of the the updates and then return those in one batch. Querying the REST API would result in nice, clean trade history, but such data wouldn't fully reflect real actionable market behavior and would result in unrealistic backtesting results, breaking in the real-time scenarios.

    See for more details.

    What L2 order book data can be used for?

    L2 data (market-by-price) includes bids and asks orders aggregated by price level and can be used to analyze among other things:

    • order book imbalance

    • average execution cost

    • average liquidity away from midpoint

    • average spread

    We do provide L2 data both in , (top 25 and top 5 levels) as well as in format via client-side.

    What L3 order book data can be used for?

    L3 data (market-by-order) includes every order book order addition, update, cancellation and match and can be used to analyze among other things:

    • order resting time

    • order fill probability

    • order queue dynamics

    Historical L3 data is currently available via API for , and - remaining supported exchanges provide only.

    What is the maximum order book depth available for each supported exchange?

    We always collect full depth order book data as long as exchange's WebSocket API supports it. Table below shows current state of affairs for each supported exchange.

    Which exchanges support liquidations data type?

    data is sourced from exchanges WebSocket APIs when supported with fallback to pooling REST APIs when WebSockets APIs do not support that data type and can be accessed via ) or as .

    Do you provide historical options data?

    Yes, we do provide historical options data for and - see CSV data type and and exchange details pages.

    Do you provide historical futures data?

    We cover all leading derivatives exchanges such as , , , , , , , , , and

    What is the difference between futures and perpetual swaps contracts?

    Futures contract is a contract that has expiry date (for example quarter ahead for quarterly futures). Futures contract price converges to spot price as the contract approaches expiration/settlement date. After futures contract expires, exchange settles it and replaces with a new contract for the next period (next quarter for our previous example).

    Perpetual swap contract also commonly called "perp", "swap", "perpetual" or "perpetual future" in crypto exchanges nomenclature is very similar to futures contract, but does not have expiry date (hence perpetual). In order to ensure that the perpetual swap contract price stays near the spot price exchanges employ mechanism called funding rate. When the funding rate is positive, Longs pay Shorts. When the funding rate is negative, Shorts pay Longs. This mechanism can be quite nuanced and vary between exchanges, so it's best to study each contract specification to learn all the details (funding periods, mark price mechanisms etc.).

    See CSV if you'd like to download data for all futures or perpetual swaps as a single file for given exchange instead one by one for each individual instrument.

    Do you provide time based aggregated data as well?

    We are focusing on providing the best possible tick-level historical data for cryptocurrency exchanges and as of now our APIs (both and ) do offer access to tick-level data only and do not offer support for time based aggregated data.

    If you're interested in time based aggregated data (OHLC, interval based order book snapshots) see our that provide such capabilities, but with the caveat that data aggregation is performed client-side from tick-level data sourced from the API, meaning it can be relatively slow process in contrast to ready to download aggregated data.

    Can you record market data for exchange that's not currently supported?

    Yes, we're always open to support new promising exchanges. and we'll get back to you to discuss the details.

    Do you provide market data in normalized format?

    (unified data format for every exchange) is available via our and . Our provides data only in .

    Do you provide normalized contract amounts for derivatives exchanges in your historical data feeds?

    Data we provide has contract amounts exactly as provided by exchanges APIs, meaning in some cases it can be tricky to compare across exchanges due to different contract multipliers (like for example OKEx where each contract has $100 value) or different contract types (linear or inverse). We'll keep it this way, but we also provide that returns contract multipliers, tick sizes and more for each instrument in uniform way, allowing easily normalize the contract amounts client-side without having to go through all kinds of documentation on various exchange to find this information.

    .

    What is a difference between exchange-native and normalized data format?

    Cryptocurrency markets are very fragmented and every exchange provides data in it's own bespoke data format which we call exchange-native data format. Our and can provide market data in this format, meaning data you receive is exactly the same as the live data you would have received from exchanges ("as-is").

    See in exchange-native format and .

    For example BitMEX trade message looks like this:

    and this is Deribit trade message:

    In contrast, normalized data format means the same, unified format across multiple exchanges. We provide normalized data via our (data normalization is performed client-side) as well as via .

    In the process of data normalization we map the data we (exchange-native format) to normalized/unified format across exchanges that is easier to deal with (one data format across multiple exchanges). from exchange-native to normalized format to make the whole process as transparent as possible.

    Sample normalized trade message:

    We support following normalized data types via our :

    • tick-by-tick trades

    • order book L2 updates

    • order book snapshots (tick-by-tick, 10ms, 100ms, 1s, 10s etc)

    • quotes

    and :

    • (top 25 and top 5 levels)

    What is the channel field used in the HTTP API and client libs replay functions?

    Exchanges when publishing real-time data messages, always publish those for subscription topics clients have subscribed to. Those subscriptions topics are also very often called "channels" or "streams" in exchanges documentations pages and describe data type given message belongs to - for example publishes it's trades data via and order book L2 updates data via .

    Since we collect the data for all the channels described in exchanges' details page () our and offer filtering capability by those channels names, so for example to get historical trades for , channel needs to be provided alongside requested instruments symbols (via HTTP API or client lib replay function args).

    What time zone is used in the data?

    UTC, always.

    Is provided raw market data complete?

    We're doing our best to provide the most complete and reliable historical raw data API on the market. To do so amongst , we utilize on Google Cloud Platform that offer best in the class availability, networking and monitoring. However due to exchanges' APIs downtimes (maintenance, deployments, etc.) we can experience data gaps and cannot guarantee 100% data completeness, but 99.9% (99.99% on most days) which should be more than enough for most of the use cases that tick level data is useful for. In rare circumstances, when exchange's API changes without any notice or we hit new unexpected rate limits we also may fail to record data during such period, it happens very rarely and is very specific for each exchange. Use API endpoint and check for incidentReports field in order to get most detailed and up to date information on that subject.

    How frequently exchanges drop WebSocket connections?

    As long as exchange WebSocket API is not 'hidden' behind Cloudflare proxy (causing relatively frequent "CloudFlare WebSocket proxy restarting, Connection reset by peer" errors) connections are stable for majority of supported exchanges and there are almost no connection drops during the day. In cases when there is more volatility in the market some exchanges tend to drop connections more frequently or have larger latency spikes. Overall it's a nuanced matter that changes over time, if you'd have any questions regarding particular exchange, please do not hesitate to .

    Can historical order books reconstructed from L2 updates be crossed (bid/ask overlap) occasionally?

    Although is should never happen in theory, in practice due to various crypto exchanges bugs and peculiarities it can happen (very occasionally), see some posts from users reporting those issues:

    We do track sequence numbers of WebSocket L2 order book messages when collecting the data and restart connection when sequence gap is detected for exchanges that do provide those numbers. We observe that even in scenario when sequence numbers are in check, bid/ask overlap can occur. When such scenario occurs, exchanges tend to 'forget' to publish delete messages for the opposite side of the book when publishing new level for given side - we validated that hypothesis by comparing reconstructed order book snapshots that had crossed order book (bid/ask overlap) for which we removed order book levels for the opposite side manually (as exchange didn't publish that 'delete'), with quote/ticker feeds if best bid/ask matches (for exchanges that provide those) - see .

    Can exchange publish data with non monotonically increasing timestamps for single data channel?

    That shouldn't happen in theory, but we've detected that for some exchanges when new connection is established sometimes first message for given channel & symbol has newer timestamp than subsequent message, e.g., order book snapshot has newer timestamp than first order book update. This is why we provide data via and for given data ranges based on (timestamp of message arrival) which are always monotonically increasing.

    Are exchanges publishing duplicated trades data messages?

    Some exchanges are occasionally publishing duplicated trades (trades with the same ids). Since we collect real-time data we also collect and provide duplicate trades via if those were published by real-time WebSocket feeds of exchanges. Our have functionality that when working with can deduplicate such trades, similarly for we deduplicate data.

    How order book data snapshots are provided?

    Historical market data available via provides order book snapshots at the beginning of each day (00:00 UTC) - .

    We also provide custom order book snapshots with customizable time intervals from tick-by-tick, milliseconds to minutes or hours via in which case custom snapshots are computed client side from raw data provided via HTTP API as well as via - and .

    Do you collect order books as snapshots or in streaming mode?

    Order books are collected in streaming mode - snapshot at the beginning of each day and then incremental updates. .

    We also provide custom order book snapshots with customizable time intervals from tick-by-tick, milliseconds to minutes or hours via in which case custom snapshots are computed client side from raw data provided via HTTP API as well as via - and .

    How CSV dataset is built from real-time data?

    Cryptocurrency exchanges real-time APIs vary a lot, but for they all tend to follow similar flow, first when WS connection is established and subscription is confirmed, exchanges send initial order book snapshot (all existing price levels or top 'x' levels depending on exchange) and then start streaming 'book update' messages (called frequently deltas as well). Those updates when applied to initial snapshot, result in up to data order book state at given time.

    We do provide initial L2 snapshots in incremental_book_L2 dataset at the beginning of each day (00:00 UTC, ), but also anytime exchange closes it's real-time WebSocket connection, .

    Let's take FTX as an example and start with it's snapshot orderbook message (that is frequently called 'partial' in exchanges API docs as well). Remaining bids and asks levels were removed from this sample message for the sake of clarity.

    Such snapshot message maps to the following rows in CSV file:

    ... and here's a sample FTX orderbook update message.

    Let's see how it maps to CSV format.

    See if you have doubts how to reconstruct order book state based on data provided in incremental_book_L2 dataset.

    How can I reconstruct full order book state from CSV dataset?

    See also .

    In order to reconstruct full order book state correctly from data:

    • For each row in the CSV file (iterate in the same order as provided in file):

      • only if local timestamp of current row is larger than previous row local timestamp(local_timestamp column value) it means you can read your local order book state as it's consistent, why? CSV format is flat where each row represents single price level update, but most exchanges real-time feeds publish multiple order book levels updates via single WebSocket message that need to be processed together before reading locally maintained order book state. We use local timestamp value here to detect all price level updates belonging to single 'update' message.

    Alternatively we do also provide order book snapshots CSV datasets ready to download.

    How CSV datasets are split into the files?

    CSV datasets are available in daily intervals split by exchange, data type and symbol. In addition to standard currency pairs/instrument symbols, each exchange also has special depending if it supports given market type: SPOT, FUTURES, OPTIONS and PERPETUALS. That feature is useful if someone is interested in for examples all Deribit's options instruments' trades or quotes data without a need to request data for each symbol separately one by one.

    How market data messages are being timestamped?

    Each message received via WebSocket connection is timestamped with 100ns precision using at arrival time (before any message processing) and stored in ISO 8601 format.

    What is the new historical market data delay in relation to real-time?

    For it's 15 minutes (T - 15min), for given day are available on the next day around 06:00 UTC.

    derivative tick info (open interest, funding rate, mark price, index price)

  • liquidations

  • options summary

  • OHLCV

  • volume/tick based trade bars

  • hidden interest (i.e., iceberg orders)

    real-time, dynamically adjusted

    top 1000 levels initial order book snapshot, full depth incremental order book updates

    100ms

    top 100 levels initial order book snapshot and updates

    real-time

    top 400 levels initial order book snapshot and updates

    real-time

    top 400 levels initial order book snapshot and updates

    real-time

    top 400 levels initial order book snapshot and updates

    real-time

    top 400 levels initial order book snapshot and updates

    real-time

    top 150 levels initial order book snapshot and updates

    30ms

    top 150 levels initial order book snapshot and updates

    30ms

    top 150 levels initial order book snapshot and updates

    30ms

    top 150 levels initial order book snapshot and updates

    100ms

    top 100 levels initial order book snapshot and updates

    real-time

    top 100 levels initial order book snapshot and updates

    real-time

    full order book depth snapshot and updates

    real-time

    full order book depth snapshot and updates

    real-time

    top 1000 levels initial order book snapshot and updates

    real-time

    full order book depth snapshot and updates

    real-time

    full order book depth snapshot and updates

    real-time

    full order book depth snapshot and updates

    real-time

    top 25 levels initial order book snapshot and updates

    real-time

    full order book depth snapshot and updates

    real-time

    top 15 levels snapshots

    real-time

    top 30 levels initial order book snapshot and updates

    20ms

    top 100 levels initial order book snapshot and updates

    real-time

    top 1000 levels initial order book snapshot, full depth incremental order book updates

    100ms

    top 20 levels order book snapshots

    unknown

    top 30 levels order book snapshots

    unknown

    top 400 levels initial order book snapshot and updates

    real-time

    full order book depth snapshot and updates

    real-time

    full order book depth snapshot and updates

    real-time

    top 1000 levels initial order book snapshot, full depth incremental order book updates

    100ms

    collected from stream, since 2021-04-27 liquidation orders streams do not push realtime order data anymore, instead, they push snapshot order data at a maximum frequency of 1 order push per second

    2019-08-01

    collected from channel (trades with liquidation flag)

    2020-12-18

    collected by pooling OKEx REST APIs since liquidations aren't available via WS feeds

    2020-12-18

    collected by pooling OKEx REST APIs since liquidations aren't available via WS feeds

    2020-06-24

    collected from channel

    2020-06-24

    collected from channel

    2019-09-14

    collected from channel

    2019-03-30

    collected from channel (trades with liquidation type)

    2020-12-18

    up until 2021-09-20 collected by pooling Bybit REST APIs since liquidations weren't available via WS feeds, starting from 2021-09-20 collected from channel

    derivative tick info (open interest, funding rate, mark price, index price)

  • liquidations

  • OHLCV

  • volume/tick based trade bars

  • quotes

  • derivative tick info (open interest, funding rate, mark price, index price)

  • liquidations

  • 8.101

    ftx

    ETH/USD

    1601510401216632

    1601510401316432

    true

    bid

    359.72

    121.259

    4.962

    ftx

    ETH/USD

    1601510427184054

    1601510427204036

    false

    ask

    361.02

    0

    if current row is a part of the snapshot (is_snapshot column value set to true) and previous one was not, reset your local order book state object that tracks price levels for each order book side as it means that there was a connection restart and exchange provided full order book snapshot or it was a start of a new day (each incremental_book_L2 file starts with the snapshot)

  • if current row amount is set to zero (amount column value set to 0) remove such price level (row's price column) from your local order book state as such price level does not exist anymore

  • if current row amount is not set to zero update your local order book state price level with new value or add new price level if not exist yet in your local order book state - maintain separately bids and asks order book sides (side column value)

  • exchange

    order book depth

    order book updates frequency

    BitMEX

    full order book depth snapshot and updates

    real-time

    Deribit

    full order book depth snapshot and updates

    real-time

    Binance USDT Futures

    top 1000 levels initial order book snapshot, full depth incremental order book updates

    real-time, dynamically adjusted

    Binance COIN Futures

    exchange

    available since

    notes

    BitMEX

    2019-03-30

    collected from WS liquidation channel

    Deribit

    2019-03-30

    collected from WS trade channel (trades with liquidation flag)

    Binance USDT Futures

    2019-11-17

    collected from WS forceOrder stream, since 2021-04-27 liquidation orders streams do not push realtime order data anymore, instead, they push snapshot order data at a maximum frequency of 1 order push per second

    Binance COIN Futures

    exchange

    symbol

    timestamp

    local_timestamp

    is_snapshot

    side

    price

    amount

    ftx

    ETH/USD

    1601510401216632

    1601510401316432

    true

    ask

    exchange

    symbol

    timestamp

    local_timestamp

    is_snapshot

    side

    price

    amount

    ftx

    ETH/USD

    1601510427184054

    1601510427204046

    false

    ask

    trades
    incremental order book L2 updates
    order book snapshots
    options_chain
    quotes
    derivative tick info
    liquidations
    Raw data API
    pro and business
    exchange-native data format
    historical data details
    real-time channels
    Binance bookTicker channel
    BitMEX liquidation channel
    normalized data types
    client libs
    raw data API
    real-time WS feeds
    Coinbase Pro
    L3 order book data
    Binance Futures
    L2
    Binance
    market data collection overview
    data source matters
    the most granular data
    connection drops
    small data gaps
    duplicated trades
    crossed books
    market data collection overview
    CSV format as incremental order book L2 updates
    tick level order book snapshots
    exchange-native
    API and client libraries that can perform full order book reconstruction
    Bitfinex
    Coinbase Pro
    Bitstamp
    L2 data
    Liquidations
    raw data APIs (replaying relevant channel
    normalized data type via CSV downloads
    Deribit
    OKEx Options
    options chain
    Deribit
    OKEx Options
    BitMEX
    Deribit
    Binance USDT Futures
    Binance COIN Futures
    FTX
    OKEx
    Huobi Futures
    Huobi Swap
    Bitfinex Derivatives
    Bybit
    many more.
    grouped symbols section
    HTTP
    CSV datasets
    client libs
    Contact us
    Normalized market data
    official libraries
    downloadable CSV files
    HTTP API
    exchange-native format
    instrument metadata API
    See instrument metadata API docs
    HTTP API
    client libs
    how we collect data
    why it's important
    client libs
    downloadable CSV files
    collected from real-time WebSocket APIs
    We've open sourced all the data mappings
    client libs
    downloadable CSV data files
    tick-by-tick trades
    incremental order book L2 updates
    tick level order book snapshots
    options_chain
    BitMEX
    trade channel
    orderBookL2
    Captured real-time market data channels section
    HTTP API
    client libs
    BitMEX
    trade
    many other things
    highly available Kubernetes clusters
    connection drops
    /exchanges/:exchange
    contact us
    https://www.reddit.com/r/BitMEX/comments/8lbj9e/bidask_ledger_weirdness/
    https://www.reddit.com/r/KrakenSupport/comments/emu7xc/websocket_bid_sometimes_not_being_deletedupdated/
    https://www.reddit.com/r/KrakenSupport/comments/d1a4nx/websocket_orderbook_receiving_wrong_bid_price_for/
    https://twitter.com/coinarb/status/931260529993170944
    sample code that implements that manual level removal logic
    API
    CSV downloads
    local timestamps
    API
    client libraries
    normalized data
    downloadable CSV files
    tick-by-tick trades
    HTTP API
    see details
    client libs
    downloadable CSV files
    book_snapshot_25
    book_snapshot_5
    See details
    client libs
    downloadable CSV files
    book_snapshot_25
    book_snapshot_5
    incremental_book_l2
    L2 order book data
    more details
    see details
    this answer
    incremental_book_L2
    how incremental_book_l2 CSV dataset is built from real-time data
    incremental_book_L2
    top 25 and top 5 levels
    'grouped' symbols available
    synchronized clock
    API access
    downloadable CSV files

    top 1000 levels initial order book snapshot, full depth incremental order book updates

    2020-06-16

    359.8

    360.24

    {
      "table": "trade",
      "action": "insert",
      "data": [
        {
          "timestamp": "2019-06-01T00:03:11.589Z",
          "symbol": "ETHUSD",
          "side": "Sell",
          "size": 10,
          "price": 268.7,
          "tickDirection": "ZeroMinusTick",
          "trdMatchID": "ebc230d9-0b6e-2d5d-f99a-f90109a2b113",
          "grossValue": 268700,
          "homeNotional": 0.08555051758063137,
          "foreignNotional": 22.987424073915648
        }
      ]
    }
    {
      "jsonrpc": "2.0",
      "method": "subscription",
      "params": {
        "channel": "trades.ETH-26JUN20.raw",
        "data": [
          {
            "trade_seq": 18052,
            "trade_id": "ETH-10813935",
            "timestamp": 1577836825724,
            "tick_direction": 0,
            "price": 132.65,
            "instrument_name": "ETH-26JUN20",
            "index_price": 128.6,
            "direction": "buy",
            "amount": 1.0
          }
        ]
      }
    }
    
    {
      "type": "trade",
      "symbol": "XBTUSD",
      "exchange": "bitmex",
      "id": "282a0445-0e3a-abeb-f403-11003204ea1b",
      "price": 7996,
      "amount": 50,
      "side": "sell",
      "timestamp": "2019-10-23T10:32:49.669Z",
      "localTimestamp": "2019-10-23T10:32:49.740Z"
    }
    {
      "channel": "orderbook",
      "market": "ETH/USD",
      "type": "partial",
      "data": {
        "time": 1601510401.2166328,
        "checksum": 204980439,
        "bids": [
          [
            359.72,
            121.259
          ]
        ],
        "asks": [
          [
            359.8,
            8.101
          ]
        ],
        "action": "partial"
      }
    }
    {
      "channel": "orderbook",
      "market": "ETH/USD",
      "type": "update",
      "data": {
        "time": 1601510427.1840546,
        "checksum": 1377242400,
        "bids": [],
        "asks": [
          [
            360.24,
            4.962
          ],
          [
            361.02,
            0
          ]
        ],
        "action": "update"
      }
    }
    Binance Spot
    FTX
    OKX Futures
    OKX Swap
    OKX Options
    OKX Spot
    Huobi Futures
    Huobi COIN Swaps
    Huobi USDT Swaps
    Huobi Global
    Bitfinex Derivatives
    Bitfinex
    Coinbase Pro
    Kraken Futures
    Kraken
    Bitstamp
    Gemini
    Poloniex
    Bybit
    dYdX
    Upbit
    Phemex
    FTX US
    Binance US
    Gate.io Futures
    Gate.io
    OKCoin
    bitFlyer
    HitBTC
    Binance DEX
    WS forceOrder
    FTX
    WS trades
    OKX Futures
    OKX Swap
    Huobi Futures
    WS liquidation_orders
    Huobi Swap
    WS liquidation_orders
    Bitfinex Derivatives
    WS liquidations
    Kraken Futures
    WS trade
    Bybit
    WS liquidation

    Billing and Subscriptions

    What is the order process?

    1. Select data plan and access type that you're interested in via order form on Tardis.dev website.

      • available data plans

      • available access types

        • Subscription

    2. Proceed to checkout where you provide email address and payment details

      • accepted payment methods

        • Credit Cards (Mastercard Visa Maestro American Express Discover Diners Club JCB UnionPay)

    3. Successfully complete your payment and receive the API key via email that allows you to and . API key is valid as long as subscription is active or 6 months for one-off purchases.

    For larger orders we do also accept .

    Do you provide discounts?

    We do provide discounts in a transparent form via .

    How one-off purchase based access works?

    One-off purchase provides access to specific time periods of historical market data. API key is valid for a year since purchase and allows access to (trades, order books etc.) for ordered date ranges both via and .

    How subscription based access works?

    Subscriptions based access model relies on recurring payments at regular intervals (monthly, quarterly, yearly) and offers access to newly collected market data as it as well as which range depends on chosen billing period.

    There are three 'dimensions' you can customize your subscription by:

    • Subscription type - or

    • Data plan (which exchanges data you get access to) - , , or

    • Billing interval (how much of historical data you get access to) -

    For example "" Business Subscription with allows accessing all available existing historical data via API and CSV files and one year of new data as it becomes available (for initial payment). API key is valid as long as subscription is active and allows access to (trades, orders book data, quotes, funding, liquidations etc.) via and (for subscriptions types only).

    What are the differences between subscriptions types?

    Business subscriptions include up to 10 additional API keys that can be used to get access to historical data by many team members at the same time including across different geo-locations and provide dedicated communication channel (via Telegram Messenger, email or Zoom Calls) for your company where our team is on standby to answer any questions you may have.

    Business subscriptions also provide vendor onboarding support on our side including filling W9, ACH and other company specific forms as well as integration assistance - we do provide dedicated code snippets to help out with seamless integration.

    Academic subscriptions require confirmation of eligibility, for example using university (@edu) email address.

    Pro, academic and solo subscriptions plans include single API key that can be used to get access to historical data. Such API key can't be used by different team members (IP addresses) at the same time. feature does not have this limitation (does not require API key at all).

    Do subscriptions include access to historical data as well?

    Yes, depending on chosen billing period, subscriptions include access to existing historical market data as well:

    • all available historical data if subscription is billed yearly - historical market data is available since 2019-03-30 for majority of the supported exchanges ( for exact date for particular exchange)

    • 12 months of historical data if subscription is billed quarterly, e.g., subscription that has started at 2020-04-01, includes access to historical data since 2019-04-01 - it's not a rolling time window, but fixed starting date since when historical data is available for your subscription

    • 4 months of historical data if subscription is billed monthly

    All subscriptions provide access to (trades, orders book data, quotes, funding etc.) via and (for pro and business subscriptions types).

    What is included in "Individual" data plan?

    "Individual" data plan provides per-exchange access to market data that includes full feed (all instruments) and data types of selected exchange(s), for example full exchange data feed.

    See to learn more what data is available for each supported exchange.

    "Individual" data plan allows access to (trades, orders book data, quotes, funding etc.) via and (for pro and business subscriptions types). Range of historical data access for "Individual" data plan depends on (for example: access to all existing historical data we collected if subscription is billed yearly).

    "Individual" data plan is available both for and.

    What is included in "Perpetuals" data plan?

    "Perpetuals" data plan provides access to the following perpetual swaps instruments' market data (over 500 perpetual swaps instruments across 13 exchanges):

    • : all perpetual swaps instruments

    • : all perpetual swaps instruments

    • : all perpetual swaps instruments

    "Perpetuals" data plan allows access to (trades, orders book data, funding etc.) via and (for pro and business subscriptions types). Range of historical data access for "Perpetuals" data plan depends on (for example: access to all existing historical data we collected if subscription is billed yearly).

    "Perpetuals" data plan is available both for and.

    What is included in "Derivatives" data plan?

    "Derivatives" data plan provides access to the following derivatives exchanges' market data:

    • : all exchange's instruments

    • : all exchange's instruments

    • : all exchange's instruments

    "Derivatives" data plan allows access to (trades, orders book data, quotes, funding etc.) via and (for pro and business subscriptions types).

    Range of historical data access for "Derivatives" data plan depends on (for example: access to all existing historical data we collected if subscription is billed yearly).

    "Derivatives" data plan is available both for and.

    What is included in "All Exchanges" data plan?

    "All Exchanges" data plan provides access to market data of (30+ leading spot and derivatives exchanges, ).

    "All Exchanges" data plan allows access to (trades, orders book data, quotes, funding, liquidations etc.) for and theirs instruments/currency pairs via and (for pro and business subscriptions types).

    Range of historical data access for "All Exchanges" data plan depends on (for example: access to all existing historical data we collected if subscription is billed yearly).

    "All Exchanges" data plan is available both for and.

    How can I change my subscription plan?

    describing which plan you'd like to change to and we'll handle the rest.

    Can I pay through invoicing?

    We offer invoicing for customers paying over $6000 for data access. Simply use our and "PAY THROUGH INVOICING" button.

    Alternatively with orders details you're interested in (data plan, billing period) and we'll send you back invoice that if paid will give you the access to the data.

    Can I get quotation document before making an order?

    Yes, please use our and "REQUEST QUOTATION" button. Alternatively with orders details you're interested in (data plan, billing period) and we'll send you back quotation document in no time.

    How can I get an invoice and VAT refund?

    After successful order you'll receive Receipt email from which is our online reseller & payment processor. Click on the button titled "View Receipt" there.

    You will be redirected to the receipt page where you will be able to enter your address details by clicking on "Add address & VAT Number“ link.

    If you would like to enter a VAT number select "This is a business purchase" checkbox to enter the VAT ID if forgot to enter it during the checkout. The tax amount will be refunded in max. 12 hours after it is confirmed by Paddle.

    • Right click on the screen and click 'Print...' in context menu

    • Change destination to 'Save as PDF'

    • Click 'Save' button to save invoice as PDF file

    What are your VAT details?

    Click on the link titled "Click here to get a full invoice with address & custom information" provided with the order confirmation email sent by Paddle to get the address and VAT ID of Paddle who process our payments. Paddle acts as a reseller and Merchant of Record so they handle VAT on our behalf.

    I’ve lost my invoice. How do I get a new one?

    You need to or [email protected] to request a new invoice. Please provide the email address you bought the subscription with and any extra details that might help.

    What is the refund policy?

    We do not offer refunds for initial subscription payments and one-off purchases.

    If you are on yearly billing and forget to your subscription before the renewal date, within seven days after the renewal date to discuss a refund.

    If you’re on a monthly or quarterly billing, please be sure to your subscription before the end date of your current plan as there are no refunds for recurring payments on monthly and quarterly billing plans.

    If you'd like to test the service, we offer generous free trials. Simply and we'll set up test account for you in no time.

    How can I cancel my subscription?

    In order to cancel you active subscription use the 'Cancel subscription' link we've sent you in email together with your API key or and we'll provide cancellation link for you. Alternatively you can email ([email protected]) which acts as our reseller and Merchant of Record including a note of the email address you used to purchase your subscription and your order number.

    Do you accept payments in cryptocurrency?

    We accept BTC, ETH and USDT for . and we'll get back to you with details.

    How I can update my credit card information?

    In order to update your credit card information use the 'Update payment method' link we've sent you in email together with your API key or and we'll provide that link for you.

  • Academic

  • Pro

  • Business

  • PayPal
  • Apple Pay (one-off purchases only)

  • Wire Transfers (for one-off purchases only)

  • —

    —

    ✓

    ✓

    ✓

    ✓

    ✓

    ✓

    —

    —

    ✓

    ✓

    Additional API keys

    —

    —

    —

    ✓

    none

    none

    email (priority)

    dedicated

    Integration assistance

    —

    —

    —

    ✓

    Vendor onboarding

    —

    —

    —

    ✓

    Api keys count

    1

    1

    1

    10

    —

    —

    ✓ ()

    ✓

    , e.g., subscription that has started at 2020-04-01, includes access to historical data since 2019-12-01 - it's not a rolling time window, but fixed starting date since when historical data is available for your subscription
    : all perpetual swaps instruments
  • FTX: all perpetual swaps instruments

  • OKX Swap: all perpetual swaps instruments

  • Huobi COIN Swaps: all perpetual swaps instruments

  • Huobi USDT Swaps: all perpetual swaps instruments

  • bitFlyer: FX_BTC_JPY

  • Bitfinex Derivatives: all perpetual swaps instruments

  • Bybit: all perpetual swaps instruments

  • dYdX: all perpetual swaps instruments

  • Phemex: all perpetual swaps instruments

  • Delta: all perpetual swaps instruments

  • Gate.io Futures: all perpetual swaps instruments

  • CoinFLEX: all perpetual swaps instruments

  • dYdX: all perpetual swaps instruments

  • WOO X: all perpetual swaps instruments

  • Ascendex: all perpetual swaps instruments

  • Crypto.com: all perpetual swaps instruments

  • : all exchange's instruments
  • FTX: all exchange's instruments

  • OKX Futures: all exchange's instruments

  • OKX Swap: all exchange's instruments

  • OKX Options: all exchange's instruments

  • Huobi Futures: all exchange's instruments

  • Huobi COIN Swap: all exchange's instruments

  • Huobi USDT Swaps: all exchange's instruments

  • Bitfinex Derivatives: all exchange's instruments

  • Bybit: all exchange's instruments

  • DYdX: all exchange's instruments

  • Phemex: all exchange's instruments

  • CoinFLEX: : all exchange's instruments

  • Delta: all exchange's instruments

  • bitFlyer: all exchange's instruments

  • Gate.io Futures: all exchange's instruments

  • dYdX: all perpetual swaps instruments

  • WOO X: all exchange's instruments

  • Crypto.com: all exchange's instruments

  • Ascendex: all exchange's instruments

  • Academic

    Solo

    Professional

    Business

    Downloadable CSV files

    ✓

    ✓

    ✓

    ✓

    Raw data replay API (HTTP API /data-feeds)

    —

    —

    ✓

    Individual
    Perpetuals
    Derivatives
    All Exchanges
    One-off Purchase
    .
    download CSV datasets
    access historical data via API
    pay through invoicing
    different subscriptions types
    all available data types
    API
    downloadable CSV files
    becomes available
    existing historical market data
    Academic, Solo,
    Pro
    Business
    Individual
    Perpetuals
    Derivatives
    All Exchanges
    monthly, quarterly or yearly
    All Exchanges
    yearly billing period
    all available data types
    downloadable CSV files
    via raw market data API
    Pro and Business
    Real-time market data streaming
    see exchange details page
    all available data types
    downloadable CSV files
    raw data replay API
    Coinbase
    Historical Data Details
    all available data types
    downloadable CSV files
    raw data replay API
    chosen billing period
    subscriptions
    one-off purchases
    BitMEX
    Deribit
    Binance USDT Futures
    all available data types
    downloadable CSV files
    raw data replay API
    chosen billing period
    subscriptions
    one-off purchases
    BitMEX
    Deribit
    Binance USDT Futures
    Binance COIN Futures
    all available data types
    downloadable CSV files
    raw data replay API
    chosen billing period
    subscriptions
    one-off purchases
    all supported exchanges
    see full list
    all available data types
    all supported exchanges
    downloadable CSV files
    raw data replay API
    chosen billing period
    subscriptions
    one-off purchases
    Contact us
    order form
    contact us
    order form
    contact us
    Paddle
    Save invoice as PDF
    contact us
    cancel
    reach out to us
    cancel
    reach out to us
    contact us
    Paddle
    one-off purchases
    Contact us
    contact us
    Pay through invoicing order form preview
    Sample invoice document preview
    request quotation document order form preview
    Sample quotation document preview
    Receipt email preview
    Receipt page preview
    "Edit receipt information" preview

    ✓

    Solo
    Binance COIN Futures
    Tardis Machine replay APis
    Tardis Machine real-time APIs
    Instrument metadata API
    Professional support level
    Unlimited data raplay API
    30 millions requests per day, up to 60 concurrent requests