Tardis.dev provides the most comprehensive and granular cryptocurrency market data products in the industry and offers:
access to high frequency historical data including the most granular tick level order book updates and trades for both derivatives and top spot cryptocurrency exchanges
: historical funding, open interest, indexes, liquidations and more
fast and convenient data access both via and
data with complete how the data is being recorded
and support
allowing reconstruction of state of the limit order book at any given moment in time across
with discounts for solo trades, small prop shops and academic researchers
support via Tardis.dev open source libraries that connect directly to exchanges' public WebSocket APIs (no API key required)
Data use cases
market microstructure and order book dynamics research
trading execution optimization
tick-level granularity market simulation
liquidity and lead-lag analysis
In total over 40 000 distinct instruments & currency pairs across leading derivatives and spot cryptocurrency exchanges is supported. We collect and provide data for all instruments & currency pairs available on given exchange with some exceptions for spot exchanges where we collect high caps currency pairs only (due to exchanges API limitations).
Binance
We do provide discounts in a transparent form via .
Yes, if you'd like to test the service (data quality, coverage, API performance etc.) we offer generous free trials. Simply and we'll set up trial account for you.
Our support team has in-depth knowledge of market data and exchanges' APIs peculiarities, programming and data analysis expertise. You get the answers straight from people whose day to day job is overseeing and maintaining market data collection process and infrastructure.
For we provide dedicated communication channel (via Telegram Messenger, email or Zoom calls) for your company where our team is on standby every business day (7AM - 3PM UTC) to answer any questions you may have. For and we do provide email based support with 24-48 business hours initial response time.
For there is no dedicated support provided, only self-service.
Since cryptocurrency exchanges' market data APIs are public anyone can use those to collect the data, but it's a time consuming and resource intensive undertaking (exchanges we support publish ~1000GB of new data every day), that requires investment in , constant monitoring and oversight (exchanges API changes, rate-limiting monitoring, new exchanges integrations, unexpected connection issues, latency monitoring etc.), not to mention implementation costs of data collection, storage and distribution services. All in all we think our offering is , and , provides good value and saves you time and money in comparison to in-house solution allowing you to focus on your core objective not on data management intricacies.
You can access historical market data via which provides raw data in or download with, , , , , (open interest, funding, mark price, index price) and . provide data in as well, which can be more flexible than CSV datasets for some use cases but also slower to download due to on-demand, client-side data normalization overhead in comparison to ready to download CSV files.
Data is available since 2019-03-30 for majority of the supported exchanges (that existed at that time).
Yes, see for more details.
Any programming language that can communicate using HTTPS can communicate with our .
We do provide official and clients that offer fast and convenient access to tick-level historical market data.
Finally, our open source, locally runnable server with built-in local data caching, provides market data normalization, custom order book snapshots capabilities and real-time market data streaming support that connects directly to exchanges' WebSocket APIs. It provides both streaming HTTP and WebSocket endpoints returning market data for whole time periods (in contrast to Tardis.dev where single call returns data for single minute time period) and is available via npm and as a Docker Image.
Historical market data provided by can be accessed via HTTPS.
Locally runnable server provides both HTTP and WebSocket based APIs for accessing both historical and real-time market data.
Exchanges' market data WebSocket APIs are designed to publish real-time feeds and not historical ones. Locally runnable bridges that gap and allows "replaying" historical market data from any given past point in time with the same data format and 'subscribe' logic as real-time exchanges' APIs. In many cases existing exchanges' WebSocket clients can be used to connect to this endpoint just by changing URL, and receive market data in format for date ranges specified in URL query string params.
We do not provide hosted real-time market data API as we think that given everyone can access exchanges' APIs directly for free without restrictions, relaying on 3rd party for such crucial piece of infrastructure does not make sense (additional latency and another ). Instead we developed that offer consolidated real-time normalized market data streaming capabilities, connect directly to exchanges' WebSocket APIs and are completely free to use.
There are no API rate limits for API.
for professional level subscriptions is limited to 30 millions requests per day and up to 60 concurrent requests. API key can be used only from single IP adress at the same time. For business level subscriptions there are no rate limits for raw data replay API as long as your behavior does not negatively impact other customers API usage experience. If that's the case, we'll contact your via email and do our best to help how to sort it out - in most cases it's download client bug that over and over downloads the same data in a loop.
API key can be obtained on website via . You'll receive it via email after successful order.
immediately and we will generate new API key for you.
Highly available located in in London, UK (europe-west2 region) and Tokyo, Japan (asia-northeast1 region)
Two independent, geo-redundant, highly durable storage services
High performance API servers deployed across network of data centers around the globe
We do not have a formal SLA in place yet, but all infrastructure is set up to provide highest availability possible on both data collection and distribution side with geo-redundant setup. Both data collection services and public APIs are constantly monitored from multiple locations and our team is immediately notified in case of any issue. We don't practice maintenance that would affect API availability, but in very rare circumstance if that would happen we'll communicate that in advance. If a formal SLA is something that your business require .
backtesting and optimization of trading strategies
full historical order book reconstruction at any given point in time
training machine learning models
alpha generation
designing quantitative models
academics research
data visualizations
Bitfinex
Kraken Futures (Crypto Facilities)
Kucoin
2019-08-01
2019-03-30
2019-03-30
2020-02-01
2019-03-30
2019-11-19
2020-03-28
2020-10-30
2019-11-19
2019-09-14
2019-05-23
2019-03-30
2019-03-30
2019-06-04
2019-03-30
2019-08-30
2020-07-01
2019-11-07
2021-12-04
2021-04-06
2023-01-20
2022-08-16
2023-02-23
2021-03-03
2020-03-17
2020-03-30
2021-03-28
(delisted)
2020-05-22
2019-09-25
2020-07-01
2020-07-01
2023-01-13
2022-06-01
2019-11-19
2019-08-30
2019-11-19
2020-07-14
(delisted)
2019-06-04
exchange
available since
2019-03-30
2019-03-30
2019-11-17
2020-06-16
2019-03-30
(delisted)
Got questions? We're happy to help!
Simply via email.
We provide the most comprehensive and granular market data on the market sourced from real-time WebSocket APIs with complete control and transparency how the data is being recorded.
Via downloadable CSV data files following normalized tick-level data types are available:
(top 25 and top 5 levels)
(open interest, funding rate, mark price, index price)
that is available for subscriptions provides data in . See to learn about captured for each exchange. Each captured channel can be considered a different exchange specific data type (for example , or ).
We also provide following via our (normalization is done client-side, using as a data source):
trades
order book L2 updates
order book snapshots (tick-by-tick, 10ms, 100ms, 1s, 10s etc)
quotes
We always collect and provide data with the most granularity that exchange can offer via it's . High frequency can mean different things for different exchanges due to exchanges APIs limitations. For example for it can mean (market-by-order), for all order book real-time updates and for Spot it means order book updates aggregated in 100ms intervals.
Raw market data is sourced from exchanges real-time WebSocket APIs. For cases where exchange lacks WebSocket API for particular data type we fallback to pooling REST API periodically, e.g., Binance Futures open interest data.
Recording exchanges real-time WebSocket feeds allows us preserving and providing that exchanges APIs can offer including data that is simply not available via their REST APIs like tick level order book updates. Historical data sourced from WebSocket real-time feeds adheres to what you'll see when trading live and can be used to exactly replicate live conditions even if it means some occasional causing , real-time data publishing delays especially during larger market moves, or in some edge cases. We find that trade-off acceptable and even if data isn't as clean and corrected as sourced from REST APIs, it allows for more insight into market microstructure and various unusual exchanges behaviors that simply can't be captured otherwise. Simple example would be latency spikes for many exchanges during increased volatility periods where exchange publish trade/order book/quote WebSocket messages with larger than usual latency or simply skip some of the the updates and then return those in one batch. Querying the REST API would result in nice, clean trade history, but such data wouldn't fully reflect real actionable market behavior and would result in unrealistic backtesting results, breaking in the real-time scenarios.
L2 data (market-by-price) includes bids and asks orders aggregated by price level and can be used to analyze among other things:
order book imbalance
average execution cost
average liquidity away from midpoint
average spread
We do provide L2 data both in , (top 25 and top 5 levels) as well as in format via client-side.
L3 data (market-by-order) includes every order book order addition, update, cancellation and match and can be used to analyze among other things:
order resting time
order fill probability
order queue dynamics
Historical L3 data is currently available via API for , and - remaining supported exchanges provide only.
We always collect full depth order book data as long as exchange's WebSocket API supports it. Table below shows current state of affairs for each supported exchange.
data is sourced from exchanges WebSocket APIs when supported with fallback to pooling REST APIs when WebSockets APIs do not support that data type and can be accessed via ) or as .
Yes, we do provide historical options data for and - see CSV data type and and exchange details pages.
We cover all leading derivatives exchanges such as , , , , , , , , , and
Futures contract is a contract that has expiry date (for example quarter ahead for quarterly futures). Futures contract price converges to spot price as the contract approaches expiration/settlement date. After futures contract expires, exchange settles it and replaces with a new contract for the next period (next quarter for our previous example).
Perpetual swap contract also commonly called "perp", "swap", "perpetual" or "perpetual future" in crypto exchanges nomenclature is very similar to futures contract, but does not have expiry date (hence perpetual). In order to ensure that the perpetual swap contract price stays near the spot price exchanges employ mechanism called funding rate. When the funding rate is positive, Longs pay Shorts. When the funding rate is negative, Shorts pay Longs. This mechanism can be quite nuanced and vary between exchanges, so it's best to study each contract specification to learn all the details (funding periods, mark price mechanisms etc.).
We are focusing on providing the best possible tick-level historical data for cryptocurrency exchanges and as of now our APIs (both and ) do offer access to tick-level data only and do not offer support for time based aggregated data.
If you're interested in time based aggregated data (OHLC, interval based order book snapshots) see our that provide such capabilities, but with the caveat that data aggregation is performed client-side from tick-level data sourced from the API, meaning it can be relatively slow process in contrast to ready to download aggregated data.
Yes, we're always open to support new promising exchanges. and we'll get back to you to discuss the details.
(unified data format for every exchange) is available via our and . Our provides data only in .
Data we provide has contract amounts exactly as provided by exchanges APIs, meaning in some cases it can be tricky to compare across exchanges due to different contract multipliers (like for example OKEx where each contract has $100 value) or different contract types (linear or inverse). We'll keep it this way, but we also provide that returns contract multipliers, tick sizes and more for each instrument in uniform way, allowing easily normalize the contract amounts client-side without having to go through all kinds of documentation on various exchange to find this information.
Cryptocurrency markets are very fragmented and every exchange provides data in it's own bespoke data format which we call exchange-native data format. Our and can provide market data in this format, meaning data you receive is exactly the same as the live data you would have received from exchanges ("as-is").
For example BitMEX trade message looks like this:
and this is Deribit trade message:
In contrast, normalized data format means the same, unified format across multiple exchanges. We provide normalized data via our (data normalization is performed client-side) as well as via .
Sample normalized trade message:
We support following normalized data types via our :
tick-by-tick trades
order book L2 updates
order book snapshots (tick-by-tick, 10ms, 100ms, 1s, 10s etc)
quotes
and :
(top 25 and top 5 levels)
channel field used in the HTTP API and client libs replay functions?Exchanges when publishing real-time data messages, always publish those for subscription topics clients have subscribed to. Those subscriptions topics are also very often called "channels" or "streams" in exchanges documentations pages and describe data type given message belongs to - for example publishes it's trades data via and order book L2 updates data via .
Since we collect the data for all the channels described in exchanges' details page () our and offer filtering capability by those channels names, so for example to get historical trades for , channel needs to be provided alongside requested instruments symbols (via HTTP API or client lib replay function args).
UTC, always.
We're doing our best to provide the most complete and reliable historical raw data API on the market. To do so amongst , we utilize on Google Cloud Platform that offer best in the class availability, networking and monitoring. However due to exchanges' APIs downtimes (maintenance, deployments, etc.) we can experience data gaps and cannot guarantee 100% data completeness, but 99.9% (99.99% on most days) which should be more than enough for most of the use cases that tick level data is useful for.
In rare circumstances, when exchange's API changes without any notice or we hit new unexpected rate limits we also may fail to record data during such period, it happens very rarely and is very specific for each exchange. Use API endpoint and check for incidentReports field in order to get most detailed and up to date information on that subject.
As long as exchange WebSocket API is not 'hidden' behind Cloudflare proxy (causing relatively frequent "CloudFlare WebSocket proxy restarting, Connection reset by peer" errors) connections are stable for majority of supported exchanges and there are almost no connection drops during the day. In cases when there is more volatility in the market some exchanges tend to drop connections more frequently or have larger latency spikes. Overall it's a nuanced matter that changes over time, if you'd have any questions regarding particular exchange, please do not hesitate to .
Although is should never happen in theory, in practice due to various crypto exchanges bugs and peculiarities it can happen (very occasionally), see some posts from users reporting those issues:
We do track sequence numbers of WebSocket L2 order book messages when collecting the data and restart connection when sequence gap is detected for exchanges that do provide those numbers. We observe that even in scenario when sequence numbers are in check, bid/ask overlap can occur. When such scenario occurs, exchanges tend to 'forget' to publish delete messages for the opposite side of the book when publishing new level for given side - we validated that hypothesis by comparing reconstructed order book snapshots that had crossed order book (bid/ask overlap) for which we removed order book levels for the opposite side manually (as exchange didn't publish that 'delete'), with quote/ticker feeds if best bid/ask matches (for exchanges that provide those) - see .
That shouldn't happen in theory, but we've detected that for some exchanges when new connection is established sometimes first message for given channel & symbol has newer timestamp than subsequent message, e.g., order book snapshot has newer timestamp than first order book update. This is why we provide data via and for given data ranges based on (timestamp of message arrival) which are always monotonically increasing.
Some exchanges are occasionally publishing duplicated trades (trades with the same ids). Since we collect real-time data we also collect and provide duplicate trades via if those were published by real-time WebSocket feeds of exchanges. Our have functionality that when working with can deduplicate such trades, similarly for we deduplicate data.
Historical market data available via provides order book snapshots at the beginning of each day (00:00 UTC) - .
We also provide custom order book snapshots with customizable time intervals from tick-by-tick, milliseconds to minutes or hours via in which case custom snapshots are computed client side from raw data provided via HTTP API as well as via - and .
Order books are collected in streaming mode - snapshot at the beginning of each day and then incremental updates. .
We also provide custom order book snapshots with customizable time intervals from tick-by-tick, milliseconds to minutes or hours via in which case custom snapshots are computed client side from raw data provided via HTTP API as well as via - and .
Cryptocurrency exchanges real-time APIs vary a lot, but for they all tend to follow similar flow, first when WS connection is established and subscription is confirmed, exchanges send initial order book snapshot (all existing price levels or top 'x' levels depending on exchange) and then start streaming 'book update' messages (called frequently deltas as well). Those updates when applied to initial snapshot, result in up to data order book state at given time.
Let's take FTX as an example and start with it's snapshot orderbook message (that is frequently called 'partial' in exchanges API docs as well). Remaining bids and asks levels were removed from this sample message for the sake of clarity.
Such snapshot message maps to the following rows in CSV file:
... and here's a sample FTX orderbook update message.
Let's see how it maps to CSV format.
See if you have doubts how to reconstruct order book state based on data provided in incremental_book_L2 dataset.
In order to reconstruct full order book state correctly from data:
For each row in the CSV file (iterate in the same order as provided in file):
only if local timestamp of current row is larger than previous row local timestamp(local_timestamp column value) it means you can read your local order book state as it's consistent, why? CSV format is flat where each row represents single price level update, but most exchanges real-time feeds publish multiple order book levels updates via single WebSocket message that need to be processed together before reading locally maintained order book state. We use local timestamp value here to detect all price level updates belonging to single 'update' message.
Alternatively we do also provide order book snapshots CSV datasets ready to download.
CSV datasets are available in daily intervals split by exchange, data type and symbol. In addition to standard currency pairs/instrument symbols, each exchange also has special depending if it supports given market type: SPOT, FUTURES, OPTIONS and PERPETUALS. That feature is useful if someone is interested in for examples all Deribit's options instruments' trades or quotes data without a need to request data for each symbol separately one by one.
Each message received via WebSocket connection is timestamped with 100ns precision using at arrival time (before any message processing) and stored in ISO 8601 format.
For it's 15 minutes (T - 15min), for given day are available on the next day around 06:00 UTC.
derivative tick info (open interest, funding rate, mark price, index price)
liquidations
options summary
OHLCV
volume/tick based trade bars
hidden interest (i.e., iceberg orders)
real-time, dynamically adjusted
top 1000 levels initial order book snapshot, full depth incremental order book updates
100ms
top 100 levels initial order book snapshot and updates
real-time
top 400 levels initial order book snapshot and updates
real-time
top 400 levels initial order book snapshot and updates
real-time
top 400 levels initial order book snapshot and updates
real-time
top 400 levels initial order book snapshot and updates
real-time
top 150 levels initial order book snapshot and updates
30ms
top 150 levels initial order book snapshot and updates
30ms
top 150 levels initial order book snapshot and updates
30ms
top 150 levels initial order book snapshot and updates
100ms
top 100 levels initial order book snapshot and updates
real-time
top 100 levels initial order book snapshot and updates
real-time
full order book depth snapshot and updates
real-time
full order book depth snapshot and updates
real-time
top 1000 levels initial order book snapshot and updates
real-time
full order book depth snapshot and updates
real-time
full order book depth snapshot and updates
real-time
full order book depth snapshot and updates
real-time
top 25 levels initial order book snapshot and updates
real-time
full order book depth snapshot and updates
real-time
top 15 levels snapshots
real-time
top 30 levels initial order book snapshot and updates
20ms
top 100 levels initial order book snapshot and updates
real-time
top 1000 levels initial order book snapshot, full depth incremental order book updates
100ms
top 20 levels order book snapshots
unknown
top 30 levels order book snapshots
unknown
top 400 levels initial order book snapshot and updates
real-time
full order book depth snapshot and updates
real-time
full order book depth snapshot and updates
real-time
top 1000 levels initial order book snapshot, full depth incremental order book updates
100ms
collected from stream, since 2021-04-27 liquidation orders streams do not push realtime order data anymore, instead, they push snapshot order data at a maximum frequency of 1 order push per second
2019-08-01
collected from channel (trades with liquidation flag)
2020-12-18
collected by pooling OKEx REST APIs since liquidations aren't available via WS feeds
2020-12-18
collected by pooling OKEx REST APIs since liquidations aren't available via WS feeds
2020-06-24
collected from channel
2020-06-24
collected from channel
2019-09-14
collected from channel
2019-03-30
collected from channel (trades with liquidation type)
2020-12-18
up until 2021-09-20 collected by pooling Bybit REST APIs since liquidations weren't available via WS feeds, starting from 2021-09-20 collected from channel
derivative tick info (open interest, funding rate, mark price, index price)
liquidations
OHLCV
volume/tick based trade bars
derivative tick info (open interest, funding rate, mark price, index price)
8.101
ftx
ETH/USD
1601510401216632
1601510401316432
true
bid
359.72
121.259
4.962
ftx
ETH/USD
1601510427184054
1601510427204036
false
ask
361.02
0
if current row is a part of the snapshot (is_snapshot column value set to true) and previous one was not, reset your local order book state object that tracks price levels for each order book side as it means that there was a connection restart and exchange provided full order book snapshot or it was a start of a new day (each incremental_book_L2 file starts with the snapshot)
if current row amount is set to zero (amount column value set to 0) remove such price level (row's price column) from your local order book state as such price level does not exist anymore
if current row amount is not set to zero update your local order book state price level with new value or add new price level if not exist yet in your local order book state - maintain separately bids and asks order book sides (side column value)
exchange
order book depth
order book updates frequency
full order book depth snapshot and updates
real-time
full order book depth snapshot and updates
real-time
top 1000 levels initial order book snapshot, full depth incremental order book updates
real-time, dynamically adjusted
exchange
available since
notes
2019-03-30
collected from WS liquidation channel
2019-03-30
collected from WS trade channel (trades with liquidation flag)
2019-11-17
collected from WS forceOrder stream, since 2021-04-27 liquidation orders streams do not push realtime order data anymore, instead, they push snapshot order data at a maximum frequency of 1 order push per second
exchange
symbol
timestamp
local_timestamp
is_snapshot
side
price
amount
ftx
ETH/USD
1601510401216632
1601510401316432
true
ask
exchange
symbol
timestamp
local_timestamp
is_snapshot
side
price
amount
ftx
ETH/USD
1601510427184054
1601510427204046
false
ask
top 1000 levels initial order book snapshot, full depth incremental order book updates
2020-06-16
359.8
360.24
{
"table": "trade",
"action": "insert",
"data": [
{
"timestamp": "2019-06-01T00:03:11.589Z",
"symbol": "ETHUSD",
"side": "Sell",
"size": 10,
"price": 268.7,
"tickDirection": "ZeroMinusTick",
"trdMatchID": "ebc230d9-0b6e-2d5d-f99a-f90109a2b113",
"grossValue": 268700,
"homeNotional": 0.08555051758063137,
"foreignNotional": 22.987424073915648
}
]
}{
"jsonrpc": "2.0",
"method": "subscription",
"params": {
"channel": "trades.ETH-26JUN20.raw",
"data": [
{
"trade_seq": 18052,
"trade_id": "ETH-10813935",
"timestamp": 1577836825724,
"tick_direction": 0,
"price": 132.65,
"instrument_name": "ETH-26JUN20",
"index_price": 128.6,
"direction": "buy",
"amount": 1.0
}
]
}
}
{
"type": "trade",
"symbol": "XBTUSD",
"exchange": "bitmex",
"id": "282a0445-0e3a-abeb-f403-11003204ea1b",
"price": 7996,
"amount": 50,
"side": "sell",
"timestamp": "2019-10-23T10:32:49.669Z",
"localTimestamp": "2019-10-23T10:32:49.740Z"
}{
"channel": "orderbook",
"market": "ETH/USD",
"type": "partial",
"data": {
"time": 1601510401.2166328,
"checksum": 204980439,
"bids": [
[
359.72,
121.259
]
],
"asks": [
[
359.8,
8.101
]
],
"action": "partial"
}
}{
"channel": "orderbook",
"market": "ETH/USD",
"type": "update",
"data": {
"time": 1601510427.1840546,
"checksum": 1377242400,
"bids": [],
"asks": [
[
360.24,
4.962
],
[
361.02,
0
]
],
"action": "update"
}
}Select data plan and access type that you're interested in via order form on Tardis.dev website.
available data plans
available access types
Subscription
Proceed to checkout where you provide email address and payment details
accepted payment methods
Credit Cards (Mastercard Visa Maestro American Express Discover Diners Club JCB UnionPay)
Successfully complete your payment and receive the API key via email that allows you to and . API key is valid as long as subscription is active or 6 months for one-off purchases.
We do provide discounts in a transparent form via .
One-off purchase provides access to specific time periods of historical market data. API key is valid for a year since purchase and allows access to (trades, order books etc.) for ordered date ranges both via and .
Subscriptions based access model relies on recurring payments at regular intervals (monthly, quarterly, yearly) and offers access to newly collected market data as it as well as which range depends on chosen billing period.
There are three 'dimensions' you can customize your subscription by:
Subscription type - or
Data plan (which exchanges data you get access to) - , , or
Billing interval (how much of historical data you get access to) -
For example "" Business Subscription with allows accessing all available existing historical data via API and CSV files and one year of new data as it becomes available (for initial payment). API key is valid as long as subscription is active and allows access to (trades, orders book data, quotes, funding, liquidations etc.) via and (for subscriptions types only).
Yes, depending on chosen billing period, subscriptions include access to existing historical market data as well:
all available historical data if subscription is billed yearly - historical market data is available since 2019-03-30 for majority of the supported exchanges ( for exact date for particular exchange)
12 months of historical data if subscription is billed quarterly, e.g., subscription that has started at 2020-04-01, includes access to historical data since 2019-04-01 - it's not a rolling time window, but fixed starting date since when historical data is available for your subscription
4 months of historical data if subscription is billed monthly
All subscriptions provide access to (trades, orders book data, quotes, funding etc.) via and (for pro and business subscriptions types).
"Individual" data plan provides per-exchange access to market data that includes full feed (all instruments) and data types of selected exchange(s), for example full exchange data feed.
"Individual" data plan allows access to (trades, orders book data, quotes, funding etc.) via and (for pro and business subscriptions types). Range of historical data access for "Individual" data plan depends on (for example: access to all existing historical data we collected if subscription is billed yearly).
"Perpetuals" data plan provides access to the following perpetual swaps instruments' market data (over 500 perpetual swaps instruments across 13 exchanges):
: all perpetual swaps instruments
: all perpetual swaps instruments
: all perpetual swaps instruments
"Perpetuals" data plan allows access to (trades, orders book data, funding etc.) via and (for pro and business subscriptions types). Range of historical data access for "Perpetuals" data plan depends on (for example: access to all existing historical data we collected if subscription is billed yearly).
"Derivatives" data plan provides access to the following derivatives exchanges' market data:
: all exchange's instruments
: all exchange's instruments
: all exchange's instruments
"Derivatives" data plan allows access to (trades, orders book data, quotes, funding etc.) via and (for pro and business subscriptions types).
Range of historical data access for "Derivatives" data plan depends on (for example: access to all existing historical data we collected if subscription is billed yearly).
"All Exchanges" data plan provides access to market data of (30+ leading spot and derivatives exchanges, ).
"All Exchanges" data plan allows access to (trades, orders book data, quotes, funding, liquidations etc.) for and theirs instruments/currency pairs via and (for pro and business subscriptions types).
Range of historical data access for "All Exchanges" data plan depends on (for example: access to all existing historical data we collected if subscription is billed yearly).
describing which plan you'd like to change to and we'll handle the rest.
We offer invoicing for customers paying over $6000 for data access. Simply use our and "PAY THROUGH INVOICING" button.
Alternatively with orders details you're interested in (data plan, billing period) and we'll send you back invoice that if paid will give you the access to the data.
Yes, please use our and "REQUEST QUOTATION" button. Alternatively with orders details you're interested in (data plan, billing period) and we'll send you back quotation document in no time.
After successful order you'll receive Receipt email from which is our online reseller & payment processor. Click on the button titled "View Receipt" there.
You will be redirected to the receipt page where you will be able to enter your address details by clicking on "Add address & VAT Number“ link.
If you would like to enter a VAT number select "This is a business purchase" checkbox to enter the VAT ID if forgot to enter it during the checkout. The tax amount will be refunded in max. 12 hours after it is confirmed by Paddle.
Right click on the screen and click 'Print...' in context menu
Change destination to 'Save as PDF'
Click 'Save' button to save invoice as PDF file
Click on the link titled "Click here to get a full invoice with address & custom information" provided with the order confirmation email sent by Paddle to get the address and VAT ID of Paddle who process our payments. Paddle acts as a reseller and Merchant of Record so they handle VAT on our behalf.
You need to or [email protected] to request a new invoice. Please provide the email address you bought the subscription with and any extra details that might help.
We do not offer refunds for initial subscription payments and one-off purchases.
If you are on yearly billing and forget to your subscription before the renewal date, within seven days after the renewal date to discuss a refund.
If you’re on a monthly or quarterly billing, please be sure to your subscription before the end date of your current plan as there are no refunds for recurring payments on monthly and quarterly billing plans.
If you'd like to test the service, we offer generous free trials. Simply and we'll set up test account for you in no time.
In order to cancel you active subscription use the 'Cancel subscription' link we've sent you in email together with your API key or and we'll provide cancellation link for you. Alternatively you can email ([email protected]) which acts as our reseller and Merchant of Record including a note of the email address you used to purchase your subscription and your order number.
We accept BTC, ETH and USDT for . and we'll get back to you with details.
In order to update your credit card information use the 'Update payment method' link we've sent you in email together with your API key or and we'll provide that link for you.
Apple Pay (one-off purchases only)
Wire Transfers (for one-off purchases only)
—
—
✓
✓
✓
✓
✓
✓
—
—
✓
✓
Additional API keys
—
—
—
✓
none
none
email (priority)
dedicated
Integration assistance
—
—
—
✓
Vendor onboarding
—
—
—
✓
Api keys count
1
1
1
10
—
—
✓ ()
✓
FTX: all perpetual swaps instruments
OKX Swap: all perpetual swaps instruments
Huobi COIN Swaps: all perpetual swaps instruments
Huobi USDT Swaps: all perpetual swaps instruments
bitFlyer: FX_BTC_JPY
Bitfinex Derivatives: all perpetual swaps instruments
Bybit: all perpetual swaps instruments
dYdX: all perpetual swaps instruments
Phemex: all perpetual swaps instruments
Delta: all perpetual swaps instruments
Gate.io Futures: all perpetual swaps instruments
CoinFLEX: all perpetual swaps instruments
dYdX: all perpetual swaps instruments
WOO X: all perpetual swaps instruments
Ascendex: all perpetual swaps instruments
Crypto.com: all perpetual swaps instruments
FTX: all exchange's instruments
OKX Futures: all exchange's instruments
OKX Swap: all exchange's instruments
OKX Options: all exchange's instruments
Huobi Futures: all exchange's instruments
Huobi COIN Swap: all exchange's instruments
Huobi USDT Swaps: all exchange's instruments
Bitfinex Derivatives: all exchange's instruments
Bybit: all exchange's instruments
DYdX: all exchange's instruments
Phemex: all exchange's instruments
CoinFLEX: : all exchange's instruments
Delta: all exchange's instruments
bitFlyer: all exchange's instruments
Gate.io Futures: all exchange's instruments
dYdX: all perpetual swaps instruments
WOO X: all exchange's instruments
Crypto.com: all exchange's instruments
Ascendex: all exchange's instruments
Academic
Solo
Professional
Business
✓
✓
✓
✓
Raw data replay API
(HTTP API /data-feeds)
—
—
✓
✓