Support

Quickstart

Set up Databento

You need an API key to request data from Databento. Sign up and you will automatically receive an API key. Each API key is a 32-character string starting with db-, that can be found from the API keys page on your portal.

Info
Info

Every new account receives $125 in free data credits. These credits allow you to start receiving actual historical data immediately and verify your integration at no cost.

Choose a service

Databento provides two data services: historical and live. These two services are nearly identical, but we separate them due to licensing fees and the differences between response-request and real-time streaming APIs. You can choose to integrate just one or both services.

Historical Live Reference
Coverage Data from at least 24 hours ago. Intraday historical data from within the last 24 hours and real-time data. Historical with intraday updates.
Pricing Usage-based or flat-rate. No monthly license fees. Usage-based or flat-rate. Monthly license fees apply. Dataset license fees apply.
Access Client libraries (C++, Python, and Rust) and API (HTTP). Client libraries (C++, Python, and Rust) and API (Raw). Client libraries (Python) and API (HTTP).

Build your first application

Historical
Live
Reference
1

Select how you will be integrating Databento below to see installation instructions.

If you don't see an official client library for your preferred language, you can still integrate our historical service through its HTTP API.

HISTORICAL DATA
Client libraries
Python
Python
C++
C++
Rust
Rust
APIs
HTTP
HTTP
$
cargo add databento
77
2

Add our Rust client to your project's dependencies with cargo.

$ cargo add databento
3

A simple Databento application looks like this.

This replays 10 minutes of trades of the entire CME Globex market event-by-event.

Copy this to a file named main.rs, then compile and run the application.

use databento::{
    dbn::{Schema, TradeMsg},
    historical::timeseries::GetRangeParams,
    HistoricalClient, Symbols,
};
use time::macros::datetime;

#[tokio::main]
async fn main() -> anyhow::Result<()> {
    let mut client = HistoricalClient::builder()
        .key("YOUR_API_KEY")?
        .build()?;
    let mut decoder = client
        .timeseries()
        .get_range(
            &GetRangeParams::builder()
                .dataset("GLBX.MDP3")
                .date_time_range((
                    datetime!(2022-06-10 14:30 UTC),
                    datetime!(2022-06-10 14:40 UTC),
                ))
                .symbols(Symbols::All)
                .schema(Schema::Trades)
                .build(),
        )
        .await?;
    while let Some(trade) =
        decoder.decode_record::<TradeMsg>().await?
    {
        println!("{trade:?}");
    }
    Ok(())
}
4

You can modify this application to specify particular instruments, and schemas,

Let's get ESM2 and NQZ2 data in 1-second OHLCV bars.

use databento::{
    dbn::{OhlcvMsg, Schema},
    historical::timeseries::GetRangeParams,
    HistoricalClient,
};
use time::macros::datetime;

#[tokio::main]
async fn main() -> anyhow::Result<()> {
    let mut client = HistoricalClient::builder()
        .key("YOUR_API_KEY")?
        .build()?;
    let mut decoder = client
        .timeseries()
        .get_range(
            &GetRangeParams::builder()
                .dataset("GLBX.MDP3")
                .date_time_range((
                    datetime!(2022-06-10 14:30 UTC),
                    datetime!(2022-06-10 14:40 UTC),
                ))
                .symbols(vec!["ESM2", "NQZ2"])
                .schema(Schema::Ohlcv1S)
                .build(),
        )
        .await?;
    while let Some(bar) =
        decoder.decode_record::<OhlcvMsg>().await?
    {
        println!("{bar:?}");
    }
    Ok(())
}

You have successfully written your first historical data application with Databento! Here are shortcuts to some of the next steps you can take:

  • To download a large amount of data to disk, see how to do a batch download of data files.
  • To get another dataset, just swap the dataset. You can get a list of datasets and their names from our metadata.
  • You can use our symbology API to find other instrument IDs.
  • You can also find dataset names and instrument IDs interactively from our search.

To learn more, read the full documentation for our historical service under API reference - Historical.