Support

Convert DBN to other encoding formats

In this example, we’ll walk through how to convert DBN to CSV, JSON, or Parquet encoding formats.

DBN

DBN is an extremely fast message encoding and storage format for normalized market data. DBN is the default encoding used across our API and client libraries.

Historical data can be requested with timeseries.get_range(). This method returns a DBNStore.

See also
See also

Historical data can also be requested in various encoding formats with batch downloads. See the programmatic batch downloads example for more information.

import databento as db

client = db.Historical("YOUR_API_KEY")

dbn_store = client.timeseries.get_range(
    dataset="EQUS.SUMMARY",
    symbols=["AAPL", "NVDA", "NFLX", "META", "GOOGL"],
    schema="ohlcv-1d",
    start="2025-09-30",
)

You can save this data to a DBN file with DBNStore.to_file(), then read it back into a DBNStore with DBNStore.from_file().

dbn_store.to_file("demo_data.dbn")

dbn_store = db.DBNStore.from_file("demo_data.dbn")

You can inspect the symbology mappings for a DBNStore with the symbology attribute.

dbn_store = db.DBNStore.from_file("demo_data.dbn")

print(dbn_store.symbology)
{"symbols": ["AAPL", "NVDA", "NFLX", "META", "GOOGL"],
 "stype_in": "raw_symbol",
 "stype_out": "instrument_id",
 "start_date": "2025-09-30",
 "end_date": "2025-10-01",
 "partial": [],
 "not_found": [],
 "mappings": {"GOOGL": [{"start_date": datetime.date(2025, 9, 30),
                         "end_date": datetime.date(2025, 10, 1),
                         "symbol": "7152"}],
              "NVDA": [{"start_date": datetime.date(2025, 9, 30),
                        "end_date": datetime.date(2025, 10, 1),
                        "symbol": "11667"}],
              "AAPL": [{"start_date": datetime.date(2025, 9, 30),
                        "end_date": datetime.date(2025, 10, 1),
                        "symbol": "38"}],
              "NFLX": [{"start_date": datetime.date(2025, 9, 30),
                        "end_date": datetime.date(2025, 10, 1),
                        "symbol": "11275"}],
              "META": [{"start_date": datetime.date(2025, 9, 30),
                        "end_date": datetime.date(2025, 10, 1),
                        "symbol": "10451"}]}}

You can iterate over a DBNStore, which will yield DBN records.

dbn_store = db.DBNStore.from_file("demo_data.dbn")

for record in dbn_store:
    print(record)
OhlcvMsg { hd: RecordHeader { length: 14, rtype: Ohlcv1D, publisher_id: EqusSummaryEqus, instrument_id: 11275, ts_event: 1759190400000000000 }, open: 1206.410000000, high: 1208.500000000, low: 1178.000000000, close: 1198.920000000, volume: 3830304 }
OhlcvMsg { hd: RecordHeader { length: 14, rtype: Ohlcv1D, publisher_id: EqusSummaryEqus, instrument_id: 11667, ts_event: 1759190400000000000 }, open: 182.080000000, high: 187.350000000, low: 181.480000000, close: 186.580000000, volume: 236981032 }
OhlcvMsg { hd: RecordHeader { length: 14, rtype: Ohlcv1D, publisher_id: EqusSummaryEqus, instrument_id: 38, ts_event: 1759190400000000000 }, open: 254.855000000, high: 255.919000000, low: 253.110000000, close: 254.630000000, volume: 37704259 }
OhlcvMsg { hd: RecordHeader { length: 14, rtype: Ohlcv1D, publisher_id: EqusSummaryEqus, instrument_id: 7152, ts_event: 1759190400000000000 }, open: 242.810000000, high: 243.290000000, low: 239.245000000, close: 243.100000000, volume: 34724346 }
OhlcvMsg { hd: RecordHeader { length: 14, rtype: Ohlcv1D, publisher_id: EqusSummaryEqus, instrument_id: 10451, ts_event: 1759190400000000000 }, open: 742.250000000, high: 742.970000000, low: 726.300000000, close: 734.380000000, volume: 16226750 }

You can convert a DBNStore into a Pandas DataFrame with DBNStore.to_df().

dbn_store = db.DBNStore.from_file("demo_data.dbn")

df = dbn_store.to_df()
print(df)
                           rtype  publisher_id  instrument_id      open      high       low    close     volume symbol
ts_event
2025-09-30 00:00:00+00:00     35            90             38   254.855   255.919   253.110   254.63   37704259   AAPL
2025-09-30 00:00:00+00:00     35            90           7152   242.810   243.290   239.245   243.10   34724346  GOOGL
2025-09-30 00:00:00+00:00     35            90          10451   742.250   742.970   726.300   734.38   16226750   META
2025-09-30 00:00:00+00:00     35            90          11275  1206.410  1208.500  1178.000  1198.92    3830304   NFLX
2025-09-30 00:00:00+00:00     35            90          11667   182.080   187.350   181.480   186.58  236981032   NVDA

CSV

See also
See also

Data can also be directly requested in CSV format by setting encoding="csv" in batch.submit_job().

You can save a DBNStore to a CSV file with DBNStore.to_csv().

dbn_store = db.DBNStore.from_file("demo_data.dbn")

dbn_store.to_csv("demo_data.csv")

with open("demo_data.csv", "r") as f:
    print(f.read())
ts_event,rtype,publisher_id,instrument_id,open,high,low,close,volume,symbol
2025-09-30T00:00:00.000000000Z,35,90,38,254.855000000,255.919000000,253.110000000,254.630000000,37704259,AAPL
2025-09-30T00:00:00.000000000Z,35,90,7152,242.810000000,243.290000000,239.245000000,243.100000000,34724346,GOOGL
2025-09-30T00:00:00.000000000Z,35,90,10451,742.250000000,742.970000000,726.300000000,734.380000000,16226750,META
2025-09-30T00:00:00.000000000Z,35,90,11275,1206.410000000,1208.500000000,1178.000000000,1198.920000000,3830304,NFLX
2025-09-30T00:00:00.000000000Z,35,90,11667,182.080000000,187.350000000,181.480000000,186.580000000,236981032,NVDA

JSON

See also
See also

Data can also be directly requested in JSON format by setting encoding="json" in batch.submit_job().

You can save a DBNStore to a JSONL file with DBNStore.to_json().

dbn_store = db.DBNStore.from_file("demo_data.dbn")

dbn_store.to_json("demo_data.json")

with open("demo_data.json", "r") as f:
    print(f.read())
{"hd":{"ts_event":"2025-09-30T00:00:00.000000000Z","rtype":35,"publisher_id":90,"instrument_id":38},"open":"254.855000000","high":"255.919000000","low":"253.110000000","close":"254.630000000","volume":"37704259","symbol":"AAPL"}
{"hd":{"ts_event":"2025-09-30T00:00:00.000000000Z","rtype":35,"publisher_id":90,"instrument_id":7152},"open":"242.810000000","high":"243.290000000","low":"239.245000000","close":"243.100000000","volume":"34724346","symbol":"GOOGL"}
{"hd":{"ts_event":"2025-09-30T00:00:00.000000000Z","rtype":35,"publisher_id":90,"instrument_id":10451},"open":"742.250000000","high":"742.970000000","low":"726.300000000","close":"734.380000000","volume":"16226750","symbol":"META"}
{"hd":{"ts_event":"2025-09-30T00:00:00.000000000Z","rtype":35,"publisher_id":90,"instrument_id":11275},"open":"1206.410000000","high":"1208.500000000","low":"1178.000000000","close":"1198.920000000","volume":"3830304","symbol":"NFLX"}
{"hd":{"ts_event":"2025-09-30T00:00:00.000000000Z","rtype":35,"publisher_id":90,"instrument_id":11667},"open":"182.080000000","high":"187.350000000","low":"181.480000000","close":"186.580000000","volume":"236981032","symbol":"NVDA"}

Parquet

You can save a DBNStore to a parquet file with DBNStore.to_parquet().

dbn_store = db.DBNStore.from_file("demo_data.dbn")

dbn_store.to_parquet("demo_data.parquet")