Using TimeDB API
This notebook demonstrates how to use the TimeDB REST API to read and write time series data.
Note: This example assumes no user authentication (users_table is not created). In production, you would typically use authentication with API keys.
What we’ll cover:
Setting up the database schema (using SDK - admin task)
Starting the API server
Inserting time series data using the REST API
Reading time series data using the REST API
Updating records using the REST API
[6]:
import timedb as td
import pandas as pd
import requests
import json
from datetime import datetime, timezone, timedelta
from typing import Dict, Any
# API base URL (adjust if your API is running on a different host/port)
API_BASE_URL = "http://127.0.0.1:8000"
API_BASE_URL = "https://rebase-energy--timedb-api-fastapi-app-dev.modal.run"
API_BASE_URL = "https://sebaheg--timedb-api-fastapi-app.modal.run"
print("✓ Imports successful")
✓ Imports successful
Part 1: Setup Database Schema
First, we’ll use the SDK to create the database schema. This is typically done once by an administrator. The API cannot create or delete the database schema - this must be done through the SDK or CLI.
[3]:
# Delete existing schema (optional - only if you want to start fresh)
# Uncomment the line below if you want to start with a clean database
td.delete()
# Create database schema
td.create()
Creating database schema...
✓ Schema created successfully
Part 2: Start the API Server
Before we can use the API, we need to start the API server.
Note: The API server runs in a blocking manner. In a notebook, we’ll start it in a background thread so we can continue using the notebook.
[3]:
# Start the API server in the background
# This will start the server in a daemon thread so we can continue using the notebook
td.start_api_background()
INFO: Started server process [73573]
INFO: Waiting for application startup.
INFO: Application startup complete.
INFO: Uvicorn running on http://127.0.0.1:8000 (Press CTRL+C to quit)
Starting API server in background thread on http://127.0.0.1:8000...
Starting TimeDB API server on http://127.0.0.1:8000
API docs available at http://127.0.0.1:8000/docs
Press Ctrl+C to stop the server
INFO: 127.0.0.1:60092 - "GET / HTTP/1.1" 200 OK
✓ API is running
Name: TimeDB API
Version: 0.1.1
Available endpoints:
- read_values: GET /values - Read time series values
- upload_timeseries: POST /upload - Upload time series data (create a new run with values)
- create_series: POST /series - Create a new time series
- list_timeseries: GET /list_timeseries - List all time series (series_id -> series_key mapping)
- update_records: PUT /values - Update existing time series records
✓ API server started successfully
Server running at http://127.0.0.1:8000
API docs available at http://127.0.0.1:8000/docs
[3]:
True
[4]:
# Verify API is running and get API information
td.check_api()
❌ API is not running!
Please start it by running: td.start_api_background()
Or in a terminal: timedb api
[4]:
False
Part 3: Insert Data Using the API
Now let’s create some sample time series data and insert it using the REST API.
[14]:
headers={"Content-Type": "application/json",
"X-API-Key": "OekXZaRjQDxdQ-3NfX_Y0p7SzgRWSkd2q32VxUkJbTs"}
[8]:
# First, create the time series using the /create_series endpoint
series_to_create = [
{
"name": "temperature",
"description": "Temperature measurements in Celsius",
"unit": "celsius"
},
{
"name": "humidity",
"description": "Relative humidity percentage",
"unit": "percent"
}
]
created_series = {}
for series_info in series_to_create:
response = requests.post(
f"{API_BASE_URL}/series",
json=series_info,
headers={"Content-Type": "application/json",
"X-API-Key": "OekXZaRjQDxdQ-3NfX_Y0p7SzgRWSkd2q32VxUkJbTs"
}
)
response.raise_for_status()
result = response.json()
series_key = series_info["name"]
created_series[series_key] = result["series_id"]
print(f"✓ Created series '{series_key}': {result['series_id']}")
print(f" Message: {result['message']}")
print(f"\n✓ Created {len(created_series)} time series")
# Create sample time series data
base_time = datetime(2025, 1, 1, 0, 0, tzinfo=timezone.utc)
dates = [base_time + timedelta(hours=i) for i in range(24)]
# Prepare request payload for API
# Now we use the series_key from the created series
value_rows = []
for i, date in enumerate(dates):
# Add temperature value using the created series_key
value_rows.append({
"valid_time": date.isoformat(),
"value_key": "temperature", # Use the series_key from created series
"value": 20.0 + i * 0.3 # Temperature rising
})
# Add humidity value using the created series_key
value_rows.append({
"valid_time": date.isoformat(),
"value_key": "humidity", # Use the series_key from created series
"value": 60.0 - i * 0.5 # Humidity decreasing
})
# Note: workflow_id defaults to "api-workflow" if not provided
create_run_request = {
"run_start_time": datetime.now(timezone.utc).isoformat(),
"value_rows": value_rows
}
print(f"\nPrepared {len(value_rows)} value rows to insert")
print(f"Time range: {dates[0]} to {dates[-1]}")
print(f"Series: {', '.join(created_series.keys())}")
✓ Created series 'temperature': 49c45b07-7910-4f85-add4-1d77bf80d487
Message: Series created successfully
✓ Created series 'humidity': 79e63af8-b0e0-4372-b0db-cee9b975c600
Message: Series created successfully
✓ Created 2 time series
Prepared 48 value rows to insert
Time range: 2025-01-01 00:00:00+00:00 to 2025-01-01 23:00:00+00:00
Series: temperature, humidity
3.1: Upload the Data
Now let’s upload the time series data using the series_keys from the series we just created.
[10]:
# Upload data via API
response = requests.post(
f"{API_BASE_URL}/upload",
json=create_run_request,
headers={"Content-Type": "application/json",
"X-API-Key": "OekXZaRjQDxdQ-3NfX_Y0p7SzgRWSkd2q32VxUkJbTs"
}
)
response.raise_for_status()
result = response.json()
print(f"✓ Created run with ID: {result['run_id']}")
print(f" Message: {result['message']}")
print(f"\nSeries IDs returned:")
for series_key, series_id in result['series_ids'].items():
print(f" {series_key}: {series_id}")
# Store run_id and series_ids for later use
run_id = result['run_id']
series_ids = result['series_ids'] # Maps series_key -> series_id
✓ Created run with ID: c5c7b465-1a9f-4605-a801-c51b8de0e73e
Message: Run created successfully
Series IDs returned:
temperature: 4b7f4c69-6898-406e-9d6d-78d7a9096214
humidity: 715583c0-1d9e-4012-b938-013d0f5dc563
3.2: List All Time Series
After uploading data, you can list all available time series to get the series_id -> series_key mapping. This is useful for subsequent API calls.
[13]:
# List all time series
response = requests.get(f"{API_BASE_URL}/list_timeseries",
headers={"Content-Type": "application/json",
"X-API-Key": "OekXZaRjQDxdQ-3NfX_Y0p7SzgRWSkd2q32VxUkJbTs"
})
response.raise_for_status()
timeseries_list = response.json()
print(f"✓ Found {len(timeseries_list)} time series")
print("\nSeries information:")
for series_id, series_info in timeseries_list.items():
print(f" {series_id}:")
print(f" Series Key: {series_info['series_key']}")
print(f" Description: {series_info.get('description', 'N/A')}")
print(f" Unit: {series_info['unit']}")
# Store for later use
all_series_ids = timeseries_list
✓ Found 2 time series
Series information:
715583c0-1d9e-4012-b938-013d0f5dc563:
Series Key: humidity
Description: None
Unit: dimensionless
4b7f4c69-6898-406e-9d6d-78d7a9096214:
Series Key: temperature
Description: None
Unit: dimensionless
Part 4: Read Data Using the API
Let’s read the time series data we just inserted using the API.
[15]:
# Read data via API
# Note: Since we're not using authentication, we can read all data
params = {
"start_valid": base_time.isoformat(),
"end_valid": (base_time + timedelta(hours=24)).isoformat(),
"mode": "flat", # "flat" returns latest known_time per valid_time, "overlapping" returns all revisions
"all_versions": False # Set to True to include historical versions
}
response = requests.get(f"{API_BASE_URL}/values", params=params, headers=headers)
response.raise_for_status()
data = response.json()
print(f"✓ Retrieved {data['count']} records via API")
# Convert to DataFrame for easier viewing
if data['count'] > 0:
df_api = pd.DataFrame(data['data'])
# Convert ISO strings back to datetime
df_api['valid_time'] = pd.to_datetime(df_api['valid_time'])
print("\nFirst few rows:")
print(df_api.head(10))
print(f"\nDataFrame shape: {df_api.shape}")
print(f"Columns: {list(df_api.columns)}")
else:
print("No data found")
✓ Retrieved 48 records via API
First few rows:
valid_time series_id value \
0 2025-01-01 00:00:00+00:00 4b7f4c69-6898-406e-9d6d-78d7a9096214 20.0
1 2025-01-01 00:00:00+00:00 715583c0-1d9e-4012-b938-013d0f5dc563 60.0
2 2025-01-01 01:00:00+00:00 4b7f4c69-6898-406e-9d6d-78d7a9096214 20.3
3 2025-01-01 01:00:00+00:00 715583c0-1d9e-4012-b938-013d0f5dc563 59.5
4 2025-01-01 02:00:00+00:00 4b7f4c69-6898-406e-9d6d-78d7a9096214 20.6
5 2025-01-01 02:00:00+00:00 715583c0-1d9e-4012-b938-013d0f5dc563 59.0
6 2025-01-01 03:00:00+00:00 4b7f4c69-6898-406e-9d6d-78d7a9096214 20.9
7 2025-01-01 03:00:00+00:00 715583c0-1d9e-4012-b938-013d0f5dc563 58.5
8 2025-01-01 04:00:00+00:00 4b7f4c69-6898-406e-9d6d-78d7a9096214 21.2
9 2025-01-01 04:00:00+00:00 715583c0-1d9e-4012-b938-013d0f5dc563 58.0
series_key series_unit
0 temperature dimensionless
1 humidity dimensionless
2 temperature dimensionless
3 humidity dimensionless
4 temperature dimensionless
5 humidity dimensionless
6 temperature dimensionless
7 humidity dimensionless
8 temperature dimensionless
9 humidity dimensionless
DataFrame shape: (48, 5)
Columns: ['valid_time', 'series_id', 'value', 'series_key', 'series_unit']
[16]:
df_api
[16]:
| valid_time | series_id | value | series_key | series_unit | |
|---|---|---|---|---|---|
| 0 | 2025-01-01 00:00:00+00:00 | 4b7f4c69-6898-406e-9d6d-78d7a9096214 | 20.0 | temperature | dimensionless |
| 1 | 2025-01-01 00:00:00+00:00 | 715583c0-1d9e-4012-b938-013d0f5dc563 | 60.0 | humidity | dimensionless |
| 2 | 2025-01-01 01:00:00+00:00 | 4b7f4c69-6898-406e-9d6d-78d7a9096214 | 20.3 | temperature | dimensionless |
| 3 | 2025-01-01 01:00:00+00:00 | 715583c0-1d9e-4012-b938-013d0f5dc563 | 59.5 | humidity | dimensionless |
| 4 | 2025-01-01 02:00:00+00:00 | 4b7f4c69-6898-406e-9d6d-78d7a9096214 | 20.6 | temperature | dimensionless |
| 5 | 2025-01-01 02:00:00+00:00 | 715583c0-1d9e-4012-b938-013d0f5dc563 | 59.0 | humidity | dimensionless |
| 6 | 2025-01-01 03:00:00+00:00 | 4b7f4c69-6898-406e-9d6d-78d7a9096214 | 20.9 | temperature | dimensionless |
| 7 | 2025-01-01 03:00:00+00:00 | 715583c0-1d9e-4012-b938-013d0f5dc563 | 58.5 | humidity | dimensionless |
| 8 | 2025-01-01 04:00:00+00:00 | 4b7f4c69-6898-406e-9d6d-78d7a9096214 | 21.2 | temperature | dimensionless |
| 9 | 2025-01-01 04:00:00+00:00 | 715583c0-1d9e-4012-b938-013d0f5dc563 | 58.0 | humidity | dimensionless |
| 10 | 2025-01-01 05:00:00+00:00 | 4b7f4c69-6898-406e-9d6d-78d7a9096214 | 21.5 | temperature | dimensionless |
| 11 | 2025-01-01 05:00:00+00:00 | 715583c0-1d9e-4012-b938-013d0f5dc563 | 57.5 | humidity | dimensionless |
| 12 | 2025-01-01 06:00:00+00:00 | 4b7f4c69-6898-406e-9d6d-78d7a9096214 | 21.8 | temperature | dimensionless |
| 13 | 2025-01-01 06:00:00+00:00 | 715583c0-1d9e-4012-b938-013d0f5dc563 | 57.0 | humidity | dimensionless |
| 14 | 2025-01-01 07:00:00+00:00 | 4b7f4c69-6898-406e-9d6d-78d7a9096214 | 22.1 | temperature | dimensionless |
| 15 | 2025-01-01 07:00:00+00:00 | 715583c0-1d9e-4012-b938-013d0f5dc563 | 56.5 | humidity | dimensionless |
| 16 | 2025-01-01 08:00:00+00:00 | 4b7f4c69-6898-406e-9d6d-78d7a9096214 | 22.4 | temperature | dimensionless |
| 17 | 2025-01-01 08:00:00+00:00 | 715583c0-1d9e-4012-b938-013d0f5dc563 | 56.0 | humidity | dimensionless |
| 18 | 2025-01-01 09:00:00+00:00 | 4b7f4c69-6898-406e-9d6d-78d7a9096214 | 22.7 | temperature | dimensionless |
| 19 | 2025-01-01 09:00:00+00:00 | 715583c0-1d9e-4012-b938-013d0f5dc563 | 55.5 | humidity | dimensionless |
| 20 | 2025-01-01 10:00:00+00:00 | 4b7f4c69-6898-406e-9d6d-78d7a9096214 | 23.0 | temperature | dimensionless |
| 21 | 2025-01-01 10:00:00+00:00 | 715583c0-1d9e-4012-b938-013d0f5dc563 | 55.0 | humidity | dimensionless |
| 22 | 2025-01-01 11:00:00+00:00 | 4b7f4c69-6898-406e-9d6d-78d7a9096214 | 23.3 | temperature | dimensionless |
| 23 | 2025-01-01 11:00:00+00:00 | 715583c0-1d9e-4012-b938-013d0f5dc563 | 54.5 | humidity | dimensionless |
| 24 | 2025-01-01 12:00:00+00:00 | 4b7f4c69-6898-406e-9d6d-78d7a9096214 | 23.6 | temperature | dimensionless |
| 25 | 2025-01-01 12:00:00+00:00 | 715583c0-1d9e-4012-b938-013d0f5dc563 | 54.0 | humidity | dimensionless |
| 26 | 2025-01-01 13:00:00+00:00 | 4b7f4c69-6898-406e-9d6d-78d7a9096214 | 23.9 | temperature | dimensionless |
| 27 | 2025-01-01 13:00:00+00:00 | 715583c0-1d9e-4012-b938-013d0f5dc563 | 53.5 | humidity | dimensionless |
| 28 | 2025-01-01 14:00:00+00:00 | 4b7f4c69-6898-406e-9d6d-78d7a9096214 | 24.2 | temperature | dimensionless |
| 29 | 2025-01-01 14:00:00+00:00 | 715583c0-1d9e-4012-b938-013d0f5dc563 | 53.0 | humidity | dimensionless |
| 30 | 2025-01-01 15:00:00+00:00 | 4b7f4c69-6898-406e-9d6d-78d7a9096214 | 24.5 | temperature | dimensionless |
| 31 | 2025-01-01 15:00:00+00:00 | 715583c0-1d9e-4012-b938-013d0f5dc563 | 52.5 | humidity | dimensionless |
| 32 | 2025-01-01 16:00:00+00:00 | 4b7f4c69-6898-406e-9d6d-78d7a9096214 | 24.8 | temperature | dimensionless |
| 33 | 2025-01-01 16:00:00+00:00 | 715583c0-1d9e-4012-b938-013d0f5dc563 | 52.0 | humidity | dimensionless |
| 34 | 2025-01-01 17:00:00+00:00 | 4b7f4c69-6898-406e-9d6d-78d7a9096214 | 25.1 | temperature | dimensionless |
| 35 | 2025-01-01 17:00:00+00:00 | 715583c0-1d9e-4012-b938-013d0f5dc563 | 51.5 | humidity | dimensionless |
| 36 | 2025-01-01 18:00:00+00:00 | 4b7f4c69-6898-406e-9d6d-78d7a9096214 | 25.4 | temperature | dimensionless |
| 37 | 2025-01-01 18:00:00+00:00 | 715583c0-1d9e-4012-b938-013d0f5dc563 | 51.0 | humidity | dimensionless |
| 38 | 2025-01-01 19:00:00+00:00 | 4b7f4c69-6898-406e-9d6d-78d7a9096214 | 25.7 | temperature | dimensionless |
| 39 | 2025-01-01 19:00:00+00:00 | 715583c0-1d9e-4012-b938-013d0f5dc563 | 50.5 | humidity | dimensionless |
| 40 | 2025-01-01 20:00:00+00:00 | 4b7f4c69-6898-406e-9d6d-78d7a9096214 | 26.0 | temperature | dimensionless |
| 41 | 2025-01-01 20:00:00+00:00 | 715583c0-1d9e-4012-b938-013d0f5dc563 | 50.0 | humidity | dimensionless |
| 42 | 2025-01-01 21:00:00+00:00 | 4b7f4c69-6898-406e-9d6d-78d7a9096214 | 26.3 | temperature | dimensionless |
| 43 | 2025-01-01 21:00:00+00:00 | 715583c0-1d9e-4012-b938-013d0f5dc563 | 49.5 | humidity | dimensionless |
| 44 | 2025-01-01 22:00:00+00:00 | 4b7f4c69-6898-406e-9d6d-78d7a9096214 | 26.6 | temperature | dimensionless |
| 45 | 2025-01-01 22:00:00+00:00 | 715583c0-1d9e-4012-b938-013d0f5dc563 | 49.0 | humidity | dimensionless |
| 46 | 2025-01-01 23:00:00+00:00 | 4b7f4c69-6898-406e-9d6d-78d7a9096214 | 26.9 | temperature | dimensionless |
| 47 | 2025-01-01 23:00:00+00:00 | 715583c0-1d9e-4012-b938-013d0f5dc563 | 48.5 | humidity | dimensionless |
4.1: Read with Different Modes
The API supports two query modes:
“flat”: Returns the latest version of each (valid_time, series_id) combination
“overlapping”: Returns all forecast revisions, showing how predictions evolve over time
Let’s try the overlapping mode:
[8]:
# Read in overlapping mode to see all forecast revisions
params_overlapping = {
"start_valid": base_time.isoformat(),
"end_valid": (base_time + timedelta(hours=6)).isoformat(), # Smaller range for clarity
"mode": "overlapping", # This mode shows all known_time revisions
"all_versions": False
}
response = requests.get(f"{API_BASE_URL}/values", params=params_overlapping)
response.raise_for_status()
data_overlapping = response.json()
print(f"✓ Retrieved {data_overlapping['count']} records in overlapping mode")
if data_overlapping['count'] > 0:
df_overlapping = pd.DataFrame(data_overlapping['data'])
df_overlapping['valid_time'] = pd.to_datetime(df_overlapping['valid_time'])
if 'known_time' in df_overlapping.columns:
df_overlapping['known_time'] = pd.to_datetime(df_overlapping['known_time'])
print("\nFirst few rows (showing forecast revisions):")
print(df_overlapping.head(10))
✓ Retrieved 12 records in overlapping mode
First few rows (showing forecast revisions):
known_time valid_time \
0 2026-01-01 00:31:51.638418+00:00 2025-01-01 00:00:00+00:00
1 2026-01-01 00:31:51.638418+00:00 2025-01-01 00:00:00+00:00
2 2026-01-01 00:31:51.638418+00:00 2025-01-01 01:00:00+00:00
3 2026-01-01 00:31:51.638418+00:00 2025-01-01 01:00:00+00:00
4 2026-01-01 00:31:51.638418+00:00 2025-01-01 02:00:00+00:00
5 2026-01-01 00:31:51.638418+00:00 2025-01-01 02:00:00+00:00
6 2026-01-01 00:31:51.638418+00:00 2025-01-01 03:00:00+00:00
7 2026-01-01 00:31:51.638418+00:00 2025-01-01 03:00:00+00:00
8 2026-01-01 00:31:51.638418+00:00 2025-01-01 04:00:00+00:00
9 2026-01-01 00:31:51.638418+00:00 2025-01-01 04:00:00+00:00
series_id value series_key series_unit
0 27813f08-4d13-41fc-aa59-aa88c2d293a9 60.0 humidity dimensionless
1 8cf187d5-808a-4734-9b8f-c940fe83d6de 20.0 temperature dimensionless
2 27813f08-4d13-41fc-aa59-aa88c2d293a9 59.5 humidity dimensionless
3 8cf187d5-808a-4734-9b8f-c940fe83d6de 20.3 temperature dimensionless
4 27813f08-4d13-41fc-aa59-aa88c2d293a9 59.0 humidity dimensionless
5 8cf187d5-808a-4734-9b8f-c940fe83d6de 20.6 temperature dimensionless
6 27813f08-4d13-41fc-aa59-aa88c2d293a9 58.5 humidity dimensionless
7 8cf187d5-808a-4734-9b8f-c940fe83d6de 20.9 temperature dimensionless
8 27813f08-4d13-41fc-aa59-aa88c2d293a9 58.0 humidity dimensionless
9 8cf187d5-808a-4734-9b8f-c940fe83d6de 21.2 temperature dimensionless
Part 5: Insert More Data
Let’s insert another run with updated values to demonstrate how the API handles multiple runs.
[9]:
# Create new time series data for a second run
new_base_time = datetime(2025, 1, 2, 0, 0, tzinfo=timezone.utc)
new_dates = [new_base_time + timedelta(hours=i) for i in range(12)]
# Prepare request payload for a new run
value_rows_new = []
for i, date in enumerate(new_dates):
# Add temperature value (updated forecast)
value_rows_new.append({
"valid_time": date.isoformat(),
"value_key": "temperature",
"value": 25.0 + i * 0.2 # Different values than first run
})
# Add humidity value (updated forecast)
value_rows_new.append({
"valid_time": date.isoformat(),
"value_key": "humidity",
"value": 50.0 - i * 0.3 # Different values than first run
})
# Note: workflow_id defaults to "api-workflow" if not provided
create_run_request_new = {
"run_start_time": datetime.now(timezone.utc).isoformat(),
"value_rows": value_rows_new
}
print(f"Prepared {len(value_rows_new)} value rows for second run")
print(f"Time range: {new_dates[0]} to {new_dates[-1]}")
# Insert the new run
response = requests.post(
f"{API_BASE_URL}/runs",
json=create_run_request_new,
headers={"Content-Type": "application/json"}
)
response.raise_for_status()
result_new = response.json()
print(f"\n✓ Created second run with ID: {result_new['run_id']}")
print(f" Message: {result_new['message']}")
Prepared 24 value rows for second run
Time range: 2025-01-02 00:00:00+00:00 to 2025-01-02 11:00:00+00:00
---------------------------------------------------------------------------
HTTPError Traceback (most recent call last)
Cell In[9], line 36
30 # Insert the new run
31 response = requests.post(
32 f"{API_BASE_URL}/runs",
33 json=create_run_request_new,
34 headers={"Content-Type": "application/json"}
35 )
---> 36 response.raise_for_status()
38 result_new = response.json()
39 print(f"\n✓ Created second run with ID: {result_new['run_id']}")
File ~/Documents/Github/timedb/.venv/lib/python3.14/site-packages/requests/models.py:1026, in Response.raise_for_status(self)
1021 http_error_msg = (
1022 f"{self.status_code} Server Error: {reason} for url: {self.url}"
1023 )
1025 if http_error_msg:
-> 1026 raise HTTPError(http_error_msg, response=self)
HTTPError: 404 Client Error: Not Found for url: https://rebase-energy--timedb-api-fastapi-app-dev.modal.run/runs
[ ]:
# Read the newly inserted data
params_new = {
"start_valid": new_base_time.isoformat(),
"end_valid": (new_base_time + timedelta(hours=12)).isoformat(),
"mode": "flat"
}
response = requests.get(f"{API_BASE_URL}/values", params=params_new)
response.raise_for_status()
data_new = response.json()
print(f"✓ Retrieved {data_new['count']} records for the new time range")
if data_new['count'] > 0:
df_new = pd.DataFrame(data_new['data'])
df_new['valid_time'] = pd.to_datetime(df_new['valid_time'])
print("\nData from second run:")
print(df_new.head(10))
Part 6: Update Records Using the API
The API supports updating existing records. To update a record, you need:
run_id: The run that created the recordtenant_id: The tenant ID (defaults to zeros UUID if not authenticated)valid_time: The time the value is valid forseries_id: The series identifier
Let’s demonstrate updating a record. First, we need to get the series_id for our series.
[ ]:
# Get series_id from the read response (series_id should be in the response)
# For this example, we'll use the SDK to get series_id, or we can extract it from the API response
# Let's read a record to see what fields are available
params_for_update = {
"start_valid": base_time.isoformat(),
"end_valid": (base_time + timedelta(hours=1)).isoformat(),
"mode": "flat"
}
response = requests.get(f"{API_BASE_URL}/values", params=params_for_update)
response.raise_for_status()
data_for_update = response.json()
if data_for_update['count'] > 0:
# Get the first record
first_record = data_for_update['data'][0]
print("Sample record structure:")
print(f" valid_time: {first_record.get('valid_time', 'N/A')}")
print(f" series_key: {first_record.get('series_key', 'N/A')}")
print(f" series_id: {first_record.get('series_id', 'N/A')}")
print(f" value: {first_record.get('value', 'N/A')}")
# For updating, we need to use the SDK to get series_id, or store it when creating runs
# Let's use the SDK to get series_id for demonstration
import uuid as uuid_lib
series_mapping = {}
for record in data_for_update['data']:
series_key = record.get('series_key')
series_id_str = record.get('series_id')
if series_key and series_id_str:
series_mapping[series_key] = uuid_lib.UUID(series_id_str)
print(f"\nSeries mapping: {series_mapping}")
# Now we can create an update request
# Default tenant_id for non-authenticated requests
default_tenant_id = "00000000-0000-0000-0000-000000000000"
update_request = {
"updates": [
{
"run_id": run_id, # From our first insert
"tenant_id": default_tenant_id,
"valid_time": base_time.isoformat(),
"series_id": str(series_mapping.get("temperature", "")),
"value": 22.5, # Update the temperature value
"annotation": "Updated via API" # Add an annotation
}
]
}
print(f"\nUpdating record:")
print(f" run_id: {run_id}")
print(f" valid_time: {base_time.isoformat()}")
print(f" series: temperature")
print(f" new value: 22.5")
# Send update request
response = requests.put(
f"{API_BASE_URL}/values",
json=update_request,
headers={"Content-Type": "application/json"}
)
response.raise_for_status()
update_result = response.json()
print(f"\n✓ Update result:")
print(f" Updated: {len(update_result['updated'])} records")
print(f" Skipped (no-op): {len(update_result['skipped_no_ops'])} records")
if update_result['updated']:
print(f"\nUpdated record:")
for updated in update_result['updated']:
print(f" value_id: {updated.get('value_id', 'N/A')}")
else:
print("No records found to update")
Part 7: Verify the Update
Let’s read the data again to verify the update was applied.
[ ]:
# Read the updated record
params_verify = {
"start_valid": base_time.isoformat(),
"end_valid": (base_time + timedelta(hours=1)).isoformat(),
"mode": "flat",
"all_versions": True # Include all versions to see the update
}
response = requests.get(f"{API_BASE_URL}/values", params=params_verify)
response.raise_for_status()
data_verify = response.json()
if data_verify['count'] > 0:
df_verify = pd.DataFrame(data_verify['data'])
df_verify['valid_time'] = pd.to_datetime(df_verify['valid_time'])
# Filter for temperature at the updated time
temp_records = df_verify[
(df_verify['series_key'] == 'temperature') &
(df_verify['valid_time'] == base_time)
]
print(f"✓ Found {len(temp_records)} version(s) of the temperature record")
print("\nAll versions (showing update history):")
print(temp_records[['valid_time', 'series_key', 'value', 'changed_by', 'change_time']].head())
# Show the current value
if len(temp_records) > 0:
current = temp_records.iloc[-1] # Latest version
print(f"\nCurrent value: {current['value']}")
if 'annotation' in current and pd.notna(current['annotation']):
print(f"Annotation: {current['annotation']}")
else:
print("No records found")
Summary
This notebook demonstrated how to use the TimeDB REST API to:
Start the API server - Required before making API calls
Insert time series data - Using
POST /runsendpointRead time series data - Using
GET /valuesendpoint with different modesUpdate records - Using
PUT /valuesendpoint
Key API Endpoints:
``GET /`` - API information and available endpoints
``POST /runs`` - Create a new run with time series values
``GET /values`` - Read time series values (supports
flatandoverlappingmodes)``PUT /values`` - Update existing time series records
Query Modes:
``flat``: Returns the latest version of each (valid_time, series_id) combination
``overlapping``: Returns all forecast revisions, showing how predictions evolve over time
Authentication:
This example assumes no authentication (users_table not created)
In production, you would:
Create users_table using SDK:
td.create_with_users()Create users via CLI or SDK (with tenant_id)
Use API keys in requests:
headers={"X-API-Key": "your-api-key"}Users can only access data for their own tenant_id
Starting the API Server:
The API server can be started in several ways:
Using the SDK in a notebook (as shown in this notebook):
import timedb as td # Start in background thread (non-blocking) td.start_api_background() # Check if server is running if td.check_api(): print("API is running")
Using the SDK directly (blocking - use in a separate terminal/process):
import timedb as td td.start_api() # Blocks until Ctrl+C
Using the CLI:
timedb api --host 127.0.0.1 --port 8000
Using uvicorn directly:
uvicorn timedb.api:app --host 127.0.0.1 --port 8000
Note: In this notebook, we use td.start_api_background() which runs the server in a daemon thread. To stop it, simply restart the kernel.