{ "cells": [ { "cell_type": "markdown", "id": "c5066026", "metadata": {}, "source": [ "# Using TimeDB API\n", "\n", "This notebook demonstrates how to use the TimeDB REST API to read and write time series data.\n", "\n", "**Note**: This example assumes no user authentication (users_table is not created). In production, you would typically use authentication with API keys.\n", "\n", "## What we'll cover:\n", "1. Setting up the database schema (using SDK - admin task)\n", "2. Starting the API server\n", "3. Inserting time series data using the REST API\n", "4. Reading time series data using the REST API\n", "5. Updating records using the REST API\n" ] }, { "cell_type": "code", "execution_count": 6, "id": "c951cc83", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "✓ Imports successful\n" ] } ], "source": [ "import timedb as td\n", "import pandas as pd\n", "import requests\n", "import json\n", "from datetime import datetime, timezone, timedelta\n", "from typing import Dict, Any\n", "\n", "# API base URL (adjust if your API is running on a different host/port)\n", "API_BASE_URL = \"http://127.0.0.1:8000\"\n", "API_BASE_URL = \"https://rebase-energy--timedb-api-fastapi-app-dev.modal.run\"\n", "API_BASE_URL = \"https://sebaheg--timedb-api-fastapi-app.modal.run\"\n", "print(\"✓ Imports successful\")\n" ] }, { "cell_type": "markdown", "id": "cf1860a3", "metadata": {}, "source": [ "## Part 1: Setup Database Schema\n", "\n", "First, we'll use the SDK to create the database schema. This is typically done once by an administrator. The API cannot create or delete the database schema - this must be done through the SDK or CLI.\n" ] }, { "cell_type": "code", "execution_count": 3, "id": "e7d798a4", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Creating database schema...\n", "✓ Schema created successfully\n" ] } ], "source": [ "# Delete existing schema (optional - only if you want to start fresh)\n", "# Uncomment the line below if you want to start with a clean database\n", "td.delete()\n", "\n", "# Create database schema\n", "td.create()\n" ] }, { "cell_type": "markdown", "id": "cd50c230", "metadata": {}, "source": [ "## Part 2: Start the API Server\n", "\n", "Before we can use the API, we need to start the API server. \n", "\n", "**Note**: The API server runs in a blocking manner. In a notebook, we'll start it in a background thread so we can continue using the notebook.\n" ] }, { "cell_type": "code", "execution_count": 3, "id": "5a264e58", "metadata": {}, "outputs": [ { "name": "stderr", "output_type": "stream", "text": [ "INFO: Started server process [73573]\n", "INFO: Waiting for application startup.\n", "INFO: Application startup complete.\n", "INFO: Uvicorn running on http://127.0.0.1:8000 (Press CTRL+C to quit)\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Starting API server in background thread on http://127.0.0.1:8000...\n", "Starting TimeDB API server on http://127.0.0.1:8000\n", "API docs available at http://127.0.0.1:8000/docs\n", "Press Ctrl+C to stop the server\n", "INFO: 127.0.0.1:60092 - \"GET / HTTP/1.1\" 200 OK\n", "✓ API is running\n", " Name: TimeDB API\n", " Version: 0.1.1\n", "\n", "Available endpoints:\n", " - read_values: GET /values - Read time series values\n", " - upload_timeseries: POST /upload - Upload time series data (create a new run with values)\n", " - create_series: POST /series - Create a new time series\n", " - list_timeseries: GET /list_timeseries - List all time series (series_id -> series_key mapping)\n", " - update_records: PUT /values - Update existing time series records\n", "✓ API server started successfully\n", " Server running at http://127.0.0.1:8000\n", " API docs available at http://127.0.0.1:8000/docs\n" ] }, { "data": { "text/plain": [ "True" ] }, "execution_count": 3, "metadata": {}, "output_type": "execute_result" } ], "source": [ "# Start the API server in the background\n", "# This will start the server in a daemon thread so we can continue using the notebook\n", "td.start_api_background()\n" ] }, { "cell_type": "code", "execution_count": 4, "id": "93bf1ba9", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "❌ API is not running!\n", " Please start it by running: td.start_api_background()\n", " Or in a terminal: timedb api\n" ] }, { "data": { "text/plain": [ "False" ] }, "execution_count": 4, "metadata": {}, "output_type": "execute_result" } ], "source": [ "# Verify API is running and get API information\n", "td.check_api()\n" ] }, { "cell_type": "markdown", "id": "4ae14808", "metadata": {}, "source": [ "## Part 3: Insert Data Using the API\n", "\n", "Now let's create some sample time series data and insert it using the REST API.\n" ] }, { "cell_type": "code", "execution_count": 14, "id": "0dbd79a3", "metadata": {}, "outputs": [], "source": [ "headers={\"Content-Type\": \"application/json\",\n", " \"X-API-Key\": \"OekXZaRjQDxdQ-3NfX_Y0p7SzgRWSkd2q32VxUkJbTs\"}" ] }, { "cell_type": "code", "execution_count": 8, "id": "efc22941", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "✓ Created series 'temperature': 49c45b07-7910-4f85-add4-1d77bf80d487\n", " Message: Series created successfully\n", "✓ Created series 'humidity': 79e63af8-b0e0-4372-b0db-cee9b975c600\n", " Message: Series created successfully\n", "\n", "✓ Created 2 time series\n", "\n", "Prepared 48 value rows to insert\n", "Time range: 2025-01-01 00:00:00+00:00 to 2025-01-01 23:00:00+00:00\n", "Series: temperature, humidity\n" ] } ], "source": [ "# First, create the time series using the /create_series endpoint\n", "series_to_create = [\n", " {\n", " \"name\": \"temperature\",\n", " \"description\": \"Temperature measurements in Celsius\",\n", " \"unit\": \"celsius\"\n", " },\n", " {\n", " \"name\": \"humidity\",\n", " \"description\": \"Relative humidity percentage\",\n", " \"unit\": \"percent\"\n", " }\n", "]\n", "\n", "created_series = {}\n", "for series_info in series_to_create:\n", " response = requests.post(\n", " f\"{API_BASE_URL}/series\",\n", " json=series_info,\n", " headers={\"Content-Type\": \"application/json\",\n", " \"X-API-Key\": \"OekXZaRjQDxdQ-3NfX_Y0p7SzgRWSkd2q32VxUkJbTs\"\n", " }\n", " )\n", " response.raise_for_status()\n", " result = response.json()\n", " series_key = series_info[\"name\"]\n", " created_series[series_key] = result[\"series_id\"]\n", " print(f\"✓ Created series '{series_key}': {result['series_id']}\")\n", " print(f\" Message: {result['message']}\")\n", "\n", "print(f\"\\n✓ Created {len(created_series)} time series\")\n", "\n", "# Create sample time series data\n", "base_time = datetime(2025, 1, 1, 0, 0, tzinfo=timezone.utc)\n", "dates = [base_time + timedelta(hours=i) for i in range(24)]\n", "\n", "# Prepare request payload for API\n", "# Now we use the series_key from the created series\n", "value_rows = []\n", "for i, date in enumerate(dates):\n", " # Add temperature value using the created series_key\n", " value_rows.append({\n", " \"valid_time\": date.isoformat(),\n", " \"value_key\": \"temperature\", # Use the series_key from created series\n", " \"value\": 20.0 + i * 0.3 # Temperature rising\n", " })\n", " # Add humidity value using the created series_key\n", " value_rows.append({\n", " \"valid_time\": date.isoformat(),\n", " \"value_key\": \"humidity\", # Use the series_key from created series\n", " \"value\": 60.0 - i * 0.5 # Humidity decreasing\n", " })\n", "\n", "# Note: workflow_id defaults to \"api-workflow\" if not provided\n", "create_run_request = {\n", " \"run_start_time\": datetime.now(timezone.utc).isoformat(),\n", " \"value_rows\": value_rows\n", "}\n", "\n", "print(f\"\\nPrepared {len(value_rows)} value rows to insert\")\n", "print(f\"Time range: {dates[0]} to {dates[-1]}\")\n", "print(f\"Series: {', '.join(created_series.keys())}\")\n" ] }, { "cell_type": "markdown", "id": "d68fad34", "metadata": {}, "source": [ "### 3.1: Upload the Data\n", "\n", "Now let's upload the time series data using the series_keys from the series we just created.\n" ] }, { "cell_type": "code", "execution_count": 10, "id": "fab8523c", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "✓ Created run with ID: c5c7b465-1a9f-4605-a801-c51b8de0e73e\n", " Message: Run created successfully\n", "\n", "Series IDs returned:\n", " temperature: 4b7f4c69-6898-406e-9d6d-78d7a9096214\n", " humidity: 715583c0-1d9e-4012-b938-013d0f5dc563\n" ] } ], "source": [ "# Upload data via API\n", "response = requests.post(\n", " f\"{API_BASE_URL}/upload\",\n", " json=create_run_request,\n", " headers={\"Content-Type\": \"application/json\",\n", " \"X-API-Key\": \"OekXZaRjQDxdQ-3NfX_Y0p7SzgRWSkd2q32VxUkJbTs\"\n", " }\n", ")\n", "response.raise_for_status()\n", "\n", "result = response.json()\n", "print(f\"✓ Created run with ID: {result['run_id']}\")\n", "print(f\" Message: {result['message']}\")\n", "print(f\"\\nSeries IDs returned:\")\n", "for series_key, series_id in result['series_ids'].items():\n", " print(f\" {series_key}: {series_id}\")\n", "\n", "# Store run_id and series_ids for later use\n", "run_id = result['run_id']\n", "series_ids = result['series_ids'] # Maps series_key -> series_id\n" ] }, { "cell_type": "markdown", "id": "5fc21dd0", "metadata": {}, "source": [ "### 3.2: List All Time Series\n", "\n", "After uploading data, you can list all available time series to get the series_id -> series_key mapping. This is useful for subsequent API calls.\n" ] }, { "cell_type": "code", "execution_count": 13, "id": "09f69fd7", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "✓ Found 2 time series\n", "\n", "Series information:\n", " 715583c0-1d9e-4012-b938-013d0f5dc563:\n", " Series Key: humidity\n", " Description: None\n", " Unit: dimensionless\n", " 4b7f4c69-6898-406e-9d6d-78d7a9096214:\n", " Series Key: temperature\n", " Description: None\n", " Unit: dimensionless\n" ] } ], "source": [ "# List all time series\n", "response = requests.get(f\"{API_BASE_URL}/list_timeseries\",\n", " headers={\"Content-Type\": \"application/json\",\n", " \"X-API-Key\": \"OekXZaRjQDxdQ-3NfX_Y0p7SzgRWSkd2q32VxUkJbTs\"\n", " })\n", "response.raise_for_status()\n", "\n", "timeseries_list = response.json()\n", "print(f\"✓ Found {len(timeseries_list)} time series\")\n", "print(\"\\nSeries information:\")\n", "for series_id, series_info in timeseries_list.items():\n", " print(f\" {series_id}:\")\n", " print(f\" Series Key: {series_info['series_key']}\")\n", " print(f\" Description: {series_info.get('description', 'N/A')}\")\n", " print(f\" Unit: {series_info['unit']}\")\n", "\n", "# Store for later use\n", "all_series_ids = timeseries_list\n" ] }, { "cell_type": "markdown", "id": "ee28ad3c", "metadata": {}, "source": [ "## Part 4: Read Data Using the API\n", "\n", "Let's read the time series data we just inserted using the API.\n" ] }, { "cell_type": "code", "execution_count": 15, "id": "06ea2260", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "✓ Retrieved 48 records via API\n", "\n", "First few rows:\n", " valid_time series_id value \\\n", "0 2025-01-01 00:00:00+00:00 4b7f4c69-6898-406e-9d6d-78d7a9096214 20.0 \n", "1 2025-01-01 00:00:00+00:00 715583c0-1d9e-4012-b938-013d0f5dc563 60.0 \n", "2 2025-01-01 01:00:00+00:00 4b7f4c69-6898-406e-9d6d-78d7a9096214 20.3 \n", "3 2025-01-01 01:00:00+00:00 715583c0-1d9e-4012-b938-013d0f5dc563 59.5 \n", "4 2025-01-01 02:00:00+00:00 4b7f4c69-6898-406e-9d6d-78d7a9096214 20.6 \n", "5 2025-01-01 02:00:00+00:00 715583c0-1d9e-4012-b938-013d0f5dc563 59.0 \n", "6 2025-01-01 03:00:00+00:00 4b7f4c69-6898-406e-9d6d-78d7a9096214 20.9 \n", "7 2025-01-01 03:00:00+00:00 715583c0-1d9e-4012-b938-013d0f5dc563 58.5 \n", "8 2025-01-01 04:00:00+00:00 4b7f4c69-6898-406e-9d6d-78d7a9096214 21.2 \n", "9 2025-01-01 04:00:00+00:00 715583c0-1d9e-4012-b938-013d0f5dc563 58.0 \n", "\n", " series_key series_unit \n", "0 temperature dimensionless \n", "1 humidity dimensionless \n", "2 temperature dimensionless \n", "3 humidity dimensionless \n", "4 temperature dimensionless \n", "5 humidity dimensionless \n", "6 temperature dimensionless \n", "7 humidity dimensionless \n", "8 temperature dimensionless \n", "9 humidity dimensionless \n", "\n", "DataFrame shape: (48, 5)\n", "Columns: ['valid_time', 'series_id', 'value', 'series_key', 'series_unit']\n" ] } ], "source": [ "# Read data via API\n", "# Note: Since we're not using authentication, we can read all data\n", "params = {\n", " \"start_valid\": base_time.isoformat(),\n", " \"end_valid\": (base_time + timedelta(hours=24)).isoformat(),\n", " \"mode\": \"flat\", # \"flat\" returns latest known_time per valid_time, \"overlapping\" returns all revisions\n", " \"all_versions\": False # Set to True to include historical versions\n", "}\n", "\n", "response = requests.get(f\"{API_BASE_URL}/values\", params=params, headers=headers)\n", "response.raise_for_status()\n", "\n", "data = response.json()\n", "print(f\"✓ Retrieved {data['count']} records via API\")\n", "\n", "# Convert to DataFrame for easier viewing\n", "if data['count'] > 0:\n", " df_api = pd.DataFrame(data['data'])\n", " # Convert ISO strings back to datetime\n", " df_api['valid_time'] = pd.to_datetime(df_api['valid_time'])\n", " print(\"\\nFirst few rows:\")\n", " print(df_api.head(10))\n", " print(f\"\\nDataFrame shape: {df_api.shape}\")\n", " print(f\"Columns: {list(df_api.columns)}\")\n", "else:\n", " print(\"No data found\")\n" ] }, { "cell_type": "code", "execution_count": 16, "id": "0b6c7120", "metadata": {}, "outputs": [ { "data": { "text/html": [ "
\n", "\n", "\n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", "
valid_timeseries_idvalueseries_keyseries_unit
02025-01-01 00:00:00+00:004b7f4c69-6898-406e-9d6d-78d7a909621420.0temperaturedimensionless
12025-01-01 00:00:00+00:00715583c0-1d9e-4012-b938-013d0f5dc56360.0humiditydimensionless
22025-01-01 01:00:00+00:004b7f4c69-6898-406e-9d6d-78d7a909621420.3temperaturedimensionless
32025-01-01 01:00:00+00:00715583c0-1d9e-4012-b938-013d0f5dc56359.5humiditydimensionless
42025-01-01 02:00:00+00:004b7f4c69-6898-406e-9d6d-78d7a909621420.6temperaturedimensionless
52025-01-01 02:00:00+00:00715583c0-1d9e-4012-b938-013d0f5dc56359.0humiditydimensionless
62025-01-01 03:00:00+00:004b7f4c69-6898-406e-9d6d-78d7a909621420.9temperaturedimensionless
72025-01-01 03:00:00+00:00715583c0-1d9e-4012-b938-013d0f5dc56358.5humiditydimensionless
82025-01-01 04:00:00+00:004b7f4c69-6898-406e-9d6d-78d7a909621421.2temperaturedimensionless
92025-01-01 04:00:00+00:00715583c0-1d9e-4012-b938-013d0f5dc56358.0humiditydimensionless
102025-01-01 05:00:00+00:004b7f4c69-6898-406e-9d6d-78d7a909621421.5temperaturedimensionless
112025-01-01 05:00:00+00:00715583c0-1d9e-4012-b938-013d0f5dc56357.5humiditydimensionless
122025-01-01 06:00:00+00:004b7f4c69-6898-406e-9d6d-78d7a909621421.8temperaturedimensionless
132025-01-01 06:00:00+00:00715583c0-1d9e-4012-b938-013d0f5dc56357.0humiditydimensionless
142025-01-01 07:00:00+00:004b7f4c69-6898-406e-9d6d-78d7a909621422.1temperaturedimensionless
152025-01-01 07:00:00+00:00715583c0-1d9e-4012-b938-013d0f5dc56356.5humiditydimensionless
162025-01-01 08:00:00+00:004b7f4c69-6898-406e-9d6d-78d7a909621422.4temperaturedimensionless
172025-01-01 08:00:00+00:00715583c0-1d9e-4012-b938-013d0f5dc56356.0humiditydimensionless
182025-01-01 09:00:00+00:004b7f4c69-6898-406e-9d6d-78d7a909621422.7temperaturedimensionless
192025-01-01 09:00:00+00:00715583c0-1d9e-4012-b938-013d0f5dc56355.5humiditydimensionless
202025-01-01 10:00:00+00:004b7f4c69-6898-406e-9d6d-78d7a909621423.0temperaturedimensionless
212025-01-01 10:00:00+00:00715583c0-1d9e-4012-b938-013d0f5dc56355.0humiditydimensionless
222025-01-01 11:00:00+00:004b7f4c69-6898-406e-9d6d-78d7a909621423.3temperaturedimensionless
232025-01-01 11:00:00+00:00715583c0-1d9e-4012-b938-013d0f5dc56354.5humiditydimensionless
242025-01-01 12:00:00+00:004b7f4c69-6898-406e-9d6d-78d7a909621423.6temperaturedimensionless
252025-01-01 12:00:00+00:00715583c0-1d9e-4012-b938-013d0f5dc56354.0humiditydimensionless
262025-01-01 13:00:00+00:004b7f4c69-6898-406e-9d6d-78d7a909621423.9temperaturedimensionless
272025-01-01 13:00:00+00:00715583c0-1d9e-4012-b938-013d0f5dc56353.5humiditydimensionless
282025-01-01 14:00:00+00:004b7f4c69-6898-406e-9d6d-78d7a909621424.2temperaturedimensionless
292025-01-01 14:00:00+00:00715583c0-1d9e-4012-b938-013d0f5dc56353.0humiditydimensionless
302025-01-01 15:00:00+00:004b7f4c69-6898-406e-9d6d-78d7a909621424.5temperaturedimensionless
312025-01-01 15:00:00+00:00715583c0-1d9e-4012-b938-013d0f5dc56352.5humiditydimensionless
322025-01-01 16:00:00+00:004b7f4c69-6898-406e-9d6d-78d7a909621424.8temperaturedimensionless
332025-01-01 16:00:00+00:00715583c0-1d9e-4012-b938-013d0f5dc56352.0humiditydimensionless
342025-01-01 17:00:00+00:004b7f4c69-6898-406e-9d6d-78d7a909621425.1temperaturedimensionless
352025-01-01 17:00:00+00:00715583c0-1d9e-4012-b938-013d0f5dc56351.5humiditydimensionless
362025-01-01 18:00:00+00:004b7f4c69-6898-406e-9d6d-78d7a909621425.4temperaturedimensionless
372025-01-01 18:00:00+00:00715583c0-1d9e-4012-b938-013d0f5dc56351.0humiditydimensionless
382025-01-01 19:00:00+00:004b7f4c69-6898-406e-9d6d-78d7a909621425.7temperaturedimensionless
392025-01-01 19:00:00+00:00715583c0-1d9e-4012-b938-013d0f5dc56350.5humiditydimensionless
402025-01-01 20:00:00+00:004b7f4c69-6898-406e-9d6d-78d7a909621426.0temperaturedimensionless
412025-01-01 20:00:00+00:00715583c0-1d9e-4012-b938-013d0f5dc56350.0humiditydimensionless
422025-01-01 21:00:00+00:004b7f4c69-6898-406e-9d6d-78d7a909621426.3temperaturedimensionless
432025-01-01 21:00:00+00:00715583c0-1d9e-4012-b938-013d0f5dc56349.5humiditydimensionless
442025-01-01 22:00:00+00:004b7f4c69-6898-406e-9d6d-78d7a909621426.6temperaturedimensionless
452025-01-01 22:00:00+00:00715583c0-1d9e-4012-b938-013d0f5dc56349.0humiditydimensionless
462025-01-01 23:00:00+00:004b7f4c69-6898-406e-9d6d-78d7a909621426.9temperaturedimensionless
472025-01-01 23:00:00+00:00715583c0-1d9e-4012-b938-013d0f5dc56348.5humiditydimensionless
\n", "
" ], "text/plain": [ " valid_time series_id value \\\n", "0 2025-01-01 00:00:00+00:00 4b7f4c69-6898-406e-9d6d-78d7a9096214 20.0 \n", "1 2025-01-01 00:00:00+00:00 715583c0-1d9e-4012-b938-013d0f5dc563 60.0 \n", "2 2025-01-01 01:00:00+00:00 4b7f4c69-6898-406e-9d6d-78d7a9096214 20.3 \n", "3 2025-01-01 01:00:00+00:00 715583c0-1d9e-4012-b938-013d0f5dc563 59.5 \n", "4 2025-01-01 02:00:00+00:00 4b7f4c69-6898-406e-9d6d-78d7a9096214 20.6 \n", "5 2025-01-01 02:00:00+00:00 715583c0-1d9e-4012-b938-013d0f5dc563 59.0 \n", "6 2025-01-01 03:00:00+00:00 4b7f4c69-6898-406e-9d6d-78d7a9096214 20.9 \n", "7 2025-01-01 03:00:00+00:00 715583c0-1d9e-4012-b938-013d0f5dc563 58.5 \n", "8 2025-01-01 04:00:00+00:00 4b7f4c69-6898-406e-9d6d-78d7a9096214 21.2 \n", "9 2025-01-01 04:00:00+00:00 715583c0-1d9e-4012-b938-013d0f5dc563 58.0 \n", "10 2025-01-01 05:00:00+00:00 4b7f4c69-6898-406e-9d6d-78d7a9096214 21.5 \n", "11 2025-01-01 05:00:00+00:00 715583c0-1d9e-4012-b938-013d0f5dc563 57.5 \n", "12 2025-01-01 06:00:00+00:00 4b7f4c69-6898-406e-9d6d-78d7a9096214 21.8 \n", "13 2025-01-01 06:00:00+00:00 715583c0-1d9e-4012-b938-013d0f5dc563 57.0 \n", "14 2025-01-01 07:00:00+00:00 4b7f4c69-6898-406e-9d6d-78d7a9096214 22.1 \n", "15 2025-01-01 07:00:00+00:00 715583c0-1d9e-4012-b938-013d0f5dc563 56.5 \n", "16 2025-01-01 08:00:00+00:00 4b7f4c69-6898-406e-9d6d-78d7a9096214 22.4 \n", "17 2025-01-01 08:00:00+00:00 715583c0-1d9e-4012-b938-013d0f5dc563 56.0 \n", "18 2025-01-01 09:00:00+00:00 4b7f4c69-6898-406e-9d6d-78d7a9096214 22.7 \n", "19 2025-01-01 09:00:00+00:00 715583c0-1d9e-4012-b938-013d0f5dc563 55.5 \n", "20 2025-01-01 10:00:00+00:00 4b7f4c69-6898-406e-9d6d-78d7a9096214 23.0 \n", "21 2025-01-01 10:00:00+00:00 715583c0-1d9e-4012-b938-013d0f5dc563 55.0 \n", "22 2025-01-01 11:00:00+00:00 4b7f4c69-6898-406e-9d6d-78d7a9096214 23.3 \n", "23 2025-01-01 11:00:00+00:00 715583c0-1d9e-4012-b938-013d0f5dc563 54.5 \n", "24 2025-01-01 12:00:00+00:00 4b7f4c69-6898-406e-9d6d-78d7a9096214 23.6 \n", "25 2025-01-01 12:00:00+00:00 715583c0-1d9e-4012-b938-013d0f5dc563 54.0 \n", "26 2025-01-01 13:00:00+00:00 4b7f4c69-6898-406e-9d6d-78d7a9096214 23.9 \n", "27 2025-01-01 13:00:00+00:00 715583c0-1d9e-4012-b938-013d0f5dc563 53.5 \n", "28 2025-01-01 14:00:00+00:00 4b7f4c69-6898-406e-9d6d-78d7a9096214 24.2 \n", "29 2025-01-01 14:00:00+00:00 715583c0-1d9e-4012-b938-013d0f5dc563 53.0 \n", "30 2025-01-01 15:00:00+00:00 4b7f4c69-6898-406e-9d6d-78d7a9096214 24.5 \n", "31 2025-01-01 15:00:00+00:00 715583c0-1d9e-4012-b938-013d0f5dc563 52.5 \n", "32 2025-01-01 16:00:00+00:00 4b7f4c69-6898-406e-9d6d-78d7a9096214 24.8 \n", "33 2025-01-01 16:00:00+00:00 715583c0-1d9e-4012-b938-013d0f5dc563 52.0 \n", "34 2025-01-01 17:00:00+00:00 4b7f4c69-6898-406e-9d6d-78d7a9096214 25.1 \n", "35 2025-01-01 17:00:00+00:00 715583c0-1d9e-4012-b938-013d0f5dc563 51.5 \n", "36 2025-01-01 18:00:00+00:00 4b7f4c69-6898-406e-9d6d-78d7a9096214 25.4 \n", "37 2025-01-01 18:00:00+00:00 715583c0-1d9e-4012-b938-013d0f5dc563 51.0 \n", "38 2025-01-01 19:00:00+00:00 4b7f4c69-6898-406e-9d6d-78d7a9096214 25.7 \n", "39 2025-01-01 19:00:00+00:00 715583c0-1d9e-4012-b938-013d0f5dc563 50.5 \n", "40 2025-01-01 20:00:00+00:00 4b7f4c69-6898-406e-9d6d-78d7a9096214 26.0 \n", "41 2025-01-01 20:00:00+00:00 715583c0-1d9e-4012-b938-013d0f5dc563 50.0 \n", "42 2025-01-01 21:00:00+00:00 4b7f4c69-6898-406e-9d6d-78d7a9096214 26.3 \n", "43 2025-01-01 21:00:00+00:00 715583c0-1d9e-4012-b938-013d0f5dc563 49.5 \n", "44 2025-01-01 22:00:00+00:00 4b7f4c69-6898-406e-9d6d-78d7a9096214 26.6 \n", "45 2025-01-01 22:00:00+00:00 715583c0-1d9e-4012-b938-013d0f5dc563 49.0 \n", "46 2025-01-01 23:00:00+00:00 4b7f4c69-6898-406e-9d6d-78d7a9096214 26.9 \n", "47 2025-01-01 23:00:00+00:00 715583c0-1d9e-4012-b938-013d0f5dc563 48.5 \n", "\n", " series_key series_unit \n", "0 temperature dimensionless \n", "1 humidity dimensionless \n", "2 temperature dimensionless \n", "3 humidity dimensionless \n", "4 temperature dimensionless \n", "5 humidity dimensionless \n", "6 temperature dimensionless \n", "7 humidity dimensionless \n", "8 temperature dimensionless \n", "9 humidity dimensionless \n", "10 temperature dimensionless \n", "11 humidity dimensionless \n", "12 temperature dimensionless \n", "13 humidity dimensionless \n", "14 temperature dimensionless \n", "15 humidity dimensionless \n", "16 temperature dimensionless \n", "17 humidity dimensionless \n", "18 temperature dimensionless \n", "19 humidity dimensionless \n", "20 temperature dimensionless \n", "21 humidity dimensionless \n", "22 temperature dimensionless \n", "23 humidity dimensionless \n", "24 temperature dimensionless \n", "25 humidity dimensionless \n", "26 temperature dimensionless \n", "27 humidity dimensionless \n", "28 temperature dimensionless \n", "29 humidity dimensionless \n", "30 temperature dimensionless \n", "31 humidity dimensionless \n", "32 temperature dimensionless \n", "33 humidity dimensionless \n", "34 temperature dimensionless \n", "35 humidity dimensionless \n", "36 temperature dimensionless \n", "37 humidity dimensionless \n", "38 temperature dimensionless \n", "39 humidity dimensionless \n", "40 temperature dimensionless \n", "41 humidity dimensionless \n", "42 temperature dimensionless \n", "43 humidity dimensionless \n", "44 temperature dimensionless \n", "45 humidity dimensionless \n", "46 temperature dimensionless \n", "47 humidity dimensionless " ] }, "execution_count": 16, "metadata": {}, "output_type": "execute_result" } ], "source": [ "df_api" ] }, { "cell_type": "markdown", "id": "a0bee3c9", "metadata": {}, "source": [ "### 4.1: Read with Different Modes\n", "\n", "The API supports two query modes:\n", "- **\"flat\"**: Returns the latest version of each (valid_time, series_id) combination\n", "- **\"overlapping\"**: Returns all forecast revisions, showing how predictions evolve over time\n", "\n", "Let's try the overlapping mode:\n" ] }, { "cell_type": "code", "execution_count": 8, "id": "61fd74b6", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "✓ Retrieved 12 records in overlapping mode\n", "\n", "First few rows (showing forecast revisions):\n", " known_time valid_time \\\n", "0 2026-01-01 00:31:51.638418+00:00 2025-01-01 00:00:00+00:00 \n", "1 2026-01-01 00:31:51.638418+00:00 2025-01-01 00:00:00+00:00 \n", "2 2026-01-01 00:31:51.638418+00:00 2025-01-01 01:00:00+00:00 \n", "3 2026-01-01 00:31:51.638418+00:00 2025-01-01 01:00:00+00:00 \n", "4 2026-01-01 00:31:51.638418+00:00 2025-01-01 02:00:00+00:00 \n", "5 2026-01-01 00:31:51.638418+00:00 2025-01-01 02:00:00+00:00 \n", "6 2026-01-01 00:31:51.638418+00:00 2025-01-01 03:00:00+00:00 \n", "7 2026-01-01 00:31:51.638418+00:00 2025-01-01 03:00:00+00:00 \n", "8 2026-01-01 00:31:51.638418+00:00 2025-01-01 04:00:00+00:00 \n", "9 2026-01-01 00:31:51.638418+00:00 2025-01-01 04:00:00+00:00 \n", "\n", " series_id value series_key series_unit \n", "0 27813f08-4d13-41fc-aa59-aa88c2d293a9 60.0 humidity dimensionless \n", "1 8cf187d5-808a-4734-9b8f-c940fe83d6de 20.0 temperature dimensionless \n", "2 27813f08-4d13-41fc-aa59-aa88c2d293a9 59.5 humidity dimensionless \n", "3 8cf187d5-808a-4734-9b8f-c940fe83d6de 20.3 temperature dimensionless \n", "4 27813f08-4d13-41fc-aa59-aa88c2d293a9 59.0 humidity dimensionless \n", "5 8cf187d5-808a-4734-9b8f-c940fe83d6de 20.6 temperature dimensionless \n", "6 27813f08-4d13-41fc-aa59-aa88c2d293a9 58.5 humidity dimensionless \n", "7 8cf187d5-808a-4734-9b8f-c940fe83d6de 20.9 temperature dimensionless \n", "8 27813f08-4d13-41fc-aa59-aa88c2d293a9 58.0 humidity dimensionless \n", "9 8cf187d5-808a-4734-9b8f-c940fe83d6de 21.2 temperature dimensionless \n" ] } ], "source": [ "# Read in overlapping mode to see all forecast revisions\n", "params_overlapping = {\n", " \"start_valid\": base_time.isoformat(),\n", " \"end_valid\": (base_time + timedelta(hours=6)).isoformat(), # Smaller range for clarity\n", " \"mode\": \"overlapping\", # This mode shows all known_time revisions\n", " \"all_versions\": False\n", "}\n", "\n", "response = requests.get(f\"{API_BASE_URL}/values\", params=params_overlapping)\n", "response.raise_for_status()\n", "\n", "data_overlapping = response.json()\n", "print(f\"✓ Retrieved {data_overlapping['count']} records in overlapping mode\")\n", "\n", "if data_overlapping['count'] > 0:\n", " df_overlapping = pd.DataFrame(data_overlapping['data'])\n", " df_overlapping['valid_time'] = pd.to_datetime(df_overlapping['valid_time'])\n", " if 'known_time' in df_overlapping.columns:\n", " df_overlapping['known_time'] = pd.to_datetime(df_overlapping['known_time'])\n", " print(\"\\nFirst few rows (showing forecast revisions):\")\n", " print(df_overlapping.head(10))\n" ] }, { "cell_type": "markdown", "id": "531f8b9e", "metadata": {}, "source": [ "## Part 5: Insert More Data\n", "\n", "Let's insert another run with updated values to demonstrate how the API handles multiple runs.\n" ] }, { "cell_type": "code", "execution_count": 9, "id": "d2786e09", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Prepared 24 value rows for second run\n", "Time range: 2025-01-02 00:00:00+00:00 to 2025-01-02 11:00:00+00:00\n" ] }, { "ename": "HTTPError", "evalue": "404 Client Error: Not Found for url: https://rebase-energy--timedb-api-fastapi-app-dev.modal.run/runs", "output_type": "error", "traceback": [ "\u001b[31m---------------------------------------------------------------------------\u001b[39m", "\u001b[31mHTTPError\u001b[39m Traceback (most recent call last)", "\u001b[36mCell\u001b[39m\u001b[36m \u001b[39m\u001b[32mIn[9]\u001b[39m\u001b[32m, line 36\u001b[39m\n\u001b[32m 30\u001b[39m \u001b[38;5;66;03m# Insert the new run\u001b[39;00m\n\u001b[32m 31\u001b[39m response = requests.post(\n\u001b[32m 32\u001b[39m \u001b[33mf\u001b[39m\u001b[33m\"\u001b[39m\u001b[38;5;132;01m{\u001b[39;00mAPI_BASE_URL\u001b[38;5;132;01m}\u001b[39;00m\u001b[33m/runs\u001b[39m\u001b[33m\"\u001b[39m,\n\u001b[32m 33\u001b[39m json=create_run_request_new,\n\u001b[32m 34\u001b[39m headers={\u001b[33m\"\u001b[39m\u001b[33mContent-Type\u001b[39m\u001b[33m\"\u001b[39m: \u001b[33m\"\u001b[39m\u001b[33mapplication/json\u001b[39m\u001b[33m\"\u001b[39m}\n\u001b[32m 35\u001b[39m )\n\u001b[32m---> \u001b[39m\u001b[32m36\u001b[39m \u001b[43mresponse\u001b[49m\u001b[43m.\u001b[49m\u001b[43mraise_for_status\u001b[49m\u001b[43m(\u001b[49m\u001b[43m)\u001b[49m\n\u001b[32m 38\u001b[39m result_new = response.json()\n\u001b[32m 39\u001b[39m \u001b[38;5;28mprint\u001b[39m(\u001b[33mf\u001b[39m\u001b[33m\"\u001b[39m\u001b[38;5;130;01m\\n\u001b[39;00m\u001b[33m✓ Created second run with ID: \u001b[39m\u001b[38;5;132;01m{\u001b[39;00mresult_new[\u001b[33m'\u001b[39m\u001b[33mrun_id\u001b[39m\u001b[33m'\u001b[39m]\u001b[38;5;132;01m}\u001b[39;00m\u001b[33m\"\u001b[39m)\n", "\u001b[36mFile \u001b[39m\u001b[32m~/Documents/Github/timedb/.venv/lib/python3.14/site-packages/requests/models.py:1026\u001b[39m, in \u001b[36mResponse.raise_for_status\u001b[39m\u001b[34m(self)\u001b[39m\n\u001b[32m 1021\u001b[39m http_error_msg = (\n\u001b[32m 1022\u001b[39m \u001b[33mf\u001b[39m\u001b[33m\"\u001b[39m\u001b[38;5;132;01m{\u001b[39;00m\u001b[38;5;28mself\u001b[39m.status_code\u001b[38;5;132;01m}\u001b[39;00m\u001b[33m Server Error: \u001b[39m\u001b[38;5;132;01m{\u001b[39;00mreason\u001b[38;5;132;01m}\u001b[39;00m\u001b[33m for url: \u001b[39m\u001b[38;5;132;01m{\u001b[39;00m\u001b[38;5;28mself\u001b[39m.url\u001b[38;5;132;01m}\u001b[39;00m\u001b[33m\"\u001b[39m\n\u001b[32m 1023\u001b[39m )\n\u001b[32m 1025\u001b[39m \u001b[38;5;28;01mif\u001b[39;00m http_error_msg:\n\u001b[32m-> \u001b[39m\u001b[32m1026\u001b[39m \u001b[38;5;28;01mraise\u001b[39;00m HTTPError(http_error_msg, response=\u001b[38;5;28mself\u001b[39m)\n", "\u001b[31mHTTPError\u001b[39m: 404 Client Error: Not Found for url: https://rebase-energy--timedb-api-fastapi-app-dev.modal.run/runs" ] } ], "source": [ "# Create new time series data for a second run\n", "new_base_time = datetime(2025, 1, 2, 0, 0, tzinfo=timezone.utc)\n", "new_dates = [new_base_time + timedelta(hours=i) for i in range(12)]\n", "\n", "# Prepare request payload for a new run\n", "value_rows_new = []\n", "for i, date in enumerate(new_dates):\n", " # Add temperature value (updated forecast)\n", " value_rows_new.append({\n", " \"valid_time\": date.isoformat(),\n", " \"value_key\": \"temperature\",\n", " \"value\": 25.0 + i * 0.2 # Different values than first run\n", " })\n", " # Add humidity value (updated forecast)\n", " value_rows_new.append({\n", " \"valid_time\": date.isoformat(),\n", " \"value_key\": \"humidity\",\n", " \"value\": 50.0 - i * 0.3 # Different values than first run\n", " })\n", "\n", "# Note: workflow_id defaults to \"api-workflow\" if not provided\n", "create_run_request_new = {\n", " \"run_start_time\": datetime.now(timezone.utc).isoformat(),\n", " \"value_rows\": value_rows_new\n", "}\n", "\n", "print(f\"Prepared {len(value_rows_new)} value rows for second run\")\n", "print(f\"Time range: {new_dates[0]} to {new_dates[-1]}\")\n", "\n", "# Insert the new run\n", "response = requests.post(\n", " f\"{API_BASE_URL}/runs\",\n", " json=create_run_request_new,\n", " headers={\"Content-Type\": \"application/json\"}\n", ")\n", "response.raise_for_status()\n", "\n", "result_new = response.json()\n", "print(f\"\\n✓ Created second run with ID: {result_new['run_id']}\")\n", "print(f\" Message: {result_new['message']}\")\n" ] }, { "cell_type": "code", "execution_count": null, "id": "56edefbc", "metadata": {}, "outputs": [], "source": [ "# Read the newly inserted data\n", "params_new = {\n", " \"start_valid\": new_base_time.isoformat(),\n", " \"end_valid\": (new_base_time + timedelta(hours=12)).isoformat(),\n", " \"mode\": \"flat\"\n", "}\n", "\n", "response = requests.get(f\"{API_BASE_URL}/values\", params=params_new)\n", "response.raise_for_status()\n", "\n", "data_new = response.json()\n", "print(f\"✓ Retrieved {data_new['count']} records for the new time range\")\n", "\n", "if data_new['count'] > 0:\n", " df_new = pd.DataFrame(data_new['data'])\n", " df_new['valid_time'] = pd.to_datetime(df_new['valid_time'])\n", " print(\"\\nData from second run:\")\n", " print(df_new.head(10))\n" ] }, { "cell_type": "markdown", "id": "0bdc9501", "metadata": {}, "source": [ "## Part 6: Update Records Using the API\n", "\n", "The API supports updating existing records. To update a record, you need:\n", "- `run_id`: The run that created the record\n", "- `tenant_id`: The tenant ID (defaults to zeros UUID if not authenticated)\n", "- `valid_time`: The time the value is valid for\n", "- `series_id`: The series identifier\n", "\n", "Let's demonstrate updating a record. First, we need to get the series_id for our series.\n" ] }, { "cell_type": "code", "execution_count": null, "id": "140136e2", "metadata": {}, "outputs": [], "source": [ "# Get series_id from the read response (series_id should be in the response)\n", "# For this example, we'll use the SDK to get series_id, or we can extract it from the API response\n", "# Let's read a record to see what fields are available\n", "params_for_update = {\n", " \"start_valid\": base_time.isoformat(),\n", " \"end_valid\": (base_time + timedelta(hours=1)).isoformat(),\n", " \"mode\": \"flat\"\n", "}\n", "\n", "response = requests.get(f\"{API_BASE_URL}/values\", params=params_for_update)\n", "response.raise_for_status()\n", "data_for_update = response.json()\n", "\n", "if data_for_update['count'] > 0:\n", " # Get the first record\n", " first_record = data_for_update['data'][0]\n", " print(\"Sample record structure:\")\n", " print(f\" valid_time: {first_record.get('valid_time', 'N/A')}\")\n", " print(f\" series_key: {first_record.get('series_key', 'N/A')}\")\n", " print(f\" series_id: {first_record.get('series_id', 'N/A')}\")\n", " print(f\" value: {first_record.get('value', 'N/A')}\")\n", " \n", " # For updating, we need to use the SDK to get series_id, or store it when creating runs\n", " # Let's use the SDK to get series_id for demonstration\n", " import uuid as uuid_lib\n", " series_mapping = {}\n", " for record in data_for_update['data']:\n", " series_key = record.get('series_key')\n", " series_id_str = record.get('series_id')\n", " if series_key and series_id_str:\n", " series_mapping[series_key] = uuid_lib.UUID(series_id_str)\n", " \n", " print(f\"\\nSeries mapping: {series_mapping}\")\n", " \n", " # Now we can create an update request\n", " # Default tenant_id for non-authenticated requests\n", " default_tenant_id = \"00000000-0000-0000-0000-000000000000\"\n", " \n", " update_request = {\n", " \"updates\": [\n", " {\n", " \"run_id\": run_id, # From our first insert\n", " \"tenant_id\": default_tenant_id,\n", " \"valid_time\": base_time.isoformat(),\n", " \"series_id\": str(series_mapping.get(\"temperature\", \"\")),\n", " \"value\": 22.5, # Update the temperature value\n", " \"annotation\": \"Updated via API\" # Add an annotation\n", " }\n", " ]\n", " }\n", " \n", " print(f\"\\nUpdating record:\")\n", " print(f\" run_id: {run_id}\")\n", " print(f\" valid_time: {base_time.isoformat()}\")\n", " print(f\" series: temperature\")\n", " print(f\" new value: 22.5\")\n", " \n", " # Send update request\n", " response = requests.put(\n", " f\"{API_BASE_URL}/values\",\n", " json=update_request,\n", " headers={\"Content-Type\": \"application/json\"}\n", " )\n", " response.raise_for_status()\n", " \n", " update_result = response.json()\n", " print(f\"\\n✓ Update result:\")\n", " print(f\" Updated: {len(update_result['updated'])} records\")\n", " print(f\" Skipped (no-op): {len(update_result['skipped_no_ops'])} records\")\n", " \n", " if update_result['updated']:\n", " print(f\"\\nUpdated record:\")\n", " for updated in update_result['updated']:\n", " print(f\" value_id: {updated.get('value_id', 'N/A')}\")\n", "else:\n", " print(\"No records found to update\")\n" ] }, { "cell_type": "markdown", "id": "548c4468", "metadata": {}, "source": [ "## Part 7: Verify the Update\n", "\n", "Let's read the data again to verify the update was applied.\n" ] }, { "cell_type": "code", "execution_count": null, "id": "f3b7563c", "metadata": {}, "outputs": [], "source": [ "# Read the updated record\n", "params_verify = {\n", " \"start_valid\": base_time.isoformat(),\n", " \"end_valid\": (base_time + timedelta(hours=1)).isoformat(),\n", " \"mode\": \"flat\",\n", " \"all_versions\": True # Include all versions to see the update\n", "}\n", "\n", "response = requests.get(f\"{API_BASE_URL}/values\", params=params_verify)\n", "response.raise_for_status()\n", "data_verify = response.json()\n", "\n", "if data_verify['count'] > 0:\n", " df_verify = pd.DataFrame(data_verify['data'])\n", " df_verify['valid_time'] = pd.to_datetime(df_verify['valid_time'])\n", " \n", " # Filter for temperature at the updated time\n", " temp_records = df_verify[\n", " (df_verify['series_key'] == 'temperature') & \n", " (df_verify['valid_time'] == base_time)\n", " ]\n", " \n", " print(f\"✓ Found {len(temp_records)} version(s) of the temperature record\")\n", " print(\"\\nAll versions (showing update history):\")\n", " print(temp_records[['valid_time', 'series_key', 'value', 'changed_by', 'change_time']].head())\n", " \n", " # Show the current value\n", " if len(temp_records) > 0:\n", " current = temp_records.iloc[-1] # Latest version\n", " print(f\"\\nCurrent value: {current['value']}\")\n", " if 'annotation' in current and pd.notna(current['annotation']):\n", " print(f\"Annotation: {current['annotation']}\")\n", "else:\n", " print(\"No records found\")\n" ] }, { "cell_type": "markdown", "id": "7d1686e2", "metadata": {}, "source": [ "## Summary\n", "\n", "This notebook demonstrated how to use the TimeDB REST API to:\n", "1. **Start the API server** - Required before making API calls\n", "2. **Insert time series data** - Using `POST /runs` endpoint\n", "3. **Read time series data** - Using `GET /values` endpoint with different modes\n", "4. **Update records** - Using `PUT /values` endpoint\n", "\n", "### Key API Endpoints:\n", "\n", "- **`GET /`** - API information and available endpoints\n", "- **`POST /runs`** - Create a new run with time series values\n", "- **`GET /values`** - Read time series values (supports `flat` and `overlapping` modes)\n", "- **`PUT /values`** - Update existing time series records\n", "\n", "### Query Modes:\n", "\n", "- **`flat`**: Returns the latest version of each (valid_time, series_id) combination\n", "- **`overlapping`**: Returns all forecast revisions, showing how predictions evolve over time\n", "\n", "### Authentication:\n", "\n", "- This example assumes **no authentication** (users_table not created)\n", "- In production, you would:\n", " 1. Create users_table using SDK: `td.create_with_users()`\n", " 2. Create users via CLI or SDK (with tenant_id)\n", " 3. Use API keys in requests: `headers={\"X-API-Key\": \"your-api-key\"}`\n", " 4. Users can only access data for their own tenant_id\n", "\n", "### Starting the API Server:\n", "\n", "The API server can be started in several ways:\n", "\n", "1. **Using the SDK in a notebook** (as shown in this notebook):\n", " ```python\n", " import timedb as td\n", " \n", " # Start in background thread (non-blocking)\n", " td.start_api_background()\n", " \n", " # Check if server is running\n", " if td.check_api():\n", " print(\"API is running\")\n", " ```\n", "\n", "2. **Using the SDK directly** (blocking - use in a separate terminal/process):\n", " ```python\n", " import timedb as td\n", " td.start_api() # Blocks until Ctrl+C\n", " ```\n", "\n", "3. **Using the CLI**:\n", " ```bash\n", " timedb api --host 127.0.0.1 --port 8000\n", " ```\n", "\n", "4. **Using uvicorn directly**:\n", " ```bash\n", " uvicorn timedb.api:app --host 127.0.0.1 --port 8000\n", " ```\n", "\n", "**Note**: In this notebook, we use `td.start_api_background()` which runs the server in a daemon thread. To stop it, simply restart the kernel.\n" ] } ], "metadata": { "kernelspec": { "display_name": ".venv", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.14.2" } }, "nbformat": 4, "nbformat_minor": 5 }