Skip to content
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -141,6 +141,7 @@ from timecopilot import TimeCopilot
# - unique_id: Unique identifier for each time series (string)
# - ds: Date column (datetime format)
# - y: Target variable for forecasting (float format)
# Spark, Ray, and Dask dataframes are accepted and converted to pandas at entry.
# The pandas frequency will be inferred from the ds column, if not provided.
# If the seasonality is not provided, it will be inferred based on the frequency.
# If the horizon is not set, it will default to 2 times the inferred seasonality.
Expand Down Expand Up @@ -323,4 +324,3 @@ Our pre-print paper is [available in arxiv](https://arxiv.org/abs/2509.00616).
}
```


71 changes: 71 additions & 0 deletions docs/examples/dask-dataframe.ipynb
Original file line number Diff line number Diff line change
@@ -0,0 +1,71 @@
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Using Dask DataFrames\n",
"\n",
"TimeCopilot converts Dask DataFrames to pandas at entry so you can use them directly.\n"
]
},
{
"cell_type": "code",
"metadata": {},
"execution_count": null,
"outputs": [],
"source": [
"import pandas as pd\n",
"import dask.dataframe as dd\n",
"\n",
"from timecopilot import TimeCopilotForecaster\n",
"from timecopilot.models.stats import SeasonalNaive\n"
]
},
{
"cell_type": "code",
"metadata": {},
"execution_count": null,
"outputs": [],
"source": [
"df = pd.read_csv(\n",
" \"https://timecopilot.s3.amazonaws.com/public/data/air_passengers.csv\",\n",
" parse_dates=[\"ds\"],\n",
")\n",
"dask_df = dd.from_pandas(df, npartitions=1)\n"
]
},
{
"cell_type": "code",
"metadata": {},
"execution_count": null,
"outputs": [],
"source": [
"tcf = TimeCopilotForecaster(models=[SeasonalNaive()])\n",
"fcst_df = tcf.forecast(df=dask_df, h=12, freq=\"MS\")\n",
"fcst_df.head()\n"
]
}
],
"metadata": {
"kernelspec": {
"display_name": ".venv",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.11.12"
}
},
"nbformat": 4,
"nbformat_minor": 5
}
90 changes: 90 additions & 0 deletions docs/examples/ray-dataframe.ipynb
Original file line number Diff line number Diff line change
@@ -0,0 +1,90 @@
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Using Ray Datasets\n",
"\n",
"TimeCopilot converts Ray datasets to pandas at entry so you can use them directly.\n"
]
},
{
"cell_type": "code",
"metadata": {},
"execution_count": null,
"outputs": [],
"source": [
"import pandas as pd\n",
"import ray\n",
"import ray.data as ray_data\n",
"\n",
"from timecopilot import TimeCopilotForecaster\n",
"from timecopilot.models.stats import SeasonalNaive\n"
]
},
{
"cell_type": "code",
"metadata": {},
"execution_count": null,
"outputs": [],
"source": [
"ray.init(ignore_reinit_error=True)\n"
]
},
{
"cell_type": "code",
"metadata": {},
"execution_count": null,
"outputs": [],
"source": [
"df = pd.read_csv(\n",
" \"https://timecopilot.s3.amazonaws.com/public/data/air_passengers.csv\",\n",
" parse_dates=[\"ds\"],\n",
")\n",
"ray_df = ray_data.from_pandas(df)\n"
]
},
{
"cell_type": "code",
"metadata": {},
"execution_count": null,
"outputs": [],
"source": [
"tcf = TimeCopilotForecaster(models=[SeasonalNaive()])\n",
"fcst_df = tcf.forecast(df=ray_df, h=12, freq=\"MS\")\n",
"fcst_df.head()\n"
]
},
{
"cell_type": "code",
"metadata": {},
"execution_count": null,
"outputs": [],
"source": [
"ray.shutdown()\n"
]
}
],
"metadata": {
"kernelspec": {
"display_name": ".venv",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.11.12"
}
},
"nbformat": 4,
"nbformat_minor": 5
}
89 changes: 89 additions & 0 deletions docs/examples/spark-dataframe.ipynb
Original file line number Diff line number Diff line change
@@ -0,0 +1,89 @@
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Using Spark DataFrames\n",
"\n",
"TimeCopilot converts Spark DataFrames to pandas at entry so you can use them directly.\n"
]
},
{
"cell_type": "code",
"metadata": {},
"execution_count": null,
"outputs": [],
"source": [
"import pandas as pd\n",
"from pyspark.sql import SparkSession\n",
"\n",
"from timecopilot import TimeCopilotForecaster\n",
"from timecopilot.models.stats import SeasonalNaive\n"
]
},
{
"cell_type": "code",
"metadata": {},
"execution_count": null,
"outputs": [],
"source": [
"spark = SparkSession.builder.master(\"local[1]\").appName(\"timecopilot\").getOrCreate()\n"
]
},
{
"cell_type": "code",
"metadata": {},
"execution_count": null,
"outputs": [],
"source": [
"df = pd.read_csv(\n",
" \"https://timecopilot.s3.amazonaws.com/public/data/air_passengers.csv\",\n",
" parse_dates=[\"ds\"],\n",
")\n",
"spark_df = spark.createDataFrame(df)\n"
]
},
{
"cell_type": "code",
"metadata": {},
"execution_count": null,
"outputs": [],
"source": [
"tcf = TimeCopilotForecaster(models=[SeasonalNaive()])\n",
"fcst_df = tcf.forecast(df=spark_df, h=12, freq=\"MS\")\n",
"fcst_df.head()\n"
]
},
{
"cell_type": "code",
"metadata": {},
"execution_count": null,
"outputs": [],
"source": [
"spark.stop()\n"
]
}
],
"metadata": {
"kernelspec": {
"display_name": ".venv",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.11.12"
}
},
"nbformat": 4,
"nbformat_minor": 5
}
15 changes: 14 additions & 1 deletion docs/getting-started/installation.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,10 +14,23 @@ TimeCopilot is available on PyPI as [timecopilot](https://pypi.org/project/timec
uv add timecopilot
```

!!! tip

Optional dataframe dependencies (Spark, Ray, Dask) can be installed with:

```bash
pip install "timecopilot[dataframes]"
```

or

```bash
uv sync --group dataframes
```


Requires Python 3.10 or later.

!!! tip

If you don't have a prior experience with `uv`, go to [uv getting started](https://docs.astral.sh/uv/getting-started/) section.

25 changes: 14 additions & 11 deletions mkdocs.yml
Original file line number Diff line number Diff line change
Expand Up @@ -19,17 +19,20 @@ nav:
- Quickstart: getting-started/quickstart.md
- Installation: getting-started/installation.md
- Examples:
- examples/agent-quickstart.ipynb
- examples/llm-providers.ipynb
- examples/aws-bedrock.ipynb
- examples/google-llms.ipynb
- examples/forecaster-quickstart.ipynb
- examples/anomaly-detection-forecaster-quickstart.ipynb
- examples/ts-foundation-models-comparison-quickstart.ipynb
- examples/gift-eval.ipynb
- examples/chronos-family.ipynb
- examples/cryptocurrency-quickstart.ipynb
- examples/sktime.ipynb
- examples/agent-quickstart.ipynb
- examples/llm-providers.ipynb
- examples/aws-bedrock.ipynb
- examples/google-llms.ipynb
- examples/forecaster-quickstart.ipynb
- examples/anomaly-detection-forecaster-quickstart.ipynb
- examples/ts-foundation-models-comparison-quickstart.ipynb
- examples/gift-eval.ipynb
- examples/chronos-family.ipynb
- examples/cryptocurrency-quickstart.ipynb
- examples/sktime.ipynb
- examples/spark-dataframe.ipynb
- examples/ray-dataframe.ipynb
- examples/dask-dataframe.ipynb
- Experiments:
- experiments/gift-eval.md
- experiments/fev.md
Expand Down
10 changes: 10 additions & 0 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -25,6 +25,11 @@ docs = [
"modal>=1.0.4",
"ruff>=0.12.1",
]
dataframes = [
"dask[dataframe]>=2024.9.1",
"pyspark>=3.5.1",
"ray[data]>=2.52.1",
]

[project]
authors = [
Expand Down Expand Up @@ -74,6 +79,11 @@ dependencies = [
"tsfeatures",
"utilsforecast[plotting]>=0.2.15",
]
optional-dependencies = { dataframes = [
"dask[dataframe]>=2024.9.1",
"pyspark>=3.5.1",
"ray[data]>=2.52.1",
] }
description = "The GenAI Forecasting Agent · LLMs × Time Series Foundation Models"
license = "MIT"
name = "timecopilot"
Expand Down
Loading