Skip to content

Commit c8f5688

Browse files
committed
Revert "Merge branch 'main' into webui_overhaul"
This reverts commit 631e67c, reversing changes made to 06f4d95.
1 parent 631e67c commit c8f5688

266 files changed

Lines changed: 1505 additions & 44406 deletions

File tree

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

.github/workflows/build-dataset.yml

Lines changed: 34 additions & 123 deletions
Original file line numberDiff line numberDiff line change
@@ -69,8 +69,6 @@ jobs:
6969
7070
- name: Build dataset
7171
run: |
72-
# Build the dataset
73-
# Output will be deployed to https://api.openfilamentdatabase.org/
7472
python -m builder.build \
7573
--version "${{ steps.version.outputs.version }}"
7674
@@ -107,136 +105,53 @@ jobs:
107105
id: deployment
108106
uses: actions/deploy-pages@v4
109107

110-
# Get the last release tag for generating release notes
111-
- name: Get last release tag
112-
id: last_release
113-
run: |
114-
LAST_TAG=$(git tag -l 'dataset-v*' --sort=-version:refname | head -n 1)
115-
if [ -z "$LAST_TAG" ]; then
116-
echo "previous_tag=" >> $GITHUB_OUTPUT
117-
echo "No previous dataset release found"
118-
else
119-
echo "previous_tag=$LAST_TAG" >> $GITHUB_OUTPUT
120-
echo "Last release: $LAST_TAG"
121-
fi
122-
123-
# Generate release notes with only changes since last release
124-
- name: Generate release notes
125-
id: release_notes
126-
run: |
127-
PREVIOUS_TAG="${{ steps.last_release.outputs.previous_tag }}"
128-
129-
# Start building release notes
130-
cat > release_notes.md << 'EOF'
131-
## Open Filament Database v${{ steps.version.outputs.version }}
132-
133-
All paths below are relative to the base API URL: **https://api.openfilamentdatabase.org/**
134-
135-
### Downloads
136-
137-
| Format | Description | File |
138-
|--------|-------------|------|
139-
| **SQLite (Filaments)** | Relational database with full schema | `sqlite/filaments.db` |
140-
| **SQLite (Stores)** | Store information database | `sqlite/stores.db` |
141-
| **SQLite (compressed)** | XZ compressed SQLite files | `sqlite/*.db.xz` |
142-
| **JSON** | Complete dataset in one file | `json/all.json` |
143-
| **JSON (compressed)** | Gzipped JSON | `json/all.json.gz` |
144-
| **NDJSON** | Newline-delimited JSON for streaming | `json/all.ndjson` |
145-
| **CSV** | Multiple CSV files | `csv/` directory |
146-
147-
### API Endpoints
148-
149-
All endpoints are relative to: **https://api.openfilamentdatabase.org/**
150-
151-
#### Brands & Filaments
152-
- `api/v1/index.json` - API root with version and stats
153-
- `api/v1/brands/index.json` - List of all brands
154-
- `api/v1/brands/{slug}/index.json` - Individual brand with materials
155-
- `api/v1/brands/{slug}/materials/{material_slug}/index.json` - Material with filaments
156-
- `api/v1/brands/{slug}/materials/{material_slug}/filaments/{filament_slug}/index.json` - Filament with variants
157-
- `api/v1/brands/{slug}/materials/{material_slug}/filaments/{filament_slug}/variants/{variant_slug}.json` - Variant details with sizes
158-
159-
#### Stores
160-
- `api/v1/stores/index.json` - List of stores
161-
- `api/v1/stores/{slug}.json` - Individual store details
162-
163-
#### Logos
164-
- `api/v1/brands/logo/index.json` - List of all brand logos
165-
- `api/v1/brands/logo/{logo_id}.json` - Brand logo metadata
166-
- `api/v1/brands/logo/{logo_id}.{ext}` - Brand logo image file
167-
- `api/v1/stores/logo/index.json` - List of all store logos
168-
- `api/v1/stores/logo/{logo_id}.json` - Store logo metadata
169-
- `api/v1/stores/logo/{logo_id}.{ext}` - Store logo image file
170-
171-
#### Schemas
172-
- `api/v1/schemas/index.json` - List of available JSON schemas
173-
- `api/v1/schemas/{name}.json` - Individual JSON schema
174-
175-
### Direct Downloads
176-
177-
All paths relative to: **https://api.openfilamentdatabase.org/**
178-
179-
- `json/all.json` - Complete dataset
180-
- `sqlite/filaments.db` - SQLite database (filaments)
181-
- `sqlite/stores.db` - SQLite database (stores)
182-
- `csv/` - CSV files
183-
184-
### Checksums
185-
186-
See `manifest.json` for SHA256 checksums of all files.
187-
188-
EOF
189-
190-
# Add changes section only if there was a previous release
191-
if [ -n "$PREVIOUS_TAG" ]; then
192-
echo "" >> release_notes.md
193-
echo "### Changes Since $PREVIOUS_TAG" >> release_notes.md
194-
echo "" >> release_notes.md
195-
196-
# Verify tag exists before running git log
197-
if git rev-parse "$PREVIOUS_TAG" >/dev/null 2>&1; then
198-
# Get commit log since last release, using -- to separate paths from revisions
199-
if git log "$PREVIOUS_TAG..HEAD" --oneline --no-merges -- data/ stores/ builder/ schemas/ > commits.txt 2>/dev/null; then
200-
if [ -s commits.txt ]; then
201-
# Group changes by type
202-
echo "#### Recent Updates" >> release_notes.md
203-
echo "" >> release_notes.md
204-
while read line; do
205-
echo "- $line" >> release_notes.md
206-
done < commits.txt
207-
else
208-
echo "- Infrastructure and maintenance updates" >> release_notes.md
209-
fi
210-
else
211-
echo "- Unable to retrieve commit history" >> release_notes.md
212-
fi
213-
else
214-
echo "- First release or previous tag not found" >> release_notes.md
215-
fi
216-
fi
217-
218-
# Set output
219-
echo "notes_file=release_notes.md" >> $GITHUB_OUTPUT
220-
221108
# Create Release
222109
- name: Create Release
223110
uses: softprops/action-gh-release@v2
224111
with:
225112
tag_name: dataset-v${{ steps.version.outputs.version }}
226113
name: Dataset v${{ steps.version.outputs.version }}
227-
body_path: release_notes.md
114+
body: |
115+
## Open Filament Database v${{ steps.version.outputs.version }}
116+
117+
### Downloads
118+
119+
| Format | Description | File |
120+
|--------|-------------|------|
121+
| **SQLite** | Relational database with full schema | `filaments.db` |
122+
| **SQLite (compressed)** | XZ compressed SQLite | `filaments.db.xz` |
123+
| **JSON** | Complete dataset in one file | `all.json` |
124+
| **JSON (compressed)** | Gzipped JSON | `all.json.gz` |
125+
| **NDJSON** | Newline-delimited JSON for streaming | `all.ndjson` |
126+
| **CSV** | Multiple CSV files | `csv/` directory |
127+
128+
### Endpoints
129+
130+
- `/api/v1/brands/{slug}/index.json` - Individual brand with filaments
131+
- `/api/v1/stores/index.json` - List of stores
132+
- `/api/v1/stores/{slug}.json` - Individual store
133+
- `/api/v1/schemas/index.json` - List of available JSON schemas
134+
- `/api/v1/schemas/{name}.json` - Individual JSON schema
135+
136+
### Direct Downloads
137+
138+
- `/json/all.json` - Complete dataset
139+
- `/sqlite/filaments.db` - SQLite database
140+
- `/csv/` - CSV files
141+
142+
### Checksums
143+
144+
See `manifest.json` for SHA256 checksums of all files.
228145
files: |
229146
dist/json/all.json
230147
dist/json/all.json.gz
231148
dist/json/all.ndjson
232149
dist/sqlite/filaments.db
233150
dist/sqlite/filaments.db.xz
234-
dist/sqlite/stores.db
235-
dist/sqlite/stores.db.xz
236151
dist/manifest.json
237152
draft: false
238153
prerelease: false
239-
generate_release_notes: false
154+
generate_release_notes: true
240155

241156
# Validate the build
242157
validate:
@@ -258,16 +173,12 @@ jobs:
258173
259174
- name: Validate SQLite
260175
run: |
261-
echo "Validating SQLite databases..."
262-
echo "Filaments database:"
176+
echo "Validating SQLite database..."
263177
sqlite3 dist/sqlite/filaments.db "SELECT COUNT(*) FROM brand;"
264178
sqlite3 dist/sqlite/filaments.db "SELECT COUNT(*) FROM filament;"
265179
sqlite3 dist/sqlite/filaments.db "SELECT COUNT(*) FROM variant;"
266180
sqlite3 dist/sqlite/filaments.db "SELECT COUNT(*) FROM size;"
267-
echo ""
268-
echo "Stores database:"
269-
sqlite3 dist/sqlite/stores.db "SELECT COUNT(*) FROM store;"
270-
echo "✓ SQLite databases are valid"
181+
echo "✓ SQLite database is valid"
271182
272183
- name: Test SQLite queries
273184
run: |

.github/workflows/update_profiles.yaml

Lines changed: 0 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -4,14 +4,9 @@ on:
44
schedule:
55
- cron: "0 0 * * *"
66
workflow_dispatch:
7-
push:
8-
branches:
9-
- main
107

118
jobs:
129
update_slicer_profiles:
13-
# Only run on main branch, not on PRs
14-
if: github.ref == 'refs/heads/main' && github.event_name != 'pull_request'
1510
runs-on: ubuntu-latest
1611
permissions:
1712
contents: write

README.md

Lines changed: 12 additions & 31 deletions
Original file line numberDiff line numberDiff line change
@@ -35,52 +35,33 @@ git clone https://github.com/YOUR_USERNAME/open-filament-database.git
3535
cd open-filament-database
3636
```
3737
### 5. Make your changes!
38-
Use the web editor (recommended) or edit files manually:
39-
40-
**Using the WebUI (Recommended):**
38+
Either use the web editor by simply running these commands or [following the guide](docs/webui.md), if you want to do it manually you can [use this one](docs/manual.md)
4139
```bash
4240
cd webui
4341
npm ci
4442
npm run dev
4543
```
46-
Then access it in your browser at http://localhost:5173
47-
48-
The WebUI includes built-in validation and data sorting features to help ensure your changes are correct. [Full WebUI guide](docs/webui.md)
49-
50-
**Manual editing:** If you prefer to edit files directly, [follow this guide](docs/manual.md)
51-
52-
### 6. Validate and sort your changes
53-
The WebUI can validate and sort your data automatically:
54-
55-
1. Click the "Validate" button in the top-right corner to check for errors
56-
2. Click the "Sort Data" button to organize your JSON files consistently
57-
3. Fix any validation errors that appear (they'll be highlighted in red)
44+
and access it in your browser at http://localhost:5173
5845

59-
Alternatively, you can use the command-line validation scripts ([see guide](docs/validation.md)):
46+
### 6. Validate your changes
47+
Once you've finished modifying the database you can use these commands or [this guide](docs/validation.md) to make sure your data is correct, fix any errors that pop up
6048
```bash
61-
python data_validator.py --folder-names # Validates folder names
62-
python data_validator.py --logo-files # Validates logo files
63-
python data_validator.py --json-files # Validates JSON files
64-
python data_validator.py --store-ids # Validates store IDs
49+
python data_validator.py --folder-names # Validates folder names.
50+
python data_validator.py --logo-files # Validates logo files.
51+
python data_validator.py --json-files # Validates json files.
52+
python data_validator.py --store-ids # Validates store ids.
6553
```
6654
### 7. Submit your changes
67-
Before submitting, make sure your data is sorted consistently:
68-
- **In the WebUI:** Click the "Sort Data" button in the top-right corner
69-
- **Or via command line:** Run `python scripts/sort_data.py`
70-
71-
Then add your changes:
55+
Start by running this command to add all your changes up
7256
```bash
7357
git add .
7458
```
75-
76-
Create a commit with a descriptive message (e.g., "Added Elegoo Red PLA variant"):
59+
Then run this command but replace `COMMIT_MESSAGE` with a title of what you did, e.g. "Added filament A to brand B"
7760
```bash
7861
git commit -m "COMMIT_MESSAGE"
7962
```
80-
81-
Upload your changes to GitHub:
63+
When that's done you can run this command to upload your stuff
8264
```bash
8365
git push -u origin YOUR_BRANCHNAME
8466
```
85-
86-
Finally, make a pull request [using this guide](docs/pull-requesting.md)
67+
Afterwards you can make a pull request [using this guide](docs/pull-requesting.md)

builder/build.py

Lines changed: 19 additions & 27 deletions
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@
55
This script crawls the data directory, normalizes all entities,
66
and exports them to multiple formats:
77
- JSON (all.json, all.ndjson, per-brand)
8-
- SQLite databases (filaments.db, stores.db)
8+
- SQLite database (filaments.db)
99
- CSV files
1010
- Static API (for GitHub Pages)
1111
- HTML landing page (index.html)
@@ -19,7 +19,7 @@
1919
--stores-dir DIR Stores directory (default: stores)
2020
--version VERSION Dataset version (default: auto-generated)
2121
--skip-json Skip JSON export
22-
--skip-sqlite Skip SQLite export (both filaments and stores)
22+
--skip-sqlite Skip SQLite export
2323
--skip-csv Skip CSV export
2424
--skip-api Skip static API export
2525
--skip-html Skip HTML landing page export
@@ -37,7 +37,7 @@
3737

3838
from builder.crawler import crawl_data
3939
from builder.errors import BuildResult
40-
from builder.exporters import export_json, export_sqlite, export_sqlite_stores, export_csv, export_api, export_html
40+
from builder.exporters import export_json, export_sqlite, export_csv, export_api, export_html
4141
from builder.utils import get_current_timestamp
4242

4343

@@ -147,7 +147,6 @@ def main():
147147
data_dir = project_root / args.data_dir
148148
stores_dir = project_root / args.stores_dir
149149
schemas_dir = project_root / "schemas"
150-
builder_schemas_dir = project_root / "builder" / "schemas"
151150
output_dir = project_root / args.output_dir
152151

153152
# Generate version if not provided
@@ -171,51 +170,44 @@ def main():
171170
build_result = BuildResult()
172171

173172
# Step 1: Crawl data
174-
print("\n[1/7] Crawling data...")
173+
print("\n[1/6] Crawling data...")
175174
db, crawl_result = crawl_data(str(data_dir), str(stores_dir))
176175
build_result.merge(crawl_result)
177176

178177
# Step 2: Export JSON
179178
if not args.skip_json:
180-
print("\n[2/7] Exporting JSON...")
179+
print("\n[2/6] Exporting JSON...")
181180
export_json(db, str(output_dir), version, generated_at)
182181
else:
183-
print("\n[2/7] Skipping JSON export")
182+
print("\n[2/6] Skipping JSON export")
184183

185-
# Step 3: Export SQLite (filaments)
184+
# Step 3: Export SQLite
186185
if not args.skip_sqlite:
187-
print("\n[3/7] Exporting SQLite (filaments)...")
186+
print("\n[3/6] Exporting SQLite...")
188187
export_sqlite(db, str(output_dir), version, generated_at)
189188
else:
190-
print("\n[3/7] Skipping SQLite export")
189+
print("\n[3/6] Skipping SQLite export")
191190

192-
# Step 4: Export SQLite (stores)
193-
if not args.skip_sqlite:
194-
print("\n[4/7] Exporting SQLite (stores)...")
195-
export_sqlite_stores(db, str(output_dir), version, generated_at)
196-
else:
197-
print("\n[4/7] Skipping SQLite stores export")
198-
199-
# Step 5: Export CSV
191+
# Step 4: Export CSV
200192
if not args.skip_csv:
201-
print("\n[5/7] Exporting CSV...")
193+
print("\n[4/6] Exporting CSV...")
202194
export_csv(db, str(output_dir), version, generated_at)
203195
else:
204-
print("\n[5/7] Skipping CSV export")
196+
print("\n[4/6] Skipping CSV export")
205197

206-
# Step 6: Export Static API
198+
# Step 5: Export Static API
207199
if not args.skip_api:
208-
print("\n[6/7] Exporting Static API...")
209-
export_api(db, str(output_dir), version, generated_at, schemas_dir=str(schemas_dir), builder_schemas_dir=str(builder_schemas_dir), data_dir=str(data_dir), stores_dir=str(stores_dir))
200+
print("\n[5/6] Exporting Static API...")
201+
export_api(db, str(output_dir), version, generated_at, schemas_dir=str(schemas_dir))
210202
else:
211-
print("\n[6/7] Skipping Static API export")
203+
print("\n[5/6] Skipping Static API export")
212204

213-
# Step 7: Export HTML landing page
205+
# Step 6: Export HTML landing page
214206
if not args.skip_html:
215-
print("\n[7/7] Exporting HTML landing page...")
207+
print("\n[6/6] Exporting HTML landing page...")
216208
export_html(db, str(output_dir), version, generated_at, Path(__file__).parent.resolve().joinpath("templates"))
217209
else:
218-
print("\n[7/7] Skipping HTML export")
210+
print("\n[6/6] Skipping HTML export")
219211

220212
# Calculate checksums and write manifest
221213
print("\nGenerating checksums and manifest...")

builder/exporters/__init__.py

Lines changed: 0 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,6 @@
44

55
from .json_exporter import export_json, export_all_json, export_ndjson, export_per_brand_json
66
from .sqlite_exporter import export_sqlite
7-
from .sqlite_stores_exporter import export_sqlite_stores
87
from .csv_exporter import export_csv
98
from .api_exporter import export_api
109
from .html_exporter import export_html
@@ -15,7 +14,6 @@
1514
'export_ndjson',
1615
'export_per_brand_json',
1716
'export_sqlite',
18-
'export_sqlite_stores',
1917
'export_csv',
2018
'export_api',
2119
'export_html',

0 commit comments

Comments
 (0)