A modular, future-proof backend for serving Dutch BINAS (vmbo-k, vwo, etc.) data as a modern API.
graph TB
Client[Client Application] -->|HTTP| API[BINAS API<br/>Express :3001]
API --> Routes[API Routes]
Routes --> Data[Data Layer]
Data --> Raw[Raw Data<br/>kaders.json]
Data --> Optimized[Optimized Data<br/>kaders_optimized.json]
Scripts[Optimization Scripts] -->|Process| Raw
Scripts -->|Generate| Optimized
API --> Endpoints[Endpoints]
Endpoints --> E1[/api/kaders<br/>All Kaders]
Endpoints --> E2[/api/kaders/:id<br/>Single Kader]
style API fill:#e1f5ff
style Data fill:#fff4e1
style Scripts fill:#e8f5e9
- API: Node.js + Express backend for serving BINAS tables and formulas.
- Data: JSON files, optimized for machine-readability and extensibility.
- Scripts: Tools for optimizing and converting BINAS data.
.
├── binas-api/ # Express backend
│ ├── data/
│ │ ├── kaders.json # Original vmbo-k BINAS data
│ │ └── kaders_optimized.json # Optimized, camelCase, future-proofed
│ ├── index.js # Main API server
│ └── package.json
├── data/ # Source data (can add vwo, havo, etc.)
│ ├── kaders.json
│ ├── havo_vwo_dummy.json
│ └── ...
├── scripts/ # Data processing scripts
│ └── optimize_kaders.js
├── aliases.sh # Shell aliases for multi-repo npm commands
├── push-all.sh # Script to git add/commit/push all sub-repos
├── ... # Other project folders (frontend, dashboard, etc.)
sequenceDiagram
participant User
participant API as BINAS API
participant Data as Data Files
User->>API: GET /api/kaders
API->>Data: Read kaders_optimized.json
Data-->>API: Return all kaders
API-->>User: JSON Response
User->>API: GET /api/kaders/:id
API->>Data: Read kaders_optimized.json
Data-->>API: Find kader by ID
API-->>User: Single Kader JSON
-
Install dependencies:
npm install --prefix binas-api
-
Start the API server:
npm start --prefix binas-api
The API will run on http://localhost:3001
-
Endpoints:
GET /api/kaders — all kaders (with binasType distinction)
GET /api/kaders/:id — single kader by ID
flowchart LR
Raw[Raw Data<br/>kaders.json] --> Script[optimize_kaders.js]
Script --> Process[Processing]
Process --> Transform[Transform to<br/>camelCase]
Transform --> Validate[Validate Structure]
Validate --> Optimized[Optimized Data<br/>kaders_optimized.json]
style Raw fill:#fff4e1
style Script fill:#e1f5ff
style Optimized fill:#e8f5e9
graph TB
Alias[aliases.sh] --> Commands[Shell Aliases]
Commands --> NRD[nrd<br/>npm run dev]
Commands --> NRB[nrb<br/>npm run build]
Commands --> NS[ns<br/>npm start]
Commands --> NL[nl<br/>npm run lint]
NRD --> Repos1[All Repos<br/>website, dashboard, id, manager]
NRB --> Repos1
NS --> Repos1
NL --> Repos1
Push[push-all.sh] --> Git[Git Operations]
Git --> Add[git add .]
Git --> Commit[git commit]
Git --> Push[git push]
Add --> Repos2[All Sub-Repos]
Commit --> Repos2
Push --> Repos2
style Alias fill:#e1f5ff
style Push fill:#fff4e1
style Repos1 fill:#e8f5e9
style Repos2 fill:#f3e5f5
Defines shell aliases to run npm commands across all sub-repos (website, dashboard, id, manager).
- nrd: Run
npm run dev in all repos with a package.json
- nrb: Run
npm run build in all repos
- ns: Run
npm start in all repos
- nl: Run
npm run lint in all repos
Usage:
source aliases.sh
Then use nrd, nrb, ns, or nl as commands.
Loops through all sub-repos and, if the directory is a git repo, runs:
git add .
git commit -m "Update project" (ignores errors if nothing to commit)
git push
Usage:
bash push-all.sh
gantt
title Digibinas Development Roadmap
dateFormat YYYY-MM-DD
section Frontend
Next.js Webapp :a1, 2025-02-01, 30d
Tailwind Styling :a2, after a1, 10d
section Data
Add VWO Data :b1, 2025-02-15, 15d
Add HAVO Data :b2, after b1, 15d
section API
Search Endpoints :c1, 2025-03-01, 10d
Filtering Endpoints :c2, after c1, 10d
section Testing
API Tests :d1, 2025-03-15, 10d
Data Integrity Tests :d2, after d1, 10d
- Frontend: Build a Next.js/Tailwind webapp to consume this API.
- Add more data: Add vwo/havo BINAS data and optimize with the script.
- Extend API: Add endpoints for search, filtering, or other BINAS modules.
- Documentation: Document API endpoints and data structure for frontend devs.
- Testing: Add tests for API and data integrity.
Contributions and feedback welcome!