Table of Contents
Features
Beautiful Web UI
Modern dark theme with drag-and-drop file upload
Cloudflare Access SSO
Secure your app with enterprise authentication
Auto-interprets headers
Column names parsed from CSV first row
Type inference
Optionally detect INTEGER/REAL/TEXT columns
Chunked uploads
Handle files with 100k+ rows without timeouts
Table preview
View and export data directly in the UI
Table management
List, query, and delete tables
Progress tracking
Real-time upload progress for large files
What's New in v1.1
Chunked Upload Support
- Files over 10MB or 5,000 rows automatically use chunked upload
- Handles files up to 500k+ rows without timeout issues
- Real-time progress tracking during upload
- Cancel button for long-running uploads
- Better error recovery - individual chunk failures don't stop entire upload
Large File Handling
- Client-side file reading in chunks to avoid browser memory issues
- Pre-validation of file size and row count
- Warnings for very large files (>500k rows)
- Graceful handling of the "NotReadableError" issue
Quick Start
1. Clone and Install
git clone <your-repo>
cd csv-to-d1-loader
npm install
2. Create a D1 Database
# Create a new database
npx wrangler d1 create csv-import-db
# Note the database ID from the output
3. Configure wrangler.toml
Update wrangler.toml with your database ID:
[[d1_databases]]
binding = "DB"
database_name = "csv-import-db"
database_id = "YOUR_DATABASE_ID_HERE"
4. Deploy
npx wrangler deploy
5. Visit Your App
Open https://csv-to-d1-loader.<your-subdomain>.workers.dev in your browser!
File Size Limits & Recommendations
| File Size | Row Count | Upload Method | Notes |
|---|---|---|---|
| < 10 MB | < 5,000 | Simple upload | Fast, single request |
| 10-100 MB | 5,000-50,000 | Chunked | Automatic chunking |
| 100+ MB | 50,000-500,000 | Chunked | May take several minutes |
| > 500 MB | > 500,000 | Split file | Consider splitting into multiple files |
Cloudflare Limits to Be Aware Of:
- Workers CPU time: 30 seconds (free), 30-60s (paid)
- Request body size: 100 MB maximum
- D1 database size: 10 GB (check current limits)
API Reference
All API endpoints are available under /api/:
Upload CSV (Simple - for small files)
POST
/api/upload?table=<table_name>
Query Parameters:
| Parameter | Required | Description |
|---|---|---|
table |
Yes | Name of the table to create |
infer_types |
No | Set to true to infer INTEGER/REAL types |
replace |
No | Set to true to drop existing table first |
Chunked Upload (for large files)
The UI handles this automatically, but if using the API directly:
1. Initialize Upload
POST
/api/upload/init
{
"tableName": "my_table",
"headers": ["col1", "col2", "col3"],
"inferTypes": false,
"replace": true
}
2. Upload Chunks
POST
/api/upload/chunk
Headers:
Content-Type: application/json
X-Table-Name: my_table
X-Chunk-Index: 0
Body:
{
"rows": [["val1", "val2", "val3"], ...],
"columns": ["col1", "col2", "col3"]
}