There are three supported paths for getting bulk data into a Table: paste, CSV upload, and the bulk-insert API. Pick by source size and how often the import runs.
Paste from a spreadsheet
For one-off loads, copy rows from any spreadsheet (Excel, Google Sheets, Numbers) and paste directly into the table editor. The editor parses tab-separated values, validates each cell against the column type, and appends the rows below the last existing row.
- Number columns reject non-numeric tokens — the cell is coerced to
0if the parser cannot recover a number. - Boolean columns accept
true/false(case-insensitive),1/0, andyes/no. - Date columns accept ISO-8601 (
2026-03-16) and the editor's display format. - JSON columns expect raw JSON; non-JSON tokens are stored as strings and surface a validation warning.
Paste is bounded by the table's maxRows (default 10000).
CSV upload
The Tables UI exposes a CSV import dialog. The dialog:
- Reads the CSV header row and offers a column-mapping step (CSV column → Table column).
- Validates every row against the destination schema.
- Inserts in batches.
Rows that fail validation are surfaced in a per-row error report; valid rows are inserted regardless. Use this for moderate-sized imports (thousands of rows). Above maxRows, the dialog refuses to start.
Bulk insert via the API
For large imports, scheduled imports, or imports from another system, use the bulk-insert endpoint. Each request inserts up to TABLE_LIMITS.MAX_BULK_INSERT_ROWS rows; for larger sets, batch the calls.
curl -X POST \
"https://your-actana.example.com/api/v1/tables/${TABLE_ID}/rows/bulk" \
-H "Authorization: Bearer ${API_KEY}" \
-H "Content-Type: application/json" \
-d '{
"rows": [
{ "data": { "name": "Acme", "score": 92, "vip": true } },
{ "data": { "name": "Globex", "score": 78, "vip": false } }
]
}'The same shape is exposed by the Table block under the bulk-insert operation — point a workflow at any upstream source (HTTP fetch, S3 read, scheduled trigger) and pipe parsed rows into the block.
The actana_tables skill in the Assistant can bulk-insert rows in natural language ("Load this CSV into the customers table"). The skill calls the same API; large pastes still need to be batched into multiple requests.
After the import
The rowCount cache on user_table_definitions is updated atomically with the insert. Filter and sort queries pick up new rows immediately — there is no indexing delay because the GIN index on the JSONB data column is maintained by the row insert itself.
Source
apps/actana/blocks/blocks/table.ts—bulk-insertoperationapps/actana/app/api/v1/tables/[tableId]/— REST bulk endpointsapps/actana/lib/table/constants.ts—TABLE_LIMITS