feat(console): add JSON export option to database documents#2897
feat(console): add JSON export option to database documents#2897Divyansh2992 wants to merge 2 commits intoappwrite:mainfrom
Conversation
Console (appwrite/console)Project ID: Sites (1)
Tip JWT tokens let functions act on behalf of users while preserving their permissions |
WalkthroughThis PR adds JSON export functionality to the database table interface. The changes introduce a new Estimated code review effort🎯 4 (Complex) | ⏱️ ~45 minutes 🚥 Pre-merge checks | ✅ 3✅ Passed checks (3 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Actionable comments posted: 1
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
src/routes/(console)/project-[region]-[project]/databases/database-[database]/table-[table]/export/+page.svelte (1)
15-15:⚠️ Potential issue | 🟡 MinorEmit
Click.DatabaseExportJsonwhen JSON export is initiated.The JSON path currently tracks submit/error but not click. Since this PR introduces
Click.DatabaseExportJson, add it when the JSON export action starts.📈 Proposed analytics wiring
-import { Submit, trackEvent, trackError } from '$lib/actions/analytics'; +import { Click, Submit, trackEvent, trackError } from '$lib/actions/analytics'; @@ } else { // JSON export logic + trackEvent(Click.DatabaseExportJson); $isSubmitting = true; try {Also applies to: 132-193
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/routes/`(console)/project-[region]-[project]/databases/database-[database]/table-[table]/export/+page.svelte at line 15, When the JSON export flow starts, emit the Click.DatabaseExportJson analytics event: in the handler that initiates the export (the Submit call path in this file where trackEvent and trackError are imported), call trackEvent('Click.DatabaseExportJson') immediately before starting the JSON export submission; ensure the same addition is applied to the other JSON export branch referenced around lines 132-193 so both JSON export entry points call trackEvent('Click.DatabaseExportJson') prior to invoking Submit and before any trackError handling.
🧹 Nitpick comments (1)
src/routes/(console)/project-[region]-[project]/databases/database-[database]/table-[table]/export-json/+page.svelte (1)
12-12: Use route alias import instead of relative import.Please replace
../storewith the configured alias style for route imports.As per coding guidelines
**/*.{js,ts,svelte}: Use $lib, $routes, and $themes path aliases for imports.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/routes/`(console)/project-[region]-[project]/databases/database-[database]/table-[table]/export-json/+page.svelte at line 12, Replace the relative import "import { table } from '../store'" with the configured route-alias import (use the $routes alias per guidelines) so the symbol table is imported via the route alias instead of a relative path; update the import statement in +page.svelte to use $routes (and fix any other relative route imports in this file) to comply with the /*.{js,ts,svelte} alias rule.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In
`@src/routes/`(console)/project-[region]-[project]/databases/database-[database]/table-[table]/export-json/+page.svelte:
- Around line 94-125: The loop currently uses offset-based pagination (pageSize,
offset and Query.offset) which should be changed to cursor-based to match the
other export path: replace offset and Query.offset with a cursor variable and
pass Query.cursorAfter(cursor) to sdk.forProject(...).tablesDB.listRows; in the
loop update cursor from the listRows response (response.cursor /
response.nextCursor / response.cursorAfter depending on the SDK field) and break
when the response indicates no further cursor or rows, while keeping the
selectedCols filtering and pushing into allRows unchanged.
---
Outside diff comments:
In
`@src/routes/`(console)/project-[region]-[project]/databases/database-[database]/table-[table]/export/+page.svelte:
- Line 15: When the JSON export flow starts, emit the Click.DatabaseExportJson
analytics event: in the handler that initiates the export (the Submit call path
in this file where trackEvent and trackError are imported), call
trackEvent('Click.DatabaseExportJson') immediately before starting the JSON
export submission; ensure the same addition is applied to the other JSON export
branch referenced around lines 132-193 so both JSON export entry points call
trackEvent('Click.DatabaseExportJson') prior to invoking Submit and before any
trackError handling.
---
Nitpick comments:
In
`@src/routes/`(console)/project-[region]-[project]/databases/database-[database]/table-[table]/export-json/+page.svelte:
- Line 12: Replace the relative import "import { table } from '../store'" with
the configured route-alias import (use the $routes alias per guidelines) so the
symbol table is imported via the route alias instead of a relative path; update
the import statement in +page.svelte to use $routes (and fix any other relative
route imports in this file) to comply with the /*.{js,ts,svelte} alias rule.
ℹ️ Review info
Configuration used: Organization UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (4)
src/lib/actions/analytics.tssrc/routes/(console)/project-[region]-[project]/databases/database-[database]/table-[table]/+page.sveltesrc/routes/(console)/project-[region]-[project]/databases/database-[database]/table-[table]/export-json/+page.sveltesrc/routes/(console)/project-[region]-[project]/databases/database-[database]/table-[table]/export/+page.svelte
...ject-[region]-[project]/databases/database-[database]/table-[table]/export-json/+page.svelte
Outdated
Show resolved
Hide resolved
Greptile SummaryThis PR successfully adds JSON export functionality to the database documents interface. The implementation extends the existing Export wizard to support both CSV and JSON formats through a format selector. Key changes:
Implementation notes:
Confidence Score: 4/5
Important Files Changed
Flowchart%%{init: {'theme': 'neutral'}}%%
flowchart TD
A[User clicks Export] --> B{Select Format}
B -->|CSV| C[Call backend API]
C --> D[Show success notification]
D --> E[Navigate back to table]
B -->|JSON| F[Initialize pagination state]
F --> G{fetched less than total?}
G -->|No| N[Create JSON blob]
G -->|Yes| H[Build query with limit 100 and filters]
H --> I{Has lastId?}
I -->|Yes| J[Add Query.cursorAfter to pagination]
I -->|No| K[First page - no cursor]
J --> L[Fetch page via tablesDB.listRows]
K --> L
L --> M{Empty result?}
M -->|Yes| N
M -->|No| O[Filter rows to selected columns]
O --> P[Append to allRows array]
P --> Q[Update fetched count and lastId]
Q --> G
N --> R[Download file via Blob API]
R --> S[Show completion notification]
S --> E
Last reviewed commit: 3a392e9 |

What does this PR do?
This PR adds support for exporting collection documents as a .json file from the Database Console.
Currently, users can export documents as CSV. This enhancement extends the existing Export wizard to support both CSV and JSON formats without introducing any backend changes.
Key Changes:
Renamed “Export CSV” wizard to a generic “Export” wizard
Added a format selector (CSV / JSON) in the Export options
Implemented client-side JSON export using:
listDocuments
Cursor-based pagination using Query.cursorAfter() and Query.limit(100)
Active filters
Search queries
Selected columns
Generated downloadable .json file using browser Blob API
File name dynamically set to:
${tableName}.json
Added analytics tracking:
Click.DatabaseExportJson
Submit.DatabaseExportJson
This is a Console-only UI enhancement and does not require backend modifications. Existing CSV export behavior remains unchanged.
Test Plan
I verified the implementation locally with the following steps:
Started the Console locally.
Navigated to:
Database → Collection → Documents
Clicked the Export button.
Selected:
Format: JSON
Specific columns via checkboxes
Search queries
Filters
Verified:
JSON file downloads automatically.
File name matches collection name (.json).
Export respects:
Active filters
Search queries
Selected columns
Pagination correctly fetches all documents (tested with datasets >100 documents).
CSV export still works exactly as before.
No console errors.
Analytics events trigger correctly.
Related PRs and Issues
Closes Issue #2891
Have you read the Contributing Guidelines on issues?
Yes, I have read and followed the contributing guidelines.
Summary by CodeRabbit