Skip to content

feat(console): add JSON export option to database documents#2897

Open
Divyansh2992 wants to merge 2 commits intoappwrite:mainfrom
Divyansh2992:feature/database-json-export
Open

feat(console): add JSON export option to database documents#2897
Divyansh2992 wants to merge 2 commits intoappwrite:mainfrom
Divyansh2992:feature/database-json-export

Conversation

@Divyansh2992
Copy link

@Divyansh2992 Divyansh2992 commented Feb 28, 2026

What does this PR do?

This PR adds support for exporting collection documents as a .json file from the Database Console.

Currently, users can export documents as CSV. This enhancement extends the existing Export wizard to support both CSV and JSON formats without introducing any backend changes.

image

Key Changes:

  1. Renamed “Export CSV” wizard to a generic “Export” wizard

  2. Added a format selector (CSV / JSON) in the Export options

  3. Implemented client-side JSON export using:

  • listDocuments

  • Cursor-based pagination using Query.cursorAfter() and Query.limit(100)

  1. Ensured export respects:
  • Active filters

  • Search queries

  • Selected columns

  1. Generated downloadable .json file using browser Blob API

  2. File name dynamically set to:
    ${tableName}.json

  3. Added analytics tracking:

  • Click.DatabaseExportJson

  • Submit.DatabaseExportJson

This is a Console-only UI enhancement and does not require backend modifications. Existing CSV export behavior remains unchanged.

Test Plan

I verified the implementation locally with the following steps:

  1. Started the Console locally.

  2. Navigated to:
    Database → Collection → Documents

  3. Clicked the Export button.

  4. Selected:

  • Format: JSON

  • Specific columns via checkboxes

  1. Applied:
  • Search queries

  • Filters

  1. Triggered export.

Verified:

  1. JSON file downloads automatically.

  2. File name matches collection name (.json).

  3. Export respects:

  • Active filters

  • Search queries

  • Selected columns

  1. Pagination correctly fetches all documents (tested with datasets >100 documents).

  2. CSV export still works exactly as before.

  3. No console errors.

  4. Analytics events trigger correctly.

Related PRs and Issues

Closes Issue #2891

Have you read the Contributing Guidelines on issues?

Yes, I have read and followed the contributing guidelines.

Summary by CodeRabbit

  • New Features
    • Added JSON export capability for database tables with column selection
    • Export interface now supports both CSV and JSON formats with format selector
    • JSON export includes options to apply current table filters during export
    • Export wizard provides "Select all/Deselect all" and expandable column toggles

@appwrite
Copy link

appwrite bot commented Feb 28, 2026

Console (appwrite/console)

Project ID: 688b7bf400350cbd60e9

Sites (1)
Site Status Logs Preview QR
 console-stage
688b7cf6003b1842c9dc
Failed Failed Authorize Preview URL QR Code

Tip

JWT tokens let functions act on behalf of users while preserving their permissions

@coderabbitai
Copy link
Contributor

coderabbitai bot commented Feb 28, 2026

Walkthrough

This PR adds JSON export functionality to the database table interface. The changes introduce a new DatabaseExportJson tracking event to the analytics enums, update the export button tooltip text, create a new JSON export page component with column selection and filter support, and refactor the existing export page to support both CSV and JSON formats with format-specific UI and export logic.

Estimated code review effort

🎯 4 (Complex) | ⏱️ ~45 minutes

🚥 Pre-merge checks | ✅ 3
✅ Passed checks (3 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The title accurately describes the main change: adding JSON export functionality to database documents in the console, which aligns with the core purpose of this PR.
Docstring Coverage ✅ Passed No functions found in the changed files to evaluate docstring coverage. Skipping docstring coverage check.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (1)
src/routes/(console)/project-[region]-[project]/databases/database-[database]/table-[table]/export/+page.svelte (1)

15-15: ⚠️ Potential issue | 🟡 Minor

Emit Click.DatabaseExportJson when JSON export is initiated.

The JSON path currently tracks submit/error but not click. Since this PR introduces Click.DatabaseExportJson, add it when the JSON export action starts.

📈 Proposed analytics wiring
-import { Submit, trackEvent, trackError } from '$lib/actions/analytics';
+import { Click, Submit, trackEvent, trackError } from '$lib/actions/analytics';
@@
         } else {
             // JSON export logic
+            trackEvent(Click.DatabaseExportJson);
             $isSubmitting = true;
             try {

Also applies to: 132-193

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In
`@src/routes/`(console)/project-[region]-[project]/databases/database-[database]/table-[table]/export/+page.svelte
at line 15, When the JSON export flow starts, emit the Click.DatabaseExportJson
analytics event: in the handler that initiates the export (the Submit call path
in this file where trackEvent and trackError are imported), call
trackEvent('Click.DatabaseExportJson') immediately before starting the JSON
export submission; ensure the same addition is applied to the other JSON export
branch referenced around lines 132-193 so both JSON export entry points call
trackEvent('Click.DatabaseExportJson') prior to invoking Submit and before any
trackError handling.
🧹 Nitpick comments (1)
src/routes/(console)/project-[region]-[project]/databases/database-[database]/table-[table]/export-json/+page.svelte (1)

12-12: Use route alias import instead of relative import.

Please replace ../store with the configured alias style for route imports.

As per coding guidelines **/*.{js,ts,svelte}: Use $lib, $routes, and $themes path aliases for imports.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In
`@src/routes/`(console)/project-[region]-[project]/databases/database-[database]/table-[table]/export-json/+page.svelte
at line 12, Replace the relative import "import { table } from '../store'" with
the configured route-alias import (use the $routes alias per guidelines) so the
symbol table is imported via the route alias instead of a relative path; update
the import statement in +page.svelte to use $routes (and fix any other relative
route imports in this file) to comply with the /*.{js,ts,svelte} alias rule.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In
`@src/routes/`(console)/project-[region]-[project]/databases/database-[database]/table-[table]/export-json/+page.svelte:
- Around line 94-125: The loop currently uses offset-based pagination (pageSize,
offset and Query.offset) which should be changed to cursor-based to match the
other export path: replace offset and Query.offset with a cursor variable and
pass Query.cursorAfter(cursor) to sdk.forProject(...).tablesDB.listRows; in the
loop update cursor from the listRows response (response.cursor /
response.nextCursor / response.cursorAfter depending on the SDK field) and break
when the response indicates no further cursor or rows, while keeping the
selectedCols filtering and pushing into allRows unchanged.

---

Outside diff comments:
In
`@src/routes/`(console)/project-[region]-[project]/databases/database-[database]/table-[table]/export/+page.svelte:
- Line 15: When the JSON export flow starts, emit the Click.DatabaseExportJson
analytics event: in the handler that initiates the export (the Submit call path
in this file where trackEvent and trackError are imported), call
trackEvent('Click.DatabaseExportJson') immediately before starting the JSON
export submission; ensure the same addition is applied to the other JSON export
branch referenced around lines 132-193 so both JSON export entry points call
trackEvent('Click.DatabaseExportJson') prior to invoking Submit and before any
trackError handling.

---

Nitpick comments:
In
`@src/routes/`(console)/project-[region]-[project]/databases/database-[database]/table-[table]/export-json/+page.svelte:
- Line 12: Replace the relative import "import { table } from '../store'" with
the configured route-alias import (use the $routes alias per guidelines) so the
symbol table is imported via the route alias instead of a relative path; update
the import statement in +page.svelte to use $routes (and fix any other relative
route imports in this file) to comply with the /*.{js,ts,svelte} alias rule.

ℹ️ Review info

Configuration used: Organization UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between ec5b6ad and 05cbf96.

📒 Files selected for processing (4)
  • src/lib/actions/analytics.ts
  • src/routes/(console)/project-[region]-[project]/databases/database-[database]/table-[table]/+page.svelte
  • src/routes/(console)/project-[region]-[project]/databases/database-[database]/table-[table]/export-json/+page.svelte
  • src/routes/(console)/project-[region]-[project]/databases/database-[database]/table-[table]/export/+page.svelte

@greptile-apps
Copy link
Contributor

greptile-apps bot commented Feb 28, 2026

Greptile Summary

This PR successfully adds JSON export functionality to the database documents interface. The implementation extends the existing Export wizard to support both CSV and JSON formats through a format selector.

Key changes:

  • Renamed the "Export CSV" wizard to a generic "Export" wizard supporting both formats
  • Implemented client-side JSON export using cursor-based pagination (Query.cursorAfter()) with 100-document batches
  • Added format-specific UI controls (delimiter and header options only shown for CSV)
  • Export respects active filters, search queries, and selected columns
  • Properly tracks analytics events (Submit.DatabaseExportJson)
  • Removed redundant export-json wizard in cleanup commit

Implementation notes:

  • JSON export loads all data into browser memory before download, suitable for moderate datasets but may struggle with very large collections (thousands of documents)
  • CSV export continues to use the backend service, better for large datasets
  • Pagination logic correctly handles edge cases (empty results, filtered queries, exact multiples of page size)

Confidence Score: 4/5

  • Safe to merge with awareness of client-side memory limitations for very large datasets
  • The implementation is well-structured with proper pagination, error handling, and analytics tracking. The cursor-based pagination logic correctly handles edge cases. The main consideration is that JSON export loads all data into browser memory, which is acceptable for moderate datasets but could be problematic for very large collections. CSV export via backend service remains available for large-scale exports.
  • No files require special attention

Important Files Changed

Filename Overview
src/routes/(console)/project-[region]-[project]/databases/database-[database]/table-[table]/export/+page.svelte Implemented JSON export with format selector, cursor-based pagination, and client-side file generation

Flowchart

%%{init: {'theme': 'neutral'}}%%
flowchart TD
    A[User clicks Export] --> B{Select Format}
    B -->|CSV| C[Call backend API]
    C --> D[Show success notification]
    D --> E[Navigate back to table]
    
    B -->|JSON| F[Initialize pagination state]
    F --> G{fetched less than total?}
    G -->|No| N[Create JSON blob]
    G -->|Yes| H[Build query with limit 100 and filters]
    H --> I{Has lastId?}
    I -->|Yes| J[Add Query.cursorAfter to pagination]
    I -->|No| K[First page - no cursor]
    J --> L[Fetch page via tablesDB.listRows]
    K --> L
    L --> M{Empty result?}
    M -->|Yes| N
    M -->|No| O[Filter rows to selected columns]
    O --> P[Append to allRows array]
    P --> Q[Update fetched count and lastId]
    Q --> G
    N --> R[Download file via Blob API]
    R --> S[Show completion notification]
    S --> E
Loading

Last reviewed commit: 3a392e9

Copy link
Contributor

@greptile-apps greptile-apps bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

4 files reviewed, 1 comment

Edit Code Review Agent Settings | Greptile

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant