feat(storage): Adds the complete storage layer for the Smart Notes vault system.#5
feat(storage): Adds the complete storage layer for the Smart Notes vault system.#5Nihallllll wants to merge 2 commits intoAOSSIE-Org:mainfrom
Conversation
- DB-first architecture using better-sqlite3 (Node.js + Bun compatible) - Full note lifecycle: create, read, update, rename, move, trash, restore - Folder CRUD with rename/move cascade to all child note paths - Tag management synced to DB and file frontmatter - Conflict detection on concurrent external edits - File watcher (chokidar) with debounce and internal-write suppression - RAG pipeline bridge: getNotesNeedingReindex, updateEmbeddingStatus - Event system via EventEmitter3 (noteCreated, noteUpdated, folderMoved, etc.) - 78 vitest tests covering all operations, edge cases, and events
WalkthroughAdds a complete vault-based storage app: SQLite-backed DB, filesystem watcher with internal-write suppression and conflict handling, frontmatter and file utilities, a VaultManager API for notes/folders/tags/trash, types, tests, and project config files. Changes
Sequence Diagram(s)sequenceDiagram
participant App as Application
participant VM as VaultManager
participant DB as Database
participant FS as FileSystem
participant W as Watcher
App->>VM: initializeVault(path)
VM->>FS: walk vault for .md files
FS-->>VM: list of files
VM->>VM: process files -> NoteMeta[]
VM->>DB: bulkUpsertNotes(NoteMeta[])
DB-->>VM: ok
VM->>App: emit(VAULT_READY)
App->>VM: startWatcher()
VM->>W: start(handler)
W->>FS: subscribe to file events
FS->>W: file change (add/change/unlink)
W->>VM: deliver event (relativePath, isInternal)
VM->>FS: read file content (if needed)
FS-->>VM: content
VM->>VM: detectConflict(dbHash, diskHash)
alt conflict
VM->>FS: saveConflict(...)
VM->>DB: update conflict state
VM->>App: emit(CONFLICT_DETECTED)
else no conflict
VM->>DB: insert/update note
DB-->>VM: ok
VM->>App: emit(NOTE_CREATED/UPDATED/DELETED)
end
Estimated code review effort🎯 4 (Complex) | ⏱️ ~60 minutes Suggested labels
Poem
🚥 Pre-merge checks | ✅ 2✅ Passed checks (2 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Actionable comments posted: 22
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@apps/storage/.gitignore`:
- Around line 15-16: The .gitignore patterns are too specific/malformed: replace
the single-character pattern "_.log" with a broader "*.log" to ignore all log
files, and fix the report pattern "report.[0-9]_.[0-9]_.[0-9]_.[0-9]_.json" to a
valid glob such as "report.[0-9]*_[0-9]*_[0-9]*_[0-9]*.json" (or "report.*.json"
if you want a simpler catch-all) so generated report files are correctly
ignored; update the patterns accordingly in the .gitignore.
In `@apps/storage/index.ts`:
- Line 1: The top-level entry currently only contains a side-effect
console.log("Hello via Bun!"), which prevents consumers from importing the
storage API; remove that debug log and replace it with proper re-exports of the
package public API (e.g., delete the console.log line in apps/storage/index.ts
and add exports such as export * from './storage' and/or explicit exports like
export { StorageClient, upload, download } from their implementing modules) so
consumers can import the storage functions/classes; ensure you reference the
actual module filenames in the repository when adding the export statements.
In `@apps/storage/README.md`:
- Line 1: Add a top-level H1 by prefixing the first line with "# " so the first
line becomes "# What it does — in plain terms" and remove any extra blank lines
at the end so the file ends with exactly one trailing newline (ensure only a
single final newline character at EOF).
In `@apps/storage/src/conflicts.ts`:
- Around line 80-95: The logConflict function has a read-modify-write race that
can lose entries under concurrent calls; change logConflict to avoid reading the
file first and instead append each conflict atomically (e.g., write one NDJSON
line per conflict with fs.appendFile or use an atomic file-locking library) and
write a newline-terminated JSON object rather than rewriting a JSON array; also
update getUnresolvedConflicts to parse the new NDJSON format (read file
line-by-line and JSON.parse each line) so both functions use the same format.
In `@apps/storage/src/DB.ts`:
- Around line 469-473: The updateFolder method builds SQL from arbitrary keys
leading to SQL injection; mirror the fix used in updateNote by whitelisting
allowed folder fields (exclude "folder_id"), filter the incoming updates object
to only those keys, throw or no-op if no valid fields remain, then construct the
SET clause from the validated keys and run the prepared statement with the
corresponding values plus folderId; updateFolder should reference the same
allowed-field list/validation helper used by updateNote (or introduce one) to
ensure only safe column names are used.
- Around line 197-201: The updateNote method builds SQL column names directly
from updates keys which can lead to SQL injection; change updateNote to
validate/whitelist incoming keys against an explicit allowed set derived from
NoteMeta (excluding "note_id") before constructing the fields string, reject or
ignore any unknown keys, build the SET clause only from validated fields, keep
using parameter placeholders for values, and throw or return early if there are
no valid update fields; reference updateNote, NoteMeta, note_id, fields, values
and this.db.prepare when making the change.
- Around line 94-102: The migration swallowing errors in the block that alters
notes columns (the checks using currentVersion, calls to this.db.exec for adding
embedding_status, last_embedded_at, original_path, and the subsequent
this.db.prepare update and this.setConfig(STORAGE_VERSION)) should be changed to
catch errors into a variable (e.g., catch (err)) and only suppress/ignore
expected "column already exists" errors; for any other error log the full error
(via console.error or the repo's logger) with context including the failing SQL
and rethrow or fail the migration. Update each try { this.db.exec(...) } catch
(_) {} to catch (err) { if (!/column.*exists|duplicate column|already
exists/i.test(String(err))) console.error(`[DB] Migration error adding column
<column_name>:`, err); else /* continue silently */ } and preserve the existing
UPDATE and setConfig calls after handling errors.
- Around line 13-24: The constructor currently calls the async ensureDir without
awaiting, causing a race where Database(this.dbPath), initializeSchema, and
runMigrations can run before the directory exists; fix by replacing the async
call with a synchronous directory creation or a factory: either add and call
ensureDirSync(grimoireDir) (implement ensureDirSync using mkdirSync with
{recursive:true}) before computing this.db and invoking new
Database/initializeSchema/runMigrations, or convert the constructor to private
and create a static async create(vaultPath) factory that awaits
ensureDir(vaultPath) then constructs and returns the DB instance; update uses of
constructor accordingly and ensure symbols referenced are constructor,
ensureDir/ensureDirSync, grimoireDir, this.dbPath, Database, initializeSchema,
and runMigrations.
In `@apps/storage/src/frontmatter.ts`:
- Around line 9-30: Add an explicit return type to parseFrontmatter by declaring
a FrontmatterResult (or similar) TypeScript interface describing the exact
shape: title: string | null, tags: string[], created_at: string | null,
description: string | null, userFrontmatter: Record<string, unknown> (or a more
specific type), content: string, rawText: string; then annotate the function
signature export function parseFrontmatter(rawText: string): FrontmatterResult
so IDEs and consumers get proper typings and docs.
In `@apps/storage/src/fsutils.ts`:
- Around line 24-25: The current atomicWrite implementation uses
fs.copyFile(tempPath, filePath) which can expose partial data; change it to
perform an atomic rename-based swap: ensure the temp file is created in the same
directory as filePath, replace the fs.copyFile call with await
fs.rename(tempPath, filePath) inside atomicWrite (and remove or stop using
fs.copyFile), and wrap the rename in try/catch to clean up tempPath on failure;
optionally fsync the containing directory after rename for durability.
- Line 29: Change the catch parameter type from `any` to `unknown` in the catch
clause (replace `catch (err: any)` with `catch (err: unknown)`) in the
fsutils.ts function where the error is re-thrown; no further inspection is
required because the code simply re-throws the caught error (ensure any
subsequent uses of `err` are compatible with `unknown` or are type-guarded
before use).
In `@apps/storage/src/gray-matter.d.ts`:
- Around line 28-33: The two self-referential aliases (GrayMatterOption and
GrayMatterFile) must be replaced so they reference the actual interface/type
definitions in the module instead of themselves: change the alias
GrayMatterOption<I extends Input, O extends ...> to point to the module-level
options/interface type (e.g., the exported GrayMatterOptions/Options interface
or a new GrayMatterOptionsInterface that uses generics I and O) and change
GrayMatterFile<I extends Input> to reference the module-level GrayMatterFile
interface/type (or create a GrayMatterFileInterface generic) so they are not
circular; also replace all occurrences of any (notably the ones flagged) with
explicit types like unknown, Record<string, unknown>, or a narrower union (e.g.,
string|number|boolean) that matches the expected value shape for the fields or
data properties referenced by functions/methods using Input, GrayMatterOption,
and GrayMatterFile to keep typing strict and consistent.
- Line 17: Replace the uses of `any` in this declaration by using safer explicit
types: change the `data: { [key: string]: any }` property to `data:
Record<string, unknown>` (or a more specific union if you know exact metadata
value shapes) and update the recursive generic instantiation
`GrayMatterOption<string, GrayMatterOption<string, any>>` to use `unknown`
instead of `any` (e.g., `GrayMatterOption<string, GrayMatterOption<string,
unknown>>`) or a more precise constraint for the nested option type; update all
references to the `data` property and the `GrayMatterOption` generic in the file
to maintain type consistency.
In `@apps/storage/src/vaultManager.ts`:
- Around line 959-963: Remove the unnecessary non-null assertion and await when
calling the synchronous DB method; change the assignment to call
this.db.restoreNoteFromTrash(noteId) (no await, no !) and then keep the existing
null check for originalRelPath so you still throw new Error(`Cannot restore:
original path unknown for note ${noteId}`) when it returns null.
- Around line 249-265: The handleWatcherEvent function currently declares its
parameter as event: any which loses type safety; import and use the explicit
WatcherEvent type (e.g., add an import for WatcherEvent) and change the
signature to handleWatcherEvent(event: WatcherEvent): Promise<void>, keeping the
existing switch on event.type and uses of event.relativePath intact; update any
callers or mock tests to pass a WatcherEvent if necessary so the type checks
pass.
- Around line 267-283: The handleExternalAdd flow can throw UNIQUE constraint
errors when insertNote is called for an already-existing note; modify
handleExternalAdd to defensively check for existing notes (e.g., query this.db
for an existing record by note_id or path derived from noteMeta) before calling
this.db!.insertNote, or wrap the insertNote call in a try/catch that
specifically ignores UNIQUE constraint errors and only rethrows unexpected
errors; ensure you still emit EVENTS.NOTE_CREATED only when the insert actually
happened (or adjust the payload logic accordingly).
In `@apps/storage/src/Watcher.ts`:
- Line 1: The import of FSWatcher from chokidar is used only as a type; change
the import to a type-only import by replacing the current import declaration
that references FSWatcher (e.g., the line importing chokidar, { FSWatcher })
with an import that uses the TypeScript "import type" syntax for FSWatcher while
keeping the default chokidar import for runtime usage.
- Line 173: The call to the handler at this.handler(event) can return a Promise
(WatcherHandler) but is invoked without awaiting or catching rejections; update
the invocation so async rejections are handled — for example wrap the call with
Promise.resolve(this.handler(event)).catch(err => { /* log or handle error via
Watcher instance logger/emit */ }) or await it inside an async method with
try/catch; reference the Watcher class and the this.handler(event) invocation
and ensure any caught error is forwarded to the Watcher’s error logging/handling
path.
- Around line 79-86: stop() currently closes this.watcher but leaves pending
debounce timers and suppression state that can survive a restart; update stop()
to clear and null out any debounce timers (e.g. clearTimeout(this.debounceTimer)
and set this.debounceTimer = null), reset suppression state (e.g.
this.suppressedPaths or this.suppressionMap = new Map() or null), remove any
attached listeners/handlers (ensure this.handler is removed and set to null),
and ensure any other related flags are reset so the Watcher is fully cleanly
stopped and can restart without stale timers/state.
- Around line 92-99: The markInternalWrite method uses the raw relativePath as
the suppression key which can differ from the POSIX-normalized event paths (so
Windows backslashes leak); normalize the path before using it as the key (e.g.,
convert backslashes to forward slashes or call path.posix.normalize) and then
use that normalized key for recentWrites.set, the expiry calculation, and for
deletion inside the setTimeout; update references to suppressionWindow and
recentWrites in markInternalWrite accordingly.
In `@apps/storage/test/vaultManager.test.ts`:
- Around line 59-61: The test currently uses non-null asserts like notes[0]!
after calling vault.listNotes(); instead, first assert the array is non-empty
(e.g., expect(notes).toHaveLength(1) or expect(notes.length).toBeGreaterThan(0))
or assert the element is defined (expect(notes[0]).toBeDefined()), then access
notes[0].note_id without the `!`; apply the same change for the other usages of
notes in this test file (the subsequent assertions that use notes[0] around the
other test cases) so test failures surface clearly instead of being masked by
non-null assertions.
ℹ️ Review info
Configuration used: Path: .coderabbit.yaml
Review profile: ASSERTIVE
Plan: Pro
📒 Files selected for processing (17)
apps/storage/.gitignoreapps/storage/README.mdapps/storage/index.tsapps/storage/package.jsonapps/storage/src/DB.tsapps/storage/src/Watcher.tsapps/storage/src/conflicts.tsapps/storage/src/events.tsapps/storage/src/frontmatter.tsapps/storage/src/fsutils.tsapps/storage/src/gray-matter.d.tsapps/storage/src/index.tsapps/storage/src/types.tsapps/storage/src/vaultManager.tsapps/storage/test/vaultManager.test.tsapps/storage/tsconfig.jsonapps/storage/vitest.config.ts
| @@ -0,0 +1,43 @@ | |||
| What it does — in plain terms | |||
There was a problem hiding this comment.
Fix markdownlint violations (MD041, MD047).
Add a top-level H1 at Line 1 and ensure the file ends with a single trailing newline.
📝 Proposed fix
-What it does — in plain terms
+# Smart Notes Storage Layer
+
+What it does — in plain terms(Also ensure one final newline at EOF.)
Also applies to: 43-43
🧰 Tools
🪛 markdownlint-cli2 (0.21.0)
[warning] 1-1: First line in a file should be a top-level heading
(MD041, first-line-heading, first-line-h1)
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@apps/storage/README.md` at line 1, Add a top-level H1 by prefixing the first
line with "# " so the first line becomes "# What it does — in plain terms" and
remove any extra blank lines at the end so the file ends with exactly one
trailing newline (ensure only a single final newline character at EOF).
| async function logConflict(vaultPath: string, conflict: Conflict) { | ||
| const conflictsLog = path.join(vaultPath, ".grimoire", "conflicts.json") | ||
|
|
||
| let conflicts: Conflict[] = [] | ||
|
|
||
| try { | ||
| const existing = await fs.readFile(conflictsLog, "utf8") | ||
| conflicts = JSON.parse(existing) | ||
| } catch (err) { | ||
| // File doesn't exist yet, that's fine | ||
| } | ||
|
|
||
| conflicts.push(conflict) | ||
|
|
||
| await fs.writeFile(conflictsLog, JSON.stringify(conflicts, null, 2), "utf8") | ||
| } |
There was a problem hiding this comment.
Potential race condition in concurrent conflict logging.
The read-modify-write pattern in logConflict can cause data loss if multiple conflicts are detected simultaneously. Consider using file locking or atomic append operations.
🔒 Suggested improvement for concurrent safety
+import { createWriteStream } from "fs"
+
async function logConflict(vaultPath: string, conflict: Conflict) {
const conflictsLog = path.join(vaultPath, ".grimoire", "conflicts.json")
- let conflicts: Conflict[] = []
-
- try {
- const existing = await fs.readFile(conflictsLog, "utf8")
- conflicts = JSON.parse(existing)
- } catch (err) {
- // File doesn't exist yet, that's fine
- }
-
- conflicts.push(conflict)
-
- await fs.writeFile(conflictsLog, JSON.stringify(conflicts, null, 2), "utf8")
+ // Use append-only NDJSON format to avoid race conditions
+ const line = JSON.stringify(conflict) + "\n"
+ await fs.appendFile(conflictsLog, line, "utf8")
}Note: This would require updating getUnresolvedConflicts to parse NDJSON format instead of JSON array.
Alternatively, since conflicts are relatively rare events, documenting this limitation may be acceptable for now.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@apps/storage/src/conflicts.ts` around lines 80 - 95, The logConflict function
has a read-modify-write race that can lose entries under concurrent calls;
change logConflict to avoid reading the file first and instead append each
conflict atomically (e.g., write one NDJSON line per conflict with fs.appendFile
or use an atomic file-locking library) and write a newline-terminated JSON
object rather than rewriting a JSON array; also update getUnresolvedConflicts to
parse the new NDJSON format (read file line-by-line and JSON.parse each line) so
both functions use the same format.
| export async function markConflictResolved( | ||
| vaultPath: string, | ||
| noteId: NoteID, | ||
| timestamp: number | ||
| ) { | ||
| const conflictsLog = path.join(vaultPath, ".grimoire", "conflicts.json") | ||
|
|
||
| try { | ||
| const data = await fs.readFile(conflictsLog, "utf8") | ||
| const conflicts: Conflict[] = JSON.parse(data) | ||
|
|
||
| const conflict = conflicts.find( | ||
| (c) => c.note_id === noteId && c.timestamp === timestamp | ||
| ) | ||
|
|
||
| if (conflict) { | ||
| conflict.resolved = true | ||
| await fs.writeFile( | ||
| conflictsLog, | ||
| JSON.stringify(conflicts, null, 2), | ||
| "utf8" | ||
| ) | ||
| } | ||
| } catch (err) { | ||
| // Log doesn't exist, that's fine | ||
| } | ||
| } |
There was a problem hiding this comment.
Same race condition concern in markConflictResolved.
This function has the same read-modify-write pattern. If this becomes an issue in practice, consider the NDJSON approach or using a database table for conflicts.
There was a problem hiding this comment.
Actionable comments posted: 5
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@apps/storage/README.md`:
- Line 45: The README.md file is missing a trailing newline; update
apps/storage/README.md to ensure the file ends with a single newline character
(add one blank line terminator at EOF) so it conforms to Markdown convention
(MD047).
In `@apps/storage/src/frontmatter.ts`:
- Around line 36-42: Change the frontmatter parameter type in
stringifyFrontmatter from Record<string, any> to Record<string, unknown> to
avoid using any; update any internal access that assumes specific value types to
narrow/cast safely (e.g., when serializing or reading properties) or validate
types before use, and adjust callers that relied on the old type by adding
explicit casts or transforming their data to the new safe shape so the function
and its usages compile and maintain proper runtime checks.
In `@apps/storage/src/Watcher.ts`:
- Line 4: Remove the unused type import NoteID from the import statement (import
type { NoteID } from "./types") in Watcher.ts; update the import to only include
types or symbols actually used in the file (or remove the entire import line if
nothing else is imported) so the unused NoteID import is eliminated.
In `@apps/storage/test/vaultManager.test.ts`:
- Around line 391-392: The test is bypassing TS checks by casting the result of
vault.listFolders("") to any[]; remove the cast and use the actual return type
instead—either destructure directly (const [parent] = await
vault.listFolders("")) so TypeScript infers the correct type, or import/annotate
the proper Folder[] (or Folder) type and update the test to use it; if
listFolders' signature is wrong, update the vault.listFolders return type to the
correct typed array so tests no longer need an any cast (references:
vault.listFolders and vault.renameFolder).
- Around line 201-206: The test currently types the events array as any[] which
bypasses type checking; change its declaration to the specific conflict event
payload type exported by the vault/events module (e.g., ConflictDetectedPayload
or the exported type corresponding to EVENTS.CONFLICT_DETECTED) and import that
type at the top of the test, then use it for events (e.g., const events:
ConflictDetectedPayload[] = []) so the handler
vault.on(EVENTS.CONFLICT_DETECTED, ...) and assertions (events[0].note_id) are
type-checked against the correct payload shape.
ℹ️ Review info
Configuration used: Path: .coderabbit.yaml
Review profile: ASSERTIVE
Plan: Pro
📒 Files selected for processing (7)
apps/storage/.gitignoreapps/storage/README.mdapps/storage/index.tsapps/storage/package.jsonapps/storage/src/Watcher.tsapps/storage/src/frontmatter.tsapps/storage/test/vaultManager.test.ts
| Tells the RAG pipeline which notes need to be embedded (new, changed, or previously failed) | ||
| Tells the RAG pipeline which notes changed since a specific time (for incremental updates) | ||
| Lets the RAG pipeline report back when it finishes indexing a note | ||
| Bulk-read all notes with content in one call for initial indexing No newline at end of file |
There was a problem hiding this comment.
Add trailing newline at end of file.
The file should end with a single newline character per Markdown conventions (MD047).
🧰 Tools
🪛 markdownlint-cli2 (0.21.0)
[warning] 45-45: Files should end with a single newline character
(MD047, single-trailing-newline)
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@apps/storage/README.md` at line 45, The README.md file is missing a trailing
newline; update apps/storage/README.md to ensure the file ends with a single
newline character (add one blank line terminator at EOF) so it conforms to
Markdown convention (MD047).
| export function stringifyFrontmatter( | ||
| content: string, | ||
| frontmatter?: Record<string, any> | ||
| ): string { | ||
| if (!frontmatter || Object.keys(frontmatter).length === 0) { | ||
| return content // No frontmatter, just content | ||
| } |
There was a problem hiding this comment.
🧹 Nitpick | 🔵 Trivial
Avoid using any; prefer unknown or a more specific type.
The frontmatter parameter uses Record<string, any> which allows unchecked access. Consider using Record<string, unknown> for better type safety.
♻️ Proposed fix
export function stringifyFrontmatter(
content: string,
- frontmatter?: Record<string, any>
+ frontmatter?: Record<string, unknown>
): string {As per coding guidelines, "Avoid 'any', use explicit types".
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@apps/storage/src/frontmatter.ts` around lines 36 - 42, Change the frontmatter
parameter type in stringifyFrontmatter from Record<string, any> to
Record<string, unknown> to avoid using any; update any internal access that
assumes specific value types to narrow/cast safely (e.g., when serializing or
reading properties) or validate types before use, and adjust callers that relied
on the old type by adding explicit casts or transforming their data to the new
safe shape so the function and its usages compile and maintain proper runtime
checks.
| import chokidar from "chokidar" | ||
| import type { FSWatcher } from "chokidar" | ||
| import path from "path" | ||
| import type { NoteID } from "./types" |
There was a problem hiding this comment.
🧹 Nitpick | 🔵 Trivial
Remove unused import NoteID.
The NoteID type is imported but never used in this file.
♻️ Proposed fix
-import type { NoteID } from "./types"📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| import type { NoteID } from "./types" |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@apps/storage/src/Watcher.ts` at line 4, Remove the unused type import NoteID
from the import statement (import type { NoteID } from "./types") in Watcher.ts;
update the import to only include types or symbols actually used in the file (or
remove the entire import line if nothing else is imported) so the unused NoteID
import is eliminated.
| const events: any[] = [] | ||
| vault.on(EVENTS.CONFLICT_DETECTED, (p) => events.push(p)) | ||
| await expect(vault.updateNote(meta.note_id, "my attempted change")).rejects.toThrow("Conflict detected") | ||
| expect(events).toHaveLength(1) | ||
| expect(events[0]!.note_id).toBe(meta.note_id) | ||
| }) |
There was a problem hiding this comment.
🧹 Nitpick | 🔵 Trivial
Avoid using any; use the proper event payload type.
The events array is typed as any[] which bypasses type checking. Use the appropriate payload type for better type safety.
♻️ Proposed fix
- const events: any[] = []
- vault.on(EVENTS.CONFLICT_DETECTED, (p) => events.push(p))
+ const events: { note_id: string; conflictPath: string }[] = []
+ vault.on(EVENTS.CONFLICT_DETECTED, (p: { note_id: string; conflictPath: string }) => events.push(p))As per coding guidelines, "Avoid 'any', use explicit types".
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| const events: any[] = [] | |
| vault.on(EVENTS.CONFLICT_DETECTED, (p) => events.push(p)) | |
| await expect(vault.updateNote(meta.note_id, "my attempted change")).rejects.toThrow("Conflict detected") | |
| expect(events).toHaveLength(1) | |
| expect(events[0]!.note_id).toBe(meta.note_id) | |
| }) | |
| const events: { note_id: string; conflictPath: string }[] = [] | |
| vault.on(EVENTS.CONFLICT_DETECTED, (p: { note_id: string; conflictPath: string }) => events.push(p)) | |
| await expect(vault.updateNote(meta.note_id, "my attempted change")).rejects.toThrow("Conflict detected") | |
| expect(events).toHaveLength(1) | |
| expect(events[0]!.note_id).toBe(meta.note_id) | |
| }) |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@apps/storage/test/vaultManager.test.ts` around lines 201 - 206, The test
currently types the events array as any[] which bypasses type checking; change
its declaration to the specific conflict event payload type exported by the
vault/events module (e.g., ConflictDetectedPayload or the exported type
corresponding to EVENTS.CONFLICT_DETECTED) and import that type at the top of
the test, then use it for events (e.g., const events: ConflictDetectedPayload[]
= []) so the handler vault.on(EVENTS.CONFLICT_DETECTED, ...) and assertions
(events[0].note_id) are type-checked against the correct payload shape.
| const [parent] = await vault.listFolders("") as any[] | ||
| await vault.renameFolder(parent.folder_id, "renamed-parent") |
There was a problem hiding this comment.
🧹 Nitpick | 🔵 Trivial
Avoid as any[] cast; use the proper return type.
The cast bypasses type checking. If listFolders returns a properly typed array, the cast should be unnecessary. If the return type is incomplete, consider fixing it at the source.
♻️ Proposed fix
- const [parent] = await vault.listFolders("") as any[]
+ const [parent] = await vault.listFolders("")As per coding guidelines, "Avoid 'any', use explicit types".
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@apps/storage/test/vaultManager.test.ts` around lines 391 - 392, The test is
bypassing TS checks by casting the result of vault.listFolders("") to any[];
remove the cast and use the actual return type instead—either destructure
directly (const [parent] = await vault.listFolders("")) so TypeScript infers the
correct type, or import/annotate the proper Folder[] (or Folder) type and update
the test to use it; if listFolders' signature is wrong, update the
vault.listFolders return type to the correct typed array so tests no longer need
an any cast (references: vault.listFolders and vault.renameFolder).
What this includes:
VaultManager— main API (init, CRUD, rename, move, trash, restore, search)DB— SQLite metadata store via better-sqlite3Watcher— chokidar file watcher, detects external edits and syncs DBRun tests:
cd apps/storage && bun run test
Summary by CodeRabbit