Skip to content

feat(notifications): bridge CEF providers into core ingest pipeline with dedup#874

Merged
graycyrus merged 8 commits intotinyhumansai:mainfrom
M3gA-Mind:feat/718-phase1-ingestion-gap
Apr 24, 2026
Merged

feat(notifications): bridge CEF providers into core ingest pipeline with dedup#874
graycyrus merged 8 commits intotinyhumansai:mainfrom
M3gA-Mind:feat/718-phase1-ingestion-gap

Conversation

@M3gA-Mind
Copy link
Copy Markdown
Contributor

@M3gA-Mind M3gA-Mind commented Apr 24, 2026

Summary

Closes part of #718 — Phase 1 of the notification intelligence pipeline.

  • Closes the ingestion gap for CDP-migrated providers (Slack, WhatsApp, Discord, Telegram): handleFired in webviewNotifications/service.ts now calls ingestNotification(...) after dispatching to the in-memory slice, mirroring the existing recipe-based path for Gmail/LinkedIn/Google Meet. On a non-skipped result the new notification is also dispatched to integrationNotifications Redux state.
  • Adds content-hash deduplication: new store::exists_recent checks for an identical (provider, account_id, title, body) notification within the last 60 seconds before inserting, preventing duplicates from webview retries or reloads.
  • Standardises log prefix to [notification_intel] throughout notifications/rpc.rs for consistent observability.
  • Unit test for exists_recent added in store.rs.

Test plan

  • Open a connected Slack or Discord webview account — a new notification from that account should now appear in the integration notifications store (visible via openhuman.notification_list RPC).
  • Trigger the same notification twice within 60 seconds — second call returns { skipped: true, reason: "duplicate" }.
  • Gmail / LinkedIn notifications continue to work as before (recipe path unchanged).
  • cargo test -p openhuman notifications::storeexists_recent_detects_duplicate passes.

Summary by CodeRabbit

  • New Features
    • Notifications are forwarded asynchronously and, when accepted, are recorded with a received timestamp and marked "unread".
  • Bug Fixes
    • Duplicate notifications within a 60‑second window are suppressed and not re-added.
  • Reliability
    • Failed intake attempts are handled gracefully and won't create partial notification entries.
  • Tests
    • WebSocket tests improved to more robustly handle different frame types and termination conditions.

…ith dedup

Wire CDP-migrated providers (Slack, WhatsApp, Discord, Telegram) into the
core triage pipeline by calling ingestNotification in handleFired after the
existing Redux dispatches. Add a 60-second content-hash deduplication check
(exists_recent) in the Rust store, invoke it at the top of handle_ingest to
drop duplicate fires, and standardise observability prefixes to
[notification_intel] throughout rpc.rs.
@M3gA-Mind M3gA-Mind requested a review from a team April 24, 2026 11:24
@coderabbitai
Copy link
Copy Markdown
Contributor

coderabbitai Bot commented Apr 24, 2026

📝 Walkthrough

Walkthrough

Frontend forwards webview-fired notifications into an async ingest pipeline (fire-and-forget); backend store transactionally skips inserts for recent (60s) duplicates and returns { skipped: true } to avoid spawning triage; frontend logs/skips or dispatches addNotification based on ingest result.

Changes

Cohort / File(s) Summary
Frontend notification ingestion
app/src/lib/webviewNotifications/service.ts
handleFired forwards payloads to ingestNotification asynchronously (fire-and-forget); logs forwarding; on ingest success dispatches addNotification with id, status: 'unread', and received_at; logs skip reasons or errors.
Backend RPC handlers
src/openhuman/notifications/rpc.rs
Replaced "[notifications::rpc]" prefixes with "[notification_intel]"; handle_ingest uses store::insert_if_not_recent to detect recent duplicates and returns { skipped: true, reason: "duplicate" } when skipping; avoids spawning triage on skip; updated log prefixes in other handlers and parameter/store error messages.
Store duplicate detection & tests
src/openhuman/notifications/store.rs
Added pub fn insert_if_not_recent(...) -> Result<bool> implementing transactional content/account-based deduplication within a 60s window; new compound index and tests validating duplicate skipping and expiry behavior.
Tests / Mock WebSocket
tests/webview_apis_bridge.rs
Mock WebSocket server receive loop now matches on all frame outcomes (not only successful text frames); explicitly terminates on close/errors, skips non-text frames, and continues to deserialize/respond to text messages.

Sequence Diagram

sequenceDiagram
    actor Webview
    participant Frontend as "WebviewNotifications Service"
    participant Backend as "notification_intel RPC"
    participant DB as "Notifications Store"
    participant Redux as "Redux Slice"

    Webview->>Frontend: webview notification fired
    Frontend->>Backend: ingestNotification (async, fire-and-forget)
    Backend->>DB: insert_if_not_recent(provider, account_id, title, body, received_at)
    DB-->>Backend: bool (inserted)
    alt Insert skipped (duplicate)
        Backend-->>Frontend: { skipped: true, reason: "duplicate" }
        Frontend->>Frontend: log skip reason
    else Inserted
        Backend->>DB: persist notification (completed)
        Backend->>Backend: spawn triage task
        Backend-->>Frontend: { id, status: "unread", received_at }
        Frontend->>Redux: dispatch addNotification
    end
Loading

Estimated code review effort

🎯 3 (Moderate) | ⏱️ ~20 minutes

Possibly related PRs

Suggested reviewers

  • senamakel

Poem

🐰 I hopped a note from webview light,
Sniffed for repeats within sixty's sight.
If fresh, I trumpet, mark it new —
If twin, I pause and let it snooze.
Tiny paws keep logs polite ✨

🚥 Pre-merge checks | ✅ 5
✅ Passed checks (5 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The title accurately summarizes the main objective: bridging CDP-migrated providers into the core ingest pipeline and implementing deduplication logic.
Docstring Coverage ✅ Passed Docstring coverage is 100.00% which is sufficient. The required threshold is 80.00%.
Linked Issues check ✅ Passed Check skipped because no linked issues were found for this pull request.
Out of Scope Changes check ✅ Passed Check skipped because no linked issues were found for this pull request.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 5

🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@app/src/lib/webviewNotifications/service.ts`:
- Around line 89-90: The debug log call using log('[notification_intel]
forwarding to core ingest provider=%s account=%s', provider, accountId) must not
emit raw PII; update the log in service.ts to keep provider but redact accountId
(e.g., replace accountId with a deterministic hash, a truncated fingerprint, or
a simple boolean like accountPresent=true/false) so the code still conveys
identity presence without logging the full email/identifier; ensure you
reference the same log call and variables (log, provider, accountId) when making
the change.

In `@src/openhuman/notifications/rpc.rs`:
- Around line 55-60: The debug log in the is_dup branch currently emits raw user
content via %req.title; change it to avoid logging PII by replacing the raw
title with a non-sensitive identifier (e.g., title length or a stable hash/ID).
Update the tracing::debug call inside the if is_dup block (around the
notification_intel duplicate check) to log provider and either title_len =
req.title.len() or title_hash = sha256(req.title) or an internal
notification_id, and remove %req.title from the message.
- Around line 45-63: The dedup race comes from calling store::exists_recent(...)
and then store::insert(...) in separate DB connections; replace those two calls
with a single transactional store API (e.g. add store::insert_if_not_recent or
store::insert_notification_tx) that opens one SQLite connection/transaction,
checks for a recent duplicate and inserts only if none is found, returning
whether the insert was skipped; update notification_ingest to call that new
transactional method instead of exists_recent and insert so the check+insert is
atomic and prevents concurrent duplicate inserts.

In `@src/openhuman/notifications/store.rs`:
- Around line 470-480: The test exists_recent_detects_duplicate only checks that
a freshly inserted matching row is found but doesn't verify that an older
(stale) matching row outside the 60-second window is ignored; update the test to
insert a second notification with the same channel/title/body but with
received_at set to older than 60 seconds (e.g., now - Duration::from_secs(61))
using sample_notification or by mutating the Notification before calling insert,
then assert that exists_recent(&config, "slack", None, "Test notification",
"Test body").unwrap() returns false for that stale row; keep the original fresh
insert assertions as well so you cover both fresh and stale behavior.
- Around line 237-264: The query in exists_recent compares the RFC3339 TEXT
column received_at lexicographically to datetime('now','-60 seconds'), causing
incorrect matches; update both SQL branches in exists_recent
(integration_notifications table) to compare timestamps numerically, e.g., use
strftime('%s', received_at) >= strftime('%s','now','-60 seconds') (or an
equivalent julianday/epoch conversion) so the comparison is by epoch seconds
rather than string ordering; keep the same params and error handling but replace
the received_at predicate in both the Some(account_id) and None branches.
🪄 Autofix (Beta)

Fix all unresolved CodeRabbit comments on this PR:

  • Push a commit to this branch (recommended)
  • Create a new PR with the fixes

ℹ️ Review info
⚙️ Run configuration

Configuration used: defaults

Review profile: CHILL

Plan: Pro

Run ID: 82a015d2-4515-4f44-a3b8-961e9bedcfaa

📥 Commits

Reviewing files that changed from the base of the PR and between b81d04d and 64bb8ec.

📒 Files selected for processing (3)
  • app/src/lib/webviewNotifications/service.ts
  • src/openhuman/notifications/rpc.rs
  • src/openhuman/notifications/store.rs

Comment thread app/src/lib/webviewNotifications/service.ts Outdated
Comment thread src/openhuman/notifications/rpc.rs Outdated
Comment thread src/openhuman/notifications/rpc.rs Outdated
Comment thread src/openhuman/notifications/store.rs Outdated
Comment thread src/openhuman/notifications/store.rs
Make ingest dedup atomic in a single transaction, compare recent-window timestamps by epoch, and redact account identifiers from debug logs while adding regression coverage for stale duplicate detection.

Made-with: Cursor
Copy link
Copy Markdown
Contributor

@graycyrus graycyrus left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🔍 Code Review — PR #874

Walkthrough

This PR closes the ingestion gap for CDP-migrated providers (Slack, WhatsApp, Discord, Telegram) by wiring handleFired in the webview notifications service to call ingestNotification(...), bridging into the Rust core triage pipeline. It adds atomic content-hash deduplication via a new insert_if_not_recent store function (single SQLite transaction), standardises log prefixes to [notification_intel], and ships 3 new unit tests.

Changes

File Summary
app/src/lib/webviewNotifications/service.ts Wire handleFiredingestNotification (fire-and-forget), dispatch addNotification to Redux on success
src/openhuman/notifications/rpc.rs Replace store::insert with atomic store::insert_if_not_recent, add duplicate-skip branch, rename all log prefixes
src/openhuman/notifications/store.rs Add insert_if_not_recent (transactional dedup+insert), exists_recent (standalone check), 3 new tests

Actionable Comments

See inline comments below.

Comment thread src/openhuman/notifications/store.rs Outdated
Comment thread src/openhuman/notifications/store.rs
Comment thread app/src/lib/webviewNotifications/service.ts
Comment thread app/src/lib/webviewNotifications/service.ts
Trigger a fresh PR merge commit against current main so CI re-evaluates checks with the up-to-date base state.

Made-with: Cursor
Enrich forwarded raw payload fields for triage context, make optimistic notification shape explicit, add a dedup query index, and remove unused exists_recent paths in favor of transactional insert_if_not_recent coverage.

Made-with: Cursor
Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 3

🧹 Nitpick comments (1)
src/openhuman/notifications/store.rs (1)

421-554: Consider splitting this module to stay under the Rust file-size guideline.

store.rs is now 555 lines. Moving tests (or dedup logic) into a dedicated submodule would keep this easier to maintain.

As per coding guidelines, "src/**/*.rs: Source files should be ≤ ~500 lines; split modules when growing to improve maintainability".

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/openhuman/notifications/store.rs` around lines 421 - 554, The tests
module in store.rs has grown too large (≈555 lines); move the #[cfg(test)] mod
tests into a new test-only file or submodule to keep store.rs under ~500 lines.
Create a new file (e.g., store_tests.rs or tests/mod.rs) or a submodule (mod
tests;) and relocate the entire tests block, ensuring you import the same
symbols (insert, list, unread_count, mark_read, update_triage,
insert_if_not_recent, get_settings, upsert_settings, sample_notification,
test_config) and adjust visibility/imports (use super::* or
crate::openhuman::notifications::store::* as needed) so all tests compile
unchanged. Update store.rs to reference the new tests module with #[cfg(test)]
mod tests;.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@src/openhuman/notifications/store.rs`:
- Around line 503-525: Add a unit test that covers the Some(account_id)
deduplication branch by creating notifications with account_id set (use
sample_notification and then set n.account_id = Some("acct1".into())), insert
the first, then call insert_if_not_recent to assert that a duplicate with the
same account_id is rejected, and also assert that a notification with a
different account_id (e.g. Some("acct2".into())) is accepted; use
TempDir::new(), test_config, insert, and insert_if_not_recent to mirror the
existing tests (similar to insert_if_not_recent_skips_duplicate and
insert_if_not_recent_rejects_expired_window_only) so coverage exercises the
account_id IS NOT NULL path.
- Around line 119-177: Add development-oriented debug/trace logs inside
insert_if_not_recent: log an entry message at the start of insert_if_not_recent
with a stable grep-friendly prefix (e.g. "[notifications::store]
insert_if_not_recent entry") including non-PII correlation fields such as
n.provider and a boolean account_id.is_some(); log the dedup branch decision
when count > 0 with the same prefix and the dedup result ("duplicate") and then
an exit log before returning false; log a success exit after the INSERT commit
with the prefix and result ("inserted"); and log failures on query/insert/commit
error paths via debug/error with the same prefix but without logging PII (do not
log title, body, raw_payload, or full account identifiers). Ensure logs use
Rust's debug/trace macros and reference insert_if_not_recent and the transaction
(tx) operations to place them near the dedup query, the duplicate return, and
the post-insert commit.
- Around line 121-149: The DEFERRED transaction started by
unchecked_transaction() lets concurrent callers both see count==0 and leads to
duplicate inserts; replace unchecked_transaction() with
transaction_with_behavior(TransactionBehavior::Immediate) so the code acquires a
write lock before running the COUNT query and subsequent insert/commit (i.e.,
change how tx is created in the insert_if_not_recent flow so the COUNT + insert
are serialized), keeping the existing COUNT query, params, and commit/error
handling intact.

---

Nitpick comments:
In `@src/openhuman/notifications/store.rs`:
- Around line 421-554: The tests module in store.rs has grown too large (≈555
lines); move the #[cfg(test)] mod tests into a new test-only file or submodule
to keep store.rs under ~500 lines. Create a new file (e.g., store_tests.rs or
tests/mod.rs) or a submodule (mod tests;) and relocate the entire tests block,
ensuring you import the same symbols (insert, list, unread_count, mark_read,
update_triage, insert_if_not_recent, get_settings, upsert_settings,
sample_notification, test_config) and adjust visibility/imports (use super::* or
crate::openhuman::notifications::store::* as needed) so all tests compile
unchanged. Update store.rs to reference the new tests module with #[cfg(test)]
mod tests;.
🪄 Autofix (Beta)

Fix all unresolved CodeRabbit comments on this PR:

  • Push a commit to this branch (recommended)
  • Create a new PR with the fixes

ℹ️ Review info
⚙️ Run configuration

Configuration used: defaults

Review profile: CHILL

Plan: Pro

Run ID: c77e1072-cd11-49ae-9005-155b0e9f83da

📥 Commits

Reviewing files that changed from the base of the PR and between e1bcd45 and 3fe34c3.

📒 Files selected for processing (2)
  • app/src/lib/webviewNotifications/service.ts
  • src/openhuman/notifications/store.rs
🚧 Files skipped from review as they are similar to previous changes (1)
  • app/src/lib/webviewNotifications/service.ts

Comment thread src/openhuman/notifications/store.rs
Comment thread src/openhuman/notifications/store.rs
Comment thread src/openhuman/notifications/store.rs
Drop the second base64 key in app/src-tauri/Cargo.toml so cargo metadata, format checks, and tauri build/test jobs can parse the manifest in PR merge context.

Made-with: Cursor
Handle non-text websocket frames in the mock bridge loop so request_round_trips_list_labels_through_mock_server does not time out intermittently in CI.

Made-with: Cursor
Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 3

🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@tests/webview_apis_bridge.rs`:
- Around line 88-95: The message loop currently silently breaks or continues on
send failures, close frames, non-text messages, and receive errors; update the
loop (around the sink.send(Message::Text(...)).await and the match arms for
Ok(Message::Close(_)), Ok(_), and Err(_)) to log meaningful development-oriented
messages (including context like the payload or error) at each branch before
breaking/continuing — e.g., log send failure with the response text and error,
log receipt of Close with the close reason, log unexpected non-text Message with
its variant, and log receive errors with the error details; use the project’s
test/logging facility (e.g., tracing or test logger) so these paths surface
during flaky test debugging.
- Around line 92-95: Add unit tests in tests/webview_apis_bridge.rs that
exercise the non-text/close/error branches shown in the match
(Message::Close(_), non-text Ok(_), and Err(_)). Specifically, add one test
sending a Close message and assert the handler breaks/terminates as expected
(matching existing behavior used in the text tests), another test sending a
non-text successful message (e.g., a Binary/NonText Message variant) and assert
the handler continues/skips appropriately, and a third test simulating an Err
from the message stream and assert the handler breaks. Locate the
message-handling function used by the current text RPC tests (the same test
helper or function invoked by those tests) and reuse its setup to drive these
three new cases so they cover the newly introduced branches.
- Around line 62-64: The mock WebSocket server currently uses
serde_json::from_str(&text).unwrap() and
req["id"].as_str().unwrap()/req["method"].as_str().unwrap(), which will panic on
malformed frames; change the logic around req, id, and method so you gracefully
handle errors: parse serde_json::from_str(&text) with match/if let and on Err
send a JSON error response (or log and continue) instead of panicking, then
extract id and method with safe checks (e.g., req.get("id").and_then(|v|
v.as_str()) and same for "method") and send a proper error response if missing,
continuing the spawned server task rather than unwrapping. Ensure the error
responses use the same WS response path the test harness expects so the server
task remains alive.
🪄 Autofix (Beta)

Fix all unresolved CodeRabbit comments on this PR:

  • Push a commit to this branch (recommended)
  • Create a new PR with the fixes

ℹ️ Review info
⚙️ Run configuration

Configuration used: defaults

Review profile: CHILL

Plan: Pro

Run ID: 79b59d01-b458-4055-a59b-2d285e78af6f

📥 Commits

Reviewing files that changed from the base of the PR and between 3fe34c3 and bdf79eb.

📒 Files selected for processing (1)
  • tests/webview_apis_bridge.rs

Comment thread tests/webview_apis_bridge.rs
Comment thread tests/webview_apis_bridge.rs
Comment thread tests/webview_apis_bridge.rs
Copy link
Copy Markdown
Contributor

@graycyrus graycyrus left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

All review feedback addressed — dead code removed, dedup index added, raw_payload enriched, optional fields explicit. CI green. LGTM.

@graycyrus graycyrus merged commit 208c276 into tinyhumansai:main Apr 24, 2026
7 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants