[BUG]:- Redundant Sync Re-initialization & Subscription Overlap in sync.ts #190

Closed
opened 2026-04-11 16:13:37 +00:00 by rajanarahul93 · 4 comments

Summary

A structural flaw in src/app/core/sync.ts causes the entire synchronization state for a Space to be torn down and restarted whenever a user navigates between rooms. Additionally, overlapping subscriptions result in duplicate network traffic for every event.

Root Cause

  1. Redundant Re-initialization: The syncSpaces store subscription includes $page.params.h in its key derivation. Changing rooms alters the roomsKey, which calls unsubscriber() to abort all active connections for the space before immediately restarting them.
  2. Subscription Overlap: syncSpace initializes a relay-wide listener for content kinds while simultaneously spawning dedicated per-room listeners for those same kinds. This results in every message being requested and processed twice.

Impact

  • Network Churn: Excessive CLOSE/REQ cycles on every navigation.
  • Overhead: Double bandwidth and CPU consumption per room.
  • Scalability: High risk of reaching relay subscription limits in large spaces.

Suggested Fix

  • Decouple room-level synchronization from the top-level $page-dependent space sync.
  • Deduplicate listeners by merging room-specific filters into the broader space-level subscription where possible.
### Summary A structural flaw in `src/app/core/sync.ts` causes the entire synchronization state for a Space to be torn down and restarted whenever a user navigates between rooms. Additionally, overlapping subscriptions result in duplicate network traffic for every event. ### Root Cause 1. **Redundant Re-initialization**: The `syncSpaces` store subscription includes `$page.params.h` in its key derivation. Changing rooms alters the `roomsKey`, which calls `unsubscriber()` to abort all active connections for the space before immediately restarting them. 2. **Subscription Overlap**: `syncSpace` initializes a relay-wide listener for content kinds while simultaneously spawning dedicated per-room listeners for those same kinds. This results in every message being requested and processed twice. ### Impact - **Network Churn**: Excessive `CLOSE`/`REQ` cycles on every navigation. - **Overhead**: Double bandwidth and CPU consumption per room. - **Scalability**: High risk of reaching relay subscription limits in large spaces. ### Suggested Fix - Decouple room-level synchronization from the top-level `$page`-dependent space sync. - Deduplicate listeners by merging room-specific filters into the broader space-level subscription where possible.
Author

@hodlbod can i work on this!!

@hodlbod can i work on this!!
Owner

The redundancy in syncSpace is by design, some relay implementations are picky and won't respond to filters without an h tag. If the relay supports negentropy, it mostly prevents duplicate syncing, although you're right that there are multiple listeners.

You're right that the page sync is redundant and harmful. In an effort to understand it I came up with this patch: 9f386f69

If there's anything else that needs to be done let me know, and feel free to open a PR.

The redundancy in syncSpace is by design, some relay implementations are picky and won't respond to filters without an `h` tag. If the relay supports negentropy, it mostly prevents duplicate syncing, although you're right that there are multiple listeners. You're right that the page sync is redundant and harmful. In an effort to understand it I came up with this patch: 9f386f69 If there's anything else that needs to be done let me know, and feel free to open a PR.
rajanarahul93 was assigned by hodlbod 2026-04-11 17:22:37 +00:00
hodlbod added the dev label 2026-04-11 17:22:39 +00:00
hodlbod added this to the Current milestone 2026-04-11 17:22:41 +00:00
Author

No additional changes needed from this patch 9f386f69
@hodlbod

No additional changes needed from this patch [9f386f69](https://gitea.coracle.social/coracle/flotilla/commit/9f386f69) @hodlbod
Owner

Great, thanks!

Great, thanks!
Sign in to join this conversation.
2 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: coracle/flotilla#190