Bug Description
When using datastar-patch-signals (via the Go SDK's PatchSignals) to update an array signal with fewer elements than the current array, the old elements beyond the new array's length are not removed. This leaves "ghost" entries in the signal that cause downstream computed signals to produce NaN and break subsequent operations.
Reproduction
Given a signal store initialized with a 3-element array:
<div data-signals='{"items":[{"name":"a","value":1},{"name":"b","value":2},{"name":"c","value":3}]}'></div>
<div data-computed:total="$items.reduce((s,i)=>s+i.value,0)"></div>
<span data-text="$total"></span> <!-- shows 6 -->
Send an SSE datastar-patch-signals event with a 2-element array:
{"items":[{"name":"a","value":1},{"name":"c","value":3}]}
Expected: $items has 2 elements, $total shows 4.
Actual: $items retains a ghost 3rd element with undefined properties. $total shows NaN (because undefined + number = NaN). Subsequent operations that read $items (e.g., .map() for a POST body) include the ghost element, producing invalid data.
Root Cause
The mergePatch → Proxy set path creates a new reactive proxy via deep() for the incoming array, but the internal merge logic iterates array indices as object keys rather than performing a wholesale array replacement. Old indices beyond the new array's length survive in the proxy store.
This appears related to the RC.5 fix for array length setting — the loop in the Proxy set trap:
if (isArr && prop === 'length') {
const diff = (deepObj[prop] as unknown as number) - newValue
deepObj[prop] = newValue
if (diff > 0) {
for (let i = newValue; i < deepObj[prop]; i++) { // uses NEW value → no-op
patch[i] = null
}
}
}
After deepObj[prop] = newValue, the loop condition i < deepObj[prop] compares against the already-updated length, so it never executes.
Workaround
Direct signal assignment via a Datastar expression works correctly:
// This properly replaces the array (same mechanism as SortableJS reorder)
$items = evt.detail.items
So dispatching a custom event from ExecuteScript and handling it with data-on:itemspatched="$items = evt.detail.items" bypasses the broken merge path.
Environment
- Datastar JS: v1.0.0-RC.8
- Datastar Go SDK: v1.1.0
Bug Description
When using
datastar-patch-signals(via the Go SDK'sPatchSignals) to update an array signal with fewer elements than the current array, the old elements beyond the new array's length are not removed. This leaves "ghost" entries in the signal that cause downstream computed signals to produceNaNand break subsequent operations.Reproduction
Given a signal store initialized with a 3-element array:
Send an SSE
datastar-patch-signalsevent with a 2-element array:{"items":[{"name":"a","value":1},{"name":"c","value":3}]}Expected:
$itemshas 2 elements,$totalshows 4.Actual:
$itemsretains a ghost 3rd element with undefined properties.$totalshowsNaN(becauseundefined + number = NaN). Subsequent operations that read$items(e.g.,.map()for a POST body) include the ghost element, producing invalid data.Root Cause
The
mergePatch→ Proxysetpath creates a new reactive proxy viadeep()for the incoming array, but the internal merge logic iterates array indices as object keys rather than performing a wholesale array replacement. Old indices beyond the new array's length survive in the proxy store.This appears related to the RC.5 fix for array
lengthsetting — the loop in the Proxysettrap:After
deepObj[prop] = newValue, the loop conditioni < deepObj[prop]compares against the already-updated length, so it never executes.Workaround
Direct signal assignment via a Datastar expression works correctly:
So dispatching a custom event from
ExecuteScriptand handling it withdata-on:itemspatched="$items = evt.detail.items"bypasses the broken merge path.Environment