View Issue Details

IDProjectCategoryView StatusLast Update
0000552GTA ConnectedGrand Theft Auto IVpublic2026-05-03 20:23
Reporterskitzo Assigned To 
PriorityurgentSeveritycrashReproducibilityalways
Status newResolutionopen 
Platformx64OSWindowsOS Version11
Summary0000552: GTA:C game.receiveNetworkEvent CTD on identical payload after remote ped re-stream
Description

We’ve encountered a reproducible client crash (CTD) in GTAC 1.7.3 when re-injecting a byte-identical WEAPON_DAMAGE_EVENT via game.receiveNetworkEvent. The crash occurs only after a remote player ped has gone through a stream-out → stream-in cycle, suggesting an inconsistency in the internal netID-to-entity mapping after reallocation. The same payload works perfectly without prior streaming or on fresh stream-in, which indicates the issue is not data-related but tied to entity lifecycle/state.

A full, well-structured breakdown of the issue, including reproduction steps, logs, and analysis, is provided in the attached markdown file below.

TagsNo tags attached.
Game

Activities

skitzo

2026-05-03 20:23

reporter  

2026-05-03-gtac-receivenetworkevent-ctd-report.md (7,446 bytes)   
# GTAC `game.receiveNetworkEvent` CTD on identical payload after remote ped re-stream

**Subject:** `game.receiveNetworkEvent` CTD on byte-identical `WEAPON_DAMAGE_EVENT` payload after a remote ped stream-out → stream-in cycle

Hi GTAC team,

We have a clean A/B reproducer for a client CTD in GTAC 1.7.3 where re-injecting a *byte-identical* `WEAPON_DAMAGE_EVENT` payload via `game.receiveNetworkEvent` either applies damage cleanly OR crashes the IV process — depending solely on whether the receiving client's local network entity for the source player has been through a stream-out → stream-in cycle.

## Setup

- GTAC version: **1.7.3** (image built from `Dockerfile`, `GTAC_VERSION=1.7.3`).
- Two clients connected over LAN; both stock GTA IV 1.0.8.0.
- Server-side gamemode in TypeScript bundled with esbuild.
- We use the **default GTA IV nametags** — a name plus a circled dot beside it. Each player receives a **randomly-assigned color** for that nametag/dot, which turns out to be a useful debug indicator: when the engine reallocates a remote ped's network entity locally, the color visibly flips (e.g. red before separation, blue after re-stream).

## The relay

Server side — forwards every native IV multiplayer event to all clients:

```ts
addEventHandler("OnAddIVNetworkEvent", function (event, client, type, name, data, data2) {
    triggerNetworkEvent("ReceiveIVNetworkEvent", null, type, name, data, data2, client.index);
});
```

Client side — re-injects into the local IV engine:

```ts
addNetworkHandler("ReceiveIVNetworkEvent", (type, name, data, data2, from) => {
    game.receiveNetworkEvent(0, from, type, 0, data, data2);
});
```

This relay is what powers our HP/damage system — the native `WEAPON_DAMAGE_EVENT` payloads from `OnAddIVNetworkEvent` are forwarded to all clients and re-injected via `game.receiveNetworkEvent` so that shooting and melee register cross-client.

## Reproducer

We've narrowed it to a single requirement: **the receiving client's local network entity for the source player must have gone through a stream-out → stream-in cycle.** Initial spawn proximity, who teleports, who attacks — none of that matters in isolation. There are two distinct crash-free shapes and one reliable crash shape, all distinguished only by streaming history.

### Crash-free — no prior stream-out

**Scenario A1 — co-located the entire time:**

1. A and B spawn close together at the login spawn, so streaming kicks in immediately. They never separate far enough to stream out.
2. Each gets the other's nametag with a randomly-assigned color (no subsequent flips).
3. They punch / shoot each other freely. Damages propagate normally; HP changes correctly on both sides. No crash.

**Scenario A2 — joined later, no prior streaming overlap:**

1. A is already in-game far from spawn.
2. B logs in at the spawn — A is well outside B's streaming range, so A is never streamed in for B.
3. B then teleports / walks to A. The first stream-in happens here, and the engine builds the network entity from scratch with no prior stale state.
4. B punches A 3 times. Damages apply normally. No crash.

### Crash case — stream-out followed by re-stream

**Scenario B — co-located → separate → reunite:**

1. A and B spawn co-located (e.g. login spawn) — they stream each other in. Each gets the other's nametag with a random color.
2. B teleports far away — both peds stream OUT on the opposite client.
3. A teleports to B's new location, triggering a fresh stream-IN.
4. **The nametag color above B's head visibly changes on A's screen** at this moment (e.g. it was red before separation, now blue) — visual artifact of A's engine recreating B's network entity after the free → reallocate cycle.
5. B punches A. **A's GTA IV process CTDs immediately** — silent exit, no engine error overlay.

## Captured payloads — these are byte-identical

We instrumented `OnAddIVNetworkEvent` server-side to dump every event with a hex view. Here are the actual punch events from each scenario:

**Scenario A2 (any of three successful punches):**

```
[IVNet#6] type=3/WEAPON_DAMAGE_EVENT name="WEAPON_DAMAGE_EVENT" dataLen=3 data2Len=2
[hexDump] #6 data (3 bytes):
  0000  c0 81 a4   |...|
[hexDump] #6 data2 (2 bytes):
  0000  6e 68      |nh|
parsed data2: hitEntityNetId=26734 hitEntityType=255 flags=0x00
                (note: data2 is only 2 bytes; type/flags fields read past buffer end)
```

**Scenario B (the one punch that CTDs A):**

```
[IVNet#4] type=3/WEAPON_DAMAGE_EVENT name="WEAPON_DAMAGE_EVENT" dataLen=3 data2Len=2
[hexDump] #4 data (3 bytes):
  0000  c0 81 8e   |...|
[hexDump] #4 data2 (2 bytes):
  0000  6e 68      |nh|
parsed data2: hitEntityNetId=26734 hitEntityType=255 flags=0x00
```

Same shape, same source, same `hitEntityNetId` (26734 — A's ped from B's perspective). Only the third byte of `data` differs (damage sub-value), and the hex bytes A2 → B are within the same distribution as A2's three successful events. **The payload is not the problem.**

## What we believe is happening

- In Scenarios A1 and A2, A's local IV engine has either a long-lived entity for B (A1) or a freshly-built one with no prior state (A2). `game.receiveNetworkEvent` resolves the netID correctly against a consistent entity table.
- In Scenario B, A's engine first allocates an entity for B (login co-location), frees it when B teleports out of streaming range, then allocates a *new* entity when A teleports back into B's range. The internal netID → entity mapping is left inconsistent — visible as the nametag color flipping.
- The next `game.receiveNetworkEvent` call dereferences a stale or freed pointer → segfault, no error overlay.

## Notable: teleport itself fires no surfaced IV events

We expected a flurry of `OBJECT_ID_FREED_EVENT` / `MARK_AS_NO_LONGER_NEEDED_EVENT` / `REQUEST_CONTROL_EVENT` / `GIVE_CONTROL_EVENT` around the teleport in Scenario B's server log, but **none of those fire through `OnAddIVNetworkEvent`**. The only events between spawn and CTD are 3× `RESURRECTED_LOCAL_PLAYER_EVENT` (spawn) and the lethal `WEAPON_DAMAGE_EVENT`. So either teleport doesn't generate those events, or they fire only client-internally and aren't surfaced to scripts.

## What we'd like to know

1. Is the relay pattern (`OnAddIVNetworkEvent` → `triggerNetworkEvent` → `game.receiveNetworkEvent` on each client) actually a supported way to propagate IV-native systems (notably weapon-hit registration) in a GTAC server-authoritative setup? If not, what's the recommended approach?

2. Is `game.receiveNetworkEvent(0, from, type, 0, data, data2)` the correct signature? What should arguments 1 and 4 (the two zeroes) be when re-injecting a forwarded event?

3. Should `game.receiveNetworkEvent` validate netID → entity resolution before dereferencing? The crash is silent and total — there's no error overlay.

4. Is there an API to query whether a remote `client.index` is *currently streamed in* on the local client? We could gate re-injection on this and add a brief grace window after a fresh stream-in to let the engine settle.

5. Has anyone else hit this pattern? Our codebase has a prior commit that disables the relay entirely as a workaround, but the root cause was never identified until now.

Happy to share the full gtac container logs for both scenarios, our parser code, and the client handler.

Thanks for any pointers!

— LCRP team

Issue History

Date Modified Username Field Change
2026-05-03 20:23 skitzo New Issue
2026-05-03 20:23 skitzo File Added: 2026-05-03-gtac-receivenetworkevent-ctd-report.md