Information Disclosure

MEDIUM
facebook/react
Commit: 894bc73cb493
Affected: 19.2.0 - 19.2.3
2026-04-03 23:35 UTC

Description

The commit hardens the handling of Flight server references to prevent information disclosure by leaking server-side function source code through stringification or implicit toString conversions. It adds safeguards to ensure server references are not exposed when converted to strings (e.g., hiding the implementation body), introduces explicit error handling for invalid payloads, and improves robustness against cycles in Promises and stream lifecycles. Additionally, it changes how server references are encoded (e.g., serializeServerReferenceID) to reduce the chance of leaking internal identifiers. In short, the patch mitigates an information-disclosure risk associated with server function references and addresses related resource-handling concerns (cycle protection, stream cleanup).

Proof of Concept

Proof-of-concept (exploit) before the fix: 1) Setup a server that exposes a Flight server function which returns sensitive logic or secrets: // Server (Flight server function) export function getSecret() { const secret = process.env.SECRET || 'default-secret'; return { secret }; } 2) Client consumes this server function as part of a payload that is passed through Flight (i.e., as data, not a plain function reference). // Client-side (receives a payload containing a server reference) fetchFlightPayload().then((payload) => { // payload.ref is a server function reference that originates from the server console.log(String(payload.ref)); // Before the fix, this can print the server function source code console.log(JSON.stringify(payload)); // May also leak function body if treated as data }); Expected behavior before the fix: - The String(payload.ref) call would produce the server function's source code body, leaking internal implementation details (e.g., "function getSecret() { const secret = ...; return { secret }; }") to the client log, enabling information disclosure. 3) After applying the patch, the server function reference stringification is sanitized: - String(payload.ref) would yield a placeholder like "[ServerFunction]" or similar, rather than the actual function body. - Internal identifiers and source are not leaked via serialized payloads. Concrete exploit steps for testing (post-fix behavior to verify mitigation): - Retrieve a Flight payload that contains a server function reference. - Evaluate String(payload.ref) and ensure the output is a sanitized placeholder, not the function body. - Optionally run JSON.stringify(payload) and verify that no sensitive server code is included in the serialized data. - Confirm that attempting to rely on function body content via toString or JSON encoding no longer reveals server implementation details. Notes: - The PoC demonstrates the information-disclosure risk that the patch targets and what you would expect to see before vs after the fix. The exact placeholder string may vary by environment, but the key expectation is that server function bodies are not exposed in string representations or serialized payloads after the patch.

Commit Details

Author: Sebastian Markbåge

Date: 2025-12-11 20:24 UTC

Message:

[Flight] Patch Promise cycles and toString on Server Functions (#35345) Server Functions can be stringified (sometimes implicitly) when passed as data. This adds an override to hide the source code in that case - just in case someone puts sensitive information in there. Note that this still preserves the `name` field but this is also available on the export but in practice is likely minified anyway. There's nothing else on these referenes we'd consider unsafe unless you explicitly expose expandos which are part of the `"use server"` export. This adds a safety check to ensure you don't encode cyclic Promises. This isn't a parser bug per se. Promises do have a safety mechanism that avoids them infinite looping. However, since we use custom Thenables, what can happen is that every time a native Promise awaits it, another Promise wrapper is created around the Thenable which foils the ECMAScript Promise cycle detection which can lead to an infinite loop. This also ensures that embedded `ReadableStream` and `AsyncIterable` streams are properly closed if the source stream closes early both on the Server and Client. This doesn't cause an infinite loop but just to make sure resource clean up can proceed properly. We're also adding some more explicit clear errors for invalid payloads since we no longer need to obfuscate the original issue.

Triage Assessment

Vulnerability Type: Information Disclosure

Confidence: MEDIUM

Reasoning:

The commit adds safeguards to prevent leaking server function source code when stringifying server references, which mitigates information disclosure risks. It also adds cycle protection for Promises and ensures streams are closed properly, addressing potential resource-related issues that could be exploited (e.g., infinite loops / DoS). These changes collectively harden security behavior in server-function data handling.

Verification Assessment

Vulnerability Type: Information Disclosure

Confidence: MEDIUM

Affected Versions: 19.2.0 - 19.2.3

Code Diff

diff --git a/packages/react-client/src/ReactFlightClient.js b/packages/react-client/src/ReactFlightClient.js index 49da30ccf4c1..0db63ecd3efa 100644 --- a/packages/react-client/src/ReactFlightClient.js +++ b/packages/react-client/src/ReactFlightClient.js @@ -894,6 +894,7 @@ function resolveModuleChunk<T>( const resolvedChunk: ResolvedModuleChunk<T> = (chunk: any); resolvedChunk.status = RESOLVED_MODULE; resolvedChunk.value = value; + resolvedChunk.reason = null; if (__DEV__) { const debugInfo = getModuleDebugInfo(value); if (debugInfo !== null) { @@ -1114,6 +1115,8 @@ export function reportGlobalError( // because we won't be getting any new data to resolve it. if (chunk.status === PENDING) { triggerErrorOnChunk(response, chunk, error); + } else if (chunk.status === INITIALIZED && chunk.reason !== null) { + chunk.reason.error(error); } }); if (__DEV__) { @@ -1462,15 +1465,95 @@ function fulfillReference( ): void { const {handler, parentObject, key, map, path} = reference; - for (let i = 1; i < path.length; i++) { + try { + for (let i = 1; i < path.length; i++) { + while ( + typeof value === 'object' && + value !== null && + value.$$typeof === REACT_LAZY_TYPE + ) { + // We never expect to see a Lazy node on this path because we encode those as + // separate models. This must mean that we have inserted an extra lazy node + // e.g. to replace a blocked element. We must instead look for it inside. + const referencedChunk: SomeChunk<any> = value._payload; + if (referencedChunk === handler.chunk) { + // This is a reference to the thing we're currently blocking. We can peak + // inside of it to get the value. + value = handler.value; + continue; + } else { + switch (referencedChunk.status) { + case RESOLVED_MODEL: + initializeModelChunk(referencedChunk); + break; + case RESOLVED_MODULE: + initializeModuleChunk(referencedChunk); + break; + } + switch (referencedChunk.status) { + case INITIALIZED: { + value = referencedChunk.value; + continue; + } + case BLOCKED: { + // It is possible that we're blocked on our own chunk if it's a cycle. + // Before adding the listener to the inner chunk, let's check if it would + // result in a cycle. + const cyclicHandler = resolveBlockedCycle( + referencedChunk, + reference, + ); + if (cyclicHandler !== null) { + // This reference points back to this chunk. We can resolve the cycle by + // using the value from that handler. + value = cyclicHandler.value; + continue; + } + // Fallthrough + } + case PENDING: { + // If we're not yet initialized we need to skip what we've already drilled + // through and then wait for the next value to become available. + path.splice(0, i - 1); + // Add "listener" to our new chunk dependency. + if (referencedChunk.value === null) { + referencedChunk.value = [reference]; + } else { + referencedChunk.value.push(reference); + } + if (referencedChunk.reason === null) { + referencedChunk.reason = [reference]; + } else { + referencedChunk.reason.push(reference); + } + return; + } + case HALTED: { + // Do nothing. We couldn't fulfill. + // TODO: Mark downstreams as halted too. + return; + } + default: { + rejectReference( + response, + reference.handler, + referencedChunk.reason, + ); + return; + } + } + } + } + value = value[path[i]]; + } + while ( typeof value === 'object' && value !== null && value.$$typeof === REACT_LAZY_TYPE ) { - // We never expect to see a Lazy node on this path because we encode those as - // separate models. This must mean that we have inserted an extra lazy node - // e.g. to replace a blocked element. We must instead look for it inside. + // If what we're referencing is a Lazy it must be because we inserted one as a virtual node + // while it was blocked by other data. If it's no longer blocked, we can unwrap it. const referencedChunk: SomeChunk<any> = value._payload; if (referencedChunk === handler.chunk) { // This is a reference to the thing we're currently blocking. We can peak @@ -1491,132 +1574,57 @@ function fulfillReference( value = referencedChunk.value; continue; } - case BLOCKED: { - // It is possible that we're blocked on our own chunk if it's a cycle. - // Before adding the listener to the inner chunk, let's check if it would - // result in a cycle. - const cyclicHandler = resolveBlockedCycle( - referencedChunk, - reference, - ); - if (cyclicHandler !== null) { - // This reference points back to this chunk. We can resolve the cycle by - // using the value from that handler. - value = cyclicHandler.value; - continue; - } - // Fallthrough - } - case PENDING: { - // If we're not yet initialized we need to skip what we've already drilled - // through and then wait for the next value to become available. - path.splice(0, i - 1); - // Add "listener" to our new chunk dependency. - if (referencedChunk.value === null) { - referencedChunk.value = [reference]; - } else { - referencedChunk.value.push(reference); - } - if (referencedChunk.reason === null) { - referencedChunk.reason = [reference]; - } else { - referencedChunk.reason.push(reference); - } - return; - } - case HALTED: { - // Do nothing. We couldn't fulfill. - // TODO: Mark downstreams as halted too. - return; - } - default: { - rejectReference( - response, - reference.handler, - referencedChunk.reason, - ); - return; - } } } + break; } - value = value[path[i]]; - } - while ( - typeof value === 'object' && - value !== null && - value.$$typeof === REACT_LAZY_TYPE - ) { - // If what we're referencing is a Lazy it must be because we inserted one as a virtual node - // while it was blocked by other data. If it's no longer blocked, we can unwrap it. - const referencedChunk: SomeChunk<any> = value._payload; - if (referencedChunk === handler.chunk) { - // This is a reference to the thing we're currently blocking. We can peak - // inside of it to get the value. - value = handler.value; - continue; - } else { - switch (referencedChunk.status) { - case RESOLVED_MODEL: - initializeModelChunk(referencedChunk); + const mappedValue = map(response, value, parentObject, key); + parentObject[key] = mappedValue; + + // If this is the root object for a model reference, where `handler.value` + // is a stale `null`, the resolved value can be used directly. + if (key === '' && handler.value === null) { + handler.value = mappedValue; + } + + // If the parent object is an unparsed React element tuple, we also need to + // update the props and owner of the parsed element object (i.e. + // handler.value). + if ( + parentObject[0] === REACT_ELEMENT_TYPE && + typeof handler.value === 'object' && + handler.value !== null && + handler.value.$$typeof === REACT_ELEMENT_TYPE + ) { + const element: any = handler.value; + switch (key) { + case '3': + transferReferencedDebugInfo(handler.chunk, fulfilledChunk); + element.props = mappedValue; + break; + case '4': + // This path doesn't call transferReferencedDebugInfo because this reference is to a debug chunk. + if (__DEV__) { + element._owner = mappedValue; + } break; - case RESOLVED_MODULE: - initializeModuleChunk(referencedChunk); + case '5': + // This path doesn't call transferReferencedDebugInfo because this reference is to a debug chunk. + if (__DEV__) { + element._debugStack = mappedValue; + } + break; + default: + transferReferencedDebugInfo(handler.chunk, fulfilledChunk); break; } - switch (referencedChunk.status) { - case INITIALIZED: { - value = referencedChunk.value; - continue; - } - } - } - break; - } - - const mappedValue = map(response, value, parentObject, key); - parentObject[key] = mappedValue; - - // If this is the root object for a model reference, where `handler.value` - // is a stale `null`, the resolved value can be used directly. - if (key === '' && handler.value === null) { - handler.value = mappedValue; - } - - // If the parent object is an unparsed React element tuple, we also need to - // update the props and owner of the parsed element object (i.e. - // handler.value). - if ( - parentObject[0] === REACT_ELEMENT_TYPE && - typeof handler.value === 'object' && - handler.value !== null && - handler.value.$$typeof === REACT_ELEMENT_TYPE - ) { - const element: any = handler.value; - switch (key) { - case '3': - transferReferencedDebugInfo(handler.chunk, fulfilledChunk); - element.props = mappedValue; - break; - case '4': - // This path doesn't call transferReferencedDebugInfo because this reference is to a debug chunk. - if (__DEV__) { - element._owner = mappedValue; - } - break; - case '5': - // This path doesn't call transferReferencedDebugInfo because this reference is to a debug chunk. - if (__DEV__) { - element._debugStack = mappedValue; - } - break; - default: - transferReferencedDebugInfo(handler.chunk, fulfilledChunk); - break; + } else if (__DEV__ && !reference.isDebug) { + transferReferencedDebugInfo(handler.chunk, fulfilledChunk); } - } else if (__DEV__ && !reference.isDebug) { - transferReferencedDebugInfo(handler.chunk, fulfilledChunk); + } catch (error) { + rejectReference(response, reference.handler, error); + return; } handler.deps--; @@ -1882,6 +1890,7 @@ function loadServerReference<A: Iterable<any>, T>( const initializedChunk: InitializedChunk<T> = (chunk: any); initializedChunk.status = INITIALIZED; initializedChunk.value = handler.value; + initializedChunk.reason = null; if (resolveListeners !== null) { wakeChunk(response, resolveListeners, handler.value, initializedChunk); } else { @@ -2359,7 +2368,7 @@ function parseModelString( // Symbol return Symbol.for(value.slice(2)); } - case 'F': { + case 'h': { // Server Reference const ref = value.slice(2); return getOutlinedModel( @@ -3138,6 +3147,7 @@ function startReadableStream<T>( streamState: StreamState, ): void { let controller: ReadableStreamController = (null: any); + let closed = false; const stream = new ReadableStream({ type: type, start(c) { @@ -3195,6 +3205,10 @@ function startReadableStream<T>( } }, close(json: UninitializedModel): void { + if (closed) { + return; + } + closed = true; if (previousBlockedChunk === null) { controller.close(); } else { @@ -3205,6 +3219,10 @@ function startReadableStream<T>( } }, error(error: mixed): void { + if (closed) { + return; + } + closed = true; if (previousBlockedChunk === null) { // $FlowFixMe[incompatible-call] controller.error(error); @@ -3265,6 +3283,7 @@ function startAsyncIterable<T>( (chunk: any); initializedChunk.status = INITIALIZED; initializedChunk.value = {done: false, value: value}; + initializedChunk.reason = null; if (resolveListeners !== null) { wakeChunkIfInitialized( response, @@ -3294,6 +3313,9 @@ function startAsyncIterable<T>( nextWriteIndex++; }, close(value: UninitializedModel): void { + if (closed) { + return; + } closed = true; if (nextWriteIndex === buffer.length) { buffer[nextWriteIndex] = createResolvedIteratorResultChunk( @@ -3321,6 +3343,9 @@ function startAsyncIterable<T>( } }, error(error: Error): void { + if (closed) { + return; + } closed = true; if (nextWriteIndex === buffer.length) { buffer[nextWriteIndex] = diff --git a/packages/react-client/src/ReactFlightReplyClient.js b/packages/react-client/src/ReactFlightReplyClient.js index f75f54f4ead0..4dc13ce48607 100644 --- a/packages/react-client/src/ReactFlightReplyClient.js +++ b/packages/react-client/src/ReactFlightReplyClient.js @@ -104,7 +104,7 @@ function serializePromiseID(id: number): string { } function serializeServerReferenceID(id: number): string { - return '$F' + id.toString(16); + return '$h' + id.toString(16); } function serializeTemporaryReferenceMarker(): string { @@ -112,7 +112,6 @@ function serializeTemporaryReferenceMarker(): string { } function serializeFormDataReference(id: number): string { - // Why K? F is "Function". D is "Date". What else? return '$K' + id.toString(16); } @@ -474,8 +473,22 @@ export function processReply( } } + const existingReference = writtenObjects.get(value); + // $FlowFixMe[method-unbinding] if (typeof value.then === 'function') { + if (existingReference !== undefined) { + if (modelRoot === value) { + // This is the ID we're currently emitting so we need to write it + // once but if we discover it again, we refer to it by id. + modelRoot = null; + } else { + // We've already emitted this as an outlined object, so we can + // just refer to that by its existing ID. + ... [truncated]
← Back to Alerts View on GitHub →