-
Notifications
You must be signed in to change notification settings - Fork 23
[POC] Compression for sync streams #329
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Draft
rkistner
wants to merge
8
commits into
main
Choose a base branch
from
compression
base: main
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
Draft
Changes from all commits
Commits
Show all changes
8 commits
Select commit
Hold shift + click to select a range
ef0e1bd
Enable gzip, zstd compression for sync streams.
rkistner e67ae61
Actually enable compression.
rkistner 6d79a1f
Fix error handling in compressed streams.
rkistner da3da38
[WIP] stream tests.
rkistner bf5885b
Log encoding used.
rkistner 256b95a
Semi-functional test.
rkistner 1df8769
Add metric for actual bytes sent, powersync_data_sent_bytes_total.
rkistner 9976ff8
Merge remote-tracking branch 'origin/main' into compression
rkistner File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,78 @@ | ||
import { PassThrough, pipeline, Readable, Transform } from 'node:stream'; | ||
import type Negotiator from 'negotiator'; | ||
import * as zlib from 'node:zlib'; | ||
import { RequestTracker } from '../sync/RequestTracker.js'; | ||
|
||
/** | ||
* Compress a streamed response. | ||
* | ||
* `@fastify/compress` can do something similar, but does not appear to work as well on streamed responses. | ||
* The manual implementation is simple enough, and gives us more control over the low-level details. | ||
* | ||
* @param negotiator Negotiator from the request, to negotiate response encoding | ||
* @param stream plain-text stream | ||
* @returns | ||
*/ | ||
export function maybeCompressResponseStream( | ||
negotiator: Negotiator, | ||
stream: Readable, | ||
tracker: RequestTracker | ||
): { stream: Readable; encodingHeaders: { 'content-encoding'?: string } } { | ||
const encoding = (negotiator as any).encoding(['identity', 'gzip', 'zstd'], { preferred: 'zstd' }); | ||
if (encoding == 'zstd') { | ||
tracker.setCompressed(encoding); | ||
return { | ||
stream: transform( | ||
stream, | ||
// Available since Node v23.8.0, v22.15.0 | ||
// This does the actual compression in a background thread pool. | ||
zlib.createZstdCompress({ | ||
// We need to flush the frame after every new input chunk, to avoid delaying data | ||
// in the output stream. | ||
flush: zlib.constants.ZSTD_e_flush, | ||
params: { | ||
// Default compression level is 3. We reduce this slightly to limit CPU overhead | ||
[zlib.constants.ZSTD_c_compressionLevel]: 2 | ||
} | ||
}), | ||
tracker | ||
), | ||
encodingHeaders: { 'content-encoding': 'zstd' } | ||
}; | ||
} else if (encoding == 'gzip') { | ||
tracker.setCompressed(encoding); | ||
return { | ||
stream: transform( | ||
stream, | ||
zlib.createGzip({ | ||
// We need to flush the frame after every new input chunk, to avoid delaying data | ||
// in the output stream. | ||
flush: zlib.constants.Z_SYNC_FLUSH | ||
}), | ||
tracker | ||
), | ||
encodingHeaders: { 'content-encoding': 'gzip' } | ||
}; | ||
} else { | ||
return { | ||
stream: stream, | ||
encodingHeaders: {} | ||
}; | ||
} | ||
} | ||
|
||
function transform(source: Readable, transform: Transform, tracker: RequestTracker) { | ||
// pipe does not forward error events automatically, resulting in unhandled error | ||
// events. This forwards it. | ||
const out = new PassThrough(); | ||
const trackingTransform = new Transform({ | ||
transform(chunk, encoding, callback) { | ||
tracker.addCompressedDataSent(chunk.length); | ||
callback(null, chunk); | ||
} | ||
}); | ||
pipeline(source, transform, trackingTransform, out, (err) => { | ||
if (err) out.destroy(err); | ||
}); | ||
return out; | ||
} |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,59 @@ | ||
import { | ||
BucketStorageFactory, | ||
createCoreAPIMetrics, | ||
MetricsEngine, | ||
OpenTelemetryMetricsFactory, | ||
RouteAPI, | ||
RouterEngine, | ||
ServiceContext, | ||
StorageEngine, | ||
SyncContext, | ||
SyncRulesBucketStorage | ||
} from '@/index.js'; | ||
import { MeterProvider } from '@opentelemetry/sdk-metrics'; | ||
|
||
export function mockServiceContext(storage: Partial<SyncRulesBucketStorage> | null) { | ||
// This is very incomplete - just enough to get the current tests passing. | ||
|
||
const storageEngine: StorageEngine = { | ||
activeBucketStorage: { | ||
async getActiveStorage() { | ||
return storage; | ||
} | ||
} as Partial<BucketStorageFactory> | ||
} as any; | ||
|
||
const meterProvider = new MeterProvider({ | ||
readers: [] | ||
}); | ||
const meter = meterProvider.getMeter('powersync-tests'); | ||
const metricsEngine = new MetricsEngine({ | ||
disable_telemetry_sharing: true, | ||
factory: new OpenTelemetryMetricsFactory(meter) | ||
}); | ||
createCoreAPIMetrics(metricsEngine); | ||
const service_context: Partial<ServiceContext> = { | ||
syncContext: new SyncContext({ maxBuckets: 1, maxDataFetchConcurrency: 1, maxParameterQueryResults: 1 }), | ||
routerEngine: { | ||
getAPI() { | ||
return { | ||
getParseSyncRulesOptions() { | ||
return { defaultSchema: 'public' }; | ||
} | ||
} as Partial<RouteAPI>; | ||
}, | ||
addStopHandler() { | ||
return () => {}; | ||
} | ||
} as Partial<RouterEngine> as any, | ||
storageEngine, | ||
metricsEngine: metricsEngine, | ||
// Not used | ||
configuration: null as any, | ||
lifeCycleEngine: null as any, | ||
migrations: null as any, | ||
replicationEngine: null as any, | ||
serviceMode: null as any | ||
}; | ||
return service_context as ServiceContext; | ||
} |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,84 @@ | ||
import { BasicRouterRequest, Context, SyncRulesBucketStorage } from '@/index.js'; | ||
import { logger, RouterResponse, ServiceError } from '@powersync/lib-services-framework'; | ||
import { SqlSyncRules } from '@powersync/service-sync-rules'; | ||
import { Readable, Writable } from 'stream'; | ||
import { pipeline } from 'stream/promises'; | ||
import { beforeEach, describe, expect, it, vi } from 'vitest'; | ||
import { syncStreamed } from '../../../src/routes/endpoints/sync-stream.js'; | ||
import { mockServiceContext } from './mocks.js'; | ||
|
||
describe('Stream Route', () => { | ||
describe('compressed stream', () => { | ||
it('handles missing sync rules', async () => { | ||
const context: Context = { | ||
logger: logger, | ||
service_context: mockServiceContext(null) | ||
}; | ||
|
||
const request: BasicRouterRequest = { | ||
headers: {}, | ||
hostname: '', | ||
protocol: 'http' | ||
}; | ||
|
||
const error = (await (syncStreamed.handler({ context, params: {}, request }) as Promise<RouterResponse>).catch( | ||
(e) => e | ||
)) as ServiceError; | ||
|
||
expect(error.errorData.status).toEqual(500); | ||
expect(error.errorData.code).toEqual('PSYNC_S2302'); | ||
}); | ||
|
||
it('handles a stream error with compression', async () => { | ||
// This primarily tests that an underlying storage error doesn't result in an uncaught error | ||
// when compressing the stream. | ||
|
||
const storage = { | ||
getParsedSyncRules() { | ||
return new SqlSyncRules('bucket_definitions: {}'); | ||
}, | ||
watchCheckpointChanges: async function* (options) { | ||
throw new Error('Simulated storage error'); | ||
} | ||
} as Partial<SyncRulesBucketStorage>; | ||
const serviceContext = mockServiceContext(storage); | ||
|
||
const context: Context = { | ||
logger: logger, | ||
service_context: serviceContext, | ||
token_payload: { | ||
exp: new Date().getTime() / 1000 + 10000, | ||
iat: new Date().getTime() / 1000 - 10000, | ||
sub: 'test-user' | ||
} | ||
}; | ||
|
||
// It may be worth eventually doing this via Fastify to test the full stack | ||
|
||
const request: BasicRouterRequest = { | ||
headers: { | ||
'accept-encoding': 'gzip' | ||
}, | ||
hostname: '', | ||
protocol: 'http' | ||
}; | ||
|
||
const response = await (syncStreamed.handler({ context, params: {}, request }) as Promise<RouterResponse>); | ||
expect(response.status).toEqual(200); | ||
const stream = response.data as Readable; | ||
const r = await drainWithTimeout(stream).catch((error) => error); | ||
expect(r.message).toContain('Simulated storage error'); | ||
}); | ||
}); | ||
}); | ||
|
||
export async function drainWithTimeout(readable: Readable, ms = 2_000) { | ||
const devNull = new Writable({ | ||
write(_chunk, _enc, cb) { | ||
cb(); | ||
} // discard everything | ||
}); | ||
|
||
// Throws AbortError if it takes longer than ms, and destroys the stream | ||
await pipeline(readable, devNull, { signal: AbortSignal.timeout(ms) }); | ||
} |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Uh oh!
There was an error while loading. Please reload this page.