Conversation
7a8d6eb to
39f74f1
Compare
39f74f1 to
3097e9f
Compare
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
This PR was opened by the Changesets release GitHub action. When you're ready to do a release, you can merge this and the packages will be published to npm automatically. If you're not ready to do a release yet, that's fine, whenever you add more changesets to main, this PR will be updated.
Releases
@walkthru-earth/objex@1.2.1
Patch Changes
aa62ae4Thanks @yharby! - v1.2.1 focuses on making authenticated reads from S3-compatible buckets actually work in the browser, and fixing a handful of smaller bugs surfaced along the way. No breaking changes. Both packages bump together via the changesetsfixedconfig.Authenticated S3-compatible reads (the headline fix)
Before:
signed-s3connections produceds3://bucket/keyURLs. DuckDB-WASM's httpfs and the other fetchers signed each request withAuthorization: AWS4-HMAC-SHA256 .... TheAuthorizationheader triggers a CORS preflight, and the preflight is fragile on GCS, whereresponseHeaderis dual-purpose (Access-Control-Expose-HeadersANDAccess-Control-Allow-Headers): any request header the browser sends that is not listed is silently dropped from the preflight response, the preflight returns 200 withoutAccess-Control-Allow-Origin, and the browser blocks the real request.After: a new
presignHttpsUrl(conn, key, expiresIn?)helper instorage/presign.tsusesaws4fetch.signQueryto return a presigned HTTPS URL withX-Amz-Signaturein the query string.buildHttpsUrlAsyncandbuildDuckDbUrlAsync(new inutils/url.ts) surface it to callers, andresolveTableSourceAsync(new inquery/source.ts) wires it into the table-source pipeline. DuckDB httpfs and every other range-request fetcher can now issueGETwith only aRangeheader, keeping the preflight trivial. The 7-day expiry matches the SigV4 protocol maximum, the hard cap on every provider in the registry.Viewers migrated to await the async builders so their external fetchers receive a self-authenticating URL:
TableViewer(viaresolveTableSourceAsync, re-populates the editor only if the user has not edited the generated SQL during the await),CogViewer,CopcViewer,ArchiveViewer,FlatGeobufViewer,PmtilesViewer,StacMapViewer,CodeViewer,ZarrViewer,ZarrMapViewer.PmtilesMapViewdrops its unused sync import.configureStorage(conn, connId, sourceRef?)inquery/wasm.tsnow short-circuits the fullSET s3_access_key_id / secret / region / endpoint / url_styleblock whenever the source ref points at a presigned HTTPS URL (isHttpsSourceRef(ref)). Every caller threads the ref or raw SQL through: schema / row-count / CRS probes passsource.ref, data-query paths (query,queryForMap,queryCancellable,queryForMapCancellable) pass the raw SQL (the regex matchesread_parquet('https://...')embedded in SQL too). Net effect: one worker round-trip saved per query on every presigned tab, not just at tab open.Secondary fixes kept from the same workstream:
configureStoragefalls back toresolveProviderEndpoint()when the connection'sendpointfield is empty and the provider is not plain S3. Covers GCS, DO Spaces, Wasabi, B2, Storj, Contabo, Hetzner, Linode, OVHcloud, so auto-detected?url=connections that omit the endpoint still route DuckDB to the correct host on thes3://fallback path.configureStoragehardened against Svelte-proxiedconnIdvalues, template-literal use of a proxied primitive could throwTypeError: can't convert symbol to stringinside the swallowed catch.connIdis normalized to a plain string at the top of the function.CORS_HELP.gcs) updated. Thecors.jsontemplate now includesAuthorization,x-amz-content-sha256,x-amz-date, plusx-amz-*andx-goog-*wildcards, and addsRangeplus the conditionalIf-Match/If-Modified-Since/If-None-Match/If-Unmodified-Sinceheaders so DuckDB httpfs partial reads pass the preflight. The accompanying note explains thatresponseHeaderis dual-purpose and that missing entries cause silent preflight rejections.Credential prompt for private
?url=bucketsAuto-detected buckets opened via the
?url=query param were always saved withanonymous: true. When the URL pointed at a private bucket, the first LIST request failed silently and no credential prompt opened, the only workaround was to manually edit the connection in the sidebar.Now
BrowserCloudAdapter.listPageS3,listPageGcs, andBrowserAzureAdapter.listPagethrow a typedAuthRequiredErroron 401 / 403. The browser store catches it during the first LIST of an anonymous connection and surfaces it on a reactiveauthRequiredfield.Sidebar.sveltewatches that field, flips the connection toanonymous: false, and callsensureCredentials(), which opens the credential dialog so the user can paste HMAC keys or a SAS token. Public buckets keep the zero-click auto-open flow, the LIST returns 200 andauthRequiredis never triggered.Arrow DECIMAL values render correctly
query()/queryCancellable()inquery/wasm.tsderive column types fromString(field.type)on the Arrow schema, which emitsDecimal[10e+2](precisionesigned-scale), not the DuckDBDESCRIBEformDECIMAL(10,2). The initialdecimalScale()regex matched only the DESCRIBE shape, sodecimalColsstayed empty and every DECIMAL column fell through to.get(i)and rendered as rawUint32Array/BigInt(for example,"12345,0,0,0"for123.45). The regex now matches both shapes, soformatDecimal()actually runs and cells render as scaled decimal strings.Geometry column auto-detection no longer false-positives
findGeoColumnmatched its name hints (geom,geo,wkb,shape, ...) withString.includes, so a column liken_geographic_entities(INT) was detected as a geometry column because it containsgeo. The fallback now tokenizes column names on snake_case / kebab-case / camelCase / numeric boundaries and requires an exact token match, eliminating the false positives. Earlier priorities (exact known names viaGEO_NAMES, typed GEOMETRY / WKB_GEOMETRY columns) are unchanged.Invalid-TIFF surface message in
CogViewer@developmentseed/geotiffthrowsOnly tiff supported version:<n>when the first four bytes of the file do not match a TIFF / BigTIFF signature (II*\0,MM\0*,II+\0,MM\0+). This fires on files that advertiseimage/tiffbut are corrupt, encrypted, or a different format entirely (GDAL returns "not recognized as being in a supported file format" on the same bytes).CogViewernow traps that error during the pre-flight read and shows a clear, localizedmap.cogInvalidTiffmessage instead of lettingCOGLayerre-invoke the loader and crash uncaught.@walkthru-earth/objex-utilspackaging and surfaceexports["."]split into nestedimport.types→./dist/index.d.tsandrequire.types→./dist/index.d.cts, so CJS consumers resolve to the.d.ctsemitted bytsup.publintnow reports "All good!" on the package build.QuerySource,AccessMode,AccessModeInput,getAccessMode,isPubliclyStreamable,resolveProviderEndpoint, plus the previously-missedexportToCsvandexportToJson.docs/cog.mdtrimmed to only the pure, peer-dep-free helpers actually re-exported. The render-pipeline helpers (selectCogPipeline,createConfigurableGetTileData,normalizeCogGeotiff,createEpsgResolver,fitCogBounds,renderNonTiledBitmap, ...) are now explicitly called out as "not re-exported here" so consumers know to depend on the full@walkthru-earth/objexpackage if they need them.docs/storage.mddocumentsresolveProviderEndpointand the tightened GCS CORS guidance.@walkthru-earth/objex-utils@1.2.1
Patch Changes
aa62ae4Thanks @yharby! - v1.2.1 focuses on making authenticated reads from S3-compatible buckets actually work in the browser, and fixing a handful of smaller bugs surfaced along the way. No breaking changes. Both packages bump together via the changesetsfixedconfig.Authenticated S3-compatible reads (the headline fix)
Before:
signed-s3connections produceds3://bucket/keyURLs. DuckDB-WASM's httpfs and the other fetchers signed each request withAuthorization: AWS4-HMAC-SHA256 .... TheAuthorizationheader triggers a CORS preflight, and the preflight is fragile on GCS, whereresponseHeaderis dual-purpose (Access-Control-Expose-HeadersANDAccess-Control-Allow-Headers): any request header the browser sends that is not listed is silently dropped from the preflight response, the preflight returns 200 withoutAccess-Control-Allow-Origin, and the browser blocks the real request.After: a new
presignHttpsUrl(conn, key, expiresIn?)helper instorage/presign.tsusesaws4fetch.signQueryto return a presigned HTTPS URL withX-Amz-Signaturein the query string.buildHttpsUrlAsyncandbuildDuckDbUrlAsync(new inutils/url.ts) surface it to callers, andresolveTableSourceAsync(new inquery/source.ts) wires it into the table-source pipeline. DuckDB httpfs and every other range-request fetcher can now issueGETwith only aRangeheader, keeping the preflight trivial. The 7-day expiry matches the SigV4 protocol maximum, the hard cap on every provider in the registry.Viewers migrated to await the async builders so their external fetchers receive a self-authenticating URL:
TableViewer(viaresolveTableSourceAsync, re-populates the editor only if the user has not edited the generated SQL during the await),CogViewer,CopcViewer,ArchiveViewer,FlatGeobufViewer,PmtilesViewer,StacMapViewer,CodeViewer,ZarrViewer,ZarrMapViewer.PmtilesMapViewdrops its unused sync import.configureStorage(conn, connId, sourceRef?)inquery/wasm.tsnow short-circuits the fullSET s3_access_key_id / secret / region / endpoint / url_styleblock whenever the source ref points at a presigned HTTPS URL (isHttpsSourceRef(ref)). Every caller threads the ref or raw SQL through: schema / row-count / CRS probes passsource.ref, data-query paths (query,queryForMap,queryCancellable,queryForMapCancellable) pass the raw SQL (the regex matchesread_parquet('https://...')embedded in SQL too). Net effect: one worker round-trip saved per query on every presigned tab, not just at tab open.Secondary fixes kept from the same workstream:
configureStoragefalls back toresolveProviderEndpoint()when the connection'sendpointfield is empty and the provider is not plain S3. Covers GCS, DO Spaces, Wasabi, B2, Storj, Contabo, Hetzner, Linode, OVHcloud, so auto-detected?url=connections that omit the endpoint still route DuckDB to the correct host on thes3://fallback path.configureStoragehardened against Svelte-proxiedconnIdvalues, template-literal use of a proxied primitive could throwTypeError: can't convert symbol to stringinside the swallowed catch.connIdis normalized to a plain string at the top of the function.CORS_HELP.gcs) updated. Thecors.jsontemplate now includesAuthorization,x-amz-content-sha256,x-amz-date, plusx-amz-*andx-goog-*wildcards, and addsRangeplus the conditionalIf-Match/If-Modified-Since/If-None-Match/If-Unmodified-Sinceheaders so DuckDB httpfs partial reads pass the preflight. The accompanying note explains thatresponseHeaderis dual-purpose and that missing entries cause silent preflight rejections.Credential prompt for private
?url=bucketsAuto-detected buckets opened via the
?url=query param were always saved withanonymous: true. When the URL pointed at a private bucket, the first LIST request failed silently and no credential prompt opened, the only workaround was to manually edit the connection in the sidebar.Now
BrowserCloudAdapter.listPageS3,listPageGcs, andBrowserAzureAdapter.listPagethrow a typedAuthRequiredErroron 401 / 403. The browser store catches it during the first LIST of an anonymous connection and surfaces it on a reactiveauthRequiredfield.Sidebar.sveltewatches that field, flips the connection toanonymous: false, and callsensureCredentials(), which opens the credential dialog so the user can paste HMAC keys or a SAS token. Public buckets keep the zero-click auto-open flow, the LIST returns 200 andauthRequiredis never triggered.Arrow DECIMAL values render correctly
query()/queryCancellable()inquery/wasm.tsderive column types fromString(field.type)on the Arrow schema, which emitsDecimal[10e+2](precisionesigned-scale), not the DuckDBDESCRIBEformDECIMAL(10,2). The initialdecimalScale()regex matched only the DESCRIBE shape, sodecimalColsstayed empty and every DECIMAL column fell through to.get(i)and rendered as rawUint32Array/BigInt(for example,"12345,0,0,0"for123.45). The regex now matches both shapes, soformatDecimal()actually runs and cells render as scaled decimal strings.Geometry column auto-detection no longer false-positives
findGeoColumnmatched its name hints (geom,geo,wkb,shape, ...) withString.includes, so a column liken_geographic_entities(INT) was detected as a geometry column because it containsgeo. The fallback now tokenizes column names on snake_case / kebab-case / camelCase / numeric boundaries and requires an exact token match, eliminating the false positives. Earlier priorities (exact known names viaGEO_NAMES, typed GEOMETRY / WKB_GEOMETRY columns) are unchanged.Invalid-TIFF surface message in
CogViewer@developmentseed/geotiffthrowsOnly tiff supported version:<n>when the first four bytes of the file do not match a TIFF / BigTIFF signature (II*\0,MM\0*,II+\0,MM\0+). This fires on files that advertiseimage/tiffbut are corrupt, encrypted, or a different format entirely (GDAL returns "not recognized as being in a supported file format" on the same bytes).CogViewernow traps that error during the pre-flight read and shows a clear, localizedmap.cogInvalidTiffmessage instead of lettingCOGLayerre-invoke the loader and crash uncaught.@walkthru-earth/objex-utilspackaging and surfaceexports["."]split into nestedimport.types→./dist/index.d.tsandrequire.types→./dist/index.d.cts, so CJS consumers resolve to the.d.ctsemitted bytsup.publintnow reports "All good!" on the package build.QuerySource,AccessMode,AccessModeInput,getAccessMode,isPubliclyStreamable,resolveProviderEndpoint, plus the previously-missedexportToCsvandexportToJson.docs/cog.mdtrimmed to only the pure, peer-dep-free helpers actually re-exported. The render-pipeline helpers (selectCogPipeline,createConfigurableGetTileData,normalizeCogGeotiff,createEpsgResolver,fitCogBounds,renderNonTiledBitmap, ...) are now explicitly called out as "not re-exported here" so consumers know to depend on the full@walkthru-earth/objexpackage if they need them.docs/storage.mddocumentsresolveProviderEndpointand the tightened GCS CORS guidance.