Skip to content

Fix bugs and add features#3103

Open
DevFlex-AI wants to merge 6 commits intoonlook-dev:mainfrom
DevFlex-AI:main
Open

Fix bugs and add features#3103
DevFlex-AI wants to merge 6 commits intoonlook-dev:mainfrom
DevFlex-AI:main

Conversation

@DevFlex-AI
Copy link
Copy Markdown

@DevFlex-AI DevFlex-AI commented Apr 26, 2026

Description

I added mroe fetaures that was not added acocording the readme and improvedd from my last pr which was not safe t/o merge and this is more safe

Related Issues

Type of Change

  • Bug fix
  • New feature
  • Documentation
  • Refactor
  • Other (please describe):

Testing

I builded and it was successful

Screenshots (if applicable)

Additional Notes

Summary by CodeRabbit

  • New Features

    • Image download and save functionality in chat context
    • Figma added as import option alongside GitHub and local imports
    • GitHub pull request creation workflow from branches
    • Collaboration features including comment management and MCP support
    • Git repository metadata integration for project imports
  • Documentation

    • Added unified implementation roadmap with feature catalog and rollout guidelines

DevFlex-AI and others added 6 commits April 25, 2026 17:25
@coderabbitai
Copy link
Copy Markdown

coderabbitai Bot commented Apr 26, 2026

📝 Walkthrough

Walkthrough

This pull request introduces image save functionality for chat contexts, extends project creation with GitHub metadata (repo URL, base branch), adds GitHub PR workflow endpoints, restructures the import page to support multiple import types (including Figma), and introduces a new workspace router with collaboration features (PixelRaft comments, MCP command execution, image storage). A documentation page outlining the unified Onlook implementation roadmap is also added.

Changes

Cohort / File(s) Summary
Image Save to Project
image-pill.tsx, input-context-pills.tsx
Adds save button to external images in chat. Sanitizes filenames from MIME types, checks for naming conflicts with counter loop, fetches image bytes from context, writes to project's image folder via sandbox, updates ImageMessageContext with local path and metadata, and surfaces outcomes via toast notifications.
GitHub Metadata & PR Integration
use-repo-import.ts, project.ts, github.ts
Extends project creation to include github-import tag and Git metadata (repoUrl, baseBranch). Adds two new protected endpoints: getBranchGitState (validates branch access, returns parsed repo metadata) and createPullRequestFromBranch (computes file diffs via sandbox provider, syncs changes to GitHub head branch, creates PR with fallback to existing open PR).
Import Page Refactoring
import/page.tsx
Refactors from hardcoded two-card layout to data-driven configuration with ImportType union (local, github, figma) and importCards metadata. Uses importPathMap with Routes constants for navigation, renders all configured import options.
Workspace Collaboration Router
workspace.ts
New TRPC router providing protected endpoints for per-branch collaboration: PixelRaft comment management (create, update resolution, delete via collaboration-comments.json), MCP command execution (load/save config, compose and run commands), and base64 image storage with extension derivation and sanitized filename handling.
Router Integration & Routes
root.ts, routers/index.ts, constants/index.ts
Adds workspaceRouter to appRouter via new workspace namespace. Exports workspace module through barrel file. Introduces IMPORT_FIGMA route constant (/projects/import/figma).
Documentation
meta.json, unified-onlook-implementation-plan.mdx
Adds navigation entry for new implementation plan documentation page. Comprehensive roadmap specifying governance rules, quality gates, risk matrix, acceptance criteria, 306 mapped UI features, and per-milestone sign-off checklist.

Sequence Diagram(s)

sequenceDiagram
    participant User
    participant UI as GitHub Router Endpoint
    participant Branch as Branch/Project DB
    participant Sandbox as Sandbox Provider
    participant GitHub as GitHub API
    participant Fallback as Fallback PR Lookup

    User->>UI: createPullRequestFromBranch(branchId, repoUrl)
    UI->>Branch: Validate user has access & repo connected
    alt Access Denied
        UI-->>User: TRPCError (UNAUTHORIZED)
    end
    UI->>Sandbox: Get provider for sandboxId
    Sandbox->>Sandbox: Compute changed files (added/modified/deleted)
    UI->>GitHub: Ensure head branch ref exists
    UI->>Sandbox: For each changed file: read from sandbox
    loop For each file
        alt File modified/added
            GitHub->>GitHub: Sync to head branch (create/update)
        else File deleted
            GitHub->>GitHub: Delete from head branch
        end
    end
    UI->>GitHub: Create pull request (base → head)
    alt PR creation fails
        UI->>Fallback: Look up existing open PR (base/head)
        Fallback-->>UI: Return existing PR or null
    end
    UI-->>User: Success with PR data or error
Loading

Estimated code review effort

🎯 4 (Complex) | ⏱️ ~45 minutes

Possibly related PRs

Poem

🐰 A rabbit's ode to collaboration
When images hop into local ground,
And branches dance with GitHub's sound,
While workspaces sing comment tunes,
Our features grow like spring's bright blooms! 🌱✨

🚥 Pre-merge checks | ✅ 2 | ❌ 3

❌ Failed checks (2 warnings, 1 inconclusive)

Check name Status Explanation Resolution
Description check ⚠️ Warning Description lacks specifics about the actual changes; it contains typos and vague claims without concrete details about what features were added or bugs fixed. Provide a detailed description of the major changes: image saving from chat, GitHub PR creation flow, workspace collaboration features, and Figma import routing. Include links to related issues and clarify which bugs were fixed.
Docstring Coverage ⚠️ Warning Docstring coverage is 0.00% which is insufficient. The required threshold is 80.00%. Write docstrings for the functions missing them to satisfy the coverage threshold.
Title check ❓ Inconclusive Title is vague and generic using non-descriptive terms; it does not convey what the changeset actually implements despite substantial feature additions. Replace with a specific title that captures the main change, such as 'Add image save and GitHub PR creation features' or 'Implement PixelRaft collaboration and workspace router'.
✅ Passed checks (2 passed)
Check name Status Explanation
Linked Issues check ✅ Passed Check skipped because no linked issues were found for this pull request.
Out of Scope Changes check ✅ Passed Check skipped because no linked issues were found for this pull request.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

@vercel
Copy link
Copy Markdown

vercel Bot commented Apr 26, 2026

@cto-new[bot] is attempting to deploy a commit to the Onlook Team on Vercel.

A member of the Team first needs to authorize it.

Copy link
Copy Markdown

@chatgpt-codex-connector chatgpt-codex-connector Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: 35e66df02b

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

const importPathMap: Record<ImportType, string> = {
local: `${Routes.IMPORT_PROJECT}/local`,
github: Routes.IMPORT_GITHUB,
figma: Routes.IMPORT_FIGMA,
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P1 Badge Disable Figma import card until route exists

This maps the new Figma card to Routes.IMPORT_FIGMA (/projects/import/figma), but there is no corresponding App Router page under apps/web/client/src/app/projects/import/figma, so clicking a primary import option now sends users to a 404. Either add that route or keep the card non-clickable/hidden until the flow is implemented.

Useful? React with 👍 / 👎.

const parseGitHubRepoUrl = (repoUrl: string): { owner: string; repo: string } => {
const normalizedUrl = repoUrl.trim();

const httpsMatch = normalizedUrl.match(/^https?:\/\/github\.com\/([^/]+)\/([^/.]+)(\.git)?$/i);
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P2 Badge Accept dotted repo names when parsing GitHub URLs

The repo segment regex ([^/.]+) rejects valid GitHub repository names that contain dots (for example org/my.repo), so getBranchGitState and createPullRequestFromBranch will fail with Invalid GitHub repository URL for those projects even when clone_url is valid. The parser should allow dots in repo names and only trim an optional trailing .git suffix.

Useful? React with 👍 / 👎.

Comment on lines +536 to +538
throw new TRPCError({
code: 'INTERNAL_SERVER_ERROR',
message: error instanceof Error ? error.message : 'Failed to create pull request',
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P2 Badge Preserve existing TRPC error codes in PR creation

This catch block converts all failures to INTERNAL_SERVER_ERROR, including intentional TRPCErrors raised earlier in the same mutation (for example BAD_REQUEST when no files were changed), which turns expected client-handled cases into 500s. Re-throw TRPCError instances unchanged and only wrap unknown errors.

Useful? React with 👍 / 👎.

Copy link
Copy Markdown

@coderabbitai coderabbitai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 8

🧹 Nitpick comments (6)
apps/web/client/src/app/projects/import/page.tsx (1)

12-43: Move user-facing strings to next-intl.

Titles, descriptions, and ariaLabel are hardcoded English. The repository convention is to source user-facing copy from next-intl messages/hooks rather than literals. Since you're refactoring this surface anyway, consider making the importCards entries use translation keys (e.g., t('import.local.title')) so the rebrand and future locales don't require touching component code.

As per coding guidelines: "Avoid hardcoded user-facing text; use next-intl messages/hooks".

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@apps/web/client/src/app/projects/import/page.tsx` around lines 12 - 43, The
importCards array contains hardcoded user-facing strings (title, description,
ariaLabel); replace these literals with next-intl translation lookups (e.g.,
t('import.local.title'), t('import.local.description'),
t('import.local.ariaLabel')) by importing and using useTranslations from
next-intl inside the component that renders importCards or by creating a factory
function that accepts t and returns the array; update the three card entries
(type 'local', 'github', 'figma') to use distinct translation keys and ensure
corresponding message keys are added to your locale files so icons
(Icons.Upload, Icons.GitHubLogo, Icons.Layout) and the ImportType typing remain
unchanged.
apps/web/client/src/app/project/[id]/_components/right-panel/chat-tab/context-pills/input-context-pills.tsx (1)

38-45: Extension detection misses dotted display names.

sanitizedName.includes('.') treats any dot as an extension marker. A displayName like "screenshot.from.web" or "foo.bar" (no real extension) skips the MIME-derived extension entirely, producing a sandbox file with the wrong/missing extension. Consider matching only a trailing extension (e.g., /\.[A-Za-z0-9]{1,8}$/) before deciding whether to append.

♻️ Proposed refactor
 const getSanitizedFileName = (displayName: string, mimeType: string) => {
     const sanitizedName = sanitizeFilename(displayName || `chat-image-${Date.now()}`);
-    if (sanitizedName.includes('.')) {
+    if (/\.[A-Za-z0-9]{1,8}$/.test(sanitizedName)) {
         return sanitizedName;
     }
     const extension = getExtensionFromMimeType(mimeType);
     return `${sanitizedName}.${extension}`;
 };
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In
`@apps/web/client/src/app/project/`[id]/_components/right-panel/chat-tab/context-pills/input-context-pills.tsx
around lines 38 - 45, getSanitizedFileName currently treats any dot in
sanitizedName as an extension (using sanitizedName.includes('.')), causing names
like "screenshot.from.web" to skip adding a proper MIME-derived extension;
update the check to only consider a trailing extension by testing sanitizedName
against a regex that matches a final dot plus 1–8 alphanumeric chars (e.g.,
/\.[A-Za-z0-9]{1,8}$/) before returning sanitizedName, otherwise call
getExtensionFromMimeType(mimeType) and append the extension; keep using
sanitizeFilename(displayName || `chat-image-${Date.now()}`) to produce
sanitizedName.
apps/web/client/src/server/api/routers/workspace.ts (2)

121-127: Brittle MIME → extension mapping; duplicated on the client.

mimeType.includes('png') is order-sensitive and partial: e.g. image/jpeg falls to the default 'jpg' (correct by coincidence), image/avif silently becomes jpg, and image/svg+xml matches both 'svg' and 'svg+xml' cases inconsistently if reordered. The client at apps/web/client/src/app/project/[id]/_components/right-panel/chat-tab/context-pills/input-context-pills.tsx (lines 22–36) already maintains a stricter exact-match map; consider lifting that into @onlook/utility and reusing it here for parity.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@apps/web/client/src/server/api/routers/workspace.ts` around lines 121 - 127,
The detectExtensionFromMimeType function is brittle due to partial includes and
duplicated logic on the client; replace the loose includes-check in
detectExtensionFromMimeType with a strict lookup map (exact mime-type →
extension) shared with the client by extracting that map into the shared utility
package (e.g., `@onlook/utility`) and importing it into workspace.ts and the
client component (input-context-pills.tsx); update detectExtensionFromMimeType
to do an exact-key lookup (with a sensible default like 'jpg') against the
shared map and remove any duplicated mapping logic from the client so both use
the single source of truth.

97-108: readJsonFile swallows real read/parse errors.

The bare catch on line 105 returns the fallback for any error — including a corrupted JSON file or an authorization/network error from the provider. Comments could silently appear "empty" while the underlying file still has data, and the next write would clobber the corrupted-but-recoverable contents with a fresh array. Consider distinguishing "file does not exist" from "read/parse failed" and surfacing the latter.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@apps/web/client/src/server/api/routers/workspace.ts` around lines 97 - 108,
The readJsonFile function currently swallows all errors and returns fallback;
change it so only "file not found" returns fallback while
read/parse/network/authorization errors are surfaced: call provider.readFile
within a try and if it throws detect a not-found condition (check error.code,
error.status, or error.message patterns from the provider) and return fallback
only in that case, otherwise rethrow the error; then separate the JSON.parse
into its own try/catch and if parse fails rethrow a descriptive parse error (do
not return fallback); update references to readJsonFile, provider.readFile, and
JSON.parse accordingly so callers can handle real failures instead of silently
clobbering data.
apps/web/client/src/server/api/routers/github.ts (2)

134-141: Avoid as any cast on installation account.

(installation.data.account as any).name discards types. Octokit's account union for installations distinguishes User (has login) from Enterprise (has name). Discriminate on the type field instead of casting.

♻️ Proposed refactor
-                        login:
-                            'login' in installation.data.account
-                                ? installation.data.account.login
-                                : (installation.data.account as any).name || '',
+                        login:
+                            'login' in installation.data.account
+                                ? installation.data.account.login
+                                : 'name' in installation.data.account
+                                  ? installation.data.account.name ?? ''
+                                  : '',

As per coding guidelines: "Avoid using the any type unless absolutely necessary" (applies to {apps,packages}/**/*.{ts,tsx}).

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@apps/web/client/src/server/api/routers/github.ts` around lines 134 - 141, The
code currently uses an unsafe cast "(installation.data.account as any).name" —
update the conditional to discriminate on installation.data.account.type (e.g.,
if installation.data.account.type === 'User' use
installation.data.account.login, else if type === 'Enterprise' use
installation.data.account.name) and remove the any cast; ensure you narrow the
union for installation.data.account before accessing .name so TypeScript
statically verifies the property access (update the object construction around
login/avatar_url/description accordingly).

419-487: Sequential N+1 GitHub API calls; non-atomic sync.

For each changed file the loop performs an extra getContent followed by createOrUpdateFileContents (or getContent + deleteFile). For a branch with many changed files this is slow, races against secondary rate limits, and — more importantly — is non-atomic: a mid-loop failure leaves the head branch in a half-applied state with no rollback.

Consider building a single tree+commit via the Git Data API (git.createTree with base_tree, git.createCommit, git.updateRef) so the whole change set lands as one commit and one network round-trip per stage.

Want me to sketch a Git Trees-based replacement for the per-file loop?

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@apps/web/client/src/server/api/routers/github.ts` around lines 419 - 487, The
current per-file loop (iterating changedFiles and calling provider.statFile,
getFileContentAsBase64, octokit.rest.repos.getContent,
createOrUpdateFileContents, and deleteFile) causes N+1 API calls and non-atomic
partial updates; replace it with a single Git Data API commit: fetch the base
commit/tree for headBranchName (e.g., via octokit.rest.git.getCommit or getRef +
getCommit), for each changedFile create blobs with octokit.rest.git.createBlob
using the base64 content returned by getFileContentAsBase64 (skip createBlob for
deletions), build a new tree with entries for updated/added files (path, mode
"100644", type "blob", blob sha) and omit entries for deleted files so they are
removed, call octokit.rest.git.createTree with base_tree set to the base tree
sha, then octokit.rest.git.createCommit with the new tree and parent commit sha,
and finally octokit.rest.git.updateRef to move headBranchName to the new commit;
remove the per-file
octokit.rest.repos.createOrUpdateFileContents/getContent/deleteFile calls and
the processedFiles incremental commit behavior.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In
`@apps/web/client/src/app/project/`[id]/_components/right-panel/chat-tab/context-pills/input-context-pills.tsx:
- Around line 78-122: Replace hardcoded toast strings in this component by using
next-intl translations: import and call useTranslations (e.g., const t =
useTranslations('ContextPills') at top of the component) and replace
toast.success('Image already exists in project assets') with
toast.success(t('image.alreadyExists')), replace toast.success(`Saved image to
${targetPath}`) with toast.success(t('image.saved', { path: targetPath })), and
replace toast.error('Failed to save image to project assets') with
toast.error(t('image.saveFailed')). Add the three keys (image.alreadyExists,
image.saved, image.saveFailed) to the appropriate locale message files and
update interpolation for path where needed; keep all other logic
(getSanitizedFileName, editorEngine.activeSandbox.writeFile,
editorEngine.chat.context mapping) unchanged.
- Around line 89-110: Capture the current branch id from
editorEngine.branches.activeBranch.id before any awaits and async file
operations (before the while loop/fetch/writeFile) and store it in a local const
(e.g., savedBranchId), then use that savedBranchId when building
updatedImageContext.branchId (instead of reading
editorEngine.branches.activeBranch.id after awaits); ensure the same
savedBranchId is used for any subsequent metadata related to
targetPath/imageContext to avoid branch mismatch.

In `@apps/web/client/src/app/projects/import/page.tsx`:
- Around line 71-78: The Card rendered with role="button" (component Card) and
onClick handler is not keyboard-activatable; update the component usage so
keyboard users can trigger handleCardClick(card.type): either replace the Card
wrapper with a semantic interactive element (<button> or <a>) so it natively
supports Enter/Space and focus styles, or add an onKeyDown handler that listens
for Enter and Space and calls handleCardClick(card.type) and preventDefault for
Space; also ensure tabIndex remains 0 and add a visible focus style
(focus-visible) so focused cards are discoverable.
- Around line 19-43: The three user-facing card descriptions still reference
"Pixelraft"; update the description strings in the objects with type 'local',
'github', and 'figma' to remove the "Pixelraft" branding (and "Pixelraft-ready")
and replace with the new non-branded phrasing (e.g., "the app", "your project",
or the new product name), leaving icons/ariaLabel values unchanged; locate the
objects by their type property in the cards array in page.tsx and edit only the
description text.

In `@apps/web/client/src/server/api/routers/github.ts`:
- Around line 27-44: The regex in parseGitHubRepoUrl rejects valid repo names
containing dots and may fold the optional .git into the slug; update both
patterns to allow any characters except slash for owner and repo and make the
.git suffix optional outside the repo capture. Replace the https and ssh
patterns with something like
/^https?:\/\/github\.com\/([^/]+)\/([^/]+?)(?:\.git)?\/?$/i and
/^git@github\.com:([^/]+)\/([^/]+?)(?:\.git)?$/i so owner uses ([^/]+) and repo
uses a non-greedy ([^/]+?) with (?:\.git)? after it, then return { owner:
match[1], repo: match[2] } as before and keep the TRPCError handling unchanged.
- Around line 389-545: The outer catch currently wraps every error (including
TRPCError instances thrown earlier like the BAD_REQUESTs) into a new
INTERNAL_SERVER_ERROR, losing original status codes; update the catch after the
main try so that if (error instanceof TRPCError) you re-throw the same error,
otherwise wrap into a new TRPCError as before (preserving message and cause).
Locate the catch that constructs new TRPCError (referencing TRPCError,
provider.destroy, and the surrounding code that calls octokit.rest.pulls.create)
and change it to first check instanceof TRPCError and throw error directly,
falling back to creating the INTERNAL_SERVER_ERROR only for non-TRPCError
errors.

In `@apps/web/client/src/server/api/routers/workspace.ts`:
- Around line 364-399: The saveImageToProjectAssets mutation currently accepts
imageDataUrl as z.string().min(1) which allows unbounded payloads and can cause
memory pressure when parseDataUrl decodes the full base64; update the input
schema for saveImageToProjectAssets (the z.object with imageDataUrl) to enforce
a sensible maximum (e.g., .max(...)) and additionally validate decoded size
after parseDataUrl by checking parsed.data.byteLength and throwing a controlled
error if it exceeds the allowed limit before creating the Uint8Array or calling
provider.writeFile; ensure the error is descriptive so callers know the payload
is too large.
- Around line 167-269: The three handlers (addComment, setCommentResolved,
deleteComment) perform read→mutate→write on COMMENTS_FILE and can lose updates
under concurrent requests; wrap the critical section around readJsonFile/parse,
mutation, and writeJsonFile in a per-branch mutex (keyed by input.branchId)
obtained right after getBranchProvider and released in the finally block so only
one mutation runs at a time for a branch; implement a lightweight in-memory lock
(or use an existing mutex utility) that awaits acquisition before reading,
performs the change (using commentsSchema.parse on the read result), writes
back, then releases the lock and still calls provider.destroy(). Optionally add
a small retry with backoff if lock acquisition fails to reduce latency under
contention.

---

Nitpick comments:
In
`@apps/web/client/src/app/project/`[id]/_components/right-panel/chat-tab/context-pills/input-context-pills.tsx:
- Around line 38-45: getSanitizedFileName currently treats any dot in
sanitizedName as an extension (using sanitizedName.includes('.')), causing names
like "screenshot.from.web" to skip adding a proper MIME-derived extension;
update the check to only consider a trailing extension by testing sanitizedName
against a regex that matches a final dot plus 1–8 alphanumeric chars (e.g.,
/\.[A-Za-z0-9]{1,8}$/) before returning sanitizedName, otherwise call
getExtensionFromMimeType(mimeType) and append the extension; keep using
sanitizeFilename(displayName || `chat-image-${Date.now()}`) to produce
sanitizedName.

In `@apps/web/client/src/app/projects/import/page.tsx`:
- Around line 12-43: The importCards array contains hardcoded user-facing
strings (title, description, ariaLabel); replace these literals with next-intl
translation lookups (e.g., t('import.local.title'),
t('import.local.description'), t('import.local.ariaLabel')) by importing and
using useTranslations from next-intl inside the component that renders
importCards or by creating a factory function that accepts t and returns the
array; update the three card entries (type 'local', 'github', 'figma') to use
distinct translation keys and ensure corresponding message keys are added to
your locale files so icons (Icons.Upload, Icons.GitHubLogo, Icons.Layout) and
the ImportType typing remain unchanged.

In `@apps/web/client/src/server/api/routers/github.ts`:
- Around line 134-141: The code currently uses an unsafe cast
"(installation.data.account as any).name" — update the conditional to
discriminate on installation.data.account.type (e.g., if
installation.data.account.type === 'User' use installation.data.account.login,
else if type === 'Enterprise' use installation.data.account.name) and remove the
any cast; ensure you narrow the union for installation.data.account before
accessing .name so TypeScript statically verifies the property access (update
the object construction around login/avatar_url/description accordingly).
- Around line 419-487: The current per-file loop (iterating changedFiles and
calling provider.statFile, getFileContentAsBase64,
octokit.rest.repos.getContent, createOrUpdateFileContents, and deleteFile)
causes N+1 API calls and non-atomic partial updates; replace it with a single
Git Data API commit: fetch the base commit/tree for headBranchName (e.g., via
octokit.rest.git.getCommit or getRef + getCommit), for each changedFile create
blobs with octokit.rest.git.createBlob using the base64 content returned by
getFileContentAsBase64 (skip createBlob for deletions), build a new tree with
entries for updated/added files (path, mode "100644", type "blob", blob sha) and
omit entries for deleted files so they are removed, call
octokit.rest.git.createTree with base_tree set to the base tree sha, then
octokit.rest.git.createCommit with the new tree and parent commit sha, and
finally octokit.rest.git.updateRef to move headBranchName to the new commit;
remove the per-file
octokit.rest.repos.createOrUpdateFileContents/getContent/deleteFile calls and
the processedFiles incremental commit behavior.

In `@apps/web/client/src/server/api/routers/workspace.ts`:
- Around line 121-127: The detectExtensionFromMimeType function is brittle due
to partial includes and duplicated logic on the client; replace the loose
includes-check in detectExtensionFromMimeType with a strict lookup map (exact
mime-type → extension) shared with the client by extracting that map into the
shared utility package (e.g., `@onlook/utility`) and importing it into
workspace.ts and the client component (input-context-pills.tsx); update
detectExtensionFromMimeType to do an exact-key lookup (with a sensible default
like 'jpg') against the shared map and remove any duplicated mapping logic from
the client so both use the single source of truth.
- Around line 97-108: The readJsonFile function currently swallows all errors
and returns fallback; change it so only "file not found" returns fallback while
read/parse/network/authorization errors are surfaced: call provider.readFile
within a try and if it throws detect a not-found condition (check error.code,
error.status, or error.message patterns from the provider) and return fallback
only in that case, otherwise rethrow the error; then separate the JSON.parse
into its own try/catch and if parse fails rethrow a descriptive parse error (do
not return fallback); update references to readJsonFile, provider.readFile, and
JSON.parse accordingly so callers can handle real failures instead of silently
clobbering data.
🪄 Autofix (Beta)

Fix all unresolved CodeRabbit comments on this PR:

  • Push a commit to this branch (recommended)
  • Create a new PR with the fixes

ℹ️ Review info
⚙️ Run configuration

Configuration used: defaults

Review profile: CHILL

Plan: Pro

Run ID: 859f7a76-93bf-4b28-8c25-8508b707f34a

📥 Commits

Reviewing files that changed from the base of the PR and between a242be5 and 35e66df.

⛔ Files ignored due to path filters (1)
  • bun.lock is excluded by !**/*.lock
📒 Files selected for processing (12)
  • apps/web/client/src/app/project/[id]/_components/right-panel/chat-tab/context-pills/image-pill.tsx
  • apps/web/client/src/app/project/[id]/_components/right-panel/chat-tab/context-pills/input-context-pills.tsx
  • apps/web/client/src/app/projects/import/github/_hooks/use-repo-import.ts
  • apps/web/client/src/app/projects/import/page.tsx
  • apps/web/client/src/server/api/root.ts
  • apps/web/client/src/server/api/routers/github.ts
  • apps/web/client/src/server/api/routers/index.ts
  • apps/web/client/src/server/api/routers/project/project.ts
  • apps/web/client/src/server/api/routers/workspace.ts
  • apps/web/client/src/utils/constants/index.ts
  • docs/content/docs/developers/meta.json
  • docs/content/docs/developers/unified-onlook-implementation-plan.mdx

Comment on lines +78 to +122
toast.success('Image already exists in project assets');
return;
}

try {
const destinationFolder = `${DefaultSettings.IMAGE_FOLDER}/images`;
const fileName = getSanitizedFileName(imageContext.displayName, imageContext.mimeType);

let targetPath = `${destinationFolder}/${fileName}`;
let counter = 1;

while (await editorEngine.activeSandbox.fileExists(targetPath)) {
const lastDotIndex = fileName.lastIndexOf('.');
const baseName =
lastDotIndex > 0 ? fileName.slice(0, lastDotIndex) : fileName;
const extension =
lastDotIndex > 0 ? fileName.slice(lastDotIndex) : '';
targetPath = `${destinationFolder}/${baseName}-${counter}${extension}`;
counter += 1;
}

const imageResponse = await fetch(imageContext.content);
const imageBuffer = await imageResponse.arrayBuffer();
const imageData = new Uint8Array(imageBuffer);

await editorEngine.activeSandbox.writeFile(targetPath, imageData);

const savedImageName = targetPath.split('/').pop() ?? fileName;
const updatedImageContext: ImageMessageContext = {
...imageContext,
source: 'local',
path: targetPath,
branchId: editorEngine.branches.activeBranch.id,
displayName: savedImageName,
};

editorEngine.chat.context.context = editorEngine.chat.context.context.map((context) =>
context === imageContext ? updatedImageContext : context,
);

toast.success(`Saved image to ${targetPath}`);
} catch (error) {
console.error('Failed to save image to project assets:', error);
toast.error('Failed to save image to project assets');
}
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Hardcoded user-facing toast strings.

The toast messages on lines 78, 118 and 121 are hardcoded English strings shown to users. Route them through next-intl so they can be localized.

As per coding guidelines: "Avoid hardcoded user-facing text; use next-intl messages/hooks instead" (applies to apps/web/client/src/**/*.tsx).

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In
`@apps/web/client/src/app/project/`[id]/_components/right-panel/chat-tab/context-pills/input-context-pills.tsx
around lines 78 - 122, Replace hardcoded toast strings in this component by
using next-intl translations: import and call useTranslations (e.g., const t =
useTranslations('ContextPills') at top of the component) and replace
toast.success('Image already exists in project assets') with
toast.success(t('image.alreadyExists')), replace toast.success(`Saved image to
${targetPath}`) with toast.success(t('image.saved', { path: targetPath })), and
replace toast.error('Failed to save image to project assets') with
toast.error(t('image.saveFailed')). Add the three keys (image.alreadyExists,
image.saved, image.saveFailed) to the appropriate locale message files and
update interpolation for path where needed; keep all other logic
(getSanitizedFileName, editorEngine.activeSandbox.writeFile,
editorEngine.chat.context mapping) unchanged.

Comment on lines +89 to +110
while (await editorEngine.activeSandbox.fileExists(targetPath)) {
const lastDotIndex = fileName.lastIndexOf('.');
const baseName =
lastDotIndex > 0 ? fileName.slice(0, lastDotIndex) : fileName;
const extension =
lastDotIndex > 0 ? fileName.slice(lastDotIndex) : '';
targetPath = `${destinationFolder}/${baseName}-${counter}${extension}`;
counter += 1;
}

const imageResponse = await fetch(imageContext.content);
const imageBuffer = await imageResponse.arrayBuffer();
const imageData = new Uint8Array(imageBuffer);

await editorEngine.activeSandbox.writeFile(targetPath, imageData);

const savedImageName = targetPath.split('/').pop() ?? fileName;
const updatedImageContext: ImageMessageContext = {
...imageContext,
source: 'local',
path: targetPath,
branchId: editorEngine.branches.activeBranch.id,
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

🧩 Analysis chain

🏁 Script executed:

#!/bin/bash
# Confirm whether activeSandbox and branches.activeBranch are guaranteed to refer to the same branch.
ast-grep --pattern $'get activeSandbox() {
  $$$
}'
rg -nP --type=ts -C3 '\bactiveSandbox\b' -g '!**/node_modules/**'
rg -nP --type=ts -C3 '\bactiveBranch\b' -g '!**/node_modules/**'

Repository: onlook-dev/onlook

Length of output: 50373


🏁 Script executed:

rg -nP --type=ts 'activeBranchData\s*=' | head -30

Repository: onlook-dev/onlook

Length of output: 43


🏁 Script executed:

rg -nP --type=ts 'setActiveBranch|switchBranch|activeBranch\s*=' -A 3 | head -50

Repository: onlook-dev/onlook

Length of output: 2001


🏁 Script executed:

ast-grep --pattern 'switchBranch($$$) { $$$ }' | head -100

Repository: onlook-dev/onlook

Length of output: 43


🏁 Script executed:

rg -nP --type=ts 'activeBranchData' -B 2 -A 2 | head -80

Repository: onlook-dev/onlook

Length of output: 3604


🏁 Script executed:

rg -nP --type=ts 'private.*activeBranchData|this\.activeBranchData\s*=' -A 2 | head -60

Repository: onlook-dev/onlook

Length of output: 43


🏁 Script executed:

rg -nP --type=ts 'currentBranchId\s*=' -B 3 -A 3 | head -100

Repository: onlook-dev/onlook

Length of output: 4394


🏁 Script executed:

# Search for branch switching logic
fd -e ts -x grep -l "switchBranch\|selectBranch\|setActiveBranch" {} \; | head -10

Repository: onlook-dev/onlook

Length of output: 43


🏁 Script executed:

# Get the full activeBranchData getter implementation
sed -n '80,88p' apps/web/client/src/components/store/editor/branch/manager.ts

Repository: onlook-dev/onlook

Length of output: 523


🏁 Script executed:

# Check if there are any async operations during branch switch that could cause divergence
rg -nP --type=ts 'switchToBranch|async.*branch' -A 10 apps/web/client/src/components/store/editor/branch/manager.ts | head -80

Repository: onlook-dev/onlook

Length of output: 2885


🏁 Script executed:

# Check if there's any scenario where currentBranchId could change between activeSandbox and activeBranch calls
rg -nP --type=ts 'branchMap' apps/web/client/src/components/store/editor/branch/manager.ts | head -30

Repository: onlook-dev/onlook

Length of output: 1336


🏁 Script executed:

# Look at the full context of the file being reviewed to understand the sequence
sed -n '75,115p' apps/web/client/src/app/project/[id]/_components/right-panel/chat-tab/context-pills/input-context-pills.tsx

Repository: onlook-dev/onlook

Length of output: 1915


🏁 Script executed:

# Verify if there are any safeguards preventing branch switches during file operations
rg -nP --type=ts 'fileExists|writeFile' -B 5 -A 5 apps/web/client/src/app/project/[id]/_components/right-panel/chat-tab/context-pills/input-context-pills.tsx | head -60

Repository: onlook-dev/onlook

Length of output: 1183


🏁 Script executed:

# Check if there's any locking mechanism or state guard in BranchManager
rg -nP --type=ts 'isSwitching|isLoading|locked' apps/web/client/src/components/store/editor/branch/manager.ts

Repository: onlook-dev/onlook

Length of output: 43


🏁 Script executed:

# Search for patterns that guard against branch switches during operations
rg -nP --type=ts 'try.*finally|abort|cancel' apps/web/client/src/components/store/editor/ | grep -i branch | head -20

Repository: onlook-dev/onlook

Length of output: 43


🏁 Script executed:

# Look for similar patterns in the codebase where branchId is captured during async ops
rg -nP --type=tsx 'activeBranch\.id.*await|await.*activeBranch\.id' -B 3 -A 3 | head -60

Repository: onlook-dev/onlook

Length of output: 89


🏁 Script executed:

# Search for how other code captures branchId before async operations
rg -nP --type=ts 'const.*branchId.*=.*branches\.activeBranch\.id' -A 5 | head -60

Repository: onlook-dev/onlook

Length of output: 785


🏁 Script executed:

# Look at the specific pattern: is branchId captured before or after async operations?
sed -n '75,115p' apps/web/client/src/app/project/[id]/_components/right-panel/chat-tab/context-pills/input-context-pills.tsx

Repository: onlook-dev/onlook

Length of output: 1915


🏁 Script executed:

# Check how the image-tab pattern handles this - does it capture branchId early?
sed -n '20,80p' apps/web/client/src/app/project/[id]/_components/left-panel/design-panel/image-tab/index.tsx

Repository: onlook-dev/onlook

Length of output: 1996


🏁 Script executed:

# Search for the pattern: capture branchId early before async operations
rg -nP --type=ts 'const branchId.*activeBranch.*id' -A 15 | head -80

Repository: onlook-dev/onlook

Length of output: 2055


🏁 Script executed:

# Confirm the exact issue: capture branchId after async ops vs before
# Show the full context around line 110 in the reviewed file
sed -n '103,112p' apps/web/client/src/app/project/[id]/_components/right-panel/chat-tab/context-pills/input-context-pills.tsx

Repository: onlook-dev/onlook

Length of output: 505


🏁 Script executed:

# Check if there are other similar patterns in the codebase that capture branchId late
rg -nP --type=ts 'await.*sandbox.*\n.*branchId.*activeBranch' | head -20

Repository: onlook-dev/onlook

Length of output: 43


Capture branchId before async operations to prevent mismatch.

The code writes files through editorEngine.activeSandbox, then captures branchId after multiple awaits (fileExists, fetch, writeFile). If the user switches branches during this sequence, the saved path will exist in one branch's sandbox but be tagged with a different branch's ID. Capture branchId early before file operations, similar to the pattern in image-tab/index.tsx line 22.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In
`@apps/web/client/src/app/project/`[id]/_components/right-panel/chat-tab/context-pills/input-context-pills.tsx
around lines 89 - 110, Capture the current branch id from
editorEngine.branches.activeBranch.id before any awaits and async file
operations (before the while loop/fetch/writeFile) and store it in a local const
(e.g., savedBranchId), then use that savedBranchId when building
updatedImageContext.branchId (instead of reading
editorEngine.branches.activeBranch.id after awaits); ensure the same
savedBranchId is used for any subsequent metadata related to
targetPath/imageContext to avoid branch mismatch.

Comment on lines +19 to +43
{
type: 'local',
title: 'Import a Local Project',
description:
'Select a directory from your computer to start working with your project in Pixelraft.',
icon: <Icons.Upload className="w-6 h-6 text-primary" />,
ariaLabel: 'Import local project',
},
{
type: 'github',
title: 'Import from GitHub',
description:
'Connect your GitHub account to import repositories and create pull requests from Pixelraft.',
icon: <Icons.GitHubLogo className="w-6 h-6 text-primary" />,
ariaLabel: 'Import from GitHub',
},
{
type: 'figma',
title: 'Import from Figma',
description:
'Paste a Figma file URL to generate a Pixelraft-ready starter project and synced design metadata.',
icon: <Icons.Layout className="w-6 h-6 text-primary" />,
ariaLabel: 'Import from Figma',
},
];
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

User-visible copy still says "Pixelraft" — contradicts the rebrand goal added in this same PR.

This same PR adds unified-onlook-implementation-plan.mdx whose M1 exit criteria require "no user-visible Pixelraft branding on active surfaces" and whose final acceptance criteria states "Pixelraft naming removed from user-visible surfaces." The three card descriptions on lines 23, 31, and 39 still reference "Pixelraft" / "Pixelraft-ready", which directly violates that intent on the Import page (a primary user-facing surface).

✏️ Proposed copy fix
-        description:
-            'Select a directory from your computer to start working with your project in Pixelraft.',
+        description:
+            'Select a directory from your computer to start working with your project in Onlook.',
@@
-        description:
-            'Connect your GitHub account to import repositories and create pull requests from Pixelraft.',
+        description:
+            'Connect your GitHub account to import repositories and create pull requests from Onlook.',
@@
-        description:
-            'Paste a Figma file URL to generate a Pixelraft-ready starter project and synced design metadata.',
+        description:
+            'Paste a Figma file URL to generate an Onlook-ready starter project and synced design metadata.',
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@apps/web/client/src/app/projects/import/page.tsx` around lines 19 - 43, The
three user-facing card descriptions still reference "Pixelraft"; update the
description strings in the objects with type 'local', 'github', and 'figma' to
remove the "Pixelraft" branding (and "Pixelraft-ready") and replace with the new
non-branded phrasing (e.g., "the app", "your project", or the new product name),
leaving icons/ariaLabel values unchanged; locate the objects by their type
property in the cards array in page.tsx and edit only the description text.

Comment on lines +71 to +78
<Card
key={card.type}
className="w-full h-64 cursor-pointer transition-all duration-200 bg-background/80 backdrop-blur-xl hover:shadow-lg hover:scale-[1.02] border-[0.5px] border-foreground-tertiary/50"
onClick={() => handleCardClick(card.type)}
tabIndex={0}
role="button"
aria-label={card.ariaLabel}
>
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Card exposed as role="button" is not keyboard-activatable.

The <Card> renders as a div with onClick, tabIndex={0}, and role="button", but there is no onKeyDown/onKeyUp handler. Keyboard users who focus the card cannot trigger navigation with Enter or Space, which breaks the implied button contract and the keyboard-nav requirements outlined in the same PR's plan (DQA-04/CMP-05). Either render an actual <button> (or <a> for navigation) or attach a key handler.

♿ Proposed accessibility fix
-                    <Card
-                        key={card.type}
-                        className="w-full h-64 cursor-pointer transition-all duration-200 bg-background/80 backdrop-blur-xl hover:shadow-lg hover:scale-[1.02] border-[0.5px] border-foreground-tertiary/50"
-                        onClick={() => handleCardClick(card.type)}
-                        tabIndex={0}
-                        role="button"
-                        aria-label={card.ariaLabel}
-                    >
+                    <Card
+                        key={card.type}
+                        className="w-full h-64 cursor-pointer transition-all duration-200 bg-background/80 backdrop-blur-xl hover:shadow-lg hover:scale-[1.02] border-[0.5px] border-foreground-tertiary/50 focus-visible:outline-none focus-visible:ring-2 focus-visible:ring-primary"
+                        onClick={() => handleCardClick(card.type)}
+                        onKeyDown={(e) => {
+                            if (e.key === 'Enter' || e.key === ' ') {
+                                e.preventDefault();
+                                handleCardClick(card.type);
+                            }
+                        }}
+                        tabIndex={0}
+                        role="button"
+                        aria-label={card.ariaLabel}
+                    >

A focus-visible style is also worth adding so the focused card is discoverable. Based on learnings, this surface should ideally use a real <button> element where possible.

📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
<Card
key={card.type}
className="w-full h-64 cursor-pointer transition-all duration-200 bg-background/80 backdrop-blur-xl hover:shadow-lg hover:scale-[1.02] border-[0.5px] border-foreground-tertiary/50"
onClick={() => handleCardClick(card.type)}
tabIndex={0}
role="button"
aria-label={card.ariaLabel}
>
<Card
key={card.type}
className="w-full h-64 cursor-pointer transition-all duration-200 bg-background/80 backdrop-blur-xl hover:shadow-lg hover:scale-[1.02] border-[0.5px] border-foreground-tertiary/50 focus-visible:outline-none focus-visible:ring-2 focus-visible:ring-primary"
onClick={() => handleCardClick(card.type)}
onKeyDown={(e) => {
if (e.key === 'Enter' || e.key === ' ') {
e.preventDefault();
handleCardClick(card.type);
}
}}
tabIndex={0}
role="button"
aria-label={card.ariaLabel}
>
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@apps/web/client/src/app/projects/import/page.tsx` around lines 71 - 78, The
Card rendered with role="button" (component Card) and onClick handler is not
keyboard-activatable; update the component usage so keyboard users can trigger
handleCardClick(card.type): either replace the Card wrapper with a semantic
interactive element (<button> or <a>) so it natively supports Enter/Space and
focus styles, or add an onKeyDown handler that listens for Enter and Space and
calls handleCardClick(card.type) and preventDefault for Space; also ensure
tabIndex remains 0 and add a visible focus style (focus-visible) so focused
cards are discoverable.

Comment on lines +27 to +44
const parseGitHubRepoUrl = (repoUrl: string): { owner: string; repo: string } => {
const normalizedUrl = repoUrl.trim();

const httpsMatch = normalizedUrl.match(/^https?:\/\/github\.com\/([^/]+)\/([^/.]+)(\.git)?$/i);
if (httpsMatch?.[1] && httpsMatch[2]) {
return { owner: httpsMatch[1], repo: httpsMatch[2] };
}

const sshMatch = normalizedUrl.match(/^git@github\.com:([^/]+)\/([^/.]+)(\.git)?$/i);
if (sshMatch?.[1] && sshMatch[2]) {
return { owner: sshMatch[1], repo: sshMatch[2] };
}

throw new TRPCError({
code: 'BAD_REQUEST',
message: 'Invalid GitHub repository URL',
});
};
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Regex rejects valid GitHub repo names with dots.

GitHub allows . in repository names (e.g., actions/setup-node, foo/bar.js, vercel/next.js). The pattern [^/.]+ for the repo segment fails for any such repo, throwing BAD_REQUEST even though the URL is legitimate. The owner pattern is also slightly off — owners may not contain /, but the trailing .git should be stripped from the repo, not folded into the slug match.

🛠️ Proposed fix
-    const httpsMatch = normalizedUrl.match(/^https?:\/\/github\.com\/([^/]+)\/([^/.]+)(\.git)?$/i);
-    if (httpsMatch?.[1] && httpsMatch[2]) {
-        return { owner: httpsMatch[1], repo: httpsMatch[2] };
-    }
-
-    const sshMatch = normalizedUrl.match(/^git@github\.com:([^/]+)\/([^/.]+)(\.git)?$/i);
-    if (sshMatch?.[1] && sshMatch[2]) {
-        return { owner: sshMatch[1], repo: sshMatch[2] };
-    }
+    const stripGit = (s: string) => s.replace(/\.git$/i, '');
+    const httpsMatch = normalizedUrl.match(/^https?:\/\/github\.com\/([^/]+)\/([^/]+?)(?:\.git)?\/?$/i);
+    if (httpsMatch?.[1] && httpsMatch[2]) {
+        return { owner: httpsMatch[1], repo: stripGit(httpsMatch[2]) };
+    }
+
+    const sshMatch = normalizedUrl.match(/^git@github\.com:([^/]+)\/([^/]+?)(?:\.git)?\/?$/i);
+    if (sshMatch?.[1] && sshMatch[2]) {
+        return { owner: sshMatch[1], repo: stripGit(sshMatch[2]) };
+    }
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@apps/web/client/src/server/api/routers/github.ts` around lines 27 - 44, The
regex in parseGitHubRepoUrl rejects valid repo names containing dots and may
fold the optional .git into the slug; update both patterns to allow any
characters except slash for owner and repo and make the .git suffix optional
outside the repo capture. Replace the https and ssh patterns with something like
/^https?:\/\/github\.com\/([^/]+)\/([^/]+?)(?:\.git)?\/?$/i and
/^git@github\.com:([^/]+)\/([^/]+?)(?:\.git)?$/i so owner uses ([^/]+) and repo
uses a non-greedy ([^/]+?) with (?:\.git)? after it, then return { owner:
match[1], repo: match[2] } as before and keep the TRPCError handling unchanged.

Comment on lines +389 to +545
try {
const status = await provider.gitStatus({});
const changedFiles = status.changedFiles.filter((file) => !!file.trim());

if (changedFiles.length === 0) {
throw new TRPCError({
code: 'BAD_REQUEST',
message: 'No changed files found in this branch',
});
}

const baseRef = await octokit.rest.git.getRef({
owner,
repo,
ref: `heads/${baseBranch}`,
});

try {
await octokit.rest.git.createRef({
owner,
repo,
ref: `refs/heads/${headBranchName}`,
sha: baseRef.data.object.sha,
});
} catch (error) {
if (!(error instanceof Error && error.message.includes('Reference already exists'))) {
throw error;
}
}

let processedFiles = 0;

for (const changedFile of changedFiles) {
const sandboxPath = normalizeSandboxPath(changedFile);
const repoPath = normalizeRepoPath(changedFile);

if (!repoPath) {
continue;
}

let existsInSandbox = true;
try {
await provider.statFile({ args: { path: sandboxPath } });
} catch {
existsInSandbox = false;
}

if (existsInSandbox) {
const content = await getFileContentAsBase64(provider, sandboxPath);

let existingSha: string | undefined;
try {
const existing = await octokit.rest.repos.getContent({
owner,
repo,
path: repoPath,
ref: headBranchName,
});
if (!Array.isArray(existing.data)) {
existingSha = existing.data.sha;
}
} catch {
}

await octokit.rest.repos.createOrUpdateFileContents({
owner,
repo,
path: repoPath,
message: `pixelraft: update ${repoPath}`,
content,
branch: headBranchName,
...(existingSha ? { sha: existingSha } : {}),
});
processedFiles += 1;
continue;
}

try {
const existing = await octokit.rest.repos.getContent({
owner,
repo,
path: repoPath,
ref: headBranchName,
});

if (!Array.isArray(existing.data)) {
await octokit.rest.repos.deleteFile({
owner,
repo,
path: repoPath,
message: `pixelraft: delete ${repoPath}`,
sha: existing.data.sha,
branch: headBranchName,
});
processedFiles += 1;
}
} catch {
}
}

if (processedFiles === 0) {
throw new TRPCError({
code: 'BAD_REQUEST',
message: 'No applicable file changes were synced to GitHub',
});
}

try {
const pullRequest = await octokit.rest.pulls.create({
owner,
repo,
title: input.title,
head: headBranchName,
base: baseBranch,
body: input.body,
});

return {
pullRequestNumber: pullRequest.data.number,
pullRequestUrl: pullRequest.data.html_url,
headBranchName,
baseBranch,
changedFiles: processedFiles,
};
} catch (error) {
const existingPullRequests = await octokit.rest.pulls.list({
owner,
repo,
head: `${owner}:${headBranchName}`,
base: baseBranch,
state: 'open',
});

const existingPullRequest = existingPullRequests.data[0];
if (existingPullRequest) {
return {
pullRequestNumber: existingPullRequest.number,
pullRequestUrl: existingPullRequest.html_url,
headBranchName,
baseBranch,
changedFiles: processedFiles,
};
}

throw error;
}
} catch (error) {
throw new TRPCError({
code: 'INTERNAL_SERVER_ERROR',
message: error instanceof Error ? error.message : 'Failed to create pull request',
cause: error,
});
} finally {
await provider.destroy().catch(() => {
});
}
}),
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Outer catch flattens specific TRPC error codes to INTERNAL_SERVER_ERROR.

The try block deliberately throws BAD_REQUEST (lines 393-398, 489-494) for legitimate user errors (no changed files, no applicable changes). The outer catch on line 535 rewraps every error — including these TRPCErrors — as INTERNAL_SERVER_ERROR, so clients lose the original status code (and may surface a misleading "internal error" to users for what is actually a 4xx). Re-throw existing TRPCErrors as-is.

🛠️ Proposed fix
             } catch (error) {
+                if (error instanceof TRPCError) {
+                    throw error;
+                }
                 throw new TRPCError({
                     code: 'INTERNAL_SERVER_ERROR',
                     message: error instanceof Error ? error.message : 'Failed to create pull request',
                     cause: error,
                 });
             } finally {
Confirmed via GitHub community: "Dashes and periods are the only punctuation marks allowed in repo names" — so the regex must accept dots in repo names.
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
try {
const status = await provider.gitStatus({});
const changedFiles = status.changedFiles.filter((file) => !!file.trim());
if (changedFiles.length === 0) {
throw new TRPCError({
code: 'BAD_REQUEST',
message: 'No changed files found in this branch',
});
}
const baseRef = await octokit.rest.git.getRef({
owner,
repo,
ref: `heads/${baseBranch}`,
});
try {
await octokit.rest.git.createRef({
owner,
repo,
ref: `refs/heads/${headBranchName}`,
sha: baseRef.data.object.sha,
});
} catch (error) {
if (!(error instanceof Error && error.message.includes('Reference already exists'))) {
throw error;
}
}
let processedFiles = 0;
for (const changedFile of changedFiles) {
const sandboxPath = normalizeSandboxPath(changedFile);
const repoPath = normalizeRepoPath(changedFile);
if (!repoPath) {
continue;
}
let existsInSandbox = true;
try {
await provider.statFile({ args: { path: sandboxPath } });
} catch {
existsInSandbox = false;
}
if (existsInSandbox) {
const content = await getFileContentAsBase64(provider, sandboxPath);
let existingSha: string | undefined;
try {
const existing = await octokit.rest.repos.getContent({
owner,
repo,
path: repoPath,
ref: headBranchName,
});
if (!Array.isArray(existing.data)) {
existingSha = existing.data.sha;
}
} catch {
}
await octokit.rest.repos.createOrUpdateFileContents({
owner,
repo,
path: repoPath,
message: `pixelraft: update ${repoPath}`,
content,
branch: headBranchName,
...(existingSha ? { sha: existingSha } : {}),
});
processedFiles += 1;
continue;
}
try {
const existing = await octokit.rest.repos.getContent({
owner,
repo,
path: repoPath,
ref: headBranchName,
});
if (!Array.isArray(existing.data)) {
await octokit.rest.repos.deleteFile({
owner,
repo,
path: repoPath,
message: `pixelraft: delete ${repoPath}`,
sha: existing.data.sha,
branch: headBranchName,
});
processedFiles += 1;
}
} catch {
}
}
if (processedFiles === 0) {
throw new TRPCError({
code: 'BAD_REQUEST',
message: 'No applicable file changes were synced to GitHub',
});
}
try {
const pullRequest = await octokit.rest.pulls.create({
owner,
repo,
title: input.title,
head: headBranchName,
base: baseBranch,
body: input.body,
});
return {
pullRequestNumber: pullRequest.data.number,
pullRequestUrl: pullRequest.data.html_url,
headBranchName,
baseBranch,
changedFiles: processedFiles,
};
} catch (error) {
const existingPullRequests = await octokit.rest.pulls.list({
owner,
repo,
head: `${owner}:${headBranchName}`,
base: baseBranch,
state: 'open',
});
const existingPullRequest = existingPullRequests.data[0];
if (existingPullRequest) {
return {
pullRequestNumber: existingPullRequest.number,
pullRequestUrl: existingPullRequest.html_url,
headBranchName,
baseBranch,
changedFiles: processedFiles,
};
}
throw error;
}
} catch (error) {
throw new TRPCError({
code: 'INTERNAL_SERVER_ERROR',
message: error instanceof Error ? error.message : 'Failed to create pull request',
cause: error,
});
} finally {
await provider.destroy().catch(() => {
});
}
}),
try {
const status = await provider.gitStatus({});
const changedFiles = status.changedFiles.filter((file) => !!file.trim());
if (changedFiles.length === 0) {
throw new TRPCError({
code: 'BAD_REQUEST',
message: 'No changed files found in this branch',
});
}
const baseRef = await octokit.rest.git.getRef({
owner,
repo,
ref: `heads/${baseBranch}`,
});
try {
await octokit.rest.git.createRef({
owner,
repo,
ref: `refs/heads/${headBranchName}`,
sha: baseRef.data.object.sha,
});
} catch (error) {
if (!(error instanceof Error && error.message.includes('Reference already exists'))) {
throw error;
}
}
let processedFiles = 0;
for (const changedFile of changedFiles) {
const sandboxPath = normalizeSandboxPath(changedFile);
const repoPath = normalizeRepoPath(changedFile);
if (!repoPath) {
continue;
}
let existsInSandbox = true;
try {
await provider.statFile({ args: { path: sandboxPath } });
} catch {
existsInSandbox = false;
}
if (existsInSandbox) {
const content = await getFileContentAsBase64(provider, sandboxPath);
let existingSha: string | undefined;
try {
const existing = await octokit.rest.repos.getContent({
owner,
repo,
path: repoPath,
ref: headBranchName,
});
if (!Array.isArray(existing.data)) {
existingSha = existing.data.sha;
}
} catch {
}
await octokit.rest.repos.createOrUpdateFileContents({
owner,
repo,
path: repoPath,
message: `pixelraft: update ${repoPath}`,
content,
branch: headBranchName,
...(existingSha ? { sha: existingSha } : {}),
});
processedFiles += 1;
continue;
}
try {
const existing = await octokit.rest.repos.getContent({
owner,
repo,
path: repoPath,
ref: headBranchName,
});
if (!Array.isArray(existing.data)) {
await octokit.rest.repos.deleteFile({
owner,
repo,
path: repoPath,
message: `pixelraft: delete ${repoPath}`,
sha: existing.data.sha,
branch: headBranchName,
});
processedFiles += 1;
}
} catch {
}
}
if (processedFiles === 0) {
throw new TRPCError({
code: 'BAD_REQUEST',
message: 'No applicable file changes were synced to GitHub',
});
}
try {
const pullRequest = await octokit.rest.pulls.create({
owner,
repo,
title: input.title,
head: headBranchName,
base: baseBranch,
body: input.body,
});
return {
pullRequestNumber: pullRequest.data.number,
pullRequestUrl: pullRequest.data.html_url,
headBranchName,
baseBranch,
changedFiles: processedFiles,
};
} catch (error) {
const existingPullRequests = await octokit.rest.pulls.list({
owner,
repo,
head: `${owner}:${headBranchName}`,
base: baseBranch,
state: 'open',
});
const existingPullRequest = existingPullRequests.data[0];
if (existingPullRequest) {
return {
pullRequestNumber: existingPullRequest.number,
pullRequestUrl: existingPullRequest.html_url,
headBranchName,
baseBranch,
changedFiles: processedFiles,
};
}
throw error;
}
} catch (error) {
if (error instanceof TRPCError) {
throw error;
}
throw new TRPCError({
code: 'INTERNAL_SERVER_ERROR',
message: error instanceof Error ? error.message : 'Failed to create pull request',
cause: error,
});
} finally {
await provider.destroy().catch(() => {
});
}
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@apps/web/client/src/server/api/routers/github.ts` around lines 389 - 545, The
outer catch currently wraps every error (including TRPCError instances thrown
earlier like the BAD_REQUESTs) into a new INTERNAL_SERVER_ERROR, losing original
status codes; update the catch after the main try so that if (error instanceof
TRPCError) you re-throw the same error, otherwise wrap into a new TRPCError as
before (preserving message and cause). Locate the catch that constructs new
TRPCError (referencing TRPCError, provider.destroy, and the surrounding code
that calls octokit.rest.pulls.create) and change it to first check instanceof
TRPCError and throw error directly, falling back to creating the
INTERNAL_SERVER_ERROR only for non-TRPCError errors.

Comment on lines +167 to +269
addComment: protectedProcedure
.input(
z.object({
branchId: z.string().uuid(),
content: z.string().min(1).max(5000),
targetFrameId: z.string().optional(),
}),
)
.mutation(async ({ ctx, input }) => {
const { provider } = await getBranchProvider({
branchId: input.branchId,
userId: ctx.user.id,
db: ctx.db,
});

try {
const comments = commentsSchema.parse(
await readJsonFile<CollaborationComment[]>(provider, COMMENTS_FILE, []),
);
const now = new Date().toISOString();
const newComment: CollaborationComment = {
id: uuidv4(),
content: input.content.trim(),
createdBy: ctx.user.id,
targetFrameId: input.targetFrameId ?? null,
resolved: false,
createdAt: now,
updatedAt: now,
};

comments.push(newComment);
await writeJsonFile(provider, COMMENTS_FILE, comments);
return newComment;
} finally {
await provider.destroy().catch(() => {
});
}
}),

setCommentResolved: protectedProcedure
.input(
z.object({
branchId: z.string().uuid(),
commentId: z.string(),
resolved: z.boolean(),
}),
)
.mutation(async ({ ctx, input }) => {
const { provider } = await getBranchProvider({
branchId: input.branchId,
userId: ctx.user.id,
db: ctx.db,
});

try {
const comments = commentsSchema.parse(
await readJsonFile<CollaborationComment[]>(provider, COMMENTS_FILE, []),
);
const comment = comments.find((item) => item.id === input.commentId);

if (!comment) {
throw new TRPCError({
code: 'NOT_FOUND',
message: 'Comment not found',
});
}

comment.resolved = input.resolved;
comment.updatedAt = new Date().toISOString();
await writeJsonFile(provider, COMMENTS_FILE, comments);
return comment;
} finally {
await provider.destroy().catch(() => {
});
}
}),

deleteComment: protectedProcedure
.input(
z.object({
branchId: z.string().uuid(),
commentId: z.string(),
}),
)
.mutation(async ({ ctx, input }) => {
const { provider } = await getBranchProvider({
branchId: input.branchId,
userId: ctx.user.id,
db: ctx.db,
});

try {
const comments = commentsSchema.parse(
await readJsonFile<CollaborationComment[]>(provider, COMMENTS_FILE, []),
);
const updatedComments = comments.filter((item) => item.id !== input.commentId);
await writeJsonFile(provider, COMMENTS_FILE, updatedComments);
return true;
} finally {
await provider.destroy().catch(() => {
});
}
}),
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Lost-update race on collaboration-comments.json.

addComment, setCommentResolved, and deleteComment all read → mutate → write the same JSON file with no locking, version check, or compare-and-swap. Two concurrent mutations (multiple browser tabs, simultaneous resolve+add, etc.) will silently drop one writer's change because the second writer's read snapshot doesn't include the first writer's mutation.

Mitigations to consider:

  • Append-only event log (writes are merges, not full rewrites).
  • Version/etag field on the file with retry-on-conflict.
  • Per-branch in-process mutex around the read-modify-write window for short-term safety.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@apps/web/client/src/server/api/routers/workspace.ts` around lines 167 - 269,
The three handlers (addComment, setCommentResolved, deleteComment) perform
read→mutate→write on COMMENTS_FILE and can lose updates under concurrent
requests; wrap the critical section around readJsonFile/parse, mutation, and
writeJsonFile in a per-branch mutex (keyed by input.branchId) obtained right
after getBranchProvider and released in the finally block so only one mutation
runs at a time for a branch; implement a lightweight in-memory lock (or use an
existing mutex utility) that awaits acquisition before reading, performs the
change (using commentsSchema.parse on the read result), writes back, then
releases the lock and still calls provider.destroy(). Optionally add a small
retry with backoff if lock acquisition fails to reduce latency under contention.

Comment on lines +364 to +399
saveImageToProjectAssets: protectedProcedure
.input(
z.object({
branchId: z.string().uuid(),
imageDataUrl: z.string().min(1),
displayName: z.string().min(1).max(250),
mimeType: z.string().min(1),
}),
)
.mutation(async ({ ctx, input }) => {
const { provider } = await getBranchProvider({
branchId: input.branchId,
userId: ctx.user.id,
db: ctx.db,
});

try {
const parsed = parseDataUrl(input.imageDataUrl);
const extension = detectExtensionFromMimeType(input.mimeType || parsed.mimeType);
const baseName = sanitizeFilename(input.displayName.replace(/\.[^.]+$/, '')) || 'image';
const fileName = `${baseName}-${Date.now()}.${extension}`;
const assetDirectory = '/public/pixelraft-assets';
const assetPath = `${assetDirectory}/${fileName}`;

try {
await provider.createDirectory({ args: { path: assetDirectory } });
} catch {
}

await provider.writeFile({
args: {
path: assetPath,
content: new Uint8Array(parsed.data),
overwrite: true,
},
});
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Bound imageDataUrl size to avoid memory pressure.

imageDataUrl: z.string().min(1) accepts arbitrarily large payloads. A malicious or buggy client can post a multi-hundred-MB data URL, which then gets fully buffered (Buffer.from(match[2], 'base64')) before any size check. Add a .max(...) on the input (and/or check parsed.data.byteLength after decode) sized to your largest expected asset.

🛡️ Proposed fix
-                imageDataUrl: z.string().min(1),
+                imageDataUrl: z.string().min(1).max(10 * 1024 * 1024), // ~7.5MB binary
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@apps/web/client/src/server/api/routers/workspace.ts` around lines 364 - 399,
The saveImageToProjectAssets mutation currently accepts imageDataUrl as
z.string().min(1) which allows unbounded payloads and can cause memory pressure
when parseDataUrl decodes the full base64; update the input schema for
saveImageToProjectAssets (the z.object with imageDataUrl) to enforce a sensible
maximum (e.g., .max(...)) and additionally validate decoded size after
parseDataUrl by checking parsed.data.byteLength and throwing a controlled error
if it exceeds the allowed limit before creating the Uint8Array or calling
provider.writeFile; ensure the error is descriptive so callers know the payload
is too large.

@DevFlex-AI
Copy link
Copy Markdown
Author

@coderabbitai you remeber me i made. a version of the last pr but better can you tell me if this actually mergable safely
and also do an autofix toll call and pusha commit to fxi the problems in this repo

@coderabbitai
Copy link
Copy Markdown

coderabbitai Bot commented Apr 26, 2026

Only users with a collaborator, contributor, member, or owner role can interact with CodeRabbit.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant