Add Hibachi vault adapter. Renamed Hyperliquid functions. Fixed USDC address#2651
Add Hibachi vault adapter. Renamed Hyperliquid functions. Fixed USDC address#2651IsmaAG wants to merge 2 commits into
Conversation
📝 WalkthroughWalkthroughThe APY adaptor expands from a single Hyperliquid data source to support multiple sources (Hyperliquid and Hibachi). Hyperliquid logic is refactored into focused helper functions, and a complete Hibachi integration is added with data fetching, daily interval grouping, and APY computation. The main export orchestrates both sources sequentially, aggregating results and gracefully skipping failed sources. ChangesMulti-Source APY Collection
Estimated code review effort🎯 3 (Moderate) | ⏱️ ~30 minutes Poem
🚥 Pre-merge checks | ✅ 4 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (4 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Actionable comments posted: 1
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
src/adaptors/growihf/index.js (1)
33-35:⚠️ Potential issue | 🟡 Minor | ⚡ Quick win
rejectwith a string yieldsundefinedin theerr.messagelog.Line 34 rejects with the bare string
'Failed to parse vaultDetails'. The new error handler at line 232 logserr.message, which isundefinedfor a string rejection — so a parse failure surfaces asgrowihf: skipping hyperliquid: undefined, hiding the root cause. Reject with anErrorinstance instead.🛠️ Proposed fix
- } catch (err) { - reject('Failed to parse vaultDetails'); - } + } catch (err) { + reject(new Error(`Failed to parse vaultDetails: ${err.message}`)); + }You may also want to harden the apy handler logger to tolerate non-Error rejections, e.g.
console.error(\growihf: skipping ${name}: ${err?.message ?? err}`)`.🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the rest with a brief reason, keep changes minimal, and validate. In `@src/adaptors/growihf/index.js` around lines 33 - 35, The promise rejection in the catch block that handles parsing vaultDetails currently calls reject('Failed to parse vaultDetails'), which makes downstream logging (which reads err.message) show undefined; change that rejection to reject(new Error('Failed to parse vaultDetails')) so err.message is populated, and while here also make the apy handler logging resilient to non-Error rejections (e.g., use err?.message ?? String(err)) when logging in the handler that references err.message to ensure useful output for any rejection type.
🧹 Nitpick comments (3)
src/adaptors/growihf/index.js (3)
176-193: 💤 Low valueMinor:
infoDataround-trip and redundant access on line 193.Two small cleanups in
computeHibachiAPY7Day:
- Lines 177–179 fetch
/vault/infoonly to confirm the vault exists; the returnedinfoobject is never used afterwards. If/vault/performancealready errors clearly whenvaultIdis unknown, this extra request can be dropped (one fewer hop, one fewer failure mode).- Line 193 reads
performanceData.vaultPerformanceIntervals.at(-1)after you've already aliased that array tointervals(line 182) and validated it (line 183). Useintervals.at(-1)for consistency and to avoid the|| []fallback being silently bypassed here.♻️ Diff
- const tvlUsdLatest = - parseFloat(performanceData.vaultPerformanceIntervals.at(-1).totalValueLocked); + const tvlUsdLatest = parseFloat(intervals.at(-1).totalValueLocked);🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the rest with a brief reason, keep changes minimal, and validate. In `@src/adaptors/growihf/index.js` around lines 176 - 193, In computeHibachiAPY7Day remove the redundant fetch of `${HIBACHI_DATA_API}/vault/info` and the unused variables infoData/info (drop the fetch and existence check), rely on the `${HIBACHI_DATA_API}/vault/performance` call and its errors instead; also replace performanceData.vaultPerformanceIntervals.at(-1) with intervals.at(-1) (using the already-declared intervals array that was validated) when computing tvlUsdLatest to avoid the redundant fallback and keep consistency.
45-132: ⚡ Quick winExtract the duplicated TWR/annualized-yield computation.
computeHyperliquidAPY7Dayduplicates the entire deltaPNL → TWR → cumulative product →Math.pow(..., 365/days) - 1block fromcomputeHyperliquidAPYInception(lines 49–76 vs. 91–118). Pulling the per-portfolio computation into one helper that takes the portfolio entry and returns{ tvlUsd, annYield }would eliminate ~30 lines of copy/paste and make future fixes (e.g., thedays <= 0guard, NaN protection) apply in one place.♻️ Sketch
+function computeHyperliquidPortfolioYield(portfolioEntry) { + const accountValueHistory = portfolioEntry.accountValueHistory; + const pnlHistory = portfolioEntry.pnlHistory; + if (accountValueHistory.length < 3 || pnlHistory.length < 3) + throw new Error('Not enough data points'); + + const timestamps = accountValueHistory.map(([t]) => t); + const values_tvl = accountValueHistory.map(([, v]) => parseFloat(v)); + const values_pnl = pnlHistory.map(([, v]) => parseFloat(v)); + + let twrAcc = 1; + for (let i = 1; i < timestamps.length; i++) { + const delta = values_pnl[i] - values_pnl[i - 1]; + const r = values_tvl[i - 1] !== 0 ? delta / values_tvl[i - 1] : 0; + twrAcc *= 1 + r; + } + const days = (timestamps[timestamps.length - 1] - timestamps[0]) / 86400000; + if (days <= 0) throw new Error('Non-positive time span'); + return { + tvlUsd: values_tvl[values_tvl.length - 1], + annYield: Math.pow(twrAcc, 365 / days) - 1, + }; +}Note: while restructuring, also double-check the TWR denominator — using
values_tvl[i](current) instead ofvalues_tvl[i - 1](start-of-period) is non-standard for time-weighted returns and may bias the result, particularly when TVL changes materially due to deposits/withdrawals.🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the rest with a brief reason, keep changes minimal, and validate. In `@src/adaptors/growihf/index.js` around lines 45 - 132, computeHyperliquidAPY7Day duplicates the deltaPNL→TWR→cumulative-product→annualized-yield logic from computeHyperliquidAPYInception; extract that block into a helper (e.g., computePortfolioAnnYield) that accepts a portfolio entry (the array with accountValueHistory and pnlHistory) and returns { tvlUsd, annYield }. Replace the duplicated code in both computeHyperliquidAPYInception and computeHyperliquidAPY7Day to call this helper, and inside the helper add guards for days <= 0, NaN protection, and prefer using the period-start TVL (values_tvl[i-1]) as the TWR denominator (or at least make the denominator choice consistent and documented).
11-43: ⚡ Quick winAdd timeouts and HTTP status checks to both fetchers.
Neither
fetchHyperliquidVaultDetailsnorfetchHibachiDataAPIsets a request timeout, so a hung connection to either upstream will stall the adaptor indefinitely (the worker that runsapy()has no per-source bound). Additionally,fetchHibachiDataAPIcallsJSON.parse(body)regardless ofres.statusCode, so a 4xx/5xx HTML or text error page produces a confusing parse error rather than a clear HTTP failure.♻️ Suggested hardening for `fetchHibachiDataAPI` (apply the same pattern to the Hyperliquid fetcher)
function fetchHibachiDataAPI(url) { return new Promise((resolve, reject) => { - https - .get(url, (res) => { + const req = https.get(url, { timeout: 15000 }, (res) => { + if (res.statusCode < 200 || res.statusCode >= 300) { + res.resume(); + return reject(new Error(`HTTP ${res.statusCode} from ${url}`)); + } let body = ''; res.on('data', (chunk) => (body += chunk)); res.on('end', () => { try { resolve(JSON.parse(body)); } catch (err) { reject(err); } }); - }) - .on('error', reject); + }); + req.on('timeout', () => req.destroy(new Error(`Timeout fetching ${url}`))); + req.on('error', reject); }); }Also applies to: 134-150
🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the rest with a brief reason, keep changes minimal, and validate. In `@src/adaptors/growihf/index.js` around lines 11 - 43, Update both fetchHyperliquidVaultDetails and fetchHibachiDataAPI to add request timeouts and HTTP status checks: set a timeout on the https request (use req.setTimeout or an AbortController pattern) to abort and reject with a clear timeout error if the upstream hangs, and in the response handler check res.statusCode and reject with an error containing the statusCode and response body for non-2xx responses instead of blindly calling JSON.parse; only call JSON.parse on 2xx responses and include parsing errors in the rejection message.
🤖 Prompt for all review comments with AI agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.
Inline comments:
In `@src/adaptors/growihf/index.js`:
- Around line 176-207: In computeHibachiAPY7Day, change the pool chain field
from the incorrect literal 'hibachi' to the actual settlement chain 'arbitrum'
to match the underlying token constant USDT_ADDRESS_ARBITRUM and ensure proper
token resolution; update the value assigned to the chain property in the
returned object and as a follow-up verify the vault accepts USDT on Arbitrum and
that USDT_ADDRESS_ARBITRUM is the correct token constant for this vault (switch
to USDC_ADDRESS_ARBITRUM if the vault uses USDC).
---
Outside diff comments:
In `@src/adaptors/growihf/index.js`:
- Around line 33-35: The promise rejection in the catch block that handles
parsing vaultDetails currently calls reject('Failed to parse vaultDetails'),
which makes downstream logging (which reads err.message) show undefined; change
that rejection to reject(new Error('Failed to parse vaultDetails')) so
err.message is populated, and while here also make the apy handler logging
resilient to non-Error rejections (e.g., use err?.message ?? String(err)) when
logging in the handler that references err.message to ensure useful output for
any rejection type.
---
Nitpick comments:
In `@src/adaptors/growihf/index.js`:
- Around line 176-193: In computeHibachiAPY7Day remove the redundant fetch of
`${HIBACHI_DATA_API}/vault/info` and the unused variables infoData/info (drop
the fetch and existence check), rely on the
`${HIBACHI_DATA_API}/vault/performance` call and its errors instead; also
replace performanceData.vaultPerformanceIntervals.at(-1) with intervals.at(-1)
(using the already-declared intervals array that was validated) when computing
tvlUsdLatest to avoid the redundant fallback and keep consistency.
- Around line 45-132: computeHyperliquidAPY7Day duplicates the
deltaPNL→TWR→cumulative-product→annualized-yield logic from
computeHyperliquidAPYInception; extract that block into a helper (e.g.,
computePortfolioAnnYield) that accepts a portfolio entry (the array with
accountValueHistory and pnlHistory) and returns { tvlUsd, annYield }. Replace
the duplicated code in both computeHyperliquidAPYInception and
computeHyperliquidAPY7Day to call this helper, and inside the helper add guards
for days <= 0, NaN protection, and prefer using the period-start TVL
(values_tvl[i-1]) as the TWR denominator (or at least make the denominator
choice consistent and documented).
- Around line 11-43: Update both fetchHyperliquidVaultDetails and
fetchHibachiDataAPI to add request timeouts and HTTP status checks: set a
timeout on the https request (use req.setTimeout or an AbortController pattern)
to abort and reject with a clear timeout error if the upstream hangs, and in the
response handler check res.statusCode and reject with an error containing the
statusCode and response body for non-2xx responses instead of blindly calling
JSON.parse; only call JSON.parse on 2xx responses and include parsing errors in
the rejection message.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: defaults
Review profile: CHILL
Plan: Pro
Run ID: 89ccdf66-ea2a-4dd4-ac79-622edbf33e00
⛔ Files ignored due to path filters (1)
package-lock.jsonis excluded by!**/package-lock.json
📒 Files selected for processing (1)
src/adaptors/growihf/index.js
| async function computeHibachiAPY7Day() { | ||
| const infoData = await fetchHibachiDataAPI(`${HIBACHI_DATA_API}/vault/info`); | ||
| const info = (infoData || []).find((v) => v.vaultId === GROWI_ALPHA_VAULT_ID); | ||
| if (!info) throw new Error('Growi Alpha Vault info not found'); | ||
|
|
||
| const performanceData = await fetchHibachiDataAPI(`${HIBACHI_DATA_API}/vault/performance?vaultId=${GROWI_ALPHA_VAULT_ID}&timeRange=All`); | ||
| const intervals = performanceData.vaultPerformanceIntervals || []; | ||
| if (intervals.length < 3) throw new Error('Not enough data points'); | ||
|
|
||
| const apyBaseInception = computeHibachiAPYInterval(intervals); | ||
|
|
||
| const dailyBuckets = groupHibachiIntervalsByDay(intervals); | ||
| const week = dailyBuckets.slice(-7); | ||
| const apy7d = | ||
| week.length >= 2 ? computeHibachiAPYInterval(week) : apyBaseInception; | ||
|
|
||
| const tvlUsdLatest = | ||
| parseFloat(performanceData.vaultPerformanceIntervals.at(-1).totalValueLocked); | ||
|
|
||
| return { | ||
| pool: `growihf-alpha-vault-hibachi`, | ||
| chain: 'hibachi', | ||
| project: 'growihf', | ||
| symbol: 'USDT', | ||
| tvlUsd: tvlUsdLatest, | ||
| apy: apy7d * 100, | ||
| apyBaseInception: apyBaseInception * 100, | ||
| underlyingTokens: [USDT_ADDRESS_ARBITRUM], | ||
| poolMeta: 'Hibachi Vault', | ||
| url: 'https://app.hf.growi.fi/', | ||
| }; | ||
| } |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
#!/bin/bash
# Look at how other adaptors pair chain with underlyingTokens to see whether
# cross-chain token addresses are an accepted pattern in this repo.
rg -nP --type=js -C3 "chain:\s*['\"]hibachi['\"]"
rg -nP --type=js -C2 "underlyingTokens" src/adaptors/growihf/Repository: DefiLlama/yield-server
Length of output: 1188
🌐 Web query:
Hibachi exchange Growi Alpha Vault deposit token USDT or USDC
💡 Result:
Hibachi (including its Growi Alpha Vault / GAV vaults) accepts deposits in stablecoins USDC and USDT (collateral), on supported networks (Base and Arbitrum). [1] What to use for “deposit token”: - USDC or USDT are both supported as deposit/collateral assets on Hibachi. [1][2] - Deposit availability can vary by vault, but the docs state users can deposit USDC or USDT into Hibachi vaults. [1] Notes (important for deposits): - Hibachi currently supports deposits/withdrawals only on Base and Arbitrum; sending to other networks can lock funds on-chain. [3] If you meant something else by “Hibachi exchange” (e.g., a specific UI deposit screen for Growi Alpha Vault), tell me the exact vault link/screen text and I can map which token (USDT vs USDC) it allows there.
Citations:
- 1: https://docs.hibachi.xyz/hibachi-docs/vaults/how-vaults-work
- 2: https://docs.hibachi.xyz/hibachi-docs/getting-started/signing-up
- 3: https://docs.hibachi.xyz/faq
🏁 Script executed:
#!/bin/bash
# Check the address definitions and any comments about tokens in the growihf adaptor
cat -n src/adaptors/growihf/index.js | head -50Repository: DefiLlama/yield-server
Length of output: 1924
🏁 Script executed:
#!/bin/bash
# Look for USDT_ADDRESS_ARBITRUM and ARBITRUM_NATIVE_USDC_ADDRESS definitions
rg -n "USDT_ADDRESS_ARBITRUM|ARBITRUM_NATIVE_USDC_ADDRESS" src/adaptors/growihf/Repository: DefiLlama/yield-server
Length of output: 458
🏁 Script executed:
#!/bin/bash
# Check if there are any Hibachi-specific token addresses referenced in the codebase
rg -n "hibachi" -i src/adaptors/growihf/ -A2 -B2Repository: DefiLlama/yield-server
Length of output: 3719
🏁 Script executed:
#!/bin/bash
# Look at how other adaptors handle chain definitions to see if cross-chain addresses are common
rg -n "chain:" src/adaptors/ --type=js -A1 | grep -i "arbitrum\|ethereum\|polygon" | head -20Repository: DefiLlama/yield-server
Length of output: 1522
🌐 Web query:
Growi Alpha Vault Hibachi deposit token specifications USDT USDC
💡 Result:
Growi Alpha Vault (GAV) deposit token specifications (USDT / USDC) 1) Which tokens are supported for deposits - Hibachi vaults support deposits of either USDC or USDT (deposit availability may vary by vault) [1]. 2) Network / chain support - Deposits of USDC or USDT are supported on Base and Arbitrum [1]. 3) Token-specific details (contract address, decimals, chain-specific behavior) - The Hibachi docs content available in the retrieved sources does not specify the ERC-20 contract addresses, decimals, or other token-level parameters specifically for USDT and USDC within the Growi Alpha Vault (GAV) page/snippets we accessed [2][1]. - What is explicitly described for vault deposits is the supported deposit assets (USDC/USDT) and that the deposit configuration and availability can vary by vault [1]. Net: Based on the authoritative Hibachi documentation excerpts retrieved, the Growi Alpha Vault deposit tokens are USDC and USDT, available to deposit on Base and Arbitrum, but the sources provided here do not include the USDC/USDT token contract addresses or other low-level token specs for GAV [1].
Citations:
- 1: https://docs.hibachi.xyz/hibachi-docs/vaults/how-vaults-work
- 2: https://docs.hibachi.xyz/hibachi-docs/vaults/growi-alpha-vault-gav
🏁 Script executed:
#!/bin/bash
# Search for 'hibachi' chain in other parts of the codebase
rg -rn "hibachi" --type=js src/adaptors/ | grep -v "growihf"Repository: DefiLlama/yield-server
Length of output: 48
🏁 Script executed:
#!/bin/bash
# Look for chain definitions or valid chains list
find src -name "*.ts" -o -name "*.js" | xargs grep -l "chain.*=\|chains.*=" | head -10Repository: DefiLlama/yield-server
Length of output: 480
🏁 Script executed:
#!/bin/bash
# Check if there's any configuration or index defining valid chains
rg -n "hibachi|arbitrum|ethereum" src/utils/ src/config/ -A1 -B1 2>/dev/null | head -30Repository: DefiLlama/yield-server
Length of output: 1518
🏁 Script executed:
#!/bin/bash
# Look at the full Hibachi function to see if there are clues about deposit token
sed -n '176,207p' src/adaptors/growihf/index.jsRepository: DefiLlama/yield-server
Length of output: 1316
🌐 Web query:
Growi Alpha Vault GAV Hibachi specifically USDT or USDC which token deposit
💡 Result:
The Growi Alpha Vault (GAV) on Hibachi accepts deposits of USDC or USDT, as Hibachi vaults generally support these stablecoins on their supported chains (Base for USDC, Arbitrum for USDT).[1][2] Specific deposit availability for GAV may vary by allocation periods, such as top point earners.[3][4] Growi Finance's related products, like Growi HF on Hyperliquid (via Arbitrum), also use USDC deposits.[5][6]
Citations:
- 1: https://docs.hibachi.xyz/hibachi-docs/vaults/how-vaults-work
- 2: https://docs.hibachi.xyz/hibachi-docs/getting-started/signing-up
- 3: https://docs.hibachi.xyz/hibachi-docs/vaults
- 4: https://docs.hibachi.xyz/hibachi-docs/vaults/growi-alpha-vault-gav
- 5: https://www.hypurr.co/ecosystem-projects/growi-finance
- 6: https://growi-finance.gitbook.io/hf-docs
🏁 Script executed:
#!/bin/bash
# Search for chain validation or chain list in the codebase
find src -type f \( -name "*.ts" -o -name "*.js" \) -exec grep -l "const.*CHAIN\|export.*chain" {} \; | head -10Repository: DefiLlama/yield-server
Length of output: 7970
🏁 Script executed:
#!/bin/bash
# Look for any documentation or schema validation for pool object structure
find src -type f \( -name "*.ts" -o -name "*.js" \) -exec grep -l "underlyingTokens\|poolSchema\|validatePool" {} \; | head -5Repository: DefiLlama/yield-server
Length of output: 35210
🏁 Script executed:
#!/bin/bash
# Check if there's any reference material about Hibachi being a chain vs protocol
rg -n "Hibachi\|hibachi" --type=js src/ | head -20Repository: DefiLlama/yield-server
Length of output: 48
🌐 Web query:
DefiLlama yield-server supported chains list hibachi chain
💡 Result:
DefiLlama yield-server “supported chains list” for the Hibachi chain: Hibachi is on the chains Arbitrum and Base.[1][2] What I can and can’t confirm from the yield-server code: 1) The DefiLlama yield-server repository defines how an adaptor returns pools including a required chain field, which should match DefiLlama’s chain names.[3] However, the sources I retrieved don’t include the Hibachi adaptor’s actual exported pool objects, so I can’t enumerate the exact chain strings as implemented in the yield-server adaptor. 2) Hibachi’s DefiLlama chain pages indicate its ecosystem/charts are associated with Arbitrum and Base.[1][2] So, the grounded answer to “hibachi chain” is: Arbitrum and Base.[1][2]
Citations:
- 1: https://defillama.com/chain/hibachi
- 2: https://defillama2.llamao.fi/protocol/hibachi?events=false&perpVolume=true&tvl=false
- 3: https://github.com/DefiLlama/yield-server
🌐 Web query:
Hibachi protocol blockchain settlement layer which network Arbitrum Base
💡 Result:
Hibachi’s blockchain settlement layer is implemented as zk-verified on-chain settlement (“collateral can only move on-chain if the movement has been validated via a zk proof”). [1] Hibachi is deployed on both Arbitrum and Base (so settlement happens on the user’s chosen network). [2][3][4] What network connects to which chain: - Arbitrum: users must have assets and ETH on Arbitrum; Hibachi accepts USDT as collateral on Arbitrum. [2][5] - Base: users must have assets and ETH on Base; Hibachi accepts USDC as collateral on Base. [2][5] Additional detail: Hibachi’s “verification/settlement layer” is described as zk-verification, but the docs and other sources do not present a separate third “settlement layer chain” beyond the fact that Hibachi runs on Arbitrum and Base for on-chain settlement. [1][5]
Citations:
- 1: https://docs.hibachi.xyz/hibachi-docs/verification
- 2: https://docs.hibachi.xyz/hibachi-docs/getting-started/signing-up
- 3: https://docs.hibachi.xyz/faq
- 4: https://blog.hibachi.xyz/introducing-hibachi-ccfc853a7c12?gi=7bd9e03a33ce
- 5: https://insidecryptoreview.com/en/dex/hibachi
Change chain: 'hibachi' to chain: 'arbitrum' to match the Arbitrum USDT address.
Hibachi is a protocol that settles on Arbitrum and Base—not a settlement layer itself. Using chain: 'hibachi' with USDT_ADDRESS_ARBITRUM creates a mismatch that will likely break token resolution in DefiLlama's pipeline. Since the underlying token is explicitly an Arbitrum address, the pool should declare chain: 'arbitrum' to ensure proper token pricing and metadata.
Verify which stablecoin (USDT vs USDC) the Growi Alpha Vault actually accepts on Arbitrum and confirm the correct address is being used.
🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.
In `@src/adaptors/growihf/index.js` around lines 176 - 207, In
computeHibachiAPY7Day, change the pool chain field from the incorrect literal
'hibachi' to the actual settlement chain 'arbitrum' to match the underlying
token constant USDT_ADDRESS_ARBITRUM and ensure proper token resolution; update
the value assigned to the chain property in the returned object and as a
follow-up verify the vault accepts USDT on Arbitrum and that
USDT_ADDRESS_ARBITRUM is the correct token constant for this vault (switch to
USDC_ADDRESS_ARBITRUM if the vault uses USDC).
|
The growihf adapter exports pools: Test Suites: 1 passed, 1 total |
Added Growi Alpha Vault (GAV) on Hibachi chain. It displays vault APY since inception and 7 day https://hibachi.xyz/vaults
Hyperliquid vault yield calculations are the same, but some functions changed name. Fixed USDC address to correspond with native USDC on Arbitrum
Summary by CodeRabbit