Skip to content

Add Hibachi vault adapter. Renamed Hyperliquid functions. Fixed USDC address#2651

Open
IsmaAG wants to merge 2 commits into
DefiLlama:masterfrom
Growi-DeFi:master
Open

Add Hibachi vault adapter. Renamed Hyperliquid functions. Fixed USDC address#2651
IsmaAG wants to merge 2 commits into
DefiLlama:masterfrom
Growi-DeFi:master

Conversation

@IsmaAG
Copy link
Copy Markdown
Contributor

@IsmaAG IsmaAG commented May 5, 2026

Added Growi Alpha Vault (GAV) on Hibachi chain. It displays vault APY since inception and 7 day https://hibachi.xyz/vaults

Hyperliquid vault yield calculations are the same, but some functions changed name. Fixed USDC address to correspond with native USDC on Arbitrum

Summary by CodeRabbit

  • New Features
    • Added Hibachi protocol integration to expand APY data sources
    • APY aggregation now retrieves and combines data from multiple protocols including Hyperliquid and Hibachi
    • Improved system resilience with automatic error handling and fallback for unavailable data sources
    • Updated metadata formatting to enhance data clarity and accuracy

@coderabbitai
Copy link
Copy Markdown
Contributor

coderabbitai Bot commented May 5, 2026

📝 Walkthrough

Walkthrough

The APY adaptor expands from a single Hyperliquid data source to support multiple sources (Hyperliquid and Hibachi). Hyperliquid logic is refactored into focused helper functions, and a complete Hibachi integration is added with data fetching, daily interval grouping, and APY computation. The main export orchestrates both sources sequentially, aggregating results and gracefully skipping failed sources.

Changes

Multi-Source APY Collection

Layer / File(s) Summary
Configuration & Constants
src/adaptors/growihf/index.js (lines 1–11)
Added Hyperliquid and Arbitrum USDC address constants, plus Hibachi API base URL and vault identifiers for both data sources.
Hyperliquid Refactoring
src/adaptors/growihf/index.js (lines 11–133)
Split Hyperliquid logic into fetchHyperliquidVaultDetails, computeHyperliquidAPYInception, and computeHyperliquidAPY7Day. Updated returned pool metadata to include apyBaseInception, underlyingTokens (Arbitrum USDC), and symbol (USDC).
Hibachi Integration
src/adaptors/growihf/index.js (lines 134–207)
Added fetchHibachiDataAPI, groupHibachiIntervalsByDay, computeHibachiAPYInterval, and computeHibachiAPY7Day to fetch Hibachi vault data, bucket performance intervals by day, compute annualized APY from share price ratios, and derive 7-day APY (with fallback to inception APY). Returns Hibachi pool object with chain, symbol, underlying tokens, and computed yields.
Multi-Source Orchestration
src/adaptors/growihf/index.js (lines 209–238)
Modified module.exports.apy to define a sources array for Hyperliquid and Hibachi, sequentially execute each source builder, catch per-source errors with logged warnings, and return the combined pools array.

Estimated code review effort

🎯 3 (Moderate) | ⏱️ ~30 minutes

Poem

🐰 A rabbit's ode to dual flows,
where Hyperliquid and Hibachi grows,
APYs bundled in one sweet dance,
share prices bucketed by chance,
errors skip gracefully, pools combine—
a multi-source treasure most fine!

🚥 Pre-merge checks | ✅ 4 | ❌ 1

❌ Failed checks (1 warning)

Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 0.00% which is insufficient. The required threshold is 80.00%. Write docstrings for the functions missing them to satisfy the coverage threshold.
✅ Passed checks (4 passed)
Check name Status Explanation
Title check ✅ Passed The title accurately reflects the three main changes: adding Hibachi vault adapter, renaming Hyperliquid functions, and fixing the USDC address.
Linked Issues check ✅ Passed Check skipped because no linked issues were found for this pull request.
Out of Scope Changes check ✅ Passed Check skipped because no linked issues were found for this pull request.
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (1)
src/adaptors/growihf/index.js (1)

33-35: ⚠️ Potential issue | 🟡 Minor | ⚡ Quick win

reject with a string yields undefined in the err.message log.

Line 34 rejects with the bare string 'Failed to parse vaultDetails'. The new error handler at line 232 logs err.message, which is undefined for a string rejection — so a parse failure surfaces as growihf: skipping hyperliquid: undefined, hiding the root cause. Reject with an Error instance instead.

🛠️ Proposed fix
-        } catch (err) {
-          reject('Failed to parse vaultDetails');
-        }
+        } catch (err) {
+          reject(new Error(`Failed to parse vaultDetails: ${err.message}`));
+        }

You may also want to harden the apy handler logger to tolerate non-Error rejections, e.g. console.error(\growihf: skipping ${name}: ${err?.message ?? err}`)`.

🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.

In `@src/adaptors/growihf/index.js` around lines 33 - 35, The promise rejection in
the catch block that handles parsing vaultDetails currently calls reject('Failed
to parse vaultDetails'), which makes downstream logging (which reads
err.message) show undefined; change that rejection to reject(new Error('Failed
to parse vaultDetails')) so err.message is populated, and while here also make
the apy handler logging resilient to non-Error rejections (e.g., use
err?.message ?? String(err)) when logging in the handler that references
err.message to ensure useful output for any rejection type.
🧹 Nitpick comments (3)
src/adaptors/growihf/index.js (3)

176-193: 💤 Low value

Minor: infoData round-trip and redundant access on line 193.

Two small cleanups in computeHibachiAPY7Day:

  • Lines 177–179 fetch /vault/info only to confirm the vault exists; the returned info object is never used afterwards. If /vault/performance already errors clearly when vaultId is unknown, this extra request can be dropped (one fewer hop, one fewer failure mode).
  • Line 193 reads performanceData.vaultPerformanceIntervals.at(-1) after you've already aliased that array to intervals (line 182) and validated it (line 183). Use intervals.at(-1) for consistency and to avoid the || [] fallback being silently bypassed here.
♻️ Diff
-  const tvlUsdLatest =
-    parseFloat(performanceData.vaultPerformanceIntervals.at(-1).totalValueLocked);
+  const tvlUsdLatest = parseFloat(intervals.at(-1).totalValueLocked);
🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.

In `@src/adaptors/growihf/index.js` around lines 176 - 193, In
computeHibachiAPY7Day remove the redundant fetch of
`${HIBACHI_DATA_API}/vault/info` and the unused variables infoData/info (drop
the fetch and existence check), rely on the
`${HIBACHI_DATA_API}/vault/performance` call and its errors instead; also
replace performanceData.vaultPerformanceIntervals.at(-1) with intervals.at(-1)
(using the already-declared intervals array that was validated) when computing
tvlUsdLatest to avoid the redundant fallback and keep consistency.

45-132: ⚡ Quick win

Extract the duplicated TWR/annualized-yield computation.

computeHyperliquidAPY7Day duplicates the entire deltaPNL → TWR → cumulative product → Math.pow(..., 365/days) - 1 block from computeHyperliquidAPYInception (lines 49–76 vs. 91–118). Pulling the per-portfolio computation into one helper that takes the portfolio entry and returns { tvlUsd, annYield } would eliminate ~30 lines of copy/paste and make future fixes (e.g., the days <= 0 guard, NaN protection) apply in one place.

♻️ Sketch
+function computeHyperliquidPortfolioYield(portfolioEntry) {
+  const accountValueHistory = portfolioEntry.accountValueHistory;
+  const pnlHistory = portfolioEntry.pnlHistory;
+  if (accountValueHistory.length < 3 || pnlHistory.length < 3)
+    throw new Error('Not enough data points');
+
+  const timestamps = accountValueHistory.map(([t]) => t);
+  const values_tvl = accountValueHistory.map(([, v]) => parseFloat(v));
+  const values_pnl = pnlHistory.map(([, v]) => parseFloat(v));
+
+  let twrAcc = 1;
+  for (let i = 1; i < timestamps.length; i++) {
+    const delta = values_pnl[i] - values_pnl[i - 1];
+    const r = values_tvl[i - 1] !== 0 ? delta / values_tvl[i - 1] : 0;
+    twrAcc *= 1 + r;
+  }
+  const days = (timestamps[timestamps.length - 1] - timestamps[0]) / 86400000;
+  if (days <= 0) throw new Error('Non-positive time span');
+  return {
+    tvlUsd: values_tvl[values_tvl.length - 1],
+    annYield: Math.pow(twrAcc, 365 / days) - 1,
+  };
+}

Note: while restructuring, also double-check the TWR denominator — using values_tvl[i] (current) instead of values_tvl[i - 1] (start-of-period) is non-standard for time-weighted returns and may bias the result, particularly when TVL changes materially due to deposits/withdrawals.

🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.

In `@src/adaptors/growihf/index.js` around lines 45 - 132,
computeHyperliquidAPY7Day duplicates the
deltaPNL→TWR→cumulative-product→annualized-yield logic from
computeHyperliquidAPYInception; extract that block into a helper (e.g.,
computePortfolioAnnYield) that accepts a portfolio entry (the array with
accountValueHistory and pnlHistory) and returns { tvlUsd, annYield }. Replace
the duplicated code in both computeHyperliquidAPYInception and
computeHyperliquidAPY7Day to call this helper, and inside the helper add guards
for days <= 0, NaN protection, and prefer using the period-start TVL
(values_tvl[i-1]) as the TWR denominator (or at least make the denominator
choice consistent and documented).

11-43: ⚡ Quick win

Add timeouts and HTTP status checks to both fetchers.

Neither fetchHyperliquidVaultDetails nor fetchHibachiDataAPI sets a request timeout, so a hung connection to either upstream will stall the adaptor indefinitely (the worker that runs apy() has no per-source bound). Additionally, fetchHibachiDataAPI calls JSON.parse(body) regardless of res.statusCode, so a 4xx/5xx HTML or text error page produces a confusing parse error rather than a clear HTTP failure.

♻️ Suggested hardening for `fetchHibachiDataAPI` (apply the same pattern to the Hyperliquid fetcher)
 function fetchHibachiDataAPI(url) {
   return new Promise((resolve, reject) => {
-    https
-      .get(url, (res) => {
+    const req = https.get(url, { timeout: 15000 }, (res) => {
+        if (res.statusCode < 200 || res.statusCode >= 300) {
+          res.resume();
+          return reject(new Error(`HTTP ${res.statusCode} from ${url}`));
+        }
         let body = '';
         res.on('data', (chunk) => (body += chunk));
         res.on('end', () => {
           try {
             resolve(JSON.parse(body));
           } catch (err) {
             reject(err);
           }
         });
-      })
-      .on('error', reject);
+      });
+    req.on('timeout', () => req.destroy(new Error(`Timeout fetching ${url}`)));
+    req.on('error', reject);
   });
 }

Also applies to: 134-150

🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.

In `@src/adaptors/growihf/index.js` around lines 11 - 43, Update both
fetchHyperliquidVaultDetails and fetchHibachiDataAPI to add request timeouts and
HTTP status checks: set a timeout on the https request (use req.setTimeout or an
AbortController pattern) to abort and reject with a clear timeout error if the
upstream hangs, and in the response handler check res.statusCode and reject with
an error containing the statusCode and response body for non-2xx responses
instead of blindly calling JSON.parse; only call JSON.parse on 2xx responses and
include parsing errors in the rejection message.
🤖 Prompt for all review comments with AI agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.

Inline comments:
In `@src/adaptors/growihf/index.js`:
- Around line 176-207: In computeHibachiAPY7Day, change the pool chain field
from the incorrect literal 'hibachi' to the actual settlement chain 'arbitrum'
to match the underlying token constant USDT_ADDRESS_ARBITRUM and ensure proper
token resolution; update the value assigned to the chain property in the
returned object and as a follow-up verify the vault accepts USDT on Arbitrum and
that USDT_ADDRESS_ARBITRUM is the correct token constant for this vault (switch
to USDC_ADDRESS_ARBITRUM if the vault uses USDC).

---

Outside diff comments:
In `@src/adaptors/growihf/index.js`:
- Around line 33-35: The promise rejection in the catch block that handles
parsing vaultDetails currently calls reject('Failed to parse vaultDetails'),
which makes downstream logging (which reads err.message) show undefined; change
that rejection to reject(new Error('Failed to parse vaultDetails')) so
err.message is populated, and while here also make the apy handler logging
resilient to non-Error rejections (e.g., use err?.message ?? String(err)) when
logging in the handler that references err.message to ensure useful output for
any rejection type.

---

Nitpick comments:
In `@src/adaptors/growihf/index.js`:
- Around line 176-193: In computeHibachiAPY7Day remove the redundant fetch of
`${HIBACHI_DATA_API}/vault/info` and the unused variables infoData/info (drop
the fetch and existence check), rely on the
`${HIBACHI_DATA_API}/vault/performance` call and its errors instead; also
replace performanceData.vaultPerformanceIntervals.at(-1) with intervals.at(-1)
(using the already-declared intervals array that was validated) when computing
tvlUsdLatest to avoid the redundant fallback and keep consistency.
- Around line 45-132: computeHyperliquidAPY7Day duplicates the
deltaPNL→TWR→cumulative-product→annualized-yield logic from
computeHyperliquidAPYInception; extract that block into a helper (e.g.,
computePortfolioAnnYield) that accepts a portfolio entry (the array with
accountValueHistory and pnlHistory) and returns { tvlUsd, annYield }. Replace
the duplicated code in both computeHyperliquidAPYInception and
computeHyperliquidAPY7Day to call this helper, and inside the helper add guards
for days <= 0, NaN protection, and prefer using the period-start TVL
(values_tvl[i-1]) as the TWR denominator (or at least make the denominator
choice consistent and documented).
- Around line 11-43: Update both fetchHyperliquidVaultDetails and
fetchHibachiDataAPI to add request timeouts and HTTP status checks: set a
timeout on the https request (use req.setTimeout or an AbortController pattern)
to abort and reject with a clear timeout error if the upstream hangs, and in the
response handler check res.statusCode and reject with an error containing the
statusCode and response body for non-2xx responses instead of blindly calling
JSON.parse; only call JSON.parse on 2xx responses and include parsing errors in
the rejection message.
🪄 Autofix (Beta)

Fix all unresolved CodeRabbit comments on this PR:

  • Push a commit to this branch (recommended)
  • Create a new PR with the fixes

ℹ️ Review info
⚙️ Run configuration

Configuration used: defaults

Review profile: CHILL

Plan: Pro

Run ID: 89ccdf66-ea2a-4dd4-ac79-622edbf33e00

📥 Commits

Reviewing files that changed from the base of the PR and between da87652 and ba5ce1b.

⛔ Files ignored due to path filters (1)
  • package-lock.json is excluded by !**/package-lock.json
📒 Files selected for processing (1)
  • src/adaptors/growihf/index.js

Comment on lines +176 to +207
async function computeHibachiAPY7Day() {
const infoData = await fetchHibachiDataAPI(`${HIBACHI_DATA_API}/vault/info`);
const info = (infoData || []).find((v) => v.vaultId === GROWI_ALPHA_VAULT_ID);
if (!info) throw new Error('Growi Alpha Vault info not found');

const performanceData = await fetchHibachiDataAPI(`${HIBACHI_DATA_API}/vault/performance?vaultId=${GROWI_ALPHA_VAULT_ID}&timeRange=All`);
const intervals = performanceData.vaultPerformanceIntervals || [];
if (intervals.length < 3) throw new Error('Not enough data points');

const apyBaseInception = computeHibachiAPYInterval(intervals);

const dailyBuckets = groupHibachiIntervalsByDay(intervals);
const week = dailyBuckets.slice(-7);
const apy7d =
week.length >= 2 ? computeHibachiAPYInterval(week) : apyBaseInception;

const tvlUsdLatest =
parseFloat(performanceData.vaultPerformanceIntervals.at(-1).totalValueLocked);

return {
pool: `growihf-alpha-vault-hibachi`,
chain: 'hibachi',
project: 'growihf',
symbol: 'USDT',
tvlUsd: tvlUsdLatest,
apy: apy7d * 100,
apyBaseInception: apyBaseInception * 100,
underlyingTokens: [USDT_ADDRESS_ARBITRUM],
poolMeta: 'Hibachi Vault',
url: 'https://app.hf.growi.fi/',
};
}
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor | ⚡ Quick win

🧩 Analysis chain

🏁 Script executed:

#!/bin/bash
# Look at how other adaptors pair chain with underlyingTokens to see whether
# cross-chain token addresses are an accepted pattern in this repo.
rg -nP --type=js -C3 "chain:\s*['\"]hibachi['\"]"
rg -nP --type=js -C2 "underlyingTokens" src/adaptors/growihf/

Repository: DefiLlama/yield-server

Length of output: 1188


🌐 Web query:

Hibachi exchange Growi Alpha Vault deposit token USDT or USDC

💡 Result:

Hibachi (including its Growi Alpha Vault / GAV vaults) accepts deposits in stablecoins USDC and USDT (collateral), on supported networks (Base and Arbitrum). [1] What to use for “deposit token”: - USDC or USDT are both supported as deposit/collateral assets on Hibachi. [1][2] - Deposit availability can vary by vault, but the docs state users can deposit USDC or USDT into Hibachi vaults. [1] Notes (important for deposits): - Hibachi currently supports deposits/withdrawals only on Base and Arbitrum; sending to other networks can lock funds on-chain. [3] If you meant something else by “Hibachi exchange” (e.g., a specific UI deposit screen for Growi Alpha Vault), tell me the exact vault link/screen text and I can map which token (USDT vs USDC) it allows there.

Citations:


🏁 Script executed:

#!/bin/bash
# Check the address definitions and any comments about tokens in the growihf adaptor
cat -n src/adaptors/growihf/index.js | head -50

Repository: DefiLlama/yield-server

Length of output: 1924


🏁 Script executed:

#!/bin/bash
# Look for USDT_ADDRESS_ARBITRUM and ARBITRUM_NATIVE_USDC_ADDRESS definitions
rg -n "USDT_ADDRESS_ARBITRUM|ARBITRUM_NATIVE_USDC_ADDRESS" src/adaptors/growihf/

Repository: DefiLlama/yield-server

Length of output: 458


🏁 Script executed:

#!/bin/bash
# Check if there are any Hibachi-specific token addresses referenced in the codebase
rg -n "hibachi" -i src/adaptors/growihf/ -A2 -B2

Repository: DefiLlama/yield-server

Length of output: 3719


🏁 Script executed:

#!/bin/bash
# Look at how other adaptors handle chain definitions to see if cross-chain addresses are common
rg -n "chain:" src/adaptors/ --type=js -A1 | grep -i "arbitrum\|ethereum\|polygon" | head -20

Repository: DefiLlama/yield-server

Length of output: 1522


🌐 Web query:

Growi Alpha Vault Hibachi deposit token specifications USDT USDC

💡 Result:

Growi Alpha Vault (GAV) deposit token specifications (USDT / USDC) 1) Which tokens are supported for deposits - Hibachi vaults support deposits of either USDC or USDT (deposit availability may vary by vault) [1]. 2) Network / chain support - Deposits of USDC or USDT are supported on Base and Arbitrum [1]. 3) Token-specific details (contract address, decimals, chain-specific behavior) - The Hibachi docs content available in the retrieved sources does not specify the ERC-20 contract addresses, decimals, or other token-level parameters specifically for USDT and USDC within the Growi Alpha Vault (GAV) page/snippets we accessed [2][1]. - What is explicitly described for vault deposits is the supported deposit assets (USDC/USDT) and that the deposit configuration and availability can vary by vault [1]. Net: Based on the authoritative Hibachi documentation excerpts retrieved, the Growi Alpha Vault deposit tokens are USDC and USDT, available to deposit on Base and Arbitrum, but the sources provided here do not include the USDC/USDT token contract addresses or other low-level token specs for GAV [1].

Citations:


🏁 Script executed:

#!/bin/bash
# Search for 'hibachi' chain in other parts of the codebase
rg -rn "hibachi" --type=js src/adaptors/ | grep -v "growihf"

Repository: DefiLlama/yield-server

Length of output: 48


🏁 Script executed:

#!/bin/bash
# Look for chain definitions or valid chains list
find src -name "*.ts" -o -name "*.js" | xargs grep -l "chain.*=\|chains.*=" | head -10

Repository: DefiLlama/yield-server

Length of output: 480


🏁 Script executed:

#!/bin/bash
# Check if there's any configuration or index defining valid chains
rg -n "hibachi|arbitrum|ethereum" src/utils/ src/config/ -A1 -B1 2>/dev/null | head -30

Repository: DefiLlama/yield-server

Length of output: 1518


🏁 Script executed:

#!/bin/bash
# Look at the full Hibachi function to see if there are clues about deposit token
sed -n '176,207p' src/adaptors/growihf/index.js

Repository: DefiLlama/yield-server

Length of output: 1316


🌐 Web query:

Growi Alpha Vault GAV Hibachi specifically USDT or USDC which token deposit

💡 Result:

The Growi Alpha Vault (GAV) on Hibachi accepts deposits of USDC or USDT, as Hibachi vaults generally support these stablecoins on their supported chains (Base for USDC, Arbitrum for USDT).[1][2] Specific deposit availability for GAV may vary by allocation periods, such as top point earners.[3][4] Growi Finance's related products, like Growi HF on Hyperliquid (via Arbitrum), also use USDC deposits.[5][6]

Citations:


🏁 Script executed:

#!/bin/bash
# Search for chain validation or chain list in the codebase
find src -type f \( -name "*.ts" -o -name "*.js" \) -exec grep -l "const.*CHAIN\|export.*chain" {} \; | head -10

Repository: DefiLlama/yield-server

Length of output: 7970


🏁 Script executed:

#!/bin/bash
# Look for any documentation or schema validation for pool object structure
find src -type f \( -name "*.ts" -o -name "*.js" \) -exec grep -l "underlyingTokens\|poolSchema\|validatePool" {} \; | head -5

Repository: DefiLlama/yield-server

Length of output: 35210


🏁 Script executed:

#!/bin/bash
# Check if there's any reference material about Hibachi being a chain vs protocol
rg -n "Hibachi\|hibachi" --type=js src/ | head -20

Repository: DefiLlama/yield-server

Length of output: 48


🌐 Web query:

DefiLlama yield-server supported chains list hibachi chain

💡 Result:

DefiLlama yield-server “supported chains list” for the Hibachi chain: Hibachi is on the chains Arbitrum and Base.[1][2] What I can and can’t confirm from the yield-server code: 1) The DefiLlama yield-server repository defines how an adaptor returns pools including a required chain field, which should match DefiLlama’s chain names.[3] However, the sources I retrieved don’t include the Hibachi adaptor’s actual exported pool objects, so I can’t enumerate the exact chain strings as implemented in the yield-server adaptor. 2) Hibachi’s DefiLlama chain pages indicate its ecosystem/charts are associated with Arbitrum and Base.[1][2] So, the grounded answer to “hibachi chain” is: Arbitrum and Base.[1][2]

Citations:


🌐 Web query:

Hibachi protocol blockchain settlement layer which network Arbitrum Base

💡 Result:

Hibachi’s blockchain settlement layer is implemented as zk-verified on-chain settlement (“collateral can only move on-chain if the movement has been validated via a zk proof”). [1] Hibachi is deployed on both Arbitrum and Base (so settlement happens on the user’s chosen network). [2][3][4] What network connects to which chain: - Arbitrum: users must have assets and ETH on Arbitrum; Hibachi accepts USDT as collateral on Arbitrum. [2][5] - Base: users must have assets and ETH on Base; Hibachi accepts USDC as collateral on Base. [2][5] Additional detail: Hibachi’s “verification/settlement layer” is described as zk-verification, but the docs and other sources do not present a separate third “settlement layer chain” beyond the fact that Hibachi runs on Arbitrum and Base for on-chain settlement. [1][5]

Citations:


Change chain: 'hibachi' to chain: 'arbitrum' to match the Arbitrum USDT address.

Hibachi is a protocol that settles on Arbitrum and Base—not a settlement layer itself. Using chain: 'hibachi' with USDT_ADDRESS_ARBITRUM creates a mismatch that will likely break token resolution in DefiLlama's pipeline. Since the underlying token is explicitly an Arbitrum address, the pool should declare chain: 'arbitrum' to ensure proper token pricing and metadata.

Verify which stablecoin (USDT vs USDC) the Growi Alpha Vault actually accepts on Arbitrum and confirm the correct address is being used.

🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.

In `@src/adaptors/growihf/index.js` around lines 176 - 207, In
computeHibachiAPY7Day, change the pool chain field from the incorrect literal
'hibachi' to the actual settlement chain 'arbitrum' to match the underlying
token constant USDT_ADDRESS_ARBITRUM and ensure proper token resolution; update
the value assigned to the chain property in the returned object and as a
follow-up verify the vault accepts USDT on Arbitrum and that
USDT_ADDRESS_ARBITRUM is the correct token constant for this vault (switch to
USDC_ADDRESS_ARBITRUM if the vault uses USDC).

@github-actions
Copy link
Copy Markdown

github-actions Bot commented May 5, 2026

The growihf adapter exports pools:

Test Suites: 1 passed, 1 total
Tests: 16 passed, 16 total
Snapshots: 0 total
Time: 0.247 s
Ran all test suites.

Nb of pools: 2
 

Sample pools:
┌─────────┬───────────────────────────────┬───────────────┬───────────┬────────┬────────────────┬───────────────────┬───────────────────┬──────────────────────────────────────────────────┬─────────────────────┬────────────────────────────┐
│ (index) │ pool                          │ chain         │ project   │ symbol │ tvlUsd         │ apy               │ apyBaseInception  │ underlyingTokens                                 │ poolMeta            │ url                        │
├─────────┼───────────────────────────────┼───────────────┼───────────┼────────┼────────────────┼───────────────────┼───────────────────┼──────────────────────────────────────────────────┼─────────────────────┼────────────────────────────┤
│ 0       │ 'growihf-vault-hyperliquid'   │ 'hyperliquid' │ 'growihf' │ 'USDC' │ 8023496.70594  │ 45.12804732348845 │ 64.25357673724133 │ [ '0xaf88d065e77c8cC2239327C5EDb3A432268e5831' ] │ 'Hyperliquid Vault' │ 'https://app.hf.growi.fi/' │
│ 1       │ 'growihf-alpha-vault-hibachi' │ 'hibachi'     │ 'growihf' │ 'USDT' │ 1554502.782698 │ 49.41958408669069 │ 24.29808441496386 │ [ '0xFd086bC7CD5C481DCC9C85ebE478A1C0b69FCbb9' ] │ 'Hibachi Vault'     │ 'https://app.hf.growi.fi/' │
└─────────┴───────────────────────────────┴───────────────┴───────────┴────────┴────────────────┴───────────────────┴───────────────────┴──────────────────────────────────────────────────┴─────────────────────┴────────────────────────────┘

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant