refactor: migrate to pnpm and enhance header component flexibility#38
refactor: migrate to pnpm and enhance header component flexibility#38
Conversation
|
Warning Rate limit exceeded
To keep reviews running without waiting, you can enable usage-based add-on for your organization. This allows additional reviews beyond the hourly cap. Account admins can enable it under billing. ⌛ How to resolve this issue?After the wait time has elapsed, a review can be triggered using the We recommend that you space out your commits to avoid hitting the rate limit. 🚦 How do rate limits work?CodeRabbit enforces hourly rate limits for each developer per organization. Our paid plans have higher rate limits than the trial, open-source and free plans. In all cases, we re-allow further reviews after a brief timeout. Please see our FAQ for further information. ℹ️ Review info⚙️ Run configurationConfiguration used: Organization UI Review profile: CHILL Plan: Pro Run ID: ⛔ Files ignored due to path filters (4)
📒 Files selected for processing (52)
WalkthroughThis PR switches the project from npm to pnpm package management, introduces version generation and runtime environment validation, overhauls CLI dependency installation to use pnpm-driven builds with retry logic, adds PDF page numbering support, and expands component APIs with new sizing options, layout variants, and a new document context system. Changes
Possibly Related PRs
🚥 Pre-merge checks | ✅ 4 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (4 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
✨ Simplify code
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
…dge and Avatar Introduce 'xxs' (extra extra small) as a new named size preset for Badge, Avatar, and AvatarGroup components. Refactor Badge sizing to use a named size map with consistent mm-based measurements for container padding, icon dimensions, and gaps. Add support for numeric font sizes (in pt) that interpolate layout dimensions from named presets, enabling custom typography while maintaining consistent spacing. Update ListTable to use Badge component for tags instead of inline spans, improving consistency and maintainability. Replace gray color palette with slate throughout components for better visual hierarchy. Add resolveSizeVariant utility to safely resolve variant lookups with fallback behavior and console warnings for unknown values, preventing SSR crashes from typos. Update example documentation and package.json scripts to use pnpm instead of npm. BREAKING CHANGE: Badge size type now includes 'xxs' and number; components using hardcoded 'xs' should be reviewed for appropriate sizing.
…eader component flexibility - Replace npm with pnpm across all documentation, task definitions, and examples for improved dependency management - Add logoClass and className props to FlanksourceHeader for flexible styling and sizing - Refactor Header component to support custom className and style props with merged utility function - Remove hardcoded padding from CSS and move to component-level DEFAULT_CLASSES for better control - Update HeaderSolid example with detailed comments showing 30mm hero and 14mm compact header configurations - Add Fira Code as primary monospace font in Tailwind config - Update .gitignore to exclude .agent/ and .sandbox.yaml files These changes improve component reusability and allow per-instance header customization without CSS modifications.
There was a problem hiding this comment.
Actionable comments posted: 20
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (4)
cli/src/server/routes.ts (1)
497-501:⚠️ Potential issue | 🟠 MajorSilent fallback masks
pnpmmissing at runtime.Same
pnpm execconcern ascli/src/generators/html.tsLine 114 (see comment there). Additionally, the catch on Line 499–501 only emits atdebuglevel — ifpnpmisn't on PATH on the server host, every render silently falls back to the (likely incomplete) Vite CSS without operators noticing. Consider logging atwarnlevel here, or using a direct binary path as suggested in the root-cause comment.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@cli/src/server/routes.ts` around lines 497 - 501, The catch currently swallows errors from the shell invocation of pnpm (`await $`cd ${facetRoot} && pnpm exec tailwindcss -i ${stylesInput} --content ${tempHtmlPath} -o ${outputCssPath}`.quiet()`) and only logs at debug level; change the catch to accept an error (e) and log a warning with the error details via logger.warn including which command failed and relevant variables (facetRoot, stylesInput, tempHtmlPath, outputCssPath), so missing pnpm on PATH is visible at runtime; additionally, consider using a direct binary path (or a resolved pnpm exec path) instead of relying on plain `pnpm` in the shell to avoid PATH issues..github/workflows/test.yml (1)
22-53:⚠️ Potential issue | 🟠 MajorWorkflow still uses npm despite the PR's stated pnpm migration.
The PR objective is "migrate to pnpm", but this workflow continues to:
- cache against
cli/package-lock.json(line 27), which no longer exists (repository now usescli/pnpm-lock.yaml),- run
npm installandcd cli && npm install(lines 46–47),- run
cd cli && npm run build:lib(line 53).Since the repository has migrated to pnpm (evidenced by
pnpm-lock.yamlat root and cli, and"packageManager": "pnpm@8.13.1"in package.json), this workflow will fail: the cache step will not find the referenced npm lockfile, andnpm installwill not respect the pnpm lockfiles.Switch the setup, cache, install, and build steps to pnpm (e.g.,
pnpm/action-setup,cache: "pnpm",pnpm install --frozen-lockfile,pnpm --filter cli build:lib).🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In @.github/workflows/test.yml around lines 22 - 53, The workflow still uses npm-specific steps and cache keys (actions/setup-node with cache-dependency-path referencing cli/package-lock.json, and the commands npm install, cd cli && npm install, cd cli && npm run build:lib) but the repo uses pnpm/pnpm-lock.yaml; update the job to use pnpm (replace actions/setup-node step to include pnpm via pnpm/action-setup or install pnpm, change cache: "npm" to cache: "pnpm" and cache-dependency-path to refer to pnpm-lock.yaml files, replace npm install commands with pnpm install --frozen-lockfile (and run from root only or use pnpm --filter cli install where appropriate), and replace cd cli && npm run build:lib with pnpm --filter cli build:lib) so the CI respects pnpm lockfiles and caching.examples/kitchen-sink/UberKitchenSink.tsx (1)
1-40:⚠️ Potential issue | 🟡 MinorAdd
AIModelCardto the CLI runtime types infacet-types.ts.
AIModelCardis properly exported from the package's public API (src/components/index.tsx), but it is missing fromcli/src/server/facet-types.ts. Templates rendered through the CLI will lack TypeScript support for this component.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@examples/kitchen-sink/UberKitchenSink.tsx` around lines 1 - 40, The CLI runtime type definitions are missing the AIModelCard component type; update the facet runtime types by adding AIModelCard to the exported component/type list in facet-types.ts (the same union/interface where other components like Badge, StatCard, and Theme are declared), ensure its TypeScript type matches the public export signature (as in src/components/index.tsx), and export it so templates rendered through the CLI receive proper typings.cli/src/utils/pdf-multipass.ts (1)
499-566:⚠️ Potential issue | 🟡 MinorSilent skip when placeholders are present but
browserorhtmlis missing.When
headerHasPlaceholders.get(key)is true and eitherbrowserorhtmlisundefined, the placeholder branch is skipped,embeddedremainsundefined, and the header (or footer) is not drawn at all for that page. SinceembeddedHeaders.set(key, …)was also skipped at line 507 specifically because placeholders were detected, there is no pre-embedded fallback either.Both arguments are optional in the new signature, so a caller that passes only
pageSizeMap(or only one ofbrowser/html) silently loses headers/footers on pages that were supposed to receive page numbers. At minimum, log a warning when this combination occurs so the regression isn't invisible; consider falling back to the unsubstituted overlay (await embedOverlay(doc, buf)) so static portions still render.🛡️ Suggested guard
for (const [key, buf] of overlays.headers) { const has = htmlHasTokens || bufferHasPlaceholders(buf); headerHasPlaceholders.set(key, has); - if (!has) embeddedHeaders.set(key, await embedOverlay(doc, buf)); + if (!has || !browser || !html) embeddedHeaders.set(key, await embedOverlay(doc, buf)); } for (const [key, buf] of overlays.footers) { const has = htmlHasTokens || bufferHasPlaceholders(buf); footerHasPlaceholders.set(key, has); - if (!has) embeddedFooters.set(key, await embedOverlay(doc, buf)); + if (!has || !browser || !html) embeddedFooters.set(key, await embedOverlay(doc, buf)); }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@cli/src/utils/pdf-multipass.ts` around lines 499 - 566, The code currently skips rendering headers/footers when headerHasPlaceholders.get(key) or footerHasPlaceholders.get(key) is true but browser or html are missing; update the branches in the header and footer handling (the blocks using headerHasPlaceholders/footerHasPlaceholders, embeddedHeaders/embeddedFooters, overlays.headers/overlays.footers, renderElementPdf, and embedOverlay) to detect this condition, emit a warning (so the regression is visible) and fall back to embedding the raw overlay (await embedOverlay(doc, buf)) so static parts still render; apply the same change for both the header and footer paths and ensure the fallback assigns embedded so page.drawPage still runs.
🧹 Nitpick comments (26)
README.md (1)
402-405: Redundant explicitprepublishOnlyinvocation.
prepublishOnlyis a lifecycle script that npm/pnpm run automatically beforepublish— invoking it manually duplicates the build. Either drop the manual step (just documentnpm publish/pnpm publish) or update the comment to clarify it's only for testing the prepublish step in isolation.Minor — pre-existing nit surfaced by this diff.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@README.md` around lines 402 - 405, The README currently shows an explicit pnpm run prepublishOnly followed by npm publish which duplicates npm/pnpm's automatic running of the prepublishOnly lifecycle; update that snippet to either remove the manual prepublishOnly call and show just "npm publish" / "pnpm publish", or keep the manual command but change the comment to state it's only for testing the prepublishOnly step in isolation (reference the prepublishOnly lifecycle and the npm publish / pnpm publish commands in the change).package.json (1)
62-62:source-map-supportas a runtime dep — intentional for consumers of the library?
@flanksource/facetis published as a library (permain/exports). Listingsource-map-supportunderdependenciesmeans every downstream consumer of the React component library installs it, even though it's primarily useful at the Node CLI runtime. If it's only needed by the CLI, consider moving it tocli/package.json'sdependencies(and out of root) so library consumers don't pull it transitively.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@package.json` at line 62, The package.json currently lists "source-map-support" as a root dependency, which makes every consumer of the published library install it; move "source-map-support" out of the root "dependencies" and into the CLI package's dependencies instead (e.g., add it to cli/package.json dependencies and remove it from the root package.json) so only the CLI runtime pulls it; update any root code that imports it to either lazy-require from the CLI entrypoint or ensure the CLI package provides the import path.src/components/PageNo.tsx (1)
9-18: Document the placeholder semantics — bare_PG_/_TL_will render in non-PDF contexts.When this component is rendered in a browser/Storybook (i.e. anywhere outside the PDF multipass that substitutes
_PG_/_TL_), users will see literal_PG_ / _TL_text. That's surprising without a doc comment. A short JSDoc on the component noting that the output is intended to be post-processed bycli/src/utils/pdf-multipass.tswould prevent confusion.📝 Proposed JSDoc
+/** + * Renders a placeholder span whose text is later substituted with the actual + * page number / total during PDF post-processing (see cli/src/utils/pdf-multipass.ts). + * In non-PDF rendering contexts the raw `_PG_`/`_TL_` markers will be visible. + */ export default function PageNo({ format = '${page} / ${total}', className, }: PageNoProps) {🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/components/PageNo.tsx` around lines 9 - 18, Add a short JSDoc comment above the PageNo component explaining that the format prop uses placeholders PAGE_PLACEHOLDER (`_PG_`) and TOTAL_PLACEHOLDER (`_TL_`) which are intended to be replaced later by the PDF multipass processor (see cli/src/utils/pdf-multipass.ts), and note that in non-PDF contexts (browser/Storybook) these raw placeholders will be rendered literally; reference the component name PageNo and the symbols PAGE_PLACEHOLDER and TOTAL_PLACEHOLDER so maintainers know where to look.cli/src/utils/package-manager.test.ts (1)
59-59: Use the top-levelfs/promisesimport instead of dynamicimport().
fs/promisesis already statically imported at line 2; pulling inmkdiragain viaawait import('fs/promises')inside each test adds noise without benefit.♻️ Diff
-import { mkdtemp, rm, writeFile } from 'fs/promises'; +import { mkdir, mkdtemp, rm, writeFile } from 'fs/promises'; @@ - const nested = join(workDir, 'a', 'b'); - const { mkdir } = await import('fs/promises'); - await mkdir(nested, { recursive: true }); + const nested = join(workDir, 'a', 'b'); + await mkdir(nested, { recursive: true }); @@ - const nested = join(workDir, 'child'); - const { mkdir } = await import('fs/promises'); - await mkdir(nested); + const nested = join(workDir, 'child'); + await mkdir(nested);Also applies to: 71-71
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@cli/src/utils/package-manager.test.ts` at line 59, Replace the dynamic await import of 'fs/promises' and its destructured mkdir with the top-level static import already present in this test module: remove occurrences of "const { mkdir } = await import('fs/promises')" and use the statically imported mkdir symbol directly in the tests (both places where the dynamic import is used). This keeps the code consistent with the existing import at the top and removes unnecessary dynamic imports.cli/src/utils/package-manager.ts (1)
79-79: Nit:?? rawis unreachable —String.prototype.splitalways returns a non-empty array.
raw.split('@')[0]isstring(neverundefined), so the nullish fallback is dead. Either drop it, or simplify withpkg.packageManager.split('@', 1)[0]. Functionally fine either way.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@cli/src/utils/package-manager.ts` at line 79, Nit: the nullish fallback is unreachable when computing the variable name from raw.split; remove the dead fallback or simplify the split logic. Update the code that assigns name (the variable computed from raw.split) to either drop the "?? raw" fallback entirely or use a single-item split on the packageManager field (e.g., call split with a limit of 1 on pkg.packageManager) so the result is clearly a string without needing a fallback.examples/kitchen-sink/package.json (1)
12-12: Considerlink:../../for the local facet dependency.
file:../../works with pnpm but may copy the package at install time, requiring reinstallation to reflect changes insrc/. With pnpm,link:../../creates a direct symlink and provides live updates during development, which is typically better for in-repo examples.If you set up a
pnpm-workspace.yamlat the repository root,"@flanksource/facet": "workspace:*"would be another option, but that requires additional monorepo configuration.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@examples/kitchen-sink/package.json` at line 12, Update the local dependency spec for "@flanksource/facet" in package.json to use a pnpm symlink instead of a file copy by replacing the value "file:../../" with "link:../../" (or use "workspace:*" if you add a pnpm-workspace.yaml); edit the dependency entry for "@flanksource/facet" so installs create a live symlink to the repo source rather than copying it.src/components/Page.tsx (1)
14-24:productprop in PageProps is unused and should be removed or wired through to rendering.The
productprop is declared in the PageProps interface but never destructured or used in the Page component. However, multiple callers are already passing this prop (examples/kitchen-sink/TypographyTest.tsx, examples/kitchen-sink/UberKitchenSink.tsx, cli/src/server/playground-html.ts), suggesting it was intended to display in the page header. Either remove it from the public API to clarify the interface, or wire it through to the title/Header rendering logic.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/components/Page.tsx` around lines 14 - 24, The PageProps interface declares a product prop that is never used by the Page component; either remove it from PageProps or pass it into the header rendering—preferably wire it through: update the Page component to destructure product from its props (alongside title, titleClassName, etc.) and include it in the header/title rendering logic (e.g., render product next to or below title in the Header/title block, or pass product into the existing Header component if one is used) so callers like TypographyTest and UberKitchenSink see the product string; if you choose removal, delete product from PageProps and remove usages in callers.src/components/CalloutBox.tsx (1)
60-60: Nit: helper nameresolveSizeVariantreads oddly when applied to a non-size variant.
CalloutBox.variantis a semantic kind (info/warning/success/default), not a size. The lookup logic is identical, but naming the helper around "size" creates a small cognitive mismatch with every non-size caller (CalloutBox, and any others that key on color/theme/variant). Consider renaming toresolveVariantorresolveByKeyin a follow-up; not worth blocking here.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/components/CalloutBox.tsx` at line 60, The helper name resolveSizeVariant is misleading for semantic variants like CalloutBox.variant; rename the helper (and its exported identifier) to something generic such as resolveVariant or resolveByKey and update all usages (e.g., the call in CalloutBox that currently uses resolveSizeVariant, the helper definition, any imports/exports, and tests) so callers still pass the same args (variant, variantStyles, default, 'CalloutBox') but the function name reflects generic variant lookup; ensure types/signatures remain unchanged and update any references in other components to avoid breaking imports.cli/package.json (2)
16-28:gen-versionruns twice duringpnpm run build; also consider pinning the package manager.
prebuildrunsgen-version, thenprebuild:cliruns it again beforebuild:cli. Not a correctness issue but redundant — drop one of them (typicallyprebuild:cliis sufficient sincebuild:libdoesn't need the generated file, or rely solely onprebuildand removeprebuild:cli).- Since the PR standardizes on pnpm, adding a top-level
"packageManager": "pnpm@<version>"field makes the toolchain reproducible and lets Corepack pick it up automatically — otherwise contributors runningnpm installwon't get a clear failure.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@cli/package.json` around lines 16 - 28, Remove the redundant double invocation of the gen-version script by deleting either the "prebuild:cli" script or the "prebuild" script so "gen-version" runs only once during the build (keep "prebuild:cli" if you want generation only before build:cli, or keep "prebuild" to run it before the whole build chain); and add a top-level "packageManager": "pnpm@<pin-version>" field to package.json to pin pnpm (replace <pin-version> with the project’s chosen pnpm version) so Corepack and contributors use a consistent package manager.
70-84: Remove stale Jest tooling now that tests run on Bun.
testandtest:watchhave been migrated tobun test, and the test files usebun:testimports, butjest,@types/jest, andts-jestare still listed indevDependencies. They're unreachable from any script and only add install weight and supply-chain surface. Drop them unless something outside this PR still depends on them.♻️ Proposed cleanup
"devDependencies": { "@types/bun": "^1.3.0", - "@types/jest": "^29.5.0", "@types/node": "^20.11.0", "@types/node-forge": "^1.3.14", "@types/semver": "^7.5.0", "@types/react": "^18.3.12", "@types/react-dom": "^18.3.1", - "jest": "^29.7.0", "node-forge": "^1.3.3", "pdfassembler": "^0.1.0-beta.8", - "ts-jest": "^29.1.0", "tsdown": "^0.16.0", "typescript": "^5.6.3" }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@cli/package.json` around lines 70 - 84, The devDependencies still include stale Jest packages ("jest", "@types/jest", "ts-jest") even though tests were migrated to Bun; remove those three entries from the "devDependencies" object in package.json to eliminate unused install weight and supply-chain surface, and then verify no scripts or other files reference Jest symbols before committing (look for occurrences of "jest", "@types/jest", and "ts-jest" to confirm safe removal).cli/src/utils/remote-resolver.test.ts (1)
229-242: Opt-in network test depends on a third-party package; consider pinning to a more reliable fixture.
leftpad@0.0.1is fetched live from the public registry. Even though this is gated behindFACET_TEST_NPM=1, using an unmaintained third-party package as a fixture leaves the test exposed to registry/availability flakiness, unpublishes, or supply-chain incidents. A tiny first-party fixture package (or a well-known stable package likeis-number/@flanksource/*) would be a safer long-term choice.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@cli/src/utils/remote-resolver.test.ts` around lines 229 - 242, Replace the external leftpad fixture in the itNetwork test by pointing npmRef to a small, pinned test package maintained by us (or a highly stable public package); update the npmRef object (repoUrl and ref fields) inside the itNetwork('resolveNpmRef installs a real package via pnpm' ...) test and adjust the expected templateFile and resolvedSha assertions to match the chosen fixture, then run the test to confirm resolveRemoteRef still writes the expected file under result.consumerRoot; specifically modify the npmRef declaration and the expect(...) checks that reference result.templateFile and result.resolvedSha so they align with the new fixture package and its file layout.src/stories/components/AIModelCard.stories.tsx (1)
74-74: Minor: typeMultipleCardsasStoryfor consistency.All other stories use
Story(i.e.StoryObj<typeof meta>). BareStoryObjhere loses themetalinkage and the controls/argTypes inference, which is a small consistency wart even thoughMultipleCardsonly usesrender.♻️ Proposed change
-export const MultipleCards: StoryObj = { +export const MultipleCards: Story = { render: () => (🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/stories/components/AIModelCard.stories.tsx` at line 74, The story export MultipleCards is currently typed as StoryObj which breaks linkage to meta and argTypes; change its type to Story (i.e. StoryObj<typeof meta> or the project's alias Story) to restore proper meta inference and consistency with other stories—locate the MultipleCards export in AIModelCard.stories.tsx and update its type annotation from StoryObj to Story (or StoryObj<typeof meta> if using the generic form).src/components/utils/resolveSizeVariant.ts (1)
1-31: Optional: capwarnedKeysto avoid unbounded growth in long-running processes.
warnedKeysis a module-singletonSetthat lives for the process lifetime. Under normal use this is fine (small, finite valid+invalid key space), but in long-running SSR/dev servers a misconfigured caller passing dynamically generatedsizevalues would accumulate entries indefinitely. A trivial size cap (or LRU) prevents that without changing the “warn once” UX.♻️ Possible bound
const warnedKeys = new Set<string>(); +const MAX_WARN_KEYS = 1024; @@ - if (!warnedKeys.has(key) && size != null) { + if (!warnedKeys.has(key) && size != null) { + if (warnedKeys.size >= MAX_WARN_KEYS) warnedKeys.clear(); warnedKeys.add(key);🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/components/utils/resolveSizeVariant.ts` around lines 1 - 31, warnedKeys is an unbounded module-level Set which can grow indefinitely; replace it with a bounded, insertion-ordered store (e.g., a Map used as an LRU FIFO) and evict the oldest entry when capacity is exceeded. Concretely, change the top-level warnedKeys to a Map<string, boolean> (or similar) with a MAX_WARNED_KEYS constant, update checks in resolveSizeVariant to use warnedKeys.has(key) and warnedKeys.set(key, true), and after adding a new key, if warnedKeys.size > MAX_WARNED_KEYS remove the oldest entry by iterating warnedKeys.keys().next().value and deleting it so the “warn once” behavior is preserved without unbounded memory growth.src/components/Header.tsx (1)
17-21: Tailwind class merging: userclassNamewon't reliably overrideDEFAULT_CLASSESfor overlapping utilities.With Tailwind's JIT, utilities at the same layer have equal CSS specificity, so the last rule emitted in the generated stylesheet wins — not the last token in the className string. A consumer passing
className="px-[10mm]"to overridepx-[5mm]fromDEFAULT_CLASSESmay or may not actually take effect depending on Tailwind's class ordering. Since the whole point of this refactor is per-instance customization, this can silently bite consumers.Consider using
tailwind-mergeso the right-most caller-supplied utility wins deterministically:♻️ Proposed fix using tailwind-merge
import React from 'react'; +import { twMerge } from 'tailwind-merge'; export type PageType = 'first' | 'default' | 'last'; ... const DEFAULT_CLASSES = 'py-[1mm] px-[5mm]'; -function merged(...parts: Array<string | undefined>): string { - return parts.filter(Boolean).join(' '); -} +function merged(...parts: Array<string | undefined>): string { + return twMerge(parts.filter(Boolean).join(' ')); +}Also applies to: 41-41
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/components/Header.tsx` around lines 17 - 21, DEFAULT_CLASSES and the merged(...) helper use naive string concat so consumer className like "px-[10mm]" may not override "px-[5mm]" reliably; replace the simple join with tailwind-merge to deterministically resolve conflicts. Import the twMerge (or merge from 'tailwind-merge') and change merged to call twMerge(DEFAULT_CLASSES, ...parts) (and update any other usage at the other occurrence mentioned around the merged call), ensuring the caller-supplied utilities win when appropriate.src/stories/components/DatasheetTemplate.stories.tsx (1)
26-53: Optional: extract the repeated margins literal.The same
{ top: 10, bottom: 10, left: 10, right: 10 }is repeated four times. Lifting it to a constant (e.g.,const MARGINS = { top: 10, bottom: 10, left: 10, right: 10 } as const;) would make future tweaks a single-line change.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/stories/components/DatasheetTemplate.stories.tsx` around lines 26 - 53, Extract the repeated margins object into a single constant (e.g., MARGINS) at the top of the story and replace every inline `{ top: 10, bottom: 10, left: 10, right: 10 }` with that constant in the Page usages; specifically update the Page props used in the default story and the MultiPage story so all four Page components reference MARGINS (you can declare it as `const MARGINS = { top: 10, bottom: 10, left: 10, right: 10 } as const`), ensuring all references to Page's margins prop use the new symbol.cli/src/bundler/vite-builder.ts (2)
97-108: Regex classification looks correct, but consider unmatched-export cases.The patterns correctly anchor on a non-
./non-/first character to exclude relative/absolute path failures. One pattern that occasionally surfaces from broken installs isSyntaxError: The requested module '<pkg>' does not provide an export named ..., which is also typically resolved by reinstalling (e.g., when a dep gets bumped under your feet). Worth adding if you see it in practice.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@cli/src/bundler/vite-builder.ts` around lines 97 - 108, Add detection for "does not provide an export named" errors to RUNTIME_REINSTALL_PATTERNS used by isMissingDepError: update the RUNTIME_REINSTALL_PATTERNS array to include a regex that matches messages like "SyntaxError: The requested module '<pkg>' does not provide an export named" (case-insensitive, allow single/double quotes and variable export names), so isMissingDepError(stderr) returns true for these broken-install export errors; ensure the new pattern follows the same non-relative/module-name anchoring style as the other entries and is pushed into the existing array constant RUNTIME_REINSTALL_PATTERNS referenced by isMissingDepError.
118-150: Loader stdout is discarded on failure.
runLoaderWithRetryonly inspectsfirstErr?.stderrwhen classifying and surfacing errors. Bun's$populates bothstdoutandstderron a non-zero exit, and Vite SSR sometimes prints the most useful frames (e.g., transformed source location) to stdout. If a non-missing-dep failure occurs, users will see only the stderr fragment in the wrappedVite SSR build failed:message and may lose the original stack context.♻️ Suggested change to include stdout in the surfaced error
- } catch (firstErr: any) { - const stderr = firstErr?.stderr?.toString?.() || ''; - if (!isMissingDepError(stderr)) { - const raw = stderr || (firstErr instanceof Error ? firstErr.message : String(firstErr)); - throw new Error(`Vite SSR build failed:\n${deduplicateOutput(raw)}`); - } + } catch (firstErr: any) { + const stderr = firstErr?.stderr?.toString?.() || ''; + const stdout = firstErr?.stdout?.toString?.() || ''; + if (!isMissingDepError(stderr) && !isMissingDepError(stdout)) { + const raw = [stderr, stdout].filter(Boolean).join('\n') + || (firstErr instanceof Error ? firstErr.message : String(firstErr)); + throw new Error(`Vite SSR build failed:\n${deduplicateOutput(raw)}`); + }Apply the same to the second-attempt catch block.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@cli/src/bundler/vite-builder.ts` around lines 118 - 150, The error handling in runLoaderWithRetry only inspects firstErr.stderr (and secondErr.stderr) so useful output printed to stdout by runLoaderOnce (via bun's $) is lost; change the classification and error construction to include both stdout and stderr (e.g., combine firstErr.stdout + firstErr.stderr into a single string before calling isMissingDepError and before passing to deduplicateOutput), and apply the same combination in the second catch block for secondErr so both attempts surface full output from runLoaderOnce.cli/src/commands/doctor.ts (3)
352-355:.npmrcsecret regex has redundant alternations and may false-positive on comments.
_authmatches inside_authTokenand_password(well, no —_passwordcontains no_auth, but_authTokendoes), making the first three alternations partially overlapping. More importantly, this is unanchored, so a comment like# remember to set _authToken latertriggers the path. Anchor to^after optional whitespace and split on=/:to reduce false positives:♻️ Suggested tighter detection
- const hasSecret = /_authToken|_password|_auth|certfile|keyfile/.test(raw); + const hasSecret = raw.split(/\r?\n/).some(line => { + const t = line.replace(/^\s*[#;].*/, '').trim(); + return /^([@\w./:-]+:)?(_authToken|_password|_auth|certfile|keyfile)\s*=/.test(t); + });🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@cli/src/commands/doctor.ts` around lines 352 - 355, The current unanchored regex in the hasSecret check (/ _authToken|_password|_auth|certfile|keyfile/) can false-positive on comments and has overlapping alternations; update the test to match credential keys only at the start of a line (allowing leading whitespace) and require a separator (= or :) so comments won't trigger it. Replace the /_authToken|_password|_auth|certfile|keyfile/.test(raw) usage with a multiline-anchored pattern like /^\s*(?:_authToken|_password|_auth|certfile|keyfile)\s*(?:=|:)/m.test(raw) (put longer tokens before shorter to avoid accidental partial matches) so hasSecret only returns true for actual key assignments in doctor.ts.
356-365:.gitignoreparsing is line-exact and misses common patterns.The check accepts only
.facet,.facet/,/.facet, or/.facet/. A user who ignores via**/.facet/or who relies on a parent.gitignorewill be flagged as leaking secrets even when the directory is ignored. Consider using a smallgitignoreparser (e.g.,ignorefrom npm) orgit check-ignore:+import { execSync } from 'child_process'; +... + try { + execSync(`git -C ${JSON.stringify(consumerRoot)} check-ignore -q .facet/`, { stdio: 'ignore' }); + ignored = true; + } catch { ignored = false; }This delegates the check to git itself, which is the source of truth.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@cli/src/commands/doctor.ts` around lines 356 - 365, The current .gitignore check in doctor.ts is too strict and only matches exact lines like '.facet' or '/.facet/'; replace this ad-hoc parsing by delegating to git or a proper gitignore parser: use 'git check-ignore --no-index --quiet --path' (or spawn git check-ignore) on the path join(consumerRoot, '.facet') to determine ignore status, or install and use the 'ignore' npm package to load the .gitignore contents and test the relative path; update the logic around gitignore, gitignore variable and ignored flag so ignored accurately reflects whether the directory is ignored (including parent .gitignore and patterns like '**/.facet/') and handle errors/fallbacks when git is not available.
196-203: Native bindings check may pass when an unrelated, wrong-arch binding is co-installed.The loop iterates
CRITICAL_NATIVE_PACKAGES, finds matching entries, then assertssome(m => m.includes(expectedSuffix)). If the package has multiple arch-specific subpackages installed (e.g., both@rollup+rollup-darwin-arm64@…and@rollup+rollup-linux-x64@…got hoisted in a transferrednode_modules),some(...)returns true and the check passes — but the runtime may still load the foreign binary first and fail. Consider asserting that all matching-arch suffixes equalexpectedSuffix, or that no foreign-arch suffix is present:♻️ Tighter check
- const archMatch = matches.some(m => m.includes(expectedSuffix)); - if (!archMatch) missing.push(prefix); + const archSpecific = matches.filter(m => /-(darwin|linux|win32|freebsd)-(x64|arm64|ia32|arm)\b/.test(m)); + const goodArch = archSpecific.some(m => m.includes(expectedSuffix)); + const foreignArch = archSpecific.some(m => !m.includes(expectedSuffix)); + if (!goodArch || foreignArch) missing.push(prefix);🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@cli/src/commands/doctor.ts` around lines 196 - 203, The current arch-check uses archMatch = matches.some(m => m.includes(expectedSuffix)) which can pass when a wrong-arch build is present alongside a correct one; change the logic around CRITICAL_NATIVE_PACKAGES/entries/matches/expectedSuffix so that you first collect arch-specific matches (e.g., matches.filter(m => /-\w+-\w+(@|$)/.test(m))) and then consider the package OK only if either there are no arch-specific matches or every arch-specific match ends with the expectedSuffix (i.e., archSpecific.every(m => m.includes(expectedSuffix))); if any arch-specific match has a different suffix, treat the package as missing and push to missing.cli/src/builders/facet-directory.test.ts (1)
13-20: Minor: silence info/warn in tests for cleaner output.
new Logger(false)still emitsinfo/warnper the comment. With ~15 tests exercising broken/foreign-install paths that intentionally trigger warning logs, output noise can swamp real failures. Consider a quieter test logger or capturing logs viamock.moduleif your test runner supports it. Not a blocker.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@cli/src/builders/facet-directory.test.ts` around lines 13 - 20, The tests create a Logger in beforeEach with new Logger(false) which still prints info/warn; replace this with a test-silent logger by either (A) providing a dedicated test logger that implements the same API but no-ops info/warn (e.g., replace new Logger(false) with new QuietTestLogger() or createQuietLogger()) or (B) if your test framework supports mocking, stub Logger.prototype.info and Logger.prototype.warn to no-op before tests run (or spy and restore) so that the logger variable used in beforeEach emits only errors; update the beforeEach to instantiate or install that quiet logger so tests no longer print info/warn.src/components/StatCard.tsx (1)
698-718:summaryvariant: long values can overflow the bordered card.
labelis constrained withwhitespace-nowrap overflow-hidden text-ellipsis, but the value<div>has no overflow handling. With a longformattedValue(e.g.,"$1,234,567,890.12"or a long string), the parent'smin-w-*will be exceeded and the card grows past its intended report-band width — defeating the "compact KPI" purpose. Consider mirroring the label clamping on the value too:♻️ Optional clamp
- <div className={`font-bold ${colorClasses.text} ${sizeClasses.summaryValueText} ${conditionalClasses || ''} ${valueClassName || ''}`.trim()}> + <div className={`font-bold ${colorClasses.text} ${sizeClasses.summaryValueText} truncate ${conditionalClasses || ''} ${valueClassName || ''}`.trim()}> {formattedValue} </div>🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/components/StatCard.tsx` around lines 698 - 718, The summary variant's value div (the element that renders formattedValue in the 'summary' branch of StatCard) lacks overflow/clamping and can force the card wider; update that div's class string to include the same truncation/overflow utilities used on the label (e.g., whitespace-nowrap overflow-hidden text-ellipsis or the Tailwind shorthand truncate) so long values are clipped instead of expanding the card, and ensure this change composes with valueClassName/conditionalClasses (avoid allowing those props to override the overflow behavior).src/components/ListTable.tsx (1)
167-172:diffDaysmath is sensitive to DST transitions.
(startOfToday.getTime() - dateStart.getTime()) / 86400000doesn't account for the 23-/25-hour days at DST boundaries —Math.floorcan land off-by-one for dates that straddle a transition, putting (e.g.) "exactly 7 days ago" into the 30-day week bucket on some days of the year. Consider computing the difference in calendar days directly:const msPerDay = 86_400_000; const utcToday = Date.UTC(now.getFullYear(), now.getMonth(), now.getDate()); const utcThen = Date.UTC(d.getFullYear(), d.getMonth(), d.getDate()); const diffDays = Math.round((utcToday - utcThen) / msPerDay);🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/components/ListTable.tsx` around lines 167 - 172, The diffDays calculation in getTimeBucket is DST-sensitive and can produce off-by-one results; replace the current local-ms math using startOfToday/getTime with a UTC-based day difference: compute utcToday = Date.UTC(now.getFullYear(), now.getMonth(), now.getDate()) and utcThen = Date.UTC(d.getFullYear(), d.getMonth(), d.getDate()), set msPerDay = 86400000 and compute diffDays = Math.round((utcToday - utcThen) / msPerDay); update the code around the diffDays variable in getTimeBucket to use these UTC values to ensure correct calendar-day differences across DST transitions.examples/kitchen-sink/FindingsReport.tsx (1)
421-432: Default metadata is a fallback for missingdata.metadata, not for an empty object.
data.metadata ? data.metadata as ReportMetadata : {…defaults}substitutes the defaults only whendata.metadatais falsy. If a caller passes an explicitmetadata: {}(empty but truthy) the report will render with no reporting window, classification, etc., and the footer/header strings will collapse to bare defaults like"Internal"and the current date. Probably fine for an example, but worth being aware of if this file is the template consumers copy from.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@examples/kitchen-sink/FindingsReport.tsx` around lines 421 - 432, The metadata fallback currently treats any truthy data.metadata (including an empty object) as authoritative; update the logic in FindingsReport to merge defaults with any provided metadata so missing fields are filled and an empty object doesn't wipe defaults. Locate the metadata variable in FindingsReport and replace the ternary check on data.metadata with a merge pattern that defines a defaults object (reportingWindow, environment, classification, audience, preparedBy, scope) and then spreads ...(data.metadata as Partial<ReportMetadata> || {}) on top, so callers can override specific fields but empty or partial metadata still inherits sensible defaults.cli/src/utils/pdf-multipass.ts (2)
434-477:replaceInPdfStreamsis unused and should be removed.This function is defined but never called anywhere in the codebase. The compositing path (lines 528-565) uses a different approach (regenerating the overlay via Puppeteer rather than rewriting PDF streams), making this dead code from an earlier design. Retaining it creates maintenance burden and divergent placeholder logic between this and
replaceInBuffer.Remove the function to clean up the codebase.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@cli/src/utils/pdf-multipass.ts` around lines 434 - 477, The function replaceInPdfStreams is dead code (defined but never used) and should be removed: delete the entire async function replaceInPdfStreams(buf: Buffer, replacements: [string, string][]) and any related unused imports or constants that only support it (e.g., PDFDocument/PDFRawStream/PDFName/zlib references if they become unused after removal), keeping replaceInBuffer and the existing Puppeteer-based compositing logic as the canonical implementation.
405-419:bufferHasPlaceholdersmay inflate on every literal'x'byte.
Buffer.from([0x78])matches the byte0x78, which is also the ASCII code for'x'and appears throughout typical PDF content streams (text, names, hex strings). For each occurrence the function attemptszlib.inflateSync(buf.subarray(offset)), which throws and is caught — but the throw path through Node's zlib still does meaningful work and can dominate runtime on multi-MB overlay PDFs.Two cheaper options:
- Match the actual zlib magic more strictly (e.g.,
0x78 0x01,0x78 0x5E,0x78 0x9C,0x78 0xDA) before callinginflateSync.- Iterate
doc.context.enumerateIndirectObjects()and only inflatePDFRawStreamentries whoseFilteris/FlateDecode(the same approachreplaceInPdfStreamsalready uses at lines 437-444).Optional given that overlays are typically small, but worth noting if this ends up running on large content buffers.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@cli/src/utils/pdf-multipass.ts` around lines 405 - 419, bufferHasPlaceholders currently tries to inflate on every 0x78 byte which causes expensive throw paths; change the zlib detection to first check two-byte sequences (0x78 followed by one of 0x01,0x5E,0x9C,0xDA) before calling zlib.inflateSync on buf.subarray(offset), so only likely zlib streams are inflated, and keep checks for PAGE_MARKER, TOTAL_MARKER, PAGE_MARKER_HEX, TOTAL_MARKER_HEX as before; alternatively (if you prefer stream-aware scanning) restrict inflation to actual PDF stream objects by iterating doc.context.enumerateIndirectObjects() and inflating only PDFRawStream entries with Filter /FlateDecode (same approach used in replaceInPdfStreams).
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: CHILL
Plan: Pro
Run ID: 43dc156a-5928-4282-abaa-23e8ef677731
⛔ Files ignored due to path filters (4)
cli/package-lock.jsonis excluded by!**/package-lock.jsoncli/pnpm-lock.yamlis excluded by!**/pnpm-lock.yamlpackage-lock.jsonis excluded by!**/package-lock.jsonpnpm-lock.yamlis excluded by!**/pnpm-lock.yaml
📒 Files selected for processing (70)
.github/workflows/test.yml.gitignore.storybook/main.tsREADME.mdTaskfile.ymlcli/ARCHITECTURE.mdcli/README.mdcli/STATUS.mdcli/package.jsoncli/scripts/gen-version.cjscli/src/builders/facet-directory.test.tscli/src/builders/facet-directory.tscli/src/bundler/vite-builder.test.tscli/src/bundler/vite-builder.tscli/src/cli.tscli/src/commands/doctor.tscli/src/generators/html.tscli/src/server/facet-cache.tscli/src/server/facet-types.tscli/src/server/routes.tscli/src/utils/package-manager.test.tscli/src/utils/package-manager.tscli/src/utils/pdf-generator.tscli/src/utils/pdf-multipass.tscli/src/utils/remote-resolver.test.tscli/src/utils/remote-resolver.tscli/src/version-generated.tscli/test/page-number.test.tscli/tsconfig.jsoncli/tsdown.config.tscli/vite-ssr-loader.tsexamples/kitchen-sink/BadgeExamples.tsxexamples/kitchen-sink/FindingsReport.tsxexamples/kitchen-sink/FlanksourceHeader.tsxexamples/kitchen-sink/HeaderSolid.tsxexamples/kitchen-sink/ListTableExample.tsxexamples/kitchen-sink/UberKitchenSink.tsxexamples/kitchen-sink/package.jsonexamples/kitchen-sink/sample-findings.jsonexamples/quickstart/README.mdpackage.jsonsrc/components/AIModelCard.tsxsrc/components/Avatar.tsxsrc/components/AvatarGroup.tsxsrc/components/Badge.tsxsrc/components/BulletList.tsxsrc/components/CalloutBox.tsxsrc/components/DatasheetTemplate.tsxsrc/components/Document.tsxsrc/components/Finding.tsxsrc/components/Header.tsxsrc/components/Heatmap.tsxsrc/components/ListTable.tsxsrc/components/Page.tsxsrc/components/PageNo.tsxsrc/components/ProgressBar.tsxsrc/components/ScoreGauge.tsxsrc/components/SeverityStatCard.tsxsrc/components/Shield/Shield.tsxsrc/components/StatCard.test.tsxsrc/components/StatCard.tsxsrc/components/Status.tsxsrc/components/index.tsxsrc/components/utils/resolveSizeVariant.tssrc/stories/components/AIModelCard.stories.tsxsrc/stories/components/Badge.stories.tsxsrc/stories/components/DatasheetTemplate.stories.tsxsrc/stories/components/StatCard.stories.tsxsrc/styles.csstailwind.config.js
💤 Files with no reviewable changes (6)
- cli/src/version-generated.ts
- cli/STATUS.md
- src/components/Heatmap.tsx
- src/styles.css
- cli/src/server/facet-cache.ts
- cli/ARCHITECTURE.md
| let sha = process.env.GIT_COMMIT || 'unknown'; | ||
| let dirty = false; | ||
| try { | ||
| sha = execSync('git rev-parse --short HEAD', { encoding: 'utf8' }).trim(); | ||
| dirty = execSync('git status --porcelain', { encoding: 'utf8' }).trim().length > 0; | ||
| } catch {} | ||
|
|
||
| const commit = sha + (dirty ? '-dirty' : ''); | ||
| const buildDate = new Date().toISOString().replace('T', ' ').slice(0, 16); | ||
|
|
||
| const out = `// DO NOT MODIFY — this file is auto-generated by scripts/gen-version.js |
There was a problem hiding this comment.
Suppress git stderr and fix the header filename.
execSync inherits stderr by default, so when this script runs outside a git working tree (e.g. installed-from-tarball, shallow CI checkouts) git emits fatal: not a git repository … to stderr before the catch {} swallows the throw, polluting build logs. Pipe stderr to ignore so the fallback path is silent.
Also, the auto-generated header refers to scripts/gen-version.js, but the file is gen-version.cjs.
♻️ Proposed fix
try {
- sha = execSync('git rev-parse --short HEAD', { encoding: 'utf8' }).trim();
- dirty = execSync('git status --porcelain', { encoding: 'utf8' }).trim().length > 0;
+ const opts = { encoding: 'utf8', stdio: ['ignore', 'pipe', 'ignore'] };
+ sha = execSync('git rev-parse --short HEAD', opts).trim();
+ dirty = execSync('git status --porcelain', opts).trim().length > 0;
} catch {}
@@
-const out = `// DO NOT MODIFY — this file is auto-generated by scripts/gen-version.js
+const out = `// DO NOT MODIFY — this file is auto-generated by scripts/gen-version.cjs📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| let sha = process.env.GIT_COMMIT || 'unknown'; | |
| let dirty = false; | |
| try { | |
| sha = execSync('git rev-parse --short HEAD', { encoding: 'utf8' }).trim(); | |
| dirty = execSync('git status --porcelain', { encoding: 'utf8' }).trim().length > 0; | |
| } catch {} | |
| const commit = sha + (dirty ? '-dirty' : ''); | |
| const buildDate = new Date().toISOString().replace('T', ' ').slice(0, 16); | |
| const out = `// DO NOT MODIFY — this file is auto-generated by scripts/gen-version.js | |
| let sha = process.env.GIT_COMMIT || 'unknown'; | |
| let dirty = false; | |
| try { | |
| const opts = { encoding: 'utf8', stdio: ['ignore', 'pipe', 'ignore'] }; | |
| sha = execSync('git rev-parse --short HEAD', opts).trim(); | |
| dirty = execSync('git status --porcelain', opts).trim().length > 0; | |
| } catch {} | |
| const commit = sha + (dirty ? '-dirty' : ''); | |
| const buildDate = new Date().toISOString().replace('T', ' ').slice(0, 16); | |
| const out = `// DO NOT MODIFY — this file is auto-generated by scripts/gen-version.cjs |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@cli/scripts/gen-version.cjs` around lines 8 - 18, Suppress git stderr by
calling execSync with options that ignore stderr (e.g., include a stdio array
where stderr is 'ignore') for the two execSync invocations that set sha and
dirty, so git errors don't leak into build logs, and update the auto-generated
header string in the out variable to reference "scripts/gen-version.cjs" instead
of "scripts/gen-version.js"; locate the execSync calls and the out template via
the symbols execSync, sha, dirty, commit, buildDate, and out and apply these two
changes.
| // Resolve any pnpm/npm local-path protocol (file:, link:, portal:, workspace:) | ||
| // to an absolute form anchored at `pkgDir`, so the reference survives being | ||
| // rewritten into `.facet/package.json` which lives at a different depth. | ||
| function resolveLocalProtocol(version: string, pkgDir: string): string { | ||
| for (const proto of ['file:', 'link:', 'portal:']) { | ||
| if (version.startsWith(proto)) { | ||
| return proto + resolve(pkgDir, version.slice(proto.length)); | ||
| } | ||
| } | ||
| return version; | ||
| } |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
#!/bin/bash
# Are there tests/examples exercising workspace: deps in .facet/?
rg -nP --type=ts --type=json -C2 '"workspace:'Repository: flanksource/facet
Length of output: 43
🏁 Script executed:
#!/bin/bash
# Find the resolveLocalProtocol function and its tests
fd -e ts -e js | xargs rg -l "resolveLocalProtocol" 2>/dev/nullRepository: flanksource/facet
Length of output: 97
🏁 Script executed:
#!/bin/bash
# Search for workspace references (case-insensitive, broader context)
rg -i "workspace" --type=ts --type=js -C 2 | head -100Repository: flanksource/facet
Length of output: 4264
🏁 Script executed:
#!/bin/bash
# Check if there are any tests in the test directory for facet-directory.ts
fd -path "*/test*/*facet-directory*" -o -path "*/spec*/*facet-directory*"Repository: flanksource/facet
Length of output: 230
🏁 Script executed:
#!/bin/bash
# Look at the full file to understand context and usage
wc -l cli/src/builders/facet-directory.tsRepository: flanksource/facet
Length of output: 101
🏁 Script executed:
#!/bin/bash
# Look at doctor.ts context around workspace: check
sed -n '1,200p' cli/src/commands/doctor.ts | head -100Repository: flanksource/facet
Length of output: 3136
🏁 Script executed:
#!/bin/bash
# Find the exact line with workspace: in doctor.ts and show context
rg -n "workspace:" cli/src/commands/doctor.ts -B 3 -A 3Repository: flanksource/facet
Length of output: 385
🏁 Script executed:
#!/bin/bash
# Look at resolveLocalProtocol usage in facet-directory.ts
rg -n "resolveLocalProtocol" cli/src/builders/facet-directory.ts -B 2 -A 5Repository: flanksource/facet
Length of output: 818
🏁 Script executed:
#!/bin/bash
# Check how resolveOverrideValues (which calls resolveLocalProtocol) is used
rg -n "resolveOverrideValues\|resolveLocalProtocol" cli/src/builders/facet-directory.ts -B 3 -A 3Repository: flanksource/facet
Length of output: 43
🏁 Script executed:
#!/bin/bash
# Look at the full context of where the protocol resolution happens
sed -n '50,100p' cli/src/builders/facet-directory.tsRepository: flanksource/facet
Length of output: 2275
🌐 Web query:
pnpm workspace: protocol resolution absolute path
💡 Result:
In pnpm workspaces, the "workspace:" protocol resolves exclusively to local workspace packages matching the specified range (e.g., "workspace:" resolves to the available workspace package version). pnpm first parses the workspace specifier using workspacePrefToNpm (converting "workspace:" to ""), then uses @pnpm/resolve-workspace-range to select a matching version from workspacePackages (a map of package names to available versions in the workspace, found via pnpm-workspace.yaml). If no match, installation fails. The resolution produces a ResolveResult with resolvedVia: "workspace". For linking: - Default (linkWorkspacePackages=true): Creates symlinks via "link:" protocol with relative paths from the project dir to the workspace package dir (e.g., "link:../sibling-pkg"). - With injectWorkspacePackages=true: Uses "file:" protocol with relative paths from lockfile dir (workspace root) to package dir, enabling hardlinks. In pnpm-lock.yaml: - Importers section retains original specifier (e.g., "workspace:"). - Packages section uses normalized ID like "link:../pkg" or "file:packages/pkg" with resolution: {type: "link"|"directory", directory: "relative/path/to/pkg"}. Absolute paths appear in some cases (e.g., file: overrides) but are bugs being fixed; workspace: uses relative paths. Relative paths ensure portability across machines. For publishing, workspace: specs convert to semver versions.
Citations:
- 1: https://pnpm.io/workspaces
- 2: https://pnpm.io/next/workspaces
- 3: https://github.com/pnpm/pnpm/blob/2b14c742/resolving/resolver-base/src/index.ts
- 4: https://github.com/pnpm/pnpm/blob/2b14c742/resolving/npm-resolver/src/index.ts
- 5: https://registry.npmjs.org/%40pnpm%2Fresolve-workspace-range
- 6: injectWorkspacePackages does not work with relative path (workspace:../package) pnpm/pnpm#10446
🏁 Script executed:
#!/bin/bash
# Check if there's any other workspace: handling in the codebase
rg "workspace:" cli/src/ -B 2 -A 2Repository: flanksource/facet
Length of output: 1286
workspace: protocol is referenced in the comment but not handled.
The leading comment lists workspace: as one of the protocols this helper resolves, but the loop only iterates ['file:', 'link:', 'portal:']. Furthermore, workspace: references are inherently workspace-contextual (resolved via pnpm-workspace.yaml to symlinks or semver ranges) and cannot be converted to absolute paths like file:, link:, and portal: can. If a consumer specifies workspace:* in pnpm.overrides, it will be passed unchanged into .facet/package.json, where pnpm — running with --ignore-workspace and without workspace context — will fail to resolve it.
Either:
- Update the comment to remove
workspace:from the list of supported protocols, or - Detect and reject
workspace:references with a clear error message explaining why they cannot be used with facet overrides.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@cli/src/builders/facet-directory.ts` around lines 58 - 68, The comment
incorrectly claims workspace: is handled by resolveLocalProtocol but it isn't;
update resolveLocalProtocol to explicitly detect values starting with
"workspace:" and throw a clear, user-facing error explaining workspace:
references cannot be converted to absolute paths and cannot be used in facet
overrides (so consumers must use a file:/link:/portal: or a proper version), and
also update the leading comment to either remove workspace: from the supported
list or note that workspace: will be rejected; locate the logic in the
resolveLocalProtocol function and add the workspace: check before other
protocols so the function fails fast with an explanatory message.
| const unsetVar = /\$\{([A-Z_][A-Z0-9_]*)\}/g; | ||
| let skip = false; | ||
| for (const match of line.matchAll(unsetVar)) { | ||
| const envName = match[1]; | ||
| if (!process.env[envName]) { | ||
| this.logger.warn(`Skipping .npmrc line referencing unset \${${envName}}: ${key}`); | ||
| skip = true; | ||
| break; | ||
| } | ||
| } |
There was a problem hiding this comment.
Unset-variable regex only matches uppercase env-var names.
/\$\{([A-Z_][A-Z0-9_]*)\}/g won't match lowercase or mixed-case references like ${authtoken} or ${myToken}. Such lines are kept verbatim (with the literal ${authtoken}), which pnpm may or may not expand depending on its own rules. npm/pnpm .npmrc env interpolation is case-sensitive but doesn't restrict to uppercase, so silently keeping unresolved references can produce harder-to-debug auth failures than the warn-and-skip path.
Suggest broadening the character class:
- const unsetVar = /\$\{([A-Z_][A-Z0-9_]*)\}/g;
+ const unsetVar = /\$\{([A-Za-z_][A-Za-z0-9_]*)\}/g;📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| const unsetVar = /\$\{([A-Z_][A-Z0-9_]*)\}/g; | |
| let skip = false; | |
| for (const match of line.matchAll(unsetVar)) { | |
| const envName = match[1]; | |
| if (!process.env[envName]) { | |
| this.logger.warn(`Skipping .npmrc line referencing unset \${${envName}}: ${key}`); | |
| skip = true; | |
| break; | |
| } | |
| } | |
| const unsetVar = /\$\{([A-Za-z_][A-Za-z0-9_]*)\}/g; | |
| let skip = false; | |
| for (const match of line.matchAll(unsetVar)) { | |
| const envName = match[1]; | |
| if (!process.env[envName]) { | |
| this.logger.warn(`Skipping .npmrc line referencing unset \${${envName}}: ${key}`); | |
| skip = true; | |
| break; | |
| } | |
| } |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@cli/src/builders/facet-directory.ts` around lines 599 - 608, The
unset-variable regex only matches uppercase names; update the pattern used in
unsetVar (and the loop using line.matchAll and envName) to accept
lowercase/mixed-case env var names too (e.g. change /\$\{([A-Z_][A-Z0-9_]*)\}/g
to /\$\{([A-Za-z_][A-Za-z0-9_]*)\}/g) so the check correctly warns-and-skips
lines that reference any valid-cased env var, leaving the rest of the logic
(this.logger.warn message and setting skip = true) unchanged.
| // doctor command — environment preflight | ||
| program | ||
| .command('doctor') | ||
| .description('Check environment health (Node, arch, pnpm, Chromium, native bindings, FACET_PACKAGE_PATH, .facet/.npmrc leakage)') | ||
| .option('--json', 'Emit machine-readable JSON instead of human output') | ||
| .option('-v, --verbose', 'Enable verbose logging') | ||
| .action(async (options: any) => { | ||
| const { runDoctor } = await import('./commands/doctor.js'); | ||
| const exitCode = await runDoctor({ | ||
| consumerRoot: process.cwd(), | ||
| verbose: !!options.verbose, | ||
| json: !!options.json, | ||
| }); | ||
| process.exit(exitCode); | ||
| }); |
There was a problem hiding this comment.
Inconsistent error handling vs. sibling commands.
Every other action handler in this file (html, pdf, serve, lint) wraps its body in try/catch, formats failures via Logger, optionally prints the stack on --verbose, and exits with 1. The new doctor action does none of this — a thrown error from import('./commands/doctor.js') or a bug inside runDoctor will surface as an unhandled rejection with a raw Node stack trace, which is jarring UX and harder to script against (no consistent stderr format, no JSON envelope when --json is set).
♻️ Suggested change
.action(async (options: any) => {
- const { runDoctor } = await import('./commands/doctor.js');
- const exitCode = await runDoctor({
- consumerRoot: process.cwd(),
- verbose: !!options.verbose,
- json: !!options.json,
- });
- process.exit(exitCode);
+ const logger = new Logger(options.verbose);
+ try {
+ const { runDoctor } = await import('./commands/doctor.js');
+ const exitCode = await runDoctor({
+ consumerRoot: process.cwd(),
+ verbose: !!options.verbose,
+ json: !!options.json,
+ });
+ process.exit(exitCode);
+ } catch (error) {
+ logger.error(`Doctor failed: ${error instanceof Error ? error.message : String(error)}`);
+ if (options.verbose && error instanceof Error && error.stack) {
+ logger.debug(error.stack);
+ }
+ process.exit(1);
+ }
});📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| // doctor command — environment preflight | |
| program | |
| .command('doctor') | |
| .description('Check environment health (Node, arch, pnpm, Chromium, native bindings, FACET_PACKAGE_PATH, .facet/.npmrc leakage)') | |
| .option('--json', 'Emit machine-readable JSON instead of human output') | |
| .option('-v, --verbose', 'Enable verbose logging') | |
| .action(async (options: any) => { | |
| const { runDoctor } = await import('./commands/doctor.js'); | |
| const exitCode = await runDoctor({ | |
| consumerRoot: process.cwd(), | |
| verbose: !!options.verbose, | |
| json: !!options.json, | |
| }); | |
| process.exit(exitCode); | |
| }); | |
| program | |
| .command('doctor') | |
| .description('Check environment health (Node, arch, pnpm, Chromium, native bindings, FACET_PACKAGE_PATH, .facet/.npmrc leakage)') | |
| .option('--json', 'Emit machine-readable JSON instead of human output') | |
| .option('-v, --verbose', 'Enable verbose logging') | |
| .action(async (options: any) => { | |
| const logger = new Logger(options.verbose); | |
| try { | |
| const { runDoctor } = await import('./commands/doctor.js'); | |
| const exitCode = await runDoctor({ | |
| consumerRoot: process.cwd(), | |
| verbose: !!options.verbose, | |
| json: !!options.json, | |
| }); | |
| process.exit(exitCode); | |
| } catch (error) { | |
| logger.error(`Doctor failed: ${error instanceof Error ? error.message : String(error)}`); | |
| if (options.verbose && error instanceof Error && error.stack) { | |
| logger.debug(error.stack); | |
| } | |
| process.exit(1); | |
| } | |
| }); |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@cli/src/cli.ts` around lines 266 - 280, The doctor command handler currently
awaits import('./commands/doctor.js') and calls runDoctor(...) without error
handling; wrap the entire .action(async (options) => { ... }) body in a
try/catch like the other handlers (html/pdf/serve/lint): catch any error from
the dynamic import or runDoctor, use Logger (the same logger used by siblings)
to format the failure to stderr, emit a JSON error envelope when options.json is
true, and when options.verbose is set include the error.stack; finally ensure
process.exit(1) is called on error, otherwise exit with the code returned by
runDoctor.
| try { | ||
| // Run tailwindcss CLI: -i input.css --content "output.html" -o output.css | ||
| const tailwindCmd = $`cd ${facetRoot} && npx tailwindcss -i ${stylesInput} --content ${tempHtmlPath} -o ${outputCssPath}`; | ||
| const tailwindCmd = $`cd ${facetRoot} && pnpm exec tailwindcss -i ${stylesInput} --content ${tempHtmlPath} -o ${outputCssPath}`; |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
#!/bin/bash
# Confirm tailwindcss is installed into .facet/node_modules during setup, so a direct binary path is viable
rg -nP --type=ts -C3 'tailwindcss' cli/src
fd -t f 'package.json' cli | head -20 | xargs -I{} sh -c 'echo "=== {} ==="; cat "{}"'Repository: flanksource/facet
Length of output: 4671
🏁 Script executed:
# Search for where .facet directory is created and used
rg -n '\.facet' cli/src --type ts -B 2 -A 2Repository: flanksource/facet
Length of output: 32778
🏁 Script executed:
# Find where dependencies (including tailwindcss) are installed during facet setup
rg -n 'pnpm.*install|npm.*install' cli/src --type ts -B 3 -A 3 | head -60Repository: flanksource/facet
Length of output: 4374
🏁 Script executed:
# Check the facet-directory.ts file more thoroughly for the full context of the dependency list
sed -n '400,430p' cli/src/builders/facet-directory.tsRepository: flanksource/facet
Length of output: 922
🏁 Script executed:
# Check README for pnpm documentation
find . -name "README*" -o -name "readme*" | head -5 | xargs cat 2>/dev/null | grep -i pnpm -B 3 -A 3 | head -40Repository: flanksource/facet
Length of output: 656
🏁 Script executed:
# Check if pnpm is validated/checked before execution
rg -n 'resolvePackageManager|checkPnpm|pnpm.*--version' cli/src --type ts -B 2 -A 5 | head -80Repository: flanksource/facet
Length of output: 5400
🏁 Script executed:
# Verify the exact locations and context of the pnpm exec calls
sed -n '110,120p' cli/src/generators/html.ts
sed -n '490,505p' cli/src/server/routes.tsRepository: flanksource/facet
Length of output: 1064
Document pnpm as a runtime prerequisite and consider using the resolved tailwindcss binary instead of pnpm exec.
The README offers npm install -g @flanksource/facet`` as an alternative installation method without noting that pnpm must also be available at runtime. While the code enforces pnpm as a hard requirement during `.facet/` setup (resolved in `vite-builder.ts`), users installing via npm won't be aware of this runtime dependency before attempting HTML/PDF generation.
Additionally, since tailwindcss is already installed into .facet/node_modules/.bin/ during the pnpm install step, you could invoke the binary directly instead of relying on pnpm exec:
const tailwindBin = join(facetRoot, 'node_modules/.bin/tailwindcss');
const tailwindCmd = $`${tailwindBin} -i ${stylesInput} --content ${tempHtmlPath} -o ${outputCssPath}`;This eliminates the implicit assumption that pnpm is available in PATH during CSS generation and would work regardless of which package manager was used to install the CLI.
Add a note to the README install section clarifying that pnpm must be present at runtime, or apply the direct binary path approach to both cli/src/generators/html.ts (line 114) and cli/src/server/routes.ts (line 497).
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@cli/src/generators/html.ts` at line 114, The current CSS generation uses
`pnpm exec` which forces a runtime pnpm requirement; update
`cli/src/generators/html.ts` to resolve and call the installed tailwind binary
directly by constructing `tailwindBin` from `facetRoot` (e.g., join `facetRoot`
with `node_modules/.bin/tailwindcss`) and replace the `tailwindCmd` invocation
that currently references `stylesInput`, `tempHtmlPath`, and `outputCssPath` to
call that binary; apply the same change in `cli/src/server/routes.ts` (the
analogous tailwind invocation around line ~497) and also add a short note in the
README install section documenting that pnpm is required at runtime if you
choose not to call the local binary.
| function getTimeBucket(value: string): TimeBucket { | ||
| const d = new Date(value); | ||
| const now = new Date(); | ||
| const startOfToday = new Date(now.getFullYear(), now.getMonth(), now.getDate()); | ||
| const diffDays = Math.floor((startOfToday.getTime() - new Date(d.getFullYear(), d.getMonth(), d.getDate()).getTime()) / 86400000); | ||
|
|
||
| if (diffDays < 0) { | ||
| return { key: 'future', label: formatDayLabel(d), dateFormat: 'time' }; | ||
| } | ||
| if (diffDays === 0) { | ||
| return { key: 'today', label: formatDayLabel(d), dateFormat: 'time' }; | ||
| } | ||
| if (diffDays <= 6) { | ||
| return { key: `day-${diffDays}`, label: formatDayLabel(d), dateFormat: 'time' }; | ||
| } | ||
| if (diffDays <= 30) { | ||
| const weekStart = new Date(d); | ||
| weekStart.setDate(d.getDate() - ((d.getDay() + 6) % 7)); | ||
| const weekEnd = new Date(weekStart); | ||
| weekEnd.setDate(weekStart.getDate() + 4); | ||
| return { | ||
| key: `week-${formatMonthDay(weekStart.toISOString())}`, | ||
| label: `${formatMonthDay(weekStart.toISOString())} – ${formatMonthDay(weekEnd.toISOString())}`, | ||
| dateFormat: 'monthDay', | ||
| }; | ||
| } | ||
| return { | ||
| key: `month-${d.getFullYear()}-${d.getMonth()}`, | ||
| label: d.toLocaleDateString('en-US', { month: 'long', year: 'numeric' }), | ||
| dateFormat: 'monthDay', | ||
| }; | ||
| } |
There was a problem hiding this comment.
getTimeBucket collapses all future-dated rows into a single "future" group.
For any diffDays < 0, the function returns key: 'future' regardless of how far ahead the date is. buildGroups keys its groupMap on key, so:
- All future-dated rows are merged into one group, even if they span multiple days/weeks/months.
- The visible group
label(formatDayLabel(d)) is fixed by whichever future row is iterated first — subsequent future rows for different days are hidden under that one label.
Past-day buckets correctly use distinct keys (day-${diffDays}, week-…, month-…). Future dates should follow the same pattern, e.g. mirror the past logic with negated diffDays:
if (diffDays < 0) {
- return { key: 'future', label: formatDayLabel(d), dateFormat: 'time' };
+ const ahead = -diffDays;
+ if (ahead <= 6) {
+ return { key: `future-day-${ahead}`, label: formatDayLabel(d), dateFormat: 'time' };
+ }
+ return {
+ key: `future-month-${d.getFullYear()}-${d.getMonth()}`,
+ label: d.toLocaleDateString('en-US', { month: 'long', year: 'numeric' }),
+ dateFormat: 'monthDay',
+ };
}📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| function getTimeBucket(value: string): TimeBucket { | |
| const d = new Date(value); | |
| const now = new Date(); | |
| const startOfToday = new Date(now.getFullYear(), now.getMonth(), now.getDate()); | |
| const diffDays = Math.floor((startOfToday.getTime() - new Date(d.getFullYear(), d.getMonth(), d.getDate()).getTime()) / 86400000); | |
| if (diffDays < 0) { | |
| return { key: 'future', label: formatDayLabel(d), dateFormat: 'time' }; | |
| } | |
| if (diffDays === 0) { | |
| return { key: 'today', label: formatDayLabel(d), dateFormat: 'time' }; | |
| } | |
| if (diffDays <= 6) { | |
| return { key: `day-${diffDays}`, label: formatDayLabel(d), dateFormat: 'time' }; | |
| } | |
| if (diffDays <= 30) { | |
| const weekStart = new Date(d); | |
| weekStart.setDate(d.getDate() - ((d.getDay() + 6) % 7)); | |
| const weekEnd = new Date(weekStart); | |
| weekEnd.setDate(weekStart.getDate() + 4); | |
| return { | |
| key: `week-${formatMonthDay(weekStart.toISOString())}`, | |
| label: `${formatMonthDay(weekStart.toISOString())} – ${formatMonthDay(weekEnd.toISOString())}`, | |
| dateFormat: 'monthDay', | |
| }; | |
| } | |
| return { | |
| key: `month-${d.getFullYear()}-${d.getMonth()}`, | |
| label: d.toLocaleDateString('en-US', { month: 'long', year: 'numeric' }), | |
| dateFormat: 'monthDay', | |
| }; | |
| } | |
| function getTimeBucket(value: string): TimeBucket { | |
| const d = new Date(value); | |
| const now = new Date(); | |
| const startOfToday = new Date(now.getFullYear(), now.getMonth(), now.getDate()); | |
| const diffDays = Math.floor((startOfToday.getTime() - new Date(d.getFullYear(), d.getMonth(), d.getDate()).getTime()) / 86400000); | |
| if (diffDays < 0) { | |
| const ahead = -diffDays; | |
| if (ahead <= 6) { | |
| return { key: `future-day-${ahead}`, label: formatDayLabel(d), dateFormat: 'time' }; | |
| } | |
| return { | |
| key: `future-month-${d.getFullYear()}-${d.getMonth()}`, | |
| label: d.toLocaleDateString('en-US', { month: 'long', year: 'numeric' }), | |
| dateFormat: 'monthDay', | |
| }; | |
| } | |
| if (diffDays === 0) { | |
| return { key: 'today', label: formatDayLabel(d), dateFormat: 'time' }; | |
| } | |
| if (diffDays <= 6) { | |
| return { key: `day-${diffDays}`, label: formatDayLabel(d), dateFormat: 'time' }; | |
| } | |
| if (diffDays <= 30) { | |
| const weekStart = new Date(d); | |
| weekStart.setDate(d.getDate() - ((d.getDay() + 6) % 7)); | |
| const weekEnd = new Date(weekStart); | |
| weekEnd.setDate(weekStart.getDate() + 4); | |
| return { | |
| key: `week-${formatMonthDay(weekStart.toISOString())}`, | |
| label: `${formatMonthDay(weekStart.toISOString())} – ${formatMonthDay(weekEnd.toISOString())}`, | |
| dateFormat: 'monthDay', | |
| }; | |
| } | |
| return { | |
| key: `month-${d.getFullYear()}-${d.getMonth()}`, | |
| label: d.toLocaleDateString('en-US', { month: 'long', year: 'numeric' }), | |
| dateFormat: 'monthDay', | |
| }; | |
| } |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@src/components/ListTable.tsx` around lines 167 - 198, getTimeBucket currently
returns a single key 'future' for any diffDays < 0 which causes buildGroups'
groupMap (keyed by the returned key) to merge all future-dated rows into one
group; change getTimeBucket to mirror the past-date branching for negative
diffDays so future rows get unique keys (e.g. use `day-${-diffDays}` for
near-future days, a `week-...` key built from the week's start for mid-range
future dates, and `month-{year}-{month}` for farther future dates) and return
corresponding labels and dateFormat values so buildGroups' groupMap and visible
labels remain distinct per future bucket.
| valueColor?: string; | ||
| sublabel?: React.ReactNode; | ||
| sublabelClassName?: string; | ||
| shrink?: boolean; |
There was a problem hiding this comment.
shrink prop is missing from the public CLI types and JSDoc.
StatCardProps.shrink is added here, but cli/src/server/facet-types.ts (the runtime declaration string consumed by SSR templates) does not declare shrink, and the JSDoc block at lines 153–185 doesn't mention it either. Consumers using the CLI typings will get a type error if they try to pass shrink.
📝 Add `shrink` to facet-types.ts
sublabelClassName?: string;
+ shrink?: boolean;
conditionalStyles?: ConditionalStyle[];(in cli/src/server/facet-types.ts, inside StatCardProps)
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@src/components/StatCard.tsx` at line 24, Add the new optional boolean prop
"shrink" to the public runtime/CLI declaration for StatCard by updating the
StatCardProps entry in the facet-types runtime string (ensure it is declared as
shrink?: boolean) and also update the JSDoc block for StatCardProps to document
the shrink prop and its behavior; make sure the type and name exactly match the
StatCardProps interface used in src/components/StatCard.tsx so CLI consumers no
longer get a missing-property type error.
| <div className="inline-flex flex-row items-center"> | ||
| <span | ||
| className={`inline-block h-3 w-3 flex-shrink-0 rounded-full shadow-md ${className} ${color}`} | ||
| aria-hidden="true" | ||
| /> | ||
| className={`${indicatorClassName} ${className}`} |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
#!/bin/bash
# Find existing call sites passing className to Status to assess impact.
rg -nP -C2 --type=tsx --type=ts '<Status\b[^>]*\bclassName='Repository: flanksource/facet
Length of output: 89
🏁 Script executed:
#!/bin/bash
# First, locate the Status.tsx file
fd -name 'Status.tsx'Repository: flanksource/facet
Length of output: 292
🏁 Script executed:
#!/bin/bash
# Search for Status component usages with className (using correct syntax without -t flag)
rg -nP -C2 '<Status[^>]*className='Repository: flanksource/facet
Length of output: 43
🏁 Script executed:
#!/bin/bash
# Also look for the Status.tsx file and read the implementation around lines 70-72
fd -name 'Status.tsx' | head -1 | xargs wc -lRepository: flanksource/facet
Length of output: 312
🏁 Script executed:
#!/bin/bash
# Find Status.tsx file
find . -name 'Status.tsx' -type fRepository: flanksource/facet
Length of output: 89
🏁 Script executed:
#!/bin/bash
# Search for Status component usage more broadly
rg 'Status[^a-zA-Z]' --type-list | grep -i tsxRepository: flanksource/facet
Length of output: 451
🏁 Script executed:
#!/bin/bash
# Let's just look for all tsx files and their count
find . -name '*.tsx' -o -name '*.ts' | wc -lRepository: flanksource/facet
Length of output: 65
🏁 Script executed:
#!/bin/bash
# Read Status.tsx to see the implementation around lines 70-72
cat -n ./src/components/Status.tsx | head -85 | tail -20Repository: flanksource/facet
Length of output: 805
🏁 Script executed:
#!/bin/bash
# Also get a fuller view to understand the component structure
cat -n ./src/components/Status.tsxRepository: flanksource/facet
Length of output: 2797
🏁 Script executed:
#!/bin/bash
# Search for all usages of the Status component
rg 'Status' --type-listRepository: flanksource/facet
Length of output: 451
🏁 Script executed:
#!/bin/bash
# Search for Status component usage - look for imports and usage
rg 'import.*Status' -A 5 -B 5Repository: flanksource/facet
Length of output: 550
🏁 Script executed:
#!/bin/bash
# More specific search for Status component usage
rg '<Status' -A 3 -B 1Repository: flanksource/facet
Length of output: 3581
Move className prop to the outer wrapper <div> for conventional component API.
The className prop is currently applied to the inner indicator <span> (line 72) alongside layout/sizing classes (h-3 w-3 rounded-full, etc.), rather than the root element. This is unconventional—consumers typically expect className on a component's root element to control margins, alignment, and other wrapper-level styling.
Since no existing usages pass className to the Status component, this can be safely moved to the outer <div> without breaking changes. Alternatively, if the current behavior is intentional, introduce a separate indicatorClassName prop (following the pattern already established elsewhere in this PR) to make the intent explicit.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@src/components/Status.tsx` around lines 70 - 72, The Status component
currently applies the incoming className to the inner indicator <span> (combined
with indicatorClassName) instead of the root wrapper <div>; update the component
so the root <div className="inline-flex flex-row items-center"> receives the
incoming className prop (merging with its existing classes) and the inner <span>
uses only indicatorClassName (or a new indicatorClassName prop if you prefer to
keep span customization explicit); adjust the component props/signature
accordingly (remove className from the span, apply it to the div, and ensure any
existing indicatorClassName behavior remains intact).
| fontFamily: { | ||
| 'sans': ["Open Sans", "system-ui", "-apple-system", "BlinkMacSystemFont", "Segoe UI", "Roboto", "sans-serif"], | ||
| 'mono': ["Consolas", "Monaco", "Courier New", "monospace"], | ||
| 'mono': ["Fira Code", "Consolas", "Monaco", "Courier New", "monospace"], |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
#!/bin/bash
# Look for any Fira Code loading mechanism
rg -nP -i 'fira[\s_-]*code' --type=css --type=ts --type=tsx --type=html --type=js
rg -nP 'fonts\.googleapis|@font-face' srcRepository: flanksource/facet
Length of output: 89
🏁 Script executed:
#!/bin/bash
# Search for Fira Code mentions (case-insensitive)
echo "=== Searching for Fira Code ==="
rg -i 'fira' --type-list | head -20
rg -i 'fira' --type=css --type=js --type=html
echo -e "\n=== Searching for Google Fonts or `@font-face` ==="
rg 'fonts\.googleapis|@font-face' --type=css --type=js --type=html
echo -e "\n=== Checking for font files in public/assets ==="
find . -type f \( -name "*.woff*" -o -name "*.ttf" -o -name "*.otf" \) 2>/dev/null | head -20
echo -e "\n=== Looking at main HTML files ==="
find . -maxdepth 3 -name "*.html" -type f | head -10Repository: flanksource/facet
Length of output: 916
🏁 Script executed:
#!/bin/bash
echo "=== Content of src/styles.css ==="
cat -n src/styles.css
echo -e "\n=== Content of cli/playground.html ==="
cat -n cli/playground.html | head -50Repository: flanksource/facet
Length of output: 28091
Confirm Fira Code is actually loaded for PDF/HTML output.
Listing "Fira Code" first in the Tailwind config and CSS font-family rules only changes the CSS stack—the font still needs to be loaded (Google Fonts <link>, @font-face, or installed system font). Puppeteer's headless Chromium has no Fira Code by default, so PDF output will silently fall back to Consolas/Monaco/Courier New, defeating the change.
Add a <link> to fonts.googleapis.com or an @font-face declaration in your styles, and ensure fonts are loaded before Puppeteer renders.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@tailwind.config.js` at line 14, The Tailwind 'mono' entry lists "Fira Code"
but the font isn't actually loaded for PDF/HTML output; add an actual font load
(either a Google Fonts <link> or an `@font-face` in your main stylesheet that
includes Fira Code) so the browser has the face available, and ensure
Puppeteer's render waits for fonts to finish loading (e.g., wait for
document.fonts.ready before calling page.pdf / page.screenshot). Update the
CSS/HTML that supplies your global fonts (the stylesheet that Tailwind's 'mono'
will use) to include the Fira Code load and add a wait for font loading in the
code path that invokes Puppeteer rendering.
5b11709 to
a7a45a7
Compare
|
Warning Review the following alerts detected in dependencies. According to your organization's Security Policy, it is recommended to resolve "Warn" alerts. Learn more about Socket for GitHub.
|
The pnpm migration in a7a45a7 left npm references in CI/build: - test.yml setup-node cached cli/package-lock.json (no longer exists) - Dockerfile builder ran npm run build, but prebuild calls pnpm - Dockerfile final stage copied package-lock.json (no longer exists) - release.yml installed and built via npm and tried to git-add stale package-lock files Switch all of them to pnpm@8.13.1 via pnpm/action-setup, replace npm install/run with pnpm equivalents, and update lockfile paths.
pnpm 8 disables pre/post lifecycle scripts by default for security. cli/package.json relies on prebuild → gen-version to emit src/version-generated.ts; without it, the Docker build fails with "Could not resolve '../version-generated.js'". Add .npmrc with enable-pre-post-scripts=true at the root and in cli/.
facet shells out to pnpm when materialising .facet/node_modules for a template. The final image only had pnpm in the builder stage, so the in-image warmup (and any runtime PDF render) failed with: "facet requires pnpm, but it was not found on PATH". Install pnpm@8.13.1 globally in the runtime stage too.
What
logoClassandclassNameprops to FlanksourceHeader for flexible stylingclassNameandstyleprops.agent/and.sandbox.yamlWhy
Summary by CodeRabbit
New Features
facet doctorcommand for environment validation and diagnostics.Improvements