Skip to content

.pr_agent_auto_best_practices

root edited this page Jul 15, 2025 · 5 revisions

Pattern 1: Add proper error handling for database queries and API calls by checking for errors in the response and throwing meaningful error messages instead of returning undefined or continuing with potentially invalid data.

Example code before:

const { data } = await supabase.from('table').select('*').single()
return data

Example code after:

const { data, error } = await supabase.from('table').select('*').single()
if (error) {
  throw new Error(`Failed to fetch data: ${error.message}`)
}
return data
Relevant past accepted suggestions:
Suggestion 1:

Add error handling for database query

The function doesn't handle potential errors from the Supabase query. If the query fails or returns an error, the function will return undefined without any indication of what went wrong. Add proper error handling to distinguish between "not found" and actual query errors.

frontend/apps/app/components/SessionDetailPage/services/designSessionWithTimelineItems/fetchDesignSessionWithTimelineItems.ts [3-31]

 export async function fetchDesignSessionWithTimelineItems(
   supabase: SupabaseClientType,
   designSessionId: string,
 ) {
-  const { data } = await supabase
+  const { data, error } = await supabase
     .from('design_sessions')
     .select(`
         id,
         organization_id,
         timeline_items (
           id,
           content,
           type,
           user_id,
           created_at,
           organization_id,
           design_session_id,
           building_schema_version_id
         )
       `)
     .eq('id', designSessionId)
     .order('created_at', {
       ascending: true,
       referencedTable: 'timeline_items',
     })
     .single()
 
+  if (error) {
+    throw new Error(`Failed to fetch design session: ${error.message}`)
+  }
+
   return data
 }

Suggestion 2:

Add sessionStorage error handling

Add error handling for sessionStorage access since it can throw exceptions in private browsing mode or when storage is disabled. Wrap the sessionStorage call in a try-catch block to prevent runtime errors.

frontend/apps/app/components/Chat/Chat.tsx [40-45]

 const getAutoStartExecuted = () => {
   if (typeof window === 'undefined') return false
-  return (
-    sessionStorage.getItem(`autoStartExecuted_${designSession.id}`) === 'true'
-  )
+  try {
+    return (
+      sessionStorage.getItem(`autoStartExecuted_${designSession.id}`) === 'true'
+    )
+  } catch {
+    return false
+  }
 }

Suggestion 3:

Log parsing errors for debugging

The catch block silently swallows all parsing errors without logging them. This makes debugging difficult when the PMAgent returns malformed JSON. Log the parsing error before falling back to the default structure.

frontend/internal-packages/agent/src/chat/workflow/nodes/analyzeRequirementsNode.ts [42-49]

-} catch {
+} catch (error) {
+  // Log parsing error for debugging
+  state.logger.log(`[${NODE_NAME}] Failed to parse response: ${error}`)
   // Fallback: treat response as single requirement
   analysisResult = {
     brd: response || 'Failed to parse requirements',
     functionalRequirements: {},
     nonFunctionalRequirements: {},
   }
 }

Suggestion 4:

Add error handling

Add error handling for the knowledge generation task. If the task fails to trigger, the function should still return success for the feedback resolution but log the error.

frontend/apps/app/features/migrations/actions/resolveReviewFeedback.ts [100-109]

 // Trigger the knowledge suggestion creation task with reviewFeedbackId
-const taskHandle = await generateKnowledgeFromFeedbackTask.trigger({
-  projectId,
-  reviewFeedback: updatedFeedback[0],
-  title: `Knowledge from resolved feedback #${feedbackId}`,
-  reasoning: `This knowledge suggestion was automatically created from resolved feedback #${feedbackId}`,
-  overallReview: completeOverallReview,
-  branch: completeOverallReview.branchName,
-})
+let taskId: string | null = null;
+try {
+  const taskHandle = await generateKnowledgeFromFeedbackTask.trigger({
+    projectId,
+    reviewFeedback: updatedFeedback[0],
+    title: `Knowledge from resolved feedback #${feedbackId}`,
+    reasoning: `This knowledge suggestion was automatically created from resolved feedback #${feedbackId}`,
+    overallReview: completeOverallReview,
+    branch: completeOverallReview.branchName,
+  })
+  taskId = taskHandle.id;
+} catch (error) {
+  console.error('Failed to trigger knowledge generation task:', error);
+  // Continue execution - don't fail the feedback resolution
+}

Suggestion 5:

Remove duplicate client creation

You're creating a Supabase client twice in the same function - once before checking the user and again inside the try block. This is inefficient and could lead to connection management issues. Reuse the existing client that was created earlier in the function.

frontend/apps/app/features/migrations/actions/reviewFeedbackComments.ts [54-57]

 try {
-  const supabase = await createClient()
-
   // Insert the comment

Pattern 2: Use proper identifier escaping for SQL table and column names by wrapping them in double quotes to prevent SQL injection and handle special characters correctly in PostgreSQL.

Example code before:

return `ALTER TABLE ${tableName} ADD COLUMN ${columnName} TEXT;`

Example code after:

return `ALTER TABLE "${tableName}" ADD COLUMN "${columnName}" TEXT;`
Relevant past accepted suggestions:
Suggestion 1:

Add SQL identifier escaping

**

    • Escape SQL identifiers (table names, column names) for PostgreSQL
    • Wraps identifier in double quotes and escapes internal double quotes
  • */ +function escapeIdentifier(identifier: string): string {
  • // Escape double quotes by doubling them and wrap in double quotes
  • return "${identifier.replace(/"/g, '""')}" +}

+/**

  • Generate ADD COLUMN statement for a column */ export function generateAddColumnStatement( @@ -60,11 +69,11 @@ column: Column, ): string { const columnDefinition = generateColumnDefinition(column)
  • let ddl = ALTER TABLE ${tableName} ADD COLUMN ${columnDefinition};
  • let ddl = ALTER TABLE ${escapeIdentifier(tableName)} ADD COLUMN ${columnDefinition};

    // Add column comment if exists if (column.comment) {

  • ddl += \n\nCOMMENT ON COLUMN ${tableName}.${column.name} IS '${escapeString(column.comment)}';
  • ddl += \n\nCOMMENT ON COLUMN ${escapeIdentifier(tableName)}.${escapeIdentifier(column.name)} IS '${escapeString(column.comment)}'; }

return ddl @@ -82,11 +91,11 @@ .join(',\n ')

// Basic CREATE TABLE statement

  • let ddl = CREATE TABLE ${tableName} (\n ${columnDefinitions}\n);
  • let ddl = CREATE TABLE ${escapeIdentifier(tableName)} (\n ${columnDefinitions}\n);

    // Add table comment if (table.comment) {

  • ddl += \n\nCOMMENT ON TABLE ${tableName} IS '${escapeString(table.comment)}';
  • ddl += \n\nCOMMENT ON TABLE ${escapeIdentifier(tableName)} IS '${escapeString(table.comment)}'; }

// Add column comments @@ -107,7 +116,7 @@ for (const column of Object.values(table.columns) as Column[]) { if (column.comment) { comments.push(

  •    `COMMENT ON COLUMN ${tableName}.${column.name} IS '${escapeString(column.comment)}';`,
    
  •    `COMMENT ON COLUMN ${escapeIdentifier(tableName)}.${escapeIdentifier(column.name)} IS '${escapeString(column.comment)}';`,
     )
    
    } } @@ -122,7 +131,7 @@ tableName: string, columnName: string, ): string {
  • return ALTER TABLE ${tableName} DROP COLUMN ${columnName};
  • return ALTER TABLE ${escapeIdentifier(tableName)} DROP COLUMN ${escapeIdentifier(columnName)}; }

/** @@ -133,14 +142,14 @@ oldColumnName: string, newColumnName: string, ): string {

  • return ALTER TABLE ${tableName} RENAME COLUMN ${oldColumnName} TO ${newColumnName};
  • return ALTER TABLE ${escapeIdentifier(tableName)} RENAME COLUMN ${escapeIdentifier(oldColumnName)} TO ${escapeIdentifier(newColumnName)}; }

/**

  • Generate DROP TABLE statement */ export function generateRemoveTableStatement(tableName: string): string {
  • return DROP TABLE ${tableName};
  • return DROP TABLE ${escapeIdentifier(tableName)}; }

/** @@ -152,7 +161,9 @@ ): string { const uniqueKeyword = index.unique ? ' UNIQUE' : '' const indexMethod = index.type ? USING ${index.type} : ''

  • const columnList = index.columns.join(', ')
  • const columnList = index.columns
  • .map((col) => escapeIdentifier(col))
  • .join(', ')
  • return CREATE${uniqueKeyword} INDEX ${index.name} ON ${tableName}${indexMethod} (${columnList});
  • return CREATE${uniqueKeyword} INDEX ${escapeIdentifier(index.name)} ON ${escapeIdentifier(tableName)}${indexMethod} (${columnList});







**The function should properly escape SQL identifiers to prevent SQL injection and handle special characters in table/column names. Use proper identifier quoting for PostgreSQL.**

[frontend/packages/db-structure/src/deparser/postgresql/utils.ts [131-137]](https://github.com/liam-hq/liam/pull/2139/files#diff-953bd04b5c3caedcff5d0e578ef9480183849538cf27562d74930e726ee96439R131-R137)

```diff
 export function generateRenameColumnStatement(
   tableName: string,
   oldColumnName: string,
   newColumnName: string,
 ): string {
-  return `ALTER TABLE ${tableName} RENAME COLUMN ${oldColumnName} TO ${newColumnName};`
+  return `ALTER TABLE "${tableName}" RENAME COLUMN "${oldColumnName}" TO "${newColumnName}";`
 }

Suggestion 2:

Fix type name typo

There's a typo in the imported type name CheckConstarintDetail. It should be CheckConstraintDetail to maintain consistency with other constraint-related types.

frontend/packages/db-structure/src/diff/types.ts [2]

 import type {
-  CheckConstarintDetail,
+  CheckConstraintDetail,
   Column,
   ColumnCheck,

Pattern 3: Fix boolean and falsy value handling in conditional checks by explicitly checking for null or undefined instead of using truthy/falsy evaluation when dealing with valid boolean or numeric values like false or 0.

Example code before:

if (!data) return null

Example code after:

if (data === null || data === undefined) return null
Relevant past accepted suggestions:
Suggestion 1:

Fix dot product calculation bug

The cosine similarity calculation has a critical bug in the dot product computation. The condition vecB[i] ? sum + a * vecB[i] : sum will skip multiplication when vecB[i] is 0, which is incorrect since 0 is a valid vector component. This will produce wrong similarity scores and affect matching accuracy.

frontend/internal-packages/schema-bench/src/nameSimilarity/nameSimilarity.ts [19-24]

 function cosineSimilarity(vecA: number[], vecB: number[]): number {
-  const dot = vecA.reduce((sum, a, i) => (vecB[i] ? sum + a * vecB[i] : sum), 0)
+  const dot = vecA.reduce((sum, a, i) => sum + a * vecB[i], 0)
   const normA = Math.sqrt(vecA.reduce((sum, a) => sum + a * a, 0))
   const normB = Math.sqrt(vecB.reduce((sum, b) => sum + b * b, 0))
   return dot / (normA * normB)
 }

Suggestion 2:

Fix boolean value handling

The current implementation returns null when the unique flag is false, which is a valid boolean value. This prevents detecting changes from true to false. Consider checking if the data is undefined instead of treating false as a falsy value.

frontend/packages/db-structure/src/diff/indexes/buildIndexUniqueDiffItem.ts [22-27]

 const data =
   status === 'removed'
     ? before.tables[tableId]?.indexes[indexId]?.unique
     : after.tables[tableId]?.indexes[indexId]?.unique
 
-if (!data) return null
+if (data === undefined) return null

Suggestion 3:

Fix empty string handling

The current implementation will skip empty strings as they evaluate to falsy in the condition. This could prevent detecting changes where the type is set to an empty string. Check for undefined instead.

frontend/packages/db-structure/src/diff/indexes/buildIndexTypeDiffItem.ts [22-27]

 const data =
   status === 'removed'
     ? before.tables[tableId]?.indexes[indexId]?.type
     : after.tables[tableId]?.indexes[indexId]?.type
 
-if (!data) return null
+if (data === undefined) return null

Suggestion 4:

Fix falsy value handling

The check for !data will incorrectly return null for default values that are falsy (like 0 or false). Since default can be a number or boolean, you should check if data is strictly null or undefined.

frontend/packages/db-structure/src/diff/columns/buildColumnDefaultDiffItem.ts [27]

-if (!data) return null
+if (data === null || data === undefined) return null

Pattern 4: Use useRef instead of createRef in functional components to maintain the same reference across renders and prevent unnecessary re-creation of refs on every render.

Example code before:

const myRef = createRef<HTMLElement>()

Example code after:

const myRef = useRef<HTMLElement>(null)
Relevant past accepted suggestions:
Suggestion 1:

Use useRef instead of createRef

Using createRef in a functional component creates a new ref on every render, which can cause performance issues and unexpected behavior. Use useRef instead to maintain the same reference across renders.

frontend/packages/erd-core/src/features/erd/components/ERDRenderer/CommandPalette/CommandPalette.tsx [18]

-const closeButtonRef = createRef<HTMLButtonElement>()
+const closeButtonRef = useRef<HTMLButtonElement>(null)

Suggestion 2:

Move custom hook outside component

Move the useDeviceType hook outside of the component to prevent it from being recreated on each render. Custom hooks should be defined at the module level, not inside component functions.

frontend/packages/erd-core/src/features/erd/components/ERDRenderer/ERDRenderer.tsx [96-107]

+// Move this outside the component
 const useDeviceType = () => {
   const [isMobile, setIsMobile] = useState(false);  
   useEffect(() => {
     const handleResize = () => {
       setIsMobile(window.innerWidth < 768);
     };  
     handleResize();  
     window.addEventListener('resize', handleResize);  
     return () => window.removeEventListener('resize', handleResize);
   }, []);  
   return { isMobile, isDesktop: !isMobile };
 };
 
+export const ERDRenderer: FC<Props> = ({
+  // ...rest of the component
+

Pattern 5: Add validation for numeric parameters and array bounds before accessing array elements or parsing string values to prevent runtime errors from invalid input.

Example code before:

const id = Number.parseInt(organizationId)
const lastItem = pathSegments[pathSegments.length - 1]

Example code after:

const id = Number.parseInt(organizationId)
if (isNaN(id)) throw new Error('Invalid ID')
if (pathSegments.length === 0) return undefined
const lastItem = pathSegments[pathSegments.length - 1]
Relevant past accepted suggestions:
Suggestion 1:

Add fallback for undefined terminal columns

The code assumes process.stdout.columns is always defined, but it can be undefined in certain environments (like CI/CD or when stdout is redirected). Add a fallback to prevent potential runtime errors.

frontend/packages/cli/src/cli/banner.ts [38-41]

 const asciiArt =
-  process.stdout.columns > longAsciiArtSafeWidth
+  (process.stdout.columns || 80) > longAsciiArtSafeWidth
     ? longAsciiArt
     : shortAsciiArt

Suggestion 2:

Validate vector length compatibility

The function assumes both input vectors have the same length but doesn't validate this. If vectors have different lengths, accessing vecB[i] when i >= vecB.length will return undefined, leading to incorrect calculations. Add a length validation or handle mismatched vector sizes appropriately.

frontend/internal-packages/schema-bench/src/nameSimilarity/nameSimilarity.ts [19-24]

 function cosineSimilarity(vecA: number[], vecB: number[]): number {
-  const dot = vecA.reduce((sum, a, i) => (vecB[i] ? sum + a * vecB[i] : sum), 0)
+  if (vecA.length !== vecB.length) {
+    throw new Error('Vectors must have the same length')
+  }
+  const dot = vecA.reduce((sum, a, i) => sum + a * vecB[i], 0)
   const normA = Math.sqrt(vecA.reduce((sum, a) => sum + a * a, 0))
   const normB = Math.sqrt(vecB.reduce((sum, b) => sum + b * b, 0))
+  if (normA === 0 || normB === 0) return 0
   return dot / (normA * normB)
 }

Suggestion 3:

Add validation to ensure array has elements before accessing its items to prevent potential runtime errors

The function doesn't validate if pathSegments array has any elements before accessing the last segment. If the URL path is empty or has no segments, pathSegments.length - 1 could be -1, causing a runtime error. Add a check to ensure the array has elements before accessing the last segment.

frontend/apps/app/app/(app)/app/projects/[projectId]/layout.tsx [19-30]

 const getDefaultTabFromPath = async (): Promise<
   ProjectTabValue | undefined
 > => {
   const headersList = await headers()
   const urlPath = headersList.get('x-url-path') || ''
   const pathSegments = urlPath.split('/')
+  
+  if (pathSegments.length === 0) {
+    return undefined
+  }
+  
   const lastSegment = pathSegments[pathSegments.length - 1]
 
   const result = safeParse(ProjectTabSchema, lastSegment)
 
   return result.success ? result.output : undefined
 }

Suggestion 4:

Add validation to ensure API responses contain data in the expected format before processing or returning them

The API route doesn't validate the response from getProjects() before returning it. Even though this is a mock implementation, it's good practice to validate that the returned data is in the expected format to prevent potential runtime errors. Add a check to ensure that projects is an array before returning it.

frontend/apps/app/app/api/projects/route.ts [4-17]

 export async function GET(_request: NextRequest): Promise<NextResponse> {
   try {
     // This is a mock implementation. In a real app, you would fetch projects from a database.
     const projects = await getProjects()
-
+    
+    if (!Array.isArray(projects)) {
+      return NextResponse.json(
+        { error: 'Invalid projects data format' },
+        { status: 500 },
+      )
+    }
+    
     return NextResponse.json(projects)
   } catch (error) {
     console.error('Error in projects API route:', error)
     return NextResponse.json(
       { error: 'Failed to fetch projects' },
       { status: 500 },
     )
   }
 }

Suggestion 5:

Add validation for numeric parameters to prevent potential errors when parsing invalid input values

The code is parsing the organizationId parameter without validating if it's a valid number. If the parameter contains non-numeric characters, Number.parseInt() could return NaN, which might cause unexpected behavior when used in the database query. Add explicit validation to ensure the parameter is a valid number before using it.

frontend/apps/app/app/api/projects/search/route.ts [20-22]

 if (organizationId) {
-  dbQuery = dbQuery.eq('organizationId', Number.parseInt(organizationId))
+  const parsedId = Number.parseInt(organizationId);
+  if (isNaN(parsedId)) {
+    return NextResponse.json(
+      { error: 'Invalid organizationId parameter' },
+      { status: 400 },
+    )
+  }
+  dbQuery = dbQuery.eq('organizationId', parsedId)
 }

[Auto-generated best practices - 2025-07-15]

Clone this wiki locally