5.6 KiB
5.6 KiB
name, description, version, category, triggers, author, license, tags
| name | description | version | category | triggers | author | license | tags | ||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| pr-reviewer | Structured PR code review workflow. Use when asked to "review this PR", "code review", "review pull request", or "check this PR". Works with both GitHub and Gitea via platform-aware scripts. Fetches PR metadata and diff, analyzes against quality criteria, generates structured review files, and posts feedback only after explicit approval. | 2.0.0 | code-review |
|
Adapted from SpillwaveSolutions/pr-reviewer-skill | MIT |
|
PR Reviewer
Structured, two-stage PR code review workflow. Nothing is posted until explicit user approval.
Prerequisites
~/.config/mosaic/rails/git/— Platform-aware git scripts must be availablepython3— For review file generation- Current working directory must be inside the target git repository
Workflow
Stage 1: Collect and Analyze
1. Gather PR Data
Run from inside the repo directory:
# Get PR metadata as JSON
~/.config/mosaic/rails/git/pr-metadata.sh -n <PR_NUMBER> -o /tmp/pr-review/metadata.json
# Get PR diff
~/.config/mosaic/rails/git/pr-diff.sh -n <PR_NUMBER> -o /tmp/pr-review/diff.patch
# View PR details (human-readable)
~/.config/mosaic/rails/git/pr-view.sh -n <PR_NUMBER>
2. Analyze the Changes
With the diff and metadata collected, analyze the PR against these criteria:
| Category | Key Questions |
|---|---|
| Functionality | Does the code solve the stated problem? Edge cases handled? |
| Security | OWASP top 10? Secrets in code? Input validation? |
| Testing | Tests exist? Cover happy paths and edge cases? |
| Performance | Efficient algorithms? N+1 queries? Bundle size impact? |
| Readability | Clear naming? DRY? Consistent with codebase patterns? |
| Architecture | Follows project conventions? Proper separation of concerns? |
| PR Quality | Focused scope? Clean commits? Clear description? |
Priority levels for findings:
- Blocker — Must fix before merge
- Important — Should be addressed
- Nit — Nice to have, optional
- Suggestion — Consider for future
- Question — Clarification needed
- Praise — Good work worth calling out
3. Generate Review Files
Create a findings JSON and run the generator:
python3 ~/.config/mosaic/skills/pr-reviewer/scripts/generate_review_files.py \
/tmp/pr-review --findings /tmp/pr-review/findings.json
This creates:
pr/review.md— Detailed analysis (internal use)pr/human.md— Clean version for posting (no emojis, concise)pr/inline.md— Proposed inline comments with code context
Findings JSON schema:
{
"summary": "Overall assessment of the PR",
"metadata": {
"repository": "owner/repo",
"number": 123,
"title": "PR title",
"author": "username",
"head_branch": "feature-branch",
"base_branch": "main"
},
"blockers": [
{
"category": "Security",
"issue": "Brief description",
"file": "src/path/file.ts",
"line": 45,
"details": "Explanation of the problem",
"fix": "Suggested solution",
"code_snippet": "relevant code"
}
],
"important": [],
"nits": [],
"suggestions": ["Consider adding..."],
"questions": ["Is this intended to..."],
"praise": ["Excellent test coverage"],
"inline_comments": [
{
"file": "src/path/file.ts",
"line": 42,
"comment": "Consider edge case",
"code_snippet": "relevant code",
"start_line": 40,
"end_line": 44
}
]
}
Stage 2: Review and Post
4. Present to User
Show the user:
- The summary from
pr/review.md - Count of blockers, important issues, nits
- Overall recommendation (approve vs request changes)
- Ask for approval before posting
NOTHING is posted without explicit user consent.
5. Post the Review
After user approval:
# Option A: Approve
~/.config/mosaic/rails/git/pr-review.sh -n <PR_NUMBER> -a approve -c "$(cat /tmp/pr-review/pr/human.md)"
# Option B: Request changes
~/.config/mosaic/rails/git/pr-review.sh -n <PR_NUMBER> -a request-changes -c "$(cat /tmp/pr-review/pr/human.md)"
# Option C: Comment only (no verdict)
~/.config/mosaic/rails/git/pr-review.sh -n <PR_NUMBER> -a comment -c "$(cat /tmp/pr-review/pr/human.md)"
Review Criteria Reference
See references/review_criteria.md for the complete checklist.
Quick Checklist
- Code compiles and passes CI
- Tests cover new functionality
- No hardcoded secrets or credentials
- Error handling is appropriate
- No obvious performance issues (N+1, unnecessary re-renders, large bundles)
- Follows project CLAUDE.md and AGENTS.md conventions
- Commit messages follow project format
- PR description explains the "why"
Best Practices
Communication
- Frame feedback as suggestions, not demands
- Explain WHY something matters, not just WHAT is wrong
- Acknowledge good work — praise is part of review
- Prioritize: blockers first, nits last
Efficiency
- Focus on what automated tools can't catch (logic, architecture, intent)
- Don't nitpick formatting — that's the linter's job
- For large PRs (>400 lines), suggest splitting
- Review in logical chunks (API layer, then UI, then tests)
Platform Notes
- Gitea: Inline code comments are posted as regular PR comments with file/line references in the text
- GitHub: Full inline comment API available
- Both platforms support approve/reject/comment review actions via our scripts