PBI-9 + PBI-47: worktree foundation, product-worktrees, P0 fixes, PAUSED flow

Adds two interlocking PBIs:

PBI-9 — Worktree foundation + persistent product-worktrees for idea-jobs
  - src/git/worktree-paths.ts: centralised root + skip-set + lock-path helpers
  - src/git/file-lock.ts: proper-lockfile wrapper, deadlock-safe ordered acquire
  - src/git/product-worktree.ts: detached-HEAD worktree per product, .scratch/
    excluded via git rev-parse --git-path (handles linked .git file)
  - src/git/job-locks.ts: setupProductWorktrees + releaseLocksOnTerminal
  - wait-for-job.ts: idea-branch wires product-worktrees for IDEA_GRILL/MAKE_PLAN
  - update-job-status.ts + pbi-cascade.ts + stale-reset: release on all four
    server-side terminal transitions (DONE/FAILED/CANCELLED/stale)
  - cleanup-my-worktrees: skip _products/ + *.lock
  - README: worktrees section with single-host invariant + advisory-lock path

PBI-47 — Sprint-flow P0 corrections + PAUSED flow with rich pause_context
  - prisma schema: ClaudeJob.{base_sha,head_sha} + SprintRun.pause_context
  - tryClaimJob captures base_sha; prepareDoneUpdate captures head_sha
  - verify-task-against-plan diffs vs base_sha (no more origin/main fallback);
    rejects with MISSING_BASE_SHA when null — fixes per-task verify-scope P0
  - pr.ts: createPullRequest enableAutoMerge default false; new
    enableAutoMergeOnPr with --match-head-commit guard + 5-category typed
    EnableAutoMergeResult — fixes STORY auto-merge timing P0
  - src/flow/{effects,worktree-lease,pr-flow,sprint-run}.ts: pure transition
    modules + idempotent declarative effects executor
  - update-job-status: STORY auto-merge fires only on the last task of the
    story (story.status === DONE), with head_sha as merge guard; MERGE_CONFLICT
    routes to sprint-run flow which produces CREATE_CLAUDE_QUESTION +
    SET_SPRINT_RUN_STATUS effects with rich pause_context

Tests: 31 test files, 242 passing. Pure-transition tests cover STORY 3-tasks
auto-merge timing, SPRINT draft→ready, MERGE_CONFLICT pause/resume, file-lock
deadlock prevention, worktree-lease lifecycle, delete-only verify (ALIGNED),
per-job verify scope (base_sha isolation), 5-category auto-merge errors.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
This commit is contained in:
Madhura68 2026-05-06 21:09:48 +02:00
parent 454d96ee04
commit f7f5a487ec
29 changed files with 1731 additions and 46 deletions

View file

@ -335,3 +335,33 @@ npx @modelcontextprotocol/inspector node dist/index.js
- **Production database** — verify against a preview database before - **Production database** — verify against a preview database before
running against prod. The token check enforces user scope but does running against prod. The token check enforces user scope but does
not gate reads of unrelated products you happen to be a member of. not gate reads of unrelated products you happen to be a member of.
## Worktrees
Scrum4Me-mcp uses git worktrees rooted at `~/.scrum4me-agent-worktrees/` (override via `SCRUM4ME_AGENT_WORKTREE_DIR`).
### Two kinds of worktrees
- **Per-job task-worktrees** (`<jobId>/`) — one per `TASK_IMPLEMENTATION` job. Created at claim, cleaned up on `DONE`/`FAILED`/`CANCELLED` via `cleanup_my_worktrees`.
- **Persistent product-worktrees** (`_products/<productId>/`) — one per product with `repo_url`, used by `IDEA_GRILL` and `IDEA_MAKE_PLAN`. **Detached HEAD on `origin/main`**, hard-reset at every job start. `.scratch/` holds throw-away work and is wiped on each claim.
### Concurrency: file-locks
Product-worktrees are serialised via `proper-lockfile` on `_products/<productId>.lock`. Two parallel idea-jobs on the same product wait for each other. For multi-product idea-jobs, locks are acquired in alphabetical order to prevent deadlocks.
### Single-host invariant
`proper-lockfile` only works when all MCP-server processes run on the same host. Migrate to Postgres `pg_advisory_lock` when:
- multiple MCP instances on different machines serve workers, or
- the worktree directory is shared over NFS/CIFS.
Migration path: replace `acquireFileLock` in `src/git/file-lock.ts` with a `pg_try_advisory_lock(hashtext(path)::bigint)` wrapper via the existing Prisma connection. The API stays identical.
### Manual cleanup
`cleanup_my_worktrees` skips `_products/` and `*.lock` automatically. To clean up a product-worktree manually (after archive or repo-rename):
```bash
git worktree remove --force ~/.scrum4me-agent-worktrees/_products/<productId>
rm ~/.scrum4me-agent-worktrees/_products/<productId>.lock # if still present
```

View file

@ -73,6 +73,17 @@ describe('listWorktreeJobIds', () => {
mockReaddir.mockRejectedValue(Object.assign(new Error('ENOENT'), { code: 'ENOENT' })) mockReaddir.mockRejectedValue(Object.assign(new Error('ENOENT'), { code: 'ENOENT' }))
expect(await listWorktreeJobIds(WORKTREE_PARENT)).toEqual([]) expect(await listWorktreeJobIds(WORKTREE_PARENT)).toEqual([])
}) })
it('skips _products/ system dir and *.lock files (PBI-9)', async () => {
mockReaddir.mockResolvedValue([
makeDirent('job-aaa'),
makeDirent('_products'),
makeDirent('product-abc.lock'),
makeDirent('job-bbb'),
])
const ids = await listWorktreeJobIds(WORKTREE_PARENT)
expect(ids).toEqual(['job-aaa', 'job-bbb'])
})
}) })
describe('cleanupWorktrees', () => { describe('cleanupWorktrees', () => {

View file

@ -0,0 +1,22 @@
import { describe, it, expect } from 'vitest'
import { executeEffects } from '../../src/flow/effects.js'
describe('effects executor', () => {
it('RELEASE_WORKTREE_LOCKS for unknown jobId is a no-op (no throw)', async () => {
const out = await executeEffects([{ type: 'RELEASE_WORKTREE_LOCKS', jobId: 'no-such-job' }])
expect(out).toEqual([])
})
it('multiple effects execute in order; failure in one is logged but does not abort', async () => {
const out = await executeEffects([
{ type: 'RELEASE_WORKTREE_LOCKS', jobId: 'a' },
{ type: 'RELEASE_WORKTREE_LOCKS', jobId: 'b' },
])
expect(out).toEqual([])
})
it('empty effects array returns empty outcomes', async () => {
const out = await executeEffects([])
expect(out).toEqual([])
})
})

View file

@ -0,0 +1,78 @@
import { describe, it, expect } from 'vitest'
import { transition, type PrFlowState } from '../../src/flow/pr-flow.js'
describe('pr-flow STORY-mode 3-tasks scenario', () => {
it('opens PR early; auto-merge only fires on the last task', () => {
let state: PrFlowState = { kind: 'none', strategy: 'STORY' }
const allEffects: Array<Record<string, unknown>> = []
// Task 1 DONE → PR_CREATED
let r = transition(state, { type: 'PR_CREATED', prUrl: 'https://github.com/o/r/pull/1' })
state = r.nextState
allEffects.push(...r.effects)
expect(state.kind).toBe('pr_opened')
expect(allEffects.filter((e) => e.type === 'ENABLE_AUTO_MERGE')).toHaveLength(0)
// Task 2 DONE → no STORY_COMPLETED yet, no transition emitted
r = transition(state, { type: 'TASK_DONE', taskId: 't2', headSha: 'abc123' })
state = r.nextState
allEffects.push(...r.effects)
expect(state.kind).toBe('pr_opened')
expect(allEffects.filter((e) => e.type === 'ENABLE_AUTO_MERGE')).toHaveLength(0)
// Task 3 DONE = STORY_COMPLETED → ENABLE_AUTO_MERGE with head guard
r = transition(state, { type: 'STORY_COMPLETED', storyId: 's1', headSha: 'def456' })
state = r.nextState
allEffects.push(...r.effects)
expect(state.kind).toBe('waiting_for_checks')
const enableEffects = allEffects.filter((e) => e.type === 'ENABLE_AUTO_MERGE')
expect(enableEffects).toHaveLength(1)
expect(enableEffects[0]).toMatchObject({ expectedHeadSha: 'def456' })
// CI green + merge OK
r = transition(state, { type: 'MERGE_RESULT' })
state = r.nextState
expect(state.kind).toBe('auto_merge_enabled')
})
it('CHECKS_FAILED → checks_failed (no pause)', () => {
const state: PrFlowState = {
kind: 'waiting_for_checks',
strategy: 'STORY',
prUrl: 'x',
headSha: 'y',
}
const r = transition(state, { type: 'MERGE_RESULT', reason: 'CHECKS_FAILED' })
expect(r.nextState.kind).toBe('checks_failed')
})
it('MERGE_CONFLICT → merge_conflict_paused', () => {
const state: PrFlowState = {
kind: 'waiting_for_checks',
strategy: 'STORY',
prUrl: 'x',
headSha: 'y',
}
const r = transition(state, { type: 'MERGE_RESULT', reason: 'MERGE_CONFLICT' })
expect(r.nextState.kind).toBe('merge_conflict_paused')
})
})
describe('pr-flow SPRINT-mode', () => {
it('draft stays draft until SPRINT_COMPLETED → MARK_PR_READY effect', () => {
let state: PrFlowState = { kind: 'none', strategy: 'SPRINT' }
let r = transition(state, { type: 'PR_CREATED', prUrl: 'x' })
expect(r.nextState.kind).toBe('draft_opened')
expect(r.effects).toHaveLength(0)
state = r.nextState
r = transition(state, { type: 'TASK_DONE', taskId: 't1', headSha: 'a' })
expect(r.nextState.kind).toBe('draft_opened')
expect(r.effects).toHaveLength(0)
state = r.nextState
r = transition(state, { type: 'SPRINT_COMPLETED', sprintRunId: 'sr1' })
expect(r.nextState.kind).toBe('ready_for_review')
expect(r.effects.filter((e) => e.type === 'MARK_PR_READY')).toHaveLength(1)
})
})

View file

@ -0,0 +1,82 @@
import { describe, it, expect } from 'vitest'
import { transition, type SprintRunState } from '../../src/flow/sprint-run.js'
describe('sprint-run pure transitions', () => {
it('queued + CLAIM_FIRST_JOB → running with SET_SPRINT_RUN_STATUS effect', () => {
const state: SprintRunState = { kind: 'queued', sprintRunId: 'sr1' }
const r = transition(state, { type: 'CLAIM_FIRST_JOB' })
expect(r.nextState.kind).toBe('running')
expect(r.effects).toEqual([
{ type: 'SET_SPRINT_RUN_STATUS', sprintRunId: 'sr1', status: 'RUNNING' },
])
})
it('running + MERGE_CONFLICT → paused_merge_conflict + 2 effects in order', () => {
const state: SprintRunState = { kind: 'running', sprintRunId: 'sr1' }
const r = transition(state, {
type: 'MERGE_CONFLICT',
prUrl: 'https://github.com/o/r/pull/1',
prHeadSha: 'abc123',
conflictFiles: ['a.ts', 'b.ts'],
resumeInstructions: 'Resolve and push',
})
expect(r.nextState.kind).toBe('paused_merge_conflict')
expect(r.effects).toHaveLength(2)
expect(r.effects[0].type).toBe('CREATE_CLAUDE_QUESTION')
expect(r.effects[1].type).toBe('SET_SPRINT_RUN_STATUS')
if (r.effects[1].type === 'SET_SPRINT_RUN_STATUS') {
expect(r.effects[1].status).toBe('PAUSED')
expect(r.effects[1].pauseContextDraft).toMatchObject({
pause_reason: 'MERGE_CONFLICT',
pr_url: 'https://github.com/o/r/pull/1',
pr_head_sha: 'abc123',
conflict_files: ['a.ts', 'b.ts'],
})
}
})
it('paused + USER_RESUMED → running + CLOSE_CLAUDE_QUESTION + clear pause_context', () => {
const state: SprintRunState = {
kind: 'paused_merge_conflict',
sprintRunId: 'sr1',
pauseContext: {
pause_reason: 'MERGE_CONFLICT',
pr_url: 'x',
pr_head_sha: 'y',
conflict_files: [],
claude_question_id: 'q1',
resume_instructions: 'r',
paused_at: new Date().toISOString(),
},
}
const r = transition(state, { type: 'USER_RESUMED' })
expect(r.nextState.kind).toBe('running')
expect(r.effects[0]).toEqual({ type: 'CLOSE_CLAUDE_QUESTION', questionId: 'q1' })
expect(r.effects[1]).toMatchObject({
type: 'SET_SPRINT_RUN_STATUS',
status: 'RUNNING',
clearPauseContext: true,
})
})
it('running + TASK_FAILED → failed (no PAUSE)', () => {
const state: SprintRunState = { kind: 'running', sprintRunId: 'sr1' }
const r = transition(state, { type: 'TASK_FAILED', taskId: 't1', error: 'CI red' })
expect(r.nextState.kind).toBe('failed')
expect(r.effects[0]).toMatchObject({ status: 'FAILED' })
})
it('running + ALL_DONE → done + SET_SPRINT_RUN_STATUS DONE', () => {
const state: SprintRunState = { kind: 'running', sprintRunId: 'sr1' }
const r = transition(state, { type: 'ALL_DONE' })
expect(r.nextState.kind).toBe('done')
expect(r.effects[0]).toMatchObject({ status: 'DONE' })
})
it('forbidden transition (running + CLAIM_FIRST_JOB) keeps state and emits no effects', () => {
const state: SprintRunState = { kind: 'running', sprintRunId: 'sr1' }
const r = transition(state, { type: 'CLAIM_FIRST_JOB' })
expect(r.nextState).toEqual(state)
expect(r.effects).toEqual([])
})
})

View file

@ -0,0 +1,82 @@
import { describe, it, expect } from 'vitest'
import { transition, type WorktreeLeaseState } from '../../src/flow/worktree-lease.js'
describe('worktree-lease pure transitions', () => {
it('idle + JOB_CLAIMED → acquiring_lock, no effects', () => {
const r = transition({ kind: 'idle' }, { type: 'JOB_CLAIMED', jobId: 'j1', productIds: ['p1'] })
expect(r.nextState.kind).toBe('acquiring_lock')
expect(r.effects).toEqual([])
})
it('acquiring_lock + LOCK_ACQUIRED → creating_or_reusing', () => {
const state: WorktreeLeaseState = {
kind: 'acquiring_lock',
jobId: 'j1',
productIds: ['p1'],
}
const r = transition(state, { type: 'LOCK_ACQUIRED' })
expect(r.nextState.kind).toBe('creating_or_reusing')
expect(r.effects).toEqual([])
})
it('acquiring_lock + LOCK_TIMEOUT → lock_timeout', () => {
const state: WorktreeLeaseState = {
kind: 'acquiring_lock',
jobId: 'j1',
productIds: ['p1'],
}
const r = transition(state, { type: 'LOCK_TIMEOUT' })
expect(r.nextState.kind).toBe('lock_timeout')
})
it('creating_or_reusing + WORKTREE_READY → syncing', () => {
const r = transition(
{ kind: 'creating_or_reusing', jobId: 'j1', productIds: ['p1'] },
{ type: 'WORKTREE_READY' },
)
expect(r.nextState.kind).toBe('syncing')
})
it('syncing + SYNC_DONE → ready (no release effect yet)', () => {
const r = transition(
{ kind: 'syncing', jobId: 'j1', productIds: ['p1'] },
{ type: 'SYNC_DONE' },
)
expect(r.nextState.kind).toBe('ready')
expect(r.effects).toEqual([])
})
it('syncing + SYNC_FAILED → sync_failed + RELEASE_WORKTREE_LOCKS effect', () => {
const r = transition(
{ kind: 'syncing', jobId: 'j1', productIds: ['p1'] },
{ type: 'SYNC_FAILED', error: 'boom' },
)
expect(r.nextState.kind).toBe('sync_failed')
expect(r.effects).toEqual([{ type: 'RELEASE_WORKTREE_LOCKS', jobId: 'j1' }])
})
it('ready + JOB_TERMINAL → releasing + RELEASE_WORKTREE_LOCKS effect', () => {
const r = transition(
{ kind: 'ready', jobId: 'j1', productIds: ['p1'] },
{ type: 'JOB_TERMINAL', jobId: 'j1' },
)
expect(r.nextState.kind).toBe('releasing')
expect(r.effects).toEqual([{ type: 'RELEASE_WORKTREE_LOCKS', jobId: 'j1' }])
})
it('ready + STALE_RESET → stale_released + RELEASE_WORKTREE_LOCKS effect', () => {
const r = transition(
{ kind: 'ready', jobId: 'j1', productIds: ['p1'] },
{ type: 'STALE_RESET', jobId: 'j1' },
)
expect(r.nextState.kind).toBe('stale_released')
expect(r.effects).toEqual([{ type: 'RELEASE_WORKTREE_LOCKS', jobId: 'j1' }])
})
it('forbidden transition (idle + LOCK_ACQUIRED) keeps state, no effects', () => {
const state: WorktreeLeaseState = { kind: 'idle' }
const r = transition(state, { type: 'LOCK_ACQUIRED' })
expect(r.nextState).toEqual(state)
expect(r.effects).toEqual([])
})
})

View file

@ -0,0 +1,96 @@
import { describe, it, expect, beforeEach, afterEach } from 'vitest'
import * as fs from 'node:fs/promises'
import * as os from 'node:os'
import * as path from 'node:path'
import { acquireFileLock, acquireFileLocksOrdered } from '../../src/git/file-lock.js'
describe('file-lock', () => {
let tmpDir: string
beforeEach(async () => {
tmpDir = await fs.mkdtemp(path.join(os.tmpdir(), 'file-lock-'))
})
afterEach(async () => {
await fs.rm(tmpDir, { recursive: true, force: true })
})
it('acquires and releases a lock; lockfile is gone after release', async () => {
const lockPath = path.join(tmpDir, 'a.lock')
const release = await acquireFileLock(lockPath)
// proper-lockfile creates a directory at <lockPath>.lock for the actual lock
const stat = await fs.stat(`${lockPath}.lock`).catch(() => null)
expect(stat).not.toBeNull()
await release()
// After release, the .lock dir should be gone
const after = await fs.stat(`${lockPath}.lock`).catch(() => null)
expect(after).toBeNull()
})
it('release is idempotent (second call is no-op)', async () => {
const lockPath = path.join(tmpDir, 'b.lock')
const release = await acquireFileLock(lockPath)
await release()
await expect(release()).resolves.toBeUndefined()
})
it('second acquire blocks until first release', async () => {
const lockPath = path.join(tmpDir, 'c.lock')
const release1 = await acquireFileLock(lockPath)
let secondAcquired = false
const second = acquireFileLock(lockPath).then((r) => {
secondAcquired = true
return r
})
// Give the second acquire a moment to attempt
await new Promise((r) => setTimeout(r, 200))
expect(secondAcquired).toBe(false)
await release1()
const release2 = await second
expect(secondAcquired).toBe(true)
await release2()
}, 10_000)
it('acquireFileLocksOrdered sorts paths alphabetically (deadlock-free for crossed sets)', async () => {
const a = path.join(tmpDir, 'A.lock')
const b = path.join(tmpDir, 'B.lock')
// Two concurrent multi-locks with crossed orders both sort to [A, B]
const r1Promise = acquireFileLocksOrdered([b, a])
// First should grab both since paths sort the same
const r1 = await r1Promise
let secondAcquired = false
const r2Promise = acquireFileLocksOrdered([a, b]).then((r) => {
secondAcquired = true
return r
})
await new Promise((r) => setTimeout(r, 200))
expect(secondAcquired).toBe(false)
await r1()
const r2 = await r2Promise
expect(secondAcquired).toBe(true)
await r2()
}, 15_000)
it('partial failure releases held locks', async () => {
// Force the second acquire to fail by writing a regular file at the lockfile
// location proper-lockfile wants to create as a directory.
const a = path.join(tmpDir, 'A.lock')
const bPath = path.join(tmpDir, 'B.lock')
// Create a regular file at `${bPath}.lock` so proper-lockfile's mkdir fails with EEXIST
await fs.writeFile(`${bPath}.lock`, 'blocked')
await expect(acquireFileLocksOrdered([a, bPath])).rejects.toThrow()
// After failure, A's lock should be released — re-acquire immediately
const r = await acquireFileLock(a)
await r()
}, 90_000)
})

View file

@ -0,0 +1,121 @@
import { describe, it, expect, beforeEach, afterEach, vi } from 'vitest'
import * as fs from 'node:fs/promises'
import * as os from 'node:os'
import * as path from 'node:path'
import { execFile } from 'node:child_process'
import { promisify } from 'node:util'
import {
registerJobLockReleases,
releaseLocksOnTerminal,
setupProductWorktrees,
_resetJobReleasesForTest,
} from '../../src/git/job-locks.js'
const exec = promisify(execFile)
describe('job-locks: registerJobLockReleases + releaseLocksOnTerminal', () => {
beforeEach(() => _resetJobReleasesForTest())
it('releaseLocksOnTerminal for unknown job is a no-op', async () => {
await expect(releaseLocksOnTerminal('nonexistent')).resolves.toBeUndefined()
})
it('runs registered releases and clears the entry', async () => {
const release = vi.fn().mockResolvedValue(undefined)
registerJobLockReleases('job-1', [release])
await releaseLocksOnTerminal('job-1')
expect(release).toHaveBeenCalledTimes(1)
// Second call → no-op (cleared)
await releaseLocksOnTerminal('job-1')
expect(release).toHaveBeenCalledTimes(1)
})
it('failures in one release do not abort others', async () => {
const r1 = vi.fn().mockRejectedValue(new Error('boom'))
const r2 = vi.fn().mockResolvedValue(undefined)
registerJobLockReleases('job-2', [r1, r2])
await expect(releaseLocksOnTerminal('job-2')).resolves.toBeUndefined()
expect(r1).toHaveBeenCalled()
expect(r2).toHaveBeenCalled()
})
it('append-mode: multiple registers accumulate', async () => {
const r1 = vi.fn().mockResolvedValue(undefined)
const r2 = vi.fn().mockResolvedValue(undefined)
registerJobLockReleases('job-3', [r1])
registerJobLockReleases('job-3', [r2])
await releaseLocksOnTerminal('job-3')
expect(r1).toHaveBeenCalledTimes(1)
expect(r2).toHaveBeenCalledTimes(1)
})
})
describe('job-locks: setupProductWorktrees', () => {
let tmpRoot: string
let originalEnv: string | undefined
let bareRepo: string
let originRepo: string
beforeEach(async () => {
_resetJobReleasesForTest()
tmpRoot = await fs.mkdtemp(path.join(os.tmpdir(), 'job-locks-'))
originalEnv = process.env.SCRUM4ME_AGENT_WORKTREE_DIR
process.env.SCRUM4ME_AGENT_WORKTREE_DIR = path.join(tmpRoot, 'agent-worktrees')
// Set up a bare repo as origin and a clone with origin/main
bareRepo = path.join(tmpRoot, 'origin.git')
await exec('git', ['init', '--bare', '-b', 'main', bareRepo])
originRepo = path.join(tmpRoot, 'work')
await exec('git', ['init', '-b', 'main', originRepo])
await exec('git', ['config', 'user.email', 't@t.local'], { cwd: originRepo })
await exec('git', ['config', 'user.name', 'Test'], { cwd: originRepo })
await exec('git', ['remote', 'add', 'origin', bareRepo], { cwd: originRepo })
await fs.writeFile(path.join(originRepo, 'README.md'), '# init\n')
await exec('git', ['add', '-A'], { cwd: originRepo })
await exec('git', ['commit', '-m', 'init'], { cwd: originRepo })
await exec('git', ['push', '-u', 'origin', 'main'], { cwd: originRepo })
})
afterEach(async () => {
if (originalEnv) process.env.SCRUM4ME_AGENT_WORKTREE_DIR = originalEnv
else delete process.env.SCRUM4ME_AGENT_WORKTREE_DIR
await fs.rm(tmpRoot, { recursive: true, force: true })
})
it('returns empty when productIds is empty', async () => {
const result = await setupProductWorktrees('j1', [], async () => null)
expect(result).toEqual([])
})
it('creates a product-worktree, registers a lock-release, and releases it', async () => {
const result = await setupProductWorktrees('j2', ['prod-a'], async () => originRepo)
expect(result).toHaveLength(1)
expect(result[0].productId).toBe('prod-a')
expect(result[0].worktreePath).toContain('_products/prod-a')
// Worktree dir exists with detached HEAD on origin/main
const stat = await fs.stat(result[0].worktreePath)
expect(stat.isDirectory()).toBe(true)
// Lockfile is held during the job (proper-lockfile creates a .lock dir)
const lockDir = path.join(
process.env.SCRUM4ME_AGENT_WORKTREE_DIR!,
'_products',
'prod-a.lock.lock',
)
const lockStat = await fs.stat(lockDir).catch(() => null)
expect(lockStat).not.toBeNull()
await releaseLocksOnTerminal('j2')
const lockAfter = await fs.stat(lockDir).catch(() => null)
expect(lockAfter).toBeNull()
})
it('skips products where resolveRepoRoot returns null', async () => {
const result = await setupProductWorktrees('j3', ['no-repo'], async () => null)
expect(result).toEqual([])
// Lock was still acquired and registered — release cleans up
await releaseLocksOnTerminal('j3')
})
})

View file

@ -0,0 +1,75 @@
import { describe, it, expect, vi, beforeEach } from 'vitest'
// Mock node:child_process before importing the module under test
vi.mock('node:child_process', () => ({
execFile: vi.fn(),
}))
import { execFile } from 'node:child_process'
import { enableAutoMergeOnPr } from '../../src/git/pr.js'
const mockExecFile = vi.mocked(execFile) as unknown as ReturnType<typeof vi.fn>
function mockGhFailure(stderr: string) {
mockExecFile.mockImplementation(((_cmd: string, _args: string[], _opts: unknown, cb: any) => {
cb(Object.assign(new Error('gh exit'), { stderr }))
}) as never)
}
function mockGhSuccess() {
mockExecFile.mockImplementation(((_cmd: string, _args: string[], _opts: unknown, cb: any) => {
cb(null, { stdout: '', stderr: '' })
}) as never)
}
describe('enableAutoMergeOnPr — typed errors (PBI-47 C2 layer 1)', () => {
beforeEach(() => {
mockExecFile.mockReset()
})
it('returns ok:true on green merge', async () => {
mockGhSuccess()
const result = await enableAutoMergeOnPr({ prUrl: 'x', expectedHeadSha: 'sha1' })
expect(result.ok).toBe(true)
})
it('classifies GH_AUTH_ERROR for 401/403 / permission strings', async () => {
mockGhFailure('gh: HTTP 403: permission denied')
const result = await enableAutoMergeOnPr({ prUrl: 'x', expectedHeadSha: 'sha1' })
expect(result.ok).toBe(false)
if (!result.ok) expect(result.reason).toBe('GH_AUTH_ERROR')
})
it('classifies AUTO_MERGE_NOT_ALLOWED for repo-setting refusal', async () => {
mockGhFailure('auto-merge is not allowed for this repository')
const result = await enableAutoMergeOnPr({ prUrl: 'x', expectedHeadSha: 'sha1' })
expect(result.ok).toBe(false)
if (!result.ok) expect(result.reason).toBe('AUTO_MERGE_NOT_ALLOWED')
})
it('classifies MERGE_CONFLICT for dirty merge state', async () => {
mockGhFailure('pull request is not in a mergeable state (dirty)')
const result = await enableAutoMergeOnPr({ prUrl: 'x', expectedHeadSha: 'sha1' })
expect(result.ok).toBe(false)
if (!result.ok) expect(result.reason).toBe('MERGE_CONFLICT')
})
it('classifies UNKNOWN for unrecognised stderr', async () => {
mockGhFailure('unexpected gh error')
const result = await enableAutoMergeOnPr({ prUrl: 'x', expectedHeadSha: 'sha1' })
expect(result.ok).toBe(false)
if (!result.ok) expect(result.reason).toBe('UNKNOWN')
})
it('passes --match-head-commit when expectedHeadSha provided', async () => {
mockGhSuccess()
await enableAutoMergeOnPr({ prUrl: 'pr-url', expectedHeadSha: 'abc123' })
const callArgs = mockExecFile.mock.calls[0]
expect(callArgs[0]).toBe('gh')
const args = callArgs[1] as string[]
expect(args).toContain('--match-head-commit')
expect(args).toContain('abc123')
expect(args).toContain('--auto')
expect(args).toContain('--squash')
})
})

View file

@ -0,0 +1,59 @@
import { describe, it, expect } from 'vitest'
import { classifyDiffAgainstPlan } from '../../src/verify/classify.js'
describe('classify — delete-only commits (PBI-47 C5)', () => {
it('returns ALIGNED when the deleted path is in the plan', () => {
const diff = `diff --git a/app/todos/page.tsx b/app/todos/page.tsx
deleted file mode 100644
index 1234567..0000000
--- a/app/todos/page.tsx
+++ /dev/null
@@ -1,3 +0,0 @@
-export default function TodosPage() {
- return null
-}`
const plan = '- Verwijder `app/todos/page.tsx`\n- Verwijder gerelateerde imports'
const result = classifyDiffAgainstPlan({ diff, plan })
expect(result.result).toBe('ALIGNED')
})
it('returns ALIGNED for multi-file delete-only when both paths in plan', () => {
const diff = `diff --git a/app/todos/page.tsx b/app/todos/page.tsx
deleted file mode 100644
--- a/app/todos/page.tsx
+++ /dev/null
@@ -1,2 +0,0 @@
-line 1
-line 2
diff --git a/components/todo-list.tsx b/components/todo-list.tsx
deleted file mode 100644
--- a/components/todo-list.tsx
+++ /dev/null
@@ -1,1 +0,0 @@
-line`
const plan = '- `app/todos/page.tsx`\n- `components/todo-list.tsx`'
const result = classifyDiffAgainstPlan({ diff, plan })
expect(result.result).toBe('ALIGNED')
})
it('returns PARTIAL when only some plan deletes appear in the diff', () => {
const diff = `diff --git a/a.ts b/a.ts
deleted file mode 100644
--- a/a.ts
+++ /dev/null
@@ -1,1 +0,0 @@
-x`
const plan = '- `a.ts`\n- `b.ts`' // b.ts missing
const result = classifyDiffAgainstPlan({ diff, plan })
expect(result.result).toBe('PARTIAL')
})
it('returns EMPTY for a no-op diff', () => {
const result = classifyDiffAgainstPlan({ diff: '', plan: 'irrelevant' })
expect(result.result).toBe('EMPTY')
})
})

View file

@ -0,0 +1,55 @@
import { describe, it, expect, beforeAll, afterAll } from 'vitest'
import * as fs from 'node:fs/promises'
import * as path from 'node:path'
import * as os from 'node:os'
import { execFile } from 'node:child_process'
import { promisify } from 'node:util'
import { getDiffInWorktree } from '../../src/tools/verify-task-against-plan.js'
const exec = promisify(execFile)
describe('verify scope per-job (PBI-47 P0)', () => {
let tmpRepo: string
let baseSha: string
let task1Sha: string
beforeAll(async () => {
tmpRepo = await fs.mkdtemp(path.join(os.tmpdir(), 'verify-scope-'))
await exec('git', ['init', '-b', 'main'], { cwd: tmpRepo })
await exec('git', ['config', 'user.email', 't@t.local'], { cwd: tmpRepo })
await exec('git', ['config', 'user.name', 'Test'], { cwd: tmpRepo })
await fs.writeFile(path.join(tmpRepo, 'README.md'), '# init\n')
await exec('git', ['add', '-A'], { cwd: tmpRepo })
await exec('git', ['commit', '-m', 'init'], { cwd: tmpRepo })
const baseRev = await exec('git', ['rev-parse', 'HEAD'], { cwd: tmpRepo })
baseSha = baseRev.stdout.trim()
// Simulate task 1: add a.ts
await fs.writeFile(path.join(tmpRepo, 'a.ts'), 'task 1\n')
await exec('git', ['add', '-A'], { cwd: tmpRepo })
await exec('git', ['commit', '-m', 'task 1'], { cwd: tmpRepo })
const t1Rev = await exec('git', ['rev-parse', 'HEAD'], { cwd: tmpRepo })
task1Sha = t1Rev.stdout.trim()
// Simulate task 2: add b.ts
await fs.writeFile(path.join(tmpRepo, 'b.ts'), 'task 2\n')
await exec('git', ['add', '-A'], { cwd: tmpRepo })
await exec('git', ['commit', '-m', 'task 2'], { cwd: tmpRepo })
})
afterAll(async () => {
await fs.rm(tmpRepo, { recursive: true, force: true })
})
it('diff vs base = origin/main → both task 1 and task 2 visible', async () => {
const diff = await getDiffInWorktree(tmpRepo, baseSha)
expect(diff).toContain('a.ts')
expect(diff).toContain('b.ts')
})
it('diff vs base = task1_sha → only task 2 visible', async () => {
const diff = await getDiffInWorktree(tmpRepo, task1Sha)
expect(diff).not.toContain('a.ts')
expect(diff).toContain('b.ts')
})
})

21
package-lock.json generated
View file

@ -13,7 +13,9 @@
"@modelcontextprotocol/sdk": "^1.29.0", "@modelcontextprotocol/sdk": "^1.29.0",
"@prisma/adapter-pg": "^7.8.0", "@prisma/adapter-pg": "^7.8.0",
"@prisma/client": "^7.8.0", "@prisma/client": "^7.8.0",
"@types/proper-lockfile": "^4.1.4",
"pg": "^8.13.1", "pg": "^8.13.1",
"proper-lockfile": "^4.1.2",
"yaml": "^2.8.4", "yaml": "^2.8.4",
"zod": "^4.0.0" "zod": "^4.0.0"
}, },
@ -1327,6 +1329,15 @@
"pg-types": "^2.2.0" "pg-types": "^2.2.0"
} }
}, },
"node_modules/@types/proper-lockfile": {
"version": "4.1.4",
"resolved": "https://registry.npmjs.org/@types/proper-lockfile/-/proper-lockfile-4.1.4.tgz",
"integrity": "sha512-uo2ABllncSqg9F1D4nugVl9v93RmjxF6LJzQLMLDdPaXCUIDPeOJ21Gbqi43xNKzBi/WQ0Q0dICqufzQbMjipQ==",
"license": "MIT",
"dependencies": {
"@types/retry": "*"
}
},
"node_modules/@types/react": { "node_modules/@types/react": {
"version": "19.2.14", "version": "19.2.14",
"resolved": "https://registry.npmjs.org/@types/react/-/react-19.2.14.tgz", "resolved": "https://registry.npmjs.org/@types/react/-/react-19.2.14.tgz",
@ -1338,6 +1349,12 @@
"csstype": "^3.2.2" "csstype": "^3.2.2"
} }
}, },
"node_modules/@types/retry": {
"version": "0.12.5",
"resolved": "https://registry.npmjs.org/@types/retry/-/retry-0.12.5.tgz",
"integrity": "sha512-3xSjTp3v03X/lSQLkczaN9UIEwJMoMCA1+Nb5HfbJEQWogdeQIyVtTvxPXDQjZ5zws8rFQfVfRdz03ARihPJgw==",
"license": "MIT"
},
"node_modules/@vitest/expect": { "node_modules/@vitest/expect": {
"version": "4.1.5", "version": "4.1.5",
"resolved": "https://registry.npmjs.org/@vitest/expect/-/expect-4.1.5.tgz", "resolved": "https://registry.npmjs.org/@vitest/expect/-/expect-4.1.5.tgz",
@ -2332,7 +2349,6 @@
"version": "4.2.11", "version": "4.2.11",
"resolved": "https://registry.npmjs.org/graceful-fs/-/graceful-fs-4.2.11.tgz", "resolved": "https://registry.npmjs.org/graceful-fs/-/graceful-fs-4.2.11.tgz",
"integrity": "sha512-RbJ5/jmFcNNCcDV5o9eTnBLJ/HszWV0P73bc+Ff4nS/rJj+YaS6IGyiOL0VoBYX+l1Wrl3k63h/KrH+nhJ0XvQ==", "integrity": "sha512-RbJ5/jmFcNNCcDV5o9eTnBLJ/HszWV0P73bc+Ff4nS/rJj+YaS6IGyiOL0VoBYX+l1Wrl3k63h/KrH+nhJ0XvQ==",
"devOptional": true,
"license": "ISC" "license": "ISC"
}, },
"node_modules/grammex": { "node_modules/grammex": {
@ -3277,7 +3293,6 @@
"version": "4.1.2", "version": "4.1.2",
"resolved": "https://registry.npmjs.org/proper-lockfile/-/proper-lockfile-4.1.2.tgz", "resolved": "https://registry.npmjs.org/proper-lockfile/-/proper-lockfile-4.1.2.tgz",
"integrity": "sha512-TjNPblN4BwAWMXU8s9AEz4JmQxnD1NNL7bNOY/AKUzyamc379FWASUhc/K1pL2noVb+XmZKLL68cjzLsiOAMaA==", "integrity": "sha512-TjNPblN4BwAWMXU8s9AEz4JmQxnD1NNL7bNOY/AKUzyamc379FWASUhc/K1pL2noVb+XmZKLL68cjzLsiOAMaA==",
"devOptional": true,
"license": "MIT", "license": "MIT",
"dependencies": { "dependencies": {
"graceful-fs": "^4.2.4", "graceful-fs": "^4.2.4",
@ -3289,7 +3304,6 @@
"version": "3.0.7", "version": "3.0.7",
"resolved": "https://registry.npmjs.org/signal-exit/-/signal-exit-3.0.7.tgz", "resolved": "https://registry.npmjs.org/signal-exit/-/signal-exit-3.0.7.tgz",
"integrity": "sha512-wnD2ZE+l+SPC/uoS0vXeE9L1+0wuaMqKlfz9AMUo38JsyLSBWSFcHR1Rri62LZc12vLr1gb3jl7iwQhgwpAbGQ==", "integrity": "sha512-wnD2ZE+l+SPC/uoS0vXeE9L1+0wuaMqKlfz9AMUo38JsyLSBWSFcHR1Rri62LZc12vLr1gb3jl7iwQhgwpAbGQ==",
"devOptional": true,
"license": "ISC" "license": "ISC"
}, },
"node_modules/proxy-addr": { "node_modules/proxy-addr": {
@ -3444,7 +3458,6 @@
"version": "0.12.0", "version": "0.12.0",
"resolved": "https://registry.npmjs.org/retry/-/retry-0.12.0.tgz", "resolved": "https://registry.npmjs.org/retry/-/retry-0.12.0.tgz",
"integrity": "sha512-9LkiTwjUh6rT555DtE9rTX+BKByPfrMzEAtnlEtdEwr3Nkffwiihqe2bWADg+OQRjt9gl6ICdmB/ZFDCGAtSow==", "integrity": "sha512-9LkiTwjUh6rT555DtE9rTX+BKByPfrMzEAtnlEtdEwr3Nkffwiihqe2bWADg+OQRjt9gl6ICdmB/ZFDCGAtSow==",
"devOptional": true,
"license": "MIT", "license": "MIT",
"engines": { "engines": {
"node": ">= 4" "node": ">= 4"

View file

@ -32,7 +32,9 @@
"@modelcontextprotocol/sdk": "^1.29.0", "@modelcontextprotocol/sdk": "^1.29.0",
"@prisma/adapter-pg": "^7.8.0", "@prisma/adapter-pg": "^7.8.0",
"@prisma/client": "^7.8.0", "@prisma/client": "^7.8.0",
"@types/proper-lockfile": "^4.1.4",
"pg": "^8.13.1", "pg": "^8.13.1",
"proper-lockfile": "^4.1.2",
"yaml": "^2.8.4", "yaml": "^2.8.4",
"zod": "^4.0.0" "zod": "^4.0.0"
}, },

View file

@ -2,7 +2,6 @@ generator client {
provider = "prisma-client-js" provider = "prisma-client-js"
} }
datasource db { datasource db {
provider = "postgresql" provider = "postgresql"
} }
@ -314,6 +313,7 @@ model SprintRun {
failure_reason String? failure_reason String?
failed_task Task? @relation("SprintRunFailedTask", fields: [failed_task_id], references: [id], onDelete: SetNull) failed_task Task? @relation("SprintRunFailedTask", fields: [failed_task_id], references: [id], onDelete: SetNull)
failed_task_id String? failed_task_id String?
pause_context Json?
created_at DateTime @default(now()) created_at DateTime @default(now())
updated_at DateTime @updatedAt updated_at DateTime @updatedAt
jobs ClaudeJob[] jobs ClaudeJob[]
@ -385,6 +385,8 @@ model ClaudeJob {
cache_read_tokens Int? cache_read_tokens Int?
cache_write_tokens Int? cache_write_tokens Int?
plan_snapshot String? plan_snapshot String?
base_sha String?
head_sha String?
branch String? branch String?
pr_url String? pr_url String?
summary String? summary String?

View file

@ -13,6 +13,7 @@ import {
getPullRequestState, getPullRequestState,
} from '../git/pr.js' } from '../git/pr.js'
import { deleteRemoteBranch } from '../git/push.js' import { deleteRemoteBranch } from '../git/push.js'
import { releaseLocksOnTerminal } from '../git/job-locks.js'
export type CascadeOutcome = { export type CascadeOutcome = {
cancelled_job_ids: string[] cancelled_job_ids: string[]
@ -88,6 +89,9 @@ async function runCascade(failedJobId: string): Promise<CascadeOutcome> {
error: 'cancelled_by_pbi_failure', error: 'cancelled_by_pbi_failure',
}, },
}) })
// PBI-9: release product-worktree locks for cancelled jobs.
// No-op for jobs without registered locks (TASK_IMPLEMENTATION).
for (const j of eligible) await releaseLocksOnTerminal(j.id)
} }
const outcome: CascadeOutcome = { const outcome: CascadeOutcome = {

192
src/flow/effects.ts Normal file
View file

@ -0,0 +1,192 @@
// PBI-9 + PBI-47: declarative effects produced by pure transitions.
// Executor handles each effect idempotently; failures are logged, not thrown.
export type PauseContext = {
pause_reason: 'MERGE_CONFLICT'
pr_url: string
pr_head_sha: string
conflict_files: string[]
claude_question_id: string
resume_instructions: string
paused_at: string
}
export type FlowEffect =
| { type: 'RELEASE_WORKTREE_LOCKS'; jobId: string }
| { type: 'ENABLE_AUTO_MERGE'; prUrl: string; expectedHeadSha: string }
| { type: 'MARK_PR_READY'; prUrl: string }
| {
type: 'CREATE_CLAUDE_QUESTION'
sprintRunId: string
prUrl: string
files: string[]
}
| { type: 'CLOSE_CLAUDE_QUESTION'; questionId: string }
| {
type: 'SET_SPRINT_RUN_STATUS'
sprintRunId: string
status: 'QUEUED' | 'RUNNING' | 'PAUSED' | 'DONE' | 'FAILED' | 'CANCELLED'
pauseContextDraft?: Omit<PauseContext, 'claude_question_id'>
clearPauseContext?: boolean
}
export type AutoMergeOutcome =
| { effect: 'ENABLE_AUTO_MERGE'; ok: true }
| {
effect: 'ENABLE_AUTO_MERGE'
ok: false
reason: 'CHECKS_FAILED' | 'MERGE_CONFLICT' | 'GH_AUTH_ERROR' | 'AUTO_MERGE_NOT_ALLOWED' | 'UNKNOWN'
stderr: string
}
/**
* Execute a list of effects in order. Returns outcome objects only for
* effects whose result the caller needs to react to (auto-merge fail
* triggers MERGE_CONFLICT-event in update-job-status). Other failures
* are logged but swallowed.
*
* CREATE_CLAUDE_QUESTION SET_SPRINT_RUN_STATUS chains: the question_id
* created in the first effect is injected into the pause_context of the
* second.
*/
export async function executeEffects(
effects: FlowEffect[],
): Promise<AutoMergeOutcome[]> {
const outcomes: AutoMergeOutcome[] = []
let lastQuestionId: string | undefined
for (const effect of effects) {
try {
if (effect.type === 'CREATE_CLAUDE_QUESTION') {
lastQuestionId = await createOrReuseClaudeQuestion(effect)
continue
}
if (effect.type === 'SET_SPRINT_RUN_STATUS') {
await applySprintRunStatus(effect, lastQuestionId)
continue
}
const outcome = await executeEffect(effect)
if (outcome) outcomes.push(outcome)
} catch (err) {
console.warn(`[effects] effect ${effect.type} failed (idempotent skip):`, err)
}
}
return outcomes
}
async function executeEffect(effect: FlowEffect): Promise<AutoMergeOutcome | undefined> {
switch (effect.type) {
case 'RELEASE_WORKTREE_LOCKS': {
const { releaseLocksOnTerminal } = await import('../git/job-locks.js')
await releaseLocksOnTerminal(effect.jobId)
return undefined
}
case 'ENABLE_AUTO_MERGE': {
const { enableAutoMergeOnPr } = await import('../git/pr.js')
const result = await enableAutoMergeOnPr({
prUrl: effect.prUrl,
expectedHeadSha: effect.expectedHeadSha,
})
if (result.ok) return { effect: 'ENABLE_AUTO_MERGE', ok: true }
return { effect: 'ENABLE_AUTO_MERGE', ok: false, reason: result.reason, stderr: result.stderr }
}
case 'MARK_PR_READY': {
const { markPullRequestReady } = await import('../git/pr.js')
const result = await markPullRequestReady({ prUrl: effect.prUrl })
if ('error' in result) {
console.warn(`[effects] MARK_PR_READY failed for ${effect.prUrl}: ${result.error}`)
}
return undefined
}
case 'CLOSE_CLAUDE_QUESTION': {
const { prisma } = await import('../prisma.js')
await prisma.claudeQuestion.updateMany({
where: { id: effect.questionId, status: 'open' },
data: { status: 'closed' },
})
return undefined
}
// CREATE_CLAUDE_QUESTION + SET_SPRINT_RUN_STATUS handled in executeEffects.
case 'CREATE_CLAUDE_QUESTION':
case 'SET_SPRINT_RUN_STATUS':
return undefined
}
}
async function createOrReuseClaudeQuestion(effect: {
sprintRunId: string
prUrl: string
files: string[]
}): Promise<string> {
const { prisma } = await import('../prisma.js')
// Reuse existing open question for the same SprintRun + PR if present.
const existing = await prisma.claudeQuestion.findFirst({
where: {
status: 'open',
options: { path: ['sprint_run_id'], equals: effect.sprintRunId } as never,
},
orderBy: { created_at: 'desc' },
select: { id: true },
})
if (existing) return existing.id
// Need product_id + asker (user) to create. Resolve via SprintRun.
const sprintRun = await prisma.sprintRun.findUnique({
where: { id: effect.sprintRunId },
select: {
started_by_id: true,
sprint: { select: { product_id: true } },
},
})
if (!sprintRun) {
throw new Error(`SprintRun ${effect.sprintRunId} not found`)
}
const fileList =
effect.files.length === 0
? '(unknown files — check the PR)'
: effect.files.slice(0, 5).join(', ')
+ (effect.files.length > 5 ? ` + ${effect.files.length - 5} more` : '')
const created = await prisma.claudeQuestion.create({
data: {
product_id: sprintRun.sprint.product_id,
asked_by: sprintRun.started_by_id,
question:
`Merge-conflict on ${effect.prUrl}. Conflict files: ${fileList}. `
+ `Resolve on the branch and push, then resume the sprint.`,
options: {
sprint_run_id: effect.sprintRunId,
pr_url: effect.prUrl,
conflict_files: effect.files,
},
status: 'open',
expires_at: new Date(Date.now() + 7 * 24 * 60 * 60 * 1000), // 7 days
},
select: { id: true },
})
return created.id
}
async function applySprintRunStatus(
effect: Extract<FlowEffect, { type: 'SET_SPRINT_RUN_STATUS' }>,
lastQuestionId: string | undefined,
): Promise<void> {
const { prisma, Prisma } = await (async () => {
const mod = await import('../prisma.js')
const prismaPkg = await import('@prisma/client')
return { prisma: mod.prisma, Prisma: prismaPkg.Prisma }
})()
const data: Record<string, unknown> = { status: effect.status }
if (effect.pauseContextDraft && lastQuestionId) {
data.pause_context = {
...effect.pauseContextDraft,
claude_question_id: lastQuestionId,
}
}
if (effect.clearPauseContext) {
data.pause_context = Prisma.JsonNull
}
await prisma.sprintRun.update({ where: { id: effect.sprintRunId }, data })
}

110
src/flow/pr-flow.ts Normal file
View file

@ -0,0 +1,110 @@
import type { FlowEffect } from './effects.js'
import type { AutoMergeFailReason } from '../git/pr.js'
export type PrStrategy = 'STORY' | 'SPRINT'
export type PrFlowState =
| { kind: 'none'; strategy: PrStrategy }
| { kind: 'branch_pushed'; strategy: PrStrategy; prUrl?: string }
| { kind: 'pr_opened'; strategy: 'STORY'; prUrl: string }
| { kind: 'draft_opened'; strategy: 'SPRINT'; prUrl: string }
| { kind: 'waiting_for_checks'; strategy: 'STORY'; prUrl: string; headSha: string }
| { kind: 'auto_merge_enabled'; strategy: 'STORY'; prUrl: string; headSha: string }
| { kind: 'ready_for_review'; strategy: 'SPRINT'; prUrl: string }
| { kind: 'merged'; strategy: PrStrategy; prUrl: string }
| { kind: 'checks_failed'; strategy: PrStrategy; prUrl: string }
| { kind: 'merge_conflict_paused'; strategy: PrStrategy; prUrl: string; headSha: string }
export type PrFlowEvent =
| { type: 'PR_CREATED'; prUrl: string }
| { type: 'TASK_DONE'; taskId: string; headSha: string }
| { type: 'STORY_COMPLETED'; storyId: string; headSha: string }
| { type: 'SPRINT_COMPLETED'; sprintRunId: string }
| { type: 'MERGE_RESULT'; reason?: AutoMergeFailReason }
export type TransitionResult = { nextState: PrFlowState; effects: FlowEffect[] }
export function transition(state: PrFlowState, event: PrFlowEvent): TransitionResult {
if (state.strategy === 'STORY') {
switch (state.kind) {
case 'none':
case 'branch_pushed':
if (event.type === 'PR_CREATED') {
return {
nextState: { kind: 'pr_opened', strategy: 'STORY', prUrl: event.prUrl },
effects: [],
}
}
break
case 'pr_opened':
if (event.type === 'STORY_COMPLETED') {
return {
nextState: {
kind: 'waiting_for_checks',
strategy: 'STORY',
prUrl: state.prUrl,
headSha: event.headSha,
},
effects: [
{ type: 'ENABLE_AUTO_MERGE', prUrl: state.prUrl, expectedHeadSha: event.headSha },
],
}
}
break
case 'waiting_for_checks':
if (event.type === 'MERGE_RESULT' && !event.reason) {
return {
nextState: {
kind: 'auto_merge_enabled',
strategy: 'STORY',
prUrl: state.prUrl,
headSha: state.headSha,
},
effects: [],
}
}
if (event.type === 'MERGE_RESULT' && event.reason === 'MERGE_CONFLICT') {
return {
nextState: {
kind: 'merge_conflict_paused',
strategy: 'STORY',
prUrl: state.prUrl,
headSha: state.headSha,
},
effects: [],
}
}
if (event.type === 'MERGE_RESULT' && event.reason === 'CHECKS_FAILED') {
return {
nextState: { kind: 'checks_failed', strategy: 'STORY', prUrl: state.prUrl },
effects: [],
}
}
break
}
}
if (state.strategy === 'SPRINT') {
switch (state.kind) {
case 'none':
case 'branch_pushed':
if (event.type === 'PR_CREATED') {
return {
nextState: { kind: 'draft_opened', strategy: 'SPRINT', prUrl: event.prUrl },
effects: [],
}
}
break
case 'draft_opened':
if (event.type === 'SPRINT_COMPLETED') {
return {
nextState: { kind: 'ready_for_review', strategy: 'SPRINT', prUrl: state.prUrl },
effects: [{ type: 'MARK_PR_READY', prUrl: state.prUrl }],
}
}
break
}
}
return { nextState: state, effects: [] }
}

136
src/flow/sprint-run.ts Normal file
View file

@ -0,0 +1,136 @@
import type { FlowEffect, PauseContext } from './effects.js'
export type SprintRunStateKind =
| 'queued'
| 'running'
| 'paused_merge_conflict'
| 'done'
| 'failed'
| 'cancelled'
export type SprintRunState = {
kind: SprintRunStateKind
sprintRunId: string
pauseContext?: PauseContext
}
export type SprintRunEvent =
| { type: 'CLAIM_FIRST_JOB' }
| { type: 'TASK_DONE'; taskId: string }
| { type: 'TASK_FAILED'; taskId: string; error: string }
| {
type: 'MERGE_CONFLICT'
prUrl: string
prHeadSha: string
conflictFiles: string[]
resumeInstructions: string
}
| { type: 'USER_RESUMED' }
| { type: 'USER_CANCELLED' }
| { type: 'ALL_DONE' }
export type TransitionResult = { nextState: SprintRunState; effects: FlowEffect[] }
export function transition(state: SprintRunState, event: SprintRunEvent): TransitionResult {
switch (state.kind) {
case 'queued':
if (event.type === 'CLAIM_FIRST_JOB') {
return {
nextState: { ...state, kind: 'running' },
effects: [
{ type: 'SET_SPRINT_RUN_STATUS', sprintRunId: state.sprintRunId, status: 'RUNNING' },
],
}
}
break
case 'running':
if (event.type === 'TASK_DONE') {
return { nextState: state, effects: [] }
}
if (event.type === 'TASK_FAILED') {
return {
nextState: { ...state, kind: 'failed' },
effects: [
{ type: 'SET_SPRINT_RUN_STATUS', sprintRunId: state.sprintRunId, status: 'FAILED' },
],
}
}
if (event.type === 'ALL_DONE') {
return {
nextState: { ...state, kind: 'done' },
effects: [
{ type: 'SET_SPRINT_RUN_STATUS', sprintRunId: state.sprintRunId, status: 'DONE' },
],
}
}
if (event.type === 'MERGE_CONFLICT') {
const pauseContextDraft: Omit<PauseContext, 'claude_question_id'> = {
pause_reason: 'MERGE_CONFLICT',
pr_url: event.prUrl,
pr_head_sha: event.prHeadSha,
conflict_files: event.conflictFiles,
resume_instructions: event.resumeInstructions,
paused_at: new Date().toISOString(),
}
return {
nextState: { ...state, kind: 'paused_merge_conflict' },
effects: [
{
type: 'CREATE_CLAUDE_QUESTION',
sprintRunId: state.sprintRunId,
prUrl: event.prUrl,
files: event.conflictFiles,
},
{
type: 'SET_SPRINT_RUN_STATUS',
sprintRunId: state.sprintRunId,
status: 'PAUSED',
pauseContextDraft,
},
],
}
}
if (event.type === 'USER_CANCELLED') {
return {
nextState: { ...state, kind: 'cancelled' },
effects: [
{ type: 'SET_SPRINT_RUN_STATUS', sprintRunId: state.sprintRunId, status: 'CANCELLED' },
],
}
}
break
case 'paused_merge_conflict':
if (event.type === 'USER_RESUMED') {
const closeQuestionEffects: FlowEffect[] = state.pauseContext
? [{ type: 'CLOSE_CLAUDE_QUESTION', questionId: state.pauseContext.claude_question_id }]
: []
return {
nextState: { ...state, kind: 'running', pauseContext: undefined },
effects: [
...closeQuestionEffects,
{
type: 'SET_SPRINT_RUN_STATUS',
sprintRunId: state.sprintRunId,
status: 'RUNNING',
clearPauseContext: true,
},
],
}
}
if (event.type === 'USER_CANCELLED') {
return {
nextState: { ...state, kind: 'cancelled', pauseContext: undefined },
effects: [
{
type: 'SET_SPRINT_RUN_STATUS',
sprintRunId: state.sprintRunId,
status: 'CANCELLED',
clearPauseContext: true,
},
],
}
}
break
}
return { nextState: state, effects: [] }
}

103
src/flow/worktree-lease.ts Normal file
View file

@ -0,0 +1,103 @@
import type { FlowEffect } from './effects.js'
export type WorktreeLeaseState =
| { kind: 'idle' }
| { kind: 'acquiring_lock'; jobId: string; productIds: string[] }
| { kind: 'creating_or_reusing'; jobId: string; productIds: string[] }
| { kind: 'syncing'; jobId: string; productIds: string[] }
| { kind: 'ready'; jobId: string; productIds: string[] }
| { kind: 'releasing'; jobId: string }
| { kind: 'released'; jobId: string }
| { kind: 'lock_timeout'; jobId: string; productIds: string[] }
| { kind: 'sync_failed'; jobId: string; productIds: string[]; error: string }
| { kind: 'stale_released'; jobId: string }
export type WorktreeLeaseEvent =
| { type: 'JOB_CLAIMED'; jobId: string; productIds: string[] }
| { type: 'LOCK_ACQUIRED' }
| { type: 'LOCK_TIMEOUT' }
| { type: 'WORKTREE_READY' }
| { type: 'SYNC_DONE' }
| { type: 'SYNC_FAILED'; error: string }
| { type: 'JOB_TERMINAL'; jobId: string }
| { type: 'STALE_RESET'; jobId: string }
export type TransitionResult = {
nextState: WorktreeLeaseState
effects: FlowEffect[]
}
export function transition(
state: WorktreeLeaseState,
event: WorktreeLeaseEvent,
): TransitionResult {
switch (state.kind) {
case 'idle':
if (event.type === 'JOB_CLAIMED') {
return {
nextState: { kind: 'acquiring_lock', jobId: event.jobId, productIds: event.productIds },
effects: [],
}
}
break
case 'acquiring_lock':
if (event.type === 'LOCK_ACQUIRED') {
return {
nextState: { kind: 'creating_or_reusing', jobId: state.jobId, productIds: state.productIds },
effects: [],
}
}
if (event.type === 'LOCK_TIMEOUT') {
return {
nextState: { kind: 'lock_timeout', jobId: state.jobId, productIds: state.productIds },
effects: [],
}
}
break
case 'creating_or_reusing':
if (event.type === 'WORKTREE_READY') {
return {
nextState: { kind: 'syncing', jobId: state.jobId, productIds: state.productIds },
effects: [],
}
}
break
case 'syncing':
if (event.type === 'SYNC_DONE') {
return {
nextState: { kind: 'ready', jobId: state.jobId, productIds: state.productIds },
effects: [],
}
}
if (event.type === 'SYNC_FAILED') {
return {
nextState: {
kind: 'sync_failed',
jobId: state.jobId,
productIds: state.productIds,
error: event.error,
},
effects: [{ type: 'RELEASE_WORKTREE_LOCKS', jobId: state.jobId }],
}
}
break
case 'ready':
if (event.type === 'JOB_TERMINAL') {
return {
nextState: { kind: 'releasing', jobId: state.jobId },
effects: [{ type: 'RELEASE_WORKTREE_LOCKS', jobId: state.jobId }],
}
}
if (event.type === 'STALE_RESET') {
return {
nextState: { kind: 'stale_released', jobId: state.jobId },
effects: [{ type: 'RELEASE_WORKTREE_LOCKS', jobId: state.jobId }],
}
}
break
case 'releasing':
return { nextState: { kind: 'released', jobId: state.jobId }, effects: [] }
}
// Unknown or forbidden transition — keep current state, no effects
return { nextState: state, effects: [] }
}

38
src/git/file-lock.ts Normal file
View file

@ -0,0 +1,38 @@
import lockfile from 'proper-lockfile'
export async function acquireFileLock(lockPath: string): Promise<() => Promise<void>> {
const release = await lockfile.lock(lockPath, {
realpath: false,
stale: 30_000,
update: 5_000,
retries: { retries: 60, factor: 1, minTimeout: 1_000, maxTimeout: 1_000 },
})
let released = false
return async () => {
if (released) return
released = true
await release()
}
}
export async function acquireFileLocksOrdered(
lockPaths: string[],
): Promise<() => Promise<void>> {
const sorted = [...lockPaths].sort()
const releases: Array<() => Promise<void>> = []
try {
for (const p of sorted) {
releases.push(await acquireFileLock(p))
}
} catch (err) {
for (const r of releases.reverse()) {
await r().catch(() => {})
}
throw err
}
return async () => {
for (const r of releases.reverse()) {
await r().catch(() => {})
}
}
}

69
src/git/job-locks.ts Normal file
View file

@ -0,0 +1,69 @@
import * as fs from 'node:fs/promises'
import * as path from 'node:path'
import { acquireFileLocksOrdered } from './file-lock.js'
import {
getProductWorktreeLockPath,
getWorktreeRoot,
} from './worktree-paths.js'
import {
getOrCreateProductWorktree,
syncProductWorktree,
} from './product-worktree.js'
type JobReleases = Map<string, Array<() => Promise<void>>>
const jobReleases: JobReleases = new Map()
export async function setupProductWorktrees(
jobId: string,
productIds: string[],
resolveRepoRoot: (productId: string) => Promise<string | null>,
): Promise<Array<{ productId: string; worktreePath: string }>> {
if (productIds.length === 0) return []
// Ensure parent dir exists so lockfile creation succeeds
await fs.mkdir(path.join(getWorktreeRoot(), '_products'), { recursive: true })
// Lock-first, alphabetically sorted (deadlock prevention for multi-product idea-jobs)
const sorted = [...productIds].sort()
const lockPaths = sorted.map(getProductWorktreeLockPath)
const releaseAll = await acquireFileLocksOrdered(lockPaths)
registerJobLockReleases(jobId, [releaseAll])
// After lock-acquire, create/reuse worktrees and sync
const out: Array<{ productId: string; worktreePath: string }> = []
for (const productId of sorted) {
const repoRoot = await resolveRepoRoot(productId)
if (!repoRoot) continue
const { worktreePath } = await getOrCreateProductWorktree({ repoRoot, productId })
await syncProductWorktree({ worktreePath })
out.push({ productId, worktreePath })
}
return out
}
export function registerJobLockReleases(
jobId: string,
releases: Array<() => Promise<void>>,
): void {
const existing = jobReleases.get(jobId) ?? []
jobReleases.set(jobId, [...existing, ...releases])
}
export async function releaseLocksOnTerminal(jobId: string): Promise<void> {
const releases = jobReleases.get(jobId)
if (!releases) return // idempotent — already released or never locked
jobReleases.delete(jobId)
for (const release of releases) {
try {
await release()
} catch (err) {
console.warn(`[job-locks] release failed for job ${jobId}:`, err)
}
}
}
// For tests
export function _resetJobReleasesForTest(): void {
jobReleases.clear()
}

View file

@ -1,7 +1,7 @@
import { execFile } from 'node:child_process' import { execFile } from 'node:child_process'
import { promisify } from 'node:util' import { promisify } from 'node:util'
import * as path from 'node:path' import * as path from 'node:path'
import * as os from 'node:os' import { getWorktreeRoot } from './worktree-paths.js'
const exec = promisify(execFile) const exec = promisify(execFile)
@ -12,10 +12,17 @@ export async function createPullRequest(opts: {
body: string body: string
/** Open as draft PR (mens moet 'm later ready-for-review zetten). Default false. */ /** Open as draft PR (mens moet 'm later ready-for-review zetten). Default false. */
draft?: boolean draft?: boolean
/** Schakel auto-merge (squash) in. Default true. Voor sprint-mode: false. */ /**
* PBI-47 (P0): default changed to false. Auto-merge is now enabled
* separately via `enableAutoMergeOnPr` only on the **last task** of a
* STORY-mode story, with a head-SHA guard to prevent racing earlier
* task merges. Callers may still pass `true` for one-off PRs that
* are immediately ready to merge; in that case we use the new typed
* helper rather than the previous fire-and-forget gh call.
*/
enableAutoMerge?: boolean enableAutoMerge?: boolean
}): Promise<{ url: string } | { error: string }> { }): Promise<{ url: string } | { error: string }> {
const { worktreePath, branchName, title, body, draft = false, enableAutoMerge = true } = opts const { worktreePath, branchName, title, body, draft = false, enableAutoMerge = false } = opts
let url: string let url: string
try { try {
@ -40,21 +47,14 @@ export async function createPullRequest(opts: {
return { error: `gh pr create failed: ${msg.slice(0, 300)}` } return { error: `gh pr create failed: ${msg.slice(0, 300)}` }
} }
// Best-effort: enable auto-merge (squash) on the freshly created PR. If the // Legacy opt-in: enableAutoMerge=true and not draft → fire the new typed
// repo doesn't have "Allow auto-merge" turned on, or the token lacks scope, // helper without head-SHA guard (caller didn't supply one). Result is
// gh exits non-zero and we just log. The PR is still valid; auto-merge can // logged but not propagated — same shape as before.
// be turned on manually. We do NOT fail the whole createPullRequest call —
// the URL was successfully obtained which is the contract this returns.
// Bij draft + sprint-flow slaan we dit over: de PR moet eerst handmatig of
// via markPullRequestReady ready-for-review worden gezet.
if (enableAutoMerge && !draft) { if (enableAutoMerge && !draft) {
try { const result = await enableAutoMergeOnPr({ prUrl: url, cwd: worktreePath })
await exec('gh', ['pr', 'merge', '--auto', '--squash', url], { cwd: worktreePath }) if (!result.ok) {
} catch (err) {
const stderr =
(err as { stderr?: string }).stderr ?? (err as Error).message ?? ''
console.warn( console.warn(
`[createPullRequest] auto-merge enable failed for ${url}: ${stderr.slice(0, 200)}`, `[createPullRequest] auto-merge enable failed for ${url}: ${result.reason} ${result.stderr.slice(0, 200)}`,
) )
} }
} }
@ -62,6 +62,51 @@ export async function createPullRequest(opts: {
return { url } return { url }
} }
export type AutoMergeFailReason =
| 'CHECKS_FAILED'
| 'MERGE_CONFLICT'
| 'GH_AUTH_ERROR'
| 'AUTO_MERGE_NOT_ALLOWED'
| 'UNKNOWN'
export type EnableAutoMergeResult =
| { ok: true }
| { ok: false; reason: AutoMergeFailReason; stderr: string }
function classifyAutoMergeError(stderr: string): AutoMergeFailReason {
if (/conflict|not in mergeable state|dirty/i.test(stderr)) return 'MERGE_CONFLICT'
if (/checks? failed|status check|required check/i.test(stderr)) return 'CHECKS_FAILED'
if (/authentication|HTTP 401|HTTP 403|permission|gh auth/i.test(stderr)) return 'GH_AUTH_ERROR'
if (/auto-?merge.*not.*allowed|auto-?merge.*disabled/i.test(stderr)) return 'AUTO_MERGE_NOT_ALLOWED'
return 'UNKNOWN'
}
/**
* Enable auto-merge (squash) on a PR with an optional head-SHA guard.
*
* PBI-47 (P0): when `expectedHeadSha` is provided we pass `--match-head-commit`
* so GitHub only activates auto-merge if the remote head still matches the
* SHA the caller observed. This prevents racing late pushes from another
* worker triggering a merge of a different commit set.
*/
export async function enableAutoMergeOnPr(opts: {
prUrl: string
expectedHeadSha?: string
cwd?: string
}): Promise<EnableAutoMergeResult> {
try {
const args = ['pr', 'merge', '--auto', '--squash']
if (opts.expectedHeadSha) args.push('--match-head-commit', opts.expectedHeadSha)
args.push(opts.prUrl)
await exec('gh', args, opts.cwd ? { cwd: opts.cwd } : {})
return { ok: true }
} catch (err) {
const stderr =
(err as { stderr?: string }).stderr ?? (err as Error).message ?? ''
return { ok: false, reason: classifyAutoMergeError(stderr), stderr: stderr.slice(0, 500) }
}
}
// Zet een draft-PR over naar "ready for review". Gebruikt bij sprint-mode // Zet een draft-PR over naar "ready for review". Gebruikt bij sprint-mode
// wanneer alle stories in de SprintRun DONE zijn — mens reviewt en mergt zelf. // wanneer alle stories in de SprintRun DONE zijn — mens reviewt en mergt zelf.
export async function markPullRequestReady(opts: { export async function markPullRequestReady(opts: {
@ -163,8 +208,7 @@ export async function createRevertPullRequest(opts: {
pbiCode, pbiCode,
} = opts } = opts
const worktreeDir = const worktreeDir = getWorktreeRoot()
process.env.SCRUM4ME_AGENT_WORKTREE_DIR ?? path.join(os.homedir(), '.scrum4me-agent-worktrees')
const wtPath = path.join(worktreeDir, `revert-${jobId}`) const wtPath = path.join(worktreeDir, `revert-${jobId}`)
const revertBranch = `revert/${originalBranch}-${jobId.slice(-8)}` const revertBranch = `revert/${originalBranch}-${jobId.slice(-8)}`

View file

@ -0,0 +1,66 @@
import { execFile } from 'node:child_process'
import { promisify } from 'node:util'
import * as fs from 'node:fs/promises'
import * as path from 'node:path'
import { getProductWorktreePath } from './worktree-paths.js'
const exec = promisify(execFile)
export async function getOrCreateProductWorktree(opts: {
repoRoot: string
productId: string
}): Promise<{ worktreePath: string; created: boolean }> {
const worktreePath = getProductWorktreePath(opts.productId)
await fs.mkdir(path.dirname(worktreePath), { recursive: true })
try {
await fs.access(worktreePath)
return { worktreePath, created: false }
} catch {
// Path bestaat niet — aanmaken
}
await exec('git', ['fetch', 'origin', '--prune'], { cwd: opts.repoRoot })
await exec('git', ['worktree', 'add', '--detach', worktreePath, 'origin/main'], {
cwd: opts.repoRoot,
})
// Resolve REAL exclude-pad (linked worktree heeft .git als file, niet directory)
const { stdout } = await exec('git', ['rev-parse', '--git-path', 'info/exclude'], {
cwd: worktreePath,
})
const excludePath = path.resolve(worktreePath, stdout.trim())
const existing = await fs.readFile(excludePath, 'utf8').catch(() => '')
if (!existing.split('\n').includes('.scratch/')) {
const sep = existing === '' || existing.endsWith('\n') ? '' : '\n'
await fs.appendFile(excludePath, `${sep}.scratch/\n`)
}
return { worktreePath, created: true }
}
export async function syncProductWorktree(opts: { worktreePath: string }): Promise<void> {
const { worktreePath } = opts
await exec('git', ['fetch', 'origin', '--prune'], { cwd: worktreePath })
await exec('git', ['reset', '--hard', 'origin/main'], { cwd: worktreePath })
await exec('git', ['clean', '-fd', '-e', '.scratch/'], { cwd: worktreePath })
// Wis .scratch/ inhoud, behoud de map
const scratch = path.join(worktreePath, '.scratch')
await fs.rm(scratch, { recursive: true, force: true })
await fs.mkdir(scratch, { recursive: true })
}
export async function removeProductWorktree(opts: {
repoRoot: string
productId: string
}): Promise<{ removed: boolean }> {
const worktreePath = getProductWorktreePath(opts.productId)
try {
await exec('git', ['worktree', 'remove', '--force', worktreePath], {
cwd: opts.repoRoot,
})
return { removed: true }
} catch {
return { removed: false }
}
}

19
src/git/worktree-paths.ts Normal file
View file

@ -0,0 +1,19 @@
import * as os from 'node:os'
import * as path from 'node:path'
export const SYSTEM_WORKTREE_DIRS = new Set(['_products'])
export function getWorktreeRoot(): string {
return (
process.env.SCRUM4ME_AGENT_WORKTREE_DIR
?? path.join(os.homedir(), '.scrum4me-agent-worktrees')
)
}
export function getProductWorktreePath(productId: string): string {
return path.join(getWorktreeRoot(), '_products', productId)
}
export function getProductWorktreeLockPath(productId: string): string {
return path.join(getWorktreeRoot(), '_products', `${productId}.lock`)
}

View file

@ -1,8 +1,8 @@
import { execFile } from 'node:child_process' import { execFile } from 'node:child_process'
import { promisify } from 'node:util' import { promisify } from 'node:util'
import * as path from 'node:path' import * as path from 'node:path'
import * as os from 'node:os'
import * as fs from 'node:fs/promises' import * as fs from 'node:fs/promises'
import { getWorktreeRoot } from './worktree-paths.js'
const exec = promisify(execFile) const exec = promisify(execFile)
@ -50,9 +50,7 @@ export async function createWorktreeForJob(opts: {
const { repoRoot, jobId, baseRef = 'origin/main', reuseBranch = false } = opts const { repoRoot, jobId, baseRef = 'origin/main', reuseBranch = false } = opts
let { branchName } = opts let { branchName } = opts
const parent = const parent = getWorktreeRoot()
process.env.SCRUM4ME_AGENT_WORKTREE_DIR ??
path.join(os.homedir(), '.scrum4me-agent-worktrees')
await fs.mkdir(parent, { recursive: true }) await fs.mkdir(parent, { recursive: true })
@ -121,9 +119,7 @@ export async function removeWorktreeForJob(opts: {
}): Promise<{ removed: boolean }> { }): Promise<{ removed: boolean }> {
const { repoRoot, jobId, keepBranch = false } = opts const { repoRoot, jobId, keepBranch = false } = opts
const parent = const parent = getWorktreeRoot()
process.env.SCRUM4ME_AGENT_WORKTREE_DIR ??
path.join(os.homedir(), '.scrum4me-agent-worktrees')
const worktreePath = path.join(parent, jobId) const worktreePath = path.join(parent, jobId)

View file

@ -1,12 +1,11 @@
import { z } from 'zod' import { z } from 'zod'
import * as fs from 'node:fs/promises' import * as fs from 'node:fs/promises'
import * as path from 'node:path'
import * as os from 'node:os'
import type { McpServer } from '@modelcontextprotocol/sdk/server/mcp.js' import type { McpServer } from '@modelcontextprotocol/sdk/server/mcp.js'
import { prisma } from '../prisma.js' import { prisma } from '../prisma.js'
import { requireWriteAccess } from '../auth.js' import { requireWriteAccess } from '../auth.js'
import { toolJson, withToolErrors } from '../errors.js' import { toolJson, withToolErrors } from '../errors.js'
import { removeWorktreeForJob } from '../git/worktree.js' import { removeWorktreeForJob } from '../git/worktree.js'
import { getWorktreeRoot, SYSTEM_WORKTREE_DIRS } from '../git/worktree-paths.js'
import { resolveRepoRoot } from './wait-for-job.js' import { resolveRepoRoot } from './wait-for-job.js'
const TERMINAL_STATUSES = new Set(['DONE', 'FAILED', 'CANCELLED']) const TERMINAL_STATUSES = new Set(['DONE', 'FAILED', 'CANCELLED'])
@ -15,16 +14,20 @@ const ACTIVE_STATUSES = new Set(['QUEUED', 'CLAIMED', 'RUNNING'])
const inputSchema = z.object({}) const inputSchema = z.object({})
export async function getWorktreeParent(): Promise<string> { export async function getWorktreeParent(): Promise<string> {
return ( return getWorktreeRoot()
process.env.SCRUM4ME_AGENT_WORKTREE_DIR ??
path.join(os.homedir(), '.scrum4me-agent-worktrees')
)
} }
export async function listWorktreeJobIds(worktreeParent: string): Promise<string[]> { export async function listWorktreeJobIds(worktreeParent: string): Promise<string[]> {
try { try {
const entries = await fs.readdir(worktreeParent, { withFileTypes: true }) const entries = await fs.readdir(worktreeParent, { withFileTypes: true })
return entries.filter((e) => e.isDirectory()).map((e) => e.name) return entries
.filter(
(e) =>
e.isDirectory()
&& !SYSTEM_WORKTREE_DIRS.has(e.name)
&& !e.name.endsWith('.lock'),
)
.map((e) => e.name)
} catch { } catch {
return [] return []
} }

View file

@ -11,11 +11,30 @@ import { prisma } from '../prisma.js'
import { requireWriteAccess } from '../auth.js' import { requireWriteAccess } from '../auth.js'
import { toolJson, toolError, withToolErrors } from '../errors.js' import { toolJson, toolError, withToolErrors } from '../errors.js'
import { removeWorktreeForJob } from '../git/worktree.js' import { removeWorktreeForJob } from '../git/worktree.js'
import { getWorktreeRoot } from '../git/worktree-paths.js'
import { releaseLocksOnTerminal } from '../git/job-locks.js'
import { resolveRepoRoot } from './wait-for-job.js' import { resolveRepoRoot } from './wait-for-job.js'
import { pushBranchForJob } from '../git/push.js' import { pushBranchForJob } from '../git/push.js'
import { createPullRequest, markPullRequestReady } from '../git/pr.js' import { createPullRequest, markPullRequestReady } from '../git/pr.js'
import { cancelPbiOnFailure } from '../cancel/pbi-cascade.js' import { cancelPbiOnFailure } from '../cancel/pbi-cascade.js'
import { propagateStatusUpwards } from '../lib/tasks-status-update.js' import { propagateStatusUpwards } from '../lib/tasks-status-update.js'
import { transition as prFlowTransition } from '../flow/pr-flow.js'
import { transition as sprintRunTransition } from '../flow/sprint-run.js'
import { executeEffects } from '../flow/effects.js'
import { execFile as execFileCb } from 'node:child_process'
import { promisify } from 'node:util'
const execGh = promisify(execFileCb)
async function fetchConflictFiles(prUrl: string): Promise<string[]> {
try {
const { stdout } = await execGh('gh', ['pr', 'view', prUrl, '--json', 'files'])
const parsed = JSON.parse(stdout) as { files?: Array<{ path: string }> }
return parsed.files?.map((f) => f.path) ?? []
} catch {
return []
}
}
const inputSchema = z.object({ const inputSchema = z.object({
job_id: z.string().min(1), job_id: z.string().min(1),
@ -85,26 +104,37 @@ export type DoneUpdatePlan = {
branchOverride: string | undefined branchOverride: string | undefined
errorOverride: string | undefined errorOverride: string | undefined
skipWorktreeCleanup: boolean skipWorktreeCleanup: boolean
headSha: string | undefined
} }
export async function prepareDoneUpdate( export async function prepareDoneUpdate(
jobId: string, jobId: string,
branch: string | undefined, branch: string | undefined,
): Promise<DoneUpdatePlan> { ): Promise<DoneUpdatePlan> {
const worktreeDir = const worktreeDir = getWorktreeRoot()
process.env.SCRUM4ME_AGENT_WORKTREE_DIR ?? path.join(os.homedir(), '.scrum4me-agent-worktrees')
const worktreePath = path.join(worktreeDir, jobId) const worktreePath = path.join(worktreeDir, jobId)
const branchName = branch ?? `feat/job-${jobId.slice(-8)}` const branchName = branch ?? `feat/job-${jobId.slice(-8)}`
const pushResult = await pushBranchForJob({ worktreePath, branchName }) const pushResult = await pushBranchForJob({ worktreePath, branchName })
if (pushResult.pushed) { if (pushResult.pushed) {
let headSha: string | undefined
try {
const { execFile } = await import('node:child_process')
const { promisify } = await import('node:util')
const exec = promisify(execFile)
const { stdout } = await exec('git', ['rev-parse', 'HEAD'], { cwd: worktreePath })
headSha = stdout.trim()
} catch (err) {
console.warn(`[prepareDoneUpdate] failed to resolve HEAD sha for job ${jobId}:`, err)
}
return { return {
dbStatus: 'DONE', dbStatus: 'DONE',
pushedAt: new Date(), pushedAt: new Date(),
branchOverride: branchName, branchOverride: branchName,
errorOverride: undefined, errorOverride: undefined,
skipWorktreeCleanup: false, skipWorktreeCleanup: false,
headSha,
} }
} }
@ -115,6 +145,7 @@ export async function prepareDoneUpdate(
branchOverride: undefined, branchOverride: undefined,
errorOverride: undefined, errorOverride: undefined,
skipWorktreeCleanup: false, skipWorktreeCleanup: false,
headSha: undefined,
} }
} }
@ -126,6 +157,7 @@ export async function prepareDoneUpdate(
branchOverride: undefined, branchOverride: undefined,
errorOverride: `push failed (${pushResult.reason}): ${snippet}`, errorOverride: `push failed (${pushResult.reason}): ${snippet}`,
skipWorktreeCleanup: true, skipWorktreeCleanup: true,
headSha: undefined,
} }
} }
@ -376,6 +408,7 @@ export function registerUpdateJobStatusTool(server: McpServer) {
let branchToWrite = branch let branchToWrite = branch
let errorToWrite = error let errorToWrite = error
let skipWorktreeCleanup = false let skipWorktreeCleanup = false
let headShaToWrite: string | undefined
if (status === 'done') { if (status === 'done') {
// M12: idea-jobs hebben geen task/plan_snapshot/branch — skip de // M12: idea-jobs hebben geen task/plan_snapshot/branch — skip de
@ -401,6 +434,7 @@ export function registerUpdateJobStatusTool(server: McpServer) {
if (plan.branchOverride !== undefined) branchToWrite = plan.branchOverride if (plan.branchOverride !== undefined) branchToWrite = plan.branchOverride
if (plan.errorOverride !== undefined) errorToWrite = plan.errorOverride if (plan.errorOverride !== undefined) errorToWrite = plan.errorOverride
skipWorktreeCleanup = plan.skipWorktreeCleanup skipWorktreeCleanup = plan.skipWorktreeCleanup
headShaToWrite = plan.headSha
} }
} }
@ -414,9 +448,7 @@ export function registerUpdateJobStatusTool(server: McpServer) {
job.kind === 'TASK_IMPLEMENTATION' && job.kind === 'TASK_IMPLEMENTATION' &&
job.task_id job.task_id
) { ) {
const worktreeDir = const worktreeDir = getWorktreeRoot()
process.env.SCRUM4ME_AGENT_WORKTREE_DIR ??
path.join(os.homedir(), '.scrum4me-agent-worktrees')
prUrl = await maybeCreateAutoPr({ prUrl = await maybeCreateAutoPr({
jobId: job_id, jobId: job_id,
productId: job.product_id, productId: job.product_id,
@ -443,6 +475,7 @@ export function registerUpdateJobStatusTool(server: McpServer) {
...(summary !== undefined ? { summary } : {}), ...(summary !== undefined ? { summary } : {}),
...(errorToWrite !== undefined ? { error: errorToWrite } : {}), ...(errorToWrite !== undefined ? { error: errorToWrite } : {}),
...(prUrl !== null ? { pr_url: prUrl } : {}), ...(prUrl !== null ? { pr_url: prUrl } : {}),
...(headShaToWrite !== undefined ? { head_sha: headShaToWrite } : {}),
...(model_id !== undefined ? { model_id } : {}), ...(model_id !== undefined ? { model_id } : {}),
...(input_tokens !== undefined ? { input_tokens } : {}), ...(input_tokens !== undefined ? { input_tokens } : {}),
...(output_tokens !== undefined ? { output_tokens } : {}), ...(output_tokens !== undefined ? { output_tokens } : {}),
@ -468,6 +501,7 @@ export function registerUpdateJobStatusTool(server: McpServer) {
// sibling-cancel binnen dezelfde SprintRun af bij FAILED. // sibling-cancel binnen dezelfde SprintRun af bij FAILED.
// Idea-jobs hebben geen task_id en worden hier overgeslagen. // Idea-jobs hebben geen task_id en worden hier overgeslagen.
let sprintRunBecameDone = false let sprintRunBecameDone = false
let storyBecameDone = false
if ( if (
(actualStatus === 'done' || actualStatus === 'failed') && (actualStatus === 'done' || actualStatus === 'failed') &&
job.kind === 'TASK_IMPLEMENTATION' && job.kind === 'TASK_IMPLEMENTATION' &&
@ -479,6 +513,7 @@ export function registerUpdateJobStatusTool(server: McpServer) {
actualStatus === 'done' ? 'DONE' : 'FAILED', actualStatus === 'done' ? 'DONE' : 'FAILED',
) )
sprintRunBecameDone = actualStatus === 'done' && propagation.sprintRunChanged sprintRunBecameDone = actualStatus === 'done' && propagation.sprintRunChanged
storyBecameDone = actualStatus === 'done' && propagation.storyChanged
} catch (err) { } catch (err) {
console.warn( console.warn(
`[update_job_status] propagateStatusUpwards error for task ${job.task_id}:`, `[update_job_status] propagateStatusUpwards error for task ${job.task_id}:`,
@ -487,6 +522,72 @@ export function registerUpdateJobStatusTool(server: McpServer) {
} }
} }
// PBI-47 (P0): STORY-mode auto-merge timing fix.
// Only enable auto-merge when this DONE was the *last* task of a STORY
// (story.status flipped to DONE) and pr_strategy === STORY. The
// pr-flow transition emits ENABLE_AUTO_MERGE with the head_sha guard.
if (
storyBecameDone &&
updated.pr_url &&
headShaToWrite &&
job.kind === 'TASK_IMPLEMENTATION'
) {
const storyCtx = await prisma.claudeJob.findUnique({
where: { id: job_id },
select: {
task: { select: { story: { select: { status: true } } } },
sprint_run: { select: { pr_strategy: true } },
},
})
if (
storyCtx?.sprint_run?.pr_strategy === 'STORY'
&& storyCtx.task?.story.status === 'DONE'
) {
const result = prFlowTransition(
{ kind: 'pr_opened', strategy: 'STORY', prUrl: updated.pr_url },
{
type: 'STORY_COMPLETED',
storyId: '',
headSha: headShaToWrite,
},
)
const outcomes = await executeEffects(result.effects)
// PBI-47 (C2): route MERGE_CONFLICT to sprint-run flow → PAUSED.
// Other reasons (CHECKS_FAILED, GH_AUTH_ERROR, AUTO_MERGE_NOT_ALLOWED, UNKNOWN)
// remain warnings; CHECKS_FAILED is already covered by the task-FAIL cascade.
for (const o of outcomes) {
if (o.effect === 'ENABLE_AUTO_MERGE' && !o.ok) {
console.warn(
`[update_job_status] auto-merge fail for ${updated.pr_url}: ${o.reason} ${o.stderr.slice(0, 200)}`,
)
if (o.reason === 'MERGE_CONFLICT') {
const sprintRunId = await prisma.claudeJob
.findUnique({
where: { id: job_id },
select: { sprint_run_id: true },
})
.then((j) => j?.sprint_run_id)
if (sprintRunId) {
const conflictFiles = await fetchConflictFiles(updated.pr_url)
const conflictResult = sprintRunTransition(
{ kind: 'running', sprintRunId },
{
type: 'MERGE_CONFLICT',
prUrl: updated.pr_url,
prHeadSha: headShaToWrite ?? '',
conflictFiles,
resumeInstructions:
'Resolve the conflict on this branch, push, then resume the sprint via the UI.',
},
)
await executeEffects(conflictResult.effects)
}
}
}
}
}
}
// SPRINT-mode: bij sprint-DONE de draft-PR ready-for-review zetten. // SPRINT-mode: bij sprint-DONE de draft-PR ready-for-review zetten.
// Mens reviewt + mergt zelf — geen auto-merge in deze modus. // Mens reviewt + mergt zelf — geen auto-merge in deze modus.
if (sprintRunBecameDone && updated.pr_url) { if (sprintRunBecameDone && updated.pr_url) {
@ -581,6 +682,12 @@ export function registerUpdateJobStatusTool(server: McpServer) {
await cancelPbiOnFailure(job_id) await cancelPbiOnFailure(job_id)
} }
// PBI-9: release product-worktree locks on terminal transitions.
// No-op for jobs without registered locks (i.e. TASK_IMPLEMENTATION).
if (actualStatus === 'done' || actualStatus === 'failed') {
await releaseLocksOnTerminal(job_id)
}
const queueCount = await prisma.claudeJob.count({ const queueCount = await prisma.claudeJob.count({
where: { user_id: userId, status: 'QUEUED' }, where: { user_id: userId, status: 'QUEUED' },
}) })

View file

@ -15,8 +15,15 @@ const inputSchema = z.object({
worktree_path: z.string().min(1), worktree_path: z.string().min(1),
}) })
export async function getDiffInWorktree(worktreePath: string): Promise<string> { export async function getDiffInWorktree(
const { stdout } = await exec('git', ['diff', 'origin/main...HEAD'], { cwd: worktreePath }) worktreePath: string,
baseSha?: string,
): Promise<string> {
// PBI-47 (P0): when base_sha is provided, diff against the per-job base
// captured at claim-time so verify only sees the current task's changes.
// Falls back to origin/main only for legacy callers without base_sha.
const range = baseSha ? `${baseSha}...HEAD` : 'origin/main...HEAD'
const { stdout } = await exec('git', ['diff', range], { cwd: worktreePath })
return stdout return stdout
} }
@ -58,7 +65,7 @@ export function registerVerifyTaskAgainstPlanTool(server: McpServer) {
where: { status: { in: ['CLAIMED', 'RUNNING'] } }, where: { status: { in: ['CLAIMED', 'RUNNING'] } },
orderBy: { created_at: 'desc' }, orderBy: { created_at: 'desc' },
take: 1, take: 1,
select: { id: true, plan_snapshot: true }, select: { id: true, plan_snapshot: true, base_sha: true },
}, },
}, },
}) })
@ -67,9 +74,19 @@ export function registerVerifyTaskAgainstPlanTool(server: McpServer) {
const activeJob = task.claude_jobs[0] ?? null const activeJob = task.claude_jobs[0] ?? null
// PBI-47 (P0): require base_sha so diff is scoped to this job's work,
// not the full origin/main...HEAD which would include sibling commits
// on a reused story/sprint branch.
if (activeJob && !activeJob.base_sha) {
return toolError(
'MISSING_BASE_SHA: This claim has no base_sha. '
+ 'Re-claim the task (cancel + wait_for_job) so a fresh base_sha is captured.',
)
}
let diff: string let diff: string
try { try {
diff = await getDiffInWorktree(worktree_path) diff = await getDiffInWorktree(worktree_path, activeJob?.base_sha ?? undefined)
} catch (err) { } catch (err) {
return toolError( return toolError(
`git diff failed in worktree (${worktree_path}): ${(err as Error).message ?? 'unknown error'}`, `git diff failed in worktree (${worktree_path}): ${(err as Error).message ?? 'unknown error'}`,

View file

@ -7,10 +7,15 @@ import { Client } from 'pg'
import * as fs from 'node:fs/promises' import * as fs from 'node:fs/promises'
import * as os from 'node:os' import * as os from 'node:os'
import * as path from 'node:path' import * as path from 'node:path'
import { execFile } from 'node:child_process'
import { promisify } from 'node:util'
import { prisma } from '../prisma.js' import { prisma } from '../prisma.js'
const execFileP = promisify(execFile)
import { requireWriteAccess } from '../auth.js' import { requireWriteAccess } from '../auth.js'
import { toolJson, toolError, withToolErrors } from '../errors.js' import { toolJson, toolError, withToolErrors } from '../errors.js'
import { createWorktreeForJob } from '../git/worktree.js' import { createWorktreeForJob } from '../git/worktree.js'
import { setupProductWorktrees, releaseLocksOnTerminal } from '../git/job-locks.js'
/** Parse `https://github.com/<owner>/<name>(.git)?` → `<name>`. */ /** Parse `https://github.com/<owner>/<name>(.git)?` → `<name>`. */
export function repoNameFromUrl(repoUrl: string | null | undefined): string | null { export function repoNameFromUrl(repoUrl: string | null | undefined): string | null {
@ -181,6 +186,26 @@ export async function attachWorktreeToJob(
branchName, branchName,
reuseBranch: reused, reuseBranch: reused,
}) })
// PBI-47 (P0): capture base_sha so verify_task_against_plan can diff
// against the claim-time HEAD instead of origin/main. For reused branches
// (siblings already pushed), base_sha = SHA of the worktree HEAD now.
// For fresh branches, base_sha = origin/main HEAD which createWorktreeForJob
// just checked out.
let baseSha: string | null = null
try {
const { stdout } = await execFileP('git', ['rev-parse', 'HEAD'], { cwd: worktreePath })
baseSha = stdout.trim()
} catch (err) {
console.warn(`[attachWorktreeToJob] failed to resolve base_sha for ${jobId}:`, err)
}
if (baseSha) {
await prisma.claudeJob.update({
where: { id: jobId },
data: { base_sha: baseSha },
})
}
return { worktree_path: worktreePath, branch_name: actualBranch, reused_branch: reused } return { worktree_path: worktreePath, branch_name: actualBranch, reused_branch: reused }
} catch (err) { } catch (err) {
await rollbackClaim(jobId) await rollbackClaim(jobId)
@ -234,6 +259,11 @@ export async function resetStaleClaimedJobs(userId: string): Promise<void> {
if (failedRows.length === 0 && requeuedRows.length === 0) return if (failedRows.length === 0 && requeuedRows.length === 0) return
// PBI-9: release any product-worktree locks held by these stale jobs.
// No-op for jobs without registered locks (TASK_IMPLEMENTATION).
for (const j of failedRows) await releaseLocksOnTerminal(j.id)
for (const j of requeuedRows) await releaseLocksOnTerminal(j.id)
// Notify UI via SSE for each transition (best-effort) // Notify UI via SSE for each transition (best-effort)
try { try {
const pg = new Client({ connectionString: process.env.DATABASE_URL }) const pg = new Client({ connectionString: process.env.DATABASE_URL })
@ -371,6 +401,9 @@ async function getFullJobContext(jobId: string) {
idea: { idea: {
include: { include: {
pbi: { select: { id: true, code: true, title: true } }, pbi: { select: { id: true, code: true, title: true } },
secondary_products: {
include: { product: { select: { id: true, repo_url: true } } },
},
}, },
}, },
product: { select: { id: true, name: true, repo_url: true, definition_of_done: true } }, product: { select: { id: true, name: true, repo_url: true, definition_of_done: true } },
@ -384,6 +417,21 @@ async function getFullJobContext(jobId: string) {
if (!job.idea) return null if (!job.idea) return null
const { idea } = job const { idea } = job
const { getIdeaPromptText } = await import('../lib/idea-prompts.js') const { getIdeaPromptText } = await import('../lib/idea-prompts.js')
// Setup persistent product-worktrees for this idea-job (PBI-9).
// Primary product is gated by repo_url via resolveRepoRoot returning null.
// Secondary products from IdeaProduct[] need explicit repo_url filter.
const involvedProductIds: string[] = []
if (idea.product_id) involvedProductIds.push(idea.product_id)
for (const ip of idea.secondary_products ?? []) {
if (ip.product?.repo_url && !involvedProductIds.includes(ip.product_id)) {
involvedProductIds.push(ip.product_id)
}
}
const worktrees = involvedProductIds.length > 0
? await setupProductWorktrees(job.id, involvedProductIds, (pid) => resolveRepoRoot(pid))
: []
return { return {
job_id: job.id, job_id: job.id,
kind: job.kind, kind: job.kind,
@ -408,6 +456,11 @@ async function getFullJobContext(jobId: string) {
repo_url: job.product.repo_url, repo_url: job.product.repo_url,
prompt_text: getIdeaPromptText(job.kind), prompt_text: getIdeaPromptText(job.kind),
branch_suggestion: `feat/idea-${idea.code.toLowerCase()}-${job.kind === 'IDEA_GRILL' ? 'grill' : 'plan'}`, branch_suggestion: `feat/idea-${idea.code.toLowerCase()}-${job.kind === 'IDEA_GRILL' ? 'grill' : 'plan'}`,
product_worktrees: worktrees.map((w) => ({
product_id: w.productId,
worktree_path: w.worktreePath,
})),
primary_worktree_path: worktrees[0]?.worktreePath ?? null,
} }
} }