Migrating 700 Tests With AI Assistance
How I used Claude Code to migrate 700 tests from Jest to Vitest in 2 days. The strategy, the gotchas, and the patterns that scaled.
Test migrations are the kind of project that nobody wants to do. They are repetitive, error-prone, and boring. You know the framework is better, you know the tests will run faster, but the thought of touching 700 test files -- each with slightly different patterns, edge cases, and quirks -- is enough to keep the ticket in the backlog forever.
That is exactly where this project sat for six months: a Jira ticket labeled "Migrate from Jest to Vitest" with the estimate "3 weeks, low priority." Then I decided to try it with Claude Code. Two days later, the migration was complete, all 700 tests were passing, and CI was green.
This is not a tutorial on Vitest. It is a case study on how AI handles large-scale, repetitive code transformations -- and where it struggles.
Key Takeaways
- Pattern-based migrations are AI's ideal task -- the changes are mechanical, repetitive, and follow consistent rules
- The strategy matters more than the prompts -- batch by similarity, not by directory, to maximize AI's pattern recognition
- 80% of tests migrated with zero manual intervention, 15% needed minor fixes, and 5% required human judgment
- The hardest part was not the syntax changes but the behavior differences between test runners -- timing, mocking, and module resolution
- Claude Code's ability to run tests after each batch and self-correct is what made the timeline possible
The Migration Landscape
What We Were Migrating
- 700 test files across a Next.js application
- Jest with TypeScript using
ts-jestfor compilation - React Testing Library for component tests
- MSW (Mock Service Worker) for API mocking
- Custom test utilities for database fixtures
Why Vitest
Vitest runs tests 3-5x faster than Jest for TypeScript projects because it uses Vite's native ESM transformation instead of ts-jest. On our project, the full test suite went from 4 minutes to 50 seconds. That alone justified the migration.
The Strategy: Batch by Pattern, Not by Directory
My first instinct was to migrate directory by directory. Start with tests/utils/, then tests/components/, then tests/api/. This seems logical but is wrong for AI-assisted migration.
The better approach: batch by test pattern. Group tests that share the same structure and migrate them together. This lets Claude learn the pattern once and apply it consistently.
Pattern Categories
Category 1: Pure function tests (250 tests) Simple input/output tests with no mocking. The easiest category.
// Jest
import { slugify } from '../utils/slugify'
describe('slugify', () => {
it('converts spaces to hyphens', () => {
expect(slugify('hello world')).toBe('hello-world')
})
})
// Vitest (minimal changes)
import { describe, it, expect } from 'vitest'
import { slugify } from '../utils/slugify'
describe('slugify', () => {
it('converts spaces to hyphens', () => {
expect(slugify('hello world')).toBe('hello-world')
})
})
Change: Add explicit import { describe, it, expect } from 'vitest'. Everything else is identical.
Category 2: Component tests with React Testing Library (300 tests) Tests that render React components and assert on DOM output.
// Jest
import { render, screen } from '@testing-library/react'
import { SkillCard } from '../components/SkillCard'
test('renders skill title', () => {
render(<SkillCard skill={mockSkill} />)
expect(screen.getByText('My Skill')).toBeInTheDocument()
})
// Vitest
import { describe, it, expect } from 'vitest'
import { render, screen } from '@testing-library/react'
import { SkillCard } from '../components/SkillCard'
it('renders skill title', () => {
render(<SkillCard skill={mockSkill} />)
expect(screen.getByText('My Skill')).toBeInTheDocument()
})
Change: Import test functions from vitest. Everything else is the same because React Testing Library works identically with both runners.
Category 3: Tests with mocking (100 tests)
Tests that use jest.fn(), jest.mock(), or jest.spyOn(). This is where differences emerge.
// Jest
jest.mock('../lib/supabase/client', () => ({
createClient: jest.fn(() => mockSupabaseClient),
}))
const mockFn = jest.fn()
jest.spyOn(console, 'error').mockImplementation(() => {})
// Vitest
vi.mock('../lib/supabase/client', () => ({
createClient: vi.fn(() => mockSupabaseClient),
}))
const mockFn = vi.fn()
vi.spyOn(console, 'error').mockImplementation(() => {})
Change: Replace jest.fn() with vi.fn(), jest.mock() with vi.mock(), jest.spyOn() with vi.spyOn().
Category 4: Tests with timers and async patterns (50 tests) The hardest category. Jest's fake timer implementation and Vitest's have subtle differences.
The Execution: Day 1
Morning: Configuration and Category 1
I started by setting up the Vitest configuration and migrating the simplest tests.
claude "We're migrating from Jest to Vitest. Install vitest and configure it for
our Next.js + TypeScript project. Keep the existing test structure.
Here's our current jest.config.ts for reference."
Claude generated the vitest.config.ts, updated package.json scripts, and added the necessary dependencies. This took 15 minutes.
Then, Category 1:
claude "Migrate all pure function tests to Vitest. The only change needed is
adding explicit imports from 'vitest' for describe, it, expect, and beforeEach.
Process all files matching tests/**/*.test.ts that do not import from
@testing-library or use jest.mock."
Claude processed 250 files in about 20 minutes. It added the import line to each file and changed nothing else. I ran the tests: 248 passed, 2 failed.
The 2 failures were tests that used jest.setTimeout() (needs to be vi.setConfig({ testTimeout: ... })). Claude fixed both in under a minute.
Afternoon: Category 2
claude "Migrate all React Testing Library tests to Vitest. Add vitest imports.
Do not change the RTL usage -- it's the same for both frameworks.
Process files matching tests/**/*.test.tsx."
300 files in 30 minutes. 295 passed immediately. The 5 failures were:
- 2 tests that used
jest.requireActual()(changed tovi.importActual()) - 1 test that relied on Jest's automatic JSX transform configuration
- 2 tests with
jest.useFakeTimers()(changed tovi.useFakeTimers())
Claude fixed all 5 with specific prompts for each failure pattern. Total time for Category 2: about 1.5 hours including fixes. For more on testing patterns, see the testing skills guide.
The Execution: Day 2
Morning: Category 3 (Mocking)
This was the category I expected to be hardest, and it was -- but not for the reason I expected.
claude "Migrate all tests that use jest.mock, jest.fn, or jest.spyOn to their
Vitest equivalents: vi.mock, vi.fn, vi.spyOn. Process all remaining test files."
The syntax replacement was mechanical and Claude handled it perfectly. 95 of 100 tests passed after the initial migration.
The 5 failures were all caused by a behavioral difference: Vitest hoists vi.mock() calls to the top of the file (like Jest does), but the hoisting interacts differently with variable declarations. Tests that declared mock implementations before vi.mock() needed restructuring.
// This works in Jest but not in Vitest
const mockRouter = { push: jest.fn(), back: jest.fn() }
jest.mock('next/navigation', () => ({ useRouter: () => mockRouter }))
// Vitest needs the factory to be self-contained
vi.mock('next/navigation', () => ({
useRouter: () => ({ push: vi.fn(), back: vi.fn() })
}))
Claude understood the difference after I showed it the first failure and fixed the remaining 4 automatically.
Afternoon: Category 4 (Timers and Async)
The final 50 tests. These were the tests I had been dreading, and they lived up to the reputation.
Vitest's fake timer implementation handles setInterval differently from Jest's. Tests that relied on jest.advanceTimersByTime() needed adjustment:
// Jest
jest.useFakeTimers()
jest.advanceTimersByTime(1000)
expect(callback).toHaveBeenCalledTimes(1)
// Vitest (sometimes needs explicit tick)
vi.useFakeTimers()
vi.advanceTimersByTime(1000)
await vi.runAllTimersAsync() // This was needed for some async patterns
expect(callback).toHaveBeenCalledTimes(1)
40 of the 50 tests passed after Claude's initial migration. The remaining 10 required individual debugging. Claude fixed 7 of them after seeing the error messages. The final 3 I fixed manually because they involved complex async patterns where the root cause was ambiguous.
Results
| Category | Tests | Auto-Migrated | Minor Fix | Manual Fix |
|---|---|---|---|---|
| Pure functions | 250 | 248 (99%) | 2 (1%) | 0 |
| Component tests | 300 | 295 (98%) | 5 (2%) | 0 |
| Mocking tests | 100 | 95 (95%) | 5 (5%) | 0 |
| Timer/Async tests | 50 | 40 (80%) | 7 (14%) | 3 (6%) |
| Total | 700 | 678 (97%) | 19 (2.7%) | 3 (0.4%) |
97% of tests migrated without manual intervention. The entire project took 16 hours across 2 days. The original estimate was 3 weeks.
Lessons for Large-Scale AI Migrations
Batch by Similarity
Group files by the type of change needed, not by their location in the file system. Claude's pattern recognition works best when it processes many similar files consecutively.
Run Tests After Every Batch
Never migrate all files before running tests. Migrate a batch, test, fix, then move to the next batch. This catches issues early and prevents error accumulation.
Show Claude the First Failure
When a migration pattern fails, show Claude the specific error message. It will often identify the root cause and apply the fix to all similar files in the batch.
Know When to Fix Manually
The last 3 tests I fixed manually took about 20 minutes each. Claude had spent 15 minutes on each without resolving them. Recognizing when to take over manually saves time. For more on the overall workflow, see our AI dev workflow guide.
FAQ
Does this approach work for migrating between other test frameworks?
Yes. The pattern (categorize by change type, batch, migrate, test, fix) works for any test framework migration. Mocha to Jest, Jasmine to Vitest, or any other combination.
What about test coverage -- did any tests stop covering the right code?
I ran coverage reports before and after. Coverage was identical. The migration changed the test runner, not the test logic. The assertions and test scenarios remained the same.
How do you handle test fixtures and custom utilities?
Migrate those first, before the test files. They are shared dependencies that many tests rely on. If a fixture breaks, it cascades to every test that uses it.
Is it worth migrating if our test suite is small?
For under 50 tests, manual migration is probably faster than setting up the AI-assisted process. The AI approach pays off at scale -- the more repetitive files, the higher the ROI.
What about snapshot tests?
Snapshot tests need special handling. Vitest uses a different snapshot format than Jest. You need to regenerate all snapshots after migration by running vitest --update rather than trying to convert the snapshot files.
Explore production-ready AI skills at aiskill.market/browse or submit your own skill to the marketplace.
Sources
- Vitest Migration Guide - Official Jest-to-Vitest migration reference
- Vitest vs Jest - Feature comparison and differences
- React Testing Library - Framework-agnostic testing utilities