Basics
Maestro is designed to be used through conversation. You may be used to working with systems which are designed to handle very narrowly scoped requests: think of Maestro as your engineering partner instead.Sessions
Currently, all Maestro sessions are independent. A single session is a checkpointable, resumable artefact, used to create an Agent.Memories
Dialog history, containing records of previous turns. All memories can be forgotten or compacted.
Files
Full support for source code and data in complex, enterprise scale projects.
Tools
Comprehensive toolbox to manage and interact with files, gather information and accomplish goals.
The Maestro Partnership Model
User’s Role: Quality Controller & Strategic Guide
- Set clear objectives and success criteria
- Challenge logical inconsistencies and shortcuts
- Enforce proper validation methodology
- Push back when standards aren’t met
Maestro’s Role: Technical Implementation Partner
- Deep technical implementation capability
- Systematic analysis and problem-solving
- Comprehensive testing and validation
- Learning from feedback and course-correction
The Dynamic That Works
Best Sessions Feature: Active user oversight with immediate feedback- Users who catch errors early prevent larger mistakes
- Professional criticism improves output quality
- Insistence on evidence leads to better solutions
- Partnership creates better results than either alone
Complex Feature Implementation Phases
1
Discovery & Understanding
Goal: Deep comprehension of existing systemPattern: “Clone X and walk me through how subsystem Y works”Success Criteria: Maestro demonstrates understanding of architecture, constraints, and integration pointsTime Investment: Essential upfront work that prevents later architectural mistakes
2
Strategic Analysis
Goal: Identify highest-value implementation approachPattern: “What are the 3 most valuable ways to extend this with Z?”Success Criteria: Clear rationale for chosen approach, understanding of alternativesRisk Mitigation: Prevents over-engineering or choosing wrong approach
3
Specification-Driven Development
Goal: Complete technical specification before implementationPattern: “Create a full spec for X, then implement it”Success Criteria: Comprehensive spec covering edge cases, performance targets, testing requirementsQuality Gate: Implementation should follow spec, not evolve organically
4
Implementation with Continuous Validation
Goal: Working implementation with proper integrationSuccess Criteria: Code compiles, basic functionality works, no obvious regressionsEarly Warning: Watch for compilation issues, integration problems
5
Professional Validation
Goal: Systematic testing and performance validationPattern: “Validate your work systematically”Success Criteria: All tests pass, performance meets targets, no regressionsUser Vigilance Required: This is where user oversight is most critical
6
Comprehensive Integration
Goal: Complete test coverage, documentation updates, clean codebasePattern: “Run ALL tests, update docs, clean up WIP code”Success Criteria: Production-ready code with full documentation
What Makes Sessions Successful
🎯 Clear Success Criteria
Best Practice: Define measurable outcomes upfront- “Performance should exceed baseline by X%”
- “All existing tests must continue passing”
- “Implementation must be fully Redis-compatible”
🔍 Demand Evidence, Not Claims
Pattern: Always ask for validation- ❌ Don’t Accept: “The implementation is performing well”
- ✅ Demand: “Show me benchmarks against the baseline using the same methodology”
🚫 Never Accept Shortcuts
Quality Standards:- Zero test failures tolerated
- Every performance claim must be validated
- All edge cases must be tested
- Regressions are unacceptable
🛠️ Use Existing Infrastructure
Principle: Don’t reinvent testing/benchmark tools- “Use the existing test suite structure”
- “Run the benchmark scripts already in the codebase”
- “Follow the established patterns”

