Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 3 additions & 3 deletions .github/agents/implementation.agent.md
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@ counting how many retries have occurred.

## RESEARCH State (start)

Call the built-in @explore sub-agent with:
Call the built-in explore sub-agent with:

- **context**: the user's request and any current quality findings
- **goal**: analyze the implementation state and develop a plan to implement the request
Expand All @@ -35,7 +35,7 @@ Once the explore sub-agent finishes, transition to the DEVELOPMENT state.

## DEVELOPMENT State

Call the @developer sub-agent with:
Call the developer sub-agent with:

- **context** the user's request and the current implementation plan
- **goal** implement the user's request and any identified quality fixes
Expand All @@ -47,7 +47,7 @@ Once the developer sub-agent finishes:

## QUALITY State

Call the @quality sub-agent with:
Call the quality sub-agent with:

- **context** the user's request and the current implementation report
- **goal** check the quality of the work performed for any issues
Expand Down
156 changes: 76 additions & 80 deletions .github/agents/quality.agent.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,76 +13,14 @@ DEMA Consulting standards and Continuous Compliance practices.

# Standards-Based Quality Assessment

This assessment is a quality control system of the project and MUST be performed.
This assessment is a quality control system of the project and MUST be performed systematically.

1. **Analyze completed work** to identify scope and changes made
2. **Read relevant standards** from `.github/standards/` as defined in AGENTS.md based on work performed
3. **Execute comprehensive quality checks** across all compliance areas - EVERY checkbox item must be evaluated
3. **Execute comprehensive quality assessment** using the structured evaluation criteria in the reporting template
4. **Validate tool compliance** using ReqStream, ReviewMark, and language tools
5. **Generate quality assessment report** with findings and recommendations

## Requirements Compliance

- [ ] Were requirements updated to reflect functional changes?
- [ ] Were new requirements created for new features?
- [ ] Do requirement IDs follow semantic naming standards?
- [ ] Were source filters applied appropriately for platform-specific requirements?
- [ ] Does ReqStream enforcement pass without errors?
- [ ] Is requirements traceability maintained to tests?

## Design Documentation Compliance

- [ ] Were design documents updated for architectural changes?
- [ ] Were new design artifacts created for new components?
- [ ] Are design decisions documented with rationale?
- [ ] Is system/subsystem/unit categorization maintained?
- [ ] Is design-to-implementation traceability preserved?

## Code Quality Compliance

- [ ] Are language-specific standards followed (from applicable standards files)?
- [ ] Are quality checks from standards files satisfied?
- [ ] Is code properly categorized (system/subsystem/unit/OTS)?
- [ ] Is appropriate separation of concerns maintained?
- [ ] Was language-specific tooling executed and passing?

## Testing Compliance

- [ ] Were tests created/updated for all functional changes?
- [ ] Is test coverage maintained for all requirements?
- [ ] Are testing standards followed (AAA pattern, etc.)?
- [ ] Does test categorization align with code structure?
- [ ] Do all tests pass without failures?

## Review Management Compliance

- [ ] Were review-sets updated to include new/modified files?
- [ ] Do file patterns follow include-then-exclude approach?
- [ ] Is review scope appropriate for change magnitude?
- [ ] Was ReviewMark tooling executed and passing?
- [ ] Were review artifacts generated correctly?

## Documentation Compliance

- [ ] Was README.md updated for user-facing changes?
- [ ] Were user guides updated for feature changes?
- [ ] Does API documentation reflect code changes?
- [ ] Was compliance documentation generated?
- [ ] Does documentation follow standards formatting?
- [ ] Is documentation organized under `docs/` following standard folder structure?
- [ ] Do Pandoc collections include proper `introduction.md` files with Purpose and Scope sections?
- [ ] Are auto-generated markdown files left unmodified?
- [ ] Do README.md files use absolute URLs and include concrete examples?
- [ ] Is documentation integrated into ReviewMark review-sets for formal review?

## Process Compliance

- [ ] Was Continuous Compliance workflow followed?
- [ ] Did all quality gates execute successfully?
- [ ] Were appropriate tools used for validation?
- [ ] Were standards consistently applied across work?
- [ ] Was compliance evidence generated and preserved?

# Reporting

Upon completion create a summary in `.agent-logs/[agent-name]-[subject]-[unique-id].md`
Expand All @@ -100,26 +38,84 @@ of the project consisting of:
- **Standards Applied**: [Standards files used for assessment]
- **Categories Evaluated**: [Quality check categories assessed]

## Quality Check Results

- **Requirements Compliance**: <PASS/FAIL> - [Summary]
- **Design Documentation**: <PASS/FAIL> - [Summary]
- **Code Quality**: <PASS/FAIL> - [Summary]
- **Testing Compliance**: <PASS/FAIL> - [Summary]
- **Review Management**: <PASS/FAIL> - [Summary]
- **Documentation**: <PASS/FAIL> - [Summary]
- **Process Compliance**: <PASS/FAIL> - [Summary]

## Findings

- **Issues Found**: [List of compliance issues]
- **Recommendations**: [Suggested improvements]
## Requirements Compliance: (PASS|FAIL|N/A)

- Were requirements updated to reflect functional changes? (PASS|FAIL|N/A) - [Evidence/Details]
- Were new requirements created for new features? (PASS|FAIL|N/A) - [Evidence/Details]
- Do requirement IDs follow semantic naming standards? (PASS|FAIL|N/A) - [Evidence/Details]
- Do requirement files follow kebab-case naming convention? (PASS|FAIL|N/A) - [Evidence/Details]
- Are requirement files organized under `docs/reqstream/` with proper folder structure? (PASS|FAIL|N/A) - [Evidence/Details]
- Are OTS requirements properly placed in `docs/reqstream/ots/` subfolder? (PASS|FAIL|N/A) - [Evidence/Details]
- Were source filters applied appropriately for platform-specific requirements? (PASS|FAIL|N/A) - [Evidence/Details]
- Does ReqStream enforcement pass without errors? (PASS|FAIL|N/A) - [Evidence/Details]
- Is requirements traceability maintained to tests? (PASS|FAIL|N/A) - [Evidence/Details]

## Design Documentation Compliance: (PASS|FAIL|N/A)

- Were design documents updated for architectural changes? (PASS|FAIL|N/A) - [Evidence/Details]
- Were new design artifacts created for new components? (PASS|FAIL|N/A) - [Evidence/Details]
- Do design folder names use kebab-case convention matching source structure? (PASS|FAIL|N/A) - [Evidence/Details]
- Are design files properly named ({subsystem-name}.md, {unit-name}.md patterns)? (PASS|FAIL|N/A) - [Evidence/Details]
- Is `docs/design/introduction.md` present with required Software Structure section? (PASS|FAIL|N/A) - [Evidence/Details]
- Are design decisions documented with rationale? (PASS|FAIL|N/A) - [Evidence/Details]
- Is system/subsystem/unit categorization maintained? (PASS|FAIL|N/A) - [Evidence/Details]
- Is design-to-implementation traceability preserved? (PASS|FAIL|N/A) - [Evidence/Details]

## Code Quality Compliance: (PASS|FAIL|N/A)

- Are language-specific standards followed (from applicable standards files)? (PASS|FAIL|N/A) - [Evidence/Details]
- Are quality checks from standards files satisfied? (PASS|FAIL|N/A) - [Evidence/Details]
- Is code properly categorized (system/subsystem/unit/OTS)? (PASS|FAIL|N/A) - [Evidence/Details]
- Is appropriate separation of concerns maintained? (PASS|FAIL|N/A) - [Evidence/Details]
- Was language-specific tooling executed and passing? (PASS|FAIL|N/A) - [Evidence/Details]

## Testing Compliance: (PASS|FAIL|N/A)

- Were tests created/updated for all functional changes? (PASS|FAIL|N/A) - [Evidence/Details]
- Is test coverage maintained for all requirements? (PASS|FAIL|N/A) - [Evidence/Details]
- Are testing standards followed (AAA pattern, etc.)? (PASS|FAIL|N/A) - [Evidence/Details]
- Does test categorization align with code structure? (PASS|FAIL|N/A) - [Evidence/Details]
- Do all tests pass without failures? (PASS|FAIL|N/A) - [Evidence/Details]

## Review Management Compliance: (PASS|FAIL|N/A)

- Were review-sets updated to include new/modified files? (PASS|FAIL|N/A) - [Evidence/Details]
- Do file patterns follow include-then-exclude approach? (PASS|FAIL|N/A) - [Evidence/Details]
- Is review scope appropriate for change magnitude? (PASS|FAIL|N/A) - [Evidence/Details]
- Was ReviewMark tooling executed and passing? (PASS|FAIL|N/A) - [Evidence/Details]
- Were review artifacts generated correctly? (PASS|FAIL|N/A) - [Evidence/Details]

## Documentation Compliance: (PASS|FAIL|N/A)

- Was README.md updated for user-facing changes? (PASS|FAIL|N/A) - [Evidence/Details]
- Were user guides updated for feature changes? (PASS|FAIL|N/A) - [Evidence/Details]
- Does API documentation reflect code changes? (PASS|FAIL|N/A) - [Evidence/Details]
- Was compliance documentation generated? (PASS|FAIL|N/A) - [Evidence/Details]
- Does documentation follow standards formatting? (PASS|FAIL|N/A) - [Evidence/Details]
- Is documentation organized under `docs/` following standard folder structure? (PASS|FAIL|N/A) - [Evidence/Details]
- Do Pandoc collections include proper `introduction.md` files with Purpose and Scope sections? (PASS|FAIL|N/A) - [Evidence/Details]
- Are auto-generated markdown files left unmodified? (PASS|FAIL|N/A) - [Evidence/Details]
- Do README.md files use absolute URLs and include concrete examples? (PASS|FAIL|N/A) - [Evidence/Details]
- Is documentation integrated into ReviewMark review-sets for formal review? (PASS|FAIL|N/A) - [Evidence/Details]

## Process Compliance: (PASS|FAIL|N/A)

- Was Continuous Compliance workflow followed? (PASS|FAIL|N/A) - [Evidence/Details]
- Did all quality gates execute successfully? (PASS|FAIL|N/A) - [Evidence/Details]
- Were appropriate tools used for validation? (PASS|FAIL|N/A) - [Evidence/Details]
- Were standards consistently applied across work? (PASS|FAIL|N/A) - [Evidence/Details]
- Was compliance evidence generated and preserved? (PASS|FAIL|N/A) - [Evidence/Details]

## Overall Findings

- **Critical Issues**: [Count and description of critical findings]
- **Recommendations**: [Suggested improvements and next steps]
- **Tools Executed**: [Quality tools used for validation]

## Compliance Status

- **Standards Adherence**: [Overall compliance rating]
- **Quality Gates**: [Status of automated quality checks]
- **Standards Adherence**: [Overall compliance rating with specific standards]
- **Quality Gates**: [Status of automated quality checks with tool outputs]
```

Return this summary to the caller.
142 changes: 142 additions & 0 deletions .github/standards/design-documentation.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,142 @@
# Design Documentation Standards

This document defines DEMA Consulting standards for design documentation
within Continuous Compliance environments, extending the general technical
documentation standards with specific requirements for software design
artifacts.

# Core Principles

Design documentation serves as the bridge between requirements and
implementation, providing detailed technical specifications that enable:

- **Formal Code Review**: Reviewers can verify implementation matches design
- **Compliance Evidence**: Auditors can trace requirements through design to code
- **Maintenance Support**: Developers can understand system structure and interactions
- **Quality Assurance**: Testing teams can validate against detailed specifications

# Required Structure and Documents

Design documentation must be organized under `docs/design/` with folder structure
mirroring source code organization because reviewers need clear navigation from
design to implementation:

```text
docs/design/
├── introduction.md # Design overview with software structure
├── system.md # System-level design documentation
├── {subsystem-name}/ # Subsystem design documents (kebab-case folder names)
│ ├── {subsystem-name}.md # Subsystem overview and design
│ └── {unit-name}.md # Unit-level design documents
└── {unit-name}.md # Top-level unit design documents (if not in subsystem)
```

## introduction.md (MANDATORY)

The `introduction.md` file serves as the design entry point and MUST include
these sections because auditors need clear scope boundaries and architectural
overview:

### Purpose Section

Clear statement of the design document's purpose, audience, and regulatory
or compliance drivers.

### Scope Section

Define what software items are covered and what is explicitly excluded.
Specify version boundaries and applicability constraints.

### Software Structure Section (MANDATORY)

Include a text-based tree diagram showing the software organization across
System, Subsystem, and Unit levels. Agents MUST read `software-items.md`
to understand these classifications before creating this section.

Example format:

```text
ProjectName (System)
├── ComponentA (Subsystem)
│ ├── ClassX (Unit)
│ └── ClassY (Unit)
├── ComponentB (Subsystem)
│ └── ClassZ (Unit)
└── UtilityClass (Unit)
```

### Folder Layout Section (MANDATORY)

Include a text-based tree diagram showing how the source code folders
mirror the software structure, with file paths and brief descriptions.

Example format:

```text
src/ProjectName/
├── ComponentA/
│ ├── ClassX.cs — brief description
│ └── ClassY.cs — brief description
├── ComponentB/
│ └── ClassZ.cs — brief description
└── UtilityClass.cs — brief description
```

## system.md (MANDATORY)

The `system.md` file contains system-level design documentation including:

- System architecture and major components
- External interfaces and dependencies
- Data flow and control flow
- System-wide design constraints and decisions
- Integration patterns and communication protocols

## Subsystem and Unit Design Documents

For each subsystem identified in the software structure:

- Create a kebab-case folder matching the subsystem name (enables automated tooling)
- Include `{subsystem-name}.md` with subsystem overview and design
- Include unit design documents for complex units within the subsystem

For significant units requiring detailed design:

- Document data models, algorithms, and key methods
- Describe interactions with other units
- Include sufficient detail for formal code review
- Place in appropriate subsystem folder or at design root level

# Software Items Integration (CRITICAL)

Before creating design documentation, agents MUST:

1. **Read `.github/standards/software-items.md`** to understand System/Subsystem/Unit classifications
2. **Apply proper categorization** when creating software structure diagrams
3. **Ensure consistency** between software structure and folder layout
4. **Validate mapping** from design categories to source code organization

# Writing Guidelines

Design documentation must be technical and specific because it serves as the
implementation specification for formal code review:

- **Implementation Detail**: Provide sufficient detail for code review and implementation
- **Architectural Clarity**: Clearly define component boundaries and interfaces
- **Traceability**: Link to requirements where applicable using ReqStream patterns
- **Concrete Examples**: Use actual class names, method signatures, and data structures
- **Current Information**: Keep synchronized with code changes and refactoring

# Quality Checks

Before submitting design documentation, verify:

- [ ] `introduction.md` includes both Software Structure and Folder Layout sections
- [ ] Software structure correctly categorizes items as System/Subsystem/Unit per `software-items.md`
- [ ] Folder layout matches actual source code organization
- [ ] `system.md` provides comprehensive system-level design
- [ ] Subsystem folders use kebab-case naming matching source code
- [ ] Design documents contain sufficient implementation detail
- [ ] All documents follow technical documentation formatting standards
- [ ] Content is current with implementation and requirements
- [ ] Documents are integrated into ReviewMark review-sets for formal review
Loading