SEPTEMBER 2025 I Volume 46, Issue 3
SEPTEMBER 2025
Volume 46 I Issue 3
IN THIS JOURNAL:
- Issue at a Glance
- Chairman’s Message
Technical Articles
- Kernel Model Validation: How To Do It, And Why You Should Care
- Confidence-Based Skip-Lot Sampling
- Eucalyptus – An Analysis Suite for Fault Trees with Uncertainty Quantification
- Digital Twins in Reliability Engineering: Innovations, Challenges and Opportunities
- Competence Measure Enhanced Ensemble Learning Voting Schemes
- Advancing the Test Science of LLM-enabled Systems: A Survey of Factors and Conditions that Matter Most
- Beyond Accuracy: Evaluating Bayesian Neural Networks in a Real-world Application
- Balancing Structure and Flexibility: Evaluating Agile, Waterfall, and Hybrid Methodologies in Aerospace and Defense Projects
Workforce of the Future
- Building Confidence, Interest, and Opportunity: A Social Cognitive Career Theory-Based Analysis of the Young Women in Engineering Outreach Program
News
- Association News
- Chapter News
- Corporate Member News
![]()
Balancing Structure and Flexibility: Evaluating Agile, Waterfall, and Hybrid Methodologies in Aerospace and Defense Projects

Maryam H. Gracias
Department of Systems Engineering, 6029 Campus Delivery, Fort Collins, CO, USA 80523![]()

Erika E Gallegos
Associate Professor in the Department of Systems Engineering
at Colorado State University
Abstract
The Aerospace and Defense (A&D) industry operates within a uniquely demanding and highly regulated environment where the Waterfall methodology has traditionally been the preferred approach due to its structured processes and compliance-centric focus. Waterfall is a sequential development model in which each phase—requirements, design, implementation, testing, and deployment—must be completed before moving on to the next. It is often used in highly regulated fields because it provides clear documentation, predictable planning, and strict traceability. However, the increasing complexity of sustainment projects—characterized by evolving operational requirements and the need to manage costs—has highlighted the constraints of purely sequential development models. Agile, by contrast, is an iterative methodology that emphasizes flexibility, collaboration, and continuous feedback. It is designed to respond quickly to change, delivering value incrementally rather than all at once. While Agile methodologies offer enhanced adaptability, iterative development, and stakeholder engagement, they often struggle to satisfy the stringent documentation and certification standards inherent to A&D programs. Hybrid approaches have emerged as a middle ground, combining Waterfall’s structure with Agile’s adaptability. These methods allow critical compliance-driven activities (such as safety certification) to follow Waterfall, while more dynamic elements (such as software updates) can be managed with Agile cycles. This paper investigates the application of hybrid development frameworks that strategically integrate Waterfall’s rigorous governance with Agile’s flexibility, including the critical junctures of systems integration testing and performance testing. Using the SILA (Sustainment Integration for Legacy Aircraft) system as a representative case study—a pseudonym for a sustainment solution for aging aircraft subsystems—this research analyzes how hybrid approaches facilitate requirement traceability, continuous validation, and responsive change management. The findings demonstrate that hybrid frameworks can effectively balance the demands of regulatory compliance to structured systems engineering, test and evaluation with operational agility, providing a viable path forward for complex sustainment projects within the A&D sector.
1.0 Introduction
The Aerospace and Defense (A&D) industry operates within a uniquely high stakes, regulated, and technically demanding environment. Systems are often large-scale, safety-critical, and subject to strict compliance, which has historically favored the adoption of the Waterfall methodology (Hiekata et al., 2016). Waterfall’s linear and sequential development model offers benefits such as traceability, predictable planning, and robust documentation—essential attributes for ensuring safety assurance, airworthiness certification, and regulatory compliance (Maharao, 2024). However, this model assumes stable requirements throughout the development lifecycle, which is rarely the case in complex sustainment projects. With growing pressures to extend the life of aging aircraft, respond quickly to operational changes, and accommodate emergent technologies, the limitations of a purely Waterfall approach have become increasingly evident.
In contrast, Agile methodologies have gained traction across multiple engineering domains due to their iterative nature, customer-centric development, and ability to rapidly incorporate changing requirements. Agile’s emphasis on frequent stakeholder feedback, minimal documentation overhead, and incremental value delivery makes it attractive in dynamic environments where change is constant (Fasching, 2023). Despite these advantages, Agile also presents challenges in the defense context. Its focus on flexibility and continuous iteration can clash with the rigid governance structures, safety constraints, and certification requirements that are non-negotiable in A&D programs. For example, systems often require configuration management, end-to-end validation, and third-party verification before release—activities that are traditionally Waterfall-oriented and harder to implement under Agile principles (Gupta et al., 2016). Put differently, Agile trades some of Waterfall’s upfront planning and documentation for speed and responsiveness, which works well in fast-changing software environments but can be difficult to align with defense programs that demand strict, traceable compliance.
Recognizing that neither methodology fully addresses the needs of contemporary A&D projects, organizations are increasingly adopting hybrid approaches (Gracias & Gallegos, 2024). Hybrid approaches combine the strengths of both models—retaining Waterfall’s rigor for highly regulated subsystems while applying Agile practices to less constrained areas, such as software diagnostics or iterative tool development. These approaches strategically integrate elements of both Waterfall and Agile frameworks, tailoring the methodology to suit the nature of the project, the maturity of the requirements, and the criticality of the system under development. Hybrid models provide the structure necessary for high-assurance components while also enabling agility in areas that require adaptability, such as software updates, or integration with emerging technologies.
A growing body of systems engineering research underscores the relevance of hybrid and model-based approaches in system sustainment and supply chain resilience. For example, Donelli et al. (2023) propose a concurrent value-driven framework that links system design to supply chain integration, which aligns with the strategic complexity observed in SILA (Sustainment Integration for Legacy Aircraft). Similarly, Oliver et al. (2022) and Mousavi et al. (2022) highlight the importance of using systems thinking and Model-Based Systems Engineering (MBSE) to anticipate and mitigate disruptions in aerospace and semiconductor supply chains, respectively. Similarly, Kazakevich and Joiner (2025) propose a framework that integrates Agile and Waterfall approaches, aiming to address the increasing prevalence of software-intensive systems while accommodating the entrenched cultural and procedural traditions of Waterfall-based acquisitions. These findings underscore the need for adaptable, model-driven development frameworks in mission-critical environments.
Moreover, Morgan et al. (2021) explore how modularity and software containers can be synergized with MBSE to proactively address system obsolescence – an increasingly important factor in sustainment-focused programs like the one presented in this paper. Together, these studies reinforce the importance of integrating systems-level thinking, Agile responsiveness, and MBSE-enabled traceability in sustainment operations – foundational assumptions that shape the methods and analytical lens applied throughout this case study.
This paper examines the potential of hybrid methodologies to overcome the limitations of single-model frameworks in the A&D sector, using the SILA system as a representative case study. In this paper, we refer to the subject sustainment project as SILA (Sustainment Integration for Legacy Aircraft), a pseudonym adopted to maintain confidentiality and comply with proprietary disclosure agreements. SILA is a sustainment-focused software system developed by an organization to support the ongoing maintenance of legacy aircraft platforms. Comparable subsystems of SILA were developed using Waterfall, Agile, and Hybrid methodologies, making it an ideal case study for comparing these methods. This case study is based on a real project but anonymized to protect proprietary and/or classified data. The SILA system is modeled using MagicDraw, a widely used modeling tool that supports the development of complex systems through a Model-Based Systems Engineering (MBSE) approach. MagicDraw enables the engineering team to represent SILA’s functional architecture, behavioral logic, and interface definitions in a structured, visual format, allowing for a clear and comprehensive understanding of the system at various levels of abstraction (Planas et al., 2020).
The SILA project involves both regulated, tightly controlled subsystems that benefit from Waterfall’s structured development and more fluid elements that require iterative change, frequent feedback, and rapid prototyping—characteristics aligned with Agile practices. For example, software updates to ground-based diagnostic tools may require quick turnarounds, whereas modifications to flight-critical hardware must adhere to certification and safety standards. In this dual-paced environment, a hybrid approach has allowed SILA teams to align development practices with system-specific needs, improving both responsiveness and compliance.
By analyzing the implementation of hybrid methodologies across different aspects of the SILA project, this paper aims to generate a deeper understanding of the conditions under which hybrid approaches are most effective. In particular, this study evaluates how Waterfall, Agile, and Hybrid models influence key performance factors in aerospace sustainment projects, including requirement change, change management efficiency, cost, and time effectiveness. The paper also considers the tools and practices required to enable such integration, including governance structures, feedback loops, and role definition between Agile teams and compliance authorities.
1.1 Research Objectives
This research explores effective sustainment strategies within A&D projects by analyzing how hybrid development approaches—blending elements of both Waterfall and Agile—were applied in the SILA system to address evolving operational demands and regulatory requirements. The study aims to:
- Quantify the impact of Waterfall, Agile, and Hybrid approaches on requirement change frequency, and cost/time efficiency.
- Identify bottlenecks in change management and sustainment workflows caused by rigid or poorly integrated requirement structures.
- Determine whether a Hybrid development approach can optimize sustainment performance while maintaining regulatory compliance and minimizing rework.
2.0 Methods
This study adopts a qualitative case study methodology to evaluate the use of three development approaches—Waterfall, Agile, and Hybrid—within the SILA system. SILA is a de-identified, real-world aircraft sustainment initiative that supports aging aircraft systems. By examining comparable subsystems developed using each approach, this study investigates how different development methodologies—Waterfall, Agile, and Hybrid—affect change management processes, cost, and schedule efficiency within complex A&D environments.
The SILA system presents unique challenges in requirements management. Ensuring operational readiness while addressing part obsolescence, diminishing manufacturing sources, and evolving maintenance requires a development approach that balances stability and adaptability. The system exemplifies the organization’s commitment to supporting in-service aircraft long after production lines have closed. SILA’s core objectives include:
- Proactive Maintenance: Modeling subsystems, predicting failure points, and establishing preventive maintenance strategies to minimize unplanned aircraft downtime.
- Streamlined Sustainment Efforts: Managing spare parts, maintenance workflows, and system updates to keep aging aircraft operational and compliant with evolving mission requirements.
This research focuses on three distinct approaches of subsystems within SILA:
- Waterfall SILA Subsystems: These subsystems followed a sequential, documentation-heavy process where all requirements were defined upfront, reviewed, and approved before design and implementation.
- Agile SILA Subsystems: These components were developed using iterative sprints, with evolving requirements, continuous integration, and active stakeholder feedback throughout development.
- Hybrid SILA Subsystems: These subsystems blended structured Waterfall practices for compliance-driven tasks with Agile practices for adaptability in less regulated updates. Specifically, requirement baseline establishment, formal validation, and final system qualification followed Waterfall methods, while design elaboration, prototyping, and subsystem integration used Agile cycles to allow iterative refinement.
The three classifications provide a direct comparison across development styles, helping identify trade-offs and potential best practices for sustainment-focused projects.
2.1 Research Design
A case study design was chosen for its effectiveness in providing rich, context-specific insights into complex engineering and organizational dynamics. The SILA system—designed to sustain legacy aircraft subsystems after production—offers a practical platform for analyzing how Hybrid methodologies operate under varying developmental constraints. By examining how both Waterfall and Agile methodologies were applied, and more importantly, how hybrid strategies emerged to bridge the two, the research captures critical lessons for future implementation.
This design enables an in-depth analysis of project execution, from managing evolving requirements to maintaining regulatory compliance. The research design also evaluates how hybrid methodologies influence cost, time, and sustainment outcomes, providing actionable recommendations for integrating flexible, scalable development models suitable for the dynamic demands of the A&D sector.
2.2 Hybrid Methodology Definition
This study categorizes SILA subsystems into three approaches: Waterfall SILA, Agile SILA, and Hybrid SILA. Each category represents a distinct methodological approach applied to comparable subsystems—ensuring that the analysis reflects differences attributable to development methodology.
The Hybrid SILA approach is defined as a deliberate and structured integration of select Waterfall and Agile practices, applied to balance the traceability and compliance requirements of the aerospace domain with the adaptability needed for evolving sustainment needs. The Hybrid model was an intentional architectural response to the limitations observed in using either methodology exclusively. Specifically, Hybrid SILA subsystems adopted the following Waterfall components:
- Formal requirements baselining and approval at project initiation, ensuring traceability to original stakeholder intents.
- Structured design reviews and verification gates to support compliance with safety and airworthiness standards.
- Rigorous configuration management and change control processes, aligning with aerospace regulatory expectations for documentation and auditability.
In parallel, the Agile components embedded in Hybrid SILA subsystems included:
- Use of sprint cycles and incremental development to accelerate prototyping and subsystem refinement.
- Continuous stakeholder engagement, including maintainers and field engineers, to rapidly incorporate operational feedback.
- Frequent internal demonstrations and backlog grooming, allowing early identification and resolution of emergent issues.
This hybridization allowed teams to partition development activities: high-assurance tasks (e.g., interface definition, regulatory compliance) were conducted using Waterfall practices, while lower-risk or iterative tasks (e.g., interface mockups, non-safety-critical logic updates) followed Agile workflows. In doing so, Hybrid SILA sought to maximize responsiveness to change without compromising system integrity or certification requirements.
The classification of subsystems into Waterfall, Agile, and Hybrid categories was derived through a combination of document analysis of lifecycle artifacts (e.g., requirement specifications, sprint logs, system review packages), interviews with engineering leads and project managers to understand the rationale behind methodological choices, and configuration reviews, identifying formal development practices applied to the subsystem. Data was collected over a 13-month period, allowing the capture of impacts such as change propagation, rework costs, and time-to-field metrics.
Each subsystem included in the study (see Table 1) was selected based on functional similarity and scope, ensuring valid comparisons across the three methodological approaches. This structured classification enabled a systematic evaluation of how development methodology influences outcomes such as requirement volatility, cost efficiency, and sustainment adaptability. In summary, the Hybrid SILA methodology represents a tailored engineering strategy that draws upon the predictability of Waterfall and the agility of iterative methods, designed specifically for the nuanced challenges of long-term aircraft subsystem sustainment.
Table 1: Summary of Subsystems and Methodologies Compared
| Lifecycle Element | Methodology | SILA Subsystem |
| Requirement Changes | Waterfall | Propulsion system software |
| Agile | ||
| Hybrid | ||
| System Integration and Test | Waterfall | Integrated avionics management system |
| Agile | Flight management software suite | |
| Hybrid | Aircraft health monitoring system | |
| Performance Testing and Optimization | Waterfall | Autonomous navigation and guidance module |
| Agile | Engine performance monitoring system | |
| Hybrid | Cybersecurity interface | |
| Earned Value Management System | Waterfall | Flight Control Data |
| Agile | Navigation Planning Processor | |
| Hybrid | Avionics Health and Diagnostics |
3.0 Results
The cost estimates have been rounded to the nearest half million to protect sensitive data while maintaining the integrity of the comparative analysis. This rounding preserves relative trends and outcomes across Waterfall, Agile, and Hybrid methodologies without disclosing precise financial figures.
3.1 SILA System Requirement Changes Comparison
Table 2 presents a comparative analysis of how propulsion system software requirements for the SILA system were managed using Waterfall, Agile, and Hybrid methodologies. This table compares three comparable propulsion subsystem projects developed independently under Waterfall, Agile, and Hybrid methodologies. Each row outlines how key engineering and project management factors were handled under each method. Requirement scopes were functionally equivalent to allow fair benchmarking of time, cost, rework, and stakeholder outcomes.
The Waterfall-based subsystem experienced late-stage requirement gaps and minimal stakeholder input, resulting in a 9-month timeline—2 months beyond the planned schedule—and a total cost of $4 million, including $1 million in rework due to inflexible architecture and delayed validation. In contrast, the Agile approach supported iterative refinement of requirements—particularly for diagnostics and fault detection—enabled by early stakeholder involvement and simulation-in-the-loop validation. This led to on-time delivery in just 15 weeks and a reduced total cost of $2 million, with $800K saved by avoiding rework that would have resulted from late-stage defects and user interface mismatches (as seen in Waterfall). While Agile was both the fastest and most cost-effective, the Hybrid approach yielded a strategic middle ground. It combined upfront planning with controlled iteration, completing delivery in 18 weeks at the same total cost of $2 million. The Hybrid approach limited rework to just $400K by identifying requirement changes early (e.g., via model reuse and staged updates) and improving stakeholder alignment through monthly feedback sessions. This also enabled a 35% reduction in V&V duration, and only one defect escaped testing. Though not as rapid as Agile, Hybrid demonstrated superior control and consistency—traits critical for sustaining complex legacy systems—by balancing structure and flexibility to mitigate risks and align with regulatory and operational constraints. All cost savings and timeline deviations are compared to contractual estimates provided to the client. The data supports that while Agile maximizes speed and cost savings, Hybrid offers a robust alternative for environments that demand adaptability without sacrificing system integrity. Cost and schedule comparisons are based on contractual estimates, and all savings/overruns are measured against those baselines.
Table 2: System Requirement Changes of the SILA system for Propulsion System Software
| Factor | Waterfall | Agile | Hybrid |
| Requirement Stability | Initial requirement: “The subsystem must support fixed-point diagnostics for engine anomalies using predefined fault codes.” Issue: Inflexible design led to a 3-month delay and $1M in rework when unexpected sensor types were introduced. | Evolved requirement: “The diagnostic engine must process both fixed and variable fault data.” Result: Refined over 6 iterations in 8 weeks, with no delay. | Combined initial planning with staged updates. Result: Adaptable requirement set integrated after 2 iterations, completing changes in 10 weeks with only $250K in adjustments. |
| Change Management | Example change: Add fault detection for variable-speed compressor anomalies. Process: Took 6 months due to documentation and formal review cycles. Impact: $500K in rework. | Same change as in Waterfall implemented after Sprint 2. Result: Completed in 3 weeks, saving $250K under budget. | Same change identified in iteration 1 and implemented in 5 weeks. Result: $300K saved through partial reuse of previously verified models. |
| Stakeholder Collaboration | Frequency: Quarterly milestone reviews. Issue: UI mismatches with pilot expectations noticed too late. Impact: $500K redesign. | Frequency: Bi-weekly stakeholder demos. Result: UI refinements made mid-development; no redesign required. | Frequency: Monthly reviews with integrated feedback sessions. Result: UI issues caught early and resolved in design sprint; reduced stakeholder review time by 40%. |
| Verification & Validation | Initial Requirement (Waterfall): “The system shall validate fault flags post-cycle via simulation only.” Issue: Late-stage defects detected during final test phase. Impact: $800K in rework. | Continuous V&V using simulation-in-the-loop throughout development. Result: Reduced test cycle time by 50% (from 4 weeks to 2 weeks). | V&V conducted through phased simulation and lab testing. Result: Only 1 defect escaped final test; saved $400K in rework (vs. $800K in Waterfall); test duration reduced from 4 weeks to 2.6 weeks (35%). |
| Time & Cost Efficiency | Timeline: 9 months (2 months overrun) Total Cost: $4M Rework: $1M | Timeline: 15 weeks (delivered on time) Total Cost: $2M Savings: $800K saved by avoiding late-stage rework and UI redesigns (compared to Waterfall). | Timeline: 18 weeks Total Cost: $2M Rework: $400K Result: Balanced performance—60% less rework than Waterfall, with more controlled changes than Agile. Completed in 2 iterations vs. Agile’s 6. |
| Impact | High cost and schedule overrun driven by late discovery of requirement issues and rigid processes. | High adaptability enabled rapid delivery and cost control but required strict sprint discipline and heavy stakeholder involvement. | Demonstrated optimal balance: up-front clarity combined with flexible delivery. Enabled sustained cost control, reduced testing burden, and improved stakeholder alignment. |
3.1.1. Requirement Stability
The Waterfall approach suffered from rigidity in managing evolving requirements. The initial requirement— “The subsystem must support fixed-point diagnostics for engine anomalies using predefined fault codes”—resulted in a 3-month delay and $1M in rework when unexpected sensor types were introduced late in development. In contrast, Agile’s evolved requirement— “The diagnostic engine must process both fixed and variable fault data”—was refined iteratively over 6 sprints in 8 weeks without delays. Hybrid combined structured upfront planning with staged adaptability, completing changes in 10 weeks after just 2 iterations, with only $250K in adjustments. This highlights that while Agile excels at rapid iteration, Hybrid offers flexibility with fewer change cycles and tighter control.
3.1.2. Change Management
In Waterfall, implementing changes such as “adding fault detection for variable-speed compressor anomalies” required 6 months due to lengthy documentation and review cycles, incurring $500K in rework. Agile integrated the same change in just 3 weeks after Sprint 2 using telemetry logs, saving $250K under budget. Hybrid flagged the change early during its first iteration and implemented it within 5 weeks, achieving $300K savings through partial reuse of previously verified models. Hybrid’s proactive identification of changes reduced both cost and time compared to Waterfall, though Agile remained the fastest.
3.1.3. Stakeholder Collaboration
Waterfall limited stakeholder collaboration to quarterly milestone reviews, which led to misaligned user interfaces (UI) with pilot expectations and a $500K redesign. Agile employed bi-weekly stakeholder demos, allowing UI refinements mid-development with no redesigns needed. Hybrid implemented monthly reviews and integrated feedback sessions, enabling early UI adjustments and reducing review time by 40%. This demonstrates that while Agile provides the fastest feedback loops, Hybrid still captured early insights effectively, reducing late-stage changes.
3.1.4. Verification & Validation (V&V)
In Waterfall, verification was performed late in the cycle, relying solely on simulation. The requirement— “The system shall validate fault flags post-cycle via simulation only”—resulted in $800K of rework due to late defect detection. Agile incorporated continuous V&V through simulation-in-the-loop, reducing test cycle time by 50% (from 4 weeks to 2 weeks). Hybrid phased V&V with both incremental simulation and lab testing, saving $400K (compared to Waterfall’s $800K in rework) and reducing the test phase to 2.6 weeks—a 35% improvement over Waterfall.
3.1.5. Time & Cost Efficiency
The Waterfall approach took 9 months (2 months overrun) with a total cost of $4M, including $1M in rework. Agile, by contrast, delivered in 15 weeks on budget at $2M, avoiding $800K in rework through early defect detection and iterative stakeholder collaboration. Hybrid delivered in 18 weeks at the same total cost of $2M but with reduced rework of $400K—60% lower than Waterfall. While Agile achieved the fastest delivery, Hybrid maintained tighter control with fewer requirement changes (2 iterations cf. Agile’s 6).
3.1.6. Overall Impact
Waterfall was characterized by high costs and schedule overruns caused by rigid processes and late identification of requirement gaps. Agile provided rapid delivery and cost savings but required rigorous sprint discipline and continuous stakeholder involvement. Hybrid demonstrated a balanced performance, combining upfront clarity with flexibility to accommodate evolving requirements, while sustaining cost control, reducing testing effort, and improving stakeholder alignment.
3.2 SILA System Integration and Test (SIT) Phase Comparison
The comparative analysis of the SILA System Integration and Test (SIT) phase in Table 3 highlights distinct trade-offs among Waterfall, Agile, and Hybrid methodologies regarding cost, schedule, adaptability, and validation efficiency. The Waterfall approach used for integrating the Integrated Avionics Management System required the most time—12 months—and the highest expenditure at $4 million. While the system’s inherent complexity contributed to this, the rigid, sequential Waterfall process further extended the schedule by dedicating three months to extensive documentation, six months to integration testing, and three months to validation. The lack of early feedback loops led to significant rework during validation, especially due to hard-coded interfaces that limited flexibility when requirements changed. Conversely, the Agile methodology applied to the Flight Management Software Suite enabled completion in just four months at $3 million. Through four iterative sprints, Agile facilitated early subsystem testing and incorporated maintainer feedback quickly, which enhanced responsiveness. However, this accelerated pace resulted in reduced system-wide traceability and some inconsistencies in subsystem outputs, necessitating additional verification and validation steps after delivery. The Hybrid approach, implemented for the Aircraft Health Monitoring System, balanced these trade-offs by integrating Agile sprints within a structured Waterfall milestone framework. Over six months and $3 million, it achieved moderate cost and time efficiency while maintaining compliance and traceability.
Table 3: System Integration and Test (SIT) Phase of Aircraft Systems Sustainment
| Model | Subsystem | Duration | Cost | Approach | Highlights | Challenges |
| Waterfall | Integrated Avionics Management System | 12 months | $4M | Sequential (Plan, Build, Test) | 3 months for documentation
6 months integration testing 3 months validation |
Delayed feedback Rework in validation phase
Hard-coded interfaces led to issues when updates were needed |
| Agile | Flight Management Software Suite | 4 months | $3M | 4 Iterative Sprints | Feedback from maintainers
Subsystem testing in Sprint 1 Interface tweaks integrated by Sprint 2 |
Lacked traceability
Inconsistencies in subsystem outputs Required additional V&V steps post-delivery |
| Hybrid | Aircraft Health Monitoring System | 6 months | $3M | Agile Sprints Aligned to Waterfall Milestones | Milestone 1: Compliance planning (1 month)
Sprints 1–3: Development & testing (3 months) Milestone 2: Formal reviews (2 months) |
Maintained traceability and compliance
Enabled quick adaptation to maintenance feedback Reduced rework by 50% compared to Waterfall |
3.3 SILA Performance Testing and Optimization
The SILA Performance Testing and Optimization data in Table 4 highlights distinct differences in efficiency and effectiveness among the three development methodologies. The Waterfall method, applied to the Autonomous Navigation and Guidance Module, was the most time- and cost-intensive, requiring six months and $2.5 million to complete. The process was heavily sequential, with three months dedicated to initial simulations and stress testing, followed by bottleneck analysis and a final month of optimization and certification. Issues were often discovered late, leading to high rework efforts and delays primarily due to the inflexible documentation and lack of iterative testing cycles. In contrast, the Agile approach used for the Engine Performance Monitoring System completed performance testing in just three months at a cost of $2 million. This was achieved through three focused sprints addressing stress testing, bottleneck identification and fixes, and final optimization with iterative testing. However, despite faster delivery, Agile faced challenges in maintaining certification traceability and sometimes missed broader system-wide impacts due to sprint-focused efforts. The Hybrid model, implemented for the Cybersecurity Interface, struck a strong balance between speed, cost, and rigor. Completed in 4.5 months and costing $2 million, it began with Waterfall-style simulations and baseline setup, followed by two months of Agile sprints for performance tuning, and concluded with 1.5 months of certification alignment and automated traceability. This combined approach reduced bottleneck resolution time by 40% compared to Waterfall while preserving necessary documentation and compliance rigor. Overall, the Hybrid model demonstrated superior adaptability and control, making it well-suited for complex, compliance-driven environments demanding both responsiveness and audit readiness.
Table 4: Performance Testing and Optimization of Autonomous Systems Resilience
| Model | Subsystem | Duration | Cost | Activities | Challenges |
| Waterfall | Autonomous Navigation and Guidance Module | 6 months | $2.5 million | – 3 months: Initial simulations and stress testing – 2 months: Bottleneck analysis – 1 month: Final optimization and certification | – Late discovery of issues – High rework effort – Inflexible documentation and testing cycles |
| Agile | Engine Performance Monitoring System | 3 months | $2 million | – Sprint 1: Initial stress testing – Sprint 2: Bottleneck identification and fix – Sprint 3: Final optimization and iterative testing | – Certification traceability issues – Sprint focus sometimes missed system-wide impacts |
| Hybrid | Cybersecurity Interface | 4.5 months | $2 million | – Month 1: Waterfall-style simulation and baseline setup – Months 2–3: Agile sprints for performance tuning – Final 1.5 months: Certification alignment and automated traceability | – Balanced iterative feedback with documentation rigor – Reduced time to resolve bottlenecks by 40% |
3.4 SILA Earned Value Management System (EVMS) Analysis
The following Table 5 illustrates evolving cost and value trends across three comparable subsystems developed under distinct methodologies within the Aircraft Mission Support: the Flight Control Data (Waterfall), the Navigation Planning Processor (Agile), and the Avionics Health and Diagnostics (Hybrid). Over the eight-month period, the Waterfall-based Flight Control Data shows a structured increase in Planned Value (PV), rising from $0.5M in Month 1 to $4.5M by Month 8. However, Earned Value (EV) begins to fall behind planned value from Month 3 and ends at $4.0M, while Actual Cost (AC) accelerates, also ending at $4.5M. This indicates early and sustained cost overruns, with a final $0.5M gap between expected value and actual cost—highlighting inefficiencies and slower progress than planned. The Navigation Planning Processor, developed using Agile, shows a more proportional trend. Planned and expected values remain closely aligned from start to finish, both ending at $4.0M. Actual costs stays consistently controlled, mirroring expected value across most months and also finishing at $4.0M. The tight alignment among the three metrics suggests efficient execution, with minimal variance between what was planned, accomplished, and spent. The Hybrid approach, used for the Avionics Health and Diagnostics, demonstrates the most consistent and balanced performance. Planned value rises steadily to $4.5M, closely matched by expected value, which also reaches $4.5M by Month 8. Actual cost climbs in parallel, finishing at $4.0M. Variances between all three indicators remain small throughout the period, reflecting controlled costs and predictable progress. The hybrid model effectively combines structured planning with adaptive delivery, leading to strong schedule and budget alignment across the development timeline.
Table 5: Earned Value Management System of Aircraft Mission Support
| Month | Waterfall Planned Value |
Waterfall Expected Value |
Waterfall Actual Cost | Agile Planned Value |
Agile Expected Value |
Agile Actual Cost | Hybrid Planned Value |
Hybrid Expected Value |
Hybrid Actual Cost |
| 1 | $0.5M | $0.5M | $0.5M | $0.5M | $0.5M | $0.5M | $0.5M | $0.5M | $0.5M |
| 2 | $1.0M | $1.0M | $1.5M | $1.0M | $1.0M | $1.0M | $1.0M | $1.0M | $1.0M |
| 3 | $2.0M | $1.5M | $2.0M | $2.0M | $1.5M | $1.5M | $1.5M | $1.5M | $2.0M |
| 4 | $2.5M | $2.0M | $2.5M | $2.5M | $2.5M | $2.0M | $2.5M | $2.0M | $2.5M |
| 5 | $3.0M | $3.0M | $3.5M | $3.0M | $2.5M | $2.5M | $3.0M | $3.0M | $2.5M |
| 6 | $3.5M | $3.5M | $4.0M | $3.5M | $3.0M | $3.0M | $3.5M | $3.5M | $3.0M |
| 7 | $4.0M | $4.0M | $4.0M | $4.0M | $3.5M | $3.5M | $4.0M | $4.0M | $3.5M |
| 8 | $4.5M | $4.0M | $4.5M | $4.0M | $4.0M | $4.0M | $4.5M | $4.5M | $4.0M |
3.4.1 Planned Value (PV) Comparison
Looking at the Planned Value (PV) graph in Figure 1, all three subsystems exhibit structured cumulative budget allocation across eight months. The Flight Control Data (Waterfall) subsystem maintains a steady planned value increase from $0.5M in Month 1 to $4.5M by Month 8. This front-loaded approach aligns with Waterfall’s planning-intensive philosophy, where funding is committed early to support documentation, design, and review cycles (Boehm et al., 2003). In contrast, both the Navigation Planning Processor (Agile) and Avionics Health and Diagnostics (Hybrid) subsystems demonstrate similarly gradual planned value growth patterns—reaching $4.0M and $4.5M respectively—reflecting more adaptive budgeting models. Agile’s incremental allocation supports sprint-based delivery and reprioritization, while Hybrid balances early funding for critical components with iterative reallocation, integrating both flexibility and oversight (Highsmith, 2001; Serrador & Pinto, 2015).

Figure 1: Planned Value (PV) in SILA
3.4.2 Earned Value (EV) Comparison
The Earned Value (EV) graph in Figure 2 reveals notable differences in delivery efficiency. By Month 8, Avionics Health and Diagnostics (Hybrid) leads with $4.5M in expected value, followed by Navigation Planning Processor (Agile) and Flight Control Data (Waterfall), both at $4.0M. Hybrid’s ability to align earned value with planned value demonstrates an effective combination of pre-defined structure and agile responsiveness. The Hybrid team baselined critical system elements early—mimicking Waterfall strengths—while using Agile iterations to incorporate feedback and refine deliverables. This allowed for fast prototyping and high user engagement without compromising traceability or documentation (Boehm et al., 2003; Vinekar et al., 2006). By contrast, Agile’s strong start slowed around Months 5–7, likely due to limited upfront architecture planning and decentralized coordination (Stettina et al., 2014), although the final expected value ($4.0M) still matched its planned value. The Waterfall-driven Flight Control Data subsystem reflected the most delayed progress, with expected value plateauing at $4.0M—half a million below its planned value. The linear sequencing inherent in Waterfall, which delays value realization until phase completion, limited its adaptability to emerging needs (Maharao, 2024), further reinforcing the relative strength of Hybrid models in sustainment environments (Conforto et al., 2014).
Figure 2: Earned Value (EV) in SILA
3.4.3 Actual Value (AV) Comparison
The Actual Cost (AC) graph in Figure 3 reveals clear distinctions in expenditure behavior. By Month 8, the Flight Control Data subsystem incurs the highest cost at $4.5M, matching its planned value but exceeding its EV. This consistent overrun began in Month 2, when AC jumped to $1.5M while expected value remained at $1.0M—indicating inefficiencies, likely caused by delayed requirement validation or rework during integration (Boehm et al., 2003). In contrast, the Navigation Planning Processor (Agile) maintained tight cost control, concluding with an actual cost of $4.0M, in exact alignment with its planned and expected values. This reflects Agile’s efficient cycle management and reduced overhead stemming from early stakeholder involvement and scope refinement (Highsmith, 2009). Meanwhile, the Avionics Health and Diagnostics (Hybrid) subsystem presents a middle ground—reaching $4.0M in actual cost, which is lower than its final EV of $4.5M. The Hybrid cost curve remains steady, reflecting strategic planning for compliance-heavy tasks early in the cycle and efficient iterative refinement for less critical updates. This stratified resource allocation contributed to predictable spending and reduced volatility across the timeline (Conforto et al., 2016; Serrador & Pinto, 2015).

Figure 3: Actual Cost (AC) in SILA
4.0 Discussion
The findings from the SILA case study illustrate that neither Waterfall nor Agile, when applied in isolation, fully meets the operational and compliance demands of aerospace and defense sustainment projects. Instead, the results reinforce the value of a strategically tailored hybrid methodology that draws on the strengths of both approaches – leveraging Agile’s adaptability for rapid, low-risk updates while retaining Waterfall’s governance for high-assurance phases. This balance proved particularly effective in SILA, where subsystem-level iterations could proceed at speed without compromising the formal verification, documentation, and certification processes essential to regulated environments.
4.1 Requirement Changes Overall
One of the most critical areas of divergence across the three methodologies was requirement stability. In the SILA case, each propulsion subsystem shared a comparable diagnostic requirement, yet the ability to manage requirement change varied significantly by method. Under the Waterfall approach, requirements were rigidly front-loaded and documented with minimal room for adaptation. When new sensor types were introduced mid-development, the design lacked the flexibility to accommodate them. This led to a 3-month schedule delay and $1 million in rework costs, as late-stage changes required revalidation of integrated components. This outcome reflects the common limitations of linear models in dynamic sustainment environments, where requirements may shift based on operational feedback or evolving system interfaces (Silva et al., 2023).
In contrast, Agile’s iterative model allowed requirements to evolve through six development sprints across eight weeks. The team was able to adapt the fault diagnostics algorithm to handle both fixed and variable fault data types without triggering any schedule delay. This was enabled through continuous collaboration and simulation-in-the-loop testing, which allowed incremental refinement aligned with real-time feedback. While Agile clearly excelled in responsiveness, it also required a higher degree of coordination and sprint discipline, and its lightweight documentation practices posed traceability concerns—particularly in highly regulated sectors (Arthur et al., 2017).
The Hybrid approach demonstrated a more structured yet adaptive path. Requirements were defined up front and refined through two planned iterations, completing the updates in 10 weeks with only $250K in adjustment costs. This structure preserved traceability while avoiding the rework costs of Waterfall and the overhead of repeated Agile iteration cycles. The outcome shows that strategic flexibility, when intentionally embedded within staged development gates, can minimize cost and schedule impacts without compromising traceability (Dugbartey et al., 2025).
Change management further illustrated the burden of rigid process adherence. A representative engineering update—adding fault detection for variable-speed compressor anomalies—required six months to integrate under Waterfall due to formal documentation and approval processes, incurring $500K in rework. Agile addressed the same change after Sprint 2 using real-time telemetry logs and completed it in just 3 weeks, saving $250K under budget. However, Agile’s rapid integration also posed configuration management risks when formal change logs were delayed or incomplete. The Hybrid model implemented the change within 5 weeks during its first iteration, leveraging partial reuse of previously verified model elements. This saved $300K compared to Waterfall, while maintaining full traceability. The result supports the notion that Hybrid methods can offer speed without sacrificing auditability.
Stakeholder collaboration revealed a similar pattern. In Waterfall, feedback was limited to quarterly milestone reviews, which proved too infrequent to identify UI misalignments with pilot expectations. This resulted in a $500K redesign late in the cycle. Agile’s bi-weekly demos enabled earlier input from stakeholders and resolved UI issues during development—eliminating the need for redesign altogether. The Hybrid approach scheduled monthly feedback sessions with interim design reviews, enabling UI refinements within the first design sprint and reducing review effort by 40%. This cadence effectively captured feedback without overwhelming stakeholders, offering a middle ground between Waterfall’s inflexibility and Agile’s intensity (Shereni, 2015).
Verification and Validation (V&V) strategies also diverged in terms of timing, traceability, and cost impact. Waterfall deferred all validation to a post-development test phase, where several critical defects emerged. This resulted in $800K in rework and a 4-week test window. Agile integrated simulation-in-the-loop testing during development, allowing earlier defect identification and reducing the final test phase to 2 weeks, avoiding the rework entirely. Hybrid followed a phased testing approach, combining simulation and physical lab validation at key milestones. This approach limited escaped defects to one, saved $400K in rework, and shortened the final V&V phase to 2.6 weeks—a 35% reduction compared to Waterfall. Importantly, Hybrid preserved formal test documentation needed for certification, striking a balance between flexibility and compliance (Shekhar, 2019).
Finally, the cost and schedule performance summarized the trade-offs across methodologies. Waterfall took 9 months and cost $4 million, with $1 million attributed to rework. Agile completed the project in 15 weeks at a total cost of $2 million, saving $800K by resolving issues earlier and avoiding rework. Hybrid required 18 weeks, slightly longer than Agile, but matched the $2 million budget and reduced rework to just $400K—a 60% improvement over Waterfall. While Agile was the fastest and most cost-efficient, Hybrid offered more predictability in change management, traceable artifacts, and compliance support—attributes essential for sustainment programs within aerospace and defense domains.
These results reinforce Objective 1, confirming that Agile approaches deliver speed and cost efficiency in dynamic, change-prone environments. However, the Hybrid methodology emerged as the most balanced, offering a viable alternative in sustainment contexts where traceability, certification readiness, and controlled evolution of requirements are equally important. Waterfall retained value in traceability but showed significant performance degradation in the face of evolving system needs.
4.2 System Integration and Test (SIT) Phase Overall
The SILA System Integration and Test (SIT) phase results reveal important distinctions in how development methodologies impact integration duration, cost effectiveness, and defect resolution. Using the Waterfall approach, the SIT phase spanned 12 months and incurred costs of approximately $4 million. This extended timeline was partly due to the system’s complexity but was further prolonged by Waterfall’s rigid, sequential structure, which dedicated the first three months exclusively to documentation and interface specifications before any integration began. This front-loaded process delayed critical feedback from integration testing, leading to defects being discovered late during validation and necessitating significant rework, which caused schedule overruns and increased expenses. These outcomes align with prior studies highlighting inefficiencies caused by delayed defect detection and rigid phase transitions typical of sequential models (Boehm et al., 2003).
Conversely, the Agile implementation completed the SIT phase in only 4 months at a reduced cost of $3 million, representing a 30% cost saving. Agile’s iterative sprints embedded integration and testing activities early and continuously, enabling earlier identification of interface mismatches and faster incorporation of stakeholder feedback. This iterative cadence allowed for quicker issue resolution and minimized the propagation of integration defects—a pattern observed in other large defense projects employing Agile methodologies effectively (Tanner et al., 2014). However, Agile’s rapid pace sometimes compromised formal traceability and led to inconsistencies among subsystem outputs, particularly when deliverables from different sprints were not fully synchronized (Womack et al., 2008).
These challenges address the benefits of hybrid methodologies. The Hybrid approach adopted in the SILA SIT phase combined Agile sprints with structured milestone-based reviews, effectively balancing agility with regulatory compliance. By front-loading compliance planning and integrating formal validation gates, the Hybrid model reduced rework by 50% compared to Waterfall, while maintaining traceability and adapting quickly to feedback. Such hybrid frameworks provide a strategic middle ground, embedding flexibility within a rigorously controlled process—a necessity for high-assurance aerospace projects (Shekhar, 2019).
4.3 Performance Testing and Optimization Overall
The results of the SILA performance testing and optimization phase further highlight systemic inefficiencies associated with traditional Waterfall approaches, especially within sustainment workflows. Waterfall’s fixed-phase progression delayed the detection and resolution of performance bottlenecks, with the testing cycle taking six months and $2.5 million to complete. Bottlenecks were only identified during the later stages, leading to extensive rework and delayed certification. This lag in feedback echoes longstanding critiques of Waterfall’s rigidity, particularly its limited capacity to adapt to emergent system behaviors during integration and testing (Larman et al., 2003). By contrast, the Agile model achieved a 50% reduction in duration and a 20% reduction in cost, completing testing in three months at $2 million. Agile’s iterative, feedback-driven development and early validation practices embedded stress testing within sprints, enabling earlier bottleneck detection and resolution before issues could propagate downstream. These results align with prior studies demonstrating that Agile’s continuous integration and testing enhances responsiveness and reduces late-stage failures (Dingsøyr et al., 2010).
The Hybrid approach, combining upfront structured planning with iterative tuning and automated traceability, provided the most balanced solution. Completed in 4.5 months and costing $2 million, it reduced bottleneck resolution time by 40% compared to Waterfall, while maintaining documentation rigor essential for certification. This supports Boehm and Turner’s (2003) argument that hybrid models enable organizations to leverage Agile’s adaptability and speed while preserving plan-driven controls. Furthermore, hybrid strategies reduce integration debt by aligning iterative development with formal milestones (Stettina et al., 2014). These findings directly address Objective 2 of this study, quantifying how Waterfall’s rigid phase-gated structure contributes to rework and inefficiency in sustainment workflows. Agile’s shortened feedback loops are especially valuable in environments requiring ongoing optimization based on evolving operational data. However, the traceability and audit readiness limitations suggest that Agile alone may not fully satisfy regulated industry standards. Therefore, a hybrid approach emerges as the optimal strategy for performance testing in Aerospace & Defense contexts, balancing faster issue resolution with compliance, traceability, and certification integrity.
4.4 Earned Value Management System (EVMS) Overall
The three Figures 1, 2 and 3 collectively provided a comparative Earned Value Management (EVM) analysis for three SILA subsystems, each developed under a distinct methodology: the Flight Control Data subsystem (Waterfall), the Navigation Planning Processor subsystem (Agile), and the Avionics Health and Diagnostics subsystem (Hybrid). EVM is a critical project control technique that integrates scope, cost, and schedule parameters to provide an objective measurement of project performance. It does so through three core metrics: Planned Value (PV)—the authorized budget assigned to scheduled work; Actual Cost (AC)—the realized cost incurred for the performed work; and Earned Value (EV)—the value of work completed to date. These indicators, when analyzed collectively, offer a multidimensional view of cost-efficiency, schedule adherence, and overall value delivery. This is particularly vital in Aerospace and Defense (A&D) programs, where complex interdependencies, long development cycles, and evolving requirements often lead to significant scope creep and increased sustainment costs if not proactively managed (Kerzner, 2017).
The Flight Control Data (Waterfall) subsystem shows strong alignment with planned effort but suffers from higher costs and slower value realization due to integration bottlenecks and late rework. The Navigation Planning Processor (Agile) subsystem demonstrates strong cost control and value alignment, although its performance decelerates slightly under complex integration scenarios. The Avionics Health and Diagnostics (Hybrid) subsystem outperforms both in value delivery and cost-effectiveness, maintaining consistent trajectories across planned value, expected value, and actual cost metrics. These findings reinforce broader project management literature that advocates for hybrid frameworks in regulated, evolving environments like A&D, where both structure and flexibility are essential to program success (Kerzner, 2017).
4.5 Synthesis of Hybrid Methodology
Based on the findings, a conceptual diagram (Figure 4) was developed to illustrate how Hybrid methodologies can be adapted by drawing from both Agile and Waterfall approaches, depending on factors such as time pressure and safety-critical requirements. Specifically, when time constraints are tight and safety demands are high, certain elements of Waterfall may be prioritized to ensure rigor and documentation, whereas in low-risk, time-flexible scenarios, Agile practices may offer greater efficiency and adaptability. The diagram highlights this spectrum and provides guidance on how to balance these methodologies in various operational contexts.

Figure 4: Hybrid Methodologies Recommendations Based on SILA Project
Overall, looking at Hybrid methodologies quadrant diagram (Figure 4), the Earned Value Management System (EVMS) discussion highlights the specific benefits of adopting hybrid methodologies in scenarios characterized by lower safety requirements and minimal time pressures. Under these conditions, SILA’s implementation of a hybrid approach enabled regular reassessments and flexible, phased implementation, aligning closely with evolving project goals. Through moderate documentation and targeted stakeholder involvement, the EVMS updates allowed for precise tracking of project costs, resource allocation, and schedule adherence, resulting in enhanced adaptability and responsiveness without the excessive administrative overhead typical of pure Waterfall methodologies. This approach fostered a balanced environment that supported informed decision-making and rapid adjustments, crucial for optimizing the value derived from project investments.
In the Performance Testing and Optimization discussion, the effectiveness of hybrid methodologies emerged clearly in high safety but low urgency contexts. Initially defined yet flexible requirements provided a strong foundation, allowing structured iterations and moderate documentation. These controlled iterative cycles enabled the team to integrate real-time performance data and make incremental improvements while maintaining compliance with rigorous safety standards. Stakeholder engagement was consistently balanced, involving the right individuals at strategic milestones, thus enabling informed adaptations based on comprehensive performance insights. The result was a finely tuned optimization process that maintained system integrity and reliability, demonstrating the critical role hybrid methodologies can play in environments where safety and performance must coexist with adaptive flexibility.
The System Integration and Test (SIT) Phase discussion emphasized Waterfall’s particular strengths and limitations when managing projects with stringent regulatory and safety requirements under high time pressure. The detailed, sequential process and extensive documentation inherent in Waterfall methodologies ensured comprehensive verification and validation (V&V), crucial for meeting strict aerospace regulations. However, this rigid structure presented substantial drawbacks, especially when unforeseen requirements emerged or operational conditions rapidly changed. The prolonged feedback loops and limited adaptability led to delays and cost escalations. These limitations underscore the need for more flexible methodologies or hybrid approaches for certain components or project phases, where adaptability could mitigate these traditional drawbacks without sacrificing the meticulous compliance required in high-stakes environments.
The Requirement Changes discussion distinctly illustrated Agile methodologies’ superior capability to manage projects facing dynamic requirements and high urgency with lower safety constraints. Agile’s iterative sprints, frequent stakeholder interactions, and continuous feedback mechanisms empowered the SILA team to respond rapidly to evolving propulsion system requirements. This iterative approach reduced turnaround times significantly and minimized rework by swiftly identifying and addressing changes as they arose. Although Agile’s minimal documentation posed some risks regarding traceability and regulatory compliance, its rapid responsiveness greatly enhanced SILA’s operational agility, ultimately resulting in improved system performance and reduced overall sustainment costs.
A comprehensive analysis of the quadrant diagram further reveals the inherent value of strategically deploying different methodologies based on clearly identified project contexts. The SILA case study demonstrates the limitations and strengths of each approach, emphasizing the critical advantage of hybrid methodologies in effectively navigating the nuanced trade-offs between documentation rigor, adaptability, regulatory compliance, and operational responsiveness. Hybrid frameworks, which integrate structured Waterfall phases with Agile flexibility, emerge as particularly advantageous, offering an optimal blend tailored specifically to the project’s risk profile, stakeholder expectations, regulatory demands, and time constraints. Thus, the quadrant analysis not only provides a valuable visual and strategic tool but also articulates a clear pathway towards methodological agility and excellence in aerospace sustainment projects.
4.6 Limitations
While the SILA case study provides valuable insights into the practical application of Hybrid methodologies in aerospace and defense sustainment, several limitations should be acknowledged. First, the analysis is based on a single program within a specific organizational and regulatory context, which may limit the generalizability of findings to other A&D projects with different scales, governance structures, or operational priorities. Second, the study relied on retrospective data and stakeholder accounts, introducing the potential for recall bias and subjective interpretation of process effectiveness. Additionally, quantitative measures of cost and time efficiency were constrained by the availability and granularity of project documentation, which may have obscured smaller-scale variations in performance. Finally, the research did not include a formal experimental comparison between Hybrid, Agile, and Waterfall approaches, meaning causality between the methodology and observed outcomes cannot be definitively established. These limitations suggest the need for broader, multi-program studies with prospective data collection to more rigorously evaluate hybrid frameworks across varied sustainment environments.
5.0 Conclusions
The SILA case study underscores the growing necessity of hybrid methodologies in the sustainment of complex aerospace and defense systems. Traditional Waterfall approaches delivered vital structure, formal documentation, and regulatory compliance—elements that remain indispensable in high-assurance, safety-critical domains. However, their rigidity and lengthy change cycles often hindered timely adaptation to evolving operational needs, especially as new failure modes and sustainment demands emerged over time. In contrast, Agile methodologies significantly improved SILA’s responsiveness by enabling iterative development, rapid stakeholder feedback, and continuous integration of diagnostic enhancements. These qualities proved critical in reducing turnaround times, lowering cost overruns, and enhancing the system’s predictive maintenance capabilities. Yet, Agile’s inherent informality introduced risks to consistency, traceability, and certification—core requirements in defense sustainment programs. This paper illustrates that a hybrid methodology, combining Waterfall’s structured compliance with Agile’s iterative adaptability, provides the most effective framework for sustainment success. Within SILA, hybrid implementation allowed the organization to retain the control and traceability demanded by regulatory bodies, while embedding short, Agile-driven sprint cycles to manage subsystem updates, respond to maintenance insights, and refine diagnostic algorithms in near real-time. Notably, the hybrid model enabled earlier defect detection, reduced rework through continuous testing, and promoted cross-functional collaboration across engineering, program management, and maintenance personnel.
For example, Agile sprints allowed updates to predictive algorithms to be delivered within weeks, particularly beneficial in scenarios characterized by high time pressure and low safety concerns, where rapid iterations and minimal documentation enhanced efficiency. Concurrently, Waterfall milestones ensured that subsystem integration and testing—critical in high safety, high time-pressure contexts—aligned with overarching certification and contractual requirements, ensuring robust formal documentation and rigorous verification processes. In situations marked by high safety yet lower urgency, such as SILA’s Performance Testing and Optimization, the hybrid methodology balanced initial requirement definitions with controlled iterative developments and moderate documentation, effectively harmonizing structure with flexibility. For the Earned Value Management System (EVMS) Analysis, which presented lower safety and time pressures, the hybrid approach facilitated adaptive planning, phased implementations, and regular stakeholder reviews, optimizing cost control and adaptability.
This approach not only reduced sustainment costs but also improved system reliability and responsiveness. While Agile clearly delivers superior performance in fast-turnaround sustainment environments, it is not universally applicable across all phases of aerospace engineering. Hybrid methodologies offer a pragmatic solution, enabling teams to adopt Agile’s sprint-driven approach for lower-risk iterative activities, while preserving Waterfall’s rigor for phases involving certification, documentation, and V&V. In the SILA case, a hybrid structure was trialed by integrating Agile practices into subsystem-level development while applying Waterfall gates for formal reviews and compliance checks. This allowed continuous stakeholder engagement and rapid prototyping while maintaining traceability for regulatory approval. This approach optimized the balance between speed and governance—achieving the goals of Objective 3.
In conclusion, the SILA case study demonstrates that a Hybrid methodology offers the optimal balance between stability and adaptability in aerospace sustainment projects. As illustrated in Figure 4 (Hybrid quadrant diagram), by strategically blending the formal rigor of Waterfall with the responsiveness of Agile—tailored specifically to the safety and time-pressure contexts—organizations can better navigate the complexities of system evolution, regulatory compliance, and operational readiness, ultimately ensuring more resilient, cost-effective, and mission-aligned sustainment outcomes. To fully realize the benefits of hybrid development in aerospace and defense, greater emphasis must be placed on workforce training and formal education—particularly through institutions like the Defense Acquisition University (DAU). Most current engineering programs still treat Agile and Waterfall as mutually exclusive, leaving a gap in preparing future engineers for hybrid application in regulated environments. Embedding hybrid methodology into DAU coursework and related university curricula would better equip acquisition professionals and engineers to strategically tailor development approaches based on safety, compliance, and time-pressure demands.
References
Boehm, B., & Turner, R. (2003). Balancing Agility and Discipline: A Guide for the Perplexed. Addison-Wesley Professional.
Conforto, E. C., Salum, F., Amaral, D. C., Silva, S. L., & Almeida, L. F. M. (2014). Can Agile Project Management Be Adopted by Industries Other than Software Development? Project Management Journal, 47(3), 21–34. 10.1002/pmj.21410
Donelli, G., L. Boggero, and B. Nagel, Concurrent Value-Driven Decision-Making Process for the Aircraft, Supply Chain and Manufacturing Systems Design. Systems, 2023. 11(12): p. 578.
Dingsøyr, T., Dybå, T., & Moe, N. B. (2010). Agile Software Development. Springer. 10.1007/978-3-642-12575-1.
Fasching, C. M. (2023). Towards a framework for risk identification and mitigation in Agile software re-engineering: A case study. University of Central Lancashire. https://clok.uclan.ac.uk/id/eprint/52511/1/MRes_Fasching.pdf
Gracias, M. H., & Gallegos, E. E. (2024). Transitioning perspectives: agile and waterfall perceptions in the integration of model-based systems engineering (MBSE) within aerospace and defense industries. ITEA Journal of Test and Evaluation, 45(4). https://doi.org/10.61278/itea.45.4.1006
Gupta, R. K., Manikreddy, P., & GV, A. (2016). Challenges in Adapting Agile Testing in a Legacy Product. IEEE 11th International Conference on Global Software Engineering (ICGSE), pp. 104–108. 10.1109/ICGSE.2016.21
Hiekata, K., Mitsuyuki, T., Goto, T., & Moser, B. (2016). Design of software development architecture comparison of waterfall and agile using reliability growth model. Transdisciplinary Engineering: Crossing Boundaries (pp. 471-480). IOS Press.
Highsmith, J. (2009). Agile Project Management: Creating Innovative Products. Addison-Wesley.
Highsmith, J. & Cockburn, A. (2001), Agile software development: the business of innovation. IEEE Xplore, Vol. 34, no. 9, pp. 120-127. 10.1109/2.947100.
Kazakevich B. & Joiner K. (2025, submitted), Agile framework for minimum viable capabilities in traditionally linear engineering acquisition cultures. International Journal of Agile Systems and Management.
Kerzner, H. (2017). Project Management: A Systems Approach to Planning, Scheduling, and Controlling. 13th Edition. Wiley.
Larman, C. & Basili, V. R. (2003). Iterative and Incremental Development: A Brief History. IEEE Xplore, 36(6), 47–56. 10.1109/MC.2003.1204375
Maharao, C. S. (2024). A study on Agile project management in IT: Challenges and best practices. ShodhKosh Journal of Visual and Performing Arts, 5(1). doi: 10.29121/shodhkosh.v5.i1.2024.2284
Morgan, M., T. Holzer, and T. Eveleigh, Synergizing model‐based systems engineering, modularity, and software container concepts to manage obsolescence. Systems Engineering, 2021. 24(5): p. 369-380.
Mousavi, B.A., et al., Use of Model-Based System Engineering methodology and tools for disruption analysis of supply chains: A case in semiconductor manufacturing. Journal of Industrial Information Integration, 2022. 28(100335).
Oliver, E., T. Mazzuchi, and S. Sarkani, A resilience systemic model for assessing critical supply chain disruptions. Systems Engineering, 2022(5): p. 510-533.
Planas, E., & Cabot, J. (2020). How are UML class diagrams built in practice? A usability study of two UML tools: Magicdraw and Papyrus. Computer Standards & Interfaces, 67, 103363.
Shekhar, P. C. (2019). Agile vs. Waterfall: A Comprehensive Analysis of Software Testing Method. International Journal of Innovative Research and Creative Technology 5 (5):1-12.
Serrador, P., & Pinto, J. K. (2015). Does Agile work? — A quantitative analysis of agile project success. International Journal of Project Management, 33(5), 1040–1051. 10.1016/j.ijproman.2015.01.006
Stettina, C. J., & Hörz, J. (2014). Agile portfolio management: An empirical perspective on the practice in use. International Journal of Project Management, 33(1), 140-152. 10.1016/j.ijproman.2014.03.008
Tanner, M. & Willingh, U. (2014). Factors leading to the success and failure of Agile projects implemented in traditionally waterfall environments, International Conference on Human Capital without Borders: Knowledge and Learning for Quality of Life, Portoroz, Slovenia, 693-701.
Vinekar, V., Slinkman, C. W., & Nerur, S. (2006). Can Agile and Traditional Systems Development Approaches Coexist? An Ambidextrous View. Information Systems Management, 23(3), 31–42. 10.1201/1078.10580530/46108.23.3.20060601/93705.4
Womack, J. P., & Jones, D. T. (2008). Lean Thinking: Banish Waste and Create Wealth in Your Corporation (2nd ed.).
Author Biographies
Maryam H. Gracias received her B.S. (2018) and M.S. (2020) degrees in Engineering from Embry-Riddle Aeronautical University (ERAU), Daytona Beach, Florida, where she also obtained her pilot’s license. She is finishing her Ph.D. in Systems Engineering (2025) at Colorado State University, Fort Collins, Colorado.
Her doctoral research examines Agile, Waterfall, and Hybrid methodologies in the Aerospace and Defense sector, with a focus on Model-Based Systems Engineering (MBSE) in relation to requirement stability, schedule efficiency, and verification and validation processes. She has several years of systems engineering experience in the Aerospace and Defense industry.
Erika E. Gallegos is an Associate Professor in the Department of Systems Engineering at Colorado State University. She received her B.S. in Civil Engineering from Oregon State University (2010), and her MS (2013) and PhD (2018) in Civil Engineering from the University of Washington.
Her research is centered on integrating humans with complex systems to enhance safety, performance, and sustainability in the design and evaluation of new and existing infrastructure. Dr. Gallegos’ work focuses on modeling human behavior and cognitive workload over time to evaluate the interactions between humans and machines, with an emphasis on developing appropriate trust, maintaining situational awareness, and improving decision making of human operators.
Dewey Classification: L 681 12

