JUNE 2025 I Volume 46, Issue 2
JUNE 2025
Volume 46 I Issue 2
IN THIS JOURNAL:
- Issue at a Glance
- Chairman’s Message
Workforce of the Future
- Encouraging Diversity in AI Test and Evaluation
Technical Articles
- Model Based Test and Evaluation Master Plan Technical Introduction
- Integrating RAG, HCD, and PD in MBSE for Mission Problem Framing
- Then What? The Need for Iterative Assessments to Achieve Successful Operational Capabilities
- Surpass the Adversary: Enhanced Mission Training through Digital Engineering
- Adaptive Algorithms for LIDAR Semantic Segmentation on Edge Devices
- 2025 AI in T&E Forum
- UC UK ITEA Event Summary
- AI and ML Methods in Verification and Validation
News
- Association News
- Chapter News
- Corporate Member News
![]()
Integrating RAG, HCD, and PD in MBSE for Mission Problem Framing

Rafi Soule – PhD Candidate
Department of Engineering Management and Systems Engineering;
Old Dominion University; Norfolk, VA
![]()
![]()
Abstract
The initial phase of Mission Engineering (ME) is critical for defining mission problems or opportunities, but traditional methods in defense and space contexts rely on manual processes that are time-intensive and prone to knowledge gaps. This article introduces a Model-Based Systems Engineering (MBSE) approach that integrates Retrieval-Augmented Generation (RAG) with Human-Centered Design (HCD) and Participatory Design (PD) methodologies to enhance problem definition in ME contexts. We present an MBSE-driven framework that integrates RAG with HCD/PD methods to improve early mission problem framing. Our approach embeds stakeholder workshops and surveys into a structured modeling workflow, where RAG dynamically retrieves relevant technical and operational knowledge to augment human insights (NVIDIA, 2025). This unified process yields demonstrable practitioner benefits: it produces more accurate mission problem definitions, fosters stakeholder consensus on objectives, and shortens the time required to reach alignment. We illustrate applications in defense and aerospace mission scenarios (INCOSE, 2007), showing how a single MBSE model can unify diverse information sources and drive clearer, faster decision-making.
Keywords: Mission Engineering; Model-Based Systems Engineering; Retrieval-Augmented Generation; Human-Centered Design; Participatory Design.
Introduction
Mission Engineering (ME) serves as a critical bridge between operational contexts and system design in order to achieve desired mission outcomes. The U.S. Department of Defense defines ME as a process encompassing the technical efforts to analyze, design, and integrate current and emerging capabilities for mission success (DoD MEG 2.0, 2023). Importantly, the very first phase of any ME effort is to define the Mission Problem or Opportunity, establishing the mission’s purpose, key questions, and decision needs that will guide all subsequent analysis and system integration (ASD (MC), 2024; DoD MEG 2.0, 2023). In fact, the DoD’s Mission Engineering Guide 2.0 places problem definition at the forefront of the ME process (Figure 1), underscoring its foundational importance.
Figure 1. Mission Engineering Methodology from the DoD MEG 2.0
Despite the importance of proper problem framing, current ME practices often struggle with ambiguous mission problem statements and fragmented understanding across stakeholders. Problem definitions tend to be unclear or overly technical, stakeholder misalignment creates implementation barriers, knowledge is siloed across organizations, and mission context is poorly visualized (Rudder, 2024; Wilking et al., 2024). A key reason for these shortcomings is that traditional ME efforts have historically emphasized technical system performance over human factors and operational context (Rudder, 2024). Focusing exclusively on quantitative system metrics can yield a false sense of objectivity while neglecting the qualitative aspects of how humans will use and interact with the system. In other words, a mission defined only by technical parameters may satisfy engineering specifications yet still miss critical user needs or contextual nuances, leading to gaps and ambiguities in the problem understanding. As a result, the solutions derived from such a narrow problem definition risk being misaligned with real-world conditions and end-user expectations (Herrera, 2024; Whitmore, 2023).
One approach to improve clarity and consistency in the problem-definition phase is to employ a Model-Based Systems Engineering (MBSE) methodology. MBSE provides a formalized approach to capturing mission models, requirements, and system-of-systems relationships in a single environment (INCOSE, 2007). By creating a digital model, MBSE makes complexity explicit and maintains traceability of decisions. However, purely technical MBSE tools can suffer from usability issues and often exclude non-technical stakeholders, leaving gaps in human and organizational context (Ma et al., 2022; Pasupuleti, 2023; Wilking et al., 2024). In effect, a purely MBSE-driven process may still fall short of addressing the human and organizational dimensions of the mission problem.
To bridge this gap, there is growing recognition that mission problem framing should incorporate human-centered perspectives and collaborative practices alongside technical modeling. Recent DoD research and guidance have called for more interdisciplinary and user-inclusive approaches to ME (Elmer L. Roman, 2023; Hernandez & Pollman, 2022). In line with this, Human-Centered Design (HCD) and Participatory Design (PD) offer promising methodologies to enrich the early stages of mission engineering. HCD is a design approach that prioritizes end-user needs and experiences, ensuring solutions are not only technically sound but also usable and effective in real-world scenarios (Gordon et al., n.d.; NASA/JSC, 2024). PD, on the other hand, actively involves stakeholders (especially end users and domain experts) in co-designing solutions, fostering collaboration and buy-in while ensuring that outcomes align with user expectations and operational context (DoD MEG 2.0, 2023; Watson et al., 2017). By engaging users and stakeholders early through HCD/PD techniques – such as workshops, interviews, and iterative prototyping – ME practitioners can surface context-specific insights and integration issues that purely technical analysis might overlook. This human-centric approach helps ensure that the defined mission problem addresses root causes rather than just symptoms, and that the resulting requirements truly reflect operational realities.
Recent AI advances can further improve mission problem framing. In particular, Retrieval-Augmented Generation (RAG) augments language models with external knowledge: it dynamically retrieves relevant documents or data to ground its outputs. RAG has been shown to enhance the accuracy and reliability of AI-generated content by incorporating factual, domain-specific information (NVIDIA, 2025). RAG can automatically pull in up-to-date, domain-specific information (e.g., from doctrine documents, prior mission lessons learned, or subject-matter expert inputs) to ground AI-generated analyses in factual context (Lewis et al., 2020). In mission engineering, a RAG-assisted process could automatically gather context (e.g. threat data, lessons learned) to inform the evolving mission definition.
This article proposes an integrated MBSE framework that unifies RAG, HCD, and PD into the mission problem-definition workflow. Human-centered co-design activities feed insights into the MBSE model, which in turn triggers RAG-based retrieval of relevant organizational knowledge. By integrating these elements, the framework aims to produce mission problem definitions that are clearer, more comprehensive, and more stakeholder-aligned than those developed by traditional methods (Elmer L. Roman, 2023; Hernandez & Pollman, 2022). By embedding all inputs and updates in a single model, the framework addresses fragmentation across technical, organizational, and human dimensions. The result is a more accurate and aligned mission problem definition, with faster consensus among stakeholders early in the engineering process.
Technical Content
Problem Statement
Poorly framed missions can lead to costly delays or mismatched capabilities. For example, NASA’s upcoming Venus program illustrates this risk: the VERITAS spacecraft (originally to launch in 2027) was deferred to no earlier than 2031 as funds were diverted, putting its engineering team on hold and breaking the plan to supply foundational data for sister missions (Kuthunur, 2023). In the defense sector, repeated continuing resolutions have similarly disrupted acquisition. A stopgap FY2024 budget locked the Pentagon at prior-year levels, blocking new start programs (McDougall, 2024) and creating funding misalignments (e.g. $26B in Navy appropriations) that forced procurement roadblocks for key systems like Patriot missiles and Javelin anti-tank weapons (McDougall, 2024). These cases, where mission contexts and requirements were incomplete or shifted late, directly delayed deployments and risked fielding systems misaligned with warfighter needs.
- NASA VERITAS delay: The decadal Venus mission was frozen after its budget was repurposed, disbanding the team and jeopardizing related missions (Kuthunur, 2023).
- Defense procurement freeze: Continuing resolutions in FY2023–FY2024 kept the DoD at FY23 funding, blocking new acquisition starts and stalling procurement of air defense and missile systems (e.g. Patriots, Javelins) (McDougall, 2024).
These examples highlight how insufficient mission engineering (e.g. unclear mission threads, late stakeholder feedback) can fragment understanding and postpone fielding of the right capabilities.
Technological Fragmentation
Current ME processes demonstrate insufficient focus at the system-of-systems level, with inadequate tradeoff analyses at the true mission level. Knowledge integration remains fragmented, with mission data scattered across disparate systems without cohesive integration mechanisms. This fragmentation limits the ability to generate comprehensive, contextually appropriate problem statements that reflect organizational realities (Elmer L. Roman, 2023).
Organizational Fragmentation
Inconsistent ME approaches between service groups inhibit cross-mission thread analysis. Significant disconnects exist between leadership perceptions and engineering implementation, with many projects operating against isolated operational concept documents rather than integrated mission frameworks. Poor stakeholder alignment leads to scope conflicts, delayed decisions, and diminished project value (Berggren et al, 2001).
Human Dimension Fragmentation
ME processes typically emphasize technical system performance over human factors and stakeholder engagement. This neglect of the human dimension results in systems that may meet technical specifications but fail to address actual user needs or operational realities. Mission engineering lacks systematic approaches for capturing and integrating stakeholder perspectives into mission problem definitions (Errida & Lotfi, 2021; Raheb, 1997).
Figure 2. The differences between PD and UCD, SD and HCD.
HCD is a mix of meta design and service design but closely related to anthropology. It is used more generally in social development than service development (Hagan, 2024).
Retrieval-Augmented Generation (RAG)
RAG is an advanced AI framework that combines the generative capabilities of Large Language Models (LLMs) with real-time retrieval of external, domain-specific knowledge. This integration enhances the accuracy, relevance, and contextual understanding of AI-generated responses by grounding them in factual data. By 2025, RAG had evolved beyond simple text retrieval to incorporate multimodal content integration, real-time knowledge graphs, and hybrid AI architectures (Blynn et al., 2021; M. K. Norman et al., 2021).
In the context of mission problem definition, RAG addresses knowledge fragmentation by:
• Dynamically retrieving relevant information from organizational knowledge repositories
• Minimizing “hallucinations” by grounding problem statements in factual organizational data
• Enabling access to real-time, domain-specific knowledge without requiring extensive retraining
• Enhancing traceability through source references for retrieved information
Figure 3. RAG Workflow Diagram.
This diagram illustrates the RAG process, where user queries are enhanced with relevant information retrieved from knowledge sources before being processed by a large language model, resulting in more accurate and context-aware responses (Núria, 2025).
Human-Centered Design (HCD)
HCD is a problem-solving methodology that prioritizes user needs, desires, and experiences throughout the design process. Its key principles include being people-centered, understanding and solving root problems, thinking of everything as a system, and implementing small, iterative interventions (Townsend & Romme, 2024). Originally popularized by Don Norman, HCD has evolved to emphasize designing with people rather than simply for them (D. Norman, 2014).
In mission problem definition, HCD contributes by:
- Focusing on stakeholder needs and operational contexts
- Addressing underlying mission challenges rather than symptoms
- Viewing mission problems within interconnected systems
- Enabling iterative refinement of problem statements based on stakeholder feedback
Figure 4. The Human-Centered Design (Julia Braga, 2019).
Participatory Design (PD)
PD actively involves stakeholders, especially end users, throughout the design process. This approach emphasizes collaboration, empowerment, iteration, contextual understanding, and user advocacy. Unlike traditional design approaches that often rely on the designer’s expertise with less direct user involvement, PD enables co-creation and fosters a sense of ownership among stakeholders (Schuler & Namioka, 1993; Spinuzzi, 2005; Tremblay-Boire, 2013).
For mission problem definition, PD enhances the process through:
- Collaborative activities where stakeholders contribute knowledge and insights
- Empowering users to actively influence problem formulation
- Iterative refinement based on continuous feedback
- Deepening contextual understanding of mission environments
- Ensuring stakeholder interests are prioritized in problem framing
Figure 5. Co-design Process Cycle (McKercher, 2020).
MBSE-Based Co-Design Framework (RAG + HCD/PD Integration)
We propose an MBSE-based co-design framework that iteratively integrates AI-driven knowledge retrieval with human-centered design activities. For MBSE, one could fine-tune a custom LLM (or use a service LLM) on systems engineering text and link it to the model repository. For example, tools like PivotPoint’s MBSEmaestro leverage ChatGPT-4o in tandem with SysML tools (Cameo Systems Modeler) to generate executable architecture models from architectural patterns. Such integration allows an engineer to ask natural-language questions about a model (e.g. “What operational scenarios should System X support?”) and receive answers grounded in the indexed knowledge base (MBSEmaestro, 2024).
Figure 6 depicts this co-design process cycle as a continuous series of stages, illustrating how automated data analytics and stakeholder collaboration feed into each other. The process synchronizes three critical inputs – stakeholder feedback, organizational knowledge, and AI-driven analysis – to ensure that both human insights and data-driven context shape the mission problem definition (Cornford & Feather, 2016; Fant & Pettit, 2023).
Figure 6. Co-Design Process Cycle for Mission Problem Framing – adapted from McKercher, 2020
The stages of this continuous co-design cycle are as follows:
1. Data Collection & Preparation: The process begins with a new mission situation or problem prompt that triggers a structured survey of stakeholders. This structured survey is a set of well-defined questions or prompts (e.g. about mission objectives, constraints, perceived challenges, and priorities) designed to capture the mission context from those with operational knowledge. The survey itself can be generated using predefined mission-analysis templates or AI assistance to ensure comprehensive coverage of relevant topics. Stakeholder responses to this survey provide explicit inputs about the mission needs and context. In parallel, engineers gather and organize existing mission data – such as doctrine, prior mission reports, lessons learned, and system specs – into an organizational knowledge repository. The result of Stage 1 is a collection of human inputs and documentation ready for analysis (including a knowledge base indexed for RAG).
Figure 7. Structured Data Gathering for Mission Context.
2. RAG-Enabled Context Analysis: In Stage 2, the RAG engine automatically retrieves relevant information from the curated knowledge repository and uses it to draft an initial mission context and problem statement. In practice, the stakeholder inputs from Stage 1 (survey results and any additional queries from the mission team) are fed into a semantic search of the knowledge base. The RAG system employs embedding models and smart filters to pull out pertinent data, which is then stored (e.g. in a vector database) for use by a language model. The language model (LLM) generator uses the retrieved data along with its trained knowledge to produce a draft problem definition that is grounded in the organizational context. This output might include a summary of the mission environment, key challenges, and objectives, complete with traceable references to source documents (Lewis et al., 2020). The RAG-generated draft ensures the initial problem framing is evidence-based rather than reliant solely on the team’s memory or perspective, thereby beginning to integrate fragmented knowledge from across the organization into a cohesive context.
Figure 8. AI-Powered Contextualization of Mission Inputs.
3. Stakeholder Co-Design Workshops: Stage 3 introduces human-centric refinement by convening HCD/PD-driven co-design sessions with stakeholders to review and iteratively improve the RAG draft. In facilitated workshops, diverse stakeholders – e.g. operational commanders, end users (operators), subject matter experts, and even adversarial perspective experts – are invited to critique and modify the draft problem statement collaboratively. Using participatory design methods (brainstorming, storyboarding mission scenarios, voting on priorities, etc.), the group identifies errors, clarifies ambiguities, and contributes missing perspectives or requirements. Any preliminary models or diagrams (mission thread sketches, use-case diagrams, etc.) can be updated in real time during these sessions. This stage embodies co-design: rather than engineers defining the problem in isolation, stakeholders and designers share decision-making. The RAG-proposed statement from Stage 2 serves as a starting point or “strawman” which the human participants validate against real-world experience (HCD’s influence) and reshape as needed (PD’s influence). By the end of Stage 3, the mission problem statement and top-level requirements are co-created and agreed upon by the group, ensuring that human factors (e.g. usability, workflow constraints, stakeholder priorities) are fully considered. This participatory stage is crucial for building consensus and stakeholder alignment early, as the problem definition now reflects a shared understanding across technical and operational communities.
Figure 9. Collaborative Refinement Through Co-Design.
4. Final Problem Definition (MBSE Capture): In Stage 4, the outcomes of the co-design workshops are synthesized into a finalized problem definition and formally captured in the MBSE environment. This involves translating the refined mission problem statement, objectives, and key requirements into the structured format of a model-based representation. For example, systems engineers might update a SysML requirements diagram to include the mission objectives and constraints identified, or create an OV-1 operational view to graphically illustrate the mission scenario and context. The result is an authoritative, model-based mission problem definition artifact that will guide subsequent engineering activities (solution design, analysis, and testing). Because the definition is recorded in an MBSE repository, it is now part of the mission’s digital thread – linked to other model elements (stakeholder needs, use cases, initial operational concepts) and ready to be traced forward into design and evaluation. The MBSE formalization ensures nothing from the collaborative sessions is lost and provides a single source of truth for the mission problem going forward.
Figure 10. Modeling the Mission Definition in MBSE” Visual Concept.
Importantly, this four-stage framework is not a one-time linear sequence but a continuous cycle. After Stage 4, if new information arises or stakeholder priorities change, the process can iterate again: the team updates the data repository or gathers new stakeholder input, triggers the RAG analysis again, and reconvenes stakeholders to refine the problem definition further. In other words, Figure 6’s co-design cycle is intended to be ongoing, allowing the problem definition to evolve as the mission context itself evolves. Throughout this cycle, the MBSE backbone facilitates seamless integration of the AI-driven outputs and human inputs. As stakeholders modify mission needs in Stage 3, those changes are immediately reflected in the model-based documentation; by Stage 4 the MBSE environment contains a living model of the mission problem space. This tight coupling of RAG (AI) with HCD/PD (human-centered co-design), all anchored by MBSE, enables traceability and responsiveness. The framework ensures that any update – whether from a new piece of data or a stakeholder insight – propagates through the model. This way, the problem definition remains current, consistent, and well-understood by all parties, thereby addressing the common pitfalls of fragmentation and misalignment in traditional approaches.
Benefits and Expected Outcomes
By applying the above human-centered, model-supported process, the framework is expected to yield several key benefits in the mission problem framing stage:
Enhanced Problem Definition Clarity: Integrating stakeholder perspectives with organizational knowledge through a structured MBSE approach produces clearer and more precise problem statements. Because the initial problem definition is grounded in real data (via RAG) and vetted by end-users and decision-makers, the resulting statement avoids the ambiguity and bias that often plague traditional mission definitions. These problem statements are tightly aligned with operational realities and mission objectives, providing a solid foundation for subsequent engineering work. In essence, the framework ensures the right problem is being addressed, as the definition has been sharpened by both evidence and user insight (Cornford & Feather, 2016; Fant & Pettit, 2023).
Improved Stakeholder Alignment: The participatory nature of the framework ensures that diverse stakeholder perspectives are captured and integrated into the problem definition. Rather than a top-down formulation, stakeholders from different echelons (leadership, operators, engineers) collaboratively shape the mission statement. This co-creation process reduces misalignment between leadership intent and engineering interpretation, fostering greater consensus around the defined mission problem or opportunity (Evans et al., 2009; Weiland, 2021). Moreover, because the agreed-upon mission needs and assumptions are documented in an MBSE model, all stakeholders share a common reference point. The MBSE repository acts as a communication bridge – a single source of truth that everyone can consult – further solidifying stakeholder buy-in and understanding. In short, the framework not only builds alignment through collaboration but also maintains that alignment by capturing it in an authoritative model that guides future work.
Facilitated Knowledge Integration: By leveraging RAG technology within the workflow, the framework enables seamless integration of organizational knowledge into the problem-definition process. Critical information from doctrine, past missions, technical reports, and other data sources is dynamically retrieved and synthesized into the evolving mission context, which helps break down knowledge silos (Lewis et al., 2020). Additionally, MBSE plays an important role in institutionalizing this knowledge integration. The relevant data and context obtained via RAG are not just presented informally – they are incorporated into the model (for example, linked to specific requirements, design constraints, or rationale in the digital thread). This means the typically fragmented expertise and historical knowledge become part of a structured representation accessible to all team members. The result is a problem definition enriched with corporate memory and lessons learned, ensuring that the mission engineering effort benefits from the full spectrum of available knowledge. This integrated approach addresses one of the key limitations of traditional ME, where information often remains scattered and hard to trace.
Enhanced Visualization and Communication: An MBSE-based approach inherently provides improved visualization capabilities for complex mission architectures and problem spaces. By capturing the mission problem in models and diagrams, the framework makes the problem more transparent and communicable to various stakeholders. For example, graphical views like operational concept diagrams, hierarchy diagrams of involved systems, or stakeholder requirement matrices can be automatically generated or updated as part of the MBSE repository. Such visual representations help translate complex technical information into formats that are easier to understand. This greatly enhances communication across diverse groups – engineers, analysts, commanders, and non-technical stakeholders can all engage with the model at the level of detail appropriate for them. As a result, mission problems and context are more accessible to a broad audience, which improves shared understanding and expedites feedback cycles (Fant & Pettit, 2023). In summary, the combination of RAG’s data-driven summaries with MBSE’s visualization tools provides stakeholders a clearer window into the problem space than traditional text documents would, thereby facilitating more effective discussion and decision-making.
Future Work and Validation
While the proposed framework is promising, further work is needed to refine and validate it, as well as to ease its adoption in practice. Key areas for future work include:
- Tool Support and Automation: Developing dedicated tools and methodological guides to support implementation of this integrated MBSE/RAG/HCD/PD approach. This may involve software that automates the structured survey generation and RAG querying, or plugins for MBSE platforms to streamline the import of RAG outputs and the management of stakeholder input within the model. Effective tool support will lower the barrier to adoption and ensure consistency in how the process is applied.
- Validation Studies: Conducting rigorous validation studies in a variety of mission contexts to test and evaluate the effectiveness of the framework. For example, pilot implementations could be carried out in controlled mission planning exercises or wargame scenarios, where one team uses the integrated framework and another uses a traditional problem-definition approach. Metrics such as the clarity and completeness of the final problem definitions, the degree of stakeholder satisfaction and consensus, and the time or iterations required to reach a finalized definition can be collected. Through these studies, the framework’s impact on mission engineering outcomes can be measured empirically. Emphasizing formal test and evaluation in this manner will provide evidence as to whether the human-centered MBSE approach yields statistically significant improvements over the status quo. The insights from these evaluations will also guide any needed refinements to the process before broader deployment.
- Integration with Existing Processes: Exploring how to integrate the proposed approach with existing Mission Engineering frameworks, guides, and tools in the defense community. Rather than supplanting current processes, the framework could complement them – for instance, by embedding the co-design cycle within the early phases of the DoD’s Mission Engineering Guide workflow. Future efforts will examine interoperability with legacy requirements-management systems, compliance with documentation standards, and ways to export the MBSE-defined problem context into downstream analysis or acquisition tools. Demonstrating that the framework can plug into current organizational workflows will be important for real-world adoption.
- Broadening to Other Domains: Investigating the applicability of the approach to different domains beyond the initial defense use-cases. The core challenges of mission problem framing – stakeholder alignment, knowledge integration, and so on – appear in various fields such as disaster response, cybersecurity, healthcare, and infrastructure planning. Future research will apply the framework to case studies in some of these domains to confirm its generalizability. This will help identify any domain-specific adjustments needed (for example, tailoring the structured survey questions or knowledge base configuration for a humanitarian mission versus a military one). Proving the framework’s value across multiple contexts will strengthen the argument for its adoption and could foster cross-domain learning in complex system-of-systems engineering.
By pursuing these future work threads, we aim to both validate the benefits of the integrated approach and address practical considerations for its use. In particular, well-designed validation and testing efforts will provide concrete evidence to decision-makers about the framework’s efficacy, ensuring that the proposed co-design cycle delivers real improvements in mission engineering outcomes.
Conclusion
The proposed MBSE-based framework addresses critical dimensional fragmentation: technological, organizational, and human that traditionally limits ME effectiveness. By integrating RAG dynamic knowledge retrieval capabilities with HCD’s emphasis on user needs and PD’s collaborative methods, the framework significantly improves the accuracy of mission problem definitions, fosters stakeholder consensus, and reduces the time required to reach alignment.
As mission engineering grows increasingly essential for tackling complex challenges in defense, space, and other mission-critical domains, this integrated approach offers a transformative methodology. Practitioners gain practical tools to unify diverse knowledge sources and human perspectives within a coherent MBSE environment, directly enhancing decision-making clarity and speed. Additionally, the approach provides theoretical advancements by bridging systems thinking with artificial intelligence and human-centered methodologies.
Future validation activities such as planned pilot studies and comparative exercises will further confirm the framework’s efficacy and refine its implementation. Ultimately, this integrated methodology represents a meaningful evolution in mission engineering practice, delivering clearer mission contexts, stronger stakeholder alignment, and improved mission outcomes.
References
ASD (MC), A. S. of D. for M. C. (2024). Mission Engineering – ASD(MC). https://ac.cto.mil/mission-engineering/
Berggren et al. (2001). Clients, contractors, and consultants: the consequences of organizational fragmentation in contemporary project environments. https://www.pmi.org/learning/library/organizational-fragmentation-contemporary-environments-5288
Blynn, E., Harris, E., Wendland, M., Chang, C., Kasungami, D., Ashok, M., & Ayenekulu, M. (2021). Integrating Human-Centered Design to Advance Global Health: Lessons From 3 Programs. Global Health: Science and Practice, 9(Supplement 2), S261–S273. https://doi.org/10.9745/GHSP-D-21-00279
Cornford, S. L., & Feather, M. S. (2016). Model Based Mission Assurance in a Model Based Systems Engineering (MBSE) Framework State-of-the-Art Assessment. http://www.sti.nasa.gov
DoD MEG 2.0. (2023). Department of Defense Mission Engineering Guide 2.0. https://ac.cto.mil/mission-engineering/
Elmer L. Roman. (2023). Mission Engineering Implementation – Impacts & Challenges. https://ndiastorage.blob.core.usgovcloudapi.net/ndia/2023/systems/Tue_1560075_Roman_Panel.pdf
Errida, A., & Lotfi, B. (2021). The determinants of organizational change management success: Literature review and case study. International Journal of Engineering Business Management, 13. https://doi.org/10.1177/18479790211016273
Evans, J., Cornford, S., & Feather, M. S. (2009). Model Based Mission Assurance: NASA’s Assurance Future.
Fant, J. S., & Pettit, R. G. (2023). MBSE mission assurance. Handbook of Model-Based Systems Engineering, 861–893. https://doi.org/10.1007/978-3-030-93582-5_72/TABLES/3
Gordon, P., Kramer, J., Moore, G., Yeung, W., & Agogino, A. (n.d.). A Systematic Review of Human-Centered Design for Development in Academic Research.
Hagan, M. (2024). A Brief History of Design Thinking – Open Law Lab. Open Law Lab. https://www.openlawlab.com/2013/09/09/a-brief-history-of-design-thinking-2/
Hernandez, A. S., & Pollman, A. G. (2022). Characterizing integration challenges in mission engineering to form solution strategies. Systems Engineering, 25(4), 404–418. https://doi.org/10.1002/SYS.21621
Herrera, E. (2024). Human Factors & Performance – NASA. https://www.nasa.gov/reference/jsc-human-factors-performance/
INCOSE. (2007). SYSTEMS ENGINEERING VISION 2020 – INCOSE.
Julia Braga. (2019). Is human-centered Design broken?. This is not a statement but rather a… | by Julia Braga | UX Collective. https://uxdesign.cc/is-human-centred-design-broken-cac130eecc48
Kuthunur, S. (2023). Delays to NASA’s VERITAS mission a major blow for Venus exploration | Space. https://www.space.com/veritas-mission-delay-affect-venus-exploration
Lewis, P., Perez, E., Piktus, A., Petroni, F., Karpukhin, V., Goyal, N., Küttler, H., Lewis, M., Yih, W. T., Rocktäschel, T., Riedel, S., & Kiela, D. (2020). Retrieval-Augmented Generation for Knowledge-Intensive NLP Tasks. Advances in Neural Information Processing Systems, 2020-December. https://arxiv.org/abs/2005.11401v4
Ma, J., Wang, G., Lu, J., Vangheluwe, H., Kiritsis, D., & Yan, Y. (2022). Systematic Literature Review of MBSE Tool-Chains. Applied Sciences 2022, Vol. 12, Page 3431, 12(7), 3431. https://doi.org/10.3390/APP12073431
MBSEmaestro. (2024). MBSEmaestro AI ToolkitTM. https://mbsemaestro.ai/toolkits/
McDougall, S. (2024). Congress Delays FY24 DoD Budget Again, Raising Fears of a Year-Long Continuing Resolution – Defense Security Monitor. https://dsm.forecastinternational.com/2024/03/01/congress-delays-fy24-dod-budget-again-raising-fears-of-a-year-long-continuing-resolution/
McKercher, K. A. (2020). What is co-design? — Beyond Sticky Notes. Beyond Sticky Notes. https://www.beyondstickynotes.com/what-is-codesign
NASA/JSC. (2024). Human Factors Engineering Requirements for the International Space Station.
Norman, D. (2014). The Design of Everyday Things. https://mitpress.mit.edu/9780262525671/the-design-of-everyday-things/
Norman, M. K., Hamm, M. E., Schenker, Y., Mayowski, C. A., Hierholzer, W., Rubio, D. M., & Reis, S. E. (2021). Assessing the application of human-centered design to translational research. Journal of Clinical and Translational Science, 5(1), e130. https://doi.org/10.1017/CTS.2021.794
Núria, E. (2025). Retrieval Augmented Generation (RAG) as a Solution to LLM. https://blog.bismart.com/en/what-is-retrieval-augmented-generation-rag
NVIDIA. (2025). What is Retrieval-Augmented Generation (RAG)? | NVIDIA. https://www.nvidia.com/en-us/glossary/retrieval-augmented-generation/
Pasupuleti, S. (2023). Model-Based Systems Engineering (MBSE) for the Design and Integration of Complex Robotics Systems. ESP Journal of Engineering & Technology Advancements, 3(3), 126–132. https://doi.org/10.56472/25832646/JETA-V3I7P116
Raheb, M. (1997). MISSION IN THE CONTEXT OF FRAGMENTATION. International Review of Mission. https://www.academia.edu/10284947/MISSION_IN_THE_CONTEXT_OF_FRAGMENTATION
Rudder, S. (2024). Using a model-based systems engineering framework for human-centric design. Human Factors in Design, Engineering, and Computing, 159(159). https://doi.org/10.54941/AHFE1005598
Schuler & Namioka. (1993). Participatory design: Principles and practices. https://psycnet.apa.org/record/1993-97696-000
Spinuzzi, C. (2005). The Methodology of Participatory Design.
Townsend, M., & Romme, A. G. L. (2024). The Emerging Concept of the Human-Centered Organization: A Review and Synthesis of the Literature. Humanistic Management Journal, 9(1), 53–74. https://doi.org/10.1007/S41463-024-00168-W/FIGURES/1
Tremblay-Boire, J. (2013). Change Can Be Good: A New Perspective on Mission Drift.
Watson, M. E., Rusnock, C. F., Colombi, J. M., & Miller, M. E. (2017). Human-Centered Design Using System Modeling Language. Journal of Cognitive Engineering and Decision Making, 11(3), 252–269. https://doi.org/10.1177/1555343417705255
Weiland, K. J. (2021). Future Model-Based Systems Engineering Vision and Strategy Bridge for NASA. http://www.sti.nasa.gov
Whitmore, M. (2023). Role of Human Factors Engineering and Human-Systems Integration in Future Space Exploration.
Wilking, F., Horber, D., Goetz, S., & Wartzack, S. (2024). Utilization of system models in model-based systems engineering: definition, classes and research directions based on a systematic literature review. Design Science, 10, e6. https://doi.org/10.1017/DSJ.2024.3
Author Biographies
Rafi Soule is a PhD Candidate in Systems Engineering at Old Dominion University. Her research focuses on integrating AI technologies with human-centered methodologies to enhance Mission Engineering practices. Her work advances model-based approaches for conceptual mission analysis and supports the ethical integration of AI into mission engineering practices. Rafi has professional experience in systems engineering and has contributed to multiple research projects involving MBSE and AI applications in complex mission environments.
Dewey Classification: L 681 12

