Premium
7.3.1 Human Understandability for Optimizing Architectures
Author(s) -
Smith Darold K.
Publication year - 2004
Publication title -
incose international symposium
Language(s) - English
Resource type - Journals
ISSN - 2334-5837
DOI - 10.1002/j.2334-5837.2004.tb00586.x
Subject(s) - computer science , clarity , ambiguity , robustness (evolution) , stakeholder , architecture , software engineering , process management , process (computing) , information system , risk analysis (engineering) , knowledge management , engineering , programming language , business , art , biochemistry , chemistry , public relations , electrical engineering , political science , visual arts , gene
Humans define, create, and use systems. Therefore, to minimize opportunities for problems in a system's lifecycle, the work products that affect the system definition and program lifecycle activities must be highly understandable and not subject to multiple interpretations. The antidote to complexity is (human) understandability. Understandability is driven by the amount and clarity of information presented and accessible to stakeholders at a particular level in the system architecture. High‐level information needs to be more general while lower‐level information must provide sufficient detail to prevent ambiguity and omissions. The issue is how to package information at each system level to maximize stakeholder understandability at that level. This paper describes ways to increase human understandability and thereby increase system robustness of both the architecture elements and interfaces between them during the SE process. The result is a system definition that is more likely to satisfy the SE goals of delivering an error free product.