files: 68
This data as json
rowid | image | timestamp | memento | first_capture | last_capture | current_status | text | mime | status | url | urlkey | digest | length | file_path |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
68 | 20120507040927 | https://web.archive.org/web/20120507040927/http://www.defence.gov.au/dgta/Documents/DAVENG/Software%20Symposium%20documents/2008/Presentations/An%20Approach%20to%20Software%20Dependability%20Evaluation%20(Uzunov).ppt | 2012-05-07 | 2012-05-07 | 404 | Software Dependability Evaluation Kiril Uzunov Airborne Mission Systems Air Operations Division 08 8259 5824 kiril.uzunov@dsto.defence.gov.au ADF Software Symposium’08 - RAAF Williams, Laverton "Software Support Agencies – Assuring the Continuing Dependability of Software" Dependability is a generic term including: • reliability, • availability, • maintainability, • safety, and • integrity Also known as RAMSS Agenda • Software quality and software architecture • Evaluation of Software Architectures • Quality attributes of Airborne Mission Systems • AMS_BN model • Questions Trends • The challenge is to meet the quality expectations, not the functionality • Software architecture coming into the limelight as a framework for capturing all major decisions • Increased role of software architecture during system evolution • Quality attributes are restrained by architecture to a large extent • Increased use of quantitative assessment of quality • Need for predicting quality • “Testing by itself does not improve software quality. Test results are an indicator of quality, but in and of themselves, they don't improve it. Trying to improve software quality by increasing the amount of testing is like trying to lose weight by weighing yourself more often. ... If you want to improve your software, don't test more; develop better.” – Code Complete, Steve McConnell Rob.Bogue@ThorProjects.co m Quality framework • Management oriented view of quality • Software oriented attributes • Quantitative measures of these attributes Quality Factor Metrics Quality Criteria McCall 1994 no Functional Requirements Requirement Specification Application Architecture Functionality-based Architecture Design OK Evaluate Quality Factors Architecture Transformation Quality Requirements yes Architecture evaluation techniques Bosch 2001 Architecture Evaluation Architecture oriented (review based) Quality attribute focus (3+1 techniques) Scenario-based Experience-based reasoning Modeling/metrics Simulation Specifics of AMS • Complexity • Long development time • Long term of deployment – technology ageing • Safety critical • Mission critical • System-of-Systems – Interoperability Our goals • Identify quality factors, project and product characteristics • Identify metrics for them • Make assessment/prediction concerning the project health and project risks Our approach Quality Factor Guidelines Quality Criteria Metrics Checklists People, Process & Architecture attributes Quality Factors vs. Criteria Reliability Maintainabilit y Safety Security Organisational capability X X X X Robustness X X X Testability X X X Openness X Scalability X Modifiabiity X Modularity X X Criterion - Organisational capability • Process maturity {high (level 4,5), nominal (3), low (1,2)} • Skill retention {high (<5% turnover), medium, low (>20%)} • Management {high (>7 years), low } • Architecture quality – Team experience {high (>5 years), low (<3)} – Requirements quality {high, low} – Relevant standards {yes, no} Quality Factor Guidelines Quality Criteria Metrics Checklists People, Process & Architecture attributes Criterion - Robustness • Memory management • Error propagation • Cohesion • Task management Quality Factor Guidelines Quality Criteria Metrics Checklists People, Process & Architecture attributes Criterion - Testability • Complexity • Technology independence • Interfaces • Modularity – Coupling – Change propagation – Technology Independence Quality Factor Guidelines Quality Criteria Metrics Checklists People, Process & Architecture attributes Criterion - Scalability • Interfaces • Capacity margins • Technology independence • Hardware isolation from software • Scheduling • Architecture style Quality Factor Guidelines Quality Criteria Metrics Checklists People, Process & Architecture attributes Criterion - Modularity • Technology independence • Change propagation • Coupling Quality Factor Guidelines Quality Criteria Metrics Checklists People, Process & Architecture attributes Criterion - Modifiability • Capacity margins • Technology independence • Modularity • Architecture style • Views documentation • Complexity Quality Factor Guidelines Quality Criteria Metrics Checklists People, Process & Architecture attributes Criterion - Openness • Interfaces • Technology independence • Capacity margins Quality Factor Guidelines Quality Criteria Metrics Checklists People, Process & Architecture attributes Attribute capacity margins - checklist • CPU capacity – Usage is below defined maximum under worst case tasking – Prototype been built to demonstrate processor capacity • Memory utilization under worst case tasking • Interconnection margins • Execution threads meet deadlines under worst case load • Database size/storage – Is DB enough for pre-loaded flight info – Can DB record everything that occurs during mission • Nr of tracks a system can support/process • Data links capacity – type, loads • Power margins – should be in excess of current max demand • Etc. Quality Factor Guidelines Quality Criteria Metrics Checklists People, Process & Architecture attributes AMS_BN model connectivity Organisation_Capability Architecture_Quality Information_Assurance Robustness Security FaultTolerance Complexity Modifiability Modularity Coupling Change_Propagation Skill_Retention Management Process_maturity Experience Quality_of_Requirements Safety Testability Maintainability Reliability Capacity HW_Isolation Scheduling Architecture_Style Scalability Technology_Independence Error_Propagation Cohesion Task_Management Interoperability ViewsDoc Interface OpenArchitecture Relevant_Standards Fault_prevention Memory_Management High Low 0.7 0.3 Quality of Requirements Experience Arch_Quality High Arch_Quality Low High High 0.9 0.1 High Low 0.6 0.4 Low High 0.6 0.4 Low Low 0.1 0.9 Upper management and staff experience contribute 120 percent to productivity. Effective methods/processes contribute only 35 percent. Staff inexperience: negative 177 percent; Ineffective methods/processes: negative 41 percent. (Jones C., 2000) CMMI 2006 Architecture_Quality High Low 73.8 26.2 Organisation_Capability High Low 70.0 30.0 Management High Low 80.0 20.0 Process_maturity Level 3 5 Level 1 2 60.4 39.6 Experience High Low 80.0 20.0 Quality_of_Requirements High Low 70.0 30.0 Bayes’ theorem P(A,B) = P(B,A) = (P(B|A)*P(A))/P(B) To recapitulate • We described a quality framework • “Customized” this framework to AMS software architectures • Selected suitable evaluation method (Bayesian Belief Networks) for predicting dependability • Described the topology of the model • Further development References • Avizienis et al. 2004 - A.Avizienis, J. Laprie, B.Randell, C.Landwehr “Basic Concepts and Taxonomy of Dependable and Secure Computing”, IEEE Trans. On Dependable and Secure Computing, Vol.1, No1, pp.11-33, Jan-Mar 2004 • Bosch 2001 - J.Bosch, Presentation on Software Architecture Assessment, Summer School on Software Architecture, Turku Centre for Computer Science, Finland, August 2001 • CMMI 2006 – Capability Maturity Model® Integration (CMMI®) Version 1.2 Overview, SEI CMU, 2006 • Fenton et al. 2007 – M.Fenton, M.Neil. D.Marquez Using Bayesian Networks to Predict Software Defects and Reliability, The 37th Annual IEEE/IFIP International Conference on Dependable Systems and Networks, DSN 2007, Edinburgh 2007 • Gurp 2003 - J.Gurp, SAABNet: Managing Qualitative Knowledge in Software Architecture Assessment, from “On the Design & Preservation of Software Systems”, PhD Thesis, pp.73-88, 2003 • Jones 2000 - Jones, C., Software Assessments, Benchmarks and Best Practices, Addison-Wesley, 2000 • McCall 1994 - J.A.McCall, Quality Factors, In „Encyclopedia of Software Engineering”, vol.2 O-Z, John J. Marciniak ed., John Wiley & Sons Inc., pp. 958-969, 1994 | application/vnd.ms-powerpoint | 200 | http://www.defence.gov.au/dgta/Documents/DAVENG/Software%20Symposium%20documents/2008/Presentations/An%20Approach%20to%20Software%20Dependability%20Evaluation%20(Uzunov).ppt | au,gov,defence)/dgta/documents/daveng/software%20symposium%20documents/2008/presentations/an%20approach%20to%20software%20dependability%20evaluation%20(uzunov).ppt | IVJSPKEN5BFZTFJX3I7IITKTKP5OBPK2 | 1110267 | domains/defence-gov-au/powerpoints/original/au-gov-defence-dgta-documents-daveng-software-20symposium-20documents-2008-presentations-an-20approach-20to-20software-20dependability-20evaluation-20-uzunov-ppt-20120507040927.ppt |