Home
Welcome
Members
Subcommittees
Committee History
Press Room
Jurisdiction
Hearings/Markups
Conference Schedule
Legislation
The Budget Process
Democratic Info
 
 
   
Back to Hearings & Testimony (Main)
     
February 3, 2005
 
Testimony of Mr. Gary P. Pulliam, Vice President, Civil and Commercial Operations, The Aerospace Corporation, Before the Senate Committee on Appropriations, Subcommittee on Commerce, Justice, State and the Judiciary

February 3, 2005 Mr. Chairman, distinguished committee members, and staff:

I am pleased to represent The Aerospace Corporation and appear before you today as you deliberate Trilogy and the Virtual Case File System.

As a private, nonprofit corporation, The Aerospace Corporation has provided engineering and scientific services to government organizations for over 40 years. We provide a stable, objective, expert source of analysis. We are focused on the government’s best interests, with no profit motive or predilection for any particular design or technical solution.

As its primary activity, Aerospace operates a Federally Funded Research and Development Center (FFRDC) sponsored by the Under Secretary of the Air Force, and managed by the Space and Missile Systems Center (SMC) in El Segundo, California. The Aerospace Corporation also undertakes projects for civil agencies that are in the national interest and are consistent with our corporate role. Over 350 staff members focus exclusively on computer systems, software, and information technology.

Our unique "trusted agent" role provided to the Air Force has become known throughout the Intelligence Community. In executing our FFRDC mission, and more specifically, our support to the National Reconnaissance Office, our technical core competencies have become known to the FBI.

1. Introduction In 2001, the Federal Bureau of Investigation (FBI) began a major information technology upgrade commonly known as The Trilogy Program. The User Applications Component (UAC) is one of three basic elements of Trilogy. Organizations such as the Government Accountability Office, the Department of Justice Inspector General, and the National Research Council have voiced serious concern about the progress in completing Trilogy, and specifically the UAC. In response to these concerns, the FBI developed and implemented a “corrective action plan” in June 2004. As part of the corrective action plan, the FBI requested that The Aerospace Corporation (hereafter, Aerospace), conduct an independent verification and validation of the UAC; specifically, the Virtual Case File (VCF) Delivery 1. This testimony summarizes findings and recommendations from the independent verification and validation (IV&V;) review of the VCF Delivery 1, conducted by The Aerospace Corporation (Aerospace). This testimony is extracted from Aerospace Report No. ATR-2005(5154)-4, “Independent Verification and Validation of the Trilogy Virtual Case File, Delivery 1: Final Report”, delivered to the FBI on January 21, 2005. The FBI, and the Vice President, Civil and Commercial Operations of The Aerospace Corporation have approved release of this information.

This overall scope of the IV&V; assessments of the VCF Delivery 1 included the system design, software design, overall security, and the maturity of the development contractor’s software development processes. Each assessment comprised reviews and analyses of pertinent documentation, source code, and process-related materials. In addition, the assessment of the maturity of the development contractor’s software development processes included a site visit (November 9, 2004) with interviews of key contractor personnel involved in the VCF Delivery 1. The assessments summarized in this testimony were conducted in the period August–December 2004.

It is important to clarify that this effort was not an IV&V; in the traditional sense of verifying that all requirements have been satisfied, though requirement satisfaction was part of the assessment. Neither was it an independent program assessment that focused on the entire range of management, programmatic, contractual, and technical issues. Rather, Aerospace conducted a detailed engineering assessment of VCF Delivery 1 requirements and design documentation, source code, and artifacts to provide a recommendation to the FBI on discarding or remediating VCF Delivery 1 products.

Specifically, Aerospace was asked to address the following business questions:

Question 1. Did the incumbent contractor meet the stated requirements? a. User Needs b. System Requirements c. Software Requirements Question 2. Did the incumbent contractor develop a complete and correct Concept of Operations, System Architecture, and System Requirements? Question 3. What should the FBI do with VCF Delivery 1? a. Keep all of it? b. Keep parts of it? c. Discard it?

The remainder of the testimony is organized as follows: Section 2 describes the methodology used in assessing the system design associated with VCF Delivery 1, as well as the software design, security, and the maturity of the development contractor’s software development processes. Section 3 summarizes the findings made by the assessment teams in terms of topics whose state of being influences the answers to the three business questions. These topical groupings represent (1) architecture, (2) requirements, (3) software quality, (4) performance, (5) security, and (6) contractor processes. More detailed finding statements are found in the Appendices. Section 4 presents conclusions formed by examining the findings across all six items of interest, as well as inferred findings based on possible observed trends. This section addresses Business Questions 1 and 2. Section 5 presents a framework for addressing Business Question 3 and a recommendation based on the framework. In addition, general recommendations are given based on Aerospace observations. 2. Approach The IV&V; review consisted of assessments of the UAC documentation and artifacts relating to system design, software design, security, and the maturity of the development contractor’s software development processes. In addition, the IV&V; assessment of the maturity of contractor processes included a fact-finding trip to the contractor’s facility to conduct interviews and view additional materials. In general, the methods used were tailored versions of those employed by Aerospace in performing IV&V; reviews of national security space systems. The specific approaches utilized by a given assessment team are summarized in the following sections. Because IV&V; is the process of verifying that requirements are satisfied and validating that user needs are met, and because Aerospace was limited primarily to documentation and artifacts, most of assessment was spent examining the quality of and traceability through the documentation and artifacts. This is in keeping with an essential tenet of systems engineering that necessary conditions for a system to be successfully implemented are that (1) documentation and artifacts be complete, clear, concise, precise, and mutually consistent, and (2) requirements be properly decomposed with bi-directional tracing between successive levels of the system (e.g., user needs trace to system requirements, system requirements trace to subsystem requirements, and so forth through design, implementation, and test). Not only do these conditions increase the probability of successfully implementing a system, they are required for effective maintenance.

When possible, the assessment team used industry and government standards as benchmarks against which the program documentation and artifacts were measured. Although standards were not required on the VCF development contract, standards were used in the assessment because they encapsulate known best practices that should be used whether or not they are required of a contractor. The use of standards also eliminates a level of subjectivity from the assessment.

Given the scope and time constraints of the IV&V; review, Aerospace focused on a sample of program documentation and other artifacts. Two notable exceptions were that (1) the group assessing the maturity of contractor software development processes conducted a 1-day site visit with the contractor to obtain answers to process questions and to view sample reports and artifacts, and (2) a limited number of Aerospace personnel attended a 1-day design review. In taking this overall approach, it is important to note:

• With the exception of the 1-day site visit and the 1-day design review, Aerospace did not have direct contact with the incumbent contractor to address comments on the documentation and potentially alleviate some concerns. • With the exception of database performance testing, access was not provided to the tests that occurred or the results of those tests (hence, the review does not directly address how well VCF Delivery 1 satisfies the user requirements but does so by inference). 2.1 System Design Assessment The system design assessment provided the system-level portion of the IV&V; review. The system design assessment was divided into two smaller assessment activities: an evaluation of the system-level documentation (i.e., cross-checking the system-level documentation) and a system-level IV&V; appraisal of VCF Delivery 1. The latter consisted of an examination of requirements traceability, requirements satisfaction, performance, and security. 2.1.1 System Level Documentation Assessment To objectively assess the system-level documentation, Aerospace identified standards against which the documents could be compared. This section describes the ways these standards were used in the assessment.

The CONOPS was reviewed and its content compared against the reference standard embodied in the Department of Defense (DOD) Data Item Description (DID) Operational Concept Description (OCD) [ ]. (The emerging guide for preparing CONOPS documents [ ] that is being created by the American Institute of Aeronautics and Astronautics (AIAA), in conjunction with the International Council on Systems Engineering (INCOSE), was also consulted for content and language.) In the review, particular attention was given to the CONOPS with respect to:

• The description of the current system (e.g., operational environment; major system components; interfaces to external systems or procedures; capabilities and functions of the current system; diagrams/charts depicting data flow and processes; quality attributes such as reliability, availability, maintainability, flexibility, extensibility; personnel; support concept for the current system). • The justification for and the nature of changes (e.g., description of the needed changes; priorities among the changes; changes considered but not included; assumptions and constraints). • The description of the new system. • Operational scenarios (e.g., the role of the system and interactions with users; events, actions, interactions, stimuli). • The new system’s operational and organizational impacts. • The analysis of the proposed system (e.g., summary of advantages; summary of disadvantages/limitations; alternatives and trade-offs considered).

The SADD was reviewed and its content compared to the reference standard found in the DOD DID System/Subsystem Design Description (SSDD) [ ]. (The Institute of Electrical and Electronics Engineers (IEEE) Recommended Practice for Architectural Description of Software-Intensive Systems, IEEE Std 1471-2000 [ ], was consulted for content and language.) The SADD was examined with respect to its:

• Presentation of system-wide design decisions. Specifically, decisions regarding system behavior and the selection and design of components; inputs, outputs, and interfaces; actions the system would perform in response to inputs or conditions; description of physical systems; selected algorithms; how databases would appear to the user; approaches to meeting safety, security, and privacy requirements; design and construction choices. • Descriptions of the system architectural design (e.g., hardware configuration items, computer software configuration items, and manual operations; concept of execution; interface design; requirements traceability).

The SRS was also reviewed and compared against two applicable standards: DOD Military (MIL) Standard (STD) 498, Software Development and Documentation [ ] and DOD DID System/Subsystem Specification (SSS) [ ]. (The INCOSE Systems Engineering Handbook [ ] was consulted for content.) The SRS was assessed against the full breadth of possible requirements, to include:

• Definition of required states and modes • Internal and external interface requirements • Internal data requirements • Safety requirements • Environment requirements • Computer-related requirements (e.g., resources, hardware, resource utilization, software, computer communications) • Quality factors

In addition to performing reviews of the SADD, CONOPS, and SRS against particular standards, the system-level documentation was assessed for their mutual consistency, completeness, and reasonableness. 2.1.2 VCF Delivery 1 Assessment 2.1.2.1 Requirement Traceability Aerospace examined the completeness and consistency of user need statements and their maturation into system requirements. Aerospace extracted all system and software requirements from traceability tables found in the SRS and the SRD, and examined parent-child relationships between these documents. Comparisons were made of each system requirement statement within the body of the SRS to that found in the SRS traceability matrix. A similar comparison was made with software requirements in the SRD. Validation and verification was performed on subsets of the system-level requirements involving access control and workflow (these requirement areas were chosen, in consultation with the FBI, based on their importance to the UAC). Specifically, Aerospace identified 22 system-level access control requirements and assessed all of them. Of the more than 120 system-level workflow requirements identified, 52 were assessed. The 74 system-level access control and workflow requirement statements were assessed against the following quality attributes provided in The Engineering Design of Systems [ ]: 1. Clear and concise. The requirement has only one interpretation and does not contain more than it should. When clarity was in question, the UAC Requirements Terms and Definitions Document (RTDD) was used as the primary source for clarification. 2. In-scope. The requirement does not impose anything unnecessary on the system. 3. Design- and implementation-free. The requirement does not impose a design or implementation solution. 4. Verifiable. The requirement uses concrete terms and measurable quantities. 5. Free of TBD/TBR. The requirement does not contain placeholder statements or values. 6. Free of conflict or duplication. The requirement neither overlaps nor opposes another requirement. 7. Appropriate decomposition. The traced-to software requirements make sense and are complete. 8. Complete requirement set. There is no appearance of missing requirements related to the requirement being examined. 2.1.2.2 Requirements Satisfaction Actual requirement satisfaction, as determined through a review of requirement testing results, was not considered because test results were not made available. For this reason, Aerospace relied on secondary indicators of requirement satisfaction. For example, the assessment of traceability of the CONOPS, SADD, and SRS was performed within the system-level requirement traceability activity (Section 2.1.2.1), while traceability of software requirements was examined in the software source code and traceability analyses (Sections 2.2.3 and 2.2.5). Other facets of requirement satisfaction were provided by other analyses. 2.1.2.3 Performance Contractor test methodology and database performance test results, as found in the Interim Scaling and Performance Test Report, were examined to assess the performance of VCF Delivery 1. The goal of the database performance evaluation was to identify areas of high performance risk in the database schema and database Structured Query Language (SQL) query code. Network, application server, and web server performance were not examined.

In addition to examining the contractor test data, independent checks on database performance were conducted through the following means:

• Creation of an Entity Relationship diagram based on the contractor database Data Definition Language (DDL) code, from which further analysis of the database could be conducted. • Examination of SQL code with respect to (1) system queries, especially with respect to the use of table joins in clauses, nested queries, outer joins, and cursors; (2) code complexity; (3) performance risk factors; and (4) identifying the SQL code critical path . • Review of the database structure for signs of performance enhancement attributes (e.g., table partitioning, table splitting, denormalization, materialized views, and rollup tables). • Review of the database indexing to determine if table indexes were selected for maximum SQL code performance. • Analysis of the Virtual Private Database (VPD) implementation performance risks (i.e., looking at the where clause predicates that would be added to each and every SQL query). • Evaluation of system scalability requirements through an extrapolation of reported test results. 2.2 Software Design Assessment The software design assessment comprised six distinct analyses: software architecture, software requirements, source code traceability, source code documentation, requirements traceability, and security. 2.2.1 Software Architecture Analysis The analysis began with a review of the CONOPS, SADD, SRD, Software Design Document (SDD), and accompanying component SDDs. In addition, IEEE Std 1471-2000 [4] was reviewed because it was referenced in the SADD. The software architecture was examined using an abbreviated form of the Architecture and Tradeoff Analysis Method (ATAM) developed by the Software Engineering Institute [ ]. Critical system and software requirements (known as quality attribute requirements in the ATAM) were identified in Exhibit 3-2 of the SADD, reviewed, and laid out to form a quality attribute tree, with specification down to the scenario level. (These system quality factors address scalability, extensibility, reliability, performance, security, and evolvability.) Software architectural approaches based on the high priority quality factors were then iteratively elicited and analyzed, with risks, sensitivity points, and tradeoff points identified. Part of the iterative process included brainstorming and prioritizing the scenarios generated in the utility tree based on stakeholder needs (in this case, because access to the actual stakeholders was not possible, the prioritization was based on information in the document artifacts); in the second pass of the process, the scenarios were treated as test cases for the architecture analysis. 2.2.2 Software Requirements Analysis Software requirements analysis was conducted on data access control and basic workflow requirements after a review of the SRD, SDD (and corresponding volumes), thread design documents, and consultation with the FBI. The quality of these software requirements was evaluated against the following attributes found in IEEE STD 830-1998 [ ] : 1. Unambiguous and clear. The requirement has only one interpretation. The UAC Requirements Terms and Definitions Document (RTDD) was the primary source for clarification, followed by Webster’s Dictionary [ ]. 2. Consistent. The requirements do not conflict, and requirements use the same terms to mean the same things. 3. Non-redundant. There are no superfluous requirements. Each requirement adds something new to the SRD. 4. Complete. Nothing is missing from the requirement. Each requirement defines a user type, employs the verb “shall” once, and specifies an end result. Most requirements should also have a performance or timing criterion. 5. Single requirement and concise. The requirement does not contain more than it should. The requirement has no superfluous detail and expresses only one need. 6. Design- and implementation-independent. The requirement does not prescribe any design or implementation solution. 7. Testable/verifiable. The requirement uses concrete terms and measurable quantities. Words like “good,” “well,” and “usually” signal that a requirement is not testable. 8. Complete requirement set. No requirements are missing. The set of requirements defines those actions the software will take given all possible types of input data when in all possible states. Information and findings were shared with and by the system design assessment team to increase overall understanding of critical requirements. 2.2.3 Source Code Traceability Analysis This section summarizes the combined processes of the source code traceability analysis and the software requirements traceability analysis (Section 2.2.5). Requirements in the areas of access control and basic workflow were identified and traced from the software requirements to threads and SDD volumes to the source code, using the SDD and corresponding volumes (e.g., Workflow Volume), thread design documents, Test Plan, and the RequisitePro® database. (The initial process of tracing from software requirements to threads was abandoned after the FBI notified Aerospace that the contractor had developed new documentation.) In conducting these traceability analyses, emphasis was placed on: • Correctness (e.g., does the documented design and source code address the software requirements allocated to it?) • Consistency (e.g., is the allocation of software to design and code consistent across the documentation and supporting requirements management tools; are allocations at the same level of detail?) • Completeness (e.g., are all software requirements allocated to design elements and code; do the design elements clearly and concisely satisfy the allocated requirements given the design level of detail?)

Tracings were examined from software requirements through software design and code, and from software requirements to tests. 2.2.4 Source Code Documentation Analysis Java source code complexity was determined for all modules. PL/SQL source code complexity was examined for modules related to security, basic workflow, administration, and case management software components. The complexity of modules written in Java was determined using McCabeQA®. ClearSQL® was used for modules written in PL/SQL. Module size, in terms of source lines of code (SLOC), was determined for the respective Java and PL/SQL modules because size is another indicator of complexity. Those modules with the greatest complexity, size, or relationship to other modules were then subjected to a peer review: 191 Java modules, from the functional areas of data access control, workflow, case management, administration, and components, out of 309 high-risk modules; all 667 PL/SQL modules related to the functional areas of workflow, security, administration, and case management, 98 of which were determined to be high risk; and 42 JSP modules in the functional areas of workflow, security, administration, and case management, based on size and relationship to other JSP modules. The underlying source code of the selected modules was compared to contractor documentation (SDD and corresponding volumes, thread design documents, Software Development Plan (SDP)), especially with respect to design and test. Documentation was examined for correctness, consistency, completeness, and suitability. The Java and PL/SQL peer reviews focused on data and control flow, traceability of modules from design documentation, correctness of comments, and other elements of coding practices as defined by the development contractor’s coding standards expressed in the SDP. 2.2.5 Requirements Traceability Analysis The activities of the source code traceability analysis (Section 2.2.3) and the software requirements traceability analysis were tightly coupled. For that reason, the process description and status of the two analyses are combined and reported in Section 2.2.3 above. 2.3 Security Assessment The security assessment was based on the DOD Information Technology System Certification and Accreditation Process (DITSCAP) [ , ] and the National Information Assurance Certification and Accreditation Process (NIACAP) [ ]. Project documentation reviewed as part of the assessment include the SRS, SRD, SADD, CONOPS, Security CONOPS, SDD, Security Volume, Admin Volume, Security Architecture, Security Plan and associated support package, Privileged Users Guide, and Certification and Accreditation Methodology. Security-related requirements were identified from the available documentation: the SRS, SRD, and the Security Volume of the SDD. The design of the system was then examined with respect to the subset of requirements to determine the completeness and accuracy of the system design against this requirement set. Certification and accreditation material contained in the System Security Plan and System Security Plan Support Package was also reviewed to determine its suitability and completeness with respect to what Aerospace experience has shown is necessary for such an activity. 2.4 Software Development Maturity Assessment The software development maturity assessment was conducted using the same processes Aerospace employs for national security space systems, but tailored to the meet the time constraints of this project. A questionnaire was developed, based on the U. S. Air Force Software Development Capability Evaluation (SDCE) [ ], that addresses risks, key requirements, and five areas of specific interest: • Systems engineering (e.g., system requirements development, management and control) • Software engineering (e.g., software requirements management, software design, software coding and unit testing, software integration and test) • Quality management and product control (e.g., quality management, quality assurance, defect control, peer review, software configuration management) • Organizational resources and program support (e.g., organizational process management) • Program-specific technologies (e.g., database management, COTS, trusted systems) Answers to some questions were found in a review of the available documentation: SRS; Master Plan; Configuration Management (CM), Risk Management (RM), and Quality Assurance (QA) Plans; Software Development Plan (SDP); Master Test Plan and Delivery 1 Test Plan; and System Security Architecture. Questions that could not be answered from the documentation, or for which additional information was needed, were presented to the FBI and the contractor in preparation for an on-site fact-finding visit. At the time of the fact-finding visit (November 9, 2004) Aerospace interviewed selected members of the contractor staff according to areas identified in the questionnaire. The current Deputy Program Director, who was the VCF Delivery 1 Program Manager, provided interviewees with the questionnaire and scheduled interviews with most of the VCF Delivery 1 managers. The Program Manager accessed the IBM Rational® ClearCase®, ClearQuest®, and TestManager® files during the interview sessions. Time constraints did not permit an in-depth review of the files, but sample reports were printed, and examples of parts of the Software Development Folders (SDFs) were reviewed. Prior to the visit, Aerospace requested that the following documents and artifacts be available for review: SDFs, the System Engineering Master Plan, documentation from preliminary and critical design reviews, deficiency report databases or spreadsheets, Rational Rose® artifacts, metrics plans and reports, peer review reports, and quality assurance reports. All requested items were made available and reviewed, with the following exceptions: • The System Engineering Master Plan was not provided. The review team elected not to review it because it was not part of the development contract baseline. • Rational Rose artifacts were not reviewed. The review team focused on the SDF because coding was accomplished based on the SDF contents. • No system-level preliminary or critical design reviews materials were reviewed because these events were not conducted. Materials from In-Progress Reviews (IPRs) and the System Requirements Review were reviewed. 3. Topical Findings The results of the Aerospace IV&V; are grouped into six topic areas:

• Architectures (e.g., enterprise-, system-, and software-level architectures) • Requirements (e.g., concept analysis, system analysis, requirement analysis, requirement quality, traceability) • Software quality (e.g., software functionality, structure, testing, documentation, thread methodology, database software) • Performance (e.g., overall system performance of the database) • Security (e.g., certification and accreditation, system security administration, security requirements definition, security design documentation) • Contractor processes (e.g., processes defined by the contractor that were or were not followed, processes that worked or did not work)

Findings in each area are summarized in the following sections. Each summary lists strengths and weaknesses, provides a high-level summary of the most important strengths and weaknesses (individually or in groups) and their implications, and gives an overall appraisal of the topic area. Conclusions based on the findings are summarized in Section 4.

With the exception of the software development maturity assessment, all of the assessments were made strictly on documentation and artifacts delivered to Aerospace. This has two consequences. The first consequence is that this usually leads to noting more weaknesses than strengths. If there is sufficient ambiguity or uncertainty of what is intended in a document, a negative finding is generated, even if a short conversation with the contractor could have removed the problem. Therefore, the perceived state of what is being evaluated can be more negative than the actual state warrants. Aerospace did three things to reduce both the likelihood of this happening and the associated impact. First, a fact-finding visit was made to the development contractor’s facility to resolve questions about their software development processes. Second, industry and government standards were used to provide objective measures of quality and practices. Lastly, Aerospace looked at both documentation and product (i.e., source code) for possible strengths or weaknesses in each area.

The second consequence of basing the IV&V; review largely on documentation is that the ability to transfer the existing document set from the development contractor to a replacement contractor is tested. In this instance many weaknesses could indicate there are significant problems with the documentation or that the concepts being developed are not clearly stated. In either case, it would be very unlikely that a replacement contractor could pick up where the original left off, thereby closing the door on a possible acquisition or maintenance strategy.

3.1 Architecture This section summarizes the strengths, weaknesses, and Aerospace assessment of the architecture.

3.1.1 Strengths The incumbent contractor specified a standard three-tiered Web-based design pattern for the VCF architecture. A well-designed and implemented system of this type should be highly flexible, extensible, and scaleable, and should easily integrate new functionality. The theoretical strength of the approach is that it is highly componentized, generic, and built on open standards.

3.1.2 Weaknesses Though the fundamental strength of the architecture lies in its classic three-tier model, the fundamental weakness relates to the failure to actually implement the system according to the specified architectural concept. As a result, the system risks the ability to maintain, change components (e.g., COTS, GOTS), reuse, or add new functionality to the software. Maintainability and reuse are negatively impacted by the tightly coupled, threaded design. Performance and scalability are likely to be limited by the decision to implement VCF in a centralized versus a distributed fashion. Furthermore, it is possible that certain types of distributed architectures would provide greater reliability through redundancy.

Though maximizing the use of COTS was a stated goal of the VCF program, Aerospace found a limited use of COTS application products, and a design approach whereby functionality available in COTS was rejected, then reimplemented in VCF custom code. In addition, no non-Oracle COTS search and analysis tools were found to be acceptable, as no non-Oracle tools were found to be compatible with the Virtual Private Database and associated access controls.

The use of most COTS software is precluded by the choice for implementation of security and access controls at the data level. The VCF system uses two types of access controls: functional access controls, and data access controls. Functional access controls are implemented primarily in application code written in PL/SQL within the data tier. Data access controls are implemented using the VPD. Because all of the access control mechanisms are enforced by the database, they cannot be utilized by external applications. This is a fundamental limitation in VCF architecture. This means that virtually all functionality available in COTS that requires access control (including document management, workflow, tasking and delegation) must be implemented by developers in the VCF application in custom code. This limitation extends to highly capable COTS search and analysis applications, including link analysis and specialized applications used in other law enforcement and intelligence community applications.

The manner in which the access controls were implemented in the VPD feature of the Oracle database also imposes significant and unacceptable performance delays. While most implementations incorporate some of the controls available in VPD, and apply to a restricted subset of database tables, this implementation uses all of the control mechanisms and applies them to the tables that are used in virtually every join operation required for the response to any normal database query, resulting in significant performance degradation.

Remediation of these weaknesses would require a complete reevaluation of the approach to security access control.

Lastly, the software architecture documentation does not conform to the best practices identified in IEEE Std 1471-2000. For example, stakeholder concerns are not directly mapped to the software architectural responses, there is no viewpoint specification for the software architecture description, a specific methodology is not identified to represent architectural views, and known inconsistencies among architectural description elements are not noted. Failing to adhere to best practices can impact functionality, timeliness, and schedule throughout the development cycle.

3.1.3 Appraisal Decisions on architecture and the accompanying high-level design are fundamentally important. Yet critical architecture goals have not been met. There was a failure to appropriately assess the use of COTS products. It appears that inadequate attention was given to the performance requirements in relation to the choice of the Virtual Private Database and the associated Access Control List (ACL) table to implement the discretionary access control requirements. Analysis targeted at determining the objects to be protected with discretionary access controls, and methods of protecting these objects, may have resulted in alternate design choices that had more attractive performance characteristics. Likewise, by allowing the original three-tier architecture to collapse to two tiers (thus failing to adhere strictly to the Web-based design pattern), the architectural tenet on separation-of-concerns was violated. Consequently, future technology insertion is at risk, and maintenance and reuse of the VCF software will be more difficult.

3.2 Requirements This section summarizes the strengths, weaknesses, and Aerospace assessment of system and software requirements (development, analysis, and documentation).

3.2.1 Strengths All of the system level requirements examined were found to be within scope. There is no evidence of unnecessary features that could constrain design and increase cost.

The System Requirements Specification (SRS) did not contain TBD (to be determined) or TBR (to be reviewed) markings. This is generally a positive indicator for systems that have progressed from the conceptual phase to the development phase, since a lack of TBDs and TBRs usually means that the requirements baseline is stable. However, the lack of TBDs and TBRs provides no assurance that requirements are not missing. Friedman and Sage [ ] have pointed out that the lack of TBDs can indicate that requirements have been suppressed or ignored, thus creating what they call “silent specs.”

Design and implementation details were not found in either the system-level or software requirements; therefore, the developer adhered to expected system and software development practices.

None of the examined data access control and the basic workflow software requirements duplicated another. Avoiding duplicate requirements eliminates needless requirements analysis and redundancies in development and testing.

3.2.2 Weaknesses The CONOPS is incomplete in that it lacks summaries of advantages, disadvantages, limitations, and alternatives and tradeoffs considered. It fails to show through analysis that the Information Presentation and Transportation Network Components provide the necessary infrastructure to meet UAC requirements.

The CONOPS does not agree with the SRS, resulting in concepts that are not articulated as requirements in the SRS and requirements that do not correspond to operational concepts. The expected relationship between the CONOPS and the SRS is that the CONOPS should contain statements of operational activities; the SRS should specify system functions through functional requirements. A relationship should exist between the operational activities and the system functions. Contrast this relationship with that between the UAC CONOPS and the UAC SRS: the relationship between operational activities and system functions is missing; there is little correspondence between the statements made in the UAC CONOPS and the UAC SRS functional requirements.

Neither the SRS nor the SRD address all of the requirements expected in a specification. Failure to address the range of applicable requirements can result in a system that is implemented in such a way as to be unacceptable to the user or other stakeholders. Incorporating the additional sections at this point in the life cycle would require a major effort that would subsequently result in rewriting the design documents and making changes to the source code as needed to accommodate these design changes and would result in additional integration and test effort.

The System Architecture Design Document (SADD) is incomplete relative to expectations. Although the SADD lists architecture constraints and goals, it does not describe how the architecture meets them. The SADD includes neither decisions nor rationale for the external interfaces, scalability, extensibility, maintainability, and other items important to the architecture. The incomplete description of the system design could lead to unspecified and untraceable software requirements, which, in turn, leads to a system that does not meet users’ needs.

Inconsistencies exist between the Interface Definition Document (IDD), the Interface Control Documents (ICDs), and the SRS. For example, not all ICDs are referenced in the IDD, and some external systems noted in the SRS do not have a corresponding ICD. Although the SRS identifies external systems that currently interface with ACS and the types of interface to be supported by VCF to ensure legacy support, there are no requirements in the SRS that indicate the VCF must ensure such support. The IDD itself contains only seven requirements (“shall” statements), six of which relate to the frequency of interface execution. Inadequate interface definition puts at risk the ability of VCF Delivery 1 to operate with legacy systems.

In addition to reviewing the requirements-related documentation for inclusion of information typically expected in the documents, a quality review of the system-level and software requirements was conducted. Quality deficiencies include problems such as compound requirements, conflicting requirements, ambiguous and undefined terms, use of “and/or” in system requirements, use of “et cetera” in system requirements, use of unverifiable words in system requirements, lack of specified user category in software requirements, lack of response time constraints in software requirements, and redundant system requirements. These quality deficiencies may result in a system implementation that does not meet the expectations of users and other stakeholders. Specific problems and examples are provided in the finding summaries.

The SRS did not completely cover requirements. Gaps in expected system level functionality were found. Additionally, the Requirements Terms and Definitions Document (RTDD) contained implied requirements. Placing implied requirements within the RTDD does not ensure that the expected functionality will be implemented.

Traceability was assessed on various levels and from various perspectives. The expected relationship between need statements, system requirements, and requirements for lower-level elements (e.g., hardware and software)is that there is a strict downward flow of requirements. Every need statement maps to a system requirement and every system requirement maps to a need statement. Thus, there are neither “childless” need statements nor “orphan” system requirements. The process continues in a like manner for the system requirements and lower-level element requirements. Contrast this with what is observed in the UAC need statements and requirement documents: need statements do not flow exclusively to system requirements, and in many cases they bypass the system requirements completely. The lack of traceability from need statements to system requirements could result in a system design that does not meet user needs and may implement features that are not required.

Traceability was also assessed in the SRS review by conducting a decomposition analysis on a set of requirements from the SRS to the software level. The analysis identified problems such as incomplete decompositions and decompositions that were more restrictive than the system level requirement. Additional traceability analyses assessed the mapping of system and software requirements to the traceability matrix; errors were found in the trace. The mapping of business rules to software requirements was also incomplete. Here again, the lack of traceability from system requirements through design means that the design may not meet requirements and may implement features that are not required.

Finally, analyses were conducted of traceability from software requirements to software design and source code, and from software requirements to tests. There were three sets of artifacts that provided traceability between the software requirements and the design: the RequisitePro database, the thread documents, and the SDD volumes. The RequisitePro database traced to the name of a thread, which was associated with the corresponding portion of the SDD volume for basic workflow and for data access control. The RequisitePro database was consistent with the traceability in the SDD volumes (with one exception), but not consistent with the traceability in the thread documents. It is Aerospace’s understanding that the thread documents were the original design documents and that the SDD volumes reflected the as-built software. All but one of the software traceability findings deal with the SDD volumes as opposed to the thread documents.

There was poor traceability from the software requirements for basic workflow (BW) and data access control (DAC) through the design to the software components. A spot-check analysis of the workflow code shows that some software requirements do not appear to be covered in the code itself. There are some DAC software requirements that are inconsistently traced between the RequisitePro database and the Security Volume of the Software Design Document. Lack of adequate requirement traceability into software design and code results in risk that the software will not meet its stated requirements and greater difficulty of modifying software when requirement changes occur.

There were several BW software requirements that were not assigned to tests in the Delivery 1 Test Plan. Without these requirements being validated in assigned tests, there is no certainty that the users’ requirements are completely met.

3.2.3 Appraisal The requirements, analysis, and documentation associated with the UAC and VCF Delivery 1 contain significant information deficiencies that must be corrected to ensure an adequate system definition and development process; a majority of the system and software requirements examined contain quality deficiencies; and the requirements decomposition and traceability chain, from the SRS to SRD to software design components to source code to test documents, is weak because of missing information and inaccuracies. Extrapolating these observations to the entirety of the requirements, analysis, and documentation leads to serious concerns about the maintainability and reusability of VCF Delivery 1. Remediation could be very time-consuming and, because of the traceability concerns, may not ensure that all problems would be addressed.

3.3 Software Quality This section summarizes the strengths, weaknesses, and Aerospace assessment of the software, to include database software.

3.3.1 Strengths VCF Java Package Standards, set forth in Appendix A of the Software Development Plan, were followed. In particular, the Struts framework was followed, compelling developers to follow a Model View Controller design. This was significant because developers familiar with Model View Controller design and the Struts framework should be able to understand the flow of information in the source code and maintain the presentation layer of software with little difficulty. Also, the software components described in the Workflow Volume of the SDD were almost all found in the code. This is important for maintenance purposes.

3.3.2 Weaknesses Commenting standards were not consistently followed in the source code files that were analyzed. For example, very few functions or files included a change history, and there were no references to the design documents associated with each class. This inconsistency indicates that coding standards listed in the Software Development Plan were not always followed. Not following a formal software development process for such a large system implies the lack of a disciplined approach, a lack of coordination among developers, and a lack of standards enforcement. The result of inconsistent comments is that the burden of source code maintenance increases because programmers are forced to search through the documentation every time the code needs changing or when checking for possible side effects associated with making changes to different classes. This weakness can be corrected only by going through all the source code files and writing the needed comments. There are also comments in the code that mention work that remains to be done. This means that either the code is incomplete or that the misleading code documentation was never removed from completed code. This code should be examined in detail and compared to the design to determine its status, and it should be tested to be sure it ran without errors. Then these comments should be removed to eliminate confusion.

Some Java classes have modules with incomplete code and unused code. This code cannot be validated because its purpose is unknown. Such code can affect the safety of the system by performing unexpected and unplanned operations. If the code is not fully validated, then the proper operation of the system cannot be assured.

Some Java code contains inconsistent use of constants by hard coding and some by using constants and database files for others. The preferred method is to use constants and database files so that any future changes can be made to the constant or to the database, thereby ensuring completeness of the change. Hard coding requires that changes be made to all instances and some may be missed.

Discrepancies were found between the thread design documents and the Software Design Document volumes for data access control and basic workflow. In addition, there were cases where more detailed design was found in the thread documents than in the design documents. Inconsistent design documentation is confusing to anyone trying to understand, maintain, or modify the software.

The design documentation reviewed does not bridge the gap between the Software Design Document volumes and the source code. Missing design information included the relationships between the software components; class parameter details; full definition of class interfaces; and details on the purpose and logic of each function. The existence of the code is listed in the high-level design, but not the code behavior and interactions, which should be reflected in the lower level, detailed design. Examples of missing design details include: (1) the PL/SQL code for workflow contained a total of 111 modules, of which 57 were not mentioned in the SDD; (2) a discrepancy between the Java files listed in the SDD volumes and the source code provided to Aerospace. The lack of a detailed design document that included all source code modules makes maintenance and modifications to the source code more difficult and time-consuming, and would subsequently drive up the cost of any future changes to the system.

The PL/SQL code has timing and design issues. With regard to timing, each module writes a character string to the debug log (in one module printing is initiated through the use of a debugging switch; in all other cases the printing is hard coded). This has a negative impact on code performance because it increases execution time; this practice would be tolerable only during prototype development. As to design, the PL/SQL code uses literals rather than symbolic constant variables in the arguments of “IF” and “WHERE” statements. This code would be nearly impossible to maintain by anyone other than the programmer who developed it because there are no references to design documents that define literals, such as the integer-type values. This is an example of not following coding standards, or of not enforcing them. It would take time to fix this code properly by replacing the literals with symbolic constant variables so that it could be understood by anyone other than the original programmer.

3.3.3 Appraisal The source code appears to have been produced without adherence to the procedures and standards stipulated in the Software Development Plan. The source code examined is not maintainable with its current documentation. Without reverse-engineering the missing documentation and conducting thorough testing, the code should not be used for any operational system. To reverse-engineer this system to bring it up to the level for proper maintenance and support would cost about one-quarter to one-half of the original cost of development. In most systems, 15% of the cost is derived from the documentation across all development stages. Testing, including documentation, typically accounts for 40%. Approximately half of the documentation needs to be completed. The remaining testing is expected to be between half and all of the cost of a typical system. This depends on problems found during testing.

3.4 Performance This section summarizes the strengths, weaknesses, and Aerospace assessment of system and database performance.

3.4.1 Strengths None identified.

3.4.2 Weaknesses The VCF system that was tested by the contractor was a development version (VCF Delivery 1), which is missing requirements and is inadequate for operations. The system did not implement a number of features, such as the Virtual Private Database (VPD) or full production scale of hundreds of millions of rows in the database. The measurements (as documented in the Interim Scaling and Performance Test Report) only present some CPU utilizations and end-to-end response-times from/to the web server. Disk, bus, and network actual performance were not provided in the performance report provided to us and are presumed not to have been tested.

The reported system performance and its performance analysis approach are at best marginal. Only the least-complex transactions were reported, and a number of those did not meet requirements even for the scaled-down database without VPD. A fully populated production VCF system based on VCF Delivery 1 would not meet requirements. In some cases the response time would be hundreds of percent longer than is required, and in worst cases thousands of percent longer than is required. Such long response times are essentially nonresponsive.

The VCF database has the attributes of a logical database model with large numbers of tables, a lack of denormalization, subtype entities modeled directly to physical tables, and other logical data model features. Logical models are rarely performance-optimal. The typical database objects available for performance optimization (e.g., performance-based index selection, materialized views, table partitioning) are absent from the VCF database. At the current estimated row counts, the database will require heavy optimization in order to scale properly; however, the developers did not do this.

The database load estimates were created using historical ACS usage. While using historical ACS usage was a good starting point, a more thorough analysis of the probable usage of VCF should have been performed before translating these estimates into a testing protocol.

The database SQL code is not performance-optimized. The SQL code throughout the system uses many of the constructs that are specifically noted in the database manufacturer documentation [ , ] as being performance risks. Compounding the problem is the use of the Oracle VPD feature for database security. The VCF implementation of this feature causes even more poor-performing SQL code to be added to each and every SQL statement in the database.

The executed performance tests were flawed in two ways. First, the contractor did not isolate the database when the CPU utilization was tested. Aerospace was unable to conclude whether the database CPU was underutilized because it was not having a problem with servicing requests or it was waiting on another dependent system. The database CPU could also have been waiting on internal database hardware such as bus data transfers. The second flaw in the performance tests is that the test database was loaded at substantially lower row counts than what is estimated as needed, even for the ACS database migration.

An enterprise such as the VCF requires to be managed by a Network Operations Center (NOC). The NOC would include a Network Management System (NMS), archiving, availability monitoring, and other system and operational functions.

The NMS would include functions such as a help desk, trouble ticket system, a network management console, and network management agents on the managed workstations and servers. The NMS would use protocols such as SNMP (Simple Network Management Protocol), RMON (Remote Monitoring), and others. The lack of a requirement for an NMS would result in a system that cannot be operated in a production environment, especially after it is fully scaled to global production size. Any production system requires routine archiving. Most systems have an incremental or even a full backup daily, and a full backup at least weekly. Without archiving, work could be lost, evidence misplaced or destroyed, and investigations could lose their integrity. The lack of a requirement for archiving would result in a system that cannot be operated in a production environment, especially after it is fully scaled to global production size.

Any production system must meet availability requirements commensurate with its mission. A system that is unavailable could result in an interrupted investigation due to lack of access to investigation data, or the inability to record new information that is crucial to progress in the current investigation and other affected investigations that depend on new evidence collected. On top of that, investigation resources would be lost when the system is down. The lack of a requirement for system availability would result in a system that cannot be operated in a production environment, especially after it is fully scaled to global production size.

3.4.3 Appraisal The system falls short of meeting requirements as tested. In addition, the scaled-up system, with the VPD running, is highly unlikely to meet requirements, particularly for the type of complex queries needed by VCF. Simple queries would be hundreds of percent slower than the type of queries that were tested by the incumbent contractor. The situation would be far worse for complex queries running on the scaled-up system. The system would fall short of requirements with extremely long response times—thousands of percent longer than is required. Such long response times are essentially nonresponsive.

The database has many characteristics of a database still in development: a physical implementation of the logical database model that will undergo significant modification well before production, and SQL coding statements structured in a way that is logically sound and easily understood, but not optimized for performance. Developers typically develop code in this manner, expecting that time will be allocated to performance optimization once the code is functionally correct. Code modifications are also easier before optimization.

The database hardware selection appears adequate for the raw amounts of data that must be processed, but the database subsystem requires a realistic test with all features active, especially the VPD security and a full ACS migration data load. The production hardware and COTS software (i.e., Oracle database, Sun server, and the Hitachi Storage Area Network (SAN)) are technically capable products. However, the current VCF database schema and SQL code implementation do not contain the performance enhancements that would allow the hardware and COTS database server to perform optimally.

3.5 Security This section summarizes the strengths, weaknesses, and Aerospace assessment of security.

3.5.1 Strengths It appears that planning for system security was done at a high level early in the program. Such planning increases the likelihood that required security features (e.g., access control, audit) will be addressed in the requirements and design, which, in turn, provides a cost-effective path to certification and accreditation. Select areas of the system not generally found in initial system security reviews (e.g., infrastructure devices such as routers and switches that nonetheless contain functionality that must be addressed from a security perspective) were addressed in some amount of detail.

The system design provides for a limited interface controlled by the VCF application and infrastructure (for non-administrative users to interact with the VCF). This approach prevents exposure to security vulnerabilities that may exist in the interfaces provided by underlying products (not visible at the user interface), such as the command line for an underlying operating system.

At a high level, these strengths point to an approach that, if followed, would produce an accreditable system.

3.5.2 Weaknesses Several weaknesses were discovered that create a significant risk that the system will not be accreditable.

Broad areas of security requirements were neither well-defined nor correctly decomposed to lower-level requirements. Although the coverage area of the lower-tier requirements was the same as that in the higher-level documents, the lower-tier requirement did not provide the necessary detail to implement and test the system in support of the certification and accreditation effort. Furthermore, the documents identified as the primary means for the certification and accreditation effort (the System Security Plan and the System Security Plan Support Package) did not map to the requirements specified for the system. This failure to identify the requirements to which the system would be accredited greatly increases the risk that the system would not receive accreditation, even if built to the requirements specified. Weaknesses were found in design and implementation. The Privileged User Guide should contain information to manage and configure the system in a secure manner. However, there are many sections marked TBD, as well as sections that do not provide the detailed procedures required to perform critical configuration steps (e.g., specific configuration instructions for the boundary devices so that fundamental assumptions noted in higher-level documents can be achieved). Some of the detail provided in this guide also appears as if it were copied from other sources, and not modified for application to the VCF system. Without specific configuration information, the trustworthiness of the system cannot be assessed and the system will not be accreditable. Furthermore, if the security features that are needed do not exist, or do not support all of the capability being depended upon by the architecture, then significant schedule and dollar costs will be incurred.

The design documentation for the audit subsystem does not describe how the audit requirements are being met, especially in the area of management of the audit trail. While the Privileged User Guide contains COTS audit configuration steps, there is no discussion concerning how the VCF audit is managed, and how the VCF audit can be integrated with the audit trails produced by the COTS products to provide a coherent audit trail.

3.5.3 Appraisal At a high level, the system security description appears to be a good start in describing the functionality necessary to build an accreditable system. However, in specifying and designing the system to meet that functionality, it appears there are significant shortfalls. Select requirements specifying the functionality are imprecise and incorrectly decomposed. The design of critical identification and authentication and audit subsystems do not implement a significant portion of the requirements for those subsystems. The documents supporting the certification and accreditation of the system and security configuration are not complete.

While all of these issues can be remedied, at this point in the product lifecycle there is a high risk that the system implementation will not meet the security requirements, and that significant additional costs (both to the schedule and in dollars spent) will be incurred in trying to address the issues identified. There is a high likelihood that the system as it currently stands will not be able to be accredited without significant addit

 
 
  Home | Welcome | Members | Subcommittees | Committee History | Press Room | Jurisdiction |
Hearings/Testimony| Legislation | The Budget Process | Democratic Info
  Text Only VersionPrivacy Policy