NIH | National Cancer Institute | NCI Wiki  

Testing for Service-Oriented Architecture (SOA) combines the typical challenges of software testing and certification with the additional needs of accommodating the distributed nature of the resources, the greater access of a less bounded consumer population, and the desired flexibility to create new solutions from existing components over which the solution developer has little if any control. The purpose of testing is to demonstrate a required level of reliability, correctness, and effectiveness that enable prospective consumers to have adequate confidence in using a service.

Capabilities related to Conformance Testing within the Semantic Infrastructure are summarized as:

  • SOA Testing Environment, including definition, development, and maintenance of monitoring service suites; inclusion of test points in service descriptions; test resource discovery services; automated testing
  • Semantic validation, including harmonization validation, terminology consistency, and consistency in model representation

Functional Profile

  • 4.3.1 - SOA Testing Testing for SOA combines the typical challenges of software testing and certification with the additional needs of accommodating the distributed nature of the resources, the greater access of a more unbounded consumer population, and the desired flexibility to create new solutions from existing components over which the solution developer has little if any control. The purpose of testing is to demonstrate a required level of reliability, correctness, and effectiveness that enable prospective consumers to have adequate confidence in using a service. Adequacy is defined by the consumer based on the consumer's needs and context of use. Absolute correctness and completeness cannot be proven by testing; however, for SOA, it is critical for the prospective consumer to know what testing has been performed, how it has been performed, and what were the results.
  • 4.3.2 - Validate Conformance testing leverages the artifact and service metadata to validate that an implementation adequately addresses the requirements stated in the service specification. An example of service requirement is the ability to specify a response time in the specification (design time) and validate that this response time is valid for an implementation of the service. Additional test points include but are not limited to binding to specific terminologies and domain models.
  • No labels