Skip to end of banner
Go to start of banner

OVP 2.0 Requirements and Testing Principles

Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

Version 1 Current »

Status: DRAFT

Overview

The objective of the OVP 2.0 (or OVP Cloud Native) initiative is to provide a automated mechanism measure ceither a Cloud Native Network Function (CNF) or a Network Functions Virtualization Infrastructure (NFVI) with a standard set of requirements, presumably defined by the CNTT and related efforts.  Through this measurement, a provider of either CNFs or NFVI and acquire community badges under the OPNFV Verified Program to signal their compliance.  This will ease the integration of CNFs into operator environments that host compatible NFVIs, thereby reducing cost, complexity, and time of integration.

The overall program requires the close coordination of the following:

  • Requirements - The agreed upon rules and recommendations to which a compliant CNF or NFVI must adhere
  • Tests - The verification mechanism that determines that a given CNF or NFVI complies with one or more requirements
  • Conformance Specifications - The definition of the requirements, tests, and circumstances (pre-conditions, etc.) that must be met to be deemed conformant.  

If there is not clear traceability and strong links between these 3 components, then it becomes difficult to understand how to receive a badge and have community alignment on the meaning of the badges.  

With this in mind, the OVP Requirements Activities Work Stream has defined a set of recommended principles for each of the three components to follow. The hope is that adherence to these principles will provide the following:

  • Enable clear progress tracking and linkage between independent projects (i.e. know what has and hasn't been covered, and track changes over time)
  • Help users better understand mandatory vs. optional requirements
  • Provide a stable set of point-in-time requirements and tests to achieve conformance.  While tests and requirements will change over time, applicants for a badge must be able to operate against a stable base for any given version of a badge.
  • Reduce ambiguity in either testing, requirements, or conformance

Requirement Principles

  • All requirements must be assigned a requirements ID and not be embedded in narrative text.  This is to ensure that readers do not have to infer if a requirement exists and is applicable.
  • Requirement must have a unique ID for tracking and reference purposes. 
  • The requirement ID should include a prefix to delineate the source project
  • Requirement must state the level of compliance (ex: MUST, SHOULD, MAY) per RFC 2119
  • Mandatory requirements must be defined in such a way that they are unambiguously verifiable via automated testing
  • Requirements should be publishable or extractable into a machine readable format such as JSON
  • Requirements should include information about the impact of non-conformance and the rationale for their existence

Testing Principles

  • Tests must provide a mapping to the requirement being validated
  • Failures should explicitly reference the requirement IDs violated (optionally they can provide a link or text of the requirement)
  • Failures should provide additional content to inform the user where or how the requirement was violated (ex: which file or resource violated the requirement).  Put another way, don’t require the user to read the test to understand what went wrong.
  • Testing tools should allow users to select to between validation of mandatory and optional requirements (if validation of optional requirements are supported).
  • Testing tools should support selection of tests based on category or profile.  These category and profiles should be related to the conformance badges where possible.
  • Result reports must clearly delineate violations of mandatory vs. optional requirements
  • Tests must be available to run locally by both CNF and NFVI providers
  • Testing tools must produce machine-readable result formats which can be used as input into the badging program (OVP already defines a format). Alternatively, a higher level test execution framework could convert test results of specific tools into an OVP compatible format.

Conformance Specifications

  • Conformance specifications must refer to or define the versioned requirements that must be satisfied to achieve a given OVP badge
    • Question: There has been discussion of supporting scoring results to qualify for differing levels (ex: Bronze, Silver, Gold) of compliance.  This is a departure from prior OVP badges and the decision could impact some of the principles in the conformance specification.
  • Conformance specifications must refer to the versioned test implementations that must be used to validate the requirements
  • Since the OVP badge is specific to the Linux Foundation Networking family of products, all requirements must be defined within an LFN project.  This does not preclude an LFN requirement referring to an external requirement or set of requirements as long as those requirements can be identified and versioned appropriately.
  • Conformance specifications must define the expected preconditions and environment requirements for any test tooling
  • Conformance specifications must define which tests must be executed in the given testing tools to achieve the badge
  • If the testing tool does not provide explicit mapping between tests and requirements, then the conformance specification must provide the mapping between tests and requirements to demonstrate traceability and coverage.
  • No labels