Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

Moreover, instances of the RIVP will be made available to the teams to test their software interface compliance, before deliverying to the ROC.

The way to use thes the instances is described in the "RCS Interface Validation Pipeline User Manual" (will be provided soon).

...

The RCS integration tests are performed by the ROC team, using representative enough RCS input/ouput "test" data samples. The objective is to ensure that the RCS produce the same L1R/L2 CDF files, when they run as a "stand-alone" software (i.e., at the RCS team site) and as module of the ROC pipeline (i.e., on the roc server at the LESIA).

The tests consist of calling the RCS with a "pre-prod" instance of the ROC pipelines, in order to re-produce at the LESIA site the expected "test" output data. The ROC will use a "CDF comparison" tool to verify that the L1R/L2 CDF data generated by the RCS at the LESIA are similar to the one provided by the teams.

The "test" data samples shall be delivered by the RCS teams, as specified in the REGU (see ROC-GEN-SYS-NTT-00019-LES in the ROC Documents). The ROC team can however help the teams to identify and supply the input "test" data (for instance, some L2 and L1R output data CDF require input L1/HK CDF that can only by generated by the ROC pipelines).

Warning
  • There must be one set of "test" data files for each RCS data product and each ROC pipeline (ROC-SGSE or RODP).
  • The aim of the integration tests is not to check the validity of the L1R/L2 output data content (e.g., in terms of data quality/calibration), but just to ensure that the behaviour of the RCS is as expected when it is run into the ROC pipelines.

RCS VALIDATION TESTS

The RCS validation tests shall allow the ROC to validate the requirements and specification related to the RCS and their data products.

...