Completed the test case document.
--- a/testcases/process.html Thu Dec 06 14:54:09 2012 +0000
+++ b/testcases/process.html Thu Dec 06 15:40:22 2012 +0000
@@ -112,8 +112,7 @@
validity (as specified in [[PROV-CONSTRAINTS]]). The test cases are specially designed to
cover a small set of constraints (called unit test cases, see <a href="#unit-test-cases" class="sectionRef">Section 2.1</a>) or to exercise a subset of PROV statements. By
successfully determining the validity of such a test case, an implementation indirectly
-demonstrates its support for the PROV feature(s) covered by the test case.
-</p>
+demonstrates its support for the PROV feature(s) covered by the test case.</p>
<section id="testcase-identifier">
@@ -155,15 +154,13 @@
<p>For example, the available files for the test case <b>ordering-derivation1-PASS-c42</b>
are: <b>ordering-derivation1-PASS-c42.ttl</b>, <b>ordering-derivation1-PASS-c42.provn</b>,
-and <b>ordering-derivation1-PASS-c42.provx</b>. Although all the thee representations are
-provided for every test case, your implementation will only need to validate one
-representation (out of the three) in order to report the result for one.</p>
+and <b>ordering-derivation1-PASS-c42.provx</b>. Please note that, all though we provide three different representations, you only need to validate one per each test case.</p>
<p>For your convenience, the download links for all the available test cases
-(see <a href="#test-case-catalogues" class="sectionRef">Section 2</a>)
+(as enumerated in <a href="#test-case-catalogues" class="sectionRef">Section 2</a>)
for each representation are respectively provided in the following text files:
<a href="rdf-tests.txt">rdf-tests.txt</a>, <a href="provn-tests.txt">provn-tests.txt</a>,
-and <a href="xml-tests.txt">xml-tests.txt</a>. Please note that, all though we provide three different representations, you only need to validate one per each test case.</p>
+and <a href="xml-tests.txt">xml-tests.txt</a>.</p>
</section>
<section>
@@ -176,27 +173,28 @@
The <b>pass.txt</b> file MUST contain all the identifiers (one on each line) of
the test cases that have been successfully validated, and <b>fail.txt</b>
MUST contain the identifiers of all those that have failed the validation.
-Test cases that are not supported by the implementation SHOULD NOT be included
+The identifiers MUST also include the file extension, e.g. <b>ordering-derivation1-PASS-c42.provn</b>, to indicate the versions that were actually tested. Test cases that are not supported by the implementation SHOULD NOT be included
in either of the files.</p>
-<p>For example, if a validator can <strong>only</strong> process the three test cases
+<p>For example, if a validator can <strong>only</strong> process the PROV-N representations of the three test cases
<b>ordering-derivation1-PASS-c42</b>, <b>ordering-derivation2-FAIL-c42</b>, and <b>ordering-derivation3-PASS-c41-c42</b>, we
expect the result files's contents to be similar to the below.</p>
<pre class="example">
pass.txt:
- ordering-derivation1-PASS-c42
- ordering-derivation3-PASS-c41-c42
+ ordering-derivation1-PASS-c42.provn
+ ordering-derivation3-PASS-c41-c42.provn
fail.txt:
- ordering-derivation2-FAIL-c42
+ ordering-derivation2-FAIL-c42.provn
</pre>
-<p>Implementers MAY also chose to include the file extentions with the identifiers, e.g. <b>ordering-derivation1-PASS-c42.provn</b>, in the reported results to indicate the versions that were actually tested.</p>
-<p>Please email the test result files to the PROV Working Group at <a href="mailto:public-prov-wg@w3.org">public-prov-wg@w3.org</a>. </p>
-
+<p>Please email the test result files to the PROV Working Group at <a href="mailto:public-prov-wg@w3.org">public-prov-wg@w3.org</a>. Your implementation test results will
+ be collated and added to the
+ <a href="http://dvcs.w3.org/hg/prov/raw-file/default/reports/prov-implementations.html">implementation
+ report</a>.</p>
<p class="note">Dong: Another option is to have implementers provide some basic information about their
-implementations in a questionnaire on WBS and copy the contents of the two results files into text fields.</p>
+ implementations in a questionnaire on WBS and copy the contents of the two results files into text fields.</p>
</section>
@@ -206,13 +204,19 @@
<h2>Test Case Catalogues</h2>
-<p class="note">TODO: Introduction about the different types of test cases and how to get them.</p>
+<p>This section enumerates all test cases available for PROV-CONSTRAINTS. There are two types of test cases:</p>
+<dl>
+ <dt>Unit test cases</dt>
+ <dd>Unit test cases are designed to cover particular constraints (see <a href="#unit-test-cases"
+ class="sectionRef">Section 2.1</a>). Please note that there are a small number of test cases we do not provide the RDF representation since the (invalid) constructs therein cannot be expressed in valid RDF.</dd>
+ <dt>Example test cases</dt>
+ <dd>These are examples from PROV documents, which excercise various PROV statements. They are typically complex examples and cover numerous contraints. Therefore, we did not those constraints in the test cases' identifiers. However, any PROV-CONSTRAINTS implementation SHOULD be able to successfully validate all the example test cases. Test cases for examples from [[PROV-DM]] and [[PROV-O]] are listed in <a href="#prov-dm-test-cases" class="sectionRef">Section 2.2</a> and <a href="#prov-o-test-cases" class="sectionRef">Section 2.3</a> respectively.</dd>
+</dl>
<section>
<h3>Unit Test Cases</h3>
- <p>All test cases need to be checked for the constraints they cover and their expected
- validation results (i.e. Pass or Fail).</p>
+ <p>All test cases SHOULD be checked for their validity with respect to the constraints they cover. A success test result for a test case means its validity MUST be the same as its expected validation result as provided in Table 2 below.</p>
<table class="simple" id="table-unit-test-cases">
<caption>Table 2. PROV-CONSTRAINTS test cases</caption>
@@ -2204,24 +2208,12 @@
</tr>
</table>
- <p>For each report, the successful and failed validations will need to be checked against
- the expected validation results in Table 2 above. A test case is considered to be
- successful if its validation result is the same as the expected result in Table 2.</p>
-
- <p>The test case results will also need to be collated against the constraints they cover
- to indicate which constraints have/have not been successfully tested. This information will
- be added to the
- <a href="http://dvcs.w3.org/hg/prov/raw-file/default/reports/prov-implementations.html">implementation
- report</a> for the validator for which those test cases result are submitted.</p>
-
- <p>The collated result will be made available to the implementer soon after they submit
- their result.</p>
-
+
</section>
<section>
-
-<h3>PROV-DM Test Cases</h3>
+ <h3>PROV-DM Test Cases</h3>
+ <p>This section enumerates the example test cases from [[PROV-DM]].</p>
<table class="simple" id="table-provdm-testcases">
<caption>Table 3. Test cases from [[PROV-DM]] examples</caption>
@@ -2588,6 +2580,7 @@
<section>
<h3>PROV-O Test Cases</h3>
+<p>This section enumerates the example test cases from [[PROV-O]].</p>
<table class="simple" id="table-provo-testcases">
<caption>Table 4. Test cases from [[PROV-O]] examples</caption>
@@ -3252,6 +3245,7 @@
<section class="appendix">
<h2>Constraint Coverage</h2>
+<p>Table 5 in this section provides coverage look up for the constraints in [[PROV-CONSTRAINTS]] against the available test cases.</p>
<table class="simple" id="table-coverage">
<caption>Table 5. Constraint coverage by unit test cases</caption>