The Linked Data Platform specification, informally LDP, describes the use of HTTP for accessing, updating, creating and deleting resources from servers that expose their resources as Linked Data. This document introduces the conditions that LDP servers must satisfy in order to be conformant with the specification and presents a common format for describing LDP test results. These test cases both illustrate the features of the specification and can be used for testing conformance. [[LINKED-DATA-PLATFORM]]


This document introduces a test suite that can be used to evaluate the conformance of LDP servers to the LDP specification [[LINKED-DATA-PLATFORM]]. The document also presents the design principles that guided the development of the test suite, a testing process, and a common format for describing test results.

The purpose of the test cases is to illustrate the specification features and to help in testing conformance. The provided set of test cases is "incomplete" in the sense that passing all the tests does not prove that a given system conforms to the LDP specification; failing a test does, however, prove that the system does not conform to the specification.

The presented format is intended to facilitate the use of tests by LDP server developers, e.g., in a test harness, as well as the extension of the test suite with new tests. Developers can check the LDP Primer [[LDP-PRIMER]] for concrete examples of inputs and expected outputs that can be used for testing.

Design principles

Generic vs domain-specific servers

There will be two types of servers implementing the LDP specification:

In order to cover both types of servers, there are some basic input data and a way to provide specific input data in the test suite. It is up to the evaluator to define specific input data for a certain server. Evaluators must include these input data along with the results when reporting the results of a certain server.

Protocol evaluation vs data evaluation

The LDP specification includes restrictions on LDP servers at the protocol level and at the data level. Currently, the restrictions at the data level are minimal and servers are not forced to have a certain behavior when processing LDPR representations. Therefore, the test suite evaluates LDP servers mostly at a protocol level; the only exception is in the case of LDPCs, since they are required to include a rdf:type, containment and membership statements in their representation.

It is out of the scope of the test suite to test LDP servers in terms of the restrictions imposed by their underlying data models.

Test suite coverage

The test suite covers those requirements present in the LDP specification of any compliance level: MUST, SHOULD and MAY. This set of absolute requirements identifies the core subset of the LDP specification, LDP Core from now on, and any LDP server that satisfies these absolute requirements will be an LDP Core conformant server.

It is out of the scope of the test suite to test other levels of conformance in terms of optional capabilities (e.g., paging, patch formats).

Furthermore, the LDP specification [[LINKED-DATA-PLATFORM]] contains a number of requirements that can not validated by automated means, these are identified in a coverage report for the [[LINKED-DATA-PLATFORM]]. These requirements will need to be validated by some non-automatic method and results evaluated.

Separation of results and assertions

Instead of defining expected results for tests, which will be dependent on specific implementations, we have defined the assertions to be made over test results. In order to successfully pass a test, all of the assertions must be satisfied.

Separating test outputs and assertions has other benefits: it makes simpler to report tool results and assertions can be made by a third party.

Traceability of test cases

Any test case and its produced results and assertions should be related to those documents that are relevant for it (e.g., specifications, uses cases, etc.).

Testing process

The LDP Test Cases are defined in a location, within Java source code. [[LDP-TESTCASES]] Details about each individual test case, such as information about whether it can be executed by automated means or manually, will be found in the Java source code annotations. Also in the Java source code annotations the status of each test case, such as approved by the LDP Working Group, awaiting approval or not yet implemented.[[LDP-TESTSUITE-COVERAGE]]

  1. The person or agent in charge of executing the test cases in a specific LDP server will take the test case definitions and run every test case on the LDP server. The execution of test cases must produce a test execution report for the LDP server, in RDF format, that contains for every test case: the specific inputs used during its execution, the produced outputs, and the assertion of whether the test case is passed. The test execution report must be supplied as defined in the document on implementation conformance reports. [[LDP-CONFORM]]
  2. A report generator software will take all the LDP server execution reports and will generate an implementation report that includes the results of all the LDP servers. [[LDP-CONFORM]]

Your browser does not support SVG.

Submitting results

Here is a summary of the steps needed for an assertor to submit the compliance results for an implementation.

Describing testing artifacts in RDF

Namespaces used

The following vocabularies are reused for describing the testing artifacts: DOAP ( doap ), Dublin Core ( dcterms ) [[DC11]], FOAF ( foaf ) [[FOAF]], and W3C Test Metadata ( td ) [[TEST-METADATA]].

All the new required entities that are not covered by those vocabularies have been defined under a new namespace ( ldpt ), as well as the LDP test cases.

Next we present the definition of these namespaces and of all the namespaces used in the examples.

dcterms: <>
doap: <>
earl: <>
foaf: <>
mf: <>
rdfs: <>
rdft: <>
td: <>
ldpt: <>

Test case description

A test case is defined as an instance of the td:TestCase class and it can be further described using the following properties:

An excerpt is defined as an instance of the tn:Excerpt class and it can be further described using the following properties:

The following example contains the description of one of the LDP test cases.

ldpt:CommonResource-GetResource a td:TestCase;
         rdfs:label "CommonResource-GetResource";
         mf:name "CommonResource-GetResource";
         dcterms:title "GET on an LDPR";
         dcterms:description "Tests making a GET request on an existing LDPR";
         dcterms:contributor :RaulGarciaCastro;
         td:reviewStatus td:approved;
         rdfs:seeAlso <>;
         dcterms:subject "MUST" ;
         td:specificationReference [
             a tn:Excerpt;
             rdfs:seeAlso <>;
             tn:includesText "LDP servers MUST support the HTTP GET Method for LDPRs".

:RaulGarciaCastro a foaf:Person;
                    rdfs:label "Raúl García-Castro";
                    owl:sameAs <>.

Test case assertion description

An assertion is defined as an instance of the earl:Assertion class and it can be further described using the following properties:

The following example contains the description of one test assertion.

:TCR1-Assertion-SomeServer a earl:Assertion;
	earl:subject :AwesomeLDP;
	earl:test ldpt:CommonResource-GetResource;
	earl:assertedBy :AwesomeLDP;
	earl:mode:  earl:automatic;
	earl:result [
		a earl:OutcomeValue ;
		dcterms:date "2014-07-06T09:30:10";
		earl:outcome earl:passed

:AwesomeLDP a doap:Project, earl:TestSubject, earl:Software, earl:Assertor;
	doap:name "Awesome LDP";
	doap:description "Awesome LDP implementation";
	doap:developer    [ a	foaf:Person ;
						foaf:mbox  "" ;
						foaf:name  "Dope Developer"
	doap:homepage	<>;
	doap:programming-language "JavaScript".

Change history