Added the implementation table to the report
authorTrung Dong Huynh <tdh@ecs.soton.ac.uk>
Wed, 06 Feb 2013 11:52:41 +0000
changeset 5483 c5ea10a65199
parent 5482 ba174af1434b
child 5484 ee4e86bf5145
Added the implementation table to the report
reports/prov-implementations.html
--- a/reports/prov-implementations.html	Tue Feb 05 23:52:54 2013 +0100
+++ b/reports/prov-implementations.html	Wed Feb 06 11:52:41 2013 +0000
@@ -2,7 +2,7 @@
 
 <html><head> 
     <title>PROV Implementation Report</title> 
-    <script src="http://dev.w3.org/2009/dap/ReSpec.js/js/respec.js" class="remove"></script>
+    <script src="http://dev.w3.org/2009/dap/ReSpec.js/js/respec.js" class="remove"></script>
     <script src="../model/provbib.js" class="remove"></script>
     <script class="remove"> 
       var respecConfig = {
@@ -110,17 +110,17 @@
 </p>
     </section>
     <section id="implementations">
-      <h2>Implementations</h2>
+      <h2>Implementations</h2>
    
       The following lists the reported implementations, the type of implementation, supported PROV encodings and the URL of the implementation. 
-         <p>Implementation Type:
-        <ul>
-            <li>Application</li>
-            <li>Framework/API</li>
-            <li>Service</li>
-            <li>Vocabulary</li>
-            <li>Constraints Validator</li>
-        </ul>
+      <p>Implementation Type:
+        <ul>
+            <li>Application</li>
+            <li>Framework/API</li>
+            <li>Service</li>
+            <li>Vocabulary</li>
+            <li>Constraints Validator</li>
+        </ul>
       </p>
       <table border="1" cellspacing="0">
         <caption id="implementations-table">Table 1: List of implementations reported to the PROV Working Group.</caption>
@@ -130,30 +130,312 @@
           <th scope="col">Type</th>
           <th scope="col">PROV Encodings</th>
           <th scope="col">URL</th>
+          <th scope="col">Descriptions</th>
         </tr>
-        <tr>
+        <tr id="1">
           <td>1</td>
-          <th scope="row">ProvPy</th>
+          <th scope="row">WebLab-PROV</th>
+          <td>Application (consumes / generates provenance)</td>
+          <td>PROV-O, PROV-N</td>
+          <td><a href="http://weblab-project.org/index.php?title=WebLab-PROV">http://weblab-project.org/index.php?title=WebLab-PROV</a></td>
+          <td>Application of provenance on the WebLab platform, using the PROV ontology.</td>
+        </tr>
+        <tr id="2">
+          <td>2</td>
+          <th scope="row">StatJR eBook system</th>
+          <td>Application (consumes / generates provenance)</td>
+          <td>PROV-O, PROV-JSON</td>
+          <td><a href="http://www.bristol.ac.uk/cmm/software/statjr/index.html">http://www.bristol.ac.uk/cmm/software/statjr/index.html</a></td>
+          <td>The StatJR is a statistical modelling software package that is designed to be open and extensible. Its target audience is researchers in the social sciences and there is a strong pedagogical aspect to the software via the eBook interface. This interface makes heavy use of Prov to represent interactions and computation that occur as a user reads eBooks. Prov is also used to drive some of the functionality of the system such as cacheing executions and allowing the user to navigate to past versions of the eBook. The system is described in our paper in IPAW2012 - <br>
+              DEEP: A Provenance-Aware Executable Document System<br>
+              .</td>
+        </tr>
+        <tr id="3">
+          <td>3</td>
+          <th scope="row">PoN</th>
+          <td>Application (consumes / generates provenance)</td>
+          <td>PROV-N, PROV-JSON, PROV-N (and RDF serialization in future) via the provpy module.</td>
+          <td><a href="http://tina.ecs.soton.ac.uk/djangopon/">http://tina.ecs.soton.ac.uk/djangopon/</a></td>
+          <td>PoN is a web application for collecting, organizing and browsing archeological research data and notes. The web application can be accessed through both computers and smart phones. All versions of artefacts are retained, allowing users to see how interpretations have developed. <br>
+              PoN has been developed as part of the PATINA project, which aims to revolutionise the design of technologies for supporting research. The PATINA project was awarded by the Engineering and Physical Sciences Research Council (EPSRC) and the Arts and Humanities Research Council (AHRC) through the RCUK Digital Economy programme.</td>
+        </tr>
+        <tr id="4">
+          <td>4</td>
+          <th scope="row">WingsProvenanceExport</th>
+          <td>Application (consumes / generates provenance)</td>
+          <td>PROV-O</td>
+          <td><a href="https://github.com/dgarijo/WingsProvenanceExport">https://github.com/dgarijo/WingsProvenanceExport</a></td>
+          <td>Provenance export for the Wings workflow system, using the OPMW vocabulary. OPMW extends PROV core concepts to represent scientific workflows. More information in :http://www.opmw.org/node/8<br>
+              Input: owl ontologies and ttl files used to specify templates and summary of traces, respectively. Output: 2 files: one in PROV and one in OPM. <br>
+              The application generates provenance. It does not consume it.</td>
+        </tr>
+        <tr id="5">
+          <td>5</td>
+          <th scope="row">CollabMap</th>
+          <td>Application (consumes / generates provenance)</td>
+          <td>PROV-JSON</td>
+          <td><a href="http://www.collabmap.org/">http://www.collabmap.org/</a></td>
+          <td>CollabMap is a platform designed for crowdsourcing the task of identifying building evacuation routes to a large number of users, by offering them freely available data, such as satellite imagery (e.g. Google Maps), panoramic views (e.g. Google Streetview) and building shapes to carry out this task. The application tracks the provenance of its users for quality verification purposes.</td>
+        </tr>
+        <tr id="6">
+          <td>6</td>
+          <th scope="row">Taverna</th>
+          <td>Application (consumes / generates provenance)</td>
+          <td>PROV-O</td>
+          <td><a href="https://github.com/wf4ever/taverna-prov/">https://github.com/wf4ever/taverna-prov/</a></td>
+          <td>Plugin for Taverna workflow system for exporting workflow runs as PROV RDF. Provenance data includes information about start/stop of individual activities in a workflow definition, the workflow as a whole, and any nested workflows. Data items (entities) are identified as used and generated by the different steps. <br>
+              The RDF is using the vocabularies PROV-O, wfprov and a customization of the two called tavernaprov (see separate vocabulary registration), saved as a single Turtle file together with file representations of input, output and intermediate values.</td>
+        </tr>
+        <tr id="7">
+          <td>7</td>
+          <th scope="row">ProvToolbox</th>
           <td>Framework / API</td>
-          <td>PROV-N, Other</td>
-          <td><a href="https://github.com/trungdong/w3-prov">https://github.com/trungdong/w3-prov</a></td>
+          <td>PROV-O, PROV-N, PROV-XML, prov-json</td>
+          <td><a href="https://github.com/lucmoreau/ProvToolbox">https://github.com/lucmoreau/ProvToolbox</a></td>
+          <td>The ProvToolbox is a Java toolbox to create and convert W3C Provenance PROV representations between Java, RDF, XML, PROV-N, JSON, and dot.</td>
         </tr>
-        <tr>
-          <td>2</td>
-          <th scope="row">ProvToolbox</th>
-          <td>Application</td>
-          <td>PROV-N, PROV-RDF, PROV-XML, Other</td>
-          <td><a href="https://github.com/lucmoreau/ProvToolbox">https://github.com/lucmoreau/ProvToolbox</a></td>
+        <tr id="8">
+          <td>8</td>
+          <th scope="row">Provenance for Earth Science</th>
+          <td>Application (consumes / generates provenance)</td>
+          <td>PROV-O</td>
+          <td><a href="tbd">tbd</a></td>
+          <td>Extends PROV-O with a provenance representation for use in NASA<br>
+              s Earth Science Data Systems. Also assessing overlap of terms in ISO 19115-* Lineage.</td>
+        </tr>
+        <tr id="9">
+          <td>9</td>
+          <th scope="row">Provenance Environment (ProvEn) Services</th>
+          <td>Application (consumes / generates provenance)</td>
+          <td>PROV-O</td>
+          <td><a href="Coming soon">Coming soon</a></td>
+          <td>ProvEn Services enable and facilitate scientific teams to publish, share, link, and discover knowledge about their scientific research results.   In science, provenance is produced in many different manual and automated ways and is highly expressive.  Scientific teams producing results need a means to provide a composite origin story of the dataset to future consumers while maintaining privacy.  Proven Services provides Extract Translate Load services for users to provide native sources of provenance to build a composite history.  It relies on PROV-O, Dublin Core and other foundational ontologies so that diverse scientific knowledge can be cross referenced.</td>
+        </tr>
+        <tr id="10">
+          <td>10</td>
+          <th scope="row">Annotation Inference Framework</th>
+          <td>Application (consumes / generates provenance)</td>
+          <td>PROV-O, PROV-N, PROV-XML, PROV-JSON</td>
+          <td><a href="http://users.ecs.soton.ac.uk/ask2g10/prov/">http://users.ecs.soton.ac.uk/ask2g10/prov/</a></td>
+          <td>Annotation inference is a form of inference, that given a provenance graph with some annotations, infers new annotations for the same graph. Annotation is introduced as a generic mechanism to enable users to attach any information to the elements of a provenance graph for domain specific interpretation of provenance of data. This framework can be instantiated based on various needs.</td>
+        </tr>
+        <tr id="11">
+          <td>11</td>
+          <th scope="row">PROVoKing</th>
+          <td>Framework / API</td>
+          <td>PROV-O</td>
+          <td><a href="https://sites.google.com/site/provokinglibrary/">https://sites.google.com/site/provokinglibrary/</a></td>
+          <td>A general Java library for supporting the use of W3C PROV in Java applications.</td>
+        </tr>
+        <tr id="12">
+          <td>12</td>
+          <th scope="row">Triplify</th>
+          <td>Service</td>
+          <td>PROV-O</td>
+          <td><a href="http://triplify.org/">http://triplify.org/</a></td>
+          <td>Triplify is a small plugin for Web applications, which reveals the semantic structures encoded in relational databases by making database content available as Linked Data. The customizable metadata component that -by default- adds some provenance information to the published data, is documented here: http://sourceforge.net/apps/mediawiki/trdf/index.php?title=Triplify_Metadata_Extension</td>
+        </tr>
+        <tr id="13">
+          <td>13</td>
+          <th scope="row">Prov-gen</th>
+          <td>Application (consumes / generates provenance)</td>
+          <td>PROV-N, the author implemented an intermediate object model to represent PROV instances, called PROV-Java. Available here: 
+https://github.com/PaoloMissier/ProvToolbox/tree/master/prov-java</td>
+          <td><a href="https://github.com/PaoloMissier/ProvToolbox/tree/master/prov-gen">https://github.com/PaoloMissier/ProvToolbox/tree/master/prov-gen</a></td>
+          <td>a simple generator that produces a large PROV instance from a seed PROV instance, which can be defined using PROV-N or interactively through a simple GUI.<br>
+              The generated PROV instance is encoded as PROV-N. <br>
+              This is described in a M.Sc. dissertation: https://github.com/PaoloMissier/ProvToolbox/blob/master/prov-gen/WilliamMartin-MSc-Dissertation.pdf</td>
+        </tr>
+        <tr id="14">
+          <td>14</td>
+          <th scope="row">OBIAMA (Ontology-Based Integrated Action Modelling Arena</th>
+          <td>Application (consumes / generates provenance)</td>
+          <td>PROV-O</td>
+          <td><a href="https://github.com/garypolhill/obiama">https://github.com/garypolhill/obiama</a></td>
+          <td>A prototype discrete-event simulation environment that represents the state and structure of the model at any one time using OWL ontologies. OBIAMA is capable of generating provenance about actions performed by agents in a simulation model using PROV-O.</td>
+        </tr>
+        <tr id="15">
+          <td>15</td>
+          <th scope="row">Amalgame</th>
+          <td>Application (consumes / generates provenance)</td>
+          <td>PROV-O</td>
+          <td><a href="http://semanticweb.cs.vu.nl/amalgame/">http://semanticweb.cs.vu.nl/amalgame/</a></td>
+          <td>Amalgame is a tool for finding, evaluating and managing vocabulary alignments interactively. Because the process is interactive, we need PROV to record exactly what workflow has been executed to be able to interpret the results later.</td>
+        </tr>
+        <tr id="16">
+          <td>16</td>
+          <th scope="row">D2R Server</th>
+          <td>Service</td>
+          <td>PROV-O</td>
+          <td><a href="http://d2rq.org/d2r-server">http://d2rq.org/d2r-server</a></td>
+          <td>D2R Server is a tool for publishing relational databases on the Semantic Web. It enables RDF and HTML browsers to navigate the content of the database, and allows querying the database using the SPARQL query language. The customizable metadata component that -by default- adds some provenance information to the published data, is documented here: http://sourceforge.net/apps/mediawiki/trdf/index.php?title=D2R_Server_Metadata_Extension</td>
+        </tr>
+        <tr id="17">
+          <td>17</td>
+          <th scope="row">Provenance server</th>
+          <td>Service</td>
+          <td>PROV-N, PROV-JSON</td>
+          <td><a href="https://provenance.ecs.soton.ac.uk/store">https://provenance.ecs.soton.ac.uk/store</a></td>
+          <td>The Provenance Server is a web service that allows storing, browsing and managing provenance documents. The server can be accessed via a Web interface or via a REST API (using an API key or OAuth authentication)</td>
+        </tr>
+        <tr id="18">
+          <td>18</td>
+          <th scope="row">agentSwitch</th>
+          <td>Application (consumes / generates provenance)</td>
+          <td>PROV-N, PROV-JSON</td>
+          <td><a href="http://hac.ecs.soton.ac.uk/agentswitch/">http://hac.ecs.soton.ac.uk/agentswitch/</a></td>
+          <td>agentSwitch is a personalized energy tariff-recommender system which analyses electricity consumption and offer recommendations on:<br>
+              - the best tariffs to save electricity bills<br>
+              - shifting usages to night-time to benefit from lower electricity rate</td>
+        </tr>
+        <tr id="19">
+          <td>19</td>
+          <th scope="row">Far</th>
+          <td>Far</td>
+          <td>Far, PROV-O, PROV-XML</td>
+          <td><a href="Oracle Enterprise Transactions Controls Governor 8.6.4 ">Oracle Enterprise Transactions Controls Governor 8.6.4 </a></td>
+          <td>https://updates.oracle.com/Orion/Services/download/p14786779_864_Linux-x86-64.zip?aru=15596267<br>
+              patch_file=p14786779_864_Linux-x86-64.zip<br>
+              Oracle Enterprise Transactions Controls Governor 8.6.4 </td>
+        </tr>
+        <tr id="20">
+          <td>20</td>
+          <th scope="row">Pubby</th>
+          <td>Service</td>
+          <td>PROV-O</td>
+          <td><a href="http://wifo5-03.informatik.uni-mannheim.de/pubby/">http://wifo5-03.informatik.uni-mannheim.de/pubby/</a></td>
+          <td>Pubby can be used to add Linked Data interfaces to SPARQL endpoints. The customizable metadata component that -by default- adds some provenance information to the published data, is documented here: http://sourceforge.net/apps/mediawiki/trdf/index.php?title=Pubby_Metadata_Extension</td>
+        </tr>
+        <tr id="21">
+          <td>21</td>
+          <th scope="row">Semantic Proteomics Dashboard (SemPoD)</th>
+          <td>Application (consumes / generates provenance)</td>
+          <td>PROV-O</td>
+          <td><a href="http://physiomimi.case.edu/sempod/index.php/Main_Page">http://physiomimi.case.edu/sempod/index.php/Main_Page</a></td>
+          <td>The SemPoD platform, currently in use at the Case Center for Proteomics and Bioinformatics (CPB), extends the PROV Ontology (PROV-O) to support provenance–aware querying of 1153 mass-spectrometry experiments from 20 different projects. SemPoD consists of three components: Ontology-driven Visual Query Composer, Result Explorer, and Query Manager. SemPoD includes a dynamic query composition interface, which automatically updates the components of the query interface based on previous user selections and efficiently prunes the result set using a “smart filtering” approach based on the provenance of the results.<br>
+              SemPoD Webpage: http://physiomimi.case.edu/sempod/index.php/Main_Page</td>
+        </tr>
+        <tr id="22">
+          <td>22</td>
+          <th scope="row">DeFacto</th>
+          <td>Application (consumes / generates provenance)</td>
+          <td>PROV-O</td>
+          <td><a href="http://defacto.aksw.org">http://defacto.aksw.org</a></td>
+          <td>DeFacto (Deep Fact Validation) is an algorithm for validating statements by finding confirming sources for it on the web. It takes a statement (such as “Jamaica Inn was directed by Alfred Hitchcock”) as input and then tries to find evidence for the truth of that statement by searching for information in the web.</td>
+        </tr>
+        <tr id="23">
+          <td>23</td>
+          <th scope="row">Quality Assessment Framework</th>
+          <td>Framework / API</td>
+          <td>PROV-O</td>
+          <td><a href="https://github.com/cbaillie/QualityAssessmentFramework">https://github.com/cbaillie/QualityAssessmentFramework</a></td>
+          <td>This framework enables quality assessment of a given (RDF) entity based on user-defined requirements. Provenance described using the PROV-O ontology can be included as part of the quality requirements rules which are described using SPIN-SPARQL rules. At present quality requirements are based on the Data Quality Management ontology (http://semwebquality.org/dqm-vocabulary/v1/dqm). This framework can capture the provenance of the quality assesment process using PROV-O so that future quality assessments can make decisions about re-using existing quality scores.</td>
+        </tr>
+        <tr id="24">
+          <td>24</td>
+          <th scope="row">Global Change Information System - Information Model and Semantic Application Prototypes</th>
+          <td>Application (consumes / generates provenance)</td>
+          <td>PROV-O</td>
+          <td><a href="http://tw.rpi.edu/web/project/gcis-imsap">http://tw.rpi.edu/web/project/gcis-imsap</a></td>
+          <td>The Tetherless World Constellation (TWC) at Rensselaer Polytechnic Institute (RPI) proposes to facilitate the vocabulary and ontology development within the context of the overall development of semantic prototypes for the National Climate Assessment (NCA) portals using a combination of environmental inter-agency collaborations in a use-case focused workshop setting, information modeling, and software developments and deployments. The prototypes are intended to provide search and browse options that inspire confidence that all relevant information has been found; data providers will be citable with detailed provenance generation. Expected deliverables are: information models, vocabulary and ontology services for vetted climate assessment settings, and search/ browse prototypes.</td>
+        </tr>
+        <tr id="25">
+          <td>25</td>
+          <th scope="row">OpenUp Prov</th>
+          <td>Application (consumes / generates provenance)</td>
+          <td>PROV-O</td>
+          <td><a href="TSO">TSO</a></td>
+          <td>s implementation of PROV provides a object-oriented model for non-semantic developers to generate Provenance RDF in compliance with PROV specification. <br>
+              The Java-based tool has an integrated Apache Jena 2.7.2 to support RDF serialisation. It can have a triple-store backend, where provenance trail can be published in a progressive manner.<br>
+              To do list: digital signature of provenance, PROV validation</td>
+        </tr>
+        <tr id="26">
+          <td>26</td>
+          <th scope="row">APROVeD: Automatic Provenance Derivation</th>
+          <td>Application (consumes / generates provenance)</td>
+          <td>PROV-N, PROV-JSON</td>
+          <td><a href="http://users.ugent.be/~tdenies/APROVeD/">http://users.ugent.be/~tdenies/APROVeD/</a></td>
+          <td>In this project, we develop a new approach for automatic derivation of provenance, at multiple levels of granularity. To accomplish this, we detect entity derivations, relying on clustering algorithms, linked data and semantic similarity. The resulting derivations are structured in compliance with the Provenance Data Model (PROV-DM). While the proposed approach is purposely kept general, allowing adaptation in many use cases, we provide an implementation for one of these use cases, namely discovering the sources of news articles.</td>
+        </tr>
+        <tr id="27">
+          <td>27</td>
+          <th scope="row">Raw2LD</th>
+          <td>Application (consumes / generates provenance)</td>
+          <td>PROV-O</td>
+          <td><a href="http://github.com/Data2Semantics/raw2ld">http://github.com/Data2Semantics/raw2ld</a></td>
+          <td>Conversion scripts for converting Adverse Events reports to RDF. Uses PROV to describe a provenance trail of all conversion steps</td>
+        </tr>
+        <tr id="28">
+          <td>28</td>
+          <th scope="row">PROV-N to Neo4J DB mapping</th>
+          <td>Application (consumes / generates provenance)</td>
+          <td>PROV-N, Maps to a graph schema for the Neo4J DB.</td>
+          <td><a href="https://github.com/PaoloMissier/PROV_neo4j">https://github.com/PaoloMissier/PROV_neo4j</a></td>
+          <td>a simple Java program to upload a PROV-N encoded PROV instance into a Neo4J database.<br>
+              This core implementation only covers a few basic relations. More should be added.<br>
+              Built on the older 0.0.1-SNAPSHOT from ProvToolbox (see in the implementation section). Needs updating.<br>
+              As this was part of a grad student project in 2012, fresh resources are needed to upgrade + complete the implementation.</td>
+        </tr>
+        <tr id="29">
+          <td>29</td>
+          <th scope="row">Earth System Science Server</th>
+          <td>Application (consumes / generates provenance)</td>
+          <td>PROV-XML, PROV-JSON</td>
+          <td><a href="http://es3.eri.ucsb.edu">http://es3.eri.ucsb.edu</a></td>
+          <td>The Earth System Science Server (ES3) is a software environment for data-intensive Earth science. ES3 has unique capabilities for automatically and transparently capturing, managing and reconstructing the provenance of arbitrary, unmodified computational sequences.</td>
+        </tr>
+        <tr id="30">
+          <td>30</td>
+          <th scope="row">prov-api</th>
+          <td>Framework / API</td>
+          <td>PROV-O</td>
+          <td><a href="https://github.com/dcorsar/prov-api/">https://github.com/dcorsar/prov-api/</a></td>
+          <td>Java API for creating and manipulating provenance graphs. The API currently only implements the PROV core terms.  We have two implementations of the API: one using Jena and one based on SPARQL v1.1.  The Jena implementation supports building and querying a provenance model using an in memory ontology model.  The SPARQL implementation builds a provenance graph by generating a series of SPARQL updates, which can then be used against an appropriate resource (e.g. Jena model, SPARQL v1.1 endpoint); querying is performed by generating SPARQL queries which are executed using a provided query engine.</td>
+        </tr>
+        <tr id="31">
+          <td>31</td>
+          <th scope="row">Policy Reasoning Framework</th>
+          <td>Framework / API</td>
+          <td>PROV-O</td>
+          <td><a href="https://github.com/epignotti/PolicyReasoner">https://github.com/epignotti/PolicyReasoner</a></td>
+          <td>A policy reasoning framework based on OWL and the SPIN-SPARQL reasoner. This framework infers the provenance  of the policy reasoning process using SPARQL rules. The inferred provenance is represented using PROV-O.</td>
+        </tr>
+        <tr id="32">
+          <td>32</td>
+          <th scope="row">Informed Rural Passenger Information Infrastructure</th>
+          <td>Application (consumes / generates provenance)</td>
+          <td>PROV-O</td>
+          <td><a href="http://www.dotrural.ac.uk/irp/index.php?page=software">http://www.dotrural.ac.uk/irp/index.php?page=software</a></td>
+          <td>As part of the Informed Rural Passenger project, we are developing a real time passenger information system supported by a framework of web services that integrate and use data from various sources (the crowd, open data providers, etc.).  Prov-O is used to capture the provenance of various entities within the framework, for the purpose of supporting assessments of data quality and trustworthiness; see http://www.dotrural.ac.uk/irp/index.php?page=software for more details.</td>
+        </tr>
+        <tr id="33">
+          <td>33</td>
+          <th scope="row">PubFlow Provenance Archive</th>
+          <td>Application (consumes / generates provenance)</td>
+          <td>PROV-O, PROV-XML</td>
+          <td><a href="www.pubflow.de/en/provenanceArchive">www.pubflow.de/en/provenanceArchive</a></td>
+          <td>The PubFlow provenance archive is used to store the provenance information collected by the PubFlow research data publication framework. PubFlow is a framework to transfer research data from a local repository to public available archives. Whenever PubFlow works on the research data the corresponding provenance information is collected and stored in the provenance archive. </td>
+        </tr>
+        <tr id="34">
+          <td>34</td>
+          <th scope="row">PROV Python library</th>
+          <td>Framework / API</td>
+          <td>PROV-N, PROV-JSON</td>
+          <td><a href="http://pypi.python.org/pypi/prov">http://pypi.python.org/pypi/prov</a></td>
+          <td>The library provides an implementation of the PROV Data Model in Python. It contains a number of sub-modules:<br>
+              - prov.model: In-memory classes for PROV assertions, JSON serialisation and deserialisation, PROV-N serialisation.<br>
+              - prov.persistence: A Django app for storing and loading ProvBundle instances to/from databases using Django ORM<br>
+              - prov.tracking: a logging-like module to facilitate tracking provenance in Python programs</td>
         </tr>
       </table>
 
     </section>
 
   <section id="prov-terms">
-    <h2>PROV Language Implementation</h2>
-    <p>This section enumerates the PROV-DM terms [[PROV-DM]] that are consumed (<img src="consume.png" width="27" height="16" alt="Consume Icon" />),
-    produced (<img src="produce.png" width="27" height="16" alt="Produce Icon" />),
-    or both consumed and produced (<img src="conprod.png" width="27" height="16" alt="Consume and Produce Icon" />)
+    <h2>PROV Language Implementation</h2>
+    <p>This section enumerates the PROV-DM terms [[PROV-DM]] that are consumed (<img src="consume.png" width="27" height="16" alt="Consume Icon" />),
+    produced (<img src="produce.png" width="27" height="16" alt="Produce Icon" />),
+    or both consumed and produced (<img src="conprod.png" width="27" height="16" alt="Consume and Produce Icon" />)
     by a particular implementation.</p> Hover, over the numbers to see the implementation name.
     <table class="feature-table">
       <caption id="prov-terms-table">Table 2: Coverage of PROV-DM terms in implementations of type Application, Framework / API, or Service.</caption>
@@ -822,4 +1104,4 @@
       <p>TODO: Acknowledgements to people who reported their implementations to the working group.</p> 
     </section> 
   </body>
-</html>
+</html>