...
The set is split into several classes describing the data. At the moment four classes (Dataset, Catalogue, ResourseResource, and Agent) are mandatory. Each class is populated by a set of mandatory and recommended variables. You can find all of the descriptions of variables and classses classes here: Core Metadata Schema Specification
...
Metadata mapping is the process of establishing connections between corresponding metadata values or fields across different systems. In simple terms, it ensures that your metadata schema for your data is transformed to the HRI metadata schema in the correct way. It involves identifying and linking similar pieces of metadata information from one system to the relevant content or data elements in another system. This mapping ensures consistency and coherence between disparate datasets or databases, allowing for efficient data integration and interoperability. By associating equivalent metadata values or fields, metadata mapping enables seamless communication and exchange of information between systems, facilitating accurate data discovery, retrieval, and interpretation.
Below is an example of simple metadata is a simple metadata of a blood a sample. It describes the important information about the sample including ID of the sample, ID of the patient, and a diagnosis:
Code Block |
---|
Blood Sample Metadata: - Sample ID: BS001 - Patient ID: P123456 - Patient Gender: Female - Clinical Diagnosis: Hypertension - Storage Conditions: -20°C freezer - Processing Steps: Centrifugation at 3000 rpm for 10 minutes, aliquoting into 1 mL tubes - Research Use: Yes - Consent Status: Informed consent obtained - Metadata Created By: Lab Technician, Sarah Lee - Metadata Creation Date: January 15, 2024, 10:00 AM |
...
To map your metadata you first need to understand the structure of your metadata and their semantic meaning and the ontology (vocabulary) used to to describe your data in a Resource Description Framework (RDF), in our case DCAT V3, format. The general outline of the mapping pipeline can be found here: https://health-ri.atlassian.net/wiki/spaces/FSD/pages/edit-v2/290291734?draftShareId=ff45a2e2-80ee-49aa-b6d6-c04dedb6f9f8
Next steps
After mapping/transforming your data properties to the classes and variables of the HRI model, you need to validate your model. This step ensures that the new model both accurately represent the original data as well as adheres to the HRI metadata structure.
Once your RDF data is ready, you can publish your data to an FDP it to FAIR Data Point, where it can be harvested by the Catalogue. More information about this step can be found here: 3. Exposing metadata
...