...
This is based on the comment attached to “Actor perspective” written over here and should be further discussed/elaborated. Which steps do you generally need to do?Really drafty.
General Approach
Define FAIR Objectives & deliverables (Guidance: /wiki/spaces/FSD/pages/294584321 Define your objectives for making data FAIR)
Here we assume the goal is to
Create a new FAIR Clinical dataset
Publish this dataset to make it available for others
?
...
Communities already over here?
Assuming you want to already collect the data in a standardised way, build the study using knowledge from the communities? Reusing existing eCRFs and Codebooks?
(Find out whether work has already been done, which you can reuse. Perhaps a link to a page which talks about the communities? Or perhaps a page which talks about reusing what’s already out there?)
Build the eCRFs with the fields of interest in the EDC
Collect the (meta)data (Guidance: see XYZ)
Collect the necessary / desired metadata requirements;
Required: Core metadata mode
Desired: Health metadata model (is this extended core?)
Desired: Domain specific model(s) (Petals)
Collect the data
Collect the data in the EDC
...
Publish the dataset (Guidance: XYZ)
Applied Example
...
Use case: Researcher will collect Diagnosis data in Castor EDC and want wants to make this dataset FAIR.
...
Step 2: Build study for data collection
Researcher searches HRI codebook library for a preferred codebook with Diagnosis items
Researchers builds a Diagnosis eCRF in Castor using suitable tools
(In case items are missing, does she join something like the clinical community? Maybe she should, but I’m not sure whether most researchers would)