Skip to Main Content

Open Science Framework (OSF)

This research guide is based on the University of Washington's: "Open Science Framework (OSF)" research guide created by Jenny Muilenburg.

OSF Best Practices

OSF Support provides guides on some of the best practices for using OSF in your research workflow  Topics include:

  • File Management- this includes file naming and organizing files.
  • Version Control- good version control can lead to more efficient collaboration and increased accuracy of research results. OSF has a built-in version control for all files stored in your project.
  • Data Management- best data practices include how to make a data dictionary, sharing research outputs, and sharing data.

Research Data Management

The University of Maryland Libraries Research Data Services guide can help you with your research. 

Our librarians provide expert guidance, project consultation, and technical assistance on various aspects of data management and curation. You can consult us about data management planning, data sharing and publishing, long-term preservation, and related topics. In some cases, we can provide data architecture and governance, data transformation and manipulation, and custom software development.

If you need to write a data-management or -sharing plan for a grant proposal or journal submission, please see our guide to data management plans

Organization

Why should you organize your data?

The organizational structure of your data can help you easily locate files when revisiting a past project and can help secondary users find, identify, select, and obtain the data they require.

How do you organize your data?

For best results, data structure should be fully modeled top-to-bottom/beginning-to-end in the planning phase of a project.

You'll want to devise ways to express the following:

  • The context of data collection: project history, aim, objectives, and hypothesis
  • Data collection methods: sampling, data collection process, instruments used, hardware and software used, scale and resolution, temporal and geographic coverage, and secondary data sources used
  • Dataset structure of data files, study cases, and relationships between files
  • Data validation, checking, proofing, cleaning, and quality assurance procedure carried out
  • Changes made to data over time since their original creation and identification of different versions of data files
  • Information on access and use conditions or data confidentiality
(adapted from UKDA)