Filter
Reset all

Subjects

Content Types

Countries

AID systems

API

Data access

Data access restrictions

Database access

Database access restrictions

Database licenses

Data licenses

Data upload

Data upload restrictions

Enhanced publication

Institution responsibility type

Institution type

Keywords

Metadata standards

PID systems

Provider types

Quality management

Repository languages

Software

Syndications

Repository types

Versioning

  • * at the end of a keyword allows wildcard searches
  • " quotes can be used for searching phrases
  • + represents an AND search (default)
  • | represents an OR search
  • - represents a NOT operation
  • ( and ) implies priority
  • ~N after a word specifies the desired edit distance (fuzziness)
  • ~N after a phrase specifies the desired slop amount
  • 1 (current)
Found 17 result(s)
The Gulf of Mexico Research Initiative Information and Data Cooperative (GRIIDC) is a team of researchers, data specialists and computer system developers who are supporting the development of a data management system to store scientific data generated by Gulf of Mexico researchers. The Master Research Agreement between BP and the Gulf of Mexico Alliance that established the Gulf of Mexico Research Initiative (GoMRI) included provisions that all data collected or generated through the agreement must be made available to the public. The Gulf of Mexico Research Initiative Information and Data Cooperative (GRIIDC) is the vehicle through which GoMRI is fulfilling this requirement. The mission of GRIIDC is to ensure a data and information legacy that promotes continual scientific discovery and public awareness of the Gulf of Mexico Ecosystem.
Country
The Universitat de Barcelona Digital Repository is an institutional resource containing open-access digital versions of publications related to the teaching, research and institutional activities of the UB's teaching staff and other members of the university community, including research data.
The UC San Diego Library Digital Collections website gathers two categories of content managed by the Library: library collections (including digitized versions of selected collections covering topics such as art, film, music, history and anthropology) and research data collections (including research data generated by UC San Diego researchers).
Country
Research Data Online (RDO) provides access to research datasets held at the University of Western Australia. RDO is managed by the University Library. The information about each dataset has been provided by UWA research groups. Information about the datasets in this service is automatically harvested into Research Data Australia (RDA: https://researchdata.ands.org.au/). Language: The user interface language of the research data repository.
RADAR service offers the ability to search for research data descriptions of the Natural Resources Institute Finland (Luke). The service includes descriptions of research data for agriculture, forestry and food sectors, game management, fisheries and environment. The public web service aims to facilitate discovering subjects of natural resources studies. In addition to Luke's research data descriptions one can search metadata of the Finnish Environment Institute (SYKE). The interface between Luke and SYKE metadata services combines Luke's research data descriptions and SYKE's descriptions of spatial datasets and data systems into a unified search service.
Country
Edmond is the institutional repository of the Max Planck Society for public research data. It enables Max Planck scientists to create citable scientific assets by describing, enriching, sharing, exposing, linking, publishing and archiving research data of all kinds. A unique feature of Edmond is the dedicated metadata management, which supports a non-restrictive metadata schema definition, as simple as you like or as complex as your parameters require. Further on, all objects within Edmond have a unique identifier and therefore can be clearly referenced in publications or reused in other contexts.
ETH Data Archive is ETH Zurich's long-term preservation solution for digital information such as research data, documents or images. It serves as the backbone of data curation and for most of its content, it is a “dark archive” without public access. In this capacity, the ETH Data Archive also archives the content of ETH Zurich’s Research Collection which is the primary repository for members of the university and the first point of contact for publication of data at ETH Zurich. All data that was produced in the context of research at the ETH Zurich, can be published and archived in the Research Collection. In the following cases, a direct data upload into the ETH Data Archive though, has to be considered: - Upload and registration of software code according to ETH transfer’s requirements for Software Disclosure. - A substantial number of files, have to be regularly submitted for long-term archiving and/or publishing and browser-based upload is not an option: the ETH Data Archive may offer automated data and metadata transfers from source applications (e.g. from a LIMS) via API. - Files for a project on a local computer have to be collected and metadata has to be added before uploading the data to the ETH Data Archive: -- we provide you with the local file editor docuteam packer. Docuteam packer allows to structure, describe, and organise data for an upload into the ETH Data Archive and the depositor decides when submission is due.
The Africa Centre offers longitudinal datasets from a rural demographic in KwaZulu-Natal, South Africa where HIV prevalence is extremely high. The data may be filtered by demographics, years, or by individuals questionnaires. The Africa Centre requests notification that anyone contact them when downloading their data. Since January 2000, the Africa Centre For Population Health has built up an extensive longitudinal database of demographic, social, medical and economic information about the members of its Demographic Surveillance Area, which is situated in a rural area of northern KwaZulu-Natal. It has developed from this database, the following suite of datasets which can be used both internally within the organisation, and by other researchers.
The Allele Frequency Net Database (AFND) is a public database which contains frequency information of several immune genes such as Human Leukocyte Antigens (HLA), Killer-cell Immunoglobulin-like Receptors (KIR), Major histocompatibility complex class I chain-related (MIC) genes, and a number of cytokine gene polymorphisms. The Allele Frequency Net Database (AFND) provides a central source, freely available to all, for the storage of allele frequencies from different polymorphic areas in the Human Genome. Users can contribute the results of their work into one common database and can perform database searches on information already available. We have currently collected data in allele, haplotype and genotype format. However, the success of this website will depend on you to contribute your data.
The Open Science Framework (OSF) is part network of research materials, part version control system, and part collaboration software. The purpose of the software is to support the scientist's workflow and help increase the alignment between scientific values and scientific practices. Document and archive studies. Move the organization and management of study materials from the desktop into the cloud. Labs can organize, share, and archive study materials among team members. Web-based project management reduces the likelihood of losing study materials due to computer malfunction, changing personnel, or just forgetting where you put the damn thing. Share and find materials. With a click, make study materials public so that other researchers can find, use and cite them. Find materials by other researchers to avoid reinventing something that already exists. Detail individual contribution. Assign citable, contributor credit to any research material - tools, analysis scripts, methods, measures, data. Increase transparency. Make as much of the scientific workflow public as desired - as it is developed or after publication of reports. Find public projects here. Registration. Registering materials can certify what was done in advance of data analysis, or confirm the exact state of the project at important points of the lifecycle such as manuscript submission or at the onset of data collection. Discover public registrations here. Manage scientific workflow. A structured, flexible system can provide efficiency gain to workflow and clarity to project objectives, as pictured.
Academic Commons is a freely accessible digital collection of research and scholarship produced at Columbia University or one of its affiliate institutions (Barnard College, Teachers College, Union Theological Seminary, and Jewish Theological Seminary). The mission of Academic Commons is to collect and preserve the digital outputs of research and scholarship produced at Columbia and its affiliate institutions and present them to a global audience. Academic Commons accepts articles, dissertations, research data, presentations, working papers, videos, and more.
Country
openLandscapes is an open access information portal for landscape research. Amongst other things, the platform provides information about current research projects. In addition, it offers the scientific community the possibility to maintain a Wiki on landscape-related contents and to make available future primary data from landscape research. In openLandscapes, all technical contents are stored and organised in a networked manner, enabling technical terms to be linked to experts or institutions, as well as to data in the future.