Filter
Reset all

Subjects

Content Types

Countries

AID systems

API

Certificates

Data access

Data access restrictions

Database access

Database access restrictions

Database licenses

Data licenses

Data upload

Data upload restrictions

Enhanced publication

Institution responsibility type

Institution type

Keywords

Metadata standards

PID systems

Provider types

Quality management

Repository languages

Software

Syndications

Repository types

Versioning

  • * at the end of a keyword allows wildcard searches
  • " quotes can be used for searching phrases
  • + represents an AND search (default)
  • | represents an OR search
  • - represents a NOT operation
  • ( and ) implies priority
  • ~N after a word specifies the desired edit distance (fuzziness)
  • ~N after a phrase specifies the desired slop amount
Found 32 result(s)
IsoArcH is an open access isotope web-database for bioarchaeological samples from prehistoric and historical periods all over the world. With 40,000+ isotope related data obtained on 13,000+ specimens (i.e., humans, animals, plants and organic residues) coming from 500+ archaeological sites, IsoArcH is now one of the world's largest repositories for isotopic data and metadata deriving from archaeological contexts. IsoArcH allows to initiate big data initiatives but also highlights research lacks in certain regions or time periods. Among others, it supports the creation of sound baselines, the undertaking of multi-scale analysis, and the realization of extensive studies and syntheses on various research issues such as paleodiet, food production, resource management, migrations, paleoclimate and paleoenvironmental changes.
For datasets big and small; Store your research data online. Quickly and easily upload files of any type and we will host your research data for you. Your experimental research data will have a permanent home on the web that you can refer to.
OLOS is a Swiss-based data management portal tailored for researchers and institutions. Powerful yet easy to use, OLOS works with most tools and formats across all scientific disciplines to help researchers safely manage, publish and preserve their data. The solution was developed as part of a larger project focusing on Data Life Cycle Management (dlcm.ch) that aims to develop various services for research data management. Thanks to its highly modular architecture, OLOS can be adapted both to small institutions that need a "turnkey" solution and to larger ones that can rely on OLOS to complement what they have already implemented. OLOS is compatible with all formats in use in the different scientific disciplines and is based on modern technology that interconnects with researchers' environments (such as Electronic Laboratory Notebooks or Laboratory Information Management Systems).
As a member of SWE-CLARIN, the Humanities Lab will provide tools and expertise related to language archiving, corpus and (meta)data management, with a continued emphasis on multimodal corpora, many of which contain Swedish resources, but also other (often endangered) languages, multilingual or learner corpora. As a CLARIN K-centre we provide advice on multimodal and sensor-based methods, including EEG, eye-tracking, articulography, virtual reality, motion capture, av-recording. Current work targets automatic data retrieval from multimodal data sets, as well as the linking of measurement data (e.g. EEG, fMRI) or geo-demographic data (GIS, GPS) to language data (audio, video, text, annotations). We also provide assistance with speech and language technology related matters to various projects. A primary resource in the Lab is The Humanities Lab corpus server, containing a varied set of multimodal language corpora with standardised metadata and linked layers of annotations and other resources.
Country
To target the multidisciplinary, broad scale nature of empirical educational research in the Federal Republic of Germany, a networked research data infrastructure is required which brings together disparate services from different research data providers, delivering services to researchers in a usable, needs-oriented way. The Verbund Forschungsdaten Bildung (Educational Research Data Alliance, VFDB) therefore aims to cooperate with relevant actors from science, politics and research funding institutes to set up a powerful infrastructure for empirical educational research. This service is meant to adequately capture specific needs of the scientific communities and support empirical educational research in carrying out excellent research.
The EUR Data Repository [EDR] is the institutional data repository from the Erasmus University Rotterdam. The EUR Data Repository is an online platform where you showcase your research and make it findable, citable, and reusable for others.
MIDRC aims to develop a high-quality repository for medical images related to COVID-19 and associated clinical data, and develop and foster medical image-based artificial intelligence (AI) for use in the detection, diagnosis, prognosis, and monitoring of COVID-19.
The mission of World Data Center for Climate (WDCC) is to provide central support for the German and European climate research community. The WDCC is member of the ISC's World Data System. Emphasis is on development and implementation of best practice methods for Earth System data management. Data for and from climate research are collected, stored and disseminated. The WDCC is restricted to data products. Cooperations exist with thematically corresponding data centres of, e.g., earth observation, meteorology, oceanography, paleo climate and environmental sciences. The services of WDCC are also available to external users at cost price. A special service for the direct integration of research data in scientific publications has been developed. The editorial process at WDCC ensures the quality of metadata and research data in collaboration with the data producers. A citation code and a digital identifier (DOI) are provided and registered together with citation information at the DOI registration agency DataCite.
<<<!!!<<< checked 20.03.2017 SumsDB was offline; for more information and archive see http://brainvis.wustl.edu/sumsdb/ >>>!!!>>> SumsDB (the Surface Management System DataBase) is a repository of brain-mapping data (surfaces & volumes; structural & functional data) from many laboratories.
Country
GESIS preserves (mainly quantitative) social research data to make it available to the scientific research community. The data is described in a standardized way, secured for the long term, provided with a permanent identifier (DOI), and can be easily found and reused through browser-optimized catalogs (https://search.gesis.org/).
UltraViolet is part of a suite of repositories at New York University that provide a home for research materials, operated as a partnership of the Division of Libraries and NYU IT's Research and Instruction Technology. UltraViolet provides faculty, students, and researchers within our university community with a place to deposit scholarly materials for open access and long-term preservation. UltraViolet also houses some NYU Libraries collections, including proprietary data collections.
Country
DIAS aims at collecting and storing earth observation data; analyzing such data in combination with socio-economic data, and converting data into information useful for crisis management with respect to global-scale environmental disasters, and other threats; and to make this information available within Japan and overseas.
EMSC collects real time parametric data (source parmaters and phase pickings) provided by 65 seismological networks of the Euro-Med region. These data are provided to the EMSC either by email or via QWIDS (Quake Watch Information Distribution System, developped by ISTI). The collected data are automatically archived in a database, made available via an autoDRM, and displayed on the web site. The collected data are automatically merged to produce automatic locations which are sent to several seismological institutes in order to perform quick moment tensors determination.
ETH Data Archive is ETH Zurich's long-term preservation solution for digital information such as research data, digitised content, archival records, or images. It serves as the backbone of data curation and for most of its content, it is a “dark archive” without public access. In this capacity, the ETH Data Archive also archives the content of ETH Zurich’s Research Collection which is the primary repository for members of the university and the first point of contact for publication of data at ETH Zurich. All data that was produced in the context of research at the ETH Zurich, can be published and archived in the Research Collection. An automated connection to the ETH Data Archive in the background ensures the medium to long-term preservation of all publications and research data. Direct access to the ETH Data Archive is intended only for customers who need to deposit software source code within the framework of ETH transfer Software Registration. Open Source code packages and other content from legacy workflows can be accessed via ETH Library @ swisscovery (https://library.ethz.ch/en/).
The KNB Data Repository is an international repository intended to facilitate ecological, environmental and earth science research in the broadest senses. For scientists, the KNB Data Repository is an efficient way to share, discover, access and interpret complex ecological, environmental, earth science, and sociological data and the software used to create and manage those data. Due to rich contextual information provided with data in the KNB, scientists are able to integrate and analyze data with less effort. The data originate from a highly-distributed set of field stations, laboratories, research sites, and individual researchers. The KNB supports rich, detailed metadata to promote data discovery as well as automated and manual integration of data into new projects. The KNB supports a rich set of modern repository services, including the ability to assign Digital Object Identifiers (DOIs) so data sets can be confidently referenced in any publication, the ability to track the versions of datasets as they evolve through time, and metadata to establish the provenance relationships between source and derived data.
JCVI is a world leader in genomic research. The Institute studies the societal implications of genomics in addition to genomics itself. The Institute's research involves genomic medicine; environmental genomic analysis; clean energy; synthetic biology; and ethics, law, and economics.
The Gulf of Mexico Research Initiative Information and Data Cooperative (GRIIDC) is a team of researchers, data specialists and computer system developers who are supporting the development of a data management system to store scientific data generated by Gulf of Mexico researchers. The Master Research Agreement between BP and the Gulf of Mexico Alliance that established the Gulf of Mexico Research Initiative (GoMRI) included provisions that all data collected or generated through the agreement must be made available to the public. The Gulf of Mexico Research Initiative Information and Data Cooperative (GRIIDC) is the vehicle through which GoMRI is fulfilling this requirement. The mission of GRIIDC is to ensure a data and information legacy that promotes continual scientific discovery and public awareness of the Gulf of Mexico Ecosystem.
Country
NAKALA is a repository dedicated to SSH research data in France. Given its generalist and multi-disciplinary nature, all types of data are accepted, although certain formats are recommended to ensure longterm data preservation. It has been developed and is hosted by Huma-Num, the French national research infrastructure for digital humanities.
BOARD (Bicocca Open Archive Research Data) is the institutional data repository of the University of Milano-Bicocca. BOARD is an open, free-to-use research data repository, which enables members of University of Milano-Bicocca to make their research data publicly available. By depositing their research data in BOARD researchers can: - Make their research data citable - Share their data privately or publicly - Ensure long-term storage for their data - Keep access to all versions - Link their article to their data