Filter
Reset all

Subjects

Content Types

Countries

AID systems

API

Certificates

Data access

Data access restrictions

Database access

Database access restrictions

Database licenses

Data licenses

Data upload

Data upload restrictions

Enhanced publication

Institution responsibility type

Institution type

Keywords

Metadata standards

PID systems

Provider types

Quality management

Repository languages

Software

Syndications

Repository types

Versioning

  • * at the end of a keyword allows wildcard searches
  • " quotes can be used for searching phrases
  • + represents an AND search (default)
  • | represents an OR search
  • - represents a NOT operation
  • ( and ) implies priority
  • ~N after a word specifies the desired edit distance (fuzziness)
  • ~N after a phrase specifies the desired slop amount
Found 55 result(s)
Country
The German Neuroinformatics Node's data infrastructure (GIN) services provide a platform for comprehensive and reproducible management and sharing of neuroscience data. Building on well established versioning technology, GIN offers the power of a web based repository management service combined with a distributed file storage. The service addresses the range of research data workflows starting from data analysis on the local workstation to remote collaboration and data publication.
Country
SODHA is the federal Belgian data archive for social sciences and the digital humanities. SODHA is a new service of the State Archives of Belgium and acts as the Belgian service provider for the Consortium of European Social Science Data Archives (CESSDA).
The World Ocean Database (WOD) is a collection of scientifically quality-controlled ocean profile and plankton data that includes measurements of temperature, salinity, oxygen, phosphate, nitrate, silicate, chlorophyll, alkalinity, pH, pCO2, TCO2, Tritium, Δ13Carbon, Δ14Carbon, Δ18Oxygen, Freon, Helium, Δ3Helium, Neon, and plankton. WOD contains all data of "World Data Service Oceanography" (WDS-Oceanography).
<<<!!!<<< This repository is no longer available. >>>!!!>>> The programme "International Oceanographic Data and Information Exchange" (IODE) of the "Intergovernmental Oceanographic Commission" (IOC) of UNESCO was established in 1961. Its purpose is to enhance marine research, exploitation and development, by facilitating the exchange of oceanographic data and information between participating Member States, and by meeting the needs of users for data and information products.
Strong-motion data of engineering and scientific importance from the United States and other seismically active countries are served through the Center for Engineering Strong Motion Data(CESMD). The CESMD now automatically posts strong-motion data from an increasing number of seismic stations in California within a few minutes following an earthquake as an InternetQuick Report(IQR). As appropriate,IQRs are updated by more comprehensive Internet Data Reports that include reviewed versions of the data and maps showing, for example, the finite fault rupture along with the distribution of recording stations. Automated processing of strong-motion data will be extended to post the strong-motion records of the regional seismic networks of the Advanced National Seismic System (ANSS) outside California.
The Information Marketplace for Policy and Analysis of Cyber-risk & Trust (IMPACT) program supports global cyber risk research & development by coordinating, enhancing and developing real world data, analytics and information sharing capabilities, tools, models, and methodologies. In order to accelerate solutions around cyber risk issues and infrastructure security, IMPACT makes these data sharing components broadly available as national and international resources to support the three-way partnership among cyber security researchers, technology developers and policymakers in academia, industry and the government.
Country
GEOMAR Helmholtz Centre for Ocean Research Kiel is one of the leading marine science institutions in Europe. GEOMAR investigates the chemical, physical, biological, and geological processes in the oceans, as well as their interactions with the seafloor and the atmosphere. OceanRep is an open access digital collection containing the research output of GEOMAR staff and students. Included are journal articles, conference papers, book chapters, theses and more, - with fulltext, if available. Research data are linked to the publications entries.
The ESCAPE Open-source Scientific Software and Service Repository (OSSR) is a sustainable open-access repository to share scientific software, services and datasets to the astro-particle-physics-related communities and enable open science. It is built as a curated Zenodo community (https://zenodo.org/communities/escape2020) integrated with several tools to enable a complete software life-cycle. The ESCAPE Zenodo community welcomes entries that support the software and service projects in the OSSR such as user-support documentation, tutorials, presentations and training activities. It also encourages the archival of documents and material that disseminate and support the goals of ESCAPE.
Specification Patterns is an online repository for information about property specification for finite-state verification. The intent of this repository is to collect patterns that occur commonly in the specification of concurrent and reactive systems.
Country
DataverseNO (https://dataverse.no) is a curated, FAIR-aligned national generic repository for open research data from all academic disciplines. DataverseNO commits to facilitate that published data remain accessible and (re)usable in a long-term perspective. The repository is owned and operated by UiT The Arctic University of Norway. DataverseNO accepts submissions from researchers primarily from Norwegian research institutions. Datasets in DataverseNO are grouped into institutional collections as well as special collections. The technical infrastructure of the repository is based on the open source application Dataverse (https://dataverse.org), which is developed by an international developer and user community led by Harvard University.
Merritt is a curation repository for the preservation of and access to the digital research data of the ten campus University of California system and external project collaborators. Merritt is supported by the University of California Curation Center (UC3) at the California Digital Library (CDL). While Merritt itself is content agnostic, accepting digital content regardless of domain, format, or structure, it is being used for management of research data, and it forms the basis for a number of domain-specific repositories, such as the ONEShare repository for earth and environmental science and the DataShare repository for life sciences. Merritt provides persistent identifiers, storage replication, fixity audit, complete version history, REST API, a comprehensive metadata catalog for discovery, ATOM-based syndication, and curatorially-defined collections, access control rules, and data use agreements (DUAs). Merritt content upload and download may each be curatorially-designated as public or restricted. Merritt DOIs are provided by UC3's EZID service, which is integrated with DataCite. All DOIs and associated metadata are automatically registered with DataCite and are harvested by Ex Libris PRIMO and Thomson Reuters Data Citation Index (DCI) for high-level discovery. Merritt is also a member node in the DataONE network; curatorially-designated data submitted to Merritt are automatically registered with DataONE for additional replication and federated discovery through the ONEMercury search/browse interface.
Neuroimaging Tools and Resources Collaboratory (NITRC) is currently a free one-stop-shop environment for science researchers that need resources such as neuroimaging analysis software, publicly available data sets, and computing power. Since its debut in 2007, NITRC has helped the neuroscience community to use software and data produced from research that, before NITRC, was routinely lost or disregarded, to make further discoveries. NITRC provides free access to data and enables pay-per-use cloud-based access to unlimited computing power, enabling worldwide scientific collaboration with minimal startup and cost. With NITRC and its components—the Resources Registry (NITRC-R), Image Repository (NITRC-IR), and Computational Environment (NITRC-CE)—a researcher can obtain pilot or proof-of-concept data to validate a hypothesis for a few dollars.
Kaggle is a platform for predictive modelling and analytics competitions in which statisticians and data miners compete to produce the best models for predicting and describing the datasets uploaded by companies and users. This crowdsourcing approach relies on the fact that there are countless strategies that can be applied to any predictive modelling task and it is impossible to know beforehand which technique or analyst will be most effective.
Country
OpenAgrar is an open access repository which publishes, stores, archives and distributes publications, publication references and research data. Its resources can be searched and used by everyone. It contains amongst others theses, reports, conference proceedings, journal articles, books, institutional documents, research datasets, videos and interviews.
Our mission is to provide the data services, tools, and cyberinfrastructure leadership that advance earth-system science, enhance educational opportunities, and broaden participation. Unidata's main RAMADDA server contains access to a variety of datasets including the full IDD feed, Case Studies and other project data. RAMADDA is a content management system for Earth Science data. More information is available here: https://ramadda.org/?
Country
The National High Energy Physics Science Data Center (NHEPSDC) is a repository for high-energy physics. In 2019, it was designated as a scientific data center at the national level by the Ministry of Science and Technology of China (MOST). NHEPSDC is constructed and operated by the Institute of High Energy Physics (IHEP) of the Chinese Academy of Sciences (CAS). NHEPSDC consists of a main data center in Beijing, a branch center in Guangdong-Hong Kong-Macao Greater Bay Area, and a branch center in Huairou District of Beijing. The mission of NHEPSDC is to provide the services of data collection, archiving, long-term preservation, access and sharing, software tools, and data analysis. The services of NHEPSDC are mainly for high-energy physics and related scientific research activities. The data collected can be roughly divided into the following two categories: one is the raw data from large scientific facilities, and the other is data generated from general scientific and technological projects (usually supported by government funding), hereafter referred to as generic data. More than 70 people work in NHEPSDC now, with 18 in high-energy physics, 17 in computer science, 15 in software engineering, 20 in data management and some other operation engineers. NHEPSDC is equipped with a hierarchical storage system, high-performance computing power, high bandwidth domestic and international network links, and a professional service support system. In the past three years, the average data increment is about 10 PB per year. By integrating data resources with the IT environment, a state-of-art data process platform is provided to users for scientific research, the volume of data accessed every year is more than 400 PB with more than 10 million visits.
Country
HIstome: The Histone Infobase is a database of human histones, their post-translational modifications and modifying enzymes. HIstome is a combined effort of researchers from two institutions, Advanced Center for Treatment, Research and Education in Cancer (ACTREC), Navi Mumbai and Center of Excellence in Epigenetics, Indian Institute of Science Education and Research (IISER), Pune.
Country
Swedish National Data Service (SND) is a research data infrastructure designed to assist researchers in preserving, maintaining, and disseminating research data in a secure and sustainable manner. The SND Search function makes it easy to find, use, and cite research data from a variety of scientific disciplines. Together with an extensive network of almost 40 Swedish higher education institutions and other research organisations, SND works for increased access to research data, nationally as well as internationally.
The Biological and Chemical Oceanography Data Management Office (BCO-DMO) is a publicly accessible earth science data repository created to curate, publicly serve (publish), and archive digital data and information from biological, chemical and biogeochemical research conducted in coastal, marine, great lakes and laboratory environments. The BCO-DMO repository works closely with investigators funded through the NSF OCE Division’s Biological and Chemical Sections and the Division of Polar Programs Antarctic Organisms & Ecosystems. The office provides services that span the full data life cycle, from data management planning support and DOI creation, to archive with appropriate national facilities.
The Earth System Grid Federation (ESGF) is an international collaboration with a current focus on serving the World Climate Research Programme's (WCRP) Coupled Model Intercomparison Project (CMIP) and supporting climate and environmental science in general. Data is searchable and available for download at the Federated ESGF-CoG Nodes https://esgf.llnl.gov/nodes.html
The Harvard Dataverse is open to all scientific data from all disciplines worldwide. It includes the world's largest collection of social science research data. It is hosting data for projects, archives, researchers, journals, organizations, and institutions.
<<<!!!<<< This repository is no longer available. >>>!!!>>> BioVeL is a virtual e-laboratory that supports research on biodiversity issues using large amounts of data from cross-disciplinary sources. BioVeL supports the development and use of workflows to process data. It offers the possibility to either use already made workflows or create own. BioVeL workflows are stored in MyExperiment - Biovel Group http://www.myexperiment.org/groups/643/content. They are underpinned by a range of analytical and data processing functions (generally provided as Web Services or R scripts) to support common biodiversity analysis tasks. You can find the Web Services catalogued in the BiodiversityCatalogue.