Filter
Reset all

Subjects

Content Types

Countries

AID systems

API

Certificates

Data access

Data access restrictions

Database access

Database licenses

Data licenses

Data upload

Data upload restrictions

Enhanced publication

Institution responsibility type

Institution type

Keywords

Metadata standards

PID systems

Provider types

Quality management

Repository languages

Software

Syndications

Repository types

Versioning

  • * at the end of a keyword allows wildcard searches
  • " quotes can be used for searching phrases
  • + represents an AND search (default)
  • | represents an OR search
  • - represents a NOT operation
  • ( and ) implies priority
  • ~N after a word specifies the desired edit distance (fuzziness)
  • ~N after a phrase specifies the desired slop amount
Found 49 result(s)
---<<< This repository is no longer available. This record is out-dated >>>--- The ONS challenge contains open solubility data, experiments with raw data from different scientists and institutions. It is part of the The Open Notebook Science wiki community, ideally suited for community-wide collaborative research projects involving mathematical modeling and computer simulation work, as it allows researchers to document model development in a step-by-step fashion, then link model prediction to experiments that test the model, and in turn, use feeback from experiments to evolve the model. By making our laboratory notebooks public, the evolutionary process of a model can be followed in its totality by the interested reader. Researchers from laboratories around the world can now follow the progress of our research day-to-day, borrow models at various stages of development, comment or advice on model developments, discuss experiments, ask questions, provide feedback, or otherwise contribute to the progress of science in any manner possible.
The GHDx is our user-friendly and searchable data catalog for global health, demographic, and other health-related datasets. It provides detailed information about datasets ranging from censuses and surveys to health records and vital statistics, globally. It also serves as a platform for data owners to share their data with the public. The GDB Compare visualization, which allows the user to see rate of change in disease incidence, globally or by country, by age or across all ages, is especially powerful as a tool. Be sure to try adding a bottom chart, like the map, to augment the treemap that loads by default in the top chart.
Country
GEOFON seeks to facilitate cooperation in seismological research and earthquake and tsunami hazard mitigation by providing rapid transnational access to seismological data and source parameters of large earthquakes, and keeping these data accessible in the long term. It pursues these aims by operating and maintaining a global network of permanent broadband stations in cooperation with local partners, facilitating real time access to data from this network and those of many partner networks and plate boundary observatories, providing a permanent and secure archive for seismological data. It also archives and makes accessible data from temporary experiments carried out by scientists at German universities and institutions, thereby fostering cooperation and encouraging the full exploitation of all acquired data and serving as the permanent archive for the Geophysical Instrument Pool at Potsdam (GIPP). It also organises the data exchange of real-time and archived data with partner institutions and international centres.
The Digital Archaeological Record (tDAR) is an international digital repository for the digital records of archaeological investigations. tDAR’s use, development, and maintenance are governed by Digital Antiquity, an organization dedicated to ensuring the long-term preservation of irreplaceable archaeological data and to broadening the access to these data.
<<<!!!<<< This repository is no longer available. >>>!!!>>> The programme "International Oceanographic Data and Information Exchange" (IODE) of the "Intergovernmental Oceanographic Commission" (IOC) of UNESCO was established in 1961. Its purpose is to enhance marine research, exploitation and development, by facilitating the exchange of oceanographic data and information between participating Member States, and by meeting the needs of users for data and information products.
Country
IDOC-DATA is a department of IDOC IDOC (Integrated Data & Operation Center) has existed since 2003 as a satellite operations center and data center for the Institute of Space Astrophysics (IAS) in Orsay, France. Since then, it has operated within the OSUPS (Observatoire des Sciences de l'Univers de l'Université Paris-Saclay - first french university in shanghai ranking), which includes three institutes: IAS, AIM (Astrophysique, Interprétation, Modélisation - IRFU, CEA) and GEOPS (Geosciences Paris-Saclay) . IDOC participates in the space missions of OSUPS and its partners, from mission design to long-term scientific data archiving. For each phase of the missions, IDOC offers three kinds of services in the scientific themes of OSUPS and therefore IDOC's activities are divided into three departments: IDOC-INSTR: instrument design and testing, IDOC-OPE: instrument operations, IDOC-DATA: data management and data value chain: to produce the different levels of data constructed from observations of these instruments and make them available to users for ergonomic and efficient scientific interpretation (IDOC-DATA). It includes the responsibility: - To build access to these datasets. - To offer the corresponding services such as catalogue management, visualization tools, software pipeline automation, etc. - To preserve the availability and reliability of this hardware and software infrastructure, its confidentiality where applicable and its security.
The HEASARC is a multi-mission astronomy archive for the EUV, X-ray, and Gamma ray wave bands. Because EUV, X and Gamma rays cannot reach the Earth's surface it is necessary to place the telescopes and sensors on spacecraft. The HEASARC now holds the data from 25 observatories covering over 30 years of X-ray, extreme-ultraviolet and gamma-ray astronomy. Data and software from many of the older missions were restored by the HEASARC staff. Examples of these archived missions include ASCA, BeppoSAX, Chandra, Compton GRO, HEAO 1, Einstein Observatory (HEAO 2), EUVE, EXOSAT, HETE-2, INTEGRAL, ROSAT, Rossi XTE, Suzaku, Swift, and XMM-Newton.
Country
PANGAEA - Data Publisher for Earth & Environmental Sciences has an almost 30-year history as an open-access library for archiving, publishing, and disseminating georeferenced data from the Earth, environmental, and biodiversity sciences. Originally evolving from a database for sediment cores, it is operated as a joint facility of the Alfred Wegener Institute, Helmholtz Centre for Polar and Marine Research (AWI) and the Center for Marine Environmental Sciences (MARUM) at the University of Bremen. PANGAEA holds a mandate from the World Meteorological Organization (WMO) and is accredited as a World Radiation Monitoring Center (WRMC). It was further accredited as a World Data Center by the International Council for Science (ICS) in 2001 and has been certified with the Core Trust Seal since 2019. The successful cooperation between PANGAEA and the publishing industry along with the correspondent technical implementation enables the cross-referencing of scientific publications and datasets archived as supplements to these publications. PANGAEA is the recommended data repository of numerous international scientific journals.
The Mikulski Archive for Space Telescopes (MAST) is a NASA funded project to support and provide to the astronomical community a variety of astronomical data archives, with the primary focus on scientifically related data sets in the optical, ultraviolet, and near-infrared parts of the spectrum. MAST is located at the Space Telescope Science Institute (STScI).
Cocoon "COllections de COrpus Oraux Numériques" is a technical platform that accompanies the oral resource producers, create, organize and archive their corpus; a corpus can consist of records (usually audio) possibly accompanied by annotations of these records. The resources registered are first cataloged and stored while, and then, secondly archived in the archive of the TGIR Huma-Num. The author and his institution are responsible for filings and may benefit from a restricted and secure access to their data for a defined period, if the content of the information is considered sensitive. The COCOON platform is jointly operated by two joint research units: Laboratoire de Langues et civilisations à tradition orale (LACITO - UMR7107 - Université Paris3 / INALCO / CNRS) and Laboratoire Ligérien de Linguistique (LLL - UMR7270 - Universités d'Orléans et de Tours, BnF, CNRS).
DBpedia is a crowd-sourced community effort to extract structured information from Wikipedia and make this information available on the Web. DBpedia allows you to ask sophisticated queries against Wikipedia, and to link the different data sets on the Web to Wikipedia data. We hope that this work will make it easier for the huge amount of information in Wikipedia to be used in some new interesting ways. Furthermore, it might inspire new mechanisms for navigating, linking, and improving the encyclopedia itself.
Content type(s)
>>>!!!<<< Data originally published in the JCB DataViewer has been moved BioStudies. Please note that while the majority of data were moved, some authors opted to remove their data completely. >>>!!!<<< Migrated data can be found at https://www.ebi.ac.uk/biostudies/JCB/studies. Screen data are available in the Image Data Resource repository. http://idr.openmicroscopy.org/webclient/?experimenter=-1 >>>!!!<<< The DataViewer was decommissioned in 2018 as the journal evolved to an all-encompassing archive policy towards original source data and as new data repositories that go beyond archiving data and allow investigators to make new connections between datasets, potentially driving discovery, emerged. JCB authors are encouraged to make available all datasets included in the manuscript from the date of online publication either in a publicly available database or as supplemental materials hosted on the journal website. We recommend that our authors store and share their data in appropriate publicly available databases based on data type and/or community standard. >>>!!!<<<
The Copernicus Marine Environment Monitoring Service (CMEMS) provides regular and systematic reference information on the physical and biogeochemical state, variability and dynamics of the ocean and marine ecosystems for the global ocean and the European regional seas. The observations and forecasts produced by the service support all marine applications, including: Marine safety; Marine resources; Coastal and marine environment; Weather, seasonal forecasting and climate. For instance, the provision of data on currents, winds and sea ice help to improve ship routing services, offshore operations or search and rescue operations, thus contributing to marine safety. The service also contributes to the protection and the sustainable management of living marine resources in particular for aquaculture, sustainable fisheries management or regional fishery organisations decision-making process. Physical and marine biogeochemical components are useful for water quality monitoring and pollution control. Sea level rise is a key indicator of climate change and helps to assess coastal erosion. Sea surface temperature elevation has direct consequences on marine ecosystems and appearance of tropical cyclones. As a result of this, the service supports a wide range of coastal and marine environment applications. Many of the data delivered by the service (e.g. temperature, salinity, sea level, currents, wind and sea ice) also play a crucial role in the domain of weather, climate and seasonal forecasting.
The WashU Research Data repository accepts any publishable research data set, including textual, tabular, geospatial, imagery, computer code, or 3D data files, from researchers affiliated with Washington University in St. Louis. Datasets include metadata and are curated and assigned a DOI to align with FAIR data principles.
SHARE - Stations at High Altitude for Research on the Environment - is an integrated Project for environmental monitoring and research in the mountain areas of Europe, Asia, Africa and South America responding to the call for improving environmental research and policies for adaptation to the effects of climate changes, as requested by International and Intergovernmental institutions.
The Southern California Earthquake Data Center (SCEDC) operates at the Seismological Laboratory at Caltech and is the primary archive of seismological data for southern California. The 1932-to-present Caltech/USGS catalog maintained by the SCEDC is the most complete archive of seismic data for any region in the United States. Our mission is to maintain an easily accessible, well-organized, high-quality, searchable archive for research in seismology and earthquake engineering.
FactGrid is a Wikibase instance designed to be used by historians with a focus on international projects. The database is hosted by the University of Erfurt and coordinated at the Gotha Research Centre. Partners in joint ventures are Wikimedia Germany as the software provider and the German National Library in a project to open the GND to international research.
Country
SILVA is a comprehensive, quality-controlled web resource for up-to-date aligned ribosomal RNA (rRNA) gene sequences from the Bacteria, Archaea and Eukaryota domains alongside supplementary online services. In addition to data products, SILVA provides various online tools such as alignment and classification, phylogenetic tree calculation and viewer, probe/primer matching, and an amplicon analysis pipeline. With every full release a curated guide tree is provided that contains the latest taxonomy and nomenclature based on multiple references. SILVA is an ELIXIR Core Data Resource.
eLaborate is an online work environment in which scholars can upload scans, transcribe and annotate text, and publish the results as on online text edition which is freely available to all users. Short information about and a link to already published editions is presented on the page Editions under Published. Information about editions currently being prepared is posted on the page Ongoing projects. The eLaborate work environment for the creation and publication of online digital editions is developed by the Huygens Institute for the History of the Netherlands of the Royal Netherlands Academy of Arts and Sciences. Although the institute considers itself primarily a research facility and does not maintain a public collection profile, Huygens ING actively maintains almost 200 digitally available resource collections.
Country
Swedish National Data Service (SND) is a research data infrastructure designed to assist researchers in preserving, maintaining, and disseminating research data in a secure and sustainable manner. The SND Search function makes it easy to find, use, and cite research data from a variety of scientific disciplines. Together with an extensive network of almost 40 Swedish higher education institutions and other research organisations, SND works for increased access to research data, nationally as well as internationally.
The European Environment Agency (EEA) is an agency of the European Union. Our task is to provide sound, independent information on the environment. We are a major information source for those involved in developing, adopting, implementing and evaluating environmental policy, and also the general public. Currently, the EEA has 33 member countries. EEA's mandate is: To help the Community and member countries make informed decisions about improving the environment, integrating environmental considerations into economic policies and moving towards sustainability To coordinate the European environment information and observation network (Eionet)
OpenWorm aims to build the first comprehensive computational model of the Caenorhabditis elegans (C. elegans), a microscopic roundworm. With only a thousand cells, it solves basic problems such as feeding, mate-finding and predator avoidance. Despite being extremely well studied in biology, this organism still eludes a deep, principled understanding of its biology. We are using a bottom-up approach, aimed at observing the worm behaviour emerge from a simulation of data derived from scientific experiments carried out over the past decade. To do so we are incorporating the data available in the scientific community into software models. We are engineering Geppetto and Sibernetic, open-source simulation platforms, to be able to run these different models in concert. We are also forging new collaborations with universities and research institutes to collect data that fill in the gaps All the code we produce in the OpenWorm project is Open Source and available on GitHub.