Filter
Reset all

Subjects

Content Types

Countries

AID systems

API

Certificates

Data access

Data access restrictions

Database access

Database access restrictions

Database licenses

Data licenses

Data upload

Data upload restrictions

Enhanced publication

Institution responsibility type

Institution type

Keywords

Metadata standards

PID systems

Provider types

Quality management

Repository languages

Software

Syndications

Repository types

Versioning

  • * at the end of a keyword allows wildcard searches
  • " quotes can be used for searching phrases
  • + represents an AND search (default)
  • | represents an OR search
  • - represents a NOT operation
  • ( and ) implies priority
  • ~N after a word specifies the desired edit distance (fuzziness)
  • ~N after a phrase specifies the desired slop amount
Found 38 result(s)
Country
CHAMP (CHAllenging Minisatellite Payload) is a German small satellite mission for geoscientific and atmospheric research and applications, managed by GFZ. With its highly precise, multifunctional and complementary payload elements (magnetometer, accelerometer, star sensor, GPS receiver, laser retro reflector, ion drift meter) and its orbit characteristics (near polar, low altitude, long duration) CHAMP will generate for the first time simultaneously highly precise gravity and magnetic field measurements over a 5 years period. This will allow to detect besides the spatial variations of both fields also their variability with time. The CHAMP mission had opened a new era in geopotential research and had become a significant contributor to the Decade of Geopotentials. In addition with the radio occultation measurements onboard the spacecraft and the infrastructure developed on ground, CHAMP had become a pilot mission for the pre-operational use of space-borne GPS observations for atmospheric and ionospheric research and applications in weather prediction and space weather monitoring. End of the mission of CHAMP was at September 19 2010, after ten years, two month and four days, after 58277 orbits.
Yoda publishes research data on behalf of researchers that are affiliated with Utrecht University, its research institutes and consortia where it acts as a coordinating body. Data packages are not limited to a particular field of research or license. Yoda publishes data packages via Datacite. To find data publications use: https://public.yoda.uu.nl/ , or the Datacite search engine: https://search.datacite.org/repositories/delft.uu
Country
LiceBase is a database for sea lice genomics. LiceBase provides the genome annotation of the Atlantic salmon louse Lepeophtheirus salmonis, a genome browser, Blast functionality and access to related high-thoughput genomics data.
The PAIN Repository is a recently funded NIH initiative, which has two components: an archive for already collected imaging data (Archived Repository), and a repository for structural and functional brain images and metadata acquired prospectively using standardized acquisition parameters (Standardized Repository) in healthy control subjects and patients with different types of chronic pain. The PAIN Repository provides the infrastructure for storage of standardized resting state functional, diffusion tensor imaging and structural brain imaging data and associated biological, physiological and behavioral metadata from multiple scanning sites, and provides tools to facilitate analysis of the resulting comprehensive data sets.
METLIN represents the largest MS/MS collection of data with the database generated at multiple collision energies and in positive and negative ionization modes. The data is generated on multiple instrument types including SCIEX, Agilent, Bruker and Waters QTOF mass spectrometers.
ForestPlots.net is a web-accessible secure repository for forest plot inventories in South America, Africa and Asia. The database includes plot geographical information; location, taxonomic information and diameter measurements of trees inside each plot; and participants in plot establishment and re-measurement, including principal investigators, field assistants, students.
The National Science Foundation (NSF) Ultraviolet (UV) Monitoring Network provides data on ozone depletion and the associated effects on terrestrial and marine systems. Data are collected from 7 sites in Antarctica, Argentina, United States, and Greenland. The network is providing data to researchers studying the effects of ozone depletion on terrestrial and marine biological systems. Network data is also used for the validation of satellite observations and for the verification of models describing the transfer of radiation through the atmosphere.
The twin GRACE satellites were launched on March 17, 2002. Since that time, the GRACE Science Data System (SDS) has produced and distributed estimates of the Earth gravity field on an ongoing basis. These estimates, in conjunction with other data and models, have provided observations of terrestrial water storage changes, ice-mass variations, ocean bottom pressure changes and sea-level variations. This portal, together with PODAAC, is responsible for the distribution of the data and documentation for the GRACE project.
The Virtual Research Environment (VRE) is an open-source data management platform that enables medical researchers to store, process and share data in compliance with the European Union (EU) General Data Protection Regulation (GDPR). The VRE addresses the present lack of digital research data infrastructures fulfilling the need for (a) data protection for sensitive data, (b) capability to process complex data such as radiologic imaging, (c) flexibility for creating own processing workflows, (d) access to high performance computing. The platform promotes FAIR data principles and reduces barriers to biomedical research and innovation. The VRE offers a web portal with graphical and command-line interfaces, segregated data zones and organizational measures for lawful data onboarding, isolated computing environments where large teams can collaboratively process sensitive data privately, analytics workbench tools for processing, analyzing, and visualizing large datasets, automated ingestion of hospital data sources, project-specific data warehouses for structured storage and retrieval, graph databases to capture and query ontology-based metadata, provenance tracking, version control, and support for automated data extraction and indexing. The VRE is based on a modular and extendable state-of-the art cloud computing framework, a RESTful API, open developer meetings, hackathons, and comprehensive documentation for users, developers, and administrators. The VRE with its concerted technical and organizational measures can be adopted by other research communities and thus facilitates the development of a co-evolving interoperable platform ecosystem with an active research community.
Country
Kadi4Mat instance for use at the Karlsruhe Institute of Technology (KIT) and for cooperations, including the Cluster of Competence for Solid-state Batteries (FestBatt), the Battery Competence Cluster Analytics/Quality Assurance (AQua), and more. Kadi4Mat is the Karlsruhe Data Infrastructure for Materials Science, an open source software for managing research data. It is being developed as part of several research projects at the Institute for Applied Materials - Microstructure Modelling and Simulation (IAM-MMS) of the Karlsruhe Institute of Technology (KIT). The goal of this project is to combine the ability to manage and exchange data, the repository , with the possibility to analyze, visualize and transform said data, the electronic lab notebook (ELN). Kadi4Mat supports a close cooperation between experimenters, theorists and simulators, especially in materials science, to enable the acquisition of new knowledge and the development of novel materials. This is made possible by employing a modular and generic architecture, which allows to cover the specific needs of different scientists, each utilizing unique workflows. At the same time, this opens up the possibility of covering other research disciplines as well.
MEMENTO aims to become a valuable tool for identifying regions of the world ocean that should be targeted in future work to improve the quality of air-sea flux estimates.
Country
The term GNSS (Global Navigation Satellite Systems) comprises the different navigation satellite systems like GPS, GLONAS and the future Galileo as well as rawdata from GNSS microwave receivers and processed or derived higher level products and required auxiliary data. The results of the GZF GNSS technology based projects are used as contribution for maintaining and studying the Earth rotational behavior and the global terrestial reference frame, for studying neotectonic processes along plate boundaries and the interior of plates and as input to short term weather forecasting and atmosphere/climate research. Currently only selected products like observation data, navigation data (ephemeriden), meteorological data as well as quality data with a limited spatial coverage are provided by the GNSS ISDC.
-----<<<<< The repository is no longer available. This record is out-dated. The Matter lab provides the archived database version of 2012 and 2013 at https://www.matter.toronto.edu/basic-content-page/data-download. Data linked from the World Community Grid - The Clean Energy Project see at https://www.worldcommunitygrid.org/research/cep1/overview.do and on fighshare https://figshare.com/articles/dataset/moldata_csv/9640427 >>>>>----- The Clean Energy Project Database (CEPDB) is a massive reference database for organic semiconductors with a particular emphasis on photovoltaic applications. It was created to store and provide access to data from computational as well as experimental studies, on both known and virtual compounds. It is a free and open resource designed to support researchers in the field of organic electronics in their scientific pursuits. The CEPDB was established as part of the Harvard Clean Energy Project (CEP), a virtual high-throughput screening initiative to identify promising new candidates for the next generation of carbon-based solar cell materials.
The name Earth Online derives from ESA's Earthnet programme. Earthnet prepares and attracts new ESA Earth Observation missions by setting the international cooperation scheme, preparing the basic infrastructure, building the scientific and application Community and competency in Europe to define and set-up own European Programmes in consultation with member states. Earth Online is the entry point for scientific-technical information on Earth Observation activities by the European Space Agency (ESA). The web portal provides a vast amount of content, grown and collected over more than a decade: Detailed technical information on Earth Observation (EO) missions; Satellites and sensors; EO data products & services; Online resources such as catalogues and library; Applications of satellite data; Access to promotional satellite imagery. After 10 years of operations on distinct sites, the two principal portals of ESA Earth Observation - Earth Online (earth.esa.int) and the Principal Investigator's Portal (eopi.esa.int) have moved to a new platform. ESA's technical and scientific earth observation user communities will from now on be served from a single portal, providing a modern and easy-to-use interface to our services and data.
The Central Neuroimaging Data Archive (CNDA) allows for sharing of complex imaging data to investigators around the world, through a simple web portal. The CNDA is an imaging informatics platform that provides secure data management services for Washington University investigators, including source DICOM imaging data sharing to external investigators through a web portal, cnda.wustl.edu. The CNDA’s services include automated archiving of imaging studies from all of the University’s research scanners, automated quality control and image processing routines, and secure web-based access to acquired and post-processed data for data sharing, in compliance with NIH data sharing guidelines. The CNDA is currently accepting datasets only from Washington University affiliated investigators. Through this platform, the data is available for broad sharing with researchers both internal and external to Washington University.. The CNDA overlaps with data in oasis-brains.org https://www.re3data.org/repository/r3d100012182, but CNDA is a larger data set.
ETH Data Archive is ETH Zurich's long-term preservation solution for digital information such as research data, digitised content, archival records, or images. It serves as the backbone of data curation and for most of its content, it is a “dark archive” without public access. In this capacity, the ETH Data Archive also archives the content of ETH Zurich’s Research Collection which is the primary repository for members of the university and the first point of contact for publication of data at ETH Zurich. All data that was produced in the context of research at the ETH Zurich, can be published and archived in the Research Collection. An automated connection to the ETH Data Archive in the background ensures the medium to long-term preservation of all publications and research data. Direct access to the ETH Data Archive is intended only for customers who need to deposit software source code within the framework of ETH transfer Software Registration. Open Source code packages and other content from legacy workflows can be accessed via ETH Library @ swisscovery (https://library.ethz.ch/en/).
Country
sciencedata.dk is a research data store provided by DTU, the Danish Technical University, specifically aimed at researchers and scientists at Danish academic institutions. The service is intended for working with and sharing active research data as well as for safekeeping of large datasets. The data can be accessed and manipulated via a web interface, synchronization clients, file transfer clients or the command line. The service is built on and with open-source software from the ground up: FreeBSD, ZFS, Apache, PHP, ownCloud/Nextcloud. DTU is actively engaged in community efforts on developing research-specific functionality for data stores. Our servers are attached directly to the 10-Gigabit backbone of "Forskningsnettet" (the National Research and Education Network of Denmark) - implying that up and download speed from Danish academic institutions is in principle comparable to those of an external USB hard drive. Data store for research data allowing private sharing and sharing via links / persistent URLs.
The main function of the GGSP (Galileo Geodetic Service Provider) is to provide a terrestrial reference frame, in the broadest sense of the word, to both the Galileo Core System (GCS) as well as to the Galileo User Segment (all Galileo users). This implies that the GGSP should enable all users of the Galileo System, including the most demanding ones, to access and realise the GTRF with the precision required for their specific application. Furthermore, the GGSP must ensure the proper interfaces to all users of the GTRF, especially the geodetic and scientific user groups. In addition the GGSP must ensure the adherence to the defined standards of all its products. Last but not least the GGSP will play a key role to create awareness of the GTRF and educate users in the usage and realisation of the GTRF.
Country
TanDEM-X (TerraSAR-X add-on for Digital Elevation Measurement) is the first bistatic SAR mission in space. TanDEM-X and its twin satellite TerraSAR-X are flying in a closely controlled formation with typical distances between 250 and 500 meters. Primary mission objective is the generation of a consistent global digital elevation model with few meter level height accuracy. Beyond that, GFZ equipped TanDEM-X with a geodetic grade GPS receiver for precise baseline determination and for radio occultation measurements. TanDEM-X was launched on June 21, 2010 for a 5 year mission lifetime. The GPS radio occultation data of the German TanDEM-X satellite are analysed and globally distributed vertical atmospheric profiles (bending angles, refractivity, temperature, water vapor) are derived and provided for the international user community.
The Cancer Cell Line Encyclopedia project is a collaboration between the Broad Institute, and the Novartis Institutes for Biomedical Research and its Genomics Institute of the Novartis Research Foundation to conduct a detailed genetic and pharmacologic characterization of a large panel of human cancer models, to develop integrated computational analyses that link distinct pharmacologic vulnerabilities to genomic patterns and to translate cell line integrative genomics into cancer patient stratification. The CCLE provides public access to genomic data, analysis and visualization for about 1000 cell lines.
WISER is a self-service platform for data of the Global Networks of Isotopes in Precipitation (GNIP) and Rivers (GNIR), hosted within the IAEA's repository for technical resources (NUCLEUS). GNIP in WISER currently contains over 130,000 records, and stable isotopes are current to the end of 2013, and will be updated as verified data comes in. Parts of the GNIR water isotope data is online as well (synoptic/time series), although we are still in process of verifying and completing GNIR data uploads and for other isotopic parameters over the next year. Check back occasionally for GNIR updates. Tritium data after 2009 is in the process of being updated in the next year. When accessing WISER through the URL https://nucleus.iaea.org/wiser, you will be forwarded to the NUCLEUS log-in page. After entering your user credentials and validation, you will be forwarded to the WISER landing page.
The aim of the EPPO Global Database is to provide in a single portal for all pest-specific information that has been produced or collected by EPPO. The full database is available via the Internet, but when no Internet connection is available a subset of the database called ‘EPPO GD Desktop’ can be run as a software (now replacing PQR).
The DMC is designed to provide registered users with access to non-confidential petroleum exploration and production data from offshore Nova Scotia, subject to certain conditions. The DMC is housed in the CNSOPB's Geoscience Research Centre located in Dartmouth, Nova Scotia. Initially, the DMC will manage and distribute the following digital petroleum data: well data (i.e. logs and reports), seismic image files (e.g. TIFF, PDF), and production data. In the future the DMC could be expanded to include operational, safety, environmental, fisheries data, etc.