Reset all


Content Types


AID systems



Data access

Data access restrictions

Database access

Database access restrictions

Database licenses

Data licenses

Data upload

Data upload restrictions

Enhanced publication

Institution responsibility type

Institution type


Metadata standards

PID systems

Provider types

Quality management

Repository languages



Repository types


  • * at the end of a keyword allows wildcard searches
  • " quotes can be used for searching phrases
  • + represents an AND search (default)
  • | represents an OR search
  • - represents a NOT operation
  • ( and ) implies priority
  • ~N after a word specifies the desired edit distance (fuzziness)
  • ~N after a phrase specifies the desired slop amount
Found 803 result(s)
STRING is a database of known and predicted protein interactions. The interactions include direct (physical) and indirect (functional) associations; they are derived from four sources: - Genomic Context - High-throughput Experiments - (Conserved) Coexpression - Previous Knowledge STRING quantitatively integrates interaction data from these sources for a large number of organisms, and transfers information between these organisms where applicable.
The Space Physics Data Facility (SPDF) leads in the design and implementation of unique multi-mission and multi-disciplinary data services and software to strategically advance NASA's solar-terrestrial program, to extend our science understanding of the structure, physics and dynamics of the Heliosphere of our Sun and to support the science missions of NASA's Heliophysics Great Observatory. Major SPDF efforts include multi-mission data services such as Heliophysics Data Portal (formerly VSPO), CDAWeb and CDAWeb Inside IDL,and OMNIWeb Plus (including COHOWeb, ATMOWeb, HelioWeb and CGM) , science planning and orbit services such as SSCWeb, data tools such as the CDF software and tools, and a range of other science and technology research efforts. The staff supporting SPDF includes scientists and information technology experts.
a collection of data at Agency for Healthcare Research and Quality (AHRQ) supporting research that helps people make more informed decisions and improves the quality of health care services. The portal contains U.S.Health Information Knowledgebase (USHIK) and Systematic Review Data Repository (SRDR) and other sources concerning cost, quality, accesibility and evaluation of healthcare and medical insurance.
The BGS is a data-rich organisation with over 400 datasets in its care; including environmental monitoring data, digital databases, physical collections (borehole core, rocks, minerals and fossils), records and archives. Our data is managed by the National Geoscience Data Centre. is the host website of the Center for Invasive Species and Ecosystem Health at the University of Georgia (Formerly: Bugwood Network). The Center aims to develop, consolidate and disseminate information and programmes focused on invasive species, forest health, natural resources and agricultural management through technology development, programmes implementation, training, applied research and public awareness at state, regional, national and international levels. The site gives details of its products (Bugwood Image Database; Early Detection and Distribution Mapping and Bugwoodwiki). Details of its projects, services and personnel are provided. Users can also access image databases on Forestry, Insects, IPM, Invasive Species, Forest Pests, weed and Bark Beetle.
The Index to Marine and Lacustrine Geological Samples is a tool to help scientists locate and obtain geologic material from sea floor and lakebed cores, grabs, and dredges archived by participating institutions around the world. Data and images related to the samples are prepared and contributed by the institutions for access via the IMLGS and long-term archive at NGDC. Before proposing research on any sample, please contact the curator for sample condition and availability. A consortium of Curators guides the IMLGS, maintained on behalf of the group by NGDC, since 1977.
Copernicus is a European system for monitoring the Earth. Copernicus consists of a complex set of systems which collect data from multiple sources: earth observation satellites and in situ sensors such as ground stations, airborne and sea-borne sensors. It processes these data and provides users with reliable and up-to-date information through a set of services related to environmental and security issues. The services address six thematic areas: land monitoring, marine monitoring, atmosphere monitoring, climate change, emergency management and security. The main users of Copernicus services are policymakers and public authorities who need the information to develop environmental legislation and policies or to take critical decisions in the event of an emergency, such as a natural disaster or a humanitarian crisis. Based on the Copernicus services and on the data collected through the Sentinels and the contributing missions , many value-added services can be tailored to specific public or commercial needs, resulting in new business opportunities. In fact, several economic studies have already demonstrated a huge potential for job creation, innovation and growth.
Welcome to INTERMAGNET - the global network of observatories, monitoring the Earth's magnetic field. At this site you can find data and information from geomagnetic observatories around the world. The INTERMAGNET programme exists to establish a global network of cooperating digital magnetic observatories, adopting modern standard specifications for measuring and recording equipment, in order to facilitate data exchanges and the production of geomagnetic products in close to real time.
The CancerData site is an effort of the Medical Informatics and Knowledge Engineering team (MIKE for short) of Maastro Clinic, Maastricht, The Netherlands. Our activities in the field of medical image analysis and data modelling are visible in a number of projects we are running. CancerData is offering several datasets. They are grouped in collections and can be public or private. You can search for public datasets in the NBIA (National Biomedical Imaging Archive) image archives without logging in.
CHAMP (CHAllenging Minisatellite Payload) is a German small satellite mission for geoscientific and atmospheric research and applications, managed by GFZ. With its highly precise, multifunctional and complementary payload elements (magnetometer, accelerometer, star sensor, GPS receiver, laser retro reflector, ion drift meter) and its orbit characteristics (near polar, low altitude, long duration) CHAMP will generate for the first time simultaneously highly precise gravity and magnetic field measurements over a 5 years period. This will allow to detect besides the spatial variations of both fields also their variability with time. The CHAMP mission had opened a new era in geopotential research and had become a significant contributor to the Decade of Geopotentials. In addition with the radio occultation measurements onboard the spacecraft and the infrastructure developed on ground, CHAMP had become a pilot mission for the pre-operational use of space-borne GPS observations for atmospheric and ionospheric research and applications in weather prediction and space weather monitoring. End of the mission of CHAMP was at September 19 2010, after ten years, two month and four days, after 58277 orbits.
Due to the changes at the individual IGS analysis centers during these years the resulting time series of global geodetic parameters are inhomogeneous and inconsistent. A geophysical interpretation of these long series and the realization of a high-accuracy global reference frame are therefore difficult and questionable. The GPS reprocessing project GPS-PDR (Potsdam Dresden Reprocessing), initiated by TU München and TU Dresden and continued by GFZ Potsdam and TU Dresden, provides selected products of a homogeneously reprocessed global GPS network such as GPS satellite orbits and Earth rotation parameters.
China Earthquake Data Center provides Seismic data, geomagnetic data, geoelectric data, terrain data and underground fluid change data. It is only open in the Seismological Bureau.
EMBL-EBI provides freely available data from life science experiments covering the full spectrum of molecular biology.The EBI Metagenomics service is an automated pipeline for the analysis and archiving of metagenomic data that aims to provide insights into the phylogenetic diversity as well as the functional and metabolic potential of a sample.
The Health and Medical Care Archive (HMCA) is the data archive of the Robert Wood Johnson Foundation (RWJF), the largest philanthropy devoted exclusively to health and health care in the United States. Operated by the Inter-university Consortium for Political and Social Research (ICPSR) at the University of Michigan, HMCA preserves and disseminates data collected by selected research projects funded by the Foundation and facilitates secondary analyses of the data. Our goal is to increase understanding of health and health care in the United States through secondary analysis of RWJF-supported data collections
MGI is the international database resource for the laboratory mouse, providing integrated genetic, genomic, and biological data to facilitate the study of human health and disease. The projects contributing to this resource are: Mouse Genome Database (MGD) Project, Gene Expression Database (GXD) Project, Mouse Tumor Biology (MTB) Database Project, Gene Ontology (GO) Project at MGI, MouseMine Project, MouseCyc Project at MGI
The USGS currently houses the institute at the Center for Earth Resources Observation and Science (EROS) in Sioux Falls, South Dakota. The LCI will address land cover topics from local to global scales, and in both domestic and international settings. The USGS through the Land Cover Institute serves as a facilitator for land cover and land use science, applications, and production functions. The institute assists in the availability and technical support of land cover data sets through increasing public and scientific awareness of the importance of land cover science. LCI continues, after the reorganization of the World Data Centers in 2009, serving as the World Data Center (WDC) for land cover data for access to, or information about, land cover data of the world
DataFed is a web services-based software that non-intrusively mediates between autonomous, distributed data providers and users. The main goals of DataFed are: Aid air quality management and science by effective use of relevant data - Facilitate the access and flow of atmospheric data from provider to users - Support the development of user-driven data processing value chains. DataFed Catalog links searchable Datafed applications worldwide.
The GAVO data center at Zentrum für Astronomie Heidelberg provides VO publication services to all interested parties on behalf of the German Astrophysical Virtual Observatory. It's a A growing collection of data and services.
Jason is a remote-controlled deep-diving vessel that gives shipboard scientists immediate, real-time access to the sea floor. Instead of making short, expensive dives in a submarine, scientists can stay on deck and guide Jason as deep as 6,500 meters (4 miles) to explore for days on end. Jason is a type of remotely operated vehicle (ROV), a free-swimming vessel connected by a long fiberoptic tether to its research ship. The 10-km (6 mile) tether delivers power and instructions to Jason and fetches data from it.
Seafloor Sediments Data Collection is a collection of more than 14,000 archived marine geological samples recovered from the seafloor. The inventory includes long, stratified sediment cores, as well as rock dredges, surface grabs, and samples collected by the submersible Alvin.
The Objectively Analyzed air-sea Fluxes (OAFlux) project is a research and development project focusing on global air-sea heat, moisture, and momentum fluxes. The project is committed to produce high-quality, long-term, global ocean surface forcing datasets from the late 1950s to the present to serve the needs of the ocean and climate communities on the characterization, attribution, modeling, and understanding of variability and long-term change in the atmosphere and the oceans.
NED is a comprehensive database of multiwavelength data for extragalactic objects, providing a systematic, ongoing fusion of information integrated from hundreds of large sky surveys and tens of thousands of research publications. The contents and services span the entire observed spectrum from gamma rays through radio frequencies. As new observations are published, they are cross- identified or statistically associated with previous data and integrated into a unified database to simplify queries and retrieval. Seamless connectivity is also provided to data in NASA astrophysics mission archives (IRSA, HEASARC, MAST), to the astrophysics literature via ADS, and to other data centers around the world.
The Stanford Microarray Database (SMD) is a DNA microarray research database that provides a large amount of data for public use.