Filter
Reset all

Subjects

Content Types

Countries

AID systems

API

Data access

Data access restrictions

Database access

Database licenses

Data licenses

Data upload

Data upload restrictions

Enhanced publication

Institution responsibility type

Institution type

Keywords

Metadata standards

PID systems

Provider types

Quality management

Repository languages

Software

Syndications

Repository types

Versioning

  • * at the end of a keyword allows wildcard searches
  • " quotes can be used for searching phrases
  • + represents an AND search (default)
  • | represents an OR search
  • - represents a NOT operation
  • ( and ) implies priority
  • ~N after a word specifies the desired edit distance (fuzziness)
  • ~N after a phrase specifies the desired slop amount
  • 1 (current)
Found 24 result(s)
In February 1986 the NIST measurements were communicated to appropriate astronomers for use in ground-based testing and calibration programs for the GHRS, and in 1990 the NIST group published the new wavelengths for about 3000 lines in the Supplement Series of the Astrophysical Journal. The full report on the NIST measurements in the form of a complete and detailed atlas of the platinum/neon spectrum presented in this special issue of the Journal of Research of NIST will be highly useful to a wide range of scientists.
The Site Survey Data Bank (SSDB) is a repository for site survey data submitted in support of International Ocean Discovery Program (IODP) proposals and expeditions. SSDB serves different roles for different sets of users.
Reactome is a manually curated, peer-reviewed pathway database, annotated by expert biologists and cross-referenced to bioinformatics databases. Its aim is to share information in the visual representations of biological pathways in a computationally accessible format. Pathway annotations are authored by expert biologists, in collaboration with Reactome editorial staff and cross-referenced to many bioinformatics databases. These include NCBI Gene, Ensembl and UniProt databases, the UCSC and HapMap Genome Browsers, the KEGG Compound and ChEBI small molecule databases, PubMed, and Gene Ontology.
The OpenMadrigal project seeks to develop and support an on-line database for geospace data. The project has been led by MIT Haystack Observatory since 1980, but now has active support from Jicamarca Observatory and other community members. Madrigal is a robust, World Wide Web based system capable of managing and serving archival and real-time data, in a variety of formats, from a wide range of ground-based instruments. Madrigal is installed at a number of sites around the world. Data at each Madrigal site is locally controlled and can be updated at any time, but shared metadata between Madrigal sites allow searching of all Madrigal sites at once from any Madrigal site. Data is local; metadata is shared.
The Cancer Genome Atlas (TCGA) Data Portal provides a platform for researchers to search, download, and analyze data sets generated by TCGA. It contains clinical information, genomic characterization data, and high level sequence analysis of the tumor genomes. The Data Coordinating Center (DCC) is the central provider of TCGA data. The DCC standardizes data formats and validates submitted data.
PAGER (Prompt Assessment of Global Earthquakes for Response) is an automated system that produces content concerning the impact of significant earthquakes around the world, informing emergency responders, government and aid agencies, and the media of the scope of the potential disaster. PAGER rapidly assesses earthquake impacts by comparing the population exposed to each level of shaking intensity with models of economic and fatality losses based on past earthquakes in each country or region of the world. Earthquake alerts – which were formerly sent based only on event magnitude and location, or population exposure to shaking – now will also be generated based on the estimated range of fatalities and economic losses. PAGER uses these earthquake parameters to calculate estimates of ground shaking by using the methodology and software developed for ShakeMaps. ShakeMap sites provide near-real-time maps of ground motion and shaking intensity following significant earthquakes. These maps are used by federal, state, and local organizations, both public and private, for post-earthquake response and recovery, public and scientific information, as well as for preparedness exercises and disaster planning.
The National Sleep Research Resource (NSRR) is an NHLBI-supported repository for sharing large amounts of sleep data (polysomnography, actigraphy and questionnaire-based) from multiple cohorts, clinical trials, and other data sources. Launched in April 2014, the mission of the NSRR is to advance sleep and circadian science by supporting secondary data analysis, algorithmic development, and signal processing through the sharing of high-quality data sets.
The Human Mortality Database (HMD) was created to provide detailed mortality and population data to researchers, students, journalists, policy analysts, and others interested in the history of human longevity. The Human Mortality Database (HMD) contains original calculations of death rates and life tables for national populations (countries or areas), as well as the input data used in constructing those tables. The input data consist of death counts from vital statistics, plus census counts, birth counts, and population estimates from various sources.
The Brown Digital Repository (BDR) is a place to gather, index, store, preserve, and make available digital assets produced via the scholarly, instructional, research, and administrative activities at Brown.
On February 24, 2000, Terra began collecting what will ultimately become a new, 15-year global data set on which to base scientific investigations about our complex home planet. Together with the entire fleet of EOS spacecraft, Terra is helping scientists unravel the mysteries of climate and environmental change. TERRA's data collection instruments include: Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER), Clouds and the Earth's Radiant Energy System (CERES), Multi-angle Imaging Spectro-Radiometer (MISR), Moderate-resolution Imaging Spectroradiometer (MODIS), Measurement of Pollution in the Troposphere (MOPITT)
Digital Case is Case Western Reserve University's digital library, institutional repository and digital archive. Digital Case stores, disseminates, and preserves the intellectual output of Case faculty, departments and research centers in digital formats (both "born digital" items as well as materials of historical interest that have been digitized). Kelvin Smith Library manages Digital Case on behalf of the university. With Digital Case, KSL assumes an active role in the scholarly communication process, providing expertise in the form of a set of services (metadata creation, secure environment, preservation over time) for access and distribution of the university’s collective intellectual product.
The Digital Collections repository is a service that provides free and open access to the scholarship and creative works produced and owned by the Texas State University community. The Wittliff Collections, located on the seventh floor of the Albert B. Alkek Library at Texas State University, was founded by William D. Wittliff in 1987. The Wittliff Collections include 2 collections. 1. The Southwestern Writers Collection: These Collection holds the papers of numerous 20th century writers and the Southwestern & Mexican Photography Collection. The film holdings contain over 500 film and television screenplays as well as complete production archives for several popular films, including the television miniseries Lonesome Dove. The music holdings represent the breadth and scope of popular Texas sounds. 2. Mexican Photography Collection: The Southwestern & Mexican Photography Collection assembles a broad range of photographic work from the Southwestern United States and Mexico, from the 19th-century to the present day.
>>>>!!!!<<<< The Cancer Genomics Hub mission is now completed. The Cancer Genomics Hub was established in August 2011 to provide a repository to The Cancer Genome Atlas, the childhood cancer initiative Therapeutically Applicable Research to Generate Effective Treatments and the Cancer Genome Characterization Initiative. CGHub rapidly grew to be the largest database of cancer genomes in the world, storing more than 2.5 petabytes of data and serving downloads of nearly 3 petabytes per month. As the central repository for the foundational genome files, CGHub streamlined team science efforts as data became as easy to obtain as downloading from a hard drive. The convenient access to Big Data, and the collaborations that CGHub made possible, are now essential to cancer research. That work continues at the NCI's Genomic Data Commons. All files previously stored at CGHub can be found there. The Website for the Genomic Data Commons is here: https://gdc.nci.nih.gov/ >>>>!!!!<<<< The Cancer Genomics Hub (CGHub) is a secure repository for storing, cataloging, and accessing cancer genome sequences, alignments, and mutation information from the Cancer Genome Atlas (TCGA) consortium and related projects. Access to CGHub Data: All researchers using CGHub must meet the access and use criteria established by the National Institutes of Health (NIH) to ensure the privacy, security, and integrity of participant data. CGHub also hosts some publicly available data, in particular data from the Cancer Cell Line Encyclopedia. All metadata is publicly available and the catalog of metadata and associated BAMs can be explored using the CGHub Data Browser.
MEASURE DHS is advancing global understanding of health and population trends in developing countries through nationally-representative household surveys that provide data for a wide range of monitoring and impact evaluation indicators in the areas of population, health, HIV, and nutrition. The database collects, analyzes, and disseminates data from more than 300 surveys in over 90 countries. MEASURE DHS distributes, at no cost, survey data files for legitimate academic research.
TurtleSAT is a new website where communities are mapping the location of freshwater turtles in waterways and wetlands across the country. Australia's unique freshwater turtles are in crisis - their numbers are declining and your help is needed to record where you see turtles in your local area.
The CCHDO provides access to standard, well-described datasets from reference-quality repeat hydrography expeditions. It curates high quality full water column Conductivity-Temperature-Depth (CTD), hydrographic, carbon and tracer data from over 2,500 cruises from ~30 countries. It is the official data center for CTD and water sample profile data from the Global Ocean Ship-Based Hydrographic Investigations Program (GO-SHIP), as well as for WOCE, US Hydro, and other high quality repeat hydrography lines (e.g. SOCCOM, HOT, BATS, WOCE, CARINA.)
The OFA databases are core to the organization’s objective of establishing control programs to lower the incidence of inherited disease. Responsible breeders have an inherent responsibility to breed healthy dogs. The OFA databases serve all breeds of dogs and cats, and provide breeders a means to respond to the challenge of improving the genetic health of their breed through better breeding practices. The testing methodology and the criteria for evaluating the test results for each database were independently established by veterinary scientists from their respective specialty areas, and the standards used are generally accepted throughout the world.
The Web-enabled Landsat data (WELD) project combines geophysical and biophysical Landstat data for the purposes of long-term preservation and monitoring of national, regional, and local data. WELD products are already "terrain-corrected and radiometrically calibrated" so as to be more easily accessible to researchers.
<<<!!!<<< This repository is no longer available. >>>!!!>>> TRMM is a research satellite designed to improve our understanding of the distribution and variability of precipitation within the tropics as part of the water cycle in the current climate system. By covering the tropical and sub-tropical regions of the Earth, TRMM provides much needed information on rainfall and its associated heat release that helps to power the global atmospheric circulation that shapes both weather and climate. In coordination with other satellites in NASA's Earth Observing System, TRMM provides important precipitation information using several space-borne instruments to increase our understanding of the interactions between water vapor, clouds, and precipitation, that are central to regulating Earth's climate. The TRMM mission ended in 2015 and final TRMM multi-satellite precipitation analyses (TMPA, product 3B42/3B43) data processing will end December 31st, 2019. As a result, this TRMM webpage is in the process of being retired and some TRMM imagery may not be displaying correctly. Some of the content will be moved to the Precipitation Measurement Missions website https://gpm.nasa.gov/ and our team is exploring ways to provide some of the real-time products using GPM data. Please contact us if you have any additional questions.
Content type(s)
The Lamont-Doherty Core Repository (LDCR) contains one of the world’s most unique and important collection of scientific samples from the deep sea. Sediment cores from every major ocean and sea are archived at the Core Repository. The collection contains approximately 72,000 meters of core composed of 9,700 piston cores; 7,000 trigger weight cores; and 2,000 other cores such as box, kasten, and large diameter gravity cores. We also hold 4,000 dredge and grab samples, including a large collection of manganese nodules, many of which were recovered by submersibles. Over 100,000 residues are stored and are available for sampling where core material is expended. In addition to physical samples, a database of the Lamont core collection has been maintained for nearly 50 years and contains information on the geographic location of each collection site, core length, mineralogy and paleontology, lithology, and structure, and more recently, the full text of megascopic descriptions.
This site provides a central location for integrated near real-time or recent data relating to coral reefs, and also provides ecological forecasts (through artificial intelligence technology) as to the occurrence of specified environmental conditions, as prescribed by modelers, oceanographers and marine biologists.
This facility permits selective searches of some atomic data files compiled by R. L. Kurucz (Harvard-Smithsonian Center for Astrophysics). The data provided are: - vacuum wavelength (in nm) [above 200 nm calculated using Edlen, Metrologia, Vol. 2, No. 2, 1966]- air wavelength (in nm) above 200 nm- log(gf), - E [in cm-1], j, parity, and configuration for the levels (lower, upper), - information regarding the source of the data. CD-ROM 18 contains the spectrum synthesis programs ATLAS7V, SYNTHE, SPECTRV, ROTATE, BROADEN, PLOTSYN, etc. and sample runs found in directory PROGRAMS; Atomic line data files BELLHEAVY.DAT, BELLLIGHT.DAT, GFIRONLAB.DAT, GULLIVER.DAT, NLTELINES.DAT, GFIRONQ.DAT, obsolete, merged into GFALL, found in directory LINELISTS: Molecular line data files C2AX.ASC, C2BA.ASC, C2DA.ASC, C2EA.ASC, CNAX.ASC, CNBX.ASC, COAX.ASC, COXX.ASC, H2.ASC, HYDRIDES.ASC, SIOAX.ASC, SIOEX.ASC, SIOXX.ASC, found in directory LINELISTS; and my solar flux atlas for test calculations SOLARFLUX.ASC.
Welcome to INTERMAGNET - the global network of observatories, monitoring the Earth's magnetic field. At this site you can find data and information from geomagnetic observatories around the world. The INTERMAGNET programme exists to establish a global network of cooperating digital magnetic observatories, adopting modern standard specifications for measuring and recording equipment, in order to facilitate data exchanges and the production of geomagnetic products in close to real time.