Filter
Reset all

Subjects

Content Types

Countries

AID systems

API

Certificates

Data access

Data access restrictions

Database access

Database licenses

Data licenses

Data upload

Data upload restrictions

Enhanced publication

Institution responsibility type

Institution type

Keywords

Metadata standards

PID systems

Provider types

Quality management

Repository languages

Software

Syndications

Repository types

Versioning

  • * at the end of a keyword allows wildcard searches
  • " quotes can be used for searching phrases
  • + represents an AND search (default)
  • | represents an OR search
  • - represents a NOT operation
  • ( and ) implies priority
  • ~N after a word specifies the desired edit distance (fuzziness)
  • ~N after a phrase specifies the desired slop amount
  • 1 (current)
Found 18 result(s)
The Atmospheric Science Data Center (ASDC) at NASA Langley Research Center is responsible for processing, archiving, and distribution of NASA Earth science data in the areas of radiation budget, clouds, aerosols, and tropospheric chemistry.The ASDC specializes in atmospheric data important to understanding the causes and processes of global climate change and the consequences of human activities on the climate.
The Astromaterials Data System (AstroMat) is a data infrastructure to store, curate, and provide access to laboratory data acquired on samples curated in the Astromaterials Collection of the Johnson Space Center. AstroMat is developed and operated at the Lamont-Doherty Earth Observatory of Columbia University and funded by NASA.
The Astrophysics Source Code Library (ASCL) is a free online registry for source codes of interest to astronomers and astrophysicists and lists codes that have been used in research that has appeared in, or been submitted to, peer-reviewed publications. The ASCL is citable by using the unique ascl ID assigned to each code. The ascl ID can be used to link to the code entry by prefacing the number with ascl.net (i.e., ascl.net/1201.001).
The Solar Dynamics Observatory (SDO) studies the solar atmosphere on small scales of space and time, in multiple wavelengths. This is a searchable database of all SDO data, including citizen scientist images, space weather and near real time data, and helioseismology data.
BSRN is a project of the Radiation Panel (now the Data and Assessment Panel) from the Global Energy and Water Cycle Experiment (GEWEX) under the umbrella of the World Climate Research Programme (WCRP). It is the global baseline network for surface radiation for the Global limate Observing System (GCOS), contributing to the Global Atmospheric Watch (GAW), and forming a ooperative network with the Network for the Detection of Atmospheric Composition Change NDACC).
Finding planets orbiting nearby stars has been a holy grail in astronomy for more than 400 years. We began working on this problem 30 years ago, at a time when there were no known extrasolar planets. In late 1995 we began routinely finding planets around the nearest stars. Since then we have found several hundred planets, including the first sub-saturn mass planet, the first neptune mass planet, the first terrestrial mass planet, the first multiple planet system, and the first transiting planet.
The Infrared Space Observatory (ISO) is designed to provide detailed infrared properties of selected Galactic and extragalactic sources. The sensitivity of the telescopic system is about one thousand times superior to that of the Infrared Astronomical Satellite (IRAS), since the ISO telescope enables integration of infrared flux from a source for several hours. Density waves in the interstellar medium, its role in star formation, the giant planets, asteroids, and comets of the solar system are among the objects of investigation. ISO was operated as an observatory with the majority of its observing time being distributed to the general astronomical community. One of the consequences of this is that the data set is not homogeneous, as would be expected from a survey. The observational data underwent sophisticated data processing, including validation and accuracy analysis. In total, the ISO Data Archive contains about 30,000 standard observations, 120,000 parallel, serendipity and calibration observations and 17,000 engineering measurements. In addition to the observational data products, the archive also contains satellite data, documentation, data of historic aspects and externally derived products, for a total of more than 400 GBytes stored on magnetic disks. The ISO Data Archive is constantly being improved both in contents and functionality throughout the Active Archive Phase, ending in December 2006.
The University of Cape Town (UCT) uses Figshare for institutions for their data repository, which was launched in 2017 and is called ZivaHub: Open Data UCT. ZivaHub serves principal investigators at the University of Cape Town who are in need of a repository to store and openly disseminate the data that support their published research findings. The repository service is provided in terms of the UCT Research Data Management Policy. It provides open access to supplementary research data files and links to their respective scholarly publications (e.g. theses, dissertations, papers et al) hosted on other platforms, such as OpenUCT.
Country
The National High Energy Physics Science Data Center (NHEPSDC) is a repository for high-energy physics. In 2019, it was designated as a scientific data center at the national level by the Ministry of Science and Technology of China (MOST). NHEPSDC is constructed and operated by the Institute of High Energy Physics (IHEP) of the Chinese Academy of Sciences (CAS). NHEPSDC consists of a main data center in Beijing, a branch center in Guangdong-Hong Kong-Macao Greater Bay Area, and a branch center in Huairou District of Beijing. The mission of NHEPSDC is to provide the services of data collection, archiving, long-term preservation, access and sharing, software tools, and data analysis. The services of NHEPSDC are mainly for high-energy physics and related scientific research activities. The data collected can be roughly divided into the following two categories: one is the raw data from large scientific facilities, and the other is data generated from general scientific and technological projects (usually supported by government funding), hereafter referred to as generic data. More than 70 people work in NHEPSDC now, with 18 in high-energy physics, 17 in computer science, 15 in software engineering, 20 in data management and some other operation engineers. NHEPSDC is equipped with a hierarchical storage system, high-performance computing power, high bandwidth domestic and international network links, and a professional service support system. In the past three years, the average data increment is about 10 PB per year. By integrating data resources with the IT environment, a state-of-art data process platform is provided to users for scientific research, the volume of data accessed every year is more than 400 PB with more than 10 million visits.
The Deep Carbon Observatory (DCO) is a global community of multi-disciplinary scientists unlocking the inner secrets of Earth through investigations into life, energy, and the fundamentally unique chemistry of carbon. Deep Carbon Observatory Digital Object Registry (“DCO-VIVO”) is a centrally-managed digital object identification, object registration and metadata management service for the DCO. Digital object registration includes DCO-ID generation based on the global Handle System infrastructure and metadata collection using VIVO. Users will be able to deposit their data into the DCO Data Repository and have that data discoverable and accessible by others.
The ASTER Project consists of two parts, each having a Japanese and a U.S. component. Mission operations are split between Japan Space Systems (J-spacesystems) and the Jet Propulsion Laboratory (JPL) in the U.S. J-spacesystems oversees monitoring instrument performance and health, developing the daily schedule command sequence, processing Level 0 data to Level 1, and providing higher level data processing, archiving, and distribution. The JPL ASTER project provides scheduling support for U.S. investigators, calibration and validation of the instrument and data products, coordinating the U.S. Science Team, and maintaining the science algorithms. The joint Japan/U.S. ASTER Science Team has about 40 scientists and researchers. Data access via NASA Reverb, ASTER Japan site, earth explorer, GloVis,GDEx and LP DAAC. See here https://asterweb.jpl.nasa.gov/data.asp. In Addition data are availabe through the newly implemented ASTER Volcano archive (AVA) https://ava.jpl.nasa.gov/ .
CERN, DESY, Fermilab and SLAC have built the next-generation High Energy Physics (HEP) information system, INSPIRE. It combines the successful SPIRES database content, curated at DESY, Fermilab and SLAC, with the Invenio digital library technology developed at CERN. INSPIRE is run by a collaboration of CERN, DESY, Fermilab, IHEP, IN2P3 and SLAC, and interacts closely with HEP publishers, arXiv.org, NASA-ADS, PDG, HEPDATA and other information resources. INSPIRE represents a natural evolution of scholarly communication, built on successful community-based information systems, and provides a vision for information management in other fields of science.
Under the World Climate Research Programme (WCRP) the Working Group on Coupled Modelling (WGCM) established the Coupled Model Intercomparison Project (CMIP) as a standard experimental protocol for studying the output of coupled atmosphere-ocean general circulation models (AOGCMs). CMIP provides a community-based infrastructure in support of climate model diagnosis, validation, intercomparison, documentation and data access. This framework enables a diverse community of scientists to analyze GCMs in a systematic fashion, a process which serves to facilitate model improvement. Virtually the entire international climate modeling community has participated in this project since its inception in 1995. The Program for Climate Model Diagnosis and Intercomparison (PCMDI) archives much of the CMIP data and provides other support for CMIP. We are now beginning the process towards the IPCC Fifth Assessment Report and with it the CMIP5 intercomparison activity. The CMIP5 (CMIP Phase 5) experiment design has been finalized with the following suites of experiments: I Decadal Hindcasts and Predictions simulations, II "long-term" simulations, III "atmosphere-only" (prescribed SST) simulations for especially computationally-demanding models. The new ESGF peer-to-peer (P2P) enterprise system (http://pcmdi9.llnl.gov) is now the official site for CMIP5 model output. The old gateway (http://pcmdi3.llnl.gov) is deprecated and now shut down permanently.
The CALIPSO satellite provides new insight into the role that clouds and atmospheric aerosols play in regulating Earth's weather, climate, and air quality. CALIPSO combines an active lidar instrument with passive infrared and visible imagers to probe the vertical structure and properties of thin clouds and aerosols over the globe. CALIPSO was launched on April 28, 2006, with the CloudSat satellite. CALIPSO and CloudSat are highly complementary and together provide new, never-before-seen 3D perspectives of how clouds and aerosols form, evolve, and affect weather and climate. CALIPSO and CloudSat fly in formation with three other satellites in the A-train constellation to enable an even greater understanding of our climate system.
Subject(s)
Country
Edmond is the institutional repository of the Max Planck Society for public research data. It enables Max Planck scientists to create citable scientific assets by describing, enriching, sharing, exposing, linking, publishing and archiving research data of all kinds. Further on, all objects within Edmond have a unique identifier and therefore can be clearly referenced in publications or reused in other contexts.