Filter
Reset all

Subjects

Content Types

Countries

AID systems

API

Certificates

Data access

Data access restrictions

Database access

Database access restrictions

Database licenses

Data licenses

Data upload

Data upload restrictions

Enhanced publication

Institution responsibility type

Institution type

Keywords

Metadata standards

PID systems

Provider types

Quality management

Repository languages

Software

Syndications

Repository types

Versioning

  • * at the end of a keyword allows wildcard searches
  • " quotes can be used for searching phrases
  • + represents an AND search (default)
  • | represents an OR search
  • - represents a NOT operation
  • ( and ) implies priority
  • ~N after a word specifies the desired edit distance (fuzziness)
  • ~N after a phrase specifies the desired slop amount
Found 98 result(s)
Country
The aim of the project KCDC (KASCADE Cosmic Ray Data Centre) is the installation and establishment of a public data centre for high-energy astroparticle physics based on the data of the KASCADE experiment. KASCADE was a very successful large detector array which recorded data during more than 20 years on site of the KIT-Campus North, Karlsruhe, Germany (formerly Forschungszentrum, Karlsruhe) at 49,1°N, 8,4°O; 110m a.s.l. KASCADE collected within its lifetime more than 1.7 billion events of which some 433.000.000 survived all quality cuts. Initially about 160 million events are available here for public usage.
ETH Data Archive is ETH Zurich's long-term preservation solution for digital information such as research data, digitised content, archival records, or images. It serves as the backbone of data curation and for most of its content, it is a “dark archive” without public access. In this capacity, the ETH Data Archive also archives the content of ETH Zurich’s Research Collection which is the primary repository for members of the university and the first point of contact for publication of data at ETH Zurich. All data that was produced in the context of research at the ETH Zurich, can be published and archived in the Research Collection. An automated connection to the ETH Data Archive in the background ensures the medium to long-term preservation of all publications and research data. Direct access to the ETH Data Archive is intended only for customers who need to deposit software source code within the framework of ETH transfer Software Registration. Open Source code packages and other content from legacy workflows can be accessed via ETH Library @ swisscovery (https://library.ethz.ch/en/).
Country
sciencedata.dk is a research data store provided by DTU, the Danish Technical University, specifically aimed at researchers and scientists at Danish academic institutions. The service is intended for working with and sharing active research data as well as for safekeeping of large datasets. The data can be accessed and manipulated via a web interface, synchronization clients, file transfer clients or the command line. The service is built on and with open-source software from the ground up: FreeBSD, ZFS, Apache, PHP, ownCloud/Nextcloud. DTU is actively engaged in community efforts on developing research-specific functionality for data stores. Our servers are attached directly to the 10-Gigabit backbone of "Forskningsnettet" (the National Research and Education Network of Denmark) - implying that up and download speed from Danish academic institutions is in principle comparable to those of an external USB hard drive. Data store for research data allowing private sharing and sharing via links / persistent URLs.
Country
More than 25 years ago FIZ Karlsruhe started depositing crystal structure data linked to publications in German journals. At that time it was irrelevant whether the deposited structures were organic or inorganic. Today FIZ Karlsruhe is responsible for storing the structure data of inorganic compounds. Organic structure data are stored by the Cambridge Crystallographic Data Center. Nowadays many publishers inform their authors that in parallel to a publication in a scientific journal, crystal structure data should also be stored in the Crystal Structure Depot at FIZ Karlsruhe. A CSD number will be assigned to the data for later reference in the publication. The data can then be ordered from the Crystal Structure Depot at FIZ Karlsruhe.
Nuclear Data Services contains atomic, molecular and nuclear data sets for the development and maintenance of nuclear technologies. It includes energy-dependent reaction probabilities (cross sections), the energy and angular distributions of reaction products for many combinations of target and projectile, and the atomic and nuclear properties of excited states, and their radioactive decay data. Their main concern is providing data required to design a modern nuclear reactor for electricity production. Approximately 11.5 million nuclear data points have been measured and compiled into computerized form.
Country
SMOKA provides public science data obtained at Subaru Telescope, 188cm telescope at Okayama Astrophysical Observatory, 105cm Schmidt telescope at Kiso Observatory (University of Tokyo), MITSuME, and KANATA Telescope at Higashi-Hiroshima Observatory. It is intended mainly for astronomical researchers.
Country
The Australian Antarctic Data Centre (AADC) provides data collection and data management services in Australia's Antarctic Science Program. The AADC manages science data from Australia's Antarctic research, maps Australia's areas of interest in the Antarctic region, manages Australia's Antarctic state of the environment reporting, and provides advice and education and a range of other products.
Country
The TDB project aims to produce a database that: contains data for all the elements of interest in radioactive waste disposal systems; documents why and how the data were selected; gives recommendations based on original experimental data, rather than compilations and estimates; documents the sources of experimental data used; is internally consistent; and treats all solids and aqueous species of the elements of interest for nuclear waste storage performance assessment calculations. The database compiles formation data (Gibbs energies, enthalpies, entropies and heat capacities) for each aqueous species and solid phase of interest, as well as chemical reactions and their corresponding thermodynamic data. Non thermodynamic data (diffusion or kinetics) and sorption data are not considered in the TDB project.
Country
The CSSDP project provides space scientists with access to a wide range of space data, observations, and investigative tools. It provides a seamless, single- point of access to these resources through a custom web portal. To date, more than 350 scientists are registered users of the CSSDP portal. The project integrates data from sources such as the Canadian Geospace Monitoring Program and anticipates serving data from the NASA THEMIS satellite probes, the Canadian High-Artic Ionospheric Network (CHAIN), and the Alberta- based Enhanced Polar Outflow Probe (ePOP) satellite mission. This collection and presentation of space data is used to study the influence of the sun on near- Earth space environment, including phenomena such as geomagnetic storms, which cause the northern and southern lights. Geomagnetic storms are also known for often causing power outages, disturbances in polar communications, and the failure of satellites. The effects of space weather can also cause transpolar flight paths to be diverted, adding significant fuel costs to airlines and disruptions for travellers.
Country
The World Data Centre for Geomagnetism, Mumbai is the part of the Indian Institute of Geomagnetism, an autonomous research institute under the Department of Science and Technology, Government of India. This Centre is a part of ICSU World Data Centre System operated since 1971. This Centre has collected a comprehensive set of analog and digital geomagnetic data as well as indices of geomagnetic activity supplied from a worldwide network of magnetic observatories.
The CALIPSO satellite provides new insight into the role that clouds and atmospheric aerosols play in regulating Earth's weather, climate, and air quality. CALIPSO combines an active lidar instrument with passive infrared and visible imagers to probe the vertical structure and properties of thin clouds and aerosols over the globe. CALIPSO was launched on April 28, 2006, with the CloudSat satellite. CALIPSO and CloudSat are highly complementary and together provide new, never-before-seen 3D perspectives of how clouds and aerosols form, evolve, and affect weather and climate. CALIPSO and CloudSat fly in formation with three other satellites in the A-train constellation to enable an even greater understanding of our climate system.
Country
The Data Bank operates a computer program service related to nuclear energy applications. The software library collects programs, compiles and verifies them in an appropriate computer environment, ensuring that the computer program package is complete and adequately documented. This collection of material contains more than 2000 documented packages and group cross-section data sets. We distribute these codes on CD-ROM, DVD and via electronic transfer to about 900 nominated NEA Data Bank establishments (see the rules for requesters). Standard software verification procedures are used following an ANSI/ANS standard.
Content type(s)
Scicat allows users to access the metadata of raw and derived data which is taken at experiment facilities. Scientific datasets are linked to proposals and samples. Scientific datasets are can be linked to publications (DOI, PID). SciCat helps keeping track of data provenance (i.e. the steps leading to the final results). Scicat allows users to find data based on the metadata (both your own data and other peoples’ public data). In the long term, SciCat will help to automate scientific analysis workflows.
The WDC is concerned with the collection, management, distribution and utilization of data from Chinese provinces, autonomous regions and counties,including: Resource data:management,distribution and utlilzation of land, water, climate, forest, grassland, minerals, energy, etc. Environmental data:pollution,environmental quality, change, natural disasters,soli erosion, etc. Biological resources:animals, plants,wildlife Social economy:agriculture, industry, transport, commerce,infrastructure,etc. Population and labor Geographic background data on scales of 1:4M,1:1M, 1:(1/2)M, 1:2500, etc.
Country
The CRC1211DB is the project-database of the Collaborative Research Centre 1211 "Earth -Evolution at the dry limit" (CRC1211,https://sfb1211.uni-koeln.de/) funded by the German Research Foundation (DFG, German Research Foundation – Projektnummer 268236062). The project-database is a new implementation of the TR32DB and online since 2016. It handles all data including metadata, which are created by the involved project participants from several institutions (e.g. Universities of Cologne, Bonn, Aachen, and the Research Centre Jülich) and research fields (e.g. soil and plant sciences, biology, geography, geology, meteorology and remote sensing). The data is resulting from several field measurement campaigns, meteorological monitoring, remote sensing, laboratory studies and modelling approaches. Furthermore, outcomes of the scientists such as publications, conference contributions, PhD reports and corresponding images are collected.
The main goal of the CLUES-project is to provide constrained simulations of the local universe designed to be used as a numerical laboratory of the current paradigm. The simulations will be used for unprecedented analysis of the complex dark matter and gasdynamical processes which govern the formation of galaxies. The predictions of these experiments can be easily compared with the detailed observations of our galactic neighborhood. Some of the CLUES data is now publicly available via the CosmoSim database (https://www.cosmosim.org/). This includes AHF halo catalogues from the Box 64, WMAP3 resimulations of the Local Group with 40963 particle resolution.
>>>!!!<<<The repository is offline >>>!!!<<< The Space Physics Interactive Data Resource from NOAA's National Geophysical Data Center allows solar terrestrial physics customers to intelligently access and manage historical space physics data for integration with environment models and space weather forecasts.
The ASTER Volcano Archive (AVA) is the worlds largest specialty archive of volcano data. For 1,549 recently active volcanos listed by the Smithsonian Global Volcanism Program, the AVA has collected the entirety of high-resolution multispectral ASTER data and made it available to the public. Also included are digital elevation maps, NOAA ash advisories, alteration zone imagery, and thermal anomaly reports. LANDSAT7 data are also being processed.
Country
The Astronomical Data Archives Center (ADAC) provides access to astronomical data from all over the world with links to online data catalogs, journal archives, imaging services and data archives. Users can access the VizieR catalogue service as well as the Hubble Ultra Deep Field Data by requesting password access. ADAC also provides access to the SMOKA public science data obtained through the Subaru Telescope in Hawaii as well as Schmidt Telescope at the University of Tokyo & MITSuME and KANATA Telescope at Higashi-Hiroshima Observatory. Users may need to contact the ADAC for password access or create user accounts for the various data services accessible through the ADAC site.
The Gulf of Mexico Research Initiative Information and Data Cooperative (GRIIDC) is a team of researchers, data specialists and computer system developers who are supporting the development of a data management system to store scientific data generated by Gulf of Mexico researchers. The Master Research Agreement between BP and the Gulf of Mexico Alliance that established the Gulf of Mexico Research Initiative (GoMRI) included provisions that all data collected or generated through the agreement must be made available to the public. The Gulf of Mexico Research Initiative Information and Data Cooperative (GRIIDC) is the vehicle through which GoMRI is fulfilling this requirement. The mission of GRIIDC is to ensure a data and information legacy that promotes continual scientific discovery and public awareness of the Gulf of Mexico Ecosystem.
Herschel has been designed to observe the `cool universe'; it is observing the structure formation in the early universe, resolving the far infrared cosmic background, revealing cosmologically evolving AGN/starburst symbiosis and galaxy evolution at the epochs when most stars in the universe were formed, unveiling the physics and chemistry of the interstellar medium and its molecular clouds, the wombs of the stars, and unravelling the mechanisms governing the formation of and evolution of stars and their planetary systems, including our own solar system, putting it into context. In short, Herschel is opening a new window to study how the universe has evolved to become the universe we see today, and how our star the sun, our planet the earth, and we ourselves fit in.
The Open Science Framework (OSF) is part network of research materials, part version control system, and part collaboration software. The purpose of the software is to support the scientist's workflow and help increase the alignment between scientific values and scientific practices. Document and archive studies. Move the organization and management of study materials from the desktop into the cloud. Labs can organize, share, and archive study materials among team members. Web-based project management reduces the likelihood of losing study materials due to computer malfunction, changing personnel, or just forgetting where you put the damn thing. Share and find materials. With a click, make study materials public so that other researchers can find, use and cite them. Find materials by other researchers to avoid reinventing something that already exists. Detail individual contribution. Assign citable, contributor credit to any research material - tools, analysis scripts, methods, measures, data. Increase transparency. Make as much of the scientific workflow public as desired - as it is developed or after publication of reports. Find public projects here. Registration. Registering materials can certify what was done in advance of data analysis, or confirm the exact state of the project at important points of the lifecycle such as manuscript submission or at the onset of data collection. Discover public registrations here. Manage scientific workflow. A structured, flexible system can provide efficiency gain to workflow and clarity to project objectives, as pictured.
The COordinated Molecular Probe Line Extinction Thermal Emission Survey of Star Forming Regions (COMPLETE) provides a range of data complementary to the Spitzer Legacy Program "From Molecular Cores to Planet Forming Disks" (c2d) for the Perseus, Ophiuchus and Serpens regions. In combination with the Spitzer observations, COMPLETE will allow for detailed analysis and understanding of the physics of star formation on scales from 500 A.U. to 10 pc.
The ODIN Portal hosts scientific databases in the domains of structural materials and hydrogen research and is operated on behalf of the European energy research community by the Joint Research Centre, the European Commission's in-house science service providing independent scientific advice and support to policies of the European Union. ODIN contains engineering databases (Mat-Database, Hiad-Database, Nesshy-Database, HTR-Fuel-Database, HTR-Graphit-Database) and document management sites and other information related to European research in the area of nuclear and conventional energy.