Filter
Reset all

Subjects

Content Types

Countries

AID systems

API

Certificates

Data access

Data access restrictions

Database access

Database access restrictions

Database licenses

Data licenses

Data upload

Data upload restrictions

Enhanced publication

Institution responsibility type

Institution type

Keywords

Metadata standards

PID systems

Provider types

Quality management

Repository languages

Software

Syndications

Repository types

Versioning

  • * at the end of a keyword allows wildcard searches
  • " quotes can be used for searching phrases
  • + represents an AND search (default)
  • | represents an OR search
  • - represents a NOT operation
  • ( and ) implies priority
  • ~N after a word specifies the desired edit distance (fuzziness)
  • ~N after a phrase specifies the desired slop amount
Found 260 result(s)
Content type(s)
The British Ocean Sediment Core Research Facility (BOSCORF) is based at the Southampton site of the National Oceanography Centre and is Britain’s national deep-sea core repository. BOSCORF is responsible for long-term storage and curation of sediment cores collected through UKRI-NERC research programmes. We promote secondary usage of sediment core samples and analytical data relating to the sample collection.
Country
ROHub is a holistic solution for the storage, lifecycle management and preservation of scientific investigations, campaigns and operational processes via research objects. It makes these resources available to others, allows to publish and release them through a DOI, and allows to discover and reuse pre-existing scientific knowledge. Built entirely around the research object concept and inspired by sustainable software management principles, ROHub is the reference platform implementing natively the full research object model and paradigm, which provides the backbone to a wealth of RO-centric applications and interfaces across different scientific communities.
Country
ITESO's Institutional Repository (ReI) is a digital repository that integrates the university's academic production, which manages, preserves and makes available in open access mode the works of researchers, professors and students of this university.
The UK Hydrographic Office (UKHO) is a world-leading centre for hydrography, specialising in marine geospatial data to support safe, secure and thriving oceans. UK Hydrographic Office Bathymetry Data Archive Centre (UKHO DAC) is the UK national repository for bathymetry data. It is provided by the UK Hydrographic Office (UKHO) as part of the wider Marine Environmental Data and Information Network (MEDIN) https://medin.org.uk/. The UKHO DAC holds bathymetry data assets from a wide range of sources – Government funded, commercial, environmental and defence. The ADMIRALTY Marine Data Portal https://www.gov.uk/guidance/inspire-portal-and-medin-bathymetry-data-archive-centre provides access to marine data sets held by the UK Hydrographic Office within the UK Exclusive Economic Zone (EEZ).
<<<!!!<<< The demand for high-value environmental data and information has dramatically increased in recent years. To improve our ability to meet that demand, NOAA’s former three data centers—the National Climatic Data Center, the National Geophysical Data Center, and the National Oceanographic Data Center, which includes the National Coastal Data Development Center—have merged into the National Centers for Environmental Information (NCEI). >>>!!!>>> The National Coastal Data Development Center, a division of the National Oceanographic Data Center, is dedicated to building the long-term coastal data record to support environmental prediction, scientific analysis, and formulation of public policy.
The Data Library and Archives (DLA) is part of the joint library system supported by the Marine Biological Laboratory and the Woods Hole Oceanographic Institution. The DLA holds collections of administrative records, photographs, scientists' data and papers, film and video, historical instruments, as well as books, journals and technical reports.
Country
SISSA Open Data is the Sissa repository for the research data managment. It is an institutional repository that captures, stores, preserves, and redistributes the data of the SISSA scientific community in digital form. SISSA Open Data is managed by the SISSA Library as a service to the SISSA scientific community.
The National Nuclear Data Center (NNDC) collects, evaluates, and disseminates nuclear physics data for basic nuclear research and applied nuclear technologies. The NNDC is a worldwide resource for nuclear data. The information available to the users of NNDC services is the product of the combined efforts of the NNDC and cooperating data centers and other interested groups, both in the United States and worldwide. The NNDC specializes in the following areas: - Nuclear structure and low-energy nuclear reactions - Nuclear databases and information technology - Nuclear data compilation and evaluation
<<<!!!<<<We are moving content from our Hydra repository (hydra.hull.ac.uk) to new repositories.>>>!!!>>> Hydra is a repository for digital materials at the University of Hull. It can hold and manage any type of digital material, and is being developed in response to the growth in the amount of digital material that is generated through the research, education and administrative activities within the University. Hydra contains different collections of datasets from University of Hull research projects as: ARCdoc, domesday dataset, History of Marine animal Populations (HMAP) and others.
>>>!!!<<<The repository is offline >>>!!!<<< The Space Physics Interactive Data Resource from NOAA's National Geophysical Data Center allows solar terrestrial physics customers to intelligently access and manage historical space physics data for integration with environment models and space weather forecasts.
Country
The task of WDC geomagnetism is to collect geomagnetic data from all over the globe and distribute those data to researchers and data users, as a World Data Center for Geomagnetism.
Under the World Climate Research Programme (WCRP) the Working Group on Coupled Modelling (WGCM) established the Coupled Model Intercomparison Project (CMIP) as a standard experimental protocol for studying the output of coupled atmosphere-ocean general circulation models (AOGCMs). CMIP provides a community-based infrastructure in support of climate model diagnosis, validation, intercomparison, documentation and data access. This framework enables a diverse community of scientists to analyze GCMs in a systematic fashion, a process which serves to facilitate model improvement. Virtually the entire international climate modeling community has participated in this project since its inception in 1995. The Program for Climate Model Diagnosis and Intercomparison (PCMDI) archives much of the CMIP data and provides other support for CMIP. We are now beginning the process towards the IPCC Fifth Assessment Report and with it the CMIP5 intercomparison activity. The CMIP5 (CMIP Phase 5) experiment design has been finalized with the following suites of experiments: I Decadal Hindcasts and Predictions simulations, II "long-term" simulations, III "atmosphere-only" (prescribed SST) simulations for especially computationally-demanding models. The new ESGF peer-to-peer (P2P) enterprise system (http://pcmdi9.llnl.gov) is now the official site for CMIP5 model output. The old gateway (http://pcmdi3.llnl.gov) is deprecated and now shut down permanently.
Country
The Coriolis Data Centre handles operational oceanography measurements made in situ, complementing the measurement of the ocean surface made using instruments aboard satellites. This work is realised through the establishment of permanent networks with data collected by ships or autonomous systems that are either fixed or drifting. This data can be used to construct a snapshot of water mass structure and current intensity.
The DCS allows you to search a catalogue of metadata (information describing data) to discover and gain access to NERC's data holdings and information products. The metadata are prepared to a common NERC Metadata Standard and are provided to the catalogue by the NERC Data Centres.
Chapman University Digital Commons is an open access digital repository and publication platform designed to collect, store, index, and provide access to the scholarly and creative output of Chapman University faculty, students, staff, and affiliates. In it are faculty research papers and books, data sets, outstanding student work, audiovisual materials, images, special collections, and more, all created by members of or owned by Chapman University. The datasets are listed in a separate collection.
WISER is a self-service platform for data of the Global Networks of Isotopes in Precipitation (GNIP) and Rivers (GNIR), hosted within the IAEA's repository for technical resources (NUCLEUS). GNIP in WISER currently contains over 130,000 records, and stable isotopes are current to the end of 2013, and will be updated as verified data comes in. Parts of the GNIR water isotope data is online as well (synoptic/time series), although we are still in process of verifying and completing GNIR data uploads and for other isotopic parameters over the next year. Check back occasionally for GNIR updates. Tritium data after 2009 is in the process of being updated in the next year. When accessing WISER through the URL https://nucleus.iaea.org/wiser, you will be forwarded to the NUCLEUS log-in page. After entering your user credentials and validation, you will be forwarded to the WISER landing page.
Europeana is the trusted source of cultural heritage brought to you by the Europeana Foundation and a large number of European cultural institutions, projects and partners. It’s a real piece of team work. Ideas and inspiration can be found within the millions of items on Europeana. These objects include: Images - paintings, drawings, maps, photos and pictures of museum objects Texts - books, newspapers, letters, diaries and archival papers Sounds - music and spoken word from cylinders, tapes, discs and radio broadcasts Videos - films, newsreels and TV broadcasts All texts are CC BY-SA, images and media licensed individually.
With the creation of the Metabolomics Data Repository managed by Data Repository and Coordination Center (DRCC), the NIH acknowledges the importance of data sharing for metabolomics. Metabolomics represents the systematic study of low molecular weight molecules found in a biological sample, providing a "snapshot" of the current and actual state of the cell or organism at a specific point in time. Thus, the metabolome represents the functional activity of biological systems. As with other ‘omics’, metabolites are conserved across animals, plants and microbial species, facilitating the extrapolation of research findings in laboratory animals to humans. Common technologies for measuring the metabolome include mass spectrometry (MS) and nuclear magnetic resonance spectroscopy (NMR), which can measure hundreds to thousands of unique chemical entities. Data sharing in metabolomics will include primary raw data and the biological and analytical meta-data necessary to interpret these data. Through cooperation between investigators, metabolomics laboratories and data coordinating centers, these data sets should provide a rich resource for the research community to enhance preclinical, clinical and translational research.
<<<!!!<<<The repository is no longer available. The printversion see: https://www.taylorfrancis.com/books/mono/10.1201/9781003220435/encyclopedia-astronomy-astrophysics-murdin >>>!!!>>> This unique resource covers the entire field of astronomy and astrophysics and this online version includes the full text of over 2,750 articles, plus sophisticated search and retrieval functionality, links to the primary literature, and is frequently updated with new material. An active editorial team, headed by the Encyclopedia's editor-in-chief, Paul Murdin, oversees the continual commissioning, reviewing and loading of new and revised content.In a unique collaboration, Nature Publishing Group and Institute of Physics Publishing published the most extensive and comprehensive reference work in astronomy and astrophysics in both print and online formats. First published as a four volume print edition in 2001, the initial Web version went live in 2002, and contained the original print material and was rapidly supplemented with numerous updates and newly commissioned material. Since July 2006 the Encyclopedia is published solely by Taylor & Francis.
Herschel has been designed to observe the `cool universe'; it is observing the structure formation in the early universe, resolving the far infrared cosmic background, revealing cosmologically evolving AGN/starburst symbiosis and galaxy evolution at the epochs when most stars in the universe were formed, unveiling the physics and chemistry of the interstellar medium and its molecular clouds, the wombs of the stars, and unravelling the mechanisms governing the formation of and evolution of stars and their planetary systems, including our own solar system, putting it into context. In short, Herschel is opening a new window to study how the universe has evolved to become the universe we see today, and how our star the sun, our planet the earth, and we ourselves fit in.
Country
The NCI National Research Data Collection is Australia’s largest collection of research data, encompassing more than 10 PB of nationally and internationally significant datasets.
The NCAR is a federally funded research and development center committed to research and education in atmospheric science and related scientific fields. NCAR seeks to support and enhance the scientific community nationally and globally by monitoring and researching the atmosphere and related physical and biological systems. Users can access climate and earth models created to better understand the atmosphere, the Earth and the Sun; as well as data from various NCAR research programs and projects. NCAR is sponsored by the National Science Foundation in addition to various other U.S. agencies.
The Registry of Open Data on AWS provides a centralized repository of public data sets that can be seamlessly integrated into AWS cloud-based applications. AWS is hosting the public data sets at no charge to their users. Anyone can access these data sets from their Amazon Elastic Compute Cloud (Amazon EC2) instances and start computing on the data within minutes. Users can also leverage the entire AWS ecosystem and easily collaborate with other AWS users.