Filter
Reset all

Subjects

Content Types

Countries

AID systems

API

Certificates

Data access

Data access restrictions

Database access

Database licenses

Data licenses

Data upload

Data upload restrictions

Enhanced publication

Institution responsibility type

Institution type

Keywords

Metadata standards

PID systems

Provider types

Quality management

Repository languages

Software

Syndications

Repository types

Versioning

  • * at the end of a keyword allows wildcard searches
  • " quotes can be used for searching phrases
  • + represents an AND search (default)
  • | represents an OR search
  • - represents a NOT operation
  • ( and ) implies priority
  • ~N after a word specifies the desired edit distance (fuzziness)
  • ~N after a phrase specifies the desired slop amount
Found 41 result(s)
<<<!!!<<< This repository is no longer available. >>>!!!>>> The programme "International Oceanographic Data and Information Exchange" (IODE) of the "Intergovernmental Oceanographic Commission" (IOC) of UNESCO was established in 1961. Its purpose is to enhance marine research, exploitation and development, by facilitating the exchange of oceanographic data and information between participating Member States, and by meeting the needs of users for data and information products.
The ILO Department of Statistics is the focal point to the United Nations on labour statistics. They develop international standards for better measurement of labour issues and enhanced international comparability; provide relevant, timely and comparable labour statistics; and help Member States develop and improve their labour statistics.
coastDat is a model based data bank developed mainly for the assessment of long-term changes in data sparse regions. A sequence of numerical models is employed to reconstruct all aspects of marine climate (such as storms, waves, surges etc.) over many decades of years relying only on large-scale information such as large-scale atmospheric conditions or bathymetry.
The CBU Dataverse is a research data repository for Cape Breton University. Files are held securely on Canadian servers, and can be made openly accessible to further research, gain citations and promote our world class research.
Our research focuses mainly on the past and present bio- and geodiversity and the evolution of animals and plants. The Information Technology Center of the Staatliche Naturwissenschaftliche Sammlungen Bayerns is the institutional repository for scientific data of the SNSB. Its major tasks focus on the management of bio- and geodiversity data using different kinds of information technological structures. The facility guarantees a sustainable curation, storage, archiving and provision of such data.
The University of Toronto Dataverse is a research data repository for our faculty, students, and staff. Files are held in a secure environment on Canadian servers. Researchers can choose to make content available publicly, to specific individuals, or to restrict access.
mentha archives evidence collected from different sources and presents these data in a complete and comprehensive way. Its data comes from manually curated protein-protein interaction databases that have adhered to the IMEx consortium. The aggregated data forms an interactome which includes many organisms. mentha is a resource that offers a series of tools to analyse selected proteins in the context of a network of interactions. Protein interaction databases archive protein-protein interaction (PPI) information from published articles. However, no database alone has sufficient literature coverage to offer a complete resource to investigate "the interactome". mentha's approach generates every week a consistent interactome (graph). Most importantly, the procedure assigns to each interaction a reliability score that takes into account all the supporting evidence. mentha offers eight interactomes (Homo sapiens, Arabidopsis thaliana, Caenorhabditis elegans, Drosophila melanogaster, Escherichia coli K12, Mus musculus, Rattus norvegicus, Saccharomyces cerevisiae) plus a global network that comprises every organism, including those not mentioned. The website and the graphical application are designed to make the data stored in mentha accessible and analysable to all users. Source databases are: MINT, IntAct, DIP, MatrixDB and BioGRID.
The Square Kilometre Array (SKA) is a radio telescope with around one million square metres of collecting area, designed to study the Universe with unprecedented speed and sensitivity. The SKA is not a single telescope, but a collection of various types of antennas, called an array, to be spread over long distances. The SKA will be used to answer fundamental questions of science and about the laws of nature, such as: how did the Universe, and the stars and galaxies contained in it, form and evolve? Was Einstein’s theory of relativity correct? What is the nature of ‘dark matter’ and ‘dark energy’? What is the origin of cosmic magnetism? Is there life somewhere else in the Universe?
ICRISAT performs crop improvement research, using conventional as well as methods derived from biotechnology, on the following crops: Chickpea, Pigeonpea, Groundnut, Pearl millet,Sorghum and Small millets. ICRISAT's data repository collects, preserves and facilitates access to the datasets produced by ICRISAT researchers to all users who are interested in. Data includes Phenotypic, Genotypic, Social Science, and Spatial data, Soil and Weather.
The Humanitarian Data Exchange (HDX) is an open platform for sharing data across crises and organisations. Launched in July 2014, the goal of HDX is to make humanitarian data easy to find and use for analysis. HDX is managed by OCHA's Centre for Humanitarian Data, which is located in The Hague. OCHA is part of the United Nations Secretariat and is responsible for bringing together humanitarian actors to ensure a coherent response to emergencies. The HDX team includes OCHA staff and a number of consultants who are based in North America, Europe and Africa.
The mission of World Data Center for Climate (WDCC) is to provide central support for the German and European climate research community. The WDCC is member of the ISC's World Data System. Emphasis is on development and implementation of best practice methods for Earth System data management. Data for and from climate research are collected, stored and disseminated. The WDCC is restricted to data products. Cooperations exist with thematically corresponding data centres of, e.g., earth observation, meteorology, oceanography, paleo climate and environmental sciences. The services of WDCC are also available to external users at cost price. A special service for the direct integration of research data in scientific publications has been developed. The editorial process at WDCC ensures the quality of metadata and research data in collaboration with the data producers. A citation code and a digital identifier (DOI) are provided and registered together with citation information at the DOI registration agency DataCite.
<<<!!!<<< This repository is no longer available. >>>!!!>>> BioVeL is a virtual e-laboratory that supports research on biodiversity issues using large amounts of data from cross-disciplinary sources. BioVeL supports the development and use of workflows to process data. It offers the possibility to either use already made workflows or create own. BioVeL workflows are stored in MyExperiment - Biovel Group http://www.myexperiment.org/groups/643/content. They are underpinned by a range of analytical and data processing functions (generally provided as Web Services or R scripts) to support common biodiversity analysis tasks. You can find the Web Services catalogued in the BiodiversityCatalogue.
The Australian National University undertake work to collect and publish metadata about research data held by ANU, and in the case of four discipline areas, Earth Sciences, Astronomy, Phenomics and Digital Humanities to develop pipelines and tools to enable the publication of research data using a common and repeatable approach. Aims and outcomes: To identify and describe research data held at ANU, to develop a consistent approach to the publication of metadata on the University's data holdings: Identification and curation of significant orphan data sets that might otherwise be lost or inadvertently destroyed, to develop a culture of data data sharing and data re-use.
EMSC collects real time parametric data (source parmaters and phase pickings) provided by 65 seismological networks of the Euro-Med region. These data are provided to the EMSC either by email or via QWIDS (Quake Watch Information Distribution System, developped by ISTI). The collected data are automatically archived in a database, made available via an autoDRM, and displayed on the web site. The collected data are automatically merged to produce automatic locations which are sent to several seismological institutes in order to perform quick moment tensors determination.
The Abacus Data Network is a data repository collaboration involving Libraries at Simon Fraser University (SFU), the University of British Columbia (UBC), the University of Northern British Columbia (UNBC) and the University of Victoria (UVic).
A data repository for researchers affiliated with Toronto Metropolitan University. This resource is part of Borealis which is a service provided by the Ontario Council of University Libraries.
Queen's University Dataverse is the institutional open access research data repository for Queen's University, featuring Queen's University Biological Station (QUBS) which includes research related to ecology, evolution, resource management and conservation, GIS, climate data, and environmental science.
Forestry Images provides an accessible and easy to use archive of high quality images related to forest health and silviculture
The EPN (or EUREF Permanent Network) is a voluntary organization of several European agencies and universities that pool resources and permanent GNSS station data to generate precise GNSS products. The EPN has been created under the umbrella of the International Association Geodesy and more precisely by its sub-commission EUREF. The European Terrestrial Reference System 89 (ETRS89) is used as the standard precise GPS coordinate system throughout Europe. Supported by EuroGeographics and endorsed by the EU, this reference system forms the backbone for all geographic and geodynamic projects on the European territory both on a national as on an international level.
The Prototype Data Portal allows to retrieve Data from World Data System (WDS) members. WDS ensures the long-term stewardship and provision of quality-assessed data and data services to the international science community and other stakeholders
Nuclear Data Services contains atomic, molecular and nuclear data sets for the development and maintenance of nuclear technologies. It includes energy-dependent reaction probabilities (cross sections), the energy and angular distributions of reaction products for many combinations of target and projectile, and the atomic and nuclear properties of excited states, and their radioactive decay data. Their main concern is providing data required to design a modern nuclear reactor for electricity production. Approximately 11.5 million nuclear data points have been measured and compiled into computerized form.
Arca Data is Fiocruz's official repository for archiving, publishing, disseminating, preserving and sharing digital research data produced by the Fiocruz community or in partnership with other research institutes or bodies, with the aim of promoting new research, ensuring the reproducibility or replicability of existing research and promoting an Open and Citizen Science. Its objective is to stimulate the wide circulation of scientific knowledge, strengthening the institutional commitment to Open Science and free access to health information, in addition to providing transparency and fostering collaboration between researchers, educators, academics, managers and graduate students, to the advancement of knowledge and the creation of solutions that meet the demands of society.