Filter
Reset all

Subjects

Content Types

Countries

AID systems

API

Certificates

Data access

Data access restrictions

Database access

Database access restrictions

Database licenses

Data licenses

Data upload

Data upload restrictions

Enhanced publication

Institution responsibility type

Institution type

Keywords

Metadata standards

PID systems

Provider types

Quality management

Repository languages

Software

Syndications

Repository types

Versioning

  • * at the end of a keyword allows wildcard searches
  • " quotes can be used for searching phrases
  • + represents an AND search (default)
  • | represents an OR search
  • - represents a NOT operation
  • ( and ) implies priority
  • ~N after a word specifies the desired edit distance (fuzziness)
  • ~N after a phrase specifies the desired slop amount
Found 150 result(s)
Country
ArkeoGIS is a unified scientific data publishing platform. It is a multilingual Geographic Information System (GIS), initially developed in order to mutualize archaeological and paleoenvironmental data of the Rhine Valley. Today, it allows the pooling of spatialized scientific data concerning the past, from prehistory to the present day. The databases come from the work of institutional researchers, doctoral students, master students, private companies and archaeological services. They are stored on the TGIR Huma-Num service grid and archived as part of the Huma-Num/CINES long-term archiving service. Because of their sensitive nature, which could lead to the looting of archaeological deposits, access to the tool is reserved to archaeological professionals, from research institutions or non-profit organizations. Each user can query online all or part of the available databases and export the results of his query to other tools.
The figshare service for The Open University was launched in 2016 and allows researchers to store, share and publish research data. It helps the research data to be accessible by storing metadata alongside datasets. Additionally, every uploaded item receives a Digital Object Identifier (DOI), which allows the data to be citable and sustainable. If there are any ethical or copyright concerns about publishing a certain dataset, it is possible to publish the metadata associated with the dataset to help discoverability while sharing the data itself via a private channel through manual approval.
CAPE began as a collection of UK local governments' Climate Action Plans, and has expanded to include a number of useful datapoints around climate, carbon emissions and local government. The Climate Action Plan Explorer collects UK Council Climate Action Plans in a single database, alongside some data on area emissions estimates within the scope of influence of councils. It allows anyone to quickly and easily find out if their council has a plan, and put those plans into context.
Country
Repository of the Faculty of Science is institutional repository that gathers, permanently stores and allows access to the results of scientific and intellectual property of the Faculty of Science, University of Zagreb. The objects that can be stored in the repository are research data, scientific articles, conference papers, theses, dissertations, books, teaching materials, images, video and audio files, and presentations. To improve searchability, all materials are described with predetermined set of metadata.
Country
The Polar Data Catalogue is an online database of metadata and data that describes, indexes and provides access to diverse data sets generated by polar researchers. These records cover a wide range of disciplines from natural sciences and policy, to health, social sciences, and more.
Atmosphere to Electrons (A2e) is a new, multi-year, multi-stakeholder U.S. Department of Energy (DOE) research and development initiative tasked with improving wind plant performance and mitigating risk and uncertainty to achieve substantial reduction in the cost of wind energy production. The A2e strategic vision will enable a new generation of wind plant technology, in which smart wind plants are designed to achieve optimized performance stemming from more complete knowledge of the inflow wind resource and complex flow through the wind plant.
Country
The Digital Repository of Ireland (DRI) is a national trusted digital repository (TDR) for Ireland’s social and cultural data. We preserve, curate, and provide sustained access to a wealth of Ireland’s humanities and social sciences data through a single online portal. The repository houses unique and important collections from a variety of organisations including higher education institutions, cultural institutions, government agencies, and specialist archives. DRI has staff members from a wide variety of backgrounds, including software engineers, designers, digital archivists and librarians, data curators, policy and requirements specialists, educators, project managers, social scientists and humanities scholars. DRI is certified by the CoreTrustSeal, the current TDR standard widely recommended for best practice in Open Science. In addition to providing trusted digital repository services, the DRI is also Ireland’s research centre for best practices in digital archiving, repository infrastructures, preservation policy, research data management and advocacy at the national and European levels. DRI contributes to policy making nationally (e.g. via the National Open Research Forum and the IRC), and internationally, including European Commission expert groups, the DPC, RDA and the OECD.
The projects include airborne, ground-based and ocean measurements, social science surveys, satellite data use, modelling studies and value-added product development. Therefore, the BAOBAB data portal enables to access a great amount and a large variety of data: - 250 local observation datasets, that have been collected by operational networks since 1850, long term monitoring research networks and intensive scientific campaigns; - 1350 outputs of a socio-economics questionnaire; - 60 operational satellite products and several research products; - 10 output sets of meteorological and ocean operational models and 15 of research simulations. Data documentation complies with metadata international standards, and data are delivered into standard formats. The data request interface takes full advantage of the database relational structure and enables users to elaborate multicriteria requests (period, area, property…).
The main objective of the project is to digitize the data collected by the Maritime Administration and make it available for reuse by digitizing analog resources, integrating and harmonizing data and building a digital repository, and disseminating information about the resources collected in the system. The aim of the project is to make maritime administration data sets available on the Internet.
Country
In a changing climate, water raises increasingly complex challenges: concerning its quantity, quality, availability, allocation, use and significance as a habitat, resource and cultural medium. Dharmae, a ‘Data Hub of Australian Research on Marine and Aquatic Ecocultures’ brings together multi-disciplinary research data relating to water in all these forms. The term “ecoculture” guides the development of this collection and its approach to data discovery. Ecoculture recognizes that, since nature and culture are inextricably linked, there is a corresponding need for greater interconnectedness of the different knowledge systems applied to them.
High spatial resolution, contemporary data on human population distributions are a prerequisite for the accurate measurement of the impacts of population growth, for monitoring changes and for planning interventions. The WorldPop project aims to meet these needs through the provision of detailed and open access population distribution datasets built using transparent approaches. The WorldPop project was initiated in October 2013 to combine the AfriPop, AsiaPop and AmeriPop population mapping projects. It aims to provide an open access archive of spatial demographic datasets for Central and South America, Africa and Asia to support development, disaster response and health applications. The methods used are designed with full open access and operational application in mind, using transparent, fully documented and peer-reviewed methods to produce easily updatable maps with accompanying metadata and measures of uncertainty.
The Gateway to Astronaut Photography of Earth hosts the best and most complete online collection of astronaut photographs of the Earth from 1961 through the present. This service is provided by the International Space Station program and the JSC Earth Science & Remote Sensing Unit, ARES Division, Exploration Integration Science Directorate.
ETH Data Archive is ETH Zurich's long-term preservation solution for digital information such as research data, digitised content, archival records, or images. It serves as the backbone of data curation and for most of its content, it is a “dark archive” without public access. In this capacity, the ETH Data Archive also archives the content of ETH Zurich’s Research Collection which is the primary repository for members of the university and the first point of contact for publication of data at ETH Zurich. All data that was produced in the context of research at the ETH Zurich, can be published and archived in the Research Collection. An automated connection to the ETH Data Archive in the background ensures the medium to long-term preservation of all publications and research data. Direct access to the ETH Data Archive is intended only for customers who need to deposit software source code within the framework of ETH transfer Software Registration. Open Source code packages and other content from legacy workflows can be accessed via ETH Library @ swisscovery (https://library.ethz.ch/en/).
Country
<<<!!!<<< The digital archive of the Historical Data Center Saxony-Anhalt was transferred to the share-it repositor https://www.re3data.org/repository/r3d100013014 >>>!!!>>> The Historical Data Centre Saxony-Anhalt was founded in 2008. Its main tasks are the computer-aided provision, processing and evaluation of historical research data, the development of theoretically consolidated normative data and vocabularies as well as the further development of methods in the context of digital humanities, research data management and quality assurance. The "Historical Data Centre Saxony-Anhalt" sees itself as a central institution for the data service of historical data in the federal state of Saxony-Anhalt and is thus part of a nationally and internationally linked infrastructure for long-term data storage and use. The Centre primarily acquires individual-specific microdata for the analysis of life courses, employment biographies and biographies (primarily quantitative, but also qualitative data), which offer a broad interdisciplinary and international analytical framework and meet clearly defined methodological and technical requirements. The studies are processed, archived and - in compliance with data protection and copyright conditions - made available to the scientifically interested public in accordance with internationally recognized standards. The degree of preparation depends on the type and quality of the study and on demand. Reference studies and studies in high demand are comprehensively documented - often in cooperation with primary researchers or experts - and summarized in data collections. The Historical Data Centre supports researchers in meeting the high demands of research data management. This includes the advisory support of the entire life cycle of data, starting with data production, documentation, analysis, evaluation, publication, long-term archiving and finally the subsequent use of data. In cooperation with other infrastructure facilities of the state of Saxony-Anhalt as well as national and international, interdisciplinary data repositories, the Data Centre provides tools and infrastructures for the publication and long-term archiving of research data. Together with the University and State Library of Saxony-Anhalt, the Data Centre operates its own data repository as well as special workstations for the digitisation and analysis of data. The Historical Data Centre aims to be a contact point for very different users of historical sources. We collect data relating to historical persons, events and historical territorial units.
Country
The Canadian Disaster Database (CDD) contains detailed disaster information on more than 1000 natural, technological and conflict events (excluding war) that have happened since 1900 at home or abroad and that have directly affected Canadians. Message since 2022-01: The Canadian Disaster Database geospatial view is temporarily out of service. We apologize for the inconvenience. The standard view of the database is still available.
diversitydata.org is an online tool for exploring quality of life data across metropolitan areas for people of different racial/ethnic groups in the United States. It provides values and rankings for the largest U.S. metropolitan areas on different indicators in 8 areas of life (domains), including demographics, education, economic opportunity, housing, neighborhoods, and health. It also provides a simple mapping utility, showing the range of indicator values for metros across the U.S. Data from 1999 indicators is archives in the companion Diversity Data Archive (https://diversitydata-archive.org/). For a wider selection of data on child wellbeing, visit our partner site, diversitydatakids.org (https://www.diversitydatakids.org/). diversitydata.org has been named a Health Data All Star by the Health Data Consortium. The list was compiled in consultation with leading health researchers, government officials, entrepreneurs, advocates and others to identify the health data resources that matter most.
Ag-Analytics is an online open source database of various economic and environmental data. It automates the collection, formatting, and processing of several different commonly used datasets, such as the National Agricultural Statistics Service (NASS), the Agricultural Marketing Service (AMS), Risk Management agency (RMA), the PRISM weather database, and the U.S. Commodity Futures Trading Commission (CFTC). All the data have been cleaned and well-documented to save users the inconvenience of scraping and cleaning the data themselves.