Filter
Reset all

Subjects

Content Types

Countries

AID systems

API

Certificates

Data access

Data access restrictions

Database access

Database access restrictions

Database licenses

Data licenses

Data upload

Data upload restrictions

Enhanced publication

Institution responsibility type

Institution type

Keywords

Metadata standards

PID systems

Provider types

Quality management

Repository languages

Software

Syndications

Repository types

Versioning

  • * at the end of a keyword allows wildcard searches
  • " quotes can be used for searching phrases
  • + represents an AND search (default)
  • | represents an OR search
  • - represents a NOT operation
  • ( and ) implies priority
  • ~N after a word specifies the desired edit distance (fuzziness)
  • ~N after a phrase specifies the desired slop amount
Found 113 result(s)
The main goal of the ECCAD project is to provide scientific and policy users with datasets of surface emissions of atmospheric compounds, and ancillary data, i.e. data required to estimate or quantify surface emissions. The supply of ancillary data - such as maps of population density, maps of fires spots, burnt areas, land cover - could help improve and encourage the development of new emissions datasets. ECCAD offers: Access to global and regional emission inventories and ancillary data, in a standardized format Quick visualization of emission and ancillary data Rationalization of the use of input data in algorithms or emission models Analysis and comparison of emissions datasets and ancillary data Tools for the evaluation of emissions and ancillary data ECCAD is a dynamical and interactive database, providing the most up to date datasets including data used within ongoing projects. Users are welcome to add their own datasets, or have their regional masks included in order to use ECCAD tools.
<<<!!!<<< All user content from this site has been deleted. Visit SeedMeLab (https://seedmelab.org/) project as a new option for data hosting. >>>!!!>>> SeedMe is a result of a decade of onerous experience in preparing and sharing visualization results from supercomputing simulations with many researchers at different geographic locations using different operating systems. It’s been a labor–intensive process, unsupported by useful tools and procedures for sharing information. SeedMe provides a secure and easy-to-use functionality for efficiently and conveniently sharing results that aims to create transformative impact across many scientific domains.
ETH Data Archive is ETH Zurich's long-term preservation solution for digital information such as research data, digitised content, archival records, or images. It serves as the backbone of data curation and for most of its content, it is a “dark archive” without public access. In this capacity, the ETH Data Archive also archives the content of ETH Zurich’s Research Collection which is the primary repository for members of the university and the first point of contact for publication of data at ETH Zurich. All data that was produced in the context of research at the ETH Zurich, can be published and archived in the Research Collection. An automated connection to the ETH Data Archive in the background ensures the medium to long-term preservation of all publications and research data. Direct access to the ETH Data Archive is intended only for customers who need to deposit software source code within the framework of ETH transfer Software Registration. Open Source code packages and other content from legacy workflows can be accessed via ETH Library @ swisscovery (https://library.ethz.ch/en/).
The KNB Data Repository is an international repository intended to facilitate ecological, environmental and earth science research in the broadest senses. For scientists, the KNB Data Repository is an efficient way to share, discover, access and interpret complex ecological, environmental, earth science, and sociological data and the software used to create and manage those data. Due to rich contextual information provided with data in the KNB, scientists are able to integrate and analyze data with less effort. The data originate from a highly-distributed set of field stations, laboratories, research sites, and individual researchers. The KNB supports rich, detailed metadata to promote data discovery as well as automated and manual integration of data into new projects. The KNB supports a rich set of modern repository services, including the ability to assign Digital Object Identifiers (DOIs) so data sets can be confidently referenced in any publication, the ability to track the versions of datasets as they evolve through time, and metadata to establish the provenance relationships between source and derived data.
The International Ocean Discovery Program (IODP) is an international marine research collaboration that explores Earth's history and dynamics using ocean-going research platforms to recover data recorded in seafloor sediments and rocks and to monitor subseafloor environments. IODP depends on facilities funded by three platform providers with financial contributions from five additional partner agencies. Together, these entities represent 26 nations whose scientists are selected to staff IODP research expeditions conducted throughout the world's oceans. IODP expeditions are developed from hypothesis-driven science proposals aligned with the program's science plan Illuminating Earth's Past, Present, and Future. The science plan identifies 14 challenge questions in the four areas of climate change, deep life, planetary dynamics, and geohazards. Until 2013 under the name: International Ocean Drilling Program.
Country
In the framework of the Collaborative Research Centre/Transregio 32 ‘Patterns in Soil-Vegetation-Atmosphere Systems: Monitoring, Modelling, and Data Assimilation’ (CRC/TR32, www.tr32.de), funded by the German Research Foundation from 2007 to 2018, a RDM system was self-designed and implemented. The so-called CRC/TR32 project database (TR32DB, www.tr32db.de) is operating online since early 2008. The TR32DB handles all data including metadata, which are created by the involved project participants from several institutions (e.g. Universities of Cologne, Bonn, Aachen, and the Research Centre Jülich) and research fields (e.g. soil and plant sciences, hydrology, geography, geophysics, meteorology, remote sensing). The data is resulting from several field measurement campaigns, meteorological monitoring, remote sensing, laboratory studies and modelling approaches. Furthermore, outcomes of the scientists such as publications, conference contributions, PhD reports and corresponding images are collected in the TR32DB.
Country
The Digital Repository of Ireland (DRI) is a national trusted digital repository (TDR) for Ireland’s social and cultural data. We preserve, curate, and provide sustained access to a wealth of Ireland’s humanities and social sciences data through a single online portal. The repository houses unique and important collections from a variety of organisations including higher education institutions, cultural institutions, government agencies, and specialist archives. DRI has staff members from a wide variety of backgrounds, including software engineers, designers, digital archivists and librarians, data curators, policy and requirements specialists, educators, project managers, social scientists and humanities scholars. DRI is certified by the CoreTrustSeal, the current TDR standard widely recommended for best practice in Open Science. In addition to providing trusted digital repository services, the DRI is also Ireland’s research centre for best practices in digital archiving, repository infrastructures, preservation policy, research data management and advocacy at the national and European levels. DRI contributes to policy making nationally (e.g. via the National Open Research Forum and the IRC), and internationally, including European Commission expert groups, the DPC, RDA and the OECD.
Country
ArkeoGIS is a unified scientific data publishing platform. It is a multilingual Geographic Information System (GIS), initially developed in order to mutualize archaeological and paleoenvironmental data of the Rhine Valley. Today, it allows the pooling of spatialized scientific data concerning the past, from prehistory to the present day. The databases come from the work of institutional researchers, doctoral students, master students, private companies and archaeological services. They are stored on the TGIR Huma-Num service grid and archived as part of the Huma-Num/CINES long-term archiving service. Because of their sensitive nature, which could lead to the looting of archaeological deposits, access to the tool is reserved to archaeological professionals, from research institutions or non-profit organizations. Each user can query online all or part of the available databases and export the results of his query to other tools.
Country
The Arctic Data Centre (ADC) is a service provided by the Norwegian Meteorological Institute (MET) and is a legacy of the International Polar Year (IPY). ADC is based on the FAIR guiding principles for data management and access to free and open data. While the Norwegian Meteorological Institute use CC BY as the data license, ADC is managing data on behalf of other data owners that may have other preferences. ADC is primarily hosting data within meteorology, oceanography and glaciology, but through active metadata harvesting it also points to data within other disciplines. ADC normally offers data in CF-NetCDF adhering to the Climate and Forecast Conventions (exceptions may occur) and support services on top of data like OPeNDAP and OGC WMS. Machine interfaces to the catalogue include OAI-PMH, OGC CSW and OpenSearch. Information is provided in the native format MET Metadata (MMD), ISO-19115 and GCMD DIF (others are being considered).
The CALIPSO satellite provides new insight into the role that clouds and atmospheric aerosols play in regulating Earth's weather, climate, and air quality. CALIPSO combines an active lidar instrument with passive infrared and visible imagers to probe the vertical structure and properties of thin clouds and aerosols over the globe. CALIPSO was launched on April 28, 2006, with the CloudSat satellite. CALIPSO and CloudSat are highly complementary and together provide new, never-before-seen 3D perspectives of how clouds and aerosols form, evolve, and affect weather and climate. CALIPSO and CloudSat fly in formation with three other satellites in the A-train constellation to enable an even greater understanding of our climate system.
The NCAR Climate Data Gateway provides data discovery and access services for global and regional climate model data, knowledge, and software. The NCAR Climate Data Gateway supports community access to data products from many of NCAR&#039;s community modeling efforts, including the IPCC, PCM, AMPS, CESM, NARCCAP, and NMME activities. Data products are generally open and available, however, download access may require a login.
The WDC is concerned with the collection, management, distribution and utilization of data from Chinese provinces, autonomous regions and counties,including: Resource data:management,distribution and utlilzation of land, water, climate, forest, grassland, minerals, energy, etc. Environmental data:pollution,environmental quality, change, natural disasters,soli erosion, etc. Biological resources:animals, plants,wildlife Social economy:agriculture, industry, transport, commerce,infrastructure,etc. Population and labor Geographic background data on scales of 1:4M,1:1M, 1:(1/2)M, 1:2500, etc.
Country
HYdrological cycle in the Mediterranean EXperiemnt. Considering the science and societal issues motivating HyMeX, the programme aims to : improve our understanding of the water cycle, with emphasis on extreme events, by monitoring and modelling the Mediterranean atmosphere-land-ocean coupled system, its variability from the event to the seasonal and interannual scales, and its characteristics over one decade (2010-2020) in the context of global change, assess the social and economic vulnerability to extreme events and adaptation capacity.The multidisciplinary research and the database developed within HyMeX should contribute to: improve observational and modelling systems, especially for coupled systems, better predict extreme events, simulate the long-term water-cycle more accurately, provide guidelines for adaptation measures, especially in the context of global change.
The LRIS portal is the first element of scinfo.org.nz, a new repository of authoritative New Zealand science datasets and information. It is has been created in response to a growing expectation that government and publicly funded science data should be readily available in authoritative human and machine readable forms.
The Gulf of Mexico Research Initiative Information and Data Cooperative (GRIIDC) is a team of researchers, data specialists and computer system developers who are supporting the development of a data management system to store scientific data generated by Gulf of Mexico researchers. The Master Research Agreement between BP and the Gulf of Mexico Alliance that established the Gulf of Mexico Research Initiative (GoMRI) included provisions that all data collected or generated through the agreement must be made available to the public. The Gulf of Mexico Research Initiative Information and Data Cooperative (GRIIDC) is the vehicle through which GoMRI is fulfilling this requirement. The mission of GRIIDC is to ensure a data and information legacy that promotes continual scientific discovery and public awareness of the Gulf of Mexico Ecosystem.
The DCS allows you to search a catalogue of metadata (information describing data) to discover and gain access to NERC's data holdings and information products. The metadata are prepared to a common NERC Metadata Standard and are provided to the catalogue by the NERC Data Centres.
The Social Science Data Archive is still active and maintained as part of the UCLA Library Data Science Center. SSDA Dataverse is one of the archiving opportunities of SSDA, the others are: Data can be archived by SSDA itself or by ICPSR or by UCLA Library or by California Digital Library. The Social Science Data Archives serves the UCLA campus as an archive of faculty and graduate student survey research. We provide long term storage of data files and documentation. We ensure that the data are useable in the future by migrating files to new operating systems. We follow government standards and archival best practices. The mission of the Social Science Data Archive has been and continues to be to provide a foundation for social science research with faculty support throughout an entire research project involving original data collection or the reuse of publicly available studies. Data Archive staff and researchers work as partners throughout all stages of the research process, beginning when a hypothesis or area of study is being developed, during grant and funding activities, while data collection and/or analysis is ongoing, and finally in long term preservation of research results. Our role is to provide a collaborative environment where the focus is on understanding the nature and scope of research approach and management of research output throughout the entire life cycle of the project. Instructional support, especially support that links research with instruction is also a mainstay of operations.
Country
The NCI National Research Data Collection is Australia’s largest collection of research data, encompassing more than 10 PB of nationally and internationally significant datasets.
The NCAR is a federally funded research and development center committed to research and education in atmospheric science and related scientific fields. NCAR seeks to support and enhance the scientific community nationally and globally by monitoring and researching the atmosphere and related physical and biological systems. Users can access climate and earth models created to better understand the atmosphere, the Earth and the Sun; as well as data from various NCAR research programs and projects. NCAR is sponsored by the National Science Foundation in addition to various other U.S. agencies.
Country
The Marine Data Archive (MDA) is an online repository specifically developed to independently archive data files in a fully documented manner. The MDA can serve individuals, consortia, working groups and institutes to manage data files and file versions for a specific context (project, report, analysis, monitoring campaign), as a personal or institutional archive or back-up system and as an open repository for data publication.
This hub supports the geospatial modeling, data analysis and visualization needs of the broad research and education communities through hosting of groups, datasets, tools, training materials, and educational contents.
The Open Science Framework (OSF) is part network of research materials, part version control system, and part collaboration software. The purpose of the software is to support the scientist's workflow and help increase the alignment between scientific values and scientific practices. Document and archive studies. Move the organization and management of study materials from the desktop into the cloud. Labs can organize, share, and archive study materials among team members. Web-based project management reduces the likelihood of losing study materials due to computer malfunction, changing personnel, or just forgetting where you put the damn thing. Share and find materials. With a click, make study materials public so that other researchers can find, use and cite them. Find materials by other researchers to avoid reinventing something that already exists. Detail individual contribution. Assign citable, contributor credit to any research material - tools, analysis scripts, methods, measures, data. Increase transparency. Make as much of the scientific workflow public as desired - as it is developed or after publication of reports. Find public projects here. Registration. Registering materials can certify what was done in advance of data analysis, or confirm the exact state of the project at important points of the lifecycle such as manuscript submission or at the onset of data collection. Discover public registrations here. Manage scientific workflow. A structured, flexible system can provide efficiency gain to workflow and clarity to project objectives, as pictured.
The UCD Digital Library is a platform for exploring cultural heritage, engaging with digital scholarship, and accessing research data. The UCD Digital Library allows you to search, browse and explore a growing collection of historical materials, photographs, art, interviews, letters, and other exciting content, that have been digitised and made freely available.