Filter
Reset all

Subjects

Content Types

Countries

AID systems

API

Certificates

Data access

Data access restrictions

Database access

Database access restrictions

Database licenses

Data licenses

Data upload

Data upload restrictions

Enhanced publication

Institution responsibility type

Institution type

Keywords

Metadata standards

PID systems

Provider types

Quality management

Repository languages

Software

Syndications

Repository types

Versioning

  • * at the end of a keyword allows wildcard searches
  • " quotes can be used for searching phrases
  • + represents an AND search (default)
  • | represents an OR search
  • - represents a NOT operation
  • ( and ) implies priority
  • ~N after a word specifies the desired edit distance (fuzziness)
  • ~N after a phrase specifies the desired slop amount
Found 29 result(s)
NIAID’s TB Portals Program is a multi-national collaboration for TB data sharing and analysis to advance TB research. As a global consortium of clinicians, scientists, and IT professionals from 40 sites in 16 countries throughout eastern Europe, Asia, and sub-Saharan Africa, the TB Portals Program is a web-based, open-access repository of multi-domain TB data and tools for its analysis. Researchers can find linked socioeconomic/geographic, clinical, laboratory, radiological, and genomic data from over 7,500 international published TB patient cases with an emphasis on drug-resistant tuberculosis.
The EUROLAS Data Center (EDC) is one of the two data centers of the International Laser Ranging Service (ILRS). It collects, archives and distributes tracking data, predictions and other tracking relevant information from the global SLR network. Additionally EDC holds a mirror of the official Web-Pages of the ILRS at Goddard Space Flight Center (GSFC). And as result of the activities of the Analysis Working Group (AWG) of the ILRS, DGFI has been selected as analysis centers (AC) and as backup combination center (CC). This task includes weekly processing of SLR observations to LAGEOS-1/2 and ETALON-1/2 to compute station coordinates and earth orientation parameters. Additionally the combination of SLR solutions from the various analysis centres to a combinerd ILRS SLR solution.
The Index to Marine and Lacustrine Geological Samples is a tool to help scientists locate and obtain geologic material from sea floor and lakebed cores, grabs, and dredges archived by participating institutions around the world. Data and images related to the samples are prepared and contributed by the institutions for access via the IMLGS and long-term archive at NGDC. Before proposing research on any sample, please contact the curator for sample condition and availability. A consortium of Curators guides the IMLGS, maintained on behalf of the group by NGDC, since 1977.
The Alternative Fuels Data Center (AFDC) is a comprehensive clearinghouse of information about advanced transportation technologies. The AFDC offers transportation decision makers unbiased information, data, and tools related to the deployment of alternative fuels and advanced vehicles. The AFDC launched in 1991 in response to the Alternative Motor Fuels Act of 1988 and the Clean Air Act Amendments of 1990. It originally served as a repository for alternative fuel performance data. The AFDC has since evolved to offer a broad array of information resources that support efforts to reduce petroleum use in transportation. The AFDC serves Clean Cities stakeholders, fleets regulated by the Energy Policy Act, businesses, policymakers, government agencies, and the general public.
Country
We aim to provide a “one-stop shop” for data. To this end, we provide information on Strait of Georgia data that can be found within this Data Centre, as well as in other existing databases and locations. Clicking on the different categories in our Marine Data BC open data portal will allow you information on 1) all the data that is within this SoG Data Centre, 2) links to custodians that can provide other data sets that cannot be directly downloaded from our Data Centre, as well as 3) links to other existing data search engines where data can be immediately downloaded.
EarthWorks is a discovery tool for geospatial (a.k.a. GIS) data. It allows users to search and browse the GIS collections owned by Stanford University Libraries, as well as data collections from many other institutions. Data can be searched spatially, by manipulating a map; by keyword search; by selecting search limiting facets (e.g., limit to a given format type); or by combining these options.
The Environmental Information Data Centre (EIDC) is part of the Natural Environment Research Council's (NERC) Environmental Data Service and is hosted by the UK Centre for Ecology & Hydrology (UKCEH). We manage nationally-important datasets concerned with the terrestrial and freshwater sciences.
Country
Data are the key to successful scientific work. A sophisticated data management will guarantee the long-term availability of observational data and metadata, and will allow for an easy data search and retrieval, to supplement the international data exchange and to provide data products for scientific, political, industrial and public stakeholders.
Country
AusGeochem is an easy-to-use platform for uploading, visualising, analysing and discovering georeferenced sample information and data produced by various geoscience research institutions such as universities, geological survey agencies and museums. With respect to analytical research laboratories, AusGeochem provides a centralised repository allowing laboratories to upload, archive, disseminate and publish their datasets. The intuitive user interface (UI) allows users to access national publicly funded data quickly through the ability to view an area of interest, synthesise a variety of geochemical data in real-time, and extract the required data, gaining novel scientific insights through multi-method data collation. Lithodat Pty Ltd has integrated built-in data synthesis functions into the platform, such as cumulative age histograms, age vs elevation plots, and step-heating diagrams, allowing for rapid inter-study comparisons. Data can be extracted in multiple formats for re-use in a variety of software systems, allowing for the integration of regional datasets into machine learning and AI systems.
As part of the Copernicus Space Component programme, ESA manages the coordinated access to the data procured from the various Contributing Missions and the Sentinels, in response to the Copernicus users requirements. The Data Access Portfolio documents the data offer and the access rights per user category. The CSCDA portal is the access point to all data, including Sentinel missions, for Copernicus Core Users as defined in the EU Copernicus Programme Regulation (e.g. Copernicus Services).The Copernicus Space Component (CSC) Data Access system is the interface for accessing the Earth Observation products from the Copernicus Space Component. The system overall space capacity relies on several EO missions contributing to Copernicus, and it is continuously evolving, with new missions becoming available along time and others ending and/or being replaced.
Country
The Database for Hydrological Time Series of Inland Waters (DAHITI) was developed by the Deutsches Geodätisches Forschungsinstitut der Technischen Universität München (DGFI-TUM) in 2013. DAHITI provides water level time series of lakes, reservoirs, rivers, and wetlands derived from multi-mission satellite altimetry for hydrological applications. All water level time series are free available for the user community after a short registration process.
NKN is now Research Computing and Data Services (RCDS)! We provide data management support for UI researchers and their regional, national, and international collaborators. This support keeps researchers at the cutting-edge of science and increases our institution's competitiveness for external research grants. Quality data and metadata developed in research projects and curated by RCDS (formerly NKN) is a valuable, long-term asset upon which to develop and build new research and science.
Real-Time Database for high-resolution Neutron Monitor measurements. NMDB provides access to Neutron Monitor measurements from stations around the world. The goal of NMDB is to provide easy access to all Neutron Monitor measurements through an easy to use interface. NMDB provides access to real-time as well as historical data.
The European Centre for Medium-Range Weather Forecasts (ECMWF) is an independent intergovernmental organisation supported by 34 states. ECMWF is both a research institute and a 24/7 operational service, producing and disseminating numerical weather predictions to its Member States. This data is fully available to the national meteorological services in the Member States. The Centre also offers a catalogue of forecast data that can be purchased by businesses worldwide and other commercial customers Forecasts, analyses, climate re-analyses, reforecasts and multi-model data are available from our archive (MARS) or via dedicated data servers or via point-to-point dissemination
The data publishing portal of Marine Scotland, the directorate of the Scottish Government responsible for the management of Scotland's seas.
EUMETSAT's primary objective is to establish, maintain and exploit European systems of operational meteorological satellites. EUMETSAT is responsible for the launch and operation of the satellites and for delivering satellite data to end-users as well as contributing to the operational monitoring of climate and the detection of global climate changes. The EUMETSAT Product Navigator is the catalogue for all EUMETSAT data and products.
The KNB Data Repository is an international repository intended to facilitate ecological, environmental and earth science research in the broadest senses. For scientists, the KNB Data Repository is an efficient way to share, discover, access and interpret complex ecological, environmental, earth science, and sociological data and the software used to create and manage those data. Due to rich contextual information provided with data in the KNB, scientists are able to integrate and analyze data with less effort. The data originate from a highly-distributed set of field stations, laboratories, research sites, and individual researchers. The KNB supports rich, detailed metadata to promote data discovery as well as automated and manual integration of data into new projects. The KNB supports a rich set of modern repository services, including the ability to assign Digital Object Identifiers (DOIs) so data sets can be confidently referenced in any publication, the ability to track the versions of datasets as they evolve through time, and metadata to establish the provenance relationships between source and derived data.
Country
The term GNSS (Global Navigation Satellite Systems) comprises the different navigation satellite systems like GPS, GLONAS and the future Galileo as well as rawdata from GNSS microwave receivers and processed or derived higher level products and required auxiliary data. The results of the GZF GNSS technology based projects are used as contribution for maintaining and studying the Earth rotational behavior and the global terrestial reference frame, for studying neotectonic processes along plate boundaries and the interior of plates and as input to short term weather forecasting and atmosphere/climate research. Currently only selected products like observation data, navigation data (ephemeriden), meteorological data as well as quality data with a limited spatial coverage are provided by the GNSS ISDC.
The Open Science Framework (OSF) is part network of research materials, part version control system, and part collaboration software. The purpose of the software is to support the scientist's workflow and help increase the alignment between scientific values and scientific practices. Document and archive studies. Move the organization and management of study materials from the desktop into the cloud. Labs can organize, share, and archive study materials among team members. Web-based project management reduces the likelihood of losing study materials due to computer malfunction, changing personnel, or just forgetting where you put the damn thing. Share and find materials. With a click, make study materials public so that other researchers can find, use and cite them. Find materials by other researchers to avoid reinventing something that already exists. Detail individual contribution. Assign citable, contributor credit to any research material - tools, analysis scripts, methods, measures, data. Increase transparency. Make as much of the scientific workflow public as desired - as it is developed or after publication of reports. Find public projects here. Registration. Registering materials can certify what was done in advance of data analysis, or confirm the exact state of the project at important points of the lifecycle such as manuscript submission or at the onset of data collection. Discover public registrations here. Manage scientific workflow. A structured, flexible system can provide efficiency gain to workflow and clarity to project objectives, as pictured.
The Met Office is the UK's National Weather Service. We have a long history of weather forecasting and have been working in the area of climate change for more than two decades. As a world leader in providing weather and climate services, we employ more than 1,800 at 60 locations throughout the world. We are recognised as one of the world's most accurate forecasters, using more than 10 million weather observations a day, an advanced atmospheric model and a high performance supercomputer to create 3,000 tailored forecasts and briefings a day. These are delivered to a huge range of customers from the Government, to businesses, the general public, armed forces, and other organisations.
The LINZ Data Service provides free online access to New Zealand’s most up-to-date land and seabed data. The data can be searched, browsed and downloaded. The LINZ web services can be also integrated into other applications.
Country
The Climate Change Centre Austria - Data Centre provides the central national archive for climate data and information. The data made accessible includes observation and measurement data, scenario data, quantitative and qualitative data, as well as the measurement data and findings of research projects.
When published in 2005, the Millennium Run was the largest ever simulation of the formation of structure within the ΛCDM cosmology. It uses 10(10) particles to follow the dark matter distribution in a cubic region 500h(−1)Mpc on a side, and has a spatial resolution of 5h−1kpc. Application of simplified modelling techniques to the stored output of this calculation allows the formation and evolution of the ~10(7) galaxies more luminous than the Small Magellanic Cloud to be simulated for a variety of assumptions about the detailed physics involved. As part of the activities of the German Astrophysical Virtual Observatory we have created relational databases to store the detailed assembly histories both of all the haloes and subhaloes resolved by the simulation, and of all the galaxies that form within these structures for two independent models of the galaxy formation physics. We have implemented a Structured Query Language (SQL) server on these databases. This allows easy access to many properties of the galaxies and halos, as well as to the spatial and temporal relations between them. Information is output in table format compatible with standard Virtual Observatory tools. With this announcement (from 1/8/2006) we are making these structures fully accessible to all users. Interested scientists can learn SQL and test queries on a small, openly accessible version of the Millennium Run (with volume 1/512 that of the full simulation). They can then request accounts to run similar queries on the databases for the full simulations. In 2008 and 2012 the simulations were repeated.