Filter
Reset all

Subjects

Content Types

Countries

AID systems

API

Data access

Data access restrictions

Database access

Database access restrictions

Database licenses

Data licenses

Data upload

Data upload restrictions

Enhanced publication

Institution responsibility type

Institution type

Keywords

Metadata standards

PID systems

Provider types

Quality management

Repository languages

Software

Syndications

Repository types

Versioning

  • * at the end of a keyword allows wildcard searches
  • " quotes can be used for searching phrases
  • + represents an AND search (default)
  • | represents an OR search
  • - represents a NOT operation
  • ( and ) implies priority
  • ~N after a word specifies the desired edit distance (fuzziness)
  • ~N after a phrase specifies the desired slop amount
  • 1 (current)
Found 22 result(s)
The Alternative Fuels Data Center (AFDC) is a comprehensive clearinghouse of information about advanced transportation technologies. The AFDC offers transportation decision makers unbiased information, data, and tools related to the deployment of alternative fuels and advanced vehicles. The AFDC launched in 1991 in response to the Alternative Motor Fuels Act of 1988 and the Clean Air Act Amendments of 1990. It originally served as a repository for alternative fuel performance data. The AFDC has since evolved to offer a broad array of information resources that support efforts to reduce petroleum use in transportation. The AFDC serves Clean Cities stakeholders, fleets regulated by the Energy Policy Act, businesses, policymakers, government agencies, and the general public.
The Center for Operational Oceanographic Products and Services (CO-OPS) site offers operational data in near-real time and historic contexts. Focus is on tides and currents but also includes information on harmful algal blooms and weather, etc. Data access is made possible through geopspatial web interfaces as well as OPeNDAP services, etc.
The OpenMadrigal project seeks to develop and support an on-line database for geospace data. The project has been led by MIT Haystack Observatory since 1980, but now has active support from Jicamarca Observatory and other community members. Madrigal is a robust, World Wide Web based system capable of managing and serving archival and real-time data, in a variety of formats, from a wide range of ground-based instruments. Madrigal is installed at a number of sites around the world. Data at each Madrigal site is locally controlled and can be updated at any time, but shared metadata between Madrigal sites allow searching of all Madrigal sites at once from any Madrigal site. Data is local; metadata is shared.
OBIS strives to document the ocean's diversity, distribution and abundance of life. Created by the Census of Marine Life, OBIS is now part of the Intergovernmental Oceanographic Commission (IOC) of UNESCO, under its International Oceanographic Data and Information Exchange (IODE) programme
The EUROLAS Data Center (EDC) is one of the two data centers of the International Laser Ranging Service (ILRS). It collects, archives and distributes tracking data, predictions and other tracking relevant information from the global SLR network. Additionally EDC holds a mirror of the official Web-Pages of the ILRS at Goddard Space Flight Center (GSFC). And as result of the activities of the Analysis Working Group (AWG) of the ILRS, DGFI has been selected as analysis centers (AC) and as backup combination center (CC). This task includes weekly processing of SLR observations to LAGEOS-1/2 and ETALON-1/2 to compute station coordinates and earth orientation parameters. Additionally the combination of SLR solutions from the various analysis centres to a combinerd ILRS SLR solution.
The Keck Observatory Archive (KOA)is a collaboration between the NASA Exoplanet Science Institute (NExScI) and the W. M. Keck Observatory (WMKO). This collaboration is founded by the NASA. KOA has been archiving data from the High Resolution Echelle Spectrograph (HIRES) since August 2004 and data acquired with the Near InfraRed echelle SPECtrograph (NIRSPEC) since May 2010. The archived data extend back to 1994 for HIRES and 1999 for NIRSPEC. The W. M. Keck Observatory Archive (KOA) ingests and curates data from the following instruments: DEIMOS, ESI, HIRES, KI, LRIS, MOSFIRE, NIRC2, and NIRSPEC.
As part of the Copernicus Space Component programme, ESA manages the coordinated access to the data procured from the various Contributing Missions and the Sentinels, in response to the Copernicus users requirements. The Data Access Portfolio documents the data offer and the access rights per user category. The CSCDA portal is the access point to all data, including Sentinel missions, for Copernicus Core Users as defined in the EU Copernicus Programme Regulation (e.g. Copernicus Services).The Copernicus Space Component (CSC) Data Access system is the interface for accessing the Earth Observation products from the Copernicus Space Component. The system overall space capacity relies on several EO missions contributing to Copernicus, and it is continuously evolving, with new missions becoming available along time and others ending and/or being replaced.
Country
The term GNSS (Global Navigation Satellite Systems) comprises the different navigation satellite systems like GPS, GLONAS and the future Galileo as well as rawdata from GNSS microwave receivers and processed or derived higher level products and required auxiliary data. The results of the GZF GNSS technology based projects are used as contribution for maintaining and studying the Earth rotational behavior and the global terrestial reference frame, for studying neotectonic processes along plate boundaries and the interior of plates and as input to short term weather forecasting and atmosphere/climate research. Currently only selected products like observation data, navigation data (ephemeriden), meteorological data as well as quality data with a limited spatial coverage are provided by the GNSS ISDC.
Content type(s)
Launched in November 1995, RADARSAT-1 provided Canada and the world with an operational radar satellite system capable of timely delivery of large amounts of data. Equipped with a powerful synthetic aperture radar (SAR) instrument, it acquired images of the Earth day or night, in all weather and through cloud cover, smoke and haze. RADARSAT-1 was a Canadian-led project involving the Canadian federal government, the Canadian provinces, the United States, and the private sector. It provided useful information to both commercial and scientific users in such fields as disaster management, interferometry, agriculture, cartography, hydrology, forestry, oceanography, ice studies and coastal monitoring. In 2007, RADARSAT-2 was launched, producing over 75,000 images per year since. In 2019, the RADARSAT Constellation Mission was deployed, using its three-satellite configuration for all-condition coverage. More information about RADARSAT-2 see https://mda.space/en/geo-intelligence/ RADARSAT-2 PORTAL see https://gsiportal.mda.space/gc_cp/#/map
Real-Time Database for high-resolution Neutron Monitor measurements. NMDB provides access to Neutron Monitor measurements from stations around the world. The goal of NMDB is to provide easy access to all Neutron Monitor measurements through an easy to use interface. NMDB provides access to real-time as well as historical data.
The European Centre for Medium-Range Weather Forecasts (ECMWF) is an independent intergovernmental organisation supported by 34 states. ECMWF is both a research institute and a 24/7 operational service, producing and disseminating numerical weather predictions to its Member States. This data is fully available to the national meteorological services in the Member States. The Centre also offers a catalogue of forecast data that can be purchased by businesses worldwide and other commercial customers Forecasts, analyses, climate re-analyses, reforecasts and multi-model data are available from our archive (MARS) or via dedicated data servers or via point-to-point dissemination
<<<!!!<<< All user content from this site has been deleted. Visit SeedMeLab (https://seedmelab.org/) project as a new option for data hosting. >>>!!!>>> SeedMe is a result of a decade of onerous experience in preparing and sharing visualization results from supercomputing simulations with many researchers at different geographic locations using different operating systems. It’s been a labor–intensive process, unsupported by useful tools and procedures for sharing information. SeedMe provides a secure and easy-to-use functionality for efficiently and conveniently sharing results that aims to create transformative impact across many scientific domains.
The Coastal Data Information Program (CDIP) is an extensive network for monitoring waves and beaches along the coastlines of the United States. Since its inception in 1975, the program has produced a vast database of publicly-accessible environmental data for use by coastal engineers and planners, scientists, mariners, and marine enthusiasts. The program has also remained at the forefront of coastal monitoring, developing numerous innovations in instrumentation, system control and management, computer hardware and software, field equipment, and installation techniques.
Country
Thousands of Temperature and salinity profiles obtained by means of Nansen hydrographic casts and available earlier only as station sheets have been digitized at the German Maritime and Hydrographic Agency (BSH). In a cooperative effort between the KlimaCampus of the University of Hamburg and the German Oceanographic Data Centre (DOD, Hamburg) about 7500 hydrographic profiles were checked and identified as missing in the international oceanographic databases. Since most of the profiles were obtained in the decades before the second World War they represent an important extension of the international historical database and a respective contribution to the IOC Global Oceanographic Data Archeology and Rescue Project (GODAR). Since 2009 our efforts resulted in locating about 7500 hydrographic profiles that are not yet available for the oceanographic community.
The Registry of Open Data on AWS provides a centralized repository of public data sets that can be seamlessly integrated into AWS cloud-based applications. AWS is hosting the public data sets at no charge to their users. Anyone can access these data sets from their Amazon Elastic Compute Cloud (Amazon EC2) instances and start computing on the data within minutes. Users can also leverage the entire AWS ecosystem and easily collaborate with other AWS users.
When published in 2005, the Millennium Run was the largest ever simulation of the formation of structure within the ΛCDM cosmology. It uses 10(10) particles to follow the dark matter distribution in a cubic region 500h(−1)Mpc on a side, and has a spatial resolution of 5h−1kpc. Application of simplified modelling techniques to the stored output of this calculation allows the formation and evolution of the ~10(7) galaxies more luminous than the Small Magellanic Cloud to be simulated for a variety of assumptions about the detailed physics involved. As part of the activities of the German Astrophysical Virtual Observatory we have created relational databases to store the detailed assembly histories both of all the haloes and subhaloes resolved by the simulation, and of all the galaxies that form within these structures for two independent models of the galaxy formation physics. We have implemented a Structured Query Language (SQL) server on these databases. This allows easy access to many properties of the galaxies and halos, as well as to the spatial and temporal relations between them. Information is output in table format compatible with standard Virtual Observatory tools. With this announcement (from 1/8/2006) we are making these structures fully accessible to all users. Interested scientists can learn SQL and test queries on a small, openly accessible version of the Millennium Run (with volume 1/512 that of the full simulation). They can then request accounts to run similar queries on the databases for the full simulations. In 2008 and 2012 the simulations were repeated.
Data.gov increases the ability of the public to easily find, download, and use datasets that are generated and held by the Federal Government. Data.gov provides descriptions of the Federal datasets (metadata), information about how to access the datasets, and tools that leverage government datasets
Country
The geophysical database, GERDA, is a strong tool for data storage, handling and QC. Data are uploaded to and downloaded from the GERDA database through this website. GERDA is the Danish national database on shallow geophysical data. Since its establishment in 1998-2000, the database has been continuously developed. The database is hosted by the Geological Survey of Denmark and Greenland (GEUS).