Filter
Reset all

Subjects

Content Types

Countries

AID systems

API

Certificates

Data access

Data access restrictions

Database access

Database licenses

Data licenses

Data upload

Data upload restrictions

Enhanced publication

Institution responsibility type

Institution type

Keywords

Metadata standards

PID systems

Provider types

Quality management

Repository languages

Software

Syndications

Repository types

Versioning

  • * at the end of a keyword allows wildcard searches
  • " quotes can be used for searching phrases
  • + represents an AND search (default)
  • | represents an OR search
  • - represents a NOT operation
  • ( and ) implies priority
  • ~N after a word specifies the desired edit distance (fuzziness)
  • ~N after a phrase specifies the desired slop amount
Found 83 result(s)
SCEC's mission includes gathering data on earthquakes, both in Southern California and other locales; integrate the information into a comprehensive understanding of earthquake phenomena; and communicate useful knowledge for reducing earthquake risk to society at large. The SCEC community consists of more than 600 scientists from 16 core institutions and 47 additional participating institutions. SCEC is funded by the National Science Foundation and the U.S. Geological Survey.
China Earthquake Data Center provides Seismic data, geomagnetic data, geoelectric data, terrain data and underground fluid change data. It is only open in the Seismological Bureau.
Country
As the national oceanographic data centre for Canada, MEDS maintains centralized repositories of some oceanographic data types collected in Canada, and coordinates data exchanges between DFO and recognized intergovernmental organizations, as well as acts as a central point for oceanographic data requests. Real-time, near real-time (for operational oceanography) or historical data are made available as appropriate.
Country
The Geoscience Data Repository (GDR) is a collection of Earth Sciences Sector geoscience databases that is managed and accessed by a series of Information Services (GDRIS). This site allows you to discover, view and download information using these services. About 27 data resources are listed and many are also listed in the GeoConnections Discovery Portal.
Country
The National Earthquake Database (NEDB) comprises a number of separate databases that together act as the national repository for all raw seismograph data, measurements, and derived parameters arising from the Canadian National Seismograph Network (CNSN), the Yellowknife Seismological Array (YKA), previous regional telemetered networks in eastern and western Canada (ECTN, WCTN), local telemetered networks (CLTN, SLTN), the Regional Analogue Network, and the former Standard Seismograph Network (CSN). It supports the efforts of Earthquakes Canada in Canadian seismicity monitoring, global seismic monitoring, verification of the Comprehensive nuclear Test Ban Treaty, and international data exchange. It also supports the Nuclear Explosion Monitoring project.
The European Database of Seismogenic Faults (EDSF) was compiled in the framework of the EU Project SHARE, Work Package 3, Task 3.2. EDSF includes only faults that are deemed to be capable of generating earthquakes of magnitude equal to or larger than 5.5 and aims at ensuring a homogeneous input for use in ground-shaking hazard assessment in the Euro-Mediterranean area. Several research institutions participated in this effort with the contribution of many scientists (see the Database section for a full list). The EDSF database and website are hosted and maintained by INGV.
The USGS currently houses the institute at the Center for Earth Resources Observation and Science (EROS) in Sioux Falls, South Dakota. The LCI will address land cover topics from local to global scales, and in both domestic and international settings. The USGS through the Land Cover Institute serves as a facilitator for land cover and land use science, applications, and production functions. The institute assists in the availability and technical support of land cover data sets through increasing public and scientific awareness of the importance of land cover science. LCI continues, after the reorganization of the World Data Centers in 2009, serving as the World Data Center (WDC) for land cover data for access to, or information about, land cover data of the world
Archiving data and housing geological collections is an important role the Bureau of Geology plays in improving our understanding of the geology of New Mexico. Aside from our numerous publications, several datasets are available to the public. Data in this repository supplements published papers in our publications. Please refer to both the published material and the repository documentation before using this data. Please cite repository data as shown in each repository listing.
The Index to Marine and Lacustrine Geological Samples is a tool to help scientists locate and obtain geologic material from sea floor and lakebed cores, grabs, and dredges archived by participating institutions around the world. Data and images related to the samples are prepared and contributed by the institutions for access via the IMLGS and long-term archive at NGDC. Before proposing research on any sample, please contact the curator for sample condition and availability. A consortium of Curators guides the IMLGS, maintained on behalf of the group by NGDC, since 1977.
The VDC is a public, web-based search engine for accessing worldwide earthquake strong ground motion data. While the primary focus of the VDC is on data of engineering interest, it is also an interactive resource for scientific research and government and emergency response professionals.
Climate Data Record (CDR) is a time series of measurements of sufficient length, consistency and continuity to determine climate variability and change. The fundamental CDRs include sensor data, such as calibrated radiances and brightness temperatures, that scientists have improved and quality-controlled along with the data used to calibrate them. The thematic CDRs include geophysical variables derived from the fundamental CDRs, such as sea surface temperature and sea ice concentration, and they are specific to various disciplines.
Copernicus is a European system for monitoring the Earth. Copernicus consists of a complex set of systems which collect data from multiple sources: earth observation satellites and in situ sensors such as ground stations, airborne and sea-borne sensors. It processes these data and provides users with reliable and up-to-date information through a set of services related to environmental and security issues. The services address six thematic areas: land monitoring, marine monitoring, atmosphere monitoring, climate change, emergency management and security. The main users of Copernicus services are policymakers and public authorities who need the information to develop environmental legislation and policies or to take critical decisions in the event of an emergency, such as a natural disaster or a humanitarian crisis. Based on the Copernicus services and on the data collected through the Sentinels and the contributing missions , many value-added services can be tailored to specific public or commercial needs, resulting in new business opportunities. In fact, several economic studies have already demonstrated a huge potential for job creation, innovation and growth.
The BGS is a data-rich organisation with over 400 datasets in its care; including environmental monitoring data, digital databases, physical collections (borehole core, rocks, minerals and fossils), records and archives. Our data is managed by the National Geoscience Data Centre.
GAWSIS is being developed and maintained by the Federal Office of Meteorology and Climatology MeteoSwiss in collaboration with the WMO GAW Secretariat, the GAW World Data Centres and other GAW representatives to improve the management of information about the GAW network of ground-based stations. The application is presently hosted by the Swiss Laboratories for Materials Testing and Research Empa. GAWSIS provides the GAW community and other interested people with an up-to-date, searchable data base of site descriptions, measurements programs and data available, contact people, bibliographic references. Linked data collections are hosted at the World Data Centers of the WMO Global Atmosphere Watch.
Country
HALO-DB is the web platform of a data retrieval and long-term archive system. The system was established to hold and to manage a wide range of data based on observations of the HALO research aircraft and data which are related to HALO observations. HALO (High-Altitude and LOng-range aircraft) is the new German research aircraft (German Science Community (DFG)). The aircraft, a Gulfstream GV-550 Business-Jet, is strongly modified for the application as a research platform. HALO offers several advantages for scientific campaigns, such as its high range of more than 10000 km, a high maximum altitude of more than 15 km, as well as a relatively high payload.
Country
The company RapidEye AG of Brandenburg brought on 29 August 2008 five satellites into orbit that can be aligned within a day to any point on Earth. The data are interesting for a number of large and small companies for applications from harvest planning to assessment of insurance claims case of natural disasters. Via the Rapid Eye Science Archive (RESA) science users can receive, free of charge, optical image data of the RapidEye satellite fleet. Imagery is allocated based on a proposal to be submitted via the RESA Portal which will be evaluated by independent experts.
The AOML Environmental Data Server (ENVIDS) provides interactive, on-line access to various oceanographic and atmospheric datasets residing at AOML. The in-house datasets include Atlantic Expendable Bathythermograph (XBT), Global Lagrangian Drifting Buoy, Hurricane Flight Level, and Atlantic Hurricane Tracks (North Atlantic Best Track and Synoptic). Other available datasets include Pacific Conductivitiy/Temperature/Depth Recorder (CTD) and World Ocean Atlas 1998.
PAGER (Prompt Assessment of Global Earthquakes for Response) is an automated system that produces content concerning the impact of significant earthquakes around the world, informing emergency responders, government and aid agencies, and the media of the scope of the potential disaster. PAGER rapidly assesses earthquake impacts by comparing the population exposed to each level of shaking intensity with models of economic and fatality losses based on past earthquakes in each country or region of the world. Earthquake alerts – which were formerly sent based only on event magnitude and location, or population exposure to shaking – now will also be generated based on the estimated range of fatalities and economic losses. PAGER uses these earthquake parameters to calculate estimates of ground shaking by using the methodology and software developed for ShakeMaps. ShakeMap sites provide near-real-time maps of ground motion and shaking intensity following significant earthquakes. These maps are used by federal, state, and local organizations, both public and private, for post-earthquake response and recovery, public and scientific information, as well as for preparedness exercises and disaster planning.
Strong-motion data of engineering and scientific importance from the United States and other seismically active countries are served through the Center for Engineering Strong Motion Data(CESMD). The CESMD now automatically posts strong-motion data from an increasing number of seismic stations in California within a few minutes following an earthquake as an InternetQuick Report(IQR). As appropriate,IQRs are updated by more comprehensive Internet Data Reports that include reviewed versions of the data and maps showing, for example, the finite fault rupture along with the distribution of recording stations. Automated processing of strong-motion data will be extended to post the strong-motion records of the regional seismic networks of the Advanced National Seismic System (ANSS) outside California.
The global data compilation consisting of ca. 60,000 data points may be downloaded in csv/xml format. This compilation does not contain the descriptive codes relating to metadata that were included in the previous compilations. Users are advised to consult the references and make their own interpretations as to the quality of the data.
SCISAT, also known as the Atmospheric Chemistry Experiment (ACE), is a Canadian Space Agency small satellite mission for remote sensing of the Earth's atmosphere using solar occultation. The satellite was launched on 12 August 2003 and continues to function perfectly. The primary mission goal is to improve our understanding of the chemical and dynamical processes that control the distribution of ozone in the stratosphere and upper troposphere, particularly in the Arctic. The high precision and accuracy of solar occultation makes SCISAT useful for monitoring changes in atmospheric composition and the validation of other satellite instruments. The satellite carries two instruments. A high resolution (0.02 cm-¹) infrared Fourier transform spectrometer (FTS) operating from 2 to 13 microns (750-4400 cm-¹) is measuring the vertical distribution of trace gases, particles and temperature. This provides vertical profiles of atmospheric constituents including essentially all of the major species associated with ozone chemistry. Aerosols and clouds are monitored using the extinction of solar radiation at 1.02 and 0.525 microns as measured by two filtered imagers. The vertical resolution of the FTS is about 3-4 km from the cloud tops up to about 150 km. Peter Bernath of the University of Waterloo is the principal investigator. A dual optical spectrograph called MAESTRO (Measurement of Aerosol Extinction in the Stratosphere and Troposphere Retrieved by Occultation) covers the 400-1030 nm spectral region and measures primarily ozone, nitrogen dioxide and aerosol/cloud extinction. It has a vertical resolution of about 1-2 km. Tom McElroy of Environment and Climate Change Canada is the principal investigator. ACE data are freely available from the University of Waterloo website. SCISAT was designated an ESA Third Party Mission in 2005. ACE data are freely available through an ESA portal.
The Keck Observatory Archive (KOA)is a collaboration between the NASA Exoplanet Science Institute (NExScI) and the W. M. Keck Observatory (WMKO). This collaboration is founded by the NASA. KOA has been archiving data from the High Resolution Echelle Spectrograph (HIRES) since August 2004 and data acquired with the Near InfraRed echelle SPECtrograph (NIRSPEC) since May 2010. The archived data extend back to 1994 for HIRES and 1999 for NIRSPEC. The W. M. Keck Observatory Archive (KOA) ingests and curates data from the following instruments: DEIMOS, ESI, HIRES, KI, LRIS, MOSFIRE, NIRC2, and NIRSPEC.
Country
BLLAST is a research programme aimed at exploring the late afternoon transition of the atmospheric boundary layer. The late afternoon period of the diurnal cycle of the boundary layer is poorly understood. This is yet an important transition period that impacts the transport and dillution of water vapour and trace species. The main questions adressed by the project are: - How the turbulence activity fades when heating by the surface decreases? - What is the impact on the transport of chemical species? - How relevant processes can be represented in numerical models? To answer all these questions, a field campaign was carried out during the summer of 2011 (from June 14 to July 8). Many observation systems were then deployed and operated by research teams coming from France and abroad. They were spanning a large spectrum of space and time scales in order to achieve a comprehensive description of the boundary layer processes. The observation strategy consisted in intensifying the operations in the late afternoon with tethered balloons, resarch aircrafts and UAVs.