Filter
Reset all

Subjects

Content Types

Countries

AID systems

API

Data access

Data access restrictions

Database access

Database access restrictions

Database licenses

Data licenses

Data upload

Data upload restrictions

Enhanced publication

Institution responsibility type

Institution type

Keywords

Metadata standards

PID systems

Provider types

Quality management

Repository languages

Software

Syndications

Repository types

Versioning

  • * at the end of a keyword allows wildcard searches
  • " quotes can be used for searching phrases
  • + represents an AND search (default)
  • | represents an OR search
  • - represents a NOT operation
  • ( and ) implies priority
  • ~N after a word specifies the desired edit distance (fuzziness)
  • ~N after a phrase specifies the desired slop amount
Found 31 result(s)
Country
Open Government Data Portal of Tamil Nadu is a platform (designed by the National Informatics Centre), for Open Data initiative of the Government of Tamil Nadu. The portal is intended to publish datasets collected by the Tamil Nadu Government for public uses in different perspective. It has been created under Software as A Service (SaaS) model of Open Government Data (OGD) and publishes dataset in open formats like CSV, XLS, ODS/OTS, XML, RDF, KML, GML, etc. This data portal has following modules, namely (a) Data Management System (DMS) for contributing data catalogs by various state government agencies for making those available on the front end website after a due approval process through a defined workflow; (b) Content Management System (CMS) for managing and updating various functionalities and content types; (c) Visitor Relationship Management (VRM) for collating and disseminating viewer feedback on various data catalogs; and (d) Communities module for community users to interact and share their views and common interests with others. It includes different types of datasets generated both in geospatial and non-spatial data classified as shareable data and non-shareable data. Geospatial data consists primarily of satellite data, maps, etc.; and non-spatial data derived from national accounts statistics, price index, census and surveys produced by a statistical mechanism. It follows the principle of data sharing and accessibility via Openness, Flexibility, Transparency, Quality, Security and Machine-readable.
Copernicus is a European system for monitoring the Earth. Copernicus consists of a complex set of systems which collect data from multiple sources: earth observation satellites and in situ sensors such as ground stations, airborne and sea-borne sensors. It processes these data and provides users with reliable and up-to-date information through a set of services related to environmental and security issues. The services address six thematic areas: land monitoring, marine monitoring, atmosphere monitoring, climate change, emergency management and security. The main users of Copernicus services are policymakers and public authorities who need the information to develop environmental legislation and policies or to take critical decisions in the event of an emergency, such as a natural disaster or a humanitarian crisis. Based on the Copernicus services and on the data collected through the Sentinels and the contributing missions , many value-added services can be tailored to specific public or commercial needs, resulting in new business opportunities. In fact, several economic studies have already demonstrated a huge potential for job creation, innovation and growth.
Country
The Portal is intended to be used as catalog of datasets published by ministries/ department/ organizations of Government of India for public use, in order to enhance transparency in the functioning of the Government as well as to make innovative visualization of dataset. This National Data Portal is being updated frequently to make it as accessible as possible and completely accessible to all irrespective of physical challenges or technology.
Country
GEOFON seeks to facilitate cooperation in seismological research and earthquake and tsunami hazard mitigation by providing rapid transnational access to seismological data and source parameters of large earthquakes, and keeping these data accessible in the long term. It pursues these aims by operating and maintaining a global network of permanent broadband stations in cooperation with local partners, facilitating real time access to data from this network and those of many partner networks and plate boundary observatories, providing a permanent and secure archive for seismological data. It also archives and makes accessible data from temporary experiments carried out by scientists at German universities and institutions, thereby fostering cooperation and encouraging the full exploitation of all acquired data and serving as the permanent archive for the Geophysical Instrument Pool at Potsdam (GIPP). It also organises the data exchange of real-time and archived data with partner institutions and international centres.
PAGER (Prompt Assessment of Global Earthquakes for Response) is an automated system that produces content concerning the impact of significant earthquakes around the world, informing emergency responders, government and aid agencies, and the media of the scope of the potential disaster. PAGER rapidly assesses earthquake impacts by comparing the population exposed to each level of shaking intensity with models of economic and fatality losses based on past earthquakes in each country or region of the world. Earthquake alerts – which were formerly sent based only on event magnitude and location, or population exposure to shaking – now will also be generated based on the estimated range of fatalities and economic losses. PAGER uses these earthquake parameters to calculate estimates of ground shaking by using the methodology and software developed for ShakeMaps. ShakeMap sites provide near-real-time maps of ground motion and shaking intensity following significant earthquakes. These maps are used by federal, state, and local organizations, both public and private, for post-earthquake response and recovery, public and scientific information, as well as for preparedness exercises and disaster planning.
The Bavarian Natural History Collections (Staatliche Naturwissenschaftliche Sammlungen Bayerns, SNSB) are a research institution for natural history in Bavaria. They encompass five State Collections (zoology, botany, paleontology and geology, mineralogy, anthropology and paleoanatomy), the Botanical Garden Munich-Nymphenburg and eight museums with public exhibitions in Munich, Bamberg, Bayreuth, Eichstätt and Nördlingen. Our research focuses mainly on the past and present bio- and geodiversity and the evolution of animals and plants. To achieve this we have large scientific collections (almost 35,000,000 specimens), see "joint projects".
Country
BLLAST is a research programme aimed at exploring the late afternoon transition of the atmospheric boundary layer. The late afternoon period of the diurnal cycle of the boundary layer is poorly understood. This is yet an important transition period that impacts the transport and dillution of water vapour and trace species. The main questions adressed by the project are: - How the turbulence activity fades when heating by the surface decreases? - What is the impact on the transport of chemical species? - How relevant processes can be represented in numerical models? To answer all these questions, a field campaign was carried out during the summer of 2011 (from June 14 to July 8). Many observation systems were then deployed and operated by research teams coming from France and abroad. They were spanning a large spectrum of space and time scales in order to achieve a comprehensive description of the boundary layer processes. The observation strategy consisted in intensifying the operations in the late afternoon with tethered balloons, resarch aircrafts and UAVs.
The MGDS MediaBank contains high quality images, illustrations, animations and video clips that are organized into galleries. Media can be sorted by category, and keyword and map-based search options are provided. Each item in the MediaBank is accompanied by metadata that provides access into our cruise catalog and data repository.
Protectedplanet.net combines crowd sourcing and authoritative sources to enrich and provide data for protected areas around the world. Data are provided in partnership with the World Database on Protected Areas (WDPA). The data include the location, designation type, status year, and size of the protected areas, as well as species information.
Geochron is a global database that hosts geochronologic and thermochronologic information from detrital minerals. Information included with each sample consists of a table with the essential isotopic information and ages, a table with basic geologic metadata (e.g., location, collector, publication, etc.), a Pb/U Concordia diagram, and a relative age probability diagram. This information can be accessed and viewed with any web browser, and depending on the level of access desired, can be designated as either private or public. Loading information into Geochron requires the use of U-Pb_Redux, a Java-based program that also provides enhanced capabilities for data reduction, plotting, and analysis. Instructions are provided for three different levels of interaction with Geochron: 1. Accessing samples that are already in the Geochron database. 2. Preparation of information for new samples, and then transfer to Arizona LaserChron Center personnel for uploading to Geochron. 3. Preparation of information and uploading to Geochron using U-Pb_Redux.
ReefTEMPS is a temperature, pressure, salinity and other observables sensor network in coastal area of South, West and South West of Pacific ocean, driven by UMR ENTROPIE. It is an observatory service from the French national research infrastructure ILICO for “coastal environments”. Some of the network’s sensors have been deployed since 1958. Nearly hundred sensors are actually deployed in 14 countries covering an area of more than 8000 km from East to West. The data are acquired at different rates (from 1sec to 30 mn) depending on sensors and sites. They are processed and described using Climate and Forecast Metadata Convention at the end of oceanographic campaigns organized for sensors replacement every 6 months to 2 years.
The Humanitarian Data Exchange (HDX) is an open platform for sharing data across crises and organisations. Launched in July 2014, the goal of HDX is to make humanitarian data easy to find and use for analysis. HDX is managed by OCHA's Centre for Humanitarian Data, which is located in The Hague. OCHA is part of the United Nations Secretariat and is responsible for bringing together humanitarian actors to ensure a coherent response to emergencies. The HDX team includes OCHA staff and a number of consultants who are based in North America, Europe and Africa.
The Earth System Grid Federation (ESGF) is an international collaboration with a current focus on serving the World Climate Research Programme's (WCRP) Coupled Model Intercomparison Project (CMIP) and supporting climate and environmental science in general. Data is searchable and available for download at the Federated ESGF-CoG Nodes https://esgf.llnl.gov/nodes.html
The NIH 3D Print Exchange (the “Exchange”) is an open, comprehensive, and interactive website for searching, browsing, downloading, and sharing biomedical 3D print files, modeling tutorials, and educational material. "Biomedical" includes models of cells, bacteria, or viruses, molecules like proteins or DNA, and anatomical models of organs, tissue, and body parts. The NIH 3D Print Exchange provides models in formats that are readily compatible with 3D printers, and offers a unique set of tools to create and share 3D-printable models related to biomedical science.
Funded by the National Science Foundation (NSF) and proudly operated by Battelle, the National Ecological Observatory Network (NEON) program provides open, continental-scale data across the United States that characterize and quantify complex, rapidly changing ecological processes. The Observatory’s comprehensive design supports greater understanding of ecological change and enables forecasting of future ecological conditions. NEON collects and processes data from field sites located across the continental U.S., Puerto Rico, and Hawaii over a 30-year timeframe. NEON provides free and open data that characterize plants, animals, soil, nutrients, freshwater, and the atmosphere. These data may be combined with external datasets or data collected by individual researchers to support the study of continental-scale ecological change.
Climate Data Online (CDO) provides free access to NCDC's archive of global historical weather and climate data in addition to station history information. These data include quality controlled daily, monthly, seasonal, and yearly measurements of temperature, precipitation, wind, and degree days as well as radar data and 30-year Climate Normals
The repository is no longer available. <<<!!!<<< CCRIS information is migrated to PubChem (https://www.ncbi.nlm.nih.gov/pcsubstance?term=%22Chemical%20Carcinogenesis%20Research%20Information%20System%20(CCRIS)%22%5BSourceName%5D%20AND%20hasnohold%5Bfilt%5D) Help for CCRIS Users in PubChem: https://www.nlm.nih.gov/toxnet/Accessing_CCRIS_Content_from_PubChem.html or PDF: https://www.nlm.nih.gov/toxnet/Accessing_CCRIS_Content_from_PubChem.pdf. >>>!!!>>>
OpenStreetMap (https://www.openstreetmap.org/export#map=6/51.324/10.426) is built by a community of mappers that contribute and maintain data about roads, trails, cafés, railway stations, and much more, all over the world. Planet.osm is the OpenStreetMap data in one file.
Country
The UTM Data Centre is responsible for managing spatial data acquired during oceanographic cruises on board CSIC research vessels (RV Sarmiento de Gamboa, RV García del Cid) and RV Hespérides. The aim is, on the one hand, to disseminate which data exist and where, how and when they have been acquired. And on the other hand, to provide access to as much of the interoperable data as possible, following the FAIR principles, so that they can be used and reused. For this purpose, the UTM has a Spatial Data Infrastructure at a national level that consists of several services: Oceanographic Cruise and Data Catalogue Including metadata from more than 600 cruises carried out since 1991, with links to documentation associated to the cruise, navigation maps and datasets Geoportal Geospatial data mapping interface Underway Plot & QC Visualization, Quality Control and conversion to standard format of meteorological data and temperature and salinity of surface water At an international level, the UTM is a National Oceanographic Data Centre (NODC) of the Distributed European Marine Data Infrastructure SeaDataNet, to which the UTM provides metadata published in the Cruise Summary Report Catalog and in the data catalog Common Data Index Catalog, as well as public data to be shared.
It is a platform for supporting Open Data initiative of Government of Odisha, intends to publish datasets collected by them for public use. It also supports widely used file formats that are suitable for machine processing, thus gives avenues for many more innovative uses of Government Data in different perspective. This portal has been created under Software as A Service (SaaS) model of Open Government Data (OGD) Platform India of NIC. The data available in the portal are owned by various Departments/Organization of Government of Odisha. It follows principles on which data sharing and accessibility need to be based include: Openness, Flexibility, Transparency, Quality, Security and Machine-readable.
The Social Science Data Archive is still active and maintained as part of the UCLA Library Data Science Center. SSDA Dataverse is one of the archiving opportunities of SSDA, the others are: Data can be archived by SSDA itself or by ICPSR or by UCLA Library or by California Digital Library. The Social Science Data Archives serves the UCLA campus as an archive of faculty and graduate student survey research. We provide long term storage of data files and documentation. We ensure that the data are useable in the future by migrating files to new operating systems. We follow government standards and archival best practices. The mission of the Social Science Data Archive has been and continues to be to provide a foundation for social science research with faculty support throughout an entire research project involving original data collection or the reuse of publicly available studies. Data Archive staff and researchers work as partners throughout all stages of the research process, beginning when a hypothesis or area of study is being developed, during grant and funding activities, while data collection and/or analysis is ongoing, and finally in long term preservation of research results. Our role is to provide a collaborative environment where the focus is on understanding the nature and scope of research approach and management of research output throughout the entire life cycle of the project. Instructional support, especially support that links research with instruction is also a mainstay of operations.