Filter
Reset all

Subjects

Content Types

Countries

API

Certificates

Data access

Data access restrictions

Database access

Database access restrictions

Database licenses

Data licenses

Data upload

Data upload restrictions

Enhanced publication

Institution responsibility type

Institution type

Keywords

Metadata standards

PID systems

Provider types

Quality management

Repository languages

Software

Syndications

Repository types

Versioning

  • * at the end of a keyword allows wildcard searches
  • " quotes can be used for searching phrases
  • + represents an AND search (default)
  • | represents an OR search
  • - represents a NOT operation
  • ( and ) implies priority
  • ~N after a word specifies the desired edit distance (fuzziness)
  • ~N after a phrase specifies the desired slop amount
Found 30 result(s)
SCEC's mission includes gathering data on earthquakes, both in Southern California and other locales; integrate the information into a comprehensive understanding of earthquake phenomena; and communicate useful knowledge for reducing earthquake risk to society at large. The SCEC community consists of more than 600 scientists from 16 core institutions and 47 additional participating institutions. SCEC is funded by the National Science Foundation and the U.S. Geological Survey.
>>>!!!<<< 2019-01: Global Land Cover Facility goes offline see https://spatialreserves.wordpress.com/2019/01/07/global-land-cover-facility-goes-offline/ ; no more access to http://www.landcover.org >>>!!!<<< The Global Land Cover Facility (GLCF) provides earth science data and products to help everyone to better understand global environmental systems. In particular, the GLCF develops and distributes remotely sensed satellite data and products that explain land cover from the local to global scales.
Country
The Community Data Program (CDP) is a membership-based community development initiative open to any Canadian public, non-profit or community sector organization with a local service delivery or public policy mandate. The program facilitates access to the evidence needed to tell our stories and inform effective and responsive policy and program design and implementation. The CDP makes data accessible and useful for all members with training and capacity building resources. Through its vibrant network, the CDP facilitates and supports dialogue and the sharing of best practices in the use of community data. The CDP has emerged as a unique Canada-wide platform for generating information, convening and collaborating.
WorldData.AI comes with a built-in workspace – the next-generation hyper-computing platform powered by a library of 3.3 billion curated external trends. WorldData.AI allows you to save your models in its “My Models Trained” section. You can make your models public and share them on social media with interesting images, model features, summary statistics, and feature comparisons. Empower others to leverage your models. For example, if you have discovered a previously unknown impact of interest rates on new-housing demand, you may want to share it through “My Models Trained.” Upload your data and combine it with external trends to build, train, and deploy predictive models with one click! WorldData.AI inspects your raw data, applies feature processors, chooses the best set of algorithms, trains and tunes multiple models, and then ranks model performance.
Copernicus is a European system for monitoring the Earth. Copernicus consists of a complex set of systems which collect data from multiple sources: earth observation satellites and in situ sensors such as ground stations, airborne and sea-borne sensors. It processes these data and provides users with reliable and up-to-date information through a set of services related to environmental and security issues. The services address six thematic areas: land monitoring, marine monitoring, atmosphere monitoring, climate change, emergency management and security. The main users of Copernicus services are policymakers and public authorities who need the information to develop environmental legislation and policies or to take critical decisions in the event of an emergency, such as a natural disaster or a humanitarian crisis. Based on the Copernicus services and on the data collected through the Sentinels and the contributing missions , many value-added services can be tailored to specific public or commercial needs, resulting in new business opportunities. In fact, several economic studies have already demonstrated a huge potential for job creation, innovation and growth.
The Wolfram Data Repository is a public resource that hosts an expanding collection of computable datasets, curated and structured to be suitable for immediate use in computation, visualization, analysis and more. Building on the Wolfram Data Framework and the Wolfram Language, the Wolfram Data Repository provides a uniform system for storing data and making it immediately computable and useful. With datasets of many types and from many sources, the Wolfram Data Repository is built to be a global resource for public data and data-backed publication.
Country
Since January 2012, two previously independent resources called "ViFaArt – Virtual Library for Contemporary Art" and "arthistoricum.net – Virtual Library for Art History" have been joint together, forming a new service called arthistoricum.net. This unique union makes it now possible to research the whole subject spectrum belonging to Art History. The special interest collection of Art History focuses on Medieval and Early European Art History, including art influenced by Europe in the USA, Canada and Australia, continuing chronologically from the Early Christian era until 1945. The special interest collection of Contemporary Art continues the art historical subject spectrum to include European and North American Art History from 1945. arthistoricum.net contains text and image resources as well as comprehensive, academically relevant information dealing with all media from the Middle Ages up to the present. arthistoricum.net pools the resources and know-how of the responsible partner institutions, thus making this portal an essential forum for research and teaching.
The European Xenopus Resource Centre (EXRC) is situated in Portsmouth, United Kingdom and provides tools and services to support researchers using Xenopus models. The EXRC depends on researchers to obtain and deposit Xenopus transgenic and mutant lines, Xenopus in-situ hybridization clones, Xenopus specific antibodies and other resources with the centre. EXRC staff perform quality assurance testing on these reagents and then makes them available to the community at cost. EXRC also supplies wild type Xenopus, embryos, oocytes, egg extracts, X.tropicalis Fosmids, X.laevis BACs and ORFeomes.
Country
DARTS primarily archives high-level data products obtained by JAXA's space science missions in astrophysics (X-rays, radio, infrared), solar physics, solar-terrestrial physics, and lunar and planetary science. In addition, we archive related space science data products obtained by other domestic or foreign institutes, and provide data services to facilitate use of these data.
NIST Data Gateway - provides easy access to many of the NIST scientific and technical databases. These databases cover a broad range of substances and properties from many different scientific disciplines. The Gateway includes links to free online NIST data systems as well as to information on NIST PC databases available for purchase.
The World Data Center for Remote Sensing of the Atmosphere, WDC-RSAT, offers scientists and the general public free access (in the sense of a “one-stop shop”) to a continuously growing collection of atmosphere-related satellite-based data sets (ranging from raw to value added data), information products and services. Focus is on atmospheric trace gases, aerosols, dynamics, radiation, and cloud physical parameters. Complementary information and data on surface parameters (e.g. vegetation index, surface temperatures) is also provided. This is achieved either by giving access to data stored at the data center or by acting as a portal containing links to other providers.
FactSage is a fully integrated Canadian thermochemical database system which couples proven software with self-consistent critically assessed thermodynamic data. It currently contains data on over 5000 chemical substances as well as solution databases representing over 1000 non-ideal multicomponent solutions (oxides, salts, sulfides, alloys, aqueous, etc.). FactSage is available for use with Windows.
The UK Data Archive, based at the University of Essex, is curator of the largest collection of digital data in the social sciences and humanities in the United Kingdom. With several thousand datasets relating to society, both historical and contemporary, our Archive is a vital resource for researchers, teachers and learners. We are an internationally acknowledged centre of expertise in the areas of acquiring, curating and providing access to data. We are the lead partner in the UK Data Service (https://service.re3data.org/repository/r3d100010230) through which data users can browse collections online and register to analyse and download them. Open Data collections are available for anyone to use. The UK Data Archive is a Trusted Digital Repository (TDR) certified against the CoreTrustSeal (https://www.coretrustseal.org/) and certified against ISO27001 for Information Security (https://www.iso.org/isoiec-27001-information-security.html).
A consolidated feed from 35 million instruments provides sophisticated normalized data, streamlining analysis and decisions from front office to operations. And with flexible delivery options including cloud and API, timely accurate data enables the enterprise to capture opportunities, evaluate risk and ensure compliance in fast-moving markets.
The UK Hydrographic Office (UKHO) is a world-leading centre for hydrography, specialising in marine geospatial data to support safe, secure and thriving oceans. UK Hydrographic Office Bathymetry Data Archive Centre (UKHO DAC) is the UK national repository for bathymetry data. It is provided by the UK Hydrographic Office (UKHO) as part of the wider Marine Environmental Data and Information Network (MEDIN) https://medin.org.uk/. The UKHO DAC holds bathymetry data assets from a wide range of sources – Government funded, commercial, environmental and defence. The ADMIRALTY Marine Data Portal https://www.gov.uk/guidance/inspire-portal-and-medin-bathymetry-data-archive-centre provides access to marine data sets held by the UK Hydrographic Office within the UK Exclusive Economic Zone (EEZ).
Country
The Norwegian Meteorological Institute supplies climate observations and weather data and forecasts for the country and surrounding waters (including the Arctic). In addition commercial services are provided to fit customers requirements. Data are served through a number of subsystems (information provided in repository link) and cover data from internal services of the institute, from external services operated by the institute and research projects where the institute participates. Further information is provided in the landing page which also contains entry points some of the data portals operated.
Country
Arachne is the central object-database of the German Archaeological Institute (DAI). In 2004 the DAI and the Research Archive for Ancient Sculpture at the University of Cologne (FA) joined the effort to support Arachne as a tool for free internet-based research. Arachne's database design uses a model that builds on one of the most basic assumptions one can make about archaeology, classical archaeology or art history: all activities in these areas can most generally be described as contextualizing objects. Arachne tries to avoid the basic mistakes of earlier databases, which limited their object modeling to specific project-oriented aspects, thus creating separated containers of only a small number of objects. All objects inside Arachne share a general part of their object model, to which a more class-specific part is added that describes the specialised properties of a category of material like architecture or topography. Seen on the level of the general part, a powerful pool of material can be used for general information retrieval, whereas on the level of categories and properties, very specific structures can be displayed.
Europeana is the trusted source of cultural heritage brought to you by the Europeana Foundation and a large number of European cultural institutions, projects and partners. It’s a real piece of team work. Ideas and inspiration can be found within the millions of items on Europeana. These objects include: Images - paintings, drawings, maps, photos and pictures of museum objects Texts - books, newspapers, letters, diaries and archival papers Sounds - music and spoken word from cylinders, tapes, discs and radio broadcasts Videos - films, newsreels and TV broadcasts All texts are CC BY-SA, images and media licensed individually.
The Arabidopsis Information Resource (TAIR) maintains a database of genetic and molecular biology data for the model higher plant Arabidopsis thaliana . Data available from TAIR includes the complete genome sequence along with gene structure, gene product information, metabolism, gene expression, DNA and seed stocks, genome maps, genetic and physical markers, publications, and information about the Arabidopsis research community. Gene product function data is updated every two weeks from the latest published research literature and community data submissions. Gene structures are updated 1-2 times per year using computational and manual methods as well as community submissions of new and updated genes. TAIR also provides extensive linkouts from our data pages to other Arabidopsis resources.
The Alaska Climate Research Center archives and provides digital climate records, climate statistics, and monthly weather summaries on Alaska and the polar regions. The Alaska Climate Research Center is part of the Geophysical Institute at the University of Alaska Fairbanks.
Online materials database (known as PAULING FILE project) with nearly 2 million entries: physical properties, crystal structures, phase diagrams, available via API, ready for modern data-intensive applications. The source of these entries are about 0.5M peer-reviewed publications in materials science, processed during the last 30 years by an international team of PhD editors. The results are presented online with a quick search interface. The basic access is provided for free.
The EZRC at KIT houses the largest experimental fish facility in Europe with a capacity of more than 300,000 fish. Zebrafish stocks are maintained mostly as frozen sperm. Frequently requested lines are also kept alive as well as a selection of wildtype strains. Several thousand mutations in protein coding genes generated by TILLING in the Stemple lab of the Sanger Centre, Hinxton, UK and lines generated by ENU mutagenesis by the Nüsslein-Volhard lab in addition to transgenic lines and mutants generated by KIT groups or brought in through collaborations. We also accept submissions on an individual basis and ship fish upon request to PIs in Europe and elsewhere. EZRC also provides screening services and technologies such as imaging and high-throughput sequencing. Key areas include automation of embryo handling and automated image acquisition and processing. Our platform also involves the development of novel microscopy techniques (e.g. SPIM, DSLM, robotic macroscope) to permit high-resolution, real-time imaging in 4D. By association with the ComPlat platform, we can support also chemical screens and offer libraries with up to 20,000 compounds in total for external users. As another service to the community the EZRC provides plasmids (cDNAs, transgenes, Talen, Crispr/cas9) maintained by the Helmholtz repository of Bioparts (HERBI) to the scientific community. In addition the fish facility keeps a range of medaka stocks, maintained by the Loosli group.
Knoema is a knowledge platform. The basic idea is to connect data with analytical and presentation tools. As a result, we end with one uniformed platform for users to access, present and share data-driven content. Within Knoema, we capture most aspects of a typical data use cycle: accessing data from multiple sources, bringing relevant indicators into a common space, visualizing figures, applying analytical functions, creating a set of dashboards, and presenting the outcome.