Filter
Reset all

Subjects

Content Types

Countries

AID systems

API

Certificates

Data access

Data access restrictions

Database access

Database access restrictions

Database licenses

Data licenses

Data upload

Data upload restrictions

Enhanced publication

Institution responsibility type

Institution type

Keywords

Metadata standards

PID systems

Provider types

Quality management

Repository languages

Software

Syndications

Repository types

Versioning

  • * at the end of a keyword allows wildcard searches
  • " quotes can be used for searching phrases
  • + represents an AND search (default)
  • | represents an OR search
  • - represents a NOT operation
  • ( and ) implies priority
  • ~N after a word specifies the desired edit distance (fuzziness)
  • ~N after a phrase specifies the desired slop amount
Found 30 result(s)
SCEC's mission includes gathering data on earthquakes, both in Southern California and other locales; integrate the information into a comprehensive understanding of earthquake phenomena; and communicate useful knowledge for reducing earthquake risk to society at large. The SCEC community consists of more than 600 scientists from 16 core institutions and 47 additional participating institutions. SCEC is funded by the National Science Foundation and the U.S. Geological Survey.
The National Archives and Records Administration (NARA) is the nation's record keeper. Of all documents and materials created in the course of business conducted by the United States Federal government, only 1%-3% are so important for legal or historical reasons that they are kept by us forever. Those valuable records are preserved and are available to you, whether you want to see if they contain clues about your family’s history, need to prove a veteran’s military service, or are researching an historical topic that interests you.
>>>!!!<<< 2019-01: Global Land Cover Facility goes offline see https://spatialreserves.wordpress.com/2019/01/07/global-land-cover-facility-goes-offline/ ; no more access to http://www.landcover.org >>>!!!<<< The Global Land Cover Facility (GLCF) provides earth science data and products to help everyone to better understand global environmental systems. In particular, the GLCF develops and distributes remotely sensed satellite data and products that explain land cover from the local to global scales.
As a member of SWE-CLARIN, the Humanities Lab will provide tools and expertise related to language archiving, corpus and (meta)data management, with a continued emphasis on multimodal corpora, many of which contain Swedish resources, but also other (often endangered) languages, multilingual or learner corpora. As a CLARIN K-centre we provide advice on multimodal and sensor-based methods, including EEG, eye-tracking, articulography, virtual reality, motion capture, av-recording. Current work targets automatic data retrieval from multimodal data sets, as well as the linking of measurement data (e.g. EEG, fMRI) or geo-demographic data (GIS, GPS) to language data (audio, video, text, annotations). We also provide assistance with speech and language technology related matters to various projects. A primary resource in the Lab is The Humanities Lab corpus server, containing a varied set of multimodal language corpora with standardised metadata and linked layers of annotations and other resources.
DSpace@MIT is a service of the MIT Libraries to provide MIT faculty, researchers and their supporting communities stable, long-term storage for their digital research and teaching output and to maximize exposure of their content to a world audience. DSpace@MIT content includes conference papers, images, peer-reviewed scholarly articles, preprints, technical reports, theses, working papers, research datasets and more. This collection of more than 60,000 high-quality works is recognized as among the world's premier scholarly repositories and receives, on average, more than 1 million downloads per month.
The National Science Digital Library provides high quality online educational resources for teaching and learning, with current emphasis on the sciences, technology, engineering, and mathematics (STEM) disciplines—both formal and informal, institutional and individual, in local, state, national, and international educational settings. The NSDL collection contains structured descriptive information (metadata) about web-based educational resources held on other sites by their providers. These providers have contribute this metadata to NSDL for organized search and open access to educational resources via this website and its services.
Country
GTS AI is an Artificial Intelligence Company that offers excellent services to its clients. We use high definition images and use high quality data to analyze and help in Machine Learning Company . We are a dataset provider and we collect data in regards to artificial intelligence.
Country
PARADISEC (the Pacific And Regional Archive for Digital Sources in Endangered Cultures) offers a facility for digital conservation and access to endangered materials from all over the world. Our research group has developed models to ensure that the archive can provide access to interested communities, and conforms with emerging international standards for digital archiving. We have established a framework for accessioning, cataloguing and digitising audio, text and visual material, and preserving digital copies. The primary focus of this initial stage is safe preservation of material that would otherwise be lost, especially field tapes from the 1950s and 1960s.
The Wolfram Data Repository is a public resource that hosts an expanding collection of computable datasets, curated and structured to be suitable for immediate use in computation, visualization, analysis and more. Building on the Wolfram Data Framework and the Wolfram Language, the Wolfram Data Repository provides a uniform system for storing data and making it immediately computable and useful. With datasets of many types and from many sources, the Wolfram Data Repository is built to be a global resource for public data and data-backed publication.
The ZINC Database contains commercially available compounds for structure based virtual screening. It currently has compounds that can simply be purchased. It is provided in ready-to-dock, 3D formats with molecules represented in biologically relevant forms. It is available in subsets for general screening as well as target-, chemotype- and vendor-focused subsets. ZINC is free for everyone to use and download at the website zinc.docking.org.
>>>!!!<<< On June 1, 2020, the Academic Seismic Portal repositories at UTIG were merged into a single collection hosted at Lamont-Doherty Earth Observatory. Content here was removed July 1, 2020. Visit the Academic Seismic Portal @LDEO! https://www.marine-geo.org/collections/#!/collection/Seismic#summary (https://www.re3data.org/repository/r3d100010644) >>>!!!<<<
TiU Dataverse is the central online repository for research data at Tilburg University. The TiU Dataverse is managed by the Research Data Office (RDO) at Library and IT Services (LIS). TiU Dataverse takes part of the DataverseNL network. DataverseNL is a shared data service of several Dutch universities and institutions. The data management is in the hands of the member organizations, while the national organization Data Archiving and Networked Services (DANS) manages the network
<<<!!!<<< The demand for high-value environmental data and information has dramatically increased in recent years. To improve our ability to meet that demand, NOAA’s former three data centers—the National Climatic Data Center, the National Geophysical Data Center, and the National Oceanographic Data Center, which includes the National Coastal Data Development Center—have merged into the National Centers for Environmental Information (NCEI). >>>!!!>>> The NOAA National Centers for Environmental Information (formerly the National Geophysical Data Center) provide scientific stewardship, products and services for sea floor and lakebed data, including geophysics (gravity, magnetics, seismic reflection, bathymetry, water column sonar), and data derived from sediment and rock samples. NCEI compiles coastal and global digital elevation models, high-resolution models for tsunami inundation studies, provides stewardship for NOS data supporting charts and navigation, and is the US national long-term archive for MGG data
The Antarctic and Southern Ocean Data Portal, part of the US Antarctic Data Consortium, provides access to geoscience data, primarily marine, from the Antarctic region. The synthesis began in 2003 as the Antarctic Multibeam Bathymetry and Geophysical Data Synthesis (AMBS) with a focus on multibeam bathymetry field data and other geophysical data from the Southern Ocean collected with the R/V N. B. Palmer. In 2005, the effort was expanded to include all routine underway geophysical and oceanographic data collected with both the R/V N. B. Palmer and R/V L. Gould, the two primary research vessels serving the US Antarctic Program.
dictyBase is an integrated genetic and literature database that contains published Dictyostelium discoideum literature, genes, expressed sequence tags (ESTs), as well as the chromosomal and mitochondrial genome sequences. Direct access to the genome browser, a Blast search tool, the Dictyostelium Stock Center, research tools, colleague databases, and much much more are just a mouse click away. Dictybase is a genome portal for the Amoebozoa. dictyBase is funded by a grant from the National Institute for General Medical Sciences.
The Macaulay Library is the world's largest and oldest scientific archive of biodiversity audio and video recordings. The library collects and preserves recordings of each species' behavior and natural history, to facilitate the ability of others to collect and preserve such recordings, and to actively promote the use of these recordings for diverse purposes spanning scientific research, education, conservation, and the arts. All archived analog recordings in the collection, going back to 1929.
The Fungal Genetics Stock Center has preserved and distributed strains of genetically characterized fungi since 1960. The collection includes over 20,000 accessioned strains of classical and genetically engineered mutants of key model, human, and plant pathogenic fungi. These materials are distributed as living stocks to researchers around the world.
The LISS panel (Longitudinal Internet Studies for the Social sciences) is the principal component of the MESS project. It consists of 5000 households, comprising approximately 7500 individuals. The panel is based on a true probability sample of households drawn from the population register by Statistics Netherlands. Households that could not otherwise participate are provided with a computer and Internet connection. In addition to the LISS panel an Immigrant panel was available from October 2010 up until December 2014. This Immigrant panel consisted of around 1,600 households (2,400 individuals) of which 1,100 households (1,700 individuals) were of non-Dutch origin. The data from this panel are still available through the LISS data archive (https://www.dataarchive.lissdata.nl/study_units/view/162). Panel members complete online questionnaires every month of about 15 to 30 minutes in total. They are paid for each completed questionnaire. One member in the household provides the household data and updates this information at regular time intervals.
The SAR Data Center has a large data archive of Synthetic Aperture Radar (SAR) from a variety of sensors available at no cost. Much of the SAR data in the ASF SDC archive is limited in distribution to the scientific research community and U.S. Government Agencies. In accordance with the Memoranda of Understanding (MOU) between the relevant flight agencies (CSA, ESA, JAXA) and the U.S. State Department, the ASF SDC does not distribute SAR data for commercial use. The research community can access the data (ERS-1, ERS-2, JERS-1, RADARSAT-1, and ALOS PALSAR) via a brief proposal process.
The UK Data Archive, based at the University of Essex, is curator of the largest collection of digital data in the social sciences and humanities in the United Kingdom. With several thousand datasets relating to society, both historical and contemporary, our Archive is a vital resource for researchers, teachers and learners. We are an internationally acknowledged centre of expertise in the areas of acquiring, curating and providing access to data. We are the lead partner in the UK Data Service (https://service.re3data.org/repository/r3d100010230) through which data users can browse collections online and register to analyse and download them. Open Data collections are available for anyone to use. The UK Data Archive is a Trusted Digital Repository (TDR) certified against the CoreTrustSeal (https://www.coretrustseal.org/) and certified against ISO27001 for Information Security (https://www.iso.org/isoiec-27001-information-security.html).
The National Archives is home to millions of historical documents, known as records, which were created and collected by UK central government departments and major courts of law. Data of the fomer National Digital Archive of Datasets (NDAD) collection, which was active from 1997 to 2010 and preserves and provides online access to archived digital datasets and documents from UK central government departments, is integrated. Access to records held by The National Archives and more than 2,500 other archives.
Country
sciencedata.dk is a research data store provided by DTU, the Danish Technical University, specifically aimed at researchers and scientists at Danish academic institutions. The service is intended for working with and sharing active research data as well as for safekeeping of large datasets. The data can be accessed and manipulated via a web interface, synchronization clients, file transfer clients or the command line. The service is built on and with open-source software from the ground up: FreeBSD, ZFS, Apache, PHP, ownCloud/Nextcloud. DTU is actively engaged in community efforts on developing research-specific functionality for data stores. Our servers are attached directly to the 10-Gigabit backbone of "Forskningsnettet" (the National Research and Education Network of Denmark) - implying that up and download speed from Danish academic institutions is in principle comparable to those of an external USB hard drive. Data store for research data allowing private sharing and sharing via links / persistent URLs.
Country
The "Database for Spoken German (DGD)" is a corpus management system in the program area Oral Corpora of the Institute for German Language (IDS). It has been online since the beginning of 2012 and since mid-2014 replaces the spoken German database, which was developed in the "Deutsches Spracharchiv (DSAv)" of the IDS. After single registration, the DGD offers external users a web-based access to selected parts of the collection of the "Archive Spoken German (AGD)" for use in research and teaching. The selection of the data for external use depends on the consent of the respective data provider, who in turn must have the appropriate usage and exploitation rights. Also relevant to the selection are certain protection needs of the archive. The Archive for Spoken German (AGD) collects and archives data of spoken German in interactions (conversation corpora) and data of domestic and non-domestic varieties of German (variation corpora). Currently, the AGD hosts around 50 corpora comprising more than 15000 audio and 500 video recordings amounting to around 5000 hours of recorded material with more than 7000 transcripts. With the Research and Teaching Corpus of Spoken German (FOLK) the AGD is also compiling an extensive German conversation corpus of its own. !!! Access to data of Datenbank Gesprochenes Deutsch (DGD) is also provided by: IDS Repository https://www.re3data.org/repository/r3d100010382 !!!
The Measures of Effective Teaching(MET) project is the largest study of classroom teaching ever conducted in the United States. The University of Michigan compiled the MET data and video files into a rich research collection called the MET Longitudinal Database. Approved researchers can access the restricted MET quantitative and video data using secure online technical systems. The MET Longitudinal Database consists of a Web-based application for searching the collection and viewing the videos with accompanying metadata, and a Virtual Data Enclave that provides secure remote access to the quantitative data and documentation files.
A consolidated feed from 35 million instruments provides sophisticated normalized data, streamlining analysis and decisions from front office to operations. And with flexible delivery options including cloud and API, timely accurate data enables the enterprise to capture opportunities, evaluate risk and ensure compliance in fast-moving markets.