Filter
Reset all

Subjects

Content Types

Countries

AID systems

API

Certificates

Data access

Data access restrictions

Database access

Database licenses

Data licenses

Data upload

Data upload restrictions

Enhanced publication

Institution responsibility type

Institution type

Keywords

Metadata standards

PID systems

Provider types

Quality management

Repository languages

Software

Syndications

Repository types

Versioning

  • * at the end of a keyword allows wildcard searches
  • " quotes can be used for searching phrases
  • + represents an AND search (default)
  • | represents an OR search
  • - represents a NOT operation
  • ( and ) implies priority
  • ~N after a word specifies the desired edit distance (fuzziness)
  • ~N after a phrase specifies the desired slop amount
Found 235 result(s)
Country
The Yeast Metabolome Database (YMDB) is a manually curated database of small molecule metabolites found in or produced by Saccharomyces cerevisiae (also known as Baker’s yeast and Brewer’s yeast). This database covers metabolites described in textbooks, scientific journals, metabolic reconstructions and other electronic databases.
The Immunology Database and Analysis Portal (ImmPort) archives clinical study and trial data generated by NIAID/DAIT-funded investigators. Data types housed in ImmPort include subject assessments i.e., medical history, concomitant medications and adverse events as well as mechanistic assay data such as flow cytometry, ELISA, ELISPOT, etc. --- You won't need an ImmPort account to search for compelling studies, peruse study demographics, interventions and mechanistic assays. But why stop there? What you really want to do is download the study, look at each experiment in detail including individual ELISA results and flow cytometry files. Perhaps you want to take those flow cytometry files for a test drive using FLOCK in the ImmPort flow cytometry module. To download all that interesting data you will need to register for ImmPort access.
SourceForge is dedicated to making open source projects successful. We thrive on community collaboration to help us create the leading resource for open source software development and distribution. IT professionals come to Sourceforge to develop, download, review, and publish open source software. Sourceforge is the largest, most trusted destination for Open Source Software discovery and development on the web.
The Unidata community of over 260 universities is building a system for disseminating near real-time earth observations via the Internet. Unlike other systems, which are based on data centers where the information can be accessed, the Unidata IDD is designed so a university can request that certain data sets be delivered to computers at their site as soon as they are available from the observing system. The IDD system also allows any site with access to specialized observations to inject the dataset into the IDD for delivery to other interested sites.
<<<!!!<<< This repository is no longer available. >>>!!!>>> TRMM is a research satellite designed to improve our understanding of the distribution and variability of precipitation within the tropics as part of the water cycle in the current climate system. By covering the tropical and sub-tropical regions of the Earth, TRMM provides much needed information on rainfall and its associated heat release that helps to power the global atmospheric circulation that shapes both weather and climate. In coordination with other satellites in NASA's Earth Observing System, TRMM provides important precipitation information using several space-borne instruments to increase our understanding of the interactions between water vapor, clouds, and precipitation, that are central to regulating Earth's climate. The TRMM mission ended in 2015 and final TRMM multi-satellite precipitation analyses (TMPA, product 3B42/3B43) data processing will end December 31st, 2019. As a result, this TRMM webpage is in the process of being retired and some TRMM imagery may not be displaying correctly. Some of the content will be moved to the Precipitation Measurement Missions website https://gpm.nasa.gov/ and our team is exploring ways to provide some of the real-time products using GPM data. Please contact us if you have any additional questions.
Country
The International Network of Nuclear Reaction Data Centres (NRDC) constitutes a worldwide cooperation of nuclear data centres under the auspices of the International Atomic Energy Agency. The Network was established to coordinate the world-wide collection, compilation and dissemination of nuclear reaction data.
Country
SMHI's observation stations collect large quantities of data, including temperature, precipitation, wind, air pressure, lightning, solar radiation and ozone. Satellites and radar installations are also important sources. Data is presented continuously on smhi.se and used in SMHI's various weather services. In the Explorer SMHI’s data ( http://opendata-catalog.smhi.se/explore/ ) you find data available with open access (in Swedish). Information in English on Oceanographic observations, Model data (HIROMB BS01), Machine to machine – feeds, and Conditions of use.
CLARIN.SI is the Slovenian node of the European CLARIN (Common Language Resources and Technology Infrastructure) Centers. The CLARIN.SI repository is hosted at the Jožef Stefan Institute and offers long-term preservation of deposited linguistic resources, along with their descriptive metadata. The integration of the repository with the CLARIN infrastructure gives the deposited resources wide exposure, so that they can be known, used and further developed beyond the lifetime of the projects in which they were produced. Among the resources currently available in the CLARIN.SI repository are the multilingual MULTEXT-East resources, the CC version of Slovenian reference corpus Gigafida, the morphological lexicon Sloleks, the IMP corpora and lexicons of historical Slovenian, as well as many other resources for a variety of languages. Furthermore, several REST-based web services are provided for different corpus-linguistic and NLP tasks.
SimTK is a free project-hosting platform for the biomedical computation community that enables researchers to easily share their software, data, and models and provides the infrastructure so they can support and grow a community around their projects. It has over 126.656 members, hosts 1.648 projects from researchers around the world, and has had more than 2.095.783 files downloaded from it. Individuals have created SimTK projects to meet publisher and funding agencies’ software and data sharing requirements, run scientific challenges, create a collection of their community’s resources, and much more.
Sound and Vision has one of the largest audiovisual archives in Europe. The institute manages over 70 percent of the Dutch audiovisual heritage. The collection contains more than a million hours of television, radio, music and film from the beginning in 1898 until today. All programs of the Dutch public broadcasters come in digitally every day. Individuals and institutions entrust their collection to Sound and Vision as well. The institute ensures that the material is optimally preserved for (re)use. Broadcasters, producers and editors use the archive for the creation of new programs. The collection is also used to develop products and services for a wide audience, such as exhibitions, iPhone applications, DVD boxes and various websites. The collection of Sound and Vision contains the complete radio and television archives of the Dutch public broadcasters; films of virtually every leading Dutch documentary maker; newsreels; the national music depot; various audiovisual corporate collections; advertising, radio and video material of cultural and social organizations, of scientific institutes and of all kinds of educational institutions. There are also collections of images and articles from the history of Dutch broadcasting itself, like the elaborate collection of historical television sets.
COW seeks to facilitate the collection, dissemination, and use of accurate and reliable quantitative data in international relations. Key principles of the project include a commitment to standard scientific principles of replication, data reliability, documentation, review, and the transparency of data collection procedures. More specifically, we are committed to the free public release of data sets to the research community, to release data in a timely manner after data collection is completed, to provide version numbers for data set and replication tracking, to provide appropriate dataset documentation, and to attempt to update, document, and distribute follow-on versions of datasets where possible. We intend to use our website as the center of our data distribution efforts, to serve as central site for collection of possible error information and questions, to provide a forum for interaction with users of Correlates of War data, and as a way for the international relations community to contribute to the continuing development of the project.
Country
The Climate Change Centre Austria - Data Centre provides the central national archive for climate data and information. The data made accessible includes observation and measurement data, scenario data, quantitative and qualitative data, as well as the measurement data and findings of research projects.
Country
The Service Centre of the Federal Government for Geo-Information and Geodesy (Dienstleistungszentrum des Bundes für Geoinformation und Geodäsie - DLZ) provides geodetic and geo-topographic reference data of the Federal Government centrally to federal institutions, public administrations, economy, science and citizens. The establishment of the Service Centre is based on the Federal Geographic Reference Data Act (Bundesgeoreferenzdatengesetz − BGeoRG), which came into effect on 1 November 2012. This act regulates use, quality and technology of the geodetic and geo-topographic reference systems, networks and data.
Country
The CosmoSim database provides results from cosmological simulations performed within different projects: the MultiDark and Bolshoi project, and the CLUES project. The CosmoSim webpage provides access to several cosmological simulations, with a separate database for each simulation. Simulations overview: https://www.cosmosim.org/cms/simulations/simulations-overview/ . CosmoSim is a contribution to the German Astrophysical Virtual Observatory.
GroupLens is a research lab in the Department of Computer Science and Engineering at the University of Minnesota, Twin Cities specializing in recommender systems, online communities, mobile and ubiquitous technologies, digital libraries, and local geographic information systems.
The purpose of the Dataset Catalogue is to enhance discovery of GNS Science datasets. At a minimum, users will be able to determine whether a dataset on a specific topic exists and then whether it pertains to a specific place and/or a specific date or period. Some datasets include a web link to an online resource. In addition, contact details are provided for the custodian of each dataset as well as conditions of use.
Ag Data Commons provides access to a wide variety of open data relevant to agricultural research. We are a centralized repository for data already on the web, as well as for new data being published for the first time. While compliance with the U.S. Federal public access and open data directives is important, we aim to surpass them. Our goal is to foster innovative data re-use, integration, and visualization to support bigger, better science and policy.
The PhenoGen website shares experimental data with a worldwide community of investigators and provides a flexible, integrated, multi-resolution repository of neuroscience transcriptomic genetic data for collaborative research on genomic disorders. The main development focus is on providing Hybrid Rat Diversity Panel transcriptomic data (sequencing, genome coverage, reconstructed totalRNA/smallRNA transcriptomes, quanification of the transcriptome, eQTLs, and WGCNA) and integrating additional tools to provide platform for visualization and analysis of HRDP transcriptome data.
The U.S. Antarctic Program Data Center (USAP-DC) provides the central Project Catalog for projects funded by the NSF for the U.S. Antarctic Program and Data Repository for multi-disciplinary investigator research datasets derived from these projects. Services provided support investigators in documenting, preserving, and disseminating their research results. All data are openly accessible to the international community for browse, search, and data download. Datasets are registered in the Antarctic Master Directory to comply with the Antarctic Treaty.
US Department of Energy’s Atmospheric Radiation Measurement (ARM) Data Center is a long-term archive and distribution facility for various ground-based, aerial and model data products in support of atmospheric and climate research. ARM facility currently operates over 400 instruments at various observatories (https://www.arm.gov/capabilities/observatories/). ARM Data Center (ADC) Archive currently holds over 11,000 data products with a total holding of over 3 petabytes of data that dates back to 1993, these include data from instruments, value added products, model outputs, field campaign and PI contributed data. The data center archive also includes data collected by ARM from related program (e.g., external data such as NASA satellite).
Remote Sensing Systems is a world leader in processing and analyzing microwave data from satellite microwave sensors. We specialize in algorithm development, instrument calibration, ocean product development, and product validation. We have worked with more than 30 satellite microwave radiometer, sounder, and scatterometer instruments over the past 40 years. Currently, we operationally produce satellite retrievals for SSMIS, AMSR2, WindSat, and ASCAT. The geophysical retrievals obtained from these sensors are made available in near-real-time (NRT) to the global scientific community and general public via FTP and this web site.
Country
The sources of the data sets include data sets donated by researchers, surveys carried out by SRDA, as well as by government department and other academic organizations. Prior to the release of data sets, the confidentiality and sensitivity of every survey data set are evaluated. Standard data management and cleaning procedures are applied to ensure data accuracy and completeness. In addition, metadata and relevant supplement files are also edited and attached.
GitHub is the best place to share code with friends, co-workers, classmates, and complete strangers. Over three million people use GitHub to build amazing things together. With the collaborative features of GitHub.com, our desktop and mobile apps, and GitHub Enterprise, it has never been easier for individuals and teams to write better code, faster. Originally founded by Tom Preston-Werner, Chris Wanstrath, and PJ Hyett to simplify sharing code, GitHub has grown into the largest code host in the world.
<<<!!!<<< The demand for high-value environmental data and information has dramatically increased in recent years. To improve our ability to meet that demand, NOAA’s former three data centers—the National Climatic Data Center, the National Geophysical Data Center, and the National Oceanographic Data Center, which includes the National Coastal Data Development Center—have merged into the National Centers for Environmental Information (NCEI). >>>!!!>>> NCEI is responsible for hosting and providing access to one of the most significant archives on Earth, with comprehensive oceanic, atmospheric, and geophysical data. From the depths of the ocean to the surface of the sun and from million-year-old sediment records to near real-time satellite images, NCEI is the Nation's leading authority for environmental information. The National Centers for Environmental Information (NCEI), which hosts the World Data Service for Oceanography is a national environmental data center operated by the National Oceanic and Atmospheric Administration (NOAA) of the U.S. Department of Commerce. NCEI are responsible for hosting and providing access to one of the most significant archives on earth, with comprehensive oceanic, atmospheric, and geophysical data. The primary mission of the World Data Service for Oceanography is to ensure that global oceanographic datasets collected at great cost are maintained in a permanent archive that is easily and openly accessible to the world science community and to other users.