Filter
Reset all

Subjects

Content Types

Countries

API

Data access

Data access restrictions

Database access

Database licenses

Data licenses

Data upload

Data upload restrictions

Enhanced publication

Institution responsibility type

Institution type

Keywords

Metadata standards

PID systems

Provider types

Quality management

Repository languages

Software

Syndications

Repository types

Versioning

  • * at the end of a keyword allows wildcard searches
  • " quotes can be used for searching phrases
  • + represents an AND search (default)
  • | represents an OR search
  • - represents a NOT operation
  • ( and ) implies priority
  • ~N after a word specifies the desired edit distance (fuzziness)
  • ~N after a phrase specifies the desired slop amount
Found 27 result(s)
The repository is no longer available. >>>!!!<<< 2021-01-25: no more access to California Water CyberInfrastructure >>>!!!<<<
Jason is a remote-controlled deep-diving vessel that gives shipboard scientists immediate, real-time access to the sea floor. Instead of making short, expensive dives in a submarine, scientists can stay on deck and guide Jason as deep as 6,500 meters (4 miles) to explore for days on end. Jason is a type of remotely operated vehicle (ROV), a free-swimming vessel connected by a long fiberoptic tether to its research ship. The 10-km (6 mile) tether delivers power and instructions to Jason and fetches data from it.
Stanford Network Analysis Platform (SNAP) is a general purpose network analysis and graph mining library. It is written in C++ and easily scales to massive networks with hundreds of millions of nodes, and billions of edges. It efficiently manipulates large graphs, calculates structural properties, generates regular and random graphs, and supports attributes on nodes and edges. SNAP is also available through the NodeXL which is a graphical front-end that integrates network analysis into Microsoft Office and Excel. The SNAP library is being actively developed since 2004 and is organically growing as a result of our research pursuits in analysis of large social and information networks. Largest network we analyzed so far using the library was the Microsoft Instant Messenger network from 2006 with 240 million nodes and 1.3 billion edges. The datasets available on the website were mostly collected (scraped) for the purposes of our research. The website was launched in July 2009.
The IMEx consortium is an international collaboration between a group of major public interaction data providers who have agreed to share curation effort and develop and work to a single set of curation rules when capturing data from both directly deposited interaction data or from publications in peer-reviewed journals, capture full details of an interaction in a “deep” curation model, perform a complete curation of all protein-protein interactions experimentally demonstrated within a publication, make these interaction available in a single search interface on a common website, provide the data in standards compliant download formats, make all IMEx records freely accessible under the Creative Commons Attribution License
NC OneMap is a public service providing comprehensive discovery and access to North Carolina's geospatial data resources. NC OneMap, the State's Clearinghouse for geospatial information, relies on data sharing and partnerships.
Government of Yukon open data provides an easy way to find, access and reuse the government's public datasets. This service brings all of the government's data together in one searchable website. Our datasets are created and managed by different government departments. We cannot guarantee the quality or timeliness of all data. If you have any feedback you can get in touch with the department that produced the dataset. This is a pilot project. We are in the process of adding a quality framework to make it easier for you to access high quality, reliable data.
The WHOI Ship DataGrabber system provides the oceanographic community on-line access to underway ship data collected on the R/V Atlantis, Knorr, Oceanus, and Tioga (TBD). All the shipboard data is co-registered with the ship's GPS time and navigation systems.
The Nuclear Data Portal is a new generation of nuclear data services using modern and powerful DELL servers, Sybase relational database software, the Linux operating system with programming in Java. The Portal includes nuclear structure, decay and reaction data, as well as literature information. Data can be searched for using optimized query forms; results are presented in tables and interactive plots. Additionally, a number of nuclear science tools, codes, applications, and links are provided. The databases includes are: CINDA - Computer Index of Nuclear Reaction Data, CSISRS alias EXFOR - Experimental nuclear reaction data, ENDF - Evaluated Nuclear Data File , ENSDF - Evaluated Nuclear Structure Data File, MIRD - Medical Internal Radiation Dose, NSR - Nuclear Science References, NuDat - Nuclear Structure & Decay Data, XUNDL - Experimental Unevaluated Nuclear Data List, Chart of Nuclides. Nuclear Data Portal is a web service of National Nuclear Data Center.
The Rat Genome Database is a collaborative effort between leading research institutions involved in rat genetic and genomic research. Its goal, as stated in RFA: HL-99-013 is the establishment of a Rat Genome Database, to collect, consolidate, and integrate data generated from ongoing rat genetic and genomic research efforts and make these data widely available to the scientific community. A secondary, but critical goal is to provide curation of mapped positions for quantitative trait loci, known mutations and other phenotypic data.
Kaggle is a platform for predictive modelling and analytics competitions in which statisticians and data miners compete to produce the best models for predicting and describing the datasets uploaded by companies and users. This crowdsourcing approach relies on the fact that there are countless strategies that can be applied to any predictive modelling task and it is impossible to know beforehand which technique or analyst will be most effective.
dictyBase is an integrated genetic and literature database that contains published Dictyostelium discoideum literature, genes, expressed sequence tags (ESTs), as well as the chromosomal and mitochondrial genome sequences. Direct access to the genome browser, a Blast search tool, the Dictyostelium Stock Center, research tools, colleague databases, and much much more are just a mouse click away. Dictybase is a genome portal for the Amoebozoa. dictyBase is funded by a grant from the National Institute for General Medical Sciences.
The NCBI Short Genetic Variations database, commonly known as dbSNP, catalogs short variations in nucleotide sequences from a wide range of organisms. These variations include single nucleotide variations, short nucleotide insertions and deletions, short tandem repeats and microsatellites. Short Genetic Variations may be common, thus representing true polymorphisms, or they may be rare. Some rare human entries have additional information associated withthem, including disease associations, genotype information and allele origin, as some variations are somatic rather than germline events. ***NCBI will phase out support for non-human organism data in dbSNP and dbVar beginning on September 1, 2017***
ASAP (a systematic annotation package for community analysis of genomes) is a relational database and web interface developed to store, update and distribute genome sequence data and gene expression data collected by or in collaboration with researchers at the University of Wisconsin - Madison. ASAP was designed to facilitate ongoing community annotation of genomes and to grow with genome projects as they move from the preliminary data stage through post-sequencing functional analysis. The ASAP database includes multiple genome sequences at various stages of analysis, and gene expression data from preliminary experiments.
The JPL Tropical Cyclone Information System (TCIS) was developed to support hurricane research. There are three components to TCIS; a global archive of multi-satellite hurricane observations 1999-2010 (Tropical Cyclone Data Archive), North Atlantic Hurricane Watch and ASA Convective Processes Experiment (CPEX) aircraft campaign. Together, data and visualizations from the real time system and data archive can be used to study hurricane process, validate and improve models, and assist in developing new algorithms and data assimilation techniques.
The CONP portal is a web interface for the Canadian Open Neuroscience Platform (CONP) to facilitate open science in the neuroscience community. CONP simplifies global researcher access and sharing of datasets and tools. The portal internalizes the cycle of a typical research project: starting with data acquisition, followed by processing using already existing/published tools, and ultimately publication of the obtained results including a link to the original dataset. From more information on CONP, please visit https://conp.ca
Welcome to Smithsonian Open Access, where you can download, share, and reuse millions of the Smithsonian’s images—right now, without asking. With new platforms and tools, you have easier access to nearly 3 million 2D and 3D digital items from our collections—with many more to come. This includes images and data from across the Smithsonian’s 19 museums, nine research centers, libraries, archives, and the National Zoo.
The National Nuclear Data Center (NNDC) collects, evaluates, and disseminates nuclear physics data for basic nuclear research and applied nuclear technologies. The NNDC is a worldwide resource for nuclear data. The information available to the users of NNDC services is the product of the combined efforts of the NNDC and cooperating data centers and other interested groups, both in the United States and worldwide. The NNDC specializes in the following areas: - Nuclear structure and low-energy nuclear reactions - Nuclear databases and information technology - Nuclear data compilation and evaluation
The Solar Dynamics Observatory (SDO) studies the solar atmosphere on small scales of space and time, in multiple wavelengths. This is a searchable database of all SDO data, including citizen scientist images, space weather and near real time data, and helioseismology data.
Social Computing Data Repository hosts data from a collection of many different social media sites, most of which have blogging capacity. Some of the prominent social media sites included in this repository are BlogCatalog, Twitter, MyBlogLog, Digg, StumbleUpon, del.icio.us, MySpace, LiveJournal, The Unofficial Apple Weblog (TUAW), Reddit, etc. The repository contains various facets of blog data including blog site metadata like, user defined tags, predefined categories, blog site description; blog post level metadata like, user defined tags, date and time of posting; blog posts; blog post mood (which is defined as the blogger's emotions when (s)he wrote the blog post); blogger name; blog post comments; and blogger social network.
The Energy Data eXchange (EDX) is an online collection of capabilities and resources that advance research and customize energy-related needs. EDX is developed and maintained by NETL-RIC researchers and technical computing teams to support private collaboration for ongoing research efforts, and tech transfer of finalized DOE NETL research products. EDX supports NETL-affiliated research by: Coordinating historical and current data and information from a wide variety of sources to facilitate access to research that crosscuts multiple NETL projects/programs; Providing external access to technical products and data published by NETL-affiliated research teams; Collaborating with a variety of organizations and institutions in a secure environment through EDX’s ;Collaborative Workspaces
The Alvin Frame-Grabber system provides the NDSF community on-line access to Alvin's video imagery co-registered with vehicle navigation and attitude data for shipboard analysis, planning deep submergence research cruises, and synoptic review of data post-cruise. The system is built upon the methodology and technology developed for the JasonII Virtual Control Van and a prototype system that was deployed on 13 Alvin dives in the East Pacific Rise and the Galapagos (AT7-12, AT7-13). The deployed prototype system was extremely valuable in facilitating real-time dive planning, review, and shipboard analysis.
The NASA Exoplanet Archive collects and serves public data to support the search for and characterization of extra-solar planets (exoplanets) and their host stars. The data include published light curves, images, spectra and parameters, and time-series data from surveys that aim to discover transiting exoplanets. Tools are provided to work with the data, particularly the display and analysis of transit data sets from Kepler and CoRoT. All data are validated by the Exoplanet Archive science staff and traced to their sources. The Exoplanet Archive is the U.S. data portal for the CoRoT mission.