Filter
Reset all

Subjects

Content Types

Countries

AID systems

API

Certificates

Data access

Data access restrictions

Database access

Database licenses

Data licenses

Data upload

Data upload restrictions

Enhanced publication

Institution responsibility type

Institution type

Keywords

Metadata standards

PID systems

Provider types

Quality management

Repository languages

Software

Syndications

Repository types

Versioning

  • * at the end of a keyword allows wildcard searches
  • " quotes can be used for searching phrases
  • + represents an AND search (default)
  • | represents an OR search
  • - represents a NOT operation
  • ( and ) implies priority
  • ~N after a word specifies the desired edit distance (fuzziness)
  • ~N after a phrase specifies the desired slop amount
  • 1 (current)
Found 10 result(s)
SCEC's mission includes gathering data on earthquakes, both in Southern California and other locales; integrate the information into a comprehensive understanding of earthquake phenomena; and communicate useful knowledge for reducing earthquake risk to society at large. The SCEC community consists of more than 600 scientists from 16 core institutions and 47 additional participating institutions. SCEC is funded by the National Science Foundation and the U.S. Geological Survey.
The Precipitation Processing System (PPS) evolved from the Tropical Rainfall Measuring Mission (TRMM) Science Data and Information System (TSDIS). The purpose of the PPS is to process, analyze and archive data from the Global Precipitation Measurement (GPM) mission, partner satellites and the TRMM mission. The PPS also supports TRMM by providing validation products from TRMM ground radar sites. All GPM, TRMM and Partner public data products are available to the science community and the general public from the TRMM/GPM FTP Data Archive. Please note that you need to register to be able to access this data. Registered users can also search for GPM, partner and TRMM data, order custom subsets and set up subscriptions using our PPS Data Products Ordering Interface (STORM)
The Information Marketplace for Policy and Analysis of Cyber-risk & Trust (IMPACT) program supports global cyber risk research & development by coordinating, enhancing and developing real world data, analytics and information sharing capabilities, tools, models, and methodologies. In order to accelerate solutions around cyber risk issues and infrastructure security, IMPACT makes these data sharing components broadly available as national and international resources to support the three-way partnership among cyber security researchers, technology developers and policymakers in academia, industry and the government.
Merritt is a curation repository for the preservation of and access to the digital research data of the ten campus University of California system and external project collaborators. Merritt is supported by the University of California Curation Center (UC3) at the California Digital Library (CDL). While Merritt itself is content agnostic, accepting digital content regardless of domain, format, or structure, it is being used for management of research data, and it forms the basis for a number of domain-specific repositories, such as the ONEShare repository for earth and environmental science and the DataShare repository for life sciences. Merritt provides persistent identifiers, storage replication, fixity audit, complete version history, REST API, a comprehensive metadata catalog for discovery, ATOM-based syndication, and curatorially-defined collections, access control rules, and data use agreements (DUAs). Merritt content upload and download may each be curatorially-designated as public or restricted. Merritt DOIs are provided by UC3's EZID service, which is integrated with DataCite. All DOIs and associated metadata are automatically registered with DataCite and are harvested by Ex Libris PRIMO and Thomson Reuters Data Citation Index (DCI) for high-level discovery. Merritt is also a member node in the DataONE network; curatorially-designated data submitted to Merritt are automatically registered with DataONE for additional replication and federated discovery through the ONEMercury search/browse interface.
Kaggle is a platform for predictive modelling and analytics competitions in which statisticians and data miners compete to produce the best models for predicting and describing the datasets uploaded by companies and users. This crowdsourcing approach relies on the fact that there are countless strategies that can be applied to any predictive modelling task and it is impossible to know beforehand which technique or analyst will be most effective.
RunMyCode is a novel cloud-based platform that enables scientists to openly share the code and data that underlie their research publications. The web service only requires a web browser as all calculations are done on a dedicated cloud computer. Once the results are ready, they are automatically displayed to the user.
GitHub is the best place to share code with friends, co-workers, classmates, and complete strangers. Over three million people use GitHub to build amazing things together. With the collaborative features of GitHub.com, our desktop and mobile apps, and GitHub Enterprise, it has never been easier for individuals and teams to write better code, faster. Originally founded by Tom Preston-Werner, Chris Wanstrath, and PJ Hyett to simplify sharing code, GitHub has grown into the largest code host in the world.
California Digital Library (CDL) seeks to be a catalyst for deeply collaborative solutions providing a rich, intuitive and seamless environment for publishing, sharing and preserving our scholars’ increasingly diverse outputs, as well as for acquiring and accessing information critical to the University of California’s scholarly enterprise. University of California Curation Center (UC3) is the digital curation program within CDL. The mission of UC3 is to provide transformative preservation, curation, and research data management systems, services, and initiatives that sustain and promote open scholarship.