Filter
Reset all

Subjects

Content Types

Countries

AID systems

API

Certificates

Data access

Data access restrictions

Database access

Database licenses

Data licenses

Data upload

Data upload restrictions

Enhanced publication

Institution responsibility type

Institution type

Keywords

Metadata standards

PID systems

Provider types

Quality management

Repository languages

Software

Syndications

Repository types

Versioning

  • * at the end of a keyword allows wildcard searches
  • " quotes can be used for searching phrases
  • + represents an AND search (default)
  • | represents an OR search
  • - represents a NOT operation
  • ( and ) implies priority
  • ~N after a word specifies the desired edit distance (fuzziness)
  • ~N after a phrase specifies the desired slop amount
Found 32 result(s)
The Ozone Mapping and Profiler Suite measures the ozone layer in our upper atmosphere—tracking the status of global ozone distributions, including the ‘ozone hole.’ It also monitors ozone levels in the troposphere, the lowest layer of our atmosphere. OMPS extends out 40-year long record ozone layer measurements while also providing improved vertical resolution compared to previous operational instruments. Closer to the ground, OMPS’s measurements of harmful ozone improve air quality monitoring and when combined with cloud predictions; help to create the Ultraviolet Index, a guide to safe levels of sunlight exposure. OMPS has two sensors, both new designs, composed of three advanced hyperspectralimaging spectrometers.The three spectrometers: a downward-looking nadir mapper, nadir profiler and limb profiler. The entire OMPS suite currently fly on board the Suomi NPP spacecraft and are scheduled to fly on the JPSS-2 satellite mission. NASA will provide the OMPS-Limb profiler.
The Research Collection is ETH Zurich's publication platform. It unites the functions of a university bibliography, an open access repository and a research data repository within one platform. Researchers who are affiliated with ETH Zurich, the Swiss Federal Institute of Technology, may deposit research data from all domains. They can publish data as a standalone publication, publish it as supplementary material for an article, dissertation or another text, share it with colleagues or a research group, or deposit it for archiving purposes. Research-data-specific features include flexible access rights settings, DOI registration and a DOI preview workflow, content previews for zip- and tar-containers, as well as download statistics and altmetrics for published data. All data uploaded to the Research Collection are also transferred to the ETH Data Archive, ETH Zurich’s long-term archive.
Country
Research Data Australia is the data discovery service of the Australian National Data Service (ANDS). We do not store the data itself here but provide descriptions of, and links to, the data from our data publishing partners. ANDS is funded by the Australian Government through the National Collaborative Research Infrastructure Strategy (NCRIS).
CaltechDATA is an institutional data repository for Caltech. Caltech library runs the repository to preserve the accomplishments of Caltech researchers and share their results with the world. Caltech-associated researchers can upload data, link data with their publications, and assign a permanent DOI so that others can reference the data set. The repository also preserves software and has automatic Github integration. All files present in the repository are open access or embargoed, and all metadata is always available to the public.
The UC San Diego Library Digital Collections website gathers two categories of content managed by the Library: library collections (including digitized versions of selected collections covering topics such as art, film, music, history and anthropology) and research data collections (including research data generated by UC San Diego researchers).
The Duke Research Data Repository is a service of the Duke University Libraries that provides curation, access, and preservation of research data produced by the Duke community. Duke's RDR is a discipline agnostic institutional data repository that is intended to preserve and make public data related to the teaching and research mission of Duke University including data linked to a publication, research project, and/or class, as well as supplementary software code and documentation used to provide context for the data.
The Precipitation Processing System (PPS) evolved from the Tropical Rainfall Measuring Mission (TRMM) Science Data and Information System (TSDIS). The purpose of the PPS is to process, analyze and archive data from the Global Precipitation Measurement (GPM) mission, partner satellites and the TRMM mission. The PPS also supports TRMM by providing validation products from TRMM ground radar sites. All GPM, TRMM and Partner public data products are available to the science community and the general public from the TRMM/GPM FTP Data Archive. Please note that you need to register to be able to access this data. Registered users can also search for GPM, partner and TRMM data, order custom subsets and set up subscriptions using our PPS Data Products Ordering Interface (STORM)
ETH Data Archive is ETH Zurich's long-term preservation solution for digital information such as research data, documents or images. It serves as the backbone of data curation and for most of its content, it is a “dark archive” without public access. In this capacity, the ETH Data Archive also archives the content of ETH Zurich’s Research Collection which is the primary repository for members of the university and the first point of contact for publication of data at ETH Zurich. All data that was produced in the context of research at the ETH Zurich, can be published and archived in the Research Collection. In the following cases, a direct data upload into the ETH Data Archive though, has to be considered: - Upload and registration of software code according to ETH transfer’s requirements for Software Disclosure. - A substantial number of files, have to be regularly submitted for long-term archiving and/or publishing and browser-based upload is not an option: the ETH Data Archive may offer automated data and metadata transfers from source applications (e.g. from a LIMS) via API. - Files for a project on a local computer have to be collected and metadata has to be added before uploading the data to the ETH Data Archive: -- we provide you with the local file editor docuteam packer. Docuteam packer allows to structure, describe, and organise data for an upload into the ETH Data Archive and the depositor decides when submission is due.
>>>!!!<<< 2018-01-18: no data nor programs can be found >>>!!!<<< These archives contain public domain programs for calculations in physics and other programs that we suppose about will help during work with computer. Physical constants and experimental or theoretical data as cross sections, rate constants, swarm parameters, etc., that are necessary for physical calculations are stored here, too. Programs are mainly dedicated to computers compatible with PC IBM. If programs do not use graphic units it is possible to use them on other computers, too. It is necessary to reprogram the graphic parts of programs in the other cases.
Merritt is a curation repository for the preservation of and access to the digital research data of the ten campus University of California system and external project collaborators. Merritt is supported by the University of California Curation Center (UC3) at the California Digital Library (CDL). While Merritt itself is content agnostic, accepting digital content regardless of domain, format, or structure, it is being used for management of research data, and it forms the basis for a number of domain-specific repositories, such as the ONEShare repository for earth and environmental science and the DataShare repository for life sciences. Merritt provides persistent identifiers, storage replication, fixity audit, complete version history, REST API, a comprehensive metadata catalog for discovery, ATOM-based syndication, and curatorially-defined collections, access control rules, and data use agreements (DUAs). Merritt content upload and download may each be curatorially-designated as public or restricted. Merritt DOIs are provided by UC3's EZID service, which is integrated with DataCite. All DOIs and associated metadata are automatically registered with DataCite and are harvested by Ex Libris PRIMO and Thomson Reuters Data Citation Index (DCI) for high-level discovery. Merritt is also a member node in the DataONE network; curatorially-designated data submitted to Merritt are automatically registered with DataONE for additional replication and federated discovery through the ONEMercury search/browse interface.
Kaggle is a platform for predictive modelling and analytics competitions in which statisticians and data miners compete to produce the best models for predicting and describing the datasets uploaded by companies and users. This crowdsourcing approach relies on the fact that there are countless strategies that can be applied to any predictive modelling task and it is impossible to know beforehand which technique or analyst will be most effective.
Country
MedEffect Canada’s Adverse Reaction Online Database contains information on suspected adverse reaction reports related to marketed health products that were submitted to Health Canada by consumers and health professionals, who submit reports voluntarily, as well as by Market Authorization Holders (manufacturers and distributors), who are required to submit reports according to the Next link will take you to another Web site Food and Drugs Regulations.
RunMyCode is a novel cloud-based platform that enables scientists to openly share the code and data that underlie their research publications. The web service only requires a web browser as all calculations are done on a dedicated cloud computer. Once the results are ready, they are automatically displayed to the user.
DesignSafe-ci.org will provide a comprehensive environment for experimental, theoretical, and computational engineering and science, providing a place not only to steward data from its creation through archive, but also the workspace in which to understand, analyze, collaborate and publish that data. At the heart of the cyberinfrastructure, the Data Depot is the central shared data repository that supports the full research lifecycle, from data creation to analysis to curation and publication. The Data Depot will accept any data the user wishes to supply into a local workspace, even if the data type is unknown or only partial metadata is provided. The Discovery Workspace will be a web-based environment that provides researchers with access to data analysis tools, computational simulation tools, visualization tools, educational tools, and user-contributed tools within the cloud to support research workflows, learning, and discovery. The Reconnaissance Integration Portal will be the main access point to data collected during the reconnaissance of windstorm and earthquake events.
Country
Strasbourg astronomical Data Center (CDS) is dedicated to the collection and worldwide distribution of astronomical data and related information. Alongside data curation and service maintenance responsibilities, the CDS undertakes R&D activities that are fundamental to ensure the long term sustainability in a domain in which technology evolves very quickly. R&D areas include informatics, big data, and development of the astronomical Virtual Observatory (VO). CDS is a major actor in the VO with leading roles in European VO projects, the French Virtual Observatory and the International Virtual Observatory Alliance (IVOA). The CDS hosts the SIMBAD astronomical database, the world reference database for the identification of astronomical objects; VizieR, the catalogue service for the CDS reference collection of astronomical catalogues and tables published in academic journals; and the Aladin interactive software sky atlas for access, visualization and analysis of astronomical images, surveys, catalogues, databases and related data.
BioVeL is a virtual e-laboratory that supports research on biodiversity issues using large amounts of data from cross-disciplinary sources. BioVeL supports the development and use of workflows to process data. It offers the possibility to either use already made workflows or create own. BioVeL workflows are stored in MyExperiment - Biovel Group http://www.myexperiment.org/groups/643/content. They are underpinned by a range of analytical and data processing functions (generally provided as Web Services or R scripts) to support common biodiversity analysis tasks. You can find the Web Services catalogued in the BiodiversityCatalogue.
Country
The interdisciplinary data platform INPTDAT provides easy access to research data and information from all fields of applied plasma physics and plasma medicine. It aims to support the findability, accessibility, interoperability and re-use of data for the low-temperature plasma physics community.
The NCEP/NCAR Reanalysis Project is a joint project between the National Centers for Environmental Prediction (NCEP, formerly "NMC") and the National Center for Atmospheric Research (NCAR). The goal of this joint effort is to produce new atmospheric analyses using historical data (1948 onwards) and as well to produce analyses of the current atmospheric state (Climate Data Assimilation System, CDAS).
Country
The Flanders Marine Institute (VLIZ) is a centre for marine and coastal research. As a partner in various projects and networks it promotes and supports the international image of Flemish marine scientific research and international marine education. In its capacity as a coordination and information platform, the Flanders Marine Institute (VLIZ) supports some thousand marine scientists in Flanders by disseminating their knowledge to policymakers, educators, the general public and scientists.
Country
ResearchGate is a network where 15+ million scientists and researchers worldwide connect to share their work. Researchers can upload data of any type and receive DOIs, detailed statistics and real-time feedback. In Data discovery Section of ResearchGate you can explore the added datasets.
The FAIRDOMHub is built upon the SEEK software suite, which is an open source web platform for sharing scientific research assets, processes and outcomes. FAIRDOM (Web Site) will establish a support and service network for European Systems Biology. It will serve projects in standardizing, managing and disseminating data and models in a FAIR manner: Findable, Accessible, Interoperable and Reusable. FAIRDOM is an initiative to develop a community, and establish an internationally sustained Data and Model Management service to the European Systems Biology community. FAIRDOM is a joint action of ERA-Net EraSysAPP and European Research Infrastructure ISBE.
The CAD-60 and CAD-120 data sets comprise of RGB-D video sequences of humans performing activities which are recording using the Microsoft Kinect sensor. Being able to detect human activities is important for making personal assistant robots useful in performing assistive tasks. Our CAD dataset comprises twelve different activities (composed of several sub-activities) performed by four people in different environments, such as a kitchen, a living room, and office, etc. Tested on robots reactively responding to the detected activities.