Filter
Reset all

Subjects

Content Types

Countries

AID systems

API

Certificates

Data access

Data access restrictions

Database access

Database access restrictions

Database licenses

Data licenses

Data upload

Data upload restrictions

Enhanced publication

Institution responsibility type

Institution type

Keywords

Metadata standards

PID systems

Provider types

Quality management

Repository languages

Software

Syndications

Repository types

Versioning

  • * at the end of a keyword allows wildcard searches
  • " quotes can be used for searching phrases
  • + represents an AND search (default)
  • | represents an OR search
  • - represents a NOT operation
  • ( and ) implies priority
  • ~N after a word specifies the desired edit distance (fuzziness)
  • ~N after a phrase specifies the desired slop amount
Found 106 result(s)
Country
<<<!!!<<< The ZACAT server is end-of-life The ZACAT server is EOL and has been taken offline. The software driving the portal has been unmaintained for several years and could no longer be reasonably sustained. We have expanded https://search.gesis.org to include information on the studies' variable level where available, which is a superset of the studies in ZACAT. Please use the variable search on https://search.gesis.org to identify and download datasets. >>>!!!>>>
Country
GovData the data portal for Germany offers consistent and central access to administrative data at the federal, state, and local level. Objective is to make data more available and easier to use at a single location. As set out in the concept of "open data", we attempt to facilitate the use of open licenses and to increase the supply of machine-readable raw data.
Country
The project analyzes educational processes in Germany from early childhood to late adulthood. The National Educational Panel Study (NEPS) has been set up to find out more about the acquisition of education in Germany, to plot the consequences of education for individual biographies, and to describe central educational processes and trajectories across the entire life span. Such an interdisciplinary consortium of research institutes, researcher groups, and research. personalities has been assembled in Bamberg. In addition, the competencies and experiences with longitudinal research available at numerous other locations have been networked to form a cluster of excellence.
Pubchem contains 3 databases. 1. PubChem BioAssay: The PubChem BioAssay Database contains bioactivity screens of chemical substances described in PubChem Substance. It provides searchable descriptions of each bioassay, including descriptions of the conditions and readouts specific to that screening procedure. 2. PubChem Compound: The PubChem Compound Database contains validated chemical depiction information provided to describe substances in PubChem Substance. Structures stored within PubChem Compounds are pre-clustered and cross-referenced by identity and similarity groups. 3. PubChem Substance. The PubChem Substance Database contains descriptions of samples, from a variety of sources, and links to biological screening results that are available in PubChem BioAssay. If the chemical contents of a sample are known, the description includes links to PubChem Compound.
Country
GEOFON seeks to facilitate cooperation in seismological research and earthquake and tsunami hazard mitigation by providing rapid transnational access to seismological data and source parameters of large earthquakes, and keeping these data accessible in the long term. It pursues these aims by operating and maintaining a global network of permanent broadband stations in cooperation with local partners, facilitating real time access to data from this network and those of many partner networks and plate boundary observatories, providing a permanent and secure archive for seismological data. It also archives and makes accessible data from temporary experiments carried out by scientists at German universities and institutions, thereby fostering cooperation and encouraging the full exploitation of all acquired data and serving as the permanent archive for the Geophysical Instrument Pool at Potsdam (GIPP). It also organises the data exchange of real-time and archived data with partner institutions and international centres.
The Digital Archaeological Record (tDAR) is an international digital repository for the digital records of archaeological investigations. tDAR’s use, development, and maintenance are governed by Digital Antiquity, an organization dedicated to ensuring the long-term preservation of irreplaceable archaeological data and to broadening the access to these data.
<<<!!!<<< This repository is no longer available. >>>!!!>>> The programme "International Oceanographic Data and Information Exchange" (IODE) of the "Intergovernmental Oceanographic Commission" (IOC) of UNESCO was established in 1961. Its purpose is to enhance marine research, exploitation and development, by facilitating the exchange of oceanographic data and information between participating Member States, and by meeting the needs of users for data and information products.
<<<!!!<<< This repository is no longer available. >>>!!!>>> TeachingWithData.org is a portal where faculty can find resources and ideas to reduce the challenges of bringing real data into post-secondary classes. It allows faculty to introduce and build students' quantitative reasoning abilities with readily available, user-friendly, data-driven teaching materials.
The Research Collection is ETH Zurich's publication platform. It unites the functions of a university bibliography, an open access repository and a research data repository within one platform. Researchers who are affiliated with ETH Zurich, the Swiss Federal Institute of Technology, may deposit research data from all domains. They can publish data as a standalone publication, publish it as supplementary material for an article, dissertation or another text, share it with colleagues or a research group, or deposit it for archiving purposes. Research-data-specific features include flexible access rights settings, DOI registration and a DOI preview workflow, content previews for zip- and tar-containers, as well as download statistics and altmetrics for published data. All data uploaded to the Research Collection are also transferred to the ETH Data Archive, ETH Zurich’s long-term archive.
The Gene database provides detailed information for known and predicted genes defined by nucleotide sequence or map position. Gene supplies gene-specific connections in the nexus of map, sequence, expression, structure, function, citation, and homology data. Unique identifiers are assigned to genes with defining sequences, genes with known map positions, and genes inferred from phenotypic information. These gene identifiers are used throughout NCBI's databases and tracked through updates of annotation. Gene includes genomes represented by NCBI Reference Sequences (or RefSeqs) and is integrated for indexing and query and retrieval from NCBI's Entrez and E-Utilities systems.
CLARIN is a European Research Infrastructure for the Humanities and Social Sciences, focusing on language resources (data and tools). It is being implemented and constantly improved at leading institutions in a large and growing number of European countries, aiming at improving Europe's multi-linguality competence. CLARIN provides several services, such as access to language data and tools to analyze data, and offers to deposit research data, as well as direct access to knowledge about relevant topics in relation to (research on and with) language resources. The main tool is the 'Virtual Language Observatory' providing metadata and access to the different national CLARIN centers and their data.
Galaxies, made up of billions of stars like our Sun, are the beacons that light up the structure of even the most distant regions in space. Not all galaxies are alike, however. They come in very different shapes and have very different properties; they may be large or small, old or young, red or blue, regular or confused, luminous or faint, dusty or gas-poor, rotating or static, round or disky, and they live either in splendid isolation or in clusters. In other words, the universe contains a very colourful and diverse zoo of galaxies. For almost a century, astronomers have been discussing how galaxies should be classified and how they relate to each other in an attempt to attack the big question of how galaxies form. Galaxy Zoo (Lintott et al. 2008, 2011) pioneered a novel method for performing large-scale visual classifications of survey datasets. This webpage allows anyone to download the resulting GZ classifications of galaxies in the project.
Country
The Research Data Centre of the Robert Koch Institute (FDZ RKI) publishes the data of population-representative health surveys in the form of public use files (PUFs).The main purpose of health surveys is to generate a maximum amount of information on the state of health and health-related behaviour of Germany's resident population while ensuring an optimum use of funds. The methodology - i.e. the sample design, the principles on operationalization and measurement, and data-collection techniques - is largely modelled on the tried-and-tested methods of empirical social research. Health interview surveys (HIS) use established survey techniques such as filling out questionnaires, computer-assisted telephone interviews (CATI), computer-assisted personal interviews (CAPI), and online polling via the internet or email. The main difference compared to purely sociological surveys lies in the additional biomedical examinations, tests and medical-biochemical measurements, which generate significant added value in addition to the results of the surveys; this part is referred to internationally as the health examination survey (HES).
ModelDB is a curated database of published models in the broad domain of computational neuroscience. It addresses the need for access to such models in order to evaluate their validity and extend their use. It can handle computational models expressed in any textual form, including procedural or declarative languages (e.g. C++, XML dialects) and source code written for any simulation environment. The model source code doesn't even have to reside inside ModelDB; it just has to be available from some publicly accessible online repository or WWW site.
The Health and Retirement Study (HRS) is a longitudinal panel study that surveys a representative sample of more than 26,000 Americans over the age of 50 every two years. The study has collected information about income, work, assets, pension plans, health insurance, disability, physical health and functioning, cognitive functioning, genetic information and health care expenditures.
The Cancer Genome Atlas (TCGA) Data Portal provides a platform for researchers to search, download, and analyze data sets generated by TCGA. It contains clinical information, genomic characterization data, and high level sequence analysis of the tumor genomes. The Data Coordinating Center (DCC) is the central provider of TCGA data. The DCC standardizes data formats and validates submitted data.
Country
AVISO stands for "Archiving, Validation and Interpretation of Satellite Oceanographic data". Here, you will find data, articles, news and tools to help you discover or improve your skills in the altimetry domain through four key themes: ocean, coast, hydrology and ice. Altimetry is a technique for measuring height. Satellite altimetry measures the time taken by a radar pulse to travel from the satellite antenna to the surface and back to the satellite receiver. Combined with precise satellite location data, altimetry measurements yield sea-surface heights.
GLOBE (Global Collaboration Engine) is an online collaborative environment that enables land change researchers to share, compare and integrate local and regional studies with global data to assess the global relevance of their work.
>>>!!!<<< This site is going away on April 1, 2021. General access to the site has been disabled and community users will see an error upon login. >>>!!!<<< Socrata’s cloud-based solution allows government organizations to put their data online, make data-driven decisions, operate more efficiently, and share insights with citizens.
DBpedia is a crowd-sourced community effort to extract structured information from Wikipedia and make this information available on the Web. DBpedia allows you to ask sophisticated queries against Wikipedia, and to link the different data sets on the Web to Wikipedia data. We hope that this work will make it easier for the huge amount of information in Wikipedia to be used in some new interesting ways. Furthermore, it might inspire new mechanisms for navigating, linking, and improving the encyclopedia itself.
Country
GEOMAR Helmholtz Centre for Ocean Research Kiel is one of the leading marine science institutions in Europe. GEOMAR investigates the chemical, physical, biological, and geological processes in the oceans, as well as their interactions with the seafloor and the atmosphere. OceanRep is an open access digital collection containing the research output of GEOMAR staff and students. Included are journal articles, conference papers, book chapters, theses and more, - with fulltext, if available. Research data are linked to the publications entries.