Filter
Reset all

Subjects

Content Types

Countries

AID systems

API

Certificates

Data access

Data access restrictions

Database access

Database licenses

Data licenses

Data upload

Data upload restrictions

Enhanced publication

Institution responsibility type

Institution type

Keywords

Metadata standards

PID systems

Provider types

Quality management

Repository languages

Software

Syndications

Repository types

Versioning

  • * at the end of a keyword allows wildcard searches
  • " quotes can be used for searching phrases
  • + represents an AND search (default)
  • | represents an OR search
  • - represents a NOT operation
  • ( and ) implies priority
  • ~N after a word specifies the desired edit distance (fuzziness)
  • ~N after a phrase specifies the desired slop amount
Found 39 result(s)
Country
BASS2000 archives ground-based solar survey data, and a long term data from France's observatories. The database contains spectroheliographs, radioheliographs, coronographs, and synoptic maps. BASS2000 provides data as GIF, PNG, JPEG, MPEG, PS, and Compressed Files.
Content type(s)
Country
German astronomical observatories own considerable collection of photographic plates. While these observations lead to significant discoveries in the past, they are also of interest for scientists today and in the future. In particular, for the study of long-term variability of many types of stars, these measurements are of immense scientific value. There are about 85000 plates in the archives of Hamburger Sternwarte, Dr. Karl Remeis-Sternwarte Bamberg, and Leibniz-Institut für Astrophysik Potsdam (AIP). The plates are digitized with high-resolution flatbed scanners. In addition, the corresponding plate envelopes and observation logbooks are digitized, and further metadata are entered into the database. The work is carried out within the project “Digitalisierung astronomischer Fotoplatten und ihre Integration in das internationale Virtual Observatory”, which is funded by the DFG.
IRSA is chartered to curate the calibrated science products from NASAs infrared and sub-millimeter missions, including five major large-area/all-sky surveys. IRSA exploits a re-useable architecture to deploy cost-effective archives for customers, including: the Spitzer Space Telescope; the 2MASS and IRAS all-sky surveys; and multi-mission datasets such as COSMOS, WISE and Planck mission
OpenKIM is an online suite of open source tools for molecular simulation of materials. These tools help to make molecular simulation more accessible and more reliable. Within OpenKIM, you will find an online resource for standardized testing and long-term warehousing of interatomic models and data, and an application programming interface (API) standard for coupling atomistic simulation codes and interatomic potential subroutines.
The Solar Dynamics Observatory (SDO) studies the solar atmosphere on small scales of space and time, in multiple wavelengths. This is a searchable database of all SDO data, including citizen scientist images, space weather and near real time data, and helioseismology data.
The Open Science Framework (OSF) is part network of research materials, part version control system, and part collaboration software. The purpose of the software is to support the scientist's workflow and help increase the alignment between scientific values and scientific practices. Document and archive studies. Move the organization and management of study materials from the desktop into the cloud. Labs can organize, share, and archive study materials among team members. Web-based project management reduces the likelihood of losing study materials due to computer malfunction, changing personnel, or just forgetting where you put the damn thing. Share and find materials. With a click, make study materials public so that other researchers can find, use and cite them. Find materials by other researchers to avoid reinventing something that already exists. Detail individual contribution. Assign citable, contributor credit to any research material - tools, analysis scripts, methods, measures, data. Increase transparency. Make as much of the scientific workflow public as desired - as it is developed or after publication of reports. Find public projects here. Registration. Registering materials can certify what was done in advance of data analysis, or confirm the exact state of the project at important points of the lifecycle such as manuscript submission or at the onset of data collection. Discover public registrations here. Manage scientific workflow. A structured, flexible system can provide efficiency gain to workflow and clarity to project objectives, as pictured.
The Astronomy data repository at Harvard is currently open to all scientific data from astronomical institutions worldwide. Incorporating Astroinformatics of galaxies and quasars Dataverse. The Astronomy Dataverse is connected to the indexing services provided by the SAO/NASA Astrophysical Data Service (ADS).
Country
The CosmoSim database provides results from cosmological simulations performed within different projects: the MultiDark and Bolshoi project, and the CLUES project. The CosmoSim webpage provides access to several cosmological simulations, with a separate database for each simulation. Simulations overview: https://www.cosmosim.org/cms/simulations/simulations-overview/ . CosmoSim is a contribution to the German Astrophysical Virtual Observatory.
Constellation is a digital object identifier (DOI) based science network for supercomputing data. Constellation makes it possible for OLCF researchers to obtain DOIs for large data collections by tying them together with the associated resources and processes that went into the production of the data (e.g., jobs, collaborators, projects), using a scalable database. It also allows the annotation of the scientific conduct with rich metadata, and enables the cataloging and publishing of the artifacts for open access, aiding in scalable data discovery. OLCF users can use the DOI service to publish datasets even before the publication of the paper, and retain key data even after project expiration. From a center standpoint, DOIs enable the stewardship of data, and better management of the scratch and archival storage.
Country
Paris Astronomical Data Centre aims at providing VO access to its data collections, at participating to international standards developments, at implementing VO compliant simulation codes, data visualization and analysis software. This centre hosts high level permanent activities for tools and data distribution under the format of reference services. These sustainable services are recognized at the national level as CNRS labeled services. The various activities are organised as portals whose functions are to provide visibility and information on the projects and to encourage collaboration.
When published in 2005, the Millennium Run was the largest ever simulation of the formation of structure within the ΛCDM cosmology. It uses 10(10) particles to follow the dark matter distribution in a cubic region 500h(−1)Mpc on a side, and has a spatial resolution of 5h−1kpc. Application of simplified modelling techniques to the stored output of this calculation allows the formation and evolution of the ~10(7) galaxies more luminous than the Small Magellanic Cloud to be simulated for a variety of assumptions about the detailed physics involved. As part of the activities of the German Astrophysical Virtual Observatory we have created relational databases to store the detailed assembly histories both of all the haloes and subhaloes resolved by the simulation, and of all the galaxies that form within these structures for two independent models of the galaxy formation physics. We have implemented a Structured Query Language (SQL) server on these databases. This allows easy access to many properties of the galaxies and halos, as well as to the spatial and temporal relations between them. Information is output in table format compatible with standard Virtual Observatory tools. With this announcement (from 1/8/2006) we are making these structures fully accessible to all users. Interested scientists can learn SQL and test queries on a small, openly accessible version of the Millennium Run (with volume 1/512 that of the full simulation). They can then request accounts to run similar queries on the databases for the full simulations. In 2008 and 2012 the simulations were repeated.
The NASA Exoplanet Archive collects and serves public data to support the search for and characterization of extra-solar planets (exoplanets) and their host stars. The data include published light curves, images, spectra and parameters, and time-series data from surveys that aim to discover transiting exoplanets. Tools are provided to work with the data, particularly the display and analysis of transit data sets from Kepler and CoRoT. All data are validated by the Exoplanet Archive science staff and traced to their sources. The Exoplanet Archive is the U.S. data portal for the CoRoT mission.
Launched in December 2013, Gaia is destined to create the most accurate map yet of the Milky Way. By making accurate measurements of the positions and motions of stars in the Milky Way, it will answer questions about the origin and evolution of our home galaxy. The first data release (2016) contains three-dimensional positions and two-dimensional motions of a subset of two million stars. The second data release (2018) increases that number to over 1.6 Billion. Gaia’s measurements are as precise as planned, paving the way to a better understanding of our galaxy and its neighborhood. The AIP hosts the Gaia data as one of the external data centers along with the main Gaia archive maintained by ESAC and provides access to the Gaia data releases as part of Gaia Data Processing and Analysis Consortium (DPAC).
<<<!!!<<< This repository is no longer available. >>>!!!>>>The Deep Carbon Observatory (DCO) is a global community of multi-disciplinary scientists unlocking the inner secrets of Earth through investigations into life, energy, and the fundamentally unique chemistry of carbon. Deep Carbon Observatory Digital Object Registry (“DCO-VIVO”) is a centrally-managed digital object identification, object registration and metadata management service for the DCO. Digital object registration includes DCO-ID generation based on the global Handle System infrastructure and metadata collection using VIVO. Users will be able to deposit their data into the DCO Data Repository and have that data discoverable and accessible by others.