Filter
Reset all

Subjects

Content Types

Countries

AID systems

API

Data access

Data access restrictions

Database access

Database licenses

Data licenses

Data upload

Data upload restrictions

Enhanced publication

Institution responsibility type

Institution type

Keywords

Metadata standards

PID systems

Provider types

Quality management

Repository languages

Software

Syndications

Repository types

Versioning

  • * at the end of a keyword allows wildcard searches
  • " quotes can be used for searching phrases
  • + represents an AND search (default)
  • | represents an OR search
  • - represents a NOT operation
  • ( and ) implies priority
  • ~N after a word specifies the desired edit distance (fuzziness)
  • ~N after a phrase specifies the desired slop amount
  • 1 (current)
Found 11 result(s)
Stanford Network Analysis Platform (SNAP) is a general purpose network analysis and graph mining library. It is written in C++ and easily scales to massive networks with hundreds of millions of nodes, and billions of edges. It efficiently manipulates large graphs, calculates structural properties, generates regular and random graphs, and supports attributes on nodes and edges. SNAP is also available through the NodeXL which is a graphical front-end that integrates network analysis into Microsoft Office and Excel. The SNAP library is being actively developed since 2004 and is organically growing as a result of our research pursuits in analysis of large social and information networks. Largest network we analyzed so far using the library was the Microsoft Instant Messenger network from 2006 with 240 million nodes and 1.3 billion edges. The datasets available on the website were mostly collected (scraped) for the purposes of our research. The website was launched in July 2009.
<<<!!!<<< The repository is no longer available. further information and data see: Oxford University Research Archive: https://www.re3data.org/repository/r3d100011230 >>>!!!>>>
Country
Figshare has been chosen as the University of Adelaide's official data and digital object repository with unlimited local storage. All current staff and HDR students can access and publish research data and digital objects on the University of Adelaide's Figshare site. Because Figshare is cloud-based, you can access it anywhere and at any time.
DBpedia is a crowd-sourced community effort to extract structured information from Wikipedia and make this information available on the Web. DBpedia allows you to ask sophisticated queries against Wikipedia, and to link the different data sets on the Web to Wikipedia data. We hope that this work will make it easier for the huge amount of information in Wikipedia to be used in some new interesting ways. Furthermore, it might inspire new mechanisms for navigating, linking, and improving the encyclopedia itself.
Country
The Leibniz Data Manager (LDM) is a scientific repository for research data from the fields of science and technology. The service supports a better re-usability of research data for scientific projects. The LDM fosters the management and access to heterogeneous research data publications and assists researchers in the selection of relevant data sets for their respective disciplines. The LDM currently offers the following functions for the visualization of research data: · Supports data collections and publications with different formats. · Different views on the same data set (2D and 3D support). · Visualization of Auto CAD files. · Jupyter Notes for demonstrating live code. · RDF Description of data collections.
OSTI is the DOE office that collects, preserves, and disseminates DOE-sponsored R&D results that are the outcomes of R&D projects or other funded activities at DOE labs and facilities nationwide and grantees at universities and other institutions. The information is typically in the form of technical documents, conference papers, articles, multimedia, and software, collectively referred to as scientific and technical information (STI).
>>>!!!<<< stated 13.02.2020: the repository is offline >>>!!!<<< Data.DURAARK provides a unique collection of real world datasets from the architectural profession. The repository is unique, as it provides several different datatypes, such as 3d scans, 3d models and classifying Metadata and Geodata, to real world physical buildings.domain. Many of the datasets stem from architectural stakeholders and provide the community in this way with insights into the range of working methods, which the practice employs on large and complex building data.
Enlighten: research data is the institutional repository for research data of the University of Glasgow. As part of the CERIF 4 Datasets (C4D) project the University is exploring an extension of the CERIF standard. We have trialled methods of recording information about datasets to make them more visible, retrievable and usable.
CLAPOP is the portal of the Dutch CLARIN community. It brings together all relevant resources that were created within the CLARIN NL project and that now are part of the CLARIN NL infrastructure or that were created by other projects but are essential for the functioning of the CLARIN (NL) infrastructure. CLARIN-NL has closely cooperated with CLARIN Flanders in a number of projects. The common results of this cooperation and the results of this cooperation created by CLARIN Flanders are included here as well.
This is the KONECT project, a project in the area of network science with the goal to collect network datasets, analyse them, and make available all analyses online. KONECT stands for Koblenz Network Collection, as the project has roots at the University of Koblenz–Landau in Germany. All source code is made available as Free Software, and includes a network analysis toolbox for GNU Octave, a network extraction library, as well as code to generate these web pages, including all statistics and plots. KONECT contains over a hundred network datasets of various types, including directed, undirected, bipartite, weighted, unweighted, signed and rating networks. The networks of KONECT are collected from many diverse areas such as social networks, hyperlink networks, authorship networks, physical networks, interaction networks and communication networks. The KONECT project has developed network analysis tools which are used to compute network statistics, to draw plots and to implement various link prediction algorithms. The result of these analyses are presented on these pages. Whenever we are allowed to do so, we provide a download of the networks.