Filter
Reset all

Subjects

Content Types

Countries

AID systems

API

Certificates

Data access

Data access restrictions

Database access

Database access restrictions

Database licenses

Data licenses

Data upload

Data upload restrictions

Enhanced publication

Institution responsibility type

Institution type

Keywords

Metadata standards

PID systems

Provider types

Quality management

Repository languages

Software

Syndications

Repository types

Versioning

  • * at the end of a keyword allows wildcard searches
  • " quotes can be used for searching phrases
  • + represents an AND search (default)
  • | represents an OR search
  • - represents a NOT operation
  • ( and ) implies priority
  • ~N after a word specifies the desired edit distance (fuzziness)
  • ~N after a phrase specifies the desired slop amount
Found 50 result(s)
Country
The Digital Repository of Ireland (DRI) is a national trusted digital repository (TDR) for Ireland’s social and cultural data. We preserve, curate, and provide sustained access to a wealth of Ireland’s humanities and social sciences data through a single online portal. The repository houses unique and important collections from a variety of organisations including higher education institutions, cultural institutions, government agencies, and specialist archives. DRI has staff members from a wide variety of backgrounds, including software engineers, designers, digital archivists and librarians, data curators, policy and requirements specialists, educators, project managers, social scientists and humanities scholars. DRI is certified by the CoreTrustSeal, the current TDR standard widely recommended for best practice in Open Science. In addition to providing trusted digital repository services, the DRI is also Ireland’s research centre for best practices in digital archiving, repository infrastructures, preservation policy, research data management and advocacy at the national and European levels. DRI contributes to policy making nationally (e.g. via the National Open Research Forum and the IRC), and internationally, including European Commission expert groups, the DPC, RDA and the OECD.
The projects include airborne, ground-based and ocean measurements, social science surveys, satellite data use, modelling studies and value-added product development. Therefore, the BAOBAB data portal enables to access a great amount and a large variety of data: - 250 local observation datasets, that have been collected by operational networks since 1850, long term monitoring research networks and intensive scientific campaigns; - 1350 outputs of a socio-economics questionnaire; - 60 operational satellite products and several research products; - 10 output sets of meteorological and ocean operational models and 15 of research simulations. Data documentation complies with metadata international standards, and data are delivered into standard formats. The data request interface takes full advantage of the database relational structure and enables users to elaborate multicriteria requests (period, area, property…).
ETH Data Archive is ETH Zurich's long-term preservation solution for digital information such as research data, digitised content, archival records, or images. It serves as the backbone of data curation and for most of its content, it is a “dark archive” without public access. In this capacity, the ETH Data Archive also archives the content of ETH Zurich’s Research Collection which is the primary repository for members of the university and the first point of contact for publication of data at ETH Zurich. All data that was produced in the context of research at the ETH Zurich, can be published and archived in the Research Collection. An automated connection to the ETH Data Archive in the background ensures the medium to long-term preservation of all publications and research data. Direct access to the ETH Data Archive is intended only for customers who need to deposit software source code within the framework of ETH transfer Software Registration. Open Source code packages and other content from legacy workflows can be accessed via ETH Library @ swisscovery (https://library.ethz.ch/en/).
Country
<<<!!!<<< The digital archive of the Historical Data Center Saxony-Anhalt was transferred to the share-it repositor https://www.re3data.org/repository/r3d100013014 >>>!!!>>> The Historical Data Centre Saxony-Anhalt was founded in 2008. Its main tasks are the computer-aided provision, processing and evaluation of historical research data, the development of theoretically consolidated normative data and vocabularies as well as the further development of methods in the context of digital humanities, research data management and quality assurance. The "Historical Data Centre Saxony-Anhalt" sees itself as a central institution for the data service of historical data in the federal state of Saxony-Anhalt and is thus part of a nationally and internationally linked infrastructure for long-term data storage and use. The Centre primarily acquires individual-specific microdata for the analysis of life courses, employment biographies and biographies (primarily quantitative, but also qualitative data), which offer a broad interdisciplinary and international analytical framework and meet clearly defined methodological and technical requirements. The studies are processed, archived and - in compliance with data protection and copyright conditions - made available to the scientifically interested public in accordance with internationally recognized standards. The degree of preparation depends on the type and quality of the study and on demand. Reference studies and studies in high demand are comprehensively documented - often in cooperation with primary researchers or experts - and summarized in data collections. The Historical Data Centre supports researchers in meeting the high demands of research data management. This includes the advisory support of the entire life cycle of data, starting with data production, documentation, analysis, evaluation, publication, long-term archiving and finally the subsequent use of data. In cooperation with other infrastructure facilities of the state of Saxony-Anhalt as well as national and international, interdisciplinary data repositories, the Data Centre provides tools and infrastructures for the publication and long-term archiving of research data. Together with the University and State Library of Saxony-Anhalt, the Data Centre operates its own data repository as well as special workstations for the digitisation and analysis of data. The Historical Data Centre aims to be a contact point for very different users of historical sources. We collect data relating to historical persons, events and historical territorial units.
Ag-Analytics is an online open source database of various economic and environmental data. It automates the collection, formatting, and processing of several different commonly used datasets, such as the National Agricultural Statistics Service (NASS), the Agricultural Marketing Service (AMS), Risk Management agency (RMA), the PRISM weather database, and the U.S. Commodity Futures Trading Commission (CFTC). All the data have been cleaned and well-documented to save users the inconvenience of scraping and cleaning the data themselves.
Discovery is the digital repository of research, and related activities, undertaken at the University of Dundee. The content held in Discovery is varied and ranges from traditional research outputs such as peer-reviewed articles and conference papers, books, chapters and post-graduate research theses and data to records for artefacts, exhibitions, multimedia and software. Where possible Discovery provides full-text access to a version of the research. Discovery is the data catalogue for datasets resulting from research undertaken at the University of Dundee and in some instances the publisher of research data.
Country
The Research Data Repository of FID move is a digital long-term repository for open data from the field of transport and mobility research. All datasets are provided with an open licence and are assigned a persistent DataCite DOI (Digital Object Identifier). Both data search and archiving are free. The Specialised Information Service for Mobility and Transport Research (FID move) has been set up by the Saxon State and University Library Dresden (SLUB) and the German TIB – Leibniz Information Centre for Science and Technology as part of the DFG funding programme "Specialised Information Services".
The Gulf of Mexico Research Initiative Information and Data Cooperative (GRIIDC) is a team of researchers, data specialists and computer system developers who are supporting the development of a data management system to store scientific data generated by Gulf of Mexico researchers. The Master Research Agreement between BP and the Gulf of Mexico Alliance that established the Gulf of Mexico Research Initiative (GoMRI) included provisions that all data collected or generated through the agreement must be made available to the public. The Gulf of Mexico Research Initiative Information and Data Cooperative (GRIIDC) is the vehicle through which GoMRI is fulfilling this requirement. The mission of GRIIDC is to ensure a data and information legacy that promotes continual scientific discovery and public awareness of the Gulf of Mexico Ecosystem.
Country
Attention! Data sets are not updated anymore. Please, visit the BonaRes Repositor​ium​ for new datasets. Open Research Data provides quality assessed data and their metadata such as context information on measurement objectives, equipment, methods, testing and investigation areas. The purpose of the repository is to secure quality, integrity and long-term availability of landscape and ecosystem research data as well as to enhance accessibility of free data from ZALF long-term monitoring campaigns, landscape laboratories (Agro-ScapeLabs), field trials and experiments. The Leibniz Centre for Agricultural Landscape Research (ZALF) explores ecosystems in agricultural landscapes and the development of ecologically and economically viable land use systems. ZALF combines scientific expertise from agricultural science, geosciences, biosciences and socio-economics.
Country
EGO examines 500 years of modern European history by transcending national, disciplinary and methodological boundaries. Ten thematic threads tie together processes of intercultural exchange whose influence extended beyond national and cultural borders. These range from religion, politics, science and law to art and music, as well as to the economy, technology and the military. EGO employs the newest research to present European transfer processes comprehensively in a way that is easy to understand. The articles link to images, sources, statistics, animated and interactive maps, and audio and visual clips. EGO thereby takes full advantage of the Internet's multi-media potential.
The DesignSafe Data Depot Repository (DDR) is the platform for curation and publication of datasets generated in the course of natural hazards research. The DDR is an open access data repository that enables data producers to safely store, share, organize, and describe research data, towards permanent publication, distribution, and impact evaluation. The DDR allows data consumers to discover, search for, access, and reuse published data in an effort to accelerate research discovery. It is a component of the DesignSafe cyberinfrastructure, which represents a comprehensive research environment that provides cloud-based tools to manage, analyze, curate, and publish critical data for research to understand the impacts of natural hazards. DesignSafe is part of the NSF-supported Natural Hazards Engineering Research Infrastructure (NHERI), and aligns with its mission to provide the natural hazards research community with open access, shared-use scholarship, education, and community resources aimed at supporting civil and social infrastructure prior to, during, and following natural disasters. It serves a broad national and international audience of natural hazard researchers (both engineers and social scientists), students, practitioners, policy makers, as well as the general public. It has been in operation since 2016, and also provides access to legacy data dating from about 2005. These legacy data were generated as part of the NSF-supported Network for Earthquake Engineering Simulation (NEES), a predecessor to NHERI. Legacy data and metadata belonging to NEES were transferred to the DDR for continuous preservation and access.
The Open Science Framework (OSF) is part network of research materials, part version control system, and part collaboration software. The purpose of the software is to support the scientist's workflow and help increase the alignment between scientific values and scientific practices. Document and archive studies. Move the organization and management of study materials from the desktop into the cloud. Labs can organize, share, and archive study materials among team members. Web-based project management reduces the likelihood of losing study materials due to computer malfunction, changing personnel, or just forgetting where you put the damn thing. Share and find materials. With a click, make study materials public so that other researchers can find, use and cite them. Find materials by other researchers to avoid reinventing something that already exists. Detail individual contribution. Assign citable, contributor credit to any research material - tools, analysis scripts, methods, measures, data. Increase transparency. Make as much of the scientific workflow public as desired - as it is developed or after publication of reports. Find public projects here. Registration. Registering materials can certify what was done in advance of data analysis, or confirm the exact state of the project at important points of the lifecycle such as manuscript submission or at the onset of data collection. Discover public registrations here. Manage scientific workflow. A structured, flexible system can provide efficiency gain to workflow and clarity to project objectives, as pictured.
The European Social Survey (the ESS) is a biennial multi-country survey covering over 30 nations. The first round was fielded in 2002/2003, the fifth in 2010/2011. The questionnaire includes two main sections, each consisting of approximately 120 items; a 'core' module which remains relatively constant from round to round, plus two or more 'rotating' modules, repeated at intervals. The core module aims to monitor change and continuity in a wide range of social variables, including media use; social and public trust; political interest and participation; socio-political orientations; governance and efficacy; moral; political and social values; social exclusion, national, ethnic and religious allegiances; well-being; health and security; human values; demographics and socio-economics
The UCD Digital Library is a platform for exploring cultural heritage, engaging with digital scholarship, and accessing research data. The UCD Digital Library allows you to search, browse and explore a growing collection of historical materials, photographs, art, interviews, letters, and other exciting content, that have been digitised and made freely available.
To help flattening the COVID-19 curve public health systems need better information on whether preventive measures are working and how the virus may spread. Facebook Data for Good offer maps on population movement that researchers and nonprofits are already using to understand the coronavirus crisis, using aggregated data to protect people’s privacy.
Country
DAIS - Digital Archive of the Serbian Academy of Sciences and Arts is a joint digital repository of the Serbian Academy of Sciences and Arts (SASA) and the research institutes under the auspices of SASA. The aim of the repository is to provide open access to publications and other research outputs resulting from the projects implemented by the SASA and its institutes. The repository uses a DSpace-based software platform developed and maintained by the Belgrade University Computer Centre (RCUB).
Country
The purpose of the Canadian Urban Data Repository (CUDR) is to provide a “home” for urban datasets. While primarily focused on datasets created by academe, it will also contain datasets created by NGOs, governments, citizens, and industry. Datasets stored in the repository will be open-access and will not contain personally identifiable information. The purpose of the Canadian Urban Data Catalogue (CUDC) is to enhance the awareness of urban datasets that exist across Canada by providing a catalogue of Canadian and Canadian-created urban datasets. It will catalogue datasets available in CUDR and external datasets available on other platforms and as web services. These external datasets may be open or closed. CUDC uses a rich metadata model that supports the documentation and search for datasets relevant to a user’s needs. Catalogue entry metadata may be exported and imported from/to CUDC.
Apollo (previously DSpace@Cambridge) is the University of Cambridge’s Institutional Repository (IR), preserving and providing access to content created by members of the University. The repository stores a range of content and provides different levels of access, but its primary focus is on providing open access to the University’s research publications.
Data.gov increases the ability of the public to easily find, download, and use datasets that are generated and held by the Federal Government. Data.gov provides descriptions of the Federal datasets (metadata), information about how to access the datasets, and tools that leverage government datasets