Filter
Reset all

Subjects

Content Types

Countries

AID systems

API

Certificates

Data access

Data access restrictions

Database access

Database access restrictions

Database licenses

Data licenses

Data upload

Data upload restrictions

Enhanced publication

Institution responsibility type

Institution type

Keywords

Metadata standards

PID systems

Provider types

Quality management

Repository languages

Software

Syndications

Repository types

Versioning

  • * at the end of a keyword allows wildcard searches
  • " quotes can be used for searching phrases
  • + represents an AND search (default)
  • | represents an OR search
  • - represents a NOT operation
  • ( and ) implies priority
  • ~N after a word specifies the desired edit distance (fuzziness)
  • ~N after a phrase specifies the desired slop amount
Found 177 result(s)
The Scholarly Database (SDB) at Indiana University aims to serve researchers and practitioners interested in the analysis, modeling, and visualization of large-scale scholarly datasets. The online interface provides access to six datasets: MEDLINE papers, registered Clinical Trials, U.S. Patent and Trademark Office patents (USPTO), National Science Foundation (NSF) funding, National Institutes of Health (NIH) funding, and National Endowment for the Humanities funding – over 26 million records in total.
Chempound is a new generation repository architecture based on RDF, semantic dictionaries and linked data. It has been developed to hold any type of chemical object expressible in CML and is exemplified by crystallographic experiments and computational chemistry calculations. In both examples, the repository can hold >50k entries which can be searched by SPARQL endpoints and pre-indexing of key fields. The Chempound architecture is general and adaptable to other fields of data-rich science. The Chempound software is hosted at http://bitbucket.org/chempound and is available under the Apache License, Version 2.0
The RRUFF Project is creating a complete set of high quality spectral data from well characterized minerals and is developing the technology to share this information with the world. The collected data provides a standard for mineralogists, geoscientists, gemologists and the general public for the identification of minerals both on earth and for planetary exploration.Electron microprobe analysis is used to determine the chemistry of each mineral.
---<<< This repository is no longer available. This record is out-dated >>>--- The ONS challenge contains open solubility data, experiments with raw data from different scientists and institutions. It is part of the The Open Notebook Science wiki community, ideally suited for community-wide collaborative research projects involving mathematical modeling and computer simulation work, as it allows researchers to document model development in a step-by-step fashion, then link model prediction to experiments that test the model, and in turn, use feeback from experiments to evolve the model. By making our laboratory notebooks public, the evolutionary process of a model can be followed in its totality by the interested reader. Researchers from laboratories around the world can now follow the progress of our research day-to-day, borrow models at various stages of development, comment or advice on model developments, discuss experiments, ask questions, provide feedback, or otherwise contribute to the progress of science in any manner possible.
Country
The Ningaloo Atlas was created in response to the need for more comprehensive and accessible information on environmental and socio-economic data on the greater Ningaloo region. As such, the Ningaloo Atlas is a web portal to not only access and share information, but to celebrate and promote the biodiversity, heritage, value, and way of life of the greater Ningaloo region.
The Northern California Earthquake Data Center (NCEDC) is a permanent archive and distribution center primarily for multiple types of digital data relating to earthquakes in central and northern California. The NCEDC is located at the Berkeley Seismological Laboratory, and has been accessible to users via the Internet since mid-1992. The NCEDC was formed as a joint project of the Berkeley Seismological Laboratory (BSL) and the U.S. Geological Survey (USGS) at Menlo Park in 1991, and current USGS funding is provided under a cooperative agreement for seismic network operations.
The OpenMadrigal project seeks to develop and support an on-line database for geospace data. The project has been led by MIT Haystack Observatory since 1980, but now has active support from Jicamarca Observatory and other community members. Madrigal is a robust, World Wide Web based system capable of managing and serving archival and real-time data, in a variety of formats, from a wide range of ground-based instruments. Madrigal is installed at a number of sites around the world. Data at each Madrigal site is locally controlled and can be updated at any time, but shared metadata between Madrigal sites allow searching of all Madrigal sites at once from any Madrigal site. Data is local; metadata is shared.
Country
In the digital collections, you can take a look at the digitized prints from the holdings of the ULB Düsseldorf free of cost. In special collections, the ULB unites rare, valuable and unique parts of holdings that are installed as an ensemble. Deposita, unpublished works, donations, acquisition of rare books etc. were and are an important source for the constant growth of the library. These treasures and specialties - beyond their academic value - also contribute substantially to the profile of the ULB.
Country
Open Government Data Portal of Tamil Nadu is a platform (designed by the National Informatics Centre), for Open Data initiative of the Government of Tamil Nadu. The portal is intended to publish datasets collected by the Tamil Nadu Government for public uses in different perspective. It has been created under Software as A Service (SaaS) model of Open Government Data (OGD) and publishes dataset in open formats like CSV, XLS, ODS/OTS, XML, RDF, KML, GML, etc. This data portal has following modules, namely (a) Data Management System (DMS) for contributing data catalogs by various state government agencies for making those available on the front end website after a due approval process through a defined workflow; (b) Content Management System (CMS) for managing and updating various functionalities and content types; (c) Visitor Relationship Management (VRM) for collating and disseminating viewer feedback on various data catalogs; and (d) Communities module for community users to interact and share their views and common interests with others. It includes different types of datasets generated both in geospatial and non-spatial data classified as shareable data and non-shareable data. Geospatial data consists primarily of satellite data, maps, etc.; and non-spatial data derived from national accounts statistics, price index, census and surveys produced by a statistical mechanism. It follows the principle of data sharing and accessibility via Openness, Flexibility, Transparency, Quality, Security and Machine-readable.
Yoda publishes research data on behalf of researchers that are affiliated with Utrecht University, its research institutes and consortia where it acts as a coordinating body. Data packages are not limited to a particular field of research or license. Yoda publishes data packages via Datacite. To find data publications use: https://public.yoda.uu.nl/ , or the Datacite search engine: https://search.datacite.org/repositories/delft.uu
Country
GEOFON seeks to facilitate cooperation in seismological research and earthquake and tsunami hazard mitigation by providing rapid transnational access to seismological data and source parameters of large earthquakes, and keeping these data accessible in the long term. It pursues these aims by operating and maintaining a global network of permanent broadband stations in cooperation with local partners, facilitating real time access to data from this network and those of many partner networks and plate boundary observatories, providing a permanent and secure archive for seismological data. It also archives and makes accessible data from temporary experiments carried out by scientists at German universities and institutions, thereby fostering cooperation and encouraging the full exploitation of all acquired data and serving as the permanent archive for the Geophysical Instrument Pool at Potsdam (GIPP). It also organises the data exchange of real-time and archived data with partner institutions and international centres.
Sharing and preserving data are central to protecting the integrity of science. DataHub, a Research Computing endeavor, provides tools and services to meet scientific data challenges at Pacific Northwest National Laboratory (PNNL). DataHub helps researchers address the full data life cycle for their institutional projects and provides a path to creating findable, accessible, interoperable, and reusable (FAIR) data products. Although open science data is a crucial focus of DataHub’s core services, we are interested in working with evidence-based data throughout the PNNL research community.
Country
The NORPERM permafrost database provides information on ground temperatures from boreholes and from the near-surface using miniloggers (MTDs). The database was established during the International Polar Year as one of the main goals of the project TSP Norway - "A Contribution to the Thermal State of Permafrost in Norway and Svalbard".
Jason is a remote-controlled deep-diving vessel that gives shipboard scientists immediate, real-time access to the sea floor. Instead of making short, expensive dives in a submarine, scientists can stay on deck and guide Jason as deep as 6,500 meters (4 miles) to explore for days on end. Jason is a type of remotely operated vehicle (ROV), a free-swimming vessel connected by a long fiberoptic tether to its research ship. The 10-km (6 mile) tether delivers power and instructions to Jason and fetches data from it.
<<<!!!<<< This repository is no longer available. >>>!!!>>> The programme "International Oceanographic Data and Information Exchange" (IODE) of the "Intergovernmental Oceanographic Commission" (IOC) of UNESCO was established in 1961. Its purpose is to enhance marine research, exploitation and development, by facilitating the exchange of oceanographic data and information between participating Member States, and by meeting the needs of users for data and information products.
The European Monitoring and Evaluation Programme (EMEP) is a scientifically based and policy driven programme under the Convention on Long-range Transboundary Air Pollution (CLRTAP) for international co-operation to solve transboundary air pollution problems.
Cary Institute data repository allows researchers to store, share and publish their research data, supplementary information and associated metadata. Each published item is assigned a Digital Object identifier (DOI), which allows the data to be citable and sustainable. This repository is a member node of DataOne.
ICOS Carbon Portal is the data portal of the Integrated Carbon Observation System. It provides observational data from the state of the carbon cycle in Europe and the world. The Carbon Portal is the data center of the ICOS infrastructure. ICOS will collect greenhouse gas concentration and fluxes observations from three separate networks, all these observations are carried out to support research to help us understand how the Earth’s greenhouse gas balance works, because there are still many and large uncertainties!
The Spiral Digital Repository is the Imperial College London institutional open access repository. This system allows you, as an author, to make your research documents open access without incurring additional publication costs. When you self-archive a research document in Spiral it becomes free for anyone to read. You can upload copies of your publications to Spiral using Symplectic Elements. All deposited content becomes searchable online.
The HEASARC is a multi-mission astronomy archive for the EUV, X-ray, and Gamma ray wave bands. Because EUV, X and Gamma rays cannot reach the Earth's surface it is necessary to place the telescopes and sensors on spacecraft. The HEASARC now holds the data from 25 observatories covering over 30 years of X-ray, extreme-ultraviolet and gamma-ray astronomy. Data and software from many of the older missions were restored by the HEASARC staff. Examples of these archived missions include ASCA, BeppoSAX, Chandra, Compton GRO, HEAO 1, Einstein Observatory (HEAO 2), EUVE, EXOSAT, HETE-2, INTEGRAL, ROSAT, Rossi XTE, Suzaku, Swift, and XMM-Newton.
The Magnetics Information Consortium (MagIC) improves research capacity in the Earth and Ocean sciences by maintaining an open community digital data archive for rock magnetic, geomagnetic, archeomagnetic (archaeomagnetic) and paleomagnetic (palaeomagnetic) data. Different parts of the website allow users access to archive, search, visualize, and download these data. MagIC supports the international rock magnetism, geomagnetism, archeomagnetism (archaeomagnetism), and paleomagnetism (palaeomagnetism) research and endeavors to bring data out of private archives, making them accessible to all and (re-)useable for new, creative, collaborative scientific and educational activities. The data in MagIC is used for many types of studies including tectonic plate reconstructions, geomagnetic field models, paleomagnetic field reversal studies, magnetohydrodynamical studies of the Earth's core, magnetostratigraphy, and archeology. MagIC is a domain-specific data repository and directed by PIs who are both producers and consumers of rock, geo, and paleomagnetic data. Funded by NSF since 2003, MagIC forms a major part of https://earthref.org which integrates four independent cyber-initiatives rooted in various parts of the Earth, Ocean and Life sciences and education.
CaltechDATA is an institutional data repository for Caltech. Caltech library runs the repository to preserve the accomplishments of Caltech researchers and share their results with the world. Caltech-associated researchers can upload data, link data with their publications, and assign a permanent DOI so that others can reference the data set. The repository also preserves software and has automatic Github integration. All files present in the repository are open access or embargoed, and all metadata is always available to the public.