Filter
Reset all

Subjects

Content Types

Countries

AID systems

API

Certificates

Data access

Data access restrictions

Database access

Database access restrictions

Database licenses

Data licenses

Data upload

Data upload restrictions

Enhanced publication

Institution responsibility type

Institution type

Keywords

Metadata standards

PID systems

Provider types

Quality management

Repository languages

Software

Syndications

Repository types

Versioning

  • * at the end of a keyword allows wildcard searches
  • " quotes can be used for searching phrases
  • + represents an AND search (default)
  • | represents an OR search
  • - represents a NOT operation
  • ( and ) implies priority
  • ~N after a word specifies the desired edit distance (fuzziness)
  • ~N after a phrase specifies the desired slop amount
Found 68 result(s)
With the Program EnviDat we develop a unified and managed access portal for WSL's rich reservoir of environmental monitoring and research data. EnviDat is designed as a portal to publish, connect and search across existing data but is not intended to become a large data centre hosting original data. While sharing of data is centrally facilitated, data management remains decentralised and the know-how and responsibility to curate research data remains with the original data providers.
Country
bonndata is the institutional, FAIR-aligned and curated, cross-disciplinary research data repository for the publication of research data for all researchers at the University of Bonn. The repository is fully embedded into the University IT and Data Center and curated by the Research Data Service Center (https://www.forschungsdaten.uni-bonn.de/en). The software that bonndata is based on is the open source software Dataverse (https://dataverse.org)
The Research Collection is ETH Zurich's publication platform. It unites the functions of a university bibliography, an open access repository and a research data repository within one platform. Researchers who are affiliated with ETH Zurich, the Swiss Federal Institute of Technology, may deposit research data from all domains. They can publish data as a standalone publication, publish it as supplementary material for an article, dissertation or another text, share it with colleagues or a research group, or deposit it for archiving purposes. Research-data-specific features include flexible access rights settings, DOI registration and a DOI preview workflow, content previews for zip- and tar-containers, as well as download statistics and altmetrics for published data. All data uploaded to the Research Collection are also transferred to the ETH Data Archive, ETH Zurich’s long-term archive.
CaltechDATA is an institutional data repository for Caltech. Caltech library runs the repository to preserve the accomplishments of Caltech researchers and share their results with the world. Caltech-associated researchers can upload data, link data with their publications, and assign a permanent DOI so that others can reference the data set. The repository also preserves software and has automatic Github integration. All files present in the repository are open access or embargoed, and all metadata is always available to the public.
Copernicus is a European system for monitoring the Earth. Copernicus consists of a complex set of systems which collect data from multiple sources: earth observation satellites and in situ sensors such as ground stations, airborne and sea-borne sensors. It processes these data and provides users with reliable and up-to-date information through a set of services related to environmental and security issues. The services address six thematic areas: land monitoring, marine monitoring, atmosphere monitoring, climate change, emergency management and security. The main users of Copernicus services are policymakers and public authorities who need the information to develop environmental legislation and policies or to take critical decisions in the event of an emergency, such as a natural disaster or a humanitarian crisis. Based on the Copernicus services and on the data collected through the Sentinels and the contributing missions , many value-added services can be tailored to specific public or commercial needs, resulting in new business opportunities. In fact, several economic studies have already demonstrated a huge potential for job creation, innovation and growth.
The Tropospheric Ozone Assessment Report (TOAR) database of global surface observations is the world's most extensive collection of surface ozone measurements and includes also data on other air pollutants and on weather for some regions. Measurements from 1970 to 2019 (Version 1) have been collected in a relational database, and are made available via a graphical web interface, a REST service (https://toar-data.fz-juelich.de/api/v1) and as aggregated products on PANGAEA (https://doi.pangaea.de/10.1594/PANGAEA.876108). Measurements from 1970 to present (Version 2) are being collected in a relational database, and are made available via a REST service (https://toar-data.fz-juelich.de/api/v2).
GAWSIS is being developed and maintained by the Federal Office of Meteorology and Climatology MeteoSwiss in collaboration with the WMO GAW Secretariat, the GAW World Data Centres and other GAW representatives to improve the management of information about the GAW network of ground-based stations. The application is presently hosted by the Swiss Laboratories for Materials Testing and Research Empa. GAWSIS provides the GAW community and other interested people with an up-to-date, searchable data base of site descriptions, measurements programs and data available, contact people, bibliographic references. Linked data collections are hosted at the World Data Centers of the WMO Global Atmosphere Watch.
The Genomic Observatories Meta-Database (GEOME) is a web-based database that captures the who, what, where, and when of biological samples and associated genetic sequences. GEOME helps users with the following goals: ensure the metadata from your biological samples is findable, accessible, interoperable, and reusable; improve the quality of your data and comply with global data standards; and integrate with R, ease publication to NCBI's sequence read archive, and work with an associated LIMS. The initial use case for GEOME came from the Diversity of the Indo-Pacific Network (DIPnet) resource.
Country
B2SHARE allows publishing research data and belonging metadata. It supports different research communities with specific metadata schemas. This server is provided for researchers of the Research Centre Juelich and related communities.
The Bavarian Natural History Collections (Staatliche Naturwissenschaftliche Sammlungen Bayerns, SNSB) are a research institution for natural history in Bavaria. They encompass five State Collections (zoology, botany, paleontology and geology, mineralogy, anthropology and paleoanatomy), the Botanical Garden Munich-Nymphenburg and eight museums with public exhibitions in Munich, Bamberg, Bayreuth, Eichstätt and Nördlingen. Our research focuses mainly on the past and present bio- and geodiversity and the evolution of animals and plants. To achieve this we have large scientific collections (almost 35,000,000 specimens), see "joint projects".
Country
DataverseNO (https://dataverse.no) is a curated, FAIR-aligned national generic repository for open research data from all academic disciplines. DataverseNO commits to facilitate that published data remain accessible and (re)usable in a long-term perspective. The repository is owned and operated by UiT The Arctic University of Norway. DataverseNO accepts submissions from researchers primarily from Norwegian research institutions. Datasets in DataverseNO are grouped into institutional collections as well as special collections. The technical infrastructure of the repository is based on the open source application Dataverse (https://dataverse.org), which is developed by an international developer and user community led by Harvard University.
Scripps Institute of Oceanography (SIO) Explorer includes five federated collections: SIO Cruises, SIO Historic Photographs, the Seamounts, Marine Geological Samples, and the Educator’s Collection, all part of the US National Science Digital Library (NSDL). Each collection represents a unique resource of irreplaceable scientific research. The effort is collaboration among researchers at Scripps, computer scientists from the San Diego Supercomputer Center (SDSC), and archivists and librarians from the UCSD Libraries. In 2005 SIOExplorer was extended to the Woods Hole Oceanographic Institution with the Multi-Institution Scalable Digital Archiving project, funded through the joint NSF/Library of Congress digital archiving and preservation program, creating a harvesting methodology and a prototype collection of cruises, Alvin submersible dives and Jason ROV lowerings.
The Antarctic and Southern Ocean Data Portal, part of the US Antarctic Data Consortium, provides access to geoscience data, primarily marine, from the Antarctic region. The synthesis began in 2003 as the Antarctic Multibeam Bathymetry and Geophysical Data Synthesis (AMBS) with a focus on multibeam bathymetry field data and other geophysical data from the Southern Ocean collected with the R/V N. B. Palmer. In 2005, the effort was expanded to include all routine underway geophysical and oceanographic data collected with both the R/V N. B. Palmer and R/V L. Gould, the two primary research vessels serving the US Antarctic Program.
Protectedplanet.net combines crowd sourcing and authoritative sources to enrich and provide data for protected areas around the world. Data are provided in partnership with the World Database on Protected Areas (WDPA). The data include the location, designation type, status year, and size of the protected areas, as well as species information.
The Space Physics Data Facility (SPDF) leads in the design and implementation of unique multi-mission and multi-disciplinary data services and software to strategically advance NASA's solar-terrestrial program, to extend our science understanding of the structure, physics and dynamics of the Heliosphere of our Sun and to support the science missions of NASA's Heliophysics Great Observatory. Major SPDF efforts include multi-mission data services such as Heliophysics Data Portal (formerly VSPO), CDAWeb and CDAWeb Inside IDL,and OMNIWeb Plus (including COHOWeb, ATMOWeb, HelioWeb and CGM) , science planning and orbit services such as SSCWeb, data tools such as the CDF software and tools, and a range of other science and technology research efforts. The staff supporting SPDF includes scientists and information technology experts.
The UK Polar Data Centre (UK PDC) is the focal point for Arctic and Antarctic environmental data management in the UK. Part of the Natural Environmental Research Council’s (NERC) network of environmental data centres and based at the British Antarctic Survey, it coordinates the management of polar data from UK-funded research and supports researchers in complying with national and international data legislation and policy.
ICRISAT performs crop improvement research, using conventional as well as methods derived from biotechnology, on the following crops: Chickpea, Pigeonpea, Groundnut, Pearl millet,Sorghum and Small millets. ICRISAT's data repository collects, preserves and facilitates access to the datasets produced by ICRISAT researchers to all users who are interested in. Data includes Phenotypic, Genotypic, Social Science, and Spatial data, Soil and Weather.
ReefTEMPS is a temperature, pressure, salinity and other observables sensor network in coastal area of South, West and South West of Pacific ocean, driven by UMR ENTROPIE. It is an observatory service from the French national research infrastructure ILICO for “coastal environments”. Some of the network’s sensors have been deployed since 1958. Nearly hundred sensors are actually deployed in 14 countries covering an area of more than 8000 km from East to West. The data are acquired at different rates (from 1sec to 30 mn) depending on sensors and sites. They are processed and described using Climate and Forecast Metadata Convention at the end of oceanographic campaigns organized for sensors replacement every 6 months to 2 years.
The Humanitarian Data Exchange (HDX) is an open platform for sharing data across crises and organisations. Launched in July 2014, the goal of HDX is to make humanitarian data easy to find and use for analysis. HDX is managed by OCHA's Centre for Humanitarian Data, which is located in The Hague. OCHA is part of the United Nations Secretariat and is responsible for bringing together humanitarian actors to ensure a coherent response to emergencies. The HDX team includes OCHA staff and a number of consultants who are based in North America, Europe and Africa.
The mission of World Data Center for Climate (WDCC) is to provide central support for the German and European climate research community. The WDCC is member of the ISC's World Data System. Emphasis is on development and implementation of best practice methods for Earth System data management. Data for and from climate research are collected, stored and disseminated. The WDCC is restricted to data products. Cooperations exist with thematically corresponding data centres of, e.g., earth observation, meteorology, oceanography, paleo climate and environmental sciences. The services of WDCC are also available to external users at cost price. A special service for the direct integration of research data in scientific publications has been developed. The editorial process at WDCC ensures the quality of metadata and research data in collaboration with the data producers. A citation code and a digital identifier (DOI) are provided and registered together with citation information at the DOI registration agency DataCite.