Filter
Reset all

Subjects

Content Types

Countries

AID systems

API

Certificates

Data access

Data access restrictions

Database access

Database access restrictions

Database licenses

Data licenses

Data upload

Data upload restrictions

Enhanced publication

Institution responsibility type

Institution type

Keywords

Metadata standards

PID systems

Provider types

Quality management

Repository languages

Software

Syndications

Repository types

Versioning

  • * at the end of a keyword allows wildcard searches
  • " quotes can be used for searching phrases
  • + represents an AND search (default)
  • | represents an OR search
  • - represents a NOT operation
  • ( and ) implies priority
  • ~N after a word specifies the desired edit distance (fuzziness)
  • ~N after a phrase specifies the desired slop amount
Found 32 result(s)
NASA’s Precipitation Measurement Missions – TRMM and GPM – provide advanced information on rain and snow characteristics and detailed three-dimensional knowledge of precipitation structure within the atmosphere, which help scientists study and understand Earth's water cycle, weather and climate.
>>>!!!<<< 2019-01: Global Land Cover Facility goes offline see https://spatialreserves.wordpress.com/2019/01/07/global-land-cover-facility-goes-offline/ ; no more access to http://www.landcover.org >>>!!!<<< The Global Land Cover Facility (GLCF) provides earth science data and products to help everyone to better understand global environmental systems. In particular, the GLCF develops and distributes remotely sensed satellite data and products that explain land cover from the local to global scales.
The Research Collection is ETH Zurich's publication platform. It unites the functions of a university bibliography, an open access repository and a research data repository within one platform. Researchers who are affiliated with ETH Zurich, the Swiss Federal Institute of Technology, may deposit research data from all domains. They can publish data as a standalone publication, publish it as supplementary material for an article, dissertation or another text, share it with colleagues or a research group, or deposit it for archiving purposes. Research-data-specific features include flexible access rights settings, DOI registration and a DOI preview workflow, content previews for zip- and tar-containers, as well as download statistics and altmetrics for published data. All data uploaded to the Research Collection are also transferred to the ETH Data Archive, ETH Zurich’s long-term archive.
The BGS is a data-rich organisation with over 400 datasets in its care; including environmental monitoring data, digital databases, physical collections (borehole core, rocks, minerals and fossils), records and archives. Our data is managed by the National Geoscience Data Centre.
The Global Precipitation Measurement (GPM) mission is an international network of satellites that provide the next-generation global observations of rain and snow. Building upon the success of the Tropical Rainfall Measuring Mission (TRMM), the GPM concept centers on the deployment of a “Core” satellite carrying an advanced radar / radiometer system to measure precipitation from space and serve as a reference standard to unify precipitation measurements from a constellation of research and operational satellites.
Geochron is a global database that hosts geochronologic and thermochronologic information from detrital minerals. Information included with each sample consists of a table with the essential isotopic information and ages, a table with basic geologic metadata (e.g., location, collector, publication, etc.), a Pb/U Concordia diagram, and a relative age probability diagram. This information can be accessed and viewed with any web browser, and depending on the level of access desired, can be designated as either private or public. Loading information into Geochron requires the use of U-Pb_Redux, a Java-based program that also provides enhanced capabilities for data reduction, plotting, and analysis. Instructions are provided for three different levels of interaction with Geochron: 1. Accessing samples that are already in the Geochron database. 2. Preparation of information for new samples, and then transfer to Arizona LaserChron Center personnel for uploading to Geochron. 3. Preparation of information and uploading to Geochron using U-Pb_Redux.
ICRISAT performs crop improvement research, using conventional as well as methods derived from biotechnology, on the following crops: Chickpea, Pigeonpea, Groundnut, Pearl millet,Sorghum and Small millets. ICRISAT's data repository collects, preserves and facilitates access to the datasets produced by ICRISAT researchers to all users who are interested in. Data includes Phenotypic, Genotypic, Social Science, and Spatial data, Soil and Weather.
The NSF-supported Program serves the international scientific community through research, infrastructure, data, and models. We focus on how components of the Critical Zone interact, shape Earth's surface, and support life. ARCHIVED CONTENT: In December 2020, the CZO program was succeeded by the Critical Zone Collaborative Network (CZ Net) https://criticalzone.org/
The Australian National University undertake work to collect and publish metadata about research data held by ANU, and in the case of four discipline areas, Earth Sciences, Astronomy, Phenomics and Digital Humanities to develop pipelines and tools to enable the publication of research data using a common and repeatable approach. Aims and outcomes: To identify and describe research data held at ANU, to develop a consistent approach to the publication of metadata on the University's data holdings: Identification and curation of significant orphan data sets that might otherwise be lost or inadvertently destroyed, to develop a culture of data data sharing and data re-use.
Country
Repository of the Faculty of Science is institutional repository that gathers, permanently stores and allows access to the results of scientific and intellectual property of the Faculty of Science, University of Zagreb. The objects that can be stored in the repository are research data, scientific articles, conference papers, theses, dissertations, books, teaching materials, images, video and audio files, and presentations. To improve searchability, all materials are described with predetermined set of metadata.
ETH Data Archive is ETH Zurich's long-term preservation solution for digital information such as research data, digitised content, archival records, or images. It serves as the backbone of data curation and for most of its content, it is a “dark archive” without public access. In this capacity, the ETH Data Archive also archives the content of ETH Zurich’s Research Collection which is the primary repository for members of the university and the first point of contact for publication of data at ETH Zurich. All data that was produced in the context of research at the ETH Zurich, can be published and archived in the Research Collection. An automated connection to the ETH Data Archive in the background ensures the medium to long-term preservation of all publications and research data. Direct access to the ETH Data Archive is intended only for customers who need to deposit software source code within the framework of ETH transfer Software Registration. Open Source code packages and other content from legacy workflows can be accessed via ETH Library @ swisscovery (https://library.ethz.ch/en/).
The British Oceanographic Data Centre (BODC) is a national facility for looking after and distributing data concerning the marine environmentWe deal with biological, chemical, physical and geophysical data, and our databases contain measurements of nearly 22,000 different variables. Many of our staff have direct experience of marine data collection and analysis. They work alongside information technology specialists to ensure that data are documented and stored for current and future use.
Repository for New Mexico Experimental Program to Stimulate Competitive Research Data Collection. Provides access to data generated by the Energize New Mexico project as well as data gathered in our previous project that focused on Climate Change Impacts (RII 3). NM EPSCoR contributes its data to the DataONE network as a member node: https://search.dataone.org/#profile/NMEPSCOR
The South African Marine Information Management System (MIMS) is an Open Archival Information System (OAIS) repository that plays a multifaceted role in archiving, publishing, and preserving marine-related datasets. As an IODE-accredited Associate Data Unit (ADU), MIMS serves as a national node for the IODE of the IOC of UNESCO. It archives and publishes collections and subsets of marine-related datasets for the National Department of Forestry, Fisheries, and the Environment (DFFE) and its regional partners. As an IOC member organization, DFFE is committed to supporting the long-term preservation and archival of marine and coastal data for South Africa and its regional partners, promoting open access to data, and encouraging scientific collaboration. Tasked with the long-term preservation of South Africa's marine and coastal data, MIMS functions as an institutional data repository. It provides primary access to all data collected by the DFFE Oceans and Coastal Research Directorate and acts as a trusted broker of scientific marine data for a wide range of South African institutions. MIMS hosts the IODE AFROBIS Node, an OBIS Node that coordinates and collates data management activities within the sub-Saharan African region. As part of the OBIS Steering Group, MIMS represents sub-Saharan Africa on issues around biological (biodiversity) data standards. It also facilitates data and metadata publishing for the region through the GBIF and OBIS networks. Operating on the Findable, Accessible, Interoperable, and Reusable (FAIR) data principles, MIMS aligns its practices to maximize ocean data exchange and use while respecting the conditions stipulated by the Data Provider. By integrating various functions and commitments, MIMS stands as a vital component in the marine and coastal data landscape, fostering collaboration, standardization, and accessibility in alignment with international standards and regional needs.
The objective of this database is to stimulate the exchange of information and the collaboration between researchers within the ChArMEx community. However, this community is not exclusive and researchers not directly involved in ChArMEx, but who wish to contribute to the achievements of ChArMEx scientific and/or educational goals are welcome to join-in. The database is a depository for all the data collected during the various projects that contribute to ChArMEx coordinated program. It aims at documenting, storing and distributing the data produced or used by the project community. However, it is also intended to host datasets that were produced outside the ChArMEx program but which are meaningful to ChArMEx scientific and/or educational goals. Any data owner who wishes to add or link his dataset to ChArMEx database is welcome to contact the database manager in order to get help and support. The ChArMEx database includes past and recent geophysical in situ observations, satellite products and model outputs. The database organizes the data management and provides data services to end-users of ChArMEx data. The database system provides a detailed description of the products and uses standardized formats whenever it is possible. It defines the access rules to the data and details the mutual rights and obligations of data providers and users (see ChArMEx data and publication policy). The database is being developed jointly by : SEDOO, OMP Toulouse , ICARE, Lille and ESPRI, IPSL Paris
Country
The main focus of tambora.org is Historical Climatology. Years of meticulous work in this field in research groups around the world have resulted in large data collections on climatic parameters such as temperature, precipitation, storms, floods, etc. with different regional, temporal and thematic foci. tambora.org enables researchers to collaboratively interpret the information derived from historical sources. It provides a database for original text quotations together with bibliographic references and the extracted places, dates and coded climate and environmental information.
The DMC is designed to provide registered users with access to non-confidential petroleum exploration and production data from offshore Nova Scotia, subject to certain conditions. The DMC is housed in the CNSOPB's Geoscience Research Centre located in Dartmouth, Nova Scotia. Initially, the DMC will manage and distribute the following digital petroleum data: well data (i.e. logs and reports), seismic image files (e.g. TIFF, PDF), and production data. In the future the DMC could be expanded to include operational, safety, environmental, fisheries data, etc.
Country
The Marine Data Archive (MDA) is an online repository specifically developed to independently archive data files in a fully documented manner. The MDA can serve individuals, consortia, working groups and institutes to manage data files and file versions for a specific context (project, report, analysis, monitoring campaign), as a personal or institutional archive or back-up system and as an open repository for data publication.
The Open Science Framework (OSF) is part network of research materials, part version control system, and part collaboration software. The purpose of the software is to support the scientist's workflow and help increase the alignment between scientific values and scientific practices. Document and archive studies. Move the organization and management of study materials from the desktop into the cloud. Labs can organize, share, and archive study materials among team members. Web-based project management reduces the likelihood of losing study materials due to computer malfunction, changing personnel, or just forgetting where you put the damn thing. Share and find materials. With a click, make study materials public so that other researchers can find, use and cite them. Find materials by other researchers to avoid reinventing something that already exists. Detail individual contribution. Assign citable, contributor credit to any research material - tools, analysis scripts, methods, measures, data. Increase transparency. Make as much of the scientific workflow public as desired - as it is developed or after publication of reports. Find public projects here. Registration. Registering materials can certify what was done in advance of data analysis, or confirm the exact state of the project at important points of the lifecycle such as manuscript submission or at the onset of data collection. Discover public registrations here. Manage scientific workflow. A structured, flexible system can provide efficiency gain to workflow and clarity to project objectives, as pictured.