Filter
Reset all

Subjects

Content Types

Countries

AID systems

API

Certificates

Data access

Data access restrictions

Database access

Database access restrictions

Database licenses

Data licenses

Data upload

Data upload restrictions

Enhanced publication

Institution responsibility type

Institution type

Keywords

Metadata standards

PID systems

Provider types

Quality management

Repository languages

Software

Syndications

Repository types

Versioning

  • * at the end of a keyword allows wildcard searches
  • " quotes can be used for searching phrases
  • + represents an AND search (default)
  • | represents an OR search
  • - represents a NOT operation
  • ( and ) implies priority
  • ~N after a word specifies the desired edit distance (fuzziness)
  • ~N after a phrase specifies the desired slop amount
Found 46 result(s)
Country
>>>!!!<<< The repository is no longer available. >>>!!!<<< C3-Grid is an ALREADY FINISHED project within D-Grid, the initiative to promote a grid-based e-Science framework in Germany. The goal of C3-Grid is to support the workflow of Earth system researchers. A grid infrastructure will be implemented that allows efficient distributed data processing and inter-institutional data exchange. Aim of the effort was to develop an infrastructure for uniform access to heterogeneous data and distributed data processing. The work was structured in two projects funded by the Federal Ministry of Education and Research. The first project was part of the D-Grid initiative and explored the potential of grid technology for climate research and developed a prototype infrastructure. Details about the C3Grid architecture are described in “Earth System Modelling – Volume 6”. In the second phase "C3Grid - INAD: Towards an Infrastructure for General Access to Climate Data" this infrastructure was improved especially with respect to interoperability to Earth System Grid Federation (ESGF). Further the portfolio of available diagnostic workflows was expanded. These workflows can be re-used now in adjacent infrastructures MiKlip Evaluation Tool (http://www.fona-miklip.de/en/index.php) and as Web Processes within the Birdhouse Framework (http://bird-house.github.io/). The Birdhouse Framework is now funded as part of the European Copernicus Climate Change Service (https://climate.copernicus.eu/) managed by ECMWF and will be extended to provide scalable processing services for ESGF hosted data at DKRZ as well as IPSL and BADC.
Yoda publishes research data on behalf of researchers that are affiliated with Utrecht University, its research institutes and consortia where it acts as a coordinating body. Data packages are not limited to a particular field of research or license. Yoda publishes data packages via Datacite. To find data publications use: https://public.yoda.uu.nl/ , or the Datacite search engine: https://search.datacite.org/repositories/delft.uu
Country
One of the world’s largest banks of biological, psychosocial and clinical data on people suffering from mental health problems. The Signature center systematically collects biological, psychosocial and clinical indicators from patients admitted to the psychiatric emergency and at four points throughout their journey in the hospital: upon arrival to the emergency room (state of crisis), at the end of their hospital stay, as well as at the beginning and the end of outpatient treatment. For all hospital clients who agree to participate, blood specimens are collected for the purpose of measuring metabolic, genetic, toxic and infectious biomarkers, while saliva samples are collected to measure sex hormones and hair samples are collected to measure stress hormones. Questionnaire has been selected to cover important dimensional aspects of mental illness such as Behaviour and Cognition (Psychosis, Depression, Anxiety, Impulsiveness, Aggression, Suicide, Addiction, Sleep),Socio-demographic Profile (Spiritual beliefs, Social functioning, Childhood experiences, Demographic, Family background) and Medical Data (Medication, Diagnosis, Long-term health, RAMQ data). On 2016, May there are more than 1150 participants and 400 for the longitudinal Follow-Up
ForestPlots.net is a web-accessible secure repository for forest plot inventories in South America, Africa and Asia. The database includes plot geographical information; location, taxonomic information and diameter measurements of trees inside each plot; and participants in plot establishment and re-measurement, including principal investigators, field assistants, students.
The National Science Foundation (NSF) Ultraviolet (UV) Monitoring Network provides data on ozone depletion and the associated effects on terrestrial and marine systems. Data are collected from 7 sites in Antarctica, Argentina, United States, and Greenland. The network is providing data to researchers studying the effects of ozone depletion on terrestrial and marine biological systems. Network data is also used for the validation of satellite observations and for the verification of models describing the transfer of radiation through the atmosphere.
THEREDA (Thermodynamic Reference Database) is a joint project dedicated to the creation of a comprehensive, internally consistent thermodynamic reference database, to be used with suitable codes for the geochemical modeling of aqueous electrolyte solutions up to high concentrations.
All ADNI data are shared without embargo through the LONI Image and Data Archive (IDA), a secure research data repository. Interested scientists may obtain access to ADNI imaging, clinical, genomic, and biomarker data for the purposes of scientific investigation, teaching, or planning clinical research studies. "The Alzheimer’s Disease Neuroimaging Initiative (ADNI) unites researchers with study data as they work to define the progression of Alzheimer’s disease (AD). ADNI researchers collect, validate and utilize data, including MRI and PET images, genetics, cognitive tests, CSF and blood biomarkers as predictors of the disease. Study resources and data from the North American ADNI study are available through this website, including Alzheimer’s disease patients, mild cognitive impairment subjects, and elderly controls. "
Country
<<<!!!<<< There are no more data available. >>>!!!>>> HalOcAt brings together global oceanic and atmospheric data of mainly short-lived brominated and iodinated trace gases.
Country
Health Data Nova Scotia (HDNS), is a data repository based in the Faculty of Medicine's, Department of Community Health and Epidemiology at Dalhousie University, focused on supporting data driven research for a healthier Nova Scotia. HDNS facilitates research and innovation in Nova Scotia by providing access to linkable administrative health data and analysis for research and health service assessment purposes in a secure, controlled environment, while respecting the privacy and confidentiality of Nova Scotians.
Country
Over 1000 detailed, fully referenced and verified datasets for steels, aluminium and titanium alloys, cast irons/steels, weld metals. Materials can be searched according to a number of different criteria. Initial search results are presented in the form of a table from which they can be selected for presentation in form of detailed report or for comparison overview (up to 5 materials). In addition to material information and values of properties/parameters, images of microstructure, specimens and those of stress-strain, stress- and strain-life curves (if available) can be reviewed as well.
MEMENTO aims to become a valuable tool for identifying regions of the world ocean that should be targeted in future work to improve the quality of air-sea flux estimates.
Country
LIAG's Geophysics Information System (FIS GP) serves for the storage and supply of geophysical measurements and evaluations of LIAG and its partners. The architecture of the overall system intends a subdivision into an universal part (superstructure) and into several subsystems dedicated to geophysical methods (borehole geophysics, gravimetry, magnetics, 1D/2D geoelectrics, underground temperatures, seismics, VSP, helicopter geophysics and rock physics. The building of more subsystems is planned.
The Africa Health Research Institute (AHRI) has published its updated analytical datasets for 2016. The datasets cover socio-economic, education and employment information for individuals and households in AHRI’s population research area in rural northern KwaZulu-Natal. The datasets also include details on the migration patterns of the individuals and households who migrated into and out of the surveillance area as well as data on probable causes of death for individuals who passed away. Data collection for the 2016 individual interviews – which involves a dried blood spot sample being taken – is still in progress, and therefore datasets on HIV status and General Health only go up to 2015 for now. Over the past 16 years researchers have developed an extensive longitudinal database of demographic, social, economic, clinical and laboratory information about people over the age of 15 living in the AHRI population research area. During this time researchers have followed more than 160 000 people, of which 92 000 are still in the programme.
>>>!!!<<<The IGETS data base at GFZ Potsdam http://www.re3data.org/repository/r3d100010300 continues the activities of the International Center for Earth Tides (ICET), in particular, in collecting, archiving and distributing Earth tide records from long series of gravimeters, tiltmeters, strainmeters and other geodynamic sensors. >>>!!!<<< The ICET Data Bank contains results from 360 tidal gravity stations: hourly values, main tidal waves obtained by least squares analyses, residual vectors, oceanic attraction and loading vectors. The Data Bank contains also data from tiltmeters and extensometers. ICET is responsible for the Information System and Data Center of the Global Geodynamic Project (GGP). The tasks ascribed to ICET are : to collect all available measurements of Earth tides (which is its task as World Data Centre C), to evaluate these data by convenient methods of analysis in order to reduce the very large amount of measurements to a limited number of parameters which should contain all the desired and needed geophysical information, to compare the data from different instruments and different stations distributed all over the world, evaluate their precision and accuracy from the point of view of internal errors as well as external errors, to help to solve the basic problem of calibrations and to organize reference stations or build reference calibration devices, to fill gaps in information or data as far as feasible, to build a data bank allowing immediate and easy comparison of Earth tide parameters with different Earth models and other geodetical and geophysical parameters like geographical position, Bouguer anomaly, crustal thickness and age, heat flow, ... to ensure a broad diffusion of the results and information to all interested laboratories and individual scientists.
In 2003, the National Institute of Diabetes and Digestive and Kidney Diseases (NIDDK) at NIH established Data, Biosample, and Genetic Repositories to increase the impact of current and previously funded NIDDK studies by making their data and biospecimens available to the broader scientific community. These Repositories enable scientists not involved in the original study to test new hypotheses without any new data or biospecimen collection, and they provide the opportunity to pool data across several studies to increase the power of statistical analyses. In addition, most NIDDK-funded studies are collecting genetic biospecimens and carrying out high-throughput genotyping making it possible for other scientists to use Repository resources to match genotypes to phenotypes and to perform informative genetic analyses.
The Common Cold Project began in 2011 with the aim of creating, documenting, and archiving a database that combines final research data from 5 prospective viral-challenge studies that were conducted over the preceding 25 years: the British Cold Study (BCS); the three Pittsburgh Cold Studies (PCS1, PCS2, and PCS3); and the Pittsburgh Mind-Body Center Cold Study (PMBC). These unique studies assessed predictor (and hypothesized mediating) variables in healthy adults aged 18 to 55 years, experimentally exposed them to a virus that causes the common cold, and then monitored them for development of infection and signs and symptoms of illness.
The Central Neuroimaging Data Archive (CNDA) allows for sharing of complex imaging data to investigators around the world, through a simple web portal. The CNDA is an imaging informatics platform that provides secure data management services for Washington University investigators, including source DICOM imaging data sharing to external investigators through a web portal, cnda.wustl.edu. The CNDA’s services include automated archiving of imaging studies from all of the University’s research scanners, automated quality control and image processing routines, and secure web-based access to acquired and post-processed data for data sharing, in compliance with NIH data sharing guidelines. The CNDA is currently accepting datasets only from Washington University affiliated investigators. Through this platform, the data is available for broad sharing with researchers both internal and external to Washington University.. The CNDA overlaps with data in oasis-brains.org https://www.re3data.org/repository/r3d100012182, but CNDA is a larger data set.
Country
The aim of the project KCDC (KASCADE Cosmic Ray Data Centre) is the installation and establishment of a public data centre for high-energy astroparticle physics based on the data of the KASCADE experiment. KASCADE was a very successful large detector array which recorded data during more than 20 years on site of the KIT-Campus North, Karlsruhe, Germany (formerly Forschungszentrum, Karlsruhe) at 49,1°N, 8,4°O; 110m a.s.l. KASCADE collected within its lifetime more than 1.7 billion events of which some 433.000.000 survived all quality cuts. Initially about 160 million events are available here for public usage.
The main goal of the ECCAD project is to provide scientific and policy users with datasets of surface emissions of atmospheric compounds, and ancillary data, i.e. data required to estimate or quantify surface emissions. The supply of ancillary data - such as maps of population density, maps of fires spots, burnt areas, land cover - could help improve and encourage the development of new emissions datasets. ECCAD offers: Access to global and regional emission inventories and ancillary data, in a standardized format Quick visualization of emission and ancillary data Rationalization of the use of input data in algorithms or emission models Analysis and comparison of emissions datasets and ancillary data Tools for the evaluation of emissions and ancillary data ECCAD is a dynamical and interactive database, providing the most up to date datasets including data used within ongoing projects. Users are welcome to add their own datasets, or have their regional masks included in order to use ECCAD tools.
Country
From April 2020 to March 2023, the Covid-19 Immunity Task Force (CITF) supported 120 studies to generate knowledge about immunity to SARS-CoV-2. The subjects addressed by these studies include the extent of SARS-CoV-2 infection in Canada, the nature of immunity, vaccine effectiveness and safety, and the need for booster shots among different communities and priority populations in Canada. The CITF Databank was developed to further enhance the impact of CITF funded studies by allowing additional research using the data collected from CITF-supported studies. The CITF Databank centralizes and harmonizes individual-level data from CITF-funded studies that have met all ethical requirements to deposit data in the CITF Databank and have completed a data sharing agreement. The CITF Databank is an internationally unique resource for sharing epidemiological and laboratory data from studies about SARS-CoV-2 immunity in different populations. The types of research that are possible with data from the CITF Databank include observational epidemiological studies, mathematical modelling research, and comparative evaluation of surveillance and laboratory methods.
ETH Data Archive is ETH Zurich's long-term preservation solution for digital information such as research data, digitised content, archival records, or images. It serves as the backbone of data curation and for most of its content, it is a “dark archive” without public access. In this capacity, the ETH Data Archive also archives the content of ETH Zurich’s Research Collection which is the primary repository for members of the university and the first point of contact for publication of data at ETH Zurich. All data that was produced in the context of research at the ETH Zurich, can be published and archived in the Research Collection. An automated connection to the ETH Data Archive in the background ensures the medium to long-term preservation of all publications and research data. Direct access to the ETH Data Archive is intended only for customers who need to deposit software source code within the framework of ETH transfer Software Registration. Open Source code packages and other content from legacy workflows can be accessed via ETH Library @ swisscovery (https://library.ethz.ch/en/).