Filter
Reset all

Subjects

Content Types

Countries

AID systems

API

Certificates

Data access

Data access restrictions

Database access

Database licenses

Data licenses

Data upload

Data upload restrictions

Enhanced publication

Institution responsibility type

Institution type

Keywords

Metadata standards

PID systems

Provider types

Quality management

Repository languages

Software

Syndications

Repository types

Versioning

  • * at the end of a keyword allows wildcard searches
  • " quotes can be used for searching phrases
  • + represents an AND search (default)
  • | represents an OR search
  • - represents a NOT operation
  • ( and ) implies priority
  • ~N after a word specifies the desired edit distance (fuzziness)
  • ~N after a phrase specifies the desired slop amount
Found 34 result(s)
GPO’s govinfo system is an ISO 16363 certified Trustworthy Digital Repository that ensures free online access to current and historical information from all three branches of the United States Federal Government today and into the future.
Country
GEOFON seeks to facilitate cooperation in seismological research and earthquake and tsunami hazard mitigation by providing rapid transnational access to seismological data and source parameters of large earthquakes, and keeping these data accessible in the long term. It pursues these aims by operating and maintaining a global network of permanent broadband stations in cooperation with local partners, facilitating real time access to data from this network and those of many partner networks and plate boundary observatories, providing a permanent and secure archive for seismological data. It also archives and makes accessible data from temporary experiments carried out by scientists at German universities and institutions, thereby fostering cooperation and encouraging the full exploitation of all acquired data and serving as the permanent archive for the Geophysical Instrument Pool at Potsdam (GIPP). It also organises the data exchange of real-time and archived data with partner institutions and international centres.
The Research Collection is ETH Zurich's publication platform. It unites the functions of a university bibliography, an open access repository and a research data repository within one platform. Researchers who are affiliated with ETH Zurich, the Swiss Federal Institute of Technology, may deposit research data from all domains. They can publish data as a standalone publication, publish it as supplementary material for an article, dissertation or another text, share it with colleagues or a research group, or deposit it for archiving purposes. Research-data-specific features include flexible access rights settings, DOI registration and a DOI preview workflow, content previews for zip- and tar-containers, as well as download statistics and altmetrics for published data. All data uploaded to the Research Collection are also transferred to the ETH Data Archive, ETH Zurich’s long-term archive.
CLARIN is a European Research Infrastructure for the Humanities and Social Sciences, focusing on language resources (data and tools). It is being implemented and constantly improved at leading institutions in a large and growing number of European countries, aiming at improving Europe's multi-linguality competence. CLARIN provides several services, such as access to language data and tools to analyze data, and offers to deposit research data, as well as direct access to knowledge about relevant topics in relation to (research on and with) language resources. The main tool is the 'Virtual Language Observatory' providing metadata and access to the different national CLARIN centers and their data.
>>>!!!<<< This site is going away on April 1, 2021. General access to the site has been disabled and community users will see an error upon login. >>>!!!<<< Socrata’s cloud-based solution allows government organizations to put their data online, make data-driven decisions, operate more efficiently, and share insights with citizens.
The Copernicus Marine Environment Monitoring Service (CMEMS) provides regular and systematic reference information on the physical and biogeochemical state, variability and dynamics of the ocean and marine ecosystems for the global ocean and the European regional seas. The observations and forecasts produced by the service support all marine applications, including: Marine safety; Marine resources; Coastal and marine environment; Weather, seasonal forecasting and climate. For instance, the provision of data on currents, winds and sea ice help to improve ship routing services, offshore operations or search and rescue operations, thus contributing to marine safety. The service also contributes to the protection and the sustainable management of living marine resources in particular for aquaculture, sustainable fisheries management or regional fishery organisations decision-making process. Physical and marine biogeochemical components are useful for water quality monitoring and pollution control. Sea level rise is a key indicator of climate change and helps to assess coastal erosion. Sea surface temperature elevation has direct consequences on marine ecosystems and appearance of tropical cyclones. As a result of this, the service supports a wide range of coastal and marine environment applications. Many of the data delivered by the service (e.g. temperature, salinity, sea level, currents, wind and sea ice) also play a crucial role in the domain of weather, climate and seasonal forecasting.
Country
DataverseNO (https://dataverse.no) is a curated, FAIR-aligned national generic repository for open research data from all academic disciplines. DataverseNO commits to facilitate that published data remain accessible and (re)usable in a long-term perspective. The repository is owned and operated by UiT The Arctic University of Norway. DataverseNO accepts submissions from researchers primarily from Norwegian research institutions. Datasets in DataverseNO are grouped into institutional collections as well as special collections. The technical infrastructure of the repository is based on the open source application Dataverse (https://dataverse.org), which is developed by an international developer and user community led by Harvard University.
Launched in 2000, WormBase is an international consortium of biologists and computer scientists dedicated to providing the research community with accurate, current, accessible information concerning the genetics, genomics and biology of C. elegans and some related nematodes. In addition to their curation work, all sites have ongoing programs in bioinformatics research to develop the next generations of WormBase structure, content and accessibility
PharmGKB is a comprehensive resource that curates knowledge about the impact of genetic variation on drug response for clinicians and researchers. PharmGKB brings together the relevant data in a single place and adds value by combining disparate data on the same relationship, making it easier to search and easier to view the key aspects and by interpreting the data.PharmGKB provide clinical interpretations of this data, curated pathways and VIP summaries which are not found elsewhere.
The UK Polar Data Centre (UK PDC) is the focal point for Arctic and Antarctic environmental data management in the UK. Part of the Natural Environmental Research Council’s (NERC) network of environmental data centres and based at the British Antarctic Survey, it coordinates the management of polar data from UK-funded research and supports researchers in complying with national and international data legislation and policy.
eLaborate is an online work environment in which scholars can upload scans, transcribe and annotate text, and publish the results as on online text edition which is freely available to all users. Short information about and a link to already published editions is presented on the page Editions under Published. Information about editions currently being prepared is posted on the page Ongoing projects. The eLaborate work environment for the creation and publication of online digital editions is developed by the Huygens Institute for the History of the Netherlands of the Royal Netherlands Academy of Arts and Sciences. Although the institute considers itself primarily a research facility and does not maintain a public collection profile, Huygens ING actively maintains almost 200 digitally available resource collections.
CDC.gov is the Centers for Disease Control and Prevention primary online communication channel. CDC.gov provides users with credible, reliable health information on Data and Statistics, Diseases and Conditions, Emergencies and Disasters, Environmental Health, Healthy Living, Injury, Violence and Safety,Life Stages and Populations, Travelers' Health, Workplace Safety and Health
<<<!!!<<< This repository is no longer available. >>>!!!>>> BioVeL is a virtual e-laboratory that supports research on biodiversity issues using large amounts of data from cross-disciplinary sources. BioVeL supports the development and use of workflows to process data. It offers the possibility to either use already made workflows or create own. BioVeL workflows are stored in MyExperiment - Biovel Group http://www.myexperiment.org/groups/643/content. They are underpinned by a range of analytical and data processing functions (generally provided as Web Services or R scripts) to support common biodiversity analysis tasks. You can find the Web Services catalogued in the BiodiversityCatalogue.
ArrayExpress is one of the major international repositories for high-throughput functional genomics data from both microarray and high-throughput sequencing studies, many of which are supported by peer-reviewed publications. Data sets are submitted directly to ArrayExpress and curated by a team of specialist biological curators. In the past (until 2018) datasets from the NCBI Gene Expression Omnibus database were imported on a weekly basis. Data is collected to MIAME and MINSEQE standards.
The Neuroscience Information Framework is a dynamic index of data, materials, and tools. Please note, we do not accept direct data deposits, but if you wish to make your data repository or database available through our search, please contact us. An initiative of the NIH Blueprint for Neuroscience Research, NIF advances neuroscience research by enabling discovery and access to public research data and tools worldwide through an open source, networked environment.
Biological collections are replete with taxonomic, geographic, temporal, numerical, and historical information. This information is crucial for understanding and properly managing biodiversity and ecosystems, but is often difficult to access. Canadensys, operated from the Université de Montréal Biodiversity Centre, is a Canada-wide effort to unlock the biodiversity information held in biological collections.
The Electron Microscopy Data Bank (EMDB) is a public repository for electron microscopy density maps of macromolecular complexes and subcellular structures. It covers a variety of techniques, including single-particle analysis, electron tomography, and electron (2D) crystallography.
ClinicalTrials.gov is a website and online database of clinical research studies and information about their results. The purpose of ClinicalTrials.gov is to provide information about clinical research studies to the public, researchers, and health care professionals. The U.S. government does not review or approve the safety and science of all studies listed on this website.
myExperiment is a collaborative environment where scientists can safely publish their workflows and in silico experiments, share them with groups and find those of others. Workflows, other digital objects and bundles (called Packs) can now be swapped, sorted and searched like photos and videos on the Web. Unlike Facebook or MySpace, myExperiment fully understands the needs of the researcher and makes it really easy for the next generation of scientists to contribute to a pool of scientific methods, build communities and form relationships — reducing time-to-experiment, sharing expertise and avoiding reinvention. myExperiment is now the largest public repository of scientific workflows.
GWAS Central (previously the Human Genome Variation database of Genotype-to-Phenotype information) is a database of summary level findings from genetic association studies, both large and small. We actively gather datasets from public domain projects, and encourage direct data submission from the community.
Country
The UTM Data Centre is responsible for managing spatial data acquired during oceanographic cruises on board CSIC research vessels (RV Sarmiento de Gamboa, RV García del Cid) and RV Hespérides. The aim is, on the one hand, to disseminate which data exist and where, how and when they have been acquired. And on the other hand, to provide access to as much of the interoperable data as possible, following the FAIR principles, so that they can be used and reused. For this purpose, the UTM has a Spatial Data Infrastructure at a national level that consists of several services: Oceanographic Cruise and Data Catalogue Including metadata from more than 600 cruises carried out since 1991, with links to documentation associated to the cruise, navigation maps and datasets Geoportal Geospatial data mapping interface Underway Plot & QC Visualization, Quality Control and conversion to standard format of meteorological data and temperature and salinity of surface water At an international level, the UTM is a National Oceanographic Data Centre (NODC) of the Distributed European Marine Data Infrastructure SeaDataNet, to which the UTM provides metadata published in the Cruise Summary Report Catalog and in the data catalog Common Data Index Catalog, as well as public data to be shared.