Filter
Reset all

Subjects

Content Types

Countries

AID systems

API

Data access

Data access restrictions

Database access

Database licenses

Data licenses

Data upload

Enhanced publication

Institution responsibility type

Institution type

Keywords

Metadata standards

PID systems

Provider types

Quality management

Repository languages

Software

Syndications

Repository types

Versioning

  • * at the end of a keyword allows wildcard searches
  • " quotes can be used for searching phrases
  • + represents an AND search (default)
  • | represents an OR search
  • - represents a NOT operation
  • ( and ) implies priority
  • ~N after a word specifies the desired edit distance (fuzziness)
  • ~N after a phrase specifies the desired slop amount
  • 1 (current)
Found 14 result(s)
The Northern California Earthquake Data Center (NCEDC) is a permanent archive and distribution center primarily for multiple types of digital data relating to earthquakes in central and northern California. The NCEDC is located at the Berkeley Seismological Laboratory, and has been accessible to users via the Internet since mid-1992. The NCEDC was formed as a joint project of the Berkeley Seismological Laboratory (BSL) and the U.S. Geological Survey (USGS) at Menlo Park in 1991, and current USGS funding is provided under a cooperative agreement for seismic network operations.
The Ozone Mapping and Profiler Suite measures the ozone layer in our upper atmosphere—tracking the status of global ozone distributions, including the ‘ozone hole.’ It also monitors ozone levels in the troposphere, the lowest layer of our atmosphere. OMPS extends out 40-year long record ozone layer measurements while also providing improved vertical resolution compared to previous operational instruments. Closer to the ground, OMPS’s measurements of harmful ozone improve air quality monitoring and when combined with cloud predictions; help to create the Ultraviolet Index, a guide to safe levels of sunlight exposure. OMPS has two sensors, both new designs, composed of three advanced hyperspectralimaging spectrometers.The three spectrometers: a downward-looking nadir mapper, nadir profiler and limb profiler. The entire OMPS suite currently fly on board the Suomi NPP spacecraft and are scheduled to fly on the JPSS-2 satellite mission. NASA will provide the OMPS-Limb profiler.
Copernicus is a European system for monitoring the Earth. Copernicus consists of a complex set of systems which collect data from multiple sources: earth observation satellites and in situ sensors such as ground stations, airborne and sea-borne sensors. It processes these data and provides users with reliable and up-to-date information through a set of services related to environmental and security issues. The services address six thematic areas: land monitoring, marine monitoring, atmosphere monitoring, climate change, emergency management and security. The main users of Copernicus services are policymakers and public authorities who need the information to develop environmental legislation and policies or to take critical decisions in the event of an emergency, such as a natural disaster or a humanitarian crisis. Based on the Copernicus services and on the data collected through the Sentinels and the contributing missions , many value-added services can be tailored to specific public or commercial needs, resulting in new business opportunities. In fact, several economic studies have already demonstrated a huge potential for job creation, innovation and growth.
NCEP delivers national and global weather, water, climate and space weather guidance, forecasts, warnings and analyses to its Partners and External User Communities. The National Centers for Environmental Prediction (NCEP), an arm of the NOAA's National Weather Service (NWS), is comprised of nine distinct Centers, and the Office of the Director, which provide a wide variety of national and international weather guidance products to National Weather Service field offices, government agencies, emergency managers, private sector meteorologists, and meteorological organizations and societies throughout the world. NCEP is a critical national resource in national and global weather prediction. NCEP is the starting point for nearly all weather forecasts in the United States. The Centers are: Aviation Weather Center (AWC), Climate Prediction Center (CPC), Environmental Modeling Center (EMC), NCEP Central Operations (NCO), National Hurricane Center (NHC), Ocean Prediction Center (OPC), Storm Prediction Center (SPC), Space Weather Prediction Center (SWPC), Weather Prediction Center (WPC)
The Precipitation Processing System (PPS) evolved from the Tropical Rainfall Measuring Mission (TRMM) Science Data and Information System (TSDIS). The purpose of the PPS is to process, analyze and archive data from the Global Precipitation Measurement (GPM) mission, partner satellites and the TRMM mission. The PPS also supports TRMM by providing validation products from TRMM ground radar sites. All GPM, TRMM and Partner public data products are available to the science community and the general public from the TRMM/GPM FTP Data Archive. Please note that you need to register to be able to access this data. Registered users can also search for GPM, partner and TRMM data, order custom subsets and set up subscriptions using our PPS Data Products Ordering Interface (STORM)
Central data management of the USGS for water data that provides access to water-resources data collected at approximately 1.5 million sites in all 50 States, the District of Columbia, Puerto Rico, the Virgin Islands, Guam, American Samoa and the Commonwealth of the Northern Mariana Islands. Includes data on water use and quality, groundwater, and surface water.
Country
The data page makes the data that PCIC collects and produces publicly available with an open license. The page presently provides access to BC Station Data, High-Resolution Climatology, Downscaled Climate Scenarios and VIC Hydrologic Model Output and Extreme Indices calculated from CMIP5.
Country
The National Air Pollution Surveillance (NAPS) Program provides accurate and long-term air quality data of a uniform standard across Canada. The NAPS Network has a Canada-Wide database of criteria air contaminants from the early 1970s to the present for designated NAPS sites, as well as provincial, territorial and other sites. Trace contaminants are also monitored at several stations in the network and analyzed by the laboratory at River Road.
The NCEP/NCAR Reanalysis Project is a joint project between the National Centers for Environmental Prediction (NCEP, formerly "NMC") and the National Center for Atmospheric Research (NCAR). The goal of this joint effort is to produce new atmospheric analyses using historical data (1948 onwards) and as well to produce analyses of the current atmospheric state (Climate Data Assimilation System, CDAS).
Web Soil Survey (WSS) provides soil data and information produced by the National Cooperative Soil Survey. It is operated by the USDA Natural Resources Conservation Service (NRCS) and provides access to the largest natural resource information system in the world. NRCS has soil maps and data available online for more than 95 percent of the nation’s counties and anticipates having 100 percent in the near future. The site is updated and maintained online as the single authoritative source of soil survey information.
The AERONET (AErosol RObotic NETwork) program is a federation of ground-based remote sensing aerosol networks established by NASA and PHOTONS (PHOtométrie pour le Traitement Opérationnel de Normalisation Satellitaire; Univ. of Lille 1, CNES, and CNRS-INSU) and is greatly expanded by networks (e.g., RIMA, AeroSpan, AEROCAN, and CARSNET) and collaborators from national agencies, institutes, universities, individual scientists, and partners. The program provides a long-term, continuous and readily accessible public domain database of aerosol optical, microphysical and radiative properties for aerosol research and characterization, validation of satellite retrievals, and synergism with other databases. The network imposes standardization of instruments, calibration, processing and distribution.
When published in 2005, the Millennium Run was the largest ever simulation of the formation of structure within the ΛCDM cosmology. It uses 10(10) particles to follow the dark matter distribution in a cubic region 500h(−1)Mpc on a side, and has a spatial resolution of 5h−1kpc. Application of simplified modelling techniques to the stored output of this calculation allows the formation and evolution of the ~10(7) galaxies more luminous than the Small Magellanic Cloud to be simulated for a variety of assumptions about the detailed physics involved. As part of the activities of the German Astrophysical Virtual Observatory we have created relational databases to store the detailed assembly histories both of all the haloes and subhaloes resolved by the simulation, and of all the galaxies that form within these structures for two independent models of the galaxy formation physics. We have implemented a Structured Query Language (SQL) server on these databases. This allows easy access to many properties of the galaxies and halos, as well as to the spatial and temporal relations between them. Information is output in table format compatible with standard Virtual Observatory tools. With this announcement (from 1/8/2006) we are making these structures fully accessible to all users. Interested scientists can learn SQL and test queries on a small, openly accessible version of the Millennium Run (with volume 1/512 that of the full simulation). They can then request accounts to run similar queries on the databases for the full simulations. In 2008 and 2012 the simulations were repeated.