Format

NC, NETCDF

47 record(s)
 
Type of resources
Available actions
Topics
Keywords
Contact for the resource
Provided by
Years
Formats
From 1 - 10 / 47
  • This dataset provides a global Look-Up Table (LUT) of physiological ratios for the real-time adjustment of chlorophyll-a fluorescence measured by biogeochemical Argo (BGC-Argo) profiling floats. The physiological ratios aim to account for the global variability in the relationship between fluorescence and chlorophyll-a concentration, as influenced by phytoplankton physiology. The LUT was developed using two different gap-filled observational Argo-based products (SOCA machine learning-based methodology ; Sauzède et al., 2016; Sauzède et al., 2024). The first product provides gap-filled chlorophyll-a data derived from fluorescence corrected for dark signal and non-photochemical quenching (NPQ) following Schmechtig et al. (2023), while the second product provides chlorophyll-a concentrations derived from light attenuation. The latter is based on the downward irradiance at 490 nm (ED490) derived from the SOCA-light method (Renosh et al., 2023). From this, the diffuse attenuation coefficient (KD490) is computed, which is subsequently used to estimate the chlorophyll-a concentration through the bio-optical relationships described by Morel et al. (2007). These two products, based on fluorescence and radiometry, enable the derivation of spatially varying correction factors, or physiological ratios. These ratios provide a validated grounded framework for adjusting real-time fluorescence observations from OneArgo floats into chlorophyll-a concentrations. The LUT is distributed in NetCDF format and is provided on a regular 1°×1° latitude–longitude grid covering the global ocean. Each grid cell contains the temporal mean, averaged over the water column (from the surface to 1.5 times the euphotic depth), of the physiological ratio. The file also includes metadata describing variable definitions, units, and other relevant information. Variables included: - physiological_ratio — fluorescence-to-radiometry-based chlorophyll correction factor (dimensionless) - physiological_ratio_sd — temporal standard deviation (over the twelve months) of the fluorescence-to-radiometry-based chlorophyll correction factor (dimensionless) - lat, lon — spatial coordinates (degrees north/east) - Global attributes — dataset description, reference citation, and contact information

  • This data set contains the gridded hydrographic and transport data for the biennial Go-Ship A25 Greenland–Portugal OVIDE section from 2002 to 2012. The properties and transports are mapped on a 7km x 1m grid. Using a common grid facilitates the comparison between the different occupations of the line and the averaging. This data set was used in Daniault et al. (2016, Progress in Oceanography) to which the reader is referred for a description of the gridding method.

  • This product contains observations and gridded files from two up-to-date carbon and biogeochemistry community data products: Surface Ocean Carbon ATlas SOCATv2023 and GLobal Ocean Data Analysis Project GLODAPv2.2023. The SOCATv2023-OBS dataset contains >25 million observations of fugacity of CO2 of the surface global ocean from 1957 to early 2023. The quality control procedures are described in Bakker et al. (2016). These observations form the basis of the gridded products included in SOCATv2023-GRIDDED: monthly, yearly and decadal averages of fCO2 over a 1x1 degree grid over the global ocean, and a 0.25x0.25 degree, monthly average for the coastal ocean. GLODAPv2.2023-OBS contains >1 million observations from individual seawater samples of temperature, salinity, oxygen, nutrients, dissolved inorganic carbon, total alkalinity and pH from 1972 to 2021. These data were subjected to an extensive quality control and bias correction described in Olsen et al. (2020). GLODAPv2-GRIDDED contains global climatologies for temperature, salinity, oxygen, nitrate, phosphate, silicate, dissolved inorganic carbon, total alkalinity and pH over a 1x1 degree horizontal grid and 33 standard depths using the observations from the previous major iteration of GLODAP, GLODAPv2. SOCAT and GLODAP are based on community, largely volunteer efforts, and the data providers will appreciate that those who use the data cite the corresponding articles (see References below) in order to support future sustainability of the data products.

  • The observations of campe glider on imedia deployment (Mediterranean Sea - Western basin) are distributed in 4 files: - EGO NetCDF time-series (data, metadata, derived sea water current) - NetCDF profiles extracted from the above time-series - Raw data - JSON metadata used by the decoder The following parameters are provided : - Practical salinity - Sea temperature in-situ ITS-90 scale - Electrical conductivity - Sea water pressure, equals 0 at sea-level

  • This dataset contains OAC-P results from application to Argo data in the World Ocean : - the 2000-2015 climatology of OAC-P results mapped onto a 0.5x0.5 grid with mapping error estimates; - the 2000-2015 probability density function of the permanent pycnocline potential density referenced to the sea surface vs Brunt-Väisälä frequency squared.OAC-P is an "Objective Algorithm for the Characterization of the permanent Pycnocline" developed to characterize subtropical gyre stratification features with both observed and modeled potential density profiles. OAC-P estimates the following properties: - for the permanent pycnocline: depth, upper and lower thicknesses, Brunt-Väisälä frequency squared, potential density, temperature and salinity; - for the surface mode water overlying the permanent pycnocline: depth, Brunt-Väisälä frequency squared, potential density, temperature and salinity. Argo data were download from Coriolis Argo GDAC on February, 8th 2016. Only Argo data with QC=1, 2, 5 or 8 were used.

  • This dataset provides a World Ocean Atlas of Argo inferred statistics. The primary data are exclusively Argo profiles. The statistics are done using the whole time range covered by the Argo data, starting in July 1997. The atlas is provided with a 0.25° resolution in the horizontal and 63 depths from 0 m to 2,000 m in the vertical. The statistics include means of Conservative Temperature (CT), Absolute Salinity, compensated density, compressiblity factor and vertical isopycnal displacement (VID); standard deviations of CT, VID and the squared Brunt Vaisala frequency; skewness and kurtosis of VID; and Eddy Available Potential Energy (EAPE). The compensated density is the product of the in-situ density times the compressibility factor. It generalizes the virtual density used in Roullet et al. (2014). The compressibility factor is defined so as to remove the dependency with pressure of the in-situ density. The compensated density is used in the computation of the VID and the EAPE.

  • Ensemble simulations of the ecosystem model Apecosm (https://apecosm.org) forced by the IPSL-CM6-LR climate model with the climate change scenario SSP5-8.5. The output files contain yearly mean biomass density for 3 communities (epipelagic, mesopelagic migratory and mesopelagic redidents) and 100 size classes (ranging from 0.12cm to 1.96m) The model grid file is also provided. Units are in J/m2 and can be converted in kg/m2 by dividing by 4e6. These outputs are associated with the "Assessing the time of emergence of marine ecosystems from global to local scales using IPSL-CM6A-LR/APECOSM climate-to-fish ensemble simulations" paper from the Earth's Future "Past and Future of Marine Ecosystems" Special Collection.

  • GOSUD aims at assembling in-situ observations of the world ocean surface collected by a variety of ships and at distributing quality controlled datasets.  At present time the variables considered by GOSUD are temperature and salinity. The GOSUD data are mostly collected using thermosalinographs (TSG) installed on research vessels, on commercial ships and in some cases on sailing exploration ships GOSUD manages both near-real time data and delayed mode (reprocessed) data.

  • These monthly gridded climatology were produced using MBT, XBT, Profiling floats, Gliders, and ship-based CTD data from different database and carried out in the Med. between 1969 and 2013. The Mixed Layer Depth (MLD) is calculated with a delta T= 0.1 C criterion relative to 10m reference level on individual profiles. The Depth of the Bottom of the Seasonal Thermocline (DBST) is calculated on individual profiles as the maximum value from a vector composed of two elements: 1) the depth of the temperature minimum in the upper 200m; 2) the MLD. This double criterion for the calculation of DBST is necessary in areas where the mixed layer exceed 200m depth. DBST is the integration depth used in the calculation of the upper-ocean Heat Storage Rate. For more details about the data and the methods used, see: Houpert et al. 2015, Seasonal cycle of the mixed layer, the seasonal thermocline and the upper-ocean heat storage rate in the Mediterranean Sea derived from observations, Progress in Oceanography, http://doi.org/10.1016/j.pocean.2014.11.004

  • A quantitative understanding of the integrated ocean heat content depends on our ability to determine how heat is distributed in the ocean and what are the associated coherent patterns. This dataset contains the results of the Maze et al., 2017 (Prog. Oce.) study demonstrating how this can be achieved using unsupervised classification of Argo temperature profiles. The dataset contains: - A netcdf file with classification~results (labels and probabilities) and coordinates (lat/lon/time) of 100,684 Argo temperature profiles in North Atlantic. - A netcdf file with a Profile Classification Model (PCM) that can be used to classify new temperature profiles from observations or numerical models. The classification method used is a Gaussian Mixture Model that decomposes the Probability Density Function of the dataset into a weighted sum of Gaussian modes. North Atlantic Argo temperature profiles between 0 and 1400m depth were interpolated onto a regular 5m grid, then compressed using Principal Component Analysis and finally classified using a Gaussian Mixture Model. To use the netcdf PCM file to classify new data, you can checkout our PCM Matlab and Python toolbox here: https://github.com/obidam/pcm