Format

NC, NETCDF

40 record(s)
 
Type of resources
Available actions
Topics
Keywords
Contact for the resource
Provided by
Years
Formats
From 1 - 10 / 40
  • This dataset comprises two netcdf files. The first file contains the six global two-dimensional maps necessary to implement the tidal mixing parameterization presented in de Lavergne et al. (2020). Four power fields (E_wwi, E_sho, E_cri and E_hil) represent depth-integrated internal tide energy dissipation, with units of Watts per square meter. Each power field corresponds to a specific dissipative process and associated vertical structure of turbulence production. The two remaining fields, H_cri and H_bot, are decay heights (with units of meters) that enter the vertical structures of the E_cri and E_hil components, respectively. The second file contains three-dimensional fields of turbulence production (with units of Watts per kilogram) obtained by application of the parameterization to the WOCE global hydrographic climatology. The file includes the total turbulence production (epsilon_tid), its four components (epsilon_wwi, epsilon_sho, epsilon_cri, epsilon_hil), and the underlying hydrographic fields, as a function of longitude, latitude and depth. All maps have a horizontal resolution of 0.5º. Detailed documentation of the parameterization can be found in the following publication: de Lavergne, C., Vic, C., Madec, G., Roquet, F., Waterhouse, A.F., Whalen, C.B., Cuypers, Y., Bouruet-Aubertot, P., Ferron, B., Hibiya, T. A parameterization of local and remote tidal mixing. Journal of Advances in Modeling Earth Systems, 12, e2020MS002065 (2020). https://doi.org/10.1029/2020MS002065

  • The Southern Ocean plays a fundamental role in regulating the global climate. This ocean also contains a rich and highly productive ecosystem, potentially vulnerable to climate change. Very large national and international efforts are directed towards the modeling of physical oceanographic processes to predict the response of the Southern Ocean to global climate change and the role played by the large-scale ocean climate processes. However, these modeling efforts are greatly limited by the lack of in situ measurements, especially at high latitudes and during winter months. The standard data that are needed to study ocean circulation are vertical profiles of temperature and salinity, from which we can deduce the density of seawater. These are collected with CTD (Conductivity-Temperature-Depth) sensors that are usually deployed on research vessels or, more recently, on autonomous Argo profilers. The use of conventional research vessels to collect these data is very expensive, and does not guarantee access to areas where sea ice is found at the surface of the ocean during the winter months. A recent alternative is the use of autonomous Argo floats. However, this technology is not easy to use in glaciated areas. In this context, the collection of hydrographic profiles from CTDs mounted on marine mammals is very advantageous. The choice of species, gender or age can be done to selectively obtain data in particularly under-sampled areas such as under the sea ice or on continental shelves. Among marine mammals, elephant seals are particularly interesting. Indeed, they have the particularity to continuously dive to great depths (590 ± 200 m, with maxima around 2000 m) for long durations (average length of a dive 25 ± 15 min, maximum 80 min). A Conductivity-Temperature-Depth Satellite Relay Data Logger (CTD-SRDLs) has been developed in the early 2000s to sample temperature and salinity vertical profiles during marine mammal dives (Boehme et al. 2009, Fedak 2013). The CTD-SRDL is attached to the seal on land, then it records hydrographic profiles during its foraging trips, sending the data by satellite ARGOS whenever the seal goes back to the surface.While the principle intent of seal instrumentation was to improve understanding of seal foraging strategies (Biuw et al., 2007), it has also provided as a by-product a viable and cost-effective method of sampling hydrographic properties in many regions of the Southern Ocean (Charrassin et al., 2008; Roquet et al., 2013).

  • Observations of Sea surface temperature and salinity are now obtained from voluntary sailing ships using medium or small size sensors. They complement the networks installed on research vessels or commercial ships. The delayed mode dataset proposed here is upgraded annually as a contribution to GOSUD (http://www.gosud.org )

  • These monthly gridded climatology were produced using MBT, XBT, Profiling floats, Gliders, and ship-based CTD data from different database and carried out in the Med. between 1969 and 2013. The Mixed Layer Depth (MLD) is calculated with a delta T= 0.1 C criterion relative to 10m reference level on individual profiles. The Depth of the Bottom of the Seasonal Thermocline (DBST) is calculated on individual profiles as the maximum value from a vector composed of two elements: 1) the depth of the temperature minimum in the upper 200m; 2) the MLD. This double criterion for the calculation of DBST is necessary in areas where the mixed layer exceed 200m depth. DBST is the integration depth used in the calculation of the upper-ocean Heat Storage Rate. For more details about the data and the methods used, see: Houpert et al. 2015, Seasonal cycle of the mixed layer, the seasonal thermocline and the upper-ocean heat storage rate in the Mediterranean Sea derived from observations, Progress in Oceanography, http://doi.org/10.1016/j.pocean.2014.11.004

  • This dataset is composed by the climatological seasonal field of the Ocean Salinity Stratification as defined from the Brunt-Vaisala frequency limited to the upper 300 m depth. The details are given in Maes, C., and T. J. O’Kane (2014), Seasonal variations of the upper ocean salinity stratification in the Tropics, J. Geophys. Res. Oceans, 119, 1706–1722, doi:10.1002/2013JC009366.

  • This data set contains the gridded hydrographic and transport data for the biennial Go-Ship A25 Greenland–Portugal OVIDE section from 2002 to 2012. The properties and transports are mapped on a 7km x 1m grid. Using a common grid facilitates the comparison between the different occupations of the line and the averaging. This data set was used in Daniault et al. (2016, Progress in Oceanography) to which the reader is referred for a description of the gridding method.

  • The Coriolis Ocean Dataset for Reanalysis for the Ireland-Biscay-Iberia region (hereafter CORA-IBI) product is a regional dataset of in situ temperature and salinity measurements. The latest version of the product covers the period 1950-2014. The CORA-IBI observations comes from many different sources collected by Coriolis data centre in collaboration with the In Situ Thematic Centre of the Copernicus Marine Service (CMEMS INSTAC).  The observations integrated in the CORA-IBI product have been acquired both by autonomous platforms (Argo profilers, fixed moorings, gliders, drifters, sea mammals, fishery observing system from the RECOPESCA program), research or opportunity vessels ( CTDs, XBTs, ferrybox).  This CORA-IBI product has been controlled using an objective analysis (statistical tests) method and a visual quality control (QC). This QC procedure has been developed with the main objective to improve the quality of the dataset to the level required by the climate application and the physical ocean re-analysis activities. It provides T and S individual profiles on their original level with QC flags. The reference level of measurements is immersion (in meters) or pressure (in decibars). It is a subset on the IBI (Iberia-Bay-of-Biscay Ireland) of the CMEMS product referenced hereafter. The main new features of this regional product compared with previous global CORA products are the incorporation of coastal profiles from fishery observing system (RECOPESCA programme) in the Bay of Biscay and the English Channel as well as the use of an historical dataset collected by the Service hydrographique de la Marine (SHOM).

  • This product integrates observations aggregated and validated from the Regional EuroGOOS consortium (Arctic-ROOS, BOOS, NOOS, IBI-ROOS, MONGOOS and Black Sea GOOS) as well as from National Data Centers (NODCs) and JCOMM global systems (Argo, GOSUD, OceanSITES, GTSPP, DBCP) and the Global telecommunication system (GTS) used by the Met Offices. Data are available in a dedicated directory to waves (INSITU_GLO_WAV_REP_OBSERVATIONS_013_045) of GLOBAL Distribution Unit in one file per platform. This directory is updated twice a year. Data are distributed in two datasets, one with original time sampling and the other with hourly data and rounded timestamps. The information distributed includes wave parameters and wave spectral information. The latest version of Copernicus delayed-mode wave product is distributed from Copernicus Marine catalogue. Additional credits: The American wave data are collected from US NDBC (National Data Buoy Center). The Australian wave data are collected from Integrated Marine Observing System (IMOS); IMOS is enabled by the National Collaborative Research Infrastructure Strategy (NCRIS); It is operated by a consortium of institutions as an unincorporated joint venture, with the University of Tasmania as Lead Agent. The Canadian data are collected from Fisheries and Oceans Canada.

  • The OceanGliders initiative (formerly EGO) is a gathering of several teams of oceanographers, interested in developing the use of gliders for ocean observations. OceanGliders started in Europe with members from France, Germany, Italy, Norway, Spain, and the United Kingdom. The partners of OceanGliders have been funded by both European and national agencies to operate gliders for various purposes and at different sites. Coordinated actions are being set up for these sites in order to demonstrate the capabilities of a fleet of gliders for sampling the ocean, with a given scientific and/or operational objective. Gliders were developed since the 90’s to carry out in-situ observations of the upper 1km of the ocean, filling the gaps left by the existing observing systems. Gliders look like small autonomous robotic underwater vehicles which that uses an engine to change their buoyancy. While gliding from surface to about 1000 meters, gliders provide real-time physical and biogeochemical data along their transit.  They observe temperature, salinity, pressure, biogeochemical data or acoustic data. The OceanGliders GDAC handled at Ifremer/France aggregates the data and metadata from glider deployments provided by the DACs or PIs. The OceanGliders unique DOI publishes the quaterly snapshot of the whole GDAC content and preserves its successive quaterly versions (unique DOI for easy citability, preservation of quaterly versions for reproducibility).   The OceanGliders unique DOI references all individual glider deployment DOIs provided by the DACs or PIs, and with data in the GDAC. DACs or PIs may use the data processing chain published at http://doi.org/10.17882/45402 to generate glider NetCDF GDAC files.

  • A quantitative understanding of the integrated ocean heat content depends on our ability to determine how heat is distributed in the ocean and what are the associated coherent patterns. This dataset contains the results of the Maze et al., 2017 (Prog. Oce.) study demonstrating how this can be achieved using unsupervised classification of Argo temperature profiles. The dataset contains: - A netcdf file with classification~results (labels and probabilities) and coordinates (lat/lon/time) of 100,684 Argo temperature profiles in North Atlantic. - A netcdf file with a Profile Classification Model (PCM) that can be used to classify new temperature profiles from observations or numerical models. The classification method used is a Gaussian Mixture Model that decomposes the Probability Density Function of the dataset into a weighted sum of Gaussian modes. North Atlantic Argo temperature profiles between 0 and 1400m depth were interpolated onto a regular 5m grid, then compressed using Principal Component Analysis and finally classified using a Gaussian Mixture Model. To use the netcdf PCM file to classify new data, you can checkout our PCM Matlab and Python toolbox here: https://github.com/obidam/pcm