Joint ECMWF/OceanPredict workshop on Advances in Ocean Data Assimilation

Abstracts

This abstract is not assigned to a timetable spot.

A Neural Network-Based Observation Operator for Coupled Ocean-Acoustic Variational Data Assimilation

Andrea Storto 1, Giovanni De Magistris 2, Silvia Falchetti 2, Paolo Oddo 2

1CMRE; CNR-ISMAR, 2CMRE

Variational data assimilation requires implementing the tangent-linear and adjoint (TA/AD) version of any operator. This intrinsically hampers the use of complicated observations. Here, we assess a new data-driven approach to assimilate acoustic underwater propagation measurements (Transmission Loss, TL) into a regional ocean forecasting system. TL measurements depend on the underlying sound speed fields, mostly temperature, and their inversion would require heavy coding of the TA/AD of an acoustic underwater propagation model. In this study, the non-linear version of the acoustic model is applied to an ensemble of perturbed oceanic conditions. TL outputs are used to formulate both a statistical linear operator based on canonical correlation analysis (CCA), and a neural network-based (NN) operator. For the latter, two linearization strategies are compared, the best-performing one relying on reverse-mode automatic differentiation. The new observation operator is applied in data assimilation experiments over the Ligurian Sea (Mediterranean Sea), using the Observing System Simulation Experiments (OSSE) methodology to assess the impact of TL observations onto oceanic fields. TL observations are extracted from a nature run with perturbed surface boundary conditions and stochastic ocean physics. Sensitivity analyses indicate that the NN reconstruction of TL is significantly better than CCA. Both CCA and NN are able to improve the upper ocean skill scores in forecast experiments, with NN outperforming CCA on the average. The use of the NN observation operator is computationally affordable, and its general formulation appears promising for the adjoint-free assimilation of any remote sensing observing network.

This abstract is not assigned to a timetable spot.

Mechanism of Interannual Cross-equatorial Overturning Anomalies in the Pacific Ocean

Devanarayana Rao Mohan Rao , Neil F Tandon 1

1Department of Earth and Space Science and Engineering, York University, Toronto.

The meridional overturning circulation (MOC) transports heat and mass within the ocean. MOC variations are driven by changes in wind stress and density. A recent study (Tandon et al., J. Phys. Oceanogr., 2020) has shown that interannual variability of the global MOC is dominated by variability in the Pacific MOC, and this variability is characterized by a prominent cross-equatorial cell (CEC) spanning the tropics between 20°S and 20°N. This CEC is a potentially important influence on interannual climate variability, but the mechanism responsible for this CEC is not understood. Our research seeks to elucidate the mechanism of the CEC. In this study, we investigate the CEC mechanism using version 4.2 of the Estimating the Circulation and Climate of the Ocean (ECCO) state estimate covering the period 1992-2011. Our analysis shows that the CEC is driven by the following mechanistic chain: 1) Interannual anomalies of meridional wind stress generate temperature anomalies near the equator. 2) These temperature anomalies in turn generate equatorially antisymmetric anomalies of sea surface height (SSH). 3) These SSH anomalies drive cross-equatorial flow in the upper Pacific Ocean (above approximately 1000 m). 4) This anomalous cross-equatorial flow in the upper Pacific drives compensating flow in the deep Pacific. This mechanism contrasts with that responsible for anomalous cross-equatorial overturning on seasonal timescales, which is primarily the Ekman response to equatorially antisymmetric anomalies of zonal wind stress (Jayne and Marotzke, Rev. Geophys., 2001). On interannual timescales, however, the zonal wind stress anomalies associated with the CEC are equatorially symmetric (rather than antisymmetric), and steric SSH variations are the dominant driver of the CEC.

This abstract is not assigned to a timetable spot.

Lessons learnt by applying an EnOI system to a gridded observational product

Peter Oke 1

1CSIRO

Ensemble Optimal Interpolation (EnOI) is used by several groups for ocean forecasting and reanalysis. Here, EnOI is applied to a gridded observational product – mapping Argo and satellite data to produce weekly analyses on a global 1/10 degree grid. Starting with a configuration that has long been applied under Bluelink for forecasting and reanalysis, many shortcomings quickly became evident. Analyses included small-scale noise, unresolved by observations. The locations of observations were clearly evident, particularly altimeter tracks, indicating over-fitting. Background and analysis innovations were larger than expected, and grew with time, signally poor constraint. When such analyses are used to initialise a dynamical model, the model seems to “hide” many of these characteristics – and in the resulting daily-mean fields that most practitioners analyse to assess performance, most of the problems are not evident. But in the absence of a model, these features are disturbingly obvious. What was the cause? The ensembles all included noisy anomalies and did not include the necessary scales of interest. The localising length-scales were too short for the observing system, and there was insufficient data to really constrain the resolved scales. The solution was to use a much larger ensemble (~400 members compared to ~100), longer localising length-scales (~1000 km instead of ~200 km), and damping towards climatology (to make up for insufficient observations). The resulting analysis system seems to out-performs most dynamical forecast systems – both in terms of analysis innovations (i.e., it fits the observations more closely), and persistence forecasting (i.e., a persisted analysis seems to “beat” most dynamical forecasts). Most EnOI-based studies use a modest ensemble size (typically 100 members), poorly considered ensemble generation, and localisation that is too short. Perhaps our EnOI-based systems can do much better, if configured with care.

This abstract is not assigned to a timetable spot.

Biases at the base of the mixed layer induced by 3DVar assimilation of sea surface temperature observations.

James While 1, Matthew Martin 1, Robert King 2

1Met Office, 2UK Met Office

Propagating sea surface temperature (SST) information to depth is a non-trivial problem in data assimilation since the vertical correlations of temperature are complex and varying, and information about mixed layer salinity is sparse. At the Met Office a parameterised function is used to specify the vertical length scales over which SST information is spread. At the surface the vertical length scale is set to the depth of the mixed layer before reducing to twice the vertical grid scale at the base of the mixed layer. While producing accurate SST analyses/forecasts, this methodology can produce undesirable features near the base of the mixed layer; this is a particular problem when the only available data are SST observations. A complex interaction between the mixed layer depth and the assimilation increments often leads to positive increments being projected deeper on average than negative increments, leading to a positive temperature bias below the mixed layer. Another issue is the generation of excess mixing across the mixed layer.
We have been investigating ways to reduce these issues, and we show results from a number of experiments testing different techniques. Experiments have been conducted using both a 1-D model and a global ocean forecasting system, both based on NEMO version 3.6. Assimilation was done with a 3DVar methodology implemented using the NEMOVAR code. It has been found that using a short (a few days) forward pass exponential filter on the mixed layer depth used to parameterise the vertical length scales can be effective. Likewise, applying a balancing increment to salinity, so changes in density/density gradient, has a positive effect. None of the methods investigated completely eliminated the problems but did significantly reduce them. To make further progress it is likely that the mixed layer evolution through the assimilation window would need to be controlled.

This abstract is not assigned to a timetable spot.

Strongly coupled data assimilation with the coupled ocean-atmosphere model AWI-CM: comparison with the weakly coupled data assimilation

Qi Tang 1, Longjiang Mu 2, Helge Goessling 3, Tido Semmler 3, Lars Nerger 4

1Institute of Geographic Sciences and Natural Resources Research, CAS, 2Alfred Wegener Institute for Polar and Marine Research, 3Alfred Wegener Institute, Helmholtz Centre for Polar and Marine Research, 4Alfred Wegener Institute

We compare the results of strongly coupled data assimilation and weakly coupled data assimilation by analyzing the assimilation effect on the prediction of the ocean as well as the atmosphere variables. The AWI climate model (AWI-CM), which couples the ocean model FESOM and the atmospheric model ECHAM, is coupled with the parallel data assimilation framework (PDAF, http://pdaf.awi.de). The satellite sea surface temperature is assimilated. For the weakly coupled data assimilation, only the ocean variables are directly updated by the assimilation while the atmospheric variables are influenced through the model. For the strongly coupled data assimilation, both the ocean and the atmospheric variables are directly updated by the assimilation algorithm. The results are evaluated by comparing the estimated ocean variables with the dependent/independent observational data, and the estimated atmospheric variables with the ERA-interim data. In the ocean, both the WCDA and the SCDA improve the prediction of the temperature and SCDA and WCDA give the same RMS error of SST. In the atmosphere, WCDA gives slightly better results for the 2m temperature and 10m wind velocity than the SCDA. In the free atmosphere, SCDA yields smaller errors for the temperature, wind velocity and specific humidity than the WCDA in the Arctic region, while in the tropical region, the error are larger in general.

This abstract is not assigned to a timetable spot.

A hybrid nonlinear-Kalman ensemble transform filter for data assimilation in systems with different degrees of nonlinearity

Lars Nerger 1

1Alfred Wegener Institute

The second-order exact particle filter NETF (nonlinear ensemble transform filter) is combined with local ensemble transform Kalman filter (LETKF) to build a hybrid filter method (LKNETF). The filter combines the stability of the LETKF with the nonlinear properties of the NETF to obtain improved assimilation results for small ensemble sizes. Both filter components are localized in a consistent way so that the filter can be applied with high-dimensional models. The degree of filter nonlinearity is defined by a hybrid weight, which shifts the analysis between the LETKF and NETF. Since the NETF is more sensitive to sampling errors than the LETKF, the latter filter should be preferred in linear cases. It is discussed how an adaptive hybrid weight can be defined based on the nonlinearity of the system so that the adaptivity yields a good filter performance in linear and nonlinear situations. The filter behavior is exemplified based on experiments with the chaotic Lorenz-96 model, in which the nonlinearity can be controlled by the length of the forecast phase, and an idealized configuration of the ocean model NEMO.

This abstract is not assigned to a timetable spot.

Improving Met Office predictions of Arctic sea ice through assimilation of CryoSat-2 and SMOS thickness data

Davi Carneiro 1, Emma Fiedler 1, Ed Blockley 1, Matthew Martin 1

1Met Office

Arctic sea ice is one of the most rapidly and visibly changing components of the global climate system. Although global analysis and forecasting systems have been used successfully for mid-latitude ocean prediction for some time, their application to Arctic sea ice is less mature, since observations are much less abundant and data assimilation techniques less advanced in the polar regions than at lower latitudes. In this work, we aim at implementing sea-ice thickness data assimilation from satellite measurements within the Met Office Forecast Ocean Assimilation Model (FOAM), specifically: CryoSat-2 and Soil Moisture and Ocean Salinity (SMOS). The FOAM data assimilation scheme is NEMOVAR, a three-dimensional variational data assimilation (3D-Var) with the first guess at appropriate time (FGAT). We derive sea-ice thickness (SIT) from along-track sea-ice freeboard measurements in CryoSat-2, focusing on the assimilation of thicker ice, whereas SMOS assimilation is particularly focused on the thinner ice. Therefore, we will show results of FOAM runs with the assimilation of SIT from CryoSat-2 and SMOS individually, as well as combined.

This abstract is not assigned to a timetable spot.

Collocated Argo-Hyperspectral Infrared Satellite Measurements: An Idea

Santha Akella 1, Yosuke Fujii 2

1NASA, 2JMA

Argo temperature and salinity (T & S) profiles form an integral part of ocean data assimilation (DA) systems. Similar is the case with Hyperspectral infrared (IR) satellite measurements for any atmospheric DA system. However, these two “essential” measurement platforms are not necessarily collocated in space and/or time. In this presentation we explore a few possibilities of collocating them, its impact on coupled DA and broadly speaking, earth system analyses.

There is no single observational platform that can measure from the abyss of the ocean to the top of the atmosphere. However, the goal of a coupled DA system is to come up with an estimate of the state of the earth system that precisely spans that space. In this talk we explore how present technology could be effectively leveraged to yield observations across the air-sea interface. We propose ascent timing of Argo floats (i.e., the timing at which Argo floats measure T & S profiles) such that they match the Hyperspectral IR measurements from polar orbiting satellites. Although implementing this proposal would require coordinated effort between oceanographic in situ observation communities and space agencies, it is expected that such cooperation will have immense benefits. Just as the Argo T & S profiles provide “full” column information of the ocean, the Hyperspectral IR soundings provide atmospheric column information, through brightness temperature (BT) measurements in various wavelengths. The coupled DA systems include numerical weather prediction systems, which routinely assimilate BTs, since they use a radiative transfer model that relate physical state variables (not only atmospheric temperature and humidity but also sea surface temperature, salinity, and wind speeds, etc) to BT. In the first half of our talk we will lay out the details of these ideas and possible challenges. Second half of the talk will focus on using this information to improve and calibrate coupled background error covariances, which is a key for successful coupled data assimilation. We anticipate this topic to bring common interests of OceanPredict task teams: OS-Eval TT, CP-TT and DA-TT to tackle a bigger problem of how to maximize synergy between in situ and satellite observations: that would include a range of ocean observing systems (moorings, buoys, floats) and satellite IR, microwave, scatterometer, radar altimetry.

This abstract is not assigned to a timetable spot.

Ocean biogeochemical model parameter uncertainties: Application of ensemble data assimilation to a one-dimensional model

Nabir Mamnun 1, Lars Nerger 2, Christoph Völker 1, Mihalis Vrekoussis 3

1Alfred Wegener Institute, Helmholtz Centre for Polar and Marine Research, 2Alfred Wegener Institute, 3Institute of Environmental Physics and Remote Sensing, University of Bremen

Ocean biogeochemical models are increasingly used in the Earth system modelling efforts for climate simulations, and also for the development of marine environmental applications and services. Like any other geoscientific models, the processes within biogeochemical models include simplified schemes called parameterizations. However, the parameter values can be poorly constrained and involve unknown uncertainties. The uncertainty in the parameter values translates into uncertainty in the model outputs. Therefore, uncertainty quantification of biogeochemical model predictions requires a systematic approach to properly quantify the uncertainties related to their parameters. In this study, we apply an ensemble data assimilation method for the quantification of the uncertainty arising from the parameterization within biogeochemical models. We apply the ensemble‐based local error‐subspace transform Kalman filter into a one-dimensional vertical configuration of the biogeochemical model Regulated Ecosystem Model 2 at the Bermuda Atlantic Time-series Study station. In situ nutrients and oxygen profiles, and satellite chlorophyll-a concentration data are assimilated to estimate 10 selected biogeochemical parameters. We present convergence and interdependence features of the estimated parameters in relation to the major biological processes and discuss their uncertainties.

This abstract is not assigned to a timetable spot.

Assimilation of multi-platform observations into two-way coupled physical-biogeochemical model on the North-West European Shelf

Jozef Skakala 1, Jorn Bruggeman 1, David Ford 2, Stefano Ciavatta 1

1Plymouth Marine Laboratory, 2Met Office

North-West European (NWE) Shelf is a region of major importance for both European economy and climate. We present two important developments into the operational physical-biogeochemical model on the NWE Shelf: firstly we introduce assimilation of novel physical and biogeochemical data from an AlterEco glider mission, expanding the existing operational biogeochemical assimilative capacity into a multi-platform (e.g. satellite, glider) system. Secondly, we use a novel bio-optical module to drive the heat fluxes in the physical model, which introduces an important feedback from the biogeochemical model state to the physical model. We demonstrate that both these developments have the capacity to improve the skill of both, of the simulated physics and biogeochemistry. The two-way coupled physical-biogeochemical model will be used in the future to enable us to progress from the currently used weakly coupled physical-biogeochemical assimilative system towards the strongly coupled assimilative system.

This abstract is not assigned to a timetable spot.

Assessing the impact of SWOT in a global high-resolution data assimilation system

Mounir Benkiran 1, Le Traon Pieere-Yves 2, Rémy Elisabeth 2

1mercator-ocean, 2Mercator-ocean

This study describes the global Observing System Simulation Experiments (OSSE) system developed to assess the impact of the future Surface Water and Ocean Topography (SWOT) mission for global ocean analysis and prediction. The 1/12° global ocean circulation model and the data assimilation system (SAM2) are described. The data assimilation system differs from the one used operationally by Mercator Ocean for the Copernicus Marine Service. The main updates are related to the use of a 4D version of data assimilation scheme and a different parametrization of the model error covariance. These improvements are detailed and their contribution quantified. The Nature Run used to represent the "truth ocean" is validated through the comparison with altimeter observations. The Nature Run is used to simulate pseudo-observations (Sea Surface Height from three nadir altimeters and SWOT, Sea Surface Temperature from satellite observations, Temperature and Salinity profiles from the CORA data base) required for the OSSEs. Observation errors (noise) are added to all these simulated observations. The simulated data set is then assimilated in an Assimilated Run that uses a different model with different initial conditions and forcings. The OSSE design is validated through the comparison of OSSEs results with OSE results for conventional nadir altimeters.

This abstract is not assigned to a timetable spot.

Developing the NEMOVAR ocean data assimilation system towards Graphical Processing Unit based High Performance Computing systems

Wayne Gaudin 1, Davi Carneiro 2, David Norton 3, Marcin Chrust 4, Matthew Martin 2, Andrew Porter 5, Anthony Weaver 6, Martin Price 2, David Case 7, Andrea Piacentini 6

1Nvidia, 2Met Office, 3The Portland Group / Nvidia, 4ECMWF, 5Science and Technology Facilities Council Hartree Centre, 6CERFACS, 7National Centre for Atmospheric Science

Graphical Processing Units (GPUs) are increasingly being used in compute intensive non-graphical applications due to their large performance potential in tasks that can be made highly parallel and are bound by memory bandwidth; and potentially improved power efficiency for appropriate workloads.

During a series of hackathons we are porting key sections of the ocean data assimilation code NEMOVAR to run on GPUs, focussing first on the diffusion operator that is used to model background error covariances. Key challenges are identifying the highly parallel sections of the code and inserting directives to run them on a GPU; and managing the transfer of data between the physically and logically separate GPU and main memory systems.

We present an overview of our approaches, which have included the use of explicit vs compiler managed data transfers, and the potential for the PSyclone code-generation and translation system to automatically insert the required directives. The most compute intensive sections of the diffusion operator have been run on GPU but good performance will only be achieved by getting a larger region of the diffusion operator ‘resident’ on the GPU to reduce the repeated transfers of data between the two memory systems. We’ll show some initial findings about the performance potential of the system on GPU based on the work so far, and give an overview of the potential and challenges for a full port of NEMOVAR to GPU equipped systems.

This abstract is not assigned to a timetable spot.

Evaluation of the new Black Sea Reanalysis system

Leonardo Nascimento Lima 1, Stefania Angela Ciliberti 2, Ali Aydogdu 3, Romain Escudier 2, Simona Masina 3, Diana Azevedo 2, Elisaveta Peneva 4, Salvatore Causio 2, Andrea Cipollone 3, Emanuela Clementi 2, Sergio Cretì 2, Laura Stefanizzi 2, Rita Lecci 2, Francesco Palermo 2, Giovanni Coppini 2, Nadia Pinardi 5, Atanas Palazov 6

1Euro-Mediterranean Center on Climate Change, 2Centro Euro-Mediterraneo sui Cambiamenti Climatici, 3CMCC, 4Sofia University “St. Kliment Ohridski”, 5University of Bologna “Alma Mater Studiorum”, 6Institute of Oceanology, Bulgarian Academy of Sciences

Ocean reanalyses are becoming increasingly important to reconstruct and provide an overview of the ocean state from the past to the present-day. In the scope of the Copernicus Marine Environment Monitoring Service (CMEMS), a new Black Sea (BS) reanalysis, BS-REA (BSE3R1 system), has been produced by using an advanced variational data assimilation method to combine the best available observations with a state-of-the-art ocean general circulation model between 1993-2019. The hydrodynamical model is based on Nucleus for European Modeling of the Ocean (NEMO, v3.6), implemented for the BS domain with horizontal resolution of 1/27° x 1/36°, and 31 unevenly distributed vertical levels. NEMO is forced by atmospheric surface fluxes computed via bulk formulation and forced by ECMWF ERA5 atmospheric reanalysis product. At the surface, the model temperature is relaxed to daily objective analysis fields of sea surface temperature from CMEMS SST TAC. The exchange with Mediterranean Sea is simulated through relaxation of the temperature and salinity near Bosporus toward a monthly climatology computed from a high-resolution multi-year simulation, and the barotropic Bosporus Strait transport is corrected to balance the variations of the freshwater flux and the sea surface height measured by multi-satellite altimetry observations. A 3D-Var ocean data assimilation scheme (OceanVar) is used to assimilate sea level anomaly along-track observations from CMEMS SL TAC and available in situ vertical profiles of temperature and salinity from both SeaDataNet and CMEMS INS TAC products. Background-error covariances are decomposed in vertical covariances and horizontal correlations. The former is modelled through 15-mode multi-variate Empirical Orthogonal Functions. Horizontal correlations are modelled through a first-order recursive filter. Comparisons against the previous Black Sea reanalysis (BSE2R2 system) show important improvements for temperature and salinity, such that errors have significantly decreased (about 50%). Temperature fields present a continuous warming in the layer between 25-150 m, within which there is the presence of the Black Sea Cold Intermediate Layer (CIL). SST exhibits a positive bias and relatively higher root mean square error (RMSE) values are present in the summer season. Spatial maps of sea level anomaly reveal the largest RMSE close to the shelf areas, which are related to the mesoscale activity along the Rim current. The BSE3R1 system has produced very accurate estimates which makes it very suitable for assessing more realistic climate trends and indicators for important ocean properties.

This abstract is not assigned to a timetable spot.

The data assimilation in the ocean forecast system based on FIO-COM

Xunqiang Yin 1

1The First Institute of Oceanography, Ministry of Natural Resources, Qingdao 266061, China

The ocean forecasting system (OFS) based on the surface wave-tide-circulation coupled ocean model developed by the First Institute of Oceanography (FIO-COM), Ministry of Natural Resources, China was started its operational running since May 2016. The ensemble adjustment Kalman filter (EAKF) is designed for this OFS to assimilate the near real time oceanic observations, including the sea surface temperature (SST) derived by satellite, sea surface height (SSH) derived by satellite altimeters and Temperature and salinity profiles of Argo. There are 10 ensemble members used for the implementation of EAKF. In order to keep the ensembles maintain a reasonable spread, random noise is introduced into the initial conditions and ensemble inflation with a factor equal to 5% is applied at the daily assimilation. Although the ensemble number is relatively smaller, the computation cost is still quite huge. Recently, a sampling method based on the history simulations inspired by the NMC method is carried out to provide the ensemble with a larger number for EAKF. Some sensitive experiments are examined for comparing the performance of the original EAKF and the new scheme. The results indicated that the new scheme with a similar ensemble number could provide a close performance while the one with greater ensemble members by sampling will improve the analysis and the forecasting results. Using the sampling method or combine the sampling method and ensemble model runs could potentially employed in this system to reduce the computational cost and improving the performance of the OFS.

This abstract is not assigned to a timetable spot.

Multiscale FGAT data assimilation in OceanMAPS (v3.4) Forecasting System.

Prasanth Divakaran , Matthew Chamberlain 1, Pavel Sakov 2, Gary Brassington 2, Peter Oke 1, Russell Fiedler 1

1CSIRO, 2Bureau of Meteorology

The Ocean Model, Analysis and Prediction System (OceanMAPS) is the ocean forecasting system implemented at the Bureau of Meteorology. OceanMAPS provides 7-day forecasts on daily basis in near global eddy-resolving horizonal grid. The system is based on Modular Ocean Model (MOM 5) and data assimilation is using Enkf-C software in Ensemble Optimal Interpolation (EnOI) mode. An efficient, two-stage, multiscale FGAT (First Guess at Appropriate Time) EnOI data assimilation is used in OceanMAPSv3.4. First stage EnOI data assimilation is done using a static ensemble of climatological anomalies constructed using a coarse, 1°global ocean model. This followed by an FGAT EnOI data assimilation based on intraseasonal anomalies from a free run of the eddy-resolving ocean model (same as the OceanMAPS Ocean Model configuration). The coarse-resolution ensemble is aimed to correct broad-scales, and high-resolution ensemble is used to correct the eddy-scales. Corrections from the coarse steps are more effective at reducing the biases in the subsurface ocean whereas the high-resolution steps largely restricted to vertically uniform corrections that are associated with mesoscale eddies. Implementation of FGAT sees more observations are assimilated into the system, especially in the form of satellite-derived sea surface temperature, that gives closer fit to observation in surface and near-surface temperature. The talk includes the configuration of the two-system data assimilation system and statistical comparison with the previous version of OceanMAPS

This abstract is not assigned to a timetable spot.

Controlling Lateral Boundary Conditions of a Regional Ocean Model by an Approximate Kalman Filter

Tianran Liu 1, Naoki Hirose 1

1Research Institute for Applied Mechanics, Kyushu University

Satellite altimeter data have been assimilated into ocean models aiming at a better estimation of the current and past ocean state and thus leading more accurate predictions. In this study, we attempt to develop a regional and short-term ocean forecasting system, where the lateral boundary conditions are modified by assimilation of satellite sea surface height data by an approximate Kalman filter. Traditionally, the system error (process noise) has been basically attributed to the surface meteorological conditions. However, the assimilation impacts decay rapidly to depths and can not propagates into deep layers. The lateral boundary conditions assimilation proposed in this study extends the effective persistency of data assimilation temporally and spatially. We hope this forecasting system can provide a swift and accurate prediction of background fields at all levels to fishery, ocean power, and so on.

This abstract is not assigned to a timetable spot.

Assessing Impacts of Ensemble Kalman Filter (EnKF) on the Remo Ocean Data Assimilation System (RODAS) Over the South Western Atlantic

Filipe Bitencourt Costa , Clemente Augusto Souza Tanajura 1

1Physics Institute, Federal University of Bahia (UFBA), Salvador, Brazil.

This work presents the implementation of the Ensemble Kalman Filter (EnKF) on the REMO Ocean Data Assimilation System (RODAS) with the Hybrid Coordinate Ocean Model (HYCOM) with 1/12° on the western tropical and South Atlantic. The new version of RODAS employs a joint and multivariate assimilation of hydrographic profiles, UK MetOffice OSTIA Sea Surface Temperature (SST) and AVISO Absolute Dynamic Topography (ADT). Three experiments were performed for six months with assimilation cycle of ten days, (i) Control with no assimilation, (ii) A_EnOI employing Ensemble Optimal Interpolation (EnOI) and (iii) A_EnKF employing EnKF and forced with perturbed atmospheric fields. A_EnKF was successfully implemented as ensemble spread was maintained around 0.35 °C, 0.03 m and 0.05 psu for temperature, Sea Surface Height (SSH) and salinity, respectively. Also, the mean Root Mean Squared Deviation (RMSD) of all ensemble was greater than the RMSD of the mean run for temperature and salinity. The mean correlation of SSH with respect to AVISO was 0.12, 0.33 and 0.31 and the RMSD of SST with respect to OSTIA was 0.92, 0.52 and 0.47 °C for Control, A_EnOI and A_EnKF, respectively. For the subsurface, RMSD with respect to ARGO was 0.22, 0.20 and 0.18 psu for salinity and 1.42, 0.91 and 1.09 °C for temperature for Control, A_EnOI and A_EnKF, respectively. Impacts on the Brazil Current are still been assessed. A_EnOI showed better SSH correlation and smaller temperature error while A_EnKF presented smaller erro for SST and salinity. Therefore, A_EnKF shows comparable quality to RODAS previous version. For future works, it is expected with increase in ensemble members, from eleven to thirty three, the new version of RODAS should outperform its previous for SSH, SST and subsurface temperature and salinity.

This abstract is not assigned to a timetable spot.

Ocean data assimilation for ICON-ESM for climate predictions

Holger Pohlmann 1, Sebastian Brune 2, Kristina Fröhlich 3, Johanna Baehr 4

1DWD, 2University of Hamburg, 3Deutscher Wetterdienst, 4Institute of Oceanography, CEN, Uni Hamburg

The newly developed coupled climate model "Icosahedral Nonhydrostatic Earth System Model" (ICON-ESM) has recently become available. The pre-industrial control run has a stable climate with closed energy and water budgets. The global mean temperature is 13.73°C and the Atlantic meridional overturning circulation at 26°N has a strength of 16 Sv. The ICON-ESM will serve for seasonal to decadal climate predictions produced at the "Deutscher Wetterdienst" (DWD) in Germany. As a first step for a weakly coupled data assimilation, observed ocean 3-d temperature and salinity fields are assimilated with the method of the "Ensemble Kalman filter" (EnKF). An EnKF assimilation run over the period 1960-2014 is produced and analyzed in terms of the degree of realism of its climate variability.

This abstract is not assigned to a timetable spot.

Parameter estimation for ocean biogeochemical component in a global model using Ensemble Kalman Filter: a twin experiment

Tarkeshwar Singh 1, François Counillon 2, Jerry F. Tjiputra 3, Mohamad El Gharamti 4

1NERSC, Bergen, Norway, 2NERSC/UoB, 3NORCE Norwegian Research Centre AS and Bjerknes Centre for Climate Research, Bergen, Norway, 4National Center for Atmospheric Research, Boulder, Colorado, USA

Ocean biogeochemical (BGC) models utilise a large number of poorly-constrained global parameters to mimic unresolved processes and reproduce the observed complex spatio-temporal patterns. Large model errors stem primarily from inaccuracies in these parameters whose optimal values can vary both in space and time. This study aims to demonstrate the ability of ensemble data assimilation (DA) methods to provide high-quality and improved BGC parameters within an Earth system model in idealised twin experiment framework. We use the Norwegian Climate Prediction Model (NorCPM), which combines the Norwegian Earth System Model with the Dual-One-Step ahead smoothing-based Ensemble Kalman Filter (DOSA-EnKF). The work follows on Gharamti et al. (2017) that successfully demonstrates the approach for one-dimensional idealized ocean BGC models. We aim to estimate five spatially varying BGC parameters by assimilating Salinity and Temperature hydrographic profiles and surface BGC (Phytoplankton, Nitrate, Phosphorous, Silicate, and Oxygen) observations in a strongly coupled DA framework – i.e., jointly updating ocean and BGC state-parameters during the assimilation. The method converges quickly (less than a year), largely reducing the errors in the BGC parameters and eventually it is shown to perform nearly as well as that of the system with true parameter values. Optimal parameter values can also be recovered by assimilating climatological BGC observations and challenging sparse observational networks. The findings of this study demonstrate the applicability of the approach for tuning the system in a real framework.

This abstract is not assigned to a timetable spot.

Skills and biases of an 11-yr renalysis produced by the HYCOM+REMO ocean data assimilation system (RODAS) in the South Atlantic

Filipe Bitencourt Costa , Clemente Tanajura 1

1Federal University of Bahia (UFBA)

An 11-yr reanalysis (2008-2018) was produced with the HYCOM+REMO ocean data assimilation system (RODAS) in the South Atlantic. RODAS is based on the ensemble optimal interpolation method. It assimilated data from OSTIA sea surface temperature (SST), AVISO gridded sea surface height (SSH) and 61,031 vertical profiles of temperature and salinity (T/S) from Argo and XBTs, in addition to data from 8 PIRATA moorings each 3 days. The HYCOM+RODAS system was implemented with 1/12 degree of horizontal resolution and 32 vertical layers in the regional domain 45S-10N, 68W-18W. The system was forced by 6-hourly atmospheric reanalysis from CFSR NCEP/NOAA. Lateral boundary conditions were imposed by another HYCOM+RODAS run that covered the whole Atlantic, in which only mean dynamic topography was assimilated. Considering each 3-day period after each assimilation, the SST root mean squared deviation (RMSD) was 0.51C, SSH correlation with AVISO was 0.63, T and S RMSDs in the top 2000 m with respect to Argo data were 0.97C and 0.15 psu. These values are comparable to the HYCOM+NCODA reanalysis (0.49C, 0.60, 0.64C and 0.14 psu, respectively), and the GLORYS/Mercator reanalysis (0.60C, 0.78, 0.46C and 0.08 psu, respectively). The largest HYCOM+RODAS biases were attained in the Brazil-Malvinas Confluence region for SST (1.5C), in the crest of the South Atlantic gyre (10 cm) and in the thermocline region (2C and 0.20 psu). The results show the reanalysis is suitable to investigate ocean processes, particularly in the superior ocean in which circulation intraseasonal variability was well captured when compared to in situ data.

This abstract is not assigned to a timetable spot.

Impacts of the assimilation of satellite sea surface temperature data on estimates of the volume and heat budgets of the North Sea

Wei Chen , Johannes Schulz-Stellenfleth 1, Sebastian Grayek 1, Joanna Staneva 2

1Institute of Coastal Research, Helmholtz-Zentrum Geesthacht (HZG), 2Helholtz Zentrum Geesthacht

The different mechanisms controlling the heat budget of the North Sea are investigated based on a combination of satellite sea surface temperature measurements and numerical model simulations. Lateral heat fluxes across the shelf edge and into the Baltic Sea are considered, as well as vertical ocean-atmosphere heat exchange. The 3DVAR data assimilation (DA) scheme is applied, which contains assumed model error correlations depending on the mixed layer depth derived from a coupled circulation/ocean wave model. The simulated seawater temperature is improved both at the surface and at greater water depths. DA is shown to change the current velocity field and decrease the lateral advective volume/heat exchanges between the North Sea and the Atlantic, yielding an increased heat flux from the Atlantic into the North Sea and more heat flux from the sea to the atmosphere. The largest DA impact on volume/heat transport is found at the Norwegian Channel, where the dominant process is Eulerian transport, followed by tidal pumping and wind pumping, while other processes, such as Stokes transport, transport driven by the annual mean wind stress, and tide-wind interactions, are negligible. Further analysis reveals the acceleration of the along-shelf current at the northern edge of the North Sea and a decrease in the horizontal pressure gradient from the Atlantic to the North Sea. DA changes the velocity field inside the Norwegian Channel and the instability of the water column, which in turn reduces the Eulerian transport of heat and water outward from the North Sea.

This abstract is not assigned to a timetable spot.

Coupled reanalyses of NorCPM1 contributed to CMIP6 DCPP

Bethke Ingo 1, Noel Keenlyside 2, Wang Yiguo 3, François Counillon 4

1UoB, 2Geophysical Institute, University of Bergen, 3NERSC, 4NERSC/UoB

The Norwegian Climate Prediction Model version 1 (NorCPM1) is a new research tool for performing climate reanalyses and seasonal-to-decadal climate predictions. It has been used for contributing output to the Decadal Climate Prediction Project (DCPP) as part of the Climate Model Intercomparison Project phase 6 (CMIP6). It combines the Norwegian Earth System Model version 1 (NorESM1) with the Ensemble Kalman Filter (EnKF) assimilating SST and T/S-profile observations. NorCPM1 contributed two reanalyses based on anomaly assimilation (AA) to CMIP6 DCPP: the first (assim-i1) uses a 1980-2010 reference climatology for computing anomalies and AA only updates the physical ocean state; the second (assim-i2) uses a 1950-2010 reference climatology and additionally updates the sea ice state via AA. First, we demonstrate that the AA successfully synchronises model variability without much affecting the model climatology. Compared to the historical experiment without assimilation, the reanalyses generally have lower RMSE and higher correlation coefficient for both assimilated and unassimilated variables with respect to observations. The model bias in the reanalyses is practically unchanged by AA, even for the biogeochemistry, apart from a reduced sea ice thickness bias in assim-i2 caused by the sea ice update. The two reanalysis products overall show comparable performance, except for the trend of AMOC. To attribute the differences in AMOC trends to the assimilation settings, we consider a new reanalysis (assim-i3) that uses a 1950-2010 reference climatology and does not update the sea ice state via AA. By comparing assim-i1 with assm-i3, we assess the impact of the reference climatology on the reanalysis; by comparing assim-i2 with assim-i3, we assess the impact of the AA update of the sea ice state.

This abstract is not assigned to a timetable spot.

Leveraging Hessian Uncertainty Quantification to Design Ocean Climate Observing Systems

Nora Loose 1, Patrick Heimbach 2

1University of Colorado, Boulder, 2University of Texas at Austin

Designing effective ocean observing networks warrants deliberate, quantitative strategies, given the heavy cost and logistical challenges of ocean observing. We leverage Hessian uncertainty quantification (UQ) within the ECCO (Estimating the Circulation and Climate of the Ocean) data assimilation framework to explore a quantitative approach for ocean climate observing systems. Here, an observing system is considered optimal if it minimizes uncertainty in a set of investigator-defined design goals or quantities of interest (QoIs), such as oceanic transports or other key climate indices. Hessian UQ unifies three design concepts. (1) An observing system reduces uncertainty in a target QoI most effectively when it is sensitive to the same dynamical controls as the QoI. The dynamical controls are exposed by the Hessian eigenvector patterns of the model-data misfit function. (2) Orthogonality of the Hessian eigenvectors rigorously accounts for complementarity versus redundancy between distinct members of the observing system. (3) The Hessian eigenvalues determine the overall effectiveness of the observing system, and are controlled by the sensitivity-to-noise ratio of the observational assets (analogous to the statistical signal-to-noise ratio). We illustrate Hessian UQ and its three underlying concepts in a North Atlantic case study. Sea surface temperature observations inform mainly local air-sea fluxes. In contrast, subsurface temperature observations reduce uncertainty over basin-wide scales, and can therefore inform transport QoIs at great distances. This research provides insight  into the design of effective observing systems that maximally inform the target QoIs, while being complementary to the existing observational database.

This abstract is not assigned to a timetable spot.

How well mesoscale eddies are represented in oceanic models?

Anass El Aouni , Arthur Vidard 1

1Inria Grenoble

The ocean is a fundamentally turbulent system characterizes by the presence of several processes occurring at different scales, particularly mesoscale (ranging from 5km up to several hundreds of kilometers) which is the most energetic scale in the ocean.The latter dominates the ocean's circulation by a variety of physical structures comprising eddies, meandering currents, filaments, and jets. The term mesoscale is often followed by the word ”eddies” which reflects their predominance in the ocean.
Over the past 20 years, the combination of satellite remote sensing and ocean models have shown the ubiquity of mesoscale eddies and the key role they are playing in the climate machinery.
In fact, these physical structures form the main source of the ocean's kinetic energy and can have a crucial impact on it. In addition to their ability to stir and mix surrounding water, they also play a major role in climate change, which arises from their influence on the circulation by controlling the distribution of water properties and modulating the energy and momentum.
However, these processes were mostly studied “heuristically” in the Eulerian perspective where persistent correlations between flow quantities are sought in a fixed spatial domain.
The aim of this work is to evaluate how well mesoscale eddies are observed and adequately represented in the climate prediction models from a Lagrangian viewpoint. Here, we use a recent Lagrangian method from the nonlinear dynamics field and evaluate the impact of data assimilation on these fine-scale oceanic processes. We do this by revealing several of their physical processes, each time from oceanic outputs with and without assimilation, and from satellite geostrophic currents, then we carry on a statistical comparison to evaluate the role played by data assimilation in the realistic representation of fine-scale oceanic processes.

This abstract is not assigned to a timetable spot.

Biogeochemical reanalysis of the Arctic Ocean with joint state-parameter estimation

Tsuyoshi Wakamatsu 1, Annette Samuelsen 2, Jiping Xie 2, Çağlar Yurumktepe 2, Laurent Bertino 2

1The Nansen Center, 2NERSC

An operational system for the Arctic Ocean (above 60 degree North) biogeochemical reanalysis product is developed. The reanalysis data production is based on a fixed-lag ensemble Kalman smoother with joint state-parameter estimation and a physical-biogeochemical coupled ocean model cycled over eight day assimilation windows. Satellite chlorophyll-a and in-situ nutrients, nitrate, silicate and phosphate are assimilated to ensemble biogeochemical model states from the coupled physical-biogeochemical model HYCOM-ECOSMO. Eight global model parameters, growth and mortality rates of two phytoplankton, mortality rate of two zooplankton, carbon to silicate ratio and sinking velocity of detritus and opal, of the biogeochemical model are estimated jointly with the biogeochemical model state. Pilot reanalysis products from 2007 to 2009 are validated over the Atlantic Arctic Ocean sector. Both state and parameter estimation contribute jointly to reducing model biases in the HYCOM-ECOSMO characterised by late spring bloom and too strong bloom at the smoother update. However, out-of-phase bloom still emerges in some regions after eight days forecast. It was found that introduction of inflation in the ensemble parameters is important in keeping their seasonal changes. Validation against in situ nutrients in the Barents Sea and the Norwegian Sea show improvement in overestimation of silicate and nitrate, but validation of nutrients is difficult in general over the Arctic Ocean due to lack of in-situ data. In those areas with significant model bias in the spring bloom, phytoplankton and zooplankton biomass compositions change significantly after assimilation.

This abstract is not assigned to a timetable spot.

Coupling ocean & land processes: differences & similarities. What can we learn?

Gianpaolo Balsamo 1

1ECMWF

Land & Ocean processes in the ECMWF coupled system are represented by a variety of dedicated surface and 3D models (CHTESSEL, FLake, ECWAM, NEMO, LIM2) with a more modular developments in the pipeline that will allow also to connect directly the land water cycle to the ocean fresh-water inflow (via the CAMA-Flood river-discharge).

I plan to illustrate how physical processes’ timescales are guiding & justifying some of the simplifications made in these models used in non-linear integrations (i.e. in all forecasts), and in their simplified & tangent linear versions that are used in the 4D-Var inner loops trajectories with further reduced complexity (i.e. in analyses).

The validity of these approximations varies at different spatial scales and when approaching km-scale resolution.

For instance, the need of lateral water & heat exchanges, may become more important, as evident in mountain environment (drier tops & wetter valleys in the real world but not in models neglecting lateral transports) and in coastal areas where high gradient in salinity & temperature distributions are present near the coast.

For open water a diversity of choices is currently present over the oceans and inland water bodies, where different models are used (NEMO3.4 LIM2 EC-WAM are over oceans, with simplified and TL/AD models for inner-loops, and inland & coastal water using Flake model).

If developments would be solely guided by the needs of processes' representation, probably a mixed-layer with waves physics and a sea-ice column physics, interacting with the atmosphere column, would be best adapted for all open-water points. In the simplified and TL/AD models (where surface dynamics of ocean and sea-ice can be largely neglected over sub-daily assimilation time windows) such a scheme would facilitate coupled data assimilation and augmented control vector developments to affect directly the surface sea-state.

Reference:
Balsamo et al. 2017 WGNE Blue-book
http://bluebook.meteoinfo.ru/uploads/2017/docs/09_Balsamo_Gianpaolo_CouplingOceansLandECMWF.pdf

This abstract is not assigned to a timetable spot.

Data assimilation of ocean wind waves using Neural Networks

Kathrin Wahle 1, Joanna Staneva 2, Günther Heinz 1

1HZG, 2Helholtz Zentrum Geesthacht

A novel approach of data assimilation based on Neural Networks (NN's) is presented and applied to a wave model WAM. A case study demonstrated here is the German Bight. The method takes advantage of the ability of NN's to emulate models and to invert them. Combining forward and inverse model NN with the Levenberg–Marquardt algorithm provides boundary values or wind fields in agreement with measured wave integrated parameters. Synthesized HF- radar wave data are used to test the technique for two academic cases. The approach proposed here uses an' inverse' NN to estimate the wave parameters at the wave model's open boundary from the observations. These estimated boundary conditions are used as input for a run with a physically-based model (the wave model WAM). A forward NN is trained to generate output for a limited number of output locations. The twin experiments' results are promising and confirm the practicability of the assimilation technique based on the ML approach. The method has several advantages compared with other methods: it can be easily implemented for other wave models and regions since it only requires model output and measurements. Additionally, it can be adapted to specific problems (derive improved wind fields and/or boundary conditions or any other model parameter of interest). Data Assimilation using Neural Networks is computational very efficient compared to other advanced (non-local) assimilation strategies.

This abstract is not assigned to a timetable spot.

Assimilating synthetic Biogeochemical-Argo data into a global ocean model to inform observing system design

David Ford 1

1Met Office

Argo has revolutionised our understanding of ocean physics, and Biogeochemical-Argo (BGC-Argo) is now extending the concept to biogeochemistry. There are currently around 300 operational floats measuring one or more biogeochemical variable, and over the coming years an array of BGC-Argo floats will be established to provide regular profiles of oxygen, nitrate, pH, chlorophyll, suspended particles and downwelling irradiance. In preparation for this, and to inform decision-making around future deployments, a set of observing system simulation experiments (OSSEs) has been performed as part of the AtlantOS project. To perform the OSSEs, the MEDUSA biogeochemical model has been coupled with the global FOAM reanalysis system, and the capability developed to assimilate 3D profiles of oxygen, nitrate, pH and chlorophyll, as well as surface chlorophyll from ocean colour. The OSSEs test the impact on the system of assimilating simulated BGC-Argo observations for two potential scenarios: having BGC sensors on the full current Argo array (~4000 floats), and having BGC sensors on ¼ of the current Argo array (~1000 floats). BGC-Argo was found to provide complementary information to the existing ocean colour satellite constellation, while improving the representation of other variables throughout the water column, and air-sea CO2 flux. Assimilating ~1000 floats, which is the current target array size, provided clear benefits. Results suggest that increasing the array size further would bring further benefit, though similar or greater improvements could potentially be achieved through development of the data assimilation scheme.

This abstract is not assigned to a timetable spot.

The Met Office operational global ocean forecasting system at 1/12th degree resolution

christine Pequignet 1, Matthew Martin 1, James While 1, Robert King 2, Martin Price 1, John Siddorn 1, Mike Bell 1, Jennifer Waters 1, Ana Aguiar 1, Kerry Smout-Day 1, Gordon Inverarity 1, Jan Maksymczuk 1, Daniel Lea 1

1Met Office, 2UK Met Office

The Met Office’s operational Forecasting Ocean Assimilation Model (FOAM) global system has recently been upgraded from an eddy permitting 1/4 degree resolution (FOAM-ORCA025) to an eddy resolving 1/12th of a degree resolution (FOAM-ORCA12). The increase in resolution allows mesoscale processes to be resolved at a much larger range of latitudes, representation of finer resolution bathymetric features and coastlines and a larger number of resolved islands which can play an important role in ocean circulation.

FOAM uses a multivariate incremental variational data assimilation scheme called NEMOVAR which assimilates SST, temperature and salinity profile, altimeter SLA and satellite sea ice concentration observations with a 1 day assimilation window. The FOAM-ORCA12 configuration maintains the data assimilation at a 1/4 degree resolution while the model is run at the higher 1/12th degree resolution. This approach significantly reduces the computational cost and allows us to make use of the well established 1/4 degree data assimilation configuration.

We will present a description of the new 1/12th degree system alongside some initial results. Qualitatively, the new FOAM system appears to better represent the details of mesoscale features in SST and surface currents. However, traditional statistical verification methods suggest that the new system performs similarly or slightly worse than the pre-existing FOAM configuration. It is known that comparisons of models running at different resolutions suffer from a double penalty effect, whereby higher-resolution models are penalised more than lower-resolution models for features that are offset in time and space. Neighbourhood verification methods seek to make a fairer comparison using a common spatial scale for both models and FOAM-ORCA12 appears to perform better than FOAM-ORCA025 when these metrics are used.

This abstract is not assigned to a timetable spot.

A scalable, weak formulation of the continuity constraint across domain boundaries for global ocean eddy-resolving applications

Andrea Cipollone 1, Andrea Storto 2, Simona Masina 1

1CMCC, 2CMRE; CNR-ISMAR

The ocean forecasting community is recently devoting a growing effort towards the prediction of mesoscale processes at global scale. New high-resolution global simulation are able to resolve mesoscale structures in large part of the basin. On the other hand, a realistic representation of mesoscale variability is triggered by the capability of the assimilation schemes to efficiently ingest and combine several observing networks at the same model resolution and requires DA codes to be massively parallelized. One of the main bottlenecks that arises in this case is how to handle long correlations in presence of domain boundaries. Exact and continuous solutions over the halo/overlapping regions of different domains require multiple communications within neighbours that can potentially destroy the scalability of the code.
In this presentation, we treat the continuity restoration over the halo regions by including a new term in the cost function that drives the solution towards the continuous one [1]. This corresponds to relax the strong continuity constraint across the boundaries into a weak formulation and define a maximum allowed discontinuity among different solutions over the same halo regions (i.e. boundary continuity error”). The formulation forces possible boundary discontinuities to be less than a prescribed error, and minimizes the parallel communication compared to standard method. Theoretically, the exact solution is recovered by decreasing the boundary error towards zero. Practically, it is shown that the accuracy increases until a lower bound arises ( minimizer accuracy, mesh, etc).
Results are assessed in term of both scalability and accuracy exploiting a global ocean grid at 1/16° resolution. We also show the benefit of using a global eddy-resolving grid by comparing two experiments with different spatial resolution at assimilation level (1/16°-1/4°) and same resolution at forecast level (1/16°) [1].

[1] Cipollone, A., Storto, A., and Masina, S. (2020), JTECH, 37(10), https://doi.org/10.1175/JTECH-D-19-0099.1

Starts at 08:20 on 17 May.

Evaluation of the coupled atmosphere-ocean reanalysis and future development of the coupled data assimilation system in Japan Meteorological Agency

Tosuke Fujii 1, Chiaki Kobayashi 1, Ichiro Ishikawa 1, Yuhei Takaya 1, Toshiyuki Ishibashi 1, Tamaki Yasuda 1

1JMA/MRI

The Meteorological Research Institute (MRI) of the Japan Meteorological Agency (JMA) has developed a coupled atmosphere-ocean data assimilation system, MRI-CDA1, based on the coupled atmosphere-ocean general circulation model and separated atmosphere and ocean analysis routines adopted in JMA’s operational weather and climate prediction systems. In this presentation, we introduce the improvement of tropical precipitation and sea surface air temperature (SAT) fields in a coupled analysis generated by MRI-CDA1 over the uncoupled reanalysis using the same system. In the coupled analysis, the sea surface temperature (SST) adjustment to the atmosphere amplifies the lead/lag correlations between subseasonal variations of SST and precipitation. The atmosphere-ocean coupling generates SST variations associated with tropical instability waves, and the SAT field responds to the SST variations. The SST-precipitation and SST-SAT relationships on the weather timescale were also recovered in the coupled reanalysis, although they are hardly seen in the uncoupled one. The coupled model physics generates weather timescale SST variations consistent with the atmospheric state, and the atmospheric parameters respond to the SST variations through the coupled model physics. In this presentation, we will also introduce the future development of the coupled data assimilation system using the new global ocean data assimilation system for JMA’s coupled predictions based on a four-dimensional variational method.

Starts at 08:50 on 17 May.

NOAA-NCEP Next Generation Global Ocean Data Assimilation System (NG-GODAS): Evaluation of 40-year Reanalysis

Jong Kim 1, Daryl Kleist 2, Yi-Cheng Teng 3, Shastri Paturi 4, JieShun Zhu 5, Guillaume Verniers 6, Travis Sluka 6, Yan Hao 3

1IMGS@NOAA/NWS/NCEP/EMC, 2NOAA/NWS/NCEP/EMC, 3IMSG@NOAA/NWS/NCEP/EMC, 4IMSG @ NOAA/NWSNCEP/EMC, 5University of Maryland-CISESS, 6JCSDA

NOAA’s unified forecasting system (UFS) incorporates MOM6 ocean and CICE6 sea ice models as the ocean component of the future operational models: global weather (GFS), sub-seasonal (GEFS) and seasonal (SFS) forecasting systems. Furthermore, the UFS modeling infrastructure has been combined with the Joint Effort for Data Assimilation Integration (JEDI) project to establish NOAA’s Next Generation Global Ocean Data Assimilation System (NG-GODAS) system. An interim 40 year NG-GODAS reanalysis experiment is underway to assimilate various types of satellite and in-situ observations: satellite sea surface temperature, sea surface salinity, in-situ temperature & salinity, absolute dynamic topography, sea ice concentration, and sea ice freeboard thickness. Data atmosphere model with bias-corrected atmospheric forcing sets is applied in the reanalysis experiment: the NCEP Climate Forecast System Reanalysis (CFSR) for 1979~1999 and Global Ensemble Forecast System (GEFS) for 2000~2020. Preliminary reanalysis results show that the NG-GODAS provides significantly improved temperature and salinity analysis fields compared to current operational systems. We provide initial diagnostic validation results with detailed overview of the NG-GODAS analysis system.

Starts at 09:10 on 17 May.

Uncertainties in reconstruction of the past ocean climate with ocean reanalyses

Hao Zuo 1, Magdalena Alonso Balmaseda 1, Eric de Boisseson 1, Steffen Tietsche 1, Michael Mayer 2, Chris Roberts 1, Patricia de Rosnay 1

1ECMWF, 2ECMWF/University of Vienna

A historical reconstruction of ocean and sea-ice states, or ocean reanalysis (ORA), can be produced using ocean and sea-ice model simulation constrained with some boundary forcing fluxes, and by observations via data assimilation method. A long-term ocean reanalyses can provide invaluable information for climate monitoring. However, a reliable reconstruction of the past ocean climate strongly relies on effectiveness of the relevant ocean reanalysis system, as well as the availability and consistency of the global ocean observing system. Here we assess uncertainties of some key climate change indicators (CCIs) estimated using the ECMWF ORA systems (ORAS4, ORAS5, ORAP6). An extended ensemble approach has been used in this study, by taking into account errors from climate model and boundary forcing fluxes, as well as deficiencies in the observing networks and data assimilation methods. Climate signals like changes in ocean heat and salt contents, and inter-annual variabilities of sea-level and sea-ice changes are presented. Transports from large-scale overturing circulations like the AMOC are also discussed here. Even though robust climate signals are achievable in the post-Argo era at ORAs, large uncertainties still appear in the data-sparse earlier decades, or due to intermediate or discontinued observing networks. Quantifying and characterising these uncertainties in the ocean climate can provide valuable guidance for future enhancement of the global ocean observing system, but also help improving the long-term prediction such as decadal or climatic projections.

Starts at 10:00 on 17 May.

The Estimating the Circulation and Climate of the Ocean (ECCO) “Central Estimate”: a Multi-decadal, Coupled Ocean Reanalysis

Ian Fenty 1, Ichiro Fukumori 2, Patrick Heimbach 3

1NASA Jet Propulsion Laboratory, 2NASA Jet Propulsion Laboratory/California Institute of Technology, 3University of Texas at Austin

The Estimating the Circulation and Climate of the Ocean (ECCO) Consortium has been producing dynamically and kinematically-consistent global ocean state estimates for nearly two decades. Our current focus is Version 4 of the “Central Estimate”, a data-constrained global, 1-degree, coupled ocean, sea-ice, and thermodynamic ice-sheet model that spans the period 1992-present. The coupled ocean model is made consistent with a diverse and heterogeneous set of ocean, sea-ice, and ice-sheet data in a least-squares sense by iteratively adjusting a set of control parameters using the gradient of an uncertainty-weighted model-data misfit cost function. The gradient of the cost function is provided by the automatically-derived adjoint of the model (MITgcm).

By construction, ECCO state estimates perfectly satisfy the laws of physics and thermodynamics encoded in the numerical model and therefore conserve heat, salt, volume, and momentum. Our philosophy of strict adherence to these conservation principles ensures that ECCO reanalyses are useful for investigating the causal origins of observed ocean climate variability. However, because of the enormous scale of the nonlinear optimization problem, strictly obeying conservation laws involves a trade-off with goodness-of-fit; on the whole, ECCO reanalyses are unlikely to reproduce observations as well as ocean reanalyses that allow incremental adjustments to their state vectors through time.

Here we summarize our efforts to date with a focus on addressing recent challenges associated with (i) coupling to the sea-ice and thermodynamic ice-sheet models, (ii) adding novel data constraints such as ocean bottom pressure from GRACE and GRACE-FO, and (iii) increasing the spatial resolution of the state estimation system to achieve eddy-resolving scales.

Starts at 10:20 on 17 May.

Implementation and Evaluation of a High-Efficiency Coupled Data Assimilation System Using Multi-Timescale EnOI-Like Filtering with a Coupled General Circulation Model

Lv Lu 1, Shaoqing Zhang 1, Yingjing Jiang 1, Xiaolin Yu 1

1Ocean University of China

A multi-timescale high-efficiency approximate EnKF (MSHea-EnKF), which consists of stationary, slow-varying, and fast-varying filter using the time series of a single-model solution, has been implemented in the Geophysical Fluid Dynamics Laboratory’s global fully coupled climate model (CM2.1) to increase the representation of low-frequency background error statistics and enhance the computational efficiency. Here, the MSHea-EnKF is evaluated in a biased twin experiment framework and a 27-year real-obs coupled data assimilation (CDA) experiment. Results show that while the computing only costs duodecimal of traditional ensemble coupled data assimilation (ECDA), the ocean state estimation quality improves 30.3% and 10.7% for upper 500m salinity and temperature, respectively, and the atmosphere state estimation has almost the same quality as traditional ECDA. It’s mainly because MSHea-EnKF improves representation primarily on slow-varying background flows. The MSHea-EnKF also gets a more reasonable standard deviation distribution for Atlantic meridional overturning circulation (AMOC) and stronger meridional transport at 26.5°N below 2000m, which is closer to Rapid estimates.

Starts at 10:40 on 17 May.

Biogeochemical, ocean, and sea-ice data assimilation in the Southern Ocean

Matthew Mazloff 1, Verdy Ariane 1, Cornuelle Bruce 1

1SIO-UCSD

We introduce biogeochemical – ocean – sea ice state estimates in the Southern Ocean. Atmospheric fields are adjusted to fit observations from profiling floats, shipboard data, underway measurements, and satellites. These atmospheric adjustments shed light on biases in downwelling radiative fluxes in existing atmospheric reanalysis models. We demonstrate the validity of adjoint method optimization for coupled physical-biogeochemical state estimation using a series of gradient check experiments. The presentation demonstrates the readiness of the method for synthesizing in situ biogeochemical observations as they become more available.

Starts at 11:00 on 17 May.

The role of flow-dependent oceanic background-error covariance information in air-sea coupled data assimilation during tropical cyclones: a case study

Tsz Yan Leung 1, Polly Smith 1, Amos Lawless 1, Nancy Nichols 1, Matthew Martin 2

1University of Reading, 2Met Office

In variational data assimilation, background-error covariance structures have the ability to spread information from an observed part of the system to unobserved parts. Hence an accurate specification of these structures is crucially important for the success of assimilation systems and therefore of forecasts that their outputs initiate. For oceanic models, background-error covariances have traditionally been modelled by parametrisations which mainly depend on macroscopic properties of the ocean and have limited dependence on local conditions. This can be problematic during passage of tropical cyclones, when the spatial and temporal variability of the ocean state depart from their characteristic structures. Furthermore, the traditional method of estimating oceanic background-error covariances could amplify imbalances across the air-sea interface when weakly coupled data assimilation is applied, thereby bringing a detrimental impact to forecasts of cyclones. Using the case study of Cyclone Titli, which affected the Bay of Bengal in 2018, we explore hybrid methods that combine the traditional modelling strategy with flow-dependent estimates of the ocean’s error covariance structures based on the latest-available short-range ensemble forecast. This hybrid approach is investigated in the idealised context of a single-column model as well as in the UK Met Office’s state-of-the-art system. The idealised model helps inform how the inclusion of ensemble information can improve coupled forecasts. Different methods for producing the ensemble are explored, with the goal of generating a limited-sized ensemble that best represents the uncertainty in the ocean fields. We then demonstrate the power of this hybrid approach in changing the analysed structure of oceanic fields in the Met Office system, and explain the difference between the traditional and hybrid approaches in light of the ways the assimilation systems respond to single synthetic observations. Finally, we discuss the benefits that the hybrid approach in ocean data assimilation can bring to atmospheric forecasts of the cyclone.

Starts at 12:30 on 17 May.

Can data assimilation of physical and biological satellite observations inform subsurface distributions in the Gulf of Mexico?

Bin Wang 1, Katja Fennel 1, Liuqian Yu 2

1Dalhousie University, 2The Hong Kong University of Science and Technology

The multivariate Deterministic Ensemble Kalman Filter (DEnKF) has been implemented to assimilate physical and biological observations into a biogeochemical model of the Gulf of Mexico. First, the biogeochemical model component was tuned using BGC-Argo observations. Then, observations of sea surface height, sea surface temperature, and surface chlorophyll were assimilated and profiles of both physical and biological variables were updated based on the surface information. We assessed whether this results in improved subsurface distributions, especially of biological properties, using observations from five BGC-Argo floats that were not assimilated, but used in the a priori tuning. Results show that assimilation of the satellite data improves model representation of major circulation features, which translate into improved three-dimensional distributions of temperature and salinity. The multivariate assimilation also improves agreement of subsurface nitrate through its tight correlation with temperature, but the improvements in subsurface chlorophyll were modest initially due to suboptimal choices of the light attenuation parameters in the model’s optical module. Adjustment of light attenuation parameters greatly improved the subsurface distribution of chlorophyll. Given that the abundance of BGC-Argo profiles in the Gulf of Mexico so far is insufficient for sequential assimilation, the alternative of updating 3D biological properties in a model that has been well calibrated represents an intermediate step toward full assimilation of the new data types. We have shown that even sparse BGC-Argo observations can provide substantial benefits to biogeochemical prediction by enabling a priori model tuning.

Starts at 12:50 on 17 May.

Gaussian approximations in filters and smoothers for data assimilation

Matthias Morzfeld 1, Daniel Hodyss 2

1Scripps Institution of Oceanography, 2Naval Research Laboratory

We present mathematical arguments and experimental evidence that suggest that Gaussian approximations of posterior distributions are appropriate even if the physical system under consideration is nonlinear. The reason for this is a regularizing effect of the observations that can turn multi-modal prior distributions into nearly Gaussian posterior distributions. This has important ramifications on data assimilation (DA) algorithms because the various algorithms (ensemble Kalman filters/smoothers, variational methods, particle filters (PF)/smoothers (PS)) apply Gaussian approximations to different distributions, which leads to different approximate posterior distributions, and, subsequently, different degrees of error in their representation of the true posterior distribution. In particular, we explain that, in problems with ‘medium’ nonlinearity, (i) smoothers and variational methods tend to outperform ensemble Kalman filters; (ii) smoothers can be as accurate as PF, but may require fewer ensemble members; (iii) localization of PFs can introduce errors that are more severe than errors due to Gaussian approximations. In problems with ‘strong’ nonlinearity, posterior distributions are not amenable to Gaussian approximation. This happens, e.g. when posterior distributions are multi-modal. PFs can be used on these problems, but the required ensemble size is expected to be large (hundreds to thousands), even if the PFs are localized. Moreover, the usual indicators of performance (small root mean square error and comparable spread) may not be useful in strongly nonlinear problems. We arrive at these conclusions using a combination of theoretical considerations and a suite of numerical DA experiments with low- and high-dimensional nonlinear models in which we can control the nonlinearity.

Starts at 13:20 on 17 May.

C3S Seasonal Initialization and Global Reanalysis: Enabling an Ensemble of Data Assimilation for the Ocean

Arthur Vidard 1, Matthew Martin 2, Andrea Storto 3, Anthony Weaver 4

1Inria, 2Met Office, 3CMRE; CNR-ISMAR, 4CERFACS

This presentation will give an overview of recent developments in NEMOVAR that took place within the Copernicus-funded project ERGO. Its aim is to improve ocean data assimilation capabilities at ECMWF, used in both initialization of seasonal forecasts and generation of coupled Earth System reanalyses. In particular it has significantly improved NEMOVAR’s ensemble generation capabilities, which resulted in improved parameterisations of the existing background error covariance model. A more sophisticated, hybrid formulation has also been implemented, offering the possibility to represent fully flow-dependent background error covariances with multiple spatial scales. And finally, developments were made toward improved use of surface satellite data (SST and SSH). In parallel, significant effort were put to improve numerical efficiency, it has involved the development of multi-grid strategies, code optimisation and GPU’s and mixed precision capabilities. A significant effort has also been put in performing scout experiments and providing relevant diagnostics to evaluate the benefit coming from the proposed developments. All these aspects will be covered in detail in other presentations during this workshop.

Starts at 13:35 on 17 May.

An overview of ensemble covariance developments in NEMOVAR

Anthony Weaver 1, Marcin Chrust 2, Benjamin Menetrier 3, Andrea Piacentini 4, Arthur Vidard 5

1CERFACS, 2ECMWF, 3IRIT - JCSDA, 4CERFACS , 5Inria

This presentation provides an overview of methods for using ensembles to define background-error covariances in variational data assimilation (DA) with an emphasis on the global ocean. The methods that are described have been developed for NEMOVAR in support of operational DA at ECMWF and the Met Office. Various localized-ensemble and hybrid formulations of the background-error covariance matrix (B) have been implemented in NEMOVAR. While the basic methodologies are similar to those used in NWP, the underlying modelling and estimation algorithms are substantially different in order to account for specific characteristics of the ocean DA problem, such as the presence of irregular lateral boundaries and the diversity of ocean scales. The computational cost of applying ensemble-based covariance operators with high-resolution global ocean models is significant, so considerable effort has been devoted to the design and optimization of the algorithms. Key features include the use of inexpensive and scalable iterative diffusion solvers for parameter filtering and correlation modelling; the capacity to apply a diffusion-based localization operator on a coarse-resolution global grid; and the availability of an affordable method for estimating accurate normalization factors when vertical correlation parameters are flow dependent. This presentation will focus on the computational aspects of ensemble B modelling in NEMOVAR. Other presentations at the workshop will describe results from the ECMWF and Met Office NEMOVAR-based systems which use different ensemble B formulations.

Starts at 14:30 on 17 May.

Ensemble-variational assimilation with NEMOVAR at ECMWF

Marcin Chrust 1, Anthony Weaver 2, Philip Browne 1, Hao Zuo 1, Andrea Storto 3, Magdalena Alonso Balmaseda 1

1ECMWF, 2CERFACS, 3CMRE; CNR-ISMAR

This presentation will summarize the work by the NEMOVAR consortium (ECMWF, CERFACS, UK Met Office, INRIA) to develop an ensemble-variational data assimilation system for the NEMO model enabling effective assimilation of ocean observations. A holistic approach has been adopted by revisiting our static B matrix formulation, developing various flavours of a flow dependent B matrix and improving our ensemble generation scheme by implementing stochastic physics in the NEMO ocean model. The focus of the presentation will be on the configuration that is most likely to be implemented in the OCEAN6 system. It consists of a modelled B matrix where an ensemble of climatological perturbations is used to specify its static parameters: background error standard deviations and correlation length scales. The standard deviations are combined with errors of the day captured by ensemble perturbations. While it is not straightforward to implement hybrid/flow dependent horizontal correlation length scales in operational settings due to the requirement of costly re-computation of normalization factors ensuring that the correlation matrix is unit diagonal, we will show that it is feasible to envisage a configuration with fully flow dependent vertical length scales. With ECMWF plans to develop its own SST analysis going ahead and given the still prohibitive cost of fully ensemble-based B matrix, we consider such a configuration crucial for effective assimilation of surface observations.

Starts at 14:45 on 17 May.

Assessing the impact of Hybrid DA and inflation settings in a global ocean ensemble system at the Met Office

Daniel Lea 1, Matthew Martin 1, James While 1, Anthony Weaver 2, Andrea Storto 3, Marcin Chrust 4

1Met Office, 2CERFACS, 3CMRE; CNR-ISMAR, 4ECMWF

A global ocean and sea-ice ensemble forecasting system is being developed based on the present operational FOAM (Forecasting Ocean Assimilation Model) system run at the Met Office. This uses a 1/4° resolution NEMO ocean model and CICE sea-ice model, and assimilates data using the NEMOVAR system. NEMOVAR is primarily a variational data assimilation system, but now has the capability to perform hybrid ensemble variational assimilation. An ensemble of hybrid 3DEnVars with perturbed observations (values and locations) has been set-up, with each member forced at the surface by a separate member of the Met Office ensemble atmospheric prediction system (MOGREPS). The ensemble size is 37 members (including an unperturbed member). The system includes stochastic model perturbations developed at CMRE, and an ensemble inflation method based on Relaxation to Prior Spread (RTPS).

We perform several reanalysis runs of the ensemble system with different weights for the ensemble and model components of the covariance hybrid background error covariance and different ensemble inflation factors. This is done to test the sensitivity of the results to the settings and with a view to finding the optimal settings. The performance of these runs is assessed by looking at impact on the innovation statistics, the ensemble reliability and ensemble skill.

Starts at 15:00 on 17 May.

Learning from earth system observations: machine learning or data assimilation?

Alan Geer 1

1ECMWF

Billions of new observations are added every day to an already vast record of earth system observations from satellite and surface-based measuring devices. The future will see increasing diversity from sources like smallsats and networked devices such as smartphones. There are also important and often unique observing capabilities in research campaigns and field sites. Earth system observations are used for two purposes: to make analyses of the evolving geophysical state, and to validate and improve physical models of the system. The current state of the art, both for analysis and model parameter estimation, is data assimilation (DA). The new wave of machine learning (ML) for earth sciences may offer possibilities including the complete replacement of the DA process and the learning of new model components from scratch. But ML will have to contend with the characteristics of real observations: that they are indirect, ambiguous, sparse, diverse, only partially representative, and affected by many uncertainties. Current DA methods have the tools to handle these issues in a statistically optimal manner, whereas current ML approaches are typically only applied to regular, `perfect' data. However, there is no conflict between ML and DA since they are both founded in Bayesian probabilistic methods and they have an exact mathematical equivalence. The DA and ML methods applied in the earth sciences can learn from each other, and the future is likely to be the combination of both.

Starts at 15:30 on 17 May.

Data Learning: Integrating Data Assimilation and Machine Learning

Rossella Arcucci 1

1Imperial College London

Over the past years, Data Assimilation (DA) has increased in sophistication to better fit application requirements and circumvent implementation issues. Nevertheless, these approaches are incapable of fully overcoming their unrealistic assumptions. Machine Learning (ML) shows great capability in approximating nonlinear systems, and extracting high--dimensional features. ML algorithms are capable of assisting or replacing traditional forecasting methods. However, the data used during training in any ML algorithm include numerical, approximation and round off errors, which are trained into the forecasting model. Integration of ML with DA increases the reliability of prediction by including information with a physical meaning. This work provides an introduction to Data Learning, a field that integrates Data Assimilation and Machine Learning to overcome limitations in applying these fields to real-world data. The fundamental equations of DA and ML are presented and developed to show how they can be combined into Data Learning. We present a number of Data Learning methods and results for some test cases, though the equations are general and can easily be applied elsewhere.

Starts at 08:00 on 18 May.

A Simplified Smoother applied to the FOAM/Glosea Ocean Reanalysis

Keith Haines 1, Bo Dong , Matthew Martin 2

1University of Reading, 2Met Office

We present an ocean smoother designed for data adjustments in real reanalysis products, by utilizing knowledge of future increments. By using increments, rather than innovations as a true Kalman smoother would use, considerable simplification is obtained. A decay time parameter which also has 3-D spatial variations is applied to the smoother increments to account for memory decay timescales in the ocean. The result is different from just time smoothing the reanalysis itself as only the increments are being smoothed in time so the reanalysis product retains high frequency variability that is internally generated by the model and by atmospheric forcing. The smoother is applied to the daily Met Office FOAM/GloSea global ¼ degree ocean reanalysis over a 19 month period in 2015-16. Results show significant improvement over the original reanalysis in the temperature and salinity state and its variability. Comparisons are made directly against temperature and salinity observations, and smoother and more realistic time variability in the ocean heat and salt content is also discussed.

Starts at 08:20 on 18 May.

High-resolution Ensemble Kalman Filter with a low-resolution model using a machine learning super-resolution approach

Sébastien Barthélémy 1, Julien Brajard 2, Laurent Bertino 2

1University of Bergen, 2NERSC

Going from low- to high-resolution models is an efficient way to improve the data assimilation process in three ways: it makes better use of high-resolution observations, it represents more accurately the small scale features of the dynamics and it provides a high-resolution field that can further be used as an initial condition of a forecast. Of course, the pitfall of such an approach is the cost of computing a forecast with a high-resolution numerical model. This drawback is even more acute when using an ensemble data assimilation approach, such as the ensemble Kalman filter, for which an ensemble of forecasts is to be issued by the numerical model.
In our approach, we propose to use a cheap low-resolution model to provide the forecast while still performing the assimilation step in a high-resolution space. The principle of the algorithm is based on a machine learning approach: from a low-resolution forecast, a neural network (NN) emulates a high-resolution field that can then be used to assimilate high-resolution observations. This NN super-resolution operator is trained on one high-resolution simulation. This new data assimilation approach denoted "Super-resolution data assimilation" (SRDA), is built on an ensemble Kalman filter (EnKF) algorithm.
We applied SRDA to a quasi-geostrophic model representing simplified ocean dynamics of the surface layer, with a low-resolution up to four times smaller than the reference high-resolution (so the cost of the model is divided by 64). We show that this approach outperforms the standard low-resolution data assimilation approach and the SRDA method using standard interpolation instead of a neural network as a super-resolution operator. For the reduced cost of a low-resolution model, SRDA provides a high-resolution field with an error close to that of the field that would be obtained using a high-resolution model.

Starts at 08:40 on 18 May.

Machine Learning for Earth System Assimilation and Prediction

Massimo Bonavita 1, Marcin Chrust 1, Sebastien Massart 1, Patrick Laloyaux 1

1ECMWF

Machine Learning has proved to be an innovative, disruptive set of technologies capable of revolutionising many fields of applied science and engineering. A crucial scientific question is whether Machine Learning can have the same impact on Earth system assimilation and prediction, both in a holistic sense and for improving the separate Earth system components. The recent ECMWF-ESA Machine Learning Workshop hosted by ECMWF in October 2020 has provided initial answers to this question and has highlighted some of the opportunities and challenges that need to be overcome to realise the potential of these new technologies. In this talk we will discuss the main ideas that have emerged from the Workshop’s presentations and discussions and provide examples of the on-going work in this area at ECMWF and elsewhere from a data assimilation perspective. Finally, we discuss current examples and future perspectives of the application of machine learning techniques in the Ocean Data Assimilation context.

Starts at 09:10 on 18 May.

Relating model bias and prediction skill in the equatorial Atlantic

François Counillon 1, Noel Keenlyside 2, Thomas Toniazzo 3, Koseki Shunya 4, Teferi Demissie 5, Bethke Ingo 4, Wang Yiguo 6

1NERSC/UoB, 2UoB/NERSC, 3NORCE/UoB, 4UoB, 5NORCE/CGIAR, 6NERSC

We investigate the impact of large climatological biases in the tropical Atlantic on reanalysis and seasonal prediction performance using the Norwegian Climate Prediction Model (NorCPM) in a standard and an anomaly coupled configuration. Anomaly coupling corrects the climatological surface wind and sea surface temperature (SST) fields exchanged between oceanic and atmospheric models, and thereby significantly reduces the climatological model biases of precipitation and SST. NorCPM combines the Norwegian Earth system model (NorESM) with the Ensemble Kalman Filter and assimilates SST and hydrographic profiles. We perform a reanalysis for the period 1980-2010 and a set of seasonal predictions for the period 1985-2010 with both model configurations. Anomaly coupling improves the accuracy and the reliability of the reanalysis in the tropical Atlantic, because the corrected model enables a dynamical reconstruction that satisfies better the observations and their uncertainty. Anomaly coupling also enhances seasonal prediction skill in the equatorial Atlantic to the level of the best models of the North American multi-model ensemble, while the standard model is among the worst. However, anomaly coupling slightly damps the amplitude of Atlantic Niño and Niña events. The skill enhancements achieved by anomaly coupling are largest for forecast started from August and February. There is strong spring predictability barrier, with little skill in predicting conditions in June. The anomaly coupled system show some skill in predicting the secondary Atlantic Niño-II SST variability that peaks in November-December from August 1st.

Starts at 10:00 on 18 May.

A New Stochastic Ocean Physics Package and its Application To Hybrid-Covariance Data Assimilation

Andrea Storto 1, Panagiotis Andriopoulos

1CMRE; CNR-ISMAR

Generating optimal perturbations is a key requirement of several data assimilation schemes. Here, we present a newly developed stochastic physics package for ocean models, implemented in the NEMO ocean general circulation model. The package includes three schemes applied simultaneously: stochastically perturbed parameterization tendencies (SPPT), stochastically perturbed parameters (SPP) and stochastic kinetic energy backscatter (SKEB) schemes. The three schemes allow for different temporal and spatial perturbation scales. Within a limited-area ocean model configuration, ensemble free-running simulations were performed to assess the impact and reliability of the schemes. They prove complementary in increasing the ensemble spread at different scales and for different diagnostics. The ensemble spread appears reliable; for instance, it proves consistent with the root mean square differences with respect to higher resolution (sub-mesoscale) simulations that here represent the “truth” (in the sense that it includes “unresolved physics”). Interestingly, both the SPPT and the SKEB schemes lead to an increase of eddy kinetic energy at small spatial scales (2-10 km), and contribute to modify the ensemble mean state, mitigating warm biases near the thermocline due to the enhancement of the upper ocean vertical mixing. As an application of the stochastic packages, the ensemble anomaly covariances coming from the ensemble free-running simulations are used to feed large-scale anisotropic covariances that complement smaller-scale ones in a hybrid-covariance regional analysis and forecast system in the Mediterranean Sea. Ensemble-derived covariances are formulated as slowly varying three-dimensional low-resolution Empirical Orthogonal Functions (EOFs). The improvements due to the addition of such covariances to the stationary ones are found significant in real-data experiments, within verification skill scores against glider profile data, remotely sensed observations and current speed measurements from drifters, radar and moorings.

Starts at 10:20 on 18 May.

Recent development of a supermodel - an interactive multi-model ensemble

Shuo Wang 1, François Counillon 2, Koseki Shunya 3, Noel Keenlyside 4, Alok Kumar Gupta 5, Maolin Shen 6

1Geophysical Institute, University of Bergen, 2NERSC/UoB, 3UoB, 4UoB/NERSC, 5NORCE Norwegian Research Centre AS, 6Geophysical Institute and Bjerknes Centre for Climate Research, University of Bergen

An interactive multi-model ensemble (named as supermodel) based on three state-of-the-art earth system models (i.e., NorESM, MPIESM and CESM) is developed. The models are synchronized every month by data assimilation. The data assimilation method used is the Ensemble Optimal Interpolation (EnOI) scheme, for which the covariance matrix is constructed from a historical ensemble. The assimilated data is a weighted combination of the monthly output sea surface temperature (SST) of these individual models, but the full ocean state is constrained by the covariance matrix. The synchronization of the models during the model simulation makes this approach different from the traditional multi-model ensemble approach in which model outputs are combined a-posteriori.

We compare the different approaches to estimate the supermodel weights: equal weights, spatially varying weights based on the minimisation of the bias. The performance of these supermodels is compared to that of the individual models, and multi-model ensemble for the period 1980 to 2006. SST synchronisation is achieved in most oceans and in dynamical regimes such as ENSO. The supermodel with spatially varying weights overperforms the supermodel with equal weights. It reduces the SST bias by over 30% compare to the multi-model ensemble. The temporal variability of the supermodel is slightly on the low side but improved compared to the multi-model ensemble. The simulations are being extended to 2100 to assess the simulation of climate variability and climate change.

Starts at 10:40 on 18 May.

Modeling near-surface SST and SSS variability: modeling and observations

Santha Akella , Eric Hackert 1

1GMAO/NASA

In this presentation we focus on atmosphere-coupled interactions
characterizing the high temporal/spatial variability in temperature, salinity. For example, the near-surface temperature goes through diurnal cycles in sea surface temperature (SST) due to the exchange of heat and momentum. In addition, near-surface salinity changes rapidly with rain and wind-mixing events. For a proper representation of such (diurnal) variability, both the model and observing systems
need to be jointly modified and tuned. This presentation highlights the importance of both model tuning and the impact of various observing strategies.

Only recently by combining "sufficiently" high vertical/horizontal
resolution (e.g. 75L, 1/4deg) and sub-daily atmospheric "forcing" fields, ocean models are starting to resolve realistic diurnal variability. However, the computation expense
of such a high vertical resolution is burdensome in the context of coupled modeling and data assimilation (DA). An alternative approach is to parameterize this diurnal variability with a prognostic model, that is embedded within the ocean model. In the first part of this presentation, we present formulation of such a model and illustrate its effectiveness in modeling SST diurnal cycles.
The second half of our talk focuses on the observational aspect of diurnal variation. Moored buoys report temperature and salinity (T & S) profiles at various frequencies that vary from 10 min to hourly averages. Following the modeled diurnal fields, ideally these observations should be assimilated at their "native" temporal frequency to extract the most information from these observations. However, many ocean DA systems assimilate daily mean temperature and salinity profiles, for e.g. UMD SODA version 3 (Carton et al., 2018), GMAO S2S version 2 (Molod et al., 2020), NCEP GODAS (Behringer 2007), ECMWF OCEAN5/ORAS5 (Zuo et al., 2019), Met Office FOAM (Waters et al., 2015), JMA MOVE-G (Fujii et al., 2012). We assess the nature of errors due to this inconsistency and its implication on heat and salt fluxes. These issues are highly relevant to the development of "seamless" coupled DA and reanalysis systems, especially as coupled DA systems advance toward strongly coupled DA where the
combined atmosphere and ocean error covariances are retained.

Starts at 11:00 on 18 May.

Accurate parameter estimation for a Global Tide and Surge Model with Model Order Reduction

xiaohui wang 1, Martin Verlaan 2, Hai Xiang Lin 1

1Delft University of Technology, 2Deltares

Accurate parameter estimation of a global tide model benefits from the use of long time-series for many locations. However, with the number of measurements increasing also the computational times and memory requirements for the assimilation increase, especially for the ensemble-based methods that assimilate the measurements at one batch. We developed a memory-efficient and computational-reduced parameter estimation scheme using an order reduction approach. Proper Orthogonal Decomposition (POD) is a technique to reduce the state variables of high dimension system to a smaller linear subspace, which is generally applied for the space patterns but we used it for the time pattern reduction of model output instead. In our application, an iterative least-squares algorithm called DUD is used to estimate bathymetry for the high-resolution Global Tide and Surge Model (GTSM). Observations are 1973 time-series derived from the FES2014 data-set. We successfully described the model output with a smaller subspace corresponding to temporal patterns. To further improve the estimation accuracy, an outer-loop iteration is developed, similar to the common use for incremental 4D-VAR, in that the model-increments are evaluated on a coarser grid to reduce the computational cost. The outer-loop uses optimized parameters obtained from the previous DUD process as the new first guess to update the initial high-resolution model output and restart the next DUD procedure, which leads to better agreements with the high-resolution model. Experiments show that memory requirements can be sharply reduced by a factor of 20 in our case without accuracy loss of the estimation results. And the further implementation of outer-loop iteration indeed improves the estimation performance. The RMSE is reduced from 5.6cm in the initial model to 3.67cm after the estimation. The great performance is also demonstrated by the one-year forecast analysis from both time and frequency domains compared with FES2014 and UHSLC data-sets.

Starts at 12:30 on 18 May.

Impact assessment of satellite observation in the Mercator Ocean global 1/12° system

Elisabeth Remy 1, Jean-Michel LELLOUCHE 2, Mounir Benkiran 3

1Mercator Ocean International, 2Mercator Ocean, 3mercator-ocean

The use of ocean reanalysis and forecasts become more and more common for a large variety of applications. Requirements from the users are toward a higher resolution, leading to model increased resolution and complexity to better represent a larger spectrum of ocean phenomenon. In parallel, ocean observing systems also evolve to better capture smaller scale and higher frequency ocean features.
To benefit from new ocean observations and model evolution, developments are made in the system to better control the meso-scale dynamic and the ocean surface and mixed layer variability. Impact assessment studies are regularly conducted with new observation data sets or improved ones. We will present and discuss the ongoing effort to improve the efficiency of high resolution observation data assimilation on the global Mercator Ocean system at 1/12°. The talk will focus on satellite observations: sea level, including the MDT, sea surface salinity but also sea ice. It was also shown that the physical observation data assimilation has an impact on tracer transport that can be visible on particle trajectory or nutrients and then impact the BGC forecasts. Such indirect diagnostics will also be presented when assessing the impact of physical observations as they give a complementary view to usual statistical innovation based diagnostics.

Starts at 13:00 on 18 May.

Assimilating wide-swath altimeter observations in a high-resolution shelf-seas analysis and forecasting system

Robert King 1, Matthew Martin 2

1UK Met Office, 2Met Office

The impact of assimilating simulated wide-swath altimetry observations from the upcoming SWOT mission has been assessed using Observing System Simulation Experiments (OSSEs). This mission has the potential to bring about a step change in our ability to observe the ocean mesoscale, but work to ameliorate the effects of correlated errors in the processing of the SWOT observations and the assimilation is likely to be crucial. Our experiments use the Met Office 1.5 km resolution North-West European Shelf analysis and forecasting system. In an effort to understand the importance of future work to account for correlated errors in the data assimilation scheme and to reduce the magnitude of these errors in the observations themselves, we simulated SWOT observations with and without realistic correlated errors. These were assimilated in OSSEs along with simulated observations emulating the standard observing network, also with realistic errors added. We will discuss the potential impact of assimilating SWOT observations and the effectiveness of simple measures to reduce the impact of the large correlated errors expected with this instrument.

Starts at 13:20 on 18 May.

The impact of assimilating novel observations on prediction of transport and eddies in Australia’s Western Boundary Current System

Collette Kerry 1, Moninya Roughan 1, Brian Powell 2, Peter Oke 3

1University of New South Wales, NSW, Australia, 2University of Hawaii at Manoa, 3CSIRO Marine and Atmospheric Research

In the South Pacific’s Western Boundary Current, the East Australian Current (EAC) System, we combine a high-resolution (2.5-6km) numerical ocean model with an unprecedented observational data set, using 4-dimensional variational data assimilation. In addition to the traditional data streams (satellite derived SSH and SST, Argo profiling floats and XBT lines) we exploit novel observations that were collected as part of Australia's Integrated Marine Observing System (IMOS, www.imos.org.au). These include velocity and hydrographic observations from a deep-water mooring array and several moorings on the continental shelf, radial surface velocities from a high-frequency (HF) radar array and hydrographic observations from a suite of ocean glider missions. The impact of the novel observations on estimates of the WBC System is assessed in two ways. Firstly, a comparison of experiments with and without the novel observations allows us assess their value in state estimation and prediction of WBC transport and eddy structure. Secondly, variational methods allow us to quantify how each observation contributes to the state-estimate solution directly. Using the reanalysis we calculate the impacts of observations from various platforms in informing model estimates of volume transport and eddy kinetic energy in the EAC. The most influential observations are, in this order, the satellite derived SST, the radials from an HF radar array midway along the coast, the satellite derived SSH, the ocean glider observations and data from a full-depth mooring array in the northern, upstream portion of the domain. Not only do the HF radar observations have high impact on transport estimates at the array location, they have significant impact both up and downstream. Likewise, the impact of the mooring array is far reaching, contributing to transport estimates hundreds of kilometres downstream of its location. The observation impact of deep gliders deployed into eddies is particularly high. Significantly, we find that observations taken in regions with greater natural variability contribute most to constraining the model estimates, and subsurface observations have a high impact relative to the number of observations. The challenge of correctly representing the depth structure of the current and its eddies upon data assimilation is discussed. This work provides new information on the value of specific observation platforms for prediction of the EAC and motivates further work into improving prediction of the current’s separation and eddy shedding dynamics.

Starts at 13:40 on 18 May.

Investigating the impact of satellite total surface current velocities assimilation in global ocean forecasting systems

Jennifer Waters 1, Matthew Martin 1, Isabelle Mirouze 2, Elisabeth Remy 3, Robert King 4, Ubelmann Clement 5, Gaultier Lucile 6

1Met Office, 2Capgemini DEMS, 3Mercator Ocean International, 4UK Met Office, 5OceanNext, 6OceanDataLab

Prediction of ocean surface velocities remains a challenging and crucial aspect of operational ocean forecasting systems. Accurate surface velocities are important for coupled ocean/atmosphere/sea-ice/wave forecasting and for application such as search and rescue, offshore oil and gas operations and shipping. Surface velocities are not routinely assimilated in global forecasting systems, largely due to the very limited number of ocean velocity observations. New opportunities for assimilation of surface velocities into these systems should be provided by proposed satellite missions designed to observe ocean surface velocities, such as Sea surface KInematics Multiscale monitoring (SKIM).

The ESA Assimilation of Total Surface Current Velocity (A-TSCV) project focuses on the design, implementation and reporting on the impact of synthetic SKIM total surface current velocity assimilation. The project will use observing system simulation experiments (OSSEs) to test the assimilation methodology and provide feedback on the observation requirements for future satellite missions. Synthetic observations are being generated from a high-resolution nature run for all standard data types (sea surface temperature, sea-ice concentration, sea level anomaly and profiles of temperature and salinity) as well as the new observations expected from SKIM-like satellite missions. Two operational global ocean forecasting systems are being developed to assimilate these data in a set of coordinated OSSEs: the FOAM system run at the Met Office and the Mercator Ocean system. We will present an overview of the project, the design of the experiments and the data assimilation developments being made to effectively assimilate the surface velocity data into these systems.

Starts at 14:30 on 18 May.

The Joint Effort for Data assimilation Integration

Yannick Tremolet 1

1JCSDA

The long term objective of the Joint Effort for Data assimilation Integration (JEDI) is to provide a unified data assimilation framework for research and operational use, for different components of the Earth system including coupled systems, and for different applications, with the objective of reducing or avoiding redundant work within the community and increasing efficiency of research and of the transition from development teams to operations.

In a well-designed software system, teams can develop different aspects in parallel without interfering with other teams work and without breaking the components they are not working on. Scientists can be more efficient focusing on their area of expertise without having to understand all aspects of the system. JEDI fully implements this separation of concerns. The concept of models is clearly separated from the observation handling with clear interfaces. The data assimilation algorithms are themselves separated from the model space and observation space components. Generic code is used wherever possible to further reduce duplication of effort. This includes generic observation quality control, bias correction and observation operators on the observation side, as well as generic background error covariance matrices.

In this talk, an overview of the system and the current status of the implementation for several Earth-system components, with an emphasis on marine components, will be presented. Since JEDI can handle data assimilation for all major components of the Earth-system in a generic manner, generic coupled data assimilation can be explored. Initial steps in this direction will be discussed.

Starts at 15:00 on 18 May.

Integration of Ocean Data Assimilation System in the NOAA-UFS R2O Project

Rahul Mahajan 1, Kleist Daryl 2, Kim Jong 3, Vernieres Guillaume 4, Sluka Travis 4, Teng Yi-Cheng 3, Liu Xiao 3, Liu Ling 3, Li Xu 3

1NOAA/NWS/NCEP, 2NOAA/NWS/NCEP/EMC, 3IMSG @ NOAA/NWS/NCEP/EMC, 4JCSDA

NOAA’s current operational ocean forecast and monitoring systems are based on various models and analysis systems. Real-time Ocean Forecasting system (RTOFSv2.0) is based on 1/12 degree HYCOM-CICE4 with Navy Coupled Ocean Data Assimilation (NCODA) and operational Global Ocean Data Assimilation System (GODAS) uses an older generation ocean model of MOM3 with 1 degree resolution for ocean monitoring and climate prediction. In perspective of NOAA’s forecasting system modernization effort, we provide an overview of the scope of the NOAA Unified-Forecasting-System-Research-to-Operation (UFS-R2O) project with a focus on the integration of the ocean data assimilation systems through the Joint Effort for Data Assimilation Integration (JEDI). MOM6 and CICE6 models form the core of the NOAA-NCEP Next Generation Global Ocean Data Assimilation System (NG-GODAS). We plan to apply the JEDI-based NG-GODAS for NOAA’s future operational versions of the Global Forecast system (GFS) and the Global Ensemble Forecast system (GEFS). In this presentation, an assessment of the JEDI-based development is discussed with interim 40 year reanalysis experiment results for the MOM6-CICE6 global 1-degree model configuration. The software compatibility of the prototype version of the NG-GODAS system is also demonstrated with various model configurations and data assimilation applications. Latest updates and key milestone progresses of other JEDI-based NOAA NCEP ocean data assimilation projects are summarized including the UFS sub-seasonal-to-seasonal (S2S) initialization for ¼ degree MOM6 and CICE6 model configurations, development of near surface sea temperature analysis, biogeochemical data assimilation of satellite ocean color product, and high resolution MOM6 regional data assimilation activity to support forecasting extreme weather events.

Starts at 15:20 on 18 May.

Application of the BUMP library in the SOCA system

Benjamin Menetrier 1, Guillaume Vernieres 2, Travis Sluka 2

1IRIT - JCSDA, 2JCSDA

The BUMP library (B matrix on an Unstructured Mesh Package) is a core component of the JEDI project (Joint Effort for Data assimilation Integration), lead by the JCSDA. Using ensembles of forecasts, this generic tool can estimate parameters for various background error covariance models (static, localized ensemble, hybrid), and it also implements their efficient application to a vector. It can work on any kind of horizontal grid and handle complex boundaries, which makes it useful for ocean DA systems. The first part of this talk is a description of BUMP, its motivations, capabilities and implementation strategies.

The JCSDA has also developed a MOM6 interface to the JEDI project, which is currently being implemented within the UFS at NOAA for global and regional initialization of the ocean and cryosphere. It is also being implemented at the GMAO within a weakly coupled DA system targeting reanalysis and NWP forecast initialization. The workhorse static B matrix used for these implementations is based on parameterized background error and simple balance operators for the modeling of cross-covariances. The purpose of this study is to design and test a suite of covariance models based on newly available features in BUMP. The above mentioned covariance model is tested against the current configuration of the JCSDA ocean and sea ice reanalysis system over a period of several months. The metrics of comparison include observation space statistics of innovations as well as standard ocean and ice diagnostics.

Starts at 08:40 on 19 May.

PDAF - features and recent developments

Lars Nerger 1, Qi Tang 2, Longjiang Mu 1

1Alfred Wegener Institute, 2Institute of Geographic Sciences and Natural Resources Research, CAS

PDAF, the Parallel Data Assimilation Framework (http://pdaf.awi.de), is on open-source framework for ensemble data assimilation. PDAF is designed so that it is particularly easy to use and a data assimilation system can be quickly build, while PDAF ensures the computational efficiency. PDAF consists of an ensemble-component that provides online-coupled data assimilation functionality, thus data transfers in memory and by using the MPI parallelization standard, by inserting 3 function calls into the model code. These additions convert a numerical model into a data-assimilative model, which can be run like the original model, but with additional options. While this approach is particularly efficient, it is also possible to use separate programs to compute the forecasts and the assimilation analysis update. PDAF further provides data assimilation methods (solvers), in particular ensemble Kalman filters and particle filters. Tools for diagnostics, ensemble generation, and for generating synthetic observations for OSSEs or twin experiments, provide additional functionality for data assimilation. PDAF is used for research purposes, teaching, but also operationally. In the operational context, PDAF is used at the CMEMS forecasting center for the Baltic Sea and in the Chinese Global Ocean Forecasting System (CGOFS). A recent addition to PDAF is OMI, the Observation Module Infrastructure, a library extension for observation handling. OMI is inspired by object-oriented programming, but for ease of use, it is not coded using classes. Recent developments further include support for strongly-coupled data assimilation across components of Earth system models, model bindings for NEMO, SCHISM, and the climate model AWI-CM. Further, an ensemble-variational solver is under development. This presentation discusses the PDAF's features and recent infrastructure developments in PDAF.

Starts at 09:00 on 19 May.

Ensemble forecasting greatly expands the prediction horizon for internal “weather” of the ocean

Sergey Frolov , Prasad Thoppil 1, Clark Rowley 2, Caroly Reynolds 2, Gregg Jacobs 1, Joseph Metzger , Patrick Hogan 3, Neil Barton 1, Walcraft Alan 4, Ole Martin Smedstad 5, Jay Shriver 1

1NRL, 2US Naval Research Laboratory, 3NOAA/NCEI, 4FSU, 5Perspecta

Mesoscale eddies dominate energetics of the ocean, modify mass, heat and freshwater transport and primary production in the upper ocean. Eddy resolving ocean models (horizontal resolution finer than 10 km in mid-latitudes) show improved representation of mesoscale dynamics. However, mesoscale eddies, which are hard to constrain using available observations, are large contributors to the forecast error. As a consequence, the forecast skill horizon for ocean mesoscales in current operational models is shorter than 10 days. Here we show that this lack of predictive skill is due to high uncertainty in the initial location and forecast of mesoscale features that is not captured by the current generation of deterministic ocean modeling and assimilation systems. Using ensemble simulations, we account for this uncertainty, filter-out unconstraint scales, and, as a result, significantly extend the predictability of the ocean mesoscales (to between 20 and 40 days) than deterministic models. Results of this research suggest that leveraging advancements in ensemble analysis and forecasting should complement the current focus on high-resolution modeling of the ocean.

Starts at 10:00 on 19 May.

Requirements on Ocean Data Assimilation Methods to meet Seamless Predictions needs

Magdalena Alonso Balmaseda 1, Frederic Vitart 1, Chris Roberts 1, Michael Mayer 2, Beena Balan-Sarojini 1, Steffen Tietsche 1, Tim Stockdale 1, Antje Weisheimer 3, Hao Zuo 1

1ECMWF, 2ECMWF/University of Vienna, 3ECMWF/University of Oxford

A priority for ocean data assimilation methods is the appropriate initialization of the ocean for seamless forecasts of weather and climate, from days to decades. The relevant ocean processes span a wide range of spatial and time scales, and their correct initialization poses a major challenge for the data assimilation methodology. Thus, at the time range of days to weeks, assimilation methods targeting the accurate and balanced initialization of sharp SST fronts and ocean mixed layer are required. As we move to seasonal time scales, the balanced initialization of the thermal structure in the upper few hundred meters and equatorial waves becomes important. Decadal forecasts require initialization of the deeper parts of the ocean and associated transports. Methods for consistent and efficient assimilation of sea-ice information are also needed. Extended-range, seasonal and decadal prediction require historical reforecasts spanning several decades, which are initialized from ocean or coupled reanalyses. Consistency between the historical reanalysis and real time ocean initial conditions is essential. Reliable multi-decadal ocean reanalyses need methods for dealing with model error, as to prevent spurious climate signals arising from the changing ocean observing systems. This methodology should be robust across a variety of climate regimes. This presentation illustrates these different aspects from experiments with the ECMWF extended and seasonal forecasting systems. It also discusses the need of evaluation methodology for development of multi-scale data assimilation methods.

Starts at 10:30 on 19 May.

Evaluation of eddy-properties in operational oceanographic analysis systems

Gregory Smith 1, Anne-Sohie Fortin 2

1ECCC, 2ECCC & McGill University

Recent studies have shown that the presence of oceanic eddies affects the intensification of high-impact tropical cyclones. Many operational weather prediction systems (e.g. in Canada, UK and Europe) have now moved to using fully-coupled atmosphere-ocean prediction models. As a result, the accuracy with which ocean analysis systems are able to constrain the presence and properties of oceanic eddies may affect tropical cyclone forecast skill. While numerous eddy identification and tracking methods have been developed for oceanic eddies, specific methods and metrics tailored to verifying the skill of ocean analyses and forecasts in capturing these features are lacking. Here we apply an open-source eddy-tracking software and adapt it for the purpose of matching eddies between gridded observational analyses and two ocean analysis products of different resolution (1/4° and 1/12°). The ocean analysis products are the Global and Regional Ice Ocean Prediction Systems run operationally at Environment and Climate Change Canada. The systems share a common data assimilation approach with the main difference between them being the model resolution and the inclusion on tides in the regional system. A contingency table approach is taken to identify hits, misses and false alarms to provide statistics on the probability of detection and false alarm ratio. These statistics are investigated in terms of their sensitivity to eddy properties (radius, amplitude). The results clearly demonstrate the added value of higher resolution in accurately representing eddy features. The higher resolution analyses provide a higher probability of detection with a lower false alarm rate. Errors in eddy radii are also improved in the 1/12° analyses.

Starts at 10:50 on 19 May.

Regional Analysis of Indian OceaN (RAIN)

Balaji Baduru 1, Arya Paul 1, Biswamoy Paul 1, Francis P.A. 1

1Indian National Centre for Ocean Information Services

Title: Regional Analysis of Indian OceaN (RAIN)

Authors: Balaji B 1,2,3; Arya Paul 1; Biswamoy Paul 1; Francis P. A. 1;

Affiliation:

  1. Indian National Centre for Ocean Information Services, Ministry of Earth Sciences, Govt. of India, Hyderabad, 500090, India
  2. Indian Institute of Tropical Meteorology, Ministry of Earth Sciences, Govt. of India, Pune, 411008, India
  3. Department of Marine Geology, Mangalore University, Mangalagangotri, Karnataka, 574199, India

Abstract:
RAIN (Regional Analysis of Indian OceaN) is a data assimilation system developed in INCOIS wherein ROMS (Regional Ocean Modeling System), which is an ocean general circulation model suited for regional basins and is used as a forecast model for Indian Ocean by INCOIS, is interfaced with the data assimilation scheme of Local Ensemble Transform Kalman Filter (LETKF). This system assimilates in-situ temperature and salinity profiles and satellite track data of sea-surface temperature (SST). The ensemble members of assimilation systems are initialized with different model coefficients like diffusion and viscosity parameters and with different atmospheric forcing. In addition, the ensemble members also respond to two different mixing schemes – K profile parameterization and Mellor-Yamada. This strategy aids in exploiting the benefits of varied mixing parameterizations and aids in arresting the filter divergence. The assimilation system is validated extensively against multiple observations ranging from RAMA moorings to ADCP observations and satellite observations across both dependent variables like temperature and salinity and independent variables like sea-level anomaly and ocean currents. The assimilated system simulates the ocean state better than the previous operational ROMS setup. Improvement permeates to all vertical levels with better correlation with respect to observations and reduced root-mean-squared error.

Starts at 11:10 on 19 May.

Forecast Sensitivity to Observations and the U.S. Integrated Ocean Observing System

Andrew Moore 1, Christopher Edwards 1, Julia Levin 2, Hernan G. Arango 2, John Wilkin 2, Brian Powell 3

1University of California Santa Cruz, 2Rutgers University, 3University of Hawaii

The U.S. Integrated Ocean Observing System (IOOS) forms the backbone of real-time ocean analysis-forecast systems of U.S. territorial waters. In addition to satellite remote sensing, the IOOS is augmented with in situ observations from a variety of platforms including Argo floats, buoys and gliders. In addition, remote sensing observations of surface currents are also available from an extensive national network of coastal HF radars. Maintenance of these observing systems is obviously labor-intensive and costly. Routine monitoring of the impact of data from each element of the observing network is therefore recognized as an important activity, not only for maintaining the array and demonstrating its value, but also as an aid for planning future expansions of the observing system. This talk will focus on current efforts to quantify forecast sensitivity to observations (FSO) in analysis-forecast systems of the U.S. west coast and east coast circulations.