Joint ECMWF/OceanPredict workshop on Advances in Ocean Data Assimilation

Europe/London
Description

Workshop objectives

This workshop aimed to bring together experts in the field of ocean and coupled data assimilation to discuss the latest progress in the field, to outline the main challenges and to identify new directions for research.

Background

Ocean data assimilation underpins many forecasting and reanalysis applications. Seasonal forecasts rely on ocean data assimilation to produce initial conditions for the ocean component of coupled forecasts. The production of multi-decadal ocean and coupled reanalyses has enabled the calibration of these forecasts and allows studies into the climate of the ocean’s recent past. Higher resolution global and regional short-range forecasts of the ocean also require initialisation through data assimilation which has been the focus of groups contributing to the Data Assimilation Task Team (DA-TT) of OceanPredict. With the advent of coupled NWP at many operational centres, including ECMWF, improving methods for ocean and coupled data assimilation is ever more important.

Meeting format

The workshop took place online over 4 days.

The first 3 days consisted of invited and contributed oral presentations.

Poster presentations were also included with dedicated poster sessions.

A series of working group discussions took place on the 4th day with a final plenary session to discuss the main recommendations.

Events Team
    • 09:00 09:20
      Introduction and welcome 20m
      Speakers: Andy Brown (ECMWF), Matthew Martin (Met Office)
    • 09:20 12:00
      Theme 1: Ocean and coupled reanalysis
      Conveners: Dr Andrea Storto (CMRE; CNR-ISMAR), Magdalena Alonso Balmaseda (ECMWF)
      • 09:20
        Invited talk: Evaluation of the coupled atmosphere-ocean reanalysis and future development of the coupled data assimilation system in Japan Meteorological Agency 30m

        The Meteorological Research Institute (MRI) of the Japan Meteorological Agency (JMA) has developed a coupled atmosphere-ocean data assimilation system, MRI-CDA1, based on the coupled atmosphere-ocean general circulation model and separated atmosphere and ocean analysis routines adopted in JMA’s operational weather and climate prediction systems. In this presentation, we introduce the improvement of tropical precipitation and sea surface air temperature (SAT) fields in a coupled analysis generated by MRI-CDA1 over the uncoupled reanalysis using the same system. In the coupled analysis, the sea surface temperature (SST) adjustment to the atmosphere amplifies the lead/lag correlations between subseasonal variations of SST and precipitation. The atmosphere-ocean coupling generates SST variations associated with tropical instability waves, and the SAT field responds to the SST variations. The SST-precipitation and SST-SAT relationships on the weather timescale were also recovered in the coupled reanalysis, although they are hardly seen in the uncoupled one. The coupled model physics generates weather timescale SST variations consistent with the atmospheric state, and the atmospheric parameters respond to the SST variations through the coupled model physics. In this presentation, we will also introduce the future development of the coupled data assimilation system using the new global ocean data assimilation system for JMA’s coupled predictions based on a four-dimensional variational method.

        Speaker: Fujii Yosuke (JMA/MRI)
      • 09:50
        NOAA-NCEP Next Generation Global Ocean Data Assimilation System (NG-GODAS): Evaluation of 40-year Reanalysis 20m

        NOAA’s unified forecasting system (UFS) incorporates MOM6 ocean and CICE6 sea ice models as the ocean component of the future operational models: global weather (GFS), sub-seasonal (GEFS) and seasonal (SFS) forecasting systems. Furthermore, the UFS modeling infrastructure has been combined with the Joint Effort for Data Assimilation Integration (JEDI) project to establish NOAA’s Next Generation Global Ocean Data Assimilation System (NG-GODAS) system. An interim 40 year NG-GODAS reanalysis experiment is underway to assimilate various types of satellite and in-situ observations: satellite sea surface temperature, sea surface salinity, in-situ temperature & salinity, absolute dynamic topography, sea ice concentration, and sea ice freeboard thickness. Data atmosphere model with bias-corrected atmospheric forcing sets is applied in the reanalysis experiment: the NCEP Climate Forecast System Reanalysis (CFSR) for 1979~1999 and Global Ensemble Forecast System (GEFS) for 2000~2020. Preliminary reanalysis results show that the NG-GODAS provides significantly improved temperature and salinity analysis fields compared to current operational systems. We provide initial diagnostic validation results with detailed overview of the NG-GODAS analysis system.

        Speaker: Dr Jong Kim (IMGS@NOAA/NWS/NCEP/EMC)
      • 10:10
        Uncertainties in reconstruction of the past ocean climate with ocean reanalyses 20m

        A historical reconstruction of ocean and sea-ice states, or ocean reanalysis (ORA), can be produced using ocean and sea-ice model simulation constrained with some boundary forcing fluxes, and by observations via data assimilation method. A long-term ocean reanalyses can provide invaluable information for climate monitoring. However, a reliable reconstruction of the past ocean climate strongly relies on effectiveness of the relevant ocean reanalysis system, as well as the availability and consistency of the global ocean observing system. Here we assess uncertainties of some key climate change indicators (CCIs) estimated using the ECMWF ORA systems (ORAS4, ORAS5, ORAP6). An extended ensemble approach has been used in this study, by taking into account errors from climate model and boundary forcing fluxes, as well as deficiencies in the observing networks and data assimilation methods. Climate signals like changes in ocean heat and salt contents, and inter-annual variabilities of sea-level and sea-ice changes are presented. Transports from large-scale overturing circulations like the AMOC are also discussed here. Even though robust climate signals are achievable in the post-Argo era at ORAs, large uncertainties still appear in the data-sparse earlier decades, or due to intermediate or discontinued observing networks. Quantifying and characterising these uncertainties in the ocean climate can provide valuable guidance for future enhancement of the global ocean observing system, but also help improving the long-term prediction such as decadal or climatic projections.

        Speaker: Dr Hao Zuo (ECMWF)
      • 10:30
        Discussion 10m
      • 10:40
        Coffee break 20m
      • 11:00
        The Estimating the Circulation and Climate of the Ocean (ECCO) “Central Estimate”: a Multi-decadal, Coupled Ocean Reanalysis 20m

        The Estimating the Circulation and Climate of the Ocean (ECCO) Consortium has been producing dynamically and kinematically-consistent global ocean state estimates for nearly two decades. Our current focus is Version 4 of the “Central Estimate”, a data-constrained global, 1-degree, coupled ocean, sea-ice, and thermodynamic ice-sheet model that spans the period 1992-present. The coupled ocean model is made consistent with a diverse and heterogeneous set of ocean, sea-ice, and ice-sheet data in a least-squares sense by iteratively adjusting a set of control parameters using the gradient of an uncertainty-weighted model-data misfit cost function. The gradient of the cost function is provided by the automatically-derived adjoint of the model (MITgcm).

        By construction, ECCO state estimates perfectly satisfy the laws of physics and thermodynamics encoded in the numerical model and therefore conserve heat, salt, volume, and momentum. Our philosophy of strict adherence to these conservation principles ensures that ECCO reanalyses are useful for investigating the causal origins of observed ocean climate variability. However, because of the enormous scale of the nonlinear optimization problem, strictly obeying conservation laws involves a trade-off with goodness-of-fit; on the whole, ECCO reanalyses are unlikely to reproduce observations as well as ocean reanalyses that allow incremental adjustments to their state vectors through time.

        Here we summarize our efforts to date with a focus on addressing recent challenges associated with (i) coupling to the sea-ice and thermodynamic ice-sheet models, (ii) adding novel data constraints such as ocean bottom pressure from GRACE and GRACE-FO, and (iii) increasing the spatial resolution of the state estimation system to achieve eddy-resolving scales.

        Speaker: Ian Fenty (NASA Jet Propulsion Laboratory)
    • 11:20 13:50
      Theme 2: Coupled data assimilation
      Conveners: Matthew Martin (Met Office), Philip Browne (ECMWF)
      • 11:20
        Implementation and Evaluation of a High-Efficiency Coupled Data Assimilation System Using Multi-Timescale EnOI-Like Filtering with a Coupled General Circulation Model 20m

        A multi-timescale high-efficiency approximate EnKF (MSHea-EnKF), which consists of stationary, slow-varying, and fast-varying filter using the time series of a single-model solution, has been implemented in the Geophysical Fluid Dynamics Laboratory’s global fully coupled climate model (CM2.1) to increase the representation of low-frequency background error statistics and enhance the computational efficiency. Here, the MSHea-EnKF is evaluated in a biased twin experiment framework and a 27-year real-obs coupled data assimilation (CDA) experiment. Results show that while the computing only costs duodecimal of traditional ensemble coupled data assimilation (ECDA), the ocean state estimation quality improves 30.3% and 10.7% for upper 500m salinity and temperature, respectively, and the atmosphere state estimation has almost the same quality as traditional ECDA. It’s mainly because MSHea-EnKF improves representation primarily on slow-varying background flows. The MSHea-EnKF also gets a more reasonable standard deviation distribution for Atlantic meridional overturning circulation (AMOC) and stronger meridional transport at 26.5°N below 2000m, which is closer to Rapid estimates.

        Speaker: Lv Lu (Ocean University of China)
      • 11:40
        Biogeochemical, ocean, and sea-ice data assimilation in the Southern Ocean 20m

        We introduce biogeochemical – ocean – sea ice state estimates in the Southern Ocean. Atmospheric fields are adjusted to fit observations from profiling floats, shipboard data, underway measurements, and satellites. These atmospheric adjustments shed light on biases in downwelling radiative fluxes in existing atmospheric reanalysis models. We demonstrate the validity of adjoint method optimization for coupled physical-biogeochemical state estimation using a series of gradient check experiments. The presentation demonstrates the readiness of the method for synthesizing in situ biogeochemical observations as they become more available.

        Speaker: Matthew Mazloff (SIO-UCSD)
      • 12:00
        The role of flow-dependent oceanic background-error covariance information in air-sea coupled data assimilation during tropical cyclones: a case study 20m

        In variational data assimilation, background-error covariance structures have the ability to spread information from an observed part of the system to unobserved parts. Hence an accurate specification of these structures is crucially important for the success of assimilation systems and therefore of forecasts that their outputs initiate. For oceanic models, background-error covariances have traditionally been modelled by parametrisations which mainly depend on macroscopic properties of the ocean and have limited dependence on local conditions. This can be problematic during passage of tropical cyclones, when the spatial and temporal variability of the ocean state depart from their characteristic structures. Furthermore, the traditional method of estimating oceanic background-error covariances could amplify imbalances across the air-sea interface when weakly coupled data assimilation is applied, thereby bringing a detrimental impact to forecasts of cyclones. Using the case study of Cyclone Titli, which affected the Bay of Bengal in 2018, we explore hybrid methods that combine the traditional modelling strategy with flow-dependent estimates of the ocean’s error covariance structures based on the latest-available short-range ensemble forecast. This hybrid approach is investigated in the idealised context of a single-column model as well as in the UK Met Office’s state-of-the-art system. The idealised model helps inform how the inclusion of ensemble information can improve coupled forecasts. Different methods for producing the ensemble are explored, with the goal of generating a limited-sized ensemble that best represents the uncertainty in the ocean fields. We then demonstrate the power of this hybrid approach in changing the analysed structure of oceanic fields in the Met Office system, and explain the difference between the traditional and hybrid approaches in light of the ways the assimilation systems respond to single synthetic observations. Finally, we discuss the benefits that the hybrid approach in ocean data assimilation can bring to atmospheric forecasts of the cyclone.

        Speaker: Dr Tsz Yan Leung (University of Reading)
      • 12:30
        Lunch break 1h
      • 13:30
        Can data assimilation of physical and biological satellite observations inform subsurface distributions in the Gulf of Mexico? 20m

        The multivariate Deterministic Ensemble Kalman Filter (DEnKF) has been implemented to assimilate physical and biological observations into a biogeochemical model of the Gulf of Mexico. First, the biogeochemical model component was tuned using BGC-Argo observations. Then, observations of sea surface height, sea surface temperature, and surface chlorophyll were assimilated and profiles of both physical and biological variables were updated based on the surface information. We assessed whether this results in improved subsurface distributions, especially of biological properties, using observations from five BGC-Argo floats that were not assimilated, but used in the a priori tuning. Results show that assimilation of the satellite data improves model representation of major circulation features, which translate into improved three-dimensional distributions of temperature and salinity. The multivariate assimilation also improves agreement of subsurface nitrate through its tight correlation with temperature, but the improvements in subsurface chlorophyll were modest initially due to suboptimal choices of the light attenuation parameters in the model’s optical module. Adjustment of light attenuation parameters greatly improved the subsurface distribution of chlorophyll. Given that the abundance of BGC-Argo profiles in the Gulf of Mexico so far is insufficient for sequential assimilation, the alternative of updating 3D biological properties in a model that has been well calibrated represents an intermediate step toward full assimilation of the new data types. We have shown that even sparse BGC-Argo observations can provide substantial benefits to biogeochemical prediction by enabling a priori model tuning.

        Speaker: Mr Bin Wang (Dalhousie University)
    • 13:50 16:00
      Theme 3: Data assimilation methods
      Conveners: Andrew Moore (University of California Santa Cruz), Massimo Bonavita (ECMWF)
      • 13:50
        Invited talk: Gaussian approximations in filters and smoothers for data assimilation 30m

        We present mathematical arguments and experimental evidence that suggest that Gaussian approximations of posterior distributions are appropriate even if the physical system under consideration is nonlinear. The reason for this is a regularizing effect of the observations that can turn multi-modal prior distributions into nearly Gaussian posterior distributions. This has important ramifications on data assimilation (DA) algorithms because the various algorithms (ensemble Kalman filters/smoothers, variational methods, particle filters (PF)/smoothers (PS)) apply Gaussian approximations to different distributions, which leads to different approximate posterior distributions, and, subsequently, different degrees of error in their representation of the true posterior distribution. In particular, we explain that, in problems with ‘medium’ nonlinearity, (i) smoothers and variational methods tend to outperform ensemble Kalman filters; (ii) smoothers can be as accurate as PF, but may require fewer ensemble members; (iii) localization of PFs can introduce errors that are more severe than errors due to Gaussian approximations. In problems with ‘strong’ nonlinearity, posterior distributions are not amenable to Gaussian approximation. This happens, e.g. when posterior distributions are multi-modal. PFs can be used on these problems, but the required ensemble size is expected to be large (hundreds to thousands), even if the PFs are localized. Moreover, the usual indicators of performance (small root mean square error and comparable spread) may not be useful in strongly nonlinear problems. We arrive at these conclusions using a combination of theoretical considerations and a suite of numerical DA experiments with low- and high-dimensional nonlinear models in which we can control the nonlinearity.

        Speaker: Matthias Morzfeld (Scripps Institution of Oceanography)
      • 14:20
        C3S Seasonal Initialization and Global Reanalysis: Enabling an Ensemble of Data Assimilation for the Ocean 15m

        This presentation will give an overview of recent developments in NEMOVAR that took place within the Copernicus-funded project ERGO. Its aim is to improve ocean data assimilation capabilities at ECMWF, used in both initialization of seasonal forecasts and generation of coupled Earth System reanalyses. In particular it has significantly improved NEMOVAR’s ensemble generation capabilities, which resulted in improved parameterisations of the existing background error covariance model. A more sophisticated, hybrid formulation has also been implemented, offering the possibility to represent fully flow-dependent background error covariances with multiple spatial scales. And finally, developments were made toward improved use of surface satellite data (SST and SSH). In parallel, significant effort were put to improve numerical efficiency, it has involved the development of multi-grid strategies, code optimisation and GPU’s and mixed precision capabilities. A significant effort has also been put in performing scout experiments and providing relevant diagnostics to evaluate the benefit coming from the proposed developments. All these aspects will be covered in detail in other presentations during this workshop.

        Speaker: Arthur Vidard (Inria)
      • 14:35
        An overview of ensemble covariance developments in NEMOVAR 15m

        This presentation provides an overview of methods for using ensembles to define background-error covariances in variational data assimilation (DA) with an emphasis on the global ocean. The methods that are described have been developed for NEMOVAR in support of operational DA at ECMWF and the Met Office. Various localized-ensemble and hybrid formulations of the background-error covariance matrix (B) have been implemented in NEMOVAR. While the basic methodologies are similar to those used in NWP, the underlying modelling and estimation algorithms are substantially different in order to account for specific characteristics of the ocean DA problem, such as the presence of irregular lateral boundaries and the diversity of ocean scales. The computational cost of applying ensemble-based covariance operators with high-resolution global ocean models is significant, so considerable effort has been devoted to the design and optimization of the algorithms. Key features include the use of inexpensive and scalable iterative diffusion solvers for parameter filtering and correlation modelling; the capacity to apply a diffusion-based localization operator on a coarse-resolution global grid; and the availability of an affordable method for estimating accurate normalization factors when vertical correlation parameters are flow dependent. This presentation will focus on the computational aspects of ensemble B modelling in NEMOVAR. Other presentations at the workshop will describe results from the ECMWF and Met Office NEMOVAR-based systems which use different ensemble B formulations.

        Speaker: Dr Anthony Weaver (CERFACS)
      • 14:50
        Discussion 10m
      • 15:00
        Coffee break 30m
      • 15:30
        Ensemble-variational assimilation with NEMOVAR at ECMWF 15m

        This presentation will summarize the work by the NEMOVAR consortium (ECMWF, CERFACS, UK Met Office, INRIA) to develop an ensemble-variational data assimilation system for the NEMO model enabling effective assimilation of ocean observations. A holistic approach has been adopted by revisiting our static B matrix formulation, developing various flavours of a flow dependent B matrix and improving our ensemble generation scheme by implementing stochastic physics in the NEMO ocean model. The focus of the presentation will be on the configuration that is most likely to be implemented in the OCEAN6 system. It consists of a modelled B matrix where an ensemble of climatological perturbations is used to specify its static parameters: background error standard deviations and correlation length scales. The standard deviations are combined with errors of the day captured by ensemble perturbations. While it is not straightforward to implement hybrid/flow dependent horizontal correlation length scales in operational settings due to the requirement of costly re-computation of normalization factors ensuring that the correlation matrix is unit diagonal, we will show that it is feasible to envisage a configuration with fully flow dependent vertical length scales. With ECMWF plans to develop its own SST analysis going ahead and given the still prohibitive cost of fully ensemble-based B matrix, we consider such a configuration crucial for effective assimilation of surface observations.

        Speaker: Marcin Chrust (ECMWF)
      • 15:45
        Assessing the impact of Hybrid DA and inflation settings in a global ocean ensemble system at the Met Office 15m

        A global ocean and sea-ice ensemble forecasting system is being developed based on the present operational FOAM (Forecasting Ocean Assimilation Model) system run at the Met Office. This uses a 1/4° resolution NEMO ocean model and CICE sea-ice model, and assimilates data using the NEMOVAR system. NEMOVAR is primarily a variational data assimilation system, but now has the capability to perform hybrid ensemble variational assimilation. An ensemble of hybrid 3DEnVars with perturbed observations (values and locations) has been set-up, with each member forced at the surface by a separate member of the Met Office ensemble atmospheric prediction system (MOGREPS). The ensemble size is 37 members (including an unperturbed member). The system includes stochastic model perturbations developed at CMRE, and an ensemble inflation method based on Relaxation to Prior Spread (RTPS).

        We perform several reanalysis runs of the ensemble system with different weights for the ensemble and model components of the covariance hybrid background error covariance and different ensemble inflation factors. This is done to test the sensitivity of the results to the settings and with a view to finding the optimal settings. The performance of these runs is assessed by looking at impact on the innovation statistics, the ensemble reliability and ensemble skill.

        Speaker: Daniel Lea (Met Office)
    • 16:00 21:40
      Theme 4: Machine learning in data assimilation
      Conveners: Arthur Vidard (Inria), Marcin Chrust (ECMWF)
      • 16:00
        Invited talk: Learning from earth system observations: machine learning or data assimilation? 30m

        Billions of new observations are added every day to an already vast record of earth system observations from satellite and surface-based measuring devices. The future will see increasing diversity from sources like smallsats and networked devices such as smartphones. There are also important and often unique observing capabilities in research campaigns and field sites. Earth system observations are used for two purposes: to make analyses of the evolving geophysical state, and to validate and improve physical models of the system. The current state of the art, both for analysis and model parameter estimation, is data assimilation (DA). The new wave of machine learning (ML) for earth sciences may offer possibilities including the complete replacement of the DA process and the learning of new model components from scratch. But ML will have to contend with the characteristics of real observations: that they are indirect, ambiguous, sparse, diverse, only partially representative, and affected by many uncertainties. Current DA methods have the tools to handle these issues in a statistically optimal manner, whereas current ML approaches are typically only applied to regular, `perfect' data. However, there is no conflict between ML and DA since they are both founded in Bayesian probabilistic methods and they have an exact mathematical equivalence. The DA and ML methods applied in the earth sciences can learn from each other, and the future is likely to be the combination of both.

        Speaker: Alan Geer (ECMWF)
      • 16:30
        Invited talk: Data Learning: Integrating Data Assimilation and Machine Learning 30m

        Over the past years, Data Assimilation (DA) has increased in sophistication to better fit application requirements and circumvent implementation issues. Nevertheless, these approaches are incapable of fully overcoming their unrealistic assumptions. Machine Learning (ML) shows great capability in approximating nonlinear systems, and extracting high--dimensional features. ML algorithms are capable of assisting or replacing traditional forecasting methods. However, the data used during training in any ML algorithm include numerical, approximation and round off errors, which are trained into the forecasting model. Integration of ML with DA increases the reliability of prediction by including information with a physical meaning. This work provides an introduction to Data Learning, a field that integrates Data Assimilation and Machine Learning to overcome limitations in applying these fields to real-world data. The fundamental equations of DA and ML are presented and developed to show how they can be combined into Data Learning. We present a number of Data Learning methods and results for some test cases, though the equations are general and can easily be applied elsewhere.

        Speaker: Rossella Arcucci (Imperial College London)
      • 17:00
        Discussion 10m
      • 17:10
        Ice breaker and catch-up with the speakers 35m
    • 09:00 09:40
      Theme 4 (cont.): Machine learning in data assimilation
      Conveners: Arthur Vidard (Inria), Marcin Chrust (ECMWF)
      • 09:00
        A Simplified Smoother applied to the FOAM/Glosea Ocean Reanalysis 20m

        We present an ocean smoother designed for data adjustments in real reanalysis products, by utilizing knowledge of future increments. By using increments, rather than innovations as a true Kalman smoother would use, considerable simplification is obtained. A decay time parameter which also has 3-D spatial variations is applied to the smoother increments to account for memory decay timescales in the ocean. The result is different from just time smoothing the reanalysis itself as only the increments are being smoothed in time so the reanalysis product retains high frequency variability that is internally generated by the model and by atmospheric forcing. The smoother is applied to the daily Met Office FOAM/GloSea global ¼ degree ocean reanalysis over a 19 month period in 2015-16. Results show significant improvement over the original reanalysis in the temperature and salinity state and its variability. Comparisons are made directly against temperature and salinity observations, and smoother and more realistic time variability in the ocean heat and salt content is also discussed.

        Speaker: Keith Haines (University of Reading)
      • 09:20
        High-resolution Ensemble Kalman Filter with a low-resolution model using a machine learning super-resolution approach 20m

        Going from low- to high-resolution models is an efficient way to improve the data assimilation process in three ways: it makes better use of high-resolution observations, it represents more accurately the small scale features of the dynamics and it provides a high-resolution field that can further be used as an initial condition of a forecast. Of course, the pitfall of such an approach is the cost of computing a forecast with a high-resolution numerical model. This drawback is even more acute when using an ensemble data assimilation approach, such as the ensemble Kalman filter, for which an ensemble of forecasts is to be issued by the numerical model.
        In our approach, we propose to use a cheap low-resolution model to provide the forecast while still performing the assimilation step in a high-resolution space. The principle of the algorithm is based on a machine learning approach: from a low-resolution forecast, a neural network (NN) emulates a high-resolution field that can then be used to assimilate high-resolution observations. This NN super-resolution operator is trained on one high-resolution simulation. This new data assimilation approach denoted "Super-resolution data assimilation" (SRDA), is built on an ensemble Kalman filter (EnKF) algorithm.
        We applied SRDA to a quasi-geostrophic model representing simplified ocean dynamics of the surface layer, with a low-resolution up to four times smaller than the reference high-resolution (so the cost of the model is divided by 64). We show that this approach outperforms the standard low-resolution data assimilation approach and the SRDA method using standard interpolation instead of a neural network as a super-resolution operator. For the reduced cost of a low-resolution model, SRDA provides a high-resolution field with an error close to that of the field that would be obtained using a high-resolution model.

        Speaker: Dr Sébastien Barthélémy (University of Bergen)
    • 09:40 13:30
      Theme 5: Model error
      Conveners: Dr Hao Zuo (ECMWF), Jennifer Waters (Met Office)
      • 09:40
        Invited talk: Machine Learning for Earth System Assimilation and Prediction 30m

        Machine Learning has proved to be an innovative, disruptive set of technologies capable of revolutionising many fields of applied science and engineering. A crucial scientific question is whether Machine Learning can have the same impact on Earth system assimilation and prediction, both in a holistic sense and for improving the separate Earth system components. The recent ECMWF-ESA Machine Learning Workshop hosted by ECMWF in October 2020 has provided initial answers to this question and has highlighted some of the opportunities and challenges that need to be overcome to realise the potential of these new technologies. In this talk we will discuss the main ideas that have emerged from the Workshop’s presentations and discussions and provide examples of the on-going work in this area at ECMWF and elsewhere from a data assimilation perspective. Finally, we discuss current examples and future perspectives of the application of machine learning techniques in the Ocean Data Assimilation context.

        Speaker: Massimo Bonavita (ECMWF)
      • 10:10
        Relating model bias and prediction skill in the equatorial Atlantic 20m

        We investigate the impact of large climatological biases in the tropical Atlantic on reanalysis and seasonal prediction performance using the Norwegian Climate Prediction Model (NorCPM) in a standard and an anomaly coupled configuration. Anomaly coupling corrects the climatological surface wind and sea surface temperature (SST) fields exchanged between oceanic and atmospheric models, and thereby significantly reduces the climatological model biases of precipitation and SST. NorCPM combines the Norwegian Earth system model (NorESM) with the Ensemble Kalman Filter and assimilates SST and hydrographic profiles. We perform a reanalysis for the period 1980-2010 and a set of seasonal predictions for the period 1985-2010 with both model configurations. Anomaly coupling improves the accuracy and the reliability of the reanalysis in the tropical Atlantic, because the corrected model enables a dynamical reconstruction that satisfies better the observations and their uncertainty. Anomaly coupling also enhances seasonal prediction skill in the equatorial Atlantic to the level of the best models of the North American multi-model ensemble, while the standard model is among the worst. However, anomaly coupling slightly damps the amplitude of Atlantic Niño and Niña events. The skill enhancements achieved by anomaly coupling are largest for forecast started from August and February. There is strong spring predictability barrier, with little skill in predicting conditions in June. The anomaly coupled system show some skill in predicting the secondary Atlantic Niño-II SST variability that peaks in November-December from August 1st.

        Speaker: François Counillon (NERSC/UoB)
      • 10:30
        Discussion 10m
      • 10:40
        Coffee break 20m
      • 11:00
        A New Stochastic Ocean Physics Package and its Application To Hybrid-Covariance Data Assimilation 20m

        Generating optimal perturbations is a key requirement of several data assimilation schemes. Here, we present a newly developed stochastic physics package for ocean models, implemented in the NEMO ocean general circulation model. The package includes three schemes applied simultaneously: stochastically perturbed parameterization tendencies (SPPT), stochastically perturbed parameters (SPP) and stochastic kinetic energy backscatter (SKEB) schemes. The three schemes allow for different temporal and spatial perturbation scales. Within a limited-area ocean model configuration, ensemble free-running simulations were performed to assess the impact and reliability of the schemes. They prove complementary in increasing the ensemble spread at different scales and for different diagnostics. The ensemble spread appears reliable; for instance, it proves consistent with the root mean square differences with respect to higher resolution (sub-mesoscale) simulations that here represent the “truth” (in the sense that it includes “unresolved physics”). Interestingly, both the SPPT and the SKEB schemes lead to an increase of eddy kinetic energy at small spatial scales (2-10 km), and contribute to modify the ensemble mean state, mitigating warm biases near the thermocline due to the enhancement of the upper ocean vertical mixing. As an application of the stochastic packages, the ensemble anomaly covariances coming from the ensemble free-running simulations are used to feed large-scale anisotropic covariances that complement smaller-scale ones in a hybrid-covariance regional analysis and forecast system in the Mediterranean Sea. Ensemble-derived covariances are formulated as slowly varying three-dimensional low-resolution Empirical Orthogonal Functions (EOFs). The improvements due to the addition of such covariances to the stationary ones are found significant in real-data experiments, within verification skill scores against glider profile data, remotely sensed observations and current speed measurements from drifters, radar and moorings.

        Speaker: Dr Andrea Storto (CMRE; CNR-ISMAR)
      • 11:20
        Recent development of a supermodel - an interactive multi-model ensemble 20m

        An interactive multi-model ensemble (named as supermodel) based on three state-of-the-art earth system models (i.e., NorESM, MPIESM and CESM) is developed. The models are synchronized every month by data assimilation. The data assimilation method used is the Ensemble Optimal Interpolation (EnOI) scheme, for which the covariance matrix is constructed from a historical ensemble. The assimilated data is a weighted combination of the monthly output sea surface temperature (SST) of these individual models, but the full ocean state is constrained by the covariance matrix. The synchronization of the models during the model simulation makes this approach different from the traditional multi-model ensemble approach in which model outputs are combined a-posteriori.

        We compare the different approaches to estimate the supermodel weights: equal weights, spatially varying weights based on the minimisation of the bias. The performance of these supermodels is compared to that of the individual models, and multi-model ensemble for the period 1980 to 2006. SST synchronisation is achieved in most oceans and in dynamical regimes such as ENSO. The supermodel with spatially varying weights overperforms the supermodel with equal weights. It reduces the SST bias by over 30% compare to the multi-model ensemble. The temporal variability of the supermodel is slightly on the low side but improved compared to the multi-model ensemble. The simulations are being extended to 2100 to assess the simulation of climate variability and climate change.

        Speaker: Shuo Wang (Geophysical Institute, University of Bergen)
      • 11:40
        Modeling near-surface SST and SSS variability: modeling and observations 20m

        In this presentation we focus on atmosphere-coupled interactions
        characterizing the high temporal/spatial variability in temperature, salinity. For example, the near-surface temperature goes through diurnal cycles in sea surface temperature (SST) due to the exchange of heat and momentum. In addition, near-surface salinity changes rapidly with rain and wind-mixing events. For a proper representation of such (diurnal) variability, both the model and observing systems
        need to be jointly modified and tuned. This presentation highlights the importance of both model tuning and the impact of various observing strategies.

        Only recently by combining "sufficiently" high vertical/horizontal
        resolution (e.g. 75L, 1/4deg) and sub-daily atmospheric "forcing" fields, ocean models are starting to resolve realistic diurnal variability. However, the computation expense
        of such a high vertical resolution is burdensome in the context of coupled modeling and data assimilation (DA). An alternative approach is to parameterize this diurnal variability with a prognostic model, that is embedded within the ocean model. In the first part of this presentation, we present formulation of such a model and illustrate its effectiveness in modeling SST diurnal cycles.
        The second half of our talk focuses on the observational aspect of diurnal variation. Moored buoys report temperature and salinity (T & S) profiles at various frequencies that vary from 10 min to hourly averages. Following the modeled diurnal fields, ideally these observations should be assimilated at their "native" temporal frequency to extract the most information from these observations. However, many ocean DA systems assimilate daily mean temperature and salinity profiles, for e.g. UMD SODA version 3 (Carton et al., 2018), GMAO S2S version 2 (Molod et al., 2020), NCEP GODAS (Behringer 2007), ECMWF OCEAN5/ORAS5 (Zuo et al., 2019), Met Office FOAM (Waters et al., 2015), JMA MOVE-G (Fujii et al., 2012). We assess the nature of errors due to this inconsistency and its implication on heat and salt fluxes. These issues are highly relevant to the development of "seamless" coupled DA and reanalysis systems, especially as coupled DA systems advance toward strongly coupled DA where the
        combined atmosphere and ocean error covariances are retained.

        Speaker: Santha Akella (NASA)
      • 12:00
        Accurate parameter estimation for a Global Tide and Surge Model with Model Order Reduction 20m

        Accurate parameter estimation of a global tide model benefits from the use of long time-series for many locations. However, with the number of measurements increasing also the computational times and memory requirements for the assimilation increase, especially for the ensemble-based methods that assimilate the measurements at one batch. We developed a memory-efficient and computational-reduced parameter estimation scheme using an order reduction approach. Proper Orthogonal Decomposition (POD) is a technique to reduce the state variables of high dimension system to a smaller linear subspace, which is generally applied for the space patterns but we used it for the time pattern reduction of model output instead. In our application, an iterative least-squares algorithm called DUD is used to estimate bathymetry for the high-resolution Global Tide and Surge Model (GTSM). Observations are 1973 time-series derived from the FES2014 data-set. We successfully described the model output with a smaller subspace corresponding to temporal patterns. To further improve the estimation accuracy, an outer-loop iteration is developed, similar to the common use for incremental 4D-VAR, in that the model-increments are evaluated on a coarser grid to reduce the computational cost. The outer-loop uses optimized parameters obtained from the previous DUD process as the new first guess to update the initial high-resolution model output and restart the next DUD procedure, which leads to better agreements with the high-resolution model. Experiments show that memory requirements can be sharply reduced by a factor of 20 in our case without accuracy loss of the estimation results. And the further implementation of outer-loop iteration indeed improves the estimation performance. The RMSE is reduced from 5.6cm in the initial model to 3.67cm after the estimation. The great performance is also demonstrated by the one-year forecast analysis from both time and frequency domains compared with FES2014 and UHSLC data-sets.

        Speaker: Xiaohui Wang (Delft University of Technology)
      • 12:20
        Discussion 10m
      • 12:30
        Lunch break 1h
    • 13:30 15:30
      Theme 6: Assimilation of novel observations
      Conveners: Daniel Lea (Met Office), Patricia de Rosnay (ECMWF)
      • 13:30
        Invited talk: Impact assessment of satellite observation in the Mercator Ocean global 1/12° system 30m

        The use of ocean reanalysis and forecasts become more and more common for a large variety of applications. Requirements from the users are toward a higher resolution, leading to model increased resolution and complexity to better represent a larger spectrum of ocean phenomenon. In parallel, ocean observing systems also evolve to better capture smaller scale and higher frequency ocean features.
        To benefit from new ocean observations and model evolution, developments are made in the system to better control the meso-scale dynamic and the ocean surface and mixed layer variability. Impact assessment studies are regularly conducted with new observation data sets or improved ones. We will present and discuss the ongoing effort to improve the efficiency of high resolution observation data assimilation on the global Mercator Ocean system at 1/12°. The talk will focus on satellite observations: sea level, including the MDT, sea surface salinity but also sea ice. It was also shown that the physical observation data assimilation has an impact on tracer transport that can be visible on particle trajectory or nutrients and then impact the BGC forecasts. Such indirect diagnostics will also be presented when assessing the impact of physical observations as they give a complementary view to usual statistical innovation based diagnostics.

        Speaker: Dr Elisabeth Remy (Mercator Ocean)
      • 14:00
        Assimilating wide-swath altimeter observations in a high-resolution shelf-seas analysis and forecasting system 20m

        The impact of assimilating simulated wide-swath altimetry observations from the upcoming SWOT mission has been assessed using Observing System Simulation Experiments (OSSEs). This mission has the potential to bring about a step change in our ability to observe the ocean mesoscale, but work to ameliorate the effects of correlated errors in the processing of the SWOT observations and the assimilation is likely to be crucial. Our experiments use the Met Office 1.5 km resolution North-West European Shelf analysis and forecasting system. In an effort to understand the importance of future work to account for correlated errors in the data assimilation scheme and to reduce the magnitude of these errors in the observations themselves, we simulated SWOT observations with and without realistic correlated errors. These were assimilated in OSSEs along with simulated observations emulating the standard observing network, also with realistic errors added. We will discuss the potential impact of assimilating SWOT observations and the effectiveness of simple measures to reduce the impact of the large correlated errors expected with this instrument.

        Speaker: Robert King (UK Met Office)
      • 14:20
        The impact of assimilating novel observations on prediction of transport and eddies in Australia’s Western Boundary Current System 20m

        In the South Pacific’s Western Boundary Current, the East Australian Current (EAC) System, we combine a high-resolution (2.5-6km) numerical ocean model with an unprecedented observational data set, using 4-dimensional variational data assimilation. In addition to the traditional data streams (satellite derived SSH and SST, Argo profiling floats and XBT lines) we exploit novel observations that were collected as part of Australia's Integrated Marine Observing System (IMOS, www.imos.org.au). These include velocity and hydrographic observations from a deep-water mooring array and several moorings on the continental shelf, radial surface velocities from a high-frequency (HF) radar array and hydrographic observations from a suite of ocean glider missions. The impact of the novel observations on estimates of the WBC System is assessed in two ways. Firstly, a comparison of experiments with and without the novel observations allows us assess their value in state estimation and prediction of WBC transport and eddy structure. Secondly, variational methods allow us to quantify how each observation contributes to the state-estimate solution directly. Using the reanalysis we calculate the impacts of observations from various platforms in informing model estimates of volume transport and eddy kinetic energy in the EAC. The most influential observations are, in this order, the satellite derived SST, the radials from an HF radar array midway along the coast, the satellite derived SSH, the ocean glider observations and data from a full-depth mooring array in the northern, upstream portion of the domain. Not only do the HF radar observations have high impact on transport estimates at the array location, they have significant impact both up and downstream. Likewise, the impact of the mooring array is far reaching, contributing to transport estimates hundreds of kilometres downstream of its location. The observation impact of deep gliders deployed into eddies is particularly high. Significantly, we find that observations taken in regions with greater natural variability contribute most to constraining the model estimates, and subsurface observations have a high impact relative to the number of observations. The challenge of correctly representing the depth structure of the current and its eddies upon data assimilation is discussed. This work provides new information on the value of specific observation platforms for prediction of the EAC and motivates further work into improving prediction of the current’s separation and eddy shedding dynamics.

        Speaker: Colette Kerry (University of New South Wales)
      • 14:40
        Investigating the impact of satellite total surface current velocities assimilation in global ocean forecasting systems 20m

        Prediction of ocean surface velocities remains a challenging and crucial aspect of operational ocean forecasting systems. Accurate surface velocities are important for coupled ocean/atmosphere/sea-ice/wave forecasting and for application such as search and rescue, offshore oil and gas operations and shipping. Surface velocities are not routinely assimilated in global forecasting systems, largely due to the very limited number of ocean velocity observations. New opportunities for assimilation of surface velocities into these systems should be provided by proposed satellite missions designed to observe ocean surface velocities, such as Sea surface KInematics Multiscale monitoring (SKIM).

        The ESA Assimilation of Total Surface Current Velocity (A-TSCV) project focuses on the design, implementation and reporting on the impact of synthetic SKIM total surface current velocity assimilation. The project will use observing system simulation experiments (OSSEs) to test the assimilation methodology and provide feedback on the observation requirements for future satellite missions. Synthetic observations are being generated from a high-resolution nature run for all standard data types (sea surface temperature, sea-ice concentration, sea level anomaly and profiles of temperature and salinity) as well as the new observations expected from SKIM-like satellite missions. Two operational global ocean forecasting systems are being developed to assimilate these data in a set of coordinated OSSEs: the FOAM system run at the Met Office and the Mercator Ocean system. We will present an overview of the project, the design of the experiments and the data assimilation developments being made to effectively assimilate the surface velocity data into these systems.

        Speaker: Jennifer Waters (Met Office)
      • 15:00
        Discussion 10m
      • 15:10
        Coffee break 20m
    • 15:30 16:40
      Theme 7: Recent assimilation infrastructure developments
      Conveners: Anthony Weaver (CERFACS), Marcin Chrust (ECMWF)
      • 15:30
        Invited talk: The Joint Effort for Data assimilation Integration 30m

        The long term objective of the Joint Effort for Data assimilation Integration (JEDI) is to provide a unified data assimilation framework for research and operational use, for different components of the Earth system including coupled systems, and for different applications, with the objective of reducing or avoiding redundant work within the community and increasing efficiency of research and of the transition from development teams to operations.

        In a well-designed software system, teams can develop different aspects in parallel without interfering with other teams work and without breaking the components they are not working on. Scientists can be more efficient focusing on their area of expertise without having to understand all aspects of the system. JEDI fully implements this separation of concerns. The concept of models is clearly separated from the observation handling with clear interfaces. The data assimilation algorithms are themselves separated from the model space and observation space components. Generic code is used wherever possible to further reduce duplication of effort. This includes generic observation quality control, bias correction and observation operators on the observation side, as well as generic background error covariance matrices.

        In this talk, an overview of the system and the current status of the implementation for several Earth-system components, with an emphasis on marine components, will be presented. Since JEDI can handle data assimilation for all major components of the Earth-system in a generic manner, generic coupled data assimilation can be explored. Initial steps in this direction will be discussed.

        Speaker: Yannick Tremolet (JCSDA)
      • 16:00
        Integration of Ocean Data Assimilation System in the NOAA-UFS R2O Project 20m

        NOAA’s current operational ocean forecast and monitoring systems are based on various models and analysis systems. Real-time Ocean Forecasting system (RTOFSv2.0) is based on 1/12 degree HYCOM-CICE4 with Navy Coupled Ocean Data Assimilation (NCODA) and operational Global Ocean Data Assimilation System (GODAS) uses an older generation ocean model of MOM3 with 1 degree resolution for ocean monitoring and climate prediction. In perspective of NOAA’s forecasting system modernization effort, we provide an overview of the scope of the NOAA Unified-Forecasting-System-Research-to-Operation (UFS-R2O) project with a focus on the integration of the ocean data assimilation systems through the Joint Effort for Data Assimilation Integration (JEDI). MOM6 and CICE6 models form the core of the NOAA-NCEP Next Generation Global Ocean Data Assimilation System (NG-GODAS). We plan to apply the JEDI-based NG-GODAS for NOAA’s future operational versions of the Global Forecast system (GFS) and the Global Ensemble Forecast system (GEFS). In this presentation, an assessment of the JEDI-based development is discussed with interim 40 year reanalysis experiment results for the MOM6-CICE6 global 1-degree model configuration. The software compatibility of the prototype version of the NG-GODAS system is also demonstrated with various model configurations and data assimilation applications. Latest updates and key milestone progresses of other JEDI-based NOAA NCEP ocean data assimilation projects are summarized including the UFS sub-seasonal-to-seasonal (S2S) initialization for ¼ degree MOM6 and CICE6 model configurations, development of near surface sea temperature analysis, biogeochemical data assimilation of satellite ocean color product, and high resolution MOM6 regional data assimilation activity to support forecasting extreme weather events.

        Speaker: Rahul Mahajan (NOAA/NWS/NCEP/EMC)
      • 16:20
        Application of the BUMP library in the SOCA system 20m

        The BUMP library (B matrix on an Unstructured Mesh Package) is a core component of the JEDI project (Joint Effort for Data assimilation Integration), lead by the JCSDA. Using ensembles of forecasts, this generic tool can estimate parameters for various background error covariance models (static, localized ensemble, hybrid), and it also implements their efficient application to a vector. It can work on any kind of horizontal grid and handle complex boundaries, which makes it useful for ocean DA systems. The first part of this talk is a description of BUMP, its motivations, capabilities and implementation strategies.

        The JCSDA has also developed a MOM6 interface to the JEDI project, which is currently being implemented within the UFS at NOAA for global and regional initialization of the ocean and cryosphere. It is also being implemented at the GMAO within a weakly coupled DA system targeting reanalysis and NWP forecast initialization. The workhorse static B matrix used for these implementations is based on parameterized background error and simple balance operators for the modeling of cross-covariances. The purpose of this study is to design and test a suite of covariance models based on newly available features in BUMP. The above mentioned covariance model is tested against the current configuration of the JCSDA ocean and sea ice reanalysis system over a period of several months. The metrics of comparison include observation space statistics of innovations as well as standard ocean and ice diagnostics.

        Speaker: Benjamin Ménétrier (IRIT - JCSDA)
    • 16:40 18:00
      Poster session 1
      • 16:40
        Short poster pre-recorded introduction 20m
        Speaker: Dr Andrea Storto (CMRE; CNR-ISMAR)
      • 17:00
        Posters 1h
    • 08:10 09:40
      Poster session 2
      • 08:10
        Short poster pre-recorded introduction 20m
        Speaker: Arthur Vidard (Inria)
      • 08:30
        Posters 1h 10m
    • 09:40 10:00
      Theme 7 (cont.): Recent assimilation infrastructure developments
      Conveners: Anthony Weaver (CERFACS), Marcin Chrust (ECMWF)
      • 09:40
        PDAF - features and recent developments 15m

        PDAF, the Parallel Data Assimilation Framework (http://pdaf.awi.de), is on open-source framework for ensemble data assimilation. PDAF is designed so that it is particularly easy to use and a data assimilation system can be quickly build, while PDAF ensures the computational efficiency. PDAF consists of an ensemble-component that provides online-coupled data assimilation functionality, thus data transfers in memory and by using the MPI parallelization standard, by inserting 3 function calls into the model code. These additions convert a numerical model into a data-assimilative model, which can be run like the original model, but with additional options. While this approach is particularly efficient, it is also possible to use separate programs to compute the forecasts and the assimilation analysis update. PDAF further provides data assimilation methods (solvers), in particular ensemble Kalman filters and particle filters. Tools for diagnostics, ensemble generation, and for generating synthetic observations for OSSEs or twin experiments, provide additional functionality for data assimilation. PDAF is used for research purposes, teaching, but also operationally. In the operational context, PDAF is used at the CMEMS forecasting center for the Baltic Sea and in the Chinese Global Ocean Forecasting System (CGOFS). A recent addition to PDAF is OMI, the Observation Module Infrastructure, a library extension for observation handling. OMI is inspired by object-oriented programming, but for ease of use, it is not coded using classes. Recent developments further include support for strongly-coupled data assimilation across components of Earth system models, model bindings for NEMO, SCHISM, and the climate model AWI-CM. Further, an ensemble-variational solver is under development. This presentation discusses the PDAF's features and recent infrastructure developments in PDAF.

        Speaker: Lars Nerger (Alfred Wegener Institute)
      • 09:55
        Discussion 5m
    • 10:00 13:30
      Theme 8: Development and assessment of data assimilation in forecasting applications
      Conveners: Dr Eric de Boisseson (ECMWF), Matthew Martin (Met Office)
      • 10:00
        Invited talk: Ensemble forecasting greatly expands the prediction horizon for internal “weather” of the ocean 30m

        Mesoscale eddies dominate energetics of the ocean, modify mass, heat and freshwater transport and primary production in the upper ocean. Eddy resolving ocean models (horizontal resolution finer than 10 km in mid-latitudes) show improved representation of mesoscale dynamics. However, mesoscale eddies, which are hard to constrain using available observations, are large contributors to the forecast error. As a consequence, the forecast skill horizon for ocean mesoscales in current operational models is shorter than 10 days. Here we show that this lack of predictive skill is due to high uncertainty in the initial location and forecast of mesoscale features that is not captured by the current generation of deterministic ocean modeling and assimilation systems. Using ensemble simulations, we account for this uncertainty, filter-out unconstraint scales, and, as a result, significantly extend the predictability of the ocean mesoscales (to between 20 and 40 days) than deterministic models. Results of this research suggest that leveraging advancements in ensemble analysis and forecasting should complement the current focus on high-resolution modeling of the ocean.

        Speaker: Sergey Frolov (NOAA)
      • 10:30
        Discussion 10m
      • 10:40
        Coffee break 20m
      • 11:00
        Invited talk: Requirements on Ocean Data Assimilation Methods to meet Seamless Predictions needs 30m

        A priority for ocean data assimilation methods is the appropriate initialization of the ocean for seamless forecasts of weather and climate, from days to decades. The relevant ocean processes span a wide range of spatial and time scales, and their correct initialization poses a major challenge for the data assimilation methodology. Thus, at the time range of days to weeks, assimilation methods targeting the accurate and balanced initialization of sharp SST fronts and ocean mixed layer are required. As we move to seasonal time scales, the balanced initialization of the thermal structure in the upper few hundred meters and equatorial waves becomes important. Decadal forecasts require initialization of the deeper parts of the ocean and associated transports. Methods for consistent and efficient assimilation of sea-ice information are also needed. Extended-range, seasonal and decadal prediction require historical reforecasts spanning several decades, which are initialized from ocean or coupled reanalyses. Consistency between the historical reanalysis and real time ocean initial conditions is essential. Reliable multi-decadal ocean reanalyses need methods for dealing with model error, as to prevent spurious climate signals arising from the changing ocean observing systems. This methodology should be robust across a variety of climate regimes. This presentation illustrates these different aspects from experiments with the ECMWF extended and seasonal forecasting systems. It also discusses the need of evaluation methodology for development of multi-scale data assimilation methods.

        Speaker: Dr Magdalena Balmaseda (ECMWF)
      • 11:30
        Evaluation of eddy-properties in operational oceanographic analysis systems 20m

        Recent studies have shown that the presence of oceanic eddies affects the intensification of high-impact tropical cyclones. Many operational weather prediction systems (e.g. in Canada, UK and Europe) have now moved to using fully-coupled atmosphere-ocean prediction models. As a result, the accuracy with which ocean analysis systems are able to constrain the presence and properties of oceanic eddies may affect tropical cyclone forecast skill. While numerous eddy identification and tracking methods have been developed for oceanic eddies, specific methods and metrics tailored to verifying the skill of ocean analyses and forecasts in capturing these features are lacking. Here we apply an open-source eddy-tracking software and adapt it for the purpose of matching eddies between gridded observational analyses and two ocean analysis products of different resolution (1/4° and 1/12°). The ocean analysis products are the Global and Regional Ice Ocean Prediction Systems run operationally at Environment and Climate Change Canada. The systems share a common data assimilation approach with the main difference between them being the model resolution and the inclusion on tides in the regional system. A contingency table approach is taken to identify hits, misses and false alarms to provide statistics on the probability of detection and false alarm ratio. These statistics are investigated in terms of their sensitivity to eddy properties (radius, amplitude). The results clearly demonstrate the added value of higher resolution in accurately representing eddy features. The higher resolution analyses provide a higher probability of detection with a lower false alarm rate. Errors in eddy radii are also improved in the 1/12° analyses.

        Speaker: Dr Gregory Smith (ECCC)
      • 11:50
        Regional Analysis of Indian OceaN (RAIN) 20m

        Title: Regional Analysis of Indian OceaN (RAIN)

        Authors: Balaji B 1,2,3; Arya Paul 1; Biswamoy Paul 1; Francis P. A. 1;

        Affiliation:

        1. Indian National Centre for Ocean Information Services, Ministry of Earth Sciences, Govt. of India, Hyderabad, 500090, India
        2. Indian Institute of Tropical Meteorology, Ministry of Earth Sciences, Govt. of India, Pune, 411008, India
        3. Department of Marine Geology, Mangalore University, Mangalagangotri, Karnataka, 574199, India

        Abstract:
        RAIN (Regional Analysis of Indian OceaN) is a data assimilation system developed in INCOIS wherein ROMS (Regional Ocean Modeling System), which is an ocean general circulation model suited for regional basins and is used as a forecast model for Indian Ocean by INCOIS, is interfaced with the data assimilation scheme of Local Ensemble Transform Kalman Filter (LETKF). This system assimilates in-situ temperature and salinity profiles and satellite track data of sea-surface temperature (SST). The ensemble members of assimilation systems are initialized with different model coefficients like diffusion and viscosity parameters and with different atmospheric forcing. In addition, the ensemble members also respond to two different mixing schemes – K profile parameterization and Mellor-Yamada. This strategy aids in exploiting the benefits of varied mixing parameterizations and aids in arresting the filter divergence. The assimilation system is validated extensively against multiple observations ranging from RAMA moorings to ADCP observations and satellite observations across both dependent variables like temperature and salinity and independent variables like sea-level anomaly and ocean currents. The assimilated system simulates the ocean state better than the previous operational ROMS setup. Improvement permeates to all vertical levels with better correlation with respect to observations and reduced root-mean-squared error.

        Speaker: Balaji Baduru (Indian National Centre for Ocean Information Services)
      • 12:10
        Forecast Sensitivity to Observations and the U.S. Integrated Ocean Observing System 20m

        The U.S. Integrated Ocean Observing System (IOOS) forms the backbone of real-time ocean analysis-forecast systems of U.S. territorial waters. In addition to satellite remote sensing, the IOOS is augmented with in situ observations from a variety of platforms including Argo floats, buoys and gliders. In addition, remote sensing observations of surface currents are also available from an extensive national network of coastal HF radars. Maintenance of these observing systems is obviously labor-intensive and costly. Routine monitoring of the impact of data from each element of the observing network is therefore recognized as an important activity, not only for maintaining the array and demonstrating its value, but also as an aid for planning future expansions of the observing system. This talk will focus on current efforts to quantify forecast sensitivity to observations (FSO) in analysis-forecast systems of the U.S. west coast and east coast circulations.

        Speaker: Prof. Andrew Moore (University of California Santa Cruz)
      • 12:30
        Lunch break 1h
    • 13:30 15:15
      Working Group session 1
      • 13:30
        Introduction to the Working Groups 15m
        Speakers: Andrew Moore (University of California Santa Cruz), Matthew Martin (Met Office), Magdalena Alonso Balmaseda (ECMWF)
      • 13:45
        Working Group 1 - Chairs: Andrea Storto, Ann Kristin Sperrevik - Rapporteur: Phil Browne 1h
      • 13:45
        Working Group 2 - Chairs: Mike Bell, Elisabeth Remy - Rapporteur: Matt Martin 1h
      • 14:45
        Coffee break 30m
    • 15:15 19:30
      Working Group session 2
      • 15:15
        Working Group 1 - Chairs: Andrea Storto, Ann Kristin Sperrevik - Rapporteur: Phil Browne 2h 15m
      • 15:15
        Working Group 2 - Chairs: Mike Bell, Elisabeth Remy - Rapporteur: Matt Martin 2h 15m
      • 16:30
        Working Group 3 - Chair: Patrick Heimbach, Emily Smith - Rapporteur: Marcin Chrust 3h
    • 08:00 15:30
      Working Group session 3
      • 08:00
        Working Group 4 - Chairs: Peter Oke, Colette Kerry - Rapporteur: Hao Zuo 2h
      • 09:00
        Working Group 1 - Chairs: Andrea Storto, Ann Kristin Sperrevik - Rapporteur: Phil Browne 3h
      • 09:00
        Working Group 2 - Chairs: Mike Bell, Elisabeth Remy - Rapporteur: Matt Martin 3h
      • 12:00
        Working Group chairs prepare reports 1h
      • 13:00
        Lunch break 1h
      • 14:00
        Plenary: reports from the Working Groups, final discussion & close 1h 30m
        Speakers: Magdalena Alonso Balmaseda (ECMWF), Andrew Moore (University of California Santa Cruz)