1ECMWF
There are minor changes to mean analysis fields when aircraft or radiosonde data are withheld from the ECMWF analysis system. Aircraft data have dramatically increased in volume over the last three decades and so associated changes in biases may be of concern to reanalysis users.
With the removal of aircraft data there are minor changes to mean winds at 200 hPa especially over the North Atlantic.
Mean zonal wind at 200 hPa over east Asia, especially China, change somewhat in winter with the exclusion of radiosonde data, possibly due to problems at low radar elevation angles (note that at least 60% of radiosonde winds now come from GPS rather than radar).
Aircraft temperatures are biased warm on average and they are bias-corrected in the ECMWF system - this reduces the problem but does not eliminate it. Assimilating the data slightly increases temperatures at cruise levels (~250 hPa) and at lower levels over some airports in the USA. There is an air-speed correction to the raw temperature measurement and some evidence that this could be improved (in the on-board processing) to remove the bias.
1Institute of Atmospheric Physic ASCR, 2IAP CAS
During last decades several reanalyses have been release and they are widely used in middle atmospheric climatology and trend studies. S-RIP project analysed most of the used reanalyses except ERA5. In our study we will analyse new reanalyses like JRA-3Q, R21C or some NOAA reanalyses. We will examine climatology of basic parameters (temperature, winds or relative humidity) in all available pressure levels. Next we will analyse behaviour of several major SSW. Results will be compared with some available observations. We also try to show temperature trends of mentioned reanalyses.
1Institute of Oceanography, Universität Hamburg, 2University of Bordeaux, CNRS, Bordeaux INP, EPOC, 3Met Office, 4Shanghai Jiao Tong University, School of Oceanography, 5Meteorological Institute, Universität Hamburg
To initialize decadal climate predictions (DCPs), the widespread practice is to introduce separate ocean- and atmosphere-only reanalyses into a coupled climate model through nudging and then to begin DCPs from the nudged states. Possible inconsistencies that may lead to initialization shocks from introducing external uncoupled reanalyses into ocean and atmosphere components of a prediction system is a known and long-established issue in the seasonal and decadal climate prediction communities. Initialization shocks might lead to the loss of prediction skill at longer lead times. For example, it is known that full-field initialization can result in the disruption of the Atlantic meridional overturning circulation (AMOC). However, how critical such AMOC issues are at the assimilation step for the prediction skill, e.g., at the surface of the ocean, has not been fully understood yet.
This study concerns with deriving a coupled reanalysis as a source of coupled initial conditions for DCPs that are dynamically consistent between themselves and the prediction system. The reanalysis is based on the coupled adjoint method designed for the Earth System Model of intermediate complexity CESAM (Centrum für Erdsystemforschung und Nachhaltigkeit Erdsystem Assimilations-Modell). From the test versions of the coupled reanalysis, it is obvious that different settings of the assimilation affect the behavior of the AMOC. In a model-consistent approach, the study attempts to compare the initialization of the AMOC based on the coupled reanalysis and based on the coupled nudging toward the separate ocean and atmosphere reanalyses. We also analyze the AMOC from the multi-model CMIP6 DCPs to identify whether possible flaws in the AMOC assimilations could be linked to the issues with the prediction skill for the important climate indices. The results of this study aim to guide future initialization developments for DCPs with the comprehensive Earth System Models.
1SMHI
The Copernicus European regional reanalysis (https://climate.copernicus.eu/regional-reanalysis-europe) is produced as part of the Copernicus Climate Change Service (C3S). The presentation will introduce the service and its main objectives as well as it will give an overview on available data. Data quality will be demonstrated by comparison with ERA5 and other gridded datasets.
The Copernicus European Regional ReAnalysis (CERRA) is produced with a setup of the HARMONIE-ALADIN model system including a 3D-Var data assimilation scheme for upper air observations and an OI-scheme for surface observations. The model domain covers entire Europe at a horizontal resolution of 5.5 km. The system provides eight analyses per day – at 0 UTC, 3 UTC, 6 UTC, … and 21 UTC. Between the analyses, data are available with hourly resolution from the forecast model. More than fifty parameters are available on various level types including surface parameters and data up to 1 hPa. Data are available for the period September 1984 – June 2021 through Copernicus Climate Data Store (CDS).
In addition to CERRA, the service produced a reanalysis with an ensemble of data assimilation (EDA) system, called CERRA-EDA. CERRA-EDA consists of 10 members and is produced with a horizontal resolution of 11 km. Uncertainty information from the ensemble system is used as a flow depending part of the B-matrix for CERRA. Another product of the service is CERRA-Land. CERRA-Land provides daily precipitation analyses as well as parameters from the surface/soil model SURFEX, which is driven with CERRA data.
In the presentation, the production chain will be illustrated and the availability of data will be clarified. The focus will be on CERRA. The quality of the regional reanalysis will be demonstrated by comparison with other gridded datasets, among others ERA5. For instance, investigations of the winter storm Gudrun (January 2005, southern Sweden) will be presented.
1BIRA-IASB, 2Oskar Klein Centre for Cosmoparticle Physics, Department of Physics, Stockholm University, Stockholm, Sweden, 7Imperial Centre for Inference and Cosmology, Department of Physics, Imperial College London, Blackett Laboratory, London, UK, 3Universidad Complutense de Madrid, Madrid, Spain, 4National Institute of Water and Atmospheric Research, Lauder, New Zealand, 5National Center for Atmospheric Research, Boulder, CO, USA, 6Institute of Astrophysics and Geophysics, UR SPHERES, University of Liège, Liège, Belgium, 7Institute of Meteorology and Climate Research (IMK-ASF), Karlsruhe Institute of Technology, Karlsruhe, Germany, 8School of Chemistry, University of Wollongong, Wollongong, Australia, 9Department of Earth, Space and Environment, Chalmers University of Technology, Gothenburg, Sweden
The Brewer-Dobson Circulation (BDC) determines the distribution of long-lived tracers in the stratosphere; therefore, their changes can be used to diagnose changes in the BDC. We evaluate decadal (2005-2018) trends of nitrous oxide (N2O) in a chemistry-transport model (CTM) driven by four different reanalyses by comparing them with space-borne measurements from the Atmospheric Chemistry Experiment Fourier Transform Spectrometer (ACE-FTS) and from ground-based Fourier Transform Infrared (FTIR) measurements. The limited sensitivity of the FTIR measurements can hide negative N2O trends in the mid-stratosphere because of the large increase in the lowermost stratosphere. When applying ACE-FTS measurement sampling on the CTM experiments, the reanalyses from the European Centre for Medium Range Weather Forecast (ECMWF) compare best with ACE-FTS, but the N2O trends are consistently exaggerated. Model sensitivity tests show that the decadal N2O trends reflect changes in the stratospheric transport. While few ideal observational datasets currently exist, this reanalysis study of N2O trends still provides new insights about the BDC and its changes because of the contribution from the careful analysis of the ACE-FTS sampling and relevant sensitivity tests.
1SPASCIA, 2ECMWF, 3University of Reading
Satellite missions during the 1960s and 1970s generated significant amounts of Earth observation data. The majority of these data, however, are not currently exploited in climate reanalysis, despite their potential value for constraining the evolution of global weather. This paper highlights work from a C3S satellite data rescue project, aiming to recover, assess, improve and prepare a selection of early satellite data records that will help to improve the ECMWF’s next centennial climate reanalysis ERA6, and beyond.
These early datasets were measured by a range of infrared and microwave radiometers, sounders and imagers, flown mainly on the Nimbus series of satellites from 1964 to 1979. The sounding instruments in particular have the potential to greatly improve the quality of future reanalyses during the 1970s, especially in the upper atmosphere where few in-situ measurements were taken.
We present examples of improvements made to early satellite data quality such as revised timing and geolocation, quantifying and reducing errors typical of these heritage datasets. Extensive quality flagging enhances the overall quality and usability of each dataset. Comparison (O-A) of the observations to ERA5-based radiative transfer model (RTTOV) simulations, provides an additional means of assessing data quality and characterising biases in the data, as well as information about the quality of the reanalysis. We find evidence for model temperature biases in the upper atmosphere from study of spatial and temporal trends in O-A from different instrument datasets, supported by independent observations obtained from radiosondes and rocketsondes.
We have also tested ECMWF’s existing variational bias correction model on the reprocessed satellite datasets and make suggestions of improvements to these models to account for the specific error characteristics of each sensor. The final set of data will then be used to provide valuable information for the 1960s and 1970s, potentially as part of the ERA6 reanalysis, and beyond.
1RCAST, University of Tokyo, 2Japan Meteorological Agency, 3Tohoku University, 4Meteorological Research Institute
An overview is provided of ClimCORE (Climate change actions with CO-creation powered by Regional weather information and E-technology; Project Leader: H. Nakamura), a 10-year project initiated in December 2020 under the funding by the Japan Science and Technology Agency. It aims to construct high-resolution regional atmospheric reanalysis datasets for Japan and promote their use for various academic and industrial purposes. To produce RRJ-ClimCORE, a high-quality, high-resolution (5km) regional reanalysis dataset for Japan, the Research Center for Advanced Science and Technology (RCAST), the University of Tokyo has been collaborating with the Japan Meteorological Agency (JMA), to implement the latest version of the operational non-hydrostatic meso-scale forecast system MSM (with 96 levels) with 4D-Var assimilation system onto University’s supercomputer. Most of the observational data that have been used in the JMA operational global/meso-scale analyses will be assimilated, including conventional and satellites observations as well as hourly 1-km resolution gridded precipitation analysis data over Japan based on radar and rain gauge measurements. An experimental production was conducted with the previous version of MSM (76 levels) for 20 months from September 2020. It turns out that RRJ-ClimCORE can realistically represent stormy winds and topographically intensified regional rainfall associated with strong typhoons in addition to organized convective systems that caused torrential rainfall in Kyushu. ClimCORE also supports production of RRJ-Conv, a 60-year high-resolution (5km) reanalysis dataset based only on conventional observations and tracked typhoon centers. The production is based on JMA non-hydrostatic regional climate model with LETKF assimilation under the collaboration between Tohoku University and Meteorological Research Institute. RRJ-Conv is found to represent frequency and spatial distribution of heavy rainfall events (100 mm/day) reasonably well over Japan for 20 recent years. We note that JMA and ClimCORE jointly host the WCRP 6th International Conference on Reanalysis in Tokyo in the fall 2024.
1SMHI
A gridded dataset (SMHIGridClim) was produced in 2021 to meet the need of a high resolution climate reference dataset at SMHI. Here, we present this dataset, along with the current work aiming to extend the data.
The present dataset covers the years 1961–2018 over an area including the Nordic countries with 2.5 km horizontal resolution. The variables included are the two meter temperature and relative humidity on 1, 3 or 6 hour resolution, depending on time period, along with the daily minimum and maximum temperatures, daily precipitation and daily snow depth.
Observations for the analysis were provided by the Swedish, Finnish and Norwegian meteorological institutes, and more data was fetched from ECMWFs MARS. Quality check s was performed using the open source software TITAN (https://github.com/metno/TITAN), developed at the Norwegian Meteorological Institute. The gridding was done using optimal interpolation with the gridpp open source software (https://github.com/metno/gridpp), also from the Norwegian Meteorological Institute.
The first guess is a statistically downscaled forecast from the UERRA-HARMONIE reanalysis at 11 km horizontal resolution. The downscaling was done based on a linear regression between the operational MEPS NWP system at 2.5 km and UERRA-HARMONIE. Currently, we are preparing the switch of the first guess to CERRA, the successor of UERRA-HARMONIE. A comparison of the overlapping time period 1985-2018 will be carried out.
Moreover, we are working on adding 10m wind speed as a new parameter to the dataset. However, wind observations are often not representative since they are affected by very local conditions. Hence, the 10 meter wind speed will be achieved independently of observations using the fit from MEPS only.
1University of Ljubljana, Faculty of Mathematics and Physics, 2University of Bergen
The strength of the Hadley circulation (HC) affects the precipitation distribution in the tropics and subtropics. An accurate description of this circulation is crucial for future climate projections. However, recent studies have identified opposing trends of Hadley circulation strength between climate models and reanalyses. These disparities have been attributed to artefacts in the representation of latent heating in reanalyses or the inability of climate models to capture internal climate system variability.
We investigate whether these artefacts in the representation of latent heating are present in the latest ERA5 reanalysis. We use the extended Kuo-Eliassen equation to decompose the mean meridional circulation and identify the meridional gradient of diabatic heating as the main physical process and, therefore, the main driver of both HC strength variability and its recent strengthening. We analyse the relationship between HC strength, meridional gradient of diabatic heating and observed meridional gradient of precipitation from the Global Precipitation Climatology Project (GPCP) and the Tropical Rainfall Measuring Mission (TRMM) datasets. The results demonstrate consistent standardised trends and variability among these variables.
Comparison of HC strength in ERA5 with different decadal-to-multidecadal variability indices reveals a strong correlation between both Hadley cells and the Atlantic multidecadal variability (AMV). AMV is characterised by sea surface temperature fluctuations in the Northern Atlantic with a temporal period of 50-70 years. Other reanalyses also confirm this correlation for the southern Hadley cell over a longer time interval. Recent studies also indicate that ERA5 strengthens the meridional sea-level pressure gradient, which is directly proportional to the HC strength. This trend, however, contradicts observations. These results indicate that further analysis is needed to understand the differences in trends of Hadley circulation strength between climate models and reanalyses.
1NASA JPL, 2Royal Netherlands Meteorological Institute (KNMI), 3Jet Propulsion Laboratory/California Institute for Technology, 4Graduate School of Environmental Studies, Nagoya University, 5Japan Agency for Marine-Earth Science and Technology, 6Japan-Agency for Marine-Earth Science and Technology
Global lockdown measures to prevent the spread of the 2019 novel coronavirus (COVID-19) led to air pollutants emission reductions. While the COVID-19 lockdown impacts on both trace gas and total particulate pollutants have been widely investigated, secondary aerosol formation from trace gases remains unclear. To that end, we quantify the COVID-19 lockdown impacts on NO$_{\rm x}$ and SO$_2$ emissions and sulfate-nitrate-ammonium aerosols using multi-constituent satellite data assimilation and model simulations. The assimilated satellite observations were obtained from S5P/TROPOMI for NO$_2$ and SO$_2$ during January-June in 2019 and 2020. We find that anthropogenic emissions over major polluted regions were reduced by 19-25% for NO$_{\rm x}$ and 14-20% for SO$_2$ during April 2020. These emission reductions led to 8-21% decreases in sulfate and nitrate aerosols over highly polluted areas, corresponding to $>$ 34% of the observed aerosol optical depth declines and a global aerosol radiative forcing of $+$0.14 W$\cdot$m$^{-2}$ relative to business-as-usual scenario. These results point to the critical importance of secondary aerosol pollutants in quantifying climate impacts of future mitigation measures.
1University of Vienna
Oceanic transports of heat, volume and salinity are an integral part of the Earth's energy and mass budgets and play a key role in regulating the Earth's climate. There are several measuring lines, like the Arctic gateways or the OSNAP and RAPID arrays further south, consisting of moorings and other instruments that can measure deep water velocities and other sea state variables. It is desirable to compare the transports calculated by these instruments with ocean reanalyses or climate models. However, this is challenging because the moorings are not aligned with the model grids, and the ocean model grids get complicated towards more northern latitudes.
We have developed StraitFlux, new tools that allow the calculation of accurate volume, heat and salinity transports through any oceanic section. StraitFlux works on several curvilinear modelling grids, including different versions of the ORCA grid as used by NEMO, and thus the GREP ensemble. It incorporates two methods: the first, using line integration, provides integrated net transports, while the second uses vector projection algorithms to produce cross sections of currents, temperature and salinity in the vertical plane. This allows for a consistent comparison with observational flux estimates.
We have used StraitFlux to calculate transports in the main Arctic gateways, the Greenland-Scotland Ridge, as well as in the narrow passages of the Indonesian Throughflow Region and compare them to available observations. While we find some biases, especially in straits that are narrow and bathymetrically complicated, the results generally show that reanalyses capture the main current patterns quite well. As StraitFlux works on various modelling grids it can also be applied to output from the Coupled Model Intercomparison Project Phase 6 (CMIP6). We use reanalyses to validate oceanic transports from historical CMIP6 model runs and find larger and often systematic deviations from the mooring and reanalysis output.
1eo-winds.net
This poster will address the use of ERA5 data for the purpose of Wind- and Metocean Site Condition Assessments, with a focus on Offshore Wind projects. The added-value of ERA5 compared to other reanalysis datasets will be demonstrated via comparisons against high-quality, wind energy specific in-situ measurements. Similarly, shortcomings of the ERA5 dataset, and in particular the overestimation of the surface drag in strong, young wind-sea conditions will be discussed in detail. The purpose of sharing this information is twofold: provide end-user feedback, and advocate for the use of publicly available, high-quality, wind energy-specific in-situ measurements in future validation work (ERA6 in particular).
1Meteorological Research Institute, 2Numerical Prediction Development Center / Japan Meteorological Agency, 3JMA, 4Meteorological Research Institute / Japan Meteorological Agency
The Japan Meteorological Agency (JMA) has developed the third Japanese global atmospheric reanalysis, the Japanese Reanalysis for Three Quarters of a Century (JRA-3Q). The objective of JRA-3Q is to improve the quality and extend the period of long-term reanalysis products. JRA-3Q is the third long-term reanalysis after the Japanese 25-year Reanalysis (JRA-25) and the Japanese 55-year Reanalysis (JRA-55), and it covers the period from September 1947 to the present, extending back about 10 years earlier than JRA-55. JRA-3Q is based on the TL479 version of the JMA global numerical weather prediction (NWP) system as of December 2018 and uses results of developments in the operational global NWP system, boundary conditions, and forcing fields achieved at JMA since JRA-55. The enrichment of observations through data-rescue activities and satellite data reprocessing by meteorological and satellite centers has also helped to improve the JRA-3Q product.
The initial quality evaluation revealed major improvements from JRA-55 in the global energy budget and representation of tropical cyclones (TCs). The large upward imbalances in the global mean net energy flux at the top of the atmosphere and at the surface, among the major problems of JRA-55, have been significantly reduced in JRA-3Q. The trend of artificial weakening of TCs apparent in JRA-55 has been resolved using a method that generates TC bogus based on the JMA operational system. The quality evaluation for the pre-1957 period, which was not covered by the previous JMA reanalyses, shows that representation of major typhoons, such as Typhoon Kathleen in 1947 and Typhoon Marie in 1954, are generally consistent with the weather maps produced in those days. There remain several problems, including the diminished representation of stratospheric warming after large volcanic eruptions. This presentation discusses the causes of such problems and possible solutions in future reanalyses.
1National Centre for Atmospheric Science, University of Reading
Extreme wind events are among the costliest natural disasters in Europe. Significant effort is dedicated to understanding the risk of such events, usually analysing observed storms in the modern era. However, it is likely that some historical windstorms were more extreme and/or followed different tracks from those in the modern era. Producing plausible reanalyses of such events and translating them into a warmer world would improve the quantification of current and future windstorm risks.
Billions of historical climatological observations remain unavailable to science as they exist only on paper, stored in numerous archives around the world. We demonstrate how the rescue of such paper observations has improved our understanding of an extreme windstorm that occurred in February 1903 and its significant impacts. By assimilating newly rescued atmospheric pressure observations into the 20th Century Reanalysis system, the storm is now credibly represented in an improved reanalysis of the event. In some locations this storm produced stronger winds than any event during the modern era. As a result, estimates of risk from severe storms, based on modern period data, may need to be revised.
In addition, we use novel reanalysis experiments to translate this windstorm into a warmer world to quantify how it might be different both in the present and in the future. We find that the same storm produces more intense rainfall and stronger winds in a warmer climate, providing a new approach to quantifying how extreme weather events will change as the world is warming.
1Ocean University of China, 2Pacific Northwest National Laboratory
Extreme weather events, such as heat waves, extreme precipitation have frequently occurred, which could substantially affect air quality as well. Understanding how the extreme weather events respond to a warming climate is critically important. Numerical models are important tools understanding the associated mechanisms. With the aid of advanced supercomputers, we have recently achieved improvements in the development and assessment of ultra-high-resolution Earth system models (HRESMs) based on community earth system model (CESM), e.g., the application of a series of HRESMs including models with 25 km atmosphere and 10 km ocean, up to 5 km atmosphere and 3 km ocean. The climate modulation on climate extremes including land and marine heat waves, extreme precipitation and the association with atmospheric rivers will be shown, as well as the climate effect on air quality, through comparisons between HRESMs and commonly used CMIPs (CMIP5, CMIP6).
1ECMWF, 2University of Reading
To establish climatologies that facilitate the contextualization of extreme, high-impact weather events in relation to historical occurrences or to comprehend the influence of climate change on their intensity and frequency, an extensive collection of observations extending as far back as possible is essential. Yet, these observations exhibit inaccuracies and uneven distributions in space and time. These attributes may lead to a distorted representation of past weather and climate, particularly for variables like rainfall, which can exhibit substantial variations in space and time.
Reanalyses and reforecasts fill the gaps in the observational records. Existing literature has demonstrated that both reanalysis and reforecast datasets offer a more accurate representation of past weather and climate, owing to their global completeness and temporal consistency. Nonetheless, reanalyses and reforecasts may not adequately depict localized and/or rare events as effectively as observational climatologies might do (provided sufficient observations are available) due to the coarse spatial resolutions of both modelled datasets. This misrepresentation is particularly pertinent for discontinuous variables, such as precipitation.
In this presentation, we will examine the representation of point-rainfall climatologies by four distinct global modelled datasets: ERA5_EDA (reanalysis, 62km), ERA5 (reanalysis, 31km), ECMWF reforecasts (reforecasts, 18 km), and ERA5_ePoint (reanalysis, point-scale over an 18km grid). Furthermore, we will discuss the implications of this study on future calculations of reference climatologies for localized extreme precipitation events.
1MATE Hungarian University of Agriculture and Life Sciences
Siphoned pluviographs have been providing a significant part of rainfall intensity data, and it was most common before the more intensive application of tipping bucket and weight measurement gauges. The pluviographs operate periodically; as the rainfall height reaches the upper edge of the slowly rotating registration tape, the pen must get back to the starting position. During this process, the registration is paused. There were sometimes other correction procedures as well, comparing the measured rain depth to a normal rain gauge, and resharing the difference in some way. These correction procedures did not take care of the rainfall intensity, which had a strong influence on the error. This systematic error causes an under-measurement, mainly when the rainfall intensity is significant. If the rainfall registration tape is available, the error can be fixed using a method, like Luyckx and Berlamont’s. In most cases, the tape is not available anymore, and the data is available in an extracted form only. The tape had been processed, and only the most intensive rainfall intensities of precipitation for practical durations were extracted into data charts, for 5, 10 etc. minutes. For these cases, the above-mentioned methods cannot be applied. A new correction formula was improved to fix the siphonage error in processed data sets. The importance of this formula is the correction of extracted data, restoring a more realistic value of the peak intensities or rain depths detected by the siphoned pluviometers. The significance of this development can result in a better reference for climate change investigations on the one hand, and on the other, the direct use in drainage engineering can be important in facility revision projects.
1RCAST, University of Tokyo, 2Japan Meteorological Agency
A major purpose of ClimCORE (Climate change actions with CO-creation powered by Regional weather information and E-technology; Project Leader: H. Nakamura), a 10-year project funded by the Japan Science and Technology Agency, is to produce RRJ-ClimCORE, a high-quality, high-resolution regional atmospheric reanalysis dataset for Japan. For this purpose, the Research Center for Advanced Science and Technology (RCAST), the University of Tokyo has been collaborating with the Japan Meteorological Agency (JMA) for the last two years, to implement the latest version of the operational meso-scale forecast system (MSM) based on a non-hydrostatic regional model (ASUCA) with 5-km horizontal resolution and 96 vertical levels and a meso-scale four-dimensional variational data assimilation system onto a supercomputer (Wisteria/BDEC-01) of the University. The lateral boundary condition is taken from the latest version of the JMA global reanalysis (JRA-3Q), whereas MGDSST (0.25º resolution) is used as the lower-boundary condition. To produce RRJ-ClimCORE, most of the observational data that have been used in the JMA operational global/meso-scale analyses or JRA-3Q will be assimilated, including conventional and satellites observations as well as hourly 1-km resolution gridded precipitation analysis data over Japan based on radar and rain gauge measurements. The latter is a unique input to RRJ-ClimCORE and will be reprocessed before assimilated for improved representation over offshore regions. Due to the availability of satellite data, RRJ-ClimCORE will be produced for some twenty recent years, but it is expected to present high reproducibility of extreme events, including organized mesoscale convective systems characteristic of the Japanese region. Based on a preliminary output, the performance of RRJ-ClimCORE will be evaluated in comparison with other data.
1University of Cologne
The Atacama Desert exhibits notably lower summer rainfall compared to other deserts worldwide, such as the Namib and the Sahara. This feature, combined with its high altitude and low atmospheric humidity, places this desert as a hotspot for astronomical facilities, as well as the study of life evolution under low water supply environments. However, a series of extreme weather events of rainfall in summer and early fall have been observed in the last decade, impacting both the coastal regions and the hyperarid core of the Atacama. These episodes challenge our present knowledge regarding the mechanisms governing humidity and precipitation during this season. To overcome the scarcity of in situ measurements of the desert, we complemented climate observations in the period 1991-2019 with ERA5 reanalysis, as well as regional scale simulations (WRF). We found that Pacific originated humidity triggers rainfall across the Atacama. The humidity is transported in the lower free-troposphere along the west coast of South America, resembling a Moist Conveyor Belt structure. Moisture is transported inland by the diurnal circulation, triggering cloud formation across the desert due to the topographic instability and the steep slope of the Andes. The moist northerlies induced rainfall account for a major proportion of summer in the desert, as well as the western Altiplano. Furthermore, these events have increased notoriously in the last decade due to a synoptic configuration that leads to a more frequent moist northerlies. Potential trends in humidity and rainfall, as well its connections with climate change, are investigated.
1Bureau of Meteorology, 2Australian Bureau of Meteorology
The newly formed Australian Climate Service (ACS) is currently funding version 2 of the Bureau of Meteorology Atmospheric high-resolution Regional Reanalysis for Australia (BARRA2). Once complete, the reanalysis will cover the period from 1979 to the present day, over the Australasian region. This new climate data set is being developed to support a national climate risk service for Australia, in alignment with the development and production of next-generational national climate projections based on CMIP6. BARRA2, together with new climate projections, can provide a seamless view of past and future natural hazards at local scales.
The suite of new systems includes a deterministic reanalysis on 12 km horizontal grid (BARRA-R2) over Australia, New Zealand and parts of southeast Asia, a downscaled 22 lagged-member ensemble on a 24 km grid (BARRA-RE2), and a downscaled reanalysis on a 4 km grid over Australia (BARRA-C2). Building on BARRA version 1, BARRA2 benefits from nesting in ERA5 global reanalysis and using ERA5 EDA to support ensemble generation. It uses the recent modelling physics of the UK Met Office Unified Model (UM) and Joint UK Land Environment Simulator (JULES), assimilates more observations in 4D variational analysis, and improves soil moisture initialisation with an offline land surface simulation. The production of BARRA-R2 and BARRA-RE2 is underway (to complete before end of 2023), and the production of BARRA-C2 will commence in mid-2023.
Here we present our results from assessing the first set of BARRA-R2 reanalysis (2008-2018) and BARRA-C2 trials. These include case studies and comparisons with BARRA1, ERA5 and gridded observations to illustrate where BARRA2 adds value and areas for future improvements. Future plans for BARRA will also be shared. Additionally, the resulting reanalysis data sets will be outlined.
1Environment and Climate Change Canada, 2MSc
Long-term surface objective analyses (OA) of chemical species are valuable tools for understanding the historical evolution of pollution. They are also useful to evaluate, over the long-term, the forecast models, to build a climatology of surface pollutants, to evaluate the efficiency of existing pollution control and regulations, and to support epidemiological studies.
In order to generate the first 10-km air quality reforecast system over North American, the air quality forecast model GEM-MACH, from Environment and Climate Change Canada (ECCC), has been used. The GEM-MACH (Global Environmental Multi-scale - Modelling Air quality and Chemistry) model is an in-line chemical transport model (CTM) with full description of atmospheric chemistry and meteorological processes.
In this project, GEM-MACH model is piloted with year-specific emissions from various sectors (point and area sources) and with meteorological fields from a regional reforecast model, to calculate the ambient concentration of selected pollutants. Those model outputs then are blend with air quality surface observations from Canadian regional networks and the U.S. EPA’s AIRNow observation network, to produce the estimates of statistical errors. The statistics were computed using a First Order Autoregressive Model for each pollutant and surface station for the four seasons and for every hour. The error correlations are terrain-dependent (land/water, mountain) and addresses known problems in coastal and mountainous areas. Results for a multiyear period 2002-2017 of hourly surface analyses of 4 pollutants (O3,PM2.5 ,NO2 and SO2) covering North America will be shown. Future work and conclusions will be also presented.
1KNMI
A new temporally evolving quotient on the Ensemble of Data Assimilations (EDA) technique for estimating background error covariances has been developed for the Copernicus European Regional Re-Analysis (CERRA). The B-matrix is modeled on a bi-Fourier limited area model. Background errors are assumed isotropic, homogeneous and non-separable. Linearized geostrophic and hydrostatic balances are incorporated as multivariate relationships, coupling vorticity, and geopotential extended to mass-wind and specific humidity fields via the f-plane approximation. The EDA comprises two main pools: seasonal and daily. The seasonal component comprises winter and summer EDA forecast differences at reanalysis resolution (5.5 km). The new time quotient function temporally changes the mixture of differences from each season, to make up the seasonal component. The daily component is an 11 km moving 2.5 days average changing in realtime. Subsequent B-matrix computation sees the ingestion of forecast differences from both components, with a fixed split of 80%-20% seasonal-daily, every 2 days. The sourcing of these forecast differences from both seasonal and daily sources is in continuous temporal flux therefore. We consider a case study to illustrate the potential of estimating weather regime change using CERRA-EDA with varying proportions of seasonal-daily mixing, while including settings used for CERRA production. Our case study shows that the most influential factors are differences in observation networks between the given years, their spatial distribution across the CERRA domain, and the proportion of seasonal-daily split. It is shown that our method provides improvement over a static B-matrix.