Although state-of-the-art climate prediction models are based on the laws of physics, computational limits preclude exact solutions. To obtain practical solutions, physical processes operating on small scales are parameterized in terms of large-scale variables. The resulting parameterizations involve free parameters whose values are unknown and must be chosen empirically, a process called tuning. Model tuning raises significant challenges that require intensive collaboration between physical scientists, data scientists, mathematicians, and statisticians. In this talk, I discuss two new advances in the statistical aspects of model tuning. The first is a proposal for a benchmark algorithm based on the Kalman Filter. The lack of a standard algorithm is a serious barrier to progress. A standard reference would enable discovery of the most promising algorithms and allow lessons learned at one modeling center to be shared efficiently to other centers. The Kalman Filter is an attractive benchmark because it is the optimal Bayesian solution for normal distributions and has been studied extensively. However, existing implementations of the Kalman Filter require selecting hyperparameters, which is commonly done offline, increasing the computational burden. We propose a new Kalman Filter algorithm that estimates both model parameters and hyperparameters at the same time. The second advancement is an experimental design for choosing model parameter values that optimizes the information content of dynamical model runs. These new tuning algorithms will be illustrated with idealized models and Earth System Models.