This talk gives an overview of the research work done in working group IV. It will briefly introduce some of the problems in Deterministic Bayesian computation, big data reduction, estimation of normalizing constants in Bayesian sampling, and stochastic programming. This talk will briefly present two ongoing works in WG4. In the first part of the talk, we propose a new active data reduction method for large-scale Gaussian process (GP) modeling. GP modeling with big data is well-known to be computationally demanding, since it requires O(N^3) work and O(N^2) memory (N>>1 is the number of data points). The goal here is to first reduce this big data to a smaller dataset of n<<N points, then use the smaller data for efficient GP modeling. Our reduction method is guided by a useful bound on GP prediction error, which provides an exploration-exploitation trade-off for sequentially selecting the reduced data. We demonstrate the effectiveness of this approach in simulations and a climate model application. In the second part of this talk, we present a new adaptive modeling framework for estimating normalizing constants of an (unnormalized) posterior density, which we assume is expensive to evaluate. This method first uses posterior samples to fit an approximating surface, then employs this fit to obtain a closed-form estimate for the normalizing constant. A key novelty is the use of semi-definite (convex) programming, which allows for efficient and adaptive estimation with a small number of posterior samples. We explore the effectiveness of our approach in simulations and a real-world application.