Thursday January 24, 2013
10:00AM - 11:30AM
Reza Yaesoubi, Harvard School of Public Health
Recursive Constrained Linear Regression with Applications in Stochastic and Approximate Dynamic Programming
Linear regression is commonly used in modeling the behavior of a dependent variable (output) using a set of independent variables (inputs or regressors). In this talk, I will consider the problem of constrained linear regression when data are accumulating over time and the error terms are not necessarily exogenous. In many large-scale stochastic or dynamic programming problems, finding the exact optimal solution is computationally prohibitive. Thus, we attempt to approximate optimal policies by identifying good approximations for recourse functions (in stochastic programming) or value functions (in dynamic programming). In certain applications, it is essential to account for the structural properties of the problem (e.g., theoretical bounds on regression parameters) in these approximations in order to achieve better convergence speed and improve the quality of solutions. I will demonstrate that the inclusion of such regulatory constraints enhances the stability of approximations by replacing the assumption of no collinearity in incoming data with a milder condition. Other applications of this problem include a wide range of stochastic and dynamic programs where recourse or value functions can be adequately approximated with separable concave or convex functions. I will discuss how constrained least squares permit the use of separable linear or smooth splines in approximating these functions while also satisfying desired convexity or concavity conditions. I will present an algorithm using a primal active-set and the Schure-complement methods to solve constrained least squares problems in real-time as data are accumulating. Finally, I will establish a set of conditions under which the estimates provided by the proposed algorithm are consistent, i.e., they converge to the true parameter values in probability.