In many engineering design and optimization problems, the presence of uncertainty in the data is a critical issue. There are different ways to describe this uncertainty and to devise designs that are partly insensitive or robust to it. This book examines uncertain systems in control engineering and general decision or optimization problems for which data is uncertain. Written by leading researchers in optimization and robust control; it highlights the interactions between these two fields. Part I describes theory and solution methods for probability-constrained and stochastic optimization problems; Part II focuses on numerical methods for solving randomly perturbed convex programs and semi-infinite optimization problems by probabilistic techniques; Part III details the theory and applications of randomized techniques to the analysis and design of robust control systems. It will interest researchers, academics and postgraduates in control engineering and operations research as well as professionals working in operations research.