The main purpose of the book is to give a rigorous introduction to the most important and useful solution methods of various types of stochastic control problems for jump diffusions and its applications. Both the dynamic programming method and the maximum principle method are discussed, as well as the relation between them. Corresponding verification theorems involving the Hamilton-Jacobi Bellman equation and/or (quasi-)variational inequalities are formulated. The text emphasises applications, mostly to finance. All the main results are illustrated by examples and exercises appear at the end of each chapter with complete solutions. This will help the reader understand the theory and see how to apply it. The book assumes some basic knowledge of stochastic analysis, measure theory and partial differential equations.In the 2nd edition there is a new chapter on optimal control of stochastic partial differential equations driven by Lévy processes. There is also a new section on optimal stopping with delayed information. Moreover, corrections and other improvements have been made.