The paper deals with game theory approach to dynamic control problems. The concept of differential games is defined and developed. The essential primary setting for control problems under conflict or uncertainty is discussed along with formalized procedures for feedback control. The relationship between the given abstract construction and practically implementable approximation procedures is analyzed.