Most of multiobjective optimization algorithms consider multiple objectives as a whole when solving multiobjective optimization problems (MOPs). However, in MOPs, different objective functions may possess different properties. Hence, it can be beneficial to build objective-wise optimization strategy for each objective separately. In this paper, we firstly propose a single objective guided multiobjective optimization (SOGMO) framework to solve continuous MOPs. In SOGMO framework, a solution is selected from archive, and then objective-wise learning strategy is developed to promote the evolution of each objective of the selected solution. Thus, all the objectives of the considered solution can be simultaneously optimized in parallel by the cooperation of objective-wise learning process. A specific instantiation of the SOGMO framework is implemented, where a neighborhood field optimization (NFO) algorithm, as objective-wise learning strategy, and ϵ\documentclass[12pt]{minimal}
\usepackage{amsmath}
\usepackage{wasysym}
\usepackage{amsfonts}
\usepackage{amssymb}
\usepackage{amsbsy}
\usepackage{mathrsfs}
\usepackage{upgreek}
\setlength{\oddsidemargin}{-69pt}
\begin{document}$$\epsilon$$\end{document} dominance archive are designed. The proposed SOGMO implementation, called SOGMO-NFO, is systematically compared with several state-of-the-art multiobjective evolutionary algorithms (MOEA). Simulation results on 13 benchmark problems from CEC 2009 competition show that SOGMO-NFO is better than the compared MOEAs.