Following up on previous work of Cathabard et al. (in: Proceedings of foundations of genetic algorithms (FOGA’11), ACM, 2011) we analyze variants of the (1 + 1) evolutionary algorithm (EA) for problems with unknown solution length. For their setting, in which the solution length is sampled from a geometric distribution, we provide mutation rates that yield for both benchmark functions OneMax and LeadingOnes an expected optimization time that is of the same order as that of the (1 + 1) EA knowing the solution length. More than this, we show that almost the same run times can be achieved even if no a priori information on the solution length is available. We also regard the situation in which neither the number nor the positions of the bits with an influence on the fitness function are known. Solving an open problem from Cathabard et al. we show that, for arbitrary s∈N\documentclass[12pt]{minimal}
\usepackage{amsmath}
\usepackage{wasysym}
\usepackage{amsfonts}
\usepackage{amssymb}
\usepackage{amsbsy}
\usepackage{mathrsfs}
\usepackage{upgreek}
\setlength{\oddsidemargin}{-69pt}
\begin{document}$$s\in {\mathbb {N}}$$\end{document}, such OneMax and LeadingOnes instances can be solved, simultaneously for all n∈N\documentclass[12pt]{minimal}
\usepackage{amsmath}
\usepackage{wasysym}
\usepackage{amsfonts}
\usepackage{amssymb}
\usepackage{amsbsy}
\usepackage{mathrsfs}
\usepackage{upgreek}
\setlength{\oddsidemargin}{-69pt}
\begin{document}$$n\in {\mathbb {N}}$$\end{document}, in expected time O(n(log(n))2loglog(n)…log(s-1)(n)(log(s)(n))1+ε)\documentclass[12pt]{minimal}
\usepackage{amsmath}
\usepackage{wasysym}
\usepackage{amsfonts}
\usepackage{amssymb}
\usepackage{amsbsy}
\usepackage{mathrsfs}
\usepackage{upgreek}
\setlength{\oddsidemargin}{-69pt}
\begin{document}$$O(n (\log (n))^2 \log \log (n) \ldots \log ^{(s-1)}(n) (\log ^{(s)}(n))^{1+\varepsilon })$$\end{document} and O(n2log(n)loglog(n)…log(s-1)(n)(log(s)(n))1+ε)\documentclass[12pt]{minimal}
\usepackage{amsmath}
\usepackage{wasysym}
\usepackage{amsfonts}
\usepackage{amssymb}
\usepackage{amsbsy}
\usepackage{mathrsfs}
\usepackage{upgreek}
\setlength{\oddsidemargin}{-69pt}
\begin{document}$$O(n^2 \log (n) \log \log (n) \ldots \log ^{(s-1)}(n) (\log ^{(s)}(n))^{1+\varepsilon })$$\end{document}, respectively; that is, in almost the same time as if n and the relevant bit positions were known. For the LeadingOnes case, we prove lower bounds of same asymptotic order of magnitude apart from the (log(s)(n))ε\documentclass[12pt]{minimal}
\usepackage{amsmath}
\usepackage{wasysym}
\usepackage{amsfonts}
\usepackage{amssymb}
\usepackage{amsbsy}
\usepackage{mathrsfs}
\usepackage{upgreek}
\setlength{\oddsidemargin}{-69pt}
\begin{document}$$(\log ^{(s)}(n))^{\varepsilon }$$\end{document} factor. Aiming at closing this arbitrarily small remaining gap, we realize that there is no asymptotically best performance for this problem. For any algorithm solving, for all n, all instances of size n in expected time at most T(n), there is an algorithm doing the same in time T′(n)\documentclass[12pt]{minimal}
\usepackage{amsmath}
\usepackage{wasysym}
\usepackage{amsfonts}
\usepackage{amssymb}
\usepackage{amsbsy}
\usepackage{mathrsfs}
\usepackage{upgreek}
\setlength{\oddsidemargin}{-69pt}
\begin{document}$$T'(n)$$\end{document} with T′=o(T)\documentclass[12pt]{minimal}
\usepackage{amsmath}
\usepackage{wasysym}
\usepackage{amsfonts}
\usepackage{amssymb}
\usepackage{amsbsy}
\usepackage{mathrsfs}
\usepackage{upgreek}
\setlength{\oddsidemargin}{-69pt}
\begin{document}$$T'=o(T)$$\end{document}. For OneMax we show results of similar flavor.