Hyperparameter optimization (HPO) and neural architecture search (NAS) of machine learning (ML) models are in the core implementation steps of AI-enabled systems. With multi-objective and multi-level optimization of complex ML models, it is agreed-on that HPO and NAS are NP-hard problems. That is, the size of the search space grows exponentially with the number of hyperparameters, possible architecture elements, and configurations. In 2017, the first proposal of QC-enabled HPO and NAS optimization was proposed. Simultaneously, advancements related to quantum neural networks (QNNs) resulted in more powerful ML due to their deployment on QC infrastructure. For such, quantum architecture search (QAS) problem arose as a similar problem, aiming to achieve optimal configuration of quantum circuits. Although classical approaches to solve these problems were thoroughly studied in the literature, a systematic overview that summarizes quantum-based methods is still missing. Our work addresses this gap and provides the first Systemization of Knowledge (SoK) to differentiate, and bridge the gap between the utilization of QC for optimizing ML rather than learning. Specifically, we provide qualitative and empirical analysis of related works, and we classify the properties of QC-based HPO, NAS, and QAS optimization systems. Additionally, we present a taxonomy of studied works, and identify four main types of quantum methods used to address the aforementioned problems. Finally, we set the agenda for this new field by identifying promising directions and open issues for future research.