Manually building neural network models is a great test of researchers' knowledge reserves, so using Neural Architecture Search (NAS) to automatically construct neural networks is becoming increasingly popular. This paper uses an improved Differentiable Architecture Search (DARTS) to automatically build a classification network model, DARTS is a gradient-based NAS algorithm. However, DARTS uses local selection, so this paper proposes a selection algorithm based on global selection. The improved algorithm can ensure that all operations connected to the same intermediate node can be fairly compared, ensuring fairness in selection and searching for more types of neural network architectures. DARTS only uses fixed candidate operations to construct the neural network, so the structure of the neural network is relatively single, skip connection will dominate the searched network, and DARTS uses a large number of approximation strategies, which can easily lead to a decrease in the accuracy of the model. For the problems, we add a SENet attention mechanism after cell output, SENet can extract features on the feature layer in the channel dimension, it can not only improve the search performance of the network but also effectively increase the diversity and robustness of the network. The final test error on CIFAR10 reaches 2.48%.