In recent years, single-image super-resolution (SISR) based on deep convolutional neural networks has achieved excellent performance, thanks to the huge number of parameters and FLOPS, making it difficult to be applied to computationally constrained devices. To alleviate this problem, many lightweight SISR networks have been proposed, but their number of parameters and FLOPS have to be further reduced. To this end, in this paper, we propose an ultra-lightweight SISR network (ULN) with parameters and FLOPS of only 276 K and 61.1G, 283 K and 27.8G, and 292 K and 16.1G for scale factors of 2, 3, and 4. Specifically, the method in this paper has an extremely simple network structure, and we reduce the number of redundant parameters and FLOPS by using blueprint separable convolution (BSConv) instead of standard convolution operations, in addition to designing and introducing a more efficient attention module to enhance the model performance. Extensive experimental results show that our proposed ULN can achieve SOTA performance while keeping the number of parameters and FLOPS at a minimum. The code and model are available at https://github.com/kptx666/ULN.