Stochastic configuration networks (SCNs) are a new model of random neural networks (RNNs), which have demonstrated excellent capabilities in large-scale data analysis. For the processing of increasing data, distributed algorithms have become increasingly important. In this paper, the data sets are stored in multiple agents, and the centralized SCNs are reconstructed into equivalent sub-problems. The input weights and deviations are randomly generated under the constraints of supervised inequalities, ensuring the general approximation of the model, and proposed Based on the Alternating Direction Method of Multipliers (ADMM) distributed algorithm, L1 regularization is applied, sparse solutions are found, output weights are obtained, and the stochastic neural network model is improved. This method proves to be effective and greatly reduces the amount of information that needs to be exchanged between computing agents.