In the field of image recognition, the all-MLP architecture (MLP-Mixer) shows superior performance. However, the current MLP-Mixer is solely based on fully connected layers. The nonlinear capability of fully connected layers is relatively weak, and their simple stacked structure has limitations under complex conditions. Therefore, inspired by the diversity of neurons in the human brain, we propose an innovative DMixNet, a dendritic multi-layered perceptron architecture. Rooted in the theory of dendritic neurons from neuroscience, we propose a dendritic neural unit (DNU) that enhances DMixNet with stronger biological interpretability and more robust nonlinear processing capabilities. The flexibility of dendritic structures allows the DNU to adjust its architecture to achieve different functionalities. Based on the DNU, we propose a novel channel fusion network \documentclass[12pt]{minimal}
\usepackage{amsmath}
\usepackage{wasysym}
\usepackage{amsfonts}
\usepackage{amssymb}
\usepackage{amsbsy}
\usepackage{mathrsfs}
\usepackage{upgreek}
\setlength{\oddsidemargin}{-69pt}
\begin{document}$$\text {DNU}_\text {E}$$\end{document} and a dendritic classifier \documentclass[12pt]{minimal}
\usepackage{amsmath}
\usepackage{wasysym}
\usepackage{amsfonts}
\usepackage{amssymb}
\usepackage{amsbsy}
\usepackage{mathrsfs}
\usepackage{upgreek}
\setlength{\oddsidemargin}{-69pt}
\begin{document}$$\text {DNU}_\text {C}$$\end{document}. The \documentclass[12pt]{minimal}
\usepackage{amsmath}
\usepackage{wasysym}
\usepackage{amsfonts}
\usepackage{amssymb}
\usepackage{amsbsy}
\usepackage{mathrsfs}
\usepackage{upgreek}
\setlength{\oddsidemargin}{-69pt}
\begin{document}$$\text {DNU}_\text {E}$$\end{document} substitutes the traditional two fully connected layers as the channel mixer, constructing a dendritic mixer layer to enhance the fusion capability of channel information within the entire framework. Meanwhile, the \documentclass[12pt]{minimal}
\usepackage{amsmath}
\usepackage{wasysym}
\usepackage{amsfonts}
\usepackage{amssymb}
\usepackage{amsbsy}
\usepackage{mathrsfs}
\usepackage{upgreek}
\setlength{\oddsidemargin}{-69pt}
\begin{document}$$\text {DNU}_\text {C}$$\end{document} replaces the traditional linear classifier, effectively improving the model’s classification performance. Experimental results demonstrate that DMixNet achieves improvements of 2.13%, 4.79%, 4.71%, 23.14% on the CIFAR-10, CIFAR-100, Tiny-ImageNet and COIL-100 benchmark image recognition datasets, respectively, as well as a 14.78% enhancement on the medical image classification dataset PathMNIST, outperforming other state-of-the-art architectures. Code is available at https://github.com/KarilynXu/DMixNet.