Physics-informed neural networks (PINNs) have been widely applied in different fields due to their effectiveness in solving partial differential equations (PDEs). However, the accuracy and efficiency of PINNs need to be considerably improved for scientific and commercial purposes. To address this issue, we systematically propose a novel dimension-augmented physics-informed neural network (DaPINN), which simultaneously and significantly improves the accuracy and efficiency of the base PINN. In the DaPINN model, we manipulate the dimensionality of the network input by inserting additional sample features and then incorporate the expanded dimensionality into the loss function. Moreover, we verify the effectiveness of power series augmentation, Fourier series augmentation and replica augmentation in both forward and backward problems. In most experiments, the error of DaPINN is 1 & SIM;2 orders of magnitude lower than that of the base PINN. The results show that the DaPINN outperforms the original PINN in terms of both accuracy and efficiency with a reduced dependence on the number of sample points. We also discuss the computational complexity of the DaPINN, its network size implications, other implementations of the DaPINN and the compatibility of DaPINN's methods with residual-based adaptive refinement (RAR), self-adaptive physics-informed neural networks (SA-PINNs) and gradient-enhanced physics-informed neural networks (gPINNs).& COPY; 2023 Elsevier Inc. All rights reserved.