Localized spot patterns, where one or more solution components concentrate at certain points in the domain, are a common class of localized pattern for reaction-diffusion systems, and they arise in a wide range of modeling scenarios. Although there is a rather well-developed theoretical understanding for this class of localized pattern in one and two space dimensions, a theoretical study of such patterns in a three-dimensional setting is, largely, a new frontier. In an arbitrary bounded three-dimensional domain, the existence, linear stability, and slow dynamics of localized multispot patterns are analyzed for the well-known singularly perturbed Gierer-Meinhardt activator-inhibitor system in the limit of a small activator diffusivity epsilon(2) << 1. Our main focus is to classify the different types of multispot patterns and predict their linear stability properties for different asymptotic ranges of the inhibitor diffusivity D. For the range D = O(epsilon(-1)) >> 1, although both symmetric and asymmetric quasi-equilibrium spot patterns can be constructed, the asymmetric patterns are shown to be always unstable. On this range of D, it is shown that symmetric spot patterns can undergo either competition instabilities or a Hopf bifurcation, leading to spot annihilation or temporal spot amplitude oscillations, respectively. For D = O(1), only symmetric spot quasi-equilibria exist and they are linearly stable on O(1) time intervals. On this range, it is shown that the spot locations evolve slowly on an O(epsilon(-3)) time scale toward their equilibrium locations according to an ODE gradient flow, which is determined by a discrete energy involving the reduced-wave Green's function. The central role of the far-field behavior of a certain core problem, which characterizes the profile of a localized spot, for the construction of quasi-equilibria in the D = O(1) and D = O(epsilon(-1)) regimes, and in establishing some of their linear stability properties, is emphasized. Finally, for the range D = O(epsilon(2)), it is shown that spot quasi-equilibria can undergo a peanut-splitting instability, which leads to a cascade of spot self-replication events. Predictions of the linear stability theory are all illustrated with full PDE numerical simulations of the Gierer-Meinhardt model.