An existing phase-field model (Bhave et al. (2023) [19]) is applied to investigate why there is so much variation reported in the literature on the effectiveness of pure Ni coatings in reducing corrosion by molten salt. We simulate the impact of Ni diffusion barrier coatings on the corrosion of Ni-Cr alloys by molten FLiBe using 2D simulations. We first compare the corrosion behavior in a Ni-20Cr alloy exposed to molten FLiBe at 700 degrees C with and without a pure Ni coating. The coating reduces the mass loss after 1000 hours by a factor often, consistent with experimental results from the literature. The model is then used with Latin hypercube sampling involving 100 simulations with different coating thicknesses and average alloy and coating grain sizes. As the coating grain size increases, the model predicts that the mass loss and corrosion depth into the alloy decreases. This is due to a decrease in the number of fast diffusion paths along the coating grain boundaries (GBs) for the Cr to reach the salt. As the alloy grain size increases, the model predicts that the mass loss decreases but the corrosion depth increases. This is because larger grain size creates less GB area for Cr depletion, increasing mass loss, but less GB area also allows the Cr depletion to penetrate further into the alloy. In addition, the model predicts that as the coating thickness increases, the mass loss rapidly decreases and the impact of both grain sizes also decreases. Thus, controlling the coating grain size is less important with thicker coatings.