In machine learning, Gaussian process regression (GPR) has been gaining popularity due to its nonparametric Bayesian form. However, the traditional GPR model is designed for continuous real-valued outputs with a Gaussian assumption, which does not hold in some engineering application studies. For example, when the output variable is count data, it violates the assumptions of the GPR model. Generalized Gaussian process regression (GGPR) can overcome the drawbacks of the conventional GPR, and it allows the model outputs to be any member of the exponential family of distributions. Thus, GGPR is more flexible than GPR. However, since GGPR is a nonlinear kernel-based method, it is not readily accessible to understand the effect of each input variable on the model output. To tackle this issue, the sensitivity analysis of GGPR (SA-GGPR) is proposed in this work. SA-GGPR aims to identify factors that exert higher influence on the model output by utilizing the information from the partial derivative of the GGPR model output with respect to its inputs. The proposed method was applied to a nonlinear count data system. The application results demonstrated that the proposed SA-GGPR is superior to the PLS-Beta, PLS-VIP, and SA-GPR methods in identification accuracy.