In this work, the performance loss rates of eleven grid-connected photovoltaic (PV) systems of different technologies were evaluated by applying linear regression (LR) and trend extraction methods to Performance Ratio, Rp, time series. In particular, model-based methods such as Classical Seasonal Decomposition (CSD), Holt-Winters (11W) exponential smoothing and Autoregressive Integrated Moving Average (ARIMA), as well as non-parametric filtering methods such as LOcally wEighted Scatterplot Smoothing (LOESS) were used to extract the trend from monthly Rp time series of the first five years of operation of each PV system. The results showed that applying LR on the time series produced the lowest performance loss rates for most systems, but with significant autocorrelations in the residuals, signifying statistical inaccuracy. The application of CSD and 11W significantly reduced the residual autocorrelations as the seasonal component was extracted from the time series, resulting in comparable results for eight out of eleven PV systems, with a mean absolute percentage error (1VIAPE) of 6.22 % between the performance loss rates calculated from each method. Finally, the optimal use of multiplicative ARIMA resulted in Gaussian white noise (GWN) residuals and the most accurate statistical model of the Rp time series. ARIMA produced higher performance loss rates than LR for all technologies, except the amorphous Silicon (a-Si) system. The LOESS non-parametric method produced directly comparable results to multiplicative ARIMA, with a MAPE of -2.04 % between the performance loss rates calculated from each method, whereas LR, CSD and 11W showed higher deviation from ARIMA, with MAPE of 25.14 %, -13.71 % and -6.39 %, respectively.