In this paper we consider tracking of an optimal filter modeled as a stationary vector process. We interpret the Recursive LeastSquares (RLS) adaptive filtering algorithm as a filtering operation on the optimal filter process and the intantaneous gradient noise (induced by the measurement noise). The filtering operation carried out by the RLS algorithm depends on the window used in the least-squares criterion. To arrive at a recursive LS algorithm requires that the window impulse response can be expressed recursively (output of an IIR filter). In practice, only two popular window choices exist (with each one tuning parameter): the exponential weighting (W-RLS) and the rectangular window (SWC-RLS). However, the rectangular window can be generalized at a small cost for the resulting RLS algorithm to a window with three parameters (GSW-RLS) instead of just one, encompassing both SWCand W-RLS as special cases. Since the complexity of SWC-RLS essentially doubles with respect to W-RLS, it is generally believed that this increase in complexity allows for some improvement in tracking performance. We show that, with equal estimation noise, W-RLS generally outperforms SWC-RLS in causal tracking, with GSW-RLS still performing better, whereas for non-causal tracking SWC-RLS is by far the best (with GSW-RLS not being able to improve). When the window parameters are optimized for causal tracking MSE, GSW-RLS outperforms W-RLS which outperforms SWC-RLS. We also derive the optimal window shapes for causal and non-causal tracking of arbitrary variation spectra. It turns outs that W-RLS is optimal for causal tracking of AR(1) parameter variations whereas SWC-RLS if optimal for non-causal tracking of integrated white jumping parameters, all optimal filter parameters having proportional variation spectra in both cases.