We propose a new iteratively reweighted least squares (IRLS) algorithm for the recovery of a matrix X is an element of C-d1xd2 of rank r << min(d(1), d(2)) from incomplete linear observations, solving a sequence of low complexity linear problems. The easily implement able algorithm, which we call harmonic mean iteratively reweighted least squares (HM-IRLS), optimizes a non-convex Schatten-p quasi-norm penalization to promote low-rankness and carries three major strengths, in particular for the matrix completion setting. First, we observe a remarkable global convergence behavior of the algorithm's iterates to the low-rank matrix for relevant, interesting cases, for which any other state-of-the-art optimization approach fails the recovery. Secondly, HM-IRLS exhibits an empirical recovery probability close to 1 even for a number of measurements very close to the theoretical lower bound r(d(1)+d(2)-r), i.e., already for significantly fewer linear observations than any other tractable approach in the literature. Thirdly, HM-IRLS exhibits a locally superlinear rate of convergence (of order 2 - p) if the linear observations fulfill a suitable null space property. While for the first two properties we have so far only strong empirical evidence, we prove the third property as our main theoretical result.