We present a method for automated analysis of human semen quality using microscopic video sequences of live semen samples. The videos are captured through an automated microscope at 400x magnification. In each video frame, objects of interest are extracted using image processing techniques. A deep convolution neural network (CNN) is used to distinguish between sperms and non-sperm objects. The frame-wise count of sperm cells is used to estimate the concentration of sperms in unit volume of semen. In each video, individual sperm cells are tracked across the frames using a predictive approach which handles collisions and occlusions well. Based on their computed trajectories, sperms are classified into progressively motile, non-progressively motile and immotile types as per the WHO manual. In certain samples, due to various reasons, all visible objects drift in a certain direction. We present a method for identifying and compensating for the drift. Experimental results are presented on a set of more than 100 semen samples collected from a clinical laboratory. The results correlate well with existing accepted standard, SQA-V Gold for sperm concentration as well as motility parameters.