Deep Neural Networks (DNNs) have established themselves as a dominant technique in machine learning. DNNs have been top performers on a wide variety of tasks including image classification, speech recognition, and face recognition.(1-3) Convolutional neural networks (CNNs) have been used in nearly all of the top performing methods on the Labeled Faces in the Wild (LFW) dataset.(3-6) In this talk and accompanying paper, I attempt to provide a review and summary of the deep learning techniques used in the state-of-the-art. In addition, I highlight the need for both larger and more challenging public datasets to benchmark these systems. Despite the ability of DNNs and autoencoders to perform unsupervised feature learning, modern facial recognition pipelines still require domain specific engineering in the form of re-alignment. For example, in Facebook's recent Deep Face paper, a 3D "frontalization" step lies at the beginning of the pipeline. This step creates a 3D face model for the incoming image and then uses a series of affine transformations of the fiducial points to "frontalize" the image. This step enables the Deep Face system to use a neural network architecture with locally connected layers without weight sharing as opposed to standard convolutional layers.(6) Deep learning techniques combined with large datasets have allowed research groups to surpass human level performance on the LFW dataset.(3,5) The high accuracy (99.63% for Face Net at the time of publishing) and utilization of outside data (hundreds of millions of images in the case of Google's Face Net) suggest that current face verification benchmarks such as LFW may not be challenging enough, nor provide enough data, for current techniques.(3,5) There exist a variety of organizations with mobile photo sharing applications that would be capable of releasing a very large scale and highly diverse dataset of facial images captured on mobile devices. Such an "Image Net for Face Recognition" would likely receive a warm welcome from researchers and practitioners alike.