People can identify emotion from facial expressions easily, but it is more difficult to do it with a computer. It is now feasible to identify feelings from images because of recent advances in computational intelligence. Emotional responses are those mental states of thoughts that develop without conscious effort and are naturally associated with facial muscles, resulting in different facial expressions such as happy, sad, angry, contempt, fear, surprise etc. Emotions play an important role in nonverbal cues that represent a person's interior thoughts. Intimate robots are expanding in every domain, whether it is completing requirements of elderly people, addressing psychiatric patients, child rehabilitation, or even childcare, as the human-robot interface is grabbing on every day with the increased demand for automation in every industry. We evaluate and test machine learning algorithms on the FER 2013 data set to recognize human emotion from facial expressions, with some of them achieving the highest accuracy and the others failing to detect emotions. Many researchers have used various machine learning methods to identify human emotions during the last few years. In this research article, we analyze eight frequently used machine learning techniques on the FER 2013 dataset to determine which method performs best at categorizing human facial expression. After analyzing the results, it is found that the accuracy of some of the algorithms is quite satisfactory, with 37% for Logistic Regression, 33% for K-neighbors classifier, 100% for Decision Tree Classifier, 78% for Random Forests, 57% for Ada-Boost, 100% for Gaussian NB, 33% for LDA (Linear Discriminant Analysis), and 99% for QDA (Quadratic Discriminant Analysis). Furthermore, the experimental results show that the Decision Tree and Gaussian NB Classifier can correctly identify all of the emotions in the FER 2013 dataset with 100% classification accuracy, while Quadratic Discriminant Analysis can do so with 99% accuracy.