Emotion states greatly influence many areas in our daily lives, such as: learning, decision making and interaction with others. Therefore, the ability to detect and recognize one's emotional states is essential in intelligence Human Machine Interaction (HMI). The aim of this study was to develop a new system that can sense and communicate emotion changes expressed by the Central Nervous System (CNS) through the use of EEG signals. More specifically, this study was carried out to develop an EEG-based subject-dependent affect recognition system to quantitatively measure and categorize three affect states: Positively excited, neutral and negatively excited. In this paper, we discussed implementation issues associated with each key stage of a fully automated affect recognition system: emotion elicitation protocol, feature extraction and classification. EEG recordings from 5 subjects with IAPS images as stimuli from the eNTERFACE06 database were used for simulation purposes. Discriminating features were extracted in both time and frequency domains (statistical, narrow-band, HOC, and wavelet entropy) to better understand the oscillatory nature of the brain waves. Through the use of k Nearest Neighbor classifier (kNN), we obtained mean correct classification rates of 90.77% on the three emotion classes when K equals 5. This demonstrated the feasibility of brain waves as a mean to categorize a user's emotion state. Secondly, we also assessed the suitability of commercially available EEG headsets such as Emotive Epoc for emotion recognition applications. This study was carried out by comparing the sensor location, signal integrity with those of Biosemi Active II. A new set of recognition performance was presented with reduced number of channels.