Despite the increasing attention given to inertial sensors for Human Activity Recognition (HAR), efforts are principally focused on fitness applications where quasi-periodic activities like walking or running are studied. In contrast, activities like eating or drinking cannot be considered periodic or quasi-periodic. Instead, they are composed of sporadic occurring gestures in continuous data streams. This paper presents an approach to gesture recognition for an Ambient Assisted Living (AAL) environment. Specifically, food and drink intake gestures are studied. To do so, firstly, waist-worn tri-axial accelerometer data is used to develop a low computational model to recognize whether a person is at moving, sitting or standing estate. With this information, data from a wrist-worn tri-axial Micro-Electro-Mechanical (MEM) system was used to recognize a set of similar eating and drinking gestures. The promising preliminary results show that states can be recognized with 100% classification accuracy with the use of a low computational model on a reduced 4-dimensional feature vector. Additionally, the recognition rate achieved for eating and drinking gestures was above 99%. Altogether suggests that it is possible to develop a continuous monitoring system based on a bi-nodal inertial unit. This work is part of a bigger project that aims at developing a self-neglect detection continuous monitoring system for older adults living independently.