Eye tracking and the analysis of gaze behaviour are established tools to produce insights into how humans observe their surroundings and consume visual multimedia content. For example, gaze recordings may be directly used to study attention allocation towards the areas and objects of interest. Furthermore, segmenting the raw gaze traces into their constituent eye movements has applications in the assessment of subjective quality and mental load, and may improve computational saliency prediction of the content as well. Currently, eye trackers are beginning to be integrated into commodity virtual and augmented reality set-ups that allow for more diverse stimuli to be presented, including 360 degrees content. However, because of the more complex eye-head coordination patterns that emerge, the definitions and the well-established methods that were developed for monitor-based eye tracking are often no longer directly applicable. The main contributions of this work to the field of 360 degrees content analysis are threefold: First, we collect and partially annotate a new eye tracking data set for naturalistic 360 degrees videos. Second, we propose a new two-stage pipeline for reliable manual annotation of both "traditional" (fixations and saccades) and more complex eye movement types that is implemented in a flexible user interface. Lastly, we develop and test a proof-of-concept algorithm for automatic classification of all the eye movement types in our data set. The data set and the source code for both the annotation tool and the algorithm are publicly available at https://gin.g-node.org/ioannis.agtzidis/360_em_dataset.