Due to the serious impact of falls on the quality of life of the elderly and on the economical sustainability of health systems, the study of new monitoring systems capable of automatically alerting about falls has gained much research interest during the last decade. In the field of Human Activity Recognition, Fall Detection Systems (FDSs) can be contemplated as pattern recognition architectures able to discriminate falls from ordinary Activities of Daily Living (ADLs). In this regard, the combined application of cellular communications and wearable devices that integrate inertial sensors offers a cost-efficient solution to track the user mobility almost ubiquitously. Inertial Measurement Units (IMUs) typically utilized for these architectures, embed an accelerometer and a gyroscope. This paper investigates if the use of the angular velocity (captured by the gyroscope) as an input feature of the movement classifier introduces any benefit with respect to the most common case in which the classification decision is uniquely based on the accelerometry signals. For this purpose, the work assesses the performance of a deep learning architecture (a convolutional neural network) which is optimized to differentiate falls from ADLs as a function of the raw data measured by the two inertial sensors (gyroscope and accelerometer). The system is evaluated against on a well-known public dataset with a high number of mobility traces (falls and ADL) measured from the movements of a wide group of experimental users.