The analysis of multimodal data collected by innovative imaging sensors, Internet of Things (IoT) devices and user interactions, can provide smart and automatic distant monitoring of patients and reveal valuable insights for early detection and/or prevention of events related to their health situation. In this paper, we present a platform called ICT4LIFE which starting from low-level data capturing and performing multimodal fusion to extract relevant features, can perform high-level reasoning to provide relevant data on monitoring and evolution of the patient, and trigger proper actions for improving the quality of life of the patient.