1 Introduction
-
to live in their own home maintaining their autonomy, independence and quality of life but in a safe and secure context;
-
to be active and participate in community life in order to reduce their sense of loneliness and general negative feelings;
-
to retain control over their own life even when they need care and assistance;
-
to increase the attention of doctors and caregivers to their health.
-
The elasticity of the cloud allows allocating increasing hardware resources (storage and processing) as the number of connected agents and required services increase without discontinuity or service faults.
-
The resource redundancy of the cloud make it highly available and more fault tolerant than the classical server approach.
-
The cloud can manage a huge amount of simultaneous connections from smart agents for data collection and processing, allowing big-data processing, and carrying out learning algorithms in the field of AAL and assistive robotics.
2 Related Works
-
User health status assessment [16].
Information | Robotic application | ||
---|---|---|---|
Type | Data | Assistive | Social |
Environmental | Light | X | X |
Temperature | X | ||
Humidity | X | ||
GAS presence | X | ||
People presence | X | X | |
Door status | X | ||
Water leakages | X | ||
Sittings occupancy | X | X | |
Room size | X | X | |
Personal | Position | X | X |
Gaze | X | X | |
Posture | X | X | |
Age | X | ||
Mood | X | X | |
Abilities | X | X | |
Experience with technology | X |
Actor | Action |
---|---|
Caregiver | The Caregiver remotely sets on a web calendar App, the day and time for the robotic drug reminder and delivery service, according to the user’s therapy |
End-user | The User forgets to take his medicine, and is alone in the house or is roaming in a nursing home |
WSNs | The WSNs collect useful data for user localization and sends them to the cloud platform |
ULM | The cloud module estimates the user’s position and sends it to the DBMS |
ESM | When the time for service delivery comes, the module triggers to the HRIM a drug reminder service for the user |
HRIM | At the triggered event, the module retrieves the user’s position and all the information required to set the appropriate robot behavior. As an example, it computes the proper distance, direction and speed to engage the user.Then it sends to the robot the commands and data to perform the service |
Robot | The robot receives the command and plans the trajectory to reach the user and the mode of engagement. When in front of the user, the robot interacts using the embedded interfaces and through natural language to remind the user about the medicine |
End-user | The end-user takes the medicine |
3 System Architecture
Actor | Action |
---|---|
WSNs | The environmental sensors continually sends raw data to the cloud for environmental monitoring and user localization |
ULM | The cloud module estimates the user’s position and sends it to the DBMS |
EMM | This module autonomously analyses the environmental data to check for abnormal situations. In case of a critical situation, the information on environmental status and suggestions on how to restore a safe and secure environment are sent to the HRIM and an alert service is triggered |
HRIM | At the triggered event, the module retrieves the user’s position and all the information required to set the appropriate robot behavior. As an example, it computes the proper distance, direction and speed with which to engage the user. Then it sends the robot the commands and data to perform the service |
Robot | The robot receives the command and plans the proper trajectory to reach the user and the mode of engagement. When in front of the user, the robot talks with natural language to alert the user |
Caregiver | The caregiver are informed of the critical situation by means of E-Mail, SMS and visual feedback on the WEB GUI |
End-user | The senior can ask the robot for a walk support to check the environment or for a skype call service to ask for support from carers |
3.1 Agents
-
Service robot The service robot was developed in the Robot-Era project [39] using a SCITOS G5 platform (Metralabs, Germany) as a basis. The robot was designed to provide either physical and cognitive support to aged. In particular, it leverages the use of an integrated robotic arm for object manipulation, a tray for the transportation of things and an handle for walk support. It communicates with the user by means of an embedded touch screen and with speakers and microphones for speech synthesis and recognition. The users can communicate with the robot to ask for a service or to control it, by using the embedded microphone or using a Bluetooth connected wearable microphone. In particular, the robot can recognize specific keywords when a user is speaking, corresponding to the commands or the services that the robot can perform. The robot can perform speech synthesis through the speakers, to interact with the user. For example the robot can remind the user to take a medication or about an appointment. A SICK3000 laser scanner (from Sick AB, Germany) was installed on the front of the robot to detect obstacles and navigate in indoor unstructured environments. An embedded PC collected the data from the robots sensors, performed path planning, and provided autonomous navigation capability and the obstacle avoidance ability to the robot. The robot exchanged data with the cloud through a Wi-Fi module to get the information on the user’s position and the required services. Whenever an user required a service, the robot was able to retrieve the user’s position from the cloud, autonomously compute the path to reach the user and perform the required service. No camera was used to navigate or perform user detection, recognition or localization, to comply with the AAL privacy requirements and get this kind of robotic services more acceptable.
-
Wireless sensors networks (WSNs) There were two ZigBee WSNs included in this system: one for user localization (LNet) and one for environmental monitoring (SNet). The mesh network topology was implemented for the SNet and the LNet, to allow the devices to exchange data with each other and have a more dependable message routing then occur with the classical star and tree typologies. Multi-hop message routing was enabled to perform data exchange over the devices’ radio range and extend the services provided by the smart environments over large areas like condominiums and nursing homes. The LNet was designed for multiple user localization, observing the Received Signal Strength (RSS) [50] of the messages exchanged between the radios. It was composed of a ZigBee Coordinator (ZC), a data logger (DL), a wearable Mobile Node (MN) and a set of ZigBee anchors (ZAs). The MN periodically sent messages to all ZAs within one communication hop. Each ZA computed the RSS as the ratio between the received and transmitted electromagnetic power [51] on the received messages and transmitted this value to the DL. ZAs were placed in fixed and known positions in the environment, in particular they were installed on walls and furniture to monitor the most accessed or interesting areas of the rooms and achieve an in-room localization accuracy as suggested in [35]. Each ZA was equipped with a 60\(^{\circ }\) sectorial linear horizontal polarized antenna that spotted the workspace on the antenna bore-sight. But the MN used an embedded omnidirectional horizontal linear polarized antenna for data transmission. Sectorial antennas were introduced to improve the signal to noise ratio of the RSS observations for the user localization [52]. The LNet was designed to locate up to three users at the same time, both in the Domocasa and the Anghen experimental sites. The network provided data at a refresh rate sufficient to locate an user once every second (1 Hz). The user position refresh rate was a trade-off between the number of devices installed in the environment that must share the same communication medium without interfering with each other, and the number of simultaneously traceable users. That refresh rate was maintained in the range between 0.2 and 2 Hz, which in the literature, respectively in [35] and in [53] has been considered sufficient for delivering assistive location-based services. The LNet provided the data to implement RSS-based localization algorithms on the proposed cloud platform, and locate the users inside its workspace.
-
The SNet was developed for home monitoring and the passive localization of people. It comprises a ZC, a DL, and a set of sensor nodes (SNs). Each SN contained a selection of sensors to improve social assistive robots as in Table 1, such as Passive InfraRed (PIR) sensors, pressure sensors placed under a chair or a bed, switches on doors or drawers, gas and water leak sensors,and sensors for temperature, humidity and light. The LNet and SNet were set to different channels to avoid interference and ensure the proper bandwidth for the localization and environmental monitoring services. Each DL node was connected to a PC via USB, to upload data to the Cloud.
3.2 Cloud Platform
-
DB and DBMS The cloud DBMS and the DB were designed to store and organize the information for the improvement of the robot’s social behaviors and the quality of the service. The cloud DB improves the scalability of the system, because of its huge storage capability and high accessibility. The DBMS manages all the DB entries and queries and ensures privacy and data security limiting the access to only authorized users. The DB comprises a number tables, collecting data on the status of the monitored environments, the installed sensors, and the users.
-
The DB is conceptually divided into three different parts, to improve the system’s scalability over the users and the environments. In particular, it comprises three main entities: one related to the sensors, one to the users, and one to the environments (Fig. 3).
-
The list of the sensors installed in the LNet and SNet was reported in entity S. A unique identification number (e.g. the ZigBee sensors EUI64) was used as the primary key of an n-upla that contains the sensor type (e.g. light, temperature, ZigBee anchor, presence detection), the position in the environment (x,y coordinates), the calibration parameters if needed, and the sensing workspace in square meters. For each ith type of sensor, a specific entity (Mi) collected the sensor output over time. The DB also provided data for multi-environment and multi-user services, and information regarding the users was reported in entity U. The user entries include the information to improve the User-Robot interaction and to provide the proper services, like the user’s name, age, height, the gender and the propensity to use robotic services. The Table P recorded the users positions in terms of x,y coordinates and also included semantic labels to identify the occupied room or area of interest in a human readable manner.
-
The KF_Matrix entity collects, over time, the Kalman filter state, state covariance and measurement covariance matrix, which are specific for each user. The alarm table reports the complete list of the alarm that have occurred in daily life. The House and the Room entities report a complete description of the multi-environment context. In particular, they include data on the physical dimensions of the connected houses and of their rooms, and the useful semantic human-readable information (e.g. bedroom, kitchen, corridor...). For each room, the DB included the numerical and semantic description of one or more areas of interest. The areas of interest were selected taking in account the European report on How Europeans spend their time, from EUROSTAT [54]. EU residents aged 20 to 74 spend respectively the 40 % of their free time watching TV, 18 % socializing and 10 % reading, whereas sleeping takes almost the 35 % of the entire day. Meals and personal care take up to 2 h and 22 min per day, and some of the most time consuming home activities are performed in the kitchen, cooking and cleaning dishes (57 min per day). This suggested selecting areas of interest in the kitchen at the sink, stove and table, in the bathroom, near the sofa in the living room, and in the bed areas in the bedroom. In this way the installed sensors in these areas allow monitoring the most accessed areas of the home. As a future work, the DB will have the ability to upload data also from the sensors of the connected robots to improve the situation awareness of the intelligent software agents in the cloud.
-
User localization module (ULM) This software module was designed to locate several users in several environments and support robots in sustaining a continual care service. The software acquired data from the heterogeneous commercial and ad-hoc sensors in the connected SNets and LNets, to estimate the users’ positions. A sensor fusion approach was implemented to compute the users position in a robust and scalable manner. The accuracy and cost of the indoor localization service depends on the typology and number of the sensors installed. In the case of a sensor fault, the user position was estimated by fusing data from the remaining ones, improving the reliability and robustness of the service. The ULM can simultaneously process data from the connected MNs and locate the users over different environments.
-
The sensor fusion approach was based on a Kalman filter (KF) for the user localization. It was implemented exploiting both range-free [55, 56] and range-based [57] localization methods as suggested in [39]. The range-free localization and presence detection methods described in [58] and [59] were performed to minimize the impact of installation mistakes and calibration issues on the system’s accuracy according to [60]. The trilateration method introduced in [57] was implemented to improve the localization accuracy in the anchors neighborhood. The ULM was designed to be independent from the typology of the connected sensors leveraging the information stored in the cloud DB. Whenever a sensor provided data to the ULM, the ULM performed a query to the DB, retrieving useful information on the sensor like its position, the typology and the unit of measure of the provided data. Once the sensor’s observation has been recognized, the information is sent to the KF for processing. In this design, the ULM is technology agnostic, and the data for user localization may come from commercial or ad-hoc WSNs, smart devices or IoT agents.
-
Environmental monitoring module (EMM) This module processed all the data concerning the environmental conditions, for remote room monitoring or the detection of critical situations. The EMM was in charge of triggering events concerning user safety and security, like the detection of intruders, the presence of wet and slippery floors, gas leakages and uncomfortable climatic or living conditions.
-
Event scheduler module (ESM) The Google calendar tools and API [61] were integrated into the ESM to demonstrate the opportunity to include third-part software and services into the system, improving its maintainability. The ESM was designed as a general purpose event scheduler, able to retrieve appointments and service requests from the calendar and trigger the appropriate commands and service requests to the connected robotic agents. It can be used for medication and care management, the management of daily life activities and to promote social activities and foster healthy life styles. Depending on the users’ cognitive abilities, the calender can be set by the users themselves or by their caregivers.
-
Human robot interaction module (HRIM) The HRIM was designed as a proof of concept, to address some issues regarding the way robots navigates to the users, to attempt a service or an interaction. A software module was dedicated to the definition of the user approach strategy. This module waits for an human–robot interaction event or an interaction request. If a human–robot interaction occurs during the service, the HRIM retrieves all the necessary data from the cloud DB on the user and the environment to estimate the proper robot proxemics. For each service concerning interaction with a human, the robot could be directed to the human at a different speed and positioned at a specific distance and orientation, depending on the user’s position and posture, the dimensions of the room and the lightning conditions as suggested in [40, 41].
-
GUI The GUI consisted of a Web application for remote home monitoring and the supervision of the users locations. It was connected directly to the DB on the cloud exposing a public static IP. The GUI access was restricted to only authorized people for security. The interface home page provided the mean values of the lighting, humidity, and temperature for each sensorized room. In addition, an alarm web page provided a list of the alarms that occurred, while the localization web page reported the rooms where the users were located.
4 Experimentation and Methodology
4.1 Pilot Sites Description
-
DomoCasa Lab (IT) The DomoCasa Lab is located in Peccioli (Italy) within the Living Lab of Scuola Superiore Sant’Anna. It is a 200 \(\mathrm{m^2}\) furnished apartment which attracts people to give their own contribution for experimentation with companion robots. It comprises a living room, a kitchen, a restroom and two bedrooms. Each room was instrumented as in Fig. 2 with at least a temperature, an humidity and a light sensor, while fifteen anchors, six PIRs, and five sensorized carpets and pillows were installed for the user localization.
-
Angen nursery (SE) The Angen site is a 5-floor residential facility composed of private flats, common areas and two domotic apartments dedicated to research activities (see Fig. 3). The two apartments furnished as real homes were used as a living lab. The localization and the sensor network workspace covered an area of approximately 145 \(\mathrm{m^2}\), distributed over the two smart apartments on the first floor and the common area of the laundry on the fifth floor. The ZigBee stack provided the opportunity to tackle the challenge of monitoring such a five-floor wide indoor environment, by leveraging the multi-hop message routing and the mesh networking of the installed localization and sensor networks. The LNet in the Angen site was instrumented with 18 ZAs, distributed over the two apartments on the first floor, and the laundry on the fifth floor. The particular configuration of the Angen site, required the installation of two ZAs instrumented with an onmidirectional antenna instead of a sectorial one, to bridge messages between the first and the fifth floors. The two devices were used to implement the multi-hop message routing between the first and the fifth floor, and provided the opportunity to continually locate the user over the entire workspace. The SNet comprised eight sensor boards measuring the internal temperature, humidity and light, while a gas sensor was placed in the kitchen to detect gas leakages. A switch, two pressure sensors, and three PIR sensors were placed for presence detection. In particular, a switch was installed at the kitchen door, while a pressure sensor was placed on a chair in the kitchen, and one on the sofa in the living room of the first apartment. Again, in order to ensure the opportunity to continually monitor the laundry, two SN devices were installed in the stairwell to act as a wireless bridge between the condominium floors.
4.2 Experimental Settings
Location | With the LNet (m) | Without the LNet (m) | Error reduction (%) |
---|---|---|---|
Chair in the kitchen | 0.53 | 1.24 | 57 |
Sofa in the living room | 1.06 | 1.48 | 28 |
Switch at the kitchen door | 1.09 | 1.34 | 19 |
4.3 Metrics for Assessing the Responsiveness and Reliability of the DBMS
4.4 Metrics for Assessing the Accuracy of the ULM
5 Results
-
The analysis of the RTT as the time a robot waits for the user position, after a request to the server. The RTT differs from the classical ping measure, since it includes the processing time in the application layer.
-
The DL, as the percentage of services undelivered due to information loss, divided by the total number of service requests (Table 4).
Location | RTT 24 h | RTT night | RTT day | DL 24 h |
---|---|---|---|---|
Processing time | 9.21 | 9.21 | 9.21 | – |
Domocasa (IT) | 40.38 | 28.64 | 46.98 | 0.49 |
Angen (SE) | 134.57 | 121.59 | 142.20 | 0.0018 |