@Home Assistive Living Apartment

Project Description

As the population is aging, more and more people require additional health care, either at home, in the work place or in a nursing facility.  Now, a need exists for health monitoring outside of hospital conditions.  These new conditions make this technology of interest for developing health care monitoring systems that can be deployed in many different environments, including the home. Other systems in development employ a wide range of different sensors, including cameras, and recording the information for  processing.  These systems all involve using an apartment environment seeded with sensors for detecting human behavior and activities. While these systems are embedded in assistive environments, they do not have a comprehensive approach to describe events, or handle a general and rapid deployment into different configurations using wireless technology. In this paper, we are presenting our ongoing project of deploying sensors into an assistive environment. Our goal is to have a system can be rapidly deployed without requiring additional wiring or unwanted intrusion into the human patient‘s life

@Home Assistive Living Apartment Initial Floorplan of @Home Assistive Living Apartment

People Involved

Faculty
- Fillia Makedon

Postdocs
- Zhengyi Le
- Kyungseo Park

Students
- Eric Becker (PhD)
- Vangelis Metsis (PhD)
- Yong Lin (PHD)
- Roman Arora (MS)
- Jyothi Vinjumur(MS)
- Kapil Vyas (Undergrad)
 

Former Project Members
-Yurong Xu
-Ganesh Palaniappan

 


Papers

Y. Lin, E. Becker, K. Park, Z. Le, and F. Makedon, “Decision making in assistive environments using multimodal observations,” Proceedings of the 2nd International Conference on PErvsive Technologies Related to Assistive Environments,  Corfu, Greece: ACM, 2009, pp. 1-8.

K. Park, E. Becker, J.K. Vinjumur, Z. Le, and F. Makedon, “Human behavioral detection and data cleaning in assisted living environment using wireless sensor networks,” Proceedings of the 2nd International Conference on PErvsive Technologies Related to Assistive Environments,  Corfu, Greece: ACM, 2009, pp. 1-8.

E. Becker, Z. Le, K. Park, Y. Lin, and F. Makedon, “Event-based experiments in an assistive environment using wireless sensor networks and voice recognition,” Proceedings of the 2nd International Conference on PErvsive Technologies Related to Assistive Environments,  Corfu, Greece: ACM, 2009, pp. 1-8.

E. Becker, G. Guerra-Filho, and F. Makedon, “Automatic sensor placement in a 3D volume,” Proceedings of the 2nd International Conference on PErvsive Technologies Related to Assistive Environments, 2009, p. 36.

E. Becker, Y. Xu, S. Ledford, and F. Makedon, “A wireless sensor network architecture and its application in an assistive environment,” Proceedings of the 1st international conference on PErvasive Technologies Related to Assistive Environments,  Athens, Greece: ACM, 2008, pp. 1-7.

E. Becker, Y. Xu, H. Huang, and F. Makedon, “Requirements for implementation of localization into real-world assistive environments,” Proceedings of the 1st international conference on PErvasive Technologies Related to Assistive Environments,  Athens, Greece: ACM, 2008, pp. 1-8.

Y. Xu, J. Ford, E. Becker, and F.S. Makedon, “A BP-Neural Network Improvement to Hop-Counting for Localization in Wireless Sensor Networks,” Proceedings of International Workshop on Applications with Artificial Intelligence (AAI), conjuncted in conjunction with The 19th IEEE International Conference on Tools with Artificial Intelligence (ICTAI), Patras, Hellas, Greece, 2007.

 

 


Technology

The Sunspot motes are the backbone of the current system. Each mote can run a multi-threaded Java program and can access five onboard sensors: 3 accelerometers, a thermistor, and a phtocell. The units also have a 10-bit A/D convertor and a limited DIO functionality.
The Moven Suit is a accelerometer based motion capture suit. One suggested goal is to have a subject wear the accelerometers suit while triggering events in the apartment. The combination of the two would be worth investigating.
The Peoplebot is a complex human-itneraction robot. Currently, this device is being used for research in the Active Service Robot portion of the project. The robot would respond to events in the apartment, and use its onboard peripherals to confirm the status of the subject.
The Corobot. A small, microsoft based robot. Currently under review due to problems with the command libraries.
The Bioloid is  a complex kit used for creating many kinds of robot, from wheeled robots to small humanoids. Currently, there has been several different moves to have the bioloid go into the apartment for use. Currently, the rolling car with a wall detector has been tried out.
The MoteIV was the original product used for the apartment project back in the beginning. The company, however, has ceased production and has retired the line. During the interim waiting for new product, the project switched to the SunSpot mote.

Progress

Fall 2009

Goals for Fall 2009 include:

Application of Hidden Markov Models
Implementation of Hinge and Drawer events.
Improvement of Path Event
Investigation into Grammar methods


Summer 2009

Presented work at Petra 2009 Conference
 

Investigation of Hidden Markov Models:
Use of a single integer input for HMM and   use of array of real numbers for HMM was investigated for use with the apartment using synthetic data based on the real world data for the Bed, Bump, and Path events. Synthetic data also created for the idea of the Hinge and Drawer events. The idea is that the events should be generated from input from a series of real world experiments and then implemented for event detection using HMM.

Accelerometer Events-Bump, Bed, Hinge, and Drawer events can be treated as a tripplet of x,y, and z accelerometer readings. In the ideal case, the bump event is all three vectors, x,y, and z. The bed event is the vectors y and z, and the hinge event can be in the x and z directions. The Drawer event ideally would be in a single axis direction,.

Path events have been very odd, having a low reading on the photocell. Investigation shows that removing the tinted plastic cover on the Sunspot motes greatly increases the effectiveness of the photocell.

Spring 2009

Event Detection -Motes currently only broadcast when a defined event has been detected. At that time, the event and the related data points are sent to the base station for analysis.
 

Data events sampled using the data collection tool

Data Cleaning - For the incoming events, the data is first checked to see if the data has any outliers, an outlier being a ridiculously high change that indicates a hardware error. Once this is done, the events are combined into a sequence in order to establish what has occurred.
 

Events mapped over a period of sixty minutes.


A 3D model of the apartment has been created, and then a volume calculated. The area of the volume that can be seen by a model of a sensor or camera is then used to
vote for identifying the optimum locations and angles.

 

A 3-D Model of the apartment using Google Sketchup The preliminary results from the CamHunt application for camera placement


Fall 2008

Presented material at Petra 2008.

Ongoing changes with the data.

After discussions with Dr. Park, and reviewing all the various pieces available, some new ideas about how to put this system together have arisen. Time for an inventory of what pieces are available, and what can be done to have actual running experiments in the apartment instead of speculation and discussion.
 

Combining Components into a System


Now that all of these various pieces are lying around, one discussion with Dr. Park was the ability to find key signatures in a family of data. Given that each sensor has some known task, then data mining can be used once the data is collected. So, if the data is collected and stored in the system, and then both ordinary and exception tasks can be identified.

An example follows:

We place four sensors in two rooms: The bedroom and the bathroom. A photocell in the bedroom, a photocell in the bathroom, an accelerometer attached to the bed, and an accelerometer attached to a chair in the bedroom.

Scenario:
A subject is lying on the bed, and has to visit the bathroom. While lying on the bed, the accelerometers are at a steady level, but when they get off of the mattress, the bed moves and is detected by an accelerometer (Event A). Next, the subject turns on the light in the bedroom, and the photocell starts reporting a high voltage(Event B). The subject then walks to the bathroom, and bumps into a chair enroute. The force applied to the chair causes the accelerometer to react.(Event C) Next, the subject enters the bathroom, and turns on the bathroom light. The photocell in the bathroom suddenly goes high.(Event D) Afterward, the patient leaves the bathroom, switching off the light.(Event E) This time, the subject does not collide with the chair, but gets to the light switch. They turn off the light, and then sit on the bed.

 


After the data is collected and the events identified, then the event goes away from being a graph of voltages and collected data to just key events.

[Bed-Shake]->[Bedroom-Lights On]->[Chair-Shake]->[Bathroom-Lights On]->[Bathroom-Lights Off]->[Bedroom-Lights Off]->[Bed-Shake]

Now, the key to generating the events from the scenario requires three things: The location of the sensor, the type of the sensor, and how the data is taken from the events. What is most noticeable is that it is the sharp change in the readings that make a clearly marked event.

To do the signatures, additional timestamping is desired for the data collection. Eric is going to work on adding the timestamping to each mote, and then meet again with Dr. Park to see about constructing a database.

The SunSpot Motes have a Date/Function for their millisecond clock. A long integer can be broadcasted with the data to add a field.

So, a sample tuple could be:

"Mote ID,Time Collected, Time Sent, Time Received, Sensor Type, Data Value"

Or possibly two tables...

(MoteID, Time Received, Time Sent)-1-<>-N-(TimeCollected,Data Reading)

Need to get the data into a file first, and then see about building an exporter from Java into MYSQL.

Summer 2008

Ongoing discussions of what to do next with the technology, whether to identifiy events or how to combine older research.

Stage 1: GDL
GDL routine runs on the sensors, and their locations are recorded. GDL operates with half of its routines being on the motes, for collecting the hop counts, hope coordinates, and tracking the neighborhoods. On the base, it uses this information to generate a distance matrix for location.

GDL doesn't look to be a go at this time. Need to move to another area.


Stage 2: Data Acquisition
All the available sensors will be active, Time, location, and value of the data will have to be stored.
 

Stage 3: Data Storage
Data has to be stored for future use.Perhaps a database? but at least has to be stored for each test.

Stage 4: Middleware
Once the data is gathered, it has to be interpreted. For example,
a sensor tripped by itself does not provide information.
A series or combination of sensors over time gives more information
about what is going on.

Stage 5: Visualization
Since no cameras are unavailable for this project, what are the alternatives.

Sensors->Data->Middleware->Script

And then script into the Visualization Tool.

Changing to Sunspots
Current Status:
MoteIV system is currently powered down and put away.
The MoteIV's work for a demonstration, but are not capable of receiving
instructions from the base.

New Material:
A small SunSpot program both base and mote, is working for all the sensors on the mote.
The next task here will be threefold.

The first, to have something ready that can be shown to guests who appear in the lab.
Something that is visually interesting and responds in real time.

The second is to have something that will actually do the data capture required to
do research work. Not nearly so pretty, but will dump the data into files or a database
where it can be post-processed.

The third is to have the system open enough to allow other pieces be added to the data capture.
Security, Efficient Power, Localization, and Tracking are all additional material that needs
to work within the system.


Spring 2008

Spring 2008 saw the application of SunSPOT wireless motes to the wireless sensor networks. One of the key applications done at this time was to enhance the current GDL algorithm to work beyond a one dimensional aspect.

The real world position of motes for the figure on the right The results from the enhanced geographical distance localization (EGDL) using hop-counting from across the WSN.


Fall 2007

The MoteIV units to do a simple test of locations of targets using the photocells to try to detect the behavior of a person in an environment to the digital data. The first experiments were done with simple targets and simple areas, and expanded up to a simulated apartment.

The first tests were to mark out a working area, and to place a target within the zone to see how the system would react. The idea was based on the notion that a deployable wireless sensor could be used to check the occupancy of a parking space. The application in turn was three different chambers with three sensors each. As each target (in the case of the figure on the left, a Styrofoam cup with the face of Jack Kilby) would react with the sensors. The signal would go from the sensor to the base station, the base station to the Ethernet via a gateway, and then from the Ethernet to a webpage.
 

 

The second experiment consisted of placing twenty sensors in an apartment setting, and then having the subject move through the room, sitting on furniture and getting in and out of beds. The result of this activity showed how sensors should be placed to get a reaction from the motion. The tool on the other side consisted of a a grid of twenty monitors chiming on the detection of either a shadow (blue) or of light (red) whenever a person moved by a sensor.

 

The third experiment consisted of putting sensors in various rooms, a bedroom, a kitchen, and a hallway. As a person moved from one room to the other, the various sensors in each room would be triggered In the graph above, the sensors in the three different rooms would go low, being the three columns in the graph. The green indicates the bedroom, the red the hallway along the black strip in the map on the right, and the blue were the sensors in the kitchen area.