Guo has had success transforming large amounts of data using what is known as spectral transformation techniques. These techniques rely on manifold harmonics to first transform 3-D images into points that represent the surface of an object. The data is then compressed into a smaller form that can be sent faster over networks.
People using this platform would use body sensors similar to those installed in smartphones that can tell whether the user is looking at the device in portrait or landscape views.
"If we put body sensors on the patients, then his or her movements can be tracked with high accuracy," Prabhakaran said. "The advantage of the sensor is the data that is generated is only a few bytes large, so it is easily transmitted over the network.
"You need a 3-D model to provide visual perspective, but if you are dealing with a lousy network and can not have consistent visual perspective, the body sensors could provide that information."
Dr. Roozbeh Jafari, assistant professor of electrical engineering at UT Dallas and a co-principal investigator of the project, is an expert in cyber-physical systems. He has built wearable computers for monitoring different aspects of human health, behavior and thought, and is developing sensors for this project.
Researchers at the University of California, Berkeley and the University of Illinois at Urbana-Champaign are working on other aspects of the system, such as refining the overall user experience and coordination of the cameras used to visually capture the movements and interactions. Rehabilitation specialists at the Dallas VA Medical Center will test the system on patients.
While the main goal of the research, which is about halfway complete, is telemedicine, other applications include dance instruction or any type of education in which people need to be in the same space, Prabhakaran said.
|Contact: LaKisha Ladson|
University of Texas at Dallas