Research
MEDICAL ROBOTICS LABORATORY
UNIVERSITY of HOUSTON
We pursue the development of a novel cyber-physical system for multimodal image-guided robot-assisted surgeries. We anticipate that the general intellectual merit and far-reaching impact of this work will be the development of computational methodology to enable a leap in minimally invasive procedures (a) from “keyhole” visualization (i.e., laparoscopy) to in-situ real-time imaging guidance, and (b) toward using the emerging steerable or shape-conformable tools.
The aim of this project is to develop and investigate an approach for performing multimodality imaging that bridges biosensing at the molecular, near-cellular and macroscopic level. The hypothesis of this work is that combining such multi-level in situ sensing of complementary information may allow us to assess pathophysiology in situ in a comprehensive approach.