Systems and user interfaces for dynamic interaction with two- and three-dimensional medical image data using hand gestures
First Claim
1. A medical image computing system comprising:
- an electronic display;
one or more sensors configured to detect a hand of a user;
a storage device configured to store electronic software instructions;
one or more data stores storing at least one set of volumetric medical image data; and
one or more computer processors in communication with the electronic display, the one or more sensors, the storage device, and the one or more data stores, the one or more computer processors configured to execute the stored software instructions to cause the computing system to;
access, from the one or more data stores, a set of volumetric medical image data;
render a three-dimensional view of the set of volumetric medical image data;
generate user interface data for rendering an interactive user interface on the electronic display, the interactive user interface including at least the rendered three-dimensional view of the set of volumetric medical image data;
determine a virtual origin location in physical space about which a user may move their hand;
receive sensor data from the one or more sensors, the sensor data indicative of a user input provided via the hand of the user, the user input comprising at least a position of the hand;
determine, based on the sensor data, the position of the hand with respect to the virtual origin location;
calculate, based on the position of the hand with respect to the virtual origin location, a rotation of the set of volumetric medical image data;
render, based on the rotation, an updated three-dimensional view of the set of volumetric medical image data; and
update the user interface data to include the updated three-dimensional view of the set of volumetric medical image data.
4 Assignments
0 Petitions
Accused Products
Abstract
Embodiments of the present disclosure relate to systems and techniques for accessing data stores of medical images and displaying the medical images in substantially real-time to provide information in an interactive user interface. Systems are disclosed that may advantageously provide highly efficient, intuitive, and rapid dynamic interaction with two- and three-dimensional medical image data using hand gestures. The systems may include interactive user interfaces that are dynamically updated to provide tracking of a user'"'"'s hand in a virtual 3D space by two- and/or three-dimensional image data. A user may use the systems described herein to more quickly, thoroughly, and efficiently interact with image data including two-dimensional images, three-dimensional image data, and/or series of image data, as compared to previous systems.
37 Citations
20 Claims
-
1. A medical image computing system comprising:
-
an electronic display; one or more sensors configured to detect a hand of a user; a storage device configured to store electronic software instructions; one or more data stores storing at least one set of volumetric medical image data; and one or more computer processors in communication with the electronic display, the one or more sensors, the storage device, and the one or more data stores, the one or more computer processors configured to execute the stored software instructions to cause the computing system to; access, from the one or more data stores, a set of volumetric medical image data; render a three-dimensional view of the set of volumetric medical image data; generate user interface data for rendering an interactive user interface on the electronic display, the interactive user interface including at least the rendered three-dimensional view of the set of volumetric medical image data; determine a virtual origin location in physical space about which a user may move their hand; receive sensor data from the one or more sensors, the sensor data indicative of a user input provided via the hand of the user, the user input comprising at least a position of the hand; determine, based on the sensor data, the position of the hand with respect to the virtual origin location; calculate, based on the position of the hand with respect to the virtual origin location, a rotation of the set of volumetric medical image data; render, based on the rotation, an updated three-dimensional view of the set of volumetric medical image data; and update the user interface data to include the updated three-dimensional view of the set of volumetric medical image data. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9)
-
-
10. A computer-implemented method comprising:
by one or more computer processors executing software instructions; accessing, from one or more data stores, a set of volumetric medical image data; rendering a three-dimensional view of the set of volumetric medical image data; generating user interface data for rendering an interactive user interface on an electronic display, the interactive user interface including at least the rendered three-dimensional view of the set of volumetric medical image data; determining a virtual origin location in physical space about which a user may move their hand; receiving sensor data from one or more sensors configured to detect the hand of the user, the sensor data indicative of a user input provided via the hand of the user, the user input comprising at least a position of the hand; determining, based on the sensor data, the position of the hand with respect to the virtual origin location; calculating, based on the position of the hand with respect to the virtual origin location, a rotation of the set of volumetric medical image data; rendering, based on the rotation, an updated three-dimensional view of the set of volumetric medical image data; and updating the user interface data to include the updated three-dimensional view of the set of volumetric medical image data. - View Dependent Claims (11, 12, 13, 14, 15)
-
16. A non-transitory computer-readable storage medium having software instructions embodied therewith, the software instructions executable by one or more processors to cause the one or more processors to:
-
access, from one or more data stores, a set of volumetric medical image data; render a three-dimensional view of the set of volumetric medical image data; generate user interface data for rendering an interactive user interface on an electronic display, the interactive user interface including at least the rendered three-dimensional view of the set of volumetric medical image data; determine a virtual origin location in physical space about which a user may move their hand; receive sensor data from one or more sensors configured to detect the hand of the user, the sensor data indicative of a user input provided via the hand of the user, the user input comprising at least a position of the hand; determine, based on the sensor data, the position of the hand with respect to the virtual origin location; calculate, based on the position of the hand with respect to the virtual origin location, a rotation of the set of volumetric medical image data; render, based on the rotation, an updated three-dimensional view of the set of volumetric medical image data; and update the user interface data to include the updated three-dimensional view of the set of volumetric medical image data. - View Dependent Claims (17, 18, 19, 20)
-
Specification