Automated steering systems and methods for a robotic endoscope
First Claim
1. A control system for providing an adaptive steering control output signal for steering a robotic endoscope, the control system comprising:
- a) a first image sensor configured to capture a first input data stream comprising a series of two or more images of a lumen; and
b) one or more processors that are individually or collectively configured to;
i) perform automated image processing and feature extraction on one or more of the images using a combination of at least two different image processing algorithms simultaneously to generate an input data set comprising at least two different estimates of a location of a lumen center;
ii) output a steering direction based on an analysis of the input data set using an artificial neural network that has been trained using a training data set, the training data set comprises recorded image data from previous endoscopic procedures and empirical steering control instructions provided to a virtual steering mechanism by a surgeon or other skilled operator during simulated endoscopic procedures, wherein the steering direction adapts to changes in the input data set in real time, and wherein the steering direction need not be aligned with the estimated location(s) of the lumen center; and
iii) generate the adaptive steering control output signal in real time.
5 Assignments
0 Petitions
Accused Products
Abstract
Systems and methods for automated steering control of a robotic endoscope, e.g., a colonoscope, are provided. The control system may comprise: a) a first image sensor configured to capture a first input data stream comprising a series of two or more images of a lumen; and b) one or more processors that are individually or collectively configured to generate a steering control output signal based on an analysis of data derived from the first input data stream using a machine learning architecture, wherein the steering control output signal adapts to changes in the data of the first input data stream in real time.
16 Citations
42 Claims
-
1. A control system for providing an adaptive steering control output signal for steering a robotic endoscope, the control system comprising:
-
a) a first image sensor configured to capture a first input data stream comprising a series of two or more images of a lumen; and b) one or more processors that are individually or collectively configured to; i) perform automated image processing and feature extraction on one or more of the images using a combination of at least two different image processing algorithms simultaneously to generate an input data set comprising at least two different estimates of a location of a lumen center; ii) output a steering direction based on an analysis of the input data set using an artificial neural network that has been trained using a training data set, the training data set comprises recorded image data from previous endoscopic procedures and empirical steering control instructions provided to a virtual steering mechanism by a surgeon or other skilled operator during simulated endoscopic procedures, wherein the steering direction adapts to changes in the input data set in real time, and wherein the steering direction need not be aligned with the estimated location(s) of the lumen center; and iii) generate the adaptive steering control output signal in real time. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16)
-
-
17. A method for providing an adaptive steering control output signal for steering a robotic endoscope, the method comprising:
-
a) providing a first input data stream comprising a series of two or more images of a lumen; b) performing automated image processing and feature extraction on one or more of the images using a combination of at least two different image processing algorithms simultaneously to generate an input data set comprising at least two different estimates of a location of a lumen center; c) outputting a steering direction based on an analysis of the input data set using an artificial neural network that has been trained using a training data set, the training data set comprises recorded image data from previous endoscopic procedures and empirical steering control instructions provided to a virtual steering mechanism by a surgeon or other skilled operator during simulated endoscopic procedures, wherein the steering direction adapts to changes in the input data set in real time, and wherein the steering direction need not be aligned with the estimated location(s) of the lumen center; and d) generating the adaptive steering control output signal in real time. - View Dependent Claims (18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32)
-
-
33. A robotic endoscope, comprising:
-
a) an elongated body structure comprising one or more actuation units and a steerable distal portion; and b) a control system comprising; i) a first image sensor configured to capture a first input data stream comprising a series of two or more images of a lumen; and ii) one or more processors that are individually or collectively configured to; i) perform automated image processing and feature extraction on one or more of the images using a combination of at least two different image processing algorithms simultaneously to generate an input data set comprising at least two different estimates of a location of a lumen center; ii) output a steering direction based on an analysis of the input data set using an artificial neural network that has been trained using a training data set, the training data set comprises recorded image data from previous endoscopic procedures and empirical steering control instructions provided to a virtual steering mechanism by a surgeon or other skilled operator during simulated endoscopic procedures, wherein the steering direction adapts to changes in the input data set in real time, and wherein the steering direction need not be aligned with the estimated location(s) of the lumen center; and iii) generate an adaptive steering control output signal to control the one or more actuation units in real time. - View Dependent Claims (34, 35, 36, 37, 38, 39, 40, 41, 42)
-
Specification