Abstract:
The following invention is a vocally activated control system for controlling an apparatus in a surgical setting, the system comprises: a. a voice sensor configured to detect vocal commands generated by surgeons during surgery; b. a signal transmitter connected to the voice sensor, the transmitter is configured to convert a vocal command into a transmittable signal and transmit it; c. a processor connected to a signal transmitter configured to receive a transmittable vocal signal, the processor is configured to convert a vocal signal to a predetermined set of operative instructions associated with the apparatus, the predetermined set of operative instructions comprising at least one instruction; and d. control means connected to the processor and apparatus; the control means is configured to receive a predetermined set of operative instructions and to cause the apparatus to operate accordingly; Said voice sensor and said transmitter are integrated within a wearable element.
Abstract:
The present invention discloses a surgical maneuvering system, in which movement of a moving element is used to control a surgical device such as a surgical tool or an endoscope. The moving element can be a tool or a portion of an operator's body. The relationship between the moving element and the surgical device can be any of: motion of a moving element controlling motion of a surgical device, motion of a moving element controlling an action of a surgical device, a command of a moving element controlling motion of a surgical device, or a command of a moving element controlling an action of the device. Commands are typically arbitrary movements such as shaking a tool. Actions are changes in a surgical device that do not change its overall position, such as closing a grasper.
Abstract:
An intelligent surgical tool control system, comprising a tool management system; an indicating means to indicate at least one surgical event; a communicable database for storing, for each item of interest, its identity, its present 3D position and at least one previous 3D position; and at least one processor to identify, from a surgical event, an output surgical procedure. The tool management system can comprise a maneuvering mechanism to maneuver a surgical tool in at least two dimensions; and a controller to control at least one of activation and deactivation of a surgical tool and articulation of a surgical tool. The indicating means can indicate a surgical event selected from movement of a moving element and presence of an item of interest, where movement is determinable if the current 3D position of the moving element is substantially different from a previous 3D position of the same.
Abstract:
A surgical controlling system, comprising: at least one surgical tool configured to be inserted into a surgical environment of a human body; at least one location estimating means configured for real-time localization of the 3D spatial position of said at least one surgical tool at any given time t; at least one movement detection means communicable with a movement's database and with said location estimating means; a controller having a processing means communicable with a controller's database; said controller's database is in communication with said movement detection means; and at least one display configured to real time provide an image of at least a portion of said surgical environment; wherein said controller is configured to direct said surgical tool to said location via said instructions provided by said controller; further wherein said location is real time updated on said display as said at least one surgical tool is moved.
Abstract:
A surgical controlling system that includes: a surgical tool that is insertable into a surgical environment of a human body for a surgical procedure. Logic configured to locate in real-time the 3D spatial position of the at least one surgical tool at any given time t. The system also includes at least one movement detector and a controller in communication with a controller database.
Abstract:
The present invention provides a structured-light based system for providing a 3D image of at least one object within a field of view within a body cavity, comprising: a. An endoscope; b. at least one camera located in the endoscope's proximal end, configured to real-time provide at least one 2D image of at least a portion of said field of view by means of said at least one lens; c. a light source, configured to real-time illuminate at least a portion of said at least one object within at least a portion of said field of view with at least one time and space varying predetermined light pattern; and, d. a sensor configured to detect light reflected from said field of view; e. a computer program which, when executed by data processing apparatus, is configured to generate a 3D image of said field of view.
Abstract:
The present invention provides a system for altering the field of view of an endoscope image, comprising: at least one endoscope having a wide-angle lens in said endoscope's distal end; at least one camera located in said endoscope's proximal end, adapted to image a field of view of said endoscope image by means of said wide-angle lens; and a computer program which, when executed by data processing apparatus, is configured to select at least a portion of said field of view; wherein said portion of said field of view is selectable without physically maneuvering said endoscope or said wide-angle lens such that a virtual maneuvering of said field of view is provided.