US20200297357A1 - System for neuronavigation registration and robotic trajectory guidance, robotic surgery, and related methods and devices - Google Patents
System for neuronavigation registration and robotic trajectory guidance, robotic surgery, and related methods and devices Download PDFInfo
- Publication number
- US20200297357A1 US20200297357A1 US16/695,310 US201916695310A US2020297357A1 US 20200297357 A1 US20200297357 A1 US 20200297357A1 US 201916695310 A US201916695310 A US 201916695310A US 2020297357 A1 US2020297357 A1 US 2020297357A1
- Authority
- US
- United States
- Prior art keywords
- drill
- depth stop
- depth
- effector
- guide
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000000034 method Methods 0.000 title claims abstract description 52
- 238000002432 robotic surgery Methods 0.000 title abstract description 5
- 239000012636 effector Substances 0.000 claims abstract description 93
- 230000007246 mechanism Effects 0.000 claims description 59
- 230000033001 locomotion Effects 0.000 claims description 36
- 238000001356 surgical procedure Methods 0.000 claims description 25
- 238000012545 processing Methods 0.000 claims description 12
- 238000006073 displacement reaction Methods 0.000 claims description 6
- 230000004044 response Effects 0.000 claims description 5
- 230000000149 penetrating effect Effects 0.000 claims description 3
- 125000001153 fluoro group Chemical group F* 0.000 description 44
- 210000003128 head Anatomy 0.000 description 25
- 210000002105 tongue Anatomy 0.000 description 21
- 239000000523 sample Substances 0.000 description 14
- 238000010586 diagram Methods 0.000 description 13
- 238000003384 imaging method Methods 0.000 description 11
- 230000006870 function Effects 0.000 description 9
- 239000007943 implant Substances 0.000 description 8
- 210000003484 anatomy Anatomy 0.000 description 7
- 230000000670 limiting effect Effects 0.000 description 7
- 238000002591 computed tomography Methods 0.000 description 6
- 238000004590 computer program Methods 0.000 description 6
- 230000005355 Hall effect Effects 0.000 description 5
- 238000002594 fluoroscopy Methods 0.000 description 5
- 238000005286 illumination Methods 0.000 description 5
- 239000003550 marker Substances 0.000 description 5
- 229910052751 metal Inorganic materials 0.000 description 5
- 239000002184 metal Substances 0.000 description 5
- 230000008901 benefit Effects 0.000 description 4
- 210000004556 brain Anatomy 0.000 description 4
- 238000001514 detection method Methods 0.000 description 4
- 230000013011 mating Effects 0.000 description 4
- 238000012986 modification Methods 0.000 description 4
- 230000004048 modification Effects 0.000 description 4
- 230000009466 transformation Effects 0.000 description 4
- 238000012795 verification Methods 0.000 description 4
- 238000004891 communication Methods 0.000 description 3
- 230000002980 postoperative effect Effects 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 210000003625 skull Anatomy 0.000 description 3
- 238000012800 visualization Methods 0.000 description 3
- GVJHHUAWPYXKBD-UHFFFAOYSA-N (±)-α-Tocopherol Chemical compound OC1=C(C)C(C)=C2OC(CCCC(C)CCCC(C)CCCC(C)C)(C)CCC2=C1C GVJHHUAWPYXKBD-UHFFFAOYSA-N 0.000 description 2
- 206010028980 Neoplasm Diseases 0.000 description 2
- 238000013459 approach Methods 0.000 description 2
- 210000000988 bone and bone Anatomy 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 238000010276 construction Methods 0.000 description 2
- 230000008878 coupling Effects 0.000 description 2
- 238000010168 coupling process Methods 0.000 description 2
- 238000005859 coupling reaction Methods 0.000 description 2
- 238000005553 drilling Methods 0.000 description 2
- 238000003780 insertion Methods 0.000 description 2
- 230000037431 insertion Effects 0.000 description 2
- 230000004807 localization Effects 0.000 description 2
- 238000013507 mapping Methods 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000035515 penetration Effects 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- 238000003325 tomography Methods 0.000 description 2
- 238000013519 translation Methods 0.000 description 2
- 230000016776 visual perception Effects 0.000 description 2
- 241000226585 Antennaria plantaginifolia Species 0.000 description 1
- 235000014676 Phragmites communis Nutrition 0.000 description 1
- 229930003427 Vitamin E Natural products 0.000 description 1
- 230000004913 activation Effects 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 238000001574 biopsy Methods 0.000 description 1
- 210000004204 blood vessel Anatomy 0.000 description 1
- 238000004422 calculation algorithm Methods 0.000 description 1
- 230000001447 compensatory effect Effects 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 230000001010 compromised effect Effects 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 210000004709 eyebrow Anatomy 0.000 description 1
- WIGCFUFOHFEKBI-UHFFFAOYSA-N gamma-tocopherol Natural products CC(C)CCCC(C)CCCC(C)CCCC1CCC2C(C)C(O)C(C)C(C)C2O1 WIGCFUFOHFEKBI-UHFFFAOYSA-N 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 230000001678 irradiating effect Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 210000005036 nerve Anatomy 0.000 description 1
- 230000004007 neuromodulation Effects 0.000 description 1
- 230000036961 partial effect Effects 0.000 description 1
- 230000002265 prevention Effects 0.000 description 1
- 230000005855 radiation Effects 0.000 description 1
- 229910052761 rare earth metal Inorganic materials 0.000 description 1
- 150000002910 rare earth metals Chemical class 0.000 description 1
- 230000002441 reversible effect Effects 0.000 description 1
- 238000012552 review Methods 0.000 description 1
- 230000000087 stabilizing effect Effects 0.000 description 1
- 230000000638 stimulation Effects 0.000 description 1
- 238000000844 transformation Methods 0.000 description 1
- 239000013598 vector Substances 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
- 229940046009 vitamin E Drugs 0.000 description 1
- 235000019165 vitamin E Nutrition 0.000 description 1
- 239000011709 vitamin E Substances 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/10—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges for stereotaxic surgery, e.g. frame-based stereotaxis
- A61B90/14—Fixators for body parts, e.g. skull clamps; Constructional details of fixators, e.g. pins
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods, e.g. tourniquets
- A61B17/16—Bone cutting, breaking or removal means other than saws, e.g. Osteoclasts; Drills or chisels for bones; Trepans
- A61B17/1613—Component parts
- A61B17/162—Chucks or tool parts which are to be held in a chuck
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods, e.g. tourniquets
- A61B17/16—Bone cutting, breaking or removal means other than saws, e.g. Osteoclasts; Drills or chisels for bones; Trepans
- A61B17/17—Guides or aligning means for drills, mills, pins or wires
- A61B17/1739—Guides or aligning means for drills, mills, pins or wires specially adapted for particular parts of the body
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/03—Automatic limiting or abutting means, e.g. for safety
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/08—Accessories or related features not otherwise provided for
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/10—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges for stereotaxic surgery, e.g. frame-based stereotaxis
- A61B90/11—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges for stereotaxic surgery, e.g. frame-based stereotaxis with guides for needles or instruments, e.g. arcuate slides or ball joints
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods, e.g. tourniquets
- A61B17/16—Bone cutting, breaking or removal means other than saws, e.g. Osteoclasts; Drills or chisels for bones; Trepans
- A61B17/1695—Trepans or craniotomes, i.e. specially adapted for drilling thin bones such as the skull
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods, e.g. tourniquets
- A61B2017/00367—Details of actuation of instruments, e.g. relations between pushing buttons, or the like, and activation of the tool, working tip, or the like
- A61B2017/00407—Ratchet means
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/107—Visualisation of planned trajectories or target regions
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2051—Electromagnetic tracking systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2055—Optical tracking systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2072—Reference field transducer attached to an instrument or patient
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/03—Automatic limiting or abutting means, e.g. for safety
- A61B2090/033—Abutting means, stops, e.g. abutting on tissue or skin
- A61B2090/034—Abutting means, stops, e.g. abutting on tissue or skin abutting on parts of the device itself
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/08—Accessories or related features not otherwise provided for
- A61B2090/0807—Indication means
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/30—Devices for illuminating a surgical field, the devices having an interrelation with other surgical devices or with a surgical procedure
- A61B2090/309—Devices for illuminating a surgical field, the devices having an interrelation with other surgical devices or with a surgical procedure using white LEDs
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B2090/363—Use of fiducial points
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
- A61B2090/376—Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy
- A61B2090/3762—Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy using computed tomography systems [CT]
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/39—Markers, e.g. radio-opaque or breast lesions markers
- A61B2090/3954—Markers, e.g. radio-opaque or breast lesions markers magnetic, e.g. NMR or MRI
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/39—Markers, e.g. radio-opaque or breast lesions markers
- A61B2090/3966—Radiopaque markers visible in an X-ray image
Definitions
- the present disclosure relates to medical devices and systems, and more particularly, systems for neuronavigation registration and robotic trajectory guidance, robotic surgery, and related methods and devices.
- Position recognition systems for robot assisted surgeries are used to determine the position of and track a particular object in 3-dimensions (3D).
- certain objects such as surgical instruments, need to be tracked with a high degree of precision as the instrument is being positioned and moved by a robot or by a physician, for example.
- Position recognition systems may use passive and/or active sensors or markers for registering and tracking the positions of the objects. Using these sensors, the system may geometrically resolve the 3-dimensional position of the sensors based on information from or with respect to one or more cameras, signals, or sensors, etc. These surgical systems can therefore utilize position feedback to precisely guide movement of robotic arms and tools relative to a patients' surgical site. Thus, there is a need for a system that efficiently and accurately provide neuronavigation registration and robotic trajectory guidance in a surgical environment.
- End-effectors used in robotic surgery may be limited to use in only certain procedures, or may suffer from other drawbacks or disadvantages.
- a surgical robot system is configured for surgery on an anatomical feature of a patient, and includes a surgical robot, a robot arm connected to such surgical robot, and an end-effector connected to the robot arm.
- the end-effector has a surgical tool selectively connected to it
- the robot system includes a memory accessible by a suitable processor circuit which processes machine-readable instructions.
- the instructions which are executable by the system are ones which, in response to user input, determine a drill target and an associated drill trajectory.
- the instructions also may be executed to cause the end-effector to move to a position corresponding to such determined target and determined drill trajectory. Once positioned in this manner, the system and corresponding instructions will permit advancement of the aforementioned drill toward the determined target.
- the system includes a drill guide which is connectible between the end-effector and the drill.
- the drill guide is configured to stop advancement of the drill at a pre-selected drill depth associated with the determined target, and is further configured to guide the drill along the determined drill trajectory during advancement of the drill.
- the drill guide provides further assurance against over-penetration or over-engagement of the anatomical feature being operated upon and likewise assists in guiding the trajectory of the drill, including when such trajectory is oblique to the anatomical feature.
- the drill guide includes a guide shaft with a bore extending longitudinally therethrough and having proximal and distal openings to the bore.
- the guide shaft and the corresponding bore are sized to slideably receive a drill bit of the drill therethrough, such drill bit having an associated drill tip and being operatively connected to advancement and rotating mechanisms of the drill at a proximal location by a suitable chuck.
- the drill guide includes a depth stop which is mounted to the proximal end of the guide shaft and selectively slideable longitudinally to vary the length of the guide shaft relative to the predetermined length of the drill bit received in the drill.
- the drill stop has a surface located and otherwise disposed to engage the distal surface of the chuck of the drill. As such, engagement between the chuck and the depth stop limits the depth to which the drill tip is advanceable by an amount corresponding to the position of the depth stop.
- the drill guide makes use of a depth indicator having indicia corresponding to graduated depths of advancement of the drill bit associated with the drill guide during the operative procedures contemplated by the surgical robot system.
- the depth stop may be formed so as to include a longitudinal stem, this stem being slideably received within the bore of the guide shaft.
- the guide shaft has a window formed therein which is sized and located to reveal the longitudinal stem of the depth stop when it is received in the guide shaft.
- the aforementioned indicia associated with the graduated depth may be located either on the longitudinal stem visible through the window or on portions adjacent to the window, and such indicia are movable relative to a pointer or other indicator relative to which the graduated depth scale moves. In this manner, the desired depth limit may be manually selected by visually perceiving the numerical value of the graduated depths aligned with the associated pointer or indicator.
- the drill guide has an engagement mechanism to set the depth stop at one of a plurality of selectable, longitudinal positions corresponding to the desired drill depth.
- One suitable engagement mechanism may include a ratchet assembly engageable at one of a plurality of longitudinally spaced locations between the proximal and distal ends of the depth stop.
- the ratchet assembly includes a ratchet that may be disengaged and engaged by actuation or release, respectively, of a spring-biased trigger.
- Such trigger may have a trigger lock operatively associated therewith so that once the ratchet of the ratchet assembly has engaged the depth stop at a selected longitudinal position, such trigger lock inhibits inadvertent actuation of the trigger which would cause disengagement of the ratchet and potential longitudinal movement of the depth stop.
- the drill guide may include a handle operatively connected to the engagement mechanism so as to manually set the depth stop at the selected one of the longitudinal positions.
- the guide shaft has at least one bushing, and such bushing may be sized to engage the longitudinal surface at the distal end of the drill bit in such a manner so as to exert a force on such distal drill bit end to oppose lateral displacement of the drill bit within the guide shaft.
- Such opposing force may be sufficient to reduce deviation of the drill bit from the determined drill trajectory, especially when such trajectory is at an oblique angle to the anatomical feature being engaged.
- One such method involves guiding a drill during robot-assisted surgery on an anatomical feature of a patient, such method including the determination by suitable computer means of a drill target and an associated drill trajectory. Thereafter, the end-effector is caused to move, such as by means of a computer, to a position corresponding to the determined target and the determined drill trajectory.
- the depth stop connected to the end-effector may be manually set at one of a plurality of selectable positions corresponding to a desired depth of advancement of the drill relative to the anatomical feature. In this manner, the drill is limited from penetrating the anatomical feature beyond the desired depth set manually by the depth stop.
- the depth stop is displaced relative to visually perceptible indicia corresponding to graduated depths of advancement of the drill associated with the system.
- the depth stop may be secured at a position corresponding to one of the graduated depths indicated on the indicia.
- the steps of displacing and securing the depth stop may involve actuating a spring-biased trigger to disengage the depth stop from a first position, longitudinally sliding the depth stop relative to the visually perceptible indicia to a second position corresponding to the desired position of the depth stop, and then releasing the trigger to re-engage the depth stop at the desired position.
- FIG. 1A is an overhead view of an arrangement for locations of a robotic system, patient, surgeon, and other medical personnel during a surgical procedure, according to some embodiments;
- FIG. 1B is an overhead view of an alternate arrangement for locations of a robotic system, patient, surgeon, and other medical personnel during a cranial surgical procedure, according to some embodiments;
- FIG. 2 illustrates a robotic system including positioning of the surgical robot and a camera relative to the patient according to some embodiments
- FIG. 3 is a flowchart diagram illustrating computer-implemented operations for determining a position and orientation of an anatomical feature of a patient with respect to a robot arm of a surgical robot, according to some embodiments;
- FIG. 4 is a diagram illustrating processing of data for determining a position and orientation of an anatomical feature of a patient with respect to a robot arm of a surgical robot, according to some embodiments;
- FIGS. 5A-5C illustrate a system for registering an anatomical feature of a patient using a computerized tomography (CT) localizer, a frame reference array (FRA), and a dynamic reference base (DRB), according to some embodiments;
- CT computerized tomography
- FAA frame reference array
- DRB dynamic reference base
- FIGS. 6A and 6B illustrate a system for registering an anatomical feature of a patient using fluoroscopy (fluoro) imaging, according to some embodiments
- FIG. 7 illustrates a system for registering an anatomical feature of a patient using an intraoperative CT fixture (ICT) and a DRB, according to some embodiments;
- ICT intraoperative CT fixture
- FIGS. 8A and 8B illustrate systems for registering an anatomical feature of a patient using a DRB and an X-ray cone beam imaging device, according to some embodiments
- FIG. 9 illustrates a system for registering an anatomical feature of a patient using a navigated probe and fiducials for point-to-point mapping of the anatomical feature, according to some embodiments
- FIG. 10 illustrates a two-dimensional visualization of an adjustment range for a centerpoint-arc mechanism, according to some embodiments
- FIG. 11 illustrates a two-dimensional visualization of virtual point rotation mechanism, according to some embodiments.
- FIG. 12 is an isometric view of one possible implementation of an end-effector according to the present disclosure.
- FIG. 13 is an isometric view of another possible implementation of an end-effector of the present disclosure.
- FIG. 14 is a partial cutaway, isometric view of still another possible implementation of an end-effector according to the present disclosure.
- FIG. 15 is a bottom angle isometric view of yet another possible implementation of an end-effector according to the present disclosure.
- FIG. 16 is an isometric view of one possible tool stop for use with an end-effector according to the present disclosure
- FIGS. 17 and 18 are top plan views of one possible implementation of a tool insert locking mechanism of an end-effector according to the present disclosure
- FIGS. 19 and 20 are top plan views of the tool stop of FIG. 16 , showing open and closed positions, respectively;
- FIG. 21 is a side, elevational view of another implementation of the robot system disclosed herein, including an end-effector and associated drill guide;
- FIG. 22 is a close-up, side-elevational view of the drill guide of FIG. 21 ;
- FIG. 23 is a front elevational view of the drill guide of FIGS. 21-22 ;
- FIG. 24 is a cross sectional view of the drill guide of FIGS. 21-23 taken along the line of A-A of FIG. 23 ;
- FIG. 25 is an exploded, side-elevational view of the drill guide of FIGS. 21-24 .
- systems for neuronavigation registration and robotic trajectory guidance, and related methods and devices are disclosed.
- a first image having an anatomical feature of a patient, a registration fixture that is fixed with respect to the anatomical feature of the patient, and a first plurality of fiducial markers that are fixed with respect to the registration fixture is analyzed, and a position is determined for each fiducial marker of the first plurality of fiducial markers.
- a position and orientation of the registration fixture with respect to the anatomical feature is determined.
- a data frame comprising a second plurality of tracking markers that are fixed with respect to the registration fixture is also analyzed, and a position is determined for each tracking marker of the second plurality of tracking markers.
- a position and orientation of the registration fixture with respect to a robot arm of a surgical robot is determined.
- a position and orientation of the anatomical feature with respect to the robot arm is determined, which allows the robot arm to be controlled based on the determined position and orientation of the anatomical feature with respect to the robot arm.
- embodiments include the ability to combine neuronavigation and robotic trajectory alignment into one system, with support for a wide variety of different registration hardware and methods.
- embodiments may support both computerized tomography (CT) and fluoroscopy (fluoro) registration techniques, and may utilize frame-based and/or frameless surgical arrangements.
- CT computerized tomography
- fluoroscopy fluoro
- embodiments may support both computerized tomography (CT) and fluoroscopy (fluoro) registration techniques, and may utilize frame-based and/or frameless surgical arrangements.
- CT computerized tomography
- fluoro fluoroscopy
- FIG. 1A illustrates a surgical robot system 100 in accordance with an embodiment.
- Surgical robot system 100 may include, for example, a surgical robot 102 , one or more robot arms 104 , a base 106 , a display 110 , an end-effector 112 , for example, including a guide tube 114 , and one or more tracking markers 118 .
- the robot arm 104 may be movable along and/or about an axis relative to the base 106 , responsive to input from a user, commands received from a processing device, or other methods.
- the surgical robot system 100 may include a patient tracking device 116 also including one or more tracking markers 118 , which is adapted to be secured directly to the patient 210 (e.g., to a bone of the patient 210 ).
- the tracking markers 118 may be secured to or may be part of a stereotactic frame that is fixed with respect to an anatomical feature of the patient 210 .
- the stereotactic frame may also be secured to a fixture to prevent movement of the patient 210 during surgery.
- FIG. 1B is an overhead view of an alternate arrangement for locations of a robotic system 100 , patient 210 , surgeon 120 , and other medical personnel during a cranial surgical procedure.
- the robot 102 may be positioned behind the head 128 of the patient 210 .
- the robot arm 104 of the robot 102 has an end-effector 112 that may hold a surgical instrument 108 during the procedure.
- a stereotactic frame 134 is fixed with respect to the patient's head 128 , and the patient 210 and/or stereotactic frame 134 may also be secured to a patient base 211 to prevent movement of the patient's head 128 with respect to the patient base 211 .
- the patient 210 , the stereotactic frame 134 and/or or the patient base 211 may be secured to the robot base 106 , such as via an auxiliary arm 107 , to prevent relative movement of the patient 210 with respect to components of the robot 102 during surgery.
- Different devices may be positioned with respect to the patient's head 128 and/or patient base 211 as desired to facilitate the procedure, such as an intra-operative CT device 130 , an anesthesiology station 132 , a scrub station 136 , a neuro-modulation station 138 , and/or one or more remote pendants 140 for controlling the robot 102 and/or other devices or systems during the procedure.
- the surgical robot system 100 in the examples of FIGS. 1A and/or 1B may also use a sensor, such as a camera 200 , for example, positioned on a camera stand 202 .
- the camera stand 202 can have any suitable configuration to move, orient, and support the camera 200 in a desired position.
- the camera 200 may include any suitable camera or cameras, such as one or more cameras (e.g., bifocal or stereophotogrammetric cameras), able to identify, for example, active or passive tracking markers 118 (shown as part of patient tracking device 116 in FIG. 2 ) in a given measurement volume viewable from the perspective of the camera 200 .
- the camera 200 may scan the given measurement volume and detect the light that comes from the tracking markers 118 in order to identify and determine the position of the tracking markers 118 in three-dimensions.
- active tracking markers 118 may include infrared-emitting markers that are activated by an electrical signal (e.g., infrared light emitting diodes (LEDs)), and/or passive tracking markers 118 may include retro-reflective markers that reflect infrared or other light (e.g., they reflect incoming IR radiation into the direction of the incoming light), for example, emitted by illuminators on the camera 200 or other suitable sensor or other device.
- LEDs infrared light emitting diodes
- one or more targets of surgical interest are localized to an external reference frame.
- stereotactic neurosurgery may use an externally mounted stereotactic frame that facilitates patient localization and implant insertion via a frame mounted arc.
- Neuronavigation is used to register, e.g., map, targets within the brain based on pre-operative or intraoperative imaging.
- links and associations can be made between the imaging and the actual anatomical structures in a surgical environment, and these links and associations can be utilized by robotic trajectory systems during surgery.
- various software and hardware elements may be combined to create a system that can be used to plan, register, place and verify the location of an instrument or implant in the brain.
- These systems may integrate a surgical robot, such as the surgical robot 102 of FIGS. 1A and/or 1B , and may employ a surgical navigation system and planning software to program and control the surgical robot.
- the surgical robot 102 may be remotely controlled, such as by nonsterile personnel.
- the robot 102 may be positioned near or next to patient 210 , and it will be appreciated that the robot 102 can be positioned at any suitable location near the patient 210 depending on the area of the patient 210 undergoing the operation.
- the camera 200 may be separated from the surgical robot system 100 and positioned near or next to patient 210 as well, in any suitable position that allows the camera 200 to have a direct visual line of sight to the surgical field 208 .
- the surgeon 120 may be positioned across from the robot 102 , but is still able to manipulate the end-effector 112 and the display 110 .
- a surgical assistant 126 may be positioned across from the surgeon 120 again with access to both the end-effector 112 and the display 110 . If desired, the locations of the surgeon 120 and the assistant 126 may be reversed. The traditional areas for the anesthesiologist 122 and the nurse or scrub tech 124 may remain unimpeded by the locations of the robot 102 and camera 200 .
- the display 110 can be attached to the surgical robot 102 and in other embodiments, the display 110 can be detached from surgical robot 102 , either within a surgical room with the surgical robot 102 , or in a remote location.
- the end-effector 112 may be coupled to the robot arm 104 and controlled by at least one motor.
- end-effector 112 can comprise a guide tube 114 , which is able to receive and orient a surgical instrument 108 used to perform surgery on the patient 210 .
- end-effector is used interchangeably with the terms “end-effectuator” and “effectuator element.”
- end-effector 112 may be replaced with any suitable instrumentation suitable for use in surgery.
- end-effector 112 can comprise any known structure for effecting the movement of the surgical instrument 108 in a desired manner.
- the surgical robot 102 is able to control the translation and orientation of the end-effector 112 .
- the robot 102 is able to move end-effector 112 along x-, y-, and z-axes, for example.
- the end-effector 112 can be configured for selective rotation about one or more of the x-, y-, and z-axis such that one or more of the Euler Angles (e.g., roll, pitch, and/or yaw) associated with end-effector 112 can be selectively controlled.
- selective control of the translation and orientation of end-effector 112 can permit performance of medical procedures with significantly improved accuracy compared to conventional robots that use, for example, a six degree of freedom robot arm comprising only rotational axes.
- the surgical robot system 100 may be used to operate on patient 210 , and robot arm 104 can be positioned above the body of patient 210 , with end-effector 112 selectively angled relative to the z-axis toward the body of patient 210 .
- the position of the surgical instrument 108 can be dynamically updated so that surgical robot 102 can be aware of the location of the surgical instrument 108 at all times during the procedure. Consequently, in some embodiments, surgical robot 102 can move the surgical instrument 108 to the desired position quickly without any further assistance from a physician (unless the physician so desires). In some further embodiments, surgical robot 102 can be configured to correct the path of the surgical instrument 108 if the surgical instrument 108 strays from the selected, preplanned trajectory. In some embodiments, surgical robot 102 can be configured to permit stoppage, modification, and/or manual control of the movement of end-effector 112 and/or the surgical instrument 108 .
- a physician or other user can operate the system 100 , and has the option to stop, modify, or manually control the autonomous movement of end-effector 112 and/or the surgical instrument 108 .
- Further details of surgical robot system 100 including the control and movement of a surgical instrument 108 by surgical robot 102 can be found in co-pending U.S. Patent Publication No. 2013/0345718, which is incorporated herein by reference in its entirety.
- the surgical robot system 100 can comprise one or more tracking markers configured to track the movement of robot arm 104 , end-effector 112 , patient 210 , and/or the surgical instrument 108 in three dimensions.
- a plurality of tracking markers can be mounted (or otherwise secured) thereon to an outer surface of the robot 102 , such as, for example and without limitation, on base 106 of robot 102 , on robot arm 104 , and/or on the end-effector 112 .
- one or more tracking markers can be mounted or otherwise secured to the end-effector 112 .
- One or more tracking markers can further be mounted (or otherwise secured) to the patient 210 .
- the plurality of tracking markers can be positioned on the patient 210 spaced apart from the surgical field 208 to reduce the likelihood of being obscured by the surgeon, surgical tools, or other parts of the robot 102 .
- one or more tracking markers can be further mounted (or otherwise secured) to the surgical instruments 108 (e.g., a screw driver, dilator, implant inserter, or the like).
- the tracking markers enable each of the marked objects (e.g., the end-effector 112 , the patient 210 , and the surgical instruments 108 ) to be tracked by the surgical robot system 100 .
- system 100 can use tracking information collected from each of the marked objects to calculate the orientation and location, for example, of the end-effector 112 , the surgical instrument 108 (e.g., positioned in the tube 114 of the end-effector 112 ), and the relative position of the patient 210 .
- Further details of surgical robot system 100 including the control, movement and tracking of surgical robot 102 and of a surgical instrument 108 can be found in U.S. Patent Publication No. 2016/0242849, which is incorporated herein by reference in its entirety.
- pre-operative imaging may be used to identify the anatomy to be targeted in the procedure. If desired by the surgeon the planning package will allow for the definition of a reformatted coordinate system. This reformatted coordinate system will have coordinate axes anchored to specific anatomical landmarks, such as the anterior commissure (AC) and posterior commissure (PC) for neurosurgery procedures.
- multiple pre-operative exam images e.g., CT or magnetic resonance (MR) images
- MR magnetic resonance
- registration is the process of determining the coordinate transformations from one coordinate system to another.
- co-registering a CT scan to an MR scan means that it is possible to transform the coordinates of an anatomical point from the CT scan to the corresponding anatomical location in the MR scan.
- a common registration fixture such as a dynamic reference base (DRB)
- DRB dynamic reference base
- FIG. 3 is a flowchart diagram illustrating computer-implemented operations 300 for determining a position and orientation of an anatomical feature of a patient with respect to a robot arm of a surgical robot, according to some embodiments.
- the operations 300 may include receiving a first image volume, such as a CT scan, from a preoperative image capture device at a first time (Block 302 ).
- the first image volume includes an anatomical feature of a patient and at least a portion of a registration fixture that is fixed with respect to the anatomical feature of the patient.
- the registration fixture includes a first plurality of fiducial markers that are fixed with respect to the registration fixture.
- the operations 300 further include determining, for each fiducial marker of the first plurality of fiducial markers, a position of the fiducial marker relative to the first image volume (Block 304 ).
- the operations 300 further include, determining, based on the determined positions of the first plurality of fiducial markers, positions of an array of tracking markers on the registration fixture (fiducial registration array or FRA) with respect to the anatomical feature (Block 306 ).
- the operations 300 may further include receiving a tracking data frame from an intraoperative tracking device comprising a plurality of tracking cameras at a second time that is later than the first time (Block 308 ).
- the tracking frame includes positions of a plurality of tracking markers that are fixed with respect to the registration fixture (FRA) and a plurality of tracking markers that are fixed with respect to the robot.
- the operations 300 further include determining, for based on the positions of tracking markers of the registration fixture, a position and orientation of the anatomical feature with respect to the tracking cameras (Block 310 ).
- the operations 300 further include determining, based on the determined positions of the plurality of tracking markers on the robot, a position and orientation of the robot arm of a surgical robot with respect to the tracking cameras (Block 312 ).
- the operations 300 further include determining, based on the determined position and orientation of the anatomical feature with respect to the tracking cameras and the determined position and orientation of the robot arm with respect to the tracking cameras, a position and orientation of the anatomical feature with respect to the robot arm (Block 314 ).
- the operations 300 further include controlling movement of the robot arm with respect to the anatomical feature, e.g., along and/or rotationally about one or more defined axis, based on the determined position and orientation of the anatomical feature with respect to the robot arm (Block 316 ).
- FIG. 4 is a diagram illustrating a data flow 400 for a multiple coordinate transformation system, to enable determining a position and orientation of an anatomical feature of a patient with respect to a robot arm of a surgical robot, according to some embodiments.
- data from a plurality of exam image spaces 402 may be transformed and combined into a common exam image space 404 .
- the data from the common exam image space 404 and data from a verification image space 406 may be transformed and combined into a registration image space 408 .
- Data from the registration image space 408 may be transformed into patient fiducial coordinates 410 , which is transformed into coordinates for a DRB 412 .
- a tracking camera 414 may detect movement of the DRB 412 (represented by DRB 412 ′) and may also detect a location of a probe tracker 416 to track coordinates of the DRB 412 over time.
- a robotic arm tracker 418 determines coordinates for the robot arm based on transformation data from a Robotics Planning System (RPS) space 420 or similar modeling system, and/or transformation data from the tracking camera 414 .
- RPS Robotics Planning System
- these and other features may be used and combined in different ways to achieve registration of image space, i.e., coordinates from image volume, into tracking space, i.e., coordinates for use by the surgical robot in real-time.
- these features may include fiducial-based registration such as stereotactic frames with CT localizer, preoperative CT or MRI registered using intraoperative fluoroscopy, calibrated scanner registration where any acquired scan's coordinates are pre-calibrated relative to the tracking space, and/or surface registration using a tracked probe, for example.
- FIGS. 5A-5C illustrate a system 500 for registering an anatomical feature of a patient.
- the stereotactic frame base 530 is fixed to an anatomical feature 528 of patient, e.g., the patient's head.
- the stereotactic frame base 530 may be affixed to the patient's head 528 prior to registration using pins clamping the skull or other method.
- the stereotactic frame base 530 may act as both a fixation platform, for holding the patient's head 528 in a fixed position, and registration and tracking platform, for alternatingly holding the CT localizer 536 or the FRA fixture 534 .
- the CT localizer 536 includes a plurality of fiducial markers 532 (e.g., N-pattern radio-opaque rods or other fiducials), which are automatically detected in the image space using image processing. Due to the precise attachment mechanism of the CT localizer 536 to the base 530 , these fiducial markers 532 are in known space relative to the stereotactic frame base 530 .
- a 3D CT scan of the patient with CT localizer 536 attached is taken, with an image volume that includes both the patient's head 528 and the fiducial markers 532 of the CT localizer 536 . This registration image can be taken intraoperatively or preoperatively, either in the operating room or in radiology, for example.
- the captured 3D image dataset is stored to computer memory.
- the CT localizer 536 is removed from the stereotactic frame base 530 and the frame reference array fixture 534 is attached to the stereotactic frame base 530 .
- the stereotactic frame base 530 remains fixed to the patient's head 528 , however, and is used to secure the patient during surgery, and serves as the attachment point of a frame reference array fixture 534 .
- the frame reference array fixture 534 includes a frame reference array (FRA), which is a rigid array of three or more tracked markers 539 , which may be the primary reference for optical tracking.
- FFA frame reference array
- Mount points on the FRA fixture 534 and stereotactic frame base 530 may be designed such that the FRA fixture 534 attaches reproducibly to the stereotactic frame base 530 with minimal (i.e., submillimetric) variability. These mount points on the stereotactic frame base 530 can be the same mount points used by the CT localizer 536 , which is removed after the scan has been taken.
- An auxiliary arm (such as auxiliary arm 107 of FIG. 1B , for example) or other attachment mechanism can also be used to securely affix the patient to the robot base to ensure that the robot base is not allowed to move relative to the patient.
- a dynamic reference base (DRB) 540 may also be attached to the stereotactic frame base 530 .
- the DRB 540 in this example includes a rigid array of three or more tracked markers 542 .
- the DRB 540 and/or other tracked markers may be attached to the stereotactic frame base 530 and/or to directly to the patient's head 528 using auxiliary mounting arms 541 , pins, or other attachment mechanisms.
- the DRB 540 in general may be attached as needed for allowing unhindered surgical and equipment access.
- registration which was initially related to the tracking markers 539 of the FRA, can be optionally transferred or related to the tracking markers 542 of the DRB 540 .
- the surgeon may remove the FRA fixture 534 and navigate using only the DRB 540 .
- the surgeon could opt to navigate from the FRA markers 539 , without using a DRB 540 , or may navigate using both the FRA markers 539 and the DRB 540 .
- the FRA fixture 534 and/or DRB 540 uses optical markers, the tracked positions of which are in known locations relative to the stereotactic frame base 530 , similar to the CT localizer 536 , but it should be understood that many other additional and/or alternative techniques may be used.
- FIGS. 6A and 6B illustrate a system 600 for registering an anatomical feature of a patient using fluoroscopy (fluoro) imaging, according to some embodiments.
- image space is registered to tracking space using multiple intraoperative fluoroscopy (fluoro) images taken using a tracked registration fixture 644 .
- the anatomical feature of the patient e.g., the patient's head 628
- the clamping apparatus 643 for rigid patient fixation can be a three-pin fixation system such as a Mayfield clamp, a stereotactic frame base attached to the surgical table, or another fixation method, as desired.
- the clamping apparatus 643 may also function as a support structure for a patient tracking array or DRB 640 as well.
- the DRB may be attached to the clamping apparatus using auxiliary mounting arms 641 or other means.
- the fluoro fixture 644 is attached the fluoro unit's x-ray collecting image intensifier (not shown) and secured by tightening clamping feet 632 .
- the fluoro fixture 644 contains fiducial markers (e.g., metal spheres laid out across two planes in this example, not shown) that are visible on 2D fluoro images captured by the fluoro image capture device and can be used to calculate the location of the x-ray source relative to the image intensifier, which is typically about 1 meter away contralateral to the patient, using a standard pinhole camera model.
- Detection of the metal spheres in the fluoro image captured by the fluoro image capture device also enables the software to de-warp the fluoro image (i.e., to remove pincushion and s-distortion).
- the fluoro fixture 644 contains 3 or more tracking markers 646 for determining the location and orientation of the fluoro fixture 644 in tracking space.
- software can project vectors through a CT image volume, based on a previously captured CT image, to generate synthetic images based on contrast levels in the CT image that appear similar to the actual fluoro images (i.e., digitally reconstructed radiographs (DRRs)).
- DDRs digitally reconstructed radiographs
- the location of the patient's head 628 relative to the x-ray source and detector is calculated. Because the tracking markers 646 on the fluoro fixture 644 track the position of the image intensifier and the position of the x-ray source relative to the image intensifier is calculated from metal fiducials on the fluoro fixture 644 projected on 2D images, the position of the x-ray source and detector in tracking space are known and the system is able to achieve image-to-tracking registration.
- two or more shots are taken of the head 628 of the patient by the fluoro image capture device from two different perspectives while tracking the array markers 642 of the DRB 640 , which is fixed to the registration fixture 630 via a mounting arm 641 , and tracking markers 646 on the fluoro fixture 644 .
- an algorithm computes the location of the head 628 or other anatomical feature relative to the tracking space for the procedure. Through image-to-tracking registration, the location of any tracked tool in the image volume space can be calculated.
- a first fluoro image taken from a first fluoro perspective can be compared to a first DRR constructed from a first perspective through a CT image volume, and a second fluoro image taken from a second fluoro perspective can be compared to a second DRR constructed from a second perspective through the same CT image volume. Based on the comparisons, it may be determined that the first DRR is substantially equivalent to the first fluoro image with respect to the projected view of the anatomical feature, and that the second DRR is substantially equivalent to the second fluoro image with respect to the projected view of the anatomical feature.
- Equivalency confirms that the position and orientation of the x-ray path from emitter to collector on the actual fluoro machine as tracked in camera space matches the position and orientation of the x-ray path from emitter to collector as specified when generating the DRRs in CT space, and therefore registration of tracking space to CT space is achieved.
- FIG. 7 illustrates a system 700 for registering an anatomical feature of a patient using an intraoperative CT fixture (ICT) and a DRB, according to some embodiments.
- ICT intraoperative CT fixture
- FIG. 7 a fiducial-based image-to-tracking registration can be utilized that uses an intraoperative CT fixture (ICT) 750 having a plurality of tracking markers 751 and radio-opaque fiducial reference markers 732 to register the CT space to the tracking space.
- ICT intraoperative CT fixture
- anatomical feature 728 e.g., the patient's head
- clamping apparatus 730 such as a three-pin Mayfield frame and/or stereotactic frame
- the surgeon will affix the ICT 750 to the anatomical feature 728 , DRB 740 , or clamping apparatus 730 , so that it is in a static position relative to the tracking markers 742 of the DRB 740 , which may be held in place by mounting arm 741 or other rigid means.
- a CT scan is captured that encompasses the fiducial reference markers 732 of the ICT 750 while also capturing relevant anatomy of the anatomical feature 728 .
- the system auto-identifies (through image processing) locations of the fiducial reference markers 732 of the ICT within the CT volume, which are in a fixed position relative to the tracking markers of the ICT 750 , providing image-to-tracking registration.
- This registration which was initially based on the tracking markers 751 of the ICT 750 , is then related to or transferred to the tracking markers 742 of the DRB 740 , and the ICT 750 may then be removed.
- FIG. 8A illustrates a system 800 for registering an anatomical feature of a patient using a DRB and an X-ray cone beam imaging device, according to some embodiments.
- An intraoperative scanner 852 such as an X-ray machine or other scanning device, may have a tracking array 854 with tracking markers 855 , mounted thereon for registration. Based on the fixed, known position of the tracking array 854 on the scanning device, the system may be calibrated to directly map (register) the tracking space to the image space of any scan acquired by the system. Once registration is achieved, the registration, which is initially based on the tracking markers 855 (e.g.
- gantry markers of the scanner's array 854 , is related or transferred to the tracking markers 842 of a DRB 840 , which may be fixed to a clamping fixture 830 holding the patient's head 828 by a mounting arm 841 or other rigid means. After transferring registration, the markers on the scanner are no longer used and can be removed, deactivated or covered if desired. Registering the tracking space to any image acquired by a scanner in this way may avoid the need for fiducials or other reference markers in the image space in some embodiments.
- FIG. 8B illustrates an alternative system 800 ′ that uses a portable intraoperative scanner, referred to herein as a C-arm scanner 853 .
- the C-arm scanner 853 includes a c-shaped arm 856 coupled to a movable base 858 to allow the C-arm scanner 853 to be moved into place and removed as needed, without interfering with other aspects of the surgery.
- the arm 856 is positioned around the patient's head 828 intraoperatively, and the arm 856 is rotated and/or translated with respect to the patient's head 828 to capture the X-ray or other type of scan that to achieve registration, at which point the C-arm scanner 853 may be removed from the patient.
- a registration method for an anatomical feature of a patient may be to use a surface contour map of the anatomical feature, according to some embodiments.
- a surface contour map may be constructed using a navigated or tracked probe, or other measuring or sensing device, such as a laser pointer, 3D camera, etc.
- a surgeon may drag or sequentially touch points on the surface of the head with the navigated probe to capture the surface across unique protrusions, such as zygomatic bones, superciliary arches, bridge of nose, eyebrows, etc.
- the system compares the resulting surface contours to contours detected from the CT and/or MR images, seeking the location and orientation of contour that provides the closest match.
- each contour point is related to tracking markers on a DRB on the patient at the time it is recorded. Since the location of the contour map is known in tracking space from the tracked probe and tracked DRB, tracking-to-image registration is obtained once the corresponding contour is found in image space.
- FIG. 9 illustrates a system 900 for registering an anatomical feature of a patient using a navigated or tracked probe and fiducials for point-to-point mapping of the anatomical feature 928 (e.g., a patient's head), according to some embodiments.
- Software would instruct the user to point with a tracked probe to a series of anatomical landmark points that can be found in the CT or MR image.
- the system captures a frame of tracking data with the tracked locations of tracking markers on the probe and on the DRB. From the tracked locations of markers on the probe, the coordinates of the tip of the probe are calculated and related to the locations of markers on the DRB.
- fiducials 954 i.e., fiducial markers
- fiducial markers such as sticker fiducials or metal fiducials
- the surgeon will attach the fiducials 954 to the patient, which are constructed of material that is opaque on imaging, for example containing metal if used with CT or Vitamin E if used with MR. Imaging (CT or MR) will occur after placing the fiducials 954 .
- the surgeon or user will then manually find the coordinates of the fiducials in the image volume, or the software will find them automatically with image processing.
- the surgeon or user may also locate the fiducials 954 in physical space relative to the DRB 940 by touching the fiducials 954 with a tracked probe while simultaneously recording tracking markers on the probe (not shown) and on the DRB 940 . Registration is achieved because the coordinates of the same points are known in the image space and the tracking space.
- One use for the embodiments described herein is to plan trajectories and to control a robot to move into a desired trajectory, after which the surgeon will place implants such as electrodes through a guide tube held by the robot.
- Additional functionalities include exporting coordinates used with existing stereotactic frames, such as a Leksell frame, which uses five coordinates: X, Y, Z, Ring Angle and Arc Angle. These five coordinates are established using the target and trajectory identified in the planning stage relative to the image space and knowing the position and orientation of the ring and arc relative to the stereotactic frame base or other registration fixture.
- stereotactic frames allow a target location 1058 of an anatomical feature 1028 (e.g., a patient's head) to be treated as the center of a sphere and the trajectory can pivot about the target location 1058 .
- the trajectory to the target location 1058 is adjusted by the ring and arc angles of the stereotactic frame (e.g., a Leksell frame). These coordinates may be set manually, and the stereotactic frame may be used as a backup or as a redundant system in case the robot fails or cannot be tracked or registered successfully.
- the linear x,y,z offsets to the center point i.e., target location 1058 ) are adjusted via the mechanisms of the frame.
- a cone 1060 is centered around the target location 1058 , and shows the adjustment zone that can be achieved by modifying the ring and arc angles of the Leksell or other type of frame. This figure illustrates that a stereotactic frame with ring and arc adjustments is well suited for reaching a fixed target location from a range of angles while changing the entry point into the skull.
- FIG. 11 illustrates a two-dimensional visualization of virtual point rotation mechanism, according to some embodiments.
- the robotic arm is able to create a different type of point-rotation functionality that enables a new movement mode that is not easily achievable with a 5-axis mechanical frame, but that may be achieved using the embodiments described herein.
- this mode allows the user to pivot the robot's guide tube about any fixed point in space.
- the robot may pivot about the entry point 1162 into the anatomical feature 1128 (e.g., a patient's head). This entry point pivoting is advantageous as it allows the user to make a smaller burr hole without limiting their ability to adjust the target location 1164 intraoperatively.
- the cone 1160 represents the range of trajectories that may be reachable through a single entry hole. Additionally, entry point pivoting is advantageous as it allows the user to reach two different target locations 1164 and 1166 through the same small entry burr hole. Alternately, the robot may pivot about a target point (e.g., location 1058 shown in FIG. 10 ) within the skull to reach the target location from different angles or trajectories, as illustrated in FIG. 10 .
- a target point e.g., location 1058 shown in FIG. 10
- Such interior pivoting robotically has the same advantages as a stereotactic frame as it allows the user to approach the same target location 1058 from multiple approaches, such as when irradiating a tumor or when adjusting a path so that critical structures such as blood vessels or nerves will not be crossed when reaching targets beyond them.
- the robot adjusts the pivot point through controlled activation of axes and the robot can therefore dynamically adjust its pivot point and switch as needed between the modes illustrated in FIGS. 10 and 11 .
- these and other embodiments may allow for implant locations to be verified using intraoperative imaging. Placement accuracy of the instrument or implant relative to the planned trajectory can be qualitatively and/or quantitatively shown to the user.
- One option for comparing planned to placed position is to merge a postoperative verification CT image to any of the preoperative images. Once pre- and post-operative images are merged and plan is shown overlaid, the shadow of the implant on postop CT can be compared to the plan to assess accuracy of placement. Detection of the shadow artifact on post-op CT can be performed automatically through image processing and the offset displayed numerically in terms of millimeters offset at the tip and entry and angular offset along the path. This option does not require any fiducials to be present in the verification image since image-to-image registration is performed based on bony anatomical contours.
- a second option for comparing planned position to the final placement would utilize intraoperative fluoro with or without an attached fluoro fixture.
- Two out-of-plane fluoro images will be taken and these fluoro images will be matched to DRRs generated from pre-operative CT or MR as described above for registration. Unlike some of the registration methods described above, however, it may be less important for the fluoro images to be tracked because the key information is where the electrode is located relative to the anatomy in the fluoro image.
- the linear or slightly curved shadow of the electrode would be found on a fluoro image, and once the DRR corresponding to that fluoro shot is found, this shadow can be replicated in the CT image volume as a plane or sheet that is oriented in and out of the ray direction of the fluoro image and DRR. That is, the system may not know how deep in or out of the fluoro image plane the electrode lies on a given shot, but can calculate the plane or sheet of possible locations and represent this plane or sheet on the 3D volume. In a second fluoro view, a different plane or sheet can be determined and overlaid on the 3D image. Where these two planes or sheets intersect on the 3D image is the detected path of the electrode.
- the system can represent this detected path as a graphic on the 3D image volume and allow the user to reslice the image volume to display this path and the planned path from whatever perspective is desired, also allowing automatic or manual calculation of the deviation from planned to placed position of the electrode.
- Tracking the fluoro fixture is unnecessary but may be done to help de-warp the fluoro images and calculate the location of the x-ray emitter to improve accuracy of DRR calculation, the rate of convergence when iterating to find matching DRR and fluoro shots, and placement of sheets/planes representing the electrode on the 3D scan.
- Two primary methods to establish and maintain navigation integrity include: tracking the position of a surveillance marker relative to the markers on the DRB, and checking landmarks within the images.
- the system may alert the user of a possible loss of navigation integrity.
- the second method if a landmark check shows that the anatomy represented in the displayed slices on screen does not match the anatomy at which the tip of the probe points, then the surgeon will also become aware that there is a loss of navigation integrity.
- the surgeon has the option to re-attach the FRA, which mounts in only one possible way to the frame base, and to restore tracking-to-image registration based on the FRA tracking markers and the stored fiducials from the CT localizer 536 .
- This registration can then be transferred or related to tracking markers on a repositioned DRB. Once registration is transferred the FRA can be removed if desired.
- end-effector 112 may be equipped with components, configured, or otherwise include features so that one end-effector may remain attached to a given one of robot arms 104 without changing to another end-effector for multiple different surgical procedures, such as, by way of example only, Deep Brain Stimulation (DBS), Stereoelectroencephalography (SEEG), or Endoscopic Navigation and Tumor Biopsy.
- DBS Deep Brain Stimulation
- SEEG Stereoelectroencephalography
- Endoscopic Navigation and Tumor Biopsy Endoscopic Navigation and Tumor Biopsy.
- end-effector 112 may be orientable to oppose an anatomical feature of a patient in the manner so as to be in operative proximity thereto, and, to be able to receive one or more surgical tools for operations contemplated on the anatomical feature proximate to the end-effector 112 .
- Motion and orientation of end-effector 112 may be accomplished through any of the navigation, trajectory guidance, or other methodologies discussed herein or as may be otherwise suitable for the particular operation.
- End-effector 112 is suitably configured to permit a plurality of surgical tools 129 to be selectively connectable to end-effector 112 .
- a stylet 113 FIG. 13
- an electrode driver 115 FIG. 14
- a processor circuit includes various subroutines and other machine-readable instructions configured to cause, when executed, end-effector 112 to move, such as by GPS movement, relative to the anatomical feature, at predetermined stages of associated surgical operations, whether pre-operative, intra-operative or post-operative.
- End-effector 112 includes various components and features to either prevent or permit end-effector movement depending on whether and which tools 129 , if any, are connected to end-effector 112 .
- end-effector 112 includes a tool-insert locking mechanism 117 located on and connected to proximal surface 119 .
- Tool-insert locking mechanism 117 is configured so as to secure any selected one of a plurality of surgical tools, such as the aforesaid stylet 113 , electrode driver 115 , or any other tools for different surgeries mentioned previously or as may be contemplated by other applications of this disclosure.
- the securement of the tool by tool-insert locking mechanism 117 is such that, for any of multiple tools capable of being secured to locking mechanism 117 , each such tool is operatively and suitably secured at the predetermined height, angle of orientation, and rotational position relative to the anatomical feature of the patient, such that multiple tools may be secured to the same end-effector 112 in respective positions appropriate for the contemplated procedure.
- Tool stop 121 located on distal surface 123 of end-effector 112 , that is, the surface generally opposing the patient.
- Tool stop 121 has a stop mechanism 125 and a sensor 127 operatively associated therewith, as seen with reference to FIGS. 16, 19, and 20 .
- Stop mechanism 125 is mounted to end-effector 112 so as to be selectively movable relative thereto between an engaged position to prevent any of the tools from being connected to end-effector 112 and a disengaged position which permits any of the tools 129 to be selectively connected to end-effector 112 .
- Sensor 127 may be located on or within the housing of end-effector 112 at any suitable location ( FIGS.
- sensor 127 detects whether stop mechanism 125 is in the engaged or disengaged position.
- Sensor 127 may assume any form suitable for such detection, such as any type of mechanical switch or any type of magnetic sensor, including Reed switches, Hall Effect sensors, or other magnetic field detecting devices.
- sensor 127 has two portions, a Hall Effect sensor portion (not shown) and a magnetic portion 131 , the two portions moving relative to each other so as to generate and detect two magnetic fields corresponding to respective engaged and disengaged position.
- the magnetic portion comprises two rare earth magnets 131 which move relative to the complementary sensing portion (not shown) mounted in the housing of end effector 112 in operative proximity to magnets 131 to detect change in the associated magnetic field from movement of stop mechanism 125 between engaged and disengaged positions.
- the Hall effect sensor is bipolar and can detect whether a North pole or South pole of a magnet opposes the sensor. Magnets 131 are configured so that the North pole of one magnet faces the path of the sensor and the South pole of the other magnet faces the path of the sensor.
- the senor senses an increased signal when it is near one magnet (for example, in disengaged position), a decreased signal when it is near the other magnet (for example, in engaged position), and unchanged signal when it is not in proximity to any magnet.
- sensor 127 in response to detection of stop mechanism 125 being in the disengaged position shown in FIGS. 13 and 19 , sensor 127 causes the processor of surgical robot system 100 to execute suitable instructions to prevent movement of end-effector 112 relative to the anatomical feature. Such movement prevention may be appropriate for any number of reasons, such as when a tool is connected to end-effector 112 , such tool potentially interacting with the anatomical feature of the patient.
- a sensor 127 for detecting engaged or disengaged tool stop mechanism 125 could comprise a single magnet behind the housing (not shown) and two Hall Effect sensors located where magnets 131 are shown in the preferred embodiment.
- monopolar Hall Effect sensors are suitable and would be configured so that Sensor 1 detects a signal when the magnet is in proximity due to the locking mechanism being disengaged, while Sensor 2 detects a signal when the same magnet is in proximity due to the locking mechanism being engaged. Neither sensor would detect a signal when the magnet is between positions or out of proximity to either sensor.
- a configuration could be conceived in which a sensor is active for engaged position and inactive for disengaged position, a configuration with three signals indicating engaged, disengaged, or transitional is preferred to ensure correct behavior in case of power failure.
- End-effector 112 , tool stop 121 , and tool-insert locking mechanism 117 each have co-axially aligned bores or apertures such that any selected one of the plurality of surgical tools 129 may be received through such bores and apertures.
- end-effector has a bore 133 and tool stop 121 and tool-insert locking mechanism 117 have respective apertures 135 and 137 .
- Stop mechanism 125 includes a ring 139 axially aligned with bore 133 and aperture 135 of tool stop 121 . Ring 139 is selectively, manually rotatable in the directions indicated by arrow A ( FIG. 16 ) so as to move stop mechanism 125 between the engaged position and the disengaged position.
- the selective rotation of ring 139 includes features which enable ring 139 to be locked in either the disengaged or engaged position.
- a detent mechanism 141 is located on and mounted to ring 139 in any suitable way to lock ring 139 against certain rotational movement out of a predetermined position, in this case, such position being when stop mechanism 125 is in the engaged position.
- one suitable arrangement has a manually accessible head extending circumferentially outwardly from ring 139 and having a male protrusion (not shown) spring-loaded axially inwardly to engage a corresponding female detent portion (not shown).
- Detent mechanism 141 is manually actuatable to unlock ring 139 from its engaged position to permit ring 139 to be manually rotated to cause stop mechanism 125 to move from the engaged position ( FIG. 20 ) to the disengaged position ( FIG. 19 ).
- Tool stop 121 includes a lever arm 143 pivotally mounted adjacent aperture 135 of tool stop 121 so end of lever arm 143 selectively pivots in the directions indicated by arrow B ( FIGS. 16, 19 and 20 ).
- Lever arm 143 is operatively connected to stop mechanism 125 , meaning it closes aperture 135 of tool stop 121 in response to stop mechanism 125 being in the engaged position, as shown in FIG. 20 .
- Lever arm 143 is also operatively connected so as to pivot back in direction of arrow B to open aperture 135 in response to stop mechanism 125 being in the disengaged position. As such, movement of stop mechanism 125 between engaged and disengaged positions results in closure or opening of aperture 135 , respectively, by lever arm 143 .
- Lever arm 143 in this implementation, is not only pivotally mounted adjacent aperture 135 , but also pivots in parallel with a distal plane defined at a distal-most point of distal surface 123 of end-effector 112 . In this manner, any one of the surgical tools 129 , which is attempted to be inserted through bore 133 and aperture 135 , is stopped from being inserted past the distal plane in which lever arm 143 rotates to close aperture 135 .
- a connector 145 is configured to meet with and secure any one of the surgical tools 129 at their appropriate height, angle of orientation, and rotational position relative to the anatomical feature of the patient.
- connector 145 comprises a rotatable flange 147 which has at least one slot 149 formed therein to receive therethrough a corresponding tongue 151 associated with a selected one of the plurality of tools 129 . So, for example, in FIG. 14 , the particular electrode driver 115 has multiple tongues, one of which tongue 151 is shown.
- Rotatable flange 147 may comprise a collar 153 , which collar, in turn, has multiple ones of slots 149 radially spaced on a proximally oriented surface 155 , as best seen in FIG. 12 .
- Multiple slots 147 arranged around collar 153 are sized or otherwise configured so as to receive therethrough corresponding ones of multiple tongues 151 associated with a selected one of the plurality of tools 129 . Therefore, as seen in FIG. 13 , multiple slots 149 and corresponding tongues 151 may be arranged to permit securing of a selected one of the plurality of tools 129 only when selected tool is in the correct, predetermined angle of orientation and rotational position relative to the anatomical feature of the patient.
- tongues 151 (one of which is shown in a cutaway of FIG. 14 ) have been received in radially spaced slots 149 arrayed so that electrode driver 115 is received at the appropriate angle of orientation and rotational position.
- Rotatable flange 147 has, in this implementation, a grip 173 to facilitate manual rotation between an open and closed position as shown in FIGS. 17 and 18 , respectively.
- multiple sets of mating slots 149 and tongues 151 are arranged at different angular locations, in this case, locations which may be symmetric about a single diametric chord of a circle but otherwise radially asymmetric, and at least one of the slots has a different dimension or extends through a different arc length than other slots.
- dimensions of tongues 151 and slots 149 are selected so that when rotatable flange 147 is rotated to the closed position, flange portions 157 are radially translated to overlie or engage portions of tongues 151 , such engagement shown in FIG. 18 and affixing tool 129 (or adapter 155 ) received in connector 145 at the desired, predetermined height, angle of orientation, and rotational position relative to the anatomical feature of the patient.
- Tongues 151 described as being associated with tools 129 may either be directly connected to such tools 129 , and/or tongues 151 may be located on and mounted to the above-mentioned adapter 155 , such as that shown in FIGS. 12, 17 and 18 , such adapter 155 configured to interconnect at least one of the plurality of surgical tools 129 with end-effector 112 .
- adapter 155 includes two operative portions—a tool receiver 157 adapted to connect the selected one or more surgical tools 129 , and the second operative part being one or more tongues 151 which may, in this implementation, be mounted and connected to the distal end of adapter 155 .
- Adapter 155 has an outer perimeter 159 which, in this implementation, is sized to oppose an inner perimeter 161 of rotatable flange 147 .
- Adapter 155 extends between proximal and distal ends 163 , 165 , respectively and has an adapter bore 167 extending between ends 163 , 165 .
- Adapter bore 167 is sized to receive at least one of the plurality of surgical tools 129 , and similarly, the distance between proximal and distal ends 163 , 165 is selected so that at least one of tools 129 is secured to end-effector 112 at the predetermined, appropriate height for the surgical procedure associated with such tool received in adapter bore 167 .
- system 100 includes multiple ones of adapter 155 , configured to be interchangeable inserts 169 having substantially the same, predetermined outer perimeters 159 to be received within inner perimeter 161 of rotatable flange 147 . Still further in such implementation, the interchangeable inserts 169 have bores of different, respective diameters, which bores may be selected to receive corresponding ones of the tools 129 therein. Bores 167 may comprise cylindrical bushings having inner diameters common to multiple surgical tools 129 . One possible set of diameters for bores 167 may be 12, 15, and 17 millimeters, suitable for multiple robotic surgery operations, such as those identified in this disclosure.
- inner perimeter 161 of rotatable flange 147 and outer perimeter 159 of adapter 155 are circular, having central, aligned axes and corresponding radii.
- Slots 149 of rotatable flange 147 extend radially outwardly from the central axis of rotatable flange 147 in the illustrated implementation, whereas tongues 151 of adapter 155 extend radially outwardly from adapter 155 .
- end-effector 112 may be equipped with at least one illumination element 171 ( FIGS. 14 and 15 ) orientable toward the anatomical feature to be operated upon.
- Illumination element 171 may be in the form of a ring of LEDs 177 ( FIG. 14 ) located within adapter 167 , which adapter is in the form of a bushing secured to tool locking mechanism 117 .
- Illumination element 171 may also be a single LED 179 mounted on the distal surface 123 of end-effector 112 .
- the spacing and location of illumination element or elements 171 may be selected so that tools 129 received through bore 133 of end-effector 112 do not cast shadows or otherwise interfere with illumination from element 171 of the anatomical feature being operated upon.
- Tool stop 121 is rotatable, selectively lockable, and movable between engaged and disengaged positions, and a sensor prevents movement of end-effector 112 when in such disengaged position, due to the potential presence of a tool which may not be advisably moved during such disengaged position.
- Tool-insert locking mechanism 117 is likewise rotatable between open and closed positions to receive one of a plurality of interchangeable inserts 169 and tongues 151 of such inserts, wherein selected tools 129 may be received in such inserts 169 ; alternately, tongues 151 may be otherwise associated with tools 129 , such as by having tongues 151 directly connected to such tools 129 , which tongue-equipped tools likewise may be received in corresponding slots 149 of tool-insert locking mechanism 117 .
- Tool-insert locking mechanism 117 may be rotated from its open position in which tongues 151 have been received in slots 149 , to secure associated adapters 155 and/or tools 129 so that they are at appropriate, respective heights, angles of orientation, and rotational positions relative to the anatomical feature of the patient.
- adapters 155 For those implementations with multiple adapters 155 , the dimensions of such adapters 155 , including bore diameters, height, and other suitable dimensions, are selected so that a single or a minimized number of end-effectors 112 can be used for a multiplicity of surgical tools 129 .
- Adapters 155 such as those in the form of interchangeable inserts 169 or cylindrical bushings, may facilitate connecting an expanded set of surgical tools 129 to the end-effector 112 , and thus likewise facilitate a corresponding expanded set of associated surgical features using the same end-effector 112 .
- end-effector 212 is suitably connected to a robot arm, such as that described previously with reference to robot arm 104 , and is orientable to oppose an anatomical feature a so as to be in operative proximity thereto.
- End-effector 212 may include features similar to those described with reference to end-effector 112 of FIGS. 12-18 , such as a tool-insert locking mechanism 217 and tool stop 221 , which correspond to tool-insert locking mechanism 117 and tools stop 121 described previously.
- FIGS. 19-24 it will be appreciated that such features are not required in end-effector 212 and various other features or configurations of end-effector 212 are contemplated by the disclosure with reference to FIGS. 19-24 .
- End-effector 212 has a surgical tool comprising a drill 223 selectively connectible thereto.
- Processing circuitry, memory, and suitable machine-readable instructions are associated with this implementation of surgical robot system 100 so as to determine, for drill 223 , a drill target and an associated drill trajectory relative to anatomical feature a.
- Suitable instructions are likewise provided such that, when executed, the end-effector may be automatically moved to a position corresponding to the determined target and determined drill trajectory. Such instructions also permit advancement of the drill toward the target. Manual manipulations of end effector 212 to drill locations and trajectories are likewise contemplated.
- a drill guide 225 is releasably connected between drill 223 and end-effector 212 .
- drill guide 225 may be configured and otherwise includes features to stop advancement of drill 223 at a preselected drill depth associated with the determined target.
- Drill guide 225 may be further configured with features to guide the drill along the determined drill trajectory during the advancement of the drill.
- Drill guide 225 includes a guide shaft 227 having a bore 229 extending longitudinally therethrough between proximal and distal ends of guide shaft 227 .
- Bore 229 is sized to slideably receive a drill bit 231 which extends distally from a chuck 233 of drill 223 .
- Such drill bit 231 terminates in a drill tip 235 capable of engaging the anatomical feature a upon a suitable amount of advancement of drill bit 231 along the determined drill trajectory.
- drill bit 231 has a known, that is, predetermined, length A corresponding to the distance between the distal surface 237 of chuck 233 and the end of drill tip 235 .
- Guide shaft 227 has a known, that is, predetermined, second length B.
- Drill guide 225 includes a depth stop 239 having respective proximal and distal depth stop ends. Depth stop 239 is mounted so that its distal depth stop end is selectively and slideably received at the proximal end of guide shaft 227 . As such, depth stop 239 , when slid relative to guide shaft 227 , varies the second length B of guide shaft 227 .
- movement of depth stop 239 varies the predetermined length B of guide shaft 227 among a plurality or set of length values corresponding to amounts by which drill tip 235 extends beyond distal end of guide shaft 227 , such amounts thus corresponding to available drill depths.
- selected lengths B are less than the known, predetermined length A of drill bit 231 , and thereby permit the distal end of drill bit 231 and its drill tip 235 to extend distally from the distal end of guide shaft 227 by selected lengths corresponding to desired depths of engagement of the anatomical feature.
- depth stop 239 has surface 241 oriented and located to engage a distal surface 237 of chuck 233 of drill 223 . Accordingly, the depth to which drill tip 235 is advanceable by drill 223 is limited by an amount corresponding to the position of depth stop 239 .
- drill guide 225 includes a visually perceptible depth indicator 251 operatively associated with depth stop 239 .
- depth stop 239 has a longitudinal stem 245 having indicia 243 disposed thereon, so that slideable movement of depth stop 239 slides longitudinal stem 245 and the indicia 243 .
- Guide shaft 227 has portions forming a window 247 sized and located to reveal at least a portion of longitudinal stem 245 bearing indicia 243 .
- the portions adjacent to window 247 have one or more structures or indicia thereon to form a pointer 249 .
- Indicia 243 in this case, includes a graduated, numerical value scale associated with available depths of penetration of anatomical feature a, the pointer and numerical value scale being moveable relative to each other, in this case by sliding longitudinal stem 245 of depth stop 239 relative to window 247 of guide shaft 227 .
- pointer 249 and graduated depths of advancement appearing as numerical values in indicia 243 may be disposed in alternative configurations, such as having the pointer on the slideable longitudinal stem 245 and the graduated depths of advancement disposed longitudinally along portions of window 247 . Still further variations are possible.
- drill guide 225 makes use of an engagement mechanism 253 which sets depth stop 239 at one of the available, selectable, longitudinal positions corresponding to the desired or predetermined drill depth.
- Engagement mechanism 253 may include a ratchet assembly 255 which has features, such as mating teeth 257 as shown, to engage and set depth stop 239 at one of the longitudinally spaced locations between the proximal and distal ends of depth stop 239 .
- teeth 257 are longitudinally disposed along a suitable outer surface of longitudinal stem 245 , and a ratchet 259 with a confronting surface feature, such as one or more mating teeth 257 , is located to oppose teeth 257 disposed on stem 245 and is spring-biased to selectively engage stem 245 .
- Ratchet 259 and corresponding ratchet assembly 255 may be operated by a spring-biased trigger 261 .
- trigger 261 is manually pulled against a spring-biasing force to disengage engagement mechanism 253 from a first one of the longitudinal positions to which it had been previously set.
- depth stop 239 is operated to slide longitudinal stem 245 relative to guide shaft 227 to a selected or desired drill depth as indicated by pointer 249 relative to the scale of indicia 243 .
- spring-biased trigger 261 may be released, allowing the spring-biased force to set engagement mechanism 253 at a second one of the available longitudinal positions, the second longitudinal position corresponding to the desired drill depth.
- trigger lock 263 which is operatively connected to trigger 261 , meaning, when engaged, trigger lock 263 holds ratchet 259 engaged with corresponding teeth 257 of depth stop.
- trigger lock 263 may be in the form of a locking ring 265 , which may be threadibly or otherwise engaged to act as a stop against disengagement of mating teeth 257 of ratchet 259 , or more generally, to prevent radially outward movement of ratchet 259 relative to the longitudinal axis of depth stop 239 .
- Actuation of engagement mechanism 253 may be facilitated by providing drill guide 225 with a handle 267 .
- Handle 267 may be pulled radially outwardly from the longitudinal axis of drill guide 225 and, by virtue of connection to engagement mechanism 253 , such outward pulling of handle 267 overcomes longitudinally inward spring-biasing force and disengages engagement mechanism 253 from depth stop 239 .
- release of handle 267 after desired sliding of depth stop 239 relative to guide shaft 227 operates to set drill guide 225 at a desired drill depth.
- drill guide 225 may include features to reduce deviation of drill bit 231 from the determined drill trajectory ⁇ ( FIG. 21 ).
- such trajectory guiding features comprise at least one bushing 269 disposed within bore 229 .
- Bushing or bushings 269 are sized to receive outer longitudinal surface 273 of drill bit 231 slideably and rotatably therethrough. Accordingly, bushing 269 has an internal diameter sized to not only moveably engage longitudinal surface 273 , but to thereby exert a force shown having the direction of arrow F opposing lateral displacement of drill bit 231 within guide shaft 227 ( FIG. 22 ).
- one bushing 269 is located within guide shaft 227 toward distal end thereof and thereby engages drill bit 231 closer to where drill tip 235 engages anatomical feature a.
- Such distal locations of drill bit 231 often experience greater lateral displacement forces by virtue of their proximity to anatomical feature a to be engaged, especially in the event the drill trajectory is at an oblique angle.
- a second bushing 269 is disposed at a proximal location along guide shaft 227 and thereby may generate a countervailing force opposing lateral displacement at the proximal end of drill bit 231 , such as would result from a cantilevering of drill bit 231 upon oblique engagement of anatomical feature a.
- drill 223 may be guided during robot-assisted surgery, including any of the surgeries described herein on anatomical feature a of a patient.
- a drill target and an associated drill trajectory are determined, such as by computer processing.
- End-effector 212 is manually or automatically moved to a position corresponding to the determined target and the determined drill trajectory.
- depth stop 239 may be mechanically or manually set to one of a plurality of selectable positions corresponding to the desired depth of advancement of the drill associated with the contemplated surgery on the anatomical feature. In this way, drill 223 is limited from penetrating the anatomical feature beyond the selected, desired depth.
- the displacement of the depth stop is done with the aid of visually perceptible indicia corresponding to graduated depths of advancement of the drill and, upon such visual perception of the desired depth, the depth stop is secured at such desired position.
- a spring-biased trigger 261 is disengaged relative to depth stop 239 from a first position, depth stop 239 is then longitudinally slid relative to visually perceptible indicia 243 to a second position, such second position corresponding to the desired position of the depth stop and the desired depth of drilling on anatomical feature a.
- trigger 261 is released to re-engage depth stop 239 at the desired position. Thereafter, trigger lock 263 may be further engaged to avoid inadvertent disengagement and potential movement of depth stop 239 and thereby avoid over drilling.
- the terms “comprise”, “comprising”, “comprises”, “include”, “including”, “includes”, “have”, “has”, “having”, or variants thereof are open-ended, and include one or more stated features, integers, elements, steps, components or functions but does not preclude the presence or addition of one or more other features, integers, elements, steps, components, functions or groups thereof.
- the common abbreviation “e.g.”, which derives from the Latin phrase “exempli gratia,” may be used to introduce or specify a general example or examples of a previously mentioned item, and is not intended to be limiting of such item.
- the common abbreviation “i.e.”, which derives from the Latin phrase “id est,” may be used to specify a particular item from a more general recitation.
- Example embodiments are described herein with reference to block diagrams and/or flowchart illustrations of computer-implemented methods, apparatus (systems and/or devices) and/or computer program products. It is understood that a block of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations, can be implemented by computer program instructions that are performed by one or more computer circuits.
- These computer program instructions may be provided to a processor circuit of a general purpose computer circuit, special purpose computer circuit, and/or other programmable data processing circuit to produce a machine, such that the instructions, which execute via the processor of the computer and/or other programmable data processing apparatus, transform and control transistors, values stored in memory locations, and other hardware components within such circuitry to implement the functions/acts specified in the block diagrams and/or flowchart block or blocks, and thereby create means (functionality) and/or structure for implementing the functions/acts specified in the block diagrams and/or flowchart block(s).
- inventions of present inventive concepts may be embodied in hardware and/or in software (including firmware, resident software, micro-code, etc.) that runs on a processor such as a digital signal processor, which may collectively be referred to as “circuitry,” “a module” or variants thereof.
Landscapes
- Health & Medical Sciences (AREA)
- Surgery (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- Veterinary Medicine (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Public Health (AREA)
- General Health & Medical Sciences (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Pathology (AREA)
- Dentistry (AREA)
- Orthopedic Medicine & Surgery (AREA)
- Robotics (AREA)
- Neurosurgery (AREA)
- Manipulator (AREA)
Abstract
Description
- This application is a continuation-in-part of U.S. patent application Ser. No. 16/452,737, filed Jun. 26, 2019, which is a continuation-in-part of U.S. patent application Ser. No. 16/361,863, filed on Mar. 22, 2019, the entire contents of each of which are hereby incorporated herein by reference for all purposes.
- The present disclosure relates to medical devices and systems, and more particularly, systems for neuronavigation registration and robotic trajectory guidance, robotic surgery, and related methods and devices.
- Position recognition systems for robot assisted surgeries are used to determine the position of and track a particular object in 3-dimensions (3D). In robot assisted surgeries, for example, certain objects, such as surgical instruments, need to be tracked with a high degree of precision as the instrument is being positioned and moved by a robot or by a physician, for example.
- Position recognition systems may use passive and/or active sensors or markers for registering and tracking the positions of the objects. Using these sensors, the system may geometrically resolve the 3-dimensional position of the sensors based on information from or with respect to one or more cameras, signals, or sensors, etc. These surgical systems can therefore utilize position feedback to precisely guide movement of robotic arms and tools relative to a patients' surgical site. Thus, there is a need for a system that efficiently and accurately provide neuronavigation registration and robotic trajectory guidance in a surgical environment.
- End-effectors used in robotic surgery may be limited to use in only certain procedures, or may suffer from other drawbacks or disadvantages.
- According to some implementations, a surgical robot system is configured for surgery on an anatomical feature of a patient, and includes a surgical robot, a robot arm connected to such surgical robot, and an end-effector connected to the robot arm. The end-effector has a surgical tool selectively connected to it, and the robot system includes a memory accessible by a suitable processor circuit which processes machine-readable instructions. Among the instructions which are executable by the system are ones which, in response to user input, determine a drill target and an associated drill trajectory. The instructions also may be executed to cause the end-effector to move to a position corresponding to such determined target and determined drill trajectory. Once positioned in this manner, the system and corresponding instructions will permit advancement of the aforementioned drill toward the determined target.
- In certain implementations, the system includes a drill guide which is connectible between the end-effector and the drill. The drill guide is configured to stop advancement of the drill at a pre-selected drill depth associated with the determined target, and is further configured to guide the drill along the determined drill trajectory during advancement of the drill. As such, the drill guide provides further assurance against over-penetration or over-engagement of the anatomical feature being operated upon and likewise assists in guiding the trajectory of the drill, including when such trajectory is oblique to the anatomical feature.
- The drill guide includes a guide shaft with a bore extending longitudinally therethrough and having proximal and distal openings to the bore. The guide shaft and the corresponding bore are sized to slideably receive a drill bit of the drill therethrough, such drill bit having an associated drill tip and being operatively connected to advancement and rotating mechanisms of the drill at a proximal location by a suitable chuck. The drill guide includes a depth stop which is mounted to the proximal end of the guide shaft and selectively slideable longitudinally to vary the length of the guide shaft relative to the predetermined length of the drill bit received in the drill. The drill stop has a surface located and otherwise disposed to engage the distal surface of the chuck of the drill. As such, engagement between the chuck and the depth stop limits the depth to which the drill tip is advanceable by an amount corresponding to the position of the depth stop.
- In certain implementations, the drill guide makes use of a depth indicator having indicia corresponding to graduated depths of advancement of the drill bit associated with the drill guide during the operative procedures contemplated by the surgical robot system. The depth stop may be formed so as to include a longitudinal stem, this stem being slideably received within the bore of the guide shaft. The guide shaft, in turn, has a window formed therein which is sized and located to reveal the longitudinal stem of the depth stop when it is received in the guide shaft. With such an arrangement, the aforementioned indicia associated with the graduated depth may be located either on the longitudinal stem visible through the window or on portions adjacent to the window, and such indicia are movable relative to a pointer or other indicator relative to which the graduated depth scale moves. In this manner, the desired depth limit may be manually selected by visually perceiving the numerical value of the graduated depths aligned with the associated pointer or indicator.
- In still other aspects of the disclosed implementations, the drill guide has an engagement mechanism to set the depth stop at one of a plurality of selectable, longitudinal positions corresponding to the desired drill depth. One suitable engagement mechanism may include a ratchet assembly engageable at one of a plurality of longitudinally spaced locations between the proximal and distal ends of the depth stop. In one suitable implementation, the ratchet assembly includes a ratchet that may be disengaged and engaged by actuation or release, respectively, of a spring-biased trigger. Such trigger may have a trigger lock operatively associated therewith so that once the ratchet of the ratchet assembly has engaged the depth stop at a selected longitudinal position, such trigger lock inhibits inadvertent actuation of the trigger which would cause disengagement of the ratchet and potential longitudinal movement of the depth stop. The drill guide may include a handle operatively connected to the engagement mechanism so as to manually set the depth stop at the selected one of the longitudinal positions.
- In still other implementations, the guide shaft has at least one bushing, and such bushing may be sized to engage the longitudinal surface at the distal end of the drill bit in such a manner so as to exert a force on such distal drill bit end to oppose lateral displacement of the drill bit within the guide shaft. Such opposing force may be sufficient to reduce deviation of the drill bit from the determined drill trajectory, especially when such trajectory is at an oblique angle to the anatomical feature being engaged.
- The above-described system and its various features may be associated with a variety of related procedures or processes, collectively referred to herein as methods. One such method involves guiding a drill during robot-assisted surgery on an anatomical feature of a patient, such method including the determination by suitable computer means of a drill target and an associated drill trajectory. Thereafter, the end-effector is caused to move, such as by means of a computer, to a position corresponding to the determined target and the determined drill trajectory. At any other point prior to drill advancement during operations, the depth stop connected to the end-effector may be manually set at one of a plurality of selectable positions corresponding to a desired depth of advancement of the drill relative to the anatomical feature. In this manner, the drill is limited from penetrating the anatomical feature beyond the desired depth set manually by the depth stop.
- In another possible method hereunder, the depth stop is displaced relative to visually perceptible indicia corresponding to graduated depths of advancement of the drill associated with the system. Once a suitable location of the depth stop is determined by visual perception, the depth stop may be secured at a position corresponding to one of the graduated depths indicated on the indicia. The steps of displacing and securing the depth stop may involve actuating a spring-biased trigger to disengage the depth stop from a first position, longitudinally sliding the depth stop relative to the visually perceptible indicia to a second position corresponding to the desired position of the depth stop, and then releasing the trigger to re-engage the depth stop at the desired position.
- Other methods and related devices and systems, and corresponding methods and computer program products according to embodiments will be or become apparent to one with skill in the art upon review of the following drawings and detailed description. It is intended that all such devices and systems, and corresponding methods and computer program products be included within this description, be within the scope of the present disclosure, and be protected by the accompanying claims. Moreover, it is intended that all embodiments disclosed herein can be implemented separately or combined in any way and/or combination.
- The accompanying drawings, which are included to provide a further understanding of the disclosure and are incorporated in a constitute a part of this application, illustrate certain non-limiting embodiments of inventive concepts. In the drawings:
-
FIG. 1A is an overhead view of an arrangement for locations of a robotic system, patient, surgeon, and other medical personnel during a surgical procedure, according to some embodiments; -
FIG. 1B is an overhead view of an alternate arrangement for locations of a robotic system, patient, surgeon, and other medical personnel during a cranial surgical procedure, according to some embodiments; -
FIG. 2 illustrates a robotic system including positioning of the surgical robot and a camera relative to the patient according to some embodiments; -
FIG. 3 is a flowchart diagram illustrating computer-implemented operations for determining a position and orientation of an anatomical feature of a patient with respect to a robot arm of a surgical robot, according to some embodiments; -
FIG. 4 is a diagram illustrating processing of data for determining a position and orientation of an anatomical feature of a patient with respect to a robot arm of a surgical robot, according to some embodiments; -
FIGS. 5A-5C illustrate a system for registering an anatomical feature of a patient using a computerized tomography (CT) localizer, a frame reference array (FRA), and a dynamic reference base (DRB), according to some embodiments; -
FIGS. 6A and 6B illustrate a system for registering an anatomical feature of a patient using fluoroscopy (fluoro) imaging, according to some embodiments; -
FIG. 7 illustrates a system for registering an anatomical feature of a patient using an intraoperative CT fixture (ICT) and a DRB, according to some embodiments; -
FIGS. 8A and 8B illustrate systems for registering an anatomical feature of a patient using a DRB and an X-ray cone beam imaging device, according to some embodiments; -
FIG. 9 illustrates a system for registering an anatomical feature of a patient using a navigated probe and fiducials for point-to-point mapping of the anatomical feature, according to some embodiments; -
FIG. 10 illustrates a two-dimensional visualization of an adjustment range for a centerpoint-arc mechanism, according to some embodiments; -
FIG. 11 illustrates a two-dimensional visualization of virtual point rotation mechanism, according to some embodiments; -
FIG. 12 is an isometric view of one possible implementation of an end-effector according to the present disclosure; -
FIG. 13 is an isometric view of another possible implementation of an end-effector of the present disclosure; -
FIG. 14 is a partial cutaway, isometric view of still another possible implementation of an end-effector according to the present disclosure; -
FIG. 15 is a bottom angle isometric view of yet another possible implementation of an end-effector according to the present disclosure; -
FIG. 16 is an isometric view of one possible tool stop for use with an end-effector according to the present disclosure; -
FIGS. 17 and 18 are top plan views of one possible implementation of a tool insert locking mechanism of an end-effector according to the present disclosure; -
FIGS. 19 and 20 are top plan views of the tool stop ofFIG. 16 , showing open and closed positions, respectively; -
FIG. 21 is a side, elevational view of another implementation of the robot system disclosed herein, including an end-effector and associated drill guide; -
FIG. 22 is a close-up, side-elevational view of the drill guide ofFIG. 21 ; -
FIG. 23 is a front elevational view of the drill guide ofFIGS. 21-22 ; -
FIG. 24 is a cross sectional view of the drill guide ofFIGS. 21-23 taken along the line of A-A ofFIG. 23 ; and -
FIG. 25 is an exploded, side-elevational view of the drill guide ofFIGS. 21-24 . - It is to be understood that the present disclosure is not limited in its application to the details of construction and the arrangement of components set forth in the description herein or illustrated in the drawings. The teachings of the present disclosure may be used and practiced in other embodiments and practiced or carried out in various ways. Also, it is to be understood that the phraseology and terminology used herein is for the purpose of description and should not be regarded as limiting. The use of “including,” “comprising,” or “having” and variations thereof herein is meant to encompass the items listed thereafter and equivalents thereof as well as additional items. Unless specified or limited otherwise, the terms “mounted,” “connected,” “supported,” and “coupled” and variations thereof are used broadly and encompass both direct and indirect mountings, connections, supports, and couplings. Further, “connected” and “coupled” are not restricted to physical or mechanical connections or couplings.
- The following discussion is presented to enable a person skilled in the art to make and use embodiments of the present disclosure. Various modifications to the illustrated embodiments will be readily apparent to those skilled in the art, and the principles herein can be applied to other embodiments and applications without departing from embodiments of the present disclosure. Thus, the embodiments are not intended to be limited to embodiments shown, but are to be accorded the widest scope consistent with the principles and features disclosed herein. The following detailed description is to be read with reference to the figures, in which like elements in different figures have like reference numerals. The figures, which are not necessarily to scale, depict selected embodiments and are not intended to limit the scope of the embodiments. Skilled artisans will recognize the examples provided herein have many useful alternatives and fall within the scope of the embodiments.
- According to some other embodiments, systems for neuronavigation registration and robotic trajectory guidance, and related methods and devices are disclosed. In some embodiments, a first image having an anatomical feature of a patient, a registration fixture that is fixed with respect to the anatomical feature of the patient, and a first plurality of fiducial markers that are fixed with respect to the registration fixture is analyzed, and a position is determined for each fiducial marker of the first plurality of fiducial markers. Next, based on the determined positions of the first plurality of fiducial markers, a position and orientation of the registration fixture with respect to the anatomical feature is determined. A data frame comprising a second plurality of tracking markers that are fixed with respect to the registration fixture is also analyzed, and a position is determined for each tracking marker of the second plurality of tracking markers. Based on the determined positions of the second plurality of tracking markers, a position and orientation of the registration fixture with respect to a robot arm of a surgical robot is determined. Based on the determined position and orientation of the registration fixture with respect to the anatomical feature and the determined position and orientation of the registration fixture with respect to the robot arm, a position and orientation of the anatomical feature with respect to the robot arm is determined, which allows the robot arm to be controlled based on the determined position and orientation of the anatomical feature with respect to the robot arm.
- Advantages of this and other embodiments include the ability to combine neuronavigation and robotic trajectory alignment into one system, with support for a wide variety of different registration hardware and methods. For example, as will be described in detail below, embodiments may support both computerized tomography (CT) and fluoroscopy (fluoro) registration techniques, and may utilize frame-based and/or frameless surgical arrangements. Moreover, in many embodiments, if an initial (e.g. preoperative) registration is compromised due to movement of a registration fixture, registration of the registration fixture (and of the anatomical feature by extension) can be re-established intraoperatively without suspending surgery and re-capturing preoperative images.
- Referring now to the drawings,
FIG. 1A illustrates asurgical robot system 100 in accordance with an embodiment.Surgical robot system 100 may include, for example, asurgical robot 102, one ormore robot arms 104, abase 106, adisplay 110, an end-effector 112, for example, including aguide tube 114, and one ormore tracking markers 118. Therobot arm 104 may be movable along and/or about an axis relative to thebase 106, responsive to input from a user, commands received from a processing device, or other methods. Thesurgical robot system 100 may include apatient tracking device 116 also including one ormore tracking markers 118, which is adapted to be secured directly to the patient 210 (e.g., to a bone of the patient 210). As will be discussed in greater detail below, the trackingmarkers 118 may be secured to or may be part of a stereotactic frame that is fixed with respect to an anatomical feature of thepatient 210. The stereotactic frame may also be secured to a fixture to prevent movement of thepatient 210 during surgery. - According to an alternative embodiment,
FIG. 1B is an overhead view of an alternate arrangement for locations of arobotic system 100,patient 210,surgeon 120, and other medical personnel during a cranial surgical procedure. During a cranial procedure, for example, therobot 102 may be positioned behind thehead 128 of thepatient 210. Therobot arm 104 of therobot 102 has an end-effector 112 that may hold asurgical instrument 108 during the procedure. In this example, astereotactic frame 134 is fixed with respect to the patient'shead 128, and thepatient 210 and/orstereotactic frame 134 may also be secured to apatient base 211 to prevent movement of the patient'shead 128 with respect to thepatient base 211. In addition, thepatient 210, thestereotactic frame 134 and/or or thepatient base 211 may be secured to therobot base 106, such as via anauxiliary arm 107, to prevent relative movement of thepatient 210 with respect to components of therobot 102 during surgery. Different devices may be positioned with respect to the patient'shead 128 and/orpatient base 211 as desired to facilitate the procedure, such as anintra-operative CT device 130, ananesthesiology station 132, ascrub station 136, a neuro-modulation station 138, and/or one or moreremote pendants 140 for controlling therobot 102 and/or other devices or systems during the procedure. - The
surgical robot system 100 in the examples ofFIGS. 1A and/or 1B may also use a sensor, such as acamera 200, for example, positioned on acamera stand 202. The camera stand 202 can have any suitable configuration to move, orient, and support thecamera 200 in a desired position. Thecamera 200 may include any suitable camera or cameras, such as one or more cameras (e.g., bifocal or stereophotogrammetric cameras), able to identify, for example, active or passive tracking markers 118 (shown as part ofpatient tracking device 116 inFIG. 2 ) in a given measurement volume viewable from the perspective of thecamera 200. In this example, thecamera 200 may scan the given measurement volume and detect the light that comes from the trackingmarkers 118 in order to identify and determine the position of the trackingmarkers 118 in three-dimensions. For example,active tracking markers 118 may include infrared-emitting markers that are activated by an electrical signal (e.g., infrared light emitting diodes (LEDs)), and/orpassive tracking markers 118 may include retro-reflective markers that reflect infrared or other light (e.g., they reflect incoming IR radiation into the direction of the incoming light), for example, emitted by illuminators on thecamera 200 or other suitable sensor or other device. - In many surgical procedures, one or more targets of surgical interest, such as targets within the brain for example, are localized to an external reference frame. For example, stereotactic neurosurgery may use an externally mounted stereotactic frame that facilitates patient localization and implant insertion via a frame mounted arc. Neuronavigation is used to register, e.g., map, targets within the brain based on pre-operative or intraoperative imaging. Using this pre-operative or intraoperative imaging, links and associations can be made between the imaging and the actual anatomical structures in a surgical environment, and these links and associations can be utilized by robotic trajectory systems during surgery.
- According to some embodiments, various software and hardware elements may be combined to create a system that can be used to plan, register, place and verify the location of an instrument or implant in the brain. These systems may integrate a surgical robot, such as the
surgical robot 102 ofFIGS. 1A and/or 1B , and may employ a surgical navigation system and planning software to program and control the surgical robot. In addition or alternatively, thesurgical robot 102 may be remotely controlled, such as by nonsterile personnel. - The
robot 102 may be positioned near or next topatient 210, and it will be appreciated that therobot 102 can be positioned at any suitable location near thepatient 210 depending on the area of thepatient 210 undergoing the operation. Thecamera 200 may be separated from thesurgical robot system 100 and positioned near or next topatient 210 as well, in any suitable position that allows thecamera 200 to have a direct visual line of sight to thesurgical field 208. In the configuration shown, thesurgeon 120 may be positioned across from therobot 102, but is still able to manipulate the end-effector 112 and thedisplay 110. Asurgical assistant 126 may be positioned across from thesurgeon 120 again with access to both the end-effector 112 and thedisplay 110. If desired, the locations of thesurgeon 120 and theassistant 126 may be reversed. The traditional areas for theanesthesiologist 122 and the nurse orscrub tech 124 may remain unimpeded by the locations of therobot 102 andcamera 200. - With respect to the other components of the
robot 102, thedisplay 110 can be attached to thesurgical robot 102 and in other embodiments, thedisplay 110 can be detached fromsurgical robot 102, either within a surgical room with thesurgical robot 102, or in a remote location. The end-effector 112 may be coupled to therobot arm 104 and controlled by at least one motor. In some embodiments, end-effector 112 can comprise aguide tube 114, which is able to receive and orient asurgical instrument 108 used to perform surgery on thepatient 210. As used herein, the term “end-effector” is used interchangeably with the terms “end-effectuator” and “effectuator element.” Although generally shown with aguide tube 114, it will be appreciated that the end-effector 112 may be replaced with any suitable instrumentation suitable for use in surgery. In some embodiments, end-effector 112 can comprise any known structure for effecting the movement of thesurgical instrument 108 in a desired manner. - The
surgical robot 102 is able to control the translation and orientation of the end-effector 112. Therobot 102 is able to move end-effector 112 along x-, y-, and z-axes, for example. The end-effector 112 can be configured for selective rotation about one or more of the x-, y-, and z-axis such that one or more of the Euler Angles (e.g., roll, pitch, and/or yaw) associated with end-effector 112 can be selectively controlled. In some embodiments, selective control of the translation and orientation of end-effector 112 can permit performance of medical procedures with significantly improved accuracy compared to conventional robots that use, for example, a six degree of freedom robot arm comprising only rotational axes. For example, thesurgical robot system 100 may be used to operate onpatient 210, androbot arm 104 can be positioned above the body ofpatient 210, with end-effector 112 selectively angled relative to the z-axis toward the body ofpatient 210. - In some embodiments, the position of the
surgical instrument 108 can be dynamically updated so thatsurgical robot 102 can be aware of the location of thesurgical instrument 108 at all times during the procedure. Consequently, in some embodiments,surgical robot 102 can move thesurgical instrument 108 to the desired position quickly without any further assistance from a physician (unless the physician so desires). In some further embodiments,surgical robot 102 can be configured to correct the path of thesurgical instrument 108 if thesurgical instrument 108 strays from the selected, preplanned trajectory. In some embodiments,surgical robot 102 can be configured to permit stoppage, modification, and/or manual control of the movement of end-effector 112 and/or thesurgical instrument 108. Thus, in use, in some embodiments, a physician or other user can operate thesystem 100, and has the option to stop, modify, or manually control the autonomous movement of end-effector 112 and/or thesurgical instrument 108. Further details ofsurgical robot system 100 including the control and movement of asurgical instrument 108 bysurgical robot 102 can be found in co-pending U.S. Patent Publication No. 2013/0345718, which is incorporated herein by reference in its entirety. - As will be described in greater detail below, the
surgical robot system 100 can comprise one or more tracking markers configured to track the movement ofrobot arm 104, end-effector 112,patient 210, and/or thesurgical instrument 108 in three dimensions. In some embodiments, a plurality of tracking markers can be mounted (or otherwise secured) thereon to an outer surface of therobot 102, such as, for example and without limitation, onbase 106 ofrobot 102, onrobot arm 104, and/or on the end-effector 112. In some embodiments, such as the embodiment ofFIG. 3 below, for example, one or more tracking markers can be mounted or otherwise secured to the end-effector 112. One or more tracking markers can further be mounted (or otherwise secured) to thepatient 210. In some embodiments, the plurality of tracking markers can be positioned on thepatient 210 spaced apart from thesurgical field 208 to reduce the likelihood of being obscured by the surgeon, surgical tools, or other parts of therobot 102. Further, one or more tracking markers can be further mounted (or otherwise secured) to the surgical instruments 108 (e.g., a screw driver, dilator, implant inserter, or the like). Thus, the tracking markers enable each of the marked objects (e.g., the end-effector 112, thepatient 210, and the surgical instruments 108) to be tracked by thesurgical robot system 100. In some embodiments,system 100 can use tracking information collected from each of the marked objects to calculate the orientation and location, for example, of the end-effector 112, the surgical instrument 108 (e.g., positioned in thetube 114 of the end-effector 112), and the relative position of thepatient 210. Further details ofsurgical robot system 100 including the control, movement and tracking ofsurgical robot 102 and of asurgical instrument 108 can be found in U.S. Patent Publication No. 2016/0242849, which is incorporated herein by reference in its entirety. - In some embodiments, pre-operative imaging may be used to identify the anatomy to be targeted in the procedure. If desired by the surgeon the planning package will allow for the definition of a reformatted coordinate system. This reformatted coordinate system will have coordinate axes anchored to specific anatomical landmarks, such as the anterior commissure (AC) and posterior commissure (PC) for neurosurgery procedures. In some embodiments, multiple pre-operative exam images (e.g., CT or magnetic resonance (MR) images) may be co-registered such that it is possible to transform coordinates of any given point on the anatomy to the corresponding point on all other pre-operative exam images.
- As used herein, registration is the process of determining the coordinate transformations from one coordinate system to another. For example, in the co-registration of preoperative images, co-registering a CT scan to an MR scan means that it is possible to transform the coordinates of an anatomical point from the CT scan to the corresponding anatomical location in the MR scan. It may also be advantageous to register at least one exam image coordinate system to the coordinate system of a common registration fixture, such as a dynamic reference base (DRB), which may allow the
camera 200 to keep track of the position of the patient in the camera space in real-time so that any intraoperative movement of an anatomical point on the patient in the room can be detected by therobot system 100 and accounted for by compensatory movement of thesurgical robot 102. -
FIG. 3 is a flowchart diagram illustrating computer-implemented operations 300 for determining a position and orientation of an anatomical feature of a patient with respect to a robot arm of a surgical robot, according to some embodiments. The operations 300 may include receiving a first image volume, such as a CT scan, from a preoperative image capture device at a first time (Block 302). The first image volume includes an anatomical feature of a patient and at least a portion of a registration fixture that is fixed with respect to the anatomical feature of the patient. The registration fixture includes a first plurality of fiducial markers that are fixed with respect to the registration fixture. The operations 300 further include determining, for each fiducial marker of the first plurality of fiducial markers, a position of the fiducial marker relative to the first image volume (Block 304). The operations 300 further include, determining, based on the determined positions of the first plurality of fiducial markers, positions of an array of tracking markers on the registration fixture (fiducial registration array or FRA) with respect to the anatomical feature (Block 306). - The operations 300 may further include receiving a tracking data frame from an intraoperative tracking device comprising a plurality of tracking cameras at a second time that is later than the first time (Block 308). The tracking frame includes positions of a plurality of tracking markers that are fixed with respect to the registration fixture (FRA) and a plurality of tracking markers that are fixed with respect to the robot. The operations 300 further include determining, for based on the positions of tracking markers of the registration fixture, a position and orientation of the anatomical feature with respect to the tracking cameras (Block 310). The operations 300 further include determining, based on the determined positions of the plurality of tracking markers on the robot, a position and orientation of the robot arm of a surgical robot with respect to the tracking cameras (Block 312).
- The operations 300 further include determining, based on the determined position and orientation of the anatomical feature with respect to the tracking cameras and the determined position and orientation of the robot arm with respect to the tracking cameras, a position and orientation of the anatomical feature with respect to the robot arm (Block 314). The operations 300 further include controlling movement of the robot arm with respect to the anatomical feature, e.g., along and/or rotationally about one or more defined axis, based on the determined position and orientation of the anatomical feature with respect to the robot arm (Block 316).
-
FIG. 4 is a diagram illustrating adata flow 400 for a multiple coordinate transformation system, to enable determining a position and orientation of an anatomical feature of a patient with respect to a robot arm of a surgical robot, according to some embodiments. In this example, data from a plurality ofexam image spaces 402, based on a plurality of exam images, may be transformed and combined into a commonexam image space 404. The data from the commonexam image space 404 and data from averification image space 406, based on a verification image, may be transformed and combined into aregistration image space 408. Data from theregistration image space 408 may be transformed into patientfiducial coordinates 410, which is transformed into coordinates for aDRB 412. A trackingcamera 414 may detect movement of the DRB 412 (represented byDRB 412′) and may also detect a location of aprobe tracker 416 to track coordinates of theDRB 412 over time. Arobotic arm tracker 418 determines coordinates for the robot arm based on transformation data from a Robotics Planning System (RPS)space 420 or similar modeling system, and/or transformation data from the trackingcamera 414. - It should be understood that these and other features may be used and combined in different ways to achieve registration of image space, i.e., coordinates from image volume, into tracking space, i.e., coordinates for use by the surgical robot in real-time. As will be discussed in detail below, these features may include fiducial-based registration such as stereotactic frames with CT localizer, preoperative CT or MRI registered using intraoperative fluoroscopy, calibrated scanner registration where any acquired scan's coordinates are pre-calibrated relative to the tracking space, and/or surface registration using a tracked probe, for example.
- In one example,
FIGS. 5A-5C illustrate asystem 500 for registering an anatomical feature of a patient. In this example, thestereotactic frame base 530 is fixed to ananatomical feature 528 of patient, e.g., the patient's head. As shown byFIG. 5A , thestereotactic frame base 530 may be affixed to the patient'shead 528 prior to registration using pins clamping the skull or other method. Thestereotactic frame base 530 may act as both a fixation platform, for holding the patient'shead 528 in a fixed position, and registration and tracking platform, for alternatingly holding theCT localizer 536 or theFRA fixture 534. TheCT localizer 536 includes a plurality of fiducial markers 532 (e.g., N-pattern radio-opaque rods or other fiducials), which are automatically detected in the image space using image processing. Due to the precise attachment mechanism of theCT localizer 536 to thebase 530, thesefiducial markers 532 are in known space relative to thestereotactic frame base 530. A 3D CT scan of the patient withCT localizer 536 attached is taken, with an image volume that includes both the patient'shead 528 and thefiducial markers 532 of theCT localizer 536. This registration image can be taken intraoperatively or preoperatively, either in the operating room or in radiology, for example. The captured 3D image dataset is stored to computer memory. - As shown by
FIG. 5B , after the registration image is captured, theCT localizer 536 is removed from thestereotactic frame base 530 and the framereference array fixture 534 is attached to thestereotactic frame base 530. Thestereotactic frame base 530 remains fixed to the patient'shead 528, however, and is used to secure the patient during surgery, and serves as the attachment point of a framereference array fixture 534. The framereference array fixture 534 includes a frame reference array (FRA), which is a rigid array of three or more trackedmarkers 539, which may be the primary reference for optical tracking. By positioning the trackedmarkers 539 of the FRA in a fixed, known location and orientation relative to thestereotactic frame base 530, the position and orientation of the patient'shead 528 may be tracked in real time. Mount points on theFRA fixture 534 andstereotactic frame base 530 may be designed such that theFRA fixture 534 attaches reproducibly to thestereotactic frame base 530 with minimal (i.e., submillimetric) variability. These mount points on thestereotactic frame base 530 can be the same mount points used by theCT localizer 536, which is removed after the scan has been taken. An auxiliary arm (such asauxiliary arm 107 ofFIG. 1B , for example) or other attachment mechanism can also be used to securely affix the patient to the robot base to ensure that the robot base is not allowed to move relative to the patient. - As shown by
FIG. 5C , a dynamic reference base (DRB) 540 may also be attached to thestereotactic frame base 530. TheDRB 540 in this example includes a rigid array of three or more trackedmarkers 542. In this example, theDRB 540 and/or other tracked markers may be attached to thestereotactic frame base 530 and/or to directly to the patient'shead 528 using auxiliary mountingarms 541, pins, or other attachment mechanisms. Unlike theFRA fixture 534, which mounts in only one way for unambiguous localization of thestereotactic frame base 530, theDRB 540 in general may be attached as needed for allowing unhindered surgical and equipment access. Once theDRB 540 andFRA fixture 534 are attached, registration, which was initially related to the trackingmarkers 539 of the FRA, can be optionally transferred or related to the trackingmarkers 542 of theDRB 540. For example, if any part of theFRA fixture 534 blocks surgical access, the surgeon may remove theFRA fixture 534 and navigate using only theDRB 540. However, if theFRA fixture 534 is not in the way of the surgery, the surgeon could opt to navigate from theFRA markers 539, without using aDRB 540, or may navigate using both theFRA markers 539 and theDRB 540. In this example, theFRA fixture 534 and/orDRB 540 uses optical markers, the tracked positions of which are in known locations relative to thestereotactic frame base 530, similar to theCT localizer 536, but it should be understood that many other additional and/or alternative techniques may be used. -
FIGS. 6A and 6B illustrate asystem 600 for registering an anatomical feature of a patient using fluoroscopy (fluoro) imaging, according to some embodiments. In this embodiment, image space is registered to tracking space using multiple intraoperative fluoroscopy (fluoro) images taken using a trackedregistration fixture 644. The anatomical feature of the patient (e.g., the patient's head 628) is positioned and rigidly affixed in aclamping apparatus 643 in a static position for the remainder of the procedure. Theclamping apparatus 643 for rigid patient fixation can be a three-pin fixation system such as a Mayfield clamp, a stereotactic frame base attached to the surgical table, or another fixation method, as desired. Theclamping apparatus 643 may also function as a support structure for a patient tracking array orDRB 640 as well. The DRB may be attached to the clamping apparatus usingauxiliary mounting arms 641 or other means. - Once the patient is positioned, the
fluoro fixture 644 is attached the fluoro unit's x-ray collecting image intensifier (not shown) and secured by tightening clampingfeet 632. Thefluoro fixture 644 contains fiducial markers (e.g., metal spheres laid out across two planes in this example, not shown) that are visible on 2D fluoro images captured by the fluoro image capture device and can be used to calculate the location of the x-ray source relative to the image intensifier, which is typically about 1 meter away contralateral to the patient, using a standard pinhole camera model. Detection of the metal spheres in the fluoro image captured by the fluoro image capture device also enables the software to de-warp the fluoro image (i.e., to remove pincushion and s-distortion). Additionally, thefluoro fixture 644 contains 3 ormore tracking markers 646 for determining the location and orientation of thefluoro fixture 644 in tracking space. In some embodiments, software can project vectors through a CT image volume, based on a previously captured CT image, to generate synthetic images based on contrast levels in the CT image that appear similar to the actual fluoro images (i.e., digitally reconstructed radiographs (DRRs)). By iterating through theoretical positions of the fluoro beam until the DRRs match the actual fluoro shots, a match can be found between fluoro image and DRR in two or more perspectives, and based on this match, the location of the patient'shead 628 relative to the x-ray source and detector is calculated. Because the trackingmarkers 646 on thefluoro fixture 644 track the position of the image intensifier and the position of the x-ray source relative to the image intensifier is calculated from metal fiducials on thefluoro fixture 644 projected on 2D images, the position of the x-ray source and detector in tracking space are known and the system is able to achieve image-to-tracking registration. - As shown by
FIGS. 6A and 6B , two or more shots are taken of thehead 628 of the patient by the fluoro image capture device from two different perspectives while tracking thearray markers 642 of theDRB 640, which is fixed to theregistration fixture 630 via a mountingarm 641, and trackingmarkers 646 on thefluoro fixture 644. Based on the tracking data and fluoro data, an algorithm computes the location of thehead 628 or other anatomical feature relative to the tracking space for the procedure. Through image-to-tracking registration, the location of any tracked tool in the image volume space can be calculated. - For example, in one embodiment, a first fluoro image taken from a first fluoro perspective can be compared to a first DRR constructed from a first perspective through a CT image volume, and a second fluoro image taken from a second fluoro perspective can be compared to a second DRR constructed from a second perspective through the same CT image volume. Based on the comparisons, it may be determined that the first DRR is substantially equivalent to the first fluoro image with respect to the projected view of the anatomical feature, and that the second DRR is substantially equivalent to the second fluoro image with respect to the projected view of the anatomical feature. Equivalency confirms that the position and orientation of the x-ray path from emitter to collector on the actual fluoro machine as tracked in camera space matches the position and orientation of the x-ray path from emitter to collector as specified when generating the DRRs in CT space, and therefore registration of tracking space to CT space is achieved.
-
FIG. 7 illustrates asystem 700 for registering an anatomical feature of a patient using an intraoperative CT fixture (ICT) and a DRB, according to some embodiments. As shown inFIG. 7 , in one application, a fiducial-based image-to-tracking registration can be utilized that uses an intraoperative CT fixture (ICT) 750 having a plurality of trackingmarkers 751 and radio-opaquefiducial reference markers 732 to register the CT space to the tracking space. After stabilizing the anatomical feature 728 (e.g., the patient's head) usingclamping apparatus 730 such as a three-pin Mayfield frame and/or stereotactic frame, the surgeon will affix theICT 750 to theanatomical feature 728,DRB 740, or clampingapparatus 730, so that it is in a static position relative to the trackingmarkers 742 of theDRB 740, which may be held in place by mountingarm 741 or other rigid means. A CT scan is captured that encompasses thefiducial reference markers 732 of theICT 750 while also capturing relevant anatomy of theanatomical feature 728. Once the CT scan is loaded in the software, the system auto-identifies (through image processing) locations of thefiducial reference markers 732 of the ICT within the CT volume, which are in a fixed position relative to the tracking markers of theICT 750, providing image-to-tracking registration. This registration, which was initially based on the trackingmarkers 751 of theICT 750, is then related to or transferred to the trackingmarkers 742 of theDRB 740, and theICT 750 may then be removed. -
FIG. 8A illustrates asystem 800 for registering an anatomical feature of a patient using a DRB and an X-ray cone beam imaging device, according to some embodiments. Anintraoperative scanner 852, such as an X-ray machine or other scanning device, may have atracking array 854 with trackingmarkers 855, mounted thereon for registration. Based on the fixed, known position of thetracking array 854 on the scanning device, the system may be calibrated to directly map (register) the tracking space to the image space of any scan acquired by the system. Once registration is achieved, the registration, which is initially based on the tracking markers 855 (e.g. gantry markers) of the scanner'sarray 854, is related or transferred to the trackingmarkers 842 of aDRB 840, which may be fixed to aclamping fixture 830 holding the patient'shead 828 by a mountingarm 841 or other rigid means. After transferring registration, the markers on the scanner are no longer used and can be removed, deactivated or covered if desired. Registering the tracking space to any image acquired by a scanner in this way may avoid the need for fiducials or other reference markers in the image space in some embodiments. -
FIG. 8B illustrates analternative system 800′ that uses a portable intraoperative scanner, referred to herein as a C-arm scanner 853. In this example, the C-arm scanner 853 includes a c-shapedarm 856 coupled to amovable base 858 to allow the C-arm scanner 853 to be moved into place and removed as needed, without interfering with other aspects of the surgery. Thearm 856 is positioned around the patient'shead 828 intraoperatively, and thearm 856 is rotated and/or translated with respect to the patient'shead 828 to capture the X-ray or other type of scan that to achieve registration, at which point the C-arm scanner 853 may be removed from the patient. - Another registration method for an anatomical feature of a patient, e.g., a patient's head, may be to use a surface contour map of the anatomical feature, according to some embodiments. A surface contour map may be constructed using a navigated or tracked probe, or other measuring or sensing device, such as a laser pointer, 3D camera, etc. For example, a surgeon may drag or sequentially touch points on the surface of the head with the navigated probe to capture the surface across unique protrusions, such as zygomatic bones, superciliary arches, bridge of nose, eyebrows, etc. The system then compares the resulting surface contours to contours detected from the CT and/or MR images, seeking the location and orientation of contour that provides the closest match. To account for movement of the patient and to ensure that all contour points are taken relative to the same anatomical feature, each contour point is related to tracking markers on a DRB on the patient at the time it is recorded. Since the location of the contour map is known in tracking space from the tracked probe and tracked DRB, tracking-to-image registration is obtained once the corresponding contour is found in image space.
-
FIG. 9 illustrates asystem 900 for registering an anatomical feature of a patient using a navigated or tracked probe and fiducials for point-to-point mapping of the anatomical feature 928 (e.g., a patient's head), according to some embodiments. Software would instruct the user to point with a tracked probe to a series of anatomical landmark points that can be found in the CT or MR image. When the user points to the landmark indicated by software, the system captures a frame of tracking data with the tracked locations of tracking markers on the probe and on the DRB. From the tracked locations of markers on the probe, the coordinates of the tip of the probe are calculated and related to the locations of markers on the DRB. Once 3 or more points are found in both spaces, tracking-to-image registration is achieved. As an alternative to pointing to natural anatomical landmarks, fiducials 954 (i.e., fiducial markers), such as sticker fiducials or metal fiducials, may be used. The surgeon will attach thefiducials 954 to the patient, which are constructed of material that is opaque on imaging, for example containing metal if used with CT or Vitamin E if used with MR. Imaging (CT or MR) will occur after placing thefiducials 954. The surgeon or user will then manually find the coordinates of the fiducials in the image volume, or the software will find them automatically with image processing. After attaching aDRB 940 with trackingmarkers 942 to the patient through a mountingarm 941 connected to aclamping apparatus 930 or other rigid means, the surgeon or user may also locate thefiducials 954 in physical space relative to theDRB 940 by touching thefiducials 954 with a tracked probe while simultaneously recording tracking markers on the probe (not shown) and on theDRB 940. Registration is achieved because the coordinates of the same points are known in the image space and the tracking space. - One use for the embodiments described herein is to plan trajectories and to control a robot to move into a desired trajectory, after which the surgeon will place implants such as electrodes through a guide tube held by the robot. Additional functionalities include exporting coordinates used with existing stereotactic frames, such as a Leksell frame, which uses five coordinates: X, Y, Z, Ring Angle and Arc Angle. These five coordinates are established using the target and trajectory identified in the planning stage relative to the image space and knowing the position and orientation of the ring and arc relative to the stereotactic frame base or other registration fixture.
- As shown in
FIG. 10 , stereotactic frames allow atarget location 1058 of an anatomical feature 1028 (e.g., a patient's head) to be treated as the center of a sphere and the trajectory can pivot about thetarget location 1058. The trajectory to thetarget location 1058 is adjusted by the ring and arc angles of the stereotactic frame (e.g., a Leksell frame). These coordinates may be set manually, and the stereotactic frame may be used as a backup or as a redundant system in case the robot fails or cannot be tracked or registered successfully. The linear x,y,z offsets to the center point (i.e., target location 1058) are adjusted via the mechanisms of the frame. Acone 1060 is centered around thetarget location 1058, and shows the adjustment zone that can be achieved by modifying the ring and arc angles of the Leksell or other type of frame. This figure illustrates that a stereotactic frame with ring and arc adjustments is well suited for reaching a fixed target location from a range of angles while changing the entry point into the skull. -
FIG. 11 illustrates a two-dimensional visualization of virtual point rotation mechanism, according to some embodiments. In this embodiment, the robotic arm is able to create a different type of point-rotation functionality that enables a new movement mode that is not easily achievable with a 5-axis mechanical frame, but that may be achieved using the embodiments described herein. Through coordinated control of the robot's axes using the registration techniques described herein, this mode allows the user to pivot the robot's guide tube about any fixed point in space. For example, the robot may pivot about theentry point 1162 into the anatomical feature 1128 (e.g., a patient's head). This entry point pivoting is advantageous as it allows the user to make a smaller burr hole without limiting their ability to adjust thetarget location 1164 intraoperatively. The cone 1160 represents the range of trajectories that may be reachable through a single entry hole. Additionally, entry point pivoting is advantageous as it allows the user to reach twodifferent target locations location 1058 shown inFIG. 10 ) within the skull to reach the target location from different angles or trajectories, as illustrated inFIG. 10 . Such interior pivoting robotically has the same advantages as a stereotactic frame as it allows the user to approach thesame target location 1058 from multiple approaches, such as when irradiating a tumor or when adjusting a path so that critical structures such as blood vessels or nerves will not be crossed when reaching targets beyond them. Unlike a stereotactic frame, which relies on fixed ring and arc articulations to keep a target/pivot point fixed, the robot adjusts the pivot point through controlled activation of axes and the robot can therefore dynamically adjust its pivot point and switch as needed between the modes illustrated inFIGS. 10 and 11 . - Following the insertion of implants or instrumentation using the robot or ring and arc fixture, these and other embodiments may allow for implant locations to be verified using intraoperative imaging. Placement accuracy of the instrument or implant relative to the planned trajectory can be qualitatively and/or quantitatively shown to the user. One option for comparing planned to placed position is to merge a postoperative verification CT image to any of the preoperative images. Once pre- and post-operative images are merged and plan is shown overlaid, the shadow of the implant on postop CT can be compared to the plan to assess accuracy of placement. Detection of the shadow artifact on post-op CT can be performed automatically through image processing and the offset displayed numerically in terms of millimeters offset at the tip and entry and angular offset along the path. This option does not require any fiducials to be present in the verification image since image-to-image registration is performed based on bony anatomical contours.
- A second option for comparing planned position to the final placement would utilize intraoperative fluoro with or without an attached fluoro fixture. Two out-of-plane fluoro images will be taken and these fluoro images will be matched to DRRs generated from pre-operative CT or MR as described above for registration. Unlike some of the registration methods described above, however, it may be less important for the fluoro images to be tracked because the key information is where the electrode is located relative to the anatomy in the fluoro image. The linear or slightly curved shadow of the electrode would be found on a fluoro image, and once the DRR corresponding to that fluoro shot is found, this shadow can be replicated in the CT image volume as a plane or sheet that is oriented in and out of the ray direction of the fluoro image and DRR. That is, the system may not know how deep in or out of the fluoro image plane the electrode lies on a given shot, but can calculate the plane or sheet of possible locations and represent this plane or sheet on the 3D volume. In a second fluoro view, a different plane or sheet can be determined and overlaid on the 3D image. Where these two planes or sheets intersect on the 3D image is the detected path of the electrode. The system can represent this detected path as a graphic on the 3D image volume and allow the user to reslice the image volume to display this path and the planned path from whatever perspective is desired, also allowing automatic or manual calculation of the deviation from planned to placed position of the electrode. Tracking the fluoro fixture is unnecessary but may be done to help de-warp the fluoro images and calculate the location of the x-ray emitter to improve accuracy of DRR calculation, the rate of convergence when iterating to find matching DRR and fluoro shots, and placement of sheets/planes representing the electrode on the 3D scan.
- In this and other examples, it is desirable to maintain navigation integrity, i.e., to ensure that the registration and tracking remain accurate throughout the procedure. Two primary methods to establish and maintain navigation integrity include: tracking the position of a surveillance marker relative to the markers on the DRB, and checking landmarks within the images. In the first method, should this position change due to, for example, the DRB being bumped, then the system may alert the user of a possible loss of navigation integrity. In the second method, if a landmark check shows that the anatomy represented in the displayed slices on screen does not match the anatomy at which the tip of the probe points, then the surgeon will also become aware that there is a loss of navigation integrity. In either method, if using the registration method of CT localizer and frame reference array (FRA), the surgeon has the option to re-attach the FRA, which mounts in only one possible way to the frame base, and to restore tracking-to-image registration based on the FRA tracking markers and the stored fiducials from the
CT localizer 536. This registration can then be transferred or related to tracking markers on a repositioned DRB. Once registration is transferred the FRA can be removed if desired. - Referring now to
FIGS. 12-18 generally, with reference to thesurgical robot system 100 shown inFIG. 1A , end-effector 112 may be equipped with components, configured, or otherwise include features so that one end-effector may remain attached to a given one ofrobot arms 104 without changing to another end-effector for multiple different surgical procedures, such as, by way of example only, Deep Brain Stimulation (DBS), Stereoelectroencephalography (SEEG), or Endoscopic Navigation and Tumor Biopsy. As discussed previously, end-effector 112 may be orientable to oppose an anatomical feature of a patient in the manner so as to be in operative proximity thereto, and, to be able to receive one or more surgical tools for operations contemplated on the anatomical feature proximate to the end-effector 112. Motion and orientation of end-effector 112 may be accomplished through any of the navigation, trajectory guidance, or other methodologies discussed herein or as may be otherwise suitable for the particular operation. - End-
effector 112 is suitably configured to permit a plurality ofsurgical tools 129 to be selectively connectable to end-effector 112. Thus, for example, a stylet 113 (FIG. 13 ) may be selectively attached in order to localize an incision point on an anatomical feature of a patient, or an electrode driver 115 (FIG. 14 ) may be selectively attached to the same end-effector 112. - With reference to the previous discussion of robot
surgical system 100, a processor circuit, as well as memory accessible by such processor circuit, includes various subroutines and other machine-readable instructions configured to cause, when executed, end-effector 112 to move, such as by GPS movement, relative to the anatomical feature, at predetermined stages of associated surgical operations, whether pre-operative, intra-operative or post-operative. - End-
effector 112 includes various components and features to either prevent or permit end-effector movement depending on whether and whichtools 129, if any, are connected to end-effector 112. Referring more particularly toFIG. 12 , end-effector 112 includes a tool-insert locking mechanism 117 located on and connected toproximal surface 119. Tool-insert locking mechanism 117 is configured so as to secure any selected one of a plurality of surgical tools, such as theaforesaid stylet 113,electrode driver 115, or any other tools for different surgeries mentioned previously or as may be contemplated by other applications of this disclosure. The securement of the tool by tool-insert locking mechanism 117 is such that, for any of multiple tools capable of being secured to lockingmechanism 117, each such tool is operatively and suitably secured at the predetermined height, angle of orientation, and rotational position relative to the anatomical feature of the patient, such that multiple tools may be secured to the same end-effector 112 in respective positions appropriate for the contemplated procedure. - Another feature of the end-
effector 112 is atool stop 121 located ondistal surface 123 of end-effector 112, that is, the surface generally opposing the patient.Tool stop 121 has astop mechanism 125 and asensor 127 operatively associated therewith, as seen with reference toFIGS. 16, 19, and 20 .Stop mechanism 125 is mounted to end-effector 112 so as to be selectively movable relative thereto between an engaged position to prevent any of the tools from being connected to end-effector 112 and a disengaged position which permits any of thetools 129 to be selectively connected to end-effector 112.Sensor 127 may be located on or within the housing of end-effector 112 at any suitable location (FIGS. 12, 14, 16 ) so thatsensor 127 detects whetherstop mechanism 125 is in the engaged or disengaged position.Sensor 127 may assume any form suitable for such detection, such as any type of mechanical switch or any type of magnetic sensor, including Reed switches, Hall Effect sensors, or other magnetic field detecting devices. In one possible implementation,sensor 127 has two portions, a Hall Effect sensor portion (not shown) and amagnetic portion 131, the two portions moving relative to each other so as to generate and detect two magnetic fields corresponding to respective engaged and disengaged position. In the illustrated implementation, the magnetic portion comprises tworare earth magnets 131 which move relative to the complementary sensing portion (not shown) mounted in the housing ofend effector 112 in operative proximity tomagnets 131 to detect change in the associated magnetic field from movement ofstop mechanism 125 between engaged and disengaged positions. In this implementation the Hall effect sensor is bipolar and can detect whether a North pole or South pole of a magnet opposes the sensor.Magnets 131 are configured so that the North pole of one magnet faces the path of the sensor and the South pole of the other magnet faces the path of the sensor. In this configuration, the sensor senses an increased signal when it is near one magnet (for example, in disengaged position), a decreased signal when it is near the other magnet (for example, in engaged position), and unchanged signal when it is not in proximity to any magnet. In this implementation, in response to detection ofstop mechanism 125 being in the disengaged position shown inFIGS. 13 and 19 ,sensor 127 causes the processor ofsurgical robot system 100 to execute suitable instructions to prevent movement of end-effector 112 relative to the anatomical feature. Such movement prevention may be appropriate for any number of reasons, such as when a tool is connected to end-effector 112, such tool potentially interacting with the anatomical feature of the patient. - Another implementation of a
sensor 127 for detecting engaged or disengagedtool stop mechanism 125 could comprise a single magnet behind the housing (not shown) and two Hall Effect sensors located wheremagnets 131 are shown in the preferred embodiment. In such a configuration, monopolar Hall Effect sensors are suitable and would be configured so thatSensor 1 detects a signal when the magnet is in proximity due to the locking mechanism being disengaged, while Sensor 2 detects a signal when the same magnet is in proximity due to the locking mechanism being engaged. Neither sensor would detect a signal when the magnet is between positions or out of proximity to either sensor. Although a configuration could be conceived in which a sensor is active for engaged position and inactive for disengaged position, a configuration with three signals indicating engaged, disengaged, or transitional is preferred to ensure correct behavior in case of power failure. - End-
effector 112,tool stop 121, and tool-insert locking mechanism 117 each have co-axially aligned bores or apertures such that any selected one of the plurality ofsurgical tools 129 may be received through such bores and apertures. In this implementation end-effector has abore 133 and tool stop 121 and tool-insert locking mechanism 117 haverespective apertures Stop mechanism 125 includes aring 139 axially aligned withbore 133 andaperture 135 oftool stop 121.Ring 139 is selectively, manually rotatable in the directions indicated by arrow A (FIG. 16 ) so as to movestop mechanism 125 between the engaged position and the disengaged position. - In one possible implementation, the selective rotation of
ring 139 includes features which enablering 139 to be locked in either the disengaged or engaged position. So, for example, as illustrated, adetent mechanism 141 is located on and mounted to ring 139 in any suitable way to lockring 139 against certain rotational movement out of a predetermined position, in this case, such position being whenstop mechanism 125 is in the engaged position. Although various forms of detent mechanism are contemplated herein, one suitable arrangement has a manually accessible head extending circumferentially outwardly fromring 139 and having a male protrusion (not shown) spring-loaded axially inwardly to engage a corresponding female detent portion (not shown).Detent mechanism 141, as such, is manually actuatable to unlockring 139 from its engaged position to permitring 139 to be manually rotated to causestop mechanism 125 to move from the engaged position (FIG. 20 ) to the disengaged position (FIG. 19 ). -
Tool stop 121 includes alever arm 143 pivotally mountedadjacent aperture 135 of tool stop 121 so end oflever arm 143 selectively pivots in the directions indicated by arrow B (FIGS. 16, 19 and 20 ).Lever arm 143 is operatively connected to stopmechanism 125, meaning it closesaperture 135 oftool stop 121 in response to stopmechanism 125 being in the engaged position, as shown inFIG. 20 .Lever arm 143 is also operatively connected so as to pivot back in direction of arrow B to openaperture 135 in response to stopmechanism 125 being in the disengaged position. As such, movement ofstop mechanism 125 between engaged and disengaged positions results in closure or opening ofaperture 135, respectively, bylever arm 143. -
Lever arm 143, in this implementation, is not only pivotally mountedadjacent aperture 135, but also pivots in parallel with a distal plane defined at a distal-most point ofdistal surface 123 of end-effector 112. In this manner, any one of thesurgical tools 129, which is attempted to be inserted throughbore 133 andaperture 135, is stopped from being inserted past the distal plane in whichlever arm 143 rotates to closeaperture 135. - Turning now to tool-insert locking mechanism 117 (
FIG. 13, 17, 18 ), aconnector 145 is configured to meet with and secure any one of thesurgical tools 129 at their appropriate height, angle of orientation, and rotational position relative to the anatomical feature of the patient. In the illustrated implementation,connector 145 comprises arotatable flange 147 which has at least oneslot 149 formed therein to receive therethrough acorresponding tongue 151 associated with a selected one of the plurality oftools 129. So, for example, inFIG. 14 , theparticular electrode driver 115 has multiple tongues, one of whichtongue 151 is shown.Rotatable flange 147, in some implementations, may comprise acollar 153, which collar, in turn, has multiple ones ofslots 149 radially spaced on a proximally orientedsurface 155, as best seen inFIG. 12 .Multiple slots 147 arranged aroundcollar 153 are sized or otherwise configured so as to receive therethrough corresponding ones ofmultiple tongues 151 associated with a selected one of the plurality oftools 129. Therefore, as seen inFIG. 13 ,multiple slots 149 andcorresponding tongues 151 may be arranged to permit securing of a selected one of the plurality oftools 129 only when selected tool is in the correct, predetermined angle of orientation and rotational position relative to the anatomical feature of the patient. Similarly, with regard to the electrode driver shown inFIG. 14 , tongues 151 (one of which is shown in a cutaway ofFIG. 14 ) have been received in radially spacedslots 149 arrayed so thatelectrode driver 115 is received at the appropriate angle of orientation and rotational position. -
Rotatable flange 147 has, in this implementation, agrip 173 to facilitate manual rotation between an open and closed position as shown inFIGS. 17 and 18 , respectively. As seen inFIG. 17 , multiple sets ofmating slots 149 andtongues 151 are arranged at different angular locations, in this case, locations which may be symmetric about a single diametric chord of a circle but otherwise radially asymmetric, and at least one of the slots has a different dimension or extends through a different arc length than other slots. In this slot-tongue arrangement, and any number of variations contemplated by this disclosure, there is only one rotational position of the tool 129 (oradapter 155 discussed later) to be received in tool-insert locking mechanism 117 whenrotatable flange 147 is in the open position shown inFIG. 17 . In other words, when the user ofsystem 100 moves a selected tool 129 (or tool adapter 155) to a single appropriate rotational position, correspondingtongues 151 may be received throughslots 149. Upon placement oftongues 151 intoslots 149,tongues 151 confront abase surface 175 withinconnector 145 ofrotatable flange 147. Upon receivingtongues 151 intoslots 149 and having them rest onunderlying base surface 175, dimensions oftongues 151 andslots 149, especially with regard to height relative torotatable flange 147, are selected so that whenrotatable flange 147 is rotated to the closed position,flange portions 157 are radially translated to overlie or engage portions oftongues 151, such engagement shown inFIG. 18 and affixing tool 129 (or adapter 155) received inconnector 145 at the desired, predetermined height, angle of orientation, and rotational position relative to the anatomical feature of the patient. -
Tongues 151 described as being associated withtools 129 may either be directly connected tosuch tools 129, and/ortongues 151 may be located on and mounted to the above-mentionedadapter 155, such as that shown inFIGS. 12, 17 and 18 ,such adapter 155 configured to interconnect at least one of the plurality ofsurgical tools 129 with end-effector 112. In the described implementation,adapter 155 includes two operative portions—atool receiver 157 adapted to connect the selected one or moresurgical tools 129, and the second operative part being one ormore tongues 151 which may, in this implementation, be mounted and connected to the distal end ofadapter 155. -
Adapter 155 has anouter perimeter 159 which, in this implementation, is sized to oppose aninner perimeter 161 ofrotatable flange 147.Adapter 155 extends between proximal anddistal ends ends surgical tools 129, and similarly, the distance between proximal anddistal ends tools 129 is secured to end-effector 112 at the predetermined, appropriate height for the surgical procedure associated with such tool received in adapter bore 167. - In one possible implementation,
system 100 includes multiple ones ofadapter 155, configured to be interchangeable inserts 169 having substantially the same, predeterminedouter perimeters 159 to be received withininner perimeter 161 ofrotatable flange 147. Still further in such implementation, the interchangeable inserts 169 have bores of different, respective diameters, which bores may be selected to receive corresponding ones of thetools 129 therein. Bores 167 may comprise cylindrical bushings having inner diameters common to multiplesurgical tools 129. One possible set of diameters for bores 167 may be 12, 15, and 17 millimeters, suitable for multiple robotic surgery operations, such as those identified in this disclosure. - In the illustrated implementation,
inner perimeter 161 ofrotatable flange 147 andouter perimeter 159 ofadapter 155 are circular, having central, aligned axes and corresponding radii.Slots 149 ofrotatable flange 147 extend radially outwardly from the central axis ofrotatable flange 147 in the illustrated implementation, whereastongues 151 ofadapter 155 extend radially outwardly fromadapter 155. - In still other implementations, end-
effector 112 may be equipped with at least one illumination element 171 (FIGS. 14 and 15 ) orientable toward the anatomical feature to be operated upon.Illumination element 171 may be in the form of a ring of LEDs 177 (FIG. 14 ) located within adapter 167, which adapter is in the form of a bushing secured totool locking mechanism 117.Illumination element 171 may also be asingle LED 179 mounted on thedistal surface 123 of end-effector 112. Whether in the form ofLED ring 177 or asingle element LED 179 mounted on distal surface of end-effector 112, or any other variation, the spacing and location of illumination element orelements 171 may be selected so thattools 129 received throughbore 133 of end-effector 112 do not cast shadows or otherwise interfere with illumination fromelement 171 of the anatomical feature being operated upon. - The operation and associated features of end-
effector 112 are readily apparent from the foregoing description.Tool stop 121 is rotatable, selectively lockable, and movable between engaged and disengaged positions, and a sensor prevents movement of end-effector 112 when in such disengaged position, due to the potential presence of a tool which may not be advisably moved during such disengaged position. Tool-insert locking mechanism 117 is likewise rotatable between open and closed positions to receive one of a plurality of interchangeable inserts 169 andtongues 151 of such inserts, wherein selectedtools 129 may be received in such inserts 169; alternately,tongues 151 may be otherwise associated withtools 129, such as by havingtongues 151 directly connected tosuch tools 129, which tongue-equipped tools likewise may be received in correspondingslots 149 of tool-insert locking mechanism 117. Tool-insert locking mechanism 117 may be rotated from its open position in whichtongues 151 have been received inslots 149, to secure associatedadapters 155 and/ortools 129 so that they are at appropriate, respective heights, angles of orientation, and rotational positions relative to the anatomical feature of the patient. - For those implementations with
multiple adapters 155, the dimensions ofsuch adapters 155, including bore diameters, height, and other suitable dimensions, are selected so that a single or a minimized number of end-effectors 112 can be used for a multiplicity ofsurgical tools 129.Adapters 155, such as those in the form of interchangeable inserts 169 or cylindrical bushings, may facilitate connecting an expanded set ofsurgical tools 129 to the end-effector 112, and thus likewise facilitate a corresponding expanded set of associated surgical features using the same end-effector 112. - Another possible embodiment of
surgical robot system 100 shown inFIG. 1A is described below and shown with reference toFIGS. 21-25 . In this implementation, end-effector 212 is suitably connected to a robot arm, such as that described previously with reference torobot arm 104, and is orientable to oppose an anatomical feature a so as to be in operative proximity thereto. End-effector 212 may include features similar to those described with reference to end-effector 112 ofFIGS. 12-18 , such as a tool-insert locking mechanism 217 and tool stop 221, which correspond to tool-insert locking mechanism 117 and tools stop 121 described previously. However, it will be appreciated that such features are not required in end-effector 212 and various other features or configurations of end-effector 212 are contemplated by the disclosure with reference toFIGS. 19-24 . - End-
effector 212 has a surgical tool comprising a drill 223 selectively connectible thereto. Processing circuitry, memory, and suitable machine-readable instructions are associated with this implementation ofsurgical robot system 100 so as to determine, for drill 223, a drill target and an associated drill trajectory relative to anatomical feature a. Suitable instructions are likewise provided such that, when executed, the end-effector may be automatically moved to a position corresponding to the determined target and determined drill trajectory. Such instructions also permit advancement of the drill toward the target. Manual manipulations ofend effector 212 to drill locations and trajectories are likewise contemplated. - A
drill guide 225 is releasably connected between drill 223 and end-effector 212. As explained below,drill guide 225 may be configured and otherwise includes features to stop advancement of drill 223 at a preselected drill depth associated with the determined target.Drill guide 225 may be further configured with features to guide the drill along the determined drill trajectory during the advancement of the drill. -
Drill guide 225 includes aguide shaft 227 having abore 229 extending longitudinally therethrough between proximal and distal ends ofguide shaft 227.Bore 229 is sized to slideably receive adrill bit 231 which extends distally from achuck 233 of drill 223.Such drill bit 231 terminates in adrill tip 235 capable of engaging the anatomical feature a upon a suitable amount of advancement ofdrill bit 231 along the determined drill trajectory. - Referring more particularly to
FIG. 22 ,drill bit 231 has a known, that is, predetermined, length A corresponding to the distance between thedistal surface 237 ofchuck 233 and the end ofdrill tip 235.Guide shaft 227, in turn, has a known, that is, predetermined, second lengthB. Drill guide 225 includes adepth stop 239 having respective proximal and distal depth stop ends.Depth stop 239 is mounted so that its distal depth stop end is selectively and slideably received at the proximal end ofguide shaft 227. As such,depth stop 239, when slid relative to guideshaft 227, varies the second length B ofguide shaft 227. In particular, movement ofdepth stop 239 varies the predetermined length B ofguide shaft 227 among a plurality or set of length values corresponding to amounts by whichdrill tip 235 extends beyond distal end ofguide shaft 227, such amounts thus corresponding to available drill depths. As such selected lengths B are less than the known, predetermined length A ofdrill bit 231, and thereby permit the distal end ofdrill bit 231 and itsdrill tip 235 to extend distally from the distal end ofguide shaft 227 by selected lengths corresponding to desired depths of engagement of the anatomical feature. In the disclosed implementation,depth stop 239 hassurface 241 oriented and located to engage adistal surface 237 ofchuck 233 of drill 223. Accordingly, the depth to whichdrill tip 235 is advanceable by drill 223 is limited by an amount corresponding to the position ofdepth stop 239. - In one possible implementation,
drill guide 225 includes a visuallyperceptible depth indicator 251 operatively associated withdepth stop 239. By way of example,depth stop 239 has alongitudinal stem 245 havingindicia 243 disposed thereon, so that slideable movement ofdepth stop 239 slideslongitudinal stem 245 and theindicia 243.Guide shaft 227, in turn, has portions forming awindow 247 sized and located to reveal at least a portion oflongitudinal stem 245bearing indicia 243. The portions adjacent towindow 247 have one or more structures or indicia thereon to form apointer 249.Indicia 243, in this case, includes a graduated, numerical value scale associated with available depths of penetration of anatomical feature a, the pointer and numerical value scale being moveable relative to each other, in this case by slidinglongitudinal stem 245 ofdepth stop 239 relative towindow 247 ofguide shaft 227. - It will be appreciated that the above-described relative arrangement of
pointer 249 and graduated depths of advancement appearing as numerical values inindicia 243 may be disposed in alternative configurations, such as having the pointer on the slideablelongitudinal stem 245 and the graduated depths of advancement disposed longitudinally along portions ofwindow 247. Still further variations are possible. - In still further possible implementation,
drill guide 225 makes use of anengagement mechanism 253 which sets depth stop 239 at one of the available, selectable, longitudinal positions corresponding to the desired or predetermined drill depth.Engagement mechanism 253 may include aratchet assembly 255 which has features, such as mating teeth 257 as shown, to engage and setdepth stop 239 at one of the longitudinally spaced locations between the proximal and distal ends ofdepth stop 239. In theratchet assembly 255 shown, teeth 257 are longitudinally disposed along a suitable outer surface oflongitudinal stem 245, and aratchet 259 with a confronting surface feature, such as one or more mating teeth 257, is located to oppose teeth 257 disposed onstem 245 and is spring-biased to selectively engagestem 245. -
Ratchet 259 andcorresponding ratchet assembly 255 may be operated by a spring-biasedtrigger 261. In the illustrated embodiment,trigger 261 is manually pulled against a spring-biasing force to disengageengagement mechanism 253 from a first one of the longitudinal positions to which it had been previously set. During such disengagement,depth stop 239 is operated to slidelongitudinal stem 245 relative to guideshaft 227 to a selected or desired drill depth as indicated bypointer 249 relative to the scale ofindicia 243. After movement ofdepth stop 239 to the desired longitudinal position, spring-biasedtrigger 261 may be released, allowing the spring-biased force to setengagement mechanism 253 at a second one of the available longitudinal positions, the second longitudinal position corresponding to the desired drill depth. - Once a desired drill depth has been set by the ratchet assembly or other features of
engagement mechanism 253, such setting may be locked or secured against inadvertent movement by atrigger lock 263 which is operatively connected to trigger 261, meaning, when engaged,trigger lock 263 holds ratchet 259 engaged with corresponding teeth 257 of depth stop. In the illustrated implementation,trigger lock 263 may be in the form of alocking ring 265, which may be threadibly or otherwise engaged to act as a stop against disengagement of mating teeth 257 ofratchet 259, or more generally, to prevent radially outward movement ofratchet 259 relative to the longitudinal axis ofdepth stop 239. - Actuation of
engagement mechanism 253, including, for example, engagement and disengagement ofdepth stop 239, may be facilitated by providingdrill guide 225 with ahandle 267. Handle 267 may be pulled radially outwardly from the longitudinal axis ofdrill guide 225 and, by virtue of connection toengagement mechanism 253, such outward pulling ofhandle 267 overcomes longitudinally inward spring-biasing force and disengagesengagement mechanism 253 fromdepth stop 239. Conversely, release ofhandle 267 after desired sliding ofdepth stop 239 relative to guideshaft 227 operates to setdrill guide 225 at a desired drill depth. - In addition to limiting penetration of drill 223 to a desired drill depth,
drill guide 225 may include features to reduce deviation ofdrill bit 231 from the determined drill trajectory β (FIG. 21 ). In one suitable implementation, such trajectory guiding features comprise at least onebushing 269 disposed withinbore 229. Bushing orbushings 269 are sized to receive outer longitudinal surface 273 ofdrill bit 231 slideably and rotatably therethrough. Accordingly,bushing 269 has an internal diameter sized to not only moveably engage longitudinal surface 273, but to thereby exert a force shown having the direction of arrow F opposing lateral displacement ofdrill bit 231 within guide shaft 227 (FIG. 22 ). In one possible implementation, onebushing 269 is located withinguide shaft 227 toward distal end thereof and thereby engagesdrill bit 231 closer to wheredrill tip 235 engages anatomical feature a. Such distal locations ofdrill bit 231 often experience greater lateral displacement forces by virtue of their proximity to anatomical feature a to be engaged, especially in the event the drill trajectory is at an oblique angle. - In the illustrated embodiment, a
second bushing 269 is disposed at a proximal location alongguide shaft 227 and thereby may generate a countervailing force opposing lateral displacement at the proximal end ofdrill bit 231, such as would result from a cantilevering ofdrill bit 231 upon oblique engagement of anatomical feature a. - From the foregoing description, operations and related methods of robot
surgical system 100 and itsdrill guide 225 will be readily appreciated. For example, in one method of operation, drill 223 may be guided during robot-assisted surgery, including any of the surgeries described herein on anatomical feature a of a patient. A drill target and an associated drill trajectory are determined, such as by computer processing. End-effector 212 is manually or automatically moved to a position corresponding to the determined target and the determined drill trajectory. After or before such movement of end-effector 212, depth stop 239 may be mechanically or manually set to one of a plurality of selectable positions corresponding to the desired depth of advancement of the drill associated with the contemplated surgery on the anatomical feature. In this way, drill 223 is limited from penetrating the anatomical feature beyond the selected, desired depth. - In one possible method, the displacement of the depth stop is done with the aid of visually perceptible indicia corresponding to graduated depths of advancement of the drill and, upon such visual perception of the desired depth, the depth stop is secured at such desired position. To set the depth stop, a spring-biased
trigger 261 is disengaged relative to depth stop 239 from a first position,depth stop 239 is then longitudinally slid relative to visuallyperceptible indicia 243 to a second position, such second position corresponding to the desired position of the depth stop and the desired depth of drilling on anatomical feature a. Once the desired depth has been selected by movement ofdepth stop 239,trigger 261 is released to re-engage depth stop 239 at the desired position. Thereafter,trigger lock 263 may be further engaged to avoid inadvertent disengagement and potential movement ofdepth stop 239 and thereby avoid over drilling. - In the above-description of various embodiments of present inventive concepts, it is to be understood that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of present inventive concepts. Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which present inventive concepts belong. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of this specification and the relevant art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
- When an element is referred to as being “connected”, “coupled”, “responsive”, or variants thereof to another element, it can be directly connected, coupled, or responsive to the other element or intervening elements may be present. In contrast, when an element is referred to as being “directly connected”, “directly coupled”, “directly responsive”, or variants thereof to another element, there are no intervening elements present. Like numbers refer to like elements throughout. Furthermore, “coupled”, “connected”, “responsive”, or variants thereof as used herein may include wirelessly coupled, connected, or responsive. As used herein, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. Well-known functions or constructions may not be described in detail for brevity and/or clarity. The term “and/or” includes any and all combinations of one or more of the associated listed items.
- It will be understood that although the terms first, second, third, etc. may be used herein to describe various elements/operations, these elements/operations should not be limited by these terms. These terms are only used to distinguish one element/operation from another element/operation. Thus a first element/operation in some embodiments could be termed a second element/operation in other embodiments without departing from the teachings of present inventive concepts. The same reference numerals or the same reference designators denote the same or similar elements throughout the specification.
- As used herein, the terms “comprise”, “comprising”, “comprises”, “include”, “including”, “includes”, “have”, “has”, “having”, or variants thereof are open-ended, and include one or more stated features, integers, elements, steps, components or functions but does not preclude the presence or addition of one or more other features, integers, elements, steps, components, functions or groups thereof. Furthermore, as used herein, the common abbreviation “e.g.”, which derives from the Latin phrase “exempli gratia,” may be used to introduce or specify a general example or examples of a previously mentioned item, and is not intended to be limiting of such item. The common abbreviation “i.e.”, which derives from the Latin phrase “id est,” may be used to specify a particular item from a more general recitation.
- Example embodiments are described herein with reference to block diagrams and/or flowchart illustrations of computer-implemented methods, apparatus (systems and/or devices) and/or computer program products. It is understood that a block of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations, can be implemented by computer program instructions that are performed by one or more computer circuits. These computer program instructions may be provided to a processor circuit of a general purpose computer circuit, special purpose computer circuit, and/or other programmable data processing circuit to produce a machine, such that the instructions, which execute via the processor of the computer and/or other programmable data processing apparatus, transform and control transistors, values stored in memory locations, and other hardware components within such circuitry to implement the functions/acts specified in the block diagrams and/or flowchart block or blocks, and thereby create means (functionality) and/or structure for implementing the functions/acts specified in the block diagrams and/or flowchart block(s).
- These computer program instructions may also be stored in a tangible computer-readable medium that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable medium produce an article of manufacture including instructions which implement the functions/acts specified in the block diagrams and/or flowchart block or blocks. Accordingly, embodiments of present inventive concepts may be embodied in hardware and/or in software (including firmware, resident software, micro-code, etc.) that runs on a processor such as a digital signal processor, which may collectively be referred to as “circuitry,” “a module” or variants thereof.
- It should also be noted that in some alternate implementations, the functions/acts noted in the blocks may occur out of the order noted in the flowcharts. For example, two blocks shown in succession may in fact be executed substantially concurrently or the blocks may sometimes be executed in the reverse order, depending upon the functionality/acts involved. Moreover, the functionality of a given block of the flowcharts and/or block diagrams may be separated into multiple blocks and/or the functionality of two or more blocks of the flowcharts and/or block diagrams may be at least partially integrated. Finally, other blocks may be added/inserted between the blocks that are illustrated, and/or blocks/operations may be omitted without departing from the scope of inventive concepts. Moreover, although some of the diagrams include arrows on communication paths to show a primary direction of communication, it is to be understood that communication may occur in the opposite direction to the depicted arrows.
- Although several embodiments of inventive concepts have been disclosed in the foregoing specification, it is understood that many modifications and other embodiments of inventive concepts will come to mind to which inventive concepts pertain, having the benefit of teachings presented in the foregoing description and associated drawings. It is thus understood that inventive concepts are not limited to the specific embodiments disclosed hereinabove, and that many modifications and other embodiments are intended to be included within the scope of the appended claims. It is further envisioned that features from one embodiment may be combined or used with the features from a different embodiment(s) described herein. Moreover, although specific terms are employed herein, as well as in the claims which follow, they are used only in a generic and descriptive sense, and not for the purposes of limiting the described inventive concepts, nor the claims which follow. The entire disclosure of each patent and patent publication cited herein is incorporated by reference herein in its entirety, as if each such patent or publication were individually incorporated by reference herein. Various features and/or potential advantages of inventive concepts are set forth in the following claims.
Claims (15)
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/695,310 US20200297357A1 (en) | 2019-03-22 | 2019-11-26 | System for neuronavigation registration and robotic trajectory guidance, robotic surgery, and related methods and devices |
CN202011345423.9A CN112932668A (en) | 2019-11-26 | 2020-11-26 | Surgical robotic system for performing surgery on anatomical features of a patient |
EP20210131.7A EP3827760B1 (en) | 2019-11-26 | 2020-11-26 | System for neuronavigation registration and robotic trajectory guidance, robotic surgery, and related devices |
JP2020195918A JP7239545B2 (en) | 2019-11-26 | 2020-11-26 | Systems for neuronavigation registration and robotic trajectory guidance, robotic surgery, and related methods and devices |
US17/662,437 US11944325B2 (en) | 2019-03-22 | 2022-05-09 | System for neuronavigation registration and robotic trajectory guidance, robotic surgery, and related methods and devices |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/361,863 US20200297426A1 (en) | 2019-03-22 | 2019-03-22 | System for neuronavigation registration and robotic trajectory guidance, and related methods and devices |
US16/452,737 US11317978B2 (en) | 2019-03-22 | 2019-06-26 | System for neuronavigation registration and robotic trajectory guidance, robotic surgery, and related methods and devices |
US16/695,310 US20200297357A1 (en) | 2019-03-22 | 2019-11-26 | System for neuronavigation registration and robotic trajectory guidance, robotic surgery, and related methods and devices |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/452,737 Continuation-In-Part US11317978B2 (en) | 2019-03-22 | 2019-06-26 | System for neuronavigation registration and robotic trajectory guidance, robotic surgery, and related methods and devices |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/662,437 Division US11944325B2 (en) | 2019-03-22 | 2022-05-09 | System for neuronavigation registration and robotic trajectory guidance, robotic surgery, and related methods and devices |
Publications (1)
Publication Number | Publication Date |
---|---|
US20200297357A1 true US20200297357A1 (en) | 2020-09-24 |
Family
ID=72516229
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/695,310 Abandoned US20200297357A1 (en) | 2019-03-22 | 2019-11-26 | System for neuronavigation registration and robotic trajectory guidance, robotic surgery, and related methods and devices |
US17/662,437 Active US11944325B2 (en) | 2019-03-22 | 2022-05-09 | System for neuronavigation registration and robotic trajectory guidance, robotic surgery, and related methods and devices |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/662,437 Active US11944325B2 (en) | 2019-03-22 | 2022-05-09 | System for neuronavigation registration and robotic trajectory guidance, robotic surgery, and related methods and devices |
Country Status (1)
Country | Link |
---|---|
US (2) | US20200297357A1 (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20200188134A1 (en) * | 2018-12-14 | 2020-06-18 | Howmedica Osteonics Corp. | Augmented, Just-in-Time, Patient-Specific Implant Manufacture |
US20220104901A1 (en) * | 2020-10-07 | 2022-04-07 | Zimmer, Inc. | Depth limiter for robotically assisted arthroplasty |
US20220168055A1 (en) * | 2020-11-30 | 2022-06-02 | Medtech S.A. | Hybrid control of surgical robot for fast positioning onto planned trajectories |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3827778A1 (en) * | 2019-11-28 | 2021-06-02 | DePuy Ireland Unlimited Company | Surgical system and method for triggering a position change of a robotic device |
US20240109195A1 (en) * | 2022-10-04 | 2024-04-04 | Aescape, Inc. | Method and system for electromechanical safety for robotic manipulators |
Family Cites Families (606)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE2614083B2 (en) | 1976-04-01 | 1979-02-08 | Siemens Ag, 1000 Berlin Und 8000 Muenchen | X-ray film device for the production of transverse slice images |
US5354314A (en) | 1988-12-23 | 1994-10-11 | Medical Instrumentation And Diagnostics Corporation | Three-dimensional beam localization apparatus and microscope for stereotactic diagnoses or surgery mounted on robotic type arm |
US5246010A (en) | 1990-12-11 | 1993-09-21 | Biotrine Corporation | Method and apparatus for exhalation analysis |
US5417210A (en) | 1992-05-27 | 1995-05-23 | International Business Machines Corporation | System and method for augmentation of endoscopic surgery |
US5631973A (en) | 1994-05-05 | 1997-05-20 | Sri International | Method for telemanipulation with telepresence |
US6963792B1 (en) | 1992-01-21 | 2005-11-08 | Sri International | Surgical method |
US5657429A (en) | 1992-08-10 | 1997-08-12 | Computer Motion, Inc. | Automated endoscope system optimal positioning |
US5397323A (en) | 1992-10-30 | 1995-03-14 | International Business Machines Corporation | Remote center-of-motion robot for surgery |
EP0699053B1 (en) | 1993-05-14 | 1999-03-17 | Sri International | Surgical apparatus |
US5423832A (en) | 1993-09-30 | 1995-06-13 | Gildenberg; Philip L. | Method and apparatus for interrelating the positions of a stereotactic Headring and stereoadapter apparatus |
IL107523A (en) | 1993-11-07 | 2000-01-31 | Ultraguide Ltd | Articulated needle guide for ultrasound imaging and method of using same |
JP3378401B2 (en) | 1994-08-30 | 2003-02-17 | 株式会社日立メディコ | X-ray equipment |
US6646541B1 (en) | 1996-06-24 | 2003-11-11 | Computer Motion, Inc. | General purpose distributed operating room control system |
US6978166B2 (en) | 1994-10-07 | 2005-12-20 | Saint Louis University | System for use in displaying images of a body part |
EP1201199B1 (en) | 1994-10-07 | 2006-03-15 | St. Louis University | Surgical navigation systems including reference and localization frames |
US5882206A (en) | 1995-03-29 | 1999-03-16 | Gillio; Robert G. | Virtual surgery system |
US5887121A (en) | 1995-04-21 | 1999-03-23 | International Business Machines Corporation | Method of constrained Cartesian control of robotic mechanisms with active and passive joints |
US6122541A (en) | 1995-05-04 | 2000-09-19 | Radionics, Inc. | Head band for frameless stereotactic registration |
US5649956A (en) | 1995-06-07 | 1997-07-22 | Sri International | System and method for releasably holding a surgical instrument |
US5825982A (en) | 1995-09-15 | 1998-10-20 | Wright; James | Head cursor control interface for an automated endoscope system for optimal positioning |
US5772594A (en) | 1995-10-17 | 1998-06-30 | Barrick; Earl F. | Fluoroscopic image guided orthopaedic surgery system with intraoperative registration |
US5855583A (en) | 1996-02-20 | 1999-01-05 | Computer Motion, Inc. | Method and apparatus for performing minimally invasive cardiac procedures |
SG64340A1 (en) | 1996-02-27 | 1999-04-27 | Inst Of Systems Science Nation | Curved surgical instruments and methods of mapping a curved path for stereotactic surgery |
US6167145A (en) | 1996-03-29 | 2000-12-26 | Surgical Navigation Technologies, Inc. | Bone navigation system |
US5792135A (en) | 1996-05-20 | 1998-08-11 | Intuitive Surgical, Inc. | Articulated surgical instrument for performing minimally invasive surgery with enhanced dexterity and sensitivity |
US6167296A (en) | 1996-06-28 | 2000-12-26 | The Board Of Trustees Of The Leland Stanford Junior University | Method for volumetric image navigation |
US7302288B1 (en) | 1996-11-25 | 2007-11-27 | Z-Kat, Inc. | Tool position indicator |
US7727244B2 (en) | 1997-11-21 | 2010-06-01 | Intuitive Surgical Operation, Inc. | Sterile surgical drape |
US8529582B2 (en) | 1996-12-12 | 2013-09-10 | Intuitive Surgical Operations, Inc. | Instrument interface of a robotic surgical system |
US6205411B1 (en) | 1997-02-21 | 2001-03-20 | Carnegie Mellon University | Computer-assisted surgery planner and intra-operative guidance system |
US6012216A (en) | 1997-04-30 | 2000-01-11 | Ethicon, Inc. | Stand alone swage apparatus |
US5820559A (en) | 1997-03-20 | 1998-10-13 | Ng; Wan Sing | Computerized boundary estimation in medical images |
US5911449A (en) | 1997-04-30 | 1999-06-15 | Ethicon, Inc. | Semi-automated needle feed method and apparatus |
US6231565B1 (en) | 1997-06-18 | 2001-05-15 | United States Surgical Corporation | Robotic arm DLUs for performing surgical tasks |
US6200274B1 (en) | 1997-07-17 | 2001-03-13 | Minrad Inc. | Removable needle rule |
EP2362283B1 (en) | 1997-09-19 | 2015-11-25 | Massachusetts Institute Of Technology | Robotic apparatus |
US6226548B1 (en) | 1997-09-24 | 2001-05-01 | Surgical Navigation Technologies, Inc. | Percutaneous registration apparatus and method for use in computer-assisted surgical navigation |
US5951475A (en) | 1997-09-25 | 1999-09-14 | International Business Machines Corporation | Methods and apparatus for registering CT-scan data to multiple fluoroscopic images |
US5987960A (en) | 1997-09-26 | 1999-11-23 | Picker International, Inc. | Tool calibrator |
US6157853A (en) | 1997-11-12 | 2000-12-05 | Stereotaxis, Inc. | Method and apparatus using shaped field of repositionable magnet to guide implant |
US6212419B1 (en) | 1997-11-12 | 2001-04-03 | Walter M. Blume | Method and apparatus using shaped field of repositionable magnet to guide implant |
US6031888A (en) | 1997-11-26 | 2000-02-29 | Picker International, Inc. | Fluoro-assist feature for a diagnostic imaging device |
US6165170A (en) | 1998-01-29 | 2000-12-26 | International Business Machines Corporation | Laser dermablator and dermablation |
US7169141B2 (en) | 1998-02-24 | 2007-01-30 | Hansen Medical, Inc. | Surgical instrument |
US6298262B1 (en) | 1998-04-21 | 2001-10-02 | Neutar, Llc | Instrument guidance for stereotactic surgery |
FR2779339B1 (en) | 1998-06-09 | 2000-10-13 | Integrated Surgical Systems Sa | MATCHING METHOD AND APPARATUS FOR ROBOTIC SURGERY, AND MATCHING DEVICE COMPRISING APPLICATION |
US6477400B1 (en) | 1998-08-20 | 2002-11-05 | Sofamor Danek Holdings, Inc. | Fluoroscopic image guided orthopaedic surgery system with intraoperative registration |
DE19839825C1 (en) | 1998-09-01 | 1999-10-07 | Siemens Ag | Diagnostic X=ray device |
US6033415A (en) | 1998-09-14 | 2000-03-07 | Integrated Surgical Systems | System and method for performing image directed robotic orthopaedic procedures without a fiducial reference system |
DE19842798C1 (en) | 1998-09-18 | 2000-05-04 | Howmedica Leibinger Gmbh & Co | Calibration device |
US6340363B1 (en) | 1998-10-09 | 2002-01-22 | Surgical Navigation Technologies, Inc. | Image guided vertebral distractor and method for tracking the position of vertebrae |
US8527094B2 (en) | 1998-11-20 | 2013-09-03 | Intuitive Surgical Operations, Inc. | Multi-user medical robotic system for collaboration or training in minimally invasive surgical procedures |
US6659939B2 (en) | 1998-11-20 | 2003-12-09 | Intuitive Surgical, Inc. | Cooperative minimally invasive telesurgical system |
US6325808B1 (en) | 1998-12-08 | 2001-12-04 | Advanced Realtime Control Systems, Inc. | Robotic system, docking station, and surgical tool for collaborative control in minimally invasive surgery |
US7125403B2 (en) | 1998-12-08 | 2006-10-24 | Intuitive Surgical | In vivo accessories for minimally invasive robotic surgery |
US6322567B1 (en) | 1998-12-14 | 2001-11-27 | Integrated Surgical Systems, Inc. | Bone motion tracking system |
US6073512A (en) | 1998-12-14 | 2000-06-13 | Delaware Capital Formation, Inc. | Manual quick change tool changer |
US6451027B1 (en) | 1998-12-16 | 2002-09-17 | Intuitive Surgical, Inc. | Devices and methods for moving an image capture device in telesurgical systems |
US7016457B1 (en) | 1998-12-31 | 2006-03-21 | General Electric Company | Multimode imaging system for generating high quality images |
DE19905974A1 (en) | 1999-02-12 | 2000-09-07 | Siemens Ag | Computer tomography scanning method using multi-line detector |
US6560354B1 (en) | 1999-02-16 | 2003-05-06 | University Of Rochester | Apparatus and method for registration of images to physical space using a weighted combination of points and surfaces |
US6501981B1 (en) | 1999-03-16 | 2002-12-31 | Accuray, Inc. | Apparatus and method for compensating for respiratory and patient motions during treatment |
US6144875A (en) | 1999-03-16 | 2000-11-07 | Accuray Incorporated | Apparatus and method for compensating for respiratory and patient motion during treatment |
US6778850B1 (en) | 1999-03-16 | 2004-08-17 | Accuray, Inc. | Frameless radiosurgery treatment system and method |
US6470207B1 (en) | 1999-03-23 | 2002-10-22 | Surgical Navigation Technologies, Inc. | Navigational guidance via computer-assisted fluoroscopic imaging |
JP2000271110A (en) | 1999-03-26 | 2000-10-03 | Hitachi Medical Corp | Medical x-ray system |
US6424885B1 (en) | 1999-04-07 | 2002-07-23 | Intuitive Surgical, Inc. | Camera referenced control in a minimally invasive surgical apparatus |
US6565554B1 (en) | 1999-04-07 | 2003-05-20 | Intuitive Surgical, Inc. | Friction compensation in a minimally invasive surgical apparatus |
US6594552B1 (en) | 1999-04-07 | 2003-07-15 | Intuitive Surgical, Inc. | Grip strength with tactile feedback for robotic surgery |
US6301495B1 (en) | 1999-04-27 | 2001-10-09 | International Business Machines Corporation | System and method for intra-operative, image-based, interactive verification of a pre-operative surgical plan |
DE19927953A1 (en) | 1999-06-18 | 2001-01-11 | Siemens Ag | X=ray diagnostic apparatus |
US6314311B1 (en) | 1999-07-28 | 2001-11-06 | Picker International, Inc. | Movable mirror laser registration system |
US6788018B1 (en) | 1999-08-03 | 2004-09-07 | Intuitive Surgical, Inc. | Ceiling and floor mounted surgical robot set-up arms |
US8004229B2 (en) | 2005-05-19 | 2011-08-23 | Intuitive Surgical Operations, Inc. | Software center and highly configurable robotic systems for surgery and other uses |
US8271130B2 (en) | 2009-03-09 | 2012-09-18 | Intuitive Surgical Operations, Inc. | Master controller having redundant degrees of freedom and added forces to create internal motion |
US7594912B2 (en) | 2004-09-30 | 2009-09-29 | Intuitive Surgical, Inc. | Offset remote center manipulator for robotic surgery |
US6312435B1 (en) | 1999-10-08 | 2001-11-06 | Intuitive Surgical, Inc. | Surgical instrument with extended reach for use in minimally invasive surgery |
US7366562B2 (en) | 2003-10-17 | 2008-04-29 | Medtronic Navigation, Inc. | Method and apparatus for surgical navigation |
US8239001B2 (en) | 2003-10-17 | 2012-08-07 | Medtronic Navigation, Inc. | Method and apparatus for surgical navigation |
US8644907B2 (en) | 1999-10-28 | 2014-02-04 | Medtronic Navigaton, Inc. | Method and apparatus for surgical navigation |
US6499488B1 (en) | 1999-10-28 | 2002-12-31 | Winchester Development Associates | Surgical sensor |
US6235038B1 (en) | 1999-10-28 | 2001-05-22 | Medtronic Surgical Navigation Technologies | System for translation of electromagnetic and optical localization systems |
US6379302B1 (en) | 1999-10-28 | 2002-04-30 | Surgical Navigation Technologies Inc. | Navigation information overlay onto ultrasound imagery |
US20010036302A1 (en) | 1999-12-10 | 2001-11-01 | Miller Michael I. | Method and apparatus for cross modality image registration |
US7635390B1 (en) | 2000-01-14 | 2009-12-22 | Marctec, Llc | Joint replacement component having a modular articulating surface |
US6377011B1 (en) | 2000-01-26 | 2002-04-23 | Massachusetts Institute Of Technology | Force feedback user interface for minimally invasive surgical simulator and teleoperator and other similar apparatus |
AU2001233019A1 (en) | 2000-01-28 | 2001-08-07 | Intersense, Inc. | Self-referenced tracking |
US6725080B2 (en) | 2000-03-01 | 2004-04-20 | Surgical Navigation Technologies, Inc. | Multiple cannula image guided tool for image guided procedures |
US6996487B2 (en) | 2000-03-15 | 2006-02-07 | Orthosoft Inc. | Automatic calibration system for computer-aided surgical instruments |
US6535756B1 (en) | 2000-04-07 | 2003-03-18 | Surgical Navigation Technologies, Inc. | Trajectory storage apparatus and method for surgical navigation system |
US6484049B1 (en) | 2000-04-28 | 2002-11-19 | Ge Medical Systems Global Technology Company, Llc | Fluoroscopic tracking and visualization system |
US6856827B2 (en) | 2000-04-28 | 2005-02-15 | Ge Medical Systems Global Technology Company, Llc | Fluoroscopic tracking and visualization system |
US6856826B2 (en) | 2000-04-28 | 2005-02-15 | Ge Medical Systems Global Technology Company, Llc | Fluoroscopic tracking and visualization system |
US6614453B1 (en) | 2000-05-05 | 2003-09-02 | Koninklijke Philips Electronics, N.V. | Method and apparatus for medical image display for surgical tool planning and navigation in clinical environments |
US6645196B1 (en) | 2000-06-16 | 2003-11-11 | Intuitive Surgical, Inc. | Guided tool change |
US6782287B2 (en) | 2000-06-27 | 2004-08-24 | The Board Of Trustees Of The Leland Stanford Junior University | Method and apparatus for tracking a medical instrument based on image registration |
US6837892B2 (en) | 2000-07-24 | 2005-01-04 | Mazor Surgical Technologies Ltd. | Miniature bone-mounted surgical robot |
US6902560B1 (en) | 2000-07-27 | 2005-06-07 | Intuitive Surgical, Inc. | Roll-pitch-roll surgical tool |
DE10037491A1 (en) | 2000-08-01 | 2002-02-14 | Stryker Leibinger Gmbh & Co Kg | Process for three-dimensional visualization of structures inside the body |
US6823207B1 (en) | 2000-08-26 | 2004-11-23 | Ge Medical Systems Global Technology Company, Llc | Integrated fluoroscopic surgical navigation and imaging workstation with command protocol |
WO2002035454A1 (en) | 2000-09-25 | 2002-05-02 | Z-Kat, Inc. | Fluoroscopic registration artifact with optical and/or magnetic markers |
US7953470B2 (en) | 2000-10-23 | 2011-05-31 | Deutsches Krebsforschungszentrum Stiftung des öffentlichen Rechts | Method, device and navigation aid for navigation during medical interventions |
US6718194B2 (en) | 2000-11-17 | 2004-04-06 | Ge Medical Systems Global Technology Company, Llc | Computer assisted intramedullary rod surgery system with enhanced features |
US6666579B2 (en) | 2000-12-28 | 2003-12-23 | Ge Medical Systems Global Technology Company, Llc | Method and apparatus for obtaining and displaying computed tomography images using a fluoroscopy imaging system |
US6840938B1 (en) | 2000-12-29 | 2005-01-11 | Intuitive Surgical, Inc. | Bipolar cauterizing instrument |
CN100491914C (en) | 2001-01-30 | 2009-05-27 | Z-凯特公司 | Tool calibrator and tracker system |
US7220262B1 (en) | 2001-03-16 | 2007-05-22 | Sdgi Holdings, Inc. | Spinal fixation system and related methods |
FR2822674B1 (en) | 2001-04-03 | 2003-06-27 | Scient X | STABILIZED INTERSOMATIC MELTING SYSTEM FOR VERTEBERS |
WO2002083003A1 (en) | 2001-04-11 | 2002-10-24 | Clarke Dana S | Tissue structure identification in advance of instrument |
US6783524B2 (en) | 2001-04-19 | 2004-08-31 | Intuitive Surgical, Inc. | Robotic surgical tool with ultrasound cauterizing and cutting instrument |
US7824401B2 (en) | 2004-10-08 | 2010-11-02 | Intuitive Surgical Operations, Inc. | Robotic tool with wristed monopolar electrosurgical end effectors |
US6994708B2 (en) | 2001-04-19 | 2006-02-07 | Intuitive Surgical | Robotic tool with monopolar electro-surgical scissors |
US8398634B2 (en) | 2002-04-18 | 2013-03-19 | Intuitive Surgical Operations, Inc. | Wristed robotic surgical tool for pluggable end-effectors |
US6636757B1 (en) | 2001-06-04 | 2003-10-21 | Surgical Navigation Technologies, Inc. | Method and apparatus for electromagnetic navigation of a surgical probe near a metal object |
US7607440B2 (en) | 2001-06-07 | 2009-10-27 | Intuitive Surgical, Inc. | Methods and apparatus for surgical planning |
ATE371414T1 (en) | 2001-06-13 | 2007-09-15 | Volume Interactions Pte Ltd | GUIDANCE SYSTEM |
US6584339B2 (en) | 2001-06-27 | 2003-06-24 | Vanderbilt University | Method and apparatus for collecting and processing physical space data for use while performing image-guided surgery |
WO2003001987A2 (en) | 2001-06-29 | 2003-01-09 | Intuitive Surgical, Inc. | Platform link wrist mechanism |
US7063705B2 (en) | 2001-06-29 | 2006-06-20 | Sdgi Holdings, Inc. | Fluoroscopic locator and registration device |
US20040243147A1 (en) | 2001-07-03 | 2004-12-02 | Lipow Kenneth I. | Surgical robot and robotic controller |
ITMI20011759A1 (en) | 2001-08-09 | 2003-02-09 | Nuovo Pignone Spa | SCRAPER DEVICE FOR PISTON ROD OF ALTERNATIVE COMPRESSORS |
US7708741B1 (en) | 2001-08-28 | 2010-05-04 | Marctec, Llc | Method of preparing bones for knee replacement surgery |
US6728599B2 (en) | 2001-09-07 | 2004-04-27 | Computer Motion, Inc. | Modularity system for computer assisted surgery |
US6587750B2 (en) | 2001-09-25 | 2003-07-01 | Intuitive Surgical, Inc. | Removable infinite roll master grip handle and touch sensor for robotic surgery |
US6619840B2 (en) | 2001-10-15 | 2003-09-16 | Koninklijke Philips Electronics N.V. | Interventional volume scanner |
US6839612B2 (en) | 2001-12-07 | 2005-01-04 | Institute Surgical, Inc. | Microwrist system for surgical procedures |
US6947786B2 (en) | 2002-02-28 | 2005-09-20 | Surgical Navigation Technologies, Inc. | Method and apparatus for perspective inversion |
US8996169B2 (en) | 2011-12-29 | 2015-03-31 | Mako Surgical Corp. | Neural monitor-based dynamic haptics |
CN1643371B (en) | 2002-03-19 | 2011-07-06 | 麦德特尼克航空公司 | Computer X-ray tomography device equipped with detector moving with number axis X-ray source |
US7164968B2 (en) | 2002-04-05 | 2007-01-16 | The Trustees Of Columbia University In The City Of New York | Robotic scrub nurse |
US7099428B2 (en) | 2002-06-25 | 2006-08-29 | The Regents Of The University Of Michigan | High spatial resolution X-ray computed tomography (CT) system |
US7248914B2 (en) | 2002-06-28 | 2007-07-24 | Stereotaxis, Inc. | Method of navigating medical devices in the presence of radiopaque material |
US7630752B2 (en) | 2002-08-06 | 2009-12-08 | Stereotaxis, Inc. | Remote control of medical devices using a virtual device interface |
WO2004015369A2 (en) | 2002-08-09 | 2004-02-19 | Intersense, Inc. | Motion tracking system and method |
US7231063B2 (en) | 2002-08-09 | 2007-06-12 | Intersense, Inc. | Fiducial detection system |
EP1531749A2 (en) | 2002-08-13 | 2005-05-25 | Microbotics Corporation | Microsurgical robot system |
US6892090B2 (en) | 2002-08-19 | 2005-05-10 | Surgical Navigation Technologies, Inc. | Method and apparatus for virtual endoscopy |
US7331967B2 (en) | 2002-09-09 | 2008-02-19 | Hansen Medical, Inc. | Surgical instrument coupling mechanism |
ES2204322B1 (en) | 2002-10-01 | 2005-07-16 | Consejo Sup. De Invest. Cientificas | FUNCTIONAL BROWSER. |
JP3821435B2 (en) | 2002-10-18 | 2006-09-13 | 松下電器産業株式会社 | Ultrasonic probe |
US7319897B2 (en) | 2002-12-02 | 2008-01-15 | Aesculap Ag & Co. Kg | Localization device display method and apparatus |
US7318827B2 (en) | 2002-12-02 | 2008-01-15 | Aesculap Ag & Co. Kg | Osteotomy procedure |
US8814793B2 (en) | 2002-12-03 | 2014-08-26 | Neorad As | Respiration monitor |
US7386365B2 (en) | 2004-05-04 | 2008-06-10 | Intuitive Surgical, Inc. | Tool grip calibration for robotic surgery |
US7945021B2 (en) | 2002-12-18 | 2011-05-17 | Varian Medical Systems, Inc. | Multi-mode cone beam CT radiotherapy simulator and treatment machine with a flat panel imager |
US7505809B2 (en) | 2003-01-13 | 2009-03-17 | Mediguide Ltd. | Method and system for registering a first image with a second image relative to the body of a patient |
GB2397234A (en) | 2003-01-20 | 2004-07-21 | Armstrong Healthcare Ltd | A tool holder arrangement |
US7542791B2 (en) | 2003-01-30 | 2009-06-02 | Medtronic Navigation, Inc. | Method and apparatus for preplanning a surgical procedure |
US7660623B2 (en) | 2003-01-30 | 2010-02-09 | Medtronic Navigation, Inc. | Six degree of freedom alignment display for medical procedures |
US6988009B2 (en) | 2003-02-04 | 2006-01-17 | Zimmer Technology, Inc. | Implant registration device for surgical navigation system |
WO2004069040A2 (en) | 2003-02-04 | 2004-08-19 | Z-Kat, Inc. | Method and apparatus for computer assistance with intramedullary nail procedure |
US7083615B2 (en) | 2003-02-24 | 2006-08-01 | Intuitive Surgical Inc | Surgical tool having electrocautery energy supply conductor with inhibited current leakage |
US6840895B2 (en) | 2003-03-12 | 2005-01-11 | Ati Industrial Automation, Inc. | Tool side robotic safety interlock |
JP4163991B2 (en) | 2003-04-30 | 2008-10-08 | 株式会社モリタ製作所 | X-ray CT imaging apparatus and imaging method |
US9060770B2 (en) | 2003-05-20 | 2015-06-23 | Ethicon Endo-Surgery, Inc. | Robotically-driven surgical instrument with E-beam driver |
US7194120B2 (en) | 2003-05-29 | 2007-03-20 | Board Of Regents, The University Of Texas System | Methods and systems for image-guided placement of implants |
US7171257B2 (en) | 2003-06-11 | 2007-01-30 | Accuray Incorporated | Apparatus and method for radiosurgery |
US9002518B2 (en) | 2003-06-30 | 2015-04-07 | Intuitive Surgical Operations, Inc. | Maximum torque driving of robotic surgical tools in robotic surgical systems |
US7960935B2 (en) | 2003-07-08 | 2011-06-14 | The Board Of Regents Of The University Of Nebraska | Robotic devices with agent delivery components and related methods |
US7042184B2 (en) | 2003-07-08 | 2006-05-09 | Board Of Regents Of The University Of Nebraska | Microrobot for surgical applications |
WO2005004722A2 (en) | 2003-07-15 | 2005-01-20 | Koninklijke Philips Electronics N.V. | Computed tomography scanner with large gantry bore |
US7313430B2 (en) | 2003-08-28 | 2007-12-25 | Medtronic Navigation, Inc. | Method and apparatus for performing stereotactic surgery |
US7131974B2 (en) | 2003-10-14 | 2006-11-07 | Keyer Thomas R | Surgical drill guide |
US7835778B2 (en) | 2003-10-16 | 2010-11-16 | Medtronic Navigation, Inc. | Method and apparatus for surgical navigation of a multiple piece construct for implantation |
US20050171558A1 (en) | 2003-10-17 | 2005-08-04 | Abovitz Rony A. | Neurosurgery targeting and delivery system for brain structures |
US7840253B2 (en) | 2003-10-17 | 2010-11-23 | Medtronic Navigation, Inc. | Method and apparatus for surgical navigation |
US20050096502A1 (en) | 2003-10-29 | 2005-05-05 | Khalili Theodore M. | Robotic surgical device |
US9393039B2 (en) | 2003-12-17 | 2016-07-19 | Brainlab Ag | Universal instrument or instrument set for computer guided surgery |
US7466303B2 (en) | 2004-02-10 | 2008-12-16 | Sunnybrook Health Sciences Center | Device and process for manipulating real and virtual objects in three-dimensional space |
US20060100610A1 (en) | 2004-03-05 | 2006-05-11 | Wallace Daniel T | Methods using a robotic catheter system |
US20080287781A1 (en) | 2004-03-05 | 2008-11-20 | Depuy International Limited | Registration Methods and Apparatus |
US20080269596A1 (en) | 2004-03-10 | 2008-10-30 | Ian Revie | Orthpaedic Monitoring Systems, Methods, Implants and Instruments |
US7657298B2 (en) | 2004-03-11 | 2010-02-02 | Stryker Leibinger Gmbh & Co. Kg | System, device, and method for determining a position of an object |
US8475495B2 (en) | 2004-04-08 | 2013-07-02 | Globus Medical | Polyaxial screw |
WO2005112563A2 (en) | 2004-04-13 | 2005-12-01 | The University Of Georgia Research Foundation, Inc. | Virtual surgical system and methods |
KR100617974B1 (en) | 2004-04-22 | 2006-08-31 | 한국과학기술원 | Command-following laparoscopic system |
US7567834B2 (en) | 2004-05-03 | 2009-07-28 | Medtronic Navigation, Inc. | Method and apparatus for implantation between two vertebral bodies |
US7379790B2 (en) | 2004-05-04 | 2008-05-27 | Intuitive Surgical, Inc. | Tool memory-based software upgrades for robotic surgery |
US7974674B2 (en) | 2004-05-28 | 2011-07-05 | St. Jude Medical, Atrial Fibrillation Division, Inc. | Robotic surgical system and method for surface modeling |
US8528565B2 (en) | 2004-05-28 | 2013-09-10 | St. Jude Medical, Atrial Fibrillation Division, Inc. | Robotic surgical system and method for automated therapy delivery |
US7097357B2 (en) | 2004-06-02 | 2006-08-29 | General Electric Company | Method and system for improved correction of registration error in a fluoroscopic image |
FR2871363B1 (en) | 2004-06-15 | 2006-09-01 | Medtech Sa | ROBOTIZED GUIDING DEVICE FOR SURGICAL TOOL |
US7327865B2 (en) | 2004-06-30 | 2008-02-05 | Accuray, Inc. | Fiducial-less tracking with non-rigid image registration |
ITMI20041448A1 (en) | 2004-07-20 | 2004-10-20 | Milano Politecnico | APPARATUS FOR THE MERGER AND NAVIGATION OF ECOGRAPHIC AND VOLUMETRIC IMAGES OF A PATIENT USING A COMBINATION OF ACTIVE AND PASSIVE OPTICAL MARKERS FOR THE LOCALIZATION OF ECHOGRAPHIC PROBES AND SURGICAL INSTRUMENTS COMPARED TO THE PATIENT |
US7440793B2 (en) | 2004-07-22 | 2008-10-21 | Sunita Chauhan | Apparatus and method for removing abnormal tissue |
US7979157B2 (en) | 2004-07-23 | 2011-07-12 | Mcmaster University | Multi-purpose robotic operating system and method |
US9072535B2 (en) | 2011-05-27 | 2015-07-07 | Ethicon Endo-Surgery, Inc. | Surgical stapling instruments with rotatable staple deployment arrangements |
GB2422759B (en) | 2004-08-05 | 2008-07-16 | Elekta Ab | Rotatable X-ray scan apparatus with cone beam offset |
US7702379B2 (en) | 2004-08-25 | 2010-04-20 | General Electric Company | System and method for hybrid tracking in surgical navigation |
US7555331B2 (en) | 2004-08-26 | 2009-06-30 | Stereotaxis, Inc. | Method for surgical navigation utilizing scale-invariant registration between a navigation system and a localization system |
DE102004042489B4 (en) | 2004-08-31 | 2012-03-29 | Siemens Ag | Medical examination or treatment facility with associated method |
CA2581009C (en) | 2004-09-15 | 2011-10-04 | Synthes (U.S.A.) | Calibrating device |
US8602971B2 (en) | 2004-09-24 | 2013-12-10 | Vivid Medical. Inc. | Opto-Electronic illumination and vision module for endoscopy |
US8480566B2 (en) | 2004-09-24 | 2013-07-09 | Vivid Medical, Inc. | Solid state illumination for endoscopy |
US20090185655A1 (en) | 2004-10-06 | 2009-07-23 | Koninklijke Philips Electronics N.V. | Computed tomography method |
US7831294B2 (en) | 2004-10-07 | 2010-11-09 | Stereotaxis, Inc. | System and method of surgical imagining with anatomical overlay for navigation of surgical devices |
US7983733B2 (en) | 2004-10-26 | 2011-07-19 | Stereotaxis, Inc. | Surgical navigation using a three-dimensional user interface |
US7062006B1 (en) | 2005-01-19 | 2006-06-13 | The Board Of Trustees Of The Leland Stanford Junior University | Computed tomography with increased field of view |
US7763015B2 (en) | 2005-01-24 | 2010-07-27 | Intuitive Surgical Operations, Inc. | Modular manipulator support for robotic surgery |
US7837674B2 (en) | 2005-01-24 | 2010-11-23 | Intuitive Surgical Operations, Inc. | Compact counter balance for robotic surgical systems |
US20060184396A1 (en) | 2005-01-28 | 2006-08-17 | Dennis Charles L | System and method for surgical navigation |
US7231014B2 (en) | 2005-02-14 | 2007-06-12 | Varian Medical Systems Technologies, Inc. | Multiple mode flat panel X-ray imaging system |
ES2784219T3 (en) | 2005-03-07 | 2020-09-23 | Hector O Pacheco | Cannula for improved access to vertebral bodies for kyphoplasty, vertebroplasty, vertebral body biopsy or screw placement |
US8375808B2 (en) | 2005-12-30 | 2013-02-19 | Intuitive Surgical Operations, Inc. | Force sensing for surgical instruments |
WO2006102756A1 (en) | 2005-03-30 | 2006-10-05 | University Western Ontario | Anisotropic hydrogels |
US8496647B2 (en) | 2007-12-18 | 2013-07-30 | Intuitive Surgical Operations, Inc. | Ribbed force sensor |
US7720523B2 (en) | 2005-04-20 | 2010-05-18 | General Electric Company | System and method for managing power deactivation within a medical imaging system |
US8208988B2 (en) | 2005-05-13 | 2012-06-26 | General Electric Company | System and method for controlling a medical imaging device |
CN101193603B (en) | 2005-06-06 | 2010-11-03 | 直观外科手术公司 | Laparoscopic ultrasound robotic surgical system |
US8398541B2 (en) | 2006-06-06 | 2013-03-19 | Intuitive Surgical Operations, Inc. | Interactive user interfaces for robotic minimally invasive surgical systems |
JP2007000406A (en) | 2005-06-24 | 2007-01-11 | Ge Medical Systems Global Technology Co Llc | X-ray ct method and x-ray ct apparatus |
US7840256B2 (en) | 2005-06-27 | 2010-11-23 | Biomet Manufacturing Corporation | Image guided tracking array and method |
US8241271B2 (en) | 2005-06-30 | 2012-08-14 | Intuitive Surgical Operations, Inc. | Robotic surgical instruments with a fluid flow control system for irrigation, aspiration, and blowing |
US20070038059A1 (en) | 2005-07-07 | 2007-02-15 | Garrett Sheffer | Implant and instrument morphing |
US20080302950A1 (en) | 2005-08-11 | 2008-12-11 | The Brigham And Women's Hospital, Inc. | System and Method for Performing Single Photon Emission Computed Tomography (Spect) with a Focal-Length Cone-Beam Collimation |
US7787699B2 (en) | 2005-08-17 | 2010-08-31 | General Electric Company | Real-time integration and recording of surgical image data |
US8800838B2 (en) | 2005-08-31 | 2014-08-12 | Ethicon Endo-Surgery, Inc. | Robotically-controlled cable-based surgical end effectors |
US20070073133A1 (en) | 2005-09-15 | 2007-03-29 | Schoenefeld Ryan J | Virtual mouse for use in surgical navigation |
US7643862B2 (en) | 2005-09-15 | 2010-01-05 | Biomet Manufacturing Corporation | Virtual mouse for use in surgical navigation |
US7835784B2 (en) | 2005-09-21 | 2010-11-16 | Medtronic Navigation, Inc. | Method and apparatus for positioning a reference frame |
US8079950B2 (en) | 2005-09-29 | 2011-12-20 | Intuitive Surgical Operations, Inc. | Autofocus and/or autoscaling in telesurgery |
EP1946243A2 (en) | 2005-10-04 | 2008-07-23 | Intersense, Inc. | Tracking objects with markers |
US20090216113A1 (en) | 2005-11-17 | 2009-08-27 | Eric Meier | Apparatus and Methods for Using an Electromagnetic Transponder in Orthopedic Procedures |
US7711406B2 (en) | 2005-11-23 | 2010-05-04 | General Electric Company | System and method for detection of electromagnetic radiation by amorphous silicon x-ray detector for metal detection in x-ray imaging |
EP1795142B1 (en) | 2005-11-24 | 2008-06-11 | BrainLAB AG | Medical tracking system using a gamma camera |
US7689320B2 (en) | 2005-12-20 | 2010-03-30 | Intuitive Surgical Operations, Inc. | Robotic surgical system with joint motion controller adapted to reduce instrument tip vibrations |
US7955322B2 (en) | 2005-12-20 | 2011-06-07 | Intuitive Surgical Operations, Inc. | Wireless communication in a robotic surgical system |
US8672922B2 (en) | 2005-12-20 | 2014-03-18 | Intuitive Surgical Operations, Inc. | Wireless communication in a robotic surgical system |
US8182470B2 (en) | 2005-12-20 | 2012-05-22 | Intuitive Surgical Operations, Inc. | Telescoping insertion axis of a robotic surgical system |
CN101340852B (en) | 2005-12-20 | 2011-12-28 | 直观外科手术操作公司 | Instrument interface of a robotic surgical system |
US7762825B2 (en) | 2005-12-20 | 2010-07-27 | Intuitive Surgical Operations, Inc. | Electro-mechanical interfaces to mount robotic surgical arms |
US7819859B2 (en) | 2005-12-20 | 2010-10-26 | Intuitive Surgical Operations, Inc. | Control system for reducing internally generated frictional and inertial resistance to manual positioning of a surgical manipulator |
US8054752B2 (en) | 2005-12-22 | 2011-11-08 | Intuitive Surgical Operations, Inc. | Synchronous data communication |
ES2292327B1 (en) | 2005-12-26 | 2009-04-01 | Consejo Superior Investigaciones Cientificas | MINI CAMERA GAMMA AUTONOMA AND WITH LOCATION SYSTEM, FOR INTRACHIRURGICAL USE. |
EP2289455B1 (en) | 2005-12-30 | 2019-11-13 | Intuitive Surgical Operations, Inc. | Modular force sensor |
US7930065B2 (en) | 2005-12-30 | 2011-04-19 | Intuitive Surgical Operations, Inc. | Robotic surgery system including position sensors using fiber bragg gratings |
US7907166B2 (en) | 2005-12-30 | 2011-03-15 | Intuitive Surgical Operations, Inc. | Stereo telestration for robotic surgery |
US7533892B2 (en) | 2006-01-05 | 2009-05-19 | Intuitive Surgical, Inc. | Steering system for heavy mobile medical equipment |
KR100731052B1 (en) | 2006-01-23 | 2007-06-22 | 한양대학교 산학협력단 | Bi-planar fluoroscopy guided robot system for a minimally invasive surgical |
US8162926B2 (en) | 2006-01-25 | 2012-04-24 | Intuitive Surgical Operations Inc. | Robotic arm with five-bar spherical linkage |
US8142420B2 (en) | 2006-01-25 | 2012-03-27 | Intuitive Surgical Operations Inc. | Robotic arm with five-bar spherical linkage |
US20110290856A1 (en) | 2006-01-31 | 2011-12-01 | Ethicon Endo-Surgery, Inc. | Robotically-controlled surgical instrument with force-feedback capabilities |
US7845537B2 (en) | 2006-01-31 | 2010-12-07 | Ethicon Endo-Surgery, Inc. | Surgical instrument having recording capabilities |
EP1815950A1 (en) | 2006-02-03 | 2007-08-08 | The European Atomic Energy Community (EURATOM), represented by the European Commission | Robotic surgical system for performing minimally invasive medical procedures |
US8219178B2 (en) | 2007-02-16 | 2012-07-10 | Catholic Healthcare West | Method and system for performing invasive medical procedures using a surgical robot |
US8219177B2 (en) | 2006-02-16 | 2012-07-10 | Catholic Healthcare West | Method and system for performing invasive medical procedures using a surgical robot |
US10893912B2 (en) | 2006-02-16 | 2021-01-19 | Globus Medical Inc. | Surgical tool systems and methods |
US8526688B2 (en) | 2006-03-09 | 2013-09-03 | General Electric Company | Methods and systems for registration of surgical navigation data and image data |
EP2004071B1 (en) | 2006-03-30 | 2013-05-15 | Koninklijke Philips Electronics N.V. | Targeting device, computer readable medium and program element |
US20070233238A1 (en) | 2006-03-31 | 2007-10-04 | Medtronic Vascular, Inc. | Devices for Imaging and Navigation During Minimally Invasive Non-Bypass Cardiac Procedures |
US8500132B2 (en) | 2006-04-04 | 2013-08-06 | Ati Industrial Automation, Inc. | Rotating coupling for robotic tool changer with one-way clutch and dual-button handle mechanism |
US8601667B2 (en) | 2006-04-04 | 2013-12-10 | Ati Industrial Automation, Inc. | Rotating coupling for robotic tool changer with actuation mechanism |
CA2649320C (en) | 2006-04-14 | 2011-09-20 | William Beaumont Hospital | Tetrahedron beam computed tomography |
US8112292B2 (en) | 2006-04-21 | 2012-02-07 | Medtronic Navigation, Inc. | Method and apparatus for optimizing a therapy |
US8021310B2 (en) | 2006-04-21 | 2011-09-20 | Nellcor Puritan Bennett Llc | Work of breathing display for a ventilation system |
US7940999B2 (en) | 2006-04-24 | 2011-05-10 | Siemens Medical Solutions Usa, Inc. | System and method for learning-based 2D/3D rigid registration for image-guided surgery using Jensen-Shannon divergence |
WO2007131561A2 (en) | 2006-05-16 | 2007-11-22 | Surgiceye Gmbh | Method and device for 3d acquisition, 3d visualization and computer guided surgery using nuclear probes |
US20080004523A1 (en) | 2006-06-29 | 2008-01-03 | General Electric Company | Surgical tool guide |
DE102006032127B4 (en) | 2006-07-05 | 2008-04-30 | Aesculap Ag & Co. Kg | Calibration method and calibration device for a surgical referencing unit |
US20080013809A1 (en) | 2006-07-14 | 2008-01-17 | Bracco Imaging, Spa | Methods and apparatuses for registration in image guided surgery |
EP1886640B1 (en) | 2006-08-08 | 2009-11-18 | BrainLAB AG | Planning method and system for adjusting a free-shaped bone implant |
US7945012B2 (en) | 2006-08-17 | 2011-05-17 | Koninklijke Philips Electronics N.V. | Computed tomography image acquisition |
DE102006041033B4 (en) | 2006-09-01 | 2017-01-19 | Siemens Healthcare Gmbh | Method for reconstructing a three-dimensional image volume |
US8231610B2 (en) | 2006-09-06 | 2012-07-31 | National Cancer Center | Robotic surgical system for laparoscopic surgery |
WO2008031077A2 (en) | 2006-09-08 | 2008-03-13 | Hansen Medical, Inc. | Robotic surgical system with forward-oriented field of view guide instrument navigation |
US8150498B2 (en) | 2006-09-08 | 2012-04-03 | Medtronic, Inc. | System for identification of anatomical landmarks |
US8532741B2 (en) | 2006-09-08 | 2013-09-10 | Medtronic, Inc. | Method and apparatus to optimize electrode placement for neurological stimulation |
US8150497B2 (en) | 2006-09-08 | 2012-04-03 | Medtronic, Inc. | System for navigating a planned procedure within a body |
US8248413B2 (en) | 2006-09-18 | 2012-08-21 | Stryker Corporation | Visual navigation system for endoscopic surgery |
EP2074383B1 (en) | 2006-09-25 | 2016-05-11 | Mazor Robotics Ltd. | C-arm computerized tomography |
US8660635B2 (en) | 2006-09-29 | 2014-02-25 | Medtronic, Inc. | Method and apparatus for optimizing a computer assisted surgical procedure |
US8052688B2 (en) | 2006-10-06 | 2011-11-08 | Wolf Ii Erich | Electromagnetic apparatus and method for nerve localization during spinal surgery |
US20080144906A1 (en) | 2006-10-09 | 2008-06-19 | General Electric Company | System and method for video capture for fluoroscopy and navigation |
US20080109012A1 (en) | 2006-11-03 | 2008-05-08 | General Electric Company | System, method and apparatus for tableside remote connections of medical instruments and systems using wireless communications |
US8551114B2 (en) | 2006-11-06 | 2013-10-08 | Human Robotics S.A. De C.V. | Robotic surgical device |
US20080108912A1 (en) | 2006-11-07 | 2008-05-08 | General Electric Company | System and method for measurement of clinical parameters of the knee for use during knee replacement surgery |
US20080108991A1 (en) | 2006-11-08 | 2008-05-08 | General Electric Company | Method and apparatus for performing pedicle screw fusion surgery |
US8682413B2 (en) | 2006-11-15 | 2014-03-25 | General Electric Company | Systems and methods for automated tracker-driven image selection |
CA2670261A1 (en) | 2006-11-16 | 2008-05-29 | Vanderbilt University | Apparatus and methods of compensating for organ deformation, registration of internal structures to images, and applications of same |
US7935130B2 (en) | 2006-11-16 | 2011-05-03 | Intuitive Surgical Operations, Inc. | Two-piece end-effectors for robotic surgical tools |
US8727618B2 (en) | 2006-11-22 | 2014-05-20 | Siemens Aktiengesellschaft | Robotic device and method for trauma patient diagnosis and therapy |
US7835557B2 (en) | 2006-11-28 | 2010-11-16 | Medtronic Navigation, Inc. | System and method for detecting status of imaging device |
US8320991B2 (en) | 2006-12-01 | 2012-11-27 | Medtronic Navigation Inc. | Portable electromagnetic navigation system |
US7683332B2 (en) | 2006-12-08 | 2010-03-23 | Rush University Medical Center | Integrated single photon emission computed tomography (SPECT)/transmission computed tomography (TCT) system for cardiac imaging |
US7683331B2 (en) | 2006-12-08 | 2010-03-23 | Rush University Medical Center | Single photon emission computed tomography (SPECT) system for cardiac imaging |
US8556807B2 (en) | 2006-12-21 | 2013-10-15 | Intuitive Surgical Operations, Inc. | Hermetically sealed distal sensor endoscope |
US20080177203A1 (en) | 2006-12-22 | 2008-07-24 | General Electric Company | Surgical navigation planning system and method for placement of percutaneous instrumentation and implants |
DE102006061178A1 (en) | 2006-12-22 | 2008-06-26 | Siemens Ag | Medical system for carrying out and monitoring a minimal invasive intrusion, especially for treating electro-physiological diseases, has X-ray equipment and a control/evaluation unit |
US20080161680A1 (en) | 2006-12-29 | 2008-07-03 | General Electric Company | System and method for surgical navigation of motion preservation prosthesis |
US9220573B2 (en) | 2007-01-02 | 2015-12-29 | Medtronic Navigation, Inc. | System and method for tracking positions of uniform marker geometries |
US8684253B2 (en) | 2007-01-10 | 2014-04-01 | Ethicon Endo-Surgery, Inc. | Surgical instrument with wireless communication between a control unit of a robotic system and remote sensor |
US8374673B2 (en) | 2007-01-25 | 2013-02-12 | Warsaw Orthopedic, Inc. | Integrated surgical navigational and neuromonitoring system having automated surgical assistance and control |
CA2920567C (en) | 2007-02-01 | 2019-03-05 | Ravish V. Patwardhan | Surgical navigation system for guiding an access member |
US8146874B2 (en) | 2007-02-02 | 2012-04-03 | Hansen Medical, Inc. | Mounting support assembly for suspending a medical instrument driver above an operating table |
US8600478B2 (en) | 2007-02-19 | 2013-12-03 | Medtronic Navigation, Inc. | Automatic identification of instruments used with a surgical navigation system |
US8233963B2 (en) | 2007-02-19 | 2012-07-31 | Medtronic Navigation, Inc. | Automatic identification of tracked surgical devices using an electromagnetic localization system |
DE102007009017B3 (en) | 2007-02-23 | 2008-09-25 | Siemens Ag | Arrangement for supporting a percutaneous procedure |
US10039613B2 (en) | 2007-03-01 | 2018-08-07 | Surgical Navigation Technologies, Inc. | Method for localizing an imaging device with a surgical navigation system |
US8098914B2 (en) | 2007-03-05 | 2012-01-17 | Siemens Aktiengesellschaft | Registration of CT volumes with fluoroscopic images |
US20080228068A1 (en) | 2007-03-13 | 2008-09-18 | Viswanathan Raju R | Automated Surgical Navigation with Electro-Anatomical and Pre-Operative Image Data |
US8821511B2 (en) | 2007-03-15 | 2014-09-02 | General Electric Company | Instrument guide for use with a surgical navigation system |
US20080235052A1 (en) | 2007-03-19 | 2008-09-25 | General Electric Company | System and method for sharing medical information between image-guided surgery systems |
US8150494B2 (en) | 2007-03-29 | 2012-04-03 | Medtronic Navigation, Inc. | Apparatus for registering a physical space to image space |
US7879045B2 (en) | 2007-04-10 | 2011-02-01 | Medtronic, Inc. | System for guiding instruments having different sizes |
US8560118B2 (en) | 2007-04-16 | 2013-10-15 | Neuroarm Surgical Ltd. | Methods, devices, and systems for non-mechanically restricting and/or programming movement of a tool of a manipulator along a single axis |
CA2684472C (en) | 2007-04-16 | 2015-11-24 | Neuroarm Surgical Ltd. | Methods, devices, and systems for automated movements involving medical robots |
US8108025B2 (en) | 2007-04-24 | 2012-01-31 | Medtronic, Inc. | Flexible array for use in navigated surgery |
US8301226B2 (en) | 2007-04-24 | 2012-10-30 | Medtronic, Inc. | Method and apparatus for performing a navigated procedure |
US8311611B2 (en) | 2007-04-24 | 2012-11-13 | Medtronic, Inc. | Method for performing multiple registrations in a navigated procedure |
US20090012509A1 (en) | 2007-04-24 | 2009-01-08 | Medtronic, Inc. | Navigated Soft Tissue Penetrating Laser System |
US8010177B2 (en) | 2007-04-24 | 2011-08-30 | Medtronic, Inc. | Intraoperative image registration |
US8062364B1 (en) | 2007-04-27 | 2011-11-22 | Knee Creations, Llc | Osteoarthritis treatment and device |
DE102007022122B4 (en) | 2007-05-11 | 2019-07-11 | Deutsches Zentrum für Luft- und Raumfahrt e.V. | Gripping device for a surgery robot arrangement |
US8057397B2 (en) | 2007-05-16 | 2011-11-15 | General Electric Company | Navigation and imaging system sychronized with respiratory and/or cardiac activity |
US20080287771A1 (en) | 2007-05-17 | 2008-11-20 | General Electric Company | Surgical navigation system with electrostatic shield |
US8934961B2 (en) | 2007-05-18 | 2015-01-13 | Biomet Manufacturing, Llc | Trackable diagnostic scope apparatus and methods of use |
US20080300478A1 (en) | 2007-05-30 | 2008-12-04 | General Electric Company | System and method for displaying real-time state of imaged anatomy during a surgical procedure |
US20080300477A1 (en) | 2007-05-30 | 2008-12-04 | General Electric Company | System and method for correction of automated image registration |
US9096033B2 (en) | 2007-06-13 | 2015-08-04 | Intuitive Surgical Operations, Inc. | Surgical system instrument sterile adapter |
US9468412B2 (en) | 2007-06-22 | 2016-10-18 | General Electric Company | System and method for accuracy verification for image based surgical navigation |
EP2170564A4 (en) | 2007-07-12 | 2015-10-07 | Univ Nebraska | Methods and systems of actuation in robotic devices |
US7834484B2 (en) | 2007-07-16 | 2010-11-16 | Tyco Healthcare Group Lp | Connection cable and method for activating a voltage-controlled generator |
JP2009045428A (en) | 2007-07-25 | 2009-03-05 | Terumo Corp | Operating mechanism, medical manipulator and surgical robot system |
US8100950B2 (en) | 2007-07-27 | 2012-01-24 | The Cleveland Clinic Foundation | Oblique lumbar interbody fusion |
US8035685B2 (en) | 2007-07-30 | 2011-10-11 | General Electric Company | Systems and methods for communicating video data between a mobile imaging system and a fixed monitor system |
US8328818B1 (en) | 2007-08-31 | 2012-12-11 | Globus Medical, Inc. | Devices and methods for treating bone |
CA2737938C (en) | 2007-09-19 | 2016-09-13 | Walter A. Roberts | Direct visualization robotic intra-operative radiation therapy applicator device |
US20090080737A1 (en) | 2007-09-25 | 2009-03-26 | General Electric Company | System and Method for Use of Fluoroscope and Computed Tomography Registration for Sinuplasty Navigation |
US9050120B2 (en) | 2007-09-30 | 2015-06-09 | Intuitive Surgical Operations, Inc. | Apparatus and method of user interface with alternate tool mode for robotic surgical tools |
US9522046B2 (en) | 2010-08-23 | 2016-12-20 | Gip | Robotic surgery system |
CN101848679B (en) | 2007-11-06 | 2014-08-06 | 皇家飞利浦电子股份有限公司 | Nuclear medicine SPECT-CT machine with integrated asymmetric flat panel cone-beam CT and SPECT system |
US9265589B2 (en) | 2007-11-06 | 2016-02-23 | Medtronic Navigation, Inc. | System and method for navigated drill guide |
DE102007055203A1 (en) | 2007-11-19 | 2009-05-20 | Kuka Roboter Gmbh | A robotic device, medical workstation and method for registering an object |
US8561473B2 (en) | 2007-12-18 | 2013-10-22 | Intuitive Surgical Operations, Inc. | Force sensor temperature compensation |
US20100274120A1 (en) | 2007-12-21 | 2010-10-28 | Koninklijke Philips Electronics N.V. | Synchronous interventional scanner |
US8400094B2 (en) | 2007-12-21 | 2013-03-19 | Intuitive Surgical Operations, Inc. | Robotic surgical system with patient support |
US8864798B2 (en) | 2008-01-18 | 2014-10-21 | Globus Medical, Inc. | Transverse connector |
KR20100120183A (en) | 2008-01-30 | 2010-11-12 | 더 트러스티이스 오브 콜롬비아 유니버시티 인 더 시티 오브 뉴욕 | Systems, devices, and methods for robot-assisted micro-surgical stenting |
US20090198121A1 (en) | 2008-02-01 | 2009-08-06 | Martin Hoheisel | Method and apparatus for coordinating contrast agent injection and image acquisition in c-arm computed tomography |
US8573465B2 (en) | 2008-02-14 | 2013-11-05 | Ethicon Endo-Surgery, Inc. | Robotically-controlled surgical end effector system with rotary actuated closure systems |
US8696458B2 (en) | 2008-02-15 | 2014-04-15 | Thales Visionix, Inc. | Motion tracking system and method using camera and non-camera sensors |
US7925653B2 (en) | 2008-02-27 | 2011-04-12 | General Electric Company | Method and system for accessing a group of objects in an electronic document |
US20090228019A1 (en) | 2008-03-10 | 2009-09-10 | Yosef Gross | Robotic surgical system |
US8282653B2 (en) | 2008-03-24 | 2012-10-09 | Board Of Regents Of The University Of Nebraska | System and methods for controlling surgical tool elements |
US8808164B2 (en) | 2008-03-28 | 2014-08-19 | Intuitive Surgical Operations, Inc. | Controlling a robotic surgical tool with a display monitor |
CN104780540B (en) | 2008-03-28 | 2018-12-14 | 爱立信电话股份有限公司 | It is manipulated during switching or the identification of defective base station |
US8333755B2 (en) | 2008-03-31 | 2012-12-18 | Intuitive Surgical Operations, Inc. | Coupler to transfer controller motion from a robotic manipulator to an attached instrument |
US7886743B2 (en) | 2008-03-31 | 2011-02-15 | Intuitive Surgical Operations, Inc. | Sterile drape interface for robotic surgical instrument |
US7843158B2 (en) | 2008-03-31 | 2010-11-30 | Intuitive Surgical Operations, Inc. | Medical robotic system adapted to inhibit motions resulting in excessive end effector forces |
US9002076B2 (en) | 2008-04-15 | 2015-04-07 | Medtronic, Inc. | Method and apparatus for optimal trajectory planning |
US9345875B2 (en) | 2008-04-17 | 2016-05-24 | Medtronic, Inc. | Method and apparatus for cannula fixation for an array insertion tube set |
US8167793B2 (en) | 2008-04-26 | 2012-05-01 | Intuitive Surgical Operations, Inc. | Augmented stereoscopic visualization for a surgical robot using time duplexing |
ES2764964T3 (en) | 2008-04-30 | 2020-06-05 | Nanosys Inc | Dirt-resistant surfaces for reflective spheres |
US9579161B2 (en) | 2008-05-06 | 2017-02-28 | Medtronic Navigation, Inc. | Method and apparatus for tracking a patient |
US20110022229A1 (en) | 2008-06-09 | 2011-01-27 | Bae Sang Jang | Master interface and driving method of surgical robot |
DE202008009571U1 (en) | 2008-07-16 | 2008-10-23 | Brainlab Ag | Adapter for fixing a medical device |
TW201004607A (en) | 2008-07-25 | 2010-02-01 | Been-Der Yang | Image guided navigation system and method thereof |
US8054184B2 (en) | 2008-07-31 | 2011-11-08 | Intuitive Surgical Operations, Inc. | Identification of surgical instrument attached to surgical robot |
US8771170B2 (en) | 2008-08-01 | 2014-07-08 | Microaccess, Inc. | Methods and apparatus for transesophageal microaccess surgery |
JP2010035984A (en) | 2008-08-08 | 2010-02-18 | Canon Inc | X-ray imaging apparatus |
US9248000B2 (en) | 2008-08-15 | 2016-02-02 | Stryker European Holdings I, Llc | System for and method of visualizing an interior of body |
WO2010022088A1 (en) | 2008-08-18 | 2010-02-25 | Encision, Inc. | Enhanced control systems including flexible shielding and support systems for electrosurgical applications |
DE102008041813B4 (en) | 2008-09-04 | 2013-06-20 | Carl Zeiss Microscopy Gmbh | Method for the depth analysis of an organic sample |
US8857821B2 (en) | 2008-09-05 | 2014-10-14 | Ati Industrial Automation, Inc. | Manual robotic tool changer with rotating piston |
US7900524B2 (en) | 2008-09-09 | 2011-03-08 | Intersense, Inc. | Monitoring tools |
US8165658B2 (en) | 2008-09-26 | 2012-04-24 | Medtronic, Inc. | Method and apparatus for positioning a guide relative to a base |
US8073335B2 (en) | 2008-09-30 | 2011-12-06 | Intuitive Surgical Operations, Inc. | Operator input device for a robotic surgical system |
CN102177430B (en) | 2008-10-10 | 2014-04-02 | 皇家飞利浦电子股份有限公司 | Method and apparatus to improve CT image acquisition using a displaced geometry |
KR100944412B1 (en) | 2008-10-13 | 2010-02-25 | (주)미래컴퍼니 | Surgical slave robot |
US8781630B2 (en) | 2008-10-14 | 2014-07-15 | University Of Florida Research Foundation, Inc. | Imaging platform to provide integrated navigation capabilities for surgical guidance |
JP5762962B2 (en) | 2008-10-20 | 2015-08-12 | ザ・ジョンズ・ホプキンス・ユニバーシティ | Environmental characteristic estimation and image display |
EP2179703B1 (en) | 2008-10-21 | 2012-03-28 | BrainLAB AG | Integration of surgical instrument and display device for supporting image-based surgery |
US8784385B2 (en) | 2008-10-31 | 2014-07-22 | The Invention Science Fund I, Llc | Frozen piercing implements and methods for piercing a substrate |
KR101075363B1 (en) | 2008-10-31 | 2011-10-19 | 정창욱 | Surgical Robot System Having Tool for Minimally Invasive Surgery |
US9033958B2 (en) | 2008-11-11 | 2015-05-19 | Perception Raisonnement Action En Medecine | Surgical robotic system |
TWI435705B (en) | 2008-11-20 | 2014-05-01 | Been Der Yang | Surgical position device and image guided navigation system using the same |
JP5384521B2 (en) | 2008-11-27 | 2014-01-08 | 株式会社日立メディコ | Radiation imaging device |
US8483800B2 (en) | 2008-11-29 | 2013-07-09 | General Electric Company | Surgical navigation enabled imaging table environment |
CN102300512B (en) | 2008-12-01 | 2016-01-20 | 马佐尔机器人有限公司 | The sloped-spine stabilisation that robot guides |
ES2341079B1 (en) | 2008-12-11 | 2011-07-13 | Fundacio Clinic Per A La Recerca Biomedica | EQUIPMENT FOR IMPROVED VISION BY INFRARED VASCULAR STRUCTURES, APPLICABLE TO ASSIST PHYTOSCOPIC, LAPAROSCOPIC AND ENDOSCOPIC INTERVENTIONS AND SIGNAL TREATMENT PROCESS TO IMPROVE SUCH VISION. |
US8021393B2 (en) | 2008-12-12 | 2011-09-20 | Globus Medical, Inc. | Lateral spinous process spacer with deployable wings |
US8594841B2 (en) | 2008-12-31 | 2013-11-26 | Intuitive Surgical Operations, Inc. | Visual force feedback in a minimally invasive surgical procedure |
US8184880B2 (en) | 2008-12-31 | 2012-05-22 | Intuitive Surgical Operations, Inc. | Robust sparse image matching for robotic surgery |
US8374723B2 (en) | 2008-12-31 | 2013-02-12 | Intuitive Surgical Operations, Inc. | Obtaining force information in a minimally invasive surgical procedure |
US8830224B2 (en) | 2008-12-31 | 2014-09-09 | Intuitive Surgical Operations, Inc. | Efficient 3-D telestration for local robotic proctoring |
EP2586374B1 (en) | 2009-01-21 | 2015-03-18 | Koninklijke Philips N.V. | Method and apparatus for large field of view imaging and detection and compensation of motion artifacts |
US8611985B2 (en) | 2009-01-29 | 2013-12-17 | Imactis | Method and device for navigation of a surgical tool |
KR101038417B1 (en) | 2009-02-11 | 2011-06-01 | 주식회사 이턴 | Surgical robot system and control method thereof |
US8418073B2 (en) | 2009-03-09 | 2013-04-09 | Intuitive Surgical Operations, Inc. | User interfaces for electrosurgical tools in robotic surgical systems |
US8918207B2 (en) | 2009-03-09 | 2014-12-23 | Intuitive Surgical Operations, Inc. | Operator input device for a robotic surgical system |
US8120301B2 (en) | 2009-03-09 | 2012-02-21 | Intuitive Surgical Operations, Inc. | Ergonomic surgeon control console in robotic surgical systems |
US9737235B2 (en) | 2009-03-09 | 2017-08-22 | Medtronic Navigation, Inc. | System and method for image-guided navigation |
US20120053597A1 (en) | 2009-03-10 | 2012-03-01 | Mcmaster University | Mobile robotic surgical system |
US8335552B2 (en) | 2009-03-20 | 2012-12-18 | Medtronic, Inc. | Method and apparatus for instrument placement |
WO2010110560A2 (en) | 2009-03-24 | 2010-09-30 | 주식회사 래보 | Surgical robot system using augmented reality, and method for controlling same |
US20100249571A1 (en) | 2009-03-31 | 2010-09-30 | General Electric Company | Surgical navigation system with wireless magnetoresistance tracking sensors |
US8882803B2 (en) | 2009-04-01 | 2014-11-11 | Globus Medical, Inc. | Orthopedic clamp and extension rod |
WO2010124285A1 (en) | 2009-04-24 | 2010-10-28 | Medtronic Inc. | Electromagnetic navigation of medical instruments for cardiothoracic surgery |
US8737708B2 (en) | 2009-05-13 | 2014-05-27 | Medtronic Navigation, Inc. | System and method for automatic registration between an image and a subject |
US8225798B2 (en) | 2009-05-18 | 2012-07-24 | Loma Linda University | Method and devices for performing minimally invasive surgery |
US8308043B2 (en) | 2009-05-19 | 2012-11-13 | Covidien Lp | Recognition of interchangeable component of a device |
US8556815B2 (en) | 2009-05-20 | 2013-10-15 | Laurent Pelissier | Freehand ultrasound imaging systems and methods for guiding fine elongate instruments |
ES2388029B1 (en) | 2009-05-22 | 2013-08-13 | Universitat Politècnica De Catalunya | ROBOTIC SYSTEM FOR LAPAROSCOPIC SURGERY. |
CN101897593B (en) | 2009-05-26 | 2014-08-13 | 清华大学 | Computer chromatography imaging device and method |
WO2010141839A2 (en) | 2009-06-04 | 2010-12-09 | Virginia Tech Intellectual Properties, Inc. | Multi-parameter x-ray computed tomography |
WO2011013164A1 (en) | 2009-07-27 | 2011-02-03 | 株式会社島津製作所 | Radiographic apparatus |
RU2550542C2 (en) | 2009-08-06 | 2015-05-10 | Конинклейке Филипс Электроникс Н.В. | Method and device for shaping computer tomographic images using geometries with offset detector |
WO2011019742A1 (en) | 2009-08-10 | 2011-02-17 | Re2, Inc. | Automated tool change assembly for robotic arm |
US10828786B2 (en) | 2009-08-17 | 2020-11-10 | Mazor Robotics Ltd. | Device for improving the accuracy of manual operations |
US9844414B2 (en) | 2009-08-31 | 2017-12-19 | Gregory S. Fischer | System and method for robotic surgical intervention in a magnetic resonance imager |
EP2298223A1 (en) | 2009-09-21 | 2011-03-23 | Stryker Leibinger GmbH & Co. KG | Technique for registering image data of an object |
US8465476B2 (en) | 2009-09-23 | 2013-06-18 | Intuitive Surgical Operations, Inc. | Cannula mounting fixture |
WO2011038759A1 (en) | 2009-09-30 | 2011-04-07 | Brainlab Ag | Two-part medical tracking marker |
NL1037348C2 (en) | 2009-10-02 | 2011-04-05 | Univ Eindhoven Tech | Surgical robot, instrument manipulator, combination of an operating table and a surgical robot, and master-slave operating system. |
US8679183B2 (en) | 2010-06-25 | 2014-03-25 | Globus Medical | Expandable fusion device and method of installation thereof |
US8685098B2 (en) | 2010-06-25 | 2014-04-01 | Globus Medical, Inc. | Expandable fusion device and method of installation thereof |
US8556979B2 (en) | 2009-10-15 | 2013-10-15 | Globus Medical, Inc. | Expandable fusion device and method of installation thereof |
US8062375B2 (en) | 2009-10-15 | 2011-11-22 | Globus Medical, Inc. | Expandable fusion device and method of installation thereof |
EP2493411A4 (en) | 2009-10-28 | 2015-04-29 | Imris Inc | Automatic registration of images for image guided surgery |
USD631966S1 (en) | 2009-11-10 | 2011-02-01 | Globus Medical, Inc. | Basilar invagination implant |
US8521331B2 (en) | 2009-11-13 | 2013-08-27 | Intuitive Surgical Operations, Inc. | Patient-side surgeon interface for a minimally invasive, teleoperated surgical instrument |
US20110137152A1 (en) | 2009-12-03 | 2011-06-09 | General Electric Company | System and method for cooling components of a surgical navigation system |
US8277509B2 (en) | 2009-12-07 | 2012-10-02 | Globus Medical, Inc. | Transforaminal prosthetic spinal disc apparatus |
CN102651998B (en) | 2009-12-10 | 2015-08-05 | 皇家飞利浦电子股份有限公司 | For the scanning system of differential contrast imaging |
US8694075B2 (en) | 2009-12-21 | 2014-04-08 | General Electric Company | Intra-operative registration for navigated surgical procedures |
US8353963B2 (en) | 2010-01-12 | 2013-01-15 | Globus Medical | Expandable spacer and method for use thereof |
US9381045B2 (en) | 2010-01-13 | 2016-07-05 | Jcbd, Llc | Sacroiliac joint implant and sacroiliac joint instrument for fusing a sacroiliac joint |
EP2523621B1 (en) | 2010-01-13 | 2016-09-28 | Koninklijke Philips N.V. | Image integration based registration and navigation for endoscopic surgery |
WO2011085814A1 (en) | 2010-01-14 | 2011-07-21 | Brainlab Ag | Controlling and/or operating a medical device by means of a light pointer |
AU2011207550B2 (en) | 2010-01-20 | 2016-03-10 | Conventus Orthopaedics, Inc. | Apparatus and methods for bone access and cavity preparation |
US9039769B2 (en) | 2010-03-17 | 2015-05-26 | Globus Medical, Inc. | Intervertebral nucleus and annulus implants and method of use thereof |
US20140330288A1 (en) | 2010-03-25 | 2014-11-06 | Precision Automation And Robotics India Ltd. | Articulating Arm for a Robotic Surgical Instrument System |
US20110238080A1 (en) | 2010-03-25 | 2011-09-29 | Date Ranjit | Robotic Surgical Instrument System |
IT1401669B1 (en) | 2010-04-07 | 2013-08-02 | Sofar Spa | ROBOTIC SURGERY SYSTEM WITH PERFECT CONTROL. |
US8870880B2 (en) | 2010-04-12 | 2014-10-28 | Globus Medical, Inc. | Angling inserter tool for expandable vertebral implant |
IT1399603B1 (en) | 2010-04-26 | 2013-04-26 | Scuola Superiore Di Studi Universitari E Di Perfez | ROBOTIC SYSTEM FOR MINIMUM INVASIVE SURGERY INTERVENTIONS |
US8717430B2 (en) | 2010-04-26 | 2014-05-06 | Medtronic Navigation, Inc. | System and method for radio-frequency imaging, registration, and localization |
CA2797302C (en) | 2010-04-28 | 2019-01-15 | Ryerson University | System and methods for intraoperative guidance feedback |
WO2012169990A2 (en) | 2010-05-04 | 2012-12-13 | Pathfinder Therapeutics, Inc. | System and method for abdominal surface matching using pseudo-features |
US8738115B2 (en) | 2010-05-11 | 2014-05-27 | Siemens Aktiengesellschaft | Method and apparatus for selective internal radiation therapy planning and implementation |
DE102010020284A1 (en) | 2010-05-12 | 2011-11-17 | Siemens Aktiengesellschaft | Determination of 3D positions and orientations of surgical objects from 2D X-ray images |
US8603077B2 (en) | 2010-05-14 | 2013-12-10 | Intuitive Surgical Operations, Inc. | Force transmission for robotic surgical instrument |
US8883210B1 (en) | 2010-05-14 | 2014-11-11 | Musculoskeletal Transplant Foundation | Tissue-derived tissuegenic implants, and methods of fabricating and using same |
KR101181569B1 (en) | 2010-05-25 | 2012-09-10 | 정창욱 | Surgical robot system capable of implementing both of single port surgery mode and multi-port surgery mode and method for controlling same |
US20110295370A1 (en) | 2010-06-01 | 2011-12-01 | Sean Suh | Spinal Implants and Methods of Use Thereof |
DE102010026674B4 (en) | 2010-07-09 | 2012-09-27 | Siemens Aktiengesellschaft | Imaging device and radiotherapy device |
US8675939B2 (en) | 2010-07-13 | 2014-03-18 | Stryker Leibinger Gmbh & Co. Kg | Registration of anatomical data sets |
WO2012007036A1 (en) | 2010-07-14 | 2012-01-19 | Brainlab Ag | Method and system for determining an imaging direction and calibration of an imaging apparatus |
US20120035507A1 (en) | 2010-07-22 | 2012-02-09 | Ivan George | Device and method for measuring anatomic geometries |
US8740882B2 (en) | 2010-07-30 | 2014-06-03 | Lg Electronics Inc. | Medical robotic system and method of controlling the same |
WO2012018816A2 (en) | 2010-08-02 | 2012-02-09 | The Johns Hopkins University | Tool exchange interface and control algorithm for cooperative surgical robots |
US20120071753A1 (en) | 2010-08-20 | 2012-03-22 | Mark Hunter | Apparatus and method for four dimensional soft tissue navigation including endoscopic mapping |
JP2012045278A (en) | 2010-08-30 | 2012-03-08 | Fujifilm Corp | X-ray imaging apparatus and x-ray imaging method |
WO2012030304A1 (en) | 2010-09-01 | 2012-03-08 | Agency For Science, Technology And Research | A robotic device for use in image-guided robot assisted surgical training |
KR20120030174A (en) | 2010-09-17 | 2012-03-28 | 삼성전자주식회사 | Surgery robot system and surgery apparatus and method for providing tactile feedback |
EP2431003B1 (en) | 2010-09-21 | 2018-03-21 | Medizinische Universität Innsbruck | Registration device, system, kit and method for a patient registration |
US8679125B2 (en) | 2010-09-22 | 2014-03-25 | Biomet Manufacturing, Llc | Robotic guided femoral head reshaping |
US8657809B2 (en) | 2010-09-29 | 2014-02-25 | Stryker Leibinger Gmbh & Co., Kg | Surgical navigation system |
US8718346B2 (en) | 2011-10-05 | 2014-05-06 | Saferay Spine Llc | Imaging system and method for use in surgical and interventional medical procedures |
US8526700B2 (en) | 2010-10-06 | 2013-09-03 | Robert E. Isaacs | Imaging system and method for surgical and interventional medical procedures |
US9913693B2 (en) | 2010-10-29 | 2018-03-13 | Medtronic, Inc. | Error correction techniques in surgical navigation |
US20120123417A1 (en) | 2010-11-04 | 2012-05-17 | Smith & Nephew, Inc. | Drill Guide with Depth Stop |
EP2651295A4 (en) | 2010-12-13 | 2015-11-18 | Ortho Kinematics Inc | Methods, systems and devices for clinical data reporting and surgical navigation |
US8876866B2 (en) | 2010-12-13 | 2014-11-04 | Globus Medical, Inc. | Spinous process fusion devices and methods thereof |
WO2012088321A1 (en) | 2010-12-22 | 2012-06-28 | Viewray Incorporated | System and method for image guidance during medical procedures |
US20130281821A1 (en) | 2011-01-13 | 2013-10-24 | Koninklijke Philips Electronics N.V. | Intraoperative camera calibration for endoscopic surgery |
KR101181613B1 (en) | 2011-02-21 | 2012-09-10 | 윤상진 | Surgical robot system for performing surgery based on displacement information determined by user designation and control method therefor |
US20120226145A1 (en) | 2011-03-03 | 2012-09-06 | National University Of Singapore | Transcutaneous robot-assisted ablation-device insertion navigation system |
US9026247B2 (en) | 2011-03-30 | 2015-05-05 | University of Washington through its Center for Communication | Motion and video capture for tracking and evaluating robotic surgery and associated systems and methods |
US9308050B2 (en) | 2011-04-01 | 2016-04-12 | Ecole Polytechnique Federale De Lausanne (Epfl) | Robotic system and method for spinal and other surgeries |
US20120256092A1 (en) | 2011-04-06 | 2012-10-11 | General Electric Company | Ct system for use in multi-modality imaging system |
WO2012139031A1 (en) | 2011-04-06 | 2012-10-11 | The Trustees Of Columbia University In The City Of New York | System, method and computer-accessible medium for providing a panoramic cone beam computed tomography (cbct) |
WO2012149548A2 (en) | 2011-04-29 | 2012-11-01 | The Johns Hopkins University | System and method for tracking and navigation |
EP2719353A4 (en) | 2011-06-06 | 2015-04-22 | Nozomu Matsumoto | Method for manufacturing registration template |
US8498744B2 (en) | 2011-06-30 | 2013-07-30 | Mako Surgical Corporation | Surgical robotic systems with manual and haptic and/or active control modes |
US9089353B2 (en) | 2011-07-11 | 2015-07-28 | Board Of Regents Of The University Of Nebraska | Robotic surgical devices, systems, and related methods |
US8818105B2 (en) | 2011-07-14 | 2014-08-26 | Accuray Incorporated | Image registration for image-guided surgery |
KR20130015146A (en) | 2011-08-02 | 2013-02-13 | 삼성전자주식회사 | Method and apparatus for processing medical image, robotic surgery system using image guidance |
US10866783B2 (en) | 2011-08-21 | 2020-12-15 | Transenterix Europe S.A.R.L. | Vocally activated surgical control system |
US9427330B2 (en) | 2011-09-06 | 2016-08-30 | Globus Medical, Inc. | Spinal plate |
US8864833B2 (en) | 2011-09-30 | 2014-10-21 | Globus Medical, Inc. | Expandable fusion device and method of installation thereof |
US9060794B2 (en) | 2011-10-18 | 2015-06-23 | Mako Surgical Corp. | System and method for robotic surgery |
US8894688B2 (en) | 2011-10-27 | 2014-11-25 | Globus Medical Inc. | Adjustable rod devices and methods of using the same |
DE102011054910B4 (en) | 2011-10-28 | 2013-10-10 | Ovesco Endoscopy Ag | Magnetic end effector and means for guiding and positioning same |
US8693730B2 (en) | 2011-11-15 | 2014-04-08 | Macdonald Dettwiler & Associates Inc. | Method of real-time tracking of moving/flexible surfaces |
FR2983059B1 (en) | 2011-11-30 | 2014-11-28 | Medtech | ROBOTIC-ASSISTED METHOD OF POSITIONING A SURGICAL INSTRUMENT IN RELATION TO THE BODY OF A PATIENT AND DEVICE FOR CARRYING OUT SAID METHOD |
WO2013084221A1 (en) | 2011-12-05 | 2013-06-13 | Mazor Robotics Ltd. | Active bed mount for surgical robot |
KR101901580B1 (en) | 2011-12-23 | 2018-09-28 | 삼성전자주식회사 | Surgical robot and control method thereof |
FR2985167A1 (en) | 2011-12-30 | 2013-07-05 | Medtech | ROBOTISE MEDICAL METHOD FOR MONITORING PATIENT BREATHING AND CORRECTION OF ROBOTIC TRAJECTORY. |
US9265583B2 (en) | 2011-12-30 | 2016-02-23 | Mako Surgical Corp. | Method for image-based robotic surgery |
CA2862402C (en) | 2011-12-30 | 2020-01-07 | Mako Surgical Corp. | System for image-based robotic surgery |
KR20130080909A (en) | 2012-01-06 | 2013-07-16 | 삼성전자주식회사 | Surgical robot and method for controlling the same |
JP5107468B1 (en) | 2012-02-01 | 2012-12-26 | 順 岡田 | Surgical bone drill drill stopper |
US9138297B2 (en) | 2012-02-02 | 2015-09-22 | Intuitive Surgical Operations, Inc. | Systems and methods for controlling a robotic surgical system |
WO2013126659A1 (en) | 2012-02-22 | 2013-08-29 | Veran Medical Technologies, Inc. | Systems, methods, and devices for four dimensional soft tissue navigation |
US11207132B2 (en) | 2012-03-12 | 2021-12-28 | Nuvasive, Inc. | Systems and methods for performing spinal surgery |
US8855822B2 (en) | 2012-03-23 | 2014-10-07 | Innovative Surgical Solutions, Llc | Robotic surgical system with mechanomyography feedback |
KR101946000B1 (en) | 2012-03-28 | 2019-02-08 | 삼성전자주식회사 | Robot system and Control Method thereof for surgery |
US8888821B2 (en) | 2012-04-05 | 2014-11-18 | Warsaw Orthopedic, Inc. | Spinal implant measuring system and method |
WO2013158655A1 (en) | 2012-04-16 | 2013-10-24 | Neurologica Corp. | Imaging system with rigidly mounted fiducial markers |
WO2013158640A1 (en) | 2012-04-16 | 2013-10-24 | Neurologica Corp. | Wireless imaging system |
US20140142591A1 (en) | 2012-04-24 | 2014-05-22 | Auris Surgical Robotics, Inc. | Method, apparatus and a system for robotic assisted surgery |
US10383765B2 (en) | 2012-04-24 | 2019-08-20 | Auris Health, Inc. | Apparatus and method for a global coordinate system for use in robotic surgery |
US9020613B2 (en) | 2012-05-01 | 2015-04-28 | The Johns Hopkins University | Method and apparatus for robotically assisted cochlear implant surgery |
JP2015519108A (en) | 2012-05-02 | 2015-07-09 | 医百科技股▲ふん▼有限公司 | Auxiliary guide method during intraoral surgery |
US9125556B2 (en) | 2012-05-14 | 2015-09-08 | Mazor Robotics Ltd. | Robotic guided endoscope |
EP2849650A4 (en) | 2012-05-18 | 2016-01-20 | Carestream Health Inc | Cone beam computed tomography volumetric imaging system |
KR20130132109A (en) | 2012-05-25 | 2013-12-04 | 삼성전자주식회사 | Supporting device and surgical robot system adopting the same |
WO2013181503A1 (en) | 2012-06-01 | 2013-12-05 | Intuitive Surgical Operations, Inc. | Manipulator arm-to-patient collision avoidance using a null-space |
CN108113755B (en) | 2012-06-01 | 2020-11-27 | 直观外科手术操作公司 | Multi-port surgical robot system architecture |
US20180325610A1 (en) | 2012-06-21 | 2018-11-15 | Globus Medical, Inc. | Methods for indicating and confirming a point of interest using surgical navigation systems |
US11896446B2 (en) | 2012-06-21 | 2024-02-13 | Globus Medical, Inc | Surgical robotic automation with tracking markers |
JP2015528713A (en) | 2012-06-21 | 2015-10-01 | グローバス メディカル インコーポレイティッド | Surgical robot platform |
US20190029765A1 (en) | 2012-06-21 | 2019-01-31 | Globus Medical, Inc. | Surgical robotic systems providing transfer of registration and related methods and computer program products |
US11399900B2 (en) | 2012-06-21 | 2022-08-02 | Globus Medical, Inc. | Robotic systems providing co-registration using natural fiducials and related methods |
CA2876846C (en) | 2012-06-22 | 2021-04-06 | Board Of Regents Of The University Of Nebraska | Local control robotic surgical devices and related methods |
US20130345757A1 (en) | 2012-06-22 | 2013-12-26 | Shawn D. Stad | Image Guided Intra-Operative Contouring Aid |
US20140005678A1 (en) | 2012-06-28 | 2014-01-02 | Ethicon Endo-Surgery, Inc. | Rotary drive arrangements for surgical instruments |
US8880223B2 (en) | 2012-07-16 | 2014-11-04 | Florida Institute for Human & Maching Cognition | Anthro-centric multisensory interface for sensory augmentation of telesurgery |
US20140031664A1 (en) | 2012-07-30 | 2014-01-30 | Mako Surgical Corp. | Radiographic imaging device |
KR101997566B1 (en) | 2012-08-07 | 2019-07-08 | 삼성전자주식회사 | Surgical robot system and control method thereof |
WO2014025399A1 (en) | 2012-08-08 | 2014-02-13 | Board Of Regents Of The University Of Nebraska | Robotic surgical devices, systems, and related methods |
US9770305B2 (en) | 2012-08-08 | 2017-09-26 | Board Of Regents Of The University Of Nebraska | Robotic surgical devices, systems, and related methods |
US10110785B2 (en) | 2012-08-10 | 2018-10-23 | Karl Storz Imaging, Inc. | Deployable imaging system equipped with solid state imager |
WO2014028703A1 (en) | 2012-08-15 | 2014-02-20 | Intuitive Surgical Operations, Inc. | Systems and methods for cancellation of joint motion using the null-space |
WO2014032046A1 (en) | 2012-08-24 | 2014-02-27 | University Of Houston | Robotic device and systems for image-guided and robot-assisted surgery |
US20140080086A1 (en) | 2012-09-20 | 2014-03-20 | Roger Chen | Image Navigation Integrated Dental Implant System |
US8892259B2 (en) | 2012-09-26 | 2014-11-18 | Innovative Surgical Solutions, LLC. | Robotic surgical system with mechanomyography feedback |
US9757160B2 (en) | 2012-09-28 | 2017-09-12 | Globus Medical, Inc. | Device and method for treatment of spinal deformity |
KR102038632B1 (en) | 2012-11-06 | 2019-10-30 | 삼성전자주식회사 | surgical instrument, supporting device, and surgical robot system adopting the same |
CN104780862A (en) | 2012-11-14 | 2015-07-15 | 直观外科手术操作公司 | Smart drapes for collision avoidance |
KR102079945B1 (en) | 2012-11-22 | 2020-02-21 | 삼성전자주식회사 | Surgical robot and method for controlling the surgical robot |
US9008752B2 (en) | 2012-12-14 | 2015-04-14 | Medtronic, Inc. | Method to determine distribution of a material by an infused magnetic resonance image contrast agent |
US9393361B2 (en) | 2012-12-14 | 2016-07-19 | Medtronic, Inc. | Method to determine a material distribution |
US9001962B2 (en) | 2012-12-20 | 2015-04-07 | Triple Ring Technologies, Inc. | Method and apparatus for multiple X-ray imaging applications |
DE102013004459A1 (en) | 2012-12-20 | 2014-06-26 | avateramedical GmBH | Holding and positioning device of a surgical instrument and / or an endoscope for minimally invasive surgery and a robotic surgical system |
DE102012025101A1 (en) | 2012-12-20 | 2014-06-26 | avateramedical GmBH | Active positioning device of a surgical instrument and a surgical robotic system comprising it |
US9002437B2 (en) | 2012-12-27 | 2015-04-07 | General Electric Company | Method and system for position orientation correction in navigation |
US10028788B2 (en) | 2012-12-31 | 2018-07-24 | Mako Surgical Corp. | System for image-based robotic surgery |
KR20140090374A (en) | 2013-01-08 | 2014-07-17 | 삼성전자주식회사 | Single port surgical robot and control method thereof |
CN103969269B (en) | 2013-01-31 | 2018-09-18 | Ge医疗系统环球技术有限公司 | Method and apparatus for geometric calibration CT scanner |
US20140221819A1 (en) | 2013-02-01 | 2014-08-07 | David SARMENT | Apparatus, system and method for surgical navigation |
CN105101903B (en) | 2013-02-04 | 2018-08-24 | 儿童国家医疗中心 | Hybrid Control Surgical Robotic System |
KR20140102465A (en) | 2013-02-14 | 2014-08-22 | 삼성전자주식회사 | Surgical robot and method for controlling the same |
KR102117270B1 (en) | 2013-03-06 | 2020-06-01 | 삼성전자주식회사 | Surgical robot system and method for controlling the same |
KR20140110685A (en) | 2013-03-08 | 2014-09-17 | 삼성전자주식회사 | Method for controlling of single port surgical robot |
KR20140110620A (en) | 2013-03-08 | 2014-09-17 | 삼성전자주식회사 | surgical robot system and operating method thereof |
US9314308B2 (en) | 2013-03-13 | 2016-04-19 | Ethicon Endo-Surgery, Llc | Robotic ultrasonic surgical device with articulating end effector |
KR102119534B1 (en) | 2013-03-13 | 2020-06-05 | 삼성전자주식회사 | Surgical robot and method for controlling the same |
KR20140112207A (en) | 2013-03-13 | 2014-09-23 | 삼성전자주식회사 | Augmented reality imaging display system and surgical robot system comprising the same |
US9743987B2 (en) | 2013-03-14 | 2017-08-29 | Board Of Regents Of The University Of Nebraska | Methods, systems, and devices relating to robotic surgical devices, end effectors, and controllers |
WO2014139024A1 (en) | 2013-03-15 | 2014-09-18 | Synaptive Medical (Barbados) Inc. | Planning, navigation and simulation systems and methods for minimally invasive therapy |
EP2996545B1 (en) | 2013-03-15 | 2021-10-20 | Board of Regents of the University of Nebraska | Robotic surgical systems |
US9629595B2 (en) | 2013-03-15 | 2017-04-25 | Hansen Medical, Inc. | Systems and methods for localizing, tracking and/or controlling medical instruments |
KR102117273B1 (en) | 2013-03-21 | 2020-06-01 | 삼성전자주식회사 | Surgical robot system and method for controlling the same |
ITMI20130516A1 (en) | 2013-04-05 | 2014-10-06 | Sofar Spa | SURGICAL SYSTEM WITH STERILE TOWELS |
KR20140121581A (en) | 2013-04-08 | 2014-10-16 | 삼성전자주식회사 | Surgical robot system |
KR20140123122A (en) | 2013-04-10 | 2014-10-22 | 삼성전자주식회사 | Surgical Robot and controlling method of thereof |
US9414859B2 (en) | 2013-04-19 | 2016-08-16 | Warsaw Orthopedic, Inc. | Surgical rod measuring system and method |
US8964934B2 (en) | 2013-04-25 | 2015-02-24 | Moshe Ein-Gal | Cone beam CT scanning |
KR20140129702A (en) | 2013-04-30 | 2014-11-07 | 삼성전자주식회사 | Surgical robot system and method for controlling the same |
US20140364720A1 (en) | 2013-06-10 | 2014-12-11 | General Electric Company | Systems and methods for interactive magnetic resonance imaging |
DE102013012397B4 (en) | 2013-07-26 | 2018-05-24 | Rg Mechatronics Gmbh | Surgical robot system |
US10786283B2 (en) | 2013-08-01 | 2020-09-29 | Musc Foundation For Research Development | Skeletal bone fixation mechanism |
US20150085970A1 (en) | 2013-09-23 | 2015-03-26 | General Electric Company | Systems and methods for hybrid scanning |
US10507067B2 (en) | 2013-10-07 | 2019-12-17 | Technion Research & Development Foundation Ltd. | Needle steering by shaft manipulation |
US9848922B2 (en) | 2013-10-09 | 2017-12-26 | Nuvasive, Inc. | Systems and methods for performing spine surgery |
EP3973899B1 (en) | 2013-10-09 | 2024-10-30 | Nuvasive, Inc. | Surgical spinal correction |
JP7107635B2 (en) | 2013-10-24 | 2022-07-27 | グローバス メディカル インコーポレイティッド | Surgical tool system and method |
ITBO20130599A1 (en) | 2013-10-31 | 2015-05-01 | Cefla Coop | METHOD AND APPARATUS TO INCREASE THE FIELD OF VIEW IN A COMPUTERIZED TOMOGRAPHIC ACQUISITION WITH CONE-BEAM TECHNIQUE |
US20150146847A1 (en) | 2013-11-26 | 2015-05-28 | General Electric Company | Systems and methods for providing an x-ray imaging system with nearly continuous zooming capability |
EP3228254B1 (en) | 2014-02-21 | 2020-01-01 | 3DIntegrated ApS | A set comprising a surgical instrument |
US20150252940A1 (en) | 2014-03-05 | 2015-09-10 | BlueSky Designs, Inc. | Mounting and positioning apparatus for increased user independence |
CN111839737A (en) | 2014-03-17 | 2020-10-30 | 直观外科手术操作公司 | System and method for breakaway clutching in an articulated arm |
EP3136942A4 (en) | 2014-04-29 | 2018-01-17 | Boston Scientific Scimed, Inc. | Lumen-less illumination system |
DE102014208283B4 (en) | 2014-05-02 | 2019-01-10 | Peter Brehm | Apparatus for attaching a positioning means to a patient's bone, apparatus for processing a patient's bone and hip implant system |
AU2015277134B2 (en) | 2014-06-17 | 2019-02-28 | Nuvasive, Inc. | Systems and methods for planning, performing, and assessing spinal correction during surgery |
US10765438B2 (en) | 2014-07-14 | 2020-09-08 | KB Medical SA | Anti-skid surgical instrument for use in preparing holes in bone tissue |
JP6653317B2 (en) | 2014-08-12 | 2020-02-26 | インブイティ・インコーポレイテッド | Illuminated electrosurgical system and method of use |
US10327855B2 (en) | 2014-09-17 | 2019-06-25 | Intuitive Surgical Operations, Inc. | Systems and methods for utilizing augmented Jacobian to control manipulator joint movement |
US10631907B2 (en) | 2014-12-04 | 2020-04-28 | Mazor Robotics Ltd. | Shaper for vertebral fixation rods |
US20160166329A1 (en) | 2014-12-15 | 2016-06-16 | General Electric Company | Tomographic imaging for interventional tool guidance |
DE102014226240A1 (en) | 2014-12-17 | 2016-06-23 | Kuka Roboter Gmbh | System for robot-assisted medical treatment |
EP3878392B1 (en) | 2015-04-15 | 2024-06-12 | Mobius Imaging LLC | Integrated medical imaging and surgical robotic system |
US10180404B2 (en) | 2015-04-30 | 2019-01-15 | Shimadzu Corporation | X-ray analysis device |
CN107787208B (en) | 2015-06-23 | 2021-08-06 | 柯惠Lp公司 | Robotic surgical assembly |
EP3352698B1 (en) | 2015-09-25 | 2021-11-17 | Covidien LP | Surgical robotic assemblies and instrument adapters thereof |
US20170143284A1 (en) | 2015-11-25 | 2017-05-25 | Carestream Health, Inc. | Method to detect a retained surgical object |
US10070939B2 (en) | 2015-12-04 | 2018-09-11 | Zaki G. Ibrahim | Methods for performing minimally invasive transforaminal lumbar interbody fusion using guidance |
AU2017210124B2 (en) | 2016-01-22 | 2021-05-20 | Nuvasive, Inc. | Systems and methods for facilitating spine surgery |
US10448910B2 (en) | 2016-02-03 | 2019-10-22 | Globus Medical, Inc. | Portable medical imaging system |
US10842453B2 (en) | 2016-02-03 | 2020-11-24 | Globus Medical, Inc. | Portable medical imaging system |
US11058378B2 (en) | 2016-02-03 | 2021-07-13 | Globus Medical, Inc. | Portable medical imaging system |
US9962133B2 (en) | 2016-03-09 | 2018-05-08 | Medtronic Navigation, Inc. | Transformable imaging system |
EP3241518B1 (en) | 2016-04-11 | 2024-10-23 | Globus Medical, Inc | Surgical tool systems |
CA2959215C (en) | 2016-05-10 | 2018-06-12 | Piotr KUCHNIO | Multispectral synchronized imaging |
US10172630B2 (en) | 2016-05-19 | 2019-01-08 | Medos International Sarl | Drill guide with adjustable stop |
JP7131834B2 (en) | 2016-05-25 | 2022-09-06 | ザクト ロボティクス リミテッド | Automatic insertion device |
KR101861176B1 (en) | 2016-08-16 | 2018-05-28 | 주식회사 고영테크놀러지 | Surgical robot for stereotactic surgery and method for controlling a stereotactic surgery robot |
CN116269696A (en) | 2016-08-25 | 2023-06-23 | 内布拉斯加大学董事会 | Quick release tool coupler and related systems and methods |
US9931025B1 (en) | 2016-09-30 | 2018-04-03 | Auris Surgical Robotics, Inc. | Automated calibration of endoscopes with pull wires |
CN109952070B (en) | 2016-10-05 | 2022-02-01 | 纽文思公司 | Surgical navigation system and related methods |
JP7145599B2 (en) | 2016-10-10 | 2022-10-03 | グローバス メディカル インコーポレイティッド | Method and system for improving convergence of 2D-3D registration |
JP2018110841A (en) | 2016-11-10 | 2018-07-19 | グローバス メディカル インコーポレイティッド | Systems and methods of checking positioning for surgical systems |
GB2552855B (en) | 2017-01-31 | 2019-02-13 | Cmr Surgical Ltd | Surgical instrument engagement detection |
US10682129B2 (en) | 2017-03-23 | 2020-06-16 | Mobius Imaging, Llc | Robotic end effector with adjustable inner diameter |
-
2019
- 2019-11-26 US US16/695,310 patent/US20200297357A1/en not_active Abandoned
-
2022
- 2022-05-09 US US17/662,437 patent/US11944325B2/en active Active
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20200188134A1 (en) * | 2018-12-14 | 2020-06-18 | Howmedica Osteonics Corp. | Augmented, Just-in-Time, Patient-Specific Implant Manufacture |
US12115083B2 (en) * | 2018-12-14 | 2024-10-15 | Howmedica Osteonics Corp. | Augmented, just-in-time, patient-specific implant manufacture |
US20220104901A1 (en) * | 2020-10-07 | 2022-04-07 | Zimmer, Inc. | Depth limiter for robotically assisted arthroplasty |
US11986258B2 (en) * | 2020-10-07 | 2024-05-21 | Zimmer, Inc. | Depth limiter for robotically assisted arthroplasty |
US20240261043A1 (en) * | 2020-10-07 | 2024-08-08 | Zimmer, Inc. | Depth limiter for robotically assisted arthroplasty |
US20220168055A1 (en) * | 2020-11-30 | 2022-06-02 | Medtech S.A. | Hybrid control of surgical robot for fast positioning onto planned trajectories |
Also Published As
Publication number | Publication date |
---|---|
US11944325B2 (en) | 2024-04-02 |
US20220330954A1 (en) | 2022-10-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11850012B2 (en) | System for neuronavigation registration and robotic trajectory guidance, robotic surgery, and related methods and devices | |
US12127803B2 (en) | System for neuronavigation registration and robotic trajectory guidance, robotic surgery, and related methods and devices | |
EP3711700B1 (en) | System for neuronavigation registration and robotic trajectory guidance, and related methods and devices | |
US11944325B2 (en) | System for neuronavigation registration and robotic trajectory guidance, robotic surgery, and related methods and devices | |
US11737696B2 (en) | System for neuronavigation registration and robotic trajectory guidance, and related methods and devices | |
EP3827760B1 (en) | System for neuronavigation registration and robotic trajectory guidance, robotic surgery, and related devices | |
US11744598B2 (en) | System for neuronavigation registration and robotic trajectory guidance, robotic surgery, and related methods and devices | |
US20200297430A1 (en) | System for neuronavigation registration and robotic trajectory guidance, robotic surgery, and related methods and devices | |
EP3881791A1 (en) | System for neuronavigation registration and robotic trajectory guidance, and related methods and devices | |
JP7112534B2 (en) | A system for neuronavigation registration and robot trajectory guidance | |
EP3756609B1 (en) | System for neuronavigation registration and robotic trajectory guidance, robotic surgery, and related devices | |
EP3733112A1 (en) | System for robotic trajectory guidance for navigated biopsy needle | |
US20240108417A1 (en) | System for neuronavigation registration and robotic trajectory guidance, and related methods and devices | |
US20200297451A1 (en) | System for robotic trajectory guidance for navigated biopsy needle, and related methods and devices |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: GLOBUS MEDICAL, INC., PENNSYLVANIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:CAMERON, HAYDEN;MANTZAVINOS, SPIROS;CRAWFORD, NEIL R.;AND OTHERS;SIGNING DATES FROM 20191126 TO 20191210;REEL/FRAME:051226/0192 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: AWAITING RESPONSE FOR INFORMALITY, FEE DEFICIENCY OR CRF ACTION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |