`
`PROGRAM
`
`TECHNICAL FIELD
`
`The present invention relates to a surgery assistance device and a surgery assistance
`
`program with which navigation during surgery is performed.
`
`BACKGROUND ART
`
`In a medical facility, surgery assistance devices that allow surgery to be simulated are
`
`employed in order to perform better surgery.
`
`A conventional surgery assistance device comprised, for example, a tomographic
`
`image information acquisition section for acquiring tomographic image information, such as
`
`an image acquired by PET (positron emission tomography), a nuclear magnetic resonance
`
`image (MRI), or an X-ray CT image, a memory connected to the tomographic image
`
`information acquisition section, a volume rendering computer connected to the memory, a
`
`display for displaying the computation results of the volume rendering computer, and an
`
`input section for giving resecting instructions with respect to a displayed object that is being
`
`displayed on the display.
`
`For example, Patent Literature 1 discloses an endoscopic surgery assistance device
`
`with which the coordinates of a three-dimensional image of the endoscope actually being
`
`used and the coordinates of three—dimensional volume image data produced using a
`
`tomographic image are integrated, and these are displayed superposed over endoscopic video,
`
`which allows an image of the surgical site region to be displayed superposed at this location
`
`10
`
`15
`
`20
`
`
`
`over an endoscopic image in real time, according to changes in the endoscope or surgical
`
`instrument.
`
`CITATION LIST
`
`PATENT LITERATURE
`
`Patent Literature 1: Japanese Patent No. 4,152,402 (registered July 11, 2008)
`
`SUMMARY
`
`TECHNICAL PROBLEM
`
`However, the following problem was encountered with the conventional surgery
`
`assistance device discussed above.
`
`10
`
`15
`
`20
`
`Specifically, with the surgery assistance device disclosed in the above publication,
`
`since an image of the surgical site region displayed superposed at that location over
`
`endoscopic image in real time, the distance between the surgical instrument distal end and a
`
`specific region can be calculated. What is disclosed here, however, does not involve
`
`navigation during surgery, and is just a warning and a display of the distance to the site of a
`
`blood vessel, organ, or the like with which the surgical instrument must not come into contact.
`
`It is an object of the present invention to provide a surgery assistance device and a
`
`surgery assistance program with which proper navigation can be performed during surgery
`
`while the user views the resection site, which is resected using a surgical instrument.
`
`SOLUTION TO PROBLEM
`
`The surgery assistance device pertaining to the first invention is a surgery assistance
`
`device for performing navigation while displaying a three-dimensional simulation image
`
`produced from tomographic image information during surgery in which a resection-use
`
`
`
`surgical instrument is used while the user views an endoscopic image, the device comprising
`
`a tomographic image information acquisition section, a memory, a volume rendering
`
`computer, an endoscope/surgical instrument position sensor, a registration computer, a
`
`simulator, a distance calculator, and a navigator. The tomographic image information
`
`acquisition section acquires tomographic image information about a patient. The memory is
`
`connected to the tomographic image information acquisition section and stores voxel
`
`information for the tomographic image information. The volume rendering computer is
`
`connected to the memory and samples voxel information in a direction perpendicular to the
`
`sight line on the basis of the voxel information. The endoscope/surgical instrument position
`
`sensor sequentially senses the three—dimensional positions of the endoscope and the surgical
`
`instrument. The registration computer integrates the coordinates of a three-dimensional
`
`image produced by the volume rendering computer and the coordinates of the endoscope and
`
`the surgical instrument sensed by the endoscope/surgical instrument position sensor. The
`
`simulator stores the resection portion scheduled for surgery and virtually resected on the
`
`three-dimensional image produced by the volume rendering computer, in the memory after
`
`associating it with the voxel information. The distance calculator calculates a distance
`
`between the working end of the surgical instrument on the three-dimensional image and the
`
`voxel information indicating the resection portion and stored in the memory. The navigator
`
`displays the working end of the surgical instrument on the three-dimensional image by using
`
`the coordinates of the surgical instrument during surgery, and displays the distance between
`
`the working end and the voxel information indicating the resection portion stored in the
`
`10
`
`15
`
`20
`
`memory, along with the endoscopic image displayed during surgery.
`
`
`
`Here, for example, after a resection simulation is conducted in a state in which the
`
`area around a specific bone, blood vessel, organ, or the like is displayed using a three—
`
`dimensional image produced from a plurality of X-ray CT images, when surgery is performed
`
`using an endoscope, three—dimensional positions of the endoscope or surgical instrument
`
`actually used in the surgery are sequentially sensed, and the coordinates of a three-
`
`dimensional image formed from a plurality of X-ray CT images and the coordinates of the
`
`actual three-dimensional position of the endoscope and the surgical instrument are integrated.
`
`Then, the distance to the distal end (the working end) of the actual surgical instrument with
`
`10
`
`15
`
`respect to the site to be resected in the resection simulation performed using a three—
`
`dimensional image is calculated, and this distance is displayed along with the three-
`
`dimensional image to advise the surgeon, so that surgical navigation is performed seamlessly
`
`from the resection simulation.
`
`Here, the above-mentioned tomographic image includes, for example, two-
`
`dimensional images acquired using X-ray CT, MRI, PET, or another such medical device.
`
`The above—mentioned surgical instrument includes resection instruments for resecting organs,
`
`bones, and so forth. The above—mentioned "working end" means the tooth portion, etc., of
`
`the surgical instrument that cuts out the bone, organ, or the like.
`
`Consequently, in surgery for resecting a specific organ by using an endoscope, for
`
`example, the surgeon can accurately ascertain how far the distal end of the surgical
`
`20
`
`instrument is from the site that is to be resected, while moving the resection instrument or
`
`other surgical instrument toward the resection site. This allows the surgeon to navigate
`
`
`
`properly while inserting the surgical instrument, without feeling any uncertainty due to not
`
`knowing how far apart the surgical instrument distal end and the resection site are.
`
`The surgery assistance device pertaining to the second invention is the surgery
`
`assistance device pertaining to the first invention, wherein the simulator senses the depth of
`
`the surgical site during pre—surgery resection simulation and computes the degree of change
`
`in depth or discontinuity, and stops the resection or does not update the resection data if the
`
`degree of change exceeds a specific threshold.
`
`Here, the simulator sets a threshold for virtual resection, and provides a restriction
`
`when resection simulation is performed.
`
`Consequently, if the change in depth, etc., exceeds the threshold, the site will not be
`
`displayed in a post-resection state on the simulation image. Also, this avoids a situation in
`
`which the threshold value becomes too small, or the resection is halted too much, when the
`
`resection simulation is performed while the threshold is updated.
`
`The surgery assistance device pertaining to the third invention is the surgery
`
`assistance device pertaining to the first or second invention, wherein the navigator models, by
`
`multi-point model, the working end of the surgical instrument on the three-dimensional
`
`image.
`
`Here, the multi-point model is a model for sampling a plurality of points on the outer
`
`edge of the site where collision is expected to occur.
`
`Consequently, when a sensor for sensing the position, angle, etc., is provided to the
`
`surgical instrument at a specific position of the actual surgical instrument, for example, the
`
`surgical instrument will be represented by multiple points in a virtual space, using the
`
`10
`
`15
`
`20
`
`
`
`position of this sensor as a reference, and the distance to the resection portion can be
`
`calculated from these multiple points and displayed.
`
`The surgery assistance device pertaining to the fourth invention is the surgery
`
`assistance device pertaining to any of the first to third inventions, wherein the navigator uses
`
`a vector that has a component of the direction of voxel information indicating the resected
`
`portion by the surgical instrument during surgery as the vector of the distance.
`
`Consequently, sampling can be performed in the direction in which the surgical
`
`instrument moves closer to the resection site, while the positional relation between the
`
`resection site and the surgical instrument distal end with respect to the surgeon can be more
`
`effectively displayed, such as changing the display mode according to the speed, acceleration,
`
`and direction at which the multiple points approach.
`
`The surgery assistance device pertaining to the fifth invention is the surgery assistance
`
`device pertaining to any of the first to fourth inventions, wherein the navigator changes the
`
`display color of the voxels for each equidistance from the resection portion.
`
`Here, the range of equidistance, centered on the resection portion, is displayed as
`
`spheres of different colors on the navigation screen during surgery.
`
`Consequently, in navigation during surgery, the surgeon can easily see the distance
`
`from the portion where resection is performed to the surgical instrument distal end, which
`
`facilitates navigation.
`
`The surgery assistance device pertaining to the sixth invention is the surgery
`
`assistance device pertaining to any of the first to fifth inventions, wherein, after integrating
`
`the coordinates of a three—dimensional image and the coordinates of the endoscope and the
`
`10
`
`15
`
`20
`
`
`
`surgical instrument, the registration computer checks the accuracy of this coordinate
`
`integration, and corrects deviation in the coordinate integration if this accuracy exceeds a
`
`specific range.
`
`Here, the accuracy of registration in which the coordinates of the three-dimensional
`
`image produced on the basis of a plurality of X-ray CT images, etc., and the actual
`
`coordinates of the endoscope and surgical instrument is checked, and registration is
`
`performed again if a specific level of accuracy is not met.
`
`10
`
`15
`
`This allows the position of the endoscope or surgical instrument displayed in the
`
`three-dimensional image to be displayed more accurately in the three-dimensional image.
`
`The surgery assistance device pertaining to the seventh invention is the surgery
`
`assistance device pertaining to any of the first to sixth inventions, wherein the navigator sets
`
`and displays a first display area acquired by the endoscope and produced by the volume
`
`rendering computer, and a second display area in which the display is restricted by the
`
`surgical instrument during actual surgery.
`
`Here, in the three-dimensional image displayed on the monitor screen, etc., during
`
`surgery, the display shows the portion of the field of view that is restricted by the surgical
`
`instrument into which the endoscope is inserted.
`
`Therefore, the display is in a masked state, for example, so that the portion restricted
`
`by the retractor or other such tubular surgical instrument cannot be seen, and this allows a
`
`20
`
`three-dimensional image to be displayed in a state that approximates the actual endoscopic
`
`image.
`
`
`
`The surgery assistance device pertaining to the eighth invention is the surgery
`
`assistance device pertaining to any of the first to seventh inventions, further comprising a
`
`display component that displays the three-dimensional image, an image of the distal end of
`
`the surgical instrument, and the distance.
`
`The surgery assistance device here comprises a monitor or other such display
`
`component.
`
`Therefore, surgery can be assisted while a three—dimensional image that approximates
`
`the actual video from an endoscope is displayed on the display component during surgery in
`
`which an endoscope is used.
`
`10
`
`15
`
`20
`
`The surgery assistance program pertaining to the ninth invention is a surgery
`
`assistance program that performs navigation while displaying a three-dimensional simulation
`
`image produced from tomographic image information, during surgery in which a resection-
`
`use surgical instrument is used while an endoscopic image, wherein the surgery assistance
`
`program is used by a computer to execute a surgery assistance method comprising the steps
`
`of acquiring tomographic image information about a patient, storing voxel information for the
`
`tomographic image information, sampling voxel information in a direction perpendicular to
`
`the sight line on the basis of the voxel information, sequentially sensing the three-
`
`dimensional positions of the endoscope and surgical instrument, integrating the coordinates
`
`of the three—dimensional image and the coordinates of the endoscope and the surgical
`
`instrument, calculating the distance between the working end of the surgical instrument and
`
`the resection site included in the video acquired by the endoscope, and displaying the
`
`working end of the surgical instrument on the three-dimensional image by using the
`
`
`
`coordinates of the surgical instrument during surgery, and combining and displaying an
`
`image indicating the distal end of the surgical instrument, and the distance between the
`
`resection site and the distal end of the surgical instrument, while navigation is performed
`
`during surgery.
`
`Here, for example, after a resection simulation is conducted in a state in which the
`
`area around a specific bone, blood vessel, organ, or the like is displayed using a three—
`
`dimensional image produced from a plurality of X-ray CT images, when surgery is performed
`
`using an endoscope, three—dimensional positions of the endoscope or surgical instrument
`
`actually used in the surgery are sequentially sensed, and the coordinates of a three-
`
`dimensional image formed from a plurality of X-ray CT images and the coordinates of the
`
`actual three-dimensional position of the endoscope and the surgical instrument are integrated.
`
`Then, the distance to the distal end of the actual surgical instrument with respect to the site to
`
`be resected in the resection simulation performed using a three-dimensional image is
`
`calculated, and this distance is displayed along with the three-dimensional image to advise
`
`the surgeon, so that surgical navigation is performed seamlessly from the resection simulation.
`
`Here, the above-mentioned tomographic image includes, for example, two-
`
`dimensional images acquired using X-ray CT, MRI, PET, or another such medical device.
`
`The above—mentioned surgical instrument includes resection instruments for resecting organs,
`
`bones, and so forth.
`
`Consequently, in surgery for resecting a specific organ by using an endoscope, for
`
`example, the surgeon can accurately ascertain how far the distal end of the surgical
`
`instrument is from the site that is to be resected, while moving the resection instrument or
`
`10
`
`15
`
`20
`
`
`
`other surgical instrument toward the resection site. This allows the surgeon to navigate
`
`properly while inserting the surgical instrument, without feeling any uncertainty due to not
`
`knowing how far apart the surgical instrument distal end and the resection site are.
`
`The surgery assistance device pertaining to the tenth invention is a surgery assistance
`
`device for performing navigation while displaying a three-dimensional simulation image
`
`produced from tomographic image information, during surgery in which a resection-use
`
`surgical instrument is used while the user views an endoscopic image, the device comprising
`
`a simulator and a navigator. The simulator stores the resection portion scheduled for surgery
`
`and virtually resected on the three-dimensional image produced by sampling voxel
`
`information for the tomographic image information of the patient in a direction perpendicular
`
`to the sight line, after associating it with the voxel information. The navigator calculates a
`
`distance between the working end of the surgical instrument on the three-dimensional image
`
`and the voxel information indicating the resection portion stored in the memory, displays the
`
`working end of the surgical instrument on the three-dimensional image by using the
`
`coordinates of the surgical instrument during surgery, and displays the distance between the
`
`working end and the voxel information indicating the resection portion, along with the
`
`endoscopic image displayed during surgery.
`
`BRIEF DESCRIPTION OF DRAWINGS
`
`FIG. 1 shows the configuration of a surgery assistance system that includes a personal
`
`computer (surgery assistance device) pertaining to an embodiment of the present invention;
`
`FIG. 2 is an oblique view of the personal computer included in the surgery assistance
`
`system in FIG. 1;
`
`10
`
`15
`
`20
`
`10
`
`
`
`FIG. 3 is a control block diagram of the personal computer in FIG. 2;
`
`FIG. 4 is a block diagram of the configuration of an endoscope parameter storage
`
`section in a memory included in the control blocks in FIG. 3;
`
`FIG. 5 is a block diagram of the configuration of an endoscope parameter storage
`
`section in the memory included in the control blocks in FIG. 3;
`
`FIGS. 6A and 6B are a side view and a plan view of an oblique endoscope included in
`
`the surgery assistance system in FIG. 1 and a three-dimensional sensor attached to this
`
`endoscope;
`
`FIG. 7A is an operational flowchart of the personal computer in FIG. 2, FIG. 7B is an
`
`10
`
`operational flowchart of the flow in S6 of FIG. 7A, and FIG. 7C is an operational flowchart
`
`of the flow in S8 in FIG. 7A;
`
`FIG. 8 shows a navigation screen displayed on the display of the surgery assistance
`
`system in FIG. 1;
`
`FIG. 9 shows a navigation screen displayed on the display of the surgery assistance
`
`15
`
`system in FIG. 1;
`
`FIG. 10 shows a navigation screen displayed on the display of the surgery assistance
`
`system in FIG. 1;
`
`FIG. 11 shows a navigation screen displayed on the display of the surgery assistance
`
`system in FIG. 1;
`
`20
`
`FIG. 12 shows a navigation screen displayed on the display of the surgery assistance
`
`system in FIG. 1;
`
`ll
`
`
`
`FIGS. 13A and 13B illustrate mapping from two-dimensional input with a mouse to
`
`three-dimensional operation with an endoscope when a tubular surgical instrument (retractor)
`
`is used;
`
`FIG. 14 illustrates mapping from two-dimensional input with a mouse to three-
`
`dimensional operation with an endoscope;
`
`FIG. 15 illustrates the display of a volume rendering image that shows the desired
`
`oblique angle with an oblique endoscope;
`
`FIGS. 16A to 16C show displays when the distal end position of an oblique
`
`endoscope and the sight line vector are shown on in a three—panel View;
`
`FIG. 17 shows an oblique endoscopic image that is displayed by the personal
`
`computer in FIG. 2;
`
`FIG. 18A shows an oblique endoscopic image pertaining to this embodiment, and FIG.
`
`18B shows an endoscopic image when using a direct—View endoscope instead of an oblique
`
`endoscope;
`
`FIG. 19 shows a monitor screen that shows the restricted display area of an oblique
`
`endoscope;
`
`FIGS. 20A to 20C show an endoscopic image centered on a resection site C, an
`
`endoscopic View cropped from a three—dimensional image corresponding to a portion of this
`
`site, and a monitor screen displaying an image in which the endoscopic image and the
`
`10
`
`15
`
`20
`
`endoscopic View are superposed;
`
`12
`
`
`
`FIGS. 2 1A to 21C show an endoscopic image, a three-dimensional image (VR image)
`
`corresponding to that portion, and a monitor screen displaying an image in which the
`
`endoscopic image and the VR image are superposed;
`
`FIG. 22 shows a monitor screen displaying a registration interface screen for setting
`
`feature points;
`
`FIG. 23 illustrates coordinate conversion in registration;
`
`FIGS. 24A and 24B show a correction value setting interface in registration, and a
`
`display example of the coordinate axis and feature points on a volume rendering image;
`
`FIG. 25A is a side view of a surgical instrument included in the surgery assistance
`
`system in FIG. 1, and a three-dimensional sensor attached thereto, and FIG. 25B is a side
`
`view in which the distal end of a surgical instrument is modeled by multi-point modeling in a
`
`virtual space in which the sensor in FIG. 25A is used as a reference;
`
`FIG. 26 illustrates the step of calculating and displaying the distance from the distal
`
`end of the surgical instrument in FIG. 25B to the resection site;
`
`FIG. 27 shows a display example in which a region of equidistance from the resection
`
`site in virtual space is displayed;
`
`FIG. 28 illustrates a case in which resection control encompassing the concept of
`
`threshold summing valid points is applied to a method for updating a threshold in which
`
`resection is restricted in resection simulation;
`
`FIG. 29 illustrates a case in which resection control not encompassing the concept of
`
`threshold summing valid points is applied to a method for updating a threshold in which
`
`resection is restricted in resection simulation;
`
`10
`
`15
`
`20
`
`13
`
`
`
`FIGS. 30A and 30B are a side view and a plan view showing an endoscope and sensor
`
`used in the surgery assistance system pertaining to another embodiment of the present
`
`invention; and
`
`FIGS. 31A and 31B are a side view and a plan view showing an endoscope and sensor
`
`used in the surgery assistance system pertaining to yet another embodiment of the present
`
`invention.
`
`DESCRIPTION OF EMBODIMENTS
`
`The personal computer (surgery assistance device) pertaining to an embodiment of the
`
`10
`
`15
`
`20
`
`present invention will now be described through reference to FIGS. 1 to 29.
`
`In this embodiment, a case is described in which navigation is performed in surgery
`
`for lumbar spinal stenosis using an endoscope and a resection tool or other such surgical
`
`instrument, but the present invention is not limited to this.
`
`As shown in FIG. 1, the personal computer 1 pertaining to this embodiment
`
`constitutes a surgery assistance system 100 along with a display (display component) 2, a
`
`position and angle sensing device 29, an oblique endoscope (endoscope) 32, and a
`
`positioning transmitter (magnetic field generator) 34.
`
`The personal computer 1 functions as a surgery assistance device by reading a surgery
`
`assistance program that causes a computer to execute the surgery assistance method of this
`
`embodiment. The configuration of the personal computer 1 will be discussed in detail below.
`
`The display (display component) 2 displays a three-dimensional image for performing
`
`resection simulation or navigation during surgery (discussed below), and also displays a
`
`setting screen, etc., for surgical navigation or resection simulation.
`
`14
`
`
`
`Since the display component for displaying navigation during surgery needs to
`
`display a navigation screen that is easy for the surgeon to understand during surgery, a large
`
`liquid crystal display 102 that is included in the surgery assistance system 100 in FIG. 1 is
`
`also used in addition to the display 2 of the personal computer 1.
`
`The position and angle sensing device 29 is connected to the personal computer 1, the
`
`positioning transmitter 34, and the oblique endoscope 32, and the position and attitude of the
`
`oblique endoscope 32 or the surgical instrument 33 during actual surgery are sensed on the
`
`basis of the sensing result of a three-dimensional sensor 32a (see FIG. 6A, etc.) or a three—
`
`dimensional sensor 33b (see FIG. 25A) attached to the oblique endoscope 32, the surgical
`
`10
`
`instrument 33, etc.
`
`The oblique endoscope (endoscope) 32 is inserted from the body surface near the
`
`portion undergoing surgery, into a tubular retractor 31 (discussed below), and acquires video
`
`of the surgical site. The three—dimensional sensor 32a is attached to the oblique endoscope 32.
`
`The positioning transmitter (magnetic field generator) 34 is disposed near the surgical
`
`15
`
`table on which the patient is lying, and generates a magnetic field. Consequently, the
`
`position and attitude of the oblique endoscope 32 and the surgical instrument 33 can be
`
`sensed by sensing the magnetic field generated by the positioning transmitter 34 at the three—
`
`dimensional sensor 32a or the three-dimensional sensor 33b attached to the oblique
`
`endoscope 32 and the surgical instrument 33.
`
`20
`
`Personal Computer 1
`
`15
`
`
`
`As shown in FIG. 2, the personal computer 1 comprises the display (display
`
`component) 2 and various input components (a keyboard 3, a mouse 4, and a tablet 5 (see
`
`FIG. 2)).
`
`The display 2 displays three-dimensional images of bones, organs, or the like formed
`
`from a plurality of tomographic images such as X-ray CT images (an endoscopic image is
`
`displayed in the example in FIG. 2), and also displays the results of resection simulation and
`
`the content of surgical navigation.
`
`As shown in FIG. 3, control blocks such as the tomographic image information
`
`acquisition section 6 are formed in the interior of the personal computer 1.
`
`10
`
`The tomographic image information acquisition section 6 is connected via the voxel
`
`information extractor 7 to the tomographic image information section 8. That is, the
`
`tomographic image information section 8 is supplied with tomographic image information
`
`from a device that captures tomographic images, such as CT, MRI, or PET, and this
`
`tomographic image information is extracted as voxel information by the voxel information
`
`15
`
`extractor 7.
`
`The memory 9 is provided inside the personal computer 1, and has the voxel
`
`information storage section 10, the voxel label storage section 11, the color information
`
`storage section 12, the endoscope parameter storage section 22, and the surgical instrument
`
`parameter storage section 24. The memory 9 is connected to the volume rendering computer
`
`20
`
`13 (distance calculator, display controller).
`
`The voxel information storage section 10 stores voxel information received from the
`
`voxel information extractor 7 via the tomographic image information acquisition section 6.
`
`l6
`
`
`
`The voxel label storage section 11 has a first voxel label storage section, a second
`
`voxel label storage section, and a third voxel label storage section. These first to third voxel
`
`label storage sections are provided corresponding to a predetermined range of CT values
`
`(discussed below), that is, to the organ to be displayed. For instance, the f1rstvoxel label
`
`storage section corresponds to a range of CT values displaying a liver, the second voxel label
`
`storage section corresponds to a range of CT values displaying a blood vessel, and the third
`
`voxel label storage section corresponds to a range of CT values displaying a bone.
`
`The color information storage section 12 has a plurality of storage sections in its
`
`10
`
`15
`
`interior. These storage sections are each provided corresponding to a predetermined range of
`
`CT values, that is, to the bone, blood vessel, nerve, organ, or the like to be displayed. For
`
`instance, there may be a storage section corresponding to a range of CT values displaying a
`
`liver, a storage section corresponding to a range of CT values displaying a blood vessel, and a
`
`storage section corresponding to a range of CT values displaying a bone. Here, the various
`
`storage sections are set to different color information for each of the bone, blood vessel,
`
`nerve, or organ to be displayed. For example, white color information may be stored for the
`
`range of CT values corresponding to a bone, and red color information may be stored for the
`
`range of CT values corresponding to a blood vessel.
`
`The CT values set for the bone, blood vessel, nerve, or organ to be displayed is the
`
`result of digitizing the extent of X-ray absorption in the body, and is expressed as a relative
`
`20
`
`value (in units of HU), with water at zero. For instance, the range of CT values in which a
`
`bone is displayed is 500 to 1000 HU, the range of CT values in which blood is displayed is
`
`17
`
`
`
`30 to 50 HU, the range of CT values in which a liver is displayed is 60 to 70 HU, and the
`
`range of CT values in which a kidney is displayed is 30 to 40 HU.
`
`As shown in FIG. 4, the endoscope parameter storage section 22 has a first endoscope
`
`parameter storage section 22a, a second endoscope parameter storage section 22b, and a third
`
`endoscope parameter storage section 22c. The first to third endoscope parameter storage
`
`sections 22a to 22c store endoscope oblique angles, viewing angles, positions, attitudes, and
`
`other such information. The endoscope parameter storage section 22 is connected to an
`
`endoscope parameter setting section 23, as shown in FIG. 3.
`
`The endoscope parameter setting section 23 sets the endoscope parameters inputted
`
`via the keyboard 3 or the mouse 4, and sends them to the endoscope parameter storage
`
`section 22.
`
`As shown in FIG. 5, the surgical instrument parameter storage section 24 has a first
`
`surgical instrument parameter storage section 24a, a second surgical instrument parameter
`
`storage section 24b, and a third surgical instrument parameter storage section 24c. The first
`
`to third surgical instrument parameter storage sections 24a to 24c each store information such
`
`as the length, distal end shape, position, and attitude of the drill (if the surgical instrument is a
`
`drill), for example. As shown in FIG. 2, the surgical instrument parameter storage section 24
`
`is connected to a surgical instrument parameter setting section 25.
`
`The surgical instrument parameter setting section 25 sets surgical instrument
`
`parameters for the retractor 3 l, drill, etc., that are inputted via the keyboard 3 or the mouse 4,
`
`and sends them to the surgical instrument parameter storage section 24.
`
`10
`
`15
`
`20
`
`18
`
`
`
`An endoscope/surgical instrument position and attitude acquisition section
`
`(endoscope/surgical instrument position sensor) 26 receives via a bus 16 the sensing result
`
`from the position and angle sensing device 29, which senses the position and angle of the
`
`endoscope or surgical instrument, and sends this result to the volume rendering computer 13
`
`and a registration computer 27.
`
`The volume rendering computer 13 acquires a plurality of sets of slice information at
`
`a specific spacing in the Z direction and perpendicular to the sight line, on the basis of the
`
`voxel information stored in the voxel information storage section 10, the voxel labels stored
`
`in the voxel label storage section 11, and the color information stored in the color information
`
`storage section 12. The volume rendering computer 13 then displays this computation result
`
`as a three-dimensional image on the display 2.
`
`The volume rendering computer 13 also gives a real-time display that combines the
`
`movements of the actual endoscope or surgical instrument into a three-dimensional image on
`
`the basis of endoscope information stored in the endoscope parameter storage section 22,
`
`surgical instrument information stored in the surgical instrument parameter storage section 24,
`
`and the sensing result from the endoscope/surgical instrument position and attitude
`
`10
`
`15
`
`acquisition section 26.
`
`The volume rendering computer 13 also displays a virtual endoscopic image on the
`
`display 2 in a masked state that reflects image information in which the field of view is
`
`20
`
`restricted by the retractor 31, with respect to the image information obtained by the
`
`endoscope, on the basis of the above—mentioned endoscopic information and surgical
`
`instrument information. More specifically, the volume rendering computer 13 sets an
`
`19
`
`
`
`endoscopic image display area (first display area) A1 (see FIG. 10, etc.) acquired by the
`
`endoscope, and a restricted display area (second display area) A2 (see FIG. 10, etc.), on the
`
`basis of information related to the endoscope stored in the endoscope parameter storage
`
`section 22 (oblique angle, View angle, position, etc.) and information related to the surgical
`
`instrument stored in the surgical instrument parameter storage section 24 (diameter, length,
`
`etc.).
`
`The endoscopic image display area A1 here is a display area that is displayed on the
`
`monitor screen of the display 2 during actual endoscopic surgery. The restricted display area
`
`A2 is a display area in which the display acquired by