Surgical Instruments Checker

Welcome

Welcome to the website of the Surgical Instrument Checker, shortly SIC. In the following you can find a description why we have chosen this special project and what is our major aim. You can also get a brief overview of the user interface and the project members. 

Für die deutsche Version klicken Sie bitte auf die Flagge unten:

Initial position

Nowadays, technology plays an increasingly significant role in medicine. Due to that we couldn't imagine life without technical applications and aids in the field of surgery. Despite the technical complexity and the medical know-how in rare cases it may happen that a surgical device is not prepared for the operation or remains in the human body afterwards. Exactly this should be prevented with a technical solution in the future.


The statistics on the left-hand side of the OECD from 2011 show how many surgical instruments are forgotten in the human body per 100,000 discharges from the hospital. The European average is about 3.8 surgical tools. The Switzerland shows by far the highest rate with 11.6 remained surgical instruments in patients per 100,000 hospital discharges.

The German Coalition for Patient Safety estimates that about 600 - 700 people die because of the consequences of such a mistake in Germany every year. 

Target

A picture of the surgical instruments will be taken with an industrial camera before and after an operation. The surgical instruments in this picture will be checked for their completeness with the development software MATLAB. In the case of incompleteness an alarm should be triggered and the missing surgical tool should be displayed. The data, for example the operation method, the surgeon's name and the required surgical equipment is managed in a database.  

System overview

User Interface

On the start page the operation method can be selected and the name can be entered with a double-click. There you have to select if the instruments are placed on one or two different tables. The date is displayed here, too. With the Next button, you can get an overview of the required surgical Tools.

If, for example, the stent method has been selected as shown here all the required tools will be listed. If it's necessary, the number of the surgical instruments can be changed. Use the Next button to get to the actual completeness check. 

If the appropriate method can not be found on the first page, you can create your own method. There the name of the surgeon and the method name can be entered. The user can choose the surgical instruments and their numbers in the tables on the right .

On this screen the operation method as well as the name can be checked once again. Here we have taken the pictures with an industrial camera before. With Start Detection the tools can be checked for their completeness. This is done by filtering and cross-correlation. Click on the end button to continue.

After pressing the end button, you get to the last screen. Before you save the data, you can check the required surgical tool for its completeness by using the control button one more time. If an instrument is missing, the user is warned by a pop-up window. The data can be saved with the save button and you can return to the start screen. 

If you use the data button on the start page, you can see which operations have already been performed. The date, the surgeon and the selected operation method are listed here. If you use the data button on the start page, you can see which operations have already been performed. The date, the surgeon and the selected operation method are listed here.

Implementation

To guarentee a robust implementation three different detection methods were developed with MATLAB during the project. Every method has it's own advantages and disadvantages. 

Matching Points

One variation to find the surgical instruments are the matching points. Here, the templates are compared with the group image. The template, a picture of every single tool must be captured and stored before. Because of the problems with the light we use canny edge images. We have to take care that both images contain the same amount of details. Otherwise the instrument won't be found. The implemented function in MATLAB uses SURF-Features and BRISK-Features to find blobs which correspondence in these two pictures. Blob detection means that regions are detected which differ from the surrounding area in brightness or colour (uses integral image (summed area table)). In a blob the pixels have the same or similar properties. This method is scale and rotation invariant.

Crosscorelation

The cross correlation is one out of three methods we use to detect the surgical instruments in our images. Therefore, you need single pictures of the instruments which should be found later, called template. Each template will be searched in the recorded group images with cross correlation. This means that the template is moved over the group image and the correlation coefficient is calculated. The highest one reflects the tool. This method is scale and rotation-invariant but it is only scale-invariant with an additional marker which has to be placed next to the surgical instruments.

Generalized Hough Transformation

The Hough transformation is a detection method for simple and more complex objects which is normally not rotation or scale invariant. But with some extra work it can be rotation and scale invariant. There are predefined functions for detecting lines and circles but for the generalized Hough transformation, that we need, we have to implement the detection ourselves. The first step consists of learning the different objects. Therefore, the centre point is determined and the gradient information from each edge point is stored in the look up table. After that the object can be found in another picture. For the first try we searched the scissors in the same image. The main task is to find the centre point again. Therefore, possible midpoints are calculated and the pixel with the most votes is the potential centre. Like the other two this method is also roation- and scale-invariant.

Team members


Vanessa Di Vora


User interface

intensity-based object recognition

Irina Dobrianski

Image acquisition

feature-based object recognition



Dipl.-Ing. Dr. techn. Pierre Elbischger

Project supervisor

Image processing

Erstellen Sie Ihre Webseite gratis! Diese Website wurde mit Webnode erstellt. Erstellen Sie Ihre eigene Seite noch heute kostenfrei! Los geht´s