One PhD position in the department of Biological and Biomedical Engineering or in Integrated Program in Neuroscience is open in the lab of Professor Louis Collins at the Montreal Neurological Institute and Hospital, McGill University to develop software tools for image guided neurosurgery of brain

Start date: Fall 2019/rolling

Stipend: $25k/y

Graduate programsBiological and Biomedical Engineering or Integrated Program in Neuroscience

DescriptionFor patients with brain tumours, the extent of surgical resection has been demonstrated to be a significant independent factor for good prognosis. Unfortunately, studies have found residual tumour in up to 82% of cases, reducing survival.   The candidate will develop tools to integrate data from multiple sensors and multiple algorithms to improve patient-to-image alignment and tissue identification to enable accurate and robust guidance throughout surgery and thus enable more complete resections. Projects include research on multi-modal image registration and pathology segmentation, tumour identification from MR spectroscopy, augmented reality visualization in surgery and clinical evaluation.

Requirements: The successful candidates will work with a team of engineers, computer scientists and clinicians in an open-software environment, integrating new tools into our publicly available IBIS neuronavigation software platform (http://ibisneuronav.org).Candidates should have a masters degree in computer science, math, physics, engineering or neuroscience and should have strong analytical and programming skills (C, C++, Python), ability to work independently, good communication skills. Experience with ITK and/or 3DSlicer is a plus.

Context: This position fits within an NSERC and CIHR-funded multi-site research project to improved precision in image guided neurosurgery through an open source collaborative software environment.

Our group has  explored multiple intra-operative (iUS) solutions (vessel-based, tissue-based, and edge-based iUS-MRI registration) to estimate the non-linear transformation needed to map the pre-operative images to the deformed brain during surgery. However, all solutions (ours and those listed above) have limitations in terms of accuracy, costs, anatomical coverage, computational expense and robustness. To ensure both acceptance by surgeons and translation to industry, system robustness is as important as accuracy. A technique is needed that guarantees that the registration error is always minimal, and the transformation never diverges, so that guidance is accurate throughout surgery. To address this issue, we propose to a novel method to fuse information from multiple sensors and registration techniques to robustly estimate an accurate non-linear deformation between patient and images. Each sensor (e.g., intraoperative B-mode and Doppler ultrasound, microscope images, manually identified landmarks and pre-operative anatomical models) provides partial and overlapping registration information. These will be combined in a Multi-channel Kalman filter setting with the advantage of integrating confidence estimates and expertly labeled landmarks with image-based registration in a principled manner.

The Multi-Channel Kalman Filter framework will enable us to integrate new registration methods as they are developed. Not only will this framework improve registration accuracy, but it will make registrations more robust and thus facilitate clinical and to commercial use.  The software will be developed in the 3D Slicer open source environment, thus leveraging the huge 3D Slicer user-base to reduce development costs while improving software robustness through testing with more users than possible at our 3 collaborating sites. In line with the MNI’s Open Science Policy, our open-source platform will be usable by many academic groups, hospitals and industry partners so that future Tri-council funding in image-guided surgery is more effective and translation from laboratory to the operating room is streamlined, thus benefiting patients sooner.

Follow instructions here to apply with JobID=Job2270.