Show simple item record

dc.contributor.authorRiasatian, Abtin
dc.date.accessioned2020-09-11 19:19:27 (GMT)
dc.date.available2020-09-11 19:19:27 (GMT)
dc.date.issued2020-09-11
dc.date.submitted2020-09-03
dc.identifier.urihttp://hdl.handle.net/10012/16290
dc.description.abstractWith the recent progress in deep learning, one of the common approaches to represent images is extracting deep features. A primitive way to do this is by using off-the-shelf models. However, these features could be improved through fine-tuning or even training a network from scratch by domain-specific images. This desirable task is hindered by the lack of annotated or labeled images in the field of histopathology. In this thesis, a new network, namely KimiaNet, is proposed that uses an existing dense topology but is tailored for generating informative and discriminative deep features from histopathology images for image representation. This model is trained based on the existing DenseNet-121 architecture but by using more than 240,000 image patches of 1000 ⨉ 1000 pixels acquired at 20⨉ magnification. Considering the high cost of histopathology image annotation, which makes the idea impractical at a large scale, a high-cellularity mosaic approach is suggested which could be used as a weak or soft labeling method. Patches used for training the KimiaNet are extracted from 7,126 whole slide images of formalin-fixed paraffin-embedded (FFPE) biopsy samples, spanning 30 cancer sub-types and publicly available through The Cancer Genome Atlas (TCGA) repository. The quality of features generated by KimiaNet are tested via two types of image search, (i) given a query slide, searching among all of the slides and finding the ones with the tissue type similar to the query’s and (ii) searching among slides within the query slide’s tumor type and finding slides with the same cancer sub-type as the query slide’s. Compared to the pre-trained DenseNet-121 and the fine-tuned versions, KimiaNet achieved predominantly the best results for both search modes. In order to get an intuition of how effective training from scratch is on the expressiveness of the deep features, the deep features of randomly selected patches, from each cancer subtype, are extracted using both KimiaNet and pre-trained DenseNet-121 and visualized after reducing their dimensionality using t-distributed Stochastic Neighbor Embedding (tSNE). This visualization illustrates that for KimiaNet, the instances of each class can easily be distinguished from others while for pre-trained DenseNet the instances of almost all of the classes are mixed together. This comparison is another verification to show that how discriminative training with domain-specific images has made the features. Also, four simpler networks, made up of repetitions of convolutional, batch-normalization and Rectified Linear Unit (ReLU) layers, (CBR networks) are implemented and compared against the KimiaNet to check if the network design could still be further simplified. The experiments demonstrated that KimiaNet features are by far better than CBR networks which validate the DenseNet-121 as a good candidate for KimiaNet’s architecture.en
dc.language.isoenen
dc.publisherUniversity of Waterlooen
dc.titleKimiaNet: Training a Deep Network for Histopathology using High-Cellularityen
dc.typeMaster Thesisen
dc.pendingfalse
uws-etd.degree.departmentSystems Design Engineeringen
uws-etd.degree.disciplineSystem Design Engineeringen
uws-etd.degree.grantorUniversity of Waterlooen
uws-etd.degreeMaster of Applied Scienceen
uws.contributor.advisorTizhoosh, Hamid
uws.contributor.affiliation1Faculty of Engineeringen
uws.published.cityWaterlooen
uws.published.countryCanadaen
uws.published.provinceOntarioen
uws.typeOfResourceTexten
uws.peerReviewStatusUnrevieweden
uws.scholarLevelGraduateen


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record


UWSpace

University of Waterloo Library
200 University Avenue West
Waterloo, Ontario, Canada N2L 3G1
519 888 4883

All items in UWSpace are protected by copyright, with all rights reserved.

DSpace software

Service outages