Please use this identifier to cite or link to this item: http://hdl.handle.net/10263/7507
Full metadata record
DC FieldValueLanguage
dc.contributor.authorDey, Pratap-
dc.date.accessioned2025-02-05T11:12:18Z-
dc.date.available2025-02-05T11:12:18Z-
dc.date.issued2024-06-
dc.identifier.citation39p.en_US
dc.identifier.urihttp://hdl.handle.net/10263/7507-
dc.descriptionDissertation under the supervision of Dr. Malay Bhattacharyyaen_US
dc.description.abstractGraph neural networks (GNNs) have become essential tools for graph representation learning, with models likeGraphConvolutionalNetworks (GCNs),Graph- SAGE, and Graph Attention Networks (GATs). It has achieved notable success in various applications. HSGATv2, a recent advancement, enhances attentionmechanisms for nodes with the same class label. However, traditional GNN weight assignment methods, which often depend on node degrees or pair-wise representations, are less effective in heterophilic networks in which the labels or properties of connected nodes differ. It has been shownthat most existing models are primarily prone to homophilic graphs and lack generalization to heterophilic settings, and multi-layer perceptrons and other models that neglect the graph structure sometimes exceed these models in terms of performance. This dissertation explores the effectiveness of GNNs in node classification tasks within heterophilic or lowhomophily environments, where many common GNNs fail to perform well. So, in this dissertation, we try to address it and introduce a representation learning methodology that is comparatively suitable for both homophilic and heterophilic graphs. By thoroughly examining local structure and heterophily distributions, our approach effectively manages networks with diverse homophily ratios. Additionally, we propose a regularized optimization function to enhance model adaptability to any graphstructure. Our evaluationsonvariousnodeclassification datasets demonstrate that the proposed method is competitive to the standard baseline models, and promisingly generalizable.en_US
dc.language.isoenen_US
dc.publisherIndian Statistical Institute, Kolkataen_US
dc.relation.ispartofseriesMTech(CS) Dissertation;22-21-
dc.subjectGraph neural networks (GNNs)en_US
dc.subjectlikeGraphConvolutionalNetworks (GCNs)en_US
dc.subjectGraph Attention Networks (GATs)en_US
dc.subjectGraph- SAGEen_US
dc.titleExpressive Power ofMessage Passing in Graph Neural Networksen_US
dc.typeOtheren_US
Appears in Collections:Dissertations - M Tech (CS)

Files in This Item:
File Description SizeFormat 
Pratap_Dey-cs2221.pdfDissertations - M Tech (CS)2.14 MBAdobe PDFView/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.