Giri, Bibek2026-01-022026-01-022025-07-2327p.http://hdl.handle.net/10263/7634Dissertation under the supervision of Prof. Subhankar Mishra & Prof. Debrup ChakrabortyGraph Neural Networks (GNNs) have demonstrated impressive performance across a range of graph-based learning tasks. However, their application to domains with sensitive relational data raises serious privacy concerns, as the graph structure itself may leak confidential information. This thesis investigates a decentralized framework for enforcing edge-level local di!erential privacy (LDP) in graph-structured data. We introduce two mechanisms that perturb a node’s neighborhood in a privacy-preserving yet utility-aware manner. The first approach replaces randomly selected neighbors with feature-similar nodes from the 2-hop neighborhood, ensuring structural realism while preserving degree. The second approach eliminates the need for explicit 2-hop propagation and dummy vectors, instead relying on randomized feature queries to identify plausible substitutes. Both approaches are evaluated on benchmark graph datasets such as Cora, PubMed, and LastFM using GNN architectures like GCN, GraphSAGE, and GAT. Experimental results show that our methods achieve a favorable trade-o! between structure privacy and learning utility, while avoiding the overhead and privacy leakage risks of centralized or semi-local protocols.enDifferential Privacy, Local Differential Privacy, Graph Neural Networks, Privacy-Utility Trade-offStructural Differential Privacy in Graph Neural NetworksThesis