Repository logo
Communities & Collections
All of DSpace
  • English
  • العربية
  • বাংলা
  • Català
  • Čeština
  • Deutsch
  • Ελληνικά
  • Español
  • Suomi
  • Français
  • Gàidhlig
  • हिंदी
  • Magyar
  • Italiano
  • Қазақ
  • Latviešu
  • Nederlands
  • Polski
  • Português
  • Português do Brasil
  • Srpski (lat)
  • Српски
  • Svenska
  • Türkçe
  • Yкраї́нська
  • Tiếng Việt
Log In
New user? Click here to register.Have you forgotten your password?
  1. Home
  2. Browse by Author

Browsing by Author "Pradhan, Sayan"

Filter results by typing the first few letters
Now showing 1 - 1 of 1
  • Results Per Page
  • Sort Options
  • No Thumbnail Available
    Item
    Make LLMs Private
    (Indian Statistical Institute, Kolkata, 2024-07) Pradhan, Sayan
    Large Language Models are Machine Learning models that are trained on large text data, and have the capability to understand and generate human languages. Cryptographic techniques can be able to protect both the input and output privacy of the users. Secure Multi-Party Computation (MPC) is a cryptographic technique that allows multiple parties to compute a function on their secret inputs without revealing any information about their secret inputs. One of the applications of MPC lies in the domain of privacy-preserving machine learning (PPML). In this project “Make LLMs Private”, our goal is to choose a large language model and use MPC to make it private. Here we choose Bert model, which is a large language model and using different MPC protocols we construct an efficient combined MPC framework to evaluate Bert. Here our focus is only on the inference of Bert model. To achieve this, we require MPC protocols for multiplication, comparison, truncation, softmax, division, normalization operations. There are different MPC protocols for multiplication, division, comparison operations in ABY3 [10], Falcon [17] between three parties. Function secret sharing based protocols have smaller communication costs but larger computation costs. In Sigma [6], they introduce different MPC protocols for multiplication, comparison, exponential, truncation, maximum using function secret sharing in two parties. In a 3-party setup, MPC becomes increasingly faster, enabling practical deployment even for larger Machine Learning models. In MPC computing exponential is very expensive. In SecureML [12], the exponential is replaced by relu in an efficient way, and papers Puma [5], Bolt [13], approximate the exponential by polynomials. In Bicoptor [19], they introduce 2 round MPC protocols for multiplication, comparison, maximum with no pre-processing. Here we use those techniques to protect the Bert model. We integrate various basic MPC protocols to construct MPC protocols for complex functions, such as the softmax function, and optimize for both the number of rounds and communication costs. We propose combined MPC frameworks tailored for different scenarios in the Bert model, involving three parties.

DSpace software copyright © 2002-2026 LYRASIS

  • Privacy policy
  • End User Agreement
  • Send Feedback
Repository logo COAR Notify