Papers
arxiv:2505.13362

Dynamic Probabilistic Noise Injection for Membership Inference Defense

Published on Feb 22
Authors:
,

Abstract

Adaptive inference-time defense method DynaNoise dynamically adjusts noise injection based on query sensitivity to improve privacy-utility trade-off in machine learning models.

AI-generated summary

Membership Inference Attacks (MIAs) expose privacy risks by determining whether a specific sample was part of a model's training set. These threats are especially serious in sensitive domains such as healthcare and finance. Traditional mitigation techniques, such as static differential privacy, rely on injecting a fixed amount of noise during training or inference. However, this often leads to a detrimental trade-off: the noise may be insufficient to counter sophisticated attacks or, when increased, can substantially degrade model accuracy. To address this limitation, we propose DynaNoise, an adaptive inference-time defense that modulates injected noise based on per-query sensitivity. DynaNoise estimates risk using measures such as Shannon entropy and scales the noise variance accordingly, followed by a smoothing step that re-normalizes the perturbed outputs to preserve predictive utility. We further introduce MIDPUT (Membership Inference Defense Privacy-Utility Trade-off), a scalar metric that captures both privacy gains and accuracy retention. Our evaluation on several benchmark datasets demonstrates that DynaNoise substantially lowers attack success rates while maintaining competitive accuracy, achieving strong overall MIDPUT scores compared to state-of-the-art defenses.

Community

Sign up or log in to comment

Get this paper in your agent:

hf papers read 2505.13362
Don't have the latest CLI?
curl -LsSf https://hf.co/cli/install.sh | bash

Models citing this paper 0

No model linking this paper

Cite arxiv.org/abs/2505.13362 in a model README.md to link it from this page.

Datasets citing this paper 0

No dataset linking this paper

Cite arxiv.org/abs/2505.13362 in a dataset README.md to link it from this page.

Spaces citing this paper 1

Collections including this paper 0

No Collection including this paper

Add this paper to a collection to link it from this page.