当前位置: X-MOL 学术IEEE Open J. Comput. Soc. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Local Differential Privacy for Person-to-Person Interactions
IEEE Open Journal of the Computer Society Pub Date : 2022-12-14 , DOI: 10.1109/ojcs.2022.3228999
Yuichi Sei 1 , Akihiko Ohsuga 2
Affiliation  

Currently, many global organizations collect personal data for marketing, recommendation system improvement, and other purposes. Some organizations collect personal data securely based on a technique known as $\epsilon$ -local differential privacy (LDP). Under LDP, a privacy budget is allocated to each user in advance. Each time the user's data are collected, the user's privacy budget is consumed, and their privacy is protected by ensuring that the remaining privacy budget is greater than or equal to zero. Existing research and organizations assume that each individual's data are completely unrelated to other individuals' data. However, this assumption does not hold in a situation where interaction data between users are collected from them. In this case, each user's privacy is not sufficiently protected because the privacy budget is actually overspent. In this study, the issue of local differential privacy for person-to-person interactions is clarified. We propose a mechanism that satisfies LDP in a person-to-person interaction scenario. Mathematical analysis and experimental results show that the proposed mechanism can maintain high data utility while ensuring LDP compared to existing methods.

中文翻译:

人与人交互的本地差异隐私

目前,许多全球组织出于营销、推荐系统改进和其他目的收集个人数据。一些组织基于称为的技术安全地收集个人数据$\epsilon$ -本地差分隐私(LDP)。在 LDP 下,隐私预算被预先分配给每个用户。每次收集用户的数据,都会消耗用户的隐私预算,通过保证剩余隐私预算大于等于0来保护用户的隐私。现有研究和组织假设每个人的数据与其他人的数据完全无关。但是,在从用户那里收集用户之间的交互数据的情况下,这种假设并不成立。在这种情况下,每个用户的隐私都没有得到充分保护,因为隐私预算实际上超支了。在这项研究中,阐明了人与人交互的局部差异隐私问题。我们提出了一种在人与人交互场景中满足 LDP 的机制。
更新日期:2022-12-14
down
wechat
bug