METHOD AND APPARATUS FOR UTILITY-AWARE PRIVACY PRESERVING MAPPING AGAINST INFERENCE ATTACKS
First Claim
1. A method for processing user data for a user, comprising:
- accessing the user data, which includes private data and public data, the private data corresponding to a first category of data, and the public data corresponding to a second category of data;
decoupling dependencies between the first category of data and the second category of data, from dependencies between the second category of data and released data;
determining a privacy preserving mapping that maps the second category of data to the released data responsive the dependencies between the second category of data and the released data;
modifying the public data for the user based on the privacy preserving mapping; and
releasing the modified data to at least one of a service provider and a data collecting agency.
1 Assignment
0 Petitions
Accused Products
Abstract
The present principles focus on the privacy-utility tradeoff encountered by a user who wishes to release some public data (denoted by X) to an analyst, that is correlated with his private data (denoted by S), in the hope of getting some utility. The public data is distorted before its release according to a probabilistic privacy preserving mapping mechanism, which limits information leakage under utility constraints. In particular, this probabilistic privacy mechanism is modeled as a conditional distribution, P_(Y|X), where Y is the actual released data to the analyst. The present principles design utility-aware privacy preserving mapping mechanisms against inference attacks, when only partial, or no, statistical knowledge of the prior distribution, P_(S,X), is available. Specifically, using maximal correlation techniques, the present principles provide a separability result on the information leakage that leads to the design of the privacy preserving mapping.
37 Citations
21 Claims
-
1. A method for processing user data for a user, comprising:
-
accessing the user data, which includes private data and public data, the private data corresponding to a first category of data, and the public data corresponding to a second category of data; decoupling dependencies between the first category of data and the second category of data, from dependencies between the second category of data and released data; determining a privacy preserving mapping that maps the second category of data to the released data responsive the dependencies between the second category of data and the released data; modifying the public data for the user based on the privacy preserving mapping; and releasing the modified data to at least one of a service provider and a data collecting agency. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10)
-
-
11. An apparatus for processing user data for a user, comprising:
-
a processor configured to access the user data, which includes private data and public data, the private data corresponding to a first category of data, and the public data corresponding to a second category of data a privacy preserving mapping decision module coupled to the processor and configured to decouple dependencies between the first category of data and the second category of data, from dependencies between the second category of data and released data, and determine a privacy preserving mapping that maps the second category of data to the released data responsive the dependencies between the second category of data and released data; a privacy preserving module configured to modify the public data for the user based on the privacy preserving mapping, and release the modified data to at least one of a service provider and a data collecting agency. - View Dependent Claims (12, 13, 14, 15, 16, 17, 18, 19, 20)
-
-
21. (canceled)
Specification