enow.com Web Search

Search results

  1. Results from the WOW.Com Content Network
  2. k-anonymity - Wikipedia

    en.wikipedia.org/wiki/K-anonymity

    To use k-anonymity to process a dataset so that it can be released with privacy protection, a data scientist must first examine the dataset and decide whether each attribute (column) is an identifier (identifying), a non-identifier (not-identifying), or a quasi-identifier (somewhat identifying).

  3. Data re-identification - Wikipedia

    en.wikipedia.org/wiki/Data_re-identification

    The Protection of Human Subjects ('Common Rule'), a collection of multiple U.S. federal agencies and departments including the U.S. Department of Health and Human Services, warn that re-identification is becoming gradually easier because of "big data"—the abundance and constant collection and analysis of information along with the evolution ...

  4. Datafly algorithm - Wikipedia

    en.wikipedia.org/wiki/Datafly_algorithm

    Datafly algorithm is an algorithm for providing anonymity in medical data. The algorithm was developed by Latanya Arvette Sweeney in 1997−98. [1] [2] Anonymization is achieved by automatically generalizing, substituting, inserting, and removing information as appropriate without losing many of the details found within the data.

  5. Data anonymization - Wikipedia

    en.wikipedia.org/wiki/Data_anonymization

    According to the EDPS and AEPD, no one, including the data controller, should be able to re-identify data subjects in a properly anonymized dataset. [8] Research by data scientists at Imperial College in London and UCLouvain in Belgium, [ 9 ] as well as a ruling by Judge Michal Agmon-Gonen of the Tel Aviv District Court, [ 10 ] highlight the ...

  6. Latanya Sweeney - Wikipedia

    en.wikipedia.org/wiki/Latanya_Sweeney

    Medical dataset de-anonymization [ edit ] In 1998 Sweeney published a now famous example about data de-anonymization, demonstrating that a medical dataset that was in the public domain, can be used to identify individuals, regardless the removal of all explicit identifiers, when the medical dataset was combined with a public voter list.

  7. Spatial cloaking - Wikipedia

    en.wikipedia.org/wiki/Spatial_cloaking

    Spatial cloaking is a privacy mechanism that is used to satisfy specific privacy requirements by blurring users’ exact locations into cloaked regions. [1] [2] This technique is usually integrated into applications in various environments to minimize the disclosure of private information when users request location-based service.

  8. l-diversity - Wikipedia

    en.wikipedia.org/wiki/L-diversity

    The l-diversity model handles some of the weaknesses in the k-anonymity model where protected identities to the level of k-individuals is not equivalent to protecting the corresponding sensitive values that were generalized or suppressed, especially when the sensitive values within a group exhibit homogeneity.

  9. Pseudonymization - Wikipedia

    en.wikipedia.org/wiki/Pseudonymization

    An example of application of pseudonymization procedure is creation of datasets for de-identification research by replacing identifying words with words from the same category (e.g. replacing a name with a random name from the names dictionary), [11] [12] [13] however, in this case it is in general not possible to track data back to its origins.