Thursday, April 25, 2024
HomeLawPrivacy concerns with emerging technology and their redressal through PDP Bill

Privacy concerns with emerging technology and their redressal through PDP Bill

These newer concerns emerge in the backdrop of the right to informational privacy. The right to privacy has been recognised as part of Article 21 of the Indian constitution in KS Puttaswamy v. Union of India in 2017. That judgment discussed the idea of autonomy within the right to informational privacy. Such decisional autonomy refers to a right for every person to have control over who can process or use her personal data.

Traditional means to exercise this control has typically involved a notice-and-consent mechanism, where individuals are supposed to be informed regarding the terms of processing of their data and are given the choice to accept or reject such processing. This process places the choice of the individual at the forefront, and views the act of giving consent as an economic choice made by a rational consumer.

Seen from this lens, the actions of most consumers who are theoretically keen on preserving privacy but in practice, routinely keep consenting to their data being processed makes little sense. This disparity between thought and action has been dubbed as the ‘privacy paradox’.

In recent years, the ‘rational choice’ view has been challenged in light of how consent actually plays out in real life. For instance, reportedly digital operators now sometimes use ‘dark patterns’ to evade or inconvenience users from actually knowing the full extent and terms of their data being processed. This involves the use of cognitive manipulations pushing users to click on a preferred pre-selected choice, or by making it difficult to access crucial terms of use for their data through complex website design.

On the other hand, the act of giving out consent on almost every level in the digital age has led to ‘consent fatigue’. Consent fatigue refers to a phenomenon where users develop a tendency to simply accept privacy notices without reading, due to various factors which include bulky and jargon-based notices and the constant information overload accompanying every choice for the user online. These factors explain the privacy paradox to an extent, and further highlight the inadequacy of ‘notice-and-consent’ as a metric to protect informational privacy.

The above mentioned issues raised by personal data processing practices of AI and IoT applications point to an inadequacy of consent mechanisms to ensure informational self-determination and privacy. A better way to safeguard informational privacy and preserve control over data processing, is by providing for access and control over one’s data throughout its period of processing, and not just at the point of its collection.

Enabling users at an individual level, along with broader community level norms, allow for a continuing exercise of the right to informational privacy. Further, a broad-based involvement with personal data at the individual and societal level throughout the use of personal data addresses some of the concerns raised by AI and IoT applications.

For this continuing exercise of privacy and data protection norms at both an individual and a societal level, a regulatory framework setting out individual-based rights and obligations, measures and norms based on functionality and features of data processing, and an enforcement mechanism that provides for remedies and penalties would provide a more cogent approach to data processing concerns of the 21st century.

Source: Barandbench

RELATED ARTICLES
- Advertisment -

Most Popular

Recent Comments