The Algorithmic Accountability Paradox under India’s DPDP Act, 2023: Regulating Significant Data Fiduciaries

The Algorithmic Accountability Paradox under India’s DPDP Act, 2023: Regulating Significant Data Fiduciaries

Introduction

The Digital Personal Data Protection Act, 2023 represents India’s first attempt at creating a statutory framework for digital data protection, coming into force after years of deliberation following the landmark judgment in Justice K.S. Puttaswamy (Retd.) v. Union of India [1]. Within this legislative architecture lies a particularly intriguing provision that elevates certain data handlers to a status of heightened scrutiny and responsibility. These entities, designated as Significant Data Fiduciaries under Section 10 of the DPDP Act, find themselves at the intersection of technological power and legal accountability. The provision creates what can be termed an “algorithmic accountability paradox” where entities wielding immense data processing capabilities face obligations that demand transparency in systems whose very value proposition depends on proprietary algorithmic complexity. This article examines how the DPDP Act attempts to regulate these powerful actors, the legal framework governing their operations, and the inherent tensions that arise when law seeks to govern algorithmic opacity.

The Constitutional Foundation: Privacy as a Fundamental Right

Before examining the specific provisions governing Significant Data Fiduciaries, one must understand the constitutional bedrock upon which the DPDP Act rests. On August 24, 2017, a nine-judge bench of the Supreme Court of India unanimously recognized privacy as a fundamental right guaranteed under Article 21 of the Constitution [1]. The Court in Justice K.S. Puttaswamy (Retd.) v. Union of India established that privacy is intrinsic to the right to life and personal liberty. Justice D.Y. Chandrachud, writing for the majority, articulated that privacy encompasses three essential elements: the right to make autonomous decisions regarding intimate personal choices, the right to control dissemination of personal information, and the expectation of privacy against state surveillance. This judgment fundamentally altered the trajectory of data protection discourse in India and necessitated the creation of statutory mechanisms to operationalize this constitutional guarantee.

The Puttaswamy judgment did not merely declare privacy a fundamental right; it established a three-pronged test for any law that seeks to restrict this right. Any such restriction must pass the tests of legality (existence of law), necessity (proportionate to a legitimate state aim), and proportionality (no alternative less intrusive measure). This framework became the constitutional lodestar for the DPDP Act, compelling the legislature to balance individual privacy rights against legitimate interests of data processing entities and the state. The Act’s provisions concerning Significant Data Fiduciaries must therefore be understood not merely as regulatory requirements but as constitutional obligations flowing from the fundamental right to privacy.

Understanding Data Fiduciaries and the Concept of Significance

Defining Data Fiduciaries under DPDP Act

The DPDP Act introduces terminology that departs from the European General Data Protection Regulation’s framework while maintaining conceptual similarity. Under Section 2(i) of the Act, a “Data Fiduciary” is defined as any person who alone or in conjunction with other persons determines the purpose and means of processing of personal data. This definition closely mirrors the GDPR’s concept of a “data controller,” but the use of the term “fiduciary” is deliberate and significant. It invokes the legal concept of a fiduciary relationship, one characterized by trust, confidence, and the duty to act in the best interests of another party. By employing this terminology, the Act imposes not merely contractual obligations but a higher standard of care rooted in trust law principles.

The Designation of Significant Data Fiduciaries Under DPDP Act

Section 10 of the DPDP Act empowers the Central Government to notify any Data Fiduciary or class of Data Fiduciaries as Significant Data Fiduciaries based on an assessment of relevant factors [2]. The Act explicitly enumerates six criteria for such designation: the volume and sensitivity of personal data processed; risk to the rights of Data Principals; potential impact on the sovereignty and integrity of India; risk to electoral democracy; security of the State; and public order. This designation mechanism represents a risk-based approach to data protection regulation, recognizing that not all data processing activities pose equal threats to individual privacy or societal interests.

The discretionary nature of this designation power is both a strength and a potential vulnerability. On one hand, it allows regulatory flexibility to respond to evolving technological landscapes and emerging threats. The government could, for instance, designate a relatively smaller entity as a Significant Data Fiduciary if it processes highly sensitive biometric or genetic data, while exempting a larger entity engaged in less sensitive processing. This contextual approach prevents rigid thresholds that might become obsolete or inappropriate. On the other hand, the absence of quantifiable metrics or procedural safeguards in the designation process raises concerns about predictability, consistency, and potential for arbitrary exercise of power.

Enhanced Obligations of Significant Data Fiduciaries under the DPDP Act

Appointment of Data Protection Officer

Section 10(2)(a) of DPDP Act mandates that Significant Data Fiduciaries appoint a Data Protection Officer who must be based in India and serve as an individual responsible to the Board of Directors or similar governing body [2]. The DPO must act as the point of contact for the grievance redressal mechanism under the Act. This requirement elevates data protection from a compliance function to a governance imperative, embedding privacy considerations at the highest level of organizational decision-making. The mandate that the DPO be based in India ensures regulatory accessibility and reflects the Act’s broader emphasis on territorial presence for accountability purposes.

The Digital Personal Data Protection Rules, 2025, notified in November 2025, provide additional clarity on the DPO’s role [3]. The Rules specify that the DPO must possess expertise in privacy law, data governance, and risk management. The officer serves as the primary interface between the organization, data principals, and the Data Protection Board of India. This positioning creates an inherent tension: the DPO must simultaneously serve organizational interests while acting as a champion for data principal rights and regulatory compliance. Navigating this dual mandate requires not merely technical competence but ethical judgment and institutional independence.

Independent Data Auditor and Annual Assessments

Section 10(2)(b) of DPDP Act requires Significant Data Fiduciaries to appoint an independent data auditor to evaluate compliance with the Act’s provisions [2]. Coupled with this is the requirement under Section 10(2)(c) for periodic Data Protection Impact Assessments. Rule 12 of the DPDP Rules, 2025 operationalizes these provisions by mandating that DPIAs and audits be conducted once every twelve months from the date of notification as a Significant Data Fiduciary [3]. The auditor must furnish a report containing “significant observations” to the Data Protection Board of India, creating a mandatory disclosure mechanism that brings regulatory oversight directly into the heart of organizational data practices.

Data Protection Impact Assessments serve a preventive function in the regulatory architecture. They require organizations to conduct a systematic evaluation before implementing new processing activities, particularly those involving novel technologies or large-scale processing of sensitive data. The DPIA must include a description of the rights of Data Principals and the purpose of processing, an assessment and management of risks to these rights, and measures to mitigate identified risks. While the DPDP Act’s DPIA requirements are less prescriptive than those under Article 35 of the GDPR, they nonetheless compel organizations to engage in structured risk thinking rather than reactive compliance.

Algorithmic Due Diligence

Perhaps the most forward-looking provision in Rule 12 is the requirement that Significant Data Fiduciaries observe due diligence to verify that algorithmic software deployed for hosting, display, uploading, modification, publishing, transmission, storage, updating, or sharing of personal data does not pose risks to Data Principal rights [3]. This provision acknowledges a crucial contemporary reality: algorithms themselves, not merely data processing practices, can infringe privacy and autonomy. An algorithm that engages in discriminatory profiling, manipulative targeting, or opaque decision-making poses fundamental risks that traditional data protection principles of notice and consent cannot adequately address.

The algorithmic due diligence requirement represents an attempt to impose transparency and accountability on what are often considered “black box” systems. However, the provision faces significant implementation challenges. What constitutes adequate “due diligence” in verifying algorithmic risk? Must organizations conduct algorithmic impact assessments, maintain model cards documenting training data and performance metrics, or implement explainability mechanisms? The Rules provide no detailed guidance, leaving organizations and regulators to navigate this terrain through iterative practice and potential litigation. This gap between aspiration and operationalization exemplifies the paradox at the heart of algorithmic accountability.

The Data Protection Board of India: Regulator and Adjudicator

Section 18 of the DPDP Act establishes the Data Protection Board of India as the principal regulatory and adjudicatory authority for matters arising under the Act [4]. The Board is constituted as a body corporate with perpetual succession, possessing the power to acquire, hold, and dispose of property, and to enter into contracts. Its primary function is to adjudicate disputes between Data Principals and Data Fiduciaries, hear complaints regarding alleged breaches of obligations under the Act, and impose penalties for non-compliance.

The Board’s penalty powers are substantial. Section 33 authorizes penalties ranging from INR 50 crores to INR 250 crores depending on the nature and gravity of the breach [4]. For failure to take reasonable security safeguards resulting in a personal data breach, the maximum penalty is INR 250 crores. For failure to comply with Data Principal rights, the penalty can reach INR 200 crores. These financial penalties are among the highest in Indian regulatory law, reflecting the seriousness with which the legislature views data protection violations. The Board must consider whether the penalty is proportionate to the specific breach before imposing it, incorporating the constitutional principle of proportionality directly into the penalty framework.

However, the Board’s institutional design raises concerns about independence and accountability. Members and the Chairperson are appointed by the Central Government without clear provisions for multi-stakeholder involvement or parliamentary oversight [4]. This structure contrasts with data protection authorities in jurisdictions like the European Union, where regulators possess greater structural independence from government. The concern is particularly acute given Section 17 of the Act, which grants extensive exemptions to government instrumentalities for processing undertaken in the interests of sovereignty, integrity, security of the state, public order, and other specified purposes. A Board appointed entirely by the executive may face challenges in robustly scrutinizing government data processing activities that implicate fundamental privacy rights.

Judicial Review and Constitutional Safeguards

Recognizing the potential limitations of the administrative enforcement mechanism, the DPDP Act incorporates judicial review provisions. Orders of the Data Protection Board may be appealed to the High Court within a prescribed period [5]. This creates a two-tier system where the Board serves as the specialized first-instance adjudicator, while High Courts and ultimately the Supreme Court provide constitutional oversight. This structure ensures that data protection disputes can benefit from the Board’s technical expertise while remaining subject to judicial scrutiny under constitutional principles.

The Supreme Court’s jurisprudence post-Puttaswamy has begun shaping the contours of privacy protection in the digital age. In Anuradha Bhasin v. Union of India (2020), the Court emphasized that any restrictions on fundamental rights, including privacy, must satisfy the tests of necessity, proportionality, and legality [6]. This principle directly informs the interpretation of the DPDP Act’s provisions, particularly the broad exemptions granted to government entities and the discretionary designation of Significant Data Fiduciaries. Courts can examine whether these provisions, as applied in specific cases, violate constitutional guarantees.

More recently, in Frank Vitus v. Narcotics Control Bureau (2024), the Supreme Court struck down a bail condition requiring GPS tracking through Google Maps, holding that such constant surveillance violated the right to privacy under Article 21 [7]. This judgment demonstrates judicial willingness to scrutinize surveillance mechanisms even when deployed for legitimate law enforcement purposes. The tension between the Frank Vitus precedent and Section 17’s exemptions for law enforcement processing under the DPDP Act suggests that courts will play a crucial role in demarcating the boundaries of permissible government data processing, potentially requiring procedural safeguards beyond those specified in the statute.

The Paradox of Algorithmic Transparency

The Competitive Opacity Dilemma

The enhanced obligations imposed on Significant Data Fiduciaries create a fundamental tension between transparency mandates and commercial imperatives. Many of these entities derive competitive advantage from proprietary algorithms that analyze data to generate insights, predictions, or recommendations. The economic value of platforms operated by large technology companies often resides not in the raw data itself but in the algorithmic models that process this data to deliver personalized services, targeted advertising, or predictive analytics. Requiring extensive disclosure of algorithmic functioning through DPIAs, audits, and due diligence processes potentially exposes trade secrets and undermines competitive positioning.

This paradox is not unique to India; data protection regimes worldwide grapple with balancing transparency against legitimate confidentiality interests. The GDPR attempts to address this through provisions like Article 15(1)(h), which grants data subjects the right to meaningful information about the logic involved in automated decision-making, while simultaneously recognizing that this must not adversely affect the rights and freedoms of others, including trade secrets. The DPDP Act, however, provides less nuanced guidance. The algorithmic due diligence requirement in Rule 12 demands verification that algorithms do not pose risks to Data Principal rights but does not specify how this verification should be conducted, what standards should apply, or how to balance transparency against confidentiality [3].

The Explainability Challenge

Beyond commercial concerns lies a deeper technical challenge: the inherent opacity of certain algorithmic systems, particularly those employing machine learning and artificial intelligence. Modern deep learning models often function as “black boxes” where even their creators cannot fully explain how specific inputs generate particular outputs. These systems identify complex patterns in training data that may not correspond to human-intuitive reasoning. When such algorithms make consequential decisions affecting individuals—whether in credit scoring, employment screening, insurance pricing, or content moderation—the inability to provide clear explanations creates acute accountability problems.

The DPDP Act does not directly mandate algorithmic explainability or a “right to explanation” for automated decisions, unlike some interpretations of the GDPR. Section 6 requires consent to be “informed,” and Section 8 obligates Data Fiduciaries to ensure accuracy and completeness of data, but these provisions do not clearly extend to explaining algorithmic logic [4]. The algorithmic due diligence requirement in Rule 12 could potentially be interpreted to necessitate explainability mechanisms as part of verifying that algorithms do not pose risks, but this remains subject to regulatory guidance or judicial interpretation.

International Perspectives and Comparative Analysis

India’s approach to regulating Significant Data Fiduciaries occupies a distinctive position in the global data protection landscape. The European Union’s GDPR does not create an explicit category of “significant” controllers, though it imposes heightened obligations on controllers engaged in large-scale processing or processing of special categories of data. The GDPR’s emphasis on data minimization, purpose limitation, and granular consent requirements applies uniformly to all controllers, albeit with proportionate implementation based on risk and scale.

The United States lacks federal omnibus data protection legislation, instead relying on sector-specific laws and state-level initiatives like the California Consumer Privacy Act. The CCPA and its successor, the California Privacy Rights Act, do not employ the concept of significant data fiduciaries but impose heightened obligations on businesses meeting certain revenue or data volume thresholds. China’s Personal Information Protection Law creates a category of “Personal Information Processors with Large User Scale” subject to enhanced requirements including impact assessments and appointment of protection officers, conceptually similar to India’s approach.

What distinguishes the DPDP Act is its explicit linkage of the Significant Data Fiduciary designation to national security and sovereignty concerns. The criteria enumerated in Section 10 include not only data protection considerations (volume, sensitivity, risk to Data Principal rights) but also broader state interests (sovereignty, integrity, electoral democracy, security, public order) [2]. This reflects India’s strategic approach to data governance as implicating not merely individual privacy but national interest. The potential for designation based on impact on electoral democracy, for instance, could encompass social media platforms whose algorithmic amplification of content might influence electoral outcomes. This jurisdictional assertion of data sovereignty distinguishes India’s model from purely rights-based frameworks.

Sectoral Implications and Practical Challenges

Technology Platforms and Social Media

Large technology platforms operating social media services, search engines, and digital marketplaces are prime candidates for Significant Data Fiduciary designation given the vast volumes of personal data they process and their societal impact. These entities face particular challenges in complying with the DPDP Act’s requirements. The algorithmic curation and recommendation systems that drive user engagement on social platforms rely on processing extensive behavioral data to predict user preferences and optimize content delivery. Conducting meaningful DPIAs for these systems requires assessing not only direct privacy risks but also downstream societal harms like echo chambers, polarization, or manipulation.

Financial Services and Fintech

The financial services sector already operates under stringent data localization and security requirements imposed by sectoral regulators like the Reserve Bank of India. Banks, payment system operators, and fintech companies processing financial data are likely Significant Data Fiduciary candidates. These entities must navigate the interaction between the DPDP Act and existing RBI regulations, which the Act explicitly preserves [8]. The challenge is particularly acute for algorithmic credit scoring and fraud detection systems, where explainability demands may conflict with the statistical complexity of risk models and the competitive sensitivity of scoring methodologies.

Healthcare and Genomic Data

Healthcare providers and particularly genomic testing companies exemplify the sensitivity-based designation pathway. A relatively smaller genomic testing startup could be designated a Significant Data Fiduciary due to the extreme sensitivity of genetic data, which not only identifies individuals but reveals hereditary health predispositions affecting entire families. The algorithmic due diligence requirement takes on heightened importance in this context, as algorithms analyzing genetic data to predict disease risk or recommend treatments must be rigorously validated to avoid medical harm from inaccurate or biased predictions.

Enforcement Challenges and Future Trajectory

The DPDP Act’s effectiveness in regulating Significant Data Fiduciaries will ultimately depend on implementation and enforcement. Several challenges loom large. First, the Data Protection Board must develop institutional capacity and technical expertise to effectively oversee entities employing sophisticated data processing technologies. Evaluating whether algorithms pose risks to Data Principal rights requires understanding of machine learning architectures, bias auditing methodologies, and fairness metrics—capabilities that may require time to develop within a newly constituted regulatory body.

Second, the Act’s phased implementation timeline creates transitional uncertainty. While the DPDP Rules, 2025 were notified in November 2025, companies have been granted a 12-18 month compliance window, with full enforcement expected by May 2027 [3]. During this transition, the government must issue notifications designating which entities or classes qualify as Significant Data Fiduciaries. The absence of such notifications creates planning challenges for organizations uncertain whether they will be subject to enhanced obligations.

Third, the global nature of data flows and digital services complicates enforcement. Many Significant Data Fiduciaries will be multinational corporations with complex organizational structures spanning multiple jurisdictions. Ensuring compliance with the requirement that Data Protection Officers be based in India and that audits and DPIAs meaningfully assess India-specific processing activities requires extraterritorial regulatory reach. Section 3 of the Act asserts applicability to processing outside India if related to offering goods or services to Data Principals in India, mirroring the GDPR’s extraterritorial scope [4]. However, practical enforcement against non-resident entities remains challenging absent international cooperation frameworks.

Conclusion

The concept of Significant Data Fiduciaries under the Digital Personal Data Protection Act, 2023 (DPDP Act) represents an ambitious attempt to impose heightened accountability on entities whose data processing activities pose substantial risks to individual privacy and societal interests. By mandating Data Protection Officers, independent audits, periodic impact assessments, and algorithmic due diligence, the Act seeks to transform data protection from a compliance checklist into a governance imperative embedded in organizational culture and decision-making processes.

Yet the framework also reveals the inherent tensions in regulating algorithmic systems. The transparency and accountability that the law demands often conflicts with the commercial opacity on which business models depend and the technical limitations of explaining complex machine learning systems. This algorithmic accountability paradox—the expectation that entities will be transparent about systems whose value lies partly in their inscrutability—defines the central challenge of contemporary data protection law.

The path forward requires moving beyond binary framings of transparency versus secrecy toward more nuanced approaches. Regulatory frameworks might embrace graduated disclosure mechanisms where different stakeholders receive different levels of algorithmic transparency. Data Protection Officers and auditors might receive detailed technical access to algorithms while the public receives high-level descriptions of processing purposes and safeguards. Independent technical auditing, perhaps through regulatory sandboxes or trusted third parties, could verify algorithmic fairness without full public disclosure. The development of explainability methods that provide meaningful insight without exposing proprietary details represents another promising direction.

As India’s data protection regime matures through the coming years of implementation, judicial interpretation will prove crucial. Courts will need to articulate standards for what constitutes adequate algorithmic due diligence, how to balance transparency against legitimate confidentiality interests, and when government exemptions impermissibly infringe the fundamental right to privacy established in Puttaswamy. The Data Protection Board’s early decisions in cases involving Significant Data Fiduciaries will set important precedents regarding the practical meaning of enhanced obligations.

Ultimately, the success of the Significant Data Fiduciary framework will be measured not merely by formal compliance but by substantive outcomes: whether it genuinely reduces privacy harms, whether it fosters trustworthy algorithmic systems, and whether it empowers individuals with meaningful control over their personal data in an increasingly algorithmically-mediated world. The DPDP Act provides the legal architecture, but building effective algorithmic accountability requires sustained commitment from regulators, judiciary, industry, and civil society alike.

References

[1] Justice K.S. Puttaswamy (Retd.) v. Union of India, (2017) 10 SCC 1. Available at: https://www.scobserver.in/cases/puttaswamy-v-union-of-india-fundamental-right-to-privacy-case-background/

[2] The Digital Personal Data Protection Act, 2023, No. 22 of 2023, § 10. Available at: https://www.meity.gov.in/static/uploads/2024/06/2bf1f0e9f04e6fb4f8fef35e82c42aa5.pdf

[3] Digital Personal Data Protection Rules, 2025. Available at: https://www.hoganlovells.com/en/publications/indias-digital-personal-data-protection-act-2023-brought-into-force-

[4] The Digital Personal Data Protection Act, 2023, No. 22 of 2023. Available at: https://en.wikipedia.org/wiki/Digital_Personal_Data_Protection_Act,_2023

[5] KS&K Advocates, “Judicial Review and Appeals under India’s DPDP Act, 2023.” Available at: https://ksandk.com/data-protection-and-data-privacy/judicial-review-and-appeals-under-indias-dpdp-act-2023/

[6] Anuradha Bhasin v. Union of India, (2020) 3 SCC 637. Available at: https://lawfullegal.in/indias-digital-personal-data-protection-act-2023-a-legal-shift-towards-data-sovereignty-and-privacy/

[7] Frank Vitus v. Narcotics Control Bureau (2024). Available at: https://www.jurist.org/commentary/2024/09/unconstitutional-movement-tracking-exploring-the-tension-between-recent-indian-supreme-court-jurisprudence-and-data-protection-legislation/

Published and Authorized by  Rutvik Desai