Algorithms Under the Microscope: SEBI's Binding AI/ML Accountability Framework
India's capital markets regulator has fired a decisive shot in the country's AI governance narrative. Through binding amendments to the Intermediaries Regulations and subsequent sweeping governance directives, the Securities and Exchange Board of India (SEBI) has placed every regulated entity—from the largest stock exchange to the smallest retail algorithmic trader—under a stringent new regime of accountability for Artificial Intelligence and Machine Learning (AI/ML) usage. The message is unambiguous: the era of the unattributed "black box" algorithm is over.
The regulatory instruments establish what SEBI describes as a "glass box" standard. AI-driven decisions must be explainable, auditable, and traceable to a legally accountable human or institution. Under the newly inserted Chapter IIIB and Regulation 16C of the SEBI (Intermediaries) Regulations, 2008, regulated entities bear sole, non-delegable responsibility for any output generated by an AI/ML tool, regardless of whether it was built in-house or procured from a third-party vendor.
For stockbrokers, mutual funds, investment advisers, and Market Infrastructure Institutions (MIIs), the compliance calculus has fundamentally shifted. As the full enforcement deadline of April 1, 2026 has now arrived, board-approved AI frameworks and mandatory pre-deployment testing are no longer aspirational best practices—they are the strict regulatory baseline.
1. The Binding Foundation: Regulation 16C Explained
Regulation 16C is the centerpiece of the new framework. Its scope is intentionally expansive, applying to "any person regulated by the Board who uses artificial intelligence and machine learning tools... irrespective of the scale and scenario of adoption."
The regulation imposes accountability across three distinct axes:
- Data Privacy and Security: The entity is solely responsible for the privacy and integrity of investor data processed by the AI tool, running parallel to the strict mandates of the Digital Personal Data Protection Act (DPDP Act), 2023.
- Output Accuracy: The entity is accountable for the accuracy of every output generated, eliminating the "the tool is neutral" defense.
- Overall Accountability: Any consequence—be it a trading loss, discriminatory advice, or a flash crash—rests with the regulated entity, granting SEBI full enforcement authority.
2. The Six-Pillar Governance Architecture
To enforce Regulation 16C, SEBI has structured a principles-based governance architecture. Regulated entities must immediately align their internal systems with these six pillars:
1. Ethics
AI systems must align with investor interests. SEBI emphasizes data integrity, non-deception, and investor-centricity, drawing from OECD principles.
2. Accountability
Every deployment must have a traceable human accountable for oversight. Firms must establish a Board-approved AI Governance Framework and designate a senior technical officer to oversee the AI lifecycle.
3. Transparency
Mandatory plain-language client disclosures are required for AI tools that directly affect investors (e.g., robo-advisory). Disclosures must cover the tool's purpose, limitations, accuracy metrics, and risks.
4. Auditability
Entities must maintain comprehensive documentation (input datasets, validation reports, output records) for a minimum of five years. Independent periodic audits are mandatory.
5. Data Privacy & Security
Systems must operate within a robust cybersecurity framework. SEBI requires "circuit breakers"—automated cut-offs that halt AI-driven trading if market stability indicators are breached.
6. Fairness
AI models must be trained on diverse datasets and regularly tested for algorithmic bias to ensure they do not discriminate against specific investor demographics.
3. The Algorithmic Trading Overhaul: April 2026 Deadlines
Running parallel to Chapter IIIB is SEBI's framework for "Safer Participation of Retail Investors in Algorithmic Trading," which overhauls the legal architecture for automated strategies.
- Mandatory Exchange Approval: Every algorithmic strategy (exceeding de minimis thresholds) must be pre-approved by the stock exchange before deployment.
- Black Box vs. White Box Classification: SEBI now distinguishes between transparent, rule-based algorithms (White Box) and opaque, proprietary algorithms (Black Box). Providers of Black Box algorithms are now legally required to hold a SEBI Research Analyst (RA) license.
- Order-Per-Second (OPS) Threshold: Strategies executing more than 10 orders per second are classified as High-Frequency Trading (HFT) and attract rigorous pre-deployment testing and real-time monitoring mandates.
The Enforcement Deadline: Non-compliant brokers have been barred from onboarding new retail API clients since January 5, 2026. Full mandatory compliance for all stockbrokers is strictly enforced as of April 1, 2026.
4. Impact Analysis by Stakeholder
The scope of the framework impacts the entire capital market ecosystem:
- Stockbrokers & Sub-Brokers: Bear the heaviest burden. Must implement Vulnerability Assessment & Penetration Testing (VAPT), deploy kill switches, and retain five-year logs of all algo activity. As "Principals," they are liable for third-party algorithms hosted on their platforms.
- Mutual Funds & AMCs: Must secure board approval for AI governance frameworks used in portfolio optimization and customer segmentation, and report AI usage periodically to SEBI.
- Fintechs & SaaS Providers: While a pure SaaS provider does not bear direct SEBI regulatory liability, the regulated broker using their software does. This is creating massive contractual shifts, with brokers demanding aggressive indemnification clauses and audit rights from their fintech vendors.
5. Global Context: SEBI vs. The EU AI Act
SEBI's approach aligns broadly with global consensus but maintains distinct Indian characteristics. While the EU AI Act (2024) relies on a prescriptive, risk-tiered classification system (banning "unacceptable risk" systems entirely), SEBI has opted for a principles-based "sole responsibility" model. By collapsing the distinction between the "developer" and the "deployer," SEBI places the entire compliance burden squarely on the user-regulator (the registered intermediary).
Conclusion
The implementation of Chapter IIIB and Regulation 16C signifies that SEBI views algorithmic opacity not as a technological feature, but as a systemic risk. With the April 1, 2026 deadline effectively placing these rules into active enforcement, market participants must immediately transition their AI governance from paper policies to hardcoded, auditable technical realities.