SEBI's Binding AI/ML Accountability Framework 2026 | M S Sulthan Legal Associates
Disclaimer: As per the rules of the Bar Council of India, this content is for educational and informational purposes only. It does not constitute legal advice or solicitation.

Algorithms Under the Microscope: SEBI's Binding AI/ML Accountability Framework

By M S Sulthan Legal Associates, Kozhikode | April 9, 2026 | Securities Law / Technology Law

India's capital markets regulator has fired a decisive shot in the country's AI governance narrative. Through binding amendments to the Intermediaries Regulations and subsequent sweeping governance directives, the Securities and Exchange Board of India (SEBI) has placed every regulated entity—from the largest stock exchange to the smallest retail algorithmic trader—under a stringent new regime of accountability for Artificial Intelligence and Machine Learning (AI/ML) usage. The message is unambiguous: the era of the unattributed "black box" algorithm is over.

The regulatory instruments establish what SEBI describes as a "glass box" standard. AI-driven decisions must be explainable, auditable, and traceable to a legally accountable human or institution. Under the newly inserted Chapter IIIB and Regulation 16C of the SEBI (Intermediaries) Regulations, 2008, regulated entities bear sole, non-delegable responsibility for any output generated by an AI/ML tool, regardless of whether it was built in-house or procured from a third-party vendor.

For stockbrokers, mutual funds, investment advisers, and Market Infrastructure Institutions (MIIs), the compliance calculus has fundamentally shifted. As the full enforcement deadline of April 1, 2026 has now arrived, board-approved AI frameworks and mandatory pre-deployment testing are no longer aspirational best practices—they are the strict regulatory baseline.

1. The Binding Foundation: Regulation 16C Explained

Regulation 16C is the centerpiece of the new framework. Its scope is intentionally expansive, applying to "any person regulated by the Board who uses artificial intelligence and machine learning tools... irrespective of the scale and scenario of adoption."

The "Non-Delegable Liability" Rule: Regulation 16C imposes sole responsibility on the regulated entity. There is no longer a credible legal basis for attributing erroneous or harmful AI outputs to the third-party software vendor or the underlying LLM itself. If an AI tool violates a regulation, the SEBI-registered entity bears the absolute liability.

The regulation imposes accountability across three distinct axes:

  • Data Privacy and Security: The entity is solely responsible for the privacy and integrity of investor data processed by the AI tool, running parallel to the strict mandates of the Digital Personal Data Protection Act (DPDP Act), 2023.
  • Output Accuracy: The entity is accountable for the accuracy of every output generated, eliminating the "the tool is neutral" defense.
  • Overall Accountability: Any consequence—be it a trading loss, discriminatory advice, or a flash crash—rests with the regulated entity, granting SEBI full enforcement authority.

2. The Six-Pillar Governance Architecture

To enforce Regulation 16C, SEBI has structured a principles-based governance architecture. Regulated entities must immediately align their internal systems with these six pillars:

1. Ethics

AI systems must align with investor interests. SEBI emphasizes data integrity, non-deception, and investor-centricity, drawing from OECD principles.

2. Accountability

Every deployment must have a traceable human accountable for oversight. Firms must establish a Board-approved AI Governance Framework and designate a senior technical officer to oversee the AI lifecycle.

3. Transparency

Mandatory plain-language client disclosures are required for AI tools that directly affect investors (e.g., robo-advisory). Disclosures must cover the tool's purpose, limitations, accuracy metrics, and risks.

4. Auditability

Entities must maintain comprehensive documentation (input datasets, validation reports, output records) for a minimum of five years. Independent periodic audits are mandatory.

5. Data Privacy & Security

Systems must operate within a robust cybersecurity framework. SEBI requires "circuit breakers"—automated cut-offs that halt AI-driven trading if market stability indicators are breached.

6. Fairness

AI models must be trained on diverse datasets and regularly tested for algorithmic bias to ensure they do not discriminate against specific investor demographics.

3. The Algorithmic Trading Overhaul: April 2026 Deadlines

Running parallel to Chapter IIIB is SEBI's framework for "Safer Participation of Retail Investors in Algorithmic Trading," which overhauls the legal architecture for automated strategies.

Unique Exchange-Provided Strategy IDs: Every order generated by an algorithm must now carry a unique Strategy ID acting as a digital fingerprint. This allows SEBI to trace any market anomaly directly back to the specific algorithm and the responsible entity.
  • Mandatory Exchange Approval: Every algorithmic strategy (exceeding de minimis thresholds) must be pre-approved by the stock exchange before deployment.
  • Black Box vs. White Box Classification: SEBI now distinguishes between transparent, rule-based algorithms (White Box) and opaque, proprietary algorithms (Black Box). Providers of Black Box algorithms are now legally required to hold a SEBI Research Analyst (RA) license.
  • Order-Per-Second (OPS) Threshold: Strategies executing more than 10 orders per second are classified as High-Frequency Trading (HFT) and attract rigorous pre-deployment testing and real-time monitoring mandates.

The Enforcement Deadline: Non-compliant brokers have been barred from onboarding new retail API clients since January 5, 2026. Full mandatory compliance for all stockbrokers is strictly enforced as of April 1, 2026.

4. Impact Analysis by Stakeholder

The scope of the framework impacts the entire capital market ecosystem:

  • Stockbrokers & Sub-Brokers: Bear the heaviest burden. Must implement Vulnerability Assessment & Penetration Testing (VAPT), deploy kill switches, and retain five-year logs of all algo activity. As "Principals," they are liable for third-party algorithms hosted on their platforms.
  • Mutual Funds & AMCs: Must secure board approval for AI governance frameworks used in portfolio optimization and customer segmentation, and report AI usage periodically to SEBI.
  • Fintechs & SaaS Providers: While a pure SaaS provider does not bear direct SEBI regulatory liability, the regulated broker using their software does. This is creating massive contractual shifts, with brokers demanding aggressive indemnification clauses and audit rights from their fintech vendors.

5. Global Context: SEBI vs. The EU AI Act

SEBI's approach aligns broadly with global consensus but maintains distinct Indian characteristics. While the EU AI Act (2024) relies on a prescriptive, risk-tiered classification system (banning "unacceptable risk" systems entirely), SEBI has opted for a principles-based "sole responsibility" model. By collapsing the distinction between the "developer" and the "deployer," SEBI places the entire compliance burden squarely on the user-regulator (the registered intermediary).

Conclusion

The implementation of Chapter IIIB and Regulation 16C signifies that SEBI views algorithmic opacity not as a technological feature, but as a systemic risk. With the April 1, 2026 deadline effectively placing these rules into active enforcement, market participants must immediately transition their AI governance from paper policies to hardcoded, auditable technical realities.

Frequently Asked Questions (FAQ)

1. Who is legally responsible if a third-party AI tool causes a trading loss?
Under Regulation 16C of the SEBI Intermediaries Regulations, the SEBI-registered entity (e.g., the stockbroker or mutual fund) bears sole, non-delegable responsibility for the output of the AI tool, even if it was procured from an external SaaS vendor. The regulated entity cannot pass regulatory liability to the tech provider.
2. What is the difference between a White Box and Black Box algorithm under SEBI rules?
A White Box algorithm operates on transparent, rule-based logic that is fully explicable to regulators. A Black Box algorithm relies on opaque, proprietary reasoning (often deep learning models) with limited disclosure. In 2026, providers of Black Box algorithms must hold a formal SEBI Research Analyst (RA) license to operate.
3. What are the record-keeping requirements for AI systems?
SEBI mandates that regulated entities maintain comprehensive documentation—including input datasets, model parameters, internal approval trails, and final output records—for a minimum period of five years to ensure retrospective auditability.
4. Does a retail trader using algorithms need exchange approval?
Yes. Any retail algorithmic strategy executing orders above the de minimis threshold (currently >10 Orders Per Second) is classified as High-Frequency Trading. It must be routed through an empanelled broker, receive formal exchange pre-approval, and carry a unique Strategy ID for traceability.

Are your proprietary trading algorithms and third-party SaaS contracts compliant with SEBI's 2026 mandates? Contact our Technology and Securities Law desk for a comprehensive governance audit.

Email: contact@mssulthan.com

© 2026 M S Sulthan Legal Associates, Kozhikode. All Rights Reserved.

Loading latest insights...