
Deep-fake abuse challenges India’s new Bharatiya Nyaya Sanhita as experts call for stronger laws, definitions, and digital safeguards against AI-driven identity manipulation.
Imagine a video of you shared online where you never spoke or acted . That’s the power and peril of a deep-fake. In India, as synthetic media technologies leap ahead, the urgent question arises: is our criminal law framework ready for such manipulation? The Bharatiya Nyaya Sanhita, 2023 (BNS) sets out the new substantive criminal law . But when it comes to deep-fake abuse-non-consensual fabrication, impersonation, misinformation- there are gaps, calls for amendments and evolving jurisprudence . This article explores how deep-fakes fit (or don’t fit) into BNS, what needs to change and why this matters .
Sections under BNS relevant to deep- abuse
Here we explain the key sections in BNS that can be invoked in deep-fake contexts . Before each list, some context are mentioned to connect how the old framework (Indian Penal Code, 1860 (IPC)) touched these and how BNS adapts them .
1. Offences affecting public tranquillity and misinformation
Before diving into pointers, note that deep-fakes may disturb public order, mislead citizens and impersonate persons . Under IPC there were offences of public mischief, false statements, etc. Under BNS we see updated provisions .
- Section 353(1)(b) of BNS imposes punishment for “making a false statement which is likely to cause alarm or danger to the public” or “spreading a rumour or report which is likely to cause fear or alarm to the public” (or words to that effect) .
- Section 111 deals with organised crime. While not specific to deep-fakes, when deep-fake content is created as part of organised cyber operations it may fall under this umbrella .
- The application is extra-territorial: BNS Section 1(4) & (5) make offences committed beyond India punishable as if committed within India .
2. Offences against women, children, impersonation & non-consensual imagery
Deep-fakes often involve non-consensual sexual imagery or impersonation that violates dignity . Under the IPC there were offences such as voyeurism, assault, etc . The BNS retains many but does not yet explicitly list “deep-fake” creation as a separate offence . Key pointers :
- Offences relating to sexual imagery or assault may still apply – E.g., if a deep-fake depicts someone in a sexual act without consent, existing sections on sexual offences may be used .
- Impersonation or identity theft may be framed as an offence via fraud, cheating or misrepresentation under chapters on property and documents .
- However, BNS currently lacks a dedicated “synthetic media manipulation” clause that names deep‐fakes . Analysts have flagged this gap .
3. Offences against documents and property marks
Since deep-fakes can falsify identity, images or video which may constitute documents or digital records, provisions under BNS for falsification can be relevant .
- For example, if a deep-fake video is made impersonating a public servant, this may intersect with offences under the chapter on “offences relating to documents” .
- While BNS replicates chapters of IPC (Chapter XVIII etc) in structure .
4. General exceptions and liability
Before listing pointers, note that digital technologies often raise questions of intent, knowledge and harm . Under BNS, general exceptions (such as mistake of fact, intoxication, etc) still apply .
- The question of who made the deep-fake, why and how it was distributed is relevant to establishing “mens rea” under BNS .
- BNS retains the concept of attempt, abetment and conspiracy (Chapter IV) . Thus deep-fake fabrication plus dissemination might attract liability for criminal attempt or conspiracy .
Impact
Deep-fake technology has a wide-ranging impact that makes the need for legal clarity urgent . Below are key areas of concern .
- Personal dignity and privacy : A person can be depicted in fabricated video or audio that damages reputation or causes emotional harm . Without explicit deep-fake-specific law, victims may struggle for redress .
- Public order and misinformation : Deep-fakes of political figures or critical events can trigger panic, unrest or electoral interference . The BNS section on false statements (353) becomes crucial here .
- Economic fraud and impersonation : Deep-fakes may be used to impersonate individuals in financial transactions, board rooms or secure systems . The existing provisions on cheating, fraud, property may apply but may lack specificity .
- Technological and enforcement challenges: Detection of synthetic media, attribution of creator/distributor, cross-jurisdiction issues and proving intent all complicate matters . The gap in legal definition may hamper speedy action .
- Chilling effect on free speech: Any overly broad regulation risks stifling legitimate uses of synthetic media (satire, parody, art) . Thus any amendment must balance rights and harms .
Amendments : What needs to change
The current law can be strengthened by adding targeted provisions that acknowledge advances in synthetic media, set clear definitions, calibrate punishments and ensure procedural effectiveness . The following pointers capture the major areas .
- Introduce a specific offence of “creation, dissemination or publication of non-consensual synthetic media (deep-fake) impersonating a person or causing harm” in BNS . This would close the gap noted in literature .
- Define “synthetic media” and “deep-fake” in the statute (as distinct from mere “false statements”), so that courts and investigators have clarity .
- Introduce aggravated penalty where deep-fake impersonates public servant, political figure or affects national security/public peace (link with Section 353 & 111).
- Provide victim-centric provisions, such as right to deletion or takedown, protection of identity of victim and remedies for image rehabilitation .
- Empower investigators with digital forensics mandates, preservation orders, cross-platform takedown obligations and swift interim relief .
- Introduce safe-harbour obligations for intermediaries, coordinated with the Information Technology Act, 2000 and the Digital Personal Data Protection Act, 2023, so platforms are obligated to remove deep-fake content once notified .
- Periodic review clause: given rapid AI development, include a clause that mandates periodic review of the law every few years to keep pace with technology .
Landmark cases
Here are a few notable cases that have brought deep-fakes into legal focus in India . These help illustrate how BNS provisions are being applied and where gaps remain .
- In one reported case from Gujarat, a man was arrested for sharing a deep-fake video of the Prime Minister in a WhatsApp group. He was booked under BNS Sections 197(1)(D) and 353(1)(B) along with IT Act Section 66C .
- Another case from Indore saw an individual uploading altered images of deities and explicit content . The police used BNS Sections 196, 294 and 299 along with IT Act Sections 67/67A .
- Legal commentary (such as from the Observatory for Economic & Security Studies) points out that while the BNS covers false statements and impersonation, courts are still awaiting clear jurisprudence on when a deep-fake becomes an independent offence .
These cases underline how existing sections are being stretched to address deep-fakes, but also that a dedicated statutory treatment would aid consistency and deterrence.
Key changes
Here are the major take-aways summarised :
- The BNS brings in new structural offences (organised crime, terrorism) and extra-territorial reach, which can be leveraged for deep-fake abuse cases .
- Existing sections like 353 (false statement causing public mischief) and 111 (organised crime) provide tools but are not tailor-made for deep-fakes .
- The gap: no dedicated deep-fake offence defined in the statute, which complicates detection, prosecution and sentencing .
- Victim protection, takedown and platform liability remain weakly defined in the BNS; enforcement depends heavily on external frameworks (IT Act, IT Rules) .
- Legal commentators recommend amendment rather than piecemeal application of old provisions—a proactive statute would provide clearer deterrence .
- Technology is evolving fast; the law must be adaptive, with periodic reviews and digital-forensic mandates built in .
- Practically, law-enforcement and judiciary are already using BNS provisions to arrest deep-fake offenders, but consistent jurisprudence is yet to emerge .
Conclusion
Deep-fakes pose a new frontier in criminal law—a blend of technology, identity, reputation, public order and rights . The Bharatiya Nyaya Sanhita, 2023 provides a base structure and incorporates many offences that can be applied to deep-fake harms . But legal experts and practitioners rightly call for amendments: a clear statutory definition of deep-fake abuse, dedicated offences, enhanced victim remedies and duties on platforms . If the law remains reactive, deep-fake misuse will outpace enforcement . For India’s vision of a safe digital space, the BNS must evolve soon to meet this challenge head-on .
Source – PIB
Also read – Bharatiya Nyaya Sanhita, 2023
Discover insighs on Latin Maxims and Legal Glossary and simplify complex legal terms in seconds.The LawGist ensures exam success with quality Blogs and Articles on — Top Legal Picks (TLP), Current Affairs, latest Supreme Court judgments as Courtroom Chronicles. Backed by trusted resources and videos, The LawGist is every Professionals and Aspirant’s first choice. Discover more at thelawgist.org






