India Takes Swift Action Against Deepfakes: Legal and Digital Forensic Implications
Introduction:
Deepfake technology has become a pressing concern globally, and India is no exception. The Indian government has recently taken a significant step to address this issue by issuing advisories to leading social media platforms, compelling them to remove reported deepfake content within 36 hours. Failing to comply would result in the loss of 'safe harbour immunity,' making these platforms liable to criminal and judicial proceedings under Indian laws. In this updated blog, we will explore how these new measures affect the digital forensic landscape and legal implications in India.
Read about Deepfake AI: Open Here
Government's Response to Deepfakes:
The Ministry of Electronics and IT has issued advisories to significant social media intermediaries, emphasizing the need for due diligence and reasonable efforts to identify misinformation and deepfakes. The advisory particularly targets content that violates rules, regulations, and user agreements and instructs platforms to take prompt actions against such content. This move aligns with the IT Rules 2021 and is aimed at strengthening online content regulation.
Key Points from the Advisory:
1. Swift Removal: Social media platforms are directed to ensure the removal of deepfake content within 36 hours of receiving a report. This rapid response time is crucial in curbing the spread of manipulated content.
2. Rule 7 of IT Rules 2021: Non-compliance with the advisory could result in the application of Rule 7 of the IT Rules 2021, rendering platforms liable to losing the protection available under Section 79(1) of the Information Technology Act, 2000. Section 79 is designed to protect intermediaries from liability for third-party content hosted on their platforms.
Legal Implications:
The advisory amplifies the legal framework's importance in tackling deepfake technology in India. It underscores the need for platforms to prevent the spread of misinformation and deepfakes, a legal obligation under the IT Rules 2021. Failure to comply with these requirements empowers aggrieved individuals to take platforms to court under the provisions of the Indian Penal Code (IPC). This legal recourse is critical in holding platforms accountable for any damages resulting from deepfake content.
Encouragement for Victims:
Union Minister of State for Electronics and IT, Rajeev Chandrasekhar, has urged individuals affected by deepfakes to file police complaints and seek remedies under the IT Act. The IT Act provides for jail time and financial penalties against those involved in the creation and dissemination of deepfakes. This encouragement empowers victims to take legal action against the spread of this harmful content, providing them with a path to justice.
Real-World Example:
The recent viral deepfake video of actress Rashmika Mandanna underscores the urgency of addressing this issue. The video, which showed the actress's face morphed onto someone else's, prompted calls for legal action and highlighted the need for more stringent regulations.
Conclusion:
The Indian government's swift response to deepfake technology through advisories to social media platforms is a significant step toward addressing the challenges posed by deepfakes. The legal framework in India plays a crucial role in holding platforms accountable and providing remedies to victims. Digital forensics experts will be pivotal in assisting law enforcement in the investigation and verification of deepfake-related crimes. As deepfake technology continues to evolve, it is essential that India remains proactive in adapting its legal and technical infrastructure to combat the threats it presents.