Social Media

ShareChat β†—

Ready Score 39/100
Sushant Pasumarty
ANALYSIS SUPERVISED BY Sushant Pasumarty
πŸ“… 21 Feb 2026

ShareChat and Moj serve India's vernacular social media users β€” processing content that reveals regional identity, religious affinity, political leanings, and cultural practices. At 39/100, the combination of cultural profiling, facial data from short videos, and large minor user base creates one of the most complex DPDP compliance challenges.

⚠️ Compliance Gaps

  • No DPDP Act 2023 reference
  • Vernacular content reveals regional, religious, and political identity
  • Moj short video facial data handling undefined
  • User-generated content data rights under DPDP unclear
  • No data retention timelines
  • Data Protection Board not referenced
  • Advertising data profiling based on language and cultural preferences
  • Minor users on Moj platform lack Section 9 protections

βœ… Strengths

  • Basic security measures described
  • Grievance officer designated
  • Content moderation systems referenced

Overview

ShareChat and Moj (its short-video platform competing with Instagram Reels) serve India’s vernacular internet users β€” primarily Tier 2/3 cities in regional languages. The content users create and consume reveals regional identity (language), religious affinity (devotional content), political views (political content engagement), caste indicators, and cultural practices. The Moj platform adds facial/visual data through video creation.

DPDP Readiness: Section-by-Section Analysis

ShareChat/Moj consent covers all data under standard terms. Unique concerns:

  • Language selection: Reveals regional and potentially ethnic identity
  • Content engagement: What religious/political content a user interacts with = belief profiling
  • Video creation on Moj: Face filters process facial geometry data
  • User-generated content: Text posts and images containing personal information

Section 9 β€” DPDP Children’s Data πŸ”΄

Moj’s short-video platform attracts significant minor users. Under DPDP Section 9:

  • No verified parental consent
  • Behavioral monitoring through feed algorithms
  • Facial filter data from minors
  • No time limits or child safety provisions

Section 11 β€” Rights of Data Principal πŸ”΄

  • Can users delete cultural/religious profiling inferences?
  • Video content deletion β€” are copies retained?
  • No data portability
  • No nomination rights

Risk Assessment

CategoryRisk LevelPotential Impact
Cultural/religious profilingCriticalContent engagement reveals protected characteristics
Facial data (Moj)CriticalVideo filters process biometric-adjacent data
Minor usersCriticalLarge young user base without Section 9 compliance
Content dataHighUser-generated content = personal information
Data retentionHighCultural profiles retained indefinitely

Recommendations

  1. Implement cultural data protections β€” Don’t use language/content engagement to build religious or political profiles for advertising
  2. Define Moj facial data handling β€” β€œFace filter processing: on-device only, never stored on servers”
  3. Deploy Section 9 compliance β€” Age verification and parental consent for Moj’s minor users
  4. Add content ownership clarity β€” Clear under DPDP who controls user-generated content data
  5. Build cultural profile transparency β€” Let users see and control cultural/religious interest categorizations

How Does Your Policy Compare?

πŸ” Run Your Free DPDP Audit β†’


Analysis conducted by DPDP Consulting, a Meridian Bridge Strategy initiative. For a comprehensive compliance roadmap, book a free consultation.

Fix these compliance gaps today.

Book 1:1 Consultation
πŸ“ž Free Consultation