The Essay
How India’s Education Policy Should Address the Ownership of AI-Generated Student Work
Addressing the pressing question of who owns AI-generated student work, India's education policy must urgently provide clarity.
By Krishna Mohan
The rise of generative AI has brought us to an inflection point in higher education. A recent Harvard Business Review analysis noted that over 60 percent of college students globally have already experimented with tools like ChatGPT for assignments or idea generation. Meanwhile, an MIT Sloan study highlighted that professionals using generative AI were able to complete knowledge tasks 40 percent faster and with greater accuracy compared to peers.
If such productivity gains are reshaping the workplace, it is natural that students, as future knowledge workers, will embrace these tools as well. Yet, in India, where the National Education Policy (NEP) 2020 is championing critical thinking and experiential learning, a pressing question remains unresolved:
Who owns the output of student work when it is generated or co-created with AI?
This is not merely a legal or philosophical curiosity. The answer will shape assessment fairness, academic integrity, intellectual property rights and ultimately, how India prepares its next generation for the AI-augmented workforce.
Why Ownership Matters in an Indian Context
India’s education system has long been designed around examination-driven evaluation. From CBSE boards to university exams, the implicit assumption has always been that submitted work reflects the student’s original effort. AI now destabilises this assumption.
Consider three real-world scenarios I’ve observed in my work with academic institutions and corporate learning teams:
1. The Shortcut Risk
A student feeds an essay prompt into a generative AI system and turns in the output verbatim. If the grade is awarded, does it reward the student’s skill, or the AI’s?
2. The Collaborative Augmentation
A student uses AI to brainstorm ideas, refine drafts or simulate different perspectives, but the final submission reflects their judgment and curation. In this case, shouldn’t ownership of the work remain with the student, much like citing research papers or software libraries?
3. The Commercial Extension
An engineering student builds an AI-assisted code module during coursework and later spins it into a startup product. Should the intellectual property belong fully to the student, or should the institution have a share if AI tools provided by the university were used?
Without a clear framework, universities and policymakers risk creating confusion and, worse, inequity. Wealthier students may access premium AI tools privately, while others depend on institution-provided access, blurring the fairness of evaluation.

Why We Need to Move From Policing AI to Cultivating Originality
The debate over copyright or disclosure, while important, is ultimately a distraction if our education system continues to reward reproducibility over originality. Generative AI thrives on well-posed, common problems because it has been trained on a vast diet of existing code, essays and ideas. But originality, the ability to frame a novel question, apply contextual judgment or integrate multiple perspectives, is where students must be pushed.
In India, this is where education policy needs to intervene. Instead of only asking, “Did the student use AI?”, the more transformative question is: “Did the assessment demand something only a thinking human could create?”
Consider two examples from software engineering education:
Problem Framing vs. Problem Solving
If an exam asks, “Write code to sort a list using merge sort,” ChatGPT or GitHub Copilot can deliver a textbook solution instantly. But if the assignment is reframed as, “Given data from a retailer’s POS system with inconsistent formats and missing fields, how would you design a data pipeline to ensure accuracy before applying any machine learning?”, suddenly, originality is unavoidable. The student must demonstrate domain understanding, judgment about trade-offs and creativity in framing the problem. AI can assist, but it cannot substitute solid reasoning.
Design Trade-offs in Real-World Systems
Instead of asking, “Implement a linked list in C,” consider asking, “If you were designing a payment gateway for small businesses in India, how would you balance low latency with fraud detection checks? Illustrate with pseudo-code and explain trade-offs.” Here, the value isn’t in raw code (AI can produce that), but in the why behind each decision. Students must connect technical architecture with business and societal context, something no generative model can hallucinate with reliability.
These are not futuristic hypotheticals. In India’s tech corridors, from Bengaluru’s startup hubs to large IT campuses in Hyderabad, I’ve seen young engineers who lean too heavily on AI copilots falter when asked to design solutions even slightly beyond the template. The true differentiators are those who can step back, frame the original question, and justify their design choices with clarity, skills that no machine can automate.
This is the direction India’s education policy must take: cultivate originality, not reproducibility.

The Policy Vacuum in NEP 2020
NEP 2020 emphasises critical thinking, experiential learning and multidisciplinary education. Yet, the policy remains silent on AI-enabled authorship and ownership. While plagiarism is addressed indirectly through academic integrity guidelines, AI poses a different challenge, it generates original but machine-derived content.
Here lies the crux:
- If NEP envisions students as knowledge creators, then clarity on ownership of AI-augmented work is non-negotiable.
- Without it, we risk either over-policing AI usage (stifling innovation) or ignoring it entirely (eroding academic credibility).
The way forward is not to copy global precedents blindly, but to tailor a uniquely Indian approach: one that acknowledges AI, protects fairness and, most importantly, forces originality through assessment design.
Implementation Pathways for India
How should Indian policymakers translate these ideas into practice? Some pragmatic steps include:
1. Guideline Supplements to NEP 2020
The Ministry of Education can release an addendum clarifying AI usage policies, much like UGC circulars that interpret broader acts for universities.
2. AI Usage Declarations
Institutions could mandate a short disclosure section in assignments, where students indicate whether and how AI was used. This normalises transparency.
3. Assessment Redesign for Originality
Moving from rote essays and standard problem sets to viva voce, project-based evaluations and oral defences ensures students cannot simply submit AI outputs without understanding. Originality should be rewarded explicitly.
4. Ethics and AI Literacy Modules
Introduce a mandatory module on AI ethics, authorship and intellectual property as part of undergraduate foundation courses. Students must learn not just “how to use AI,” but “how to question AI.”
5. Equitable Access
Policies must ensure that AI resources provided by universities are accessible to all, preventing inequality between students who can afford premium AI tools and those who cannot.

From Compliance to Competence
It would be a mistake to frame this debate only in terms of compliance. For India to succeed, students must graduate with AI competence, knowing not just how to use AI, but when not to.
In my work with corporate AI adoption, I’ve observed a pattern: AI makes every professional their own CEO. Just like a chief executive is informed by multiple advisors and reports before making a decision, employees who use AI as a set of intelligent agents to inform them, while still taking the final call themselves, create the greatest value. Those who abdicate judgment entirely, on the other hand, risk losing their leadership in the process.
Our education system must mirror this balance. Instead of penalising students for AI usage, we should encourage reflective assignments: “Explain how AI influenced your answer and where you overrode its suggestions.” This forces students to demonstrate critical thinking rather than blind reliance.
To Conclude
AI in education is not a passing fad, it is the foundation of how future generations will think, learn, and create. For India, a nation of 250 million students and a rapidly growing AI economy, the stakes are enormous.
If we fail to address the ownership of AI-generated student work, we risk creating a generation either penalised unfairly for using the very tools shaping tomorrow’s workplace, or ill-prepared because we ignored the question altogether.
The solution is neither prohibition nor blind acceptance, but clarity and courage in policy: clarity on what constitutes authorship, ownership, and fairness; and courage to redesign learning so originality, not reproduction, is at the centre.
India has the opportunity to set a global benchmark here, crafting education policies that not only uphold academic integrity but also empower students to thrive as AI-augmented knowledge creators.
The Writer is Krishna Mohan is a leading voice in Generative AI, specialising in AI engineering and building ethical, future-ready AI solutions.

