Rising Scholars

Authorship in the Age of AI: Credit, Disclosure, and the New Rules

By Paras Karmacharya | May. 29, 2025  | Research writing Research skills Artificial Intelligence

31% of researchers admit to using AI to write papers.

The other 69% might be lying.

Academic writing is changing—fast.

What was once a slow, meticulous process of outlining, drafting, revising, and citing is now often jump-started with a single prompt to a language model.

Need a background section?

A structured abstract?

A summary of related literature?

Tools like ChatGPT can deliver them in seconds.

But this speed comes with a deeper question:

“If a machine helps you write your paper, is it still your paper?”

 

As a researcher, you’ve likely already wrestled with this existential question (I definitely have). You may have used AI to clean up your grammar, brainstorm discussion points, or rephrase sentences for clarity.

That is not unethical.

But it is redefining authorship.

And if we are not thoughtful, we risk eroding the very foundations of academic integrity—without even realizing it.

Let’s break this down.

 

1. Authorship Is No Longer Just About Writing

Traditional authorship criteria (like ICMJE’s) emphasize four things:

  • Substantial intellectual contribution
  • Drafting or revising the work
  • Final approval
  • Accountability

AI can do none of these.

So, despite writing large portions of text, it cannot be listed as an author.

COPE, WAME, and major journals like Nature and JAMA have reinforced this:

AI is a tool, not a mind. It cannot think, approve, or take responsibility.

Yet here’s the rub: Many authors now use AI to draft work, then lightly revise. They approve it. They submit it. But they never disclose it.

That’s a problem.

It’s not just a matter of guidelines.

It’s about transparency, integrity, and trust.

 

2. Invisible Help Is Still Help

Let’s be honest: If a human research assistant helped you write half your manuscript, you would either acknowledge them or list them as a co-author—depending on their role.

Why should AI be different?

Not disclosing AI use is akin to unacknowledged ghostwriting.

And in academic publishing, ghostwriting—especially by industry sponsors—is widely considered unethical.

If you used ChatGPT to:

  • Draft your introduction
  • Generate a results summary
  • Write limitations language

…and didn’t disclose it?

You’ve crossed into murky territory.

Many editors now view this as misrepresentation—even if the facts are accurate and the references are real.

 

3. AI Is Becoming a Quiet Co-Author

In a global Elsevier survey (2024), 31% of researchers reported using generative AI in writing. And that’s likely an underestimate.

Researchers already use AI to:

  • Polish grammar
  • Summarize studies
  • Suggest phrasing
  • Draft abstracts

What’s fascinating is that many of these AI-written sections are rated better than human-written ones.

One Nature study found ChatGPT-generated essays outscored student essays on quality and coherence.

In other words, AI is not just assisting.

It’s raising the bar.

If your writing isn’t clearly better than AI, what’s your added value?

That’s the new question authors must ask.

4. The Ethics of Speed

Time is a precious commodity—especially for clinician-researchers juggling patient care and paper deadlines.

So yes, using AI can help you:

  • Write faster
  • Meet deadlines
  • Submit more papers

But speed ≠ credibility. 

The same tools that save time can:

  • Invent references
  • Omit nuance
  • Generate plausible but incorrect claims

In one study, ChatGPT-generated abstracts passed plagiarism checks with 100% originality—but included made-up citations. Human reviewers misclassified 32% of them as genuine.

Let that sink in.

AI doesn’t plagiarize in the traditional sense.

But it fabricates with confidence.

So as researchers, the burden is on us to verify every output.

 

5. Writing with AI Demands New Literacies

We now need to teach (and model) a new kind of academic literacy:

  • Prompt engineering: How to ask AI the right questions
  • Critical editing: How to reshape raw AI output into meaningful prose
  • Fact verification: How to spot hallucinated claims or fake references
  • Ethical disclosure: How to transparently report AI’s role

It’s not enough to write.

You need to know what was generated, what was curated, and what was created.

Because that’s where authorship lives.

 

6. Rethinking Mentorship and Writing Instruction

Academic writing instruction hasn’t caught up.

Most trainees are still taught the traditional 5-step writing process: outline, draft, revise, cite, submit.

But that assumes they’re doing all the writing themselves.

In reality, many students now start with:

“Write an introduction about X using recent references.”

We need new pedagogical strategies that:

  • Encourage AI use as a starting point, not a crutch
  • Teach comparison exercises (e.g., “Write this paragraph yourself, then critique ChatGPT’s version”)
  • Emphasize original thought over polish
  • Normalize disclosure, not secrecy

If we treat AI like a calculator in math—useful but not a substitute for conceptual understanding—we can build stronger writers, not just faster ones.

 

7. In a World Where Everyone Has AI, What’s Your Edge?

Here’s the big question researchers need to ask:

What makes your writing valuable when everyone has access to the same AI tools?

 

The answer isn’t style. AI can mimic that.

The answer is:

  • Your lived clinical or research-field specific experience
  • Your judgment about what matters
  • Your ability to synthesize data and propose next steps
  • Your ethical responsibility to verify, correct, and own your claims

In short: Your unique perspective and discernment is the moat.

 

Final Thoughts: Authorship Isn’t Dead. It’s Evolving.

AI isn’t the death of academic writing.

But it is the end of writing as a solitary act.

In this new landscape:

  • Authorship = Curation + Verification + Judgment
  • Transparency = Trust
  • Prompting = Power
  • And writing = Collaboration—with machines, mentors, and co-authors alike

The researchers who thrive won’t be the ones who write the fastest.

They’ll be the ones who write with clarity, ownership, and ethical rigor—even when AI is part of the process.

So yes, AI can help you write your next paper.

But only you can make it yours.

(And yes, I used AI to write this.)


P.S. I am a NIH-funded physician-scientist and assistant professor of medicine and the director of the psoriatic arthritis and spondyloarthritis clinical and research program at the Vanderbilt University Medical Center, Nashville, TN, US. I am also the Founder of Rising Researcher Academy, where I share Researcher-First, Minimalistic, Ethical AI + Timeless, Proven Clinical Research Systems.

Join my inner circle of 4000+ researchers for exclusive, actionable advice you won’t find anywhere else HERE: https://risingresearcheracademy.com/newsletter/

BONUS: When you subscribe, you instantly unlock my Research Idea GPT and Manuscript Outline Blueprint.

You can also follow me on LinkedIn HERE.

 

Thumbnail image: created by ChatGPT (Gpt 4o) on the 4th of May 2025. 

First image: Philip Oroni on Unsplash 

blog comments powered by Disqus