Study Guide: Alex for Grant Writers

Your reference for applying Alex to NSF/NIH proposals, funding narratives, budget justifications, compliance documentation, and review preparation. Ready-to-run prompts — built around the hard parts of winning research funding, not the generic writing advice.


What This Guide Is Not

This is not a habit formation guide (see Self-Study Guide for that). This is a domain use-case library — the specific ways Alex supports professional grant writing across federal and private funding agencies.


Where to Practice These Prompts

Every prompt in this guide works with any AI assistant — ChatGPT, Claude, GitHub Copilot, Gemini, or whatever tool you prefer. The prompts are the skill; the tool is just where you type them. Pick the one you’re comfortable with and start today.

For an integrated experience, the Alex VS Code extension (free) was purpose-built for this workshop. It understands grant writing context, lets you save effective prompts with /saveinsight, and brings your study guide and practice exercises into one workspace. VS Code is a free editor that takes minutes to set up, even if you’ve never used it before.

You don’t need a specific tool to benefit. You need the habit of reaching for AI when the work is genuinely hard — not just when it’s repetitive.


Core Principle for Grant Writers

Grant writing is persuasion constrained by evidence. The hardest part is not following the formatting guidelines — it is constructing a narrative that convinces reviewers your work is important, your approach is sound, and your team can execute. The failure mode is writing a technically accurate proposal that no reviewer champions, because it never answers the question reviewers actually ask: “Why should we fund this instead of the other 200 proposals?”

Your primary discipline with Alex: use it to pressure-test your narrative logic, anticipate reviewer objections, and ensure every section of the proposal answers “so what?” — not just “what.”

Important: Always verify specific agency requirements, page limits, formatting rules, and review criteria against the current solicitation. Funding agency guidelines change frequently.


The Seven Use Cases

1. Specific Aims and Research Narrative

The grant writer’s narrative challenge: The Specific Aims page is the most important page of a federal research grant. Reviewers form their opinion in the first paragraph. A technically sound proposal with a weak narrative loses to a well-told story with solid methods. The aims page must establish significance, identify the gap, position your approach as the logical next step, and make the reviewers care — all in one page.

Prompt pattern:

I am writing a [grant type: R01, R21, NSF CAREER, foundation letter of intent] for [funder].
Research topic: [what I study and why it matters].
Gap in current knowledge: [what is not known and why it matters].
My approach: [what I propose to do — methods, not just topic].
Preliminary data: [what I have that makes this credible].
Long-term vision: [what this enables beyond the grant period].

Help me:
1. Draft a Specific Aims page that hooks the reviewer in the opening paragraph
2. Ensure each aim is independent (failure of one does not doom the others)
3. Articulate the significance in terms a non-specialist reviewer can grasp
4. Identify the "so what" gap — where the proposal assumes the reader cares without earning it

Follow-up prompts:

A colleague read my aims page and said it was "technically fine but not exciting." How do I add urgency without overpromising?
Read this as a skeptical reviewer who works in a related but different subfield. What questions would you raise in the study section?

Try this now: You are writing an R01 for NIH NIGMS studying a novel antibiotic resistance mechanism in Gram-negative bacteria. Your preliminary data shows a previously undescribed efflux pump variant in clinical E. coli isolates, and the RFA emphasizes innovative approaches to the AMR crisis. Paste your aims and preliminary data summary into the narrative prompt. The response will help you find the “so what” framing that connects your molecular mechanism to the clinical urgency reviewers care about.


2. Budget Justification

The grant writer’s budget challenge: A budget that looks inflated loses credibility. A budget that looks too lean signals naivety about what the work actually costs. The budget justification is where reviewers assess whether you understand what it takes to execute your plan — and whether you are being honest about it.

Prompt pattern:

I need to write a budget justification for a [amount] [grant type] over [duration].
Personnel: [PI effort, co-Is, postdocs, students, technicians — with FTE and roles].
Equipment: [major items over $5,000].
Supplies: [categories and estimated costs].
Travel: [conferences, field work, collaboration visits].
Other: [subawards, participant costs, publication fees].
Agency: [funder — for specific cost policies].

Help me:
1. Write a justification that ties every line item to specific aims (not "general support")
2. Ensure personnel effort matches the scope of work (reviewers check this)
3. Flag items that commonly trigger reviewer concern (too much travel, unclear equipment need)
4. Verify the budget aligns with agency cost principles and limits

3. Broader Impacts and Dissemination

The grant writer’s impacts challenge: Broader impacts are not an afterthought — at NSF, they carry equal weight to intellectual merit. The failure mode is listing activities (mentoring, outreach, workshops) without connecting them to the research or explaining why they are effective. Reviewers have seen “I will mentor undergraduates” in every proposal. What makes yours credible?

Prompt pattern:

I need to write a broader impacts section for [grant type, funder].
Research area: [domain].
Target communities: [who benefits beyond the immediate research community].
Existing impact activities: [what I already do — teaching, mentoring, outreach, industry partnerships].
Institutional support: [what resources or programs my institution provides].
Constraints: [budget for BI activities, geographic limitations, time].

Help me:
1. Connect impact activities to the research — not bolted on, but integrated
2. Go beyond the generic list: what specific, measurable outcomes will these activities produce?
3. Identify partnerships or institutional programs that add credibility
4. Anticipate the reviewer who thinks "this sounds like a boilerplate BI section" — what makes mine authentic?

4. Review Criteria Alignment and Self-Assessment

The grant writer’s alignment challenge: Every funder publishes review criteria. Most applicants read them once and write what they want to write. The proposals that score well are the ones written to the criteria — where every section explicitly addresses what reviewers are asked to evaluate.

Prompt pattern:

Here are the review criteria for [grant program]:
[Paste the review criteria or scoring rubric from the solicitation]

And here is my draft proposal:
[Paste the relevant section]

Help me:
1. Map each review criterion to where my proposal addresses it (or does not)
2. Identify criteria that are addressed weakly or only implicitly
3. Suggest specific revisions to strengthen alignment with each criterion
4. Rate my proposal on each criterion as a reviewer would: strong / moderate / weak — with reasoning

Follow-up prompts:

What would a reviewer write in the "weaknesses" section of their review? Give me the honest critique.
If I had to strengthen one section to move this proposal from "good" to "must fund," which section and what change?

5. Resubmission Strategy

The grant writer’s resubmission challenge: Most funded grants are resubmissions. The art of the resubmission is not just fixing what reviewers complained about — it is demonstrating responsiveness while maintaining the coherence of the original vision. Over-responding (changing everything reviewers mentioned) can weaken the proposal as much as under-responding (ignoring valid critiques).

Prompt pattern:

My proposal was not funded. Here are the reviewer comments:
[Paste summary statement or reviewer critiques]

My original proposal: [key points or paste relevant sections].
New preliminary data since submission: [what I have now that I did not have before].
Changes in the field: [anything that affects the significance or approach].

Help me:
1. Categorize reviewer concerns: valid critique I must address / misunderstanding I must clarify / difference of opinion I must acknowledge
2. Design the response strategy — what to change, what to clarify, what to stand firm on
3. Draft the introduction to the resubmission that demonstrates responsiveness without being defensive
4. Identify how to strengthen the proposal beyond just responding to reviewers — what would make this proposal clearly fundable?

6. Compliance and Formatting

The grant writer’s compliance challenge: Administrative rejection for formatting violations is the most preventable failure in grant writing. The proposal that is 0.5 inches over the margin requirement, has the wrong font size in a figure caption, or omits a required section gets desk-rejected before a reviewer sees it.

Prompt pattern:

I am finalizing a submission to [funder, program].
Solicitation requirements: [paste formatting rules — page limits, margins, font, required sections, supplementary materials].
My current draft status: [what sections are complete, what is pending].
Submission deadline: [date].

Help me:
1. Create a compliance checklist from the solicitation requirements
2. Identify the commonly missed requirements for this program (agency-specific gotchas)
3. Build a submission timeline working backward from the deadline
4. Flag sections where my draft may not comply (based on the rules you can see)

7. Collaborative Grant Writing

The grant writer’s collaboration challenge: Multi-PI grants and center grants are coordination nightmares. Each collaborator writes their section in isolation, voice and formatting drift, and the assembled proposal reads like what it is: five people who each wrote their part. The proposals that win read as one coherent document with one narrative arc.

Prompt pattern:

I am coordinating a [multi-PI / center / collaborative] grant with [number] investigators.
My role: [PI / contact PI / project manager / grant coordinator].
Collaborators and their contributions: [who is writing what].
Integration challenges: [different writing styles, inconsistent terminology, overlapping scope, missing connections].
Timeline: [when sections are due, when assembly happens, when internal review occurs].

Help me:
1. Create the integration checklist: terminology, acronyms, formatting, cross-references, tone
2. Draft the coordination document — what each collaborator needs to know about the overall narrative
3. Identify overlap or gaps between collaborator sections
4. Design the internal review process that catches problems before submission

What Great Looks Like

After consistent use, you should notice:

The grant writer who will thrive with AI assistance is not the one who generates draft text fastest. It is the one who uses AI to strengthen their narrative logic, anticipate reviewer objections, and submit proposals that reviewers want to fund.


Your AI toolkit: These prompts work in ChatGPT, Claude, Copilot, Gemini — and in the Alex VS Code extension, which was designed around them. Start with whatever you have. The skill transfers across all of them.

Your First Week Back: Practice Plan

DayTaskTime
Day 1Run the Review Criteria Alignment pattern on a proposal you are working on25 min
Day 2Revise your Specific Aims page using the narrative pattern30 min
Day 3Strengthen your Broader Impacts section with the integration pattern20 min
Day 4Use the Budget Justification pattern to audit your budget narrative20 min
Day 5Save three reusable prompt patterns with /saveinsight10 min

Month 2–3: Advanced Applications

Reviewer Feedback Archive

Capture patterns from reviews to improve future proposals:

/saveinsight title="Review pattern: [critique type]" insight="Reviewer concern: [what they said]. Root cause: [why our proposal triggered this]. Fix: [how we addressed it]. Prevention: [how to avoid it in future proposals]." tags="grant-writing,review,pattern"

Successful Narrative Patterns

Track narrative approaches that scored well:

/saveinsight title="Narrative: [proposal/section]" insight="Approach: [how we structured the argument]. Key move: [the specific choice that makes the narrative work]. Score: [how it was rated]. Reusable element: [what can be applied to other proposals]." tags="grant-writing,narrative"

Continue your practice: Self-Study Guide — the 30/60/90-day habit guide.

Skills Alex brings to this discipline
ai-writing-avoidance research-first-development documentation-quality-assurance bootstrap-learning
Install the Alex extension →
Completed this study guide?

Show the world you've mastered using AI in grant writing. Add your certificate to LinkedIn.

📚 Want to go deeper?

Alex was a co-author of two books — a documentary biography and a work of fiction. Both explore human-AI collaboration from angles the workshop only touches.

Discover the books →