Smithery Logo
MCPsSkillsDocsPricing
Login
Smithery Logo

Accelerating the Agent Economy

Resources

DocumentationPrivacy PolicySystem Status

Company

PricingAboutBlog

Connect

© 2026 Smithery. All rights reserved.

    lyndonkl

    scientific-manuscript-review

    lyndonkl/scientific-manuscript-review
    Research
    17
    2 installs

    About

    SKILL.md

    Install

    Install via Skills CLI

    or add to your agent
    • Claude Code
      Claude Code
    • Codex
      Codex
    • OpenClaw
      OpenClaw
    • Cursor
      Cursor
    • Amp
      Amp
    • GitHub Copilot
      GitHub Copilot
    • Gemini CLI
      Gemini CLI
    • Kilo Code
      Kilo Code
    • Junie
      Junie
    • Replit
      Replit
    • Windsurf
      Windsurf
    • Cline
      Cline
    • Continue
      Continue
    • OpenCode
      OpenCode
    • OpenHands
      OpenHands
    • Roo Code
      Roo Code
    • Augment
      Augment
    • Goose
      Goose
    • Trae
      Trae
    • Zencoder
      Zencoder
    • Antigravity
      Antigravity
    ├─
    ├─
    └─

    About

    Use when reviewing or editing research manuscripts, journal articles, reviews, or perspectives...

    SKILL.md

    Scientific Manuscript Review

    Table of Contents

    • Core Principles
    • Workflow
    • Section-by-Section Review
    • Language Guidelines
    • Guardrails
    • Quick Reference

    Related skills: Grant proposals → grant-proposal-assistant | Recommendation letters → academic-letter-architect | Emails → scientific-email-polishing

    Core Principles

    Seven foundational beliefs guiding manuscript review:

    1. Clarity over cleverness: Scientific clarity is more important than stylistic elegance
    2. Narrative shapes comprehension: Structure and story arc determine reader understanding
    3. Audience dictates tone: Expert vs. general audience requires different depth and framing
    4. Format signals credibility: Professional formatting reflects scientific rigor
    5. Claims require evidence: Strong assertions need strong data and appropriate hedging
    6. Each section has a job: Introduction sells the problem, Results show the data, Discussion interprets
    7. Constraints shape structure: Word limits and journal guidelines determine emphasis

    Workflow

    Copy this checklist and track your progress:

    Manuscript Review Progress:
    - [ ] Step 1: Identify manuscript type and extract core message
    - [ ] Step 2: Structural pass - map and evaluate overall organization
    - [ ] Step 3: Introduction review - gap statement, focus, hypothesis
    - [ ] Step 4: Results review - question, approach, finding, interpretation
    - [ ] Step 5: Discussion review - synthesis, context, limitations
    - [ ] Step 6: Scientific clarity check - claims, controls, hedging
    - [ ] Step 7: Language polish - terminology, voice, jargon
    - [ ] Step 8: Formatting check - journal compliance
    

    Step 1: Identify Manuscript Type and Core Message

    Determine document type (research article, review, perspective, short communication). Extract the ONE finding or message readers must remember. Ask: "If readers remember only one thing, what should it be?" See resources/methodology.md for extraction techniques.

    Step 2: Structural Pass

    Map overall organization against standard IMRaD (Introduction, Methods, Results, Discussion) or review structure. Check logical sequencing - does each section flow into the next? Identify unclear transitions or missing context. See resources/methodology.md for structure evaluation.

    Step 3: Introduction Review

    Evaluate using the Introduction Arc: Broad context → Narrow focus → Knowledge gap → Hypothesis/Objective. Check that gap statement is explicit and compelling. Verify ending with clear hypothesis or objective. See resources/template.md for template.

    Step 4: Results Review

    For each figure/table/experiment: Question addressed? → Approach used? → Key finding (with statistics)? → Interpretation (what it means)? Flag data-dump writing that lacks interpretation. Ensure findings build toward core message. See resources/template.md for results structure.

    Step 5: Discussion Review

    Verify structure: Revisit hypothesis → Interpret findings in field context → Place in broader literature → Acknowledge limitations → Suggest future directions. Check for overclaiming (speculation presented as fact). Ensure clear separation of data interpretation vs. speculation. See resources/methodology.md for discussion framework.

    Step 6: Scientific Clarity Check

    Run the clarity checklist: Claims supported by data? Quantitative details present (statistics, n values)? Controls adequately described? Interpretations appropriately hedged? Mechanistic explanations where needed? See resources/template.md for full checklist.

    Step 7: Language Polish

    Ensure terminology consistency throughout. Remove or define jargon on first use. Prefer active voice when it aids clarity. Standardize abbreviations. Check for hedging language ("suggests" vs "proves"). See resources/methodology.md for specific guidance.

    Step 8: Formatting Check

    Verify compliance with target journal guidelines (word limits, reference format, figure requirements). Check section headings match journal requirements. Ensure abstract follows structured/unstructured requirement. Validate using resources/evaluators/rubric_scientific_manuscript.json. Minimum standard: Average score ≥ 3.5.

    Section-by-Section Review

    Introduction Structure

    Goal: Convince readers the problem matters and your approach is sound

    The Funnel Structure:

    [Broad context - establish field importance, 1-2 sentences]
            ↓
    [Narrow to specific area - what's been done]
            ↓
    [Knowledge gap - what's missing, why it matters]
            ↓
    [Your hypothesis/objective - what you will address]
    

    Common problems:

    • Gap statement buried or implicit (make it explicit: "However, X remains unknown")
    • Too broad opening (readers don't need history of the universe)
    • No clear hypothesis at end (readers don't know what to expect)
    • Overlong literature review (move details to Discussion)

    Results Structure

    Goal: Present data clearly with interpretation, not just numbers

    Per-paragraph/figure structure:

    [Question this experiment addresses]
    [Approach/method used]
    [Key finding - with quantification]
    [Brief interpretation - what this means]
    

    Common problems:

    • Data dump (listing results without interpretation)
    • Missing statistics (p-values, n values, confidence intervals)
    • Vague descriptions ("we found differences" vs "we found 3-fold increase")
    • Figures not referenced in logical order
    • Key findings buried in text (highlight important results)

    Discussion Structure

    Goal: Interpret findings and place in broader context

    Standard flow:

    [Restate main finding and hypothesis status]
            ↓
    [Interpret key results in field context]
            ↓
    [Compare to prior literature - agreements/disagreements]
            ↓
    [Mechanistic implications (if applicable)]
            ↓
    [Limitations - honest acknowledgment]
            ↓
    [Future directions - what comes next]
            ↓
    [Concluding statement - big picture significance]
    

    Common problems:

    • Overclaiming (data doesn't support conclusions)
    • Repeating Results section (discuss, don't recapitulate)
    • Missing limitations (reviewers will note them anyway)
    • Speculation unmarked (clearly label "we speculate that...")
    • No connection to field (discuss in isolation)

    Language Guidelines

    Active vs. Passive Voice:

    • Use active for clarity: "We measured" not "Measurements were made"
    • Use passive when agent is obvious or unimportant: "Samples were incubated at 37°C"
    • Avoid dangling modifiers: Not "Having analyzed the data, the conclusion was..." but "Having analyzed the data, we concluded..."

    Hedging Language:

    • Strong data: "demonstrates", "shows", "establishes"
    • Moderate confidence: "suggests", "indicates", "supports"
    • Speculation: "may", "might", "could potentially"
    • Match hedge strength to evidence strength

    Jargon Management:

    • Define on first use: "polymerase chain reaction (PCR)"
    • Avoid unnecessary jargon when plain language works
    • Field-standard terms don't need definition (DNA, protein, cell)
    • Reader-appropriate: more definition for broad audience journals

    Terminology Consistency:

    • Pick one term and stick with it (don't alternate between "subjects", "participants", "patients")
    • Create terminology table for complex manuscripts
    • Check abbreviations defined before use

    Guardrails

    Key requirements:

    1. Preserve author voice: Edit for clarity, not voice. Avoid inventing claims or changing meaning. Mark suggestions clearly when proposing new content.

    2. Claims match data: Every conclusion must be supported by presented results. Flag overclaiming immediately. Speculation must be labeled.

    3. Quantitative rigor: Statistics required for comparisons. N values for all experiments. Significance thresholds stated. Variability measures included.

    4. Logical flow: Each section should flow naturally to the next. Transitions explicit. Conclusions follow from premises.

    5. Appropriate hedging: Strong claims need strong evidence. Use hedging language proportional to certainty.

    6. Consistent terminology: Same concept = same term throughout. Abbreviations defined before use.

    Common pitfalls:

    • ❌ Overclaiming: "This proves X" when data only suggests
    • ❌ Missing context: Results without interpretation
    • ❌ Buried lede: Important finding hidden in paragraph
    • ❌ Inconsistent terms: Alternating between synonyms
    • ❌ Dense paragraphs: Walls of text without breaks
    • ❌ Vague descriptions: "Some increase" instead of "3-fold increase"

    Quick Reference

    Key resources:

    • resources/methodology.md: Detailed review methods, structural assessment, language guidelines
    • resources/template.md: Introduction arc, results paragraph, clarity checklist
    • resources/evaluators/rubric_scientific_manuscript.json: Quality scoring criteria

    Introduction checklist:

    • Broad context establishes importance
    • Narrows to specific problem
    • Gap statement explicit ("However, X remains unknown")
    • Ends with clear hypothesis or objective

    Results checklist:

    • Each experiment has question, approach, finding, interpretation
    • Statistics present (p-values, n, confidence intervals)
    • Quantitative descriptions (numbers, not "some/many")
    • Figures referenced in logical order
    • Key findings highlighted

    Discussion checklist:

    • Opens by revisiting hypothesis
    • Interprets (doesn't just repeat) results
    • Places in literature context
    • Acknowledges limitations
    • Suggests future directions
    • Speculation clearly labeled

    Typical review time:

    • Quick review (structure + major issues): 20-30 minutes
    • Standard review (full checklist): 45-60 minutes
    • Deep revision (rewriting sections): 2-3 hours

    Inputs required:

    • Manuscript draft (any stage)
    • Target journal (if known)
    • Specific concerns from author (if any)

    Outputs produced:

    • Edited manuscript with tracked changes
    • Commentary on major structural/logic changes
    • Summary of key improvements made
    Recommended Servers
    Consensus
    Consensus
    arXiv
    arXiv
    Paper Search
    Paper Search
    Repository
    lyndonkl/claude
    Files