Sample AFWERX Open Topic Evaluation: What Reviewers Write About Your Proposal
Defense Grant Writers · March 14, 2026
AFWERX does not publish sample evaluations. When your Open Topic proposal is reviewed, you receive a selection or non-selection notification with minimal feedback. Most applicants never see what reviewers actually wrote about their submission.
We created a realistic sample evaluation to show you what the process looks like from the reviewer's side. This is a simulated document based on our experience writing hundreds of AFWERX proposals and our team's direct knowledge of the evaluation process. Every detail is fictional, but the format, scoring criteria, and types of comments reflect how actual evaluations work.
Download the sample AFWERX evaluation (PDF)
How AFWERX Open Topic Proposals Are Evaluated
AFWERX evaluates Open Topic proposals on three primary criteria, each scored on a 1-5 scale:
- Technical Merit and Feasibility: Is the technology sound? Is there evidence it works? Are the Phase I milestones achievable?
- Team and Company Qualifications: Does the team have relevant expertise? Prior SBIR track record? Domain knowledge?
- Potential for Impact and Commercialization: Is there a credible path to Air Force or Space Force adoption? Dual-use potential?
Proposals need a minimum composite score of 3.0 to be considered for award. In practice, most selected proposals score 3.5 or above.
What Separates Selected from Non-Selected
In the sample evaluation, the proposal scored 4.2 overall. Here is what drove the strong scores and what held it back:
What worked: The proposal included specific, quantified results from a TRL 3 prototype (94% classification accuracy across 12 signal types). It identified specific military end-users by name (PEO C3T and AFRL/RY). The team had prior SBIR experience with a Phase II transition. These are concrete, verifiable claims that give reviewers confidence.
What cost points: The commercial market analysis was thin. Revenue projections for the non-military segment were not well-supported. And critically, no STRATFI/TACFI program office champion had been identified, which limits the proposal's follow-on funding story.
Key Takeaways for Your Next AFWERX Submission
- Lead with data, not claims. Reviewers scored technical merit highest when the proposal included specific, quantified results from prior work.
- Name your military customer. Identifying a specific program office or end-user unit is the single highest-impact thing you can do for the commercialization score.
- Do not neglect the commercial side. Even though AFWERX is a military program, the dual-use commercialization criterion carries real weight. Generic market sizing is not enough.
- Think about STRATFI/TACFI early. If you cannot identify a program office champion during Phase I, your follow-on funding pathway is weaker.
Download the full sample AFWERX evaluation (PDF)
For a deeper look at the differences between AFWERX tracks, see our guide: AFWERX Open Topic vs Specific Topic: Which Track Is Right for You?
Need Help With Your Defense Proposal?
Our writers have served on DoD review panels. They know what scores well and what gets triaged. Fixed pricing from $1,995.
Book Free ConsultationFor NSF, NIH, and civilian agency SBIR/STTR proposals, visit sbirgrantwriters.com.