The Prayaas Methodology
Question paper QA, examination integrity, evaluator training and the diagnostic-tone discipline. The four pillars behind every Prayaas report.
Why this page exists
Most assessment platforms publish a marketing claim. We publish a methodology. The difference is what a parent, a principal, or an independent reviewer can verify when they ask the question that matters most: how exactly is this score produced?
This page documents, in full, the four pillars of the Prayaas methodology. It exists so that the diagnostic report a parent receives is supported by a published, defensible process - not by a reassuring brand voice. Every claim on the report can be traced back to a step on this page.
Four Pillars
The four disciplines behind every report.
Question Paper QA
Five-stage paper preparation workflow. Setter draft, technical review, blueprint compliance check, language pass, final approval by the Academic Lead.
Examination Integrity
Sealed paper transit, neutral test centres (no coaching institutes), trained invigilators, anonymised answer scripts. Audit trail at every step.
Evaluator Training & IRR
Every evaluator trained on the rubric and calibrated against gold-standard scripts. Inter-Rater Reliability threshold of 0.85 before live evaluation.
Diagnostic Tone
Reports are interpretive, not evaluative. Five tone principles guide every word: action-oriented, parent-comprehensible, non-judgemental, paired with strength, evidence-based.
Pillar 1
Question paper QA in five stages.
A Prayaas question paper goes through five distinct stages before a student ever sits it. Each stage has a named owner, a written checklist, and an explicit pass / fail criterion. Papers that fail any stage are returned to the previous owner; they do not silently merge into the live cycle.
Setter draft
A subject expert with classroom or examination experience drafts the paper to the latest CBSE blueprint, with the marking scheme and difficulty mix specified upfront.
Technical review
A second subject expert reviews each question for accuracy, ambiguity, and curriculum alignment. Questions that fail review are returned to the setter.
Blueprint compliance
The full paper is checked against the CBSE marks distribution, weightage by chapter, and difficulty mix. Out-of-range papers are reweighted.
Language pass
A copy editor reviews every question for clarity. Indian-English usage, no ambiguous instructions, no leading phrasing. Hindi-medium versions get a parallel pass.
Final approval
The Academic Lead signs off the paper. Signed papers go to sealed transit; unsigned drafts never leave the system.
Pillar 2
Examination integrity from setter to script.
A neutral test centre means nothing if the paper itself leaks before the cycle. Our examination integrity protocol covers the full chain from sealed paper transit through invigilation, anonymised answer scripts, and the audit trail behind every cycle.
The full protocol is published as a separate, dedicated page so that a sceptical principal or a journalist can read it in detail.
Read the Examination Integrity ProtocolPillar 3
Evaluator training and Inter-Rater Reliability.
Two trained CBSE examiners independently mark every Prayaas script. A third examiner adjudicates any disagreement greater than 5 percent of the paper's total marks. Before any examiner evaluates a live script, they pass a calibration round against gold-standard scripts.
Inter-Rater Reliability (IRR) is the statistical measure of agreement between two independent evaluators marking the same script. Prayaas uses a Cohen's kappa threshold of 0.85 across the calibration set; evaluators below the threshold are returned to additional training before being assigned live scripts.
Why this matters: a single examiner's judgement can vary by mood, fatigue, or context. Two independent examiners checked against a quantitative threshold remove that variance from the reported score.
Pillar 4
The diagnostic-tone discipline.
Five tone principles shape every line of every report. Each principle has a don't-say example and a do-say example. Every report is reviewed against this checklist before it is released.
Action-oriented
Do say
"Focus the next four weeks on Trigonometry chapters 3 and 4."
Don't say
"Performance in Trigonometry was below average."
Parent-comprehensible
Do say
"Your child correctly solved 6 of 10 questions on Quadratic Equations."
Don't say
"Mastery score on Algebra II topic 4.2 = 0.62 (sigma 0.12)."
Non-judgemental
Do say
"This is a chapter that needs more practice before the boards."
Don't say
"Your child is weak in this chapter."
Paired with a strength
Do say
"Strong on Mensuration; revise Coordinate Geometry."
Don't say
"Coordinate Geometry: poor performance."
Evidence-based
Do say
"Lost 8 marks across 3 long-form questions in section C."
Don't say
"Could improve significantly with more practice."
The Score
How the Board Readiness Score is computed.
The Board Readiness Score (BRS) is a 100-point composite metric. It combines three measurable dimensions of a student's mock board performance, each weighted according to how strongly it correlates with actual board outcomes.
The composite formula
BRS = (Mock Board Paper Score × 0.60)
+ (Chapter-Wise Mastery Distribution × 0.25)
+ (Cohort Percentile Placement × 0.15)
Each component is normalised to a 100-point scale before weighting. The result is a single, interpretable number between 0 and 100.
What the BRS bands mean
90 - 100
Boards-Ready
Strong command across most chapters; refine timing and presentation.
75 - 89
Approaching Ready
On the right track; close the gap on 3 to 5 specific chapters.
60 - 74
Building Ready
Foundation present; needs structured chapter-wise revision.
Below 60
Foundational Work Needed
Concept gaps to address; consider extra time on first principles.
What we measure, and what we do not.
The BRS measures preparation, not potential. It reports what a student can demonstrate today, on a 3-hour board-style paper, across the chapters of the current syllabus. It is a snapshot, calibrated against a defined cohort, with a defined methodology.
The BRS does notmeasure intelligence, learning ability, future performance, character, or any other attribute beyond demonstrated current preparation. We do not predict a child's exact board score with certainty - no instrument can. We do not compare students against students from other boards or curricula. We do not score effort or engagement.
Honesty about the boundary of the measurement is part of the methodology. A diagnostic that overclaims is not a diagnostic.
Versioned, annually reviewed
This methodology page is versioned. The current version is v1.0 (April 2026), built on the lessons of Edition 1 (December 2024). After each cycle, the Academic Lead and Founder review the methodology against what the cycle revealed, and a new version is published with a changelog.
Why publish the version: a serious assessment organisation does not change its methodology silently. Parents, schools, and journalists deserve a record they can refer back to.
A diagnostic that cannot be audited is not a diagnostic.
The Methodology Promise
