CP_AUTOMATION/tests/load_tests/QUESTION_TIMING_ANALYSIS.md
2025-12-16 17:43:41 +05:30

3.5 KiB
Raw Blame History

Question Timing Analysis

⏱️ Current Per-Question Timing (Approximate)

Based on RandomizedWait wait ranges, here's how long each question type takes:

Question Answer Times

Question Type Min Time Max Time Average Notes
Rating Scale 1s 4s ~2.5s Quick selection (1-5 stars)
Multiple Choice 2s 6s ~4s Reading options (A, B, C, D, E)
True/False 1s 3s ~2s Binary choice (True/False)
Open Ended 5s 15s ~10s Typing text response
Matrix 3s 8s ~5.5s Multiple cell selections

Navigation Times

Action Min Time Max Time Average
Click Next 1s 3s ~2s
Page Load 1s 3s ~2s

Total Time Per Question (Approximate)

Formula: Answer Time + Navigation Time

Question Type Min Total Max Total Average Total
Rating Scale 2s 7s ~4.5s
Multiple Choice 3s 9s ~6s
True/False 2s 6s ~4s
Open Ended 6s 18s ~12s
Matrix 4s 11s ~7.5s

Example: 108 Questions (Emotional Intelligence)

If all questions are Multiple Choice (most common):

  • Average: 108 questions × 6s = ~10.8 minutes
  • Min: 108 questions × 3s = ~5.4 minutes
  • Max: 108 questions × 9s = ~16.2 minutes

If mixed question types:

  • Average: ~8-12 minutes (depending on mix)
  • Min: ~5-7 minutes
  • Max: ~15-20 minutes

📊 What We Currently Track

In Load Test Results

Tracked:

  • Total duration per student (entire flow)
  • Total questions answered
  • Average duration across students
  • Success/failure rate

NOT Tracked:

  • Time per individual question
  • Time breakdown by question type
  • Question-level performance metrics

🔧 Adding Per-Question Timing (Optional)

If you want to track per-question timing, I can add:

  1. Question-level timing in load test results
  2. Breakdown by question type (avg time per type)
  3. Detailed metrics in JSON report

Would you like me to add this?


💡 Current Behavior

How Questions Are Answered

# For each question:
1. Detect question (0.5-1s)
2. Answer question (1-15s depending on type)
3. Wait for answer confirmation (0.5-1s)
4. Click Next (1-3s)
5. Wait for next question to load (1-3s)

Total: ~4-20s per question (depending on type)

Realistic Timing

The automation uses randomized waits to simulate human behavior:

  • Not too fast (doesn't look like a bot)
  • Not too slow (efficient testing)
  • Varies per question type (realistic)

📈 Backend Perspective

The backend tracks time per question when answers are submitted:

  • Frontend sends timeSpent with each answer
  • Backend stores it in the database
  • Available in backend analytics

But our automation doesn't currently track this - we just use realistic wait times.


🎯 Summary

Current State:

  • We know approximate times per question type
  • We track total duration per student
  • We don't track individual question times

Approximate Times:

  • Fast questions (True/False, Rating): ~2-4s each
  • Medium questions (Multiple Choice, Matrix): ~4-8s each
  • Slow questions (Open Ended): ~6-18s each

For 108 questions:

  • Average: ~8-12 minutes
  • Range: ~5-20 minutes (depending on question types)

Would you like me to add detailed per-question timing tracking to the load test?