first working version

This commit is contained in:
howard
2025-10-22 20:14:31 +08:00
parent c9767b830b
commit 8dc869634e
118 changed files with 22518 additions and 0 deletions

View File

@@ -0,0 +1,282 @@
# Automatic Scoring Guide
## ✅ Automatic Scoring Implemented!
The system now automatically scores exams that contain **ONLY** single choice and true/false questions.
## How It Works
### Scoring Rules
**Automatic scoring is activated when:**
- Exam contains ONLY `single_choice` questions
- OR exam contains ONLY `true_false` questions
- OR exam contains a mix of ONLY `single_choice` AND `true_false` questions
**Automatic scoring is NOT activated when:**
- Exam contains `essay` questions
- Exam contains `code_simple` questions
- Exam contains `code_exercise` questions
- Exam has any mix with non-auto-gradable types
### Scoring Calculation
For auto-scorable exams:
- Each correct answer earns full points for that question
- Each incorrect answer earns 0 points
- Total score = sum of earned points
- Percentage = (total score / max score) × 100
- Passing threshold = 70%
### What You See
**When you submit an auto-scored exam:**
1. Immediate score display
2. Percentage shown in large circle
3. Total points (e.g., "40 / 50")
4. PASSED or NOT PASSED status
5. Number of correct answers
6. Question-by-question breakdown in output JSON
**When you submit a manually-graded exam:**
1. Submission confirmation
2. Message: "Manual grading required for some questions"
3. No immediate score
4. Results saved for later review
## Current Exam
### Python Fundamentals - Easy Level
**Exam Details:**
- Subject: Python
- Difficulty: Easy
- Duration: 45 minutes
- Total Points: 50
- **Auto-Scored:** ✓ Yes
**Questions:**
- 5 Multiple Choice (5 points each = 25 points)
- 5 True/False (5 points each = 25 points)
- **Total:** 10 questions, 50 points
### Question Breakdown
**Multiple Choice Questions (25 points):**
1. How to create a list? → Answer: A
2. Type of 'hello'? → Answer: B
3. Keyword for function? → Answer: B
4. What does len() return? → Answer: A
5. Valid variable name? → Answer: C
**True/False Questions (25 points):**
6. Python is case-sensitive → True
7. Lists are immutable → False
8. print() displays output → True
9. Python uses {} for blocks → False
10. # for comments → True
### Perfect Score Example
If you answer all correctly:
- Score: 50 / 50
- Percentage: 100%
- Status: ✓ PASSED
### Passing Score
- Need: 35 / 50 points (70%)
- That's: 7 out of 10 questions correct
## Testing the System
### Via Browser
1. **Login:**
- Go to http://localhost/login
- Use: `testuser` / `test123`
2. **Take Exam:**
- Click "Start Exam" on "Python Fundamentals - Easy Level"
- Answer all 10 questions
- Click "Submit Exam"
3. **See Score:**
- Large circle showing percentage
- Green = Passed (≥70%)
- Red = Not Passed (<70%)
- Total points displayed
- Correct count shown
4. **View in History:**
- Click "View History"
- See your attempt with score
- Click "View Results" to see details
### Score Display Features
**Score Circle:**
- Green background if passed
- Red background if not passed
- Large percentage number
- Points fraction below
**Additional Info:**
- Correct answers count
- Pass/fail status
- Attempt ID
- Links to history and home
## Creating Auto-Scored Exams
### Template
```json
{
"examId": "your-exam-id",
"subject": "your-subject",
"title": "Your Exam Title",
"difficulty": "easy",
"durationMinutes": 30,
"sections": [
{
"id": "mcq",
"title": "Multiple Choice",
"questions": [
{
"id": "q1",
"type": "single_choice",
"prompt": "Your question?",
"choices": [
{ "key": "A", "text": "Option A" },
{ "key": "B", "text": "Option B" }
],
"answer": "A",
"points": 10
}
]
},
{
"id": "tf",
"title": "True/False",
"questions": [
{
"id": "q2",
"type": "true_false",
"prompt": "Statement here.",
"answer": true,
"points": 10
}
]
}
],
"metadata": {
"version": "1.0.0",
"totalPoints": 20,
"autoScored": true
}
}
```
**Important:**
- Use ONLY `single_choice` and `true_false` types
- Include `answer` field for each question
- Don't include essay or code questions
## Output Format
### Auto-Scored Output
The output JSON includes a `score` field in the attempt:
```json
{
"exam": { ... },
"attempt": {
"attemptId": "...",
"userId": "1",
"examId": "python-easy-v1",
"status": "finished",
"startedAt": "...",
"submittedAt": "...",
"answers": [ ... ],
"score": {
"totalScore": 40,
"maxScore": 50,
"percentage": 80.0,
"passed": true,
"byQuestion": [
{
"questionId": "q1",
"earned": 5,
"max": 5,
"correct": true
},
{
"questionId": "q2",
"earned": 0,
"max": 5,
"correct": false
}
]
}
}
}
```
### Manually-Graded Output
No `score` field in attempt:
```json
{
"exam": { ... },
"attempt": {
"attemptId": "...",
"status": "finished",
"answers": [ ... ]
// No score field
}
}
```
## Examples
### Example 1: Auto-Scored Exam
- 10 MCQ questions
- **Result:** Immediate score
### Example 2: Mixed Exam
- 5 MCQ questions
- 5 True/False questions
- **Result:** Immediate score
### Example 3: Not Auto-Scored
- 5 MCQ questions
- 1 Essay question
- **Result:** No immediate score (essay requires manual grading)
### Example 4: Not Auto-Scored
- 5 MCQ questions
- 1 Coding exercise
- **Result:** No immediate score (code requires evaluation)
## Tips
1. **For quizzes:** Use only MCQ and T/F for instant scoring
2. **For comprehensive exams:** Mix in essay/code, accept manual grading
3. **Check exam type** before taking to know if you'll get instant feedback
4. **Metadata hint:** Add `"autoScored": true` to exam metadata (optional, for reference)
## Current Status
- Auto-scoring logic implemented
- Score calculation working
- Frontend displays score
- Python Easy exam created (auto-scored)
- 10 questions (5 MCQ + 5 T/F)
- 50 total points
- 70% passing threshold
**Ready to test!** Go to http://localhost and take the exam! 🎯