CS7IS3 Assignment 2 - Evaluation & Leaderboard Guide
This guide shows you how to run the evaluation workflow and automatically submit your MAP scores to the leaderboard.
π Required Files
Before running the evaluation, ensure your repository has these files and directories. You can download them from the links below:
Required Files (Root Directory):
- β
pom.xml- Maven project configuration- π₯ Download pom.xml
- β
topics- Search topics file- π₯ Download topics
- β
qrels.assignment2.part1- Relevance judgments for evaluation
Required Files (Tools Directory):
- β
tools/evaluate.py- Python script for evaluating search results- π₯ Download evaluate.py
Required Files (GitHub Workflow):
- β
.github/workflows/evaluation.yml- Evaluation workflow file- π₯ Download evaluation.yml - UPDATED
Required Directory:
- β
Assignment Two/- Dataset directory containing:fbis/- FBIS documentsfr94/- Federal Register documents (subdirectories 01-12)ft/- Financial Times documents (subdirectories ft911-ft944)latimes/- LA Times documentsdtds/- DTD files for document parsingfbisdtd.dtdfr94dtdftdtdlatimesdtd.dtd
- π₯ Download Assignment Two dataset
π‘ Tip: After downloading, extract the
Assignment Twodirectory to your repository root. For other files, place them in the exact locations shown in the File Structure Reference section below.
Your Java Source Code:
- β
src/main/java/App.java - β
src/main/java/Indexer.java - β
src/main/java/Searcher.java
Required for leaderboard submission:
- GitHub Secrets:
LEADERBOARD_API_URLandLEADERBOARD_API_TOKEN - Optional:
TEAM_NAME- Your team name (if not set, repository name will be used) - Optional:
TEAM_MEMBERS- Names of team members, comma-separated (e.g., βJohn Doe, Jane Smithβ)
π Quick Start
Step 1: Configure GitHub Secrets (One-time setup)
To submit scores to the leaderboard, add these secrets to your repository:
Go to your repository β Settings β Secrets and variables β Actions
Click βNew repository secretβ and add:
Secret 1:
- Name:
LEADERBOARD_API_URL - Value:
https://leaderboard.qrameez.in
Secret 2:
- Name:
LEADERBOARD_API_TOKEN - Value: (provided by your instructor)
Secret 3 (Optional):
- Name:
TEAM_NAME - Value: Your team name (e.g., βTeam Luceneβ, βQuery Rangersβ)
- If not set, your repository name will be used as the team name
Secret 4 (Optional):
- Name:
TEAM_MEMBERS - Value: Names of team members, comma-separated (e.g., βJohn Doe, Jane Smith, Bob Johnsonβ)
- If not set, your GitHub username will be used
- Name:
β οΈ Note: Without
LEADERBOARD_API_URLandLEADERBOARD_API_TOKEN, the evaluation will still run but won't submit to the leaderboard.
Step 2: Run the Evaluation
The evaluation runs automatically when you push code to your repository. You can also trigger it manually:
- Go to Actions tab
- Select βCS7IS3 Assignment 2 - Search Engine Evaluationβ
- Click βRun workflowβ β βRun workflowβ
Step 3: Check Results
View workflow output:
- Go to Actions tab β Click the latest workflow run
- Check the βSubmit metrics to leaderboardβ step (should show β )
- View detailed metrics in the workflow summary
View leaderboard:
- Visit:
https://leaderboard.qrameez.in - Find your team and members' names with MAP score
- Visit:
π What the Evaluation Does
The workflow automatically:
- Validates your project structure (checks for required files)
- Builds your project (
mvn clean package) - Indexes documents from
Assignment Two/directory - Searches all topics from the
topicsfile - Evaluates results using
qrels.assignment2.part1 - Submits scores to leaderboard (if secrets are configured)
π Understanding Your Scores
The evaluation calculates these metrics:
- MAP (Mean Average Precision) - Primary ranking metric (0.0 to 1.0, higher is better)
- P@5 - Precision at 5 (fraction of top 5 results that are relevant)
The leaderboard ranks by MAP score.
π Updating Your Score
Every time you:
- Push new code
- Create a pull request
- Manually trigger the workflow
Your leaderboard entry automatically updates with the latest scores. No extra steps needed!
π Troubleshooting
Workflow fails at βBuild and Test Search Engineβ
Check:
- β
pom.xmlexists in root directory - β
Java source files are in
src/main/java/ - β
topicsfile exists in root directory - β
Assignment Two/directory exists with dataset files
Fix: Add missing files and push again.
Workflow fails at βEvaluate Resultsβ
Check:
- β
tools/evaluate.pyexists in your repository - β
qrels.assignment2.part1file exists in root directory - β
runs/student.runfile was generated (check previous step logs)
Fix: Ensure tools/evaluate.py is present and the search step completed successfully.
βSubmit metrics to leaderboardβ step is skipped
Problem: Step shows as gray (skipped)
Solution:
- Go to Settings β Secrets and variables β Actions
- Verify both
LEADERBOARD_API_URLandLEADERBOARD_API_TOKENare set - Ensure URL is exactly:
https://leaderboard.qrameez.in(no trailing slash)
Scores show 0.0 on leaderboard
Possible causes:
- Search engine didn't produce results
runs/student.runfile is empty- Build or search step failed
Solution:
- Check workflow logs in the βBuild and Test Search Engineβ step
- Verify your search engine produces output
- Ensure evaluation completed successfully
Scores not appearing on leaderboard
Check:
- Did βSubmit metrics to leaderboardβ step succeed? (β green checkmark)
- Wait 10-30 seconds and refresh the leaderboard page
- Verify secrets are correct
If still not working:
- Check workflow logs for error messages
- Verify API URL in secrets matches:
https://leaderboard.qrameez.in
π File Structure Reference
Your repository should look like this:
your-repo/
βββ pom.xml
βββ topics
βββ qrels.assignment2.part1
βββ src/
β βββ main/
β βββ java/
β βββ App.java
β βββ Indexer.java
β βββ Searcher.java
βββ Assignment Two/
β βββ fbis/
β βββ fr94/
β β βββ 01/
β β βββ 02/
β β βββ ... (subdirectories 03-12)
β βββ ft/
β β βββ ft911/
β β βββ ft921/
β β βββ ... (other ft subdirectories)
β βββ latimes/
β βββ dtds/
β βββ fbisdtd.dtd
β βββ fr94dtd
β βββ ftdtd
β βββ latimesdtd.dtd
βββ tools/
β βββ evaluate.py
βββ .github/
βββ workflows/
βββ evaluation.yml
π― Best Practices
Test locally first:
mvn clean packageFix build errors before pushing.
Check workflow logs:
- Always review the Actions tab after pushing
- Look for warnings or errors in each step
Keep secrets secure:
- Never commit secrets to your code
- Never share your
LEADERBOARD_API_TOKEN
Push frequently:
- The leaderboard shows your latest successful evaluation
- Push after each improvement to update your score
π Quick Reference
- Leaderboard:
https://leaderboard.qrameez.in - Workflow:
.github/workflows/evaluation.yml - Secrets: Settings β Secrets and variables β Actions
- Actions Tab:
https://github.com/YOUR_USERNAME/YOUR_REPO/actions
π Contact
If you have any questions or encounter issues, please contact:
Rameez Qureshi
π§ moquresh@tcd.ie
Good luck with your search engine implementation! π