Analytics
TBL specific analytics
Last updated
TBL specific analytics
Last updated
Real time analytics will show you at every step of the way how far advanced are your students within the assessment (be iRAT, tRAT or AEs).
These analytics can greatly help you facilitate your TBL lesson as they will give you insights as to which areas of the topic the students are struggling to understand, the time you need to allocate for each assessment and overall better manage the lesson experience.
Let's start with the iRAT Analytics.
For the sake of the documentation, we will use the words exam, assessment and test interchangeably.
When a TBL lesson is running and your students are completing the different assessments, you are able to get real time analytics as your students answer questions. Here are some important iRAT analytics that will help you manage and diagnose your lesson.
These are charts that are available via the Assessment monitoring page. Refer to TBL Monitoring for details.
The first "Completion" chart, displays the number of students that have started, in progress or completed the assessment.
It's split in three categories:
Not started, Students that are part of the lesson but have not yet started the assessment.
In progress,
Students that are in the middle of the assessment
Completed The number of students that have now completed the exam.
The “Students Progress” chart, aggregates the number of questions answered for each student. This chart will tell you how many questions students have answered as they are progressing through the exam.
This is a bar chart so you will see how the bars "move" as the students advanced question by question, from the right to the left.
On the x-axis, are the number of answered quesitons. On the y-axis, it's the number of students. So as an example, let's assume that our exam has 10 questions. So when a student has answer 5 questions, it will show him/her in the 5th position on the x-axis. As he/she continues answer other questions, then we can see the progress until he/she completed all the exam questions.
Use the Student progress chart to help you set time limits and keep your lesson on track.
Both of these charts are progressive, meaning that they are updated continuously as the students are advancing in the test.
As students submit their answers on the iRAT, the "Students' Choices" table will display the students' answer distribution.
The rows represent each answer (in order of display) and the columns the available answers. Each cell shows the percentage distribution for the given answer choice.
If the cell is green, it means that this option is the correct answer.
For example, in the table above, we can see that for question #2, 92.59% of the students chose answer "C". And if this case, they were correct.
However, less than 2% of the students chose the correct answer from Question #9.
Students' Choices
This table is very useful to identify potential knowledge gaps and can prepare you for the discussions ahead.
If you want to see the full text for each question, click on the question number and an overlay with all the details of the question will be displayed.
To make an analytics comparison, you can start at the Analytics tab.
The first chart display will show the overall scores for all students regardless of their team.
The chart displays the mark's distribution attained by all the students. You can zoom in and out on the chart to have better results' granularity.
The table below shows the different average scores: Arithmetic mean, Median, Mode and lowest and highest marks.
When you scroll down, you can get the breakdown analytics by teams.
Each team will show its own analytics breakdown: arithmetic mean, median, mode and lowest and highest marks.
Item analysis is the act of analysing students' responses to individual exam questions with the intention of evaluating exam quality. It is an important tool to uphold test effectiveness and fairness. Item analysis is likely something educators do both consciously and unconsciously on a regular basis.
For each iRAT (or any assessment instance), you can get specific item analysis for that "exam run".
Head to the "Analytics" tab and view the Item Analysis section
The item analysis shows you the following indexes:
Difficulty index The difficulty index represents a percentage of the students responses that answered the question correctly. This metric takes a value between 0 and 1. High values indicate that the question is easy, while low values indicate that is difficult.
Discrimination Index Discrimination index measures how well a question distinguishes between students with more knowledge or skill from those with less. Values close to +1 indicate that the question does a good job of discriminating between high performers and low performers.
Point biserial Point-biserial measures the correlation between the scores on the entire exam and the scores on the single question (where 1 = correct answer and 0 = incorrect answer). A large correlation indicates that a question is measuring the same construct as the overall exam measures.
These index will help you to assess each question within your test. Take into account the number of students in the test when making decisions on the effectiveness and/or efficacy of a question.
The Mighty Question Bank also gives you the item analysis for a question across various exam and/or semesters. We strongly suggest you to look at the Question Bank analysis as it provides richer data.
Now you might also need to get drill down and get student's specifics. For this, head back to the Summary tab
Sometimes we need to get specifics on a team or a students, for this you can use the Summary tab and dig down in each team.
Each teams or group will have its own summary section. You can expand and collapse each section to better fit your needs for comparison.
Once you open a team section, you will get the name of all the students in a group and their overall score in the exam (in this case the iRAT).
If you want to see the specific answers for each student, select the student's name to see his/her answers and other details
For each student, you sill see the answers they selected, the confidence level the chose as well as the resulting score.
You can also see the report by question rather than teams. Just select the question you want to see the report by and get the full details.
At time, you might want to do your own research analysis using tools like R or SPSS. For this you can export all data from the exam to an Excel file that then you can manipulated and use as it better suits your needs.
The Excel file will contain a breakdown of the data by group, students and provide all timestamps for each answer or attempt.
tRAT Analytics are best to be used with the previous results from the iRAT. So make sure you use these results in conjunction with the iRAT Analytics as that's when it will give you better insights into the students' knowledge gaps and strengths.
Important TBL Insights
For further comparison between the iRAT and tRAT results see the iRAT and tRAT Insights.
As the leader for each team selects and option from a tRAT question, you are able to view their choices in real-time:
Each team is represented in rows and the questions as columns.
The tRAT summary shows you which answers each team selected for each question and the order in which they selected the answers.
Unlike in the iRAT, the tRAT allows teams to have multiple attempts for each answer. If the team does not select the correct answer in their first attempt, then they can try again until the finally reach the correct answer.
Therefore the green and red letters that you see represent the options that each team selected and the sequence in which choose them.
For instance, the first row team "The Renagades" for Question 10, first selected the answer "B" as their first attempt -which was not correct. Then they opted for "D" which was not correct and finally selected "C" which was the correct answer.
The Total and Total % columns show you the number of questions that were answered correct at the first attempt.
The Total and Total % rows show you the number of teams that answered the question in their first attempt.
Insights!
Note that the cells in RED, denote the questions where all teams struggle the most. Focusing on this insights as the students answer the questions will give you a good idea where the students' knowledge gaps are.
Clicking on the question, you are able to view the question in an overlay
And clicking on the team name, you see the full list of answers for the team including the sequence of their responses, scores, burning question and discussion (in case the tRAT is conducted asynchronously).
You can see specific scores for each team
The name highlighted and with the star refers to the Team Leader. The ticks next to all the other team members denote that they are currently present in the team activity.
By clicking in the team leader's name, you will be able to see the complete report for the team (see tRAT Team Responses above).
There is also a Report by question in the tRAT
The Analytics Tab shows you the distribution chart and scores of all teams.
Similarly as with the iRAT item analysis, you can find the tRAT item analysis. Refer to this section for a complete explanation of the indexes.
Similarly as with the iRAT, you can export all the available tRAT data at any time to an Excel file to pursue your own research analytics using other statistical tool. Refer to the "Export" button in this page.
The new TBL Monitor Teams now compares the results of the RAP question across iRAT and tRAT giving you clear insights!
By looking at the iRAT results, you are able to see where students might have knowledge gaps or misunderstanding about the content. For the example, it is very clear that students are struggling with Question 10.
As tRAT results start rendering in real-time, you are able to see that in most cases, questions that students struggling with in the iRAT, they are likely to occur again during the tRAT.
When you compare the answers question-by-questions between these two, it is clear that students are struggling with questions 9 and 10 in both iRAT and tRAT.
The new TBL Monitor Teams now compares the results of the RAP question across iRAT and tRAT giving you clear insights!
As you can see in the iRAT results chart, the iRAT mean (or arithmetic average) is 6.22. Whereas the tRAT mean is 7.5, meaning that as teams students got -in average, a 20.6% increase in their score. By the way, data use for this guide is real data with group and names changed.
This is consistent with available research.
It's a common practice in TBL that when half of the students have completed the exam, you will give the remaining students doing the exam just a few extra minutes. This helps you keeping activities on time and avoid students from getting distracted.
RAP results comparison charts
RAP results comparison charts