When LAMS detects your design is based on TBL, it provides a specialised set of reports that make it easier to analyse the students performance in a TBL lesson.
For further results analysis, please see TBL Analytics.
Once you open the monitoring interface, LAMS detects that lesson is based on the TBL process and adds a new tab:
TBL Monitor (see red highlight).
TBL Monitor has a set of tabs that will facilitate the analysis of the students and teams progress and performance. More importantly, it provides you real-time data insights to help you know where and how students might be struggling - thus, allow you to plan your facilitation and interventions in the lesson.
Let's look at each of the tabs in TBL Monitor:
The Teams tabs shows you a thorough summary of the TBL assessments at a glimpse.
From this tab, you can set up the teams as well as visualise the individual student performance (iRAT) compared with the team performance (tRAT).
The team button leads you to create the teams.
Summary chart compares all each team's average correct answer for the iRAT with their respective team's tRAT score.
The Summary table shows you the data that is reflected in the chart. It also highlights the higher and lower scores for the iRAT and tRAT for each team.
More importantly, it shows the percentage increase or decrease (Delta Δ) between the iRAT and the tRAT score.
You can click on a team's name to get the breakdown of scores for individuals within that team.
Here you can compare the individual iRAT scores for each member of a team, find out the lowest and highest scores and use the chart to compare the students.
Gates allow you to manage the flow of students in your TBL lessons. From this page you are able to manage your gates by open and close them. You can also see who and when a gate was open. As always, gates can be open selectively, to allow certain students/teams to continue if need be.
This tab shows you two sections: one is the list of all questions in the iRAT.
If you click on the button: "
Show students' choices", the following reports the iRAT analytics based on students' answers.
At any time you can export these iRAT results including analysis and time stamps for each answer. See Analytics Export iRAT results.
In this view, you can track the completion of all the students in real time as they progress through the iRAT.
It's split in three categories:
Not started, Students that are part of the lesson but have not yet started the assessment.
Students that are in the middle of the assessment
Completed The number of students that have now completed the exam.
Students Progress chart, aggregates the number of questions answered by the students. This chart will tell you how many questions students have answered as they are progressing through the exam.
This is a bar chart so you will see how the bars "move" as the students advanced question by question, from the right to the left.
On the x-axis, are the number of answered questions. On the y-axis, it's the number of students. So as an example, let's assume that our exam has 10 questions. So when a student has answer 5 questions, it will show him/her in the 5th position on the x-axis. As he/she continues answer other questions, then we can see the progress until he/she completed all the exam questions.
Both of these charts (student progress and completion) are progressive, meaning that they are updated continuously as the students are advancing in the test.
As students submit their answers on the iRAT, the
Students' Choices table will display the students' answer distribution.
The rows represent each answer (in order of display) and the columns the available answers. Each cell shows the percentage distribution for the given answer choice.
If the cell is green, it means that this option/choice is the correct answer.
For example, in the table above, we can see that for question #2, 92.59% of the students chose answer "C". And if this case, they were correct.
However, less than 2% of the students chose the correct answer from Question #9.
If you want to see the full text for each question, click on the question number and an overlay with all the details of the question will be displayed.
See iRAT item analysis.
See Analytics - iRAT export results for further analysis.
If you need to allocate time for students to complete the iRAT, here's where you do it.
There are three options for managing times in the iRAT:
Everyone gets the same amount of time to complete the iRAT Time starts counting down from the moment the student enters the iRAT
Hard time limits Teacher specifies an exact time for all students to finish (ie: "All students to finish within in 5 minutes")
Time extension Teacher grants one or more students additional time to complete the iRAT.
see iRAT Time Management for a full explanation of these options.
tRAT Analytics are best to be used with the previous results from the iRAT. So make sure you use these results in conjunction with the iRAT Analytics as that's when it will give you better insights into the students' knowledge gaps and strengths.
As the leader for each team selects and option from a tRAT question, you are able to view their choices in real-time:
Each team is represented in rows and the questions as columns.
The tRAT summary shows you which answers each team selected for each question and the order in which they selected the answers.
Unlike in the iRAT, the tRAT allows teams to have multiple attempts for each answer. If the team does not select the correct answer in their first attempt, then they can try again until the finally reach the correct answer.
Therefore the green and red letters that you see represent the options that each team selected and the sequence in which choose them.
For instance, the first row team "The Renagades" for Question 10, first selected the answer "B" as their first attempt -which was not correct. Then they opted for "D" which was not correct and finally selected "C" which was the correct answer.
The Total and Total % columns show you the number of questions that were answered correct at the first attempt.
The Total and Total % rows show you the number of teams that answered the question in their first attempt.
Clicking on the question, you are able to view the question in an overlay
And clicking on the team name, you see the full list of answers for the team including the sequence of their responses, scores, burning question and discussion (in case the tRAT is conducted asynchronously).
See Analytics - tRAT export results for further analysis.
If you need to allocate completion times for the tRAT, here's how you do it:
There are three options for managing times in the tRAT:
All teams gets the same amount of time to complete the tRAT Time starts counting down from the moment the leader of the team enters the tRAT
Hard time limits Teacher specifies an exact time for all teams to finish (ie: "All teams must finish within in 5 minutes")
Time extension Teacher grants a team or teams additional time to complete the tRAT
See tRAT Time Management for details.
Teams might raise burning questions, appeals or challenges about a question(s) regarding a particular answer or a concept that they might need help with. When students do, these are presented as Burning questions and can be seen in the Burning Qs tab.
When team leaders raise burning questions for a question, these immediately show in Burning question tab. When you expand the question, then you are able the burning question and the team or teams that have raised them.
As teams complete their tRAT, they are also able to see burning question raised by other teams. Then team leaders, in agreement with the team members, will be able to like other teams' burning questions. These likes then accumulate to help the facilitator/content export to know which burning questions should be given priority.
This page has been designed so the facilitator is able share it with the class and encorage discussion. When discussing a question with the class, the facilitator can display this page in the classroom board or share the screen via Zoom or other conferencing facility and encourage teams to address the burning question first.
Note that from the picture above the answer for the question are hidden so student can have a discussion prior to the facilitators disclosing the answers and correct answer.
For instance, say your lesson is based on a design that contains two AEs. The first AE (AE1) is based on the Assessment tool and the second one uses the doKu tool (AE2):
Then when you select the AEs in TBL Monitor, you will be able to see two tabs that display both AEs (AE1 and AE2). The main purpose of aggregating the AEs is to allow better comparison and also make it easier to manage AEs in a single section.
Let's start looking at AE1, which is based on the Assessment tool.
Assessment AEs will show first the list of the questions and options to disclose answers.
If you want to track scores and team's progress, click on
"Show students' choices":
Similarly to what you saw before in the completion and progress page for iRAT, the Assessment activity displays almost the same charts. However, in this case, as the AE is performed in teams, then the details presented are based on teams instead.
The two charts at the top show you the completion as well as the number of questions that teams have completed in real-time (Groups Progress).
As teams submit their answers on the AE, the
Students' Choices table will display the teams' choices.
The rows represent each team and the columns the questions. Each cell shows the selected answer by that team.
If the cell is green, it means that this option/choice is the correct answer.
As iRAT and AEs use the same Assessment tool, you can get the same Item Analysis report. The main difference between the two is that the AE Item Analysis report is aggregated in Teams instead.
During the lesson, when teams have submitted their responses, you can disclose their answers to the rest of the class.
For MCQ questions, there's a two step disclose. You can disclose the teams' answers first. Then facilitate an intra-team discussion where each team might defend their choice and once they correct answer might be apparent to the class, then disclose the correct answer.
The disclosure of MCQs can be done question by question or all questions at once (see
"Disclose All..." buttons above). Up to your preference.
For essay responses, you can disclose the answers at once and follow up with the team's defence of their response. For essay responses, other teams can respond and rate.
doKu, in essence, is a collaborative real-time editor, allowing students to simultaneously create and edit a text document. As they edit the document, students can see each other editing in real time. This tool is ideal for AEs as it allows students to collaboratively provide a response for a case or create extended essay response.
In the AEs tab, when the AE is a doKu, then you are able to see all the document responses from all teams and see all students within each team writing on the document in real-time.
As you are able to monitor each teams progress, you can launch the Gallery Walk whenever you think the students are ready. To do so, just press the
"Start Gallery Walk" button, confirm that you are to start it and then the students will be able to see other teams document and comment and rate each other's documents.
Whenever you think students have completed the Gallery Walk, you can just complete it and everyone will simultaneously will be able to see the comments on their document as well as all teams' ratings.
Just like iRAT and tRAT, you can apply time management limitations for AEs in Assessment and/or doKu tools.
Here you are able to see a summary of all the self and peer evaluation feedback based on the criteria that you set for the students to perform their own assessment.
For each team, you are able to see a report for the evaluation criteria. When clicking on the criterion, you get the full report for the ratings.
When reviewing any of the criterion, you are able to preview the email that will be sent to each individual student. When you preview the email, the following report will be shown:
The report shows:
SPA factor for each criterion
Self and Peer's rating for each criterion
SPA Factor (average) and SAPA Factor
SPA explanation and
If you have missing students in the lesson but these have been allocated to teams, as the above calculation might take them into account, you need to remove them for the purpose of the factors mentioned above. If so click on
Manage Students and unselect the ones that you are missing.
The sequence or design tab shows you the original design as well where all the students in your lesson are
You can double-click on any activity to access to monitor what students are doing in them at any time.