Next week we are headed to the Assessment Institute in Indianapolis. We hope to see many of you there, and if you do catch us at our booth, we will be more than happy to give you a tour of some of the new features we just released in Aqua. We are pretty excited about the additions!
If you won’t be in Indy or are dying to hear more about the latest, here’s what we have been up to for our latest Aqua release…
Course-level scoring in Aqua!
Aqua is optimized to support scoring by teams, or committees, of faculty who benefit from how easy Aqua makes it to randomly and blindly distribute student evidence for scoring. Up to this point, we have only supported grouping evaluators by specific learning outcomes. With this release, assessment coordinators can choose to have work scored by faculty associated with each course in the project, making it easy to engage course instructors in collecting and scoring work by the selected learning outcomes.
Aqua will pre-populate the evaluator assignments based on faculty course enrollments, but we’ve made it easy for you to adjust those assignments or fill in gaps as needed.
Evaluators can add qualitative feedback
Evaluators can now also add overall comments on each learning outcome while scoring. We added this as a simple, unobtrusive option to preserve the focus of the user experience on the rubric and the student work.
Comments are available in the export of results and we plan to add them elsewhere in other reports for quick review in upcoming releases.
Find and review scored work
While Aqua has always made it easy to get to data quickly at the end of scoring, we have a new report that should be a big help to institutions that want to spend more time reviewing the work that was scored.
In the “View Results” area for a project, you will see a new option to “Browse Scored Submissions.” Here you can browse all of the scored work, filter for scores on the outcome or by specific criteria and quickly see the student evidence. We think this is a really nice way to support reflection and analysis at the end of a project. You can easily pull exemplar work or samples from different performance levels to support calibration exercises.
As you can see in the screenshot above, we have given each rubric criterion a short abbreviation (C1,C2, etc.) to keep the layout clean, but you can filter by the full title and hover over each column to see the title as you sort.
Preview collected work before submitting for scoring
Where you previously had to download the file, coordinators can now preview student work to be scored as the evaluators will see it from the “Manage Submissions” area.
That’s it for now. Stay tuned for the next round of developments coming in a couple months.