Collaborative Learning Analytics (CLA) tools have recently emerged as a potential solution to address the onerous process of monitoring and providing timely feedback on collaboration skills in higher education students. However, prior studies on the efficacy of such tools have mainly been carried out in small, controlled settings. This study aims to measure the impact of a specific CLA tool that can be easily deployed on a larger scale with minimal instructor effort in real-world online group work activities. Additionally, this research examines the potential influence that the characteristics of the collaborative activity may have on the tool's effectiveness. The CLA tool under investigation displays speaking participation time and peer evaluation scores from students engaged in online collaborative activities as part of their regular courses. The tool was evaluated with five instructors and 156 students over the course of one semester. The effects of the tool on students' speaking participation and peer evaluation scores were quantitatively measured and tested. A qualitative analysis of reflections from both students and instructors provided supplementary information on the quantitative results. The main finding of this study indicates that the tool has an overall small positive impact. The effectiveness of the CLA tool is primarily modulated by the synchronous or asynchronous presence of the instructor, as students tend to interact more naturally and feel less scrutinized in the absence of instructor evaluation. Based on the discussion of the findings, this research suggests design insights to enhance future CLA tools at scale for the purpose of supporting the development of online collaboration skills.