TY - GEN
T1 - Evaluating Software Documentation Quality
AU - Tang, Henry
AU - Nadi, Sarah
N1 - Publisher Copyright:
© 2023 IEEE.
PY - 2023
Y1 - 2023
N2 - The documentation of software libraries is an essential resource for learning how to use the library. Bad documentation may demotivate a developer from using the library or may result in incorrect usage of the library. Therefore, as developers select which libraries to use and learn, it would be beneficial to know the quality of the available documentation. In this paper, we follow a systematic process to create an automatic documentation quality evaluation tool. We identify several documentation quality aspects from the literature and design metrics that measure these aspects. We design a documentation quality overview visualization to visualize and present these metrics, and receive intermediate feedback through a focused interview study. Based on the received feedback, we implement a prototype for a web service that can evaluate a given documentation page for Java, JavaScript, and Python libraries. We use this web service to conduct a survey with 26 developers where we evaluate the usefulness of our metrics as well as whether they reflect developers' experiences when using this library. Our results show that participants rated most of our metrics highly, with Text Readability, and Code Readability (of examples) receiving the highest ratings. We also found several libraries where our evaluation reflected developers' experiences using the library, indicating the accuracy of our metrics.
AB - The documentation of software libraries is an essential resource for learning how to use the library. Bad documentation may demotivate a developer from using the library or may result in incorrect usage of the library. Therefore, as developers select which libraries to use and learn, it would be beneficial to know the quality of the available documentation. In this paper, we follow a systematic process to create an automatic documentation quality evaluation tool. We identify several documentation quality aspects from the literature and design metrics that measure these aspects. We design a documentation quality overview visualization to visualize and present these metrics, and receive intermediate feedback through a focused interview study. Based on the received feedback, we implement a prototype for a web service that can evaluate a given documentation page for Java, JavaScript, and Python libraries. We use this web service to conduct a survey with 26 developers where we evaluate the usefulness of our metrics as well as whether they reflect developers' experiences when using this library. Our results show that participants rated most of our metrics highly, with Text Readability, and Code Readability (of examples) receiving the highest ratings. We also found several libraries where our evaluation reflected developers' experiences using the library, indicating the accuracy of our metrics.
UR - http://www.scopus.com/inward/record.url?scp=85166318282&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85166318282&partnerID=8YFLogxK
U2 - 10.1109/MSR59073.2023.00023
DO - 10.1109/MSR59073.2023.00023
M3 - Conference contribution
AN - SCOPUS:85166318282
T3 - Proceedings - 2023 IEEE/ACM 20th International Conference on Mining Software Repositories, MSR 2023
SP - 67
EP - 78
BT - Proceedings - 2023 IEEE/ACM 20th International Conference on Mining Software Repositories, MSR 2023
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 20th IEEE/ACM International Conference on Mining Software Repositories, MSR 2023
Y2 - 15 May 2023 through 16 May 2023
ER -