Here, we describe the two evaluations carried out with end users to assess the Science Tree platform, the first based on a questionnaire and the second one conducted with some prospective users.
Evaluation based on questionnaire
For this evaluation, we adopted a questionnaire specially designed for it by using the
Google Forms19 service. More specifically, the questionnaire was divided into three distinct parts. The first one, mandatory, was aimed to identify the respondents’ profile and consisted of five questions meant to anonymously collect the following data about them: gender, age, area of expertise, level of education, and occupation. The second part included 13 multiple-choice questions based on the Likert scale
20 for assessing the users’ impression about the platform, considering the following six alternatives: “strongly agree”, “agree”, “neutral”, “disagree”, “strongly disagree,” and “I have no opinion,” the last one meaning that the respondent had no specific position about that question (e.g., the participant did not explore the filters option and might not have an opinion about it). The statements that comprised this part of the questionnaire were as follows:
1
The visualization of the researchers’ metrics is clear.
2
The metrics on the researcher’s performance as supervisor are useful.
3
When selecting a researcher, the data presented about her/him is clear.
4
The platform is esthetically pleasant.
5
The trees are clearly visualized.
6
The colors used on the platform do not cause confusion.
7
The filter options available are clear.
8
The platform is easy to use.
9
It was easy to learn how to use the platform.
10
Navigating through the platform is intuitive.
11
The platform is useful for you as a researcher.
12
Exploring the academic genealogy trees is useful to assess the researchers’ performance.
13
The visualization of the advisor-advisee relationships by an academic genealogy tree is useful.
It is important to notice that the above statements were elaborated considering a logical sequence among them, but they have been shuffled and then randomly inserted in the questionnaire presented to the researchers to avoid any bias when being answered by them.
Finally, the third part of the questionnaire included a single three-choice (“yes”, “no,” and “in some points”) question to assess whether the user had any difficulty to understand the platform data flow, followed by an open-ended question in which the respondents could optionally include any suggestions, comments, or criticism.
Regarding its application, the questionnaire was sent to a list of 5834
e-mails of researchers associated with the National Institutes of Science and Technology (INCTs), a program sponsored by CNPq to support multidisciplinary research initiatives
21 that involved several Brazilian research institutions. All together, 291 of the contacted researchers answered the questionnaire, 286 of whom were considered for having their responses analyzed, because they completed the questionnaire in its entirety, corresponding to 4.9
% of the researchers contacted by
e-mail. Regarding the final open-ended question, 147 out of the 286 respondents (51.4%) left a comment. However, 15 of them were not actual comments, but just included an
e-mail address or a congratulatory message left by the respondent. Thus, at the end, 132 comments (corresponding to 46.2% of the respondents) were considered valid and addressed in our qualitative analysis (see the “
Results” subsection).
Out of the 286 respondents, 179 (62.5%) declared themselves male, 103 (36.0%) female, and 4 (1.5%) chose not to declare their gender. The age of the respondents was spread over a spectrum ranging from 24 to 75 years, with the highest number of respondents in the age group from 50 and 59 years. This fact may indicate that most respondents are researchers still active and, eventually, with a consolidated career in their respective research areas. Another interesting fact is the existence of 19 respondents (6.6%) over 70 years old, showing that such researchers, possibly retired, are also interested in knowing their own academic genealogy trees and, possibly, exploring the trees of others colleagues. With respect to their professional occupation, 220 respondents (76.9%) were university faculty members, 54 (18.9%) researchers from public research centers, and 6 (2.1%) lab technicians. The remaining included six students (2.1%). Finally, the group of respondents included researchers from distinct knowledge areas such as biochemistry, biology, chemistry, computer science, engineering, health, mathematics, pharmacology, physics, and social sciences, among others.
Evaluation with prospective users
For this second evaluation, we selected as participants faculty members from relevant graduate programs at the Universidade Federal de Minas Gerais (UFMG) and also from other federal institutions in the region, as the participation required a face-to-face session. Thus, we sent invitations to 27 faculty members from distinct knowledge areas and in different levels of their academic career (assistant, associated, and full professors). The invitation included a brief explanation of the evaluation process and a short description of the Science Tree platform. From the total of 27 invitations sent out, we obtained seven positive replies. The evaluation process involved performing a few tasks using the platform and participating in a semi-structured interview, in which the participants had the opportunity to express their opinion about the platform.
Only one participant was present at each evaluation session. During such sessions, the participants were requested to perform three predefined tasks that involved identifying specific information on a researcher’s genealogy tree or associated with the metrics presented. Next, they were asked to freely explore a tree of their interest (their own or of a colleague) and to think aloud [
31] as they did. Afterwards, a semi-structured interview was conducted that explored the following questions:
1
What was your general opinion about the platform? Please cite its main strengths and weaknesses.
2
Did you find any difficulty in performing the given tasks? If yes, which one and why?
3
About the platform metrics chosen in some of the tasks, which one did you find interesting and why?
4
What did you think about the visualization of the trees? Justify.
5
What did you find about the configuration of the visualization options?
6
Would you use the platform in your day-to-day activities? In which circumstances?
7
Do you have any suggestion to improve the platform?
The user evaluations took place between February and March 2020 in the dependencies of the Universidade Federal de Minas Gerais (UFMG). Out of the seven participants in this evaluation, aged between 30 and 60 years, six were from the Universidade Federal de Minas Gerais and one from the Universidade Federal de Ouro Preto. Six of them were full professors with an average of 20 concluded supervisions in their respective graduate programs, and one was an assistant professor and had not supervised any graduate students yet. Their research interests were related to the areas of electrical engineering, mathematics, physics, sociology, biology, and communication. The sessions took on average 37 min. The users’ interactions and comments were recorded and later transcribed for the analysis.