Authors:
(1) Oscar Pedreira, Universidade da Coruna, Centro de Investigacion CITIC, Laboratorio de Bases de Datos, Facultade de Informatica;
(2) Felix García, Universidad de Castilla-La Mancha, Grupo Alarcos, Escuela Superior de Informatica, Paseo de la Universidad;
(3) Mario Piattini, Universidad de Castilla-La Mancha, Grupo Alarcos, Escuela Superior de Informatica, Paseo de la Universidad;
(4) Alejandro Cortinas, Universidade da Coruna, Centro de Investigacion CITIC, Laboratorio de Bases de Datos, Facultade de Informatica;
(5) Ana Cerdeira-Pena, Universidade da Coruna, Centro de Investigacion CITIC, Laboratorio de Bases de Datos, Facultade de Informatica.
3 A Software Architecture for the Gamification of SE Environments and 3.1 Software architecture
4 Gamification Engine for SE Environments
4.1 System architecture and design
4.4 Support of game mechanics and elements and 4.5 Player’s site
5.3 Subjects and analysis units and 5.4 Field procedure and data collection
5.6 Analysis of results from the case study
5.7 Validity threats and limitations of the case study
Conclusion and Future Work, Acknowledgment, and References
This subsection summarizes the main aspects of the execution of each phase of the business case.
5.5.1 Scope and solution definition
As we explained in the background to the case study, the tool suite of SC is composed of many tools, as shown in Table 7. The most important one is SC-Manage, which supports project management and requirements management, and which is a custom development of SC. However, SC also uses Redmine for issue management, TestLink for test plans, and JUnit testing.
In this project, the goal of SV was not gamify just one single tool, but to include all of them in the same gamified environment. This meant that all the tools we have mentioned were taken into account within the scope of the case study. Table 7 shows the list of behaviors considered in the design of the gamified environment, along with the particular tool where the employees carry out those behaviors.
All of the behaviors included in the list were simple behaviors, except for “Report task effort”, “Complete a task”, and “Run unit tests”, which were considered task behaviors. In the first two behaviors, the use of the attributes is directly related to the task. In the effort report, only the “Real effort” attribute is used, with the value of the reported work hours. For “Complete a task”, all the attributes of the behavior have been used, as in the examples we have presented in the description of the engine. In the case of “Run unit tests”, the attribute “Grade” was used to indicate the percentage of unit tests that were run without errors.
The rule shown in Table 2 as an example was actually taken from the case study, that is, it is a real gamification rule used by SWComp. We do not show the details of all the rules since they do not add much to the description of the case study one single tool, but to include all of them in the same gamified environment. This meant that all the tools we have mentioned were taken into account within the scope of the case study. Table 7 shows the list of behaviors considered in the design of the gamified environment, along with the particular tool where the employees carry out those behaviors.
All of the behaviors included in the list were simple behaviors, except for “Report task effort”, “Complete a task”, and “Run unit tests”, which were considered task behaviors. In the first two behaviors, the use of the attributes is directly related to the task. In the effort report, only the “Real effort” attribute is used, with the value of the reported work hours. For “Complete a task”, all the attributes of the behavior have been used, as in the examples we have presented in the description of the engine. In the case of “Run unit tests”, the attribute “Grade” was used to indicate the percentage of unit tests that were run without errors.
The rule shown in Table 2 as an example was actually taken from the case study, that is, it is a real gamification rule used by SWComp. We do not show the details of all the rules since they do not add much to the description of the case study.
5.5.2 Analysis and design of the gamified environment
One of the challenging aspects of the case study was the integration of the different tools of the company with the gamification engine, since it includes custom developments, along with COTS tools ,such as TestLink, Redmine, and JUnit. Of these last three tools, JUnit presents an even more special case, since TestLink and Redmine are tools that run continuously, while JUnit is run on demand.
Figure 10 shows a diagram with the architecture of the gamified environment. There are two central elements in it: SC-Manage and the gamification engine. Since SC-Manage is a custom development, it was easy to modify this software to communicate directly with the engine. An integration component was developed, and used to carry out the communication of those behaviors related to project management and requirements management.
As regards TestLink and Redmine, there were two design choices. Because they are both open source tools, they could be modified to communicate directly with the engine. However, they also provide APIs that allow the information they manage to be reached. In the case study the second choice was preferred. As we can see in the diagram, SC-Manage integrates the information managed by TestLink and Redmine, and communicates it to the gamification engine when it detects that some of the behaviors considered have happened.
The case of JUnit was trickier since, as we have mentioned, this tool is run on demand, and does not store the results of its executions in a database. In order to integrate this tool with the engine, a wrapper was developed on JUnit. This wrapper runs the unit tests, gets the results, and communicates them to the engine.
This paper is available on arxiv under CC BY 4.0 DEED license.