APA 6th Edition Jadrić, M., Ćukušić, M. i Bralić, A. (2014). Comparison of discrete event simulation tools in an academic environment. Croatian Operational Research Review, 5 (2), 203-219. https://doi.org/10.17535/crorr.2014.0008
MLA 8th Edition Jadrić, Mario, et al. "Comparison of discrete event simulation tools in an academic environment." Croatian Operational Research Review, vol. 5, br. 2, 2014, str. 203-219. https://doi.org/10.17535/crorr.2014.0008. Citirano 09.08.2020.
Chicago 17th Edition Jadrić, Mario, Maja Ćukušić i Antonia Bralić. "Comparison of discrete event simulation tools in an academic environment." Croatian Operational Research Review 5, br. 2 (2014): 203-219. https://doi.org/10.17535/crorr.2014.0008
Harvard Jadrić, M., Ćukušić, M., i Bralić, A. (2014). 'Comparison of discrete event simulation tools in an academic environment', Croatian Operational Research Review, 5(2), str. 203-219. https://doi.org/10.17535/crorr.2014.0008
Vancouver Jadrić M, Ćukušić M, Bralić A. Comparison of discrete event simulation tools in an academic environment. Croatian Operational Research Review [Internet]. 2014 [pristupljeno 09.08.2020.];5(2):203-219. https://doi.org/10.17535/crorr.2014.0008
IEEE M. Jadrić, M. Ćukušić i A. Bralić, "Comparison of discrete event simulation tools in an academic environment", Croatian Operational Research Review, vol.5, br. 2, str. 203-219, 2014. [Online]. https://doi.org/10.17535/crorr.2014.0008
Sažetak A new research model for simulation software evaluation is proposed consisting of three main categories of criteria: modeling and simulation capabilities of the explored tools, and tools’ input/output analysis possibilities, all with respective sub-criteria. Using the presented model, two discrete event simulation tools are evaluated in detail using the task-centred scenario. Both tools (Arena and ExtendSim) were used for teaching discrete event simulation in preceding academic years. With the aim to inspect their effectiveness and to help us determine which tool is more suitable for students i.e. academic purposes, we used a simple simulation model of entities competing for limited resources. The main goal was to measure subjective (primarily attitude) and objective indicators while using the tools when the same simulation scenario is given. The subjects were first year students of Master studies in Information Management at the Faculty of Economics in Split taking a course in Business Process Simulations (BPS). In a controlled environment – in a computer lab, two groups of students were given detailed, step-by-step instructions for building models using both tools - first using ExtendSim then Arena or vice versa. Subjective indicators (students’ attitudes) were collected using an online survey completed immediately upon building each model. Subjective indicators primarily include students’ personal estimations of Arena and ExtendSim capabilities/features for model building, model simulation and result analysis. Objective indicators were measured using specialised software that logs information on user's behavior while performing a particular task on their computer such as distance crossed by mouse during model building, the number of mouse clicks, usage of the mouse wheel and speed achieved. The results indicate that ExtendSim is well preferred comparing to Arena with regards to subjective indicators while the objective indicators are better for Arena. Objectively, students completed the given scenario faster and with fewer movements in Arena, but they still prefer ExtendSim and perceive it as a better tool considering the characteristics and functionalities.