Ich habe die Liste meiner Publikationen zum besseren Verständnis in verschiedenen Kategorien unterteilt:
- User Experience Questionnaire (UEQ): Das sind alles Publikationen die ich im Zuge des UEQ erstellt habe. Der UEQ ist der User Experience Questionnaire und kann unter www.ueq-online.org heruntergeladen werden. Mit diesem Fragebogen kann die UX eines interaktiven Produktes anhand von verschiedenen UX Faktoren gemessen werden.
- Messen der User Experience: Publikationen die allgemein das Messen der User Experience behandeln.
- Sonstiges: Alle Publikationen, die ich nicht direkt einer der oberen Kategorien zuordnen konnte, sind unter Sonstiges aufgelistet.
User Experience Questionnaire (UEQ) Publikationen
A. Hinderks, M. Schrepp, F.J. Domínguez Mayo, M.J. Escalona, and J. Thomaschewski, Developing a UX KPI based on the user experience questionnaire, Computer Standards & Interfaces 65 (2019), pp. 38–44.
Abstract: Decisions in Companies are made typically by using a number of entirely different key figures. A user experience key figure is one of many important key figures that represents one aspect of the success of the company or its products. What we aim in this article is to present to those responsible for a product a method of how a user experience key performance indicator (UX KPI) can be developed using a UX questionnaire. We have developed a UX KPI for use in organizations based on the User Experience Questionnaire (UEQ). To achieve this, we added six questions to the UEQ to measure the importance of the UEQ scales. Based on the UEQ scales and the scores given for importance, we then developed a User Experience Questionnaire KPI (UEQ KPI). In a first study with 882 participants, we calculated and discussed the UEQ KPI using Amazon and Skype. The results show that the six supplementary questions could be answered independently of the UEQ itself. In our opinion, the extension can be implemented without any problems. The resulting UEQ KPI can be used for communication within an organization as a key performance indicator.
A. Hinderks, F.J. Domínguez-Mayo, A.-L. Meiners, and J. Thomaschewski, Applying Importance-Performance Analysis (IPA) to Interpret the Results of the User Experience Questionnaire (UEQ), in: Journal of Web Engineering Vol. 19.
Abstract: In recent years, user experience questionnaires have established themselves to measure various aspects of User Experience (UX). In addition to these questionnaires, an evaluation tool is usually offered so that the results of a study can be evaluated in the light of the questionnaire. As a rule, the evaluation consists of preparing the data and comparing it with a benchmark. Often this interpretation of the data is not sufficient as it only evaluates the current User Experience. However, it is desirable to determine exactly where there is a need for action. The User Experience Questionnaire (UEQ) is a common and valid questionnaire with an evaluation tool to measure and analyse the User Experience for a product or service. In our article we present an approach that evaluates the results from the User Experience Questionnaire using the importance-performance analysis (IPA). The aim is to create another possibility to interpret the results of the UEQ and to derive recommendations for action from them. In a study with 467 participants, we validated the approach presented with YouTube, WhatsApp, and Facebook. The results show that the IPA provides additional insights from which further recommendations for action can be derived.
A. Hinderks, A.-L. Meiners, F. Mayo, and J. Thomaschewski, Interpreting the Results from the User Experience Questionnaire (UEQ) using Importance-Performance Analysis (IPA), in: Proceedings of the 15th International Conference on Web Information Systems and Technologies, 15th International Conference on Web Information Systems and Technologies, Vienna, Austria. SCITEPRESS – Science and Technology Publications, 2019, pp. 388–395.
Abstract: User Experience Questionnaire is a common and valid method to measure the User Experience (UX) for a product or service. In recent years, these questionnaires have established themselves to measure various aspects of UX. In addition to the questionnaire, an evaluation tool is usually offered so that the results of a study can be evaluated in the light of the questionnaire. As a rule, the evaluation consists of preparing the data and comparing it with a benchmark. Often this interpretation of the data is not sufficient as it only evaluates the current User Experience. However, it is desirable to determine exactly where there is a need for action. In our article we present an approach that evaluates the results from the User Experience Questionnaire (UEQ) using the importance-performance analysis (IPA). The aim is to create another possibility to interpret the results of the UEQ and to derive recommendations for action from them. In a first study with 219 participants, we validated the approach presented with YouTube and WhatsApp. The results show that the IPA provides additional insights from which further recommendations for action can be derived.
A. Hinderks, M. Schrepp, and J. Thomaschewski, A Benchmark for the Short Version of the User Experience Questionnaire, in: Proceedings of the 14th International Conference on Web Information Systems and Technologies, 3rd International Special Session on Advanced practices in Model-Driven Web Engineering, Seville, Spain, Sep. 18-20, 2018. SCITEPRESS – Science and Technology Publications, 2018, pp. 373–377.
Abstract: To enable an interactive product to provide adequate user experience (UX), it is important to ensure the quantitative measurability of this parameter. The User Experience Questionnaire (UEQ) is a well-known and popular method for such a UX measurement. One of the key features of this questionnaire is a benchmark that helps to interpret measurement results by a comparison with a large dataset of results obtained for other products. For situations where filling out the entire UEQ is too time-consuming, there is a short version (UEQ-S). However, there is currently no sufficient data available to construct an independent and interpretable benchmark for this short version. This paper examines the efficiency of using a modified version of the existing benchmark of the full UEQ for this purpose. The paper also presents some additional evaluation results concerning the UEQ-S.
M. Schrepp, A. Hinderks, and J. Thomaschewski, Design and Evaluation of a Short Version of the User Experience Questionnaire (UEQ-S), IJIMAI 4 (2017), p. 103.
Abstract: The user experience questionnaire (UEQ) is a widely used questionnaire to measure the subjective impression of users towards the user experience of products. The UEQ is a semantic differential with 26 items. Filling out the UEQ takes approximately 3-5 minutes, ie the UEQ is already reasonably efficient concerning the time required to answer all items. However, there exist several valid application scenarios, where filling out the entire UEQ appears impractical. This paper deals with the creation of an 8 item short version of the UEQ, which is optimized for these specific application scenarios. First validations of this short version are also described.
M. Schrepp, M. Pérez Cota, R. Gonçalves, A. Hinderks, and J. Thomaschewski, Adaption of user experience questionnaires for different user groups, in: Universal Access in the Information Society Vol. 16, 2017.
Abstract: Products should guarantee a sufficiently high user experience for all intended user groups. A good user experience means not only that it is sufficient to allow users to work effectively and efficiently with a product, but also that non-task-related quality aspects need to be considered. To attain various levels of user experience thus requires an efficient method to access the subjective impressions of a larger number of users toward the product. The authors show that this can be done with little effort by using a user experience questionnaire. However, such questionnaires must be adapted to the language and even to the level of language understanding of the intended target groups. Using the example of the creation of a Portuguese language version and a special version for children, the paper illustrates how the quality of the resulting adapted questionnaire can be evaluated.
M. Schrepp, A. Hinderks, and J. Thomaschewski, Construction of a Benchmark for the User Experience Questionnaire (UEQ), International Journal of Interactive Multimedia and Artificial Intelligence 4 (2017), pp. 40–44.
Abstract: Questionnaires are a cheap and highly efficient tool for achieving a quantitative measure of a product’s user experience (UX). However, it is not always easy to decide, if a questionnaire result can really show whether a product satisfies this quality aspect. So a benchmark is useful. It allows comparing the results of one product to a large set of other products. In this paper we describe a benchmark for the User Experience Questionnaire (UEQ), a widely used evaluation tool for interactive products. We also describe how the benchmark can be applied to the quality assurance process for concrete projects.
Messen der User Experience
A. Hinderks, D. Winter, M. Schrepp, and J. Thomaschewski, Applicability of User Experience and Usability Questionnaires, Journal of Universal Computer Science (2020), pp. 1717–1735.
Abstract: To be successful, interactive products need to fulfil user expectations and create a positive user experience (UX). An established method to measure UX involves questionnaires. What we aim in this paper is to present a list of user experience and usability questionnaires and its applicability for different digital products. A total of 13 questionnaires on usability and UX were analysed for this paper, and 25 factors were extracted from those questionnaires. A study was conducted based on this collection of factors with N= 61 students. The study investigated the perceived importance of usability and UX factors for seven digital products. The goal was to have a collection of usability and UX factors that could be combined for suitable products evaluation. The results of the study revealed that no questionnaire covered all the factors perceived important by the participants.
A. Hinderks, F.J. Domínguez Mayo, J. Thomaschewski, and M.J. Escalona, An SLR-Tool: Search Process in Practice: A tool to conduct and manage Systematic Literature Review (SLR), in: 2020 The 42nd International Conference on Software Engineering (ICSE), 2020, pp. 81–84.
Abstract: Systematic Literature Reviews (SLRs) have established themselves as a method in the field of software engineering. The aim of an SLR is to systematically analyze existing literature in order to answer a research question. In this paper, we present a tool to support an SLR process. The main focus of the SLR tool (https://www.slr-tool.com/) is to create and manage an SLR project, to import search results from search engines, and to manage search results by including or excluding each paper. A demo video of our SLR tool is available at https://youtu.be/Jan8JbwiE4k.