The past decades have been marked by the rapid digitization of almost every part of human life. Our work and private life, healthcare, education, governmental institutions, and society as a whole have been undergoing an era of rapid change driven by new forms of digital communication, automation, new information sources, emerging digital markets, and social networks, as well as digital practices and workflows. With this transition that some analysts refer to as a ‘digital transformation’ and some intellectuals speculate to be a ‘2nd Neolithic revolution’, come many ethical and value-related questions and challenges that are of high relevance for humanity. At the core, we must ask ourselves how we should technically and organizationally design and organize our IT-driven world so that humans can flourish in it (Spiekermann 2016). How can we build “Technology for Humanity” as the IEEE standardization organization promotes in its logo?
The pressing nature of these grand challenges demands a more comprehensive understanding of the undergoing transformation beyond the study of standalone values. In response, policymakers have taken numerous initiatives, for example, at the international level by announcing seventeen sustainable development goals and at the local level by creating large-scale publicly-financed research projects like the Weizenbaum Institute for the Networked Society in Berlin. In a similar vein, many BISE/IS institutes have woken up on this call by establishing such important initiatives as the Sustainability Lab at Vienna University of Economics and Business or by jointly collaborating on research projects that seek to address these issues. In addition to investigating privacy issues and personal data markets (e.g., Spiekermann et al. 2015; Krasnova et al. 2010; Trang et al. 2020), BISE /IS scholars have embarked on studying the fundamental issues of human control, autonomy, and freedom vis-à-vis technology (e.g., Spiekermann and Pallas 2005). Among others, these research efforts have contributed to a better understanding of potential biases inherent in AI design (e.g., Lambrecht and Tucker 2019; Bauer et al. 2020), algorithmic work management (e.g., Möhlmann et al. 2021), echo chambers, and filter bubbles (e.g., Kitchens et al. 2020), health and well-being implications of ICT use in work (e.g., Benlian 2020) and social media (e.g., Krasnova et al. 2015; Krause et al. 2021) contexts, user perceptions in the context of online targeted political advertising (e.g., Baum et al. 2021), among others. In doing so, the grey human entity formerly called “user” has started to be concretized in the many distinct roles created by new ICTs: Patients are now studied in relation to health applications (Mueller et al. 2018), citizens as natural participants of e-government processes (e.g., Tan et al. 2013; Nishant et al. 2019), crowd-workers in the context of their algorithmic supervisors (e.g., Straub et al. 2015; Cram et al. 2020), social networking users in the face of gamified platforms intentionally designed to be addictive (e.g., Turel and Serenko 2012), life-loggers with a view to personal data markets (e.g., Risius et al. 2020; Trang et al. 2020), to name a few.
Against the background of these developments, we seek to elicit papers in this area for the BISE Special Issue “Technology for Humanity”. Submissions to this special issue are encouraged from all theoretical and methodological perspectives. Thus, submissions can either investigate people in their specific roles and reactions to technology, relate to technology design or design methods to foster humanity and sustainability, be of a conceptual nature to better grasp the technological and economic changes we are witnessing, or review user studies and/or technologies of interest in this research domain. Authors must clearly outline why their contribution is new and interesting for research and practice and how they contribute to human and social value creation through their work.
Possible topics include, but are not limited to:
- Dignity and respect
- Inequality
- Freedom, liberty, transparency and autonomy
- Privacy, trust, control and freedom
- Friendship, social support and inclusion
- Bias, fairness and transparency
- Digital work, digital labor markets and the gig economy
- Personal data markets
- Digital society (e.g., fake news, online radicalization, cyberbullying)
- Physical health and mental well-being in the context of IT use
- Individual behavior and perceptions
- Moral behavior
- Technology mediated human judgement
- Value-based system design
- Privacy sensitive design, privacy by design
- Attention sensitive systems
Submission Guidelines
Please submit papers by 15 May 2022 at the latest via the journal’s online submission system (http://www.editorialmanager.com/buis/). Please observe the instructions regarding the format and size of contributions to Business & Information Systems Engineering (BISE). Papers should adhere to the submission general BISE author guidelines (http://www.bise-journal.com/author_guidelines).
All papers will be reviewed anonymously (double-blind process) by at least two referees with regard to relevance, originality, and research quality. In addition to the editors of the journal, including those of this special issue, distinguished international scholars will be involved in the review process.
Schedule
- Deadline for submission: 15 May 2022
- Notification of the authors, 1st round: 1 August 2022
- Completion Revision 1:
1 October 202215 December 2022 - Notification of the authors, 2nd round:
1 December 202231 January 2023 - Completion Revision 2:
15 January 202328 February 2023 - Final Notification: 31 March 2023
- Online publication: ASAP
- Anticipated print publication: October 2023
Editors of the Special Issue
Sarah Spiekermann-Hoff, Prof. Dr.
Institute for Information Systems & Society
Vienna University of Economics and Business (WU Vienna).
sspieker@wu.ac.at
Hanna Krasnova, Prof. Dr.
Professor for Information Systems, especially Social Media and Society
University of Potsdam
krasnova@uni-potsdam.de
Oliver Hinz, Prof. Dr.
Professor of Information Systems and Information Management
Goethe University Frankfurt
ohinz@wiwi.uni-frankfurt.de
References:
Bauer, K., Pfeuffer, N., Abdel-Karim, B. M., Hinz, O. & Kosfeld, M. (2020). The terminator of social welfare? The economic consequences of algorithmic discrimination. SAFE Working Paper, 287
Baum, K., Meissner, S. & Krasnova, H. (2021). Partisan self-interest is an important driver for people’s support for the regulation of targeted political advertising. Accepted to PLOS One, in press
Benlian, A. (2020). A daily field investigation of technology-driven stress spillovers from work to home. MIS Quarterly, 44(3), 1259–1300
Cram, W.A., Wiener, M., Tarafdar, M. & Benlian, A. (2020) Algorithmic controls and their implications for gig worker well-being and behavior. In: Proceedings of the 41st International Conference on Information Systems (ICIS)
Kitchens, B., Johnson, S.L. & Gray, P. (2020) Understanding echo chambers and filter bubbles: the impact of social media on diversification and partisan shifts in news consumption. MIS Quarterly, 44(4), 1–32
Krasnova, H., Spiekermann, S., Koroleva, K. & Hildebrand, T. (2010) Online social networks: why we disclose. Journal of Information Technology, 25(2), 89
Krasnova, H., Widjaja, T., Buxmann, P., Wenninger, H. & Benbasat, I. (2015) Research note – Why following friends can hurt you: an exploratory investigation of the effects of envy on social networking sites among college-age users. Information Systems Research, 26(3), 585–605
Krause, H. V., Baum, K., Baumann, A., & Krasnova, H. (2021). Unifying the detrimental and beneficial effects of social network site use on self-esteem: a systematic literature review. Media Psychology, 24(1), 10-47
Lambrecht, A., Tucker, C. (2019). Algorithmic bias? an empirical study of apparent gender-based discrimination in the display of stem career ads. Management Science, 65(7), 2966-2981
Möhlmann, M., Zalmanzon, L., Henfridsson, O., Gregory, R. W. (2021). Algorithmic Management of Work on Online Labor Platforms: When Matching Meets Control. MIS Quarterly, forthcoming.
Mueller, M., Heger, O., Niehaves, B. (2018). Investigating Ethical Design Requirements for Digitalized Healthcare Support: The Case of Ambulatory Physiotherapeutic Assistance Systems. Proceedings of the 51st Hawaii International Conference on System Sciences (HICSS-51)
Nishant, R., Srivastava, S. C., & Teo, T. S. (2019). Using polynomial modeling to understand service quality in e–government websites. MIS Quarterly, 43(3), 807-826
Risius, M., Baumann, A., & Krasnova H. (2020). Developing a New Paradigm: Introducing the Intention-Behaviour Gap to the Privacy Paradox Phenomenon. Proceedings of the 28th European Conference on Information Systems (ECIS2020)
Spiekermann, S., Pallas F. (2005). Technology Paternalism – Wider Implications of RFID and Sensor Networks. Poiesis & Praxis – International Journal of Ethics of Science and Technology Assessment, 4(1), 6-18
Spiekermann S (2016) Ethical IT innovation – a value-based system design approach. CRC Press, New York
Spiekermann, S., Böhme, R., Acquisti, A. et al. (2015). Personal data markets. Electronic Markets, 25, 91–93
Straub, T., Gimpel, H., Teschner, F. et al. (2015). How (not) to Incent Crowd Workers. Business & Information Systems Engineering, 57, 167–179
Tan, C. W., Benbasat, I., Cenfetelli, R. T. (2013). IT-mediated customer service content and delivery in electronic governments: An empirical investigation of the antecedents of service quality. MIS Quarterly, 37(1), 77-109
Trang, S., Trenz, M., Weiger, W. W., Tarafdar, M., & Cheung, C. M. K. (2020). One App to Trace Them All? Examining App Specifications for Mass Acceptance of Contact-Tracing Apps. European Journal of Information Systems, forthcoming.
Turel, O., Serenko, A. (2012). The benefits and dangers of enjoyment with social networking websites. European Journal of Information Systems, 21(5), 512-528