Skip to main content
Erschienen in: Business & Information Systems Engineering 5/2022

Open Access 12.07.2022 | Catchword

Corporate Digital Responsibility

verfasst von: Benjamin Mueller

Erschienen in: Business & Information Systems Engineering | Ausgabe 5/2022

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …
Hinweise
Accepted after one revision by Ulrich Frank.

1 Introduction

In 2020, the US senate held a series of antitrust hearings – involving some of the world’s leading tech companies – which highlighted the potentially double-edged nature of emerging digital technologies such as artificial intelligence or the Internet of Things. Beyond these technologies alone, digitalization (Legner et al. 2017) leveraging these technologies drives an ever stronger and more fundamental transformation of social and economic processes (Wessel et al. 2021). While these transformations are often linked to opportunities for social and economic growth, we are beginning to realize that these technologies also cause potentially undesirable side-effects. The hearings covered a range of contentious topics – such as the role of free speech of social media or bias in algorithmic decision making – emphasizing that the ability to identify, analyze, and potentially mitigate ethical tensions related to digital technologies and data is a key skill in the transformation toward a digital economy and society. Consequently, the need for scrutiny and safeguards becomes paramount if progress in not only to be driven by what is technologically possible, but by what is societally desirable and sustainable.
One way for organizations to minimize the ethical risks associated with new digital technologies is to put in place policies that encourage a responsible approach to their development, use, and modification. While such policies should be considered as part of an organization’s larger corporate responsibility, recent literature (e.g., Herden et al. 2021; Lobschat et al. 2021) has begun to propose the concept of corporate digital responsibility (CDR). Also, the Business and Information Systems Engineering (BISE) community has set its sights on the concept and the research opportunities connected to it (e.g., Mihale-Wilson et al. 2022). This is evidenced by the attention dedicated to digital forms of corporate responsibility through, for instance, conferences (e.g., WI’231) and special issues.2
In an effort to catalyze the emergence of a discourse on CDR in the BISE community and beyond, this catchword briefly reviews the motivational background and conceptual roots of CDR. It further provides and overview of extant definitions and contributions, synthesizing two key domains of CDR – a content-oriented perspective on digital ethics and an instrumental perspective on governance. Before concluding with an outlook, the catchword looks at debates and tensions in the still young body of literature on CDR, providing inspiration and guidance for potential future research in the BISE community.

2 Background

When Christopher Wylie revealed how Cambridge Analytica was using covertly collected profile data of up to 87 million Facebook users for political advertising, an understanding that digital technologies and data allowed tech-savvy corporations to exert a profound pressure on a variety of social processes – from individual decision-making to the political discourse – turned from a lingering and often implicit suspicion to widely appreciated fact. Wylie’s revelations in early 2018 were given particular nuance by the fact that the European Union’s General Data Protection Regulation (GDPR) came into enforceable effect only a few weeks later. While both of these examples are far from the only instances of tech and data related scandals or regulations, public attention and novel regulatory pressures drove increased scrutiny on how digital technologies and the data they rely on are used in their corporations to the top of many executives’ list of priorities.
While the above examples have had global impacts and have caused an intensification of the debate on data security, privacy, and regulation of technology and its use, some countries entered the debate with a head start. In Germany – an example known for its notoriously strict data protection and privacy laws – the public and political history of data protection predated the more current scandals by decades. In particular, and even though mostly grounded in the debate on data use by public authorities rather than corporate entities, the principle of informational self-determination is the bedrock of much of Germany’s history in this regard. First established in the early 1970s (Deutscher Bundestag 1972), the principle received seminal attention when used in the 1983 “Volkszählungsurteil” and became a loadstar for much of Germany’s legislative, regulatory, and social response to technology and data based innovations since. It can arguably still be seen shining though the EU’s recent efforts in this vein, too.
On the face of it, much of the recently intensified corporate debate on the matter is driven by considerations of risk and an aversion of scandal and liability. Returning to the example of Germany highlights links to the country’s rich tradition of technology assessment and its application to technology-driven innovation (Grunwald 2012). Nonetheless, more and more organizations – from startups to long-established players – are beginning to recognize that the way technology is used and data is handled opens up new opportunities to differentiate themselves in the eye of the public, their customers, the talent they seek to attract and retain, or financial investors. But in order to be able to exhibit corporate behaviors that are in line or even surpass these stakeholders’ expectations, corporations are in need of a coherent and organized approach to handling data and technology responsibly. For example, Merck – a 250 year-old industrial conglomerate – recently announced its efforts to help steer the company’s increasingly digitized business with the help of a digital ethics advisory board.3 Similarly, Elisa – a mobile network operator in Finland – has committed to a series of digital ethics values in their annual corporate responsibility reports for a number of years now and has received a series of recognitions for their work.4 Beyond these, the recently held first edition of the CDR Awards also illustrates other corporate initiatives to pursue digital responsibility such as corporate digital activism that seeks to sponsor the building of digital skills in schools or efforts to upskill a corporation’s employees in an effort to sensitize them toward extant ethical issues connected to digital technologies and data.5

3 Conceptual Roots

At the crossroads of these developments is a concept that is beginning to gain more and more widespread attention among executives, policy makers, and scholars: corporate digital responsibility. This concept has important roots in discussions on computer ethics and business ethics that this section reviews.

3.1 Computer Ethics

The basic recognition that advanced means for data collection and processing can have widespread implications on human behaviors and society’s social fabric dates back to the mid-twentieth century. In particular, the then emergent field of cybernetics reflected on the impacts of computer technology and personalities such as Norbert Wiener are often credited with the conviction that “[…] the integration of computer technology into society will require the remaking of society – a ‘second industrial revolution’ – destined to affect every major aspect of life” (Bynum 2000, pp. 109–110).
Since then, the idea of computer ethics has mainly evolved in computer science and came to a first full fruition in the late 1970s to mid 1980s with seminal works such as, for example, Walter Maner’s (1980) ‘starter kit in computer ethics’ or Deborah G. Johnson’s (1985) first textbook on the matter. Today, a 1985 special issue on computer ethics published by Metaphilosophy is generally recognized as a decisive crystallization point in the debate (Bynum 2001), especially because of notable contributions such as James H. Moor’s (1985) discussion on why computer technology deserves special and separate moral consideration. His arguments are also of interest in the current debates on CDR, especially in terms of discussions around how separate CDR can and should be thought of vis-à-vis a second important conceptual pillar – corporate social responsibility (CSR) – I will discuss later.
Since the 1980s, the debates on computer ethics have advanced greatly in both science and practice. In terms of the former, a series of authors have since contributed an ever-increasing number of specific ethical theories that are particularly suited to the computer and information age. Most notably, concepts such as information ethics (Floridi 1999; Siponen 2004), machine or robot ethics (e.g., Lin et al. 2011; Moor 2006), internet ethics (Tavani 1999), or cyberethics (Spinello 2000) provide a variety of moral norms that seek offer guidance in light of the ethical dilemmas cast by the digital revolution; especially when thought of as a form of applied ethic (Capurro 1990). More recent efforts in the domain are focused on conceptually synthesizing this diverse landscape (Capurro 2009; Floridi et al. 2019; Müller, forthcoming) – efforts that also often see a rebranding from computer ethics to the more contemporary label of digital ethics. Complementary to this, the literature has also begun to advocate a more holistic approach to the responsible use of digital technologies by going beyond a focus on singular technologies (e.g., the ethics of AI). Most notably in this vein, Stahl (2021) recently suggested to focus on the ethics of digital ecosystems.
While not as intense as in computer science, discussions of digital ethics are also prevalent in the BISE community. While early contributions took an often broad approach (e.g., Mason 1986), much of the debate since has predominantly been associated with the design of information systems (e.g., Chatterjee et al. 2009; Floyd 1999; Maedche 2017; Mumford 1995; Stahl 2007). In practice, ethical guidelines and codes of conduct by professional associations – such as the GI, ACM, or IEEE – seek to inform both education and professional practice of ICT professionals. This, too, has been a subject of research in the BISE community (e.g., Wakunuma and Stahl 2014; Walsham 1996).
Taken altogether, computer or digital ethics can be seen as providing important underpinning to CDR because of the development and discussion of pertinent ethical theories and moral norms specific to the information age.

3.2 Business Ethics and Corporate Social Responsibility

A second conceptual pillar for discussions of CDR is rooted in a general appreciation of ethical behaviors in the corporate world. One of the key reference disciplines in this is the field of business ethics, which can broadly be defined as the norms and standards that govern judgment and choices in business-related matters (Moriarty 2016; Treviño et al. 2006). Business ethics, interpreted this way, function as a form of applied ethic that seeks to govern and guide the behaviors of a broad set of corporate actors – from senior executives to shopfloor workers – such that their decisions can be judged as ‘good’ against some defined moral code of the corporation (Lewis 1985).
The importance of such governance and guidance is particularly pronounced in decisional instances when behavioral options are not clear or when competing options exist. The recent attention to concepts such the ‘triple bottom line’ (Elkington 1998) or ESG frameworks illustrates this: When corporations accept that they ought to follow more than a pure economic rationale in their decision making (e.g., profit maximization), how should they decide which of the goals to prioritize or how to manage often imperfect trade-offs between them (e.g., how much loss in profit can be justified in order to reduce a corporation’s carbon footprint)?
In this context, corporate social responsibility (CSR) has been proposed as a concept to help corporations define necessary moral norms and corresponding governance schemes to facilitate ethical decision making. In his seminal contribution, Carroll (1991) proposed a pyramidal hierarchy of responsibilities: from economic responsibilities at the bottom through to legal and ethical responsibilities all the way to philanthropic responsibilities – where the former two are often considered core for corporate entities while the latter are desirable extensions (Matten and Moon 2008). Expanding an earlier argument by Porter and Kramer (2006), Hamadi and Manzo (2021) suggest that accounting for these responsibilities is important for a corporation’s long-term competitiveness and for its ability to strive in a healthy and sustainable environment.
While a full account of the CSR literature is beyond the scope of this article (see Herden et al. 2021, for an up-to-date discussion on the matter), the discussion above shows that corporations rely on CSR to help them translate their underpinning morality into specific decisional guidance both explicitly and implicitly (Matten and Moon 2008). This provides important insight into the mechanisms that organizations require to develop a coherent moral compass for the information age.

4 Corporate Digital Responsibility

As a concept, CDR is positioned in between its conceptual roots discussed above. Being a relatively new term, first notable mentions of CDR occur around 2017 (cf. Driesens et al. 2017). While the term has since seen growing resonance in the corporate world – with a seeming focus on European countries – appreciation of and engagement with the term is still in its nascent stage in academia. Recently, works such as Hamadi and Manzo (2021) or Herden et al. (2021) provide good reviews of the emergent literature on the term. Table 1 presents select CDR definitions in the extant literature:
Table 1
Exemplary definitions of CDR in the literature
Authors
Defining CDR as …
BMUV (2021)
“[…] a voluntary corporate activity, particularly considering the consumers’ perspective, which strives to go beyond what is required by law to shape the digital world for the advancement of society.”
Driesens et al. (2017)
“[…] a voluntary commitment. It starts with the need to conform to legal requirements and standards – for handling customer data, confidential, intellectual property and so on – but it also extends to wider ethical considerations and the fundamental values that an organization operates by.”
Herden et al. (2021)
“[…] an extension of a firm’s responsibilities which takes into account the ethical opportunities and challenges of digitalization.”
Joynson (2018)
“[… being] about recognizing that the organizations driving forward the advancement of technology, and those that leverage technology to engage and provide services to the citizen, have a responsibility to do so in a manner that is fundamentally leading us toward a positive future.”
Lobschat et al. (2021)
“[…] the set of shared values and norms guiding an organization’s operations with respect to the creation and operation of digital technology and data.”
Price (2018)
“[… being] about protecting people’s rights around data (in line with regulation), about ensuring that trust is maintained because they see that products and services save them personal time, help them with their health and ageing, and protect them from less acceptable or threatening uses of those same technologies.”
Wade (2020)
“[…] a set of practices and behaviors that help an organization use data and digital technologies in a way that is socially, economically, technologically, and environmentally responsible.”
Weißenberger and Marrocco (2022)
“[…] a voluntary corporate orientation to ensure a responsible use of digital technologies.”
Looking at these definitions reveals two main domains in the emergent CDR literature. First, CDR is concerned with ensuring that corporations exhibit behaviors that comply with a larger understanding of good or positive behaviors. To adjudicate, CDR requires “[…] values and specific norms that govern an organization’s judgments and choices in matters that relate specifically to digital issues” (Lobschat et al. 2021, p. 876). In this regard, CDR is strongly related to computer/digital ethics as a source of the underlying values or norms.
Corporations can either develop a specific set of values and norms for their idiosyncratic situation (e.g., Becker et al. 2022) or try and identify any specific ethical theory to base their CDR efforts on. Some of the systems of moral norms discussed earlier – such as, for example, information ethics (Floridi 1999; Siponen 2004) or cyberethics (Spinello 2000) – can serve as a foundation for these efforts. Beyond such specific ethical theories from general moral philosophy or the digital ethics discourse, the literature also hints at the suitability of frameworks such as the Declaration of Human Duties and Responsibilities (DHDR), the Universal Declaration of Human Rights (UDHR), or the UN’s Sustainable Development Goals (UNSDGs) as potential normative underpinning for corporations’ CDR efforts (Lobschat et al. 2021). Alternatively, the literature also suggests different frameworks of guiding principles or questions that help organizations govern their corporate behaviors in a digital context (e.g., Brey 2012; Mihale-Wilson et al. 2021; Stahl et al. 2017; Wright 2011). Beyond this immediate, content-related role, computer / digital ethics also can play an important role when providing guidance on the procedural aspects of developing and maintaining a pertinent set of values and norms (e.g., Mingers and Walsham 2010). Beyond roots in moral philosophy, the literature also suggests that a corporation’s CDR norms should reflect other contextual considerations such as pertinent legal or philanthropic frameworks (Joynson 2018; Price 2018); an orientation that aligns well with Carroll’s (1991) thinking and can also be found in recent discussion of CDR in the CSR literature (Herden et al. 2021). In this vein, accounting for the stipulations of the GDPR can be considered an exemplary input to guide organization’s digital responsibility efforts. However, inputs such as this can be understood as a universally obligated internalization of social costs (Johnston et al. 2021). Correspondingly, CDR regimes that are solely based on implementing legal requirements or basic social expectations are less likely to yield any potential to differentiate an organization from its competitors in a meaningful fashion (Lobschat et al. 2021). One noticeable aspect across the emergent literature though is that some sources refrain from specifically identifying any norms or values in a normative fashion but rather focus on the second, more governance-oriented domain.
Second, CDR serves an important governance function in that a CDR regime also seeks to define how to effectuate corporate behaviors across levels that are compliant with the norms and value discussed above. Across the body of emergent literature on the issue, four general dimensions can be recognized.
  • Stakeholders: In this dimension, corporations’ CDR efforts are generally seen to identify and recognize the pertinent stakeholders that need to be considered in and represented by a corporation’s CDR regime. Beyond purely internal stakeholders (e.g., managers, users, non-users), external stakeholders are increasingly recognized both in terms of individual actors (e.g., customers) and institutional actors (e.g., regulators). In this perspective, companies need to be aware that efforts to scope their CDR can have substantial implications on what stakeholders are considered and which are not (Albrechtslund 2007; Reynolds 2011). Beyond stakeholders in terms of social actors, the literature is also beginning to call for considering artificial/technological actors when discussing CDR (e.g., algorithms or software agents) (Lobschat et al. 2021). This is in line with recent calls for IS research to review and expand its theorizing in an effort to explicitly incorporate agentic IS artifacts (Baird and Maruping 2021), especially in regards of settings in which human and technological actors collaborate (Seeber, Bittner, et al. 2020; Seeber, Waizenegger, et al. 2020a, b).
  • Artifacts: This dimension discusses the question of how an often-latent understanding of CDR-related norms and values can be made more manifest inside a corporation. For instance, the work of software engineers could be guided on the basis of codified standards (e.g., Association for Computing Machinery 2018; Gotterbarn et al. 1997; Mumford 1995) or organizations can define specific codes of conduct (e.g., Becker et al. 2022). Alternatively, guidance regarding desirable behaviors could be embedded into software artifacts that scaffold employee behaviors (e.g., Hadasch et al. 2013; Morana et al. 2019) or are taken into account by users when they plan their behaviors (Mueller et al. 2016). Similarly, linked to the arguments regarding artificial/technological actors above, research in this area also seeks to understand how to embed moral norms into machines and ensure that they too exhibit moral behavior – either by following hard-coded rules or by exhibiting precursors of machine-based ethical reasoning (e.g., Allen et al. 2006; Moor 2006; Nallur 2020; Tóth et al. 2022). Beyond such manifestations, this dimension also calls for attention to the less manifest forms in which rules such as moral norms materialize. Here, recognizing social structures such as institutions – including symbolic elements, resources, and practices – suggests increased sensitivity to aspects of organizing such as corporate mythology (e.g., storytelling or rituals) (Lobschat et al. 2021). Both the manifest as well as the non-manifest aspects of this perspective strongly inform and shape each other, even though the question of whether artifacts “have politics” (Winner 1986), or whether technology’s implications on social life are purely enacted (Orlikowski and Scott 2008), remains contested in the literature.
  • Processes and structures: While not as advanced and explicit as its counterpart in the wider CSR discourse yet, this dimension seeks to understand the various means through which CDR is implemented and enforced in corporations. This applies to changes in an organization’s hierarchical and procedural structures as well as to an adaptation of roles, rules, and responsibilities. For instance, Wade (2020) identifies a set of practices suggested to ensure corporate digital responsibility in respect of social, economic, technological, and environmental aspects. However, the literature still promotes a wide array of different approaches. For instance, Wade (2020) can be read to advocate a centralized approach to CDR where a corresponding office or corporate officer is given the authority and resources to police and enforce CDR. On the contrary, Lobschat et al. (2021) propose a more culture-oriented approach the seeks to decentralize responsibility for CDR-compliant behaviors. This stream of the literature is most specific where CDR is looked upon from a CSR perspective (e.g., Herden et al. 2021; Weißenberger and Marrocco 2022) which enables an at least implicit transfer of processes and practices from the CSR literature.
  • Impacts: This dimension encourages corporations to define the various outcome dimensions and relevant impacts associated with CDR. Beyond purely economic and competitive considerations, an increasing variety of environmental and social aspects are also associated with CDR. For example, Elliott et al. (2021) and Wade (2020) consider economic, social, and environmental aspects. With these considerations being akin to extant thinking on the relevance of ESG goals in the context of corporate responsibility at large, some authors even suggest a more emancipated consideration of digital aspects en par with other aspects of the triple bottom line (e.g., Wade 2020). Others consider such separate accounts of digital aspects a more transitory phase toward an equal consideration of ESG dimensions in both the physical and digital realm (e.g., Doerr 2021). Herden et al. (2021) propose that impacts relevant for CDR can be derived from a classical understanding of corporate responsibility, such as based on Carroll’s (1991) CSR pyramid for instance.
This dimension also includes the competitive impact of defining and implementing CDR. Early conceptual work suggests that customers, future talent, and investors will be favorably influenced in their decision making when an organization adopts a CDR regime and exhibits compliant behaviors (Lobschat et al. 2021); a suggestion that is supported by emergent empirical work (e.g., Clausen et al. 2022; Mihale-Wilson et al. 2021).
Complementarily, this dimension also contains the question of key performance indicators (KPIs) that organizations can use to assess and manage their corporate behaviors in regard of CDR. While approaches as detailed as CDR balanced scorecards are yet to be developed, recent efforts in research (e.g., Mihale-Wilson et al. 2021) can be seen to point in this direction.
Beyond these two core CDR themes – the content-related domain of digital ethics and the instrumental domain concerned with governance – returning to the definitions reveals a set of issues this literature is characterized by. Especially literature that is anchored in the German-speaking area tends to emphasize the voluntariness of corporations’ commitment to CDR; often pointing out the need to exceed the minimum level of compliance mandated by law. Most definitions also are explicit in highlighting that digital responsibility is equally about data and technology, and that digital responsibility needs to transcend the creation of digital assets such that a continued engagement with use and impacts are possible – all the way to enabling interventions and changes or possible retirement of digital assets when their impacts are no longer in line with the underlying norms and values.
Taken together, corporations will have to invest considerable efforts to build an effective CDR regime for their organizations. Figure 1 represents a stylized conceptual synthesis of the discussion above.
In the figure, the foundational position of norms and values reflects an emergent consensus in the still young literature that a clear guiding framework is needed if a corporation’s CDR efforts are to be coherent and effective. Literature explicitly discussing such norms and values (Brey 2012; Stahl et al. 2017; Wright 2011) shows that an understanding of which impacts matter to which stakeholders – and why so – can then be derived from these norms. With this in place, both discussions on artifacts as well as on processes and structures then complement the CDR efforts in seeking to find ways to effectuate desirable behaviors across levels (Lobschat et al. 2021).
Note, however, that the literature on CDR does not seem to have carved out a consensual understanding of how these domains relate to one another and that questions on how exactly the domains and dimensions inform and depend on each other remain the subject of future research. The framework shared above should thus not be interpreted as a conceptual contribution, but as an attempt to synthesize and map difference pieces of the forming CDR puzzle to one another.
Additionally, returning to the definitions reveals a set of issues that are still a matter of debate in the emergent literature. For instance, some privilege a consumer / user perspective (e.g., BMUV 2021), while others call for a broader consideration of stakeholders (e.g., Lobschat et al. 2021). Similarly, and while extant definitions agree that adding digital aspects to the mix needs to extend a corporation’s current understanding of corporate responsibility, there does not seem to be any consensus on how corporate digital responsibility and general corporate social responsibility relate to one another. This points toward the need to advance both the conceptualization of CDR per se as well as review how it relates to other concepts.
While these are issues I will also highlight as key debates and future research opportunities below, this line of thinking seems critical to our understanding of whether CDR can be an emancipated research topic of interest for the BISE community (Mihale-Wilson et al. 2022). In the literature, a two-fold approach emancipating CDR from CSR can be found.
On the one side, digital technology is argued to be unique in its characteristics such that extant norms of morale are likely to either require massive updating or do no longer apply at all. For instance, Broadbent et al. (2015) highlight how traditional ideas of responsibility are challenged by the increasing use of digital technologies. By and large, this position seems to be matching public perception of legislators’ attempts to regulate the digital sphere, which are often said to struggle to keep pace with technological developments. Historically, this line of thinking is rooted in arguments that portray digital technologies as general purpose tools (esp. Moor 1985). In this spirit, related discourses provide a rich discussion on unique characteristics of digital technologies that provide sufficient grounds for conceptual differentiation. An example are discussions on innovation and innovation management. Here, authors such as Fichman et al. (2014), Nambisan et al. (2017), or Ciriello et al. (2018) discuss characteristics such as network effects or dynamic problem–solution designs to justify why digital innovation needs to be considered conceptually separate from traditional innovation studies. The discourses on, for instance, digital business strategy (e.g., Bharadwaj et al. 2013) or digital transformation (e.g., Wessel et al. 2021) employ a similar strategy. Recently, Mihale-Wilson et al. (2022) continue this thinking and add aspects like agency, recombinant capabilities, pervasiveness, and opacity.
On the other side, literature also suggests that the moral regulation of digital technologies is a pressing and contemporary issue, thus justifying the at least temporary highlighting of the topic (Mihale-Wilson et al. 2022). Doerr (2021) complements these thoughts by compiling a list of contemporary business trends that demonstrate the salience of digital topics and explains how this salience expands traditional corporate responsibility regimes. In conclusion, Mihale-Wilson et al. (2022) establish “that a distinction between CDR and CSR is necessary because technology reshapes and extends the traditional corporate responsibilities unprecedentedly” (p. 128); arguments which align well with Moor (1985).

5 Current Debates and Tensions

While these arguments can support the stand-alone conceptualization and consideration of CDR, various streams of the literature are in disagreement over the exact definition of CDR and its connection to related work. The discussion below returns to the issues, but expands the list by calling for future research in terms of the scope and impact of CDR too. Additionally, I argue that the tension between normative and descriptive work on CDR is a source of future research in BISE.

5.1 Advanced Conceptualization of CDR

While early efforts provide starting points for comprehensive conceptualizations of CDR as a phenomenon (e.g., Herden et al. 2021; Lobschat et al. 2021; Mihale-Wilson et al. 2021), further work in this regard is needed. An immediate opportunity for the BISE community is to carefully review existing CDR frameworks and conceptually synthesize them. In this, studying what domains need to be covered and governed by digital ethics and CDR beyond issues immediately related to data handling and technology design seems especially interesting. Other digital issues – digital business strategy (Bharadwaj et al. 2013) for example – have shown that digital issues increasingly permeate all areas of an organization. In light of these experiences, exploring the role of CDR across all of an organization’s areas will be important. Especially if future CDR conceptualizations seek to grow beyond mere theories of description (Gregor 2006), a closer investigation of factors that influence the appropriateness of certain CDR regimes over others and the effects of CDR adoption and compliance on, for example, firm performance, is called for. Such efforts could build on the framework provided above as a conceptual scaffold but will need to uncover and refine the inherent conceptual structure of CDR as a phenomenon.
Beyond conceptual clarity and internal coherence (Suddaby 2010), future work addressing this open issue must also make sure to carefully reflect on contextual factors because contextual differences are likely to influence a phenomenon like CDR (Johns 2006). This seems particularly true taking into account the increasingly globalized stage on which organizations act, especially in terms of their digital activities. In this regard, both moral norms and relevant cultural factors are likely to introduce additional complexity into CDR-related considerations. Correspondingly, contextual awareness seems critical when crafting CDR theory that avoids blind spots or tensions (Barkema et al. 2015).
Future conceptualizations of CDR also need to carefully consider how to advance their theoretical underpinning. Early frameworks such as Lobschat et al. (2021) or Herden et al. (2021) draw on argument that are rooted in cultural considerations or Carroll’s (1991) pyramid respectively. Future research could further expand these foundations by taking an institutional perspective, which seems particularly interesting when taking into account pluralistic and often competing institutional logics (Berente et al. 2019) that also characterize moral dilemma. Such an institutional perspective could also investigate how organizations deal with violations of their CDR norms and regimes, along with the kind of maintenance work that is essential to keep the corporation going (Lok and De Rond 2013). In a similar vein, future CDR research can also draw on a social mechanisms (Avgerou 2013) lens to explain how CDR regimes effectuate compliant behaviors.

5.2 Relation Between CSR and CDR

Continuing the thinking of this first tension, one of the issues in conceptualizing and better understanding CDR is rooted in questions regarding its relationship with CSR. While some argue that CDR is supposed to be conceptually and organizationally distinct from CSR (Lobschat et al. 2021) – following the arguments sketched out above – others suggest that CDR is subsumed in digitally-conscious approaches to CSR (Herden et al. 2021). This line of thinking seems to be supported by arguments which point out that some issues governed by CDR have strong interrelations with aspects traditionally governed by CSR (e.g., sustainability or diversity) (Doerr 2021; Wade 2020).
As discussed in reference to computer ethics earlier, this is not a new controversy. In the early roots of computer ethics, Moor (1985) argued that “computers provide us with new capabilities and these in turn give us new choices for action. Often, either no policies for conduct in these situations exist or existing policies seem inadequate” (p. 266). Similarly, Maner (1996) argues “that there are issues and problems that are unique to computer ethics [because of] an essential involvement of computing technology. Except for this technology, these issues would not have arisen, or would not have arisen in their highly altered form” (p. 152). While this thinking supports the idea that CDR should be considered separate from CSR, Johnson (1985) provides indirect counterarguments by proposing that the ‘digital’ creates new variants of known ethical problems and dilemmas and that commonly known ethical theories and moral norms can be transposed; some contextualization notwithstanding. Following this line of thinking suggests that terms such as computer ethics or digital ethics are no longer needed to single out a subset of ethical issues arising from the use of information technology. Computer technology would be absorbed into the fabric of life, and computer ethics would thus be effectively absorbed into ordinary ethics and CDR becomes a mere part of corporate responsibility (Bynum 2001; Müller, forthcoming).
With the original tensions in computer ethics still unresolved (or relegated to a matter of conviction at least), future research has the opportunity to investigate how CDR and CSR relate to one another – also in corporate reality. While the conceptual discussion of digital issues in the IS literature shared above – such as digital innovation or digital business strategy – provides arguments for a separate consideration, other research suggests that a parallel existence of different views on adequate behavior is not without risk because such norm fragmentation induces conflict in organizations (Diefenbach and Ullrich 2018). This can be interpreted as an argument supporting the integration of CDR into more general considerations of CSR (Herden et al. 2021).
Beyond such an either-or position, Weißenberger and Marrocco (2022) propose that CDR should be considered as a transversal function within CSR. Doerr (2021) similarly reasons that pertinent digital issues provide a new layer to problematics that traditional CSR approaches consider in relation to the physical world only.
Taking a longer-term view, thus far unrelated literature suggests that a separate consideration of digital phenomena might be a transient phenomenon; with a separate consideration sensible at first before intellectual traditions then fuse in the future (Mueller et al. 2021; Parmiggiani et al. 2020). This position seems to be able to reconcile the two camps in the long run, while at least temporarily allowing for the separate consideration called for by Mihale-Wilson et al. (2022).
To help diffuse this tension, future work will have to pick-up and intensify its discussion of the conceptual home of CDR. On a very pragmatic level, work that seeks to position CDR as more than just a digitally-conscious approach to CSR should seek to identify and analyze examples of how CDR-related considerations challenge the status quo of CSR management. In any case, a more discursive perspective on theory – see Dirk Hovorka’s point in (Bichler et al. 2016) – suggests that the emergent research community on CDR will likely reach beyond the traditional disciplinary boundaries that this tension implicitly draws. A related issues is how digital ethics are appreciated and represented in the larger discourses on (moral) philosophy. Müller (forthcoming) proposes that topics related to digital ethics have only seen increasing recognition at institutions that are central to the philosophical community over the last five years, but is confident that the field is going to pick up the topic more intensely in the years ahead. Chances are that the CDR-vs-CSR tension will then be supplanted by an ethics-vs-applied ethics debate.

5.3 Scope and Impacts of CDR

Historically, the earliest contributions on CDR propose that CDR is a voluntary commitment on the side of corporations (e.g., Driesens et al. 2017). Since then, corporations have been confronted with a series of changes that, at least in part, can be seen to mandate CDR-relevant aspects. GDPR, for instance, mandates standards of care and good practice in the context of personal data. Far from being a voluntary commitment, GDPR sets a lower regulatory bound that corporations’ CDR-related efforts have to incorporate. This debate raises two immediate issues that further research should investigate.
The first relates to voluntariness and should seek insights into how this voluntariness shifts over time. Issues that emerge from any partial mandate (e.g., focus on data alone) could also be of interest, especially when potentially competing norms arise (e.g., the EU’s current attempt to build a regulatory framework for AI).6 From this, a second issue highlights the question of scope of CDR – both conceptually as well as within an organization. In terms of the former, a series of recent publications investigates CDR in the context of specific technologies – such as AI (e.g., Elliott et al. 2021; Hamadi and Manzo 2021) or IoT (e.g., Kohlmann 2019) – or specific industries – such as digital service (Wirtz et al. 2021). Here, further research is needed to shed light on the question of whether CDR needs to be thought of as an integrated matter or whether more focused or domain-specific approaches are warranted; also as a possible remedy to potentially competing CDR approaches as hinted to in the previous paragraph. In terms of the latter, further research should study how corporations actually do CDR – whether they follow a centralized approach as advocated by Wade (2020) or whether a decentralized approach is more pertinent in practice (e.g., CDR implicitly incorporated in corporate efforts regarding IT security, business intelligence and data analytics, marketing, etc.). As an early and notable empirical contribution in this domain, Mihale-Wilson et al. (2021) reveal that consumers currently seem to value a certain variant of CDR norms (mostly related to data security and transparency) more than others, which suggests that focus on some areas of CDR over others could be an interesting strategy for corporations starting their efforts in this domain. On a more general level, such research contributes to a better understanding of how CDR efforts are perceived, evaluated, and rewarded or sanctioned by relevant stakeholders.
More indirectly, issues of voluntariness and scope can also have ripple effects on the question of impacts. Particularly, Lobschat et al. (2021) point out a potential tension between CDR approaches that are focused on the avoidance of negative consequences (loss of reputation, fines and liabilities, etc.) and those that employ CDR in efforts to establish and pursue a competitively relevant positioning in the market (e.g., toward investors, future talent, etc.); especially when acknowledging larger frameworks of corporate responsibility and the role digital technologies play (e.g., Elliott et al. 2021; Wade 2020).

5.4 Descriptive vs. Normative CDR Research

Like in many disciplines that deal with matters of morality or social structure, the question of whether research on CDR should confine itself to objectively describe ‘what is’ or should actively strive to shape what ‘should be’ will also drive future discourse in the CDR community.
The domain of values and norms provides good illustration: Should researchers limit themselves to discovering and describing what values and norms are being used in practice, or should they specifically seek to propose frameworks of norms and values that exert normative pressure on practice? And, if so, is there a boundary between frameworks to establish a baseline (i.e., a checklist to ensure that no relevant domains or dimensions have been overlooked) on one side and specific norms and values on the other? While it is beyond the scope of this article to reason for one of these approaches over the other, the thus far mostly conceptual literature on norms and values has focused on suggesting a series of approaches that can help to govern digital technologies in practice – both outside of BISE (e.g., Brey 2012; Wright 2011) as well as within (e.g., Mason 1986; Stahl et al. 2017) – also because of a lack of explicit empirical studies on the matter yet. But as the empirically-oriented literature on the matter expands, this tension will become more salient. Complementarily, greater salience of ethical aspects in BISE research will potentially require an update of the limited guidance on critical research (Myers and Klein 2011).
A variant of this research opportunity that is particularly interesting to the BISE community is that of the role of ethics in designing (increasingly autonomous) systems. In this, CDR-inspired work in BISE could be based on the recognition that designing systems is akin to the creation of new (life-)worlds, and that alternative designs can lead to vastly different trajectories for future development (Frank 2009). As such, especially normative or prescriptive work on CDR in BISE will likely be among the first discourses to head recent call for providing speculatively engaging futures through our research (Hovorka and Peter 2021). Corresponding research can further explore how CDR needs to be organized such that it (a) ethically guides the efforts of creating digital products and services, (b) makes sure that the design of the resultant artifacts reflects relevant norms of digital ethics, and (c) advances the ability of digital artifacts to exhibit ethical behaviors themselves. Any research in this vein will have to keep a close eye on the inherent duality of design theory as both a theory of design as well as theories used for design (Gregor and Jones 2007).

6 Outlook

Currently, CDR seems to be gaining traction in both research and practice. Thus far, the emergent research landscape on CDR is mostly conceptual in nature, but a galloping development in practice will afford rich opportunities to explore the issue empirically. For example, the German Federal Ministry for the Environment, Nature Conservation, Nuclear Safety and Consumer Protection currently spearheads a ‘CDR Initiative’ in collaboration with a series of larger corporations with more or less direct ties to digital business.7 One of the goals of this initiative is to formulate a comprehensive ‘CDR Codex’8 which promises to provide guidance on aspects of both the content domain (i.e., identifying relevant norms) and the governance domain (i.e., how to implement and enforce responsible behaviors) of CDR. At the same time, the German Association for the Digital Economy (BVDW) is pursuing a collaborative effort with its members which has recently culminated in the release of a set of CDR best practices called ‘CDR Building Bloxx.’9 In Switzerland, the Swiss Digital Initiative attempts to establish a ‘Digital Trust Label’ that encourages corporations to consider and implement important CDR-related issues and seeks to establish itself as an important signaling device.10 Internationally, CDR frameworks emerge such as, for instance, the ‘Digital Ethics Compass,’11 the ‘Digital Responsibility Goals,’12 or the ‘CDR Manifesto.’13 This plurality and potential competition among the frameworks promise to offer interesting opportunities to observe (a) how these frameworks evolve and adapt, (b) how the ostensive understanding they incorporate gets enacted in practice, and (c) what factors influence one framework’s suitability over another. These questions also resonate with the future research opportunities recently proposed by Mihale-Wilson et al. (2022) who called for an intensified study of how to implement CDR in a corporation’s day-to-day operations.
At the same time, events like the ‘Digital Ethics Forum’14 are increasingly drawing larger audiences, hinting toward the expansion of the relevant discourse community in academia, practice, and public policy. All of these developments provide ample opportunity to further explore the domains of CDR highlighted above, their interplay, and the further development of the concept both from a descriptive as well as from a normative point of view. Especially considering the appreciation of BISE scholars toward theories for design and action, the relevance of CDR is likely to increase in our discipline.
Open AccessThis article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://​creativecommons.​org/​licenses/​by/​4.​0/​.

Unsere Produktempfehlungen

WIRTSCHAFTSINFORMATIK

WI – WIRTSCHAFTSINFORMATIK – ist das Kommunikations-, Präsentations- und Diskussionsforum für alle Wirtschaftsinformatiker im deutschsprachigen Raum. Über 30 Herausgeber garantieren das hohe redaktionelle Niveau und den praktischen Nutzen für den Leser.

Business & Information Systems Engineering

BISE (Business & Information Systems Engineering) is an international scholarly and double-blind peer-reviewed journal that publishes scientific research on the effective and efficient design and utilization of information systems by individuals, groups, enterprises, and society for the improvement of social welfare.

Wirtschaftsinformatik & Management

Texte auf dem Stand der wissenschaftlichen Forschung, für Praktiker verständlich aufbereitet. Diese Idee ist die Basis von „Wirtschaftsinformatik & Management“ kurz WuM. So soll der Wissenstransfer von Universität zu Unternehmen gefördert werden.

Literatur
Zurück zum Zitat Avgerou C (2013) Social mechanisms for causal explanation in social theory based IS research. J Assoc Inf Syst 14(8):399–419 Avgerou C (2013) Social mechanisms for causal explanation in social theory based IS research. J Assoc Inf Syst 14(8):399–419
Zurück zum Zitat Bharadwaj A, El Sawy OA, Pavlou PA, Venkatraman N (2013) Digital business strategy: toward a next generation of insights. MIS Q 37(2):471–482CrossRef Bharadwaj A, El Sawy OA, Pavlou PA, Venkatraman N (2013) Digital business strategy: toward a next generation of insights. MIS Q 37(2):471–482CrossRef
Zurück zum Zitat Broadbent S, Dewandre N, Ess CM, Floridi L, Ganascia J-G, Hildebrandt M, Laouris Y, Lobet-Maris C, Oates S, Pagallo U, Simon J, Thorseth M, Verbeek P-P (2015) The onlife manifesto. In: Floridi L et al (eds) The onlife manifesto - being human in a hyperconnected era. Springer, Cham, pp 6–13 Broadbent S, Dewandre N, Ess CM, Floridi L, Ganascia J-G, Hildebrandt M, Laouris Y, Lobet-Maris C, Oates S, Pagallo U, Simon J, Thorseth M, Verbeek P-P (2015) The onlife manifesto. In: Floridi L et al (eds) The onlife manifesto - being human in a hyperconnected era. Springer, Cham, pp 6–13
Zurück zum Zitat Diefenbach S, Ullrich D (2018) Disrespectful technologies: social norm conflicts in digital worlds. In: International conferences on usability & user experience and human factors and assistive technology:Orlando Diefenbach S, Ullrich D (2018) Disrespectful technologies: social norm conflicts in digital worlds. In: International conferences on usability & user experience and human factors and assistive technology:Orlando
Zurück zum Zitat Doerr S (2021) Corporate digital responsibility. Springer, HeidelbergCrossRef Doerr S (2021) Corporate digital responsibility. Springer, HeidelbergCrossRef
Zurück zum Zitat Fichman RG, Dos Santos BL, Zheng Z (2014) Digital innovation as a fundamental and powerful concept in the information systems curriculum. MIS Q 38(2):329–353CrossRef Fichman RG, Dos Santos BL, Zheng Z (2014) Digital innovation as a fundamental and powerful concept in the information systems curriculum. MIS Q 38(2):329–353CrossRef
Zurück zum Zitat Frank U (2009) Die konstruktion möglicher welten als chance und herausforderung der wirtschaftsinformatik. In: Becker J et al (eds) Wissenschaftstheorie und gestaltungsorientierte wirtschaftsinformatik. Physica, Heidelberg, pp 161–173CrossRef Frank U (2009) Die konstruktion möglicher welten als chance und herausforderung der wirtschaftsinformatik. In: Becker J et al (eds) Wissenschaftstheorie und gestaltungsorientierte wirtschaftsinformatik. Physica, Heidelberg, pp 161–173CrossRef
Zurück zum Zitat Gregor S (2006) The nature of theory in information systems. MIS Q 30(3):611–642CrossRef Gregor S (2006) The nature of theory in information systems. MIS Q 30(3):611–642CrossRef
Zurück zum Zitat Gregor S, Jones D (2007) The anatomy of a design theory. J Assoc Inf Syst 8(5):313–335 Gregor S, Jones D (2007) The anatomy of a design theory. J Assoc Inf Syst 8(5):313–335
Zurück zum Zitat Grunwald A (2012) Technikzukünfte als medium von zukunftsdebatten und technikgestaltung. KIT Scientific Publishing, Karlsruhe Grunwald A (2012) Technikzukünfte als medium von zukunftsdebatten und technikgestaltung. KIT Scientific Publishing, Karlsruhe
Zurück zum Zitat Hadasch F, Li Y, Mueller B (2013) IS security policy enforcement with technological agents: a field experiment. In: 21st European Conference on Information Systems, Utrecht Hadasch F, Li Y, Mueller B (2013) IS security policy enforcement with technological agents: a field experiment. In: 21st European Conference on Information Systems, Utrecht
Zurück zum Zitat Herden CJ, Alliu E, Cakici A, Cormier T, Deguelle C, Gambhir S, Griffiths C, Gupta S, Kamani SR, Kiratli Y-S, Kispataki M, Lange G, Moles de Matos L, Tripero Moreno L, Betancourt Nunez HA, Pilla V, Raj B, Roe J, Skoda M, Edinger-Schons LM (2021) Corporate digital responsibility. Sustain Manag Forum 29(1):13–29. https://doi.org/10.1007/s00550-020-00509-xCrossRef Herden CJ, Alliu E, Cakici A, Cormier T, Deguelle C, Gambhir S, Griffiths C, Gupta S, Kamani SR, Kiratli Y-S, Kispataki M, Lange G, Moles de Matos L, Tripero Moreno L, Betancourt Nunez HA, Pilla V, Raj B, Roe J, Skoda M, Edinger-Schons LM (2021) Corporate digital responsibility. Sustain Manag Forum 29(1):13–29. https://​doi.​org/​10.​1007/​s00550-020-00509-xCrossRef
Zurück zum Zitat Johns G (2006) The essential impact of context on organizational behavior. Acad Manag Rev 31(2):386–408CrossRef Johns G (2006) The essential impact of context on organizational behavior. Acad Manag Rev 31(2):386–408CrossRef
Zurück zum Zitat Johnson DG (1985) Computer ethics. Prentice-Hall Johnson DG (1985) Computer ethics. Prentice-Hall
Zurück zum Zitat Kohlmann P (2019) Kapitel 7: Corporate digital responsibility for internet of things technology. In: Spraul K (ed) Nachhaltigkeit und digitalisierung: wie digitale innovationen zu den sustainable development goals beitragen. Nomos, Kreuzberg, pp 165–182CrossRef Kohlmann P (2019) Kapitel 7: Corporate digital responsibility for internet of things technology. In: Spraul K (ed) Nachhaltigkeit und digitalisierung: wie digitale innovationen zu den sustainable development goals beitragen. Nomos, Kreuzberg, pp 165–182CrossRef
Zurück zum Zitat Maner W (1980) Starter kit in computer ethics. Helvetia Press Natl Inf Res Center Teach Philos 3:1978 Maner W (1980) Starter kit in computer ethics. Helvetia Press Natl Inf Res Center Teach Philos 3:1978
Zurück zum Zitat Mason RO (1986) Four ethical issues of the information age. MIS Q 10(1):5–12CrossRef Mason RO (1986) Four ethical issues of the information age. MIS Q 10(1):5–12CrossRef
Zurück zum Zitat Mingers J, Walsham G (2010) Toward ethical information systems: the contribution of discourse ethics. MIS Q 34(4):833–854CrossRef Mingers J, Walsham G (2010) Toward ethical information systems: the contribution of discourse ethics. MIS Q 34(4):833–854CrossRef
Zurück zum Zitat Mumford E (1995) Effective systems design and requirements analysis: The ETHICS approach. Macmillan, LondonCrossRef Mumford E (1995) Effective systems design and requirements analysis: The ETHICS approach. Macmillan, LondonCrossRef
Zurück zum Zitat Myers MD, Klein HK (2011) A set of principles for conducting critical research in information systems. MIS Q 35(1):17–36CrossRef Myers MD, Klein HK (2011) A set of principles for conducting critical research in information systems. MIS Q 35(1):17–36CrossRef
Zurück zum Zitat Nambisan S, Lyytinen K, Majchrzak A, Song M (2017) Digital innovation management: reinventing innovation management research in a digital world. MIS Q 41(1):223–238CrossRef Nambisan S, Lyytinen K, Majchrzak A, Song M (2017) Digital innovation management: reinventing innovation management research in a digital world. MIS Q 41(1):223–238CrossRef
Zurück zum Zitat Porter ME, Kramer MR (2006) Strategy & society: the link between competitive advantage and corporate social responsibility. Harv Bus Rev 84(12):78–92 Porter ME, Kramer MR (2006) Strategy & society: the link between competitive advantage and corporate social responsibility. Harv Bus Rev 84(12):78–92
Zurück zum Zitat Reynolds M (2011) Critical thinking and systems thinking: towards a critical literacy for systems thinking in practice. In: Forte JM (ed) HorvathCP Critical thinking. Nova Science Publishers, New York, pp 37–68 Reynolds M (2011) Critical thinking and systems thinking: towards a critical literacy for systems thinking in practice. In: Forte JM (ed) HorvathCP Critical thinking. Nova Science Publishers, New York, pp 37–68
Zurück zum Zitat Spinello RA (2000) CyberEthics: morality and law in cyberspace. Jones and Bartlett, Burlington Spinello RA (2000) CyberEthics: morality and law in cyberspace. Jones and Bartlett, Burlington
Zurück zum Zitat Stahl BC (2007) ETHICS, morality and critique: an essay on enid mumford’s socio-technical approach. J Assoc Inf Syst 8(9):479–480 Stahl BC (2007) ETHICS, morality and critique: an essay on enid mumford’s socio-technical approach. J Assoc Inf Syst 8(9):479–480
Zurück zum Zitat Weißenberger BE, Marrocco A (2022) Corporate Digital Responsibility und Ihre Integration in die Unternehmensführung. In: Roth S, Corsten H (eds) Handbuch Digitalisierung. Vahlen, Berilin, pp 41–58 Weißenberger BE, Marrocco A (2022) Corporate Digital Responsibility und Ihre Integration in die Unternehmensführung. In: Roth S, Corsten H (eds) Handbuch Digitalisierung. Vahlen, Berilin, pp 41–58
Zurück zum Zitat Wessel L, Baiyere A, Ologeanu-Taddei R, Cha J, Blegind-Jensen T (2021) Unpacking the difference between digital transformation and IT-enabled organizational transformation. J Assoc Inf Syst 22(1):102–129 Wessel L, Baiyere A, Ologeanu-Taddei R, Cha J, Blegind-Jensen T (2021) Unpacking the difference between digital transformation and IT-enabled organizational transformation. J Assoc Inf Syst 22(1):102–129
Zurück zum Zitat Winner L (1986) The whale and the reactor: a search for limits in an age of high technology. University of Chicago Press Winner L (1986) The whale and the reactor: a search for limits in an age of high technology. University of Chicago Press
Metadaten
Titel
Corporate Digital Responsibility
verfasst von
Benjamin Mueller
Publikationsdatum
12.07.2022
Verlag
Springer Fachmedien Wiesbaden
Erschienen in
Business & Information Systems Engineering / Ausgabe 5/2022
Print ISSN: 2363-7005
Elektronische ISSN: 1867-0202
DOI
https://doi.org/10.1007/s12599-022-00760-0

Weitere Artikel der Ausgabe 5/2022

Business & Information Systems Engineering 5/2022 Zur Ausgabe

State of the Art

Intelligent Assistants

Premium Partner