Skip to main content
Erschienen in: Business & Information Systems Engineering 4/2021

Open Access 12.02.2021 | Discussion

Token Economy

verfasst von: Ali Sunyaev, Niclas Kannengießer, Roman Beck, Horst Treiblmaier, Mary Lacity, Johann Kranz, Gilbert Fridgen, Ulli Spankowski, André Luckow

Erschienen in: Business & Information Systems Engineering | Ausgabe 4/2021

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
download
DOWNLOAD
print
DRUCKEN
insite
SUCHEN
loading …

1 Unchaining the Token Economy Through Cross-Ledger Interoperability

Ali Sunyaev, Niclas Kannengießer
Transfers of ownership of assets (e.g., fiat money, company shares, or usage rights) between agents (here, individuals or organizations) is often mediated by trusted third parties (TTPs) such as banks or notaries to increase reliability of the transfer process. The involvement of TTPs often introduces drawbacks, like increased costs, longer processing time, and the presence of a single point of failures. These drawbacks motivate the automation and decentralization of several services offered by TTPs. Technological advances have enabled the digital representation and management of asset ownerships using tokens on decentralized digital platforms without the need for TTPs. A token is a sequence of characters that serves as an identifier for a specific asset (e.g., a personalized usage rights) or asset type (e.g., a cryptocurrency). The abilities to represent assets in form of digital tokens on a decentralized digital platform and to assign ownership of these assets to agents in a fraud-resistant way can help to reduce drawbacks related to TTPs (e.g., the presence of single points of failures) and enable a new type of economy: the token economy. In tackling drawbacks related to TTPs, the token economy holds a large transformative value (Benlian et al. 2018) that can strongly affect businesses (e.g., by enabling novel business models and increasing transparency of business processes) and our daily life (e.g., by being able to monetize our own personal data instead of just giving it away).
This chapter discusses the key concept of decentralization, which the token economy is built on, from two fundamental perspectives (i.e., technical and political decentralization) and provides propositions to discuss decentralization. Moreover, this chapter explicates the need for interdisciplinary research (e.g., information systems research, computer science, management science, and social science) to embrace both perspectives.
In the token economy, technical protocols take over several tasks that traditional TTPs previously handled. For example, technical protocols running decentralized digital platforms can check individual agents’ legitimate ownership of assets and create a tamper-resistant record of the transfer of their ownership. Moreover, the use of decentralized digital platforms can increase flexibility in using tokens because agents can implement and use tokens as identifiers for various types of assets (e.g., usage rights, land ownership, or money). By reducing the need for traditional TTPs and increasing flexibility, the token economy holds the potential to support collaborations and cooperation between agents (e.g., in terms of trust; Conway and Garimella 2020; Tian 2017). Moreover, the token economy ultimately allows for novel business models (e.g., decentralized crowdsourcing) and improved business relations (e.g., through increased transparency of business processes) by being able to transfer ownership of physical or digital assets using tokens.
Like traditional asset transfers, token transfers require strong security guarantees that decentralized digital platforms must reliably provide. In the token economy, for example, a decentralized digital platform must guarantee that users cannot simultaneously use the same tokens multiple times (i.e., double-spending), while being highly available and tamper-resistant. Many of the required security guarantees for business ecosystems are addressed by the security characteristics of distributed ledger technology (DLT), including fraud-resistance, high availability, and tamper-resistance. DLT enables the operation of a highly available, append-only distributed database (i.e., a distributed ledger) with distributed storage and computing devices (i.e., nodes) in an untrustworthy environment (Kannengießer et al. 2020a; Sunyaev 2019) – an environment with arbitrary occurrences of (temporarily) unreachable nodes or fraudulent actions (e.g., double-spending).
From the microeconomic perspective, there can be multiple instances of the token economy. To create an instance of the token economy based on DLT, there are two principal creational options: first, to use custom tokens of a distributed ledger; second, to create tokens on an existing distributed ledger (Tönnissen et al. 2020). The first option is typically pursued by consortia of agents that operate a private distributed ledger (e.g., using a private Ethereum blockchain), where only authorized agents can join, read transactions from, and append new transactions to the distributed ledger (see Table 1; Kannengießer et al., 2020a). The second option requires an agent to decide for an existing distributed ledger for the token economy instantiation, where they create custom tokens using a smart contract.
Each creational option has benefits and drawbacks. For example, private distributed ledgers are often more flexible in terms of establishing own rules among agents in the consortium, whereas public ones are more complex to regulate (e.g., because of their openness for arbitrary nodes). Nonetheless, using private distributed ledgers limits the transparency benefits for external agents that are provided by public ones and can hinder the use of tokens across token economy instances. Thus, network effects are reduced, which narrows the reach of the individual token economy instances.
Besides creational options, agents must gauge operational options. Operational options comprise the assignment of responsibilities to agents in the token economy instance (e.g., the operation of a node) and the selection among different distributed ledgers, considering their technical capabilities (e.g., transaction throughput). For example, private distributed ledgers often offer a shorter transaction confirmation latency compared to public-permissionless distributed ledgers, but have a smaller degree of decentralization and, thus, are less fraud-resistant compared to public ones (Kannengießer et al., 2020a). DeKannengießer et al., 2020aspite a vast variety of possibilities related to operational options, trade-offs between DLT characteristics (e.g., availability vs. consistency) hinder distributed ledgers from being capable of simultaneously fulfilling the requirements of all token economy instances (Kannengießer et al., 2020a; O’Donoghue et al. 2019). This incapability fuels the development of new and specialized distributed ledgers designed for similar or different purposes (e.g., Ethereum and Tezos have a strong focus on decentralized computations, while IOTA focuses on supporting the Internet of Things with a lightweight protocol).
The increasing diversity of offered distributed ledgers that can be used to instantiate a token economy accompanied by the consideration of creational and operational options pertaining to the creation of token economy instances causes heterogeneity within the token economy. This heterogeneity can cause an isolation of token economy instances because distributed ledgers are still hardly capable of decentralized interoperability (e.g., for the interaction between agents and the dynamic emergence of relations between agents; Moore 2006; Peltoniemi 2005).
Cross-ledger interoperability (CLI) is needed to solve the challenges related to heterogeneity and to realize the full potential of the token economy (e.g., regarding business process innovations; Weking et al. 2020). CLI refers to the communication between distributed ledgers, for example, to carry out cross-ledger asset transfers or to execute smart contracts across distributed ledgers (Kannengießer et al. 2020b). So far, progress on CLI has been largely theoretical (e.g., Back et al. 2014; Herlihy 2018; Zamyatin et al. 2019). Based on these theoretical contributions, researchers and practitioners have developed various CLI artifacts (e.g., Cosmos and Polkadot) from which first architectural patterns for the design of CLI artifacts have emerged, including notary schemes and sidechains (e.g., Deng et al. 2018; Kannengießer et al. 2020b; Koens and Poll 2018). These research contributions and practical implementations represent major advances in CLI, but also highlight new challenges, such as the atomicity of cross-ledger transactions and understanding of creational and operational options beyond the boundaries of individual distributed ledgers. We categorize these challenges into two groups: technical decentralization and political decentralization of CLI.
Technical Decentralization of CLI. Technical decentralization of CLI is the degree that increases (and decreases) with the number of distributed, interconnected nodes that operate independently without a central authority (e.g., a static leader node in consensus finding).
Determining the appropriate degree of technical decentralization currently represents a core challenge within CLI because it strongly affects the design of CLI artifacts (e.g., regarding the information flow between distributed ledgers) and its security characteristics (e.g., atomicity and availability). One can understand the degree of technical decentralization as a continuum that ranges from no to full decentralization. No decentralization of CLI implies that a single node (i.e., a connector) manages all communication between distributed ledgers. This single connector represents a single point of failure, which makes no decentralization most comparable to asset transfers using traditional TTPs (e.g., notaries). In contrast, full decentralization of CLI requires all nodes of a distributed ledger to have the capabilities to connect to any other node of any other distributed ledger, which eliminates that single point of failure, but can cause large communication overhead. For example, when a consortium operates a private distributed ledger and decides to interoperate with another one, each consortium member may set up an own connector to enable interoperability and avoid dependencies on other consortium members’ (potentially fraudulent) connectors. Then, each consortium member is in charge of maintaining an own connector which increases the overall maintainability efforts for the distributed ledger compared to the use of only one connector for the entire distributed ledger (no decentralization). Because of the individual benefits and drawbacks of the different degrees of decentralization, agents must individually decide which degree of decentralization suits their purpose(s) when connecting a distributed ledger to another.
CLI can be achieved in a direct or indirect manner (see Fig. 1). In direct CLI, neither a traditional TTP nor a decentralized digital platform is required to enable interoperability between separated distributed ledgers. Nodes of one distributed ledger can directly communicate with those of the target distributed ledger. In contrast, indirect CLI requires a (centralized or decentralized) TTP that mediates the communication between distributed ledgers. Although direct CLI is desirable (e.g., less single points of failure), indirect CLI facilitates interoperability between multiple distributed ledgers because individual distributed ledgers must only comply with the specifications of the CLI artifact instead of the specifications of all target distributed ledgers.
The limited interoperability between distributed ledgers (e.g., caused by their large heterogeneity) resembles the incompatibility of (early) electronic data processing approaches before international standards were introduced (e.g., SWIFT for financial transactions). To facilitate the communication between heterogenous distributed ledgers, interfaces and procedures are required (e.g., like in electronic data interchange during supplier onboarding or in open banking). For example, standardized interfaces broaden the compatibility of CLI artifacts with more distributed ledgers and ease the development of flexible wallets supporting decentralized cross-ledger transactions.
Proposition 1: The design and productive operation of token economy instances require agents to balance centralization and decentralization for distributed ledgers and CLI.
Political Decentralization of CLI. Political decentralization of CLI refers to the degree of equal distribution of permissions and responsibilities across all agents that independently act according to their individual incentives and work jointly on a common goal. Current studies point out the complexity of political decentralization, for example, regarding decentralized governance and related political challenges (e.g., organization of decision rights; Beck et al. 2018; Reijers et al. 2016). To date, agents usually take part in the governance of the distributed ledger(s) to which they contribute (e.g., by providing a node). With CLI, agents can create cross-ledger business relations, which will affect the creational options (e.g., because of potentially larger network effects) and the governance of the involved distributed ledgers. For example, tokens stored on the target distributed ledger will become usable for agents of other token economy instances. Technical advancements can become more difficult because of more dependencies between distributed ledgers. These exemplary challenges point out the need for decentralized cross-ledger governance aside from the highly discussed decentralized governance, which mostly focuses on individual distributed ledgers.
Depending on the degree of technical decentralization and the choice for direct or indirect asset transfers, decentralized cross-ledger governance may introduce political centralization through a hierarchical organization of nodes. For example, when a consortium uses a single connector maintained by a single agent (low degree of technical and political decentralization) to achieve indirect interoperability, a two-level hierarchical structure emerges from the roles of the different nodes: regular nodes in the distributed ledger and the connector. The emergence of this hierarchy indicates a low technical and political degree of decentralization because permissions and responsibilities are not equal among nodes. In the previous example, the single agent controlling the connector represents a kind of TTP, introduces a single point of failure, and can impede CLI and the token economy instances on separate distributed ledgers, for example, by delaying or blocking agents’ asset transfers or becoming a performance bottleneck. Agents of the separate distributed ledgers depend on few agents controlling the respective connectors (e.g., regarding the implementation of technical updates for CLI), potentially giving rise to hierarchy in the governance of CLI and, eventually, of the individual distributed ledgers. In full technical decentralization with direct interoperability, all nodes (and consequently all agents that control nodes) reside on the same hierarchical level. This favors democratic decisions of nodes across distributed ledgers, the integration of CLI artifacts, and the interoperability with token economy instances across distributed ledgers.
Proposition 2: The degree of decentralization of the token economy depends on the degree of decentralization of individual token economy instances and their interoperability considering multiple perspectives (e.g., political and technical).
Drawing from the introduced limitations of separated token economy instances and our propositions, we conclude that there is a pressing need for CLI resulting from the inherent characteristics of the token economy that reflect those attributed to business ecosystems (e.g., competition and cooperation between organizations). A single (kind of) CLI artifact will not interconnect all distributed ledgers, and we will witness a large heterogeneity of distributed ledgers and CLI artifacts that use the full spectrum of the degree of decentralization and direct and indirect CLI. For example, no decentralization of CLI could be used for confidential data management, while full decentralization of CLI better prevents censorship across distributed ledgers.
The consideration of human actors in the design of distributed ledgers and CLI artifacts points out the complex and strong interdependence between technical decentralization and political decentralization to achieve true decentralization. The complexity of this interdependence makes it challenging to understand the relationship between the social and the technical and to explain the effects of decentralization on the token economy and ISs in general when only one of the two domains is considered. To really understand decentralization of ISs, research should combine the social and the technical and take a sociotechnical perspective on decentralization (Sarker et al. 2019). Thereby, information systems (IS) research can be the missing link between the technical and the political perspective on decentralization.
To discuss emerging areas for interdisciplinary research related to technical and political decentralization in the token economy and especially point out the importance of IS research and innovations in this emerging field, we invited researchers and practitioners with different foci on the token economy. The invitees present emerging trends related to the token economy and draw propositions from their scientific and practical works.
Roman Beck (European Blockchain Center, IT University of Copenhagen) sheds light on the standardization in DLT from a macroeconomic perspective and discusses the token economy as a foundation for network goods represented as tokens.
Horst Treiblmaier (Modul University Vienna) synthesizes the concept of sustainability with the many facets of the token economy and concludes with an agenda for token sustainability. Thereby, he clarifies the role of the token economy based on decentralized digital platforms to innovate existing business models and even create new ones.
Mary Lacity (Blockchain Center of Excellence, University of Arkansas) introduces existing applications of the token economy in supply chain management to track and trace assets and shows novel directions for future research and practice.
Johann Kranz (Ludwig-Maximilians-Universität Munich) elaborates on DLT systems' potential for decentralizing the digital economy. He argues that decentralization is a possible remedy to mitigate the excessive concentration of epistemic and economic power of a few dominant firms which raises increasing social, political, and economic concerns.
Gilbert Fridgen (University of Luxembourg) elaborates on the innovations emerging from the token economy from the perspectives of the industry and individuals and elaborates on the decentralization of token economies.
Ulli Spankowski (Börse Stuttgart) discusses the innovations through the token economy from the finance perspective and proposes potential transformations of the financial market through tokenization based on DLT.
André Luckow (BMW Group) presents the potential of the token economy for the automotive industry by improving supply chain transparency by means of the example of the DLT system Partchain.

2 Standardized Tokens as Network Goods and Source of Value Creation

Roman Beck
The token economy depends on the widespread acceptance and use of interoperable DLT protocols as interaction standard in order to benefit from positive network effects in inter-organizational networks and to avoid challenges that can occur when different DLT protocols compete in the same industry or market. Competition between DLT protocol standards can jeopardize value creation in inter-organizational networks, as DLT protocols share characteristics of club, common, and public goods that only unfold their full potential when assimilated widely. Once a DLT protocol is instantiated in a specific way, it is called DLT system.
In other words, standardization of DLT protocols to harvest their potential benefits is more complicated than standardization of goods that have predominantly a stand-alone value, as is typically the case with private goods. Furthermore, tokens cannot unfold their potential unless a supporting system is in place, comprising commonly accepted norms, agreements off-ledger, and technical norms and rules enforced on-ledger. Only after commonly accepted sociotechnical systems have been assimilated, value creation in inter-organizational networks can take place.

2.1 Tokens as Public Good

The reason why companies struggle to realize the potential of the emerging token or DLT economy is that it is not about innovating a private good, but about innovating a network good, which shares more similarities with public goods rather than private goods. Commercial models generally only address private goods scenarios.
Public goods are also referred to as “collective consumption goods” (Samuelson 1954, p. 387), as they can be used simultaneously while at the same time nobody can be excluded from their use. From that, the principles of non-rivalry and non-exclusion in consumption derive to classify goods (Samuelson 1954).
If a good, such as a token standard, is characterized by non-rivalry, then the consumption of the standard is not interfered with by the simultaneous use of the same standard by someone else. Several entities can use the same good to the same extent under same conditions. If a good possesses non-exclusion properties, then no one can be excluded from the consumption of the good. The ability to exclude someone from consumption is a necessary condition for the supply of private goods. But the ability to exclude someone is not given per se, but by assignment of property rights (Musgrave, 1959, p. 9).
Pure public goods are given if a good fulfills both criteria. The opposite of this is the pure private good, characterized by rivalry in consumption and the possibility to exclude others from consumption: if a unit of the private good is consumed, then it is no longer available to other consumers. Table 2 depicts a classification of pure public and private goods together with possible other combinations. Club goods are characterized by the fact that consumers are excludable and – to a certain degree – that they are non-rivalry in consumption. Typical examples of this category are permissioned public DLT systems where access is only granted if certain conditions are fulfilled. Common goods possess rivalry in consumption, but exclusion is not possible.
Proposition 3: DLT-based tokens have public goods characteristics as driver of value creation in inter-organizational use cases.

2.2 Standards and Network Goods

Digital goods constituting network standards, such as the internet, can be used anywhere in the world (Shapiro et al. 1998) and, thus, are not restricted to a certain geography of jurisdiction. Hence, the scope and potential application of digital goods is considerably broader compared to physical goods (Choi et al. 1997), which is why the standardization of DLT systems in general and DLT-based tokens specifically is of enormous macroeconomic importance for decades to come.
Digital public goods with strong positive network effect characteristics are called network goods. Network goods constitute quasi-standards as they require a critical mass of significant size which is why one will rarely find them covering just a small market segment (Economides and Himmelberg 1995). DLT systems and tokens based upon them are network goods, which extend the traditional goods classification as illustrated in Table 2, as will be explained in the following and is illustrated in Table 3 (adapted from Beck 2007).
Pure private goods are characterized by perfect competition in all product and factor markets, perfect information (complete, accurate, and freely available) on the relevant prices and characteristics of products and factors, and perfect mobility of all resources. Furthermore, and in a direct distinction to public goods, pure private goods must not have any kind of externalities (positive or negative) in the production and consumption of goods or any other interdependence in consumption between consumers. To guarantee a functioning market, private goods must always have an excludability property, meaning that everyone but the buyer of the good is excluded from its benefits. Some goods have the characteristics partly of private goods (no effects or spillovers on third parties in the case of a pure private good) and partly of public goods at the same time. It is possible for the market to produce such club goods to a limited extent (Buchanan 1965), but not at an appropriately satisfying level for all market participants.
DLT systems can be implemented as permissioned public systems, where access must be granted by a governing body (club network good) or as permissionless public system, (pure network good). In both cases, the collective use is not only possible, but necessary, while the value of a pure public good is not defined or does increase with the number of collective users.
Proposition 4: DLT systems and tokens based upon them are network goods where collective use in not only possible, but necessary to increase value creation in inter-organizational use cases.

2.3 Token Standardization and Network Effects

Network externalities can be considered as effects “in which the equilibrium exhibits unexploited gains from trade regarding network participants” (Liebowitz and Margolis 1995). Exploitability gains can only be realized if the same solution is used over the whole network, which is the reason why standardization is important. Network effects exist horizontally among users of DLT systems (direct) and vertically from the availability of supporting products and services (indirect). In both cases, the individual assimilation decision to use a certain DLT system and related tokens affects the assimilation behavior of other market participants. This interdependency is characterized as bandwagon-, herd-, avalanche-, and Veblen-effects (Ceci and Kain 1982; Choi 1997; Leibenstein 1950).
Network-effects-generating DLT systems call for a large number of users in order to generate value in inter-organizational networks. As the assimilation outcome of network goods can lead to multiple market equilibria (Arthur 1983, 1989), a formalized standardization process can guide the assimilation process toward a collectively preferred and stable outcome. As network goods tend to create natural monopolies with strong lock-in effects, it is crucial to define standards for DLT systems that account for economic and societal implications.
Proposition 5: Standardization of DLT systems and tokens based upon them that are characterized by strong direct and indirect network effects has strong economic implications for providing value in inter-organizational use cases.

2.4 Standards and Value Creation

As any profit-oriented organization, the existing DLT-based commercial models are mainly focused on maximizing revenue, i.e., aiming for an installed base of users, increasing market share, and maintaining a low operation cost. When an operator of commercial DLT systems considers interoperability with other DLT systems, standards are needed, but also the integrity of the existing commercial needs to be assured. The commercial value for standards can be to engage with a complementing system (e.g., a logistic chain connecting to a finance chain), or a competing solution (i.e., two logistic chains).
In case of cooperation with a complementing system, the purpose is to make the own DLT-based services more complete, e.g., through the exchange of tokens, so that users may find it beneficial to interact and conduct transactions between chains, which in turn increases the number of users of both systems. In case of cooperating with a competing system, the commercial situation is more complicated. While the standards-based ability to cooperate may increase the number of transactions that otherwise would not be possible to generate, the risk emerges that users may migrate from one DLT system to another, thereby abandoning one or the other completely. It will affect the increase in the number of users or even bring about a decrease, by users moving to the competitive solution. A DLT systems standard in this case allows for competition within the same system, among two or more competing DLT systems and token providers.
Nevertheless, similar to the complementary cooperation scenario, it must be avoided that the potential increase of operating costs exceeds the expected possible positive network effects. Thus, it requires comprehensive preparation and risk mitigation strategies developed upfront before one engages in cooperation with a competing system.
Apart from direct commercial models of DLT systems cooperation, there are also indirect commercial models that enable standards-based cooperation among DLT systems. For example, some DLT systems cooperation solutions are based on an intermediary platform, e.g., as Blockchain-as-a-Service (Kernahan et al. 2021) which is a commercial model that has emerged to meet the need for cooperation. An overview of standards-based cooperation forms and related value creation in inter-organizational systems is illustrated in Table 4. The cost of outsourcing interoperability services to a provider should be less than the cost of self-implementing interoperability to other DLT systems.
Proposition 6: Standardization of DLT systems and tokens based upon them enable new forms of cooperation to generate value directly and indirectly in inter-organizational use cases.
Where DLT systems interact, the management of user roles, permission rights, and the exchange of tokens indicating ownership becomes more complex. Governance instruments need to be agreed upon in interoperability relations that clarify decision rights, distribution of incentives, and accountabilities between two or more involved DLT and non-DLT systems, enacted on-chain in the autonomous interplay between chains, or off-chain in a clearing and settlement that requires organizational involvement.

3 The Token Economy and Sustainability: Silver Bullet or Hype?

Horst Treiblmaier
As of 2020, it is hard to ignore the substantial problems that mankind or, more precisely, the planet earth faces. Societal and economic injustice result in armed conflicts all over the world, while rapidly dwindling resources and increasing environmental pollution are altering the face of the earth. The environmental forecast is particularly alarming with extreme weather incidents increasing, species disappearing and substantial damage being done to the oceans, all of which will dramatically affect societies and economies (Winston 2018). In view of these threats, it is not surprising that new technologies are eagerly scrutinized regarding their ability to contribute to a sustainable environmental, social, and economic development. To shed light on how tokens and token economies can have a positive impact on sustainability, we first need to clarify the underlying terms, namely sustainability and token economy, the latter of which is based on distributed ledger technology (DLT).

3.1 The Broad Concept of Sustainability

A widely used sustainability definition stems from the World Commission on Environment and Development (1987, p. 37): “Sustainable development is development that meets the needs of the present without compromising the ability of future generations to meet their own needs”. According to Murphy (2012) there are three reasons why sustainability is a so-called wicked problem for which no easy solution exists. First, the consequences of unsustainable practices may be distant in both time and space. Second, successful local micro sustainability initiatives do not necessarily work on a greater scale. Third, numerous sustainability threats require instant change, but different views of sustainability embedded in our society impede fast solutions.
The huge importance of sustainability was recognized by the IS community and led to several calls for an increased contribution (vom Brocke et al. 2013). For example, more than a decade ago a research agenda was already proposed to establish the new subfield of energy informatics, which applies IS thinking and skills to increase energy efficiency (Watson et al. 2010). Yet, the current sustainability situation demands increased efforts and “sustainability should be a core imperative in IS research” (Seidel et al. 2017, p. 46).
The so-called weak sustainability perspective, which is endorsed by many international organizations and nation states, advocates a substitution of natural capital by man-made capital. In contrast, the strong sustainability perspective, mostly endorsed by ecological economists and natural and social scientists, rejects this assumption of substitutability (Daly and Cobb 1989). More specifically, it considers a healthy environment to be the basis for further social and economic development. In response to the manifold environmental, economic and societal problems, the 193 member countries of the United Nations adopted a comprehensive set of 17 Sustainable Development Goals (SDGs) in the 2015 United Nations General Assembly. These SDGs comprise numerous subgoals that include a wide array of environmental and humanitarian objectives (United Nations 2018). While undoubtedly all of these goals are important, the broad coverage of the goals makes it fairly easy to label a specific project or technology application as “sustainable” – especially, if the sum of its impact is not considered and a weak sustainability perspective is taken that allows for the substitution of capital.

3.2 The Many Facets of the Token Economy

Tokens residing on distributed ledgers come in many forms and shapes and offer a huge number of different use cases (Tönnissen et al. 2020). They can be generated at the protocol layer inherent to a specific ledger, in which case they are frequently labeled as ‘coins’, or they can reside on the application layer and are minted by smart contracts. Tokens can represent digital or physical assets and enable alternatives and direct ways of raising capital in the form of initial coin offerings (ICOs), security token offerings (STOs), or equity token offerings (ETOs), with the latter two gaining increasing importance in recent years (Kranz et al. 2019). Tokens can also cater for payment purposes as a digital representation of value that is not created by a central bank and utility tokens grant access rights to specific services. Core characteristics of tokens are the ease with which they can be created and distributed. In many cases, they enable a direct interaction between token users and token creators, which facilitates complex market structures by avoiding intermediaries. Considering the numerous forms that tokens can take, it is not surprising that many of their (envisioned) use cases can have an environmental, social, or economic impact.

3.3 Tokens as Enablers and Drivers of Sustainable Development

The most notorious association between distributed ledgers and their environmental impact is presumably the huge energy consumption of Bitcoin, caused by the underlying consensus mechanism using proof-of-work (PoW) that is required to secure the public and permissionless distributed ledger using automated and decentralized governance. An objective and thorough assessment of the total environmental impact of Bitcoin in light of its (envisaged) societal benefits goes far beyond the scope of this brief discussion but is urgently needed for a comprehensive and fair evaluation of its total impact (Sedlmeir et al. 2020). At least, this example illustrates how the manifold positive effects of a cryptocurrency do not come without detrimental side effects, both of which can be easily assessed in terms of sustainability. In this regard, it is also noteworthy that alternative consensus mechanisms exist or are currently under development that might serve as alternatives or supplements to PoW soon. It is therefore crucial to point out that DLT, or, more specifically, its mix of fundamental building blocks, including linked timestamping, digital cash, proof of work, byzantine fault tolerance, asymmetric cryptography and smart contracts (Narayanan and Clark 2017), is under constant development, and every use case needs to be assessed regarding its specific implementation and outcomes rather than asserting that the whole token economy will exert a specific impact on sustainability.
Considering the lack of specificity of the fundamental concepts surrounding distributed ledgers, the question arises of whether the use of tokens can substantially contribute to a more sustainable development in the first place, and, if so, how such an impact can be objectively assessed. As a gross classification, two types of use cases need to be differentiated, namely those that provide an improvement over the current status quo and those that would be impossible without an underlying token.
In the first group of use cases, tokens act as moderators that make the current flow of operations smoother and, in doing so, create a positive effect on sustainability. An exemplary use case represents the trading of CO2 certificates where carbon credits are converted into tokens that can easily be traded via a distributed ledger. The facilitation of trading in combination with increased transparency is supposed to yield several positive outcomes. For example, the efficiency of decentralized trading platforms can make peer-to-peer trading viable and allow for trading of energy within urban neighborhoods. Additionally, enhanced transparency can help to create a system that is less prone to fraud and is resilient against, for example, manipulation of measurements, sale of faked carbon credits or the theft of such credits. The security of distributed ledger tokens thus prevents many fraudulent activities and presents a solution that is robust to fraud (Lockley et al. 2019).
Another example is the application of tokens in supply chains to improve transparency and yield desirable effects such as improved food safety, reduced food waste, and provable fair labor conditions. In this context, tokens representing assets constitute an important building block of a more comprehensive concept that is labeled “Physical Internet”. This concept aims to combine physical, informational and financial flows to create value chains that operate with highest possible efficiency, flexibility and productivity. The resulting gains pertain to environmental (e.g., reduction of emissions), social (e.g., fair working conditions and wages) and economic (e.g., fair sharing of revenues) sustainability (Treiblmaier 2019).
Taken together, all these positive effects would be much harder to achieve without an underlying distributed ledger and digital tokens that represent specific assets and allow for an easy tracking and tracing of physical and data flows.
Proposition 7: DLT-based tokens increase the efficiency of existing sustainability use cases.
In the second group of use cases, tokens provide a ‘conditio sine qua non’. An example is the creation of a monetary system that prioritizes financial inclusion and the provision of a stable monetary supply that fosters long-term economic development. History has shown that in the long run governments and central banks cannot resist the temptation to increase the monetary supply with the goal to support the economy in case of a recession, but also to bail out banks and finance war (Ammous 2018). The consequence of such an increase is usually a fall in interest rates which should increase economic activity but regularly also causes inflation. An economy in which the money supply is regulated by code rules out all temptations to interfere with the monetary system, spawning numerous positive and negative implications. The former help to create sustainable societies from an economic and social perspective.
The potential of DLT does not stop here, as is evidenced by so-called demurrage currencies in which the value of the respective token loses value over time. This is intended to encourage spending rather than hoarding currency in order to support the economy and human wellbeing (Leonard and Treiblmaier 2019). On a small scale, experiments with demurrage currencies have shown positive effects on communities in Austria and Germany during times of recession in the 1930s but they were never applied on a large scale due to lack of scalability and, even more important, strong resistance from the Austrian central bank. Of course, this example should not be misunderstood as a recommendation to try out a rather untested and potentially risky monetary system but should serve as an example of how technology might enable visionary ideas whose realizations have so far been impossible from a technical perspective. Other use cases, which might sound less disruptive, include the transformation of market structures caused by disintermediation, as is currently happening through the tokenization of value networks (Lohmer and Lasch 2020).
Proposition 8: DLT-based tokens enable novel and innovative sustainability use cases.

3.4 A Token Sustainability Agenda

Given the growing popularity of DLT and token economies, it is foreseeable that a substantial number of upcoming studies will investigate the application of tokens to create a more sustainable future. While this development is laudable, it is also crucial that every study clearly describes both positive and negative effects which result from the deployment of tokens.
Rather than simply labeling a specific token project as sustainable and highlighting one specific sustainability dimension, the sum of all external effects needs to be described as thoroughly as possible. For example, tokens that allow for financial inclusion, but rely on public DLTs using a PoW-based consensus mechanism might help to alleviate poverty (SDG1) and reduce inequality (SDG10), but simultaneously might not be energy efficient (SDG12) and, thus, have a negative impact on climate change (SDG13). Chances are that in most cases such trade-offs cannot be avoided. Obviously, this should not stop the industry and academia from developing and applying token-based solutions for pending problems and researchers from investigating how the token economy can help to improve sustainability. Rather, an open and critical discussion is needed which includes the positive and negative effects that a specific (token-based) solution creates. A comprehensive research agenda for the impact of token economies on sustainability thus necessitates a common understanding of the core terms, the development of measurement tools that allow for the comparability of use cases, and the creation of agreement on what sustainability actually means and which of its many dimensions deserve priority.

4 Tracking Assets in Supply Chains with Distributed Ledger Technologies

Mary Lacity
For the past three decades, companies have been improving internal supply chain operations through the adoption of ERP, sensor devices, Six Sigma quality improvement programs, and improved inventory management practices. Many supply chain operations are lean, but only within the boundaries of the firm. Across firm boundaries, supply chain partners still face significant challenges trying to synchronize the data about the flow of physical goods with the actual flow of goods. Because each partner independently maintains its own systems of record, the records often do not match across supply chain partners, which results in disputes, delays, lost products, and expensive reconciliations. For example, 70 percent of invoices among Walmart Canada’s freight carriers were disputed due to inconsistent data about the location, status, and pricing of freight (Wolfson 2020). Distributed ledger technology (DLT) offers a new approach based on asset tokenization, smart contracts for processing transactions pertaining to the asset, and a tamper-proof ledger shared by all authorized parties (van Hoek et al. 2019; Westerkamp et al. 2018). When Walmart Canada and its freight operators adopted DLT, invoice disputes fell from 70 percent to less than two percent (Wolfson 2020).
Several other DLT applications for tracking assets across supply chain partners are in use today, led by GE Digital, Golden State Foods, Everledger, the IBM Food Trust, MediLedger, TradeLens, VeriTX, and EY WineChain. When trading partners agree to the status of an asset, business value results in terms of lower transaction costs due to fewer disputes, better authentication of assets, counterfeit prevention, better product quality and freshness, and less time to process food and drug recalls.
In our research, we explore how assets are tokenized in DLT applications to automate transactions for supply chain partners (Lacity 2020). While tokens may represent fungible (non-unique) assets like cryptocurrencies, loyalty rewards, and airline miles, our interest is in using tokens to represent non-fungible assets, a particular asset in the real world. For example, a token is created by hashing1 a Unique ID (UID) to represent a particular diamond, shipping container, medical device, plot of land, work of art, or sellable unit of a pharmaceutical or food product. Smart contracts are programmed to process sensor readings so that when events happen to the physical asset, such as a physical movement, a change of ownership, a change of status, or a physical transformation – its digital counterpart is updated on the DLT application. The supply chain partners, if given permission rights, all store identical replications of a tamper-resistant record of events.
Studies of DLT applications revealed three ways that first generation DLTs tokenize UIDs: stickers/stamps, branding/watermarks, and self-identification (see Table 5). Each approach has its benefits and limitations.
Stickers/Stamps. The enterprise assigns and adheres a machine-readable sticker or stamp of a unique identifier (UID) to the physical product and creates a token of the UID by hashing the unique ID. Golden State Foods and EY’s WineChain use this approach.
Golden State Foods (GSF), a $5 billion US-based Foods Services company, uses RFID, IoT sensors and DLT to ensure product freshness of its beef patties through the entire supply chain. At the GSF manufacturing plant in Alabama, GSF encodes a UID for a sellable unit (a box of patties) in an RFID tag for the physical asset and records the hashed value on a permissioned DLT application. Boxes are loaded and sealed into a pallet which then gets a unique pallet ID that is also recorded on the distributed ledger. Pallets also have IoT temperature sensors that automatically update the DLT application along the supply chain. The sensor is read as it exits the manufacturing plant, on all the trucks involved with delivery, at the distribution center, and at the retail stores, which must have IoT devices within their refrigerators. After final delivery, GSF continues to assist the retail store owners by notifying them if a box is about to expire or if their cooler is not maintaining proper temperatures (such as if an employee failed to close the cooler door). This way, GSF helps retailers ensure that their beef patties are always safe and fresh (Zemsky et al. 2020).
EY developed WineChain to restore trust in the wine supply chain. Wine fraud is a chronic problem, with over €2.7 billion counterfeit wines and spirits sold to Europeans each year (Smith 2017). With WineChain, each wine bottle gets a unique QR code that is posted to the public Ethereum blockchain. Customers can scan the wine with their smartphone and get verification if the wine is legitimate. Wine producers, brokers, importers, wholesalers, distributors, and retailers rely on a private Ethereum blockchain to track the bottle as it moves through the supply chain. As of April 2020, 15 million bottles had been tokenized and more than 100 wineries participated (Lacity 2020).
Stickers and stamps are inexpensive and do not require specialized readers. However, de-coupling of a physical and digital counterpart may occur on a small scale, for example, if stickers are damaged or removed. Additionally, small scale counterfeits might occur, say by consuming a fine wine, re-using the wine bottle, and re-selling the fake, but the perpetrator would be caught if they tried to scale the solution; the winery still has chain of custody visibility and would be able to pinpoint the common source of repeated quality complaints, and identify this discrete point of failure in the process.
Branding/Watermarks. A physical product can be branded similar to a livestock branding or similar to watermarks embedded in paper money. With this method, the asset is assigned a UID and a machine-readable version of the UID is embedded into the physical product. The branding or watermark ensures the physical product is always properly identified. For now, this method only works for durable goods, not for food, beverages, or other liquids.2 VeriTX uses this approach.
VeriTX is a platform for decentralized manufacturing, where customers print parts where they need them, when they need them. Customers can buy tamper-resistant printing instructions from sellers on the platform. VeriTX explored a number of approaches to represent assets. One way was to embed a unique hash value as a “watermark” in the printed part that can be viewed with a camera on a smartphone. The hash value is permanently stored on the distributed ledger at the time of origin. Additionally, the DLT application will also store the part’s every movement and every transfer of ownership, thus enabling the part to be tracked through the supply chain (Lacity 2020).
Self-Identification. With self-identification, the physical asset itself can serve as its own identifier for non-fungible assets. The physical asset is scanned to extract physical or chemical fingerprints to create a UID, which is then tokenized on a distributed ledger. Everledger and VeriTX use this approach.
Everledger tracks diamonds from mines to retail stores. Founded in 2015, Everledger aims to help stop “blood diamonds” diamonds mined to finance conflicts in such places as Sierra Leone, Liberia, Angola, and the Ivory Coast–by better tracking the warranties associated with fair trade practices established by the United Nations in 2003. Known as the “Kimberly Process Certification”, the process requires sellers of rough and polished diamonds to insert a warranty declaration on invoices. Everledger creates a unique digital token of the physical diamond by specifying 40 metadata points using high resolution photographs. Over 1 million diamonds were represented on the ledger as of March 2017. Everledger has since expanded its business model to track and trace other valuable assets, such as gemstones, luxury goods, art, wine, e-recycling and antiquities (Lacity 2020).
Returning to VeriTX, the company also creates a unique fingerprint based on 54,000 surface characteristics of the grain structure of the part. VeriTX tested the sensitivity of the UID by dropping parts on concrete floors, grinding them with handheld motors, and grit blasting them. The fingerprint was still identifiable even when only 3,700 unique surface features remained on the original part. This fingerprint is used both for identification and counterfeit mitigation.
The next generation of DTL applications for supply chains. The first generation of DTL applications presented here are effective in synchronizing the data about the flow of physical goods with the actual flow of goods, provided the sensor data is connected to the DTL application. So far, the synchronization and trading partner agreement on the status of an asset are mostly deterministic systems, i.e., relying on if–then-else rules encoded in smart contracts. Future DLT applications will gain more business value by including probabilistic data. For example, by estimating the probability that food is no longer safe to consume based on the number and duration of temperature excursions traced by IoT devices. Researchers and enterprises are exploring the use of machine learning to predict and prevent failures, schedule maintenance, and model “what if” scenarios for DLT applications (Bevilacqua et al. 2020). In addition to more technically focused research, more organizational research is needed. In our case studies, the selection of technologies and the coding and testing of application were between 20 and 30 percent of the effort. Up to 80 percent of the effort required trading partners to agree on data and event standards, shared governance models, intellectual property rights, and compliance assurance. Application lock-in was also a serious issue, which is why more research on interoperability is so critical (e.g., Kannengießer et al. 2020b). The overarching research question is, “How can we apply these emerging DLTs to deliver business value?”.

5 Imagining a Decentralized Digital Economy: Less Concentration of Epistemic and Economic Power through DLT systems and Tokens?

Johann Kranz
Data is often regarded as the oil of the digital economy (e.g., Varian 2018). Although the comparison is not quite accurate as oil is a private, exclusive good and data can be used simultaneously and multiple times by many entities without losing its value (i.e., non-rival use of data), the availability of both oil and data have brought about groundbreaking innovation and economic growth. Similar to oil in the beginning of the twentieth century, today an oligopoly of dominant firms is controlling access to data – the most valuable resource of our times. Based on their dominating positions in essential online service markets such as search, social networking, commerce, cloud computing, streaming or smart assistants, online service providers (OSPs) like Google, Apple, Facebook, Microsoft, or Amazon are amassing constantly growing amounts of data that their users provide wittingly or unwittingly (Easley et al. 2018; Morey et al. 2015; Spiekermann and Korunovska 2017).
The enormous amounts of proprietary and rich data on consumer behavior have served as the basis for dominant OSPs to expand in numerous online and offline markets (Autor et al. 2020). In recent years, dominant OSPs could expand their economic and social power at a rapid pace by exploiting the competitive edge which their proprietary data silos give them in data-driven innovation, especially in artificial intelligence (Berners-Lee 2018). For competitors, these large, proprietary data silos have created a skewed level playing field by erecting high barriers for market entry, competition, and innovation (Haucap 2019).
Concentration of epistemic and economic power. An increasing number of academics, policymakers, and activists argue that the once free and open internet has been seized by a few dominant OSPs that cement their position by taking unfair advantage of their market power (Zuboff 2019). Based on investigations of the functioning of digital markets (e.g., House Judiciary Committee 2020), EU and U.S. policymakers accuse dominant OSPs of exploiting their gatekeeper power by dictating unreasonable terms to the marketplace, engaging in self-preferencing practices, using collected data of rivals’ businesses for own commercial activities, and neutralizing emerging competition by so-called “killer acquisitions”. Policymakers in the EU and the U.S. are expected to introduce new, ex-ante regulatory frameworks to constrain dominant players' potential scope for anticompetitive behaviors (e.g., EU’s Digital Services Act) and have recently filed antitrust lawsuits against Google, Facebook, and Amazon.
But not only competition in digital markets is undermined by dominant OSPs, also individual privacy, agency, and eventually democratic principles are endangered by the economic logic of “surveillance capitalism” (Zuboff 2015). Dominant OSPs have created expansive digital ecosystems that consider user data as a free commodity based on which they sell in-depth psychological user profiles and are able to manipulate human behavior (Lanier 2013; Zuboff 2019). The proprietary data silos give OSPs a tremendous epistemic and economic power (Jones and Tonetti 2020; Mager and Kranz 2020; Zuboff 2019). Epistemic power is based on the extreme knowledge asymmetries that occur because of dominant OSPs’ exclusive access to huge amounts of rich behavioral data. Economic power accrues based on dominant OSPs’ ability to leverage their epistemic advantage for improved and faster decision-making, data-driven innovation, and learning cycles.
Therefore, time is ripe to think about effective strategies which can mitigate epistemic and economic power of dominant OSPs beyond privacy and antitrust laws, but by design. An important part of the conversation on effective strategies focuses on shifting towards a more decentralized architecture of the digital economy (see Fig. 2). DLT systems can enable this shift in two major ways.
Decentralized data storage systems. First, DLT can be an integral part of decentralized data storage solutions that allow for individual data controllership and the separation of data and service layers. Individual data controllership enables users to manage, control, and track data access and usage of online services in a data repository controlled by users. Technically this can be done by using DLT systems for controlling data access and retrieving data stored decentrally on peer-to-peer, off-ledger storage systems such as the InterPlanetary File System (IPFS), Swarm, or Kademlia (Truong et al. 2019). These decentralized storage solutions use distributed hash tables to store and retrieve data in a peer-to-peer network. User-controlled decentralized data storage enhances privacy by design as the hash codes needed to retrieve data is only known to users and thus can only be accessed or made accessible for other actors by individual users. Applying asymmetric encryption ensures that only intended actors can decrypt data. Furthermore, transparency increases for users because every time their data is accessed, this can be immutably recorded on underlying DLT systems.
In this technological setup, data management and controllership are uncoupled from service provision by design which could lead to more levelled playing fields in online service markets as users could more easily switch between services and “multihome”. Thus, the power of network effects that frequently leads to “winner-takes-it-all” situations in the digital economy would be mitigated. Also, data would be more portable across services allowing competition to take place on the service level. Hence, the competitiveness of dominant market players’ rivals would increase since competitive advantage gained by proprietary data silos would be rendered less important.
As the use of data is non-rival, the same data could be shared with or sold by users to multiple entities. This would create new opportunities for economic participation of users and unlock data for innovation. Individual data controllership promises social and economic gains as users can generate a better allocation of data property rights than data ownership by firms (Jones and Tonetti 2020). Thus, more control of data property rights in the hands of users would lead to the unlocking of proprietary data silos, which facilitates a provision of data that is close to the social optimum (Jones and Tonetti 2020).
Proposition 9 a, b: The uncoupling of data management and controllership from service provision will (a) decrease customer lock-in such that market concentration in online service markets will decrease and (b) lead to social and economic gains through an improved usage of data resources.
Decentralized applications. Second, DLT systems can serve as the backbone of decentralized applications (DApps) such as the social network Steemit or the rating service Yup. In contrast to the conventional backend of web applications, the backends of DApps build upon decentralized peer-to-peer networks such as Ethereum, TRON or EOS. These DLT systems are used to manage the applications’ business logic via a nexus of modular services represented by smart contracts which are autonomously executed (Glaser 2017). The source code of smart contracts is stored in a subdirectory of a DLT system. If predefined conditions are met, the contract is automatically executed without human interference. Thus, instead of the dominant design of current centralized digital platforms in which a single entity controls data and information flows between all stakeholders, DApps reduce power and economic imbalances by design.
DApp providers often issue cryptographic tokens on their own or existing DLT systems – representations of assets or rights residing on a DLT system – with limited supply which users need to use the DApp (Kranz et al. 2019). These tokens are used to build viable business models around DApps and to incentivize producers of complementary products and services, as well as users to buy and use the token. While different economic models of tokens exist, mostly these models are designed in such a way that token supply is restricted or dynamically adapted by DApp providers to ensure that the value of tokens rise with increasing demand for the DApp. Thus, the economic model of a DApp token is a key instrument to align all stakeholders’ interests to focus on the shared goals of creating and maintaining a thriving and robust DApp ecosystem. Additional instruments to achieve the long-term sustainability of a DApp ecosystem and to incentivize stakeholders’ contributions to the ecosystem are revenue sharing and participative governance mechanisms. These mechanisms allow to democratize the created value and decision rights, which enable new ways of organizing in the digital space such as cooperatives or decentralized autonomous organizations (DAOs; Beck et al. 2018; Hsieh et al. 2018; Kollmann et al. 2020).
Proposition 10 a, b: Increasing diffusion of DApps will lead to (a) enhanced economic and governance participation of stakeholders in the digital economy and (b) more decentralized, IS-enabled economic organizing.
As the token economy builds upon decentralized business logic and data layers, it may prove to be a promising alternative to the current centralized design of the digital economy which has created powerful oligopolies in essential online service markets. However, many challenges and open questions remain. First and foremost, we have not yet seen widespread adoption of DLT systems and DApps although its unique benefits and potential to transform markets are largely undisputed. But so are DLT systems’ current weaknesses, such as a lack of interoperability, scalability, ease of use, and the energy thirst of proof-of-work consensus mechanisms which have to be addressed to overcome adoption barriers and make sensible use of DLT in the digital economy (Pedersen et al. 2019).
Proposition 11: Joint and balanced increases of DLT systems' interoperability, scalability, and ease of use, will spur adoption of DLT systems in the digital economy.
As the BISE/IS field has expertise in investigating and designing complex technological, economic, and social systems, I believe that we are in an excellent position and may even be obligated to shape a more competitive, innovative, and nondiscriminatory future digital economy. Thus, our community should become more involved in the increasing transdisciplinary academic efforts to develop solutions on how the concentration of epistemic and economic power in the digital economy can be attenuated. Also, our research needs to take a closer look at the unintended consequences and negative externalities of powerful gatekeepers' control of essential online services on competition, innovation, people, and societies (Constantinides et al. 2018; Easley et al. 2018).
However, a decentralized digital economy enabled by DLT systems as delineated above is no panacea and only a possible design option towards a future digital economy that promotes innovation, respects privacy, and operates on a level playing field. Other market and technology designs or regulatory frameworks may emerge that prove more effective in correcting current market failures in the digital economy. Notwithstanding, our community is called upon to address the increasing social, political, and economic concerns of high economic and epistemic power in the hand of a couple of private, largely unregulated companies.

6 On Tokenization, Transaction Costs, Intermediaries, and Participation

Gilbert Fridgen
Let us assume that tokenization is nothing more than some cryptographic solution, replacing anything that would, in the past, have been certified on a more or less tamper-proof and unique piece of paper. This is a very technical definition that focuses only on the medium of storage: paper vs. a cryptographically secured, digital medium. This is also a definition that covers most–if not all–use cases of tokenization to date: cryptocurrencies that replace cash money, security tokens that replace paper-based securities, tokens that replace signed documents in business processes, tokens that replace paper tickets, and even verifiable credentials that can, e.g., replace paper-based passports, drivers’ licenses, or university diplomas. Finally, this definition leaves it open, if tokens can or cannot feature scarcity, i.e., they can or cannot be duplicated. For example, uniqueness is an essential requirement for cryptocurrencies or security tokens but might be irrelevant to counterproductive for tokens used in business processes.

6.1 Tokenization and Transaction Costs

A major driver of tokenization represents the prospect of lowering transaction costs, which is similar to historic expectations for most digital technologies (Ciborra 1983; Cordelia 2006): The handling of a digital medium in general causes lower cost than the handling of paper during a transaction. It is worth noting that this also holds true for distributed ledgers: Presumed problems regarding their energy usage and thus cost are questionable (Sedlmeir et al. 2020), especially as reduced paper handling might even lower resource usage in several use cases (e.g., Jensen et al. 2019). However, according to public discussion, most hopes are especially set on the potential to avoid the transaction costs caused not by paper handling but by intermediaries.
Digital currencies, security tokens, digitally signed documents, digital tickets, and even verifiable credentials can, in principle, all be implemented without advanced cryptography but with the help of intermediaries. The banking system already provides for digital money transfers or custody without any tokenization. “Login with”-offers by companies like Apple, Google, or Facebook already provide verifiable credentials.
However, all of these solutions rely on some degree of trust in intermediaries that take on the role of a TTP. Besides their operating cost, their proprietary solutions often lack interoperability from a technical or regulatory perspective. Competing intermediaries moreover will avoid integrating their respective solutions, which would lead to different incompatible standards. Finally, intermediaries build a business model that either directly or indirectly monetizes their role as a TTP (e.g., through fees or monetization of collected data). With the hope to replace TTPs by tokenization comes the hope to remove the costs of intermediaries.
Proposition 12: Market participants expect lower transaction costs in tokenized markets as tokenization replaces the role of trusted third parties and thus avoids the cost of intermediaries.

6.2 What Will Tokenization Mean for Intermediaries?

For the traditional financial sector, already Allen and Santomero (1997, p. 1462) suggest that the “emphasis on the role of intermediaries as reducing the frictions of transaction costs […] is too strong.” They further outline that “reducing participation costs, which are the costs of learning about effectively using markets as well as participating in them on a day-to-day basis” is “an important service provided by these firms”. As they see it, a major role of financial intermediaries is to help customers to “deal with the increasingly complex maze of financial instruments and markets”. Coming from the finance discipline, they especially point to mutual funds as one means to reach this goal. As of today, financial service providers could, however, also reach this goal by offering services to their customers that use modern IT, for example, data analytics to identify investment opportunities that match consumers’ needs. Intermediaries’ role of enabling consumers to participate thus does not depend on the role of being a TTP. Intermediaries’ role of enabling consumers to participate is moreover unlikely to vanish with tokenization.
Proposition 13: Despite the vanishing need for a trusted third party, intermediaries will retain an important market role.
Lowered transaction costs might even enable use cases that were costly and cumbersome to implement using traditional means. In the financial sector, tokenization could increase the granularity of tradeable investment products. With tokenization, financial markets could evolve from trading stocks of a whole company (e.g., car manufacturer) to investing into individual projects (e.g., electric vehicle division) or even individual production machines (e.g., battery assembly). New market segments of investors could become interested in participating in financial markets, for example, when fans cannot only buy the stock of a publicly listed football club but can even directly co-fund the contracts of single players. In all these scenarios, we will need intermediaries that enable investors to participate, not only in terms of regulatory compliant market access but also to deal with the increasingly complex maze of tokenized financial instruments and markets.

6.3 From Managing Transactions to Enabling Participation

Future intermediaries need to determine, for example, adequate market prices through (sensor) data analytics around production machinery or football players. They will furthermore need to understand new market dynamics that come with utility tokens that are not only an object of investment but also a consumable means to run a service (Drasch et al. 2020). Finally, it is possible to store valuables like gold or security papers in your own safe. However, many people will value the professional custody services of a specialized company—for digital and non-digital assets. Dan Schulman, president and CEO of PayPal, described their motivation to offer services around cryptocurrencies as “the opportunity, and the responsibility, to help facilitate the understanding, redemption, and interoperability of these new instruments of exchange” (PayPal 2020). Hereby, Schulman basically restates Allen’s and Santomero’s argument around intermediaries and participation (1997, p. 1462).
This look into the financial sector is, however, without loss of generality: Consumers will also need to store other kinds of tokens, for example, verifiable credentials for their passports, drivers’ licenses, or university diplomas. Again, it would be technically possible for the users to store these on their personal devices. Still, functionalities like providing secure and easy-to-use backups or multi-device access could be business opportunities and, thus, new roles for intermediaries.
Proposition 14: Tokenization removes old but also creates new business opportunities for intermediaries, especially in enabling consumers to participate in the token economy.
Both the internet and the world-wide web were initially perceived to aim at decentralization (Mathew 2016). The Mozilla Foundation’s Internet Health Report (Mozilla Foundation 2019, p. 98) criticizes, however, that “the digital world is dominated by eight American and Chinese companies: Alphabet (Google’s parent company), Alibaba, Amazon, Apple, Baidu, Facebook, Microsoft, and Tencent. These companies and their subsidiaries have outsized control over the internet.”
On the background of an originally decentralized character of the internet and its later centralization, we cannot be sure about the effects of tokenization. Reasons for the internet’s centralization are presumably economies of scale (De Filippi and McCarthy 2012) and platform ecosystems (Tiwana 2014). While tokenization promotes decentralization at first sight, it might as well lay the ground for new centralized business models. Are we sure that a token economy is not prone to economies of scale and the mechanisms of platform ecosystems? Tokenization will create another economic playing field for both, established players and newcomers in the online world – possibly leading to centralized market structures again.
Proposition 15: Tokenization will not create a decentralized internet economy.
To sum up, tokenization will further lower transaction costs. As intermediaries lose their role as trusted third parties, they need to refocus on market roles that benefit from the lower transaction costs. Facilitating market participation will be a promising role for intermediaries. This might again lead to centralized market structures.
Being successful in this environment and in the long run will require openness towards technical innovation and strategic foresight. Research in the BISE/IS field is in an ideal position to analyze in detail the market dynamics in various industries that face the effects of tokenization. The BISE/IS community can thus strongly support businesses in strategically positioning themselves in a token economy.

7 Tokenization in Financial Markets: Efficiency Gains and New Investment Opportunities

Ulli Spankowski
Tokenization, from our point of view, describes the digital securitization of rights and goods via a distributed ledger. This allows for the transfer of existing financial products and the creation of new digital assets. In consequence, it enables a transformation process along all stages of the financial market value chain. A wide variety of rights and goods can be digitally securitized in tokens. Thereby, we at Börse Stuttgart distinguish three types of tokens: Payment tokens are digital means of payment that do not require a central entity and that have little or no additional functionality. Payment tokens include cryptocurrencies, such as Bitcoin or Ripple (XRP). Security tokens correspond to classic securities in their design in that they resemble stocks (Lambert et al. 2020). Finally, utility tokens, like vouchers, grant access to services or goods of the respective issuer. The following sections describe three major opportunities regarding tokenization: direct investor access to exchanges, leaner clearing and settlement processes, and increase in investment opportunities.

7.1 Direct Investor Access to Exchanges Without Intermediaries

New globally and directly accessible marketplaces for tokens are emerging (e.g., tZERO, OpenFinance Network, Börse Stuttgart Digital Exchange). These marketplaces can serve as a secondary market for previously issued tokens. DLT takes over the tasks of brokers and guarantees the security and transparency of transactions. Institutional and private investors worldwide can connect directly to token marketplaces via customer-specific interfaces without brokers as TTPs and the associated costs. Depending on the legal framework, a marketplace can also provide processes for proving the identity of customers and measures to combat money laundering. Preliminary verification of available financial resources is unnecessary, as the exchange of money for tokens between buyer and seller can take place in near real-time. These aspects could make brokers as we know them today obsolete in the process chain.

7.2 Leaner Clearing and Settlement Processes

Tokenization might help to break up established monopolies regarding settlement and custody of traditional securities trading and enable leaner processes. If the receivables, liabilities, and delivery obligations of market participants can be displayed and viewed at any time on the ledger, DLT could replace the clearing house and reduce settlement and counterparty risks. Clearing and transfer of tokens could be done almost in real time via the distributed ledger. In that case, a special custody instance would no longer be necessary for digital assets – everybody can take care of the custody of their tokens themselves if they wish. The DLT assuming the tasks of the clearing house and eliminating the need for a custody instance results in a leaner overall clearing and settlement process.

7.3 Increase in Investment Opportunities for (Private) Investors

In the future, the tokenization of assets could open completely new possibilities for (private) investors. For instance, tokenization might allow investments in fractions of real estate, in pieces of art, or direct investments in individual projects of companies. Tokenization may create liquid markets for previously illiquid assets and could offer investors secure access to investment opportunities that were previously not available to them.
Proposition 16 a, b: Tokenization will transform financial markets by (a) giving investors direct access to exchanges and assets, (b) streamlining the investment process, and (c) increasing investment opportunities.
However, there remain significant challenges that slow down the tokenization in financial markets. A major challenge regarding security tokens are regulatory uncertainties. As a digital equivalent of securities, security tokens are subject to capital market regulation. The various elements of the typical investment value chain, from issuance over trading to custody, fall under the extensive regulatory framework for equities, such as the Prospectus Regulation (PR), Markets in Financial Instruments Directive (MiFID II), Markets in Financial Instruments Regulation (MiFIR), and Central Securities Depositories Regulation (CSDR). The classification as regular securities results in trading and ad-hoc obligations, among other things (Blandin et al. 2019). However, there still is no clear token taxonomy, especially regarding the regulation of hybrid token forms. Furthermore, other important regulatory foundations are missing to enable security tokens to exploit their potential as efficient, flexible and fungible digital assets (Nägele 2020). While important legal aspects with regards to the trade and transfer of security tokens are still awaiting clarification, first issues have already been addressed within the European Union. To foster innovation, the European Commission has proposed the DLT MTF Pilot Regime. This sandbox enables tokenization without necessarily applying all of the listed capital market frameworks in full. The implications of this sandbox for the final regulation of security tokens are not clear. On a national level, the German legislator, for instance, is working on security token regulation by gradually opening securities law to digital securities, with the digital global certificate joining the previously mandatory physical certificate. However, the so far considered new regulation, which is mainly applicable for digital bonds, might not be sufficient, since shares and other securities also would need to be considered in order for issuers and investors to gain access to the full range of financing and investment opportunities in the digital world.
To summarize, tokenization has a huge potential to transform the financial sector and to improve the investment process by reducing complexity and increasing investment opportunities. However, to realize this potential, regulators will need to address a number of existing regulatory ambiguities, such as uncertainties around token taxonomy and the implications of regulatory pilot regimes.

8 Tokenization: Opportunities and Challenges for the Automotive Industry

André Luckow
The automotive value chain is highly complex and comprises many participants, e.g., suppliers, logistic companies, and dealers. Complexity arises from many sources, for example, the need to integrate thousands of hardware and software components across multilayer supply chains. Besides customer demands for highly personalized products and details about the sources of product components, ensuring safety, quality, and environmental standards increase the complexity of supply chain management. Supply chain transparency refers to practices that improve the availability and quality of data along the supply chain enabling the described capabilities. Simultaneously, supply chain transparency is crucial for agility and resilience (e.g., the ability to quickly respond to demand changes). Increasingly linear supply chains evolve toward more flexible business ecosystems supported by information technologies that can respond to the new business needs.
Tokenization is an important mechanism to establish business ecosystems in the automotive value chain (e.g., for supply chains). Tokens can provide trusted, verifiable information that can be used in cross-organizational processes and replace the existing paper-based processes. An automotive supply chain can involve ten thousands of partners globally distributed across multiple tiers. This complexity of direct and indirect business relationships results in manifold opportunities for tokenization, including the exchange of product and logistic data, proof of origins, and custom processing.
Tokenization offers the means to establish verifiable and trustworthy artifacts, such as documents and certificates, and share it across ecosystems (c.f. Section 6). Further, tokenization can increase the attractiveness of ecosystems by lowering the entry barriers for participants (e.g., suppliers and logistic companies) and reduce the transaction costs (Laurent et al. 2020). Tokenization allows the augmentation of the physical and digital world. For supply chain transparency, non-fungible assets (Lacity 2020) are of particular interest (e.g., the representation of physical vehicle sub-components, components, and parts via a unique digital identifier). DLT is an essential technology allowing to track tokens, share data, automate processes, and facilitate the trading of tokenized assets. However, to successfully instantiate a token economy, technological, standardization, and governance aspects must be considered.

8.1 Evolution of the Automotive Industry

The automotive industry has traditionally evolved as a vertically integrated value chain rooted in need to integrate every aspect of the process for mass production and to become an economy of scale. However, since the 1930s, the automotive industry’s degree of vertical integration declined and the automotive industry evolved toward a more decoupled, decentralized business network (Langlois and Robertson 1989). Coase (1937) contrasts the difference between markets, where a pricing mechanism governs resource allocation, and firms, in which the entrepreneur and managers coordinate production. Firms remove the friction of complicated market structures and, thus, reduce transaction costs. A firm emerges when coordination can be done more efficiently in a central organization. The balance between centralization and decentralization is continuously evolving as technologies (e.g., DLT) reduce coordination overheads, and customer needs and organizations change. For example, the four ACES trends (i.e., autonomous, connected, electric, and services; Holland-Letz et al. 2018) impact the automotive industry leading to reconsiderations of vertical integration, particularly concerning the software (Fletcher et al. 2018), data, and energy storage. In other areas, token-based ecosystems may allow for more flexible and decentralized automotive value chains (e.g., for commodity parts).
This dynamic, unpredictable landscape of varying relationships between partners in the automotive value chain led toward the rise of business ecosystems that provide some coordination and reduce the transaction costs compared to markets (Pidun et al. 2019).

8.2 Tokenization in the Automotive Value Chain

Automotive supply chain networks are vast and involve many partners distributed across the globe. At the same time, there are many manual and paper-based processes hindering efficient cross-organizational collaboration. To address these challenges, we evaluate tokenization and DLT regarding their usage across the entire automotive value chain, including supply chain management, logistics, and vehicle-related services (e.g., driver license verification or charging services; Garrido et al. 2020; Gudymenko et al. 2020).
In the following, we focus on the challenge of supply chain transparency. Because of their complexity, supply chain networks often lack transparency and trust. Keeping track of all components, materials, orders, and locations is challenging. As described by Lacity (see Sect. 4), due to the many stakeholders and systems managed by these stakeholders involved, no global view of all data exists. In case of issues, this typically leads to highly manual and error-prone reconciliation processes.
To address these challenges, we developed Partchain (Miehle et al. 2019), a system for enhancing vehicle components’ and materials’ traceability within complex international supply chain networks. The objective of Partchain is to enhance supply chain visibility and ensuring the authenticity of every component. Partchain uses distributed ledger technologies, in particular Hyperledger Fabric, and can be deployed on different infrastructures. While traditional supply chain systems shield data between the different parties, Partchain provides a unified way to share and verify data.
As described by Lacity (see Sect. 4), tokenization requires UIDs for representing physical assets as non-fungible tokens. A challenge is the definition and standardization of the UID format trading of the versatility of the key, for example, by using a natural key, a synthetic key, or a composite key. A natural key is based on attributes that exist in the real world and allows for easier process integration because it contains critical identifying information that can be used without further queries. In Partchain, we utilize synthetic UIDs generated from the hash values from individual components’ serial numbers. The UIDs are associated with other meta-data, such as the manufacturing date and location of the asset.
Initially, we focus on the traceability of original components to reduce the manipulation risk and fraud and improve the diagnostics of part issues. We utilize flexible and extensible data models that allow the tracking of complex hierarchical component data. Generated tokens are passed on from partner to partner, including the OEM in the final stage. Every participant in the system can generates a new component tokens (i.e., a record for describing component attributes, in particular serial number, manufacturer, location, and data). Component tokens reference all components that were used for assembling the component.
An important requirement is the ability to authorize access to the data. For example, it is essential that only the data owner can control access to the data she provided on a very fine-grained level. For this purpose, we utilize Hyperledger Fabric’s private data collections. Currently, access to a record is only granted to immediate neighbors in the value chain.
In the future, Partchain can be extended to support traceability and full transparency for raw materials from mine to factory. For example, materials like cobalt or wolframite often originate from sources in developing countries, making it difficult to monitor, for example, working conditions, quality, and standards. As supply chains involve many intermediaries, tracking and verifying origins of all components is challenging and prone to fraud. To digitize the existing practice of paper-based proofs, the ability to create digital tamper-resistant proofs (e.g., certificates) plays an essential role. To support this use case correctly, self-identification methods, such as chemical fingerprints, are critical building blocks (see Sect. 4).

8.3 Practical Implications and Challenges

The number of use cases and opportunities in the automotive realm for tokenization is vast. However, there are significant technological and business challenges that hinder the uptake of this technology. First, IT systems are still designed for end-to-end, vertical processes, and not business ecosystems. In practice, this often means complex onboarding processes and many one-to-one instead of network transactions. With the availability of clouds and distributed ledger technologies, the technological means to digitize and optimize the automotive value chain are readily available. While we demonstrated the scalability and maturity of distributed ledgers (Sedlmeier et al. 2021), various challenges remain, including the technological integration of distributed ledgers with the legacy systems of individual partners, standardization of data formats and protocols, finding and implementing balanced incentives that encourage participation in and governance of the ecosystem.
Proposition 17: To successfully establish token ecosystems, a holistic consideration of business models, technology standards, and governance is required.
While systems, such as Partchain, demonstrated the suitability of distributed ledgers for securely exchanging and verifying components’ data, they still need to show their ecosystems’ viability. A first critical maturity level is achieved by establishing a digital token for assets. This can provide the technological basis for the digitization of additional value streams in the future. However, achieving a critical mass of partners and activities is challenging; the initial costs for technological integration, lack of standards, governance, operating, and business models slow the adoption. Balancing the needs of ecosystem participants and users, technology providers, and governance entities is challenging and requires the reconciliation of competing interests. Finding the right incentive models that work across the partner ecosystem, thus, often requires intensive experimentation.
While the long-term and strategic benefits of token-based ecosystems are apparent, creating ecosystems is challenging in practice. While online service providers can easily pursue platform-based business models (see Sect. 5), blueprints for decentral ecosystems do not exist. Often, platform providers can subsidize the initial use and adoption of the platform. After establishing a network effect, the platform provider can monetize its user base by facilitating and controlling supply and demand (Hein et al. 2020).
In decentralized ecosystems, incentive structures are more complicated. Conflicting interests, long standardization and governance processes often slow down the creation of vibrant ecosystems. A successful approach requires the confluence of the use case, standards, technology, and governance model. Thus, it is critical to define an appropriate tactical approach for incrementally working towards ecosystems and adapting, if necessary. It is instrumental to develop the ecosystem in a business-centric way and to emphasize shared benefits of all participants, ensuring that the token-based ecosystem continuously moves forward.
Table 1
Overview of types of distributed ledgers according to their permission models, with private/public referring to read permissions and permissioned/permissionless referring to write permissions
 
Permissioned
Permissionless
Private
Only authorized nodes can join the network
Only authorized nodes can join the network
Only authorized nodes can participate in consensus finding
All connected nodes can participate in consensus finding
Public
Any node can join the network
Any node can join the network
Only defined nodes can participate in consensus finding
All connected nodes can participate in consensus finding
Table 2
Classification of goods
 
Excludable
Non-excludable
Rivalry
Private goods
Common goods
Non-rivalry
Club goods
Public goods
Table 3
Extended classification of goods
 
Excludable
Non-excludable
Collective use impossible (rival)
Private goods
Common goods
Collective use (non-rival) possible
Club goods
Public goods
Collective use (non-rival) necessary
Club network goods
Network goods
Table 4
Direct and indirect cooperation models
 
Complementing cooperation
Competing cooperation
Direct value
New services and value creation across complementing systems
New fee and service exchange models where cooperation is unavoidable, or customer bases need to be shared
Indirect value
New supporting service providers such as Blockchain-as-a-Service, jointly operated trusted oracles to provide synergy effects
New support service providers to protect intellectual property, encrypt data, or provide Chinese walls to protect commercial claims
Table 5
Three examples of ways to represent assets on DLTS based on Unique IDs
Description
Stickers/stamps
Branding/watermarks
Self-identification
Examples
The digital token represents a has value of a UID that is adhered to the physical product’s packaging with a sticker or stamp
The digital token represents a hash value of a UID embedded within the physical product
The digital token represents a UID created by scanning the physical properties of the asset. Every time the physical asset is scanned, it generates the same UID
Benefits
Golden State Foods and EY WineChain
VeriTX
Everledger and VeriTX
Limitations
Suitable for food and beverages. Low cost. Ease of use
Strong coupling of physical and digital token
Strong coupling of physical and digital token
Open AccessThis article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://​creativecommons.​org/​licenses/​by/​4.​0/​.

Unsere Produktempfehlungen

WIRTSCHAFTSINFORMATIK

WI – WIRTSCHAFTSINFORMATIK – ist das Kommunikations-, Präsentations- und Diskussionsforum für alle Wirtschaftsinformatiker im deutschsprachigen Raum. Über 30 Herausgeber garantieren das hohe redaktionelle Niveau und den praktischen Nutzen für den Leser.

Business & Information Systems Engineering

BISE (Business & Information Systems Engineering) is an international scholarly and double-blind peer-reviewed journal that publishes scientific research on the effective and efficient design and utilization of information systems by individuals, groups, enterprises, and society for the improvement of social welfare.

Wirtschaftsinformatik & Management

Texte auf dem Stand der wissenschaftlichen Forschung, für Praktiker verständlich aufbereitet. Diese Idee ist die Basis von „Wirtschaftsinformatik & Management“ kurz WuM. So soll der Wissenstransfer von Universität zu Unternehmen gefördert werden.

Fußnoten
1
A hash function is an algorithm that transforms original data into a unique number (referred to as hash value). A hash function only works one-way so that the original data will always produce that unique number. Given the hash value, it is computationally infeasible to figure out the original data. The procedure of creating a hash value from data is called hashing.
 
2
There are several innovations under development to embed UIDs using nanotechnology into food, beverages, and medicines in the future. IBM calls these ultra-miniaturized cryptographic anchors (Prisco 2018).
 
Literatur
Zurück zum Zitat Ammous S (2018) The Bitcoin standard: the decentralized alternative to central banking. Wiley, Hoboken Ammous S (2018) The Bitcoin standard: the decentralized alternative to central banking. Wiley, Hoboken
Zurück zum Zitat Arthur WB (1983) On competing technologies and historical small events: the dynamics of choice under increasing returns. IIASA, Laxenburg, pp 83–90 Arthur WB (1983) On competing technologies and historical small events: the dynamics of choice under increasing returns. IIASA, Laxenburg, pp 83–90
Zurück zum Zitat Arthur WB (1989) Competing technologies, increasing returns, and lock-in by historical events. Econ J 99(394):116–131CrossRef Arthur WB (1989) Competing technologies, increasing returns, and lock-in by historical events. Econ J 99(394):116–131CrossRef
Zurück zum Zitat Autor D, Dorn D, Katz LF, Patterson C, Van Reenen J (2020) The fall of the labor share and the rise of superstar firms. Q J Econ 135(2):645–709CrossRef Autor D, Dorn D, Katz LF, Patterson C, Van Reenen J (2020) The fall of the labor share and the rise of superstar firms. Q J Econ 135(2):645–709CrossRef
Zurück zum Zitat Beck R (2007) The network (ed) economy: the nature, adoption and diffusion of communication standards. Springer, Heidelberg Beck R (2007) The network (ed) economy: the nature, adoption and diffusion of communication standards. Springer, Heidelberg
Zurück zum Zitat Beck R, Müller-Bloch C, King JL (2018) Governance in the blockchain economy: a framework and research agenda. J Assoc Inf Syst 19:1020–1034 Beck R, Müller-Bloch C, King JL (2018) Governance in the blockchain economy: a framework and research agenda. J Assoc Inf Syst 19:1020–1034
Zurück zum Zitat Bevilacqua M, Bottani E, Ciarapica FE, Costantino F, Di Donato L, Ferraro A, Mazzuto G, Monteriù A, Nardini G, Ortenzi M, Paroncini M, Pirozzi M, Prist M, Quatrini E, Tronci M, Vignali G (2020) Digital twin reference model development to prevent operators’ risk in process plants. Sustainability 12(3):1088. https://doi.org/10.3390/su12031088CrossRef Bevilacqua M, Bottani E, Ciarapica FE, Costantino F, Di Donato L, Ferraro A, Mazzuto G, Monteriù A, Nardini G, Ortenzi M, Paroncini M, Pirozzi M, Prist M, Quatrini E, Tronci M, Vignali G (2020) Digital twin reference model development to prevent operators’ risk in process plants. Sustainability 12(3):1088. https://​doi.​org/​10.​3390/​su12031088CrossRef
Zurück zum Zitat Buchanan JM (1965) An economic theory of clubs. Economica 32(125):1–14CrossRef Buchanan JM (1965) An economic theory of clubs. Economica 32(125):1–14CrossRef
Zurück zum Zitat Ceci SJ, Kain EL (1982) Jumping on the bandwagon with the underdog: the impact of attitude polls on polling behavior. Public Opin Q 46(2):228–242CrossRef Ceci SJ, Kain EL (1982) Jumping on the bandwagon with the underdog: the impact of attitude polls on polling behavior. Public Opin Q 46(2):228–242CrossRef
Zurück zum Zitat Choi S-Y, Stahl DO, Whinston AB (1997) The economics of electronic commerce. Macmillan, Indianapolis Choi S-Y, Stahl DO, Whinston AB (1997) The economics of electronic commerce. Macmillan, Indianapolis
Zurück zum Zitat Daly HE, Cobb JB (1989) For the common good: redirecting the economy toward community, the environment, and a sustainable future. Beacon Press, Boston Daly HE, Cobb JB (1989) For the common good: redirecting the economy toward community, the environment, and a sustainable future. Beacon Press, Boston
Zurück zum Zitat De Filippi P, McCarthy S (2012) Cloud computing: centralization and data sovereignty. Europ J Law Technol 3(2):1–18 De Filippi P, McCarthy S (2012) Cloud computing: centralization and data sovereignty. Europ J Law Technol 3(2):1–18
Zurück zum Zitat Deng L, Chen H, Zeng J, Zhang L-J (2018) Research on cross-chain technology based on sidechain and hash-locking. In: Liu S et al (eds) International conference on edge computing. Springer, Cham, pp 144–151 Deng L, Chen H, Zeng J, Zhang L-J (2018) Research on cross-chain technology based on sidechain and hash-locking. In: Liu S et al (eds) International conference on edge computing. Springer, Cham, pp 144–151
Zurück zum Zitat Economides N, Himmelberg C (1995) Critical mass and network evolution in telecommunications. In: Toward a competitive telecommunications industry: selected papers from the 1994 telecommunications policy research conference, University of Maryland, College Park, pp 47–63 Economides N, Himmelberg C (1995) Critical mass and network evolution in telecommunications. In: Toward a competitive telecommunications industry: selected papers from the 1994 telecommunications policy research conference, University of Maryland, College Park, pp 47–63
Zurück zum Zitat Garrido GM, Miehle D, Luckow A, Matthes F (2020) A blockchain-based flexibility market platform for EV fleets. In: 2020 Clemson University power systems conference (PSC), pp 1–8 Garrido GM, Miehle D, Luckow A, Matthes F (2020) A blockchain-based flexibility market platform for EV fleets. In: 2020 Clemson University power systems conference (PSC), pp 1–8
Zurück zum Zitat Gudymenko I, Khalid A, Siddiqui H, Idrees M, Clauß S, Luckow A, Bolsinger M, Miehle D (2020) Privacy-preserving blockchain-based systems for car sharing leveraging zero-knowledge protocols. In: 2020 IEEE international conference on decentralized applications and infrastructures (DAPPS), pp 114–119 Gudymenko I, Khalid A, Siddiqui H, Idrees M, Clauß S, Luckow A, Bolsinger M, Miehle D (2020) Privacy-preserving blockchain-based systems for car sharing leveraging zero-knowledge protocols. In: 2020 IEEE international conference on decentralized applications and infrastructures (DAPPS), pp 114–119
Zurück zum Zitat Herlihy M (2018) Atomic cross-chain swaps. In: 2018 ACM symposium on principles of distributed computing, pp 245–254 Herlihy M (2018) Atomic cross-chain swaps. In: 2018 ACM symposium on principles of distributed computing, pp 245–254
Zurück zum Zitat van Hoek R, Fugate B, Davletshin M, Waller MA (2019) Integrating blockchain into supply chain management: a toolkit for practical implementation. Kogan Page, London van Hoek R, Fugate B, Davletshin M, Waller MA (2019) Integrating blockchain into supply chain management: a toolkit for practical implementation. Kogan Page, London
Zurück zum Zitat Kannengießer N, Pfister M, Greulich M, Lins S, Sunyaev A (2020) Bridges between islands: cross-chain technology for distributed ledger technology. In: Proceedings of the 53rd Hawaii international conference on system sciences, pp 5298–5307 Kannengießer N, Pfister M, Greulich M, Lins S, Sunyaev A (2020) Bridges between islands: cross-chain technology for distributed ledger technology. In: Proceedings of the 53rd Hawaii international conference on system sciences, pp 5298–5307
Zurück zum Zitat Kernahan U, Bernskov A, Beck R (2021) Blockchain out of the box – where is the blockchain in blockchain-as-a-service? In: Proceedings of the 54th Hawaii international conference on system sciences, pp 4281–4290 Kernahan U, Bernskov A, Beck R (2021) Blockchain out of the box – where is the blockchain in blockchain-as-a-service? In: Proceedings of the 54th Hawaii international conference on system sciences, pp 4281–4290
Zurück zum Zitat Kranz J, Nagel E, Yoo Y (2019) Blockchain token sale. Bus Inf Syst Eng 61(6):745–753CrossRef Kranz J, Nagel E, Yoo Y (2019) Blockchain token sale. Bus Inf Syst Eng 61(6):745–753CrossRef
Zurück zum Zitat Lacity MC (2020) Blockchain foundations: for the internet of value. Epic Books Lacity MC (2020) Blockchain foundations: for the internet of value. Epic Books
Zurück zum Zitat Langlois RN, Robertson PL (1989) Explaining vertical integration: lessons from the American automobile industry. J Econ Hist 49(2):361–375CrossRef Langlois RN, Robertson PL (1989) Explaining vertical integration: lessons from the American automobile industry. J Econ Hist 49(2):361–375CrossRef
Zurück zum Zitat Lanier J (2013) Who owns the future? Simon & Schuster, New York Lanier J (2013) Who owns the future? Simon & Schuster, New York
Zurück zum Zitat Leibenstein H (1950) Bandwagon, snob, and veblen effects in the theory of consumers’ demand. Q J Econ 64(2):183–207CrossRef Leibenstein H (1950) Bandwagon, snob, and veblen effects in the theory of consumers’ demand. Q J Econ 64(2):183–207CrossRef
Zurück zum Zitat Leonard D, Treiblmaier H (2019) Can cryptocurrencies help to pave the way to a more sustainable economy? Questioning the economic growth paradigm. In: Treiblmaier H, Beck R (eds) Business transformation through blockchain, vol II. Palgrave Macmillan, Cham Leonard D, Treiblmaier H (2019) Can cryptocurrencies help to pave the way to a more sustainable economy? Questioning the economic growth paradigm. In: Treiblmaier H, Beck R (eds) Business transformation through blockchain, vol II. Palgrave Macmillan, Cham
Zurück zum Zitat Liebowitz SJ, Margolis SE (1995) Are network externalities a new source of market failure. Res Law Econ 17:1–22 Liebowitz SJ, Margolis SE (1995) Are network externalities a new source of market failure. Res Law Econ 17:1–22
Zurück zum Zitat Miehle D, Henze D, Seitz A, Luckow A, Bruegge B (2019) PartChain: a decentralized traceability application for multi-tier supply chain networks in the automotive industry. In: 2019 IEEE International conference on decentralized applications and infrastructures, pp 140–145 Miehle D, Henze D, Seitz A, Luckow A, Bruegge B (2019) PartChain: a decentralized traceability application for multi-tier supply chain networks in the automotive industry. In: 2019 IEEE International conference on decentralized applications and infrastructures, pp 140–145
Zurück zum Zitat Morey T, Forbath T, Schoop A (2015) Customer data: designing for transparency and trust. Harv Bus Rev 93(5):96–105 Morey T, Forbath T, Schoop A (2015) Customer data: designing for transparency and trust. Harv Bus Rev 93(5):96–105
Zurück zum Zitat Murphy R (2012) Sustainability: a wicked problem. Sociol 6(2):1–24 Murphy R (2012) Sustainability: a wicked problem. Sociol 6(2):1–24
Zurück zum Zitat Musgrave RA (1959) Theory of public finance; a study in public economy. McGraw-Hill, New York Musgrave RA (1959) Theory of public finance; a study in public economy. McGraw-Hill, New York
Zurück zum Zitat O’Donoghue O, Vazirani AA, Brindley D, Meinert E (2019) Design choices and trade-offs in health care blockchain implementations: systematic review. J Med Internet Res 21(5):e12426CrossRef O’Donoghue O, Vazirani AA, Brindley D, Meinert E (2019) Design choices and trade-offs in health care blockchain implementations: systematic review. J Med Internet Res 21(5):e12426CrossRef
Zurück zum Zitat Pedersen A, Risius M, Beck R (2019) Blockchain decision path: when to use blockchains? Which blockchains do you mean? MIS Q Exec 18(2):1540–1960 Pedersen A, Risius M, Beck R (2019) Blockchain decision path: when to use blockchains? Which blockchains do you mean? MIS Q Exec 18(2):1540–1960
Zurück zum Zitat Peltoniemi M (2005) Business ecosystem: a conceptual model of an organisation population from the perspectives of complexity and evolution, e-BRC Research Reports, vol 18. Tampere, Finland Peltoniemi M (2005) Business ecosystem: a conceptual model of an organisation population from the perspectives of complexity and evolution, e-BRC Research Reports, vol 18. Tampere, Finland
Zurück zum Zitat Samuelson PA (1954) The pure theory of public expenditure. Rev Econ Stat 36(4):387–389CrossRef Samuelson PA (1954) The pure theory of public expenditure. Rev Econ Stat 36(4):387–389CrossRef
Zurück zum Zitat Sedlmeier J, Ross P, Miehle D, Luckow A, Fridgen G (2021) The DLPS: a new framework for benchmarking blockchains. In: Proceedings of the 54th Hawaii international conference on system sciences, pp 6855–6864 Sedlmeier J, Ross P, Miehle D, Luckow A, Fridgen G (2021) The DLPS: a new framework for benchmarking blockchains. In: Proceedings of the 54th Hawaii international conference on system sciences, pp 6855–6864
Zurück zum Zitat Seidel S, Bharati P, Watson RT, Boudreau M-C, Kruse LC, Karsten H, Melville N, Toland J, Fridgen G, Albizri A, Butler T, Guzman I, Lee H, Rush D, Watts S (2017) The sustainability imperative in information systems research. Comm Assoc Inf Syst 40:40–52 Seidel S, Bharati P, Watson RT, Boudreau M-C, Kruse LC, Karsten H, Melville N, Toland J, Fridgen G, Albizri A, Butler T, Guzman I, Lee H, Rush D, Watts S (2017) The sustainability imperative in information systems research. Comm Assoc Inf Syst 40:40–52
Zurück zum Zitat Shapiro C, Carl S, Varian HR et al (1998) Information rules: a strategic guide to the network economy. Harv Bus Press, Boston Shapiro C, Carl S, Varian HR et al (1998) Information rules: a strategic guide to the network economy. Harv Bus Press, Boston
Zurück zum Zitat Spiekermann S, Korunovska J (2017) Towards a value theory for personal data. J Inf Technol 32(1):62–84CrossRef Spiekermann S, Korunovska J (2017) Towards a value theory for personal data. J Inf Technol 32(1):62–84CrossRef
Zurück zum Zitat Sunyaev A (2019) Distributed ledger technology. Internet computing: principles of distributed systems and emerging internet-based technologies. Springer, Cham, pp 265–292 Sunyaev A (2019) Distributed ledger technology. Internet computing: principles of distributed systems and emerging internet-based technologies. Springer, Cham, pp 265–292
Zurück zum Zitat Tian F (2017) A supply chain traceability system for food safety based on HACCP, blockchain & internet of things. In: 2017 International conference on service systems and service management, Dalian, pp 1–6. https://doi.org/https://doi.org/10.1109/ICSSSM.2017.7996119 Tian F (2017) A supply chain traceability system for food safety based on HACCP, blockchain & internet of things. In: 2017 International conference on service systems and service management, Dalian, pp 1–6. https://​doi.​org/​https://​doi.​org/​10.​1109/​ICSSSM.​2017.​7996119
Zurück zum Zitat Tiwana A (2014) Platform ecosystems: aligning architecture, governance, and strategy. Morgan Kaufmann, AmsterdamCrossRef Tiwana A (2014) Platform ecosystems: aligning architecture, governance, and strategy. Morgan Kaufmann, AmsterdamCrossRef
Zurück zum Zitat Truong NB, Sun K, Guo Y (2019) Blockchain-based personal data management: from fiction to solution. In: IEEE 18th International symposium on network computing and applications, pp 1–8 Truong NB, Sun K, Guo Y (2019) Blockchain-based personal data management: from fiction to solution. In: IEEE 18th International symposium on network computing and applications, pp 1–8
Zurück zum Zitat Zamyatin A, Harz D, Lind J, Panayiotou P, Gervais A, Knottenbelt W (2019) XCLAIM: trustless, interoperable, cryptocurrency-backed assets. In: 2019 IEEE symposium on security and privacy, San Francisco, pp 193–210. https://doi.org/https://doi.org/10.1109/SP.2019.00085 Zamyatin A, Harz D, Lind J, Panayiotou P, Gervais A, Knottenbelt W (2019) XCLAIM: trustless, interoperable, cryptocurrency-backed assets. In: 2019 IEEE symposium on security and privacy, San Francisco, pp 193–210. https://​doi.​org/​https://​doi.​org/​10.​1109/​SP.​2019.​00085
Zurück zum Zitat Zuboff S (2019) The age of surveillance capitalism: the fight for a human future at the new frontier of power. Profile Books, London Zuboff S (2019) The age of surveillance capitalism: the fight for a human future at the new frontier of power. Profile Books, London
Metadaten
Titel
Token Economy
verfasst von
Ali Sunyaev
Niclas Kannengießer
Roman Beck
Horst Treiblmaier
Mary Lacity
Johann Kranz
Gilbert Fridgen
Ulli Spankowski
André Luckow
Publikationsdatum
12.02.2021
Verlag
Springer Fachmedien Wiesbaden
Erschienen in
Business & Information Systems Engineering / Ausgabe 4/2021
Print ISSN: 2363-7005
Elektronische ISSN: 1867-0202
DOI
https://doi.org/10.1007/s12599-021-00684-1

Weitere Artikel der Ausgabe 4/2021

Business & Information Systems Engineering 4/2021 Zur Ausgabe