1 Introduction

There is a persistent mythology about the relationship between science and technology, which claims that knowledge originates in laboratories via deliberate hypothesis-driven investigations that we call basic research and that new technology is the practical application of such scientific production of knowledge. The origins of this view are mysterious, though it is clear that they were either exploited or exaggerated by Vannevar Bush and other scientists in the 1950s as justification for basic research funding. The myth gained traction as the details of the Manhattan Project became public and seemed to strengthen after the success of the Apollo missions to the Moon. In the popular consciousness, these two massive federal research and development programs were spearheaded by nuclear physicists and rocket scientists, respectively, who tutored and inspired applied scientists and engineers to make new devices and machines.

While there may be some basis for this mythology, at least in the sense that it has some explanatory power, there exist extraordinary exceptions to controlled experimentation as the origin of knowledge, including:

  • discoveries that were purely accidental and thus, by accounts of the discoveries themselves, devoid of scientific structuring (Kennedy 2016),

  • biomimicry (Benyus 1997), which demonstrates that knowledge originates in the natural world as the result of an infinite variety of spontaneous, evolutionary experiments rather than structured investigation,

  • practical discoveries such as the steam engine (von Baeyer 1998) or invention of a clock that keeps reliable time at sea (Sobel 1995) that predate organized, scientific knowledge of their underlying principles (thermodynamics and materials science, respectively) and prove that knowledge can originate in technology and thus flow backwards into basic research,

  • whimsical inventions that were neither practical nor scientific in their origin, but made critical contributions or expansions of some fundamental knowledge base like digital programming (Johnson 2016), and finally,

  • the general argument popularized by Taleb (2012) that technology has no need of basic research at all.

In any case, scholars of science and technology studies now widely accept that a linear progression of knowledge from basic research to practical application is at best only one of many approaches to discovery (Meyer 2000), at least insofar as technology is knowledge (Layton 1974; Johnson 2005).

2 Reframing the role of technology in processes of science

In contrast to thinking of technology exclusively as a product of science, there is another way of talking about the relationship between science and technology which emphasizes the role technology plays in enhancing the processes of science and extends the related discussion in Layton (1974) and subsequent literature (e.g., Hughes 1986). Figure 1 illustrates this view, in which technology (taken literally, the study of technique—Chaharbaghi and Willis 2000) improves science in at least three ways:

Fig. 1
figure 1

Three ways technology improves science

  1. 1.

    Enhances observational powers to create data (sensing). New scientific instruments expand the techniques of sensing phenomena to make observations and measurements in the natural world. For example, satellite imagery enabled space-based measurements of stratospheric ozone, which led to new fundamental discoveries in atmospheric chemistry. In this way, technological advancement in one field may result in new powers of observation in others. While basic science can certainly result in advances, not all of the enabling technologies of measurement resulted from organized scientific inquiry. It was the work of craftsmen that created the advanced glass lenses for microscopes and telescopes which extended vision to scales of both the microbe and the solar system (Yong 2016).

  2. 2.

    Accelerates calculation, computation, and organization of data (thinking). Technological systems of measurement, computation, modeling, and organization of data may have been the enabling technologies that differentiated the scientific method from other modes of inquiry after the Renaissance (Crosby 1997). The logical positivist view that elevates prediction and empirical verification to the highest form of evidence in the scientific method only became possible after the invention of telescopes, chronometers, sextants, differential calculus, and the ephemerides necessary to record and model the positions of celestial bodies.

  3. 3.

    Amplifies communication of results (sharing). Though the ancient Greeks relied principally on the spoken word for sharing knowledge, it is only by their written works that we retain and share their knowledge today. Since at least 1665, when Philosophical Transactions became the first scientific journal, the technology of archival record and science communication has been the printing press (Triggle and Triggle 2016). Even today, despite extraordinary advances in information communication technologies, the journal article (albeit in portable document format, or pdf) remains the fundamental unit of scientific expression. Nonetheless, as the marginal cost of data and computational effort decays to essentially zero, science may be on the cusp of a major technological transformation in communication of science and the institutions that curate scientific knowledge.

While we often think of the ways in which technology enhances scientific sensing through new instruments, the most recent technological advances, and those that may become the most impactful in the information age, are in the way we think about and communicate science. That is, advances in computing technology have allowed more rapid data processing and modeling tools that organize the observations made by advanced instrumentation, and advances in information communication technology (ICT) have amplified our powers of scientific communication, curation, and collaboration in ways that may eventually transform what we consider “normal” habits in science (Kuhn 2012).

3 Novel approaches to curating science at the International Symposium on Sustainable Systems and Technology

The International Symposium on Sustainable Systems and Technology (ISSST) experiments with many new archival science communication tools. For example, the Proceedings of the conference are all made available freely online, via CC-BY license and on the science publishing site figshare.com (ISSST 2013, 2014, 2015, 2016). The Proceedings are registered with an international standard serial number (ISSN) assigned by the US Library of Congress. All contributions receive digital object identifiers, and they are indexed by Google Scholar. As of December 2016, the three volumes of the Proceedings have garnered nearly 14,000 views and been downloaded nearly 7000 times (Figshare 2016).

Advanced ICT tools present novel opportunities for live conversations and platforms for scientists to confer. For example, at the 2013 conference (issst2013.net) in Cincinnati, Ohio, ISSST held a special session that was video linked to a meeting of the society of environmental toxicology and chemistry (SETAC) being held concurrently in Glasgow, Scotland. This enabled participants to join both deliberations, albeit electronically, without the cost of travel or sacrificing one opportunity for another. Since the 2014 conference in Oakland, California (issst2014.net), all 3 days of presentations at every ISSST have been livestreamed over YouTube, including the plenary and three simultaneous, parallel sessions. Back in 2014, this was still a novel and expensive proposition, and the ISSST leadership received a $34,000 quote from a multimedia technology provider to provide the livestreaming over dedicated Ethernet lines. Because that price tag was unaffordable, we enlisted an 18-year-old college freshman volunteer who made the technology work over hotel WiFi using less than $3000 of hardware built for videogame streaming. The technology allows the option of exclusively online participation, including remote presentations. We have been using that platform ever since.

At the ISSST, conference participants experiment with ICTs that have the potential to change the fundamental communicative unit of science from the journal article to a much more diverse portfolio of digital research products, including open datasets, post-publication peer review, video streaming, interactive posters, and other emerging digital media. In fact, we have already seen an explosion of alternative routes for dissemination of science online that include twitter, YouTube and Vimeo, twitch, slideshare, and digital publication platforms like scribd, kindle, wordpress, and linkedin. Thus, in one 2015 session in Dearborn, Michigan (issst2015.net), a presenter in Germany, speaking to an audience in the USA, answered a question from India—all in real time.

“Preprint” platforms are proliferating (Nature News 2016; Berg et al. 2016). There are new efforts to launch other archives—specifically bioRxiv, SocArXiv, and engrXiv—the last two of which are part of the Open Science Foundation (OSF) preprint service. The goal of these platforms seems to be to provide a common mechanism to cross-post to multiple communities at once. For example, the SocArXiv has a “Social and Behavioral Sciences” subarea, and the engrXiv has an entire taxonomy of engineering subject areas. Similarly, ArXiv.org is a pre-publication repository, now over 25 years old that contains over a million documents. Newer platforms include Winnower and figshare.com. Additionally, there have recently become available science-specific social networks like researchgate.net and academia.edu, as well as new platforms for identifying and recognizing scholars and contributions to scholarship like ORCID and Publons.

It is impossible to foretell what will emerge from the evolving relationship between technology and science. However, it is unlikely that a single technological platform like the printing press will dominate the next 400 years of communication, curation, and collaboration of science. Certain trends such as increasing open access and post-publication review are irrefutable. Social pressures to reduce cost, shorten time from submission to publication, increase reliability, and overcome obstacles to meritocracy are unlikely to subside. Taken in aggregate, the only thing certain is more rapid change—both in the process of producing knowledge and in the forms and channels through which it is communicated.

4 Overview of the special section

This special section of Environment Systems and Decisions is inspired by the mutualistic relationship between science and technology that is enabled by advances in ICT (Seager 2014a). All four papers emerged from the annual Proceedings of the International Symposium on Sustainable Systems and Technology (issst2016.net) held in May 2016 in Phoenix, Arizona. The four papers included herein demonstrate and highlight how technological improvements in computer programming and ICT continue to reshape normative expectations of the execution and communication of science.

Each of the papers addresses uncertainties and data. Many of our mathematical habits in science (i.e., representing whole datasets with highly aggregated point estimates such as a mean or average, or the assumption of a normal, bell curve distribution) were developed during times of computational scarcity. Because working with highly aggregated point estimates is computationally cheaper than whole datasets, and because normal (or Gaussian) distributions lend themselves readily to closed-form manipulations that are computationally inexpensive, our habits of data analysis formed around these techniques. These habits have been so ingrained that they have persisted into an age that is no longer limited by computational constraints. Existing practices of treating and communicating life cycle assessment data now seem overly simplistic (Refsgaard et al. 2007; Ascough et al. 2008), given the ease with which data can be represented as probability distributions and uncertainty can be explored with software tools like Analytica (Lumina Decision Systems) or Excel plug-ins like @Risk (Palisade) and Crystal Ball (Oracle) that automate Monte Carlo sampling. No longer is ease of computation a defensible reason for publishing software tools or models that work exclusively with point estimates, highly aggregated data, or fail to report levels of precision in model outputs. Although challenges remain, including Bayesian or Markovian dependencies between model variables, the trend toward incorporation and publication of explicit representations of uncertainty is unmistakable.

The papers in this special issue contradict and counteract these old habits. For example, Kuczenski et al. (2017) tackled the vexing issue of data security in a new era of transparency, while Dundar et al. (2017) explicitly explored scenario-driven uncertainty in assessment of the capacity of local food production to sustain a small community in the heart of US farm country. Hanes and Carpenter (2017) and Collier et al. (2017) both explored environmental assessment in the highly uncertain world of emerging technologies, applying computationally intensive approaches to explore possibility envelopes and orient development toward environmentally preferable outcomes.

5 The future of scientific societies as networks

For 2017, the ISSST will be held in conjunction with the biannual meeting of the International Society of Industrial Ecology, June 25–29, 2017 in Chicago IL (issst2017.net) and the partnership calls for a less experimental approach in technology, though still experimental in terms of merging two differentiated (albeit overlapping) communities. Nonetheless, the ISSST has spawned a novel idea for a new kind of scientific society that leverages the tools of ICT to accelerate collaboration and co-production of knowledge of interest to the readers of Environment, Systems and Decisions. The new society is called the Sustainability Conoscente Network (or “Conoscente” for short), and it is registered as a federally tax-exempt 501c3 organization in the USA (Seager 2014b). The purpose of the Conoscente is to further experiments in conferring, reviewing, publishing, collaborating, and structuring cross-disciplinary scientific organizations through communication.

What makes the Conoscente unique is the realization that science communication is not just something that happens within an organization, but it is the process by which the organization itself is structured (Hinrichs et al. 2016). Therefore, as ICT empowers scientists to communicate in increasingly interconnected ways, the structure of the organization is likely to take on new forms that are consistent with the new modes and patterns of communication. Those modes will be less hierarchical, more decentralized, and take on topologies that are less tree like and more web-like (Lima 2012). In short, the most effective scientific organizations of the twenty-first century are likely to function more like networks, in which participants are highly interconnected to one another. This mode of organizing is in contrast to the classic configuration of scientific community called the center, in which all participants are connected to a single, central hub. Thus, the mission of the Conoscente is to connect the readers of this special issue, and participants in ISSST, in a network that is closer to the leading edge of sustainability science and technology in a post-industrial age.