Weitere Kapitel dieses Buchs durch Wischen aufrufen
In 370 BCE, Plato, in the “mouth” of Socrates, lamented the introduction of writing since he thought that it might weaken people’s ability to memorise. Since then, every communication development has been vilified by some, and its positive aspects undervalued.
The first forms of long-distance communication media were the telegraph (in 1837) and, later the telephone (Alexander Bell in 1876). Both methods relied on vast networks of wires to carry information. In 1895, the controlled generation of specific electromagnetic waves, wirelessly in free space, was the foundation for wireless telegraphy (radio transmission) and the springboard for technologies that are now ubiquitous.
British scientists accelerated the speed by which verbal communication between inhabitants of different continents could occur. It was W. Thomson’s application of scientific principles and technological innovations that brought about the first successful transatlantic, underwater, electric telegraph, in 1866. This paved the way for rapid, global communication pathways, through insulated cables, leading to world-wide social and material benefits to mankind. It helped Great Britain capture a pre-eminent place in world communications, connecting Great Britain to its Empire and trading partners throughout the world. It became a key component of economic inter-connectedness.
Nowadays, over 90% of the world’s data and web traffic travel close to the speed of light, through fibre optic wires, across the sea floor, in the underwater web.
Alexander Bell was groomed and educated to follow in his father’s and grandfather’s footsteps, studying the mechanics of speech, teaching elocution, and working with the deaf community. In 1844, Samuel Morse sent his first telegraph message. Bell wanted to transmit human speech rather than Morse code clicks. His first step was a ‘harmonic telegraph’ based on six steel reeds in parallel that responded to one of six specific frequencies. A corresponding reed at the end of the line vibrated in harmony. This facilitated the sending of multi-messages, simultaneously via the telegraph.
Just as the density of air varies when a sound passes through it, Bell conceived an electric current could be made to change in response to sound. Believing that human speech causes a wave-like pattern in air, he aimed to produce an electric wave following the same pattern. By 1875, Bell, aged only twenty-eight, and electrician, Thomas Watson, created a crude telephone apparatus leading to a patent application.
Bell’s 1876 US patent was one of the most lucrative ever granted, making him extremely rich. The first intelligible telephone communication soon followed and, in 1877, the first ‘speaking phone’ was available for commercial use. The transmitter and receiver depended on the principles of Faraday’s electromagnetic induction. Responding to a sound, a vibrating membrane caused pulsations in a magnet set in a coil. This induced fluctuations in an electric current corresponding to the sound. Passage of this undulating current to an electromagnetic coil, in the receiver, caused a magnet to vibrate against a membrane reproducing the original audible sound.
The telegraph and the telephone were both wire-based electrical systems but Bell’s success with the telephone came as a direct result of his attempts to improve the telegraph. The reproducing of human speech, via the telephone, refashioned the way people communicate since they could converse remotely and directly, without leaving their home and without any intermediary. It became indispensable to households, businesses, and governments.
Combined with radio technology, the mobile telephone has now become a ubiquitous wire-less tool for both global communication and information exchange. It is claimed that, in 2020, 68% of the world’s 7.8 billion inhabitants have a mobile device, almost half of which are “smartphones”.
We have been fascinated by mathematics (Newton and Maxwell) and how its application can extend our understanding of natural happenings. In addition, to assist with the process of data collection, manipulation, and analysis, we have helped develop the world of computers.
In 1833, Babbage, and a century later, Alan Turing, conceptualised the structural architecture of a computer. At an early age, Turing developed an obsession with puzzles and codes. His code-breaking prowess would prove invaluable during WW2. His teacher described him as a genius. He went on to take a 1st class honours degree with distinction, at Cambridge University, where he was elected Fellow at the age of twenty-two.
Whilst completing his PhD at Princeton University, Turing developed the notion of a universal computer machine, describing the basic principles of a computer which became known as the Universal Turing Machine. Tommy Flowers and co-workers at Bletchley Park would go on to build ‘Colossus’, the second functioning programmable computer, in 1944. This, together with the code-breaking expertise of Turing, meant that the Bletchley Park team was able to decipher German military messages, sent from the German encrypting machine, Enigma, and intercepted during WW2. Churchill said Bletchley was his ‘secret weapon’ and may have shortened the war by up to two years and saved countless lives.
The Industrial Revolution had revealed that customised machines were capable of doing what vast swathes of human beings could do, but in a more efficient manner. The work at Bletchley demonstrated that a computer machine-based on electronics—could automate the efforts of thousands of human computing assistants. This innovation spawned a technology that became inextricably woven into the industrial and social life of late-twentieth and twenty first century life. Omnipresent computers are now so indispensable to our societies that life grinds to a halt when they stop working.
The UK emerged from the Second World War with a technological edge in electronics, computers, and programming. However, its technology did not flourish because of benign neglect of the United Kingdom’s manufacturing industry. It has no Intel, Samsung, Lenovo, Hewlett Packard, Dell, Apple, Sony, Siemens or Google.
Turing himself was always ahead of his time. In 1950. He grappled with the question, “Can machines think like humans?”. Six years later the term ‘artificial intelligence’ appeared. In 1966, the first Turing prize was awarded. This is the ‘Nobel Prize’ of computing. Tim Berners-Lee would be its 2016 recipient and another Briton would make a major input to the way we communicate via computers.
The Internet is a huge network of disparate computers all linked together. Enabling useful, interactive connections between networks called for common protocols. In 1990, for the benefit of his computer-using, co-workers at CERN, Tim Berners-Lee created an “internal web” of pathways for the free flow of documents. Extending his horizons, he then laid a “world-wide information web” over the pre-existing Internet. Some of the enablers were already in existence but three others were devised by Tim. They included HTML, a document publishing language of the Web; URI, a unique name and address for each resource on the Web and HTTP which allowed retrieval of linked resources.
Berners-Lee wrote key instruction codes for a computer seeking information, the Web page where the information is held, as well as codes for the computer that releases information to the client. His hypertext system quickly became a universal infrastructure for on-line communication and the foundation of many other industries. He intended his system to be powerful and immediately useful, rather than perfect. His specifications for the nuts and bolts of the World Wide Web (WWW, W3) have been refined in the interim, but remain essentially the same.
The number of websites has increased from only one in August 1991 to over 1.94 billion in January 2019. Over 4 billion (53%) of the world’s population use the Internet. New industries emerged to fill in the missing capabilities for a host of commercial applications. Google emerged as the dominant provider for Internet searches. In 2017, it had indexed 135 trillion Web pages, a figure that is constantly growing.
The WWW is a communication superhighway which has fundamentally changed the way we work; shop; play and correspond with friends and family, via social networking sites, blogs, and video sharing. Whilst “cyber power” is revolutionizing the way individuals live their lives, it has started to influence the way governments protect their citizens and fight wars. The Web has become the most far-reaching anthropological study in human history. It reveals that an array of unintended and potentially harmful consequences are emerging.
Most importantly, the Web is open, non-proprietary, and free because Berners-Lee and his employer, CERN, as an altruistic gesture, elected not to patent his invention, nor to use any technology that required royalties to be paid.
Berners-Lee made it possible to communicate more effectively across the globe, unrestricted by cables. He has done more to connect the world than has ever been previously achieved. Almost as important as the invention of the wheel, digital devices and infrastructure have become to the 21st century, what new transport systems and infrastructure were in the 19th and 20th centuries.
Bitte loggen Sie sich ein, um Zugang zu diesem Inhalt zu erhalten
Sie möchten Zugang zu diesem Inhalt erhalten? Dann informieren Sie sich jetzt über unsere Produkte:
- Communication: Telephone, Computers and WWW
- Chapter 15