Internet History: Backbone

Original author: Creatures of Thought
  • Transfer


In the early 1970s, Larry Roberts came to AT&T, the huge US telecommunications monopoly, with an interesting offer. At that time, he was the director of the computing division of the Advanced Research Projects Agency (ARPA), a relatively young organization operating within the Department of Defense, and engaged in long-term research divorced from reality. Over the past five years, Roberts has overseen the creation of ARPANET, the first of the most important computer networks connecting computers located in 25 different locations across the country.

The network was successful, but its long-term existence and all the related bureaucracy did not fall under the authority of ARPA. Roberts was looking for a way to reset this task to someone else. And so he contacted the directors of AT&T to offer them the "keys" to this system. After carefully considering the proposal, AT&T eventually rejected it. Senior engineers and managers of the company believed that the fundamental ARPANET technology was impractical and unstable, and it had no place in a system designed to provide reliable and universal service.

ARPANET, naturally, became the grain around which the Internet crystallized; the prototype of a huge information system covering the whole world, whose kaleidoscopic capabilities are impossible to calculate. How could AT&T not see such potential, so stuck in the past? Bob Taylor, who hired Roberts as curator of the ARPANET project in 1966, later spoke bluntly: "Working with AT&T would be like working with Cro-Magnons." However, before meeting with hostility such unreasonable ignorance of unknown corporate bureaucrats, we take a step back. The theme of our story will be the history of the Internet, so at first it would be nice to get a more general idea of ​​what is being discussed.

Of all the technological systems created in the late half of the 20th century, the Internet probably had the greatest significance for society, culture and the economy of the modern world. His closest competitor in this matter may be moving on jet aircraft. Using the Internet, people can instantly share photos, videos, and thoughts, both desirable and unwanted, with friends and relatives around the world. Young people living thousands of kilometers apart are now constantly falling in love and even getting married within the virtual world. An endless shopping center is available at any moment of the day or night directly from millions of comfortable houses.

For the most part, all of this is familiar and that is exactly the case. But, as the author himself can confirm, the Internet also turned out to be perhaps the greatest distraction, a waste of time and a source of damage to the mind in human history, surpassing television in this - and this was not easy to do. He allowed all sorts of inadequacies, fanatics and lovers of conspiracy theories to spread their rubbish all over the globe at the speed of light - part of this information can be considered harmless, and part - not. He allowed many organizations, both private and joint-stock, to slowly gain, and in some cases quickly and shamefully lose, huge mountains of data. In general, he became an amplifier of human wisdom and stupidity, and the amount of the latter is frightening.

But what is the subject we are discussing, its physical structure, all this machinery that allowed these social and cultural changes to go through? What is the Internet? If we somehow managed to filter out this substance by placing it in a glass vessel, we would see how it is stratified into three layers. The global communications network will be deposited at the bottom. This layer is about a hundred years older than the Internet, and at first it consisted of copper or iron wires, but has since been replaced by coaxial cables, microwave repeaters, optical fiber, and cellular radio.

The next layer consists of computers that communicate with each other through this system using common languages, or protocols. Among the most fundamental of these are the Internet Protocol (IP), Transmission Control Protocol (TCP), and the Border Gateway Protocol (BGP). This is the core of the Internet itself, and its specific expression occurs in the form of a network of special computers, called routers, responsible for finding the path for the message along which it can go from the source computer to the target.

Finally, in the upper layer there will be various applications that people and machines use to work and play on the Internet, many of which use specialized languages: a web browser, communication applications, video games, trading applications, etc. To use the Internet, an application only needs to enclose the message in a format that is understandable to routers. A message can be a move in chess, a tiny part of a movie, or a request to transfer money from one bank account to another - routers do not care, and they will treat it the same way.

Our story will combine these three threads together to tell the story of the Internet. First, a global communications network. In the end, all the splendor of various programs that allow computer users to have fun or do something useful over the network. Together, they are connected by technologies and protocols that allow different computers to communicate with each other. The creators of these technologies and protocols based on the achievements of the past (network) and used a vague idea of ​​the future, in the direction of which they went to the touch (future programs).

In addition to these creators, one of the regular characters in our story will be the state. This will especially affect the level of telecommunication networks that are either managed by the government or subject to strict supervision on its part. Which brings us back to AT&T. No matter how unpleasant it is to realize this, the fate of Taylor, Roberts and their colleagues in ARPA was hopelessly connected with telecommunications operators, the main layer of the future Internet. Their networks completely depended on such services. How to explain their hostility, their belief that ARPANET represents a new world, which in essence confronts retrograde telecommunications officials?

In fact, the two groups were divided not by temporary, but by philosophical differences. The directors and engineers of AT&T considered themselves the caretakers of a huge and complex machine that provided reliable and universal communication services from one person to another. Bell System was responsible for all the equipment. ARPANET architects considered the system a conductor of arbitrary data particles, and believed that its operators should not interfere with how this data is created and used from both ends of the wire.

Therefore, we must start with a story about how, thanks to the power of the US government, this impasse over the nature of American telecommunications was resolved.

One system, one-stop service?

The Internet was born in a specific environment of American telecommunications - in the United States, telephone and telegraph providers were not treated at all like in the rest of the world - and there is every reason to believe that this environment played a formative role in developing and shaping the spirit of the future Internet. Therefore, let's carefully study how all this happened. To do this, we will go back to the time of the birth of the American telegraph.

American anomaly

In 1843, Samuel Morse and his allies convinced Congress to spend $ 30,000 on a telegraph line between Washington, DC and Baltimore. They believed that this would be the first link in the government-funded network of telegraph lines that span the entire continent. In a letter to the House of Representatives, Morse invited the government to redeem all rights to its telegraphic patents, and then order private companies to build separate parts of the network, leaving separate lines for official communication. In this case, Morse wrote, “it will take a little time before the entire surface of this country is riddled with these nerves, which with the speed of thought will spread knowledge about everything that is happening on earth, turning the whole country into one big settlement.”

It seemed to him that such a vital communication system naturally served the public interest, and therefore fell into the circle of government concerns. Providing communication between several states through mail services was one of several tasks of the federal government, specifically noted in the US Constitution. However, his motives were not entirely determined by service to society. Government control gave Morse and his supporters the opportunity to successfully complete their enterprise - to receive one, but a significant payment of state money. In 1845, Cave Johnson, general postmaster of the United States under the 11th president of the United States, James Polk, announced his support for the public telegraph system proposed by Morse: “The use of such a powerful tool, for the benefit or to the detriment, for the safety of people, cannot be left in private hands persons ” He wrote. However, this was all over. The other members of the Democratic Regiment's administration did not want to have anything to do with the public telegraph, nor did the Democratic Congress. Parties did not like the schemeWhigs , forcing the government to spend money on "internal improvements" - they considered these schemes to encourage favoritism, corruption and corruption.

Due to the unwillingness of the government to act, one of the members of the Morse team, Amos Kendal, began to develop a telegraph network scheme with the support of private sponsors. However, Morse's patent was not enough to ensure a monopoly on telegraph communication. Over the course of ten years, dozens of competitors have appeared who either bought a license for alternative telegraph technologies (mainly based on the Royal House printing telegraph) or simply engaged in semi-legal affairs on shaky legal grounds. Lawsuits were filed in batches, paper fortunes grew and disappeared, bankrupt companies collapsed or sold to competitors after artificially inflating stock prices. Out of all this turmoil, by the end of the 1860s, one major player had emerged: Western Union.

A startled rumor of "monopoly" began to spread. The telegraph has already become necessary for several aspects of American life: finance, railways and newspapers. Not a single private organization has ever grown to such a size. The proposal for government control of the telegraph received a new life. In the decade after the Civil War, Congressional postal committees came up with various plans to bring the telegraph into orbit of the postal service. Three basic options have appeared: 1) the postal service sponsors another rival of Western Union, giving it special access to post offices and routes, instead of setting tariff restrictions. 2) The postal service launches its own telegraph to compete with WU and other private traders. 3) The government nationalizes the entire telegraph, passing it under the control of the postal service.

Several loyal supporters in the Congress have acquired plans for a postal telegraph, including Alexander Ramsey, chairman of the Senate postal committee. However, most of the campaign’s energy was provided by external lobbyists, in particular Gardiner Hubbard, who had experience in public services as the organizer of urban water supply and gas lighting systems in Cambridge (he later became Alexander Bell’s most important early sponsor and founder of the National Geographic Society). Hubbard and his supporters argued that the public system would provide the same useful distribution of information that paper mail did, keeping tariffs low. They said that this approach would probably serve the community better than the WU system, which was aimed at the business elite. WU naturally objected

In any case, the postal telegraph has not received enough support to serve as the subject of battles in Congress. All proposed laws are quietly suffocated. The volume of the monopoly has not reached such indicators that would overcome the fear of government abuse. Democrats again gained control of Congress in 1874, the spirit of national reconstruction in the period immediately after the civil war was muffled, and initially weak attempts to create a postal telegraph ran out of steam. The idea of ​​putting the telegraph (and later the telephone) under government control periodically arose in subsequent years, but apart from brief periods of (nominal) government control of the telephone in wartime in 1918, nothing grew out of it.

This neglect of the government by telegraph and telephone was an anomaly on a global scale. In France, the telegraph was nationalized before its electrification. In 1837, when a private company tried to arrange an optical telegraph (using signal towers) next to the existing system controlled by the government, the French parliament passed a law prohibiting the development of a telegraph not authorized by the government. In Britain, a private telegraph was allowed to develop for several decades. However, public dissatisfaction with the duopoly received led to government control over the situation in 1868. Throughout Europe, governments placed telegraphy and telephony under state mail control, as suggested by Hubbard and his supporters. [in Russia, the state enterprise Central Telegraph was founded on October 1, 1852 g / approx. transl.].

Outside Europe and North America, most of the world was controlled by the colonial authorities, and therefore had no voice in the development and regulation of telegraphy. Where independent governments existed, they usually created state telegraph systems according to the European model. These systems usually did not have enough funds for expansion at the same speed that was observed in the United States and European countries. For example, the Brazilian state telegraph company, operating under the wing of the Ministry of Agriculture, Commerce and Labor, had only 2100 km of telegraph lines by 1869, while in the United States in a similar area, where 4 times more people lived, by 1866 there were already stretched 130,000 km.

New deal

Why did the US take such a unique path? You can draw to this the local system of distribution of public positions among supporters of the party that won the election, which existed until the last years of the 19th century. Government bureaucracy, down to the heads of post offices, consisted of political appointments, with which it was possible to reward loyal allies. Both parties did not want to create new large sources of patronage for their opponents - and this would certainly happen when the telegraph fell under the control of the federal government. However, the simplest explanation would be the traditional American distrust of a powerful central government - for the same reason, the structures of American health care, education and other public institutions are just as different from structures in other countries.

Given the increasing importance of electrical communications for public life and security, the United States could not completely separate from the development of communications. In the first decades of the 20th century, a hybrid system appeared in which private communication systems tested two forces: on the one hand, the bureaucracy constantly monitored the tariffs of communication companies, ensuring that they did not take a monopoly position and would not make excessive profits; on the other, the threat of being divided under antitrust laws in the event of inappropriate behavior. As we will see, these two forces could contradict: the theory of tariff regulation considered a monopoly a natural phenomenon under certain circumstances, and duplication of services would be an unnecessary waste of resources. Regulators have usually tried to minimize the negative aspects of monopoly by controlling prices.

The concept of tariff regulation was born on the railroads and was implemented at the federal level through the Interstate Commerce Commission (ICC), created by Congress in 1887. Small businesses and independent farmers became the main motivating force of the law. They often had no choice but to use the services of the railways, which they used to deliver products to the market, and they stated that the railway companies used this, squeezing the last money out of them, while providing luxurious conditions to large corporations. A five-member commission was given the right to monitor rail services and tariffs and prevent abuse of monopoly power, in particular by prohibiting railways from providing special tariffs to selected companies (a precursor to the concept, which we call today network neutrality). The Mann-Elkins Act of 1910 extended ICC's telegraph and telephone rights. However, ICC, concentrating on transportation, has never been particularly interested in these new areas of responsibility, practically ignoring them.

At the same time, the federal government developed a completely new tool to combat monopolies. The Sherman Act of 1890 provided prosecutors with the opportunity to challenge any commercial “combination” suspected of “containing trade” —that is, suppressing competition at the expense of monopoly opportunities. This law over the next two decades has been used to liquidate several major corporations, including the Supreme Court decision of 1911 to divide Standard Oil into 34 parts.

Octopus Standard Oil from a 1904 caricature, before separation

By that time, telephony, and its main provider AT&T, managed to overshadow the telegraphy and WU in importance and capabilities so much that in 1909 AT&T was able to buy a controlling stake in WU. Theodore Vale became president of the combined companies and began the process of stitching them together. Vale firmly believed that a benevolent telecommunication monopoly would better serve the interests of society, and promoted the company's new slogan: “One policy, one system, universal service.” In the end, Vale matured so that the destroyers of the monopolies would pay attention to him.

Theodore Vale, approx. 1918

The assumption of office in 1913 by the Woodrow Wilson administration granted members of his Progressive Partya good moment to threaten with its antitrust club. Sidney Burlson, the director of the postal service, was inclined to fully mail the phone according to the European model, but this idea, as usual, did not receive support. Instead, Attorney General George Wickersham expressed the view that AT&T's continued takeover of independent telephone companies violates the Sherman Act. Instead of going to court, Vale and his deputy Nathan Kingsbury entered into a history agreement with the company as the “Kingsbury Agreement” under which AT&T committed:
  1. Stop buying independent companies.
  2. Sell ​​your stake in WU.
  3. Allow independent telephone companies to connect to the long-distance network.

But after this dangerous moment for the monopolies, decades of calm came. A calm star of tariff regulation has risen, suggesting the presence of natural monopolies in communications. Relief was made by the early 1920s, and AT&T resumed the takeover of small independent telephone companies. This approach was enshrined in an act of 1934, which established the United States Federal Communications Commission (FCC), which instead of ICC became the regulator of tariffs for wire communications. By that time, the Bell System in any way controlled at least 90% of America’s telephone business: 135 out of 140 million km of wires, 2.1 out of 2.3 billion monthly calls, 990 million out of a billion dollars in annual profit. However, the main goal of the FCC was not to resume competition, but to “make it as accessible as possible for all US residents, "fast, efficient, state and worldwide communications via wires and radio waves with adequate convenience and at a reasonable price." If one organization could provide such a service, then so be it.

In the mid-20th century, local and state telecommunications regulators in the United States developed a multi-stage cross-subsidization system to accelerate the development of a universal communications service. Regulatory commissions set tariffs based on the estimated value of the network for each client, and not based on the cost of providing the service for that client. Therefore, business users who relied on telephony to do business paid more than individuals (for whom this service provided social amenities). Customers in large urban markets, with easy access to many other users, paid more than residents of small cities, despite the great efficiency of large telephone exchanges. Long distance users paid too much, even though that technology has relentlessly reduced the cost of long distance calls, and the profit of local switches has grown. This complex system of redistribution of capital worked quite well, as long as there was one monolithic provider within which all this could work.

New technology

We are accustomed to consider monopoly as a decelerating force, generating idleness and lethargy. We expect that the monopoly will zealously guard its position and status quo, and not serve as an engine of technological, economic and cultural transformation. However, it is difficult to apply this view of AT&T at the peak of its heyday, as it betrayed innovation after innovation, anticipating and accelerating the emergence of each new breakthrough in communications.

For example, in 1922, AT&T installed a commercial broadcast radio station in its building in Manhattan, just a year and a half after the opening of the first such large station, KDKA from Westinghouse. The following year, she used her long-distance network to relay President Warren Harding’s appeal to many local radio stations across the country. A few years later, AT&T also gained a foothold in the film industry, after engineers from Bell's laboratories developed a machine that combined video and recorded sound. Warner Brothers Studios used this “ Waitafon ” to release the first Hollywood picture to sync Don Juan music ,Jazz singer .


Walter Gifford, who became president of AT&T in 1925, decided to rid the company of by-products such as broadcasting and film, in particular, to avoid an investigation by antitrust monitors. Although the US Department of Justice has not threatened the company since the Kingsbury agreement was concluded, it wasn’t worth attracting too much attention with actions that could be regarded as an attempt to abuse the monopoly position in telephony for dishonest promotion in other markets. So, instead of organizing your own th broadcast, AT & T became the main provider for the transfer of the American radio corporations RCA signals and other radio networks, transferring programs from their New York studios and other major cities in the branches of radio stations across the country.

Meanwhile, in 1927, the radiotelephony service spread across the Atlantic, put into operation by the trivial question asked by Gifford to his British postal service interlocutor: “How is the weather in London?” This, of course, is not “That's what God is doing!” [The first phrase officially transmitted by Morse code by telegraph / approx. transl.], but still she noted an important milestone, the emergence of the possibility of intercontinental conversations several decades before laying an underwater telephone cable, albeit with great cost and low quality.

However, the most important events for our history concerned the transfer of large amounts of data over a long distance. AT&T always wanted to increase the traffic of its long-distance networks, which served as the main competitive advantage over several still living independent companies, as well as yielding large profits. The easiest way was to attract customers through the development of a new technology that reduced transmission costs - usually this implied the ability to shove more calls into the same wires or cables. But, as we have already seen, long-distance calls went beyond traditional telegraph and telephone communications from one person to another. Radio networks needed their own channels, and television was already looming on the horizon, with much larger requests for bandwidth.

The most promising way to meet new demands was to lay a coaxial cable composed of concentric metal cylinders [coaxial, co-axial - with a common axis / approx. perev. ]. The properties of such a conductor were studied back in the 19th century by giants of classical physics: Maxwell, Heaviside, Rayleigh, Calvin and Thomson. He had enormous theoretical advantages like a transmission line, since he could transmit a broadband signal, and his own structure completely shielded him from cross-interaction and interference of external signals. With the start of television development in the 1920s, none of the existing technologies could provide the megahertz (or more) bandwidth required for high-quality broadcast transmissions. Therefore, engineers from Bell's laboratories set out to turn the theoretical advantages of cable into a working long-distance and broadband transmission line, including the creation of all the necessary auxiliary equipment for generating, amplifying, receiving and other signal processing. In 1936, AT&T conducted field tests, with FCC approval, for cable lengths over 160 km from Manhattan to Philadelphia. After the first test of a system with 27 voice circuits, engineers successfully learned how to transmit video by the end of 1937. stretched from Manhattan to Philadelphia. After the first test of a system with 27 voice circuits, engineers successfully learned how to transmit video by the end of 1937. stretched from Manhattan to Philadelphia. After the first test of a system with 27 voice circuits, engineers successfully learned how to transmit video by the end of 1937.

At that time, another request for long-distance communications with high throughput, radio relay communication, began to appear. Radiotelephony, used in the transatlantic communications of 1927, used a pair of broadcast radio signals and created a two-way voice channel on short waves. It was economically unprofitable to connect two radio transmitters and a receiver using the entire frequency band for one telephone conversation from the point of view of terrestrial communication. If you could cram a lot of conversations into one radio beam, then this would be another conversation. Although each individual radio station would be quite expensive, hundreds of such stations should have been enough to transmit signals throughout the United States.

Two frequency bands fought for the right to use in such a system: ultra-high frequencies (decimeter waves) UHF and microwaves (waves of centimeter length). The higher frequency of the microwaves promised greater bandwidth, but also represented great technological complexity. In the 1930s, AT & T's responsible opinion leaned toward a safer option with UHF.

However, microwave technology made a big leap forward during World War II due to its active use in radars. Bell's labs demonstrated the viability of microwave radio using the AN / TRC-69, a mobile system that can transmit eight telephone lines to another line of sight antenna. This allowed the military headquarters to quickly restore voice communications after redeployment, without waiting for the cable to be laid (and not risking being left without communication after cutting the cable, either accidentally or as part of the enemy’s actions).

Expanded microwave radio relay station AN / TRC-6

After the war, Harold T. Friis, an officer of Danish origin from Bell's laboratories, led the development of microwave radio relay communications. A 350 km test line from New York to Boston was opened at the end of 1945. Waves jumped 50 km long between ground towers - using a principle essentially similar to an optical telegraph, or even a chain of signal lights. Up the river to the Hudson Highlands, along the hills of Connecticut, to Mount Ashnebamskit in western Massachusetts, and then down to Boston Bay.

AT&T was not the only company that was interested in microwave communications, or gained military experience in managing microwave signals. Philco, General Electric, Raytheon, and television broadcasters built or planned their own experimental systems in the postwar years. Philco overtook AT&T by building a communications line between Washington and Philadelphia in the spring of 1945.

AT&T Microwave Relay Station in Creston (Wyoming), part of the first transcontinental line, 1951.

For more than 30 years, AT&T has avoided problems with antitrust committees and other government regulators. For the most part, she was defended by the notion of a natural monopoly - that it would be terribly inefficient to create many competing and unconnected systems that lay their wires across the country. Microwave communications became the first serious dent in this armor, which allowed many companies to provide long-distance communications without extra costs.

Microwave transmissions have seriously lowered the entry barrier for potential competitors. Since the technology required only a chain of stations located 50 km from each other, to create a useful system it was not necessary to buy thousands of kilometers of land and service thousands of kilometers of cable. Moreover, the bandwidth of microwaves significantly exceeded that of traditional paired cables, because each relay station could transmit thousands of telephone conversations or several television broadcasts. The competitive advantage of the existing AT&T wired long-distance system was nullified.

However, the FCC has defended AT&T for many years from the effects of such competition by making two decisions in the 1940s and 1950s. At first, the commission refused to issue licenses, except for temporary and experimental ones, to new communication providers that did not provide their services to the whole population (but, say, carried out communications within the framework of one enterprise). Therefore, access to this market threatened with a loss of license. The members of the commission were worried about the appearance of the same problem that threatened broadcasting twenty years ago, and led to the creation of the FCC itself: the cacophony of the interference of many different transmitters polluting a limited radio band.

The second decision was about interconnection. Recall that the Kingsbury agreement required AT&T to allow local telephone companies to connect to its long-distance network. Were these microwave microwave relay requirements applicable? The FCC decided that they are applicable only in places where there was no adequate coverage of the public communications system. Therefore, any competitor creating a regional or local network risked a sudden disconnection from the rest of the country when AT&T decided to enter its area. The only alternative to maintain communication was to create a new national network of its own, which was scary to do under an experimental license.

By the end of the 1950s, there was only one major player in the long-distance telecommunications market - AT&T. His microwave network transmitted 6,000 telephone lines on each route, and reached every continental state.

AT&T microwave radio relay network in 1960.

However, the first significant obstacle to AT&T’s complete and comprehensive control over the telecommunications network came from a completely different perspective.

What else to read

  • Gerald W. Brock, The Telecommunications Industry (1981) The telecommunications industry: the dynamics of market structure / Gerald W. Brock
  • John Brooks, Telephone: The First Hundred Years (1976)
  • MD Fagen, ed., History of Engineering and Science in the Bell System: Transmission Technology (1985)
  • Joshua D. Wolff, Western Union and the Creation of the American Corporate Order (2013)

Also popular now: