Telecommunications Strategy

In 1878 Alexander Graham Bell launched the American Telephone and Telegraph Company in New Haven Connecticut.  The basic service was voice telephony, provided by renting phones to customers who then rang a bell on the phone when they wished to talk to someone else, alerting an operator at a central office to talk to the customer over an analog voiceband circuit and cross connect by cables the caller to the called party, again ringing a bell to alert the called party to the new call.  Technology advanced: the operators were replaced by electromechanical switching system that automatically connected wire pairs, the central offices were connected by trunks or transmission lines increasingly using microwave technology, and constant incremental change occurred in each facet of the network, resulting in roughly 1B telephones being deployed around the globe in 1980.
Government regulation fundamentally distorts the economics of telecommunications. In 1913 Theodore Vail, President of AT&T, cut a deal with the Federal and State governments until 1984: in return for being regulated with one interstate tariff and unique tariffs for each state, and for providing universal interconnection of all telephones (in particular to the thousands of independent non Bell phone companies), AT&T was allowed to function as a profitable holding company. The Federal Communications Commission, created in 1933, began to distort the pricing so that it did not reflect the true underlying costs: long distance was priced much higher than local access, and business service was priced much higher than residential service, but at the end of the day all parties were happy with receiving increasingly better and better voice telephone services.

The invention of the transistor in 1947 and the integrated circuit in 1959 led to digital transmission being introduced in 1962 as well as computers for controlling the switching function which allowed for new services such as call waiting or voicemail.

The 1960s saw the advent of optical fibers with very low loss, along with lasers and detectors that became increasingly powerful and reliable.

The 1960s saw the advent of coaxial cable networks that transmitted television signals over coaxial cable typically to regions that could not be in line of sight of the original transmitted signal.   This was extended in the 1970s to putting a satellite receiving antenna at the head end of the cable network so that television signals from many different locales could be sent over a single cable system.

The 1970s saw the advent of mobile radio handsets, with the radio network reusing frequencies by tesselating a region into cells, with only certain frequencies being used in each cell and not in the adjacent cells, but then reused in farther out cells.

The 1990s saw the thirty odd major telecom service providers around the globe begin to migrate from their existing copper wire network access using analog transmission to high speed digital transmission, or to coaxial cable network access using digital transmission, or to wireless microwave network access.  The electromechanical switching systems could handle voice, and were replaced by digital packet switches with no moving parts that could handle voice, data, television, graphics and telemetry.  The backbone analog microwave network was replaced with digital transmission over optical fiber.  The end result is that today virtually every single person on planet earth has a mobile handset, with over 7B handsets in operation.

This is complemented by the World Wide Web and Internet which has over 8B addresses and is rapidly surpassing the number of mobile handsets.

Ultimately it is necessary to look at the underlying economics of telecommunications to determine how to develop a viable long term sustainable service business. There are three capital cost elements in telecommunications: the cost of the telephone terminal, the cost of the interconnection network, and the cost of the switching system. When Alexander Graham Bell chose to use copper wires to connect telephones to a central switching system, this required either erecting poles and hanging copper wires from the poles or digging the ground up and putting conduit in the ground that would contain copper wires. The initial cost of installing poles is far less than that for underground conduit, and these costs are far and away the largest capital cost element in the system. The capital outlay for the telephone terminal can be defrayed by renting the phone to the subscriber as long as the subscriber is willing to sign up for a long term service contract. The capital outlay for the switch initially was plug boards of cable that human operators interconnected for each call; this was replaced by electromechanical switching systems that used relays to interconnect pairs of copper wires; the switching system was shared among all the subscribers, and so its cost per subscriber was far less than the cost of the interconnection network. In the jargon of economics, there are fixed costs and variable costs, and the system that Alexander Graham Bell envisioned had high fixed costs (dominated by the cost of the copper wire network) and relatively low variable costs (the cost of the telephone terminal and the cost per subscriber of the switching system).

For one hundred years, the world was dominated by copper wire telecomms networks, reaching roughly 1B phones in the 1980s. This led to advancements in transmission and switching built on fundamental advances in physics, chemistry, mathematics, all driven by the underlying economics. Because of the perceived value of telecomms, government regulation because to distort the underlying economics for the perceived public good: the pricing for business was far higher than underlying costs that resulting in subsidizing the pricing for residential services, while the pricing for long distance interoffice connection (which had economies of scale due to sharing the interoffice links among so many subscribers) was far higher than underlying costs in order to subsidize local service. Typically one might find that 80% of all telephone calls were handled by a local switching system, while 20% interconnected elsewhere, while 20% of all local phone lines were tied to businesses and 80% tied to residences. This led to highly reliable telephone service at an acceptable price with the telecom service provider enjoying an acceptable profit to sustain operations; telecom service providers relied on debt to fund operations, because once a subscriber signed up for service they stayed a subscriber for years, so a very predictable cash flow stream was generated for each subscriber that could sustain paying off principal and interest on debt.

The advent of solid state electronics with the invention of the transistor in 1947 would lead to a surge in technology that would overturn all of this. The first example of this was in the 1950s when photovoltaic solid state electronics was deployed to power remote transmission elements; Alexander Graham Bell had chosen copper wires to connect phones to switching systems, and sending an analog signal over this pair of wires limited its range to 18,000 feet; this led to the invention of vacuum tube amplifiers that could amplify the analog signals to extend the range, but this also amplified the noise intrinsic in this circuitry which limited the range. Just as important was the need to provide electric power to these remote amplifiers or repeaters, and the cost to do so was a significant capital outlay intrinsic in the interconnection network, so providing power from sunlight locally where needed could be over time a major change in the cost structure.

Solid state electronics could handle both analog and digital signals; the realization that an analog signal could be encoded as a digital signal allowed the digital signal to be regenerated again and again along the interconnection network, while the noise along the way was not amplified, and at the other end the digital signal could then be converted back to an analog signal. This led in the 1960s to the deployment of digital transmission systems over copper wires, with a key change: the digital transmission system allowed for sending control bits as well as information bits, and this created a separate control network to manage the information network. In the 1960s the control of the electromechanical switching systems was replaced by digital electronics (a computer with its own software to control the interconnection of wire pairs); the computer control allowed new services such as call waiting or call forwarding or three way calling to be deployed in a cost effective manner.

Solid state electronics led to the invention of light amplification (lasers) and light detection (light emitting diodes or LEDs), and this stimulated the development of using ultrapure glass wires to replace copper wires or microwave links for transmission. The 1970s saw significant advances in the development of ultrapure glass fibers for communication, which in turn drove the manufacturing processes for both optical fiber and for lasers and LEDs to achieve far greater reliability at significantly lower cost. A spinoff of this was using optical fiber itself as a laser to optically amplify light signals, so no conversion was done from light to electrical, with the electrical signal then amplified and converted back to light: all the amplification occurred in one stage. This in turn required the development of optical filters that passed light at some frequency bands and blocked it at others, leading to the creation of an analog optical transmission system that was comparable to an analog microwave transmission that had been deployed during World War II and afterwards around the globe. Optical digital transmission standards appeared, initial Synchronous Optical Network (SONET), but increasingly all digital transmission migrated to a variant of Ethernet.

Solid state electronics could generate and receive microwave energy signals in different frequency bands. This suggested the possibility of eliminating copper wires entirely from telecommunications, which was historically the largest cost element. The radio signals could be transmitted in a given frequency band, and by limiting the power transmission level the signal will attenuate over distance, permitting the reuse of frequencies if signal frequencies are sufficiently far apart.  This created the notion of a cell or hexagon in space that had a set of frequencies used for communication, while the adjacent cells did not use those frequencies, but cells adjacent to those could reuse frequencies.  This had an economic benefit: the initial cost was a single radio transmitter/receiver, and as each subscriber signed up for service with their phone the revenues increased greater than the cost incurred of using more and more frequencies: lower fixed costs, higher variable costs that permitted revenues to scale up.

All of this came together in the 1990s, roughly a century after Alexander Graham Bell started telephone service in New Haven CT. South Korea was one of the first to launch a ten year plan, Korea Telecom 2000, to become world class in the year 2000. 1)the existing network was enhanced first with an optical fiber backbone transport network that interconnected to global undersea optical fiber cables. 2)the backbone optical fiber could connect to cellular base stations, accelerating the adoption of wireless network access. 3)the backbone optical fiber network sprouted tentacles to homes and to businesses, providing direct fiber optic interconnection or wireless network access which over time migrated to a variant of Ethernet, 4)the electromechanical switches were replaced with solid state electronic routers. Because this was a ten year plan, in the latter half of the decade it was planned to offer two way symmetrical data communication services; remember, this is in 1990, before software browsers and the rollout of the World Wide Web and Internet. Today, South Korea has over 95% of all homes and businesses with 100 mbps symmetrical network access.

Once the template was worked out for South Korea, it began to impact other networks, starting with Japan and Australia and Singapore, but then moving to Western Europe with Telefonica in Spain, Telecom Italia in Italy, Deutsche Telekom and Mannesmann in Germany, France Telecom in France, British Telecom in Great Britain, Telia in Scandinavia, and the Benelux countries of the Netherlands, Belgium and Luxembourg, while at the same time this was rolling across the United States (ATT, MCI, Sprint, Frontier, regional Bell holding companies) and Canada (Rogers and Bell Canada) as well as in Brazil with Embratel.

The underlying economics for the original copper wire access network led to high fixed costs (the cost to dig the ground up and pull cables through conduit, or the cost to put poles in the ground and hang cables from the poles) and low variable costs (the cost of a switching system was shared among all the subscribers, so this led to the desire to add software features that incurred little capital costs but generated high profit revenues such as call forwarding and three way calling and voice mail). The underlying economics for the new wireless access network led to relatively low fixed costs (the cost to install a base station and a switching system) and the variable costs scaled with the revenue per subscriber. The fixed capital cost is typically called the capital cost to pass a subscriber, while the total fixed cost is typically the fixed capital cost plus the cost for the final piece of access.