The Invention and Transformation of Computers: From ENIAC to the Internet Era

Invent the computer

Early computer series. In 1943 (during World War II), the University of Pennsylvania began to develop the first computer ENIAC, which was published in the New York Times in 1946. The invention of computers was inseparable from war. The war department began to discuss the use of computers for nuclear weapons analysis, weather forecasting, and oil exploration. , aircraft design, ballistic measurement and a series of functions, among which Monte Carlo simulation, weather forecast, etc. happened to be the early core scenes of ENIAC. Before the advent of computers, these tasks were calculated manually by a large number of people, and these calculations were usually performed by female employees. Responsibility, and early computer programming programmers were also dominated by women. After that, the government continued to plan the second computer EDVAC, which had breakthroughs in three main dimensions: the first was hardware, which used high-speed memory + binary storage; the second was the use of von Neumann computing architecture, which turned the computer’s switch (then still Vacuum tubes (later transistors are actually a collection of switches) are called neurons, and the memory that stores programs and data is called an organ. Both storage and calculation use binary, which is bits. EDVAC has a centralized computing unit similar to a CPU, which is connected only to the memory. The third is to have a series of operation codes, which are consistent with people’s scientific computing habits. The basic loop of EDVAC is to read the operation instructions from the memory, decode the operation instructions, and then execute them.

Early computer sales. Von Neumann’s wife wrote the first program to perform Monte Carlo simulation for ENIAC. Since then, researchers have equipped similar computers with CRT monitors, added data structure registers, and changed the address that a single operation code can access from 1 The number has been increased to 4, and parallel reading has been added from the traditional serial reading of bits. In order to reuse programs, the assembler was invented. The first symbolic assembler was developed by IBM’s computer project team. In 1947, the Association for Computing Machinery was established, which was the predecessor of IEEE. Early industry insiders believed that the computer market in the United States was very limited, with perhaps only 5-6 units. It turned out that they had far underestimated the potential of computers. In 1948, Eckert and Mauchly, the first computer builders, formed a partnership to sell similar computers. The first-generation Univac computer could do 465 multiplication operations per second and ran at a clock frequency of 2.25MHz. These data were compared to ENIAC and Univac memory. Larger and faster to read data. The first computer was delivered to the U.S. Bureau of Statistics. By 1954, about 20 computers were put into use, with a unit price of US$1 million (a sky-high price). These customers were mainly government agencies, but also included large companies such as GE and Du Pont. Judging from the early application scenarios of Univac, in addition to scientific research scenarios, the strongest substitution effect in commercial scenarios is the Punch Card-based computer (card) sold by IBM. For example, after GE purchased the computer, it was mainly used for four specific tasks: salary payment, Raw material planning and inventory management, order management and payment, accounting management. These scenarios were originally based on IBM’s card computers (which can also explain why IBM was able to dominate the computer market later, because this is a continued technological innovation), and the manpower savings brought by more advanced equipment (card computers require a lot of manpower to carry cards) are persuasive Important reasons for executives to change equipment.

scientific supercomputer

IBM emerged as the leader. After the end of World War II, the defense budget brought about by the Cold War remained an important source of support for the development of the computer industry. In 1952, IBM’s first electronic computer, the IBM 701, was launched. The first 19 units were all delivered to the U.S. Department of Defense and were therefore called “defense computers.” They were leased for $15,000 per month. The IBM701 computer is roughly similar to the Univac, but it reads memory faster because it reads one word at a time, while the latter reads one bit at a time. The 701 is used in scenarios including weapons design, aircraft design, encryption calculations, etc. By 1956, thanks to its strong sales power and customer resources, IBM delivered more mainframe computers than Univac. Its main model 704 (140 units delivered) was an upgraded version of the 701, which used Core Memory (even though Data can be stored even when the power is off), supports floating-point computing capabilities (expressed as a power of 10, which is very important for scientific computing because it supports large number operations), supports structured registers (programming is simpler), thus gaining customers Dear. The next generation IBM 709 further upgraded the hardware – a module (standalone computer) that can support the input and output of big data, have independent data storage, and printer device management. Finding reliable storage devices was difficult in the early days. At that time, high-end computers used high-speed tapes for storage, and there were also low-priced computers that used drums as storage devices (priced at US$30,000, compared to high-end computers worth millions of dollars).

Scientific programming language Fortran, etc. Assembly language implements instructions written by humans and executed by machines. The compiler is responsible for converting mathematical formulas and higher-level programming languages ​​into a language that the machine can directly execute. In 1957, IBM launched the Fortran programming language based on the 704 model. Due to the low efficiency of early computer programming, the cost of programming + debugging even exceeded the cost of executing the program. Therefore, Fortran appeared to free programmers from redundant processes. Focus more on the problems you really want to solve. For example, Fortran’s symbols and syntax for calculations are very close to algebra, and its programming efficiency and performance are very strong. Early program sharing was mainly based on private organizations. In 1955, a group of IBM 701 customers formed the SHARE sharing organization, which soon gathered 62 member units and began to share basic mathematical programs such as inverting matrices. These sharing organizations were part of the early open source community. Base. The SHARE sharing program also includes the prototype of early operating systems. Due to the scarcity of memory resources, these softwares provide functions such as managing memory while being small in size. In the early days, GM (General Motors) developed a batch processing system in the laboratory. The program is used to control the process of running the program and tell the computer whether it is a Fortran program or a new program to be run next. It later evolved into the early operating system of IBM computers. Algol was released in 1958 and was also an important early programming language. It introduced the Block architecture. The block with begin and end can become an independent instruction, and Algol supports a program to call itself and run repeatedly (recursion). The data structure of Stack has also been proposed to be combined with Algol, and programs with wireless loops will encounter Stackover errors.

Transistors and supercomputers. In 1950, Bell Labs, a subsidiary of ATT, invented the transistor. Due to antitrust investigations at the time, it was not allowed to enter the market except for operators. Therefore, Bell Labs licensed transistors to other companies at a lower price ($25,000), such as Texas. It was at this time that Instrument learned this technology. Transistors were smaller (especially after the invention of integrated circuits later) and more reliable. They were quickly popularized in the computer field to replace vacuum tubes and became important “neurons” in computers. For example, the IBM 709 was launched A year later, the transistor version 7090 was launched, priced at US$3 million, and several hundred units were sold. The rental price of the upgraded version 7094 was US$30,000 per month. Early programmers did not program in real time. They needed to enter the program first, then wait for the running results, check the running status by looking at the indicator lights, and look at the running results by printing the results to find problems. In 1955, IBM launched the supercomputer Stretch, whose computing power was 100 times that of the IBM 704. Compared with traditional computers that use the steps of read-decode-run, Stretch can calculate other programs at the same time when reading data, so it is more efficient. Stretch is also equipped with IBM’s black technology “hard drive” at the time, equipped with virtual memory technology, which is similar to a modern virtual machine and can make limited memory appear larger. It was officially named 7030 in 1960 and sold for a whopping $13.5 million (a sky-high price). However, its actual performance was only half of what was planned (although it was also very strong). It was commercially unsuccessful, but it incubated the later epoch-making IBM 360. Series, no sweat is wasted. Other supercomputer manufacturers mainly include Control Data. Its Cray 1 customers are mainly the Department of Defense. It has vector computing and other capabilities. Sales reached 80 units, with a single unit selling for $8 million. Until 1982, Cray was the fastest computer in the world.

Business data processing

IBM dominates business computers. In addition to scientific computing, computer application scenarios also include commercial applications. There is a belief in automated office itself among American companies – they believe that computers will one day be able to process all company transactions and process all business data to provide managers with a basis for decision-making. In the early days, the IBM 650 computer was the most successful model. It was based on magnetic drum storage and therefore sold relatively cheaply. Its sales reached 2,000 units and the monthly rent was $3,500. In order to popularize computers, starting from the source of college students, Watson Jr allows universities to purchase machines at a 40% discount, and its subsequent model IBM1401 (which is 7 times faster than the 650 series) continues to penetrate every customer of IBM’s traditional card-based computers, and sales Reached 12,000. By 1962, just seven years after IBM launched its first electronic computer, its revenue from computers surpassed that of card computers, and it accounted for one-third of the U.S. computer market. For big data volume calculations, IBM’s hard drive invention also has its place. Compared with traditional tapes (which take 6 minutes to search for data) or magnetic drums, hard drives can extract any data at high speed, and the efficiency is significantly improved. In addition, in open source communities such as SHARE, some basic programs such as basic classification and salary payment have become popular. Compared with Fortran’s success in the field of scientific computing, Cobol is a core programming language in the field of data processing. It is based on a programming method similar to English grammar and can realize various data processing functions.

IBM360 became the industry standard. From the perspective of demand, there are certain differences between scientific computing and commercial computing. The former requires complex calculations, but often does not have a large amount of data; but the latter often has a large amount of data, but only requires simple calculations. Therefore, in the early days, IBM had separate product lines for the two scenarios, but soon IBM began to plan and develop a computer product to meet the two major needs. In 1965, IBM launched the 360 ​​series, which established its leading position in the field of large commercial computers for the next 50 years. It was also an epoch-making product that bridged the two fields of scientific computing (7070 series) and commercial computing (1401 series). In addition to its strong capabilities, the 360 ​​series also has forward compatibility. That is, unlike previous IBM computer models that were incompatible with each other, the 360 ​​series can be compatible with previous IBM series programs. The 360 ​​high-end series can imitate the scientific computer series, while the low-end series can imitate the scientific computer series. The end series is capable of meeting the capabilities of previous commercial series. The IBM360 series is actually a bet for IBM’s fortune and life. It has released 150 new products together, including new keyboards, hard drives, tapes, etc. The 360 ​​series received 1,100 orders in the first month after its launch. Most of NASA’s computers immediately switched to the 360 ​​series to adapt to the US moon landing plan at that time. However, IBM experienced production bottlenecks in the early days, and software delivery was also a problem. Initially, IBM was preparing to launch a single OS/360 operating system, but later announced that it would give up and instead launch four different operating systems for large and small memory, hard disk and tape storage models. Overall, the 360 ​​series was more successful in the commercial field, while scientific supercomputers were the CDC market. However, the market size in the commercial field later became much larger than scientific computing. In the mid-1960s, the computer field was known as Snow White (IBM) and IBM. A dwarf (RCA, Honeywell, etc.), IBM installed 35,000 computers in 1970, and IBM’s net profit exceeded US$1 billion in 1971. The 360 ​​series remained IBM’s flagship product until the 1990s.

数据库和IRS。Databases and IRS. In the 1960s, computer-based digital management became the fantasy of many people. Many people believed that computers could revolutionize traditional company management structures and processes. Factors such as traditional employee loyalty to the company would no longer be important, and the rational element of discussing matters would become more important. Important (because anyone’s contribution can be quantified by computers), management will be further decentralized. However, this series proved to be just a beautiful phantom under the technical conditions of the time. Even integrating the different systems of a large company was a difficult task. However, Bachman from GE invented the prototype of the database, Integrated Data Store (IDS). In this way, the software needs to use data without dealing with the data directly. Instead, it uses the instructions of the IDS. The database becomes an important infrastructure of the system. Bachman is also responsible for this invention. Won the Turing Award. The US Internal Revenue Service (IRS) was a major early customer of commercial computers. Its tax upgrade system budget reached US$650 to US$850 million in the 1960s. In 1965, it provided a unique SSN number to each US tax-paying citizen, further facilitating computer data processing. However, later, accusations against citizens’ privacy (the White House can easily see the tax status of any citizen) made the IRS system updates fall behind the times. As of 2018, it was still using programs such as Cobol written in the 1960s.

real time control system

The SAGE project enables real-time control. Early real-time control systems were used in weapons control, flight simulators and other fields. Later, the core purpose of the SAGE (Semi-Automatic Air Defense System) project implemented by the U.S. Department of Defense was to intercept Soviet missiles. In 1950, the SAGE project contributed $500 million in revenue to IBM, more than IBM earned from other computers. In addition, SAGE is also considered to have cultivated the earliest programmer team in the United States. Project outsourcing companies such as SDC were established. However, in 1957 alone, IBM trained 14,000 programmers, so IBM was the largest cultivator. In 1957, the Soviet Union launched a satellite, and the United States established NASA to fight back. In addition, NASA modified IBM’s mainframe for real-time control of spacecraft (breaking the tradition, because IBM’s traditional computers are leased, so users are not allowed to modify them. ).

DEC ushered in the mini computer era. Another direction of computer development driven by the Cold War was miniaturization, which gave birth to DEC (Digital Equipment Corporation), which was established in 1957 by MIT graduate student Ken Olsen. DEC’s first computer, the PDP-1, was based on new possibilities of transistors. With its unique design, it can perform 100,000 addition operations per second, and its price is only $120,000. DEC ushered in the era of mini computers, which were based on the latest semiconductor technology. The PDP-8 type weighed 250 pounds, which was significantly smaller than IBM’s mainframe. The price was only $18,000, and its sales eventually reached 50,000 units. In addition, many other manufacturers have modified PDP-8 to create new products, such as LS-8, which became the stage lighting controller for Broadway operas. DEC was closely tied to MIT’s scientific research capabilities and survived the counterattack of IBM’s entry into mini computers (in 1969, IBM launched Syetem/3, with a monthly rental fee of US$1,000), and its products began to penetrate the hinterland of IBM – —Business data analysis field.

Invention of the integrated circuit. By “printing” all components of the transistor circuit on the circuit board, eliminating the need for formal wires and installation, the predecessor of the integrated circuit was produced, which further pushed the miniaturization of semiconductors to the extreme. The inventors of integrated circuits are Jack Kilby from TI and Bob Noyce, one of the founders of Intel. The latter also invented the Planar process, which can produce integrated circuit products on a large scale. Its earliest application scenarios include hearing aids, which have high requirements for portability. However, the largest buyer of early integrated circuits came from the aerospace industry. Its purchases accounted for 95% of the industry’s output in 1963. The large purchases of the Apollo spacecraft made the price of a single chip Dropped from $1,000 to $30. Since the military and aerospace have high requirements for product reliability, chip processing has begun to implement ultra-clean rooms. Any mixing of magazines may cause chip abnormalities. The sanitary conditions of chip Fabs are far more demanding than those of hospitals. In addition, software reliability is also required to be very high. IBM obtained an order for the spacecraft control software. They solved the safety problem by designing redundancy. Simply put, IBM prepared three systems (the spacecraft was later equipped with 5 AP101 computers). ), any set of failures will not affect the normal operation of other systems. Take service life as an example. The rover that landed on Mars in 2004 was originally designed to have a lifespan of only three months, but it ended up working normally for more than 14 years.

In the 1960s, the popularity of smaller, cheaper, and more reliable Mini computers, as well as the popularity of timesharing systems, jointly promoted computers to become a popular real-time feedback general-purpose computing system. The popularity of DEC’s Mini computer PDP-1 at MIT drove the birth of the first batch of computer “geeks”. They developed the earliest computer game Spacewar and the earliest computer graphics software Sketchpad. On the other hand, the use of computers based on time sharing has become popular and further improves computer efficiency. For example, while a user is inputting a program, the computer can run other programs. However, this sharing operation requires high memory. An MIT team developed an operating system CTSS based on time sharing, based on the IBM7094 computer, which allowed 30 users to use the computer at the same time (later upgraded to 50). In addition, MIT users also developed a series of applications to use the computer efficiently, such as Text editor, a series of programming tools, the most popular application of which was email in 1965. Based on the same computer, different users can send emails to each other. This shared computer builds a local area network to some extent. In addition, programmers from Dartmouth University developed the BASIC programming language, which is an easy-to-learn programming language that can develop applications based on shared computer systems. In 1976, the University of New Mexico developed the Matlab software package, which was commercialized by several companies including Mathworks in the PC era.

Timesharing系统的问题。Problems with the Timesharing system. In the late 1960s, the development of the time-sharing computer model began to reach its climax. Large companies such as GE also began to enter this business, which was similar to the prototype of cloud computing. IBM also transformed the 360 ​​model computer to adapt to the time-sharing system. However, the biggest difficulty of these shared systems is the difficulty of dealing with uncertain tasks and switching between different tasks. Due to the need to present a sense of timely response to multiple users, the system must quickly switch between different user programs, and the program must stop running at any point, otherwise a large computing task will monopolize the entire system. These high-level requirements are At that time, the hardware conditions were difficult, which also led to poor user experience of shared systems, or high costs (at that time, the computing power of a computer was proportional to the square of the price, and the computing power of a 1 million machine was 100 times that of a 100,000 machine) As a result, the time sharing market maintained a moderate market size in the 1970s. Many users found that compared to the price and user experience, the time sharing system was not as cost-effective as purchasing a mini computer. In addition, there were multi-core chips at that time to expand the computing power of the host, but CompuServe’s practical experience proved that the effect of connecting multiple computers such as PDP-10mini was better than the sharing effect of one supercomputer.

Software Engineering and Unix Operating Systems. In the 1960s, computer manufacturers were good at making hardware, but not good at software. The release of compilers, operating systems and other products was often delayed for a long time (at this time, software engineering was basically equal to system software), and the cost of computer system software development accounted for the total cost. It also increased from 10% in the early 1950s to 80% in the 1970s. In 1968, universities began to establish doctoral degrees in computer science. In fact, in the early days of large-scale software development, large-scale software development mainly adopted human-wave tactics, and the lack of development experience often led to delays in the development cycle (this phenomenon was still common until the 1980s, such as Windows development). In the 1970s, structured programming (Structured Programming) provides a top-down solution for developing large-scale software. However, software engineering is actually an iteration of an organizational management solution, which requires the creation of mature organizational management models such as better performance evaluation, process control, and evaluation. Another solution was to have small, lean teams (to escape the entropy of the organization). This model gave rise to the Unix operating system: it was born at Bell Labs and was developed for the laboratory’s time-sharing machine (DEC PDP- 7) To develop an operating system, due to insufficient funds in the early stage, the team can only be a small team (large teams are probably not able to develop it yet). Unix proposes a “fork” mechanism, that is, a process can directly copy the initial process and run it in memory. This makes running programs simpler. Unix has a single execution program for a specific function, and the output of the previous program can become the input of the subsequent program. In addition, during the development process of Unix, it gradually migrated from assembly language to the more advanced C language, which was also invented by Bell Labs engineers and is very efficient for operating system development. It is easy to produce bugs when programming in C language. An ampersand in the wrong position will report an error, which makes programmers crazy. However, precisely because it is based on C language, Unix can be relatively easily transplanted to different computer platforms in the future. Unix is ​​very popular in universities and academic circles. It was the first to be popularized in the industry, and it promoted the rise of DEC’s Mini computer platform – DEC employees grew from 900 in 1965 to 5,800 in 1970, and to 36,000 in 1977. In the 1970s, PDP-11 sales reached 110,000. In 1973, DEC launched the super mini computer VAX (32-bit), which could read 1 million instructions per second (MIPS). Sales reached 100,000 units in the first 10 years.

Computers enter the Internet era
——Predecessor of the Internet

The time sharing system is the base. Although the Timesharing system has not been widely popularized, multiple users using one computer inherently make it easy for these users to communicate over the Internet. The earliest email system was born at MIT (1965). By the late 1960s, email became a standard feature of time-sharing computers. Later, prototypes of Internet social products such as forums, communities, and collaborative office software also began to appear. In 1978, a sociology book “The Network Nation” imagined a society where there was no paper publishing, and publishing, email, and education were all online and networked (futuristic enough). Early computer input devices included the light pen (pen), and the mouse was invented in 1967. In the 1970s, Plato was a time-sharing system based on graphical display functions. It was mainly used in the field of education and could realize functions such as online learning. In 1975, 950 Plato units were put into use and full commercialization began. However, due to the cost of a single display terminal It cost up to US$8,000 (including backend computer costs), and was ultimately unsuccessful in commercialization (the company behind it, CDC, invested US$800 million), but Plato became the prototype of many early Internet systems.

Early development of the Internet. ARPANET (Department of Defense) is the predecessor of the Internet. When the time sharing system becomes a node and users become accustomed to seeing computers as communication devices, if these computers can be connected, it will become an Internet. The core problem to be solved is how to communicate between different computers. The traditional method is similar to establishing a direct connection by telephone, but ARPANET innovatively proposed a method based on data packet switching. If a packet is lost during transmission, the computer can resend it. In 1969, the Internet There were 4 computers, 10 in 1970, and 15 in 1971. ARPANET was officially released in 1972, which transferred data between different computers by leasing telephone lines to ATT. E-mail naturally became the early killer app of ARPANET. Due to the need to mark different time sharing systems, the @ logo was invented. In 1973, the standard e-mail format included from, data and subject (subject), and SMTP (email protocol) also began to become popular. In the mid-1970s, the TCP/IP protocol was proposed. The core of the former is that the computer reorganizes the received data packets. If there is a packet loss, the sender must resend it. The latter introduces IP addresses to distinguish whether two computers are on the same subnet. Within the network, version 4 of TCP/IP (IPv4) was ratified in the 1980s and is still widely used in 2020. In fact, the early Internet was more like a local area network, and there were many parallel networks such as Usenet (news service), CS Net, BITNET, etc. Similar commercial Internet services began to appear in 1980. CompuServe provided users with email, news, weather, etc. , stocks, sports events, lottery and a series of Internet services, while IBM and DEC provide services such as email for enterprises. The operator ATT also decided not to distinguish between data transmission and telephone services on the telephone line, and proposed jack so that ordinary users can directly connect the telephone line to the Internet modem. France had the largest number of early Internet users. Minitel was released in 1983 and provided services such as searchable telephone yellow pages. In 1987, it was spread to every corner of France, and by 1993 it had 6.5 million users.

The Internet itself is decentralized. In 1981, there were 200 computers connected to the network. Five years later, the number increased to 5,000. In 1990, it reached 160,000. DEC’s VAXmini computer superimposed with the Unix operating system became the most common software and hardware configuration on the early Internet, mainly on college campuses. In 1988, computer viruses that exploited Unix vulnerabilities began to appear. In 1986, NSFNET (National Science Foundation Network) quickly became the backbone of the Internet, and its speed continued to increase. Multiple local area networks such as NSFNET and BITNET merged, and it soon replaced ARPANET as the center of the Internet, and became the infrastructure of the later information highway. (information superhighway). In the 1990s, ISP service providers providing Internet services began to appear, and Internet traffic began to grow rapidly. The number of bits transmitted increased from 1 trillion bytes/month in 1992 to 10 trillion bytes/month in 1994. In addition, IP addresses are difficult to remember. To imitate email addresses, the domain name system was invented, and the domain name allocation agency is IANA. It has also become a rare centralized and important organization under the decentralized infrastructure of the Internet. Its management agency is ICANN is a non-profit public welfare enterprise that achieved independent status from the United States in 2009. In 2022, it rejected Ukraine’s request to kick Russia out of the Internet. In fact, looking back at the history of the development of the Internet, the basic attributes of decentralization are very important. Its data transmission has nothing to do with content. It relies more on social mechanisms for management (mutual supervision rather than setting up a centralized management organization), and any Computers can all use the Internet to send data.

PC begins to sprout

CPU and Intel ushered in an era. The PC era benefited from the development of microprocessors, which was essentially the development of semiconductor integrated circuits. In fact, the first thing that benefited from the development of the chip industry was calculators (addition, subtraction, multiplication and division). In 1964, the Chinese Wang An and a Harvard University teacher jointly developed I bought the electronic calculator Wang 300, which is cheap and easy to use. Since then, HP has also launched the HP-9100 calculator, priced at US$5,000. In the 1970s, MOS technology began to become popular. The performance of its production chips was not as good as bipolar, but the cost was lower. CMOS chips were very suitable for electronic watches and other scenarios because of their low energy consumption. Since then, chip prices have continued to fall, and the prices of electronic products such as calculators and electronic watches have also continued to decrease. The price of calculators has dropped from US$150 in 1972 to US$50 in 1976. Casio and Sharp from Japan have become market leaders. Intel’s CPU chips further ushered in the PC era. One chip represents the arrival of a computer era. In addition, storage technology inventions such as EPROM further accelerated the development of the CPU market. Since then, Intel has integrated CPU, EPROM and other products, priced at US$10,000. Customers can directly purchase them for assembling computers. Intel decided not to directly ToC this product to avoid becoming a competitor to its own customers. As Moore’s Law continues to advance, the cost of chips per unit of computing power will continue to decrease at an exponential level.

Atlair and Microsoft. In 1975, Atlair launched the first computer kit (Kit) MITS, priced at $360 (cost $75), named Microcomputer. Early Kit customers were computer geeks, and they mainly bought it for playing games. The early MITS had 16 card slots (those who have played with learning machines should know this). Functions can be expanded through hardware, and selling add-on cards has become a business. Among the early users of MITS was Bill Gates. He and Paul Allen developed the BASIC programming program. BASIC was easy to understand and had low memory requirements (the first version only required 4KB of memory). It became the most important programming language in the early days of PC. Microsoft first Monetization is also based on BASIC sales (priced at $60). Due to rampant piracy in the early days, Gates also wrote a famous open letter condemning this non-payment. The core functions of CP/M, the most popular operating system in the early days, included the consolidation of fragmented stored files. The company behind CP/M is Digital Research, and its earliest stories with Microsoft and IBM are very dramatic. You can refer to [Reading] Gates Chapter on the Development History of Microsoft: If you want to do it, make it an industry standard.

早期PC产品。Early PC products. In fact, computers during this period should still be called microcomputers. The name PC was IBM’s product name and became a common name in the industry. Early PCs were like toys, and due to the use of CPU integrated chips, for traditional manufacturers such as DEC to adopt the CPU standard is equivalent to giving up their traditional self-developed semiconductor architecture (there is a joke that an engineer can learn from DEC VAX semiconductor The architecture shows the company’s structure (because one department is responsible for one architecture), traditional manufacturers are not very interested, and this undoubtedly gives opportunities to new entrants such as Apple, Commodore, and Tandy (a subsidiary of Radio Shack). Tandy’s TRS-80 sales It costs US$400 and uses Zilog Z-80. Users buy it to experience computers or play games (similar to a learning machine). Apple’s successful product was the Apple II. Wozniak was the core product developer. It integrated the advantages of other products on the market, but used fewer chips, had stronger performance, and provided a color interface function for the first time (at that time, it was mainly through TV computer display), priced at US$495. The rise of Apple II also relied on the Killer App VisiCals (Excel prototype), and the education market was its most successful market. Apple had special discounts for universities. As mentioned above, Internet-like services began to become popular in the 1980s, and prototypes of Internet products such as BBS began to become popular. By 1989, U.S. Bureau of Statistics data showed that the penetration rate of PCs in American households was 6%.

The core scene of early PCs was games. The earliest commercial form of electronic games was the arcade. After that, computer forms such as game consoles began to become popular, mainly due to the popularization of proprietary chip products (single-chip microcontrollers). Since then, PC-based games have also begun to become popular, especially early computer products such as learning machines, which mainly rely on the game card slot of ROM for content expansion (you can buy game cards for learning machines). Early game consoles and content manufacturers such as Atari in the 1970s rise in the 1990s (for details of this history, please see [Reading] 30 Years of American Console Games: Content and Ecosystem Determination). In addition, based on the development of microcomputers, concepts and products such as smart homes also began to emerge at that time. In addition, portable personal computer products like Wenquxing also began to become popular. Among them, the Sinclair ZX81 was launched in 1981 and sold for US$140. Two years later, the price was further reduced to 40 pounds, and sales reached 1 million units. The introduction of computers into the home also brought unexpected effects. Until the mid-1980s, the proportion of female computer science students had been increasing. As computers entered the home, games and other scenes made people feel that computers were designed for male users, which led to men taking up computer science. The proportion among majors is increasing.

PC opens the office era

Word and spreadsheet processing software. In the 1970s, computers began to really enter offices (previously they were specialized departments). The two major office scenarios were word processing and spreadsheet processing. For word processing, it existed more in the form of proprietary equipment in the 1970s. In 1971, IBM named typing equipment “word processing machines”. Legal documents were the primary word processing scenario. The leading equipment player in word processing special equipment is Wang Labs ( Founded by Wang An), a Wang text computer, including the hard drive, sells for $30,000, but is still valuable considering its replacement labor costs. After the arrival of the PC era, general-purpose devices began to subvert special-purpose devices. Wordstar was the first generation of word processing software (1979), which was adapted to the CP/M operating system. At that time, science fiction writers were also early core users of Wordstar. VisiCalc is a table processing software based on the Apple II. Since the Apple II can only display 40 letters per line (half of the normal number), it is not suitable for word processing, but it is suitable for table processing and the display is smoother. VisiCalc soon became a hit application and even a must-have skill for Wall Street financial professionals. It was priced at US$200 and its revenue exceeded US$100,000 in 1981. The two quickly became a perfect match to promote each other’s sales.

IBM PC launched. In 1981, IBM released its first microcomputer named Personal Computer (referred to as PC, pictured below). It was equipped with an external keyboard and had a built-in chip and other hosts, supporting multiple expansion slots. It can be said that the IBM PC was brought together. The advantages of other products on the market are that the cheapest version of the IBM PC with 16KB memory is priced at US$1,565. If you want to choose high-end configurations such as 32KB memory, printers and other auxiliary equipment, the price climbs to US$3,405. Microsoft provides the 16-bit operating system PC-DOS for IBM PC, which requires only 6KB of running memory. The initial success of the IBM PC still came from the enterprise market. These large corporate departments still believed that Apple computers were unreliable. They had a good relationship with IBM, but the budget for purchasing PCs was gradually transferred from department leaders to ordinary employees. This also bodes well for IBM’s customer relations. The advantages are gradually being eroded. In 1980, Intel released the 8087 chip, which finally enabled PCs to have floating-point computing capabilities. In 1983, the IBM PC XT was released. Its biggest innovation was that it was equipped with a 10MB hard drive, which made the PC user experience close to that of a workstation or mini computer. Lotus 1-2-3 (table calculation), Wordperfect (word processing), and dBase (database) took the lead in embracing the IBM PC platform and became the software leader in the next era of PC. Compared with VisiCalc, Lotus 1-2-3 added drawing and database capabilities. (2,3 refers to), it can also record and repeat a series of command operations, and its revenue in the first year reached 53 million US dollars.

The IBM PC became the industry standard. By 1983, 107 companies were producing expansion cards (including sound cards, etc.) for IBM PCs, and Microsoft also began to sell the MS-DOS operating system to other computer manufacturers. It could only allow one user to allow one program at a time, and its capabilities were limited. However, it is not easy to actually launch a computer that is fully compatible with IBM PC. The first thing to crack is the BIOS code of IBM PC, and IBM owns the copyright to this. Three engineers from Texas Instruments founded Compaq. They cracked IBM’s BIOS by reverse engineering and wrote new code to achieve similar functions. This was an inefficient but legal method. Compaq’s first product was launched in 1983. Its biggest attraction was its compatibility with IBM PCs. In its first year, the company’s revenue exceeded US$100 million. With the popularity of IBM-compatible computers, computer costs have continued to decline, and its users have gradually moved from large enterprises and wealthy people to small businesses and the middle class. In 1984, IBM launched the high-end machine IBM PC AT, priced at US$5,795. It provided a user experience comparable to a mini computer, could support 16MB of memory, and was equipped with an Intel 286 processor. It was positioned for multi-tasking and multi-user scenarios (DOS was lagging behind). AT was also soon cracked, and cloned versions began to appear. Because IBM was afraid that AT performance would affect mainframe sales, it limited the CPU clock to 6MHz. However, cloned machines did not have this restriction, and their performance was even better than IBM AT. In 1984, PCs were equipped with LAN functions, allowing employees to quickly send files between computers, share printers, etc. The company selling LAN hosts was Novell, which later focused on LAN software.

IBM loses ground, Microsoft gains. IBM loses ground and Microsoft gains. In 1987, IBM was preparing to release a new PS/2 product, which had complete patent protection in the initial stage, and was ready to kick out all clone manufacturers, just like the IBM 360 host swept the industry. Of course, if you want to avoid being swept away, you can pay protection fees and hand over 5% of your revenue to IBM. IBM even wants to trace past PC series clone revenue. IBM also customized a new generation operating system OS/2 for new products. It was jointly developed by IBM and Microsoft and has more powerful functions than DOS, including multi-threading and support for larger memory. OS/2 was released in October 1988, but memory prices soared at that time, with 3MB of memory selling for as high as $1,000. This made OS/2’s popularity slower than expected (untimely). At this time, the clone manufacturers did not sit still. They united to fight against IBM. Compaq could not wait for the slow IBM and released the Deskpro 386. As a result, its sales were very successful. Compaq finally acquired DEC to enhance its technical capabilities and began to improve its sales capabilities. . Dell began to sell computers directly through ToC, and won with its efficient sales channel. Users placed orders first and then delivered the products (cash flow was also very good). In the wave of breakups between IBM and other clone manufacturers, Microsoft benefited the most, becoming a true industry standard, and its exponential growth did not end until 2001.

GUI graphics operation era

Xerox invented the GUI. Early PCs had graphics display capabilities, such as playing games, drawing software, etc., but this capability was almost ignored by DOS systems. When it comes to graphical operating interfaces, we have to mention Xerox, which became the industry leader in the 1970s with its leading copying technology. In the 1970s, many U.S. defense projects were terminated. Xerox took the opportunity to establish a research center in Silicon Valley to recruit scientific research talents. It began to develop the next generation interactive operating system at the California research center. Thacker, the inventor of the first generation GUI, later won the Turing Award for this. In addition, compared to traditional operating system mode operation, that is, if the computer is in delete mode, selecting a file will delete it. Xerox’s GUI encourages users to first select the file to be edited and then select the corresponding operation. Xerox engineers also advocated object-based programming, which gave birth to the C++ language. The final effect of the GUI is WHAT YOU SEE IS WHAT YOU GET. Combined with the laser printer technology invented by the Xerox R&D Center, it has led the industry change. In fact, by selling and licensing laser printing technology, Xerox has already covered the cost of its R&D center. Xerox also seriously wanted to commercialize GUI technology, but their mistake happened to be to promote this technology to the mass market too early (at the wrong time). In 1981, Xerox launched the Star series of computers, equipped with a GUI system and sold for $16,000. Its technology It was excellent, but the price was too high. Its sales volume was only 25,000 units. By the end of the 1980s, Xerox announced that it would abandon this series.

Server and Macintosh. The earliest replacement for mini computers was not PCs, but server workstations (Workstations). In 1981, Apollo released a 32-bit workstation based on the Motorola 68000 chip, priced at US$40,000. Since then Sun MicroSystems has entered the fray. Jobs took a fancy to the potential of Xerox’s GUI and hired Larry Tesler, who developed the GUI system at Xerox, to simplify the operation of the original system. Based on this, Apple developed the Apple Lisa series, which sells for as high as $9,995. Lisa’s sales reached 100,000 units in the first year, while Apollo only had a few hundred units. However, Apple’s goal is to subvert the industry. This sales volume obviously did not meet the target. However, Apollo, as a new player in the industry, attracted a lot of VC interest. After that, Apple launched a cheap version of Lisa Macintosh (pictured below), priced at US$3,000. Compared with Apple II and IBM PC, Macintosh was closed and did not support expansion cards. In 1984, its sales reached 250,000 units, but soon It started to decline, and Jobs was kicked out of Apple. However, the Macintosh’s elegant user interface design still had a huge impact on the industry. PageMaker (later acquired by Adobe) design software and Apple laser printers output high-quality design products. These professional design groups were the core users of Apple computers in the early days (and still are Mac users today) .

Mini computers and PC computers merge

From DOS to Windows. In the 1980s, the shortcomings of the DOS system were well known in the industry. The incremental improvements of Microsoft and IBM did not bring about fundamental changes. For example, DOS software developers had to develop adaptive drivers for different hardware. DOS lacked the function of multi-threaded operation. , lack of GUI graphical operation interface, etc. In fact, in the 1980s, different companies such as Visi, Lotus, and Microsoft were working hard to develop GUI operating systems. Unfortunately, the hardware was not ready at that time. Therefore, not only were these products unsuccessful, but their huge time and money costs also made companies such as Visi Losing competitiveness, IBM’s biggest strategic mistake with OS/2 was that it only supported Intel 286 chips and failed to follow up with the latest 386 chips in a timely manner. It wasn’t until Microsoft released Windows 3.0 in 1990 that the GUI operating interface truly had the conditions to become popular software. Microsoft just seized the opportunity of platform conversion to convert its Word and Excel office software (both of which were originally developed for Apple Mac) into The quality is good) and it has become the dominant office software on the Windows platform. Since then, the Windows system has begun to dominate the PC operating system market, with sales of any generation of its products easily exceeding 10 million. But until this time, Windows is still a key factor restricting the performance of hardware, such as low memory utilization efficiency.

Intel sets industry hardware standards. In the late 1980s, the development of a series of equipment such as graphics cards left no room for expansion of the traditional shape of computer hosts. Intel launched an integrated motherboard based on its own chip, which inherited the function of communicating with memory and expansion cards. In 1993, Intel released the Pentium series of CPUs, along with PCI motherboards. Intel began selling the entire motherboard. By the late 1990s, Intel had a strong say in PC hardware systems and promoted the popularization of new technologies such as USB. In the 1980s, RISC CPUs based on minimalist instruction sets also began to emerge. As a competitor to Intel’s complex instruction set CISC CPUs, companies such as MIPS began to produce related chips. In 1992, the frequency of RISC CPUs could easily reach 150MHz, which was significantly faster than Intel’s fastest CPU (33MHz), RISC CPU is popular in the workstation market quickly. In 1993, Intel launched the Pentium chip. Since then, its chip has begun to iterate rapidly, absorbing many of the advantages of RISC design. The performance gap between it and RISC chips has become smaller and smaller. In 1998, Windows NT announced that it would stop supporting RISC chip Alpha Chips. Since then, Intel chips Relying on its scale and cost advantages, it began to monopolize the CPU market (actually it was integrated with RISC technology, that is, the convergence of PC and workstation technology). However, RISC technology has exploded in the mobile era based on its low energy consumption characteristics, and companies such as ARM have begun to rise. New technologies and products that emerged in the 1980s included relational databases, represented by companies such as Oracle (aggressive sales culture), Sybase (Microsoft later licensed it as SQL Server), etc. In fact, 35-40% of a company’s IT department’s time is spent maintaining data consistency in files and databases, which also reflects how important the database is as infrastructure.

WindowsNT and Windows95. From the perspective of the general public, PC computers basically replaced the traditional Mini computer and server workstation market in the 1990s. However, from a technical perspective, it was the latter technology that actually covered PC technology. By 2000, the technical architecture of the PC (with Windows NT becoming the main body of Windows later) was more like the mini computer of the 1980s than the early DOS system. In 1991, Microsoft announced the next generation operating system Windows NT. Its architecture was very similar to DEC’s Mini computer operating system because its designer Dave Cutler was previously the core designer of DEC OS. Windows NT has a hardware abstraction layer that can adapt to a variety of different CPUs, such as Intel, MIPS and other RISC chips (Intel was somewhat declining at the time). During the development of Windows NT, Microsoft required each engineer to submit logs every day. If its update caused the system to crash, you must be responsible for fixing the bug (eating your own dog food). Windows NT was significantly more functional than Novell’s competing products at the time. In 1997, Microsoft released several versions of NT, including an enterprise version priced at $4,000 and a workstation version priced at $319. In 1995, Windows 95 was released. Its marketing budget reached 300 million US dollars, it added Internet functions, and released a start menu. Its sales exceeded 1 million within 4 days of its release, and sales reached 7 million within 5 weeks. Even more powerful Microsoft began to use it in new computers. Promote pre-installation mode. In 1998, Windows NT 5.0 was renamed Windows 2000, followed by WindowsXP, Vista, 7, 8, 8.2, and 10 (the naming is completely irregular). Therefore, it can be said that the basis of the existing Windows system is actually Windows NT, which is the technology of mini computers. By 2000, the PC penetration rate in the United States reached 88%.

Computer Architecture Unified Electronic Devices

Digitized audio. There are actually two paths to the development and popularization of computers. One is external, that is, from mainframes to mini computers to PC computers; the other is internal, that is, the development of semiconductor integrated circuits has caused traditional proprietary equipment to be gradually replaced by chips. and other computer equipment, and digital signals replace traditional analog signals, including televisions, audio, telephones, music players, cameras, etc. The Fourier transform proposed in 1965 allows us to convert time series signals into superpositions of different frequency signals. Complex signals can be parsed into a series of simple signal superpositions, making it possible to digitize analog signals. In the 1982 Disney animated film “Tron”, 15 minutes of animation were produced using pure computer painting (CG animation), ushering in a new era. In addition, manufacturers such as YAMAHA began to produce electronic synthesizers, which have become an important element in modern pop music. Electronic keyboard synthesizers make arrangement more like programming. In the 1980s, CDs became popular. Compared with traditional tapes, they have a large storage capacity, low cost, and better sound quality (tape is analog, while CD is a digital signal). Since then, CD has become a standard storage medium for computers, with a storage capacity of 680MB. Optical drives have also become standard computer hardware. Game consoles later also used CD-ROM to read games. In the 1990s, the MP3 digital audio format became popular and could compress a CD to 20MB. However, the rise of MP3 also promoted the development of pirated music. Early Internet application Naspers was forced to close due to copyright issues. The popularization of the MP3 format has promoted the music player revolution. The most famous one is Apple’s iPod. It has become a hit product with its minimalist and elegant design and content layout such as iTunes (as of 2007, Apple has sold 110 million iPods). It is also Apple’s iPod. and the first phase of Steve Jobs’ rise.

Digitized images. Text transmission mainly relies on fax. As the price of chips dropped, fax machines became popular in the 1980s. By the 1980s, there were 250,000 fax machines in the United States, and by 1990, the number reached 5 million. The price of fax machines dropped from $7,000 in 1983. $1,000 by 1985. Scanners began to become popular almost at the same time. They can scan traditional paper books into digital images and spread them among computer devices. Adobe’s PDF and other formats began to become industry standards. The popularity of digital cameras relies on the development of CCD technology. In 1974, Kodak developed the first digital camera. After 2000, CMOS chip technology began to be popularized in digital cameras. By 2006, the price of ordinary digital cameras dropped to a few hundred dollars. DVDs became popular around 2000. In 1997, a DVD cost as much as $1,000. By about 2003, half of American households owned a DVD player, and its price dropped to $50. HDTV high-definition TV (1080p) became popular. In 1998, a high-definition TV sold for as high as 8,000 US dollars. In 2009, the United States announced that it would stop transmitting analog TV signals, and smart TVs (based on chips) began to become popular. 3D games began to develop. The game “Doom” ushered in the era of game engines, which further lowered the threshold for 3D game development. Graphics card (GPU) manufacturers such as Nvidia and ATI also began to rise with the help of the 3D game wave. Compared with the falling prices of general chips, the prices of top-end GPU products are rising. In 2006, Nvidia’s GeForce 8800 sold for as high as $600, making it the most expensive component in a computer. The hardware of game consoles also began to be replaced by PC architecture. The Xbox released by Microsoft in 2001 used Intel’s Pentium 3 series CPU.

Internet era & cloud computing era

The Internet has grown rapidly. The Internet’s earliest concept of hypertext (hypertext) was born in 1945, which later became the predecessor of the Internet hyperlinks, Berners Lee invented the URL web site and the Http protocol, which can be linked to the corresponding documents, web sites and so on. The graphical web browsers Mosaic and Netscape launched detonated the Internet wave, the Internet connected 10,000 servers in 1994, reached 24,000 in June 1995, exceeded 1 million in March 1997, and exceeded 10 million in December 2001. 1995, ATT launched a $20 monthly Internet wireless package, and early commercial proprietary network providers such as AOL (similar to LAN, early Internet users could not distinguish between AOL and the Internet due to the large volume of users) and Microsoft MSN also rapidly completed their transition to Internet service provider ISPs. Popular early Internet applications included IM products such as ICQ, and to encourage Internet development, the U.S. government said it would not tax products sold online. Early Internet user pain points include searching for information, payment, etc. Portals became the core information indexing method for users in the early days, and YAHOO became the first batch of Internet winners, which set up a website directory that gathered URLs of different types of websites and attracted a large amount of user traffic. Early Internet search is more based on the frequency of keywords, which is easy to cheat, the user experience is not good, while Google relies on a similar number of citations to evaluate the quality of the paper’s logical PageRank algorithm, to create a highly efficient Google search engine. Advertising has become an important business model of the Internet, in the late 1990s, AOL, Yahoo and Microsoft three occupy 43% share of the Internet advertising market, after 2000 Google launched the pay-per-click conversion effect advertising model, and the establishment of advertising alliances, its advertising revenue opens up continued growth. Cookies-based personalised recommendations further enhanced user experience and advertising efficiency, and the launch of Https encryption protocol made online transactions more credible. The browser war of the century between Microsoft and Netscape became a sensational event in the early days of the Internet. The Internet is developing rapidly. The earliest concept of the Internet, hypertext, was born in 1945. It later became the predecessor of Internet hyperlinks. Berners Lee invented the URL website and the HTTP protocol, which can link to corresponding files, website addresses, etc. The launch of graphical web browsers Mosaic and Netscape set off the Internet wave. In 1994, the Internet was connected to 10,000 servers, which reached 24,000 in June 1995, exceeded 1 million in March 1997, and exceeded 1 million in December 2001. 10 million units. In 1995, ATT launched a US$20 monthly Internet wireless package, and early commercial proprietary network providers AOL (similar to LAN, due to the large number of users, early Internet users could not distinguish AOL from the Internet) and Microsoft MSN also quickly completed Transformation into an Internet Service Provider (ISP). Popular applications of the early Internet included IM products such as ICQ. In order to encourage the development of the Internet, the U.S. government stated that it would not tax products sold online. The pain points of early Internet users included searching for information, payment, etc. Portals became the core information indexing method for users in the early stage. YAHOO became one of the first Internet winners. It established a website directory that collected different types of website URLs and attracted a large amount of user traffic. In the early days, Internet searches were based more on the frequency of keywords, which was easy to cheat and resulted in poor user experience. However, Google relied on the logical PageRank algorithm, which evaluates the quality of papers based on the number of citations, to create an efficient Google search engine. Advertising has become an important business model on the Internet. In the late 1990s, AOL, Yahoo and Microsoft accounted for 43% of the Internet advertising market. After 2000, Google launched a pay-per-click conversion performance advertising model and established an advertising alliance. Its advertising revenue began to grow continuously. . Personalized recommendations based on cookies further improve user experience and advertising efficiency, and the launch of HTTPS encryption protocol makes online transactions more trustworthy. The Century Browser War between Microsoft and Netscape became a sensational event in the early days of the Internet. For more details about the industry, you can click [Reading] Internet Industry History 1990-2010: Germination, Bubble, and Rise.

Open source software LAMP. In 1984, ATT made a breakthrough in its antitrust litigation. It split up regional telephone companies and thus gained the freedom to commercialize non-telecommunications businesses. Its first preparation for commercialization was based on the Unix operating system. Based on this background, Stallman, a geek from MIT, developed a Unix-compatible GNU system. The GNU code is open source, and the user agreement requires that any future updates based on this system must also be free and open source. This requirement is revolutionary. Stallman believes that free software is not only free software, but also software that any user can use freely. Based on the GNU operating system, the Linux system was launched. In 1999, Linux successfully ran on the IBM 390 mainframe. Although it was basically impossible to make money selling Linux (required by the agreement), it was possible to sell supporting services. Related companies such as Red Hat went public in 1999, and their stock prices rose. Very high. Similar software includes MySQL in the database field. It was released in 1995 and gradually narrowed the functional gap with Oracle database in the next 15 years. In 2010, Oracle indirectly acquired MySQL through the acquisition of Sun. PHP was the most popular network programming language at the time, and Apache was the world’s most popular server software. Linux, Apache, MySQL, and PHP are collectively called LAMP and are the four major models for the development of computer open source software.

Data centers and cloud computing. In the 1990s, servers used high-end chips from the Intel Not as good as expected. However, Google used another way to achieve super storage and computing functions – connecting a bunch of ordinary computers to form a data center. This was also the prototype of the Data Center. Early Google searches were based on expensive Sun servers. In 1999, Google engineer Urs Holzle We began to try to assemble hundreds of low-cost servers to achieve efficient, low-cost, and stable computing and storage, and achieved efficient heat dissipation through water cooling and redundant design to achieve reliability. In 2004, Google launched the Gmail email service with a storage space of up to 4GB, which relied heavily on its own data center construction capabilities, and the data center architecture also laid the foundation for the development of cloud computing. In 2008, Intel launched Core Duo processors to further improve CPU performance. Intel’s later CPU product system was more like BMW models, launching i3, i5, and i7 low-, mid-, and high-end products, and at this time the CPU no longer relied on clock frequency increases. In the Win-tel alliance, Microsoft has always been the one lagging behind. It may be that Intel has done too good a job. In fact, many CPU models from 10 years ago can run Windows and Office smoothly.

Virtualization and streaming. Google’s invention of data center is more like connecting multiple computers together to behave like a supercomputer, while virtualization is the opposite, making one computer behave like multiple virtual computers. The first company to try virtualization was IBM. They invented virtual memory in the 1970s with the aim of virtualizing the IBM 360 mainframe to provide time sharing services. Virtualization capabilities gradually penetrated from mainframes to PCs. VMWare released virtualization software that supported Intel chips in 1999. Intel and AMD, the two major chip giants, also began to develop targeted support hardware. Nowadays, ordinary PC users actually use virtualization functions without knowing it. For example, Windows uses virtual machine mode when running long-term programs. The largest application scenario of virtualization is servers, which has become a key capability for the establishment of cloud computing. With the advent of the era of broadband Internet, online streaming video services have become a major trend. Youtube has leveraged its power to become the industry leader. Netflix has also transformed from a DVD rental provider to a streaming media leader. Video playback accounts for one-third of U.S. Internet traffic. Social media has also begun to rise. The most successful one is Facebook, founded by Zuckberg at Harvard University. It started from college students and became a social platform for all people (for more details, see [Reading] Facebook’s Family History: Making the World More Transparent and Information more relevant). Paid subscriptions and advertising have become the two most common business models on the Internet. Of course, there are alternatives such as Wikipedia, which relies on user donations, and Craigslist, which maintains the simplest page and charges per article to cover costs by listing ads.

Java and SaaS. With the advent of the Internet era in the 1990s, the idea of ​​building a new platform based on browsers became popular. Java’s vision was to create an interactive platform compatible with all software and hardware environments. Oracle, IBM, and Sun (the parent company of Java) joined forces to create a new generation of hardware netbooks, meaning Bypassing Microsoft’s Windows to become a new generation platform. However, whether the Internet speed of the netbook is very slow, or the Wordperfect Office suite based on the Java platform is difficult to use, the Java system itself has many bugs, or Windows is causing trouble behind the scenes (Sun has always been a pioneer in supporting anti-monopoly against Microsoft), the simple result is that Java The vision was not realized, but the unexpected result is that Java remains a widely used programming language. The Software as a Service (SaaS) model, represented by Salesforce, is essentially the Internetization of enterprise services. By the 2010s, its customers had evolved from small and medium-sized customers to large customers, becoming an important part of cloud computing. SaaS services have changed the income of traditional software companies from a project system to a subscription system, which further lowers the software threshold, makes it more difficult to combat piracy, and improves the sustainability of software companies’ income and their ability to raise prices. A typical case is that Adobe’s Creative toolkit announced it would become SaaS in 2013, and its revenue doubled in the following five years. In 1995, Netscape launched JavaScript (nothing to do with Java), which became the basic support for dynamic web pages. Microsoft IE soon announced support for JavaScript, and Google built popular applications such as Gmail and Google docs based on it. In the end, although IE and Netscape fought fiercely, and Microsoft finally won with its bundling strategy, the ultimate winner in the browser market was Google’s Chrome, which was launched in 2008. Its minimalist design and obviously more stable performance quickly captured the market. share, and by 2012 achieved 70% market share.

Computers are everywhere

Mobile Internet era. To trace back to smartphones, one of the earliest prototypes was a portable computing device, the earliest being a handheld calculator. In the 1990s, during the non-Jobs era, Apple heavily invested in the Newton handheld computer (PDA), but it failed because the time was not right. Another major prototype is the mobile phone, which can be traced back to World War II. Motorola’s analog phone device Big Brother (DynaTAC 8000X, pictured below) in the 1980s became a classic of the era. The 2G era brought the text messaging function, and mobile phones began to be able to access the Internet through WAP and other methods. In 2003, the BlackBerry mobile phone became popular, and it became a must-have for business people because it can conveniently send and receive emails using the mobile phone. Another branch is the popularization of GPS devices. In 1994, BMW’s flagship 7 Series was equipped with a GPS navigation function. After that, proprietary GPS device manufacturers such as Garmin began to appear. The earliest leading player in “smart” mobile phones was Microsoft, whose Windows CE operating system was launched in 1996. Related equipment produced by Hewlett-Packard, Compaq, Dell and other manufacturers were the early leading smartphone players. However, the master of smartphones, the final convergence point of these branches is Apple’s iPhone, which innovatively uses a large screen (3.5 inches, which was already very large at the time) as the core display and input interface, and is supplemented by Apple’s original multi-touch touch screen. The screen operation method created the App Store application ecosystem and ushered in the era of smartphones. The mobile Internet era has given rise to a series of native mobile applications such as Uber taxi hailing, Instagram social networking, and Grindr dating. Since then, Apple has launched iPad to further expand the smart mobile device pie, and launched iWatch to enter smart wearable devices, while Google has established an ecosystem to compete with Apple with the Android open source operating system. Today, hardware devices such as smartphones and smart watches have become the most indispensable devices for people, and the arrival of the AI ​​era may further accelerate the trend of hardware intelligence.

Postscript: Model Subversion and Technology Integration

Compared with company development history, which is more focused on details, writing industry development history, especially long-term development history, always makes people feel a bit like a running account. However, this kind of running account has at least one advantage – it is easier to connect the causes and consequences of the development of things, thereby weakening the “not knowing the true face of Mount Lu, just because you are in this mountain”. To summarize, there are a few points in this article that impressed me:

Product innovation cannot be planned, but technological investment is always worth it. Just as IBM failed to enter supercomputing computers in its early days, its technology became the foundation for the later generation 360 series, which swept the entire computer industry.

There is also a time sharing system. Although its own development is not as expected, on the one hand it incubated the predecessor of the Internet (based on a core computer for users to communicate with each other), on the other hand it incubated the open source software ecosystem, the Unix operating system, and later LAMP The four major open source software kings are all inseparable from time sharing systems. From the perspective of the history of the development of the software industry, closed source and open source are the yin and yang sides of things, just like the current development path of open source vs closed source for large AI models.

According to “The Innovator’s Dilemma”, we see that PC computers have a great subversive effect on the ToB model with the ToC business model. PCs have subverted mainframes, mini computers, workstations, etc. Microsoft and Intel have also subverted traditional enterprise service giants such as IBM. This is from a business model perspective. However, from the perspective of technological development, it is integrated. Just as Windows NT became the base of Windows 2000 and the RISC chip architecture was integrated into Intel chips, the technical foundation of mini computers became the core infrastructure of PC computers.

The reason why the Internet can be large and unified and maximize the network effect has a lot to do with its basic positioning of decentralization: the method of sending data packets has nothing to do with the content sent, any device can join the Internet, and the only centralized domain name management organization has obtained With independent and non-profit status, Internet governance relies more on social mechanisms rather than management by specific institutions.

error: Content is protected !!