Computing up until the late 1930s was done by hand or with the aid of some kind of mechanical device such as an abacus or manually, later electrically, powered gears and levers. By the end of the Second World War, several digital computers had been constructed using vacuum tube technology to perform simple computation using programs and data that had to be entered manually prior to each execution. By 1950 core memory had been perfected that allowed programs to be stored and modified easily, the punched card and line printer had replaced paper tape as input and output media, and business had begun to take notice of the computer as it moved out of the lab and into the office. The transistor invented in 1947 had largely replaced vacuum tubes by the late ‘50s thereby making the machines more reliable, a little cheaper and definitely cooler. By 1960 crude operating systems had been developed, code assemblers and compilers made it possible to develop software faster and more reliably. Old words took on new meaning — programmer, data processing, bits, core, dump, bug, etc. — at least to those involved in the fledgling computer “industry.”
The ’70s was the decade of the big and super-big mainframe computers, the courtship between the computer and telecommunications, core giving way to semiconductor memory, huge software development projects, remote terminals and timesharing. Almost as a counterpoint to all this grossness, at least two significant events occurred in 1975 that heralded developments that would change the face of computing forever. Ed Roberts began to advertise his Altair 8800 microcomputer kit in Popular Science, and two guys named Gates and Allen founded a company called Microsoft to sell their basic compiler designed to run on a microprocessor.
During the ‘80s the computer industry really took off as hundreds of new enterprises were formed all over the world to exploit the new micro-technologies. It was quite a ride that saw mainframe giant IBM lose a big chunk of its market share to upstart companies selling microcomputers, local area networks, horizontal market software that did word processing, spreadsheets and database management and others that developed vertical market software addressing the needs of just about every organization one could name. Telcos, with billions invested in outdated, analog networks finally jumped on the bandwagon and began to retool with communications systems designed for a digital world.
The period beginning with the next decade up to the present witnessed enabling developments like the public introduction of the Internet, the transformation of the personal computer into a consumer item in its own right, the micro miniaturization of processor and memory chips, wireless communications, portable, compact packaging and global manufacturing and marketing of technology. These developments and others led to a host of new and innovative “hi-tech” products that incorporated complex technologies into the user-friendly appliances and devices, each containing at least one microprocessor.
In little more than 70 years the computer has evolved from a scientific curiousity into an omnipresent phenomenon of epic proportions; one that those born during this time now largely take as a given. It has brought about the continuing enrichment of lanquage with words like blog, Googled, wifi, phishing, pod, pad, texting and so on. The future will undoubtedly see the further integration of the key digital technologies — computer and communications — to bring about a totally “wired” planet, and hopefully a more democratic and peaceful one.
The Vernon PC Users’ Club meets the second Tuesday of the month at 7 p.m. at the Schubert Centre in the cafeteria. Call Betty at 542-7024 or Olive at 542-8490 for more information.