Jeff Stephenson, P.E., is the Team Leader of [...]
As civil engineers and surveyors, there are many aspects of our daily jobs that are second nature to us. But we hear common questions about our field often from the public. In this “Did You Know” blog series, we’ll answer some of these questions.
We’ve all heard the buzz words, “cloud computing” and “virtual desktop.” These phrases are really just new terms for old technology: “centralized computing.”
Centralized digital computing has been around since the 1950s. It was the predominate method of providing computing resources through the mid-1980s. This was due to the high expense of computer equipment, specific environmental control conditions and the specialized technical knowledge required to operate and maintain the equipment. For years, only the very large corporations and universities could afford the high operating costs associated with computers.
A centralized computing environment consisted of a few large and powerful (relevant to the time in history) mainframe computers, which provided CPU (central processing unit) processing, data storage, retrieval and backup and hard-copy output (printing) resources for many end-users. Data input was accomplished by keypunch machines and punched card input/output devices through the 1970s. These were made obsolete with the introduction of inexpensive CRT computer terminals.
Each end-user interacted with a “dumb” terminal which was connected to the centralized computer. The terminal provided a typewriter keyboard for data input, and a CRT video terminal to visualize the data input, and the “processed” data output. Dumb terminals did not perform any data processing on their own, but simply relayed the data to the central computer for all processing. The processed data was stored on the central computer and also sent back to the terminal for display to the user.
The primary advantage for using a centralized computing environment is to reduce overall operating costs (equipment purchases and maintenance, system operations, energy consumption reduction, and environmental operating conditions). With the increased popularity and affordability of “personal computers” in the mid-1980s, centralized computing began to decline, and we saw the emergence of “de-centralized computing”.
A “de-centralized” computing environment can basically be described as an environment where each end-user has their own dedicated computer equipment. Each personal computer is equipped with its own CPU for data processing, internal hard disk for computer program and data storage, along with a keyboard, mouse and CRT for data input and output. Output devices, such as printers, can be directly connected to the personal computer to provide hard-copy printed data output.
When Sain Associates began using computers in the late 1980s, it was based on this de-centralized computing system model. But as our dependency on computers increased over the years, it became apparent that we needed a better way to address ongoing computer equipment replacements, repairs and software upgrade requirements. We needed to reduce not only the costs of these items, but also the time required to perform the work. So, we began looking into the increasingly popular “cloud computing” and “virtual desktops.”
In all networking diagrams, WAN circuits (i.e.: Frame-Relay, MPLS, etc.) and the Internet are depicted as a “cloud” shaped object located above the LANs (Local Area Networks) in which they are providing connectivity. This is one of the believed origins of the term “cloud computing.”
In the early 2000s, with increased coverage of high-speed, affordable and dependable Internet connectivity, larger companies began to offer data storage/backup and hosted email services for other companies, and the general consumer, located within their data centers via Internet connectivity.
Although “virtualization” software has been around since the 1970s (IBM-developed VM Operating System), it wasn’t until the early 2000s that it matured, and provided the foundational support for “cloud” computing. Virtualization software separates the physical computing device into one or more independent “virtual” devices, each of which can take advantage of idle computing resources.
To the end-user, “virtual desktops” are synonymous to “virtual reality,” and each is more a “state-of-mind” than a physical reality. A virtual desktop is not a true physical desktop, but an “imagined” desktop that resides at another physical location. In reality, it is nothing more than a piece of software residing on a physical computing device, which consumes pieces of idle computing resources.
By the mid-2000s, corporations began offering SAS (Software-As-A-Service), cloud computing and complete virtual desktop and virtual server services. Today, these service offerings are still expanding in availability and popularity.
Since cloud computing and virtual desktops/servers are all based on time-sharing of centralized computing resources, computer technology has come full-circle: It started as centralized-computing, then shifted to de-centralized/distributed computing, and finally has evolved back to centralized-computing.
Sain’s computer network system went all “virtual” in March of 2013, and coincided with our move to our new and current office location. Unlike a lot of companies, we chose to go all virtual (desktops and servers) at the same time. As with any change in operating procedures, it has taken our users a little time to adjust to the concepts and differences between how physical and virtual desktops/servers operate and how they are affected by the overall performance of the entire IT infrastructure. Overall though, it’s a decision we have been happy with.