The Advancements of Computing Used by Modern Technology Projects

Comparison of early servers to cloud computingCredit: Javrsmith

Project Managers who control information technology projects must ultimately determine the best deployment platform for the completed application. While many organizations have strict requirements for the target host, a great option can often be cloud computing using the WordPress content management platform. This approach will be very functional, inexpensive yet familiar, or easily learned, by the technical staff of most information technology departments. Depending on the political climate of the agency, it may be difficult to implement, but a cloud computing server can be used with for a wide variety of needs.

The concept of cloud computing is actually not at all new; server implementation of cloud computing was one of the first incarnations of information applications. The earliest computers were very large and expensive. They hosted a set of applications for a number of simultaneous users. In time, network technology advanced to the point where a single server could perform work for users distributed across a very wide area, even around the globe. These servers were large and expensive and they required specialized equipment rooms, operators and hardware. As microprocessor technology became cheaper and more capable, the application hosting world began to shift towards a greatly distributed model. In the years since 1981, when the IBM PC was released, literally billions of applications proliferated in general purpose form in every location on Earth.

The distribution of applications on personal workstations became very common, even for larger projects. Distributed sales applications were launched on the various personal computers of travelling employees. These would operate offline when needed. At various intervals, the computer would be connected to a central server which would apply application updates to the network of discreet, standalone personal computers. Some agencies accomplished this kind of periodic update through the use of distribution media, floppy disks or CDROMs that would be shipped to individuals who would apply the needed updates. Due to the often rapid need to deploy application updates, distribution of program changes to widely scattered personal computers often became a significant factor for many project managers. Luckily as projects became more technically complicated, the Internet rose in prominence. This gave project teams the ability to more easily push application changes to the separate personal workstations which housed applications.

With the continued rise in capabilities of the Internet, substantial improvements in network throughput has allowed project teams to consider the move to cloud servers. These large computers can implement large applications for use by a distributed network of users. Users can be located close to the server or at any location around the world. Data security, implemented via powerful encryption utilities, is a key to these projects. Essentially, applications distributed widely from a central server have become very similar to the deployment model used with large computers in the early days of computing. Progress has truly gone full circle.

Of course, the currently implementation of technical projects bears little resemblance to those that were released 20, 30, or more, years ago. Today's distributed user of a modern application uses a remote personal workstation, a laptop computer, a smart phone or other independent computing device. The end user has a significant amount of computer power at their location which is used to free a significant amount of work from the central server. The mundane jobs of network communications, encryption and graphical display, are handled by the very capable remote computers. The central server installed in modern projects, is responsible for data storage, data integrity and central business control. Where yesterday's host servers devoted much computing power to the individual keystrokes of a multitude of distributed terminals, today's host servers concentrate on the business. Another difference is that today's servers wield comparatively mighty computing power.

The cloud servers used as project deployment platforms today are very powerful computers. Although inferior to some of the largest earlier servers in some ways, they eclipse the older servers' abilities in many ways. Most new servers would fail were they responsible for collecting the combined total of thousands of user keyboards typing simultaneously. Many old servers were very capable of handling such a load. Luckily, modern projects do not impose such requirements on modern servers. Since the prefered remote user is equipped with an actual computer themselves, the keyboard interface is not a concern of modern application servers. This has given rise to server based applications such as WordPress. WordPress uses the server database to store and retrieve data which is used to build dynamic web pages as required by remote users.

Servers used to implement modern projects are much more powerful than older models when computer storage, and processing power are examined. This is especially true per unit of computer area, ie: power per refrigerator-sized computer cabinet. Old servers used cabinet after cabinet to house the memory, central processing unit, disk drives and other components. New servers stuff far more of the required components into a greatly reduced space. In fact, a single refrigerator-sized cabinet that might house all of the components of a new server. Such a device could contain dozens of gigabytes of random access memory. It could house dozens of terabytes of disk drive storage. There could be multiple central processing units, again in the space used by a standard refrigerator. Only the largest, and most expensive, of the older central servers would have computer power comparable to the power available to a project team with a single modern application server. Of course the old machine, say an IBM 370 mainframe from the 1970's, would require perhaps six, seven, or more, units of space, each the size of a fridge.

Since project managers now can routinely specify that their projects should be deployed on central servers, much like the old applications did, they gain a lot of functionality for their project deliverables. Information sharing between project stakeholders is usually automatic and complete with new applications. Data security is easily assured. Instant deployment of application updates is possible since the application host is centrally located. The distributed users, with their particular devices, are users of the system. They use the available code at the central server. If an update is applied to the server, the users receive the newly installed changes instantly. There is usually no deployment issues for new projects that rely on central servers. True, there may be particular upgrades necessary at the remote end, say a browser or similar installation on a remote workstation, but these are made easier by certain application distribution methods in use today.

Many project managers have used such deployment strategies for modern projects. They have documented important lessons learned in the process at educational sites such as Practical PM Journal.