Read IT Manager's Handbook: Getting Your New Job Done Online
Authors: Bill Holtsnider,Brian D. Jaffe
Tags: #Business & Economics, #Information Management, #Computers, #Information Technology, #Enterprise Applications, #General, #Databases, #Networking
Uses
The virtual machine offerings designed for clients are convenient for a number of scenarios. If your environment has to support multiple operating systems, virtual machine software can be used by:
•
Your Help Desk and support staff so that they can use a single piece of hardware to run different operating systems comparable to what different users have. In this way, the support staff can replicate the user's environment when providing support.
•
End users during operating system migration. If a particular business application isn't yet supported by more current versions of an operating system, it can challenge efforts to move the whole organization to the later OS version. By using virtual machine software, all users can be migrated to the new OS, and legacy applications can be run on an instance of an older OS.
•
The training room PCs can run various instances of operating systems, each configured similarly to various user scenarios. This allows the trainers to easily provide training that replicates the various user environments.
•
Software developers and testers can use virtual machine technology to test their applications easily from varying workstation configurations and environments.
Virtual machine offerings for servers also have distinct benefits and are quickly being embraced by organizations as a way to efficiently consolidate servers. By consolidating servers, the investment in hardware can be reduced. But, perhaps one of the most significant advantages is that it requires less physical space in the data center, and also less cooling and electricity. Virtual machines also let IT quickly create new environments for testing and development without having to buy new hardware. Server environments that are virtualized can be moved easily and quickly to different hardware, which greatly simplifies the effort of optimizing the resources and moving applications to more current hardware.
Also, legacy applications can be run on virtual servers without having to set aside dedicated hardware environments for them. It is important to note that with the growing popularity of virtualization, many software vendors have included specific references to this technology in their licensing agreements so that if you have multiple copies of their software running on virtualized servers, you need to be sure that each instance is licensed properly. The most popular solutions for virtualization are VMware from EMC and Hyper-V from Microsoft.
Virtual Desktop Infrastructure
(
VDI
)
An offshoot of the virtual machines is the virtual desktop infrastructure (VDI). With VDI, end-users are given “thin” clients (e.g., low-end workstations, netbooks, and even tablet devices). The real horsepower resides on the VM sessions that have the tools and software the users need. They use these devices to access a virtual workstation environment that is hosted in the data center in a server farm. The benefit behind VDI is that the user-devices are inexpensive, with little configuration and software associated with them, almost like dumb-terminals of mainframe days, and the VMs are more easily managed since they are centralized. The user has the identical experiences with access to files and programs regardless of where they are, and the device in front of them. The number of VMs can be easily scaled to adjust to a changing workforce so that the environment and investment is optimized.
However, so far the ROI on VDI has been somewhat illusory. Case studies have shown that the upfront costs can be very high, and that there have been problems with user acceptance, performance, and issues with software application compatibility in virtual environments.
5.3 Open Source
For many, many years, there was the Microsoft world of software and then there was everyone else's world. While the Redmond giant still owns a large portion of some very large markets, and many individual companies have made inroads into individual markets, it's the Open Source movement that has the largest impact on Microsoft's share of the operating system market in recent years.
Definition
Open Source Software (OSS) is software (both operating systems as well as applications) created by the worldwide user community. It isn't owned, developed, or supported by any one company, much less one person. Many individuals, organizations, and companies around the world develop, install, and support OSS, but the software itself isn't theirs and they don't charge anything for it.
Because the source code for OSS is available for all to see and change at will, there is a certain fear of “who is in control?” In the Open Source movement, the concept of a “
benevolent dictator
” is recognized as the person in control. While there could be hundreds or thousands of people suggesting fixes and enhancements to the product, it's the benevolent dictator that evaluates them and makes the determination as to which will be adopted and when. Of course, that won't stop people from making their own fixes and enhancements; it just means they won't be part of the “official” code. Linus Torvalds is generally recognized as Linux's benevolent dictator. For the pros and cons of Open Source, see
Table 5.1
.
Table 5.1.
Pros and Cons of OSS
PRO | CON |
Initial cost is low or nonexistent. | Depending on how it's obtained in your organization, there may be no formal support for the product. |
Adherents claim many open source products are simpler and easier to use. | Many companies are afraid that without an “identified” owner, or if they haven't paid for the software, there will be no one to turn to when they need help (i.e., vendor support). That's why purchasing “distributions” of Open Source (such as the one from Red Hat) is a common option—the vendor also can provide support (for a fee, of course). |
Major corporations, such as IBM and Hewlett-Packard, are supporting and offering Linux solutions, bringing legitimacy to the once fledgling movement. | Detractors of OSS say the software is often harder to use and missing important features. |
Because the source code of OSS is available to all, there are more eyes looking at it, which increases the chances that bugs and security flaws will be found (and fixed). In theory, this makes OSS more reliable. | Some are concerned that since OSS code is free (i.e., no “profit”), there is little incentive for anyone to enhance it, add features, etc. |
| The success of the product depends on the collaboration of hundreds of people around the planet. This is too much risk for some companies to accept. There are many conflicting studies as to which OS is more reliable and secure. |
Cost
All OSS is “free.” That is, there is no charge for the software itself. However, within the OSS genre there are “
distributions
” (or “
distros
”) that cost money to buy. In the OSS world, a distribution is a collection of OSS software already configured, compiled, and ready to install. These distributions are put together by a third party and are sometimes preferred to the original OSS code because the third party is willing to provide support. Red Hat is the best known example of a Linux distribution. Many companies prefer to use distributions because there is comfort in knowing that an identifiable party (as opposed to the “OSS community”) is responsible for support.
However, whether free via OSS or paying for distribution, the costs of software are much more than the initial expense; when evaluating the purchase of software, you must consider
Total Cost of Ownership
(
TCO
). Typical TCO considerations include the long-term components of a purchase, such as installation, training, and financing.
5.4 Managing Software
Managing software is a critical component of an IT Manager's work. There are several key issues and some very useful tools to help the manager cope with those issues. With the pace of advances and the development of software applications, features, and functionality, software is becoming more of a dynamic asset.
Total Cost of Ownership (TCO)
Purchasing software is an activity that can range from a simple retail purchase of a graphics tool or a download of a file compression utility to a complex, months-long, meeting-filled, companywide process of buying an Enterprise Resource Planning solution. However, the purchase price isn't the only cost associated with software.
The TCO concept is this: When you are considering buying an asset (e.g., a computer, a car, a house) you should consider not only the initial purchase price—for example, say a business application will cost the department $10,000—but all the costs associated with that purchase over the life of the asset. There will be costs associated with the time to install the application, as well as hardware associated with it, vendor maintenance/support fees each year, staff salaries, training costs, and so on associated with running that application. TCO is as applicable to hardware as it is to software. TCO is discussed in more detail in
“TCO and Asset Management: What Are They?”
in
Chapter 7, Getting Started with the Technical Environment
on
page 196
.
Software Management Techniques
Deployment
One of the most complicated issues related to desktops and laptops in the large corporate environment is the deployment of software. When there is a new software package to be implemented or an update (no matter how minor) to an existing application, the manual distribution and installation can be a very tedious and labor-intensive process. Ensuring that all workstations are comparably set up and maintained can drain a great deal of IT's resources.
To streamline this situation, there are several concepts that many environments (small and large) have seized upon. They're gaining in popularity and quickly becoming standard practice:
•
Developing a standard disk image
•
Selecting a
disk cloning
package
•
Implementing a software deployment tool
•
Users download from the IT
intranet
site
•
Locking down
the desktop
Develop a Standard Disk Image
Every organization should create a disk image that will essentially serve as a way of ensuring that each workstation's software is set up and configured as similarly as possible. This can reduce a technician's troubleshooting time significantly. Ideally, there will be only one standard image in an environment, but it is common to have more than one for different department needs, types of users, and so on.
Select a Disk Cloning Package
A disk cloning package uses very sophisticated techniques to quickly copy large amounts of data in a fraction of the time it would take to use conventional methods. Once the master disk image is made, it's possible to duplicate it to other workstations in a matter of minutes. In larger environments, where there is a constant influx of new computers arriving at the loading dock, a disk cloning package can save an enormous amount of time in putting your standard disk image on new computers (or when you redeploy computers). Also, because it creates an identical duplicate, it virtually eliminates any errors that might creep into a manual effort. Popular disk cloning solutions include Ghost from Symantec, and Snap Deploy from Acronis.
Implement a Software Deployment Tool
While a standard disk image, along with a cloning tool, ensures that everyone has a similar software configuration when a new workstation is deployed, it does nothing to help with updating software or adding software. Numerous tools are available to help with the deployment of software (e.g., LANDesk, Symantec's Altiris, Microsoft's SCCM, Casper from JAMF). These tools operate in a very similar fashion by allowing you to “script” the installation process for a software update—be it as complex as a new companywide application or as small as an updated printer driver. Other alternatives for deploying software include running scripts when users login, and features of Microsoft's Active Directory.
Users Download Software from an Intranet
A very common method of deploying software—and one that allows for the user to “pull” the software at their own convenience rather than have it “pushed” to them when they don't need it—is to have users download software from an intranet. This method is particularly useful for software that not every user needs; every user will need Microsoft Word, for example, but most won't need MS Project, much less that $5,000-per-seat tool the Finance department bought.
You can direct users to your IT intranet site to install these packages. A script or installer program you configure will ensure that the software is installed the way you want. This method allows IT to very efficiently deploy software that only a percentage of the entire user base will need. Some of the deployment tools mentioned earlier provide for this sort of self-service functionality for software installations.
Locking down the Desktop
Many organizations now try to limit a user's capabilities to make troublesome changes by “locking down” the desktop. Typically, this may mean that the user can't install new programs, alter network settings, or change the way the operating system is configured. Similarly, it prevents the user from deleting key programs and files and from stopping vital operating system services. However, even with a locked-down desktop, you can still give the user the flexibility to alter personal settings, such as selection of wallpaper, default printer, screen saver, colors, and fonts. A number of applications, such as the Group Policy feature of Microsoft's Active Directory, let you define and manage these policies centrally. There is additional discussion on flexibility related to standards and the locked-down desktop in
Chapter 10, Working with Users
on
page 263
.