Traditional vs Best Practices in System Provisioning



[Via: Graphic and information provided by Life 4 Hire]
This is a visual representation of License Compliance Software by Dell KACE.

The hardware and software environment in many organizations has become extremely complex as each department puts different demands on the IT team. This growing complexity is often a result of the proliferation of hardware platforms, operating systems and business applications. Automated system provisioning has made it possible to deploy the necessary software throughout the company and to all these hardware configurations, but there are still some challenges that must be overcome.

In many companies, different departments will require their own hardware configurations and, on top of that, they could have very unique software needs. The engineering department, for example, may need some powerful workstation computers loaded with the appropriate design software. At the same time, the sales department might need something a little more mobile with access to the customer relations management software. Traditional disk imaging and software deployment has made system provisioning easier in some cases, but it can also make things more complex in others.

The Traditional Method

Many companies that employ automated system provisioning methods have, traditionally, used a type of “gold master” disk image for every combination of software and hardware in the company. These are called “fat images” because they include the operating system, the necessary business applications, and all the current updates or patches that have been released to that point.

In a small company, this can still be an effective method, but it does introduce some big challenges as the business starts to grow. As the number of departments in the company increases and more hardware configurations are introduced, the number of fat images that the IT team maintains can expand significantly. If, for example, there are four different hardware platforms used in four different departments, the company is suddenly faced with 16 different images that must be updated and maintained. This kind of expansion tends to add a lot more complexity than it actually removes.

Best Practices

A more effective method is to avoid the fat images and start using the IT team a little more efficiently. A thin image can resolve a lot of the complexities introduced in the traditional method because it includes only the OS and the software that is used across the entire company. Any department-specific software can then be installed directly on the platforms that need it.

This method allows the IT team to focus on only maintaining the images for each hardware configuration, rather than the configurations multiplied by the department-specific applications. The IT team can go around all the complexity that comes from maintaining so many images and provide the software directly to the that department needs it. In the end, this can reduce the amount of effort required to keep software up-to-date, reduce the demands on the IT department, and increase overall efficiency.

 

Bio:

Angela Luke

Angela works with Dell KACE. She is interested in all things related to system management as well as deployment. Outside of work she enjoys reading, hiking and writing about technology.

Links