A New Vision for Desktop Management – Part 2

Share

By Jorge Pereira and Rob West
Originally published as a white paper for Kennedy Consulting on January 2007

Continuation of:  A New Vision from Desktop Management Part 1

It’s a fact that Desktop Management has become increasingly complex today as compared to a decade ago. Before the introduction of the Internet and the Local Area Network, personal computers were largely used as standalone units. The installation of applications was infrequent and high-powered computing tasks were rare.

When the concept of file-sharing over a network and subsequently through the Internet came into the picture, things began to change. The role of the personal computer changed to that of a high-powered, high-speed processing machine. The number of computers within an organization increased as computers became a necessity. The complexity of applications increased concurrently with increasing task and automation demands. Remote workers working in remote offices required centralized management and reporting efficiencies, and today we see a prevailing trend toward roaming users within a free seating office.

These drivers have also pressured the need for more back-end network support. Disaster recovery services are becoming more important as more data are digitized. Security issues become a major concern as spyware, viruses and Trojan horses place organizations at risk of exposure. In addition, there is a need for operating systems to be standardized and a process in place for patches and updates to be deployed in a uniform approach. Authorization control, tracking and management of applications used and deployed throughout the organization is also another gargantuan task. Control of remote access servers and Virtual Private Networks is also critical.

It is apparent that as technology advances, the need for similar advances in control and management also increases. IT organizations also face cost pressures from corporate management, and IT Managers are given mandates to lower total costs of ownership, increase efficiency and to be productive all at the same time.

It is common that Desktop Management does not take top priority in many organizations. As a result, it is often that IT Managers lament on many fronts: applications that clash, too much unproductive time spent on installations and troubleshooting tasks and the challenge in managing servers and applications within an enterprise environment.

An ideal solution should contain the following attributes in order to be able to address these challenges:

 

Solution requirements Description
Application Compatibility  Compatible applications facilitate rapid deployment of corporate solutions. As a result, IT Support staff will enjoy greater productivity by spending less time on troubleshooting and problem-solving tasks. 
Improve PC Manageability  Minimal time should be spent on managing individual PCs. This means that controls are put in place to reduce the dependency on individual desktop configuration, and to facilitate secure user access to applications and data within the organization.
Flexible Computing Models  Both rich and thin client models should be supported by this solution. Organizations today depend on both, and the best solutions will retain this flexibility.
Diagnostics and Help Desk  Users are trained and encouraged to help themselves through a number of diagnostic and self-service tools in order to reduce Help Desk calls for simple operations. A high-availability Help Desk is nevertheless available for more complex issues.
Protection for Sensitive Data  Data security should always be a priority. In the event of theft, especially for mobile workers, unauthorized corporate data access should be prevented.
Safe PCs, anytime, anywhere  Viruses, spyware and Trojan horses not only threaten personal computers but also the corporate networks to which they are attached. A robust solution should have the capability to protect both the network and each attached computer.
Software Asset Management  As more local applications are installed, effective software asset management should be present to ensure legitimacy and compliancy.

 

Current Desktop Management Practices

Realizing the growing challenges in IT operations management, many corporations have implemented desktop management solutions. In general, traditional Desktop Management (DM) practice brings forth a host of benefits and a number of drawbacks. These include the ability to standardize operating systems across an organization and to manage patches and updates in an effort to minimize operational and support requirements. Traditional DM has shown limited success in this area. Additionally, applications locally installed on individual PCs are hard to control and restrict, preventing granular and auditable control over licensing and enforcement of corporate policies. Some tools for this are available in traditional DM, but none of them fully address the goals mentioned above.

Another strategy of traditional Desktop Management is the ability to disallow users to alter critical settings on their computers, such as the Windows registry or firewall settings. Locking down desktops sounds like a great idea in theory, but geographical, cultural and political differences across larger corporations reveal subtle flaws in this theory. For those organizations that have implemented different levels of locked-down desktops, exception management can quickly become a headache and lead to increased support costs.

In terms of service level agreements, traditional DM practice relies on a high-availability Help Desk and a mobile team of desktop technicians to diagnose, track and solve problems. However, a good number of service calls involve simple operations such as the resetting of passwords – a great draw on help desk productivity. Traditional DM solutions have addressed this with self-service password management. This gradual shift to self-service is even more strongly realized in new DM practices.

Overall, these solutions worked well to help lower operating costs, improve security, enforce compliance, assist in license auditing, ensure adherence to regulations and increase service availability. However, diminishing returns as a result of ongoing technological advances and constantly-changing environments have driven a need for IT organizations to look ahead. In fact, these complex changes are gradually causing desktop management solutions to become unable to address emerging and future issues. Essentially, they may be ideal for today’s environments, but may not be adequate to meet the needs of the future.

 

The New Age of Desktop Management

Evolving out of traditional DM, new visions of Desktop Management are on the horizon. One such vision is comprised of technologies and best practices that bring the following benefits:

  • Application Virtualization
  • Software as a Service (SaaS)
  • Centralized Management
  • Optimized Desktops (standard images, locked down, easily replaced)
  • Robust reporting and auditability
  • Simplified and streamlined support
  • User-centric design – a “Client Computing Environment”

 

Application Virtualization – the first step toward a new vision of desktop management

An alternative to traditional desktop application delivery is application virtualization.

In a standard environment, configuration information and settings are installed onto the operating system of a local system. The drawback of this setup is that it is possible for application configurations to overwrite one another, causing conflicts, or even malfunction.

With application virtualization however, applications are not “installed” onto the desktop, but are rather delivered to PCs on demand. Application configuration information is loaded into a virtual memory “sandbox,” meaning that there is no permanent “footprint” on the local operating system. This leads to a cleaner desktop environment with fewer chances for application conflict and corruption. On top of this, locally installed applications are still able to interact with virtualized applications, meaning that, to the user, things work just as they should. As a side benefit (some might say a primary benefit for organizations reliant on legacy systems) two versions of the same program can run on the same system without conflict, a feat difficult to achieve before application virtualization.

 

Advantages of Application Virtualization

No installation, no alteration to operating system  While the application runs on the local machine, there is no traditional installation of applications involved. Applications behave normally, as if they had been installed, but do not exist as such. One of the primary benefits here is that it is no longer necessary to have local administrator privileges to deploy applications.
Permission-based  As applications are requested by the user, the system intelligently checks for proper authentication and licensing restrictions at launch time. Deploying an application to a user is as easy as assigning that user to a security group.
Centrally served  Only a percentage of an application is delivered locally when requested by the user. This means that other portions of an application will only be streamed to a PC whenever required. With this, the resources of the server and network are also utilized only when necessary.
Locally executed  Even though the application is delivered from the server, the application still executes locally. There are no performance issues with execution, and applications behave as they would had they been locally installed.
Cache with expiry  Applications delivered can be stored in a cache until expiry, enabling operation even if a full-time network connection is not available. This means that laptop and remote users can benefit from virtualized applications with no restrictions. At the same time, applications can be set to expire from the cache, affording greater control over licensing compliance.

 

Software as a Service – The future vision

 As application virtualization takes hold in the industry, we are seeing a gradual migration to the concept of Software as a Service. According to Brian Madden in his article “Providing Windows Applications to Users: Nine Different Theories and Architectures,” the core purpose of IT is to provide access to applications for end users. Madden also stresses that these applications, whether they be web- or desktop-based, need to have high-availability.

If we take this premise seriously, then it is our responsibility as IT professionals to work toward envisioning and designing solutions that will meet the challenging demands of an ever-changing computing environment.

What are the tenets of such a solution? Foundationally, data would be stored centrally in a secured environment that is also easy to access and manage. Each user in the network would be assigned their own profile, accessible through a global directory and not tied to a specific computer. Users would store their data in the central repository and not on a local machine. Applications and services would be managed through this directory as well, with access predefined according to user roles and group membership.

In this environment, these “managed desktops” would be user-neutral, providing secure and simple computing services based on user login that are easy to manage and support. Application virtualization would meet standardization, performance and flexibility requirements. A single base image, including the operating system and a number of fundamental applications, is loaded onto all clients. A goal of a managed desktop environment is to create client machines for which a minimum of maintenance is be required, since updates and applications that are installed locally are minimized.

Central servers would deliver virtualized applications and configuration changes to the client. Delivered from remote servers, but processed locally, productivity on client machines would not be affected. In addition, data and assets are also stored on servers and accessed remotely, providing users with the latest information while allowing tighter control on the security of these data as no local data is stored on a user’s computer. Finally, an integrated toolset provides users with additional functional services they may need on an ad-hoc basis. This also helps to enable self-service amongst users, reducing dependency on IS service staff in the process.

 

Simplified and Streamlined Support: The core framework

Taken together, the combination of managed desktops, virtualized applications, and solid processes as described above form what we refer to as the Client Computing Environment (CCE.)

As with any operating model, an adequate support and service structure needs to be put into place to allow smooth operations with minimal hiccups. An ideal support framework would be composed of the baseline framework of best practices.

The baseline framework that provides the make-up of the new CCE includes Group Policies, Active Directory, a standardized desktop, policies and processes, application distribution, and image management (including patch management) functions,. Managed desktops acquire virtualized applications from core servers. In addition, users are able to access critical information for self-service support or centrally acquire corporate policies and procedures. All user and configuration data is stored on data servers, ready to be accessed upon demand from anywhere within the organization.

From a support point-of-view, a tiered service model is created. On Tier 0, users can self-diagnose and service their needs. A group of web-based tools with high-availability is available for simple matters such as password resets and other traditional help desk items. In more advanced scenarios, application provisioning also exists on this self-service tier.

A Tier 1 service desk provides personal phone support to users who are in need of assistance. Ideally operational 24 hours a day, 7 days a week, call agents work with a suite of scripts and guides to provide solutions to desktop users. In addition, remote tools are used by this Tier 1 support team for diagnostic or resolution purposes. In contrast to traditional Tier 1 Help Desk skills in unmanaged environments, Help Desk workers in a managed environment have less complex diagnostic trees for issues arising from applications, as they are virtualized and less dependent on individual client configuration.

 

The Tier 2 and 3 groups are focused primarily on Client Computing Environments. These are highly-skilled agents who are ready to address issues that are beyond the capabilities of Tier 1 support. In addition to being users of remote tools, they also utilize integrated system administration tools to solve complex problems.

 

Finally, and as a last resort, an on-site team is ready to be deployed to locations where physical intervention is required. They function as they eyes and hands of the support team as an entirety, and physically work on machines within the managed desktop environment. While they are mainly required to swap and fix, they are also users of remote tools available in the core framework.

 

Benefits of the Optimized Desktop – from the viewpoint of IS

 

“Power and flexibility of a PC, managed like a phone”

“Who I am and the services I need are not tied to a device”

 

These are the vision statements of a future-focused Client Computing Environment, one where Application Virtualization and Software-as-a-Service concepts come together to manifest these excellent advantages for IS as an organization.

 

Agility  Rapid deployment of any application is now possible, as applications are updated and supported centrally. This also helps in ensuring that accessibility to applications and data is optimum. All of these contribute towards a flexible working environment poised for agility.
Intelligently managed  Tasks such as patch management, application management, diagnosis and repair and asset management are implemented and controlled centrally. This is an intelligent and productive approach in services management.
Locked Down  With an optimized desktop IT teams gain better control of user access, enhanced interoperability, ease in data recovery and protection and also ease in identifying threats to the network.
Cost reduction  Most importantly, optimized desktops help to reduce helpdesk costs through reduction in variability. Assets are managed real-time without the need of tedious physical checks. In the end, productivity is maximized through greater efficiency in operations and processes.

 

Summary

It is a fact that changes in our computing environment as a result of demanding business needs drive the consistent evolution in desktop management. As a result, we are always in need of better functional processes coupled with technological advancements that improve productivity and save costs at the same time. The optimized desktop meets these needs through the provision of technology that is easy to roll-out and implement.

In addition, application virtualization eliminates the need for local installations. At the same time, conflicts and regression testing needs are also eliminated. In terms of licensing, greater control and auditability are achieved. One of the greatest benefits for organizations relying on legacy applications is that multiple versions of the same application can be run simultaneously on the same client computer.

Central management and control facilitate continuity in the event of break down of desktops. Standardization allows for easy replacement of desktops and fast user recovery as everything is centrally available.

Finally, from a macro viewpoint, the corporation acquires IT agility as it becomes more responsive to business demands. Elements such as self-provisioning and self-recovery, as well as applications that are available anywhere the user goes provide a boost to productivity within the organization.

You can also find and read:   A New Vision for Desktop Management Part 1

You can download the original white paper Here

Similar Posts