What is Application Virtualization?
Application Virtualization, popularly known as application server virtualization is a subset of virtualization. It is layered on top of other virtualization technologies, such as storage virtualization or machine virtualization to allow computing resources to be distributed dynamically in real time. Application virtualization functions in a manner as if the application is running on the local hard disk, whereas in reality it is running on a virtual machine (such as a server) in another location, using its own operating system (OS), and being accessed by the local machine. Incompatibility problems with the local machine’s OS, or even bugs or poor quality code in the application, may be overcome by running virtual applications.
Modern operating systems such as Microsoft Windows and Linux can include limited application virtualization. Full application virtualization requires a virtualization layer. Application virtualization layers replace part of the runtime environment normally provided by the operating system. The layer intercepts all disk operations of virtualized applications and transparently redirects them to a virtualized location, often a single file. The application remains unaware that it accesses a virtual resource instead of a physical one. Since the application is now working with one file instead of many files spread throughout the system, it becomes easy to run the application on a different computer and previously incompatible applications can be run side-by-side.