Top 5 reasons why backup in Windows Server 2016 is complex

June 28, 2016

With the new Windows Server 2016, your backup needs may be significantly different and complex than simply finding a right compatible backup agent. Here are the reasons why- Introduction of Nano Servers With Windows Server 2008 Microsoft is bringing back the Nano Server trend. This is a bare-bone structure which is just enough to deploy the 2016 […]

windows-server-2016

With the new Windows Server 2016, your backup needs may be significantly different and complex than simply finding a right compatible backup agent. Here are the reasons why-

  1. Introduction of Nano Servers

With Windows Server 2008 Microsoft is bringing back the Nano Server trend. This is a bare-bone structure which is just enough to deploy the 2016 version of Windows Server. This is not a new option, but use of Nano Servers for deployment has a much smaller impact than server core deployment. Imagine a server without any form of user interface - Microsoft is all set to introduce its Windows Server 2016 without any UI and this includes PowerShell.

  1. User limitations

Owing to its small size and very small deployment, the Nano Servers have extremely limited applications. Although Microsoft recommends the use of Nano Servers for host Hyper-V, running could applications and scaling out file servers only.

Creating backup of Windows Server 2016 will not be impossible, but it will definitely require extensive planning. In most cases the Nano Servers are expected to run as VMs. In such cases a virtualization aware backup app compatible with Windows Server 2016 should be enough to backup all necessary data on a regular basis.

In case of guest level backups, suitable backup agents with increased compatibility will be required to be incorporated into the deployment image of the Nano Servers of 2016.

P.S – always remember that Nano Servers can be installed from deployment images only.

  1. The introduction of Windows containers

Containers weren't available for Windows currently besides the fact that they will be Docker containers which have been made compatible to work with all new versions of Windows. Since this is a fairly new technology, it is expected that most backup vendors will not have a good solution that handles all kinds of container backup and related problems. This will be an initial stage problem which will be resolved with time. However the ones who make the initial switch to Windows Server 16 will face issues during server backup.

Organizations which have been using Docker for a while prefer Linux servers. These organizations usually use scheduled codes to back up their Docker containers in the form of .tar files. These can be backed up without any hassle like other file formats after they have been created.

  1. Windows service architecture

Microsoft has been trying for quite a while now to offer Windows as a Service. This is about to come true with Windows Server 2016. If we look back we can see that Microsoft actually has quite a long and uninterrupted history of offering regular updates for Windows desktop and Windows Server to complement each other. This basically means the unique property of Microsoft to offer regular updates, features and editions to all the Windows users over a period of time. If the Windows Servers include such services then the backup operators need to be equipped with extensive new features that will perform regular testing of new builds. This is primarily necessary for auto-updating OS which add and change elements intermittently.

  1. Use of new conceptual storage models

Storage Space Direct is one of the most recommended scalable storage systems for local servers. This is a fairly new concept and has been optimized to function with Windows Server 2016. Hence it requires a little bit of time and understanding to be able to use it to its highest potential. This will bring forth the use of a new generation of disk devices including SATA SSD and NVMe disk device. Most organizations do not have the infrastructure or familiarity  to support the use of Storage Space Direct.

The other concerns regarding the application of new storage models inlcude: The use of the new SSD technology enables the organizations and service providers to use simple industry standard servers with local storage systems to eliminate all kinds of complexity. It also removes the requirement of any kind of a shared SAS fabric which is a further unknown zone for most organizations which use shared local servers.

Conclusion

The new backup agents require much more than just the ability to restore lost data. These new agents should be able to manage snapshots, DR elements, VM protection and Cloud Support in the event of server failure or system failure. However the use of Nano Servers and Docker containers may pose serious challenges for the backup process. In addition to these a number of other features need to be factored in, these include the server load, the size of the organization and the specific business requirements of the server in use. Hence, when it comes to server backup agents, one size never fits all. Those, who are switching up to 2016 version of Windows Server, may have to wait sometime before finding the right backup agent.

Author: Rahul Sharma

By Team FileCloud