Migrating DPM data from one data storage to another data storage

Recently I’ve been involved in a project to help a customer to setup DPM 2012 R2 to backup VMware environment. Yes you heard it correct DPM 2012 R2 with UR11 support VMware backup. You can read more about it here. In our initial pilot stage we used DAS storage on the DPM server itself for test backup.  Once we verify local backup and Azure backup (replicating local backup copy to Azure) successful we wanted bring a SAN storage for the DPM server. My only challenge has been how to move the existing pilot backup to new storage introduced in the DPM server since we’ve been backing up production workload and I didn’t want to re-do that job again. Prior to that let’s find out my current protection group setup for a while,

image

As you can see it’s simple PG (Protection Group) protecting two SAP VM’s. Now let’s jump into the disk group structure from Disk management perspective. There are two DAS disks being utilized for the data backup, same time you can see I have introduced 3 disks connected via SAN for the DPM server.

image

Another view from the DPM point go view,

image

Challenge is to migrate the data from Disk1 and Disk2 to Disk3 without modifying the Protection group settings. For this you can use the DPM PS MigrateDatasourceDataFromDPM.ps. But first let’s try to identify the disk structure from PS console,

Get-DPMDisk -DPMServerName <DPM Server Name>) to display the disks.

image

As you can see in the above picture Disk1 and Disk2 is occupied for holding the Data. The trick is to identify the correct disk number and not to get deviated by NtDiskId. Once identified you can use following command with parameters to transfer the data,

./MigrateDatasourceDataFromDPM.ps1 -DPMServerName <DPM Server Name> -Source $disk[n] -Destination $disk[n]

Disk [n] has to be replaced by exact disk number. Once you define and executed the command DPM will start migrating data from existing disk the targeted disk. This may take some time based on the amount of disk storage.

image

Now you’ll notice in Disk Management the DPM replica and recovery point volume information which is location on Disk 1 and Disk 2 has been migrated to Disk 3. Any new recovery points for the respective data source will now be located on the new volumes on the new disk, the original volume data on Disk 1 and Disk 2 will still need to be maintained until the recovery point on them expire. Once all recovery points expire on the old disk(s), they will appear as all unallocated free space in disk management. After that we can safely remove them from the DPM storage pool.

Note: Once this task completed you may get replica inconsistent error messages. This is normal and is expected as there has been changes made to the volume and will need to be re-synchronized by running a synchronization job with consistency.

image

In the next article let me explain how can we use Azure import/ export Azure backup workload.

PS: If you don’t want to play around with PS that much and comfortable with GUI method then you’re in luck. Refer to this link where one MVP have written a PS script to do this job in GUI level.

Advertisements

System Center Universe #APAC

SCUBanner

If you’re a organization who is using System Center or planning to user System Center, Cloud and Virtualization technology then this is an event you shouldn’t miss. All the industry gurus and experts will be on one location to share their experience with you. What’s more you’ll get a chance to meet them in-person and share your ideas and get expert advise free of charge.

System Center Universe is a global event carried out in various part of the world. APAC region event will be help on March 5-6. Look forward to see most of you during that time.

Setting up a Highly-Available VMM 2012 R2 Environment

With the introduction of new Azure Site Recovery features and Windows Azure Pack solutions the role of VMM is becoming high importance. It is very clear VMM need to be in HA (Highly Available) mode. VMM Engineer team has taken the liberty of sharing the details how to setup that in detail level. I believe this is right time for customers to adopt this considering the level of Cloud adoption they plan to leverage on their organization.

You can find the step-by-step guide here.

clip_image002

Reported problems after installing Update Rollup 2 for DPM 2012 R2

Soon after releasing the Update Rollup 2 for System Center 2012 r2 products numerous threads start in technical forums issues related with DPM 2012 R2. Most of them are related to Tape backup & recovery.

Below thread discuss such issue scenario,

http://social.technet.microsoft.com/Forums/en-US/e9f4801e-6a6f-440f-ad05-65758007db69/dpm-2012-r2-ru2-kb2958100-dpm-accessmanager-service-crash?forum=dataprotectionmanager

Good news is Microsoft DPM team acknowledge the issue nd quickly withdraw the update from the web site,

http://support.microsoft.com/kb/2958100

Recovering from the issue is quite complex and need Microsoft support in certain cases. This also remind us the thumb rule of not applying latest updates to production environment without proper study. It’s always ideal to test them in staging environment.

Software-Defined Networking with Windows Server and System Center Jump Start

This is another invaluable session offered via Microsoft Virtual Academy on coming 19th March. I know most of the IT Pros have questions about SDN plus some VMware folks need to know how Microsoft world we do that 🙂 Well this is it then!

Is your infrastructure outgrowing your current networking strategy? Want to simplify the process for managing your datacenter? Software-defined networking (SDN) can streamline datacenter implementation through self-service provisioning, take the complexity out of network management, and help increase security with fully isolated environments. Intrigued? Bring specific questions, and get answers from the team who built this popular solution!”

image

Most important part of this session is Q&A. So I would suggest you’ll do your homework and come-up with all the handy questions you have to shoot to these gentlemen plus the MVP group sitting invisibly to answer those questions.

Note: Click on the above picture to register for this event.

Virtualizing Your Data Center with Hyper-vv and System Center Free online event #MVA

MVA Virtualizing your Datacenter

Virtualizing Your Data Center with Hyper-V and System Center

Free online event with live Q&A: http://aka.ms/virtDC

Wednesday, February 19th from 9am – 5pm PST

If you’re new to virtualization, or if you have some experience and want to see the latest R2 features of Windows Server 2012 Hyper-V or Virtual Machine Manager, join us for a day of free online training with live Q&A to get all your questions answered. Learn how to build your infrastructure from the ground up on the Microsoft stack, using System Center to provide powerful management capabilities. Microsoft virtualization experts Symon Perriman and Matt McSpirit (who are also VMware Certified Professionals) demonstrate how you can help your business consolidate workloads and improve server utilization, while reducing costs. Learn the differences between the platforms, and explore how System Center can be used to manage a multi-hypervisor environment, looking at VMware vSphere 5.5 management, monitoring, automation, and migration. Even if you cannot attend the live event, register today anyway and you will get an email once we release the videos for on-demand replay!

Topics include

•        Introduction to Microsoft Virtualization

•        Host Configuration

•        Virtual Machine Clustering and Resiliency

•        Virtual Machine Configuration

•        Virtual Machine Mobility

•        Virtual Machine Replication and Protection

•        Network Virtualization

•        Virtual Machine and Service Templates

•        Private Clouds and User Roles

•        System Center 2012 R2 Data Center

•        Virtualization with the Hybrid Cloud

•        VMware Management, Integration, and Migration

Register here: http://aka.ms/virtDC

PowerShell Deployment Kit (PDT) for System Center Demos

Have you ever wonder how Microsoft engineers setup their demo labs? Well now we have given the opportunity to access their secrets as well 🙂 Microsoft’s Rob Willis has written the hydration kit to automate entire System Center components setup for a demo lab environment. This is really useful when you want to setup demo lab with minimum time investment. You can access this PDT toolkit from here.

As mentioned above this is ideal for demo environment since some of the setup concepts will not really recommended for production environment at all. So take a note NOT to use this toolkit for production deployment. As per my information gathering you’ll need minimum 16 GB RAM but recommended 32 GB. To get best performance you’ll need to deploy this on SSD disks. You can deploy this on a high end laptop computer as well.

The only downside is the time it will take to download all the required components. Depend on your internet connection speed this may take more than 10+ hours.