Skip to main content

I have two data center sites, one in Indianapolis and the other in Olympia. We have Dell Data Domains at both sites as our backup storage. I am currently in the process of moving our cloud connect storage from an old HydraSTOR to the Dell Data Domain. With the hydraSTOR I was able to seed backup copy data from a portable NAS directly onto the cloud connect repository as it was presented as a CIFS share. With the Data Domain it is using something called ddboost and I’m not sure how to run a copy job to import that initial backup copy data onto the Dell Data Domain. Has anyone else been able to successfully seed backup data onto a Data Domain?

I need to…

  1. run the initial backup copy to a local/portable NAS on the customer’s site
  2. disable the job
  3. ship the portable NAS to our data center
  4. copy backup files onto the Data Domain configured with DDBOOST
  5. update the backup copy job to use the cloud repository
  6. enable the job to run
  7. confirm the job is running successfully and call it done

Step 4 is where I am stuck. I have a support case open with Dell and waiting to see what they have to say but figured I’d ask here to see if anyone else has ran into this issue and could share some insight.

I will update this topic with the results of my Dell Support case.

Hi,
please have a look at the DataDomain section of the Veeam helpcenter

https://helpcenter.veeam.com/docs/backup/vsphere/emc_dd.html?ver=110

 

I have not used DataDomain for… at least 8 years. If I remember correct you have to define a fileshare via the GUI to access a volume with CIFS...


@JMeixner Thank you for your response! I read that article before posing here and unfortunately it doesn’t cover how to “seed” the backup files onto the Data Domain (we are using the DDBOOST file system). This storage is used to present repositories to clients via our cloud connect service. I need to run the initial backup copy to a local/portable NAS on the customer’s site, disable the job, ship the portable NAS to our data center, copy backup files onto the Data Domain, update the job to use the cloud repository, and then enable the job to run. I have several customers with backup jobs that are well over 12 TB and trying to push that much data over the internet is not feasible. 

I will update the topic with the work plan I am attempting to use for more context.


I’m currently reading this Dell article https://www.dell.com/support/kbdoc/en-ph/000019159/expedited-configuration-steps-and-details-for-datadomain-boost-fs while I wait for Dell Support to respond. 


I have worked with Data Domain many times and we use them at the MSP I work at in conjunction with Cloud Connect. When you set up DDBoost on the data domain you have the option to create a CIFS file share or NFS export of the DDBoost.  You can then copy the files to that and once they are on the DD add this to Cloud Connect or rescan the repository already there to import the backups.  You then direct the tenant to that repository for backups to the site.


I have worked with Data Domain many times and we use them at the MSP I work at in conjunction with Cloud Connect. When you set up DDBoost on the data domain you have the option to create a CIFS file share or NFS export of the DDBoost.  You can then copy the files to that and once they are on the DD add this to Cloud Connect or rescan the repository already there to import the backups.  You then direct the tenant to that repository for backups to the site.

That’s what I meant 😎 I could just not remember where to do it….


Comment