Recommended Reading

I’ve said it before but it should be repeated. I’m not very smart. Part of being a good I.T. support guy is not knowing the answers, but knowing where to quickly get the answers. When I first started in I.T. I worked as a desktop analyst  supporting Wintel systems. A quick search of Google usually  found the solution to the issue. Moving into backup/recovery/storage, the problems got harder, had more serious consequences and answers were more difficult to come by. So in addition to reading manuals and training, these three books have been good resources for me. There are many more books on backup and recovery out there. I generally don’t really enjoy reading these things in my spare time, so these are the only three I have on my shelf. (one of which I haven’t read yet) These books actually followed the evolution of my career to inform accordingly. Culminating in the final book that I plan to leverage to influence decision makers for my client.


Sans and NAS

This book was really helpful when I was first getting my feet wet in the backup and storage world.


Backup and Recovery

A great resource to have around. I found this book helpful as a reference for a presentation I was giving to a project manager on backup technologies. The first chapter maps out the evolution of backup recovery in the enterprise.

Enterprise Backup and Recovery: A corporate Insurance Policy

I haven’t read this yet, but it looks very promising. It appears to be more about policies and best practices for management and architecture to address challenges. Preston presents a backup system as not just a collection of servers software and targets, but also policy and ownership in the corporate structure. I may crack it this week while in Vegas for EMC world. Hopefully it will make me sound smarter.


no comments


EMC buys XtremIO

So EMC announced today they have purchased a company called XtremIO. An Israel based developer of storage solutions based on SSD technology for 430 million in cash. So my first question is why? They already have all flash version of their Vmax line.
So what kind of secret sauce does XtremIO have that EMC found compelling? Looking at http://www.xtremio.com/solutions/ there appears to be some deduplication technology used to consolidate data. Interesting, but is it unique? I’m looking forward to hearing more at EMC world in Vegas. SSD is best used to enhance application delivery, but I’m wondering if we will ever see SSD incorporated into backup and recovery architecture? Avamar is using SSD as RAM to enhance performance. The question may be moot as the backup recovery and storage architecture worlds are increasingly becoming blurred. One thing is for sure, they sure make snazzy looking boxes. Look at that thing! It’s sexy.

no comments


EMC launches new support site

EMC is a massive company and it requires a website like powerlink to support it’s employees, customers and partners. Powerlink’s scope being so wide made navigating it difficult at times.
So EMC launched support.emc.com. The interface is clean and a real improvement from powerlink. It appears EMC wanted to create a online experience for their customers where they can easily find the support documentation, advice from a forum or open a ticket if required. There is some tighter integration with ECN which is nice and an emphasis on creating an online community. Logging in was easy, just used my powerlink id and password. I could see my account was still linked to my support site and products. The tools section had access to all the grab utils and procedure generators, as well as a few other interesting looking tools I wasnt aware of. The EMC corp twitter account emphasizes the social aspect of the site, but there did not appear to be any options to link to some of my other social networks. Navigating from my ipad was nice, but I would love to see an app. Partners will still need powerlink to access some exclusive content and tools found there. Check out some of the screenshots below.



no comments


Selecting a backup as a service partner (BaaS)

It was just a couple of years ago my company asked me to conduct a proof of concept exercise for a particular backup as a service offering. The talk of the cloud had become louder and could no longer be ignored. Management was hoping to find a solution that could create some annuity based income and would appeal to the SMB market. Backup technologies can get expensive very fast. A conventional backup solution will require servers, tape or disk targets and software. If tapes are used man hours need to be taken into account to manage as well as an offsite vault would need to be engaged. If a disk solution is used a colocation would be required for DR with replication. It adds up fast and IT budgets rarely take the importance of backups and DR into account in planning. That’s why BaaS is a great option for SMB. For a monthly fee all your backups can be taken care of. Sounds great right? It defiantly has gotten better. Just 2-3 years ago there were really not a lot of players in the market. There were the home consumer products that provided plans for business, but not the intelligence required to properly protect databases and applications. At best these products as well as the service I tested provided a crash consistent backup. That is the robustness of the product is relied on to protect itself. If you yanked the power cable out of the wall on your exchange server, would you be concerned if the database would be mountable at startup? You should be. Most likely you’ll be OK, but you are not %100 protected. Would you want to pay somebody a monthly fee to maybe  protect your applications or databases? Now just a few years later there are many options out there that claim to provide business intelligent backups for this data. I say claim because I have not had the opportunity to test any myself since the POC I had done a few years ago. So with so many options out there, how do you choose a BaaS provider? Here are some things to consider.


Where will your data be?

It’s important to know where your data will finally actually be sitting. Is it in the data center of a known trusted provider or the CEO’s basement? A site inspection should be included as part of the due diligence, which brings me to my next point.


Consider geographically where your data will be.

Will your data be crossing any physical borders into another country? If so you may obviously want to avoid a company that is shipping your data to geopolitically unstable nations, not that any are that I know of.  Most likely your data may be crossing the US/Canada border and your company may or may not have any such concerns, but this should still be taken into account.


It’s good to ask a lot of questions about the technology and ensure you have a solid understanding of how it works. A good BaaS solution should use data deduplication and it should be done on the client side. This is required to reduce the amount of data required to be moved. As well compression should factor into the solution and encryption if required. Is any of your company data sitting encrypted on disk? How will this dedupe? A trial period should be engaged and backups and restores should be tested to ensure the solution can meet your expectations.



Most likely the service will be leveraging some kind of compression or deduplication technology to limit the amount of data required to be moved. The question is can they estimate the length of time required to complete the initial level 0 backup? Can you simultaneously run your existing backup solution during this level 0? Depending on the amount of data it could take weeks to months to complete a level 0 of all your data. Also consider when your billing will start. Will it start on the day the first level 0 begins or when the first level 0 completes? A good strategy would be to break up the company’s data profile into chunks and using known metrics estimate the amount of time it would take to complete a level 0 of each chunk and incrementally bill from there.


Server/network bandwidth cost?

Another question is if there are any throttling options?  It’s important to understand the impact the backup will have on network and server resources as the initial level 0 backup may take weeks to complete, which brings us to our next consideration.


Are there any seeding or shipping options?

Quick recovery of an employee’s spreadsheet over the WAN isn’t an issue, but what about a larger dataset? Some companies provide a service where a backup would be completed to a local portable NAS and then shipped to the service provider to seed the initial level 0 backup in hopes of completing the backup faster. This option would be required if your company has more than 5 TB of data. Conversely what if you needed to restore all 5 TB of data in the event of a disaster? Recovery over WAN would be inefficient to say the least, so could they restore and ship the data on disk? This is an important consideration and could be mean the difference between a faster recovery and going out of business in the event of a disaster.

no comments


Google Drive


I first heard the rumors about google drive just yesterday. Cut me some slack, I’m still rebooting from vacation. I’ve been a Dropbox user for a year and I love it.  The rumor of a free 5 GB basic sounded great, but I was even more impressed with the 25 GB for only $2.50 a month. So I took a look at it today and here are some quick observations on the differences between Dropbox and how this will fit into the Google toolbox.

I went to Google drive on my Ipad and was surprised to see my google docs already there. The revision history and collaborative potential looks great. I could not do much else on my IOS device. A real IOS app will be coming soon. Expect some google IOS app updates to feature some added integration with the G drive. I would love to see a G Drive option in Good Reader! Companies already invested in Google Docs and Gmail may be tempted to further venture into Google Plus to engage an online collaborative work environment in the cloud?

When I got to a PC I downloaded the Google Drive app. Like Dropbox an icon is created in the system tray and you define the local storage of your google docs sync, which it does very quickly or appeared to. What is actually being stored is a link to the actual document in the Google cloud. Nice but this does not provide any offline functionality. There is offline option that requires a change to the settings as well as Google chrome and an additional Chrome plugin. Google is of course promoting its search as a key differentiator and the ability to search for text in docs as well as images. This is a nice feature that I have leveraged Evernote for in the past.

There a a lot of other competitors out there that I have never tried like SkyDrive and SugarSync. Some brief research found that Google Drive has the largest and most expensive premium option of 16 TB for $800 a month as well as the largest file size limit of 10GB. The cost of G drive wins over almost all the competition, except for SkyDrive that does work out less expensive over the year. Being the data storage junkie I am



no comments


How to get CrashPlan to back up to a network drive (NAS)

I know it appears this blog is exclusively focused on backup technologies for the enterprise, specifically related to EMC products. That is my primary area of focus currently and I’m using this blog as a repository of my ongoing learning’s in this area. I would not be a very good backup expert if I did not protect my data at home. In the past I’ve used Mozy, Carbonite, Mainland’s mCloud (for important files only) and now Crash Plan to provide an offsite copy of my data. These products are great for quick recovery of files and some have an option to deliver data on external media of  large restores for a price.

I recently bought a Seagate Goflex NAS. We use it for primary storage of some media we download. All other important documents, pictures and music are on an internal 2TB drive. I started using Crash Plan a few months ago. I really like the ability to perform backups to multiple targets. My plan was to perform a backup to the Crash Plan cloud and to also create a copy on the NAS for quick recovery. The GoFlex like many home NAS devices leverages some dark arts to make themselve available on the network. It’s NIX based and I found many guides online regarding hacking them, getting root then leveraging samba. Also a great way to void your warranty and a lot of work for me as I’m not that smart.

The Goflex NAS creates some drive mappings, but the Crash Plan app does not allow backups to a network drive mapping. Instead I used windows 7 native VHD function to create what appears to be  a local drive that Crash Plan can use, but is actually a file sitting on my NAS. Here is how.


Go To computer management and right click on disk management and select create vhd.


Browse to the destination to store the vhd file. This should be on your NAS device. Select the desired dynamic or fixed disk increment or static setting. If fixed ensure it is large enough to ingest the backup data.


In a moment you should see the new disk. Right click and initialize.


Right Click on the volume again and select new simple volume. and assign a drive letter.


Then go to Crash Plan >Destinations and select the folders tab.Browse to the new drive letter and select start backup!

Wow! Was that ever fun. Super glad I spent my evening at home configuring backups. Can’t wait until tomorrow to go to work…please kill me.


1 comment


Removing devices quickly with NSRAdmin


So when I first started working with NetWorker with this particular client I ran into some issues. I can’t recall exactly what happened but there was some mismatch with the devices. In my experience from the NetBackup world, it is always best and usually easiest to delete and readd the devices. Just let the app scan and find what it will. As long as the OS can see the devices the app will also.

This NW env at the time was 2 years old, built and then configured by staff that did not know much more about NetWorker than I did at the time. I do know one thing. Keep your media pools to a minimum. I learned this years ago to ensure you get the best utilization out of your tape media.

At this time I was not well versed in the NetWorker command line or nsradmin. I attempted to delete the jukebox and associated tape devices. Quickly realized that the devices could  not be removed until the tape devices are removed from the media pools. OK… No problem….There are 95 media pools!!! O_o. Wow. Lets not talk about the possible rational. I was surprised EMC support could not advise on a quicker way to remove the devices from the pool rather than manually deselecting from the GUI. I was doing some research today and I found this.



If you want to delete all the jukebox definitions, and all of the
devices use nsradmin cli…

# nsradmin
> . type: NSR jukebox
> show name
> print

(this will list all the jukebox definitions, If you are happy to delete
them, continue)

> delete
> delete

(you have to run delete twice – maybe more… Just keep running delete
until no records are found)

> . type: NSR pool
> show name; devices

(if any devices were owned by pools that would prevent you from deleting
them, update devices to be blank if so )

> update devices:

> . type: NSR device; media family: tape
> show name
> print

( make sure its selecting just those devices you want to delete)

> delete
> delete

Wondering if anyone has tried this. I have since widdled down the number of media pools greatly, but I am planning a upgrade soon that will require removing and readding the library. I’ll give it a try and let you guys know how it worked.

no comments


VTL weirdness

I have an EMC VTL in my backup env. It was purchased just before EMC acquired DataDomain. Let’s just say there is a reason why EMC purchased DD. It really is the superior product. That being said this VTL 4160 has served us well and aside from one major outage, has been very stable. The problem I’m going to describe probably has more to do with NetWorker than the VTL.

I had noticed my clone jobs had been hanging. The job was looking for a particular virtual tape to copy to physical. The tape was in a virtual drive and the associated message indicated reading, done. NetWorker was repeatedly trying to unload this tape unsuccessfully. As you probably know, it is a bad idea to use anything other than NetWorker to move tapes around. So if you need to, you MUST perform an inventory so NetWorker is aware of the location of the tape or tapes.  So to clear this up I first manually kicked the tape out of the virtual drive via the CDL gui. After I performed an inventory of the specific slot I put the tape in and the drive that NetWorker THOUGHT had the tape.

When the inventory was completed and NetWorker was aware of the location of all tapes in the home slots, I restarted NetWorker. Viola!

Note: Be careful with the options for  inventory. You can accidently inventory the entire library and have it load and read all tapes. That may take some time depending on the size of your library

no comments


nsradmin is your friend

I have a confession to make. I like the GUI. Don’t hold it against me. I know most of us in Backup Recovery and the products we support have roots in UNIX. I am well aware of how superior the command line can be and generally is. That’s what this post is about.  I find sometimes  the NetWorker GUI can be finicky about what it will and wont let you do, even though there may be a menu option for it?  So this week I noticed some incorrect devices hanging around. I had taken a vacation a while ago and it looks like somebody was having some fun. Grrr!

Do you think I can right click and delete? Nope!  So I configure the device as stand alone, again I try to delete it. Nada! Denied! So what to do? Thanks to EMC support for showing me this sometime ago. To the nsradmin utility!

Use tab to move through the menu. First choose NSR device from the select menu, then use next to move through the device list.

When you find you device in question select delete.


Boom! Take that careless co-worker with no respect for others backup env!



no comments

     Next Entries »