It may seem odd, but one of the first topics we covered at Deep Analysis when we launched in 2017 was ‘Cloud File Migration.’ If you are interested, I will find and send you a copy of this lengthy report! Nearly three years on, there is no week that passes without the topic raising its head in some form or other. That form can come in terms of migration, managed file transfers (MFT), or good old large file transfers. Why? Well, the reason is simple, every time an enterprise wants to move a lot of files, be that because, for example, they have upgraded their legacy system or because of a merger or acquisition, ‘how’ to move the files is the last thing they think of. When they do think about it, they typically find out the hard way that it’s not as easy as it sounds. It’s one of the primary reasons for project stalling and cost escalating. Moving data securely requires secure data transfer tools; migrating large volumes of files requires a lot of configuration and management to ensure that the file and the associated links and metadata moves intact. Large files & Big Data stretch and break regular systems that were not designed to deal with them. I don’t know if we will update our original report, but the research we undertook revealed two things we didn’t previously know. First, well over 50 vendors are specializing in File Migration; if you add in the MFT vendors and Large File focused vendors, that number likely tops 100. The second thing we learned is that things go wrong regularly. We talked with many organizations who recounted their horror stories to us. A summary of those conversations would run along the following lines “We thought we could just……..”.
To be clear, MFT and File Migration tools (for example, MoveIT versus Cloud FastPath) are different; large data or file transfer (for example, IBM Aspera) requires somewhat different technology as well. What they have in common is that they are often used after a failed first attempt without them. It’s not the most riveting of topics, I’ll grant you, but then nor are spark plugs or lightbulbs, but you can’t drive a car or see in the dark without them. All this is to say that enterprise software and IT vendors often downplay or oversimplify the challenges of moving from system to system or one cloud to another. It’s also a reminder that we tend to oversimplify the concepts and complexity of terms like data, files, or content. And further, we often don’t fully grasp that one million files are a lot more than one hundred thousand files and that one billion files are on another scale altogether. Finally, it is a reminder that well-managed and governed data and content live in well-managed and highly structured environments. Rip them out of that environment, and sparks can fly. That goes just the same, whether it’s a single highly sensitive and valuable file, as it does for a petabyte of data.
The point of all this is simple. I have worked with many enterprise projects over my career; few have gone as smoothly as expected. The Standish Group estimates that around 70% of IT projects fail or fall short of expectations. A decade or more ago, the reason for this situation was that the software (or hardware) was questionable. That’s not the case anymore; almost all enterprise software and hardware today works well. But our failure rate remains high. It’s more often the little things that trip us up today. Take last Friday, for example; I drove 75 miles to retrieve some belongings from a storage unit; when I got there, I realized I had the wrong set of keys with me. Whenever you plan to move highly sensitive, large, or complex data and content from A to B, make sure you don’t underestimate the work this will involve, and ensure you have the right keys to unlock the door securely and safely.
Get trusted advice and technology insights for your business from the experts at Deep Analysis.