Backing up files is seen as critical to restoring lost data. While this conventional wisdom is still true - companies should always backup files - ransomware forces companies to rethink the methods they use to backup data. Some ransomware strains have evolved and target shared network drives in addition to local drives. For example, the attack code for KeRanger ransomware, the first working ransomware that targeted Macs, contained a non-working function to encrypt Time Machine backups. Time Machine is backup software included in OS X and works with any external hard drive.
Files on shared network drives: ransomware’s jackpot
Ransomware no longer infects only a computer’s hard drive and other mapped drives. Shared network drives are just as vulnerable. New strains, such as Locky, are designed to encrypt network shares like central file servers and removable drives that are connected to the computer at the time of infection. The encryption of shared files is a doomsday scenario for organizations. It only takes one employee on the network to execute ransomware and affect the entire company.
Cloud backup services are also vulnerable to ransomware
Cloud backup services aren’t immune to ransomware. Attackers know that organizations are using the cloud to store data and have created ransomware that can infect files kept in the cloud. Some ransomware strains, including a variant of Virlock, use the desktop sync clients of popular cloud services to access and encrypt files stored in the cloud. For example, if the Google Document a person is working on locally gets encrypted, the encrypted file will sync with Google Drive.
If an older version of the file was saved on Drive a person could manually download it. But someone would have to take on the task of manually downloading each file since Drive lacks a function to automatically restore an entire drive. This process could prove extremely time-consuming if an organization had several gigabytes of data to download.
Backing up everything isn’t feasible
Ransomware targets many files, from the more obvious ones like Word documents, PowerPoint presentations, photos and Excel spreadsheets, to the less obvious files, like those with the extensions css, java, js, mp3, msg, pdb, php, png, ppt, ps, sav, tif, tiff and wav. But some organizations only backup only the most important files at regular intervals if at all. Without backups, data that’s deemed less , raising the possibility that less important data won’t be re. This means data that’s deemed less important could be lost if it’s encrypted by ransomware.
Deciding to back up more files means more storage, which translates into more money. IT budgets may not have additional capital for servers and while cloud storage is cheap, it isn’t free. Figuring out what file to back up forces security departments into guessing what files ransomware will encrypt. This approach essentially places security professionals in a constant race against the ransomware’s authors, with the good guys trying to determine what files to backup before the bad guys encrypt them.
Increasing the backup frequency is a viable option, but has it’s drawbacks. Frequent backups can impact a computer’s performance since the process requires using the CPU and storage components. Frequent backups can also consume storage resources. The common approach to backing up data is to have a few previous versions of a file available. This means that organizations that frequently backup data will have several versions to store, which requires more storage. Plus in a post-ransomware attack scenario, while it may be possible to restore the older version of an encrypted file, any changes a person made to a file before a backup was done will be gone.
Want to learn more?