Fighting Ransomware with Data Backup: Weekly Roundup
Data loss has many culprits, from ransomware attacks carried out by hackers bent on extorting money to outdated IT systems and human error. But companies have several strategies at their disposal to limit — and even prevent — the damage caused by data loss. Here's a look at recent developments in data loss prevention, and other data protection stories you may have missed this week:
How Backup Can Help Fight Against DDOS Attacks
The landing page for Code Spaces, a (former) code-hosting service, tells a story of how serious data loss can be. Earlier this week, the service was taken down by a DDoS attack from a hacker who intended to extort money. According to backup and recovery expert W. Curtis Preston, the company was unable to restore their website because the company's "websites, storage, and backups were all stored in the Amazon.com egg basket." The financial and operational losses were so great that Code Spaces shuttered its digital doors.
To prevent being crippled by ransomware attacks, Preston urges companies to diversify their backup strategies. Nat Maple, vice president and general manager at Acronis, recommends that companies and consumers follow a “3-2-1” strategy, which calls for three copies of data, in two different formats with one stored offsite.
Read more at Backup Central
To Error Is Human
Human error accounts for a large chunk of data loss. According to research from Acronis, 30 percent of data loss can be attributed to human error. Data loss prevention strategies, such as the tier classification of a data center, typically focus on data center reliability. That is important, but some argue that it's not enough. "If one of the leading causes of data center outages is human error, why do we spend so much time focused on the tier classification or rating of a data center?" asks Rob McClary, vice president and general manager at FORTRUST, in Data Center Knowledge. Time and time again, people, not machines, are to blame for data loss, downtime and security breaches. McClary recommends that companies plan ahead for human error and strategize ways to increase uptime. "At the end of the day, the bottom-line metric of a data center’s success is simple: the years of continuous uptime that you deliver against the number of unplanned downtime events that you experience," McClary says.
Read more Data Center Knowledge
IT Crashes, Data Loss Keep Financial IT Pros Up At Night
A 2014 survey of financial firms conducted by Fujitsu and Coleman Parks showed that only 35 percent of the surveyed organizations were confident that they could secure customer data in the event of an IT collapse. This is a major concern for financial firms that face the possibility of losing consumer confidence should their organization lose customer data. Last year, RBS and Natwest experienced multiple IT outages that greatly inconvenienced their customers. According to International Business Times, Ross McEwan, RBS chief executive, claimed these could have been prevented had the company invested properly in the bank's dated IT systems.
Read more at International Business Times
Need-to-Know Advancements in Backup Tech
Recent advancements in backup technology improves, rather than replaces, traditional backup methods. According to IT consultant Chris Evans, newer backup methods such as replication and snapshots should be used as part of a company's backup strategy. Replication, a process which makes periodic copies of a database from one server to another, can be useful when used in tandem with other backup systems. "Remote data replication is sometimes assumed to be equivalent to backup, but this is not the case," says Evans. With replication, Evans explains, data corruption that is hidden in the system is also replicated, making the backup ineffective as a standalone strategy. He says it's better to used replication alongside traditional backup and snapshots (aka image backup), which create point-in-time copies of data because "short-term snapshots are great for dealing with user errors and some data corruption scenarios."
Read more at Computer Weekly