Skip to content

Recovering Data from a Failed Synology NAS

In my tech-filled life, Synology Network Attached Storage (NAS) units have been faithful companions for many years, serving both my personal and small business needs. Housing a quartet of disks in RAID 10 configuration, my personal workhorse at the moment is the Synology 918+. A virtual treasure trove of personal file storage, this NAS also serves as the PLEX media server for all my music, movies, TV shows, documents, and more. This NAS also contains my on site copies of all of my music projects, game development projects, software development projects, etc. By hosting this vault of data on a NAS, my recent computers have enjoyed the speed and efficiency of being solid-state-only.

Synology 918+ Network Attached Storage unit (and box)

But this past weekend, my ever-dependable Synology unit fell silent. Rather than a simple power down, it seems like a more substantial hardware failure has occurred. Although the power adaptor appears functional, no amount of coaxing can revive the unit. Thankfully, the designers at Synology have built-in several robust backup options, ensuring that all of my important documents have been real-time backed up to OneDrive, while a nightly backup of the entire unit was done using Synology’s Hyper Backup software to an external hard disk.

What is a NAS, anyway?

Picking up where we left off, you might be wondering, “What is a NAS, anyway?” Well, NAS stands for Network Attached Storage. It’s essentially a device that provides centralized data access and storage to a local network of computers. A NAS can be a lifesaver for households or small businesses like mine that need to manage a large amount of data across various devices. Think of it as your very own personal cloud storage system.

The brand I use and have come to trust is Synology. The company, headquartered in Taipei, Taiwan, was founded in 2000 and has been a pioneer in Network Attached Storage (NAS) devices, creating solutions not only for businesses but for home users as well. Over the years, they’ve made their name by offering high-quality products, with versatile features that can cater to a variety of needs. My Synology NAS, the 918+, was a powerful device for its time when it was released, with its ability to run multiple tasks simultaneously, stream high-definition media, and store terabytes of data.

While Synology is a popular brand for NAS units, it’s certainly not the only player in the market. QNAP, Netgear, Western Digital, and Seagate are also well-known for their network storage solutions. QNAP is notable for its powerful and high-capacity devices, often geared toward businesses. Netgear, on the other hand, offers a balance of affordability and functionality, ideal for those who need network storage on a budget. Western Digital and Seagate, known for their hard drives, also offer comprehensive NAS solutions that often come pre-populated with their own drives.

Now, back to my current predicament. As I mentioned earlier, the Synology unit I have been relying on has unfortunately given up the ghost. The power supply checks out, but the unit itself refuses to power on. A bad situation for any data hoarder, right? But, as the adage goes, “Don’t put all your eggs in one basket,” or in my case, all your data in one NAS. As I ponder my options for data recovery, I find myself reflecting on past instances of data loss. Regrettably, my carefree attitude toward backing up personal data when I was younger resulted in several heart-wrenching instances of lost files over the years. However, these painful experiences were crucial in teaching me the importance of a good backup strategy, particularly the 1-2-3 rule.

Easy as A, B, C

The 1-2-3 rule is a guideline for backing up your data. The concept is simple but effective: keep at least 1 copy of your data on-site, 2 copies stored on different devices, and 3 copies in total. In my case, my crucial files were backed up in OneDrive (off-site), and the entire data was mirrored on an external hard disk (on-site), ensuring that even with the NAS failure, I’m not losing any sleep over lost data.

Fortunately, the robust backup options available in Synology units are a saving grace. My documents are safe and sound, backed up in real-time to OneDrive. Moreover, I have a nightly backup of the entire unit using Synology’s Hyper Backup software to an external hard disk.

However, there’s a catch. This backup drive uses the EXT4 filesystem. Now, that’s a standard filesystem for many Linux distributions, but for Windows users like me, it can pose a challenge. EXT4 isn’t natively supported by Windows, which means I have to use a specific software, in this case, Synology’s Hyper Backup Explorer, to read and recover the files. But even this software requires access to the files on the device, something that has proven to be a challenge considering the present circumstances.

So first, I needed to confirm that the backup disk was actually working. My Synology NAS was configured to alert me to all kinds of errors and issues and there have been no alerts relating to the backup disk (or the failure, for that matter), so at this point I assumed the disk was mountable. To confirm the disk was readable I needed to be able to read an EXT4 filesystem, to achieve this I installed Windows Subsystem for Linux on my laptop.

What is Windows Subsystem for Linux?

Windows Subsystem for Linux, often referred to as WSL, is a tool provided by Microsoft that allows developers to run a Linux environment directly on Windows, without the need for a dual-boot system or virtual machine. It works by translating Linux system calls to Windows system calls, allowing Linux applications to run on the Windows operating system seamlessly.

The benefits of WSL are manifold. It grants you the flexibility to use powerful open-source tools like grep, sed, and awk directly on Windows, or to access your Windows files in a fast, Linux-like file system. It’s also a boon for developers, as it enables them to write scripts that work across different platforms, use Linux command-line tools like ssh and git, and even run web servers or databases like Apache and MySQL for local development.

Furthermore, with the release of WSL 2, the performance has been drastically improved with the use of a real Linux kernel provided by Microsoft, making it even more powerful and flexible. This makes WSL not just a feature for developers, but a significant tool for system administrators, IT professionals, and even for people who just want to learn more about Linux. In the context of our data recovery journey, WSL might be our key to interacting with the EXT4 filesystem natively on Windows, potentially simplifying the process of extracting our data.

Onwards to Recovery

With Windows Subsystem for Linux installed and my laptop rebooted, I successfully managed to mount the disk – the Hyper Backup file structure was present and readable so my assumption is that the backup disk is working as expected and contains the data. When I tried to point Synology Hyper Backup Explorer at the location, however, I got a “Stored data on the backup destination are corrupted”.

Synology Hyper Backup Explorer showing the error "Stored data on the backup destination are corrupted"

Encountering the roadblock of being unable to install Synology’s Hyper Backup Explorer within the Windows Subsystem for Linux, I was compelled to explore other options. My first assumption is that actually, Hyper Backup Explorer doesn’t want to read data from \\wsl$\ and is expecting a “local” resource. The idea of installing Ubuntu within a virtual machine was tempting, offering the prospect of an environment that could natively handle the EXT4 filesystem. Simultaneously, my eyes flicked to my Raspberry Pi, considering its Linux-based system and its potential to come to my rescue in this situation.

However, another thought niggled at the back of my mind – the perils of relying on a single copy of my data. The 1-2-3 rule of backup loomed large. Having transitioned to a NAS-first data strategy a few years ago, my aim was to consolidate all of my data in one accessible location with robust backup protocols in place. This has influenced the configuration of my recent computers – they’re SSD-only, lean on storage space, but high on performance. I’ve been working mainly on an Asus ROG Zephyrus G15, which, despite its formidable capabilities, comes with a relatively modest 512GB of SSD storage – nowhere near enough to house my 2TB of compressed Hyper Backup files.

So, I found myself facing the task of creating a second copy of my 2TB of data before any further diagnostics or recovery attempts. The need to have a safety net, a clone of my data, was paramount. This wasn’t simply about the luxury of a backup; it was about ensuring the survival of my data, regardless of what further complications my recovery attempts might unearth.

Thus, before diving into the complexities of data extraction and recovery, my first priority was clear: clone the 2TB of Hyper Backup files to another external hard disk. This step, while perhaps seeming cautious to some, is essential to safeguard against the risk of losing data during the recovery process. As any experienced data guardian will tell you, there’s no such thing as being too careful when it comes to preserving your digital assets. For bonus points, the second external hard disk is NTFS formatted and natively readable in Windows, so once the file transfer finishes I should hopefully have a locally mapped drive letter allowing Hyper Backup Explorer to read the backup.

I am reminded how infrequently I copy large amounts of data around in my personal life, normally it’s just data incrementing over time. It turns out that copying 2TB from one USB hard disk to another takes longer than I expected…

Leave a Reply

Your email address will not be published. Required fields are marked *