The Quest for Storage: Journey from SSDs to a DIY SAN Setup

Recently, I was faced with a predicament – the SSDs in my application-based host were wearing out at an alarming rate of 1% per day. With four SSDs at this rate, it was clear that once they died, the VMs and containers would grind to a halt, leading to a catastrophic loss of data.

The issue with SSDs: The SSDs I initially opted for were Samsung’s QVO series. Although affordable, they proved to be unsuitable for ZFS on Proxmox, resulting in rapid wear and tear. To tackle this, I transitioned to the pricier but more durable, Samsung 870 EVOs. However, I realized that this was just a temporary fix.

The search begins: I embarked on a journey to find a storage solution that combined high capacity with speed and reliability. My exploration led me to eBay, seeking a SAN (Storage Area Network) setup. The primary components on my checklist? A JBOD (Just a Bunch Of Disks) and an LSI card.

The findings: Luck was on my side when I stumbled upon a 25-slot JBOD for 2.5” drives, priced at only $199. Although SAS disks are notorious for their steep prices when bought brand new, I managed to grab 20 1TB drives for just $20 each, keeping the project cost under $600. For the price, 20TB of raw storage and a JBOD felt like a steal.

Choosing the right OS: With the hardware in place, the next big decision was the OS for the SAN. Though I had previously worked with Truenas core, I was intrigued by its successor – Truenas Scale. The UI was more streamlined, and setting up containers with SSL certs was notably more effortless. While Unraid was a contender, I eventually settled on Truenas Scale, which fulfilled all my requirements – ISCSI, SMB Shares, NFS Shares, and even running apps like Plex and Home Assistant.

Things to note: One limitation with Truenas Scale was that I couldn’t add drives to an existing VDEV. Since I had set up all my drives in a RaidZ3 configuration as one VDEV, this posed a potential risk. To counter this, whenever I got new drives, I incorporated them as Hot Spares. This ensured that I didn’t need to power down the system to initialize new disks in the JBOD. For enhanced performance, I also integrated a 10Gb SFP+ NIC, connecting it to my Dream Machine Pro, thus maximizing bandwidth for SAN usage.

In conclusion, this journey taught me the importance of research and planning when setting up storage solutions. From SSD wear and tear to the intricacies of SAN configurations, the path was filled with learning experiences, all culminating in a storage setup that met all my needs.

Leave a comment

Your email address will not be published. Required fields are marked *