Best mini-sas cables according to redditors
We found 298 Reddit comments discussing the best mini-sas cables. We ranked the 52 resulting products by number of redditors who mentioned them. Here are the top 20.
We found 298 Reddit comments discussing the best mini-sas cables. We ranked the 52 resulting products by number of redditors who mentioned them. Here are the top 20.
Raw storage:
Total 108TB(18 drives)
Actual storage:
Total 72TiB
Case:
Used the two bay 3.5" cage, and three bay 2.5" cage from the Deep Silence 3 case.
Fans:
Used two 120mm case fans from the Deep Silence 3 case between the two stacks of drives.
Motherboard: Supermicro X10SRA-F
CPU: Intel Xeon E5-1620 v3 3.5GHz
Heatsink: Noctua DH-D15
RAM:
Total 48gb
PSU: Corsair AX1500i
Controllers:
Total 20 ports
NIC: Mellanox Connectx-2 10g
OS Disks: 2 x Intel 330 60GB, mdadm RAID1
Storage Disks:
Seven shucked from Best Buy WD easystore externals and two from Amazon as internals.
I originally shucked the Seagates from externals. I have replaced the Seagates as they fail, and I had one fail during this upgrade. Yes, I have had five Seagate failures.
SATA/SAS cables:
OS: Fedora 25 with ZFS for Linux
Cost:
The cost was spread across years. This is more like two builds in one. My old build with the motherboard, memory, heatsink, CPU, and 4tb drives combined with my new 8tb build. With the 4tb drives I have replaced five of nine drives over time, which has driven up the real total cost.
The case is huge, but all the space is nice. You don't feel like you are cramming anything in. I used a Fractal Design R5 for my previous build, and prefer Fractal Design cases to Nanoxia cases. But the biggest Fractal Design case wouldn't quite suit my needs. Even this was a stretch for the Deep Silence 6 case. I wish the Deep Silence 6 had spots to mount 2.5" drives on the back side like the R5. It is a feature I miss.
I have a few issues. The trays and the screw holes on the WD 8tb drives don't match. The WD drives are missing the middle bottom screw holes. My temporary workaround is strong 3M double sticky foam tape with two screws. I may use a drill and drill holes in the sides of the trays. I had to tape down the 2.5" cage, but the drives are so light it is not a big deal.
After building this beast I had the window closed, the door shut, and no room fan for one day. The room was quite warm. I have since opened the window, turned on the fan, and left the door open.
My Kill-a-watt peaked at 450 watts during boot. It idles between 200-220 watts. So I could go back to my AX760 from my previous build with SATA power splitters.
I still have one tray free, but no extra drive or SATA port.
I was originally going to move the four bay 3.5" cage from the Deep Silence 3, but it was just too integrated into the case. I tried adapting it, and it didn't come out well. Even if it had, the bottom tray was going to sit below the lip of the side of the case. So that tray would have been less accessible.
I am currently copying 18tb from the old array to the new array as a burn-in test.
I got the original idea to build with this case from someone else's post. I probably would have just bought another Fractal Design R5, and run two systems otherwise. I have run two systems for storage before, connected them with 10g, and used iSCSI. When I did I used, https://romanrm.net/mhddfs , to merge the filesystems together. I am considering doing the same again.
With the right cages you could probably fit around 26 3.5" drives in this case.
Over time I have gone from 250gb to 500gb to 1tb to 1.5tb to 2tb to 4tb to 8tb drives. I didn't think I would be upgrading to 8tb anytime soon, until the Best Buy easystore deal. In the past I mostly purchased on Black Fridays. In more recent years externals from Costco.
TLDR: I built a new server combining an existing 24TiB ZFS with a new array of 36TiB ZFS for the win!
You can get the [non-premium] connector for $1, dunno what the difference actually is.
The data cables I just got off of Amazon. Link
[non-premium]: https://www.moddiy.com/products/SATA-Power-Connector-Black.html
I use these spliters for more SATA power connectors and These hotswap cages for housing the drives. They are often on sale at newegg for $40-60, this card Flashed to IT mode will add another 8 sata connections via two sas connectors(sff-8087) via a breakout cable
Currently I am running 8x3tb drives in my pc with a gtx 970 and my 550watt PSU handles it just fine.
You can get SAS to SATA breakout cables. SATA hard drives can work with SAS and SATA backplanes/ports, but SAS drives cannot work with SATA backplanes/ports (only SAS).
An SAS controller will work just fine along with any controller running on your mother board. You can have multiple controller cards running along with the built-in controller cards. I don't have experience with the card you listed, but you can get fairly cheap used/refurbished cards off ebay. I prefer LSI SAS chipsets. There are many rebranded versions of the LSI SAS cards.
More than you need now, but you can also get SAS expanders that work kinda like a network switch, but for hard drives.
I'm not sure what kind of RAID card you're going to end up with but if it's the norm around here you'll likely want one of these and then 3 of these little guys.
You could probably find those cheaper somewhere else but that would work fine. Then you'd just need one SATA power connector from your PSU and it looks like it would power all 3.
You don’t. You find SAS controllers like the H200 or LSI 9200-8e if you need external connections.
Internally you can use one of these if you need SATA.
Cable Matters Internal Mini SAS to SATA Cable (SFF-8087 to SATA Forward Breakout) 1.6 Feet https://www.amazon.com/dp/B018YHS8BS/ref=cm_sw_r_cp_api_i_uvl8CbJM89PS0
Wow, for being Data Hoarders, almost no one here has any clue about SAS.
SAS controllers will work with both SAS and SATA drives.
SATA controlles will only work with SATA drives.
SAS drives have twice the data connections of SATA drives. This is for dual path. This is very commonly used in data paths for failover and load balancing.
If the backplane you have, has SAS connectors for the drives but only SATA connections on the back, it will depend on what kind of drive you put in it will dictate what kind of controller you can use.
If you put in SATA drives, you can use a SATA controller. SATA drives will plug in and work fine in this case.
If you put in SAS drives, you need a SAS controller. Yes that will work in this case, but you lose the dual data path feature. It'll work fine, you just lose one of the bigger SAS features.
If you MIX SAS and SATA drives, you need a SAS controller. Some controllers allow mixing, some do not.
To work with your setup, you'll need a cable like this: https://www.amazon.com/HighPoint-Internal-Mini-SAS-SFF8087-Int-MS-1M4S/dp/B001L9DU88 This will go from 8087 to SATA connectors.
You can purchase something like this and then buy two of these breakout cables to add 8 HDDs without using any of the sata ports.
Good news and bad news:
The bad news is that the onboard 3008 is SAS1. The good news is that the backplane is a TQ, which means it's just passthrough. It will work with any size drive, you just need an SAS2 (or better) HBA that can supply 8 seperate SATA connections. I recommend finding a cheap H310 (usually around $40 used on ebay), flashing it to IT mode, and getting a pair of breakout cables.
The LSI 9211-8i will suit you nicely. Try to get one already flashed with the P20 firmware or just flash it yourself if your comfortable with that.
EDIT: Here's a link to one that's already pre-flashed and from a reputable seller. And you'll need two of these cables to go with it.
Okay, here's what you're going to want to learn.
Mini-SAS comes in two versions (internal - 8087 or external - 8088).
If you want to connect drives internally, you get an LSI card with internal (8i, 16i)
If you want to connect drives externally, you get an LSI card with external (8e, 16e)
Say you have two boxes, you need one external LSI card with 8088 and one passthrough 8088-8087 card.
You'll need 8087 cables to SATA (an 8i card will have two ports for 2 cables where each support 4 sata cables)
You'll need 8088 cables to connect the external cards together
Figure out how many SATA hard drives you want to support.
8e - 8 SATA drives per external card.
16e - 16 SATA drives per external card.
Shopping List for 16 External Hard Drives from one computer to another:
External Card ($30): https://www.ebay.com/itm/LSI-6GB-16-Port-SAS-SATA-HBA-Controller-Card-SAS9201-16e-H3-25379-01G-Grade-A/273461892263?hash=item3fab9954a7:g:CSMAAOSwfkFbm-XI:sc:USPSFirstClass!33175!US!-1
Mini-SAS Passthrough (2 x $30): https://www.amazon.com/CableDeconn-SFF-8088-SFF-8087-Adapter-bracket/dp/B00PRXOQFA
8087 to SATA (4 x $8): https://www.amazon.com/CableCreation-SFF-8087-female-Internal-Splitter/dp/B013JP7YI8/ref=pd_lpo_vtph_147_bs_lp_t_1?_encoding=UTF8&psc=1&refRID=AYXPARRHH92MDMM64NJJ
8088 to 8088 (4 x $15): https://www.amazon.com/CableDeconn-SAS26P-SFF-8088-External-Attached/dp/B00S7KTXW6/ref=sr_1_3?s=electronics&ie=UTF8&qid=1537045400&sr=1-3&keywords=8088+to+8088
Edit: Please don't hesitate to ask questions before spending money, just make us a diagram showing where your disks are and where you want to hook them up.
You can definitely stick with the Fractal series. I did because I couldn't have a loud, unsightly machine setup anywhere in my home. I have my main system w/ 10 Drives + 2 SSDs + 3 NVME drives in an R6. That has a DAS connected with 19 drives inside an R5; 8 stock bays + 3 in 2x5.25 bay adapter + extra 3 drive cage + extra 5 drive cage.
As you are in Europe, you might not even have to pay crazy shipping charges to buy spare drive cages from https://www.fractal-design-shop.de/Define-R5_1. In the US I had to source the extra drive cages from r/hardwareswap but that proved to be easier than I expected. Here is a pic I took before I added the 2nd 5-bay drive cage: https://imgur.com/a/TWL8IB1
Edit: Request for more info...
I have not done a build log as I am not yet "finished" with the build, but it looks like there is sufficient demand for parts info so here it goes:
I have an R6 for my main NAS server loaded with the motherboard, 10 3.5 drives and one SSD. The R5 has two extra drive cages (3 + 5) as well a 2x5.25-to-3x3.5 bay adapter.
The expansion cards I use are:
Additional parts I used:
More inspiration can be found here: https://www.serverbuilds.net/16-bay-das
$20 PCI-E SATA cards are cheap for a reason. Many of them are buggy with random drop outs.
Most of us prefer cards with LSI Logic chips (now owned by Avago/Broadcom) which use the enterprise SAS interface. SAS drives are more expensive and most home users have no use for the more advanced features. But since SAS is a superset of SATA you can just connect SATA drives to a SAS card using these $8 cables that convert a SAS internal SFF-8087 port to 4 SATA ports.
https://www.amazon.com/CableCreation-SFF-8087-Female-Controller-Backplane/dp/B013G4EMH8/
A new LSI SAS card is expensive but just buy a used one on eBay from a one of the many high rated sellers on eBay.
Literally just search eBay for "LSI SAS" or "LSI SAS 8i" if you want 8 internal ports.
Just buy any generic 8087 to 8087 right angle cable, https://www.amazon.com/CableCreation-Internal-SFF-8087-36Pin-Right/dp/B01KFEVQ4E/ref=sr_1_2?s=electronics&ie=UTF8&qid=1502054140&sr=1-2&keywords=8087+to+8087+right
Those cards slot into your PCI slots on your motherboard, and provide an interface that lets you plug more stuff into it and have them be read as disks. You'll likely also want a cable like this too: https://www.amazon.com/gp/product/B010CMW6S4
Along with a card similar to what you linked to, that let my motherboard interface with 4 more disks.
Just remember to buy SFF-8087 to SATA Forward Breakout cables, not Reverse breakout.
https://www.amazon.com/Cable-Matters-Internal-SFF-8087-Breakout/dp/B012BPLYJC/ref=sr_1_3?keywords=sas+to+sata&qid=1556200941&s=gateway&sr=8-3
No reason to buy new when there are a million LSI cards out there for sale. Get an LSI card (or clone like IBM M1015, etc) and flash it to IT mode or buy one that already has been flashed. I recently purchased 2 cards for $19 shipped.
As for cables, there are lots of 8087 to SATA cables available. You need a forward breakout cable and they are available at Amazon or Newegg for cheap.
He’s using RAID cards in the PCI slots that have mini SAS ports, then uses mini SAS to SATA breakout cables like this: CableCreation Mini SAS 36Pin (SFF-8087) Male to 4 SATA 7Pin Female Cable, Mini SAS Host/Controller to 4 SATA Target/Backplane, 1.0M https://www.amazon.com/dp/B013G4EOEY/ref=cm_sw_r_cp_api_i_ap3XCbWQEG8PP
You can get SAS->SATA "fan" cables (example) to use this with regular SATA drives, like if you were building a large NAS array in a huge PC case. This would support up to 24 SATA drives off a single controller.
It has a sas connection, so you use a breakout cable, 1 port becomes 4 sata connections.
https://www.amazon.com/Cable-Matters-Internal-SFF-8087-Breakout/dp/B018YHS8BS?th=1
That 9211 is like the gold standard. You shouldn’t have to flash to IT mode, but you do want to upgrade the firmware (which accomplishes the same thing). The real ones are trivial to flash, versus like an H200, so I wouldn’t sweat that.
If you want the “modern” version, the LSI-9207-8i has the most recent chipset.
You can get them new, and quite a bit cheaper, on eBay .
Then you just need a pair of breakout cables
Built my plex server over the last week. Bought the Dell T7500 from this thread.
For those contemplating doing the same there were a few pieces I had to buy along with it:
I thought the PC had a VGA port when looking at images, but it wasn't, so I bought a cheap graphics card just to get the UNRAID install working. I'm running it headless.
The server couldn't recognize drives over 2TB, so I bought the LSI card. You could save about half the price and buy it unflashed and the flash it to IT mode yourself. I bout the flashed version to safe myself some time.
The case came with 2 caddys, but I had an SSD and two HDDs. I ordered these. Unless you plan on taking some things apart in the case, you can't really mount harddrives using screws.
Just wanted to post my lessons learned just in case anyone was thinking about buying the Dell t7500.
In saying all of this, I am really happy with the server so far. Just finished setting up most of my dockers in UNRAID this weekend. Have only streamed a couple of things on the local network. Soon I will have someone sharing the library at a different location.
LSI LOGIC SAS 9207-8i Storage Controller LSI00301
Mini-SAS to 4x SATA Forward Breakout Cable
I picked up one of these cards and the breakout cables and it handles 8TB drives, easy to install. Works great
You're correct. 6/i is slow, limited to drives 2TB or smaller, and doesn't have true IT mode. Its a good card to keep if (1) you don't care about software RAID, and (2) have spinners 2TB or less.
The H200 is a great card. You can leave it in IR mode if you want hardware RAID or flash IT mode if you want software RAID. Will work with any drive.
You'll need new cables. These are the right ones for an LFF model.
I have this cable from Amazon in my R710
On my R710, I bought and flashed an H200 using:
https://techmattr.wordpress.com/2016/04/11/updated-sas-hba-crossflashing-or-flashing-to-it-mode-dell-perc-h200-and-h310/
I have flashed a few, and had to flash them in another box. I never could get my r710 to cooperate with that part. In the other machine, it took 20 minutes and went without a hitch.
As others mentioned, with a non standard bios you are looking at putting it in another slot. Here are the cables I'm using and they fit well: https://www.amazon.com/gp/product/B01KFEVQ4E/ref=oh_aui_detailpage_o05_s00?ie=UTF8&psc=1
Using proxmox, if you want to pass the card through to a VM, here is what you are looking at:
https://pve.proxmox.com/wiki/Pci_passthrough
If memory serves, I did have to take the step to allow unsafe interupts from the "IOMMU interrupt remapping" section.
You didn't ask me, but you could get a Lenovo SA120, LSI 9200-8e, and the appropriate cable for under $300 – leaving some cash for Lenovo drive trays.
(Note also that none of these parts are necessarily ideal for you; for example, the MSA60 costs less and includes trays but has its own drawbacks. It's hard to say without knowing requirements.)
I bought a 24-bay supermicro 2u case with an old AMD motherboard in it and gutted it into a JBOD array with the help of a few small adapters, like so:
Topology is: SAS expander backplane top and bottom ports (ignore middle) to the two internal ports of the low profile adapter via two 8087 cables, then a standard e-SAS (8088) cable to the LSI 9207-8e in my server from the external ports.
This has worked out fabulously for me.
For added comfort (aka noise and power consumption), I removed the stock dual power supply that the 2u case included and replaced it with the guts of a 230w atx power supply, since I don't have dual sources. That cut the power draw down by ~80w or so. I also replaced the fans with much quieter ones (standard ~50 CFM 80mm units) and then improved airflow by taping over holes with masking tape, and using a thick paperboard to block other areas - the main purpose being to force the airflow through the drive bays.
Edit:
If you prefer LFF drives, there are 12-bay 3.5" already assembled with all the necessary parts from ebay: http://www.ebay.com/itm/222338813833
If running to a SAS backplane (Internal Mini SAS 36-Pin to SFF-8087):
https://www.amazon.com/gp/product/B00S7KU3PC/ref=s9_acsd_top_hd_bw_b7Ps60B_c_x_w?pf_rd_m=ATVPDKIKX0DER&pf_rd_s=merchandised-search-3&pf_rd_r=GE3B55QHG241C2CC6F5Q&pf_rd_r=GE3B55QHG241C2CC6F5Q&pf_rd_t=101&pf_rd_p=0664bd6f-3961-5d2e-b87d-fbb64789498e&pf_rd_p=0664bd6f-3961-5d2e-b87d-fbb64789498e&pf_rd_i=6795231011
If Running to a sata backplane (SFF-8087 to SATA Forward Breakout):
https://www.amazon.com/gp/product/B012BPLYJC/ref=s9_acsd_top_hd_bw_b7Ps60B_c_x_w?pf_rd_m=ATVPDKIKX0DER&pf_rd_s=merchandised-search-3&pf_rd_r=GE3B55QHG241C2CC6F5Q&pf_rd_r=GE3B55QHG241C2CC6F5Q&pf_rd_t=101&pf_rd_p=0664bd6f-3961-5d2e-b87d-fbb64789498e&pf_rd_p=0664bd6f-3961-5d2e-b87d-fbb64789498e&pf_rd_i=6795231011
Make sure to grab a battery for the h700. If you are going to use this for FreeNAS, it will not work and you will be upset.
You could get a SAS raidcard and cables all for ~$27 on amazon. that would give you 8 ports with those cables and if you wanted you could use a SAS expander cable to get more.
OR
Get a $20 4 port SATA raidcard and go with that.
FYI you can plug a SATA hard drive into a SAS raidcard just not the other way around just like how you can fit a small box inside a bix box but not the other way around.
My reccomendation would be to get the SAS raidcard so you have a little expansion room if you want more drives.
If the ports are internal they are probably sff-8087, in which case you need one of these.
If the ports are external sff-8088 then use a DAS box.
I've seen this IT mode mentioned several times, what is it and what's the difference? What do you mean by SAS breakout cables? Is this the sort of cable you mean?
Cables: 2x of these: https://www.amazon.com/Cable-Matters-Internal-SFF-8087-Breakout/dp/B012BPLYJC/
SAS/SATA Controller Card: https://www.amazon.com/SAS9211-8I-8PORT-Int-Sata-Pcie/dp/B002RL8I7M/
You might find it on eBay for less. Just posting the Amazon links for clarity.
Plug the card into a free PCI-e slot on your mobo.
Plug the Mini SAS SFF-8087 connectors into the two ports on the HBA card.
Plug the SATA connectors into the back of your 5-drive hotswap bay cage.
Insert the HDDs into the hotswap trays (if it uses trays).
Turn things on. Bob's your uncle.
P.S. if you want PCI-e 3.0 version of the HBA card, you'll need to look for "LSI 93xxx" versions of the card. They're more expensive. Also, some others go for different manufacturer cards. I prefer LSI brand.
If you just want to RAID the whole thing, there are cheaper alternatives, but hardware level RAID HBA cards suck IMHO. With this type of HBA SAS/SATA Controller, you can basically pass-through the drives straight to your computer, and they'll show up as individual drives. Later you can then RAID them via software, or not.
Assuming you have PCI express 3 in your computer expansion slots, Get a card like this:
LSI Logic SAS 9207-8i Storage Controller LSI00301 https://www.amazon.com/dp/B0085FT2JC/ref=cm_sw_r_cp_apa_i_xQ-jDbVZ3R30V
It will feed data connections to 8 sata drives all by itself.
It has 2 sff-8087 ports. Then get the special 'forward breakout' cable. Well two really. One end goes into the sff-8087 port and then it splits out into four sata data cables. Which go into the hard drives of course.
https://www.amazon.com/dp/B012BPLYJC/ref=cm_sw_r_cp_apa_i_2T-jDbHFP0FHC
The card can support hardware raid, but fewer and fewer folks do that. After all, hardware raid usually requires identical drives, and us folks at home often have a motley collection of drives of various sizes, speeds, and geometries. So software raid it is. In linux, folks often use freenas or unraid. In windows 10 you can use something called 'storage spaces'. Using raid will allow you to treat all those drives like one device... And have some tolerance for failure (which happens with so many)
Next question.. does your case have room? Do you have enough power connectors?
It would really be more beneficial to just shell out 300 buck for an r710. I'm pretty sure you'll save more money in the long term since like you already know, that thing uses a ton of power. Around 300w idle which depending on your electricity that adds up! r710 can idle at a 3rd of that.
If you insist on keeping it, the h200 is a great card and can be crossflashed if needed. You'll just need some breakout cables like these https://www.amazon.com/Cable-Matters-Internal-Mini-SAS-Breakout/dp/B012BPLYJC
You get a PCI-express SAS controller to install. They come in 2 and 4 port varieties, and you get SAS to SATA cables (turns one SAS port to 4 SATA connectors) to plug into it to connect your hard drives. That could be 16 drives per card, a couple of PCI-Express slots and you'll have more SATA connectors than you have room for hard drives.
Edit: I'm in no way recommending this specific one, but here is one example of what I'm talking about:
https://www.amazon.com/LSI-Logic-9207-8i-Controller-LSI00301/dp/B0085FT2JC/ref=sr_1_4?keywords=SAS+controller&qid=1567797015&s=gateway&sr=8-4 (it's a 2 port variety).
and for cables:
https://www.amazon.com/Cable-Matters-Internal-SFF-8087-Breakout/dp/B012BPLYJC/ref=sr_1_3?keywords=SAS+SATA+cables&qid=1567797027&s=gateway&sr=8-3
I've used pcie sata expanders with mixed success, sometimes the hard drives would disappear and reappear on a reboot, you definitely don't want that to happen on a raid setup. An HBA flashed to IT mode with breakout cables works really well, I use it in my server and it is rock solid.
HBA
Cable
I would suggest an LSI 9211-8i flashed to what is known as IT mode. This mode does not use any RAID and passes the disks directly to the operating system. The 9211-8i has 2 internal SFF-8087 SAS ports. SAS can support SAS as well as SATA. You can buy a breakout cable like this one which has 4 SATA connectors on it.
Here's a link to the 9211-8i itself, already flashed to IT mode. You can flash IT mode yourself, it is just a little "involved".
LSI 9211-8I 8PORT Int 6GB Sata+sas Pcie 2.0
Get a card like this and you can add plenty of drives. Note: you can find these much cheaper on eBay and often times will include a couple sas-to-4-sata adapter cables Also bare in mind they come in two flavors. One is raid-controller mode and the second is a simple expansion. But you can change that depending on how you plan to use it.
Consider a cheap SSD for your boot drive. Not necessary considering your needs but booting from an HDD gives me a migraine :)
edit: added more info
Awesome. I am by no means an expert but had to do a tonne of reading to come up with my setup and from what I can tell there are a few ways of doing the following:
PSU: Assuming your main PSU is strong enough you could in theory use extenders to power your DAS from the same PSU but I think there are a few downfalls to doing this. As a result I would suggest getting a redundant PSU (400w+). You can buy special adapters which fit into the 24 pin plug or alternatively you can use a paperclip. Either way what you're doing is making the PSU turn on as soon as it has power from the wall opposed to needing a power button. Sample jumper listing
SAS Controller board: This is the component you need plugged into a PCI slot in your primary NAS box. I bought a LSI SAS9200-16e16 - H3-25140-02B. I didn't need to do any flashing with this and it was plug and play with my NAS box which is running Ubuntu Server. Sample listing
SAS Expander board: This is the board which is connected back to the controller board. Think of the controller board as an outlet and the expander board as a multi-plug type situation. The ebay listing title I got was called HP (487738-001) 3G SAS EXPANDER G6, G7 - FH PCIe-x8 INT/EXT (468406-B21). You will also need a pci power blank to mount this board to and power it in place of a motherboard Sample listing
In theory from what I read it is possible there are alternatives to using one of these but this is the route I wanted to go down. The benefits as I understand them is that you can disconnect your DAS from your NAS in a very clean way. Sample Expander board listing
SFF-8088 SAS to SFF-8088 SAS cable: This is the cable which externally connects your NAS to your DAS. In theory, this is optional as you could go down the route of buying a SFF-8088 to 4xSATA cable. The downside of this however is it doesn't allow the NAS and DAS to be clean separate units IMO. These are very expensive at a lot of outlets but can be had for $14.99 here (I am UK based and even with import taxes this is cheap).
SFF-8087 SAS to 4x SATA cable: This is what connects the data from your DAS drives back to the SAS Expander board. A single cable connects to four drives so I have two of these connected to my controller at the moment. Sample listing
Prebuilt NAS from the likes of Synology is a huge waste of money. The ones that can transcode 1080p media properly cost upwards of $600, and that's without the storage.
For well under $600 (again excluding storage) you can build a kick-ass dual socket Xeon based server that will transcode one 4k stream without breaking a sweat, 2 would be a stretch, but maybe.
You won't even need a GPU. Just make sure your monitor has VGA input for setting up the server, and after the initial setup it can run headless. The passmark score on the 2 CPUs is over 20k which is plenty.
For more information check out https://serverbuilds.net site and Discord channel. Based on their guides I built a very capable server for under $400, and it does extremely well transcoding multiple 1080p streams simultaneously. Besides Plex Media Server it also runs all my automation like NZBget, Sonarr, Radarr, Bazarr, Tautulli, MCEBuddy (for converting 4k to 1080p,) Commskip (for removing commercials from recorded OTA programs,) and is my backup target for 4 Duplicati sources. The CPU load never goes above 50%, so I may throw all my home automation on there as well.
You want 8087-to-8482. You'll also need SATA power splitters; there are some linked in Amazon's "frequently bought together" section:
https://www.amazon.com/dp/B013G4FEGG/
What cable are you using to connect the drive to controller? I believe you need something like this.
Edit - If cards BIOS is booting then it's not related to the PCIe slot.
[Mini SAS Cable Connector SATA Power, Creation Internal Mini SAS 36pin SFF-8087 to (4) 29pin SFF-8482 connectors with SATA Power,3.3FT] https://www.amazon.com/dp/B013G4FEGG/ref=cm_sw_r_cp_apa_i_7kDSCbS994F6V)
Sorry for the delay, but I wanted to test thoroughly before answering.
I have an ASRock x399 Taichi and an AMD Threadripper 2950x.
Vive camera disabled for all testing.
Motherboard set back to defaults before setting the PCIe speed to Gen 3, bifurcating all the PCIe lanes to sets of x4s, and setting the PCIe switches to the latest gen supported (2).
PCIe slot #3 is notable for being a PCIe 2 x1 and the only slot the WiGig adapter worked at all in.
| WiGig in Slot | GPU in Slot | Result |
|---------------|-------------|-----------------------------------------------------------------------------------------------------------------------------------------------------------|
| 1 | 4 | WiGig card not found |
| 2 | 4 | WiGig card not found |
| 3 | 4 | WiGig card initializes, pairs, connects, works. BSOD after five-ish minutes in Tilt Brush and Richie's Plank Experience (separately, two tests were done) |
| 5 | 4 | WiGig card not found |
| 2 | 1 | WiGig card not found |
| 3 | 1 | WiGig card initializes, pairs, connects, works. BSOD after five-ish minutes in Tilt Brush and Richie's Plank Experience (separately, two tests were done) |
| 4 | 1 | WiGig card not found |
| 5 | 1 | WiGig card not found |
GPU cannot be in slot 5 because the case/power-supply is in the way of the two-slot wide card.
GPU cannot be in slot 2 because it is, again, two-slots wide and overlaps with slot 3 (the only one that works for the WiGig adapter.) No wait, it can, because I used a PCIe extension cable to route underneath the GPU. This was interesting. It BSODed before I could launch the game. Maybe the cable added too much latency or didn't deliver enough power or was suffering interference. Couldn't say.
I also tried putting the WiGig adapter on a 4x PCIe riser coming out of an NVMe M.2 slot. WiGig card not found.
HTC needs to fix their driver.
edit: I'm glad to see they're trying.
The thing to do is go to eBay and get a used dell h310/9211-8i that’s been flashed with with the latest IT firmware, then you use sas to sata breakout cables. Like these ones: Cable Matters Internal Mini SAS to SATA Cable (SFF-8087 to SATA Forward Breakout) 1.6 Feet https://www.amazon.com/dp/B018YHS8BS/
Best to get slightly longer than you think you’ll need.
It’s also good to put take off the heat sink, drill two holes in the corners and twist tie a small fan to it and then repaste. That’ll keep it from overheating.
Then you’ll have something far more reliable the chinesium sata expanders.
There's not really 6gbps or 3gbps cables, SAS cables are SAS cables.
Sideband signal is for connections to backplanes.
Yes, the direction, forward or reverse, breakout does matter, don't buy anything that doesn't specify.
Here's a good one: https://www.amazon.com/dp/B018YHS8BS
For future reference when you have more money, I recently got one of these HBA cards this vendor and it works fine and was properly updated/flashed: https://www.ebay.com/itm/Dell-H310-6Gbps-SAS-HBA-LSI-9211-8i-P20-IT-Mode-ZFS-FreeNAS-unRAID-High-Air-Flow/162834671120
I use this for a cable: https://www.amazon.com/gp/product/B018YHS8BS/
And use this for cooling the heatsink: https://www.amazon.com/gp/product/B009NQLT0M/
I have a similar build for freenas. I got a m1015 and needed a cable like this: CableCreation Internal Mini SAS(SFF-8087) 36Pin Right Angle Male to Internal Mini SAS (SFF-8087) 36Pin Male Cable, 0.75 Meter https://www.amazon.com/dp/B01KFEVQ4E/ref=cm_sw_r_cp_apa_eHCDAb964EF7P
>Do I just start buying a bunch of hard drives?
Pretty much yes. Before doing that though, figure out what your current hardware can handle (especially if you go hardware RAID - older RAID adapters don't work with drives greater then 2TB). If you're going software RAID or something else, HDD size is generally a non issue - just get the biggest you can afford.
>If so, how do I organize / connect them all?
Either straight to the motherboard (for "small" setups, or non-raid setups) or buy a RAID adapter (the IBM m1015 is popular) for about $115 USD and then buy SFF8087 to SATA expanders for about $20 USD each - this setup will allow you to connect 8 SATA drives to this card in a RAID 0, 1, 10 OR you can flash it to be an HBA also know as IT mode (to allow for JBOD + software RAID) which is what people generally buy these particular cards for.
Then it's just a matter of buying the drives and making sure they all physically fit in your computer case (tower, 1/2/3/4u rack server).
>Is there anything wrong with using an old Mac instead of Linux/Windows?
I'm not aware of any native Mac support for the above programs (only because I never bothered looking as I don't own any Macs) but I wouldn't be surprised if the programs that run via Python (Couchpotato & Sickrage come to mind) work just fine, but others I really don't know.
Generally people use some flavor of Linux (I'm using CentOS 7 myself - though Debian and it's offshoots (like Ubuntu) seems to be more popular in my opinion), Windows, or FreeBSD.
Again, go with what you're comfortable with - there's no wrong OS to run when you're starting out and just getting the hang of things (despite what some die hards will tell you).
Google should tell you if the above programs run on a Mac though.
You should look into the sleeved mini-sas to sata breakout cables (http://www.amazon.com/HighPoint-Internal-Mini-SAS-SFF8087-Int-MS-1M4S/dp/B001L9DU88). They're not too expensive and they look much better than the bright red cables you get with your card (looks like a LSI 9260-4i?). If so, I might have an extra bbu or cachevault I can send your way so you won't have to disable the BBU requirement to enable writeback caching. Also, if it is a LSI card, they have a tendancy to run pretty warm so make sure you're cooling it enough
I use a LSI 9211-8i flashed in IT mode so its plug and play.
Got it from ebay:
https://www.ebay.com/itm/New-IT-Mode-Genuine-LSI-9211-8i-8-port-PCI-E-Card-Bulk-pack-US-SameDayShipping/291641245650?hash=item43e72c3fd2:g:hboAAOSwOgdYxx1I:rk:2:pf:0
For a case you can look into something like a Supermicro 2U like this one:
https://www.ebay.com/itm/Supermicro-CSE-826BE16-R920LPB-2U-Server-Chassis-2x-920W-12-Bay-BPN-SAS2-826EL1/283181634754?epid=1203915313&hash=item41eef0d4c2:g:3icAAOSwG~Vbq8to:rk:5:pf:0
There is a Lenovo 2U that is talked about around here and the data hoarder sub but i cant think of the model at the moment.
Only thing with that supermicro i listed is it looks like it has two U.2 connections to connect to the HBA So you can either get a HBA that has two U.2 ports (cost more) and be done with it or get the HBA i listed pull out the U.2 to SFF 8087 connectors and get one SFF-8087 to SFF-8087 cable and one SFF-8087 to sata cable.
https://www.amazon.com/CABLEDECONN-Internal-36-Pin-SFF-8087-Cable/dp/B00S7KU3PC/ref=sr_1_3?ie=UTF8&qid=1543276984&sr=8-3&keywords=SFF-8087+to+SFF-8087
https://www.amazon.com/CableCreation-SFF-8087-Female-Controller-Backplane/dp/B013G4EMH8/ref=sr_1_3?ie=UTF8&qid=1543277008&sr=8-3&keywords=SFF-8087+to+sata
One will plug into the sas backplane and the cable with the breakout cables will go into the two hotswap bays in the back of the case..
When dealing with a case that has a backplane you are looking for anything that is SAS2 or above if it is using a SAS Expander. Those backplanes will handle larger HDD no problem. If the backplane has a direct attach backplane (ie no built in expander) then SAS will work fine.
Take a decent decommissioned server or PC, throw a couple of $50-100 PCIe SAS controller cards in and connect drives using a few of these $13 cables, then wipe away. You're going to want a pretty beefy power supply.
I saw some people had issues with their Norco backplanes, but mine has been good. For DAS setup, you'll need a powerboard (or some cheap motherboard should work?) to turn on the power supply?
https://www.ebay.com/itm/SuperMicro-CSE-PTJBOD-CB2-Power-Board-for-JBOD-/361966696885?hash=item5446e579b5
The mid fan wall is a little tight to the backplane connectors so I went with the right angled ones.
https://www.amazon.com/gp/product/B01KFEVQ4E/
I originally bought a Y splitter for the fans, but it was loose fitting, so I swapped it out for one of these.
https://www.amazon.com/gp/product/B0763FGH6S/
If you want a semi "Professional" look then you'll need a bracket coming out of each case plus cables to connect them externally, plus more internal cables $$$... This is where a DIY DAS starts nickel and diming you. The other option is just running the SFF-8087 cable coming off of the expander thru the cases and snake it into your Proliant+9211 card. With that scenario the expander would have to be secured near the back of the case so a connecting cable would be long enough.
https://www.amazon.com/Cable-Matters-Internal-Mini-Mini-SAS/dp/B011W2F626/
https://www.newegg.com/Product/Product.aspx?Item=9SIA00Y51H7218
https://www.amazon.com/gp/product/B01GPD9QEQ/
https://www.amazon.com/CableCreation-External-26pin-SFF-8088-Cable/dp/B013G4F3A8/
Another thing is that expander card has dual aggregation, so you can double its speed if you have 2x 8087 coming out of one expander, but then you can only have 16 drive input. I did the math in the past (not sure if it was correct tho), but I think a single link expander will be the limiting factor if you try to run all +16 drives at once. While not an issue for regular access, it would slow down a parity/backup type process that accesses multiple drives at once (Snapraid comes to mind). Of course I guess not an issue if you run this while your sleeping or whatever, but overall 20-24 HDD on one expander should be fine'ish...
Thanks for the reply,
so i need to buy this cable only 2 cables which goes from the h200 to the SAS backpane correct? and the other cables that go to the Board i would remove it correct?
https://www.amazon.com/Cable-Matters-Internal-Mini-Mini-SAS/dp/B011W2F626/
​
Thank you
Board and CPU combo is good, enough single thread performance for the Minecraft server, enough multi thread for transcoding 3-4 1080p streams in Plex. (Rule of thumb is 2K passmark score per 10mb/s of video)
The board is just standard ATX size. It does only have 6 SATA ports, so you will need buy an HBA card to add more ports, or use fewer storage devices.
https://www.amazon.com/SAS9211-8I-8PORT-Int-Sata-Pcie/dp/B002RL8I7M
That card can handle 8 drives total, 4 per cable.
https://www.amazon.com/Cable-Matters-Internal-SFF-8087-Breakout/dp/B012BPLYJC
PCPartPicker Part List
Type|Item|Price
:----|:----|:----
CPU Cooler | ARCTIC Freezer 34 CO CPU Cooler | $31.95 @ Amazon
Memory | G.Skill Ripjaws V Series 16 GB (2 x 8 GB) DDR4-3200 Memory | $64.99 @ Newegg
Video Card | Zotac GeForce GT 1030 2 GB Video Card | $84.99 @ Newegg
Case | Antec Three Hundred Two ATX Mid Tower Case | $94.58 @ Walmart
Power Supply | SeaSonic FOCUS Gold 650 W 80+ Gold Certified Semi-modular ATX Power Supply | $79.90 @ Amazon
Case Fan | ARCTIC ACFAN00119A 56.3 CFM 120 mm Fan | $8.52 @ Amazon
Case Fan | ARCTIC ACFAN00119A 56.3 CFM 120 mm Fan | $8.52 @ Amazon
| Prices include shipping, taxes, rebates, and discounts |
| Total | $373.45
| Generated by PCPartPicker 2019-07-26 08:45 EDT-0400 |
CPU cooler to keep the CPU quiet. Bit of overclocking headroom if you want the extra performance. Compatible RAM. Basic GPU that will be able to handle any 4K 60Hz HEVC video decoding. Case with tons of storage room. Efficient power supply for low noise, and a long warranty. Extra 120mm fans for front intakes, to keep the storage cool.
I got these forward breakout cables.
I'm thinking I may get an SFF8088 to SFF8087 (like this) converter and try running through port #8. If that still doesn't fix it, at least I can still use the cable to double my bandwidth once I fix the other problems.
Try an LSI raid controller https://www.amazon.com/SAS9211-8I-8PORT-Int-Sata-Pcie/dp/B002RL8I7M/ref=sr_1_2?crid=P5PBSXR61CYV&keywords=lsi+raid+controller&qid=1569558454&sprefix=lsi%2Caps%2C171&sr=8-2
You will also need these https://www.amazon.com/Cable-Matters-Internal-SFF-8087-Breakout/dp/B012BPLYJC/ref=pd_bxgy_147_img_2/136-5105212-8833038?_encoding=UTF8&pd_rd_i=B012BPLYJC&pd_rd_r=89475179-7e61-4fe7-915a-198096ed13b8&pd_rd_w=QQkz9&pd_rd_wg=B6Ezg&pf_rd_p=479b6a22-70ae-47a0-9700-731033f96ce8&pf_rd_r=0A6F6Y69MVNJQX5XC257&psc=1&refRID=0A6F6Y69MVNJQX5XC257
Be sure to back up the data on your current raid because they will get formatted. After installing it you will see different boot process from the card when starting up your computer. It should tell you to hit ctrl-h (I think). After that just read carefully, choose your hard drives that you want to combine and choose the raid you want. After that boot your computer normally and install the software I linked below. Be sure to extract it before installing and use the complete installation. It might give you a login screen for the software. It will request your window's login credentials. I was wary of it at first too but its what it wants. My memory is a bit fuzzy. But I believe this is where you finish setting up the raid for windows to be able to format it.
I'm using an LSI Logic SAS9260-4I for raid 6. The only issue I've had with it is while I was installing windows I had to disconnect it. But once that was done once I reconnected it and moved on like normal.
edit: Went to the website for you and searched for the card's software management https://docs.broadcom.com/docs/12354760 that should be it.
edit2: added more information.
One of my debian setups is still in an old desktop case too :)
I run this raid card: http://www.ebay.com/itm/DELL-HV52W-RAID-CONTROLLER-PERC-H310-6GB-S-PCI-E-2-0-X8-0HV52W-/201657131656
I flashed mine to be in IT mode so that it doesn't act like a RAID card anymore, just acts like a bunch of lonely SATA ports: https://techmattr.wordpress.com/2016/04/11/updated-sas-hba-crossflashing-or-flashing-to-it-mode-dell-perc-h200-and-h310/ Help with this can be sought in the #DataHoarder IRC room, there are a few of us there who have done this on a few different models of cards now.
Got 2 of these cables so I can slap 8 disks in that sucker: https://www.amazon.com/dp/B012BPLYJC/ref=cm_cr_ryp_prd_ttl_sol_0
Then I installed ZFS as my filesystem and run my disks in a glorious 50TB array: https://github.com/zfsonlinux/zfs/wiki/Debian
I even slapped an SSD off a mobo SATA channel as a caching disk. Happy building! :)
So, in the server world, they're obviously not using SATA for high-density storage. One solution they use is SAS (Serial Attached SCSI). There are many different types of SAS ports, the most common in the homelab community (and with specific types of servers) being the SFF-8087 connector (mini-SAS) for internal storage. HBAs / RAID controller cards usually have 2 SAS connectors on them. They can be flashed (or bought pre-flashed) to what's called IT mode which allows them to operate as JBOD (Just a bunch of disks). Something like this. If you shop around a bit you can find better deals on used ones (which you shouldn't be afraid to buy, these things are rugged as hell and kept in nice server environments). You can then pair this with one of two cables, Mini-SAS 8087 to SATA or Mini-SAS 8087 to SFF-8482. If you by the latter of the two, it will work with any SATA drive you have as well, with the added compatibility for SAS drives (2 in 1!). SAS drives sometimes come in good deals on ebay @ 4TB for $50 so I'd go with the latter if you ever feel like you want bulk storage for cheap. No real harm in it.
Something like this, but bought on eBay for cheaper and flashed to IT mode to just "passthrough" the drives to the OS and not do any management by itself.
And a couple of these to connect your hard drives. :)
https://www.servethehome.com/ibm-serveraid-m1015-part-4/ I recommend having one of these on hand and https://www.amazon.com/Cable-Matters-Internal-SFF-8087-Breakout/dp/B012BPLYJC
So if you look at the R720 Owner's Manual, you'll see that there are two SATA ports on the board. One is labeled
J_SATA_CD
and one isJ_SATA_TBU
, numbered 2, and 3 respectively. These are both standard SATA ports, but they're unfortunately SATA II, so only 3Gb/s. There is also a spot on the board listed asJ_SAS_PCH
(24) which you can plug in a SFF-8087 breakout cable into to give you an additional 4 SATA connections. This port is attached to the built in S110 "RAID" controller. This is sadly also SATA II.You can buy a SAS9211-8i card for under $100 that will allow you to connect 8 SATA III devices (you'll need a breakout cable), but you'll have to figure out how to power those internal 2.5" SSDs -- I didn't have to. I already had an m.2 SATA SSD, so I bought StarTech PCI card which has two m.2 SATA slots on it. Because this thing is bus powered, I didn't have to worry about how to power it.
This is the cable you'll be looking at, and you'll need two of them because they have 4 SATA cables each - https://www.amazon.com/dp/B012BPLYJC
As far as the LSI 9211-8i goes, it's one of the most recommended RAID cards. You can also look at the PERC H310 as it is just a re-branded version of that card and may be a bit cheaper. What RAID configuration are you looking at?
They are sas/sata compatible on the HDD side. You would need 2 of those lsi cards with 3 sff8087 to sata plugs.
Cable Matters Internal Mini SAS to SATA Cable (SFF-8087 to SATA Forward Breakout) 3.3 Feet https://www.amazon.com/dp/B012BPLYJC/ref=cm_sw_r_cp_apa_i_bVg4Db0FS37N2
That card you linked has 2 sff-8087 connectors if thats what you have you can use that if you wanted. Each sff-8087 sas port has 4x sata ports basically. Both of your SSDs (and 2 more) should be able to hook up to just one of those ports. The breakout cable that goes from 1 to 4 looks like this: https://www.amazon.com/Cable-Matters-Internal-SFF-8087-Breakout/dp/B012BPLYJC
You can get a sff-8087 to sff-8088 adapter that converts the internal 4x connection to a external one to connect something like that ds4243, but for about the same money I'd still recommend the LSI adapter if you can verify you can use one more of the pcie slots. Its better to have a non raid card if your going to use something like zfs file system.
What i mean about the eSATA is that its a single sata port, vs the sas ports that are 4x sata ports. The DS4243 and others like it have a multiplexer that lets all 24 connect with that 4x connection. Those istar esata boxes have some kind of multiplexer or a controller as well, but how good/reliable are they vs something enterprise grade i dont know.
There are lots of other used SAS disk shelfs around as well (dell etc) its just about finding a good deal on one that has all its caddys etc. If your lucky maybe you can find one local on craigslist since they are so heavy and shipping is usually half the price.
Here is the HBA I use with FreeNAS.
LSI Logic SAS 9207-8i Storage Controller LSI00301 https://www.amazon.com/dp/B0085FT2JC/
You will also need these to connect drives to that card.
Cable Matters Internal Mini SAS to SATA Cable https://www.amazon.com/dp/B012BPLYJC/
For sure dude get after it, I've been having a ball with a stack of 170GB 15K RPM drives that I won from /u/storagereview over on the /r/homelabsales sub, still getting my post together for them like 3 months later lol.
Picked up one of these for the machine housing everything: https://www.amazon.com/gp/product/B01C5TG82C as a hot swap rack, it's pretty excellent. Then you just need something like these: https://www.amazon.com/gp/product/B012BPLYJC breakout cables and you're ready to rock. Do yourself a favor and get the 1.5 ft. ones though, 1 meter is too damn long.
Have been having so much fun with this project and I don't even store any data on this array lol, just building and breaking and rebuilding RAID configs.
Servers have SAS backplanes behind drive slots. Depending on design, the drive cage might be SATA or SAS and plug into a generic connector on the backplane. Although it's much more common to have cages without adapters which just plug into the backplane. The drive would stick out of the chassis when using adapter like that.
You select the correct backplane type when ordering the server. When buying from a proper vendor, servers are delivered as functioning units and do not require any internal cabling.
For homebuild like yours, you probably want something like this and this
Grab something like a LSI 9201-8i (like this) and a pair of SAS to sata breakout cables (like this).
Use a SAS to SATA breakout cable.
eg. https://www.amazon.com/CableCreation-SFF-8087-Female-Controller-Backplane/dp/B013G4EMH8/
You don't necessarily need two different cards, it just looks nicer that way and it's easier to disconnect the drives. If you want a proper setup, go with multiple cards and cables:
Another option, which is super janky but works is to just get some long SFF-8087 to SATA cables and run them out of the PowerEdge and into the disk shelf case. Definitely not as clean, and kind of a pain in the ass to work on...but much cheaper!
As for powering the whole thing, all you need is a power supply. You can use the paper clip PSU trick and then use the PSU switch to turn it on. Another option if you wanted to mount the power switch elsewhere would be to use a PSU power switch that plugs into the 24-pin connector and provides an actual toggle switch that you could mount wherever you like.
You can likely drive up to 8 SATA drives from this card. You just need the SFF-8087 to SATA breakout cables. It'll could support even more through SAS expanders / multipliers.
It will run up to 8 of them with 2 of these: https://www.amazon.com/CableCreation-SFF-8087-Female-Controller-Backplane/dp/B013G4EMH8/ref=sr_1_5?keywords=sff-8087+sata&qid=1555290849&s=gateway&sr=8-5
I've just built a FreeNAS box on a Dell T310. Xeon 2.4Ghz x3430, 8GB. 4x3.5" bays, with 2x5.25" bays you can do what you like with. I put two of these to give me 6x3.5" and 4x2.5" bays. There are 6 onboard SATA ports, so I'm using a Dell H200 card for the 3.5" drives and my 2 SSDs with 8087 to 4xSATA cables. It runs idle at about 80w, and kicks up to 120w when it's transcoding a Plex 1080p stream.
You could use whatever hypervisor you like on it.
It doesn't take just any mishmash of RAM though, so either find one with 16+ installed already, or be ready to shell out a bit more to get yourself to the maximum of 32GB.
The Dell T320 is a great looking box too, and newer, with 8x3.5" bays, bigger RAM capacity, etc.
FWIW, it totally depends on how budget sensitive you are. Based on the hardware you're talking about, it sounds like budget matters quite a bit.
In the USA, you can pick up a Dell H200 or H310 PERC card for ~$30, get the appropriate SAS to 4 SATA breakout cables for under $10 each, and flash the card into IT mode. If you want to get fancy, you can point a fan at the chipset heatsink to account for lower airflow in most cases in that area (I used a PCI slot cover-type fan adapter slow shipped from china and a single 92mm fan I had around). It's an extremely popular way to get a few more ports into your datahorder build and the card is well supported by most software. I operate it in pass-through, so I have no idea if you should use it for RAID or not.
https://www.ebay.com/itm/47MCV-047MCV-342-0663-DELL-PERC-H200-6Gb-s-PCI-e-SAS-SATA-Raid-Controller/253307058926
https://www.ebay.com/itm/Dell-PERC-H310-8-Port-6Gb-s-SAS-Adapter-RAID-Controller-HV52W-Replaces-Perc-H200/192120843762
https://techmattr.wordpress.com/2016/04/11/updated-sas-hba-crossflashing-or-flashing-to-it-mode-dell-perc-h200-and-h310/
https://tylermade.net/2017/06/27/how-to-crossflash-perc-h310-to-it-mode-lsi-9211-8i-firmware-hba-for-freenas-unraid/
https://www.amazon.com/CableCreation-SFF-8087-Female-Controller-Backplane/dp/B013G4EOEY/
https://www.amazon.com/dp/B07N3T1GJP
I have a server motherboard with SAS -> SATA ports so I can plug in one of these and get 4x SATA ports from one port. For power, I use these power splitters that allow me to neatly add power to a column of HDDs. I run one of the splitters per power line from the PSU. Before I had a server motherboard, I used this PCIE SATA card.
Sure:
The raid raid is a Dell H200 (same as an LSI-9211 8i, but I could get the dell cheaper) I found it on eBay for ~ $30 shipped.
I used something like these sas breakout cables. I have two because it made the internal cable management much easier. Note that the T20 (probably the T30 too) needs to have the 90 degree version so that the two cables attached to the bottom drive cage don't need to be bent really tight to put the case lid back on. But if you use the 90 on the top drives, then the connector angles up towards the top of the case and needs to be looped back down. I found it much cleaner looking to just use separate cables and leave 2 sata breakout connectors unused and tuck them away.
I've very very rarely seen or read about dmraid working or being worth the complications using dmraid offers.
There's always just better more equipped solutions than dmraid.
This is why I usually just tell people asking why their hard disks don't show up in "Linux Installers" that motherboard provided software raid is generally not supported in Linux.
Have you considered upgrading to full hardware raid? you would only need a capable raid controller, breakout cables and minimum two more matching hard disks if you wanted to upgrade to raid 5.
The raid disk controllers i use are also extremely affordable, reliable and supported natively with Linux.
If you do buy one of these raid controllers you will need the proper cables to connect sata hard disks.
The overall benefit of using a physical hardware raid controller is the raid volume and array being operating system and hardware agnostic meaning if your motherboard dies or your os install implodes you don't loose your raid array contents.
Your raid solution is a good one but it employs hardware dependency and design flaws.
I should also mention that specific model of LSI raid controller only uses pci express version 2 but it will work with pcie 3.0 with a firmware update. If you wanted to just use a pcie 3.0 raid controller the 9271-8i model is newer and somewhat faster due to newer dram on the raid controller but the cost increases somewhat.
I've been using the 9260 models for perhaps 8 years and so far haven't lost a single raid volume or filesystem to a hard disk failure.
Ok yeah they looked fine just wanted to make sure they were the right connectors https://www.amazon.com/CableCreation-SFF-8087-Female-Controller-Backplane/dp/B013G4EOEY/ref=mp_s_a_1_20?keywords=sata%2Bto%2Bsas&qid=1555191162&s=gateway&sr=8-20&th=1&psc=1
You'll need one SAS/SATA channel per drive bay, so 24 in total.
Your Intel S2600CP4 does provide up to 14 channels, one per connector on the mainboard - I guess 6 of them SATA, 8 of them SAS.
That means you're at least 10 channels short to be able to use all bays.
IMO, most viable options are:
Note that option 1 might require two HBAs, depending on your budget. You might find a cheap 16i HBA (~ 200 EUR) on eBay, but you might have to settle for two 8i HBAs (~ 50 EUR / each).
The RES2SV240 mentioned in option 2 and in other replies is pretty hard to find in the EU and might run you around ~250 EUR. There are cheaper options available for a full height case (> 3U).
Option 3 might break your bank. Take a look at eBay, cheapest 24i HBAs / RAID controller is at ~630 EUR at the moment.
That said, any plans to actually run SAS drives in the case? If not, then simple SATA controllers might be the cheapest option, although, personally, I would rather not.
Going all SSD? Or a single large array/vdev? Then you might not want to run an expander.
For reference:
It wouldn't be the prettiest thing, but a card like this (not necessarily the cheapest one, just the first one I found) with one of these cables will get you four external SATA ports.
I was looking at using a Sata extender or running them out the back to a separate case with the LSI Logic card using a cable like this
A SAS controller should be all you would need (along with any necessary drivers for the OS). Serverbuilds has a couple of suggestions that can be used on their NAS Killer build guides. One is the LSI 9210-8i (link: https://www.ebay.com/itm/182100909150) and another is the LSI-9201-8i SAS2 (link: https://www.ebay.com/itm/153441341142?ul_noapp=true). You'll also need the mini-SAS to SAS breakout cables to connect the two (link: https://www.amazon.com/Connector-Creation-Internal-SFF-8087-connectors/dp/B013G4FEGG/ref=as_li_ss_tl ).
​
Edited to remove shortened URL
Ok, so in another thread I started earlier today, /u/JDM_WAAAT may have convinced me to build my own server based loosely on his anniversary build. Couple of changes I just want to run by the fine people of this subreddit to make sure I'm not doing something stupid/incompatible. The items on top are straight from the guide, the items on the bottom of the list are the ones I have questions about.
-Single E5-2630 CPU
-GA-7PESH2 Mobo
-16gb DDR3 1600mhz Ram (incl with mobo)
-4TB Hard Drive (incl with mobo)
--------
-** This case, the Corsair Carbide 330r. The mobo should fit in there, I think, since it supports e-atx. If that case works, can I just use the fan that comes with it or will I need to add more cooling beyond the included fan and the CPU cooler?
-** Is this the correct breakout cable? The one linked on the guide is no longer in stock.
-**EVGA 450W 80+ Bronze PSU. The one in the build guide has gone way up in price, so does this one work?
-**Arctic Freezer 12 CPU Cooler. The one in the guide is no longer sold on Amazon, will this one be ok?
-**Operating System - I'd like to use Windows since I'm the most comfortable with it, would there be any issue with running Win10 on a flash drive with this build?
-ANYTHING else that I am missing that would be required for the build?
Thanks!!
In order to use the drive at full link speed (SAS3) you would need something like this: https://www.amazon.com/LSI-Broadcom-9300-8i-PCI-Express-Profile/dp/B00DSURZYS and this cable to go with it (for a desktop anyhow): https://www.amazon.com/CableCreation-Internal-SFF-8643-SFF-8482-connectors/dp/B01F378UF6
if you don't care about getting the full 12Gb/s from it you can go with the cheaper LSI-9207-8i controller ( https://www.amazon.com/LSI-Logic-9207-8i-Controller-LSI00301/dp/B0085FT2JC ) and this cable https://www.amazon.com/dp/B013G4FEGG/ which would allow you to get 6Gb/s which is the current max SATA speeds anyhow. (SATA1 is 1.5Gb/s, SATA2 is 3Gb/s, and SATA3 is 6Gb/s while SAS1 is 3Gb/s, SAS2 is 6Gb/s and SAS3 is 12Gb/s
Something like this, perhaps?
I want to add USB 3.0 to my HP Slimline s5710f with a nearly blocked second PCI Express port. I use this as a Plex server. The available PCI Express port has my wifi card installed. I was planning on using this cable to go around the wifi card and have this card mounted above it. Will this work? Can I have the connector on that cable up against the motherboard? Do I need to shield it? Is there a better way to do this?
First question: I need to fix the two tubes connected to the flow meter. I would like to reroute it entirely, but need suggestions. What would be the better route to run them or do you guys/girls think its fine? I need to straighten them, but need to order more coolant first.
Second question: I have a sound card I need to connect via pci extension. What are recommended cables for sound cards? I already tried this one: https://www.amazon.com/gp/product/B017QQM80A/ref=ppx_yo_dt_b_asin_title_o02_s00?ie=UTF8&psc=1
This one gets power to the sound card, but my motherboard isn't seeing it. Any suggestions?
/u/clickwir has basically summed it up. The "header cable" that you are describing is actually a board known as a backplane which your HDDs will slot into and on the back are the SATA connections. Hot swap is fancy terminology describing harddrives that can be easily accessed and replaced without shutting down or stopping the machine.
The reason why a SAS addon card is good is because each SAS port can take on 4 SATA connections making your wiring look very sleek. The downside is that you will likely have to buy the card and won't be able to take advantage of all your motherboard's SATA ports.
I use several of these cables. This will net you 8 drives. You can always use a SAS Expander for more.
https://www.amazon.com/Cable-Matters-Internal-Mini-SAS-Breakout/dp/B018YHS8BS/ref=sr_1_7?ie=UTF8&qid=1494600842&sr=8-7&keywords=sas+to+sata
Here's a trustworthy cable to try: https://www.amazon.com/Cable-Matters-Internal-SFF-8087-Breakout/dp/B018YHS8BS?th=1
Use your on-board SATA controller for the drive(s) where the VMs will be stored. I have two SSDs in mine since I have a dozen VMs. These drives will show up in the ESXi storage module.
For "raw access" you will need a separate drive controller (aka HBA=host bus adapter) for the disks you want Xpenology to use for its storage pool. This separate drive controller will show up in ESXi and you enable "passthrough" in the hardware configuration screen. After you do this, the separate drive controller can be added during configuration of the XPE VM as an additional "PCI Device" and all drives connected to this controller will show up in XPE after DSM boots. ESXi will have no visibility to these drives at all. Configured this way the drives behave as if they were in an actual Synology box.
There are caveats however since not all drive controllers can be passed and not all seem to be compatible with Jun's bootloader. There are various LSI models that most people use, with 9211-8i being one of the more popular ones. There are third party cards (such as Dell and IBM) that are the same as the 9211-8i and can use its firmware. Secondly, the card needs to be flashed from IR mode to IT mode which basically disables the built-in RAID function and presents the drives as a JBOD. Here is one example of how to flash an LSI 9211-8i into IT mode: https://nguvu.org/freenas/Convert-LSI-HBA-card-to-IT-mode/
You can also purchase preflashed cards on ebay. Do a search for "9211 it mode" and you'll find many listed. You should be able to grab one for $35-40. I personally use a Dell H310.
The 9211-8i has two SFF-8087 ports, each which supports 4 drives. Use a cable like this which has a standard SATA connector: https://www.amazon.com/Cable-Matters-Internal-SFF-8087-Breakout/dp/B018YHS8BS
Go here for info on how to configure XPE on ESXi: https://xpenology.com/forum/forum/50-dsm-6x/
There are also many Youtube videos available that shows how to configure XPE in ESXi.
Side note: Now that I've done some searches on RDM, I recall that my issue was that RDM was greyed out as an option for me. Maybe it will work fine for you, but the passthrough method is the recommended way. It's also used for other platforms like FreeNAS and unRAID.
I'm not 100% sure if there's some "gotcha" but it should be do-able without problems. I'm trying to find the cheapest ones; might be worth trying out if your other solution for your drives is more expensive, more cluttered or has some other problem.
https://www.amazon.com/dp/B01B1IVEM2/ref=psdc_6795231011_t1_B01BW1U2L2
Comment on there for example:
> I'm using Intel VROC on motherboard (Z11PA-D8) and two mirrored Seagate Barracuda drives (ST8000DM0004).
You could do what I did. Buy an lsi 9201-8i card and this from Amazon
CableCreation Internal Mini SAS(SFF-8087) 36Pin Left Angle Male to Internal Mini SAS (SFF-8087) 36Pin Male Cable, 0.75 Meter https://www.amazon.com/dp/B01KFEVOPA/ref=cm_sw_r_cp_apa_i_DXrBCbKF36D3S
Did this so I could move a drive already set up with Windows and use it in the Dell.
https://www.amazon.com/dp/B01KFEVOPA
I ordered two of these - CableCreation Internal Mini... https://www.amazon.com/dp/B01KFEVQ4E?ref=ppx_pop_mob_ap_share
They're a bit long but they work. Make sure you order the correct angle. Mine were right angle. The card can be updated with the latest Dell firmware and it can be placed in the storage slot. This allows drives of higher capacity. Currently have two 3tb drives running and will be adding a couple few 8s in a few days.
I'm assuming your 710 is the LFF with 3.5in drives?
CableCreation Internal Mini SAS SFF-8087 to Right Angle SFF-8087 Cord, Internal Mini SAS to Mini SAS Cable, Compatible with RAID or PCI Express Controller, 2.5FT /0.75M https://www.amazon.com/dp/B01KFEVQ4E/ref=cm_sw_r_cp_apa_i_Y0sKDbFEJ76W1
These are the exact cables I bought (buy 2) for my R710 II LFF to connect my H200 and it works perfectly.
Yes, you will need new cables. I also started with a perc6i and the existing cables didn't fit my h200 - it has a different connector. I got a lot of info from this youtube video: https://youtu.be/Lj_FfdPfYM4. The guy has a ton of other helpful videos and even sells the preflashed cards (though, you do pay a premium over doing it yourself).
I had a hard time finding the exact cables I needed (the youtube video mentions what you need). I ended up with a pair that are a bit longer than needed and I just hid the extra slack in the case. It ended up pretty fine and saved me some money/time trying to find the exact cable. Here is what I got off Amazon: https://www.amazon.com/dp/B07CKXFKHT/ref=cm_sw_em_r_mt_dp_U_uQXSDbRECY2G1
What you really want is one^1 of^2 these^3 ; add some cables^4 and you have 8 ports per pcie x8 slot.
Something like this?
WD Reds, currently 1 4TB and 2 3TB installed, 3 more 4TB I haven't moved over yet until I can get it working.
​
Yes they worked previously on this card when the card was installed in an old 2500K desktop. Prior to this switch they were in a Rackable 24 bay chasis connected with external SAS cable to a SAS9200-8e H3-2560-02G.
​
Had to purchase new mini SAS cables for this as I didnt have any https://www.amazon.com/gp/product/B01KFEVQ4E/ref=ppx_yo_dt_b_asin_title_o01_s01?ie=UTF8&psc=1
​
TLDR only new components to this setup are the R720 and the mini SAS cables, everything else was previously working on another build.
https://www.amazon.com/dp/B07CKXFKHT?ref=ppx_pop_mob_ap_share
These are the ones I bought. They work for the H700 and H310.
Yep, forgot to mention that. You need those internal mini-sas cables. You can get them on Amazon. Something like this is perfect and I know for a fact this exact one will fit in terms of length. You will want a right angle bend on one side.
Btw, you can flash the H200/H310 RAID cards to IT mode, which makes it act as an HBA card (allowing you to pass each disk individually allowing you to do software RAID like ZFS).
I recently put in an LSI card to attach 8 HDD's into my array (I have 5 SSD's attached making up my cache - not ideal, but I had the parts so... ;-). Worked out of the box. no flashing. No updates. I ordered mine from Amazon.com. Was $75, but I did not want to risk it, as this is my server (worth the $25 to me for simple piece of mind).
https://www.amazon.com/gp/product/B0085FT2JC/ref=ppx_yo_dt_b_asin_title_o04_s00?ie=UTF8&psc=1
Combined that with the splitter cables (I used these: https://www.amazon.com/gp/product/B07CKX6HVV/ref=ppx_yo_dt_b_asin_title_o03_s00?ie=UTF8&psc=1 ) and I have had 0 issues.
It was by far the most highly recommended card, and I did not want to deal with a bunch of random issues to save $25 dollars.
I have one in my R720XD with flexbay, you only need one card
as for cable length i tried finding my purchase and i am fairly sure i went with 2 of these right angle ones:
https://www.amazon.com/gp/product/B01KFEVQ4E/ref=ppx_yo_dt_b_asin_title_o04_s00?ie=UTF8&th=1
These cables work as well if anyone is looking for some on amazon.
These will work.
https://www.amazon.com/gp/product/B01KFEVQ4E/ref=oh_aui_search_detailpage?ie=UTF8&psc=1
I went with right angle cables as well. These are the ones I used.
https://www.amazon.com/gp/product/B01KFEVQ4E/ref=oh_aui_detailpage_o00_s03?ie=UTF8&psc=1
You can pick up sas cables that work with the h700 from Amazon, you don't necessarily need the dell ones.
I got two of these ( linky ) and they work just fine, and fit the LFF backplane.
These are what I ordered a few weeks ago, they work great in my R710+M1015. They are out of stock now but you might be able to lookup the cable on ebay by the same mfg.
https://www.amazon.com/gp/product/B01KFEVQ4E/
if you want to go ghetto-cheap ...
maybe a lsi card that has internal sas connectors (SFF8087) .
a cable like https://www.amazon.com/HighPoint-Internal-Mini-SAS-SFF8087-Int-MS-1M4S/dp/B001L9DU88
do something like this guy
https://www.reddit.com/r/homelab/comments/7crrhv/so_i_took_my_ghetto_homelab_and_made_a_homemade/
with the 5 in 3 adapters, and a psu, and rig up some LFF drives.
or maybe repurpose a desktpo case that holds 8+ internal hdd's, strip the guts etc.
https://www.amazon.com/dp/B000NX1YCC/ref=psdc_6795231011_t3_B00OYB6P7I
or
https://www.amazon.com/High-Point-Internal-Mini-SAS-Int-MS-1M4S/dp/B001L9DU88/ref=sr_1_94?s=pc&ie=UTF8&qid=1537927377&sr=1-94
or
https://www.amazon.com/Supermicro-CBL-0388L-Mini-SAS-SFF-8087-Sidebands/dp/B00OYB6P7I/ref=sr_1_103?s=pc&ie=UTF8&qid=1537927463&sr=1-103
But if I had the ones he has, I would just play it by ear. They might work out fine.
Don't forget the cable as well. Mini SAS SFF-8088
I was looking at those pc-pitstop JBOD boxes. You'll save a bit of cash ($100 for the 8-disk model) by buying the one where you have to screw the disks into the trays. Unless you really need to swap them that quickly.
As for the HBA, something with an LSI2008 chip is probably a good choice - found one (Dell-branded) as low as $65 on ebay.
You'll also need to buy two SAS 8088-8088 cables - $25 for 2m cable on Amazon. It's about another $300 for the models that have a SAS expander to run more drives using a single cable.
is this the right cable to go from the R710 with an LSI 9200-8E to the MD1200?
 
https://www.amazon.com/Monoprice-External-26-Pin-SFF-8088-108185/dp/B008VLHJBI/ref=pd_bxgy_147_img_3?_encoding=UTF8&pd_rd_i=B008VLHJBI&pd_rd_r=7EYZV4X1AK6NF3MW1BTS&pd_rd_w=PpJW1&pd_rd_wg=dtb3b&psc=1&refRID=7EYZV4X1AK6NF3MW1BTS#customerReviews
2 of these: http://www.istarusa.com/en/raidage/products.php?model=DAGE416UTL-4MS
Available on Amazon.
You'll also want a couple of these or similar: https://www.newegg.com/lsi00276-sata-sas/p/N82E16816118147
Lastly, some cables to hookup between the two: https://www.amazon.com/CABLEDECONN-SAS26P-SFF-8088-External-Attached/dp/B00S7KTXW6
Drives will be hot-swappable and show up as if they are directly connected.
As far as JBOD's go, that's pretty freaking cheap. Hook them up to whatever the hell you want that has PCIe slots.
If you're running Linux software RAID is likely to be more flexible, and with a decent CPU behind it will give a hardware controller a run for it's money in general operations. Go hardware if you want to get the benefit of hardware acceleration for things like RAID6 parity calculations.
> I'm assuming the H200E would be inside the DAS.
No. H200E is a PCIe card that is installed in the server. The MD1200 or other DAS is nothing but a metal chassis, one or more SAS controllers and one or more power supplies. That's it. You'll need a pair of SFF-8088 cables... I bought these ones for H200E to MD1200 connection.
All you need is SAS 36 pin to SFF-8087 cables. I did the exact same setup last week and got 2 of these cables on Amazon:
https://www.amazon.com/gp/aw/d/B00S7KU3PC/ref=ya_aw_od_pi
Ouch. If you can't sell them you can pick up cheap LSI controllers for $20 and then some of these to get them to work in a standard desktop.
>https://www.amazon.com/CableDeconn-SFF-8087-SFF-8482-Connectors-Power/dp/B010CMW6S4
Or at least that's what I did last time I had some lying around.
So something like this?
https://www.amazon.com/CABLEDECONN-SFF-8087-SFF-8482-Connectors-Power/dp/B010CMW6S4?ref_=fsclp_pl_dp_4
I've ordered four of these and they are working excellent. H200 with a Rosewill 4500 chassis, SM board.
I just did the same thing. I bought an LSI SAS controller and a data cable:
Controller
Data Cable
Hi, thanks, this is the exact cable I’m using https://www.amazon.co.uk/gp/aw/d/B010CMW6S4?psc=1&ref=ppx_pop_mob_b_pd_title
There was also a cable that came with this https://www.ebay.co.uk/itm/LSI-SAS-9217-8i-RAID-HBA-SAS-CARD-8-INTERNAL-6Gb-s-SAS-SATA3-2-port-cable-/382576399397?txnId=965314397025 that I have tried.
I’ve been feeling the hard drive to see if it’s spinning and I don’t think it has been, for whatever reason it isn’t drawing power..
SATA drives have worked on the card.
Drive is Western Digital 12TB DC HC520
https://documents.westerndigital.com/content/dam/doc-library/en_us/assets/public/western-digital/product/data-center-drives/ultrastar-dc-hc500-series/data-sheet-ultrastar-dc-hc520.pdf
This depends on the controller you are using. The backplane uses SAS 8087 connectors. If you are using a H700 your other end would SAS 8087. If it's a Perc 6/i it would be SAS 8484.
For perc 6/i, get something like this: Just make certain to take measurements that it will be long enough.
https://www.amazon.com/Tripp-Lite-Internal-mini-SAS-S510-18N/dp/B00193MCN0?ie=UTF8&*Version*=1&*entries*=0
H700 internal. Again take measurements:
https://www.amazon.com/Cable-Matters-Internal-Mini-SAS-Feet/dp/B011W2F626/ref=sr_1_4?s=electronics&ie=UTF8&qid=1464821930&sr=1-4&keywords=sas+8087
I have an r710 and use an h310, I used these (think the 700 uses the same):
Cable Matters Internal Mini-SAS to Internal Mini-SAS Cable 3.3 Feet / 1m https://www.amazon.com/dp/B011W2F626/ref=cm_sw_r_cp_api_7EcDzbAG31YNF
I picked up a couple of R410s a few weeks ago from a friend. I thought of a similar idea as you but my idea was to remove the backplane completely and connect the HDDs straight to onboard SATA ports. Then there is a power issue, how will I power the HDDs when I just removed the backplane? So I just went ahead and purchased a H310 and flashed it to IT mode and ran these to the backplane.
No, the SAS back panel will also have the single SFF-8087 port - it will look the same as on the Dell H200.
You just need a regular cable like this:
https://www.amazon.com/Cable-Matters-Internal-Mini-Mini-SAS/dp/B011W2F626/
I'm not familiar with the Dell T310, but I have a Dell R510 server. I too swapped the Perc 6i for a H700, but I 'm working with a drive backplane. I had to swap the cables for Internal Mini SAS to Mini SAS cables.
https://www.amazon.com/gp/product/B011W2F626/
I was starting fresh, so I don't know if the RAID settings will carry over. Safety first, better back up all your data!!
I am looking at rolling my own. Here are the components I am thinking about using.
Cables
https://www.amazon.com/Cable-Matters-Internal-Mini-SAS-Breakout/dp/B012BPLYJC/ref=sr_1_3?ie=UTF8&qid=1483843852&sr=8-3&keywords=sas+to+sata
Case
https://www.amazon.com/Rosewill-Rackmount-Computer-Pre-Installed-RSV-L4412/dp/B00N9CXGSO/ref=sr_1_3?ie=UTF8&qid=1483843870&sr=8-3&keywords=rosewill+12
SAS Multiplier
https://www.amazon.com/HighPoint-16-Channel-Port-Multiplier-Rocket-EJ340/dp/B00DWV4SKM/ref=sr_1_1?ie=UTF8&qid=1483843893&sr=8-1&keywords=sas+expander
Not too worried about which type of RAID, long as it supports the drives at full throughput. The 9211-8i looks pretty nice.
Would this work as a breakout cable for the drives?
https://www.amazon.com/dp/B012BPLYJC
So I just realized the P812 has the Mini SAS on the board whereas the P800 has the larger SAS ports on the board. Therefore, you'll actually need these breakout cables.
https://www.amazon.com/Cable-Matters-Internal-Mini-SAS-Breakout/dp/B012BPLYJC
If you get two of those, you attach them to the internal ports and then that gives you 8 total internal drives. If you needed more than that, then you would get the SAS Expander, run SAS cables from the P812 to the SAS expander, and then use more of those breakout cables on the SAS expander to get more internal drives.
I haven't used the SAS expander for HP so I am not sure how well it works or what additional configuration you will need.
You would need those other cables I listed if you were going to use the P800, but I wouldn't recommend it since that card only supports up to 2TB drives where the P812 supports MUCH larger drives and up to 108 total drives (if you really wanted to).
Maybe get some of these sorts of SATA cables would make it easier - https://www.aliexpress.com/item/4PCS-Free-Shipping-DIY-Black-sata-3-SATA-III-3-Data-Cable-Dual-channel-aluminum-foil/1582341251.html
Or get a SATA controller that uses Mini-SAS to SATA cables and get these - https://www.amazon.com/Cable-Matters-Internal-Mini-SAS-Breakout/dp/B012BPLYJC
Would make running separate SATA cables a bit easier and more manageable
Use a forward breakout cable?
https://imgur.com/a/OzIi5
I'm currently using a ASRock E3C224D4I-14S which has integrated LSI 2308 controller. It has 3x mini SAS ports which I use breakout cables giving me 12 drives. To get more ports I can use a SAS expander card or an extra pci controller.
>I was mainly thinking I could use the PCI express slot for another SATA card when the time came for upgrading, but again thats a future issue.
Don't go with a SATA card, use SAS. SATA cards are usually stuck to 1x slots and can only connect 4 drives (and even then the 1x slot can start bottlenecking you). SAS cards can connect SATA drives and they usually have more PCIe lanes so they will not be a bottleneck. All you need is a SAS card and SAS to SATA breakout cables.
https://www.ebay.com/itm/LSI-9211-8i-P20-IT-Mode-for-ZFS-FreeNAS-unRAID-Dell-H310-6Gbps-SAS-HBA/253955813684?epid=19006955695&hash=item3b20f23134:g:kKkAAOSwMjpb11RL:rk:2:pf:0
+
2x https://www.amazon.com/Cable-Matters-Internal-SFF-8087-Breakout/dp/B012BPLYJC
Do a little research on it before just buying and running with it, but it's generally straight forward.
I looked closer at our chassis setup and it is 4x 5 slot boards, so I'm not actually sure what protocol they run because SAS breakouts should max at 4. We're definitely proprietary compared to the options I am seeing online.
Closest equivalent commercially available part would be something like https://www.amazon.com/Mini-SAS-SFF-8087-Inch-Frame/dp/B00M36C2KK which effectively breaks out an internal sas port to 4x sas/sata interfaces. Looking online the DL320 should have an unused onboard port.
Alternatively https://www.amazon.com/Aplicata-Quad-NVMe-PCIe-Adapter/dp/B01MTU75X4 or https://www.aliexpress.com/item/NEW-The-adapter-card-PCI-E-16X-TO-4P-NVMe-SSD-Support-RAIDO-PCI-E-16X/32951136605.html will let you run NVME directly off the PCIE slot assuming there isn't some other expansion already there.
So in a dl320 you could probably do one of each so long as you have physical space left and you don't run out of power.
Forgot one other option, which assumes OP can find power and mounting points on their own. https://www.amazon.com/Cable-Matters-Internal-SFF-8087-Breakout/dp/B012BPLYJC
Onboard video chip or an APU.
An APU can save you from having an onboard video chip or using dGPU (and loose the lone PCIe on mITX). But being an APU that means less CPU power. This will be okay for most NAS usage, but when someone wants more then more cores are better. I've always asked here about an 8C/16T mobile APU with very small iGPU for high-end laptops and such applications. These applications either doesn't need a powerful GPU like a server/NAS or it will already have a dGPU like AIO, high-end laptops or SFF systems.
​
Zen actually support ECC, but it's up to the motherboard maker to implement it to fully support it or not.
​
8x SATA ports on mITX can be hard (although they exist). But things can be compact if we go for more server'y like two mini SAS port, each can handle 4x SATA with simple & low cost adapters.
​
These board should really have at least 2x 1GbE or better a 1x 10GbE + 1x 1GbE, or 2x 10GbE for more high-end versions.
https://www.amazon.com/SAS9211-8I-8PORT-Int-Sata-Pcie/dp/B002RL8I7M/ref=pd_ybh_a_30?_encoding=UTF8&psc=1&refRID=QZ552F2XPHH64TJYWE61
https://www.amazon.com/Cable-Matters-Internal-SFF-8087-Breakout/dp/B012BPLYJC/ref=pd_ybh_a_29?_encoding=UTF8&psc=1&refRID=QZ552F2XPHH64TJYWE61
this should help
>My only gripe it's lacking in SATA III ports. Any tweeks to meet my above goals would be greatly appreciated.
Flash an LSI (or similar branded) SAS raid controller and you'll get 8 sata 3 ports at your disposal (note requires SAS - 4 sata cables [eg. this]). You get ex server ones quite cheap. /r/homelab could probably point you at which ones are worth getting now, I've not looked into it since buying my own 6 years ago.
I second this also. I have 4 LSI 9211-8i's and I love them. The only issue I have is that my case needs 5 to be able to use all the bays, but the LSI BIOS will only let me use 4 cards. I paid ~$100 per card, well worth every penny. [LSI 9211-8i] (http://www.ebay.com/itm/LSI-Internal-SATA-SAS-PCI-e-RAID-Controller-Card-SAS9211-8i-8-PORT-HBA-/111834008063?hash=item1a09d389ff:g:YhoAAOSwrklVEHVu), [Mini-SAS to Sata Cable] (http://www.amazon.com/Cable-Matters%C2%AE-Mini-SAS-Forward-Breakout/dp/B012BPLYJC/ref=sr_1_2?ie=UTF8&qid=1451574103&sr=8-2&keywords=mini+sas+to+sata)
I don't have one to sell, but you may want to consider an IBM M1015. It has 2x SAS ports which can be turned into 8x SATA via two of these.
If the $250 price on Amazon is too much you can get them used (or "new" sometimes) on Ebay for as little as $50. Which also opens up some more options for faster shipment, I suppose.
https://www.amazon.com/dp/B012BPLYJC/ref=psdc_6795231011_t3_B008KF7H8U
edit: 9211-8i has two of those ports, so you'll need two sets of those cables, like this:
https://www.amazon.com/dp/B0728KRZYB/ref=psdc_6795231011_t3_B012BPLYJC
I'm going to give this one a whirl, hopefully there isn't some issue with my motherboard preventing it from working. Got a H310 pre-flashed off ebay and ordered two of these and let's hope this fixes all my issues.
I tried to put another 8tb in my server this morning and it wouldn't work even on a mb port. Not sure what's up.
I need new cables to connect the back panel to the LSI don't I?
I currently have https://www.amazon.com/gp/product/B012BPLYJC/ref=oh_aui_detailpage_o00_s00?ie=UTF8&psc=1
Ordered one of these: https://www.amazon.com/gp/product/B012BPLYJC/
This situation is how I got started with FreeNAS. I started with 3 drives in a standard ATX case. Then had 8 drives in a Full Phanteks tower. Now I have a 15 bay Rosewill case. I built a rack for mine, but it could sit on a shelf somewhere.
I have one of these with these cables to expand on sata ports.
I have bought this cable which should be a forward breakout https://www.amazon.com/gp/product/B012BPLYJC/ref=oh_aui_detailpage_o00_s00?ie=UTF8&psc=1
I have tried to plug it in another port on the HBA, no luck either.
Yes, all drives have SATA power and SATA data and I can hear/feel the spinning.
Well, I only have one "gaming VM" (it has a Radeon HD 6970 and a pair of USB ports passed through, and I've assigned it four vCPU / 6GB RAM), but I'm doing a lot of the rest of your desires. This is going to be a somewhat long post, and I'm not terribly well known for being overly organized with my ramblings, so bear with me... ;)
My host is an HP Z800, and the OS is ESXi 6.0.u2 (with the free license). It has two Xeon X5677's with 32GB of DDR3 (8 4GB Corsair CMZ8GX3M2A1600C9B, if that happens to matter to you at all). Because of the memory ventilation duct, I had to remove the flimsy heat spreaders. It has two fans that blow directly over both RAM banks, and I've not had any issues without the heat spreaders at all. This is the only physical PC in my house, if you don't count my rarely used laptop (it primarily gets used on the rare occasion that I travel, and on game nights to control the RPG mapping VM).
For my primary datastore (where the VMs live), I have an LSI 9260-8i, with a Mini-SAS to 4 SATA (forward) breakout cable connected to one of these, populated with four PNY CS1131 SSDs configured in a RAID5 array. Within my Windows 10 VM, I ran CrystalDiskMark (with its defaults - I'm not terribly familiar with benchmarking), and this was the result. I suspect the slow write speeds is due to 1) parity calculations and 2) write-back cache being disabled due to my not (currently) having a BBU to connect to my 9260-8i.
At any rate, onto the VM's!
VM1 - "Gaming" / primary usage - Windows 10. As previously noted, it has four vCPUs assigned, 6GB RAM, and 256GB vHDD on the afore mentioned primary datastore. It has a Radeon HD 6970 and a pair of the host USB ports attached via 'pcipassthrough'. As the host lives in an electronics / networking closet in my spare bedroom, I use some Cables2Go RapidRun digital cabling (the specific part numbers I used are now discontinued) to bring the HDMI signal from that space to a spot on one of my living room walls, where the monitor is mounted. I used a cheap USB<->Cat5 extender to bring a USB port out to a cheap USB hub, to which is connected the Logitech universal receiver for my keyboard / mouse, and a crappy USB 'sound card' (which is only used for its MIC input). Before you ask, no, I don't notice any input / display lag with the 50' cabling between my keyboard / monitor / mouse.
VM2 - Media server, "nas" - Windows Home Server 2011. This VM also has four vCPUs assigned, along with 6GB of RAM, but only a 160GB HDD (the minimum WHS2011 required for installation). This VM has the onboard Intel six port SAS/SATA controller attached, along with a USB3 PCIe controller. I have an 4-in-1 IcyDock (different model to the one I linked previously, but very similar build), in which live three Samsung / Seagate 2TB 2.5" HDDs. These are controlled / presented to the OS by StableBit's DrivePool. All of my media / other data are stored on this pool. As this VM also handles my media services, it has Plex Media Server, Sonarr, and sabnzbd installed. All downloads / unpacks / media rename / etc happens on the DrivePool, since I don't care how long those operations take (I'm the only one that accesses my media).
VM3 - RPG mapping - Windows 7 - This VM is very basic : two vCPUs and 2GB RAM. It has a Radeon HD 7470 attached, which is connected via a 50' RapidRun analog (yellow, also discontinued) VGA cable. This VM is only powered on / used when I have an RPG group at my house.
All three VMs have Chrome Remote Desktop installed so I can access them from anywhere. The media / RPG VM's are exclusively controlled via this method.
I have a Nexus Player installed at both of my TVs. Each has the Plex app installed so I can watch whatever is on the server.
If you have any specific questions, please feel free to ask. :)
>So I want the power from that socket, but the data from the other SAS controller - does that make sense?
Yup. Get yourself some sata power extension cables to hijack the power from those plugs, then get a Sata to SAS breakout cable like this to run the drives to the H700. It will take up some space in the case, but you should be able to make it all fit with some finagling.
I just picked up a Dell H310 PCIe card off eBay for $50 CDN a week ago and flashed to IT Mode using this guide
https://tylermade.net/2017/06/27/how-to-crossflash-perc-h310-to-it-mode-lsi-9211-8i-firmware-hba-for-freenas-unraid/
Also picked up 2x SFF-8087 to 4x SATA cables off Amazon
https://www.amazon.com/CableCreation-SFF-8087-female-Controller-Backplane/dp/B013G4EMH8/ref=sr_1_3?ie=UTF8&amp;qid=1517507321&amp;sr=8-3&amp;keywords=SFF-8087
So my next stupid idea is getting an 8087 to 4*SATA plugging those SATA drives into the integrated port and seeing what happens.
I'm trying to decide between upgrading my current build to support more hard drives vs selling it and building something new. The most simultaneous streams has been 4, generally there are 2 or fewer.
> It's my Plex & file server, but I have it hooked up to my home theater setup where I use PMP & Netflix. Every once in a while, gaming is done on the thing. I don't have any redundancy setup, I just make file list & registry dumps every few weeks using a script to local & cloud storage.
>
> I have a 1st Gen FireTV hooked up to my surround sound receiver as well, but the last time I used it GoT was still on. It pissed me off with Plex a long time ago, so I don't use it.
.
Current setup:
Cooler Master Storm Scout
AMD FX-8320
Gigabyte 970A-UD3P
16GB (2x8) DDR3
PNY GTX 950 2GB
---
128GB SSD (OS)
5 x 1.5TB (WD Black) only 1 installed
2 x 3TB (WD RED)
1 x 8TB (Shucc) not shucked yet, back from RMA
2 x 10TB (Shucc | will be adding more Black Friday)
BluRay Drive ** disconnected
Note: I am missing the hard drive rails.
Currently Plex Usage: 20TB
I saw someone recommend this:
Dell PERC H310 + (2) of the SFF-8087 to 4 SATA Breakout Cables.
I either need a new tower or find hard drive trays & 5.25 -> 3.5" adapters
I found this guide and it has a link to broadcom's page (i guess they bought out LSI, sweet)
https://nguvu.org/freenas/Convert-LSI-HBA-card-to-IT-mode/
Seems legit?
I'm gonna give it a shot tonight after work, I have 2 of these cards (I need just one for now) and 4 of these cables, I guess once I confirm my firmware is correct I can swap out cables to check that...
https://www.amazon.com/gp/aw/d/B013G4EMH8/ref=ya_aw_od_pi?ie=UTF8&amp;psc=1
I'm using two of these, myself:
https://www.amazon.com/gp/product/B0085FT2JC
They work great with my 4 and 10 TB HGST NAS drives, but I did have a problem with my Samsung 850 EVO SSD. There is a firmware update available for them that I haven't tried yet (I just moved the EVO to on-board SATA ports and it's fine).
Edit: You'll need cables like these (it doesn't come with them): https://www.amazon.com/gp/product/B013G4EMH8
It has 2 mini SAS ports, but you can use a breakout cable to connect it to 8 SATA drives.
This is one example: https://www.amazon.com/dp/B013G4EOEY/
And just grab a couple of these and I should be good?
Im super sorry to be bothering you again, but is this one what i should get? https://www.amazon.ca/CableCreation-Reversed-SFF-8087-female-Target/dp/B013G4EQOM/ref=sr_1_1?keywords=sff+reverse&qid=1566273710&s=gateway&sr=8-1
It says reversed cable, where 4x sata is the host and the target is SFF-8087
https://imgur.com/a/KojZuIB/
Cable management's a little rough, but I'm using these 1m/3ft cables. If it were an option, I'd go with slightly longer cables because everything's a bit snug, but that's more to do with my system as a whole. If you'd like I can show the inside tomorrow when it's idle.
--I use an old quad-core i3 laptop with a 2-port eSATA Expresscard to connect the 4-bay Probox. Can connect it with a USB3 Expresscard as well, but I don't trust that configuration. I was also able to connect it to an older motherboard that had SATA port expansion with an internal-to-external SATA cable.
&#x200B;
3FT eSATA to SATA male to male M/M Shielded Extender Extension HDD Cable 6Gbps
&#x200B;
--If I need quicker scrub times, I can take the drives and put them in a 5-bay Sans Digital HDDRACK5 with a PC power supply, and hook them up to one of my SAS cards in the tower server I had built from Fry's a few years ago. It's LSI2008 with the cables routed externally.
&#x200B;
Cable: External Mini SAS 26pin (SFF-8088) Male to 4x 7Pin Sata Cable
Cards: SAS9200-8E 8PORT Ext 6GB Sata+sas Pcie 2.0
Fan card: Titan Adjustable Dual Fan PCI Slot VGA Cooler (TTC-SC07TZ)
&#x200B;
--Sorry for the late reply, BTW - haven't checked the forum for a few days.
You have exactly the right HBA already. Nice job! All you need now is SFF-8088 to SFF-8088 cables.
$14.99 https://smile.amazon.com/dp/B013G4F3A8
no. the cable I linked to is correct if you are using an H200 internally to the backplane in an r710. That cable you linked too is the connector on the old PERC 6i. I would honestly get rid of that card. It's no good. The backplane of the r710 and the H200 use the same connectors.
So wait now you are getting a DAS? I thought you were getting another server with its own processors and everything? In that case there is no reason to connect it with an R710. You only need to do that with jbod devices that are headless.
The h200 is used internally. The H200E is used externally and uses a different connector. That is the card you would use if connecting to a jbod device. https://www.amazon.com/CableCreation-External-26pin-SFF-8088-Cable/dp/B013G4F3A8/ref=sr_1_2?s=electronics&amp;ie=UTF8&amp;qid=1499059230&amp;sr=1-2&amp;keywords=sas+external
You told me you were going to go with a supermicro chassis with 12 bays that has it's own processors and everything. That is a standalone server and has no use being connected to a r710. You are still going to need a HBA for it though which would need to be internal.
You get what I'm saying?
SFF-8088 cables aren't that expensive.
$15 on amazon
If you want to use those cables, you'll need to get a bunch of Molex splitters, because your power supply only comes with one peripheral cable that has 3 Molex connectors. You may want to look at this SAS cable as an alternative:
https://www.amazon.com/dp/B013G4FEGG/
Then you can use the SATA power cables that came with your supply to power most of the drives.
Regardless of which option you choose, you will need at least a couple splitters, because the supply only has 12 total connectors.
One other option if you don't want to deal with spaghetti in your case would be to roll your own cables. Check out this guide:
https://www.reddit.com/r/buildapc/comments/25pftl/discussion_making_custom_sata_power_cables/
Just a FYI,
if you opted to go with the Mini SAS Cable Connector SATA Power connector and tried to do the alternative SATA 3.3V cable mod mod before this particular connector you will be in for a nasty surprise.
The cable manufacturer shorted the 3.3v pins together within this connector so the alternative 3.3v approach will not work since the sleep pin will get power from the adjacent pin 3.3v pin even if removed.
I find that the Kapton tape approach works well (I recommend the old 3M sticker sheet and razor blade approach), but a full removal of the 3.3v rail via Molex will likely work (thought the risk of Molex/SATA fire would give me some heartburn).
Yeah I have these just have to hook sata power to them. I'm also using these converters to go from the 8080 to 8087
Fantastic! I'll probably just buy the LSI SAS card you listed along with this cable then.
Most of my confusion stemmed from motherboards having "SAS support", but I'm assuming that's if you are hooking up directly to the motherboard opposed to a PCIE controller?
Thanks for the in depth information!
Oh sweet, thanks.
After I read your comment I went back and looked at Amazon again - they have the SATA data and power cables here. Hopefully these work with the H310. I can't find any real information on how the SAS to SATA cables work. I keep reading about "forward" and "reserve" and "breakout" cables, but I'm not clear on what any of those are. I'm mildly computer savvy, but this is my first foray into RAID controllers.
Anyway, enough of that - thanks much for your help!
Hmmmm...
Well depending on your case you could use possibly a PCIe riser cable and connect the HD60 Pro like that. Something like this: PCI-e PCI Express 36PIN 1X extender Extension cable with Gold-plated connector
You could then place it in a empty slot of your case lower down. Some cases also have a vertical expansion slot so this could be a place to mount something. This is not something we've tested for, but would be one option.
Someone here also found using a PCI to PCIe adapter board actually worked, but we don't specifically recommend that option. I'm not sure how robust the USB 3 controller on that motherboard is, but if all else fails an HD60 S would be an alternative as it offer the same image quality and low latency preview of the HD60 Pro.
But without a place to plug it in... not sure what else to suggest.
You could white box it, but the Xeon CPU brand new is gonna be at least $200.
I went with a TS140. Something like this
Spend another couple hundred on ram Example ram here and an M1015 + breakout cables for passthrough to freenas. (That's what I do.)
EDIT
I'd suggest using ECC memory, especially with a ZFS file system.
You need these: https://www.amazon.com/Cable-Matters-Internal-SFF-8087-Breakout/dp/B018YHS8BS
I have acquired the nVIDIA Tesla K10's. Now for the rest of the shopping list:
* cheap enough to not be affected by changes in budget
From what I have seen thus far, I should be able to finish options 1 and 2 (because I start counting at 0) next month, leaving the server itself to be purchased in November or December. Might leave either the DAS or the sound card as an afterthought...
You've been gracious, please double check my setup for me?
> The motherboard in my supermicro has a SAS2 controller onboard so I just have that 1 red sata-to-8087 miniSAS reverse breakout that connects the motherboard to the case and all 24 bays work.
I had no idea you could do that. So you just need this cable and you can connect all 24-drives? How does this work with the motherboard? Do you need something special controller built in?
My motherboard has a 32Gbps M.2 slot that I have no use for.
I would also like to add more SATA devices.
Any reason why this wouldn't work?:
M.2 to mini-SAS adaptor:
https://www.newegg.com/Product/Product.aspx?Item=N82E16813998031
mini-SAS to 4 SATA port breakout cable:
https://www.amazon.com/dp/B01B1IVEM2/
Well I decided to go for this. Used this cable to connect the cage to my LSI 9300-8i adapter. So far so good. I'm running a single ST8000NM0065 drive, but will eventually add a second in a ZFS mirror.
Will continue to check the drive as time passes.
Glad to help out if you have questions, but the above workflow gets talked about pretty frequently around here, so you'll probably want to do a search. Tons of info about this around this sub.
edit: I don't know if you'll want left- or right-angle SAS cables for an R720. Pretty sure I used right-angle ones in my R720xd. It's confusing as hell which kind you need, but the "right" or "left" angle is from the cord to the connector, not from the end of the connector to the cable. In other words, look at the pictures on Amazon very closely to figure out which you need.
CableCreation Internal Mini SAS(SFF-8087) 36Pin Right Angle Male to Internal Mini SAS (SFF-8087) 36Pin Male Cable, 0.75 Meter https://www.amazon.com/dp/B01KFEVQ4E/ref=cm_sw_r_cp_apa_5qoKAbRZNNAD5
These are the cables you need. They were $12/pair a couple days ago, but $16 still isn't bad.
I was just recently/currently in the same boat. I purchased a used r710 to make into a plex server. I bought a H310 on ebay (paid a premium and bought a preflashed IT version so I wouldn't have to mess around doing it myself). I guess I don't have any real reason I bought the H310 over the H200, from what I read they both will work as long as they are flashed into IT mode. I just had to buy new cables and install them. I have only tried it with a 1 TB hard drive in unraid and it seemed to work great. Ultimately I went with unraid simply because I wanted to be able to add drives easily as my library grew and have a parity drive- both of which unraid does.
I tried the amazon ones then a left and right set from eBay. They were all wrong. I just got straight ones in the end.
https://www.amazon.ca/dp/B01KFEVQ4E?ref=ppx_pop_mob_ap_share
It’s amazon.ca I hope that’s ok.
I don't think that's the right cable (though it might work). For my H700->R710 backplane, I needed right-angle cables that looked like this (these are way too short though):
https://www.amazon.com/CableCreation-Internal-SFF-8087-36Pin-Right/dp/B01KFEVQ4E
H310 and H700 should use the same cables
Dell PN: P110M is what I ordered, which you need two. And they worked perfectly in my R710 to go from an H700 to the 3.5" backplane. I mentioned this before, and someone said those might be too short, but it's what I ordered and they worked.
http://www.ebay.com/itm/DELL-POWEREDGE-R710-PERC-H700-H200-6GBPS-SAS-SATA-RAID-CABLE-3-5-SERVER-P110M-/262926379196?hash=item3d37a228bc:g:AV0AAOSw9GhYeZ-2
Edit: oops, that one is from China too. But that's at least the right Dell Part Number to search for.