Why not become a lifetime supporting member of the site with a one-time donation of any amount? Your donation entitles you to a ton of additional benefits, including access to exclusive discounts and downloads, the ability to enter monthly free software drawings, and a single non-expiring license key for all of our programs.

You must sign up here before you can post and access some areas of the site. Registration is totally free and confidential.

A few things to consider when going for an ESXi server configuration for purposes beyond file storage, esp. in the multimedia area:...

WRT ESXi, I only meant it for whatever "server" machine was actually used and only for this machine that is already created. My suggestion would be to start with a VM in Workstation, for example, to get it all working virtually first. That said, I did know about a lot of these things, so I will address a few of them, but it is good to point them out.

1. Performance - Yes, it can be hit or miss, but unless you are KILLING the machine, in a home environment these shouldn't be an issue. I can't imagine throttling a single VM for ANY resources as it wouldn't make any sense.

2. Hardware limits - True, but again, in this particular environment it wouldn't need to be anything more. If it turned out it did, there are other routes to go (Like V2P).

3. Graphic Card Acceleration - True. I forgot that may end up playing a big part of any Multimedia setup. This could be a deal breaker for permanent setup, but you can alway V2P back once everything is setup.

4. USB Pass-through et. al. - I am not sure why you would think this to be the route to take with ANY server. It was already mentioned that the right way to handle this would be to pull/push the data from the server to do intensive multimedia manipulation on his regular machine. This takes the USB and Blu-Ray issues out of the picture.

The lab environment doesn't give any indication of many things, but it does help to determine what he would want to look at for actual hardware. If it works out to be a multibox setup, you can figure out how to do so. If it were just to determine what software is needed, again it works. Even network issues can be found this way, though that can be (read usually is) difficult to determine and I am not sure if it carries over outside of the paid versions of ESXi. I only pointed out a great way to create a proof of concept, and if it works well in the PoC, then you can move it wholesale to another machine as is. If not, you have options, including a V2P if performance is the only issue and you are inclined to believe that it is due to the virtualization. It is really just a cheap way to figure out what's what.

Before you buy anything I'd definitely give virtual a try to get a better handle on how to implement this project. No need to worry about hardware right away. They'll build plenty more by the time you're ready to buy something.

Who knows? It might even end up staying in a virtual environment if it works for you.

Thanks. That last line is the point I was making about the ESXi. It isn't the only option, but if it works well, it is definitely the way to go.

I think one thing has become abundantly clear from this thread: if you want to implement an "ideal" solution for yourself, you need to understand both your own needs (comprehensively) *and* the technology available to meet your needs.

Now obviously you've been trying to learn about the tech by asking for options here, but the key point is this is just a *starting* point. We can point out possible options - and now there seem to be 3 or 4 on the table - but we can't really pick one "best" one for you. Even you can't do that right now, and that's because you don't fully understand the underlying tech.

I suspect that, even if you were to go with one of our recommended solutions, there would be some caveat in it, or lack of understanding of some feature of it, that would end up being an issue for you. The best way is for you to really understand all this stuff. That will take a lot of time but if you're willing to invest a lot of *money* into it, I think it only sensible to invest a similarly significant amount of *time* into learning the technology so you can make your own well informed decision.

What we've got here in this thread is a starting point for much, much more further research. When you can look at all this stuff and say confidently for yourself something like "I feel RAID is the best solution", and be saying that from a position of understanding and knowledge of the technology underlying the options, then you'll be making your best decision. Until/unless that happens, I suspect you'll keep flip flopping until you make a decision, maybe even one made more or less on a "coin toss", and so - not being fully aware of the limitations through your own deeper understanding - you may well be disappointed in what you end up with.

That being said I hope that's not the case. I hope you settle on a solution and it does everything you want it to. But I do frankly suspect this will just be the beginning and that whatever you go with, you will spend a lot of time trying to make it do exactly what you want.

I think one thing has become abundantly clear from this thread: if you want to implement an "ideal" solution for yourself, you need to understand both your own needs (comprehensively) *and* the technology available to meet your needs.

Now obviously you've been trying to learn about the tech by asking for options here, but the key point is this is just a *starting* point. We can point out possible options - and now there seem to be 3 or 4 on the table - but we can't really pick one "best" one for you. Even you can't do that right now, and that's because you don't fully understand the underlying tech.

I suspect that, even if you were to go with one of our recommended solutions, there would be some caveat in it, or lack of understanding of some feature of it, that would end up being an issue for you. The best way is for you to really understand all this stuff. That will take a lot of time but if you're willing to invest a lot of *money* into it, I think it only sensible to invest a similarly significant amount of *time* into learning the technology so you can make your own well informed decision.

What we've got here in this thread is a starting point for much, much more further research. When you can look at all this stuff and say confidently for yourself something like "I feel RAID is the best solution", and be saying that from a position of understanding and knowledge of the technology underlying the options, then you'll be making your best decision. Until/unless that happens, I suspect you'll keep flip flopping until you make a decision, maybe even one made more or less on a "coin toss", and so - not being fully aware of the limitations through your own deeper understanding - you may well be disappointed in what you end up with.

That being said I hope that's not the case. I hope you settle on a solution and it does everything you want it to. But I do frankly suspect this will just be the beginning and that whatever you go with, you will spend a lot of time trying to make it do exactly what you want.

That's my last "2 cents" input on the matter.

- Oshyan

I agree Oshyan! Thanks for your input so far, it's been very helpful. What I really want to do is find someone who has already done this, go to their house, and just check out their setup. That one guy I posted about lives in Redondo beach, which is just a few minutes away. I'm very tempted to ask him to show me his setup, but that's "weird" in this day and age for some reason.

What is the deal with DAS? It sounds like something I can connect directly to my workstation, without any OS in between. If I want to avoid all the complications from my previous setups, this is where I would start. I understand now what NAS is, and I like SANS but even I can tell that's overkill for me at this point. So it sounds like DAS is what I want.

Here's where I'm confused...let's say I find some box that is a DAS that will connect to my workstation. I don't understand how it connects. Right now, I have an external box connected with two hard drives in it. The connection is esata. What's annoying is that each drive needs its own esata cable. I only have so many esata ports on my workstation, and they are currently full. How can I put 10 drives in an external box and connect it? I don't, nor will I ever, have 10 esata ports. Nor do I want it. Isn't there a way of doing this with just a cable or two?

I like DAS because it's directly attached, with no middle man OS or anything. I'll do all my file management through my main workstation anyway, so that's fine. As far as access from remote places, that's negligible at this point, and I've already figured out how to cleverly avoid those kinds of complications.

So my two questions right now is:1) What is a box that is NOT a rackmount thing that will hold 10+ drives?2) How do I connect this box to my PC using fewer cables than the number of drives (ideally one cable)?

I will absolutely not consider USB as the connection, nor will I consider firewire. I like esata very much. I'm hoping there's a better way with some kind of card that plugs into a PCI bus or something.

OK, I went back and reread this thread (ping-pong is exactly the correct term!). But that's how I work, sorry if it's frustrating. Believe it or not, all of this helps me a lot, and I appreciate everyone's assistance. Good will come of this, I promise!

So I went back to some of the previous links, and the product I'm really liking is this one:This will be my box of drives. This box will be directly connected to my current workstation through these SAS cables (which I don't know much about right now). But it sounds perfect. I'll need to buy a SAS controller for that box, which are relatively expensive. I think this is the item that a few of you here have talked about to make sure I get a business grade quality, which I want to do. If you have any advice as to which manufacturer/model I should get, please let me know.

Next is the connection to my computer. My computer doesn't have any SAS stuff right now, so I'm guessing I need to buy a card or something for it. Any direction on suggested models would be appreciated. Same goes for any cables involved, in case I have to be careful about it.

So that gets mucho storage attached to my desktop. After that, I can experiment to my hearts content. I'll try doing some server stuff with VM's. But mostly I'll just use the storage directly from the desktop.

How is that? I can't really see much overkill in this one, and it's relatively cheap. The box and related items will run me about $1500, and then I'll just hunt around for good hard drive deals. The ones with 5 -year warranties are my fav.

I will absolutely not consider USB as the connection, nor will I consider firewire. I like esata very much. I'm hoping there's a better way with some kind of card that plugs into a PCI bus or something.

I don't *think* anyone ever suggested USB or firewire. I wouldn't consider anything less than esata, I know I wouldn't suggest it unless it were *MAYBE* USB3.0, and not even likely then. As for DAS, I believe you are right in that being something very close to what you want. I believe most DAS solutions provide a proprietary card that connects their solution to the machine making it essentially appear as another internal drive controller. However, I have never really researched them or even know much about them so I can't guarantee that statement. Also, I don't know that any of them allow you to build your own out of your own box such as that.

If you really want to roll your own SAN, there is a way to do it. It will take a lot of time to get setup but you can do it in any form you desire. You may even be able to build a DAS on it, I don't know. It is using OpenFiler - a sourceforge project IIRC. Don't know how responsive it would be compared to one you can just buy, but it is always an option if you really want to become a storage expert

On the other hand, a really nice (if somewhat expensive and otherwise potentially limited option) would be buying something like a Drobo. They have pretty much everything from a basic 2 disk NAS up to a 24 disk rack mount SAN. Many have multiple setup options. I think it is probably the easiest, most elegant storage solution for SMB's in general, but that elegance doesn't come cheap, and may not be the most efficient system (performance-wise) out there.

HDDs are a personal choice but if you use WD 2TB WD20EARS, (Caviar Green), @ $79.99 each from Newegg, then the total is approximately: $1836

If you go for WD 2TB AV-GP WD20EURS, (designed for always on streaming), @ $89.99 each, then the total is approximately: $1936.

NOTE: I don't know what kind of performance you'd get over Infiniband links but since they're used for server farms, etc, they must be halfway decent but SJ or 40hz would be more likely to have had some kind of experience/knowledge with/of them.

Also, this is still more expensive than simply building a new PC with the requisite amount of ports in a decent size case and running it using FreeNAS, WHS or similar.

4wd, thanks for your posts above. I have to study them this week. But I have a question, maybe you or lotus can help:I've read that if I get a sas controller card for whatever box I end up using, that the controller is compatible with sata drives. Is this true, and are there any bad side effects of doing this? I like the sas controller with sata drives because the sas controller will allow me to have fewer cables, and the sata drives are way more convenient to buy and use vs sas drives, which i'm not really interested in. Also, the total cost in the end is not that different. So I like the sas way, and you've posted addonics' clever sata setup. I've always like addonics, but you're setup seems to be a little more complicated and more cable-y than what I'm picturing in a sas setup. Any thoughts on that?

4wd, thanks for your posts above. I have to study them this week. But I have a question, maybe you or lotus can help:I've read that if I get a sas controller card for whatever box I end up using, that the controller is compatible with sata drives. Is this true, and are there any bad side effects of doing this? I like the sas controller with sata drives because the sas controller will allow me to have fewer cables, and the sata drives are way more convenient to buy and use vs sas drives, which i'm not really interested in. Also, the total cost in the end is not that different. So I like the sas way, and you've posted addonics' clever sata setup. I've always like addonics, but you're setup seems to be a little more complicated and more cable-y than what I'm picturing in a sas setup. Any thoughts on that?

I have no idea whether SATA port multipliers will work off of a SAS host controller I'm going to err on the side of caution and say they won't.

Stoic' probably the best guy to ask about this stuff...him being a HP tech and all :D

Regarding the external cables: Infiniband cables are used for the connections, see Wikipedia for SASw and Infinibandw. If you can't use port multipliers then you'll need more Infiniband cables. If you can use port multipliers then it's conceivably one (1) cable per twenty (20) SATA HDDs, (assuming a four (4) port SATA compatible host at the PC end).

Addendum: Plus you'll need some way to break out from the storage box' Infiniband input to individual SATA connectors, so possibly an Infiniband socket to SAS plug adapter, then something like this.

OOPPSS! To answer your final question: Number of cables will be the same or more using SAS controllers, (due ambiguity of port multiplier suitability).

Please NOTE: Nothing I'm saying in my above posts was in any way designed to push you towards Addonics products or even SATA/SAS. Their site just happens to have all the stuff to put together what you wanted without jumping all over the place.As I said previously, SJ, 40Hz or f0dder, (sorry if I've missed someone), are probably the best people to ask about the suitability of what I proposed.

Added addendum: If you have 4 spare Intel SATA ports on your motherboard, (or will be making a total of 4 available by moving HDDs to the external enclosure), then a slightly different setup as follows:

4wd has inspired me to create a similar list of things to do before my project can be ready for purchasing. I'm now thinking more clearly about these things, thanks to everyone's help here. So my list is posted below, and I'll continue to update it as I progress. I'm not going to tally any price totals right now because too many things are up in the air. My first task is to figure out what my "x" and "y" numbers are so I know how much space I need to get. But anything with a question mark needs to be resolved before I'm done with this.

You require 8 eSATA ports on the computer, one for each drive - in this type of wiring situation I'd forget about RAID if you were thinking of it, too many chances for a cable to be dislodged.

It's the same as stacking 8 separate external eSATA enclosures one on top of the other.

I don't know whether you've noticed or not but:DAGE840DE-ES - TraylessDAGE840-ES - The one you selected, doesn't say trayless.

Alternative using their products:Storage end:1x DAGE840DE-2MS - 8 bay trayless storage tower using the following as inputs, (not the ones it comes with.)2x ZAGE-D-4SA70 - These as a replacement for whatever comes with the box.

You'll also need 8 spare SATA ports in your computer but you would have needed them anyway for the box you selected, so a good quality multi-port SATA controller, (there are alternatives to this, eg. SAS controller with Infiniband output or SAS to SATA adapter).

The above only requires 2 external cables.

« Last Edit: September 13, 2011, 01:13:22 PM by 4wd »

Logged

I do not need to control my anger ... people just need to stop pissing me off!

I'm liking this. Here's a question about the cabling:1) If I do a SATA to SAS thing, can I get away with just one cable connecting the storage tower to the desktop?

2) Let's say I want to put the DAS in another room. How would I connect the DAS to my desktop? The SAS or SATA or eSATA cables are only a few feet maximum. So would I have to connect the tower to my router, which would bring it to my desktop? Or does that change the whole setup into a NAS and now I need to have a motherboard, OS, etc. on the tower?

I'm liking this. Here's a question about the cabling:1) If I do a SATA to SAS thing, can I get away with just one cable connecting the storage tower to the desktop?

Infiniband connectors support a maximum of 4 devices, see Wikiw - under Architecture (SFF 8470).

I'm not saying you can't do it, just you probably can't using the normal Infiniband cables. Addendum: Unless you use 2 port multipliers - then you only need 1 cable.

Quote

2) Let's say I want to put the DAS in another room. How would I connect the DAS to my desktop? The SAS or SATA or eSATA cables are only a few feet maximum. So would I have to connect the tower to my router, which would bring it to my desktop? Or does that change the whole setup into a NAS and now I need to have a motherboard, OS, etc. on the tower?

Maximum length of eSATA is indeed a few feet, 6.6 feet or 2 metres. SAS is a little longer. From Wiki again:

Quote

Because of its higher signaling voltages, SAS can use cables up to 10 m (33 ft) long, SATA has a cable-length limit of 1 m (3 ft) or 2 m (6.6 ft) for eSATA.

However, remember you're trying to work with SATA HDDs, not SAS HDDs which use a higher signaling voltage. So I'm guessing that you might have trouble trying to push it past 3 metres unless there is an active SAS host/client on the ends of the cable.