Is there any downside to using a Raid 0/1 software solution, other than reducing storage size. I'm in the middle of building an editing workstation and ready to configure my hard drives.

I'm interested in speed that Raid 0 should provide, but also want the redundancy of mirroring. I currently think the Raid 0/1 solution makes most sense for me, but don't know what I don't know... Will Mirroring (adding Raid 1) be slower than a Raid 0 solution? If so, does it eliminate the Raid 0 speed improvements? Why don't more Video people use this solution?

Though the motherboard has a Raid capability, my understanding is that it wont allow you to create a Non-raid Operating system drive, and a raided storage solution. I was planning to do this using Windows Hard Drive Setup Utility. And lastly, Is that doable now that Windows is already installed?

Software RAID can impact performance but with a decent 2Ghz machine or faster you should not notice normally. With that said, Mirroring and Spliting might slow you down a little more. Concider getting a Promise Fast Track Raid Card. Also, At one point I thought it was only possible to do one or the other with software but that could have changed.

In the end I highly suggest utilizing the Mirrioring feature (Raid 1) more than Raid 0.

Unless your motherboard is really, really cheap, you will not see a performance difference with a $100 promise raid card - which is technically still a software raid. With a $1000 server-class raid card w/ onboard cache, that's when you'll start to notice. Honestly, software raid is pretty good. Doing it from within the windows disk manager is definitely a viable option, and free.

0+1 is not a very efficient use of four drives, you might be better off with RAID5

If the RAID card doesn't have an onboard cache/memory, then it is "fake hardware raid" and is likely no better than your motherboard's

Unless your motherboard is really, really cheap, you will not see a performance difference with a $100 promise raid card - which is technically still a software raid. With a $1000 server-class raid card w/ onboard cache, that's when you'll start to notice. Honestly, software raid is pretty good. Doing it from within the windows disk manager is definitely a viable option, and free.

0+1 is not a very efficient use of four drives, you might be better off with RAID5

If the RAID card doesn't have an onboard cache/memory, then it is "fake hardware raid" and is likely no better than your motherboard's

In general I agree with you, but the links you provided are not really up to date. The 3Ware is only PCI-X and has no cache. IMO there is only one GOOD raid controller and that is the Areca ARC-1680iX series, with up to 4 GB cache and BBM. Promise is about the worst card you can get, but you get what you pay for. On board controllers (ICH-R chips) are certainly comparable to Promise but no match for a really good raid controller.

Interesting stuff. I would have to read up on these more advanced cards to see what the return on investment is for someone who is just doing some video editing on their own. I could easily see the value in a server environment that is feeding out data to thousands of users though.

In general I agree with you, but the links you provided are not really up to date. The 3Ware is only PCI-X and has no cache. IMO there is only one GOOD raid controller and that is the Areca ARC-1680iX series, with up to 4 GB cache and BBM. Promise is about the worst card you can get, but you get what you pay for. On board controllers (ICH-R chips) are certainly comparable to Promise but no match for a really good raid controller.

That 3ware has 256MB of cache, but you are correct that I am not up to date. I don't know what the good ones are now and only want to point to that as an example of an alternative. You certainly do get what you pay for!

In my opinion, SAS RAID is extreme overkill for a video workstation. Stick with SATA and spend the $$ on CPU or RAM.

In my opinion, SAS RAID is extreme overkill for a video workstation. Stick with SATA and spend the $$ on CPU or RAM.

SAS does not by definition mean SCSI disks. If you take the example of the Areca ARC-1680iX, it makes sense to attach SATA disks and not SCSI. In an 'ideal' situation one could envisage 8 internal SATA disks in raid50 for media and 4 external SATA disks in raid3 for projects/previews. Such a setup would set you back around € 930 for 12 1TB SATA disks and € 700 for the controller, BBM and 2 GB cache. (Based on current Dutch prices excl. sales tax)

I have two 500Gb drives on software RAID 0 and it works great. I don't see the need to buy hardware raid, motherboard raid is hardware too just not dedicated however IMO with workstations as fast as they are there is enough processing power to run everything. I would definitely suggest RAID 0 for active projects, I edit all my raw footage on my Raid 0 drives and as soon as project is complete I copy over to a back-up. There is a big performance increase using Raid 0, my HDV plays smoothly and I can edit 4-6 HDV tracks simultaneously in Vegas.

You'll need four drives to use raid 10. I probably wouldn't put the boot image in any raid except raid 1. This is often done in high availability workstations.

Raid 1 and 0 and the combo - raid 10 are well implemented in Mac OSX. It might be fine done in software on a PC. Raid 5 and 6 calculate parity, so a better raid card with a cpu is the way to go with these raid levels. There's no parity calc in mirroring or raid 0 striping.

Use sata/eSata, or firewire as a second choice. USB 2 is not a good setup for video editing.

Considering the cost per meg of storage, a good raid 5 card is looking expensive. The 50% storage efficiency of raid 10 may be less expensive than 80% storage efficiency with raid 5 and a $500 raid card.

RAID 0 for scratch is usually going to be the first best use of raid in video editing. I wouldn't use 0 for anything else. I wouldn't use raid 5 or 6 unless I really needed it. The data in these raids is only readable by the raid board. When things go wrong with raid 5 and 6 they go horribly wrong.

Correct me if Im wrong, but with SATA/eSATA/1394 aren't the speed benefits of RAID 0 diminished the more drives you add? The bandwidth lessens further down the chain. I think the best solution for RAID is having the dedicated card and your drives in a separate box connected via ethernet, like this.

In theory Sata II is still much faster than four drives combined on a port multiplier. I run 2 to 4 drives raid 0 on a mac with a port multiplier and the setup is very fast. If anyone has links to raid 0 performance comparisons with video please post.

Raid 5 or 6 I would do with a highpoint raid card. I'm going to need to do this as I have some 4K files I need to edit.