What's new

Building a Vod 1080p server, multiple streams

  • SNBForums Code of Conduct

    SNBForums is a community for everyone, no matter what their level of experience.

    Please be tolerant and patient of others, especially newcomers. We are all here to share and learn!

    The rules are simple: Be patient, be nice, be helpful or be gone!

Kilm

New Around Here
What's the most efficient build and powerful enought build.
VOD 1080p60 raw camcorder video (tm700)
VOD 1080p24 blu Ray and x264 mkv's
VOD pictures, music, docs

Will be doing VOD for about 4 Tvs in the house.
( 4 htpc media extender on each tv, core 2 duo/9400m/4gb/ssd)
I'm Not concerned about cost I get everything cheapper cuz I'm in retail.
I don't want either to have unnecessary high electricity bill.

Here is my concerns:

CPu

I3 540 ( build in vga, dual core) vs i7 860
Is i3 enough to handle the workload? Will I benefit at all from a quad, i prefer i3 because lower tdp. 73w vs 95w+vga


HDD

Hdd I decided raid 0 + many external esata for manual backup
Over raid 5
Raid 5 reduces write speed + not efficient power wize + raid is never enough as a backup, gl finding a discontinued hdd spare...

5400rpm vs 7200rpm ? 1tb or 2tb hdd,s

Wd no more tler so ruled out
Hitachi 2tb 5 platters, too hot too noisy
Seagate forget bout it, after their 7200.11screw up
Only one left is samsung my choice !
F3 1tb *4 raid 0 to start. 7200rpm
I want to go with the 2tb f3 but it's a 5400, I'm not worried bout the speed but bout reliability, as 5400rpms do park heads, and show ridiculous load/unload cycles in smart


BACKPLANE

Also someone knows how to connect a backplane like this
http://cgi.ebay.com/ws/eBayISAPI.dl...&_trkparms=algo=LVI&its=I&otn=2#ht_2246wt_807
Can it connect using a sata cable to intel matrix mobo or does need a raid controller. Plus it's usually compatible with only one server mobo, maybe I can find a generic one that connects to any mobo

Else I have to use a lot of this single backplane
http://www.cs-electronics.com/images-large/ADP-4100.jpg
Very messy with many cables


The rest is ez, so any advanced user feedback is appreciated thank you.
Rest of build is
4gb ram ddr3
Seasonic or corsair 600w psu
Self made alu case, metal brake+rivets
I3 VGA or 20w vga aka 5450
Ocz 60gb ssd for boot,
Os win 7, or server 2008 or free nas
Gigabit net router netgear, fvs318gs no jumbo frames
 
Last edited:
they do make 73w i5s. I'm guessing your looking at the dual core w/HT which will show up as 4 logical cpus to the OS. So pick the most power you can afford at that TDP.

You're looking at TBs of data right? so using a striping setup would require a lot of hard drives for medium performance gain that would still be limited by your GB lan. Are you only using the onboard GB nic? is it a real intel nic? Have you compared that transfer rate with a RAID setup? Do you know what your bottleneck will be?

A RAID5 setup is the most efficient implementation of speed/cost. Have you looked at hardware controllers lately? They're pretty flexible now and don't have the same limitations you mentioned. You can mix and match hard drives and usually not have issues w/newer raid controllers. And you would have the same limitation in a striped setup regarding the spare of the same brand/model if you have a picky controller. That's why you would actually purchase spares and have them running as hot/cold spares.

All those are only limitations of hardware raid. If you run software raid in linux, unix, freenas or run ZFS, you can mix and match brands and sizes of drives. If you have to stick to windows, WHS is easy to implement, and you can duplicate data based on folders and not just drives.

Most of the multi-decaTB arrays run hitachi or samsung 2TB drives right now. All the drives are the same to me since I expect every drive to fail in their 5 year lifecycle. What matters to me, is how easy it is to replace a drive. that being said, WDs are easiest to replace and Seagates require signatures which can be pro/con depending on your delivery address. TLER hasn't been an issue for me in my linux raid setups, but other's running hardware raid have to fiddle with it across all manufacturers, not just WD.

I remember looking into that intel 6port sas backplane a long while ago and think it required a special slot or connector for it to work. Essentially you needed to buy the specific intel case for it to work and it wasnt really modable. Not sure if things have changed in the past year or so, but I'm sure a smart lad as yourself could google around and see if anyone has done the work on it... and why do you need a backplane? wouldn't it be easier to pick up a norco 4220 w/20 hot swap bays or pick up some 5 drive in 3 bay adapters and use that? it would be much easier to just use those and they would be hot swappable after wards....

lots to think about and u'll prob want to do some googling and look at other peoples storage server builds for more input.
 
I get i7-860 for about $129 (intel direct for me)
And i3-530 for $129 (retail)
I5 clarkdale is $200 (retail)
So it's more cost effective for me to take i3 or i7
I prefer i3 because of lower tdp and built in vga, but will it be enough for my use, that question idk to answer.

I'm gona use most likely an intel mobo like the dx58so with the built in nic, which I believe should have descent perf.
I don't think it justifies to buy a $150 nic if there is no substantial speed increase.

Bottleneck should be the raid0, or the i3 cpu. Since the LAN is 1gbps cat6cabling.(by the way how much does it cost to fish cat6 behind walls)

For a raid controller, the cheapest ones are $300+, which is not justified for a raid 0 4to6 drives.
I don't want to buy spares, as I will be replacing the array for a bigger one once they make 4tb hdd's for $100 within the next 3years or so, more cost effective moores law.

I wish I could run whs, it doesn't handle raid 0 or any raid for that matter,
it just has some bs duplicate feature aka raid 1.

From googling It seems brand name backplanes are compatible only with their own specific mobos.
Most of them use a mini sas as a connector.
And yea I think I'm just gona buy a pre made backplane cage, they look ugly and cheap plastic. I wanted to put alu handles myself...

U don't have issue with tler because u most likely bought ur drives before 2010, I think in march 2010 wd disabled tler in all consumer grade hard drives, even older models like 1tb fals, else it would have been a no brainer and get the wd 7200rpm 1.5tb for 109.99.

I'm afraid of the 2tb samsung, people report the drive as being 61% healthy out of the box under smart, cuz of the head park.

But I didn't know that software raid is more flexible when it comes to different make and size for the cold spare. I'll be using intel matrix, thanks for the input.
 
Last edited:
A note of caution, on another forum someone asked what OS to use for 4 streams of 1080p. Surprisingly, whs and zfs were not on the good list.
Win 7 or w28 were, but the choice was for the 32 bit version of w28, for driver compatability. Whs didn't make it because results were hit and miss. Some can get it to work, others see problems.
 
Not sure if this will help or not but I went ahead and got some power consumption numbers for you on the i7 860. Swapped out my Ati Radeon 4830 with a Radeon 4350.

Intel Core i7 860 CPU
G.SKILL ECO Series 4GB (2 x 2GB) (CAS 7) RAM
MSI P55-GD65 Motherboard
WD Cavair Blue 320 GB Hard Drive
Ati Radeon 4350 Video Card (Asus)
Antec Truepower Trio 550 Watt Power Supply

With this setup my little watt meter shows a low of 50 watts at idle. I don't have a direct comparison but based on information I have seen an i3 setup can probably get to 30 watts or lower at idle.

With that said my opinion is that the i3 setup would probably work just fine for what your goals are currently. If you think you might want to do more on the server though I might consider the i7 as it looks like it would not cost much more for you. Basically if you plan on doing any video editing or virtualization I think the quad core is a good idea.

As for the OS... If you have access to Windows Server 2008 R2 go for it. FreeNAS and Ubuntu will probably work just fine but Server 2008 would most likely be easier to setup. WHS is nice but as you mentioned no RAID. I do believe you can get around that though as it is basically just Windows Server 2003 with the WHS layer running on top.

Hope that helps a little bit...

00Roush
 
Yeah, your i7 is 50w because of the video card. My i5 and i3 setups both run the same 30ish watts. Identical builds with just the cpu being different. Once I add the dozen drives, it idles under 50w and peaks around the 70ish watt range.

Your striped raid is not going to be your bottleneck... Your GB lan is. If you take ur theoretical limits with your lan being 1Gb and your sata drives at 3Gb or even 1.5Gb as individual drives, your lan is slowest. If you translate hat to real usage bytes a second, your lan tops at around 125MB/s and a sata drive is obviously 3x that. If you look at real world performance, a modern day TB drive can get over 100MB/s for large files raid and a striped raid array would get almost double that. You can google around and find out people's speeds on SAT RAID0 arrays that over 200MB/s and i've seen 350MB/s for read speeds. Adding multiple drives would also increase those speeds for a short while.
For a file server where the most usage is limited by the gig network, RAID5/6 is preferred because there's not much performance gain from RAID0 and your network is still the limiting factor for either RAID. Once you hit multi-decaterabytes of data, you don't want to be spending weeks recovering data and the performance different isn't enough to justify it. RAID0 really is only useful for local usage as a scratch disk or for usage with lots of disk i/o

With the gigabit lan being the bottleneck, those of us with multi-decaterabytes of storage are adding dual and quad port intel nics to increase the bandwidth throughput. Adding a dual port and teaming givaes u redundancy and load balancing. Its like striping your network card to give you 2Gb/s instead of 1Gb/s.

Fishing drops is something like $100+ a drop depending on your area and how complicated your house is. Crawling through attics or going through floors adds cost. I had an estimate of over $600 to connect run drops in 4 bedrooms to one room and convert a phone line to data use between floors.
 
thank you for your feedback

i will go for windows 7 pro as the os, as windows server 2008 is too expensive and i have no experience with active directory and such.

i will go for the i3 for my fileserver as i will only be straming huge 1080p files and occasionnaly small picture/docs files, as i will be doing video editing on a workstation i7 940/6gb/$200 video card. my electricity bill is like $100 a month i dont wana make it worse. canadians cheapo !

10gbps network card is out of question considering the price, its more for business.

as for link aggregation i will consider only if i see any lag when watching streaming 1080p adn i would try a switch with jumbo frames first. dual nic cards are expensive, and from what i understand correct me if i am wrong,

i would need to have a QUAD nic card for my 1080p server with 4 rj45 cables going to my switch or router, which should theoratically be 4gbps speed from the fileserver to the switch. And have a DUAL nic card on each htpc i have, with 2 rj45 cables going to the switch, with a speed of 2gbps from switch to htpc client.
thats like $400 for the quad nic, plus 4*$150 for 4 htpc dual nic cards, plus a 16ports 10 gbps switch ??? with jumbo frames, thats alot of money !

my biggest concern is if the hard drives will be a bottleneck, because when u
have multiple reads on the same physical hard drive, usually it slows down like hell.
so im affraid when i have 4 streams on to experience lag.
its like when i try to launch 2 copy files instances on my now aging workstation, my hard drive is too busy to do any other task.

thats why most ppl making a 1080p server use unRaid very popular in avsforums, because it doesnt stripe but only provides redundance.

as for the backplane, it was hell to find and it can get very expensive, im going to use an oem sas backplane like this
http://www.supermicro.com/manuals/other/BPN-SAS-113TQ.pdf goPage 13
it has a monitoring chip, which i hope doesnt require drivers to operate.
if the card requires drivers and they dont provide it with the oem backplane package im screwed, as ur not supposed to use a bare backplane with any other build than the case chassis it was designed for, since as i said ill be building my own case in alu.:)
 
Last edited:

Support SNBForums w/ Amazon

If you'd like to support SNBForums, just use this link and buy anything on Amazon. Thanks!

Sign Up For SNBForums Daily Digest

Get an update of what's new every day delivered to your mailbox. Sign up here!
Top