This issue occurs when youtube loads video partially. I looked on youtube url video part by part, and saw &range= at the end. So I tried to change regex on storeurl.pl and seems working until now. And all parts of a video are cached and ‘hit’ normally.

and this image shows my access log while logging a youtube video part by part.

as you can see, several parts of a video are succesfully load from cache (HIT), then rest part of it is still ‘miss’. However, a video could be loaded part by part or normally. So, a video could take double size on cache swap (partially video and normal video).

Hi, is there a new way to fix youtube error?.. after i add negative_ttl 0 seconds to squid.conf, problem still occurs as of now. Can you guide me with the new way to fix the problem? Thanks in advance.

Hmmm,, can you please post updated storeurl.pl , or email me. It would be great help for our open source community. 🙂
Ok Can you provide me with your email address for communication.
My email is aacable at hotmail.com

Everything which is cache-able goes in cache.
Or you can use some advance scripting like storeurl.pl to force cache for some non cacheable objects, for example you-tube videos, antivirus updates, windows updates etc. Yes its possible.

Well I have configured the new updated storeurl.pl at two different cable.network setup, and so far , its working great. much mroe better then its previous version, so I can confirm that its working good.

Syed
i Would like to know your opinion to tell me what hardware can i use for approximatelly
200 active users every one having a bantwith 384kbps-512kbps
with a network of 25Mbps and a bantwith of 1.5TB within a month of traffic used .

I want to build a squid without caching .flv files and files larger than 150Mb .
What do u thing , what hardware computer do u prefer for me like DELL Servers , IBM and which type of that if u like any of them .

By the way i would like to buy any without needing much energy (if any exist with at least 500W)

It all depends on your network scenario and Budget you have. Mikrotik doesn’t requires much heavier hardware for this amount of users, Also Mikrotik supports only 2 GB of ram maximum.
but for proxy, the better hardware you put on it, better performance you will get from it.

thx men 🙂 i’ve more than 8 mikrotiks all RB433 or Rb435 and i want to build a squid proxy server for my clients
with just one lan card with 200users 😀
now i’ll buy a xeon 3.0 Dual with 8GB Ram and 1TB HDD 🙂
what i want to know if possible is :
Why are some servers with 2 Power Supply ?
Do i need to connect both of them in power or can i connect just one and if the Server has 750W
Does it mean that the server will take that power or if i have jut one HDD and just 8GB ram it will take just less than 750W , because sometimes we have a power cut into my town and it can stay for 3 hours without power :S

Some server have 2 power supplies, because they provide fault tolerance, redundancy , means if one power source fails, the second will keep working , like UPS working.
(Provided power is coming on second when first failure occurs)

any help, when i paste/edit from any source storeurl.pl end restarting squid, my squid wont work, but if paste again with originial storeurl.pl its work again, i’m edit storeurl.pl with notepad++ and save as perl, and try edit with perl editor too, and try from this too just copy paste not edit http://pastebin.com/e3TUtigH, error again, please help, me still newbie

There must be some copy pasting error , either due to blogs coding, or your notepad saving changing codes.
use putty to login to your squid box, then from the browser , directly copy paste the content in storeurl.pl in putty

Start SQUID in debug mode , with following parameters, it will show exactly for what reasons SQUID not restarting.

Most of my posts related to Linux are Ubuntu base, because I found Ubuntu most easy to use Linux flavor. You or other may found any other flavor more suitable for various reasons, As I have my own 🙂

For example: By default FEDORA installs SQUID 3 which doesn’t support ZPH by default, you have to compile Squid 2.7 on Fedora in order to use ZPH or Use LUSCA on FED , as LUSCA supports ZPH well.

On the other hand UBUNTU installs squid 2.7 which supports ZPH so no extra work is required.

It’s all about personnel preference, in past, FEDORA was my first choice, but from past 2 years UBUNTU have manage to become First 🙂 but don’t worry, You can achieve your goal by using any flavor of Linux, doesn’t matter its Ubuntu , Fedora, Centos , FreeBSD , or any OS with Linux as its base OS / Kernel , it will work 🙂

hi syed is this storeurl support to manage video that error when i was cache it, exmaple when i cache some video wtih 360 res, it finish then i changed to 720 res but youtube does’nt reload from begining video and cache the video even just the end of video. Is there any advice to fix my problem? cause this make much trouble for me!

is my old video on proxy still can be used because i used this storeurl from chudy or i must to re cache all youtube videos
if ($X[1] =~ /(youtube|google).*videoplayback\?/){
@itag = m/[&?](itag=[0-9]*)/;
@id = m/[&?](id=[^\&]*)/;
@range = m/(&range=[^\&\s]*)/;
print $x . “http://video-srv.youtube.com.SQUIDINTERNAL/@id&@itag@range\n”;

it’s working better thanks man ,but still gives errors
btw is it possible to cache facebook games while it is chosen by the user to use secure browsing (https)?
Did you find a way to cache those stuff? thank you

Sir I have tried it on pfsense and works fine for few days after that it crashed badly….I have search alot of things on internet and found that VARNISH Cache is 10x faster than squid, I will highly appricate if you can provide its tutorial or how to..etc..

Working with multi-threaded disk access (AIO) Squid queues the tasks to be performed and lets the disk controller work through it as fast as it can. This allows Squid to work on other processing tasks for the same request without being held up waiting for slow disks.

The queue starts off at a default length of 8 queue slots per thread. When that queue space is filled up, Squid will spit out the WARNING, and double the available queue length.

Up to a few of these is OK under very high load. But if you get them very frequently then it’s a sign that either the disk I/O is overloaded or you have run out of CPU cycles to handle it.

Workaround

* A few seconds of these after a clean startup can be ignored. They should decrease exponentially as the queue is automatically adjusted to the load.
* For Squid expected to run on a busy network, increasing the default AIO threads available can reduce the annoyance. Using fast disks is essential.
* If these continue without decreasing you need faster disks, or to spread the traffic load over more proxies.

Hi I’m testing the code perl and my question is whether this caching me well squid 2.7 now when I see the video gives me this message again TCP_REFRESH_HIT/304 202 GET http://www.youtube.com/crossdomain.xml – DIRECT/173.194 .42.4 –

or is this tomendo me of another proxy that this network before mine thanks

One further question, when you change the resolution of a video, you get the video cached in the new resolution from the point where you switched. And when you want to watch the same video it starts from the point where it was cached earlier. So, you can’t watch it entirely.
Any ideas about that, or how to make that with a resolution change, the Squid always make the caching from the video starting point?

Please accept my appologies, I never said that this or that script is written by ME. i added the chuddy_fernandez email address on top, I added my line only for tracking purposes, it is not copy righted by me.
Anyhow thank you for pointing the right thing, I have added following note in the post, Please check it and do let me know if there is any further amendments required.
=========================================================This script is NOT written by me, I only copy pasted it from the internet.
It was originally Written by chudy_fernandez@yahoo.com & Have been modified by various persons over the net to fix/add various functions.
For more info, http://wiki.squid-cache.org/ConfigExamples/DynamicContent/YouTube
=========================================================

As you said, you have added few functions to it, please do let me know your email or name, So I will add them to your section lines.

Once again, storeurl.pl is not my creation, It was just copy pasted and I did mentioned the original author name at the top. However I have now added more info on it. I hope the picture is clear now.

you did not mentioned that you write the script, but you may not modify by omiting the header lines (remaks) which is mentioned th30nly had fixed or modify the original script.

the name of th30nly was omitted from the whole script remarks and then you put your name was mentioned in the header under the original writer, I doubt it for tracking purposes since the remaks lines which mentioned fixed by th30nly still not revised yet

thxx for all this tutorials that u have provided 2 us .
i just want to ask u something as a network it

1. i’ve dedicated internet access 30Mbps Down/ 30Mbps up but on youtube i’m having problems on caching youtube with your script
could it be because my servers provider has added cache servers into his internet access
on squid i see something like that :
0 80.80.171.19 TCP_HIT/200 657 GET http://o-o.preferred.ipko-prn1.v21.lscache3.c.youtube.com/crossdomain.xml – NONE/- text/x-cross-domain-policy

i’ve tryed the command and i see all that 🙂
ur video that u watched i’ve seen others video i get this crossdomain.xml as tcp hit not the link of the video 🙂
but i’m letting it so , better to dont cache youtube 🙂

thx for everything 🙂 what i coulnd find in this blog is how to configure the mrtg graphs in ubuntu 🙂
could u add a tutorial please to add some graphs for squid

Hi Sayed , i test new storeurl.pl and the same problem not solved i can’t caching youtube , because youtube separate video file with extention FLV to 1.69 MB so that youtube doesn’t cache but it cache with new extention WEBM from youtube .

One further question, when you change the resolution of a video, you get the video cached in the new resolution from the point where you switched. And when you want to watch the same video it starts from the point where it was cached earlier. So, you can’t watch it entirely.
Any ideas about that, or how to make that with a resolution change, the Squid always make the caching from the video starting point?

(Note: I am not author of this script, Please use the links/emails provided here to contact the real authors, I only tested it and the problem seemed to be better up as compare to the previous versions of this script)

IIs it true your statement is “Some research update (Someone please confirm and post your comments)
YOUTUBE has split its videos into segments of 1.5 MB each the which is the approximation of the 51 seconds. YOUTUBE I am sure have taken this step to Prevent people from caching entire videos. If you have a video the which is 100 Mb large, it will be split into segments about 55-60.
As of right now, storeurl.pl wont be Able to cache it. Currently VIDEOCACHE plugin is doing a full cache of youtube but at higher $ $ $ cost ”

Then how the solution from you? the problem is I can only cache / hit early in about 50 seconds and so miss.
So if I watch a video on youtube with more than 3 minutes just cache / hit miss 50 seconds onwards.

No solution for 51 seconds yet !
YOUTUBE has split its videos into segments of 1.5 Mb each which is the approximation of the 51 seconds. I am sure YOUTUBE have taken this step to prevent people from caching entire videos. If you have a video which is 100 Mb large, it will be split into about 55-60 segments.
As of right now, storeurl.pl wont be able to cache it. Currently VIDEOCACHE plugin is doing full cache of youtube but at higher $$$ cost 🙂

Syed Jahanzaib and the others that are facing problem with youtube cache.. just do this 2 things.. set minimum object to 512 and prevent youtube to send range request.. today(14/05/2012) all my youtube files are working news and old one.. one thing that i’ve observed is that spliting the file help us in bandwidth saving, just check access.log after mods and check for HIT, if you’re facing problems, check store.log and see if you hav a swapout in youtube files with content length size = / = size

sorry about the delay to answer .. it’s something about this.. if you file get cached from squid you’ll get a swapout in store.log on the file..try to look for this first. Youtube splits the video in files but you still can get all parts cached, just check store.log and see whats going on when squid try to store the files

and can only restart the squid
this is a first and simple redirector, I hope the community will improve.

– Tested in squid 2.7, squid 3, Lusca.
– Need of storeurl.pl to save the videos.
– This can be improved for use in combination with http://rg3.github.com/youtube-dl/ to default to open videos in lower quality and thus speed up loading.

good blog sir ,i have some problem with my proxy,some time cant cache you tube and must be reboot my proxy and can caching again ,so what is my problem sir ,i’m using your squid and store url.i’m waiting your reply soonesty.i’m sory my language is very poor .thank’s be fore

ok thank you sir, so how about if i want to make high priority to game online example pointblank,lost saga . do you have some rule on squid for that because some time lost saga get red ping so make lag for that game ,i use 2mb and i share only 10 pc only in my internet cafe and 1 server sir, please give me some explain to my problem sir .

i was working on a better solution then store_url_rewrite for quite a long time and had the idea but not it’s workable using icap and a database.
it can also be implemented using a url_rewriter but using icap is faster and can be maintained almost without reload/restarting any server.

so you must match each of these separate cases with a one match case of id and sub cases of the order in the file.
you can have a look at this specific code i have written in ruby for that:https://github.com/elico/squid-helpers/blob/master/testers/test3.rb
works great with an explicit acl to allow store_url_rewrite only for youtube cache domains,

I have installed squid for a network of 20 users only with a 30mbps fibre optic link. Ive tried squid script shared by you without youtube caching (cause dont need that) and tried debian also with squid3

Everything works fine but after 5 – 6 hours normally, the RAM gets fully used and Ubuntu / Debian got stuck. In debian i can restart service but in ubuntu the system crashes.

After restart, everything works perfect for few hours again

Hardware is P4 3.0, 1GB RAM, 1TB HD.

Please advise if its because of hardware issue as I dont see any issues while running except this

again sorry for the last post i should do more test befor posting any way if one of the programmer can help doing this to match variable insted of fixed number becaus it change
$X[1] = ~ s / & range = 13-1781759 / /;
range=1781760-3563519
is this will do??? $X[1] = ~ s / & range =[*] / /;

i hait ppl making $$$ from wat ever ppl post to help here is the solution almost 90% all format range non range working of corse need more testing and some not all some of the webM that has content from diseny need more testing
pls post your test and if somone can help making it beter less bug will app….tks for ppls they share and help
storeurl_rewrite_children 1
storeurl_rewrite_concurrency 100
—
if ($X[1] =~ /(youtube|google).*videoplayback\?/){
@itag = m/[&?](itag=[0-9]*)/;
@id = m/[&?](id=[^\&]*)/;
@range = m/[&?](id=[^\&]*)/;
@ptk = m/[&?](ptk=[^\&]*)/;
@oid = m/[&?](oid=[^\&]*)/;
@ptchn = m/[&?](ptchn=[^\&]*)/;
@pltype = m/[&?](pltype=[^\&]*)/;
@begin = m/[&?](begin=[^\&\s]*)/;
@redirect = m/[&?](redirect_counter=[^\&]*)/;

Yeah some ppl do make money off other ppl code. So is there anyone gentle enough to forget about money for one second and provide us the code to make Youtube caching possible. When you throw the food to the sea to feed the fish – if the fish don’t note your good intention, God do count your good deed.
PEACE,

is there any one who can post a perfect script/way to cache youtube videos if yes then please post it because my squid still caching only 51sec video from youtube. if any one get solution for perfect video caching then plz plz plzzzzz
post it here becoz it can’t be hide by anyone. knowledge populated by sharing.

Syed i’m having a issue with ubuntu .
After a crash or a power issue , i need always to press enter to start ubuntu properly .
is there a way to do it automatic if i don’t press enter to choose the first solutions on the boot loader

Here is some clue for youtube non-range HIT “youtube range url regex create in squid acl, after that deny and it will become a one file/content non-range” and in storeurl.pl you can use the lastest ini this blog.

my proxy untill now still good working ,i using your squid lusca and storeurl.pl . so i have one question sir , how to make clear cache automate using squid konfiguration.
i’m waithing your reply sonesty.thanks

Over a period of time, the allocated space for the caching directories starts to fill up. Squid starts deleting cached objects from the cache once the occupied space by the objects crosses a certain threshold, which is determined by using the cache_swap_low and cache_swap_high directives. These directives take integral values between 0 and 100.

cache_swap_low 96
cache_swap_high 97

So, in accordance with these values, when the space occupied for a cache directory crosses 96 percent, Squid will start deleting objects from the cache and will try to maintain the utilization near 96 percent. However, if the incoming rate is high and the space utilization starts to touch the high limit (97 percent), the deletion becomes quite frequent until utilization moves towards the lower limit.

Squid’s defaults for low and high limits are 90 percent and 95 percent respectively, which are good if the size of cache directory is low (like 10 GB). However, if we have a large amount of space for caching (such as a few hundreds GBs), we can push the limits a bit higher and closer because even 1 percent will mean a difference of more than a gigabyte.

I want to update everybody about StoreID which is the alternative to store_url_rewrite.
it comes in the new squid 3.4 and now in squid head which can be compiled manually.
feel free to contact me about it.http://wiki.squid-cache.org/Features/StoreID