helloi have a problem when downloading big files using ftp functionsi have 1gb VOB files that i want to download from the webbrowser using php

_nero_

kamoricks, I'd do $day=60*60*24; ;)

ZEUGMA

i click the link to the file, it contacts the server and stays waiting foreveris there some setting i could tweak in php.ini to fix this?

Touqen

1 day = 86400 secondsspare the unnecessary multiplication

jiggster

drop the damn epoch!:P

tmpvar

$timestamp-(date("w",$timestamp)*(3600*24));thanks guys!

_nero_

Touqen, true.. but if you must do the multiplication, it helps to do the proper multiplication.

Xyphoid

leap seconds always make me nervous when going from timestamps to larger intervals

radam

zeugma - are you passing the file through php for the download?

ZEUGMA

i have 1gb VOB files that i want to download from the webbrowser using phpi click the link to the file, it contacts the server and stays waiting foreveris there some setting i could tweak in php.ini to fix this?

Crapple

Is there any way to make PHP do multiple things at once (i.e. thread) or speed up the rate at which it does things?

MarkR42

Crapple: You can't start new threads within a web context

Crapple

its a CLI script that verifies 95,000 URL links people have submittedlooks for codes other than 200

MarkR42

I seein which case some kind of parallelisation would be recommended

Crapple

and deletes them from mysql if it gets one.Yeah its been running for hours and its on record 5000

Crapple: It is possible to run several CLI processes at once. You can either use fork (possibly error prone), or start them from the shell or via system() etc, using the proper shell job control operators

Crapple

into like 5 different blocks or something and run all 5 blocks simultaneouslylike 0-15000; 15000-30000; 30000-45000, etc?

_nero_

Crapple, if you are going to run multiple CLI processes, have a data file that holds the last record number being acted on..before each record, the script checks the number, grabs the next one, and records it..but you'd need to do some sort of lock.. that might be a PITA..

Crapple

maybe I should just decrease how long cURL will wait

MarkR42

Anyway, if you make a PHP control process which decides what to do, then invokes a number of copies of the workers, which individually only do what they're told

reaVer: yeah, that works.. i was trying to cut down on all the .'s and stuff

reaVer

well, afaik you can have arrays in an echo without it whining about it

Crapple

MarkR42: so write a function called checkurl($start, $end) { and have the control process send starts and ends to each worker process?

reaVer

*can't

MarkR42

A better alternative might be to use some external process such as "linkchecker", and simply supply the 100k URLs as a datafile into it, then parse the results?

SirFunk

ack.. wait i just changed it back and it's still whining!

Crapple

MarkR42: I was thinking maybe doing a mysql_num_rows and then dividing it by 25, and just having that script call another script 25 times with the start/stops for what IDs each child is supposed to check

reaVer

SirFunk: check if all the ; are there:P

Maquiavelo

Hey guys, what's wrong with this command? INSERT INTO datos_cuenta (email, cedula, password) values ('$_REQUEST[nombre]', '$_REQUEST[apellido]', '$_REQUEST[password]') ? I'm getting a Error: You have an error in your SQL syntax. Check the manual that corresponds to your MySQL server version for the right syntax to use near '0' at line 1 when I'm trying to pass it through a mysql_query in PHP

ZEUGMA

i have 1gb VOB files that i want to download from the webbrowser using phpi click the link to the file, it contacts the server and stays waiting foreveris there some setting i could tweak in php.ini to fix this?

MarkR42

Crapple: Yes, but it would be better to just do a select count(*) rather than loading a redundant result set of 100k rows into memory