I had loads of trouble while reading with fgets on a WAMP (Windows server). On local the file went unto a <pre> tag without a hitch, but when I moved the code to a LAMP production server, every \r\n created two fgets and I got free empty lines.

I tried deleting with $string=str_replace("\r\n","\n",$string); but it had no effect whatsoever. The solution was to do an fread() and explode the contents by PHP_EOL and do a foreach($lines as $line) so every line did not get duplicated.

One thing I discovered with fgets, at least with PHP 5.1.6, is that you may have to use an IF statement to avoid your code running rampant (and possibly hanging the server). This can cause problems if you do not have root access on the server on which you are working.

I have noticed that without the IF statement, fgets seems to ignore when $fh is undefined (i.e., "filename" does not exist). If that happens, it will keep attempting to read from a nonexistent filehandle until the process can be administratively killed or the server hangs, whichever comes first.

Note that - afaik - fgets reads a line until it reaches a line feed (\\n). Carriage returns (\\r) aren't processed as line endings.However, nl2br insterts a <br /> tag before carriage returns as well.This is useful (but not nice - I must admit) when you want to store a more lines in one.<?phpfunction write_lines($text) {$file = fopen('data.txt', 'a');fwrite($file, str_replace("\n", ' ', $text)."\n");fclose($file);}

Regarding Leigh Purdie's comment (from 4 years ago) about stream_get_line being better for large files, I decided to test this in case it was optimized since then and I found out that Leigh's comment is just completely incorrect

fgets actually has a small amount of better performance, but the test Leigh did was not set up to produce good results

The reason this is invalid is because the buffer size of 65535 is completely unnecessary

piping the output of "yes 'this is a test line'" in to PHP makes each line 19 characters plus the delimiter

so while I don't know why stream_get_line performs better with an oversize buffer, if both buffer sizes are correct, or default, they have a negligable performance difference - although notably, stream_get_line is consistent - however if you're thinking of switching, make sure to be aware of the difference between the two functions, that stream_get_line does NOT append the delimiter, and fgets DOES append the delimiter

Here are the results on one of my servers:

Buffer size 65535stream_get_line: 0.340sfgets: 2.392s

Buffer size of 1024stream_get_line: 0m0.348sfgets: 0.404s

Buffer size of 8192 (the default for both)stream_get_line: 0.348sfgets: 0.552s

When working with VERY large files, php tends to fall over sideways and die.

Here is a neat way to pull chunks out of a file very fast and won't stop in mid line, but rater at end of last known line. It pulled a 30+ million line 900meg file through in ~ 24 seconds.

NOTE: $buf just hold current chunk of data to work with. If you try "$buf .=" (note 'dot' in from of '=') to append $buff, script will come to grinding crawl around 100megs of data, so work with current data then move on!

...will result in a timeout after a default time of 60 seconds on my install. This behavior is non standard (not POSIX like) and seems to me to be a bug, or if not a major caveat which should be documented more clearly.

After the timeout fgets() will return FALSE (=== FALSE), however, you can check to see if the stream (file handle) has really closed by checking feof($stream), e.g.

One easy sample, how to use "fgets":This sample shows how to read the Information from an .txt file:

<?php$z=fopen("protocol.txt","r"); //r = readif($pointer!=false) //Is it possible to open the File?!{//Repeats all Information from protocol.txtwhile(!feof($pointer)) //the loop runs till the Pointer is at the End of the File {$row = fgets($pointer); // $row reads the Information from the row of the Fileecho "<p>".$row."</p>"; }fclose($pointer); //File must be closed}else { echo"<p> It was not possible to open the File!</p>";}?>

There seems to be an interaction between sockets and the auto_detect_line_endings setting that can cause rather peculiar behavior. Apparently, if the first line read from a socket is split across two TCP packets, the detector will look at the first TCP packet and determine that the system uses MacOS (\r) line endings, even though the LF is contained in the next packet. For example, this affected the PEAR Net_SMTP package, which would fail mysteriously for only some email servers.

As you can see very close and fgets just coming just a little bit ahead. I suspect that fgets is reading backwards on the buffer or loads everything into its self then trys to figure it out where as a correct set buffer does the trick. Dade Brandon states that fgets lets you know how the line was delimited. stream_get_line lets you choose what you wanna call the delimiter using its 3rd option.

fgets has one more option that is important, you dont have to set the length of the line. So in a case where you may not know the length of the line maybe in handling Http protocol or something else like log lines you can simply leave it off and still get great performance.

As you can see very close and fgets just coming just a little bit ahead. I suspect that fgets is reading backwards on the buffer or loads everything into its self then trys to figure it out where as a correct set buffer does the trick. Dade Brandon states that fgets lets you know how the line was delimited. stream_get_line lets you choose what you wanna call the delimiter using its 3rd option.

fgets has one more option that is important, you dont have to set the length of the line. So in a case where you may not know the length of the line maybe in handling Http protocol or something else like log lines you can simply leave it off and still get great performance.

Even if this is not really related to PHP and its internals take care when using fgets for reading input from CLI on Linux systems as it may behave unexpected because of the limitations of arguments length on these systems. For example doing rtrim(fgets(STDIN), "\n") on a user input larger than 4095 characters will cut the the input string to 4095 characters. This shortcoming can be solved using "stty -icanon" before the script run, followed by a "stty icanon" after the script is run.