Sometimes you need those extra files. For example, aux keeps information about crossreferences and the like. They won't work unless latex can read that information from the aux file.
– SeamusFeb 15 '11 at 13:43

Try using Gummi. You'll have just the .tex file and the .pdf file. If you check out the screenshots, the "Error Output" will let you look at the log. It just doesn't leave a file behind to clutter things up. Gummi is best for small documents since it is constantly compiling to give you an almost immediate view of what you're creating.
– DJPAug 13 '11 at 21:45

FWIW, ConTeXt stores all the auxiliary information in a single file filename.tuc. So overall, only two extra files are written: filename.tuc and filename.log. If you want, you can compile the document using context --purge filename.tex which will delete the .tuc and .log files at the end of the run.
– AdityaSep 15 '14 at 22:17

18 Answers
18

You might not care about these files, but pdflatex does quite a bit. These files hold information collected during the first run(s) and are needed to build the final PDF with correct ToC, references, PDF bookmarks, etc.

Your can delete these files afterwards, e.g. manually or using a front-end tool like latexmk (-c option). However, future compilations of the PDF would then need again several compiler runs.

You can define an output directory for all files using the -output-directory command line argument of pdflatex. After compilation you can then move the PDF in the current directory.

Why does LaTeX do this? Why doesn't it do the multiple runs in the background, keeping toc, references, bookmarks, etc. in memory instead of cluttering the filesystem?
– Jonathan BaldwinOct 24 '13 at 2:19

11

@JonathanBaldwin TeX (upon which LaTeX is built) was written in 1978, before the PC era. It was designed to work in resource constrained environments, and therefore uses explicit multiple passes instead of keeping stuff in memory (and then breaking everything like word does).
– Paul de VriezeMar 30 '14 at 20:51

3

Texmaker has an option called "Use a "build" subdirectory for output files" that will put all extra files in a folder called build automatically.
– qwrOct 17 '16 at 5:02

@MartinScharrer How do you automate the moving of the output pdf to a separate directory? Is it possible? Thanks.
– Nanashi No GombeMar 23 '18 at 9:10

In addition to Martin's answer, I thought it might be useful to explain why LaTeX creates all these extra files. Let's take the example of the .aux file.

Let's say you have a \label in your document and a reference to it somewhere above where the label occurs. When pdflatex reads your .tex file, it reads the \ref first. Now, it doesn't know what to do with this ref: it hasn't yet encountered what it's referring to. Now when pdflatex reaches the label, it makes a note of what the label is referring to. By "makes a note" I mean it writes something to the .aux file that says roughly "when you encounter references to this, this is what is meant".

Then, on a second pdflatex run, when it reaches the reference, it looks in the .aux file and it knows what it is supposed to refer to and can substitute in the relevant text.

Auxiliary files are used for lots of other similar things (like tables of contents, lists of figures and so on). They are annoying, but deleting them after each run would break things. A lot.

I don't mean to raise a flamewar here, but maybe it's more fair to say that the .aux file is a kludge which is due to the current implementation of TeX. When I compile a C program, the compiler doesn't write a file with the memory offsets assigned to every variable, so it isn't clear why TeX should do something similar. Simply, it's an artifact from the past.
– Federico PoloniOct 6 '11 at 11:17

7

Actually, the C compiler does create such file, it's the object file. Without knowing where the variables are, the linker would have pretty hard time linking everything together. If you run make clean, it usually remove these files for you, which is similar what you with do with tools like latexmk. You are right that in most cases with TeX, this could be resolved instead by TeX itself making multiple passes through the file, storing the data in memory. I believe I have seen someone trying to do something like that, but I don't really see why.
– Jan HlavacekOct 23 '11 at 16:23

24

The C compiler creates the .o file only if it is given an explicit command-line switch to do it. Otherwise it keeps all in memory and spits out only the executable file. Moreover, when you modify only a portion of your program, typically most .o files are unchanged, so they are effectively useful as a "compilation cache" to reduce work. In contrast, typically, when you add a paragraph at the end of a .tex file the references change and you have to compile twice, so there is no saving. Tell me, how many other languages do you know where you normally have to compile your file twice?
– Federico PoloniFeb 15 '12 at 8:35

1

You can use the .aux files as cache too, if you use for example \includeonly.
– Juri RoblJun 11 '12 at 9:51

4

@FedericoPoloni While a C compiler can compile in a single pass (iff you don't want fancy optimizations), more modern languages can not be compiled in a single pass (e.g. Java or C#). In addition, only toy programs get compiled by passing all source files directly to the compiler. I can agree with you though that in this day and age there is little reason for TeX and LaTeX not to perform those passes by themselves.
– Paul de VriezeMar 30 '14 at 20:56

You already got lots of very good answers explaining why pdflatex needs all those auxiliary files. However you might still feel frustrated about having to live with all those files polluting a directory where (I'm guessing) you would like to keep all your LaTeX documents and their corresponding .pdf outputs.

The best solution is to keep one directory for each document you have.

You can keep, for example, a main Documents folder and then individual Paper1, Paper2, ... folders; each with their own main.tex file. Then you can happily let LaTeX store whatever auxiliary files it wants in their respective folders. The difference is that now, for you, there is a clear structure of where your documents are.

pdflatex also has a CLI option for -aux-directory=dir, so you could simply have all your aux files and such. I remember seeing an easy way to make an alias to do this with just the pdflatex command, but I can't find it
– MercurialMadnessManFeb 15 '12 at 18:02

3

@MercurialMadnessMan: The version of pdflatex I have under Debian Squeeze does not include an -aux-directory option. I think that is a MiKTeX option.
– SabreWolfyJun 3 '12 at 12:31

1

I am the boss of my computer. If I don't want my computer to make these files, it should not make these files. I understand that in the 80-ies it was wasteful to recompute everything. Today, my computer can compile my 1-2 page much faster than all the time I lose from auxiliary file mess that has been created.
– mnrSep 28 '18 at 15:31

TeX writes the .log file. It contains more information about processing the job than what is shown on the console. It's very useful for debugging.

LaTeX writes the .aux and .toc files. They are used for managing cross-references and table-of-contents information. Since TeX's organism digests the input document from beginning to end, once per job, there's no other way to have a part of the document change based on later content.

The beamer class writes .snm and .nav files. The .snm file is to assist you with including images of slides into an article version of the document. The .nav file assists in creating navigation bars on slides. Beamer is not apparently set up to suppress writing those files if they are not needed (i.e., if you do not need the functionality they enable).

The hyperref package writes the .out file to assist in creating bookmarks in the pdf file. Sometimes this isn't needed; I looked at the last few jobs I had which used hyperref and the .out files are empty. Again, this doesn't seem suppressable.

You can write the auxiliary files to a temporary directory. Then you'll have to instruct TeX to look in that temporary directory. Also you will have to make sure that the included auxiliary files are the right ones, not ones placed in the temporary directory by another process.

I think it's best to learn to live with these files. If you don't want them after you're "done" with writing the document, just delete them.

If everyone thought according to your last paragraph, we wouldn't even have the printing press let alone computers and LaTeX. Rejecting the status quo is the first step of progress.
– Jonathan BaldwinOct 24 '13 at 2:36

3

@JonathanBaldwin: Not all stati quo are created equal. IMO the work needed to force LaTeX to not use auxiliary files will not lead to inventions on the level of the printing press.
– Matthew LeingangNov 1 '13 at 15:23

If you are running (pdf)latex "by hand" then the only files created will be the .log file and the .dvi (or .pdf file). If you are using something ike synctex then there will be a few more files for controlling the automatic compilation.

I have the (Mac) application Hazel watching my Articles folder and subfolders, with rules that delete all these auxiliary files after a certain interval since they were last modified. Usually it's two days or so. It cleans up files for papers I'm not currently working on.

Addressing the concerns about deletion raised above, if such files are needed in the future they can be created anew. Though this will require multiple runs, latexmk automatically runs the tex engine the necessary number of times, so they will be created.

#! /bin/bash
echo -n > .hidden
for i in *.out; do echo "$i" >> .hidden; done
for i in *.log; do echo "$i" >> .hidden; done
for i in *.aux; do echo "$i" >> .hidden; done
for i in *.bbl; do echo "$i" >> .hidden; done
for i in *.blg; do echo "$i" >> .hidden; done
for i in *.dvi; do echo "$i" >> .hidden; done
for i in *.toc; do echo "$i" >> .hidden; done
for i in *.synctex.gz; do echo "$i" >> .hidden; done

Put this bash shell in your path, to do that put this line:

export PATH=$PATH:∼/scripts

into your ∼/.bashrc file
(in this example ~/scripts is the folder containing the bash shell above), this can be at the end of that file. Maybe you also have to run this file from the terminal, but this will ensure you can run this script anywhere, wherever you're keeping it.

Now you run this script in the terminal, while being in the directory needing cleaning and run the shell. That means that if the script is named tex.clean, you'd browse to the folder needing cleaning in terminal, enter

tex.clean

in the terminal, and browse to the file in the graphical file browser, and it should be clean :)

All the file with endings mentioned in the shell will be written to that directory's .hidden file. This means they will be hidden, but still usable :).

In nautilus (at least in fedora distributions) ctrl+h will show or hide hidden files.

P.S. I tried to have this as detailed as I could so people new to bash and command line like me could use this. I hope I succeeded :).

You can avoid repetition of code by writing for i in *.{out,log,aux,bbl,blg,dvi,toc,synctex.gz}.
– Ryan ReichJul 6 '12 at 1:22

I didn't know that writing file names to a file called .hidden will hide them like all other files that start with .. In which unices is that supported? (not Mac OS as far as I can tell)
– Matthew LeingangNov 20 '13 at 20:59

It's not unix, only nautilus file manager (and possibly others, most certainly its forks nemo and caja) supports hiding via .hidden. If you browse that directory via ls you'll continue to see those files. Still, it's a fairly good solution to remove bothering files if you are a nautilus user.
– p91paulMar 30 '15 at 9:48

This has nothing to do with file managers. In Linux (Unix) any file (or directory) starting with a dot is always a hidden file even in a terminal (ls cannot show them) . What a file manager make is un-hide that files by default (mc) of after a switch shortcut as Ctrl-H
– FranFeb 2 at 6:18

Of course, ls -a or ls .* can show hidden files of current directory, but not ls alone.
– FranFeb 2 at 6:26

Though this is an older question, I want to contribute something that might help people who love clean workspaces. I created a shell script that deletes all junk files at once. It will work on Mac and Linux; with some adjustments Windows should be possible too. Download the file "cleanlatexjunk.sh" from my repository and follow the instructions:

In the meta-command for Build & View add | txs:///cleanjunk at the end

Important: As described by people before, the "junk" files are actually needed. Therefore you should also adjust your build workflow to do multiple compiles in a row. This assures that all TOCs and links are rendered correctly. After two or three compile runs, you can safely use my script as described.

Now every build & view execution should result in a clean workspace :-)

Going back to why so many extra files (and why the C compiler doesn’t – apparently – do this): C is a language designed for single-pass compilation. Names must be defined before used. LaTeX isn’t. In a world with tight memory, the way to handle this is to use two passes, storing information needed for the second pass on the way through.

In today’s world with big memories, this is not entirely necessary. A LaTeX compiler could in principle store everything needed in internal tables and go back and patch in information as it became available.

If you really want to manage this properly, you need to work out how to use a makefile (or other similar build manager) to decide automatically when a second run is needed (some cases are obvious, like when you change your bibliography, others less so). Read this if you want to find out about Makefiles and LaTeX in detail.

"C is a language designed for single-pass compilation." False. C uses multiple passes (preprocessing, compiling, maybe some optimizing, linking, maybe more optimizing) and a C compiler may even choose to dump 5 million files into pwd that are useless after the passes are finished. Most compilers don't unless you tell them to.
– Jonathan BaldwinNov 7 '13 at 0:42

There's still another option, you can merge the Clean button and the close button in TexMaker so when you close all the auxiliary files are deleted. This way you don't have to remember to Clean all the time yet you can still use the auxiliary files as long as you don't close.

This can be done downloading the sourcecode. The important file is texmaker.cpp and the processes are CleanAll and fileClose

Texmaker includes an option to "use a build directory for output files" which kind of does the trick.

If you enable this setting Texmaker creates a 'build' folder in the same directory as the document .tex file. It automatically cleans the old output and outputs into the build folder instead.

The setting works well but isn't perfect. For example I've found that after adding a new citation to the document Bibtex complained when it couldn't find the .aux file. I had to copy the aux file back out of the build folder to compile the document. I'm not sure if there's a way to fix that yet.