I'm the first to extol the virtues of scripting languages like
Python and Perl in particular. But they aren't always the best tool
for the job. It's often forgotten how powerful the original Unix (and
now GNU) text
processing utilities are. Someone once asked me how to
combine specific columns from multiple CSV files into a new CSV
file. They had the start of a Perl solution that was not working
correctly, and wanted advice on it. My advice was to go with a
one-line shell solution which is simply this:

paste -d, output.csv

This will combine the third column from each specified file into a new
file. It relies on a feature of the more modern Bourne
shells, process
substitution - the two parts that look
like <(). Here it is in
action:

You can paste columns from as many files as you need here. One catch,
of course, is that this only works with simple CSV data - meaning
there are no embedded commas in the data fields themselves. But this
is much more understandable than any lengthy scripting language
solution.

One other tip, if you had to get rid of the first row, which might
contain column header data, just pipe the output through tail: