I regularly use awk to generate some quick statistics for the firewall/the nameserver/the proxy (e.g.: kern.log, daemon.log, squid3/access.log).

Recently I've started to play with IPv6, but realized that IPv6 address are in their extended format for ip6?tables, while in their compressed format for other services. This is quite annoying when you have to manually switch between the two formats, and, the compressed format is really more readable than the extended one.

The two awk functions below should do it for you. You can include them (as shown below) in you're awk scripts

I'm using awk more and more myself, having got hooked sometime last year. There should be a banner someplace that says "awk does more than '{print $1}'" as I often see the likes of "* |grep | sed 's/foo/ba/g' | awk '{print 1}'" when in most cases this could have been handled entirely by awk '/foo/{gsub(/foo/,"ba");print}' <(input).

If your parsing really large log files, and know a little about what fields contain what, then awk can make parsing out specific data a snip eg:

Code:

awk '$23=="DPT=25"' /var/log/iptables.log

One point though, these are not "functions", but "program files" ... not to be facicious.

I'm using awk more and more myself, having got hooked sometime last year. There should be a banner someplace that says "awk does more than '{print $1}'" as I often see the likes of "* |grep | sed 's/foo/ba/g' | awk '{print 1}'" when in most cases this could have been handled entirely by awk '/foo/{gsub(/foo/,"ba");print}' <(input).

True, but watch out when using some of the nice _gnu_ awk features, these are not posix and thus not everywhere!

Quote:

If your parsing really large log files, and know a little about what fields contain what, then awk can make parsing out specific data a snip eg:

Code:

awk '$23=="DPT=25"' /var/log/iptables.log

I've done many awk programs to generate some statistics(iptables/ip6tables, dnsmasq, squid3 and a few others), I've even quite proud of the result! But honestly, now that I'm learning Perl, I realize what's written&said everywhere: Perl is good at (among a great deal of other things) parsing text files.

Awk is often installed by default, but so is perl. I'm now using perl in my one-liners when I used to use awk. I still think it's important to know how to use awk (e.g. to avoid those grep | sed where a single awk call would have done it). But if you have some time to kill, you know what you can do!

Quote:

One point though, these are not "functions", but "program files" ... not to be facicious.

Well, I actually share these two functions to be used within _your_ program files

[...]Awk is often installed by default, but so is perl. I'm now using perl in my one-liners when I used to use awk. I still think it's important to know how to use awk (e.g. to avoid those grep | sed where a single awk call would have done it). But if you have some time to kill, you know what you can do!

I've been put off by perl somewhat, and though I'd agree it outdoes awk, sed, ed, sh, for text handling, I often get this feeling that it lacks something like transparency (for want of a better word). I was attempting to learn it some years back, but I quickly became frustrated when approaching other peoples code, it was like staring into a dark recess. This could be seen as an advantage, and no doubt perl is "flexable" in terms of how I can be wielded, but I never got the sense that I was making any headway. This was no doubt exsasperated by the fact that it was under a heavy workload at that time, I should probably revisit and see if I fair better not having 12hr workdays.

truc wrote:

Quote:

One point though, these are not "functions", but "program files" ... not to be facicious.

Well, I actually share these two functions to be used within _your_ program files

yes, OK, but they are more like programs, and I was meerly pointing out that in awk parlance the term "program file" is used. Anyhow, its symantics, "module", "function", "library" ... take your pick.