is the a quick way of displaying the last hour of logfile ?? I could write a long bash script for that but maybe there is already a way of doing it.
Reason: I want to see, which clients renewed their dhcp leases within the last hour to see if they are still there.
I have a couple of clients that nmap will not find as they go to sleep (iPhones) but I need to know whether they are present

If now is "Oct 21 11:48" that would give me everything from 10:00 to 10:59 - but I need everything from 10:48 - 11:48
OF course I could do it with a for loop etc but it would consume quite some time and I wonder if there's an easy way to do this.

Pointy-haired bosses know they can spend less on such a tool than on the risk of you cobbling together some shit in perl that only you understand and which they can't get to run any more two months after they fire you._________________History teaches us that men and nations behave wisely once they have exhausted all other alternatives. -- Abba Eban

Hey, sometimes the truth hurts. It's called "capturing institutional knowledge" and "automation tactics which facilitate persistent organizational learning". _________________History teaches us that men and nations behave wisely once they have exhausted all other alternatives. -- Abba Eban

I'm sure there are other tools out there without the PHB soft-porn. They're likely to actually work, too.

Clever, but splunk actually works. Want to extract events from a certain category within a certain time frame, and filter out stuff you don't need? You just like do it. Fuck your archaic awk skills son, thats like revving between gear shifts. Evolution.
_________________“If You Meet the Buddha on the Road, Kill Him”

Sometimes the right tool for the job is a helicopter, sometimes it's a paper-clip.

The nice thing about being able to use command-line tools and being able to code is that you can investigate any kind of one-off issue and create your own tools, just right for any job (at the cost only of your time).

The nice thing about packaged software is that, often, people have the similar problems and can use similar solutions, and there's a big economy of scale to be had in re-using what a lot of people have contributed to and in not reinventing the wheel.

I once saw a consulting firm send an analyst to Microsoft Access training because one of the Partners had it in his head that he needed a "database" to run some "queries" against. It took the poor guy three weeks to build the database, and it turned out to be a one-time requirement (and a spreadsheet would have done just fine).

This also reminds me of one of my clients (a large steel conglomerate) who had their own custom-built office suite, written from scratch, and a team of eleven people maintaining and enhancing it (and it looked like something from 1993).

Also reminds me of one time when cokehabit came in here complaining about Firefox running slowly. Then, for some reason we saw a screenshot, and he had like 18 add-ins running (I do not exaggerate). _________________History teaches us that men and nations behave wisely once they have exhausted all other alternatives. -- Abba Eban

I agree with you and pjp, just saying that in an environment where you have bazillion nodes, servers, whatnot, it's cool to be able to access events from the whole environment at one place using queries and expressions instead of sshing into some machine grepping files and all that pedestrian stuff.

Especially if you are NOT an admin, it's not only admins that need access to live information, and some admins are so dense that I'd rather bypass them and extract information I need myself. All that one needs is granted access on a class of information, which is less of a security issue (and procedure) than getting your public key onto live machines.

It's a difference between geting the information in 10 minutes, or at admins convenience (next day, if lucky).

For example, at a payment solutions company, I can extract live data (purchases and whatnot) in a prefered format (json), anonymize it, and replay the traffic pattern in test environment as a part of performance tests. So the option is to log into Splunk, enter the expression that defines what events I'm interested in, and get the data. Or I could send an e-mail to an admin and get a reply in style of "What?"_________________“If You Meet the Buddha on the Road, Kill Him”

If you say it works, I'll take your word for it. But if I'm going to pay for something, I'm not interested in 90% of that being marketing voodoo._________________lolgov. 'cause where we're going, you don't have civil liberties.