Posts [ 6 ]

Topic: Counting unique lines from a parsed text file

Hey there, forums.

My current predicament is that I am parsing a log file for a website to gather information for an admin page. The log file contains the time, userid, and session-token for each login the website completes. My goal is to end up with an integer that represents the number of unique users who logged into the site on a given day. Any ideas?

Just for some additional information, we open the text file and then parse it using a regex, putting the 3 pieces of information into an array. Hope that's enough information to at least get some questions going. Thanks in advance, everyone.

Re: Counting unique lines from a parsed text file

Can you post the format of your log file/array (perhaps the output of array.inspect)? The cleanest solution would be to generate an array of hashes, then you could do something like:

logins.collect{|login| login[:user]}.uniq.length

Alex

The log file has lines that look basically like this:Login: <[Date/Time]> UserID: [userid as int] Token: [session token as string]

The regex pulls everything the text i put between [ ] and stores them in an array a.the array would look like:a[0] = date.timea[1] = userid as inta[2] = session token

I'm at work so I can't get to my code right now. Could you give me a quick rundown of how an array of hashes works? I'm pretty new to ruby so the syntax/structures are still pretty foreign. Thanks so much.