Welcome to Splunk Answers, a Q&A forum for users to find answers to questions about deploying, managing, and using Splunk products. Contributors of all backgrounds and levels of expertise come here to find solutions to their issues, and to help other users in the Splunk community with their own questions.

This quick tutorial will help you get started with key features to help you find the answers you need. You will receive 10 karma points upon successful completion!

I have a command which needs to simply read the incoming events one by one (I don't even care about sending them further down the pipe) and process each event separately. This setup is working extremely slowly right now, quickly going through the events and then taking over at least a minute 'finalizing job', and sometimes outright hanging (you know when Chrome says 'The page has become unresponsive' or even switches directly to the 'Awww, snap!' page?) All I do there is call splunk.Intersplunk.getOrganizedResults(), loop for r in results: with some simple processing inside the loop, then passing the results on using splunk.Intersplunk.outputResults(results).

People who like this

1 Answer

Found my problem: with all seeming simplicity of processing, my script had some JSON calls for each event (which all are in JSON format). Those JSON methods seem to be quite CPU-intensive and time-costly. A mere 99 events required 63 seconds to process - way too much!

My solution was to leave only direct copying of the relevant fields into an intermediate file and to create another script outside of Splunk for further processing of it. The time of script running went down from 63 sec to .013 sec - quite an improvement! Since my initial script's purpose was to create a file with the events parsed and reassembled in some specific way then saved as an input file for another application, the end result is fine with me.

The irony is that I could not export events directly - they were too long for Splunk to do it (I kid you not - an error stating that 'URL is too long' was the content of the .CSV file I tried to create). That's the reason for me to use a script for this purpose.