At the top of the historical data section, select a time range of the 30-60 minutes leading up to the time that the issue appeared.

Scroll down to the Physical Memory Usage panel (which can be median, average, or max).

Verify that the Splunk software physical memory usage nears or exceeds the capacity of the machine. If the machine runs an operating system supported by platform instrumentation, this should be easy to determine at a glance. See About the platform instumentation framework.

Next, look at the Physical Memory Usage panel to assess the system memory usage issue and note the growth pattern, or the shape of the data. The growth pattern helps distinguish between a leak and high usage as follows:

A memory leak grows steadily and does not go away until you restart splunkd. A leak is likely a Splunk software defect.

If the memory issue is not a leak, if it grows then plateaus at a high level (that is, a level near capacity), your Splunk software usage might simply require that much memory.

Finally, identify which process class (search, main splunkd, or other) is involved as follows:

Scroll down to the Physical memory usage by process class panel. Most Splunk software out-of-memory situations are search related, but not all.

Solution

If you confirm that Splunk software is not using a large amount of memory, consult your sysadmin about pruning non-Splunk processes.

For cases that are related to Splunk software but not attributed to search processes (especially if the main splunkd process grows in memory usage over time), contact Splunk Support.

If you have attributed the excessive memory usage to searches, in Splunk Web select Settings > Monitoring Console > Search > Activity > Search activity: Instance. Scroll down to the Top 20 Memory-Consuming Searches panel to identify and review the individual offending searches. The following is a list of solutions to the most common search memory usage problems:

If a few of your searches are using a lot of memory, make sure they are as efficient as possible. Remember to filter early in a search and choose search commands that use memory efficiently. See Quick tips for optimization and Write better searches in the Search Manual.

Note that certain Splunk apps have additional system requirements. For example, Enterprise Security requires a search head with significantly more memory than Splunk Enterprise requires by default. See Deployment planning in the Enterprise Security documentation.

If you have a single search using unreasonable amounts of memory, and you are not sure why, check Known Issues and file a Support ticket. The problem is especially likely to be caused by a defect if the search process displays a growth pattern indicating a leak.

Remember not to schedule all your reports on the hour. Offset scheduled reports to avoid reaching your concurrent search limit.

Comments

Troubleshoot high memory usage

Enter your email address, and someone from the documentation team will respond to you:

Send me a copy of this feedback

Please provide your comments here. Ask a question or make a suggestion.

Feedback submitted, thanks!

You must be logged into splunk.com in order to post comments.
Log in now.

Please try to keep this discussion focused on the content covered in this documentation topic.
If you have a more general question about Splunk functionality or are experiencing a difficulty with Splunk,
consider posting a question to Splunkbase Answers.

0
out of 1000 Characters

Your Comment Has Been Posted Above

We use our own and third-party cookies to provide you with a great online experience. We also use these cookies to improve our products and services, support our marketing campaigns, and advertise to you on our website and other websites. Some cookies may continue to collect information after you have left our website.
Learn more (including how to update your settings) here »