Splunk Blogshttp://blogs.splunk.com
Thu, 30 Jul 2015 22:24:38 +0000en-UShourly1http://wordpress.org/?v=4.2.3Splunk BlogsnoSplunk Blogshttp://blogs.splunk.com/wp-content/plugins/powerpress/rss_default.jpghttp://blogs.splunk.com
Information Exchange Boosts Threat Intelligencehttp://blogs.splunk.com/2015/07/30/information-exchange-boosts-threat-intelligence/
http://blogs.splunk.com/2015/07/30/information-exchange-boosts-threat-intelligence/#commentsThu, 30 Jul 2015 16:55:40 +0000http://blogs.splunk.com/?p=24104The rash of recent government breaches and continued cyberthreats have accelerated the need for the exchange of information related to these and other known incidents. For many years, DHS has been working with industry and other federal agencies to provide more standardization of content so that security practitioners (and anyone else for that matter) are speaking the same language across multiple vendor platforms as it pertains to software, configurations and vulnerabilities, to name a few. An early example that pre-dates DHS was the Common Vulnerability Enumeration (CVE) that Mitre launched in 1999. These efforts can be challenging because gathering consensus and buy-in is never easy across a diverse set of organizations and so finding entities that can shepherd these specifications …]]>The rash of recent government breaches and continued cyberthreats have accelerated the need for the exchange of information related to these and other known incidents. For many years, DHS has been working with industry and other federal agencies to provide more standardization of content so that security practitioners (and anyone else for that matter) are speaking the same language across multiple vendor platforms as it pertains to software, configurations and vulnerabilities, to name a few. An early example that pre-dates DHS was the Common Vulnerability Enumeration (CVE) that Mitre launched in 1999. These efforts can be challenging because gathering consensus and buy-in is never easy across a diverse set of organizations and so finding entities that can shepherd these specifications is key to wide spread adoption.

This makes DHS’s announcement last week regarding the STIX (Structured Threat Information eXpression) and TAXII (Trusted Automated eXchange of Indicator Information) specifications exciting. In case you didn’t see it, both specifications are moving to OASIS (Organization for the Advancement of Structured Information Standards). OASIS is a non-profit consortium with members in over 65 countries that focuses on the adoption of open standards globally. Within OASIS, STIX/TAXII will be overseen by the Cyber Threat Intelligence Technical Committee.

STIX/TAXII, are a set of specifications that focus on cyberthreat information and their associated transfer. STIX has nine constructs that fit together to represent a threat including indicators, exploit target and course of action amongst others. TAXII defines the way threat intelligence data is shared and is the preferred way to share STIX insights. This includes transport over HTTP/HTTPS.

Many consider enriching log data with threat intelligence to be an important capability. Threat Intelligence offers a neighborhood watch effect to your log data by providing insights into threats that others may have seen or detected, as well as enhancing your situational awareness. More recently, greater emphasis has been put on providing insights into indicators of compromise (IOCs). The key is that there needs to be standard means to characterize and transport this threat intelligence.

This is where STIX and TAXII come in. Starting with the Splunk App for Enterprise Security (ES), v3.3, Splunk has added the ability to ingest STIX documents and leverage TAXII feeds for threat intelligence. Artifacts extracted from the STIX documents include:

X509 Certificates

Email

Files names/hashes

HTTP

IP/Domains

Processes

Registry entries

Services

Users

Once these threat artifacts have been extracted, they can be correlated with the logs that were previously collected to determine if any of these indicators currently exist within the enterprise and if so, when were they first seen. The power of Splunk’s search engine coupled with the STIX/TAXII integration assists analysts in looking back over their historical data to better answer the question, when did this (IP|Domain|Registry Setting|etc) first appear? From there, analysts can then start working on applying mitigation strategies to the threat.

Looking back through logs retrospectively is important, but there still is the need to identify these artifacts as they appear in the present. Enterprise Security’s correlation search can also flag these logs as notable events for an analyst when correlated with these indicators as logs are being collected.

The Splunk App for Enterprise Security comes with a number of threat sources including examples from the STIX site relating to APT1 and Poison Ivy as well as a malware domain list from hailataxi.com. If you have access to threat intelligence via a TAXII server, adding this data is as easy as specifying the website and credentials for the TAXII server.

STIX/TAXII specifications are important as the need to share threat intelligence in real-time continues to increase. The Splunk App for Enterprise Security provides the ability to easily gather these artifacts and do something actionable with them. If you are going to be at BlackHat next week, stop by and see us (booth #347) and learn about how we are integrating STIX/TAXII.

]]>http://blogs.splunk.com/2015/07/30/information-exchange-boosts-threat-intelligence/feed/0Like Malcolm Gladwell, Splunk Cloud Helps You See Things Others Don’thttp://blogs.splunk.com/2015/07/30/like-malcolm-gladwell-splunk-cloud-helps-you-see-things-others-dont/
http://blogs.splunk.com/2015/07/30/like-malcolm-gladwell-splunk-cloud-helps-you-see-things-others-dont/#commentsThu, 30 Jul 2015 14:29:46 +0000http://blogs.splunk.com/?p=24095As I’m sitting in my home office, I glance over at my credenza and I spy the Malcolm Gladwell non-fiction book, “David and Goliath: Underdogs, Misfits, and the Art of Battling Giants.” I’m a big Gladwell fan. While I enjoy how he uses powerful story-telling to reshape the way we think about life and the world around us, I also like how he uses research and data to make discoveries many of us might miss. Much like the capabilities offered to companies through Splunk software, Gladwell inspires me to dig deeper and look at things from a different perspective.

The premise of Gladwell’s “David and Goliath” book is the Old Testament account about the shepherd boy who takes down a …

]]>As I’m sitting in my home office, I glance over at my credenza and I spy the Malcolm Gladwell non-fiction book, “David and Goliath: Underdogs, Misfits, and the Art of Battling Giants.” I’m a big Gladwell fan. While I enjoy how he uses powerful story-telling to reshape the way we think about life and the world around us, I also like how he uses research and data to make discoveries many of us might miss. Much like the capabilities offered to companies through Splunk software, Gladwell inspires me to dig deeper and look at things from a different perspective.

The premise of Gladwell’s “David and Goliath” book is the Old Testament account about the shepherd boy who takes down a giant warrior using nothing more than a slingshot and a pebble. Gladwell delves deeply into the psychology and mindset of the modern-day underdog who beats the odds, and what happens when ordinary people confront powerful opponents of all kinds. Using research and data, Gladwell hypothesizes that what may appear to be a disadvantage can often be the key to triumph. My mind drifts from Gladwell’s theory to the original Old Testament parable and how both are very relatable to business and technology today.

I think about David and his slingshot. David, the boy, is an unlikely candidate in the battle against the giant, but he beats the odds not by his physical strength but through mental fortitude. He is passionate, resourceful and strategic – which are all key ingredients to winning in the business world. Successful organizations rely on instinct and intellect to build or buy ‘weapons’ that will give them a competitive advantage. These days, the weapons are usually in the form of technology – the great equalizer. If used right, technology can quickly level the playing field.

This thinking brings me right back to data – or, more specifically, the technology that can uncover the data. Like Gladwell, we love data. In fact, at Splunk we know the power of machine data – all the data that’s generated by the machines we use – and we’ve built advanced software that can index and analyze it all. It’s becoming a critical weapon in the business arsenal because it shows us what we’ve been missing and allows us to see things with a new perspective.

What’s even more exciting is that the David’s of the business world can now harness the power of Splunk analytics software with even greater ease and fewer resources by deploying Splunk Cloud. With Splunk Cloud, organizations gain the same valuable benefits of Splunk Enterprise, but in a software-as-a-service model. Splunk Cloud is like the slingshot and the pebble –very simple and easy to use at first glance, but quite powerful when it’s put into the right hands.

Orrstown Bank is a great example of a ‘David’ taking on some powerful opponents in the financial arena. This East Coast community bank greatly fortified its security and customer experience by using Splunk Cloud as its security intelligence platform. In addition to helping Orrstown Bank secure its hybrid cloud and on-premises environment, Splunk Cloud enables the bank to continuously improve reliability and cut costs in IT operations. The bank uses its technology like the stone David wielded against the great warrior-giant.

Splunk Cloud was designed to help organizations in their quest to compete and win. Like the rudimentary slingshot, Splunk Cloud isn’t complex, it isn’t more expensive, and it doesn’t require a lot of people to make it powerful. Similar to David, the team at Orrstown Bank had passionate intent to come out on top using a weapon with precise aim against the most powerful cyber threats. Splunk Cloud is their key to triumph.

We work with small and large businesses every day. No matter what their size, however, we’re seeing how Splunk Cloud addresses very specific needs – from ease of use, to cost, to flexibility, to expandability. But there’s one common denominator in all of it that brings us right back to Malcolm Gladwell and the way he thinks. It comes down to data. Regardless of whether they’re underdogs or giants, our customers are trying to uncover what’s hidden. Like Gladwell, they’re trying to grasp what others may miss and use it to their advantage. At the end of the day, that’s a huge secret to being a top competitor – and it has nothing to do with size.

]]>http://blogs.splunk.com/2015/07/30/like-malcolm-gladwell-splunk-cloud-helps-you-see-things-others-dont/feed/0Getting ready for Business Analytics at .conf2015 – Part 1http://blogs.splunk.com/2015/07/29/getting-ready-for-business-analytics-at-conf2015-part-1/
http://blogs.splunk.com/2015/07/29/getting-ready-for-business-analytics-at-conf2015-part-1/#commentsWed, 29 Jul 2015 18:43:25 +0000http://blogs.splunk.com/?p=24074It’s almost August! That’s a pretty special time for us here at Splunk because we start working with speakers for our annual user conference. That’s right, .conf2015 is just around the corner and I am super excited to meet Splunkers from around the world, hear all the cool use cases from machine data and learn what makes Splunk one of the most innovative companies in the world. It is a fantastic opportunity to interact with passionate users and learn about the innovate ways in which Splunk users have derived value from data for business and IT.

There are number of great sessions around business analytics. These customer sessions will showcase innovative use of Splunk to solve business use …

]]>It’s almost August! That’s a pretty special time for us here at Splunk because we start working with speakers for our annual user conference. That’s right, .conf2015 is just around the corner and I am super excited to meet Splunkers from around the world, hear all the cool use cases from machine data and learn what makes Splunk one of the most innovative companies in the world. It is a fantastic opportunity to interact with passionate users and learn about the innovate ways in which Splunk users have derived value from data for business and IT.

There are number of great sessions around business analytics. These customer sessions will showcase innovative use of Splunk to solve business use cases. Machine data is the new class of data that is available to business users. Many of the sessions will highlight the tremendous business value with machine data and especially when machine data is mashed up with structured data from relational databases. You will hear a lot about using Splunk to understand and optimize business processes.

You will also hear amazing stories from Otto Group, Northern Trust Bank and Komodo Cloud on understanding and optimizing end-to-end business transactions across multiple channels. Kaiser Permanente has an interesting session on tracking health claims status across multiple format and systems. These are just couple of examples of what to expect in the business analytics track at .conf2015.

I have attended a number of conferences throughout my professional career, and I can attest that the Splunk User Conference is by far my favorite one. There is an immense opportunity to learn, interact with industry peers, expand your horizon on the possibilities with data, and further your career at the event. Not to mention the hands-on workshops and training at the conference and the Search Party!

]]>http://blogs.splunk.com/2015/07/29/getting-ready-for-business-analytics-at-conf2015-part-1/feed/0Under the Hood of Cisco IThttp://blogs.splunk.com/2015/07/29/under-the-hood-of-cisco-it/
http://blogs.splunk.com/2015/07/29/under-the-hood-of-cisco-it/#commentsWed, 29 Jul 2015 16:41:03 +0000http://blogs.splunk.com/?p=23909
Do you know which technology is under the hood of Cisco IT?

Do you know what Cisco uses to monitor the health of 70+ of their apps and to respond to security incidents?

We bring you the answers straight from the horse’s mouth.

At the recent SplunkLive! SF and in front of a packed room, Robert Novak, Follow @gallifreyan, Quinn Zuo and Ruby Chiang of the Cisco IT team, Follow @ciscoit, uncovered the mystery and gave us a good look under the hood. They showed how Splunk powers their operations and solves some of their critical IT challenges.

Do you know what Cisco uses to monitor the health of 70+ of their apps and to respond to security incidents?

We bring you the answers straight from the horse’s mouth.

At the recent SplunkLive! SF and in front of a packed room, Robert Novak, Follow @gallifreyan, Quinn Zuo and Ruby Chiang of the Cisco IT team, Follow @ciscoit, uncovered the mystery and gave us a good look under the hood. They showed how Splunk powers their operations and solves some of their critical IT challenges.

During their presentation, the Cisco IT team also described how their environment is supported by a high performance, highly scalable Cisco UCS infrastructure. UCS is an essential platform for central management of big data infrastructure. Since Splunk scales from a single system to a large scale distributed deployment, it is the perfect solution for operational intelligence on top of UCS.

Cisco has endorsed Splunk as a key technology partner for their UCS platform. Here is a highlight of key assets on the Splunk-UCS integration

]]>http://blogs.splunk.com/2015/07/29/under-the-hood-of-cisco-it/feed/0DIY 0 to 60 with Splunk in 3 stepshttp://blogs.splunk.com/2015/07/28/diy-0-to-60-with-splunk-in-3-steps/
http://blogs.splunk.com/2015/07/28/diy-0-to-60-with-splunk-in-3-steps/#commentsTue, 28 Jul 2015 22:35:37 +0000http://blogs.splunk.com/?p=24043A lot of folks (particular developers) often ask me how to get started with building an app in Splunk? Many of the askers have no previous exposure to Splunk. Here are the steps I recommend:

Do the search tutorial. It covers all the basics end to end, from ingesting data, to searches, to dashboards. By the end of the tutorial you will get a good sense of what you can do with Splunk itself.

Follow the fantastic new developer guidance for apps. We worked with real partners and have documented the entire journey of building an app, and captured those learnings for you

…]]>A lot of folks (particular developers) often ask me how to get started with building an app in Splunk? Many of the askers have no previous exposure to Splunk. Here are the steps I recommend:

Do the search tutorial. It covers all the basics end to end, from ingesting data, to searches, to dashboards. By the end of the tutorial you will get a good sense of what you can do with Splunk itself.

Follow the fantastic new developer guidance for apps. We worked with real partners and have documented the entire journey of building an app, and captured those learnings for you in the guidance.

By the time you’ve finished you’ll have a good basic understanding of Splunk and our dev platform.

As I indicated in my preview post, Earning at Seat at the Table, I’m fascinated by the transformation of IT and its increased role at the business strategy table. Enamored by the glory, impact, and success of Silicon Valley unicorns, CIO’s aspire to drive innovation within their companies. Often times however, the cart is put before the horse. Programs that include ideas like blocking 20% of an employees time to innovate are rolled out with the hope of cultivating a few great prototypes that can be shown off to the executive staff. I’ve found that to roll out an effective innovation strategy, it is critical to first improve the ability to execute IT projects, and subsequently to formalize strategies for …

]]>

As I indicated in my preview post, Earning at Seat at the Table, I’m fascinated by the transformation of IT and its increased role at the business strategy table. Enamored by the glory, impact, and success of Silicon Valley unicorns, CIO’s aspire to drive innovation within their companies. Often times however, the cart is put before the horse. Programs that include ideas like blocking 20% of an employees time to innovate are rolled out with the hope of cultivating a few great prototypes that can be shown off to the executive staff. I’ve found that to roll out an effective innovation strategy, it is critical to first improve the ability to execute IT projects, and subsequently to formalize strategies for how to commercialize ideas. The next 4 articles will therefore focus on these two foundational steps that enable IT organizations to “responsibly move at market speed.”

For industries such as financial services and healthcare, forthcoming increases in regulatory oversight, combined with emerging competitive threats, require that large organizations transform their IT capabilities. The old adage, adapt or die, couldn’t be more appropriate in these times.

The goal of the transformation should be to responsibly move at market speed – to quickly take ideas from whiteboard to the hands of customers, and pivot or iterate based on the behavior of the customer, not just the voice of the customer, where:

Product managers identify new features

Developers continuously deliver those features to market

Those features run on a secure and compliant cloud-based application platform

The application platform be fully instrumented, providing actionable commercial insights to the product manager, empowering them to quickly pivot or iterate.

The faster this cycle, the closer they’ll be able to move at market speed.The key word here is “responsibly”. Companies often need to supply evidence to their executives, auditors & regulators proving that they’re operating effectively. Gone are the days where some development director can cut test cases in order to make a date. In this new world, there is little tolerance for Jedi mind-tricks of “there’s nothing to see here” – powerpoint slides showing that everything is green. Therefore, security and resilience is integrated into the platform by design, with real-time data providing quantitative and qualitative evidence that we’re delivering software that will not jeopardize the business. Hybrid Cloud strategies drive standardization, automation, and workload portability, eliminating “snow flakes”, where every app topology or virtual machine is unnecessarily unique. Continuous Delivery transforms software development into a highly-tuned factory, where developers can check in code, watch the code run through a series of automated checks using whitebox and blackbox verification tools, and receive instant feedback of potential problems. Continuous Insights brings data to the fingertips of key stakeholders, where the application code and systems are instrumented, providing a near-realtime view of exactly how the product & systems are running, shifting the culture from being reactive to proactive..

Individually each of these technology programs – Hybrid Cloud, Continuous Delivery, and Continuous insights – will deliver value to an organization. However, it’s the unique combination of these three concepts that transforms the organization’s ability to execute. It’s more than just applying technology though. Breaking the processes that stifle us, eliminating the fire drills that distract us from focusing on higher-value business problems and instilling a culture of commercial intensity are equally difficult.

The goal is to bring together emerging technologies, cultural change and process improvements to drive significant commercial impact in a relatively short period of time. Taking ideas from whiteboard to the hands of customers in 15-45 days isn’t just a Silicon Valley luxury, large enterprises can and have done it; these enterprises have evolved IT from a back-office function, to a core part of the value they deliver to their customers, fundamentally changing how the business brings digital capabilities to market. They now move forward with a strong IT foundation, and the right culture – a culture of bold ideas, lean startup and a focus on the customer – where their commercial intensity and market speed gives them the resources of a Global enterprise, with the agility of a startup.

I look forward to sharing part 2 in this series next week, when I discuss the details of hybrid cloud with continuous delivery and insights.

]]>http://blogs.splunk.com/2015/07/28/seat-at-the-table-market-speed/feed/0Using Data Analytics to Help Secure State and Local Government Networkshttp://blogs.splunk.com/2015/07/28/using-data-analytics-to-help-secure-state-and-local-government-networks/
http://blogs.splunk.com/2015/07/28/using-data-analytics-to-help-secure-state-and-local-government-networks/#commentsTue, 28 Jul 2015 18:15:53 +0000http://blogs.splunk.com/?p=24049While we eagerly await the government’s 30-day cybersecurity sprint report, it is important to remember that large federal agencies such as OPM aren’t the only ones susceptible to cyberattacks. State and local governments handle and collect confidential data just as frequently as federal agencies, which makes them attractive targets for cyberattackers. As the feds search for answers in the wake of OPM, state and local governments should likewise be reevaluating their cybersecurity approaches.

A lot of talk around cybersecurity focuses on improving data encryption, password protection and authentication practices. But one of best, and most underutilized, security resources in government is the data already being collected and the insights that information contains. State and local governments need to start embracing …

]]>While we eagerly await the government’s 30-day cybersecurity sprint report, it is important to remember that large federal agencies such as OPM aren’t the only ones susceptible to cyberattacks. State and local governments handle and collect confidential data just as frequently as federal agencies, which makes them attractive targets for cyberattackers. As the feds search for answers in the wake of OPM, state and local governments should likewise be reevaluating their cybersecurity approaches.

A lot of talk around cybersecurity focuses on improving data encryption, password protection and authentication practices. But one of best, and most underutilized, security resources in government is the data already being collected and the insights that information contains. State and local governments need to start embracing new solutions with comprehensive data and behavioral analysis capabilities, which are becoming increasingly important to effectively detect and combat cyber threats.

The Institute for Critical Infrastructure Technology recently published a report that stated legacy technologies were a big problem for OPM. Further, the Institute’s report notes that having a behavioral analytics system to track user activity as a security measure would have benefited OPM. This is something Splunk is highly invested in following its acquisition of Caspida earlier this month. Splunk now incorporates machine learning and behavioral analytics to detect unpredictable insider threats, like compromised credentials.

Investment in enterprise analytics platforms can enable both proactive threat detection and defensive mitigation, as well as support real-time response to breaches. John Zarour, director of state and local government and K-12 for Splunk, discusses these issues and more in a recent GCN article. It’s a quick read that offers some good insights state and local IT leaders would be wise to pay attention to.

]]>http://blogs.splunk.com/2015/07/28/using-data-analytics-to-help-secure-state-and-local-government-networks/feed/0Splunk Webinar: Learn how Cerner Extends Splunk to Gain End-to-End Visibility into Complex Business Processhttp://blogs.splunk.com/2015/07/24/splunk-webinar-cerner/
http://blogs.splunk.com/2015/07/24/splunk-webinar-cerner/#commentsFri, 24 Jul 2015 17:50:38 +0000http://blogs.splunk.com/?p=24012I am honored to have the privilege to host a Splunk webinar with Cerner on July 28. In this webinar, Cerner will be discussing one of their many exciting use cases around business process analytics and how they are extending Splunk software to gain end-to-end insights into complex business processes.

One such process is real-time eligibility, a critical and complex process in healthcare. As a part of the process, information for each patient (e.g., name, address, insurance carrier) is entered into the Cerner system, where it is then verified and forwarded to the insurance carrier. The carrier confirms the patient’s coverage and the amount of the deductible. Within moments, the healthcare provider can validate each patient’s eligibility and then provide …

]]>I am honored to have the privilege to host a Splunk webinar with Cerner on July 28. In this webinar, Cerner will be discussing one of their many exciting use cases around business process analytics and how they are extending Splunk software to gain end-to-end insights into complex business processes.

One such process is real-time eligibility, a critical and complex process in healthcare. As a part of the process, information for each patient (e.g., name, address, insurance carrier) is entered into the Cerner system, where it is then verified and forwarded to the insurance carrier. The carrier confirms the patient’s coverage and the amount of the deductible. Within moments, the healthcare provider can validate each patient’s eligibility and then provide services.

An error during the real-time eligibility process can result in a rejected transaction, delaying healthcare delivery and impacting the revenue cycle. The Cerner team will be discussing how Splunk is enabling them to identify and catch these errors, and further optimize the process to reduce error rates.

The creativity and ingenuity of Splunk customers continues to amaze, this webinar is case in point. Whether you are in IT or healthcare, this webinar presents a great opportunity to learn about Splunk from Cerner – one of the world’s largest healthcare software IT companies.

]]>http://blogs.splunk.com/2015/07/24/splunk-webinar-cerner/feed/0Practical Operational Intelligence for the Internet of Things – Part 1http://blogs.splunk.com/2015/07/23/practical-operational-intelligence-for-the-internet-of-things-part-1/
http://blogs.splunk.com/2015/07/23/practical-operational-intelligence-for-the-internet-of-things-part-1/#commentsThu, 23 Jul 2015 18:00:06 +0000http://blogs.splunk.com/?p=23886Recently, we were lucky to join the Eclipse Foundation’s IoT team for a webinar on “Practical Operational Intelligence for the Internet of Things”. Emphasis was on the practical. As I discussed in a recent blog for the IoT Solutions World Congress, when it comes to the IoT, turning data into insights shouldn’t be so hard. With the proliferation of complex architectures, interfaces, and slow time-to-value, it’s no wonder the “hype” of both big data and the IoT sometimes eclipses their successes.

With that in mind, I’m kicking off a multi-part blog series on “Practical” IoT Operational Intelligence and Analytics with Splunk. Goal here is to get you to value from IoT generated data as quickly and as …

With that in mind, I’m kicking off a multi-part blog series on “Practical” IoT Operational Intelligence and Analytics with Splunk. Goal here is to get you to value from IoT generated data as quickly and as easily as possible. General concepts to be discussed in this series include:

Overview (why Splunk?)

Getting Started (Download, Installation and App and Input Configuration)

This approach enables you to easily connect to remote devices and applications, ingest the machine data generated by those sources, and to securely deliver that data across complex networks to a massively scalable time-series index. You can then search that data, alert on the content of that data, extract information from, enrich, and statistically process that data. Finally, you can run time-series analytics on that data (including predict and anomaly detection), and visualize and report on the raw data or the output of any of those analytics using the platform’s native charting libraries or third party visualization platforms.

The above capabilities are exposed to end users via a web interface, and to developers and third party applications via well-documented APIs and a host of SDKs. Platform configuration and management features (including role-based access management for both data and content) are exposed to administrators via web browser, API and SDK as well.

The architecture is installed with a simple executable or tarball, and can be deployed on a laptop, in the datacenter, in a private cloud, as SaaS, or as a hybrid. It is designed to make machine data accessible, useable and valuable to anyone (that’s actually our mission), and does so through a unique combination of ease of use, self-service deployment and content creation, and fast time to value for solutions. Large corporations and individual developers alike already use Splunk as part of their IoT strategy.

As I see it, there are benefits to this approach vs. common alternatives:

No worries about scalability. You can use the same application to scale from a single machine install to massive horizontal and vertical deployments collecting and analyzing data at any volume, velocity, or variety.

You don’t need to custom code against every new data source. The Modular Input Framework allows you to wrap common communication libraries as a native Splunk input, complete with web user interface. Teams have already built Modular Inputs for accessing data from MQTT, COAP, AMQP, REST and JMS Splunk also has native inputs for file system monitoring, TCP, UDP, and can run scripts to collect data. Community supported apps such as Protocol Data Inputs enable collection of data from binary sources and just about anything else you can imagine.

You can collect data from any point of origin – cloud, datacenter, or device. With Splunk’s flexible forwarding, SDKs and APIs, and IoT relevant add-ons on Splunkbase, data can be securely pulled or pushed from any connected location.

You can manage your entire stack via a web browser. Yup, you can connect to the IoT or industrial environments, forward and process data, enrich time-series data with structured data, search, investigate, analyze, and visualize, all through a web interface. Oh, and you use that same web interface to manage the entire architecture and the users who access it, no matter how distributed or complex the architecture.

You can use one application to do both real-time and historical visualization, analytics, and alerting. Splunk’s SPL is an incredibly powerful search and analytics language. Queries written in SPL can be applied to historical and real-time windows of data for time-series analytics and statistical processing. And like many other features of the platform, SPL is extensible via custom python search commands, the community has taken this even a step further by leveraging custom commands to extend SPL with R, and algorithms like haversine.

So hopefully this makes the case for the technical “why”, the next installment will try the same for the first bit of technical “how”, including downloading, installing and configuring an instance of Splunk on a local machine and in an AWS account, and configuring an MQTT Modular Input on that instance to collect data from a remote broker. If you have any questions or comments for me on this first installment, or on anything related to Splunk for IoT, feel free to reach out on Twitter:

]]>http://blogs.splunk.com/2015/07/23/practical-operational-intelligence-for-the-internet-of-things-part-1/feed/0Smart AnSwerS #30http://blogs.splunk.com/2015/07/22/smart-answers-30/
http://blogs.splunk.com/2015/07/22/smart-answers-30/#commentsThu, 23 Jul 2015 02:46:21 +0000http://blogs.splunk.com/?p=23985Hey there community and welcome to the 30th installment of Smart AnSwerS.

Splunk HQ’s kitchens underwent a total makeover last week, and this beast of an automated hot drink machine appeared on the 1st floor. Splunkers have been frequenting the new big shiny toy, taking all the mugs, bringing them back to their respective floors, and leaving us first floor dwellers with nothing *cries*. Fortunately, this new installation has brought comic relief. Some new signage was placed on the machine saying “”OK Coffee” I am voice activated, please try me.” The machine is not voice activated, serving some occasional amusement 😉

Real Time Search Performance Considerations: Are there any

…]]>Hey there community and welcome to the 30th installment of Smart AnSwerS.

Splunk HQ’s kitchens underwent a total makeover last week, and this beast of an automated hot drink machine appeared on the 1st floor. Splunkers have been frequenting the new big shiny toy, taking all the mugs, bringing them back to their respective floors, and leaving us first floor dwellers with nothing *cries*. Fortunately, this new installation has brought comic relief. Some new signage was placed on the machine saying “”OK Coffee” I am voice activated, please try me.” The machine is not voice activated, serving some occasional amusement 😉

Real Time Search Performance Considerations: Are there any scenarios where real-time searches would be acceptable?

shailesh030 wanted to know if there were any cases where real-time searches would be recommended in a production environment. This is a question that many users do not consider and think real-time searches are the only way to go, but often come across performance issues. The response to this and similar questions is almost always the same, but not quite as thorough and detailed as lguinn’s answer. Come see how she describes the performance implications of different search options and factors to consider for each to gauge what will be optimal for your own environment.http://answers.splunk.com/answers/242969/real-time-search-performance-considerations-are-th.html

How to edit the email address for all scheduled searches and reports in a single app without editing each one by one?

marees123 had an app with more than 170 scheduled searches and reports, but needed to figure out how, in the most convenient way possible, to add email addresses for all of them to deliver search results to certain recipients. LukeMurphey shows a way to do this by creating a macro with the sendemail command that can be added to all searches. Although this does require modifying every search which marees123 wanted to avoid, putting this solution in place now makes editing recipients’ emails a one stop shop in the future. Luke explains that since macro references are replaced before the search executes, editing the macro once will change all the searches that use the macro.http://answers.splunk.com/answers/238393/how-to-edit-the-email-address-for-all-scheduled-se.html

How to convert indexed IP data from hex to decimal format in Splunk?

splunknewby already had indexed IP data in hex format, but needed to convert it to decimal format. MuS uses his Splunk fu search skill level 5000 to construct a search using rex to extract the value and eval to work conversion magic. This answer passed with flying colors, but splunknewby had a follow up question and wanted to know how to make sure future searches already have the IP addresses in the new decimal format. Other than the obvious approach of changing the source output to be in decimal format, MuS also suggests using field extraction combined with a lookup table to translate extracted fields into the desired numbers.http://answers.splunk.com/answers/241068/how-to-convert-indexed-ip-data-from-hex-to-decimal.html