• Like
  • Save

SplunkLive! Getting Started with Splunk Enterprise

  • 2,410 views
Uploaded on

 

More in: Technology
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
No Downloads

Views

Total Views
2,410
On Slideshare
0
From Embeds
0
Number of Embeds
2

Actions

Shares
Downloads
0
Comments
0
Likes
4

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide
  • Follow along if you like!See full list of supported platforms in Installation Manual.Can choose different directory during installation.
  • Good analogy for Apps is iPhone/iPad. Same data, many uses. Apps change the presentation layer.
  • Illustrate add data, illustrate creating a new index, illustrate the *nix app to show performance metrics.Also, new splunk overview app that ships with test data for DM and Pivot etc,
  • This is the unix app in action. In this example, we’re pulling a number of scripted inputs such as top, iostat, network, etc.
  • 1. Wildcards are supported - *2. Search terms are case insensitive.3. Boolean searches are supported with AND, OR, NOT. Just remember that Booleans must be uppercase.4. There is an implied AND between the search terms, and for complex searches, use parenthesis. (error OR failed)5. You can also quote phrases such as “Login Failure”6. Search Modes!
  • 1. Wildcards are supported - *2. Search terms are case insensitive.3. Boolean searches are supported with AND, OR, NOT. Just remember that Booleans must be uppercase.4. There is an implied AND between the search terms, and for complex searches, use parenthesis. (error OR failed)5. You can also quote phrases such as “Login Failure”6. Search Modes!
  • This is an example of a search by host excluding events with an error log level
  • The search assistant offers quick reference for the Splunk search language that updates as you type. That includes links to online documentation, and shows matching searches along with their count, matching terms and examples. It also shows you your history of searches.
  • A search becomes a job for Splunk to process. While a search is processing, this job can be Canceled, Paused, sent to the background and Finalized. The ability to cancel is handy if you made a mistake or chose the wrong time range.Finalized = stop processing events but build the "number of events" count. Jobs can be accessed while running or after through the jobs menu. There, Paused Jobs can be resumed and those sent to the background can be accessed. Jobs results are kept for a configurable time of 10 minutes by default.
  • Splunk search language is very unix-like—use the pipe symbol to pass search results to search commands. Search commands can be chained. You can even create your own custom search commands.These are common commands we find most useful to analyze and filter data. <review each command>Search reference is available online in addition to the search assistance and covers all search commands.
  • Much like *nix* operating systems, chances are you’re not going to memorize all of the commands. You’ll memorize a handful, and rely on the “man pages” to get additional context to commands. We SEs here at Splunk use maybe twenty terms in our day to day.
  • Fields give you much more precision in searches. Fields are key value pairs associated with your data by Splunk. So, an example would be host=www1, status=503. Now there are two specific types of fields. There are default fields, (source, sourcetype and host) which are added to every event by Splunk during indexing.And there are data-specific fields. These would be action=“purchase” or status=“503”.
  • What’s the difference between Sources, sourcetypes, and hosts?A host would be the hostname, IP address or name of the network host from which events originate. An example might be a single windows server would be a host or specific firewall.A Source is the name of a file, a stream or some other input, such as a config file, process, application or event log, on a server. So per our Windows server example, sources on that server, might include Windows event logs, exchange logs, DNS/DHCP logs, performance metrics as well as the windows event logs from the windows event viewer. Each of these is a different source.A Sourcetype is a specific data format. Sourcetype would beALL exchange logs or ALL Cisco ASA. It’s a high level group. Running your searches against a sourcetype of Windows Event Log Security across multiple servers.
  • Event types can help you automatically identify events based on a search. An event type is a field based on a search, it’s a way of classifying data for searching and reporting and it’s useful for user knowledge capture and sharing.Tags are different, in that they allow you to search for events with related field values. You can assign any field/value combination. So as an example, server names aren’t always helpful. Sometimes they contain ambiguous information. Using tags you can use a more meaningful term.The Splunk Manager allows you to enable/disable, copy, delete and edit tags that you’ve created.
  • Extracting fields that aren’t already pulled out at search time is a necessary step to doing more with your data like reporting.Show example of field extraction with IFX and an example using rex.Show other field extractor.
  • Extracting fields that aren’t already pulled out at search time is a necessary step to doing more with your data like reporting.Show example of field extraction with IFX and an example using rex.Show other field extractor.
  • Use the time range picker to set time boundaries on your searches. You can restrict the search to Preset time ranges, custom Relative time ranges, and custom Real-time time ranges. You can also specify a Date Range, a Date & Time Range, and use more advanced options for specifying the time ranges for a search.
  • Real-time alerts always trigger immediately for every returned resultReal-time monitored alerts monitor a real-time window and can trigger immediately, or you can define conditionsScheduled alerts run a search on a regular interval that you define and triggers based on conditions that you define
  • Run alert in Splunk.Splunk alerts are based on searches and can run either on a regular scheduled interval or in real-time.Alerts are triggered when the results of the search meet a specific condition that you define.Based on your needs, alerts can send emails, trigger scripts and write to RSS feeds.
  • Consider how you might use a scripted alert.
  • Demo building a traditional report. Reports can also be dashboards mailed out.
  • Demo building a report and dashboard.
  • Demo new dashboard workflow
  • Show dashboard examples:
  • Why with the same settings is the shadow so dark?
  • Splunk can be divided into four logical functions. First, from the bottom up, collection. Splunk forwarders come in two packages; the full Splunk distribution or a dedicated “Universal Forwarder”. The full Splunk distribution can be configured to filter data before transmitting, execute scripts locally, or run SplunkWeb. This gives you several options depending on the footprint size your endpoints can tolerate. The universal forwarder is an ultra-lightweight agent designed to collect data in the smallest possible footprint. Both flavors of forwarder come with automatic load balancing, SSL encryption and data compression, and the ability to route data to multiple Splunk instances or third party systems. To manage your distributed Splunk environment, there is the Deployment Server. Deployment server helps you synchronize the configuration of your search heads during distributed searching, as well as your forwarders to centrally manage your distributed data collection. Of course, Splunk has a simple flat-file configuration system, so feel free to use your own config management tools if your more comfortable with what you already have. The core of the Splunk infrastructure is indexing. An indexer does two things – it accepts and processes new data, adding it to the index and compressing it on disk. The indexer also services search requests, looking through the data it has via it’s indices and returning the appropriate results to the searcher over a compressed communication channel. Indexers scale out almost limitlessly and with almost no degradation in overall performance, allowing Splunk to scale from single-instance small deployments to truly massive Big Data challenges. Finally, the Splunk most users see is the search head. This is the webserver and app interpreting engine that provides the primary, web-based user interface. Since most of the data interpretation happens as-needed at search time, the role of the search head is to translate user and app requests into actionable searches for it’s indexer(s) and display the results. The Splunk web UI is highly customizable, either through our own view and app system, or by embedding Splunk searches in your own web apps via includes or our API.
  • Getting data into Splunk is designed to be as flexible and easy as possible. Because the indexing engine is so flexible and doesn’t generally require configuration for most IT data, all that remains is how to collect and ship the data to your Splunk. There are many options. First, you can collect data over the network, without an agent. The most common network input is syslog; Splunk is a fully compliant and customizable syslog listener over both TCP and UDP. Further, because Splunk is just software, any remote file share you can mount or symlink to via the operating system is available for indexing as well. To facilitate remote Windows data collection, Splunk has a its own WMI query tool that can remotely collect Windows Event logs and performance counters from your Windows systems. Finally, Splunk has a AD monitoring tool that can connect to AD and get your user meta data to enhance your searching context and monitor AD for replication, policy or user security changes. When Splunk is running locally as an indexer or forwarder, you have additional options and greater control. Splunk can directly monitor hundreds or thousands of local files, index them and detect changes. Additionally, many customers use our out-of-the-box scripts and tools to generate data – common examples include performance polling scripts on *nix hosts, API calls to collect hypervisor statistics and for detailed monitoring of custom apps running in debug modes. Also, Splunk has Windows-specific collection tools, including native Event Log access, registry monitoring drivers, performance monitoring and AD monitoring that can run locally with a minimal footprint.
  • Historically, a Splunk forwarder was a stripped down version of the full Splunk distribution. Certain features, such as Splunk Web, were turned off to decrease footprint on a remote host. Our customers asked us for something even lighter and we delivered. The Universal Forwarder is a new, dedicated package specifically designed for collecting and sending data to Splunk. It’s super light on resources, easy to install, but still includes all the current Splunk inputs, without requiring python. Most deployments should only require the use of the Universal Forwarder but we have kept all features of forwarding in the Regular (or Heavy) Forwarder for cases when you need specific capabilities.
  • A single indexers it can index 50-100gigabytes per day depending the data sources and load from searching. If you have terabytes a day you can linearly scale a single, logical Splunk deployment by adding index servers, using Splunk’s built in forwarderload balancing to distribute the data and using distributed search to provide a single view across all of these servers. Unlike some log management products you get full consolidated reporting and alerting not simply merged query results. When in doubt, the first rule of scaling is ‘add another commodity indexer.’ Splunk indexers are designed to enable nearly limitless fan-out with linear scalability by leveraging techniques like MapReduce to fan-out work in a highly efficient manner.
  • Leverage distributed search to give each locale access to their own data, while providing a combined view to central teams back at headquarters. Whether to optimize your network traffic or meet data segmentation requirements, feel free to build your Splunk infrastructure as it makes sense for your organization. Further, each distributed search head automatically creates the correct app and user context while searching across other datasets. No specific custom configuration management is required; Splunk handles it for you.
  • The insights from your data are mission-critical. With Splunk Enterprise 5 we wanted to deliver a highly available system, with enterprise-grade data resiliency, even as you scale on commodity storage. And we wanted to maintain Splunk’s robust, real-time and ease of use features.Splunk indexers can now be grouped together to replicate each other’s data, maintaining multiple copies of all data – preventing data loss and delivering highly available data for Splunk search. Using index replication, if one or more indexers fail, incoming data continues to get indexed and indexed data continues to be searchable.By spreading data across multiple indexers, searches can read from many indexers in parallel, improving parallelism of operations and performance. All as you scale on commodity servers and storage. And without a SAN.
  • For high availability and scale out, combine auto load balancing with data cloning. Each clone group has one complete set of the overall data for redundancy, while load balancing within each clone group spreads the load and the data between indexers for efficient scaling. So long as one indexer remains in a clone group, that group will remain synced with the entirety of the data. Search Head Pooling can share the same application and user configurations and coordinate the scheduling of searches. This allows for one logical pool of search heads to service large numbers of users with minimal downtime should a search head become unavailable.Additionally, by leveraging LDAP authentication, such as Active Directory, users can be directed to any search head as needed for load balancing or failover. NOTE: the second indexers needs to be licensed with an HA license 50% of regular enterprise license
  • Splunk isn’t the only technology that can benefit from IT data collection, so let Splunk help send the data to those systems that need it. For those systems that want a direct tap into the raw data, Splunk can forward all or a subset of data in real time via TCP as raw text or RFC-compliant syslog. This can be done on the forwarder or centrally via the indexer without incrementing your daily indexing volume. Separately, Splunk can schedule sophisticated correlation searches and configure them to open tickets or insert events into SIEMs or operation event consoles. This allows you to summarize, mash-up and transform the data with the full power of the search language and import data into these other systems in a controlled fashion, even if they don’t natively support all the data types Splunk does. MSSP, Cloud Services, etc.
  • Your logs and other IT data are important but often cryptic. You can extend Splunk’s search with lookups to external data sources as well as automate tagging of hosts, users, sources, IP addresses and other fields that appear in your IT data. This enables you to find and summarize IT data according to business impact, logical application, user role and other logical business mappings. In the example shown, Splunk is looking up the server’s IP address to determine which domain the servicing web host is located in, and the customer account number to show which local market the customer is coming from. Using these fields, a search user could create reports pivoted on this information easily. Illustrate Lookups:
  • Splunk allows you to extend your existing AAA systems into the Splunk search system for both security and convenience. Splunk can connect to your LDAP based systems, like AD, and directly map your groups and users to Splunk users and roles. From there, define what users and groups can access Splunk, which apps and searches they have access to, and automatically (and transparently) filter their results by any search you can define. That allows you to not only exclude whole events that are inappropriate for a user to see, but also mask or hide specific fields in the data – such as customer names or credit card numbers – from those not authorized to see the entire event.
  • Centralized License Management provides for a holistic approach in your multi-indexer distributed Splunk environment. You can aggregate compatible licenses into stacks of available license volume and define pools of indexers to use license volume from a given stack.
  • Splunk deployments can grow to encompass thousands of Splunk instances, including forwarders, indexers, and search heads. Splunk offers a deployment monitor app that helps you to effectively manage medium- to large-scale deployments, keeping track of all your Splunk instances and providing early warning of unexpected or abnormal behavior.The deployment monitor provides chart-rich dashboards and drilldown pages that offer a wealth of information to help you monitor the health of your system. These are some of the things you can monitor:Index throughput over timeNumber of forwarders connecting to the indexer over timeIndexer and forwarder abnormalitiesDetails for individual forwarders and indexers, such as status and forwarding volume over timeSource types being indexed by the systemLicense usage
  • With thousands of enterprise customers and an order of magnitude more actual users, we have a thriving community.We launched a dev portal a few months back and already have over 1,000 unique visitors per week.We have over 300 apps contributed by ourselves, our partners and our community.Our knowledge exchange Answers site has over 20,000+ questions answered.And in August 2012 we ran our 3rd users’ conference with over 1,000 users in attendance, over 100 sessions of content, customers presenting.Best of all, this community demands more from Splunk and gives us incredible feedback

Transcript

  • 1. Copyright © 2013 Splunk Inc. Getting Started User Training Workshop Technical Workshops Getting Started User Training
  • 2. Agenda • • • • • • • Getting Started with Splunk Search Alert Dashboard Deployment and Integration Community Help & Questions 2
  • 3. Getting Started With Splunk
  • 4. Splunk Delivers Value Across IT and the Business App Dev and App Mgmt. IT Operations Security and Compliance Digital Intelligence Business Analytics Developer Platform (REST API, SDKs) Small Data. Big Data. Huge Data. Industrial Data and Internet of Things
  • 5. Install Splunk www.splunk.com/download 32 or 64 Bit? Indexer or Universal Forwarder? Splunk Home WIN: Program FilesSplunk Other: /opt/splunk (Applications/splunk) Start Splunk WIN: Program FilesSplunkbinsplunk.exe start (services start) *NIX: /opt/splunk/bin/splunk start
  • 6. Splunk Licenses Free Download Limits Indexing to 500MB/day Enterprise Trial License expires after 60 days Reverts to Free License Features Disabled in Free License Multiple user accounts and role-based access controls Distributed search Forwarding to non-Splunk Instances Deployment management Scheduled saved searches and alerting Summary indexing Other License Types Enterprise, Forwarder, Trial
  • 7. Splunk Web Basics Default installation on: http://localhost:8000 Browser Support Firefox 10.x and latest Internet Explorer 7, 8, 9 and 10 Safari (latest) Chrome (latest) Index data Add data Getting Started App Install an App (Splunk for Windows, *NIX) 7
  • 8. Splunk Web Basics continued… Splunk Home • Provides Interactive portal to the Apps & data. • Includes a search bar and three panels: 1 – Apps 2 – Data 3 - Help Splunk Apps • Splunk Home  Find more apps • Provide different contexts for your data out of sets of views, dashboards, and configurations • Default Search App • You can create your own! 8
  • 9. Optional: add some test data Download the sample file, follow this link and save the file to your desktop, then unzip: http://bit.ly/UBPFWP (Using Splunk Book) Or, to follow along locally, you can download the slides, lookups and data samples at: http://bit.ly/UjkNt6 (Dropbox) To add the file to Splunk: – – – – – From the Welcome screen, click Add Data. Click From files and directories on the bottom half of the screen. Select Skip preview. Click the radio button next to Upload and index a file. Click Save. Install *nix or Windows app to test drive your local OS data! 9
  • 10. *nix app in action: 10
  • 11. Best Practice Suggestion: Create an individual Index based on sourcetype. • Easier to re-index data if you make a mistake. • Easier to remove data. • Easier to define permissions and data retention. 11
  • 12. Search Basics
  • 13. current view Search app – Summary view app navigation time range picker search box start search Selecting Data Summary: • Host • Source • Sourcetype global stats
  • 14. Searching Search > * Select Time Range • Historical, custom, or real-time Select Mode • Smart, Fast, Verbose Using the timeline • Click events and zoom in and out • Click and drag over events for a specific range 14
  • 15. Everything is searchable Everything is searchable fail* • * wildcards supported fail* nfs • Search terms are case insensitive • Booleans AND, OR, NOT – – – • Booleans must be uppercase Implied AND between terms Use () for complex searches Quote phrases error OR 404 error OR failed OR (sourcetype=access_*(500 OR 503)) "login failure" 15
  • 16. Example Search: 16
  • 17. Search Assistant Contextual Help - advanced type-ahead updates as you type shows examples and help History - search - commands Search Reference - short/long description - examples suggests search terms toggle off / on 17
  • 18. Job Management Searches can be managed as asynchronous processes Modify Job Settings finalize Jobs can be • • • • • • Scheduled Moved to background tasks Paused, stopped, resumed, finalized Managed Archived Cancelled pause delete 18
  • 19. Search Commands Search > error | head 1 Search results are “piped” to the command Commands for: • Manipulating fields • Formatting • Handling results • Reporting 19
  • 20. Over 130 Commands! http://www.splunk.com/base/Documentation/latest/SearchReference/SearchCheatsheet 20
  • 21. Field Extraction Fun
  • 22. Fields Default fields • host, source, sourcetype, linecount, etc. • View on left panel in search results or all in field picker Where do fields come from? • Pre-defined by sourcetypes • Automatically extracted key-value pairs • User defined 22
  • 23. Sources, Sourcetypes, Hosts • Host - hostname, IP address, or name of the network host from which the events originated • Source - the name of the file, stream, or other input • Sourcetype - a specific data type or data format 2 3
  • 24. Tagging and Event Typing Eventtypes for more human-readable reports to categorize and make sense of mountains of data punctuation helps find events with similar patterns Search > eventtype=failed_login instead of Search > “failed login” OR “FAILED LOGIN” OR “Authentication failure” OR “Failed to ………………authenticate user” Tags are labels apply ad-hoc knowledge create logical divisions or groups tag hosts, sources, fields, even eventtypes Search > tag=web_servers instead of Search > host=“apache1.splunk.com” OR host=“apache2.splunk.com” OR …………….host=“apache3.splunk.com” 24
  • 25. Extract Fields Interactive Field Extractor generate PCRE editable regex preview/save 25
  • 26. Extract Fields Interactive Field Extractor props.conf generate PCRE editable regex preview/save [mysourcetype] REPORT-myclass = myFields transforms.conf Configuration File manual field extraction [myFields] REGEX = ^(w+)s FORMAT = myFieldLabel::$1 delim-based extractions Rex Search Command ... | rex field=_raw "From: (?<from>.*) To: (?<to>.*)" 26
  • 27. Saved Search & Alert Basics
  • 28. Saved Searches Leverage Searches for future Insights! Reports Dashboards Alerts Eventtypes Add a Time Range Picker Preset Relative Real-time Date-Range Date & Time Range Advanced 28
  • 29. Create Alerts Scheduled or Real-Time Define Time Ranges Conditions Thresholds 29
  • 30. Alerting Continued… Searches run on a schedule and fire an alert • Example: Run a search for “Failed password” every 15 min over the last 15 min and alert if the number of events is greater than 10 Searches are running in real-time and fire an alert • Example: Run a search for “Failed password user=john.doe” in a 1 minute window and alert if an event is found 30
  • 31. Alerting Actions • Send email • RSS • Execute a script • Track Alert Details 31
  • 32. Report & Dashboard Wackiness
  • 33. Reporting Build reports from results of any search Define your Search and set your time range, accelerate you search and more Choose the type of chart (line, area, column, etc) and other formatting options 33
  • 34. Reporting Examples • Use wizard or reporting commands (timechart, top, etc) • Build real-time reports with real-time searches • Save reports for use on dashboards 34
  • 35. Dashboards Create dashboards from search results 35
  • 36. Dashboard Examples 36
  • 37. Manager Settings For All of that Cool Stuff You Just Created (and more!) • • • • • • Permissions Saved Searches/Reports Custom Views Distributed Splunk Deployment Server License Usage…. 37
  • 38. Deployment and Integration
  • 39. Splunk Has Four Primary Functions • Searching and Reporting (Search Head) • Indexing and Search Services (Indexer) • Local and Distributed Management (Deployment Server) • Data Collection and Forwarding (Forwarder) A Splunk install can be one or all roles… 39
  • 40. Getting Data Into Splunk Agent and Agent-less Approach for Flexibility Local File Monitoring syslog log files, config files dumps and trace files TCP/UDP syslog compatible hosts and network devices Mounted File Systems hostnamemount Scripted Inputs WMI Event Logs Performance shell scripts custom parsers batch loading Active Directory Windows Inputs Event Logs performance counters registry monitoring Active Directory monitoring code shell perf virtual host Unix, Linux and Windows hosts Custom apps and scripted API connections Windows hosts Splunk Forwarder Agent-less Data Input 40 Windows hosts
  • 41. Understanding the Universal Forwarder Forward data without negatively impacting production performance. Universal Forwarder Regular (Heavy) Forwarder Monitor All Supported Inputs ✔ ✔ Routing, Filtering, Cloning ✔ Universal Forwarder Deployment ✔ Logs Splunk Web ✔ Scripted Inputs Metrics Scripts ✔ Event Based Routing Configurations ✔ Python Libraries Messages ✔ Central Deployment Management Monitor files, changes and the system registry; capture metrics and status. 41
  • 42. Horizontal Scaling Load balanced search and indexing for massive, linear scale out. Distributed Search Forwarder Auto Load Balancing 42
  • 43. Multiple Datacenters Index and store locally. Distribute searches to datacenters, networks & geographies. Headquarters Distributed Search London Hong Kong Tokyo 43 New York
  • 44. High Availability, On Commodity Servers and Storage Index Replication As Splunk collects data, it keeps multiple identical copies If indexer fails, incoming data continues to get indexed Splunk Universal Forwarder Pool Indexed data continues to be searchable Easy setup and administration Constant Uptime Data integrity and resilience without a SAN 44
  • 45. High Availability Combine auto load balancing and cloning for HA at every Splunk tier. Shared Storage Distributed Search Distributed Search Clone Group 2 : Complete Dataset Clone Group 1 : Complete Dataset Data Cloning & Auto Load Balancing 45
  • 46. Send Data to Other Systems Route raw data in real time or send alerts based on searches. Service Desk Event Console SIEM 46
  • 47. Integrate External Data Extend search with lookups to external data sources. Watch Lists LDAP, AD CMDB CRM/ER P Correlate IP addresses with locations, accounts with regions 47
  • 48. Integrate Users and Roles Integrate authentication with LDAP and Active Directory. LDAP, AD Users and Groups Problem Investigation Splunk Flexible Roles Capabilities & Filters Manage Indexes Problem Investigation Share Searches Save Searches Problem Investigation Manage Users NOT tag=PCI App=ERP Map LDAP & AD groups to flexible Splunk roles. Define any search as a filter. 48 …
  • 49. Centralized Licensing Management Groups, Stacks, and Pools for Enterprise Deployments Problem Investigation 49
  • 50. Deployment Monitoring Keep Tabs On Your Splunk Enterprise Deployment Licenses Sourcetypes Indexers 50 Forwarders
  • 51. Support and Community
  • 52. Support Through the Splunk Community Splunkbase .conf2014 52
  • 53. Where to Go for Help • Documentation – http://www.splunk.com/base/Documentation • Technical Support – http://www.splunk.com/support • Videos – http://www.splunk.com/videos • Education – http://www.splunk.com/goto/education • Community – http://answers.splunk.com • Splunk Book – http://splunkbook.com 53
  • 54. Thank You