• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
Making Logs Sexy Again: Can We Finally Lose The Regexes?
 

Making Logs Sexy Again: Can We Finally Lose The Regexes?

on

  • 4,976 views

Making Logs Sexy Again: Can We Finally Lose The Regexes?, presented at DeepSec 2008 in Vienna, Austria

Making Logs Sexy Again: Can We Finally Lose The Regexes?, presented at DeepSec 2008 in Vienna, Austria

Statistics

Views

Total Views
4,976
Views on SlideShare
4,974
Embed Views
2

Actions

Likes
3
Downloads
0
Comments
0

1 Embed 2

http://www.slideshare.net 2

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment
  • “ Will be high-level, but the subject is definitely pretty DEEP… so DeepSec” Making Logs Sexy Again: Can We Finally Lose The Regexes?   As I was talking to one conference organizer a few years ago and he complained that there hasn’t been anything new in the field of automated log analysis for many years and that we have not progressed much since regular expressions (regexes) were used for extracting bits of useful information from logs. This talk is intended to change that! It will go through a set of approaches to “log problem”: some completely new (text mining of logs), some borrowed from other fields (Bayesian methods for log analysis) and show how people can use them to solve practical problems using these new methods. Only real-world-tested analysis methods and only those that can be used by the audience will be shown (and, of course, only the “0technically cool” and novel ones!) Example questions that will be answered: are logs data or text? Which data mining methods works really well on logs (which is still simple enough to be used by the non-Ph.Ds)? Can you make sense of all the keywords and key phrases in logs? Does Bayes works for logs? Can you predict when the system will crash based on logs?

Making Logs Sexy Again: Can We Finally Lose The Regexes? Making Logs Sexy Again: Can We Finally Lose The Regexes? Presentation Transcript

  • Making Logs Sexy Again: Can We Finally Lose The Regexes? Dr. Anton Chuvakin
  • Agenda
      • What do we do with logs?
      • What’s wrong with logs?
        • Logging “Grand Challenges”
      • What has been tried to make It right?
      • What’s next?
    C O N F I D E N T I A L
  • Who is Anton?
    • Dr. Anton Chuvakin [ formerly ] from LogLogic “is probably the number one authority on system logging in the world ”
    • SANS Institute (2008)
  • First: WTH is Logs?
    • I. A structured and timed audit trail of system, network activities?
    C O M P A N Y C O N F I D E N T I A L II. A huge messy pile of undocumented, unstructured or poorly structured “ stuff ” that might or might not mean anything – at all?
  • WTF?!!
    • %PIX|ASA-3-713185 Error: Username too long - connection aborted
    • ERROR: transport error 202: send failed: Success
    • userenv[error] 1030 RCI-CORPwsupx No description available
    • Aug 11 09:11:19 xx null pif ? exit! 0
  • Second: What Do We Do With Logs?
    • Ignore them
      • by far the #1 popular choice!
    • Collect them – and then ignore them
    • Collect them to show them to whoever wants them – and then nobody does 
    • Collect and look for “bad” stuff
    • Collect and parse them into a database – ignoring logs now costs more! 
    C O M P A N Y C O N F I D E N T I A L
    • Analysis?
    C O M P A N Y C O N F I D E N T I A L
  • Log Analysis Basics: Summary
    • Manual
    • Filtering
    • Summarization and reports
    • Log searching
    • Correlation
  • Log Analysis Basics: Manual
    • Manual log review
      • Just fire your trusty ‘tail’, ‘more’, “notepad”, ‘vi’, Event Viewer, etc and get to it! 
    • Pros:
        • Easy, not tools required (neither build nor buy)
    • Cons:
        • Try it with 10GB log file one day 
        • Boring as Hell! 
  • Log Analysis Basics: Filtering
    • Log Filtering
      • Just show me the bad stuff; here is the list (positive)
      • Just ignore the good stuff; here is the list (negative or “Artificial Ignorance”)
    • Pros:
      • Easy result interpretation: see->act
      • Many tools or write your own
    • Cons:
      • Patterns beyond single messages?
      • Neither good nor bad, but interesting?
  • Log Analysis Basics: Summary
    • Summarization and reports
      • Top X Users, Connections by IP,
    • Pros:
      • Dramatically reduces the size of data
      • Suitable for high-level reporting
    • Cons:
      • Loss of information by summarizing
      • Which report to pick for a task?
  • Log Analysis Basics: Search
    • Search across stored logs
    • User specifies a time period, log source(s), and an expression; gets back logs that match (regex or index keywords)
    • Pro
      • Easy to understand
      • Quick to do
    • Con
      • What do you search for?
      • A LOT of data back, sometimes
  • Log Analysis Basics: Correlation
    • Correlation
      • Rule-based and other “correlation” and “Correlation” algorithms
    • Pro
      • Promise of automated analysis
    • Con
      • Needs rules written by experts
      • Often, needs to be operated by experts too
      • Needs tuning for each site
    • Happy Now?
    • Most People Aren’t
    C O M P A N Y C O N F I D E N T I A L
  • Log Management “Grand Challenges?”
    • Given “the state of the art” of log analysis, major problems remain.
    • Logging Grand Challenges
    • Log management BIG and unsolved problems that cause major pain!
    • Problems that people tried to solve – and FAILED!
  • GC1 – Log Chaos
    • Challenge
      • Logs come in a dizzying variety of formats, they look and mean different – how do we understand them? Some of them are just “bad!”
    • Why a grand challenge?
      • Lack of log standards make log analysis unreliable and complex art
    • Current approaches?
      • Take logs one by one; write regexes or index
    • Why still a challenge?
      • No credible log standard emerged (work ongoing)
  • Example Log Chaos - Login? <122> Mar 4 09:23:15 localhost sshd[27577]: Accepted password for kyle from ::ffff:192.168.138.35 port 2895 ssh2 <13> Fri Mar 17 14:29:38 2006 680 Security SYSTEM User Failure Audit ENTERPRISE Account Logon Logon attempt by: MICROSOFT_AUTHENTICATION_PACKAGE_V1_0     Logon account :  POWERUSER    <57> Dec 25 00:04:32:%SEC_LOGIN-5-LOGIN_SUCCESS: Login Success [user:yellowdog] [Source:10.4.2.11] [localport:23] at 20:55:40 UTC Fri Feb 28 2006 <18> Dec 17 15:45:57 10.14.93.7 ns5xp: NetScreen device_id=ns5xp system-warning-00515: Admin User netscreen has logged on via Telnet from 10.14.98.55:39073 (2002-12-17 15:50:53)
  • GC2 – Secure and Reliable Log Collection
    • Challenge
      • To collect the logs securely, reliably AND without heavy management overhead and complexity of access
    • Why a grand challenge?
      • Agents vs remote grabbing vs stream: all suck. Security and reliability cost major management overhead
    • Current approaches?
      • Agents + remote grab (administrator access) + “silly stream” (syslog UDP)
    • Why still a challenge?
      • All approaches have critical drawbacks
  • GC3 - Log Parsing and Regexs
    • Challenge
      • To turn logs into information, one needs to parse them; to parse them one needs regular expressions (regex)
    • Why a grand challenge?
      • Every log type requires hand-writing a set of regexes
    • Current approaches?
      • UIs, “semi-auto”/assisted regex creators, limited auto-extraction, choosing not to parse, etc
    • Why still a challenge?
      • Despite all tools, log expert must create the rules
  • GC4 – Automated Meaning Extraction
    • Challenge
      • Automatically analyze logs and gain useful information, across domains (security, ops, compliance)
    • Why a grand challenge?
      • Log analysis is heavily manual, interpretative and domain- and system-specific
    • Current approaches?
      • Rule-based, summarization, filtering, minimum anomaly detection
    • Why still a challenge?
      • “ Log analysis is an art, not science” -> not much automation
  • GC5 – “Fuzzy” Search
    • Challenge
      • How to find the “right” log message(s) without knowing what to look for, exactly?
    • Why a grand challenge?
      • Many uses of logs require searching but users often don’t know what to look for
    • Current approaches?
      • Trying keywords + wildcards + refining search as we go
    • Why still a challenge?
      • No method to incorporate uncertainty in search is found yet
    • What Else Can We Try?
    C O M P A N Y C O N F I D E N T I A L
  • Handling The Challenges
    • Data mining
    • “ Search+” or smart search
    • Text mining (or “search++”)
    • Bayesian analysis
    • Context enrichment
    • Visualization
    C O M P A N Y C O N F I D E N T I A L
  • Log Mining
    • Why “mine the logs ”?
      • More human-like pattern recognition
      • Dealing with sparse data
      • Prediction ? Probably not (not soon!)
    • Towards “replacing” humans ( trying to…)
      • Offloading conclusion generation to machines
      • “ Better than junior analysts”
  • Preliminary DATA Requirements
    • Mostly the same as for other log analysis, but with some added factors:
    • Centralized
      • To look in just one place
    • Normalized
      • To look across the data sources
    • Quick accessible storage
      • To be used by the mining tools
  • What Do We “Mine” for?
    • How about for something interesting ?
    • One research paper defines “interesting” thus:
      • Unexpected to user (aka not “normal”, not routine)
      • Actionable (we can and/or should do something about it)
  • Simple Example
    • Too many attack types from a single IP address
    • Right next to known vulnerability scanners
    • External IP address
    • Conclusion : potentially dangerous attacker
  • Deeper into interesting - I
    • Approaches to finding interesting stuff in logs without knowing what we look for specifically :
    • Rare things
      • Is compromise rare in your environment? 
    • Different things ( NEW , GONE , etc)
      • Is today “just another day” … or not?
    • “ Out of character ” things
      • It always does it… but not today?
  • Can You Guess What Happened?!
  • Deeper into interesting - II
    • Counts of an otherwise uninteresting thing
      • Pings? Connections to port 80? Error 404s?
    • Ratios of otherwise uninteresting things
      • Login failures / login successes?
      • Inbound / outbound connections?
    • Frequencies of things
      • Frequent becoming rare – and vice versa!
    • Time series behaving badly
      • Traffic overall grows, but traffic vs system X slows
  • Where Is The DATA?
    • But are logs really data ?  Looks like [ broken ] English to me…
    • %PIX-2-214001: Terminating manager session from 10.10.55.2 on interface inside. Reason: incoming encrypted data (18998 bytes) longer than 12453 bytes
    • %PIX-3-109016: Downloaded authorization access-list 101 not found for user sunilp
    • So, is DATA mining appropriate?
  • “ Search+”
    • Workflows to tune searches
    • Advanced search syntax
    • Search data presentation and visualization
    • Statistics on search results
    • Automated text-> data conversion (assisted “parsing”)
    C O M P A N Y C O N F I D E N T I A L
  • Text Mining Logs?
    • Text clustering
    • Baselining and profiling of text streams
    • Bayesian mining
    • Example : textalog tool
    • New/gone keywords, keyword pairs or phrases
    • Changes in keyword frequency and mixture
    • New/gone text clusters
    C O M P A N Y C O N F I D E N T I A L
  • Keywords to Meaning?
      • E.g. Change Management activity
        • Keywords: change*, modif*, add *, delete*, drop, remove*, creat*, restore*, set, clear*, enable*, install*, write*, rename*, alter*, truncate*, renam*, update*, erase*
        • Keyword pairs: policy + change*, object + change*, file + write*, account + adde   *, audit* + change*, creat* + ( user OR account ), new + user OR account, change OR modify + restart, config*+ erase*, user+add*, config*+writ*, config* + chang*, account + change*
    C O M P A N Y C O N F I D E N T I A L
  • Log Visualization
      • Log Data -> Pictures
        • But can you do “anything but the scan?” 
      • Data visualization is MUCH easier then text visualization
      • Example : AfterGlow tool
  • Log Context Enrichment? C O M P A N Y C O N F I D E N T I A L
  • Finally, Can We Make Better Logs? CEE = Syntax + Vocabulary + Transport + Log Recommendations
    • Common Event Expression Impacts
      • Log management capabilities
      • Log correlation (SIEM) capabilities
      • Device intercommunication enabling autonomic computing
      • Enterprise-level situational awareness
      • Infosec attacker modeling and other security analysis capability
    • Common Event Expression Taxonomy
      • To specify the event in a common representation
    • Common Log Syntax
      • For parsing out relevant data from received log messages
    • Common Log Transport
      • For exchanging log messages
    • Log Recommendations
      • For guiding events and details needed to be logged by devices (OS, IDS, FWs, etc)
  • Conclusions and CALL TO ACTION!
    • As we collect more logs, the issue of “what to do with them?” will come up in force, FINALLY!
    • Current methods of turning logs into useful info mostly suck (or “waaaaaaaaaay too hard”)
    • Promising others methods are KNOWN!
    • What more do you need? 
    • Get to work on these big logging problems!
    C O M P A N Y C O N F I D E N T I A L
  • Thank You!
    • Dr. Anton Chuvakin
    • “ Log Addict”
    • www.chuvakin.org
    • See www.info-secure.org for my papers, books, reviews and other security and logging resources.
    • Subscribe to my blog at www.securitywarrior.org
  • Backup and Reference Slides C O M P A N Y C O N F I D E N T I A L
  • “ Top 11 Reasons People Hate Logs”
      • Read any logs lately? Got bored in 5 minutes - or survived for the whopping 10? Congrats, you score a point! But logs are still boooooooooooooooooooooooooooooring .
      • One log, two logs, 10 logs.... 1,000,000,000 logs: rabbits and hamsters cannot match the speed with which logs multiply . Don't you just hate that?
      • You keep hearing people refer to &quot;log data.&quot; Then  you run 'tail /var/log/messages' and see text in pidgin English. Where is my data ? Hate it!
      • &quot;Real hackers don't get logged &quot;: thus logs are seen as useless - and hated by some &quot;hard core&quot; security pros!
      • If people lie to you, you hate it. Logs do lie too (see 'false positives') - and they are hated too.
      • 'Transport error 202 message repeated 3456 times.' Niiiiice. Now go fix that! Fix what? Ah, hate the log obscurity !
      • Why are there 47 different ways to log that &quot;connection from A to B was established OK?&quot; Or 21 way to say &quot;user logged in OK?&quot; No, really? Why? Who can I kill to stop this insanity?
      • You MUST do XYZ with logs for compliance . Or you are going to jail, buddy! No, sorry, we can't tell you what XYZ is. Maybe in 7 years; for now, just store everything.
      • 'Critical error: process completed successfully'  and 'Operation successfully failed' engender deep and lasting hatred of logs in most people. They just do ...
      • The book called &quot; Ugliest Logs Ever !&quot; is a fat tome, covering every log source from a Linux system all the way to databases and CRM. Bad logs are popular! Bad logs are all the rage among the programmers! Bad logs are here to stay. Bad logs that mean nothing power the log hatred.
      • &quot;Logs: can't live with them, can't live without them&quot; :-) Hate them we might for different reasons, but we still must collect , protect ,  review , and analyze them ...