HP Discovery and Dependency Mapping Inventory: a deep dive into software application recognition teaching
Upcoming SlideShare
Loading in...5
×
 

Like this? Share it with your network

Share

HP Discovery and Dependency Mapping Inventory: a deep dive into software application recognition teaching

on

  • 4,544 views

In this technical session we’ll introduce the software application recognition functionality of HP Discovery and Dependency Mapping Inventory. We’ll discusss various methods of software ...

In this technical session we’ll introduce the software application recognition functionality of HP Discovery and Dependency Mapping Inventory. We’ll discusss various methods of software application teaching, including traditional file-based recognition and recently introduced techniques and approaches such as express teaching and installed package rule-based recognition. You’ll leave this presentation with a wealth of tips, tricks and best practices for a successful implementation of your own software application recognition project for asset management purposes.

Statistics

Views

Total Views
4,544
Views on SlideShare
4,544
Embed Views
0

Actions

Likes
1
Downloads
146
Comments
1

0 Embeds 0

No embeds

Accessibility

Categories

Upload Details

Uploaded via as Adobe PDF

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
  • liked
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

HP Discovery and Dependency Mapping Inventory: a deep dive into software application recognition teaching Presentation Transcript

  • 1. HP Discovery and Dependency Mapping Inventory: A deep dive into Software Application Recognition and Teaching Vitaly Miryanov Brindusa Kevorkian Functional Architect R&D Section Manager 1 ©2010 Hewlett-Packard Development Company, L.P. The information contained herein is subject to change without notice
  • 2. Agenda – Introduction – Recognition Engine Details – Teaching Techniques – Best practices, tips and tricks –Q&A 2
  • 3. – Subtitle goes here Introduction 3
  • 4. Software Application Recognition Software Inventory Software Application Recognition Part of the inventory – Process of application identification information that with the following normalised details: provides details about • Application Name installed software • Publisher Name applications • Release Name • Version Name 4
  • 5. Sources of Raw Application Data – Windows: • Control Panel > Add/Remove programs (prior to Vista) • Control Panel > Programs and Features (Vista, Server 2008, 7) • WMI (Win32_SoftwareElement) • Version Data found in executable files – Windows/UNIX/Mac OS X: • Native package managers (MSI, RPM, Depot, BFF, PKG, etc.) – List of Running Processes 5
  • 6. Drawbacks of These Sources – Software Application Data in these sources is: • Inconsistent • Incomplete • Inaccurate • Not normalised – For these reasons it is not possible to use it for accurate analysis of software applications for Asset Management purposes 6
  • 7. – Subtitle goes here Recognition Engine Details 7
  • 8. Normalised Data – Software Application Library records details of • Software Publishers • Application Names • Release and Version names – Library is stored in Software Application Index (SAI) files • Contains executable file details for each application version • Set of rules for applications that can be recognised using rules 8
  • 9. Recognition Engine Raw Data Recognition Engine Application Data – File Details (File Name, File Size, File Signature, File Type, Version Data) – OS Package Manager (Publisher, Application Name, Version, other properties), Relationship between packages and installed files – OS Details from – Publisher Hardware Data (UNIX OS Software – Application recognition) Application Library – Release – Version 9
  • 10. Recognition Stages 1. Initialisation 2. File-level Recognition 3. Directory-level Recognition 4. Machine-level Recognition 10
  • 11. Initialisation – Hardware Data is read from the scan file – Evaluation of Installed Package Rules – Installed Packages matching rules are identified – UNIX OS recognition is done – The Recognition Engine is prepared to iterate over all file and directory data collected in the scan file 11
  • 12. File-level Recognition – Files belonging to matched packages are identified – Remaining files participate in the file-based recognition – If a file has a version data rule associated with it, the rule is evaluated based on file’s version data 12
  • 13. Ratings for File Matches – Zero rating (not recognised): File name did not match – Minimum rating: File name matched – Minimum good rating: File name, Executable type and size matched – Good rating: • File name, Executable type, size, signature matched • File name, Executable type, version data 13
  • 14. Directory-level Recognition – Triggered when all files in the directory have been processed – Attempts to select version(s) to represent files located in this directory – If the version can be identified at this stage, all files belonging to this version are recognised – Otherwise if a best match cannot be made for a group of files, the files are deferred to Machine-level Recognition 14
  • 15. Best Match Version – The overall rating for each possible version is calculated – If a version has an install string match, its overall rating is slightly increased – The version with the highest overall rating is selected 15
  • 16. Machine Level Recognition – Triggered when all directories have been processed – The files left from the directory level recognition are analysed – A Best Match Version is calculated from the list of remaining files 16
  • 17. Release Relations – Relationships are done on the Release level to reduce the number of relations to be entered – User for • "Suite" licence purposes – Example: Microsoft Word / Microsoft Office • Save space in the library – Example: embedded Java JRE 17
  • 18. File Relation to Application – Main • Key application file, a file which is ”always there” – Associated • The file that is part of the application, but not Main file – Third Party • A file from a different publisher (usually common runtime DLLs, etc) 18
  • 19. Software Application Library – Supplied as a set of Master SAI files: • Master • French • German • UNIX • BaseUnixOS – Updated monthly: available at Software Support Online: • http://support.openview.hp.com/selfsolve/patches – Currently contains • 26,000 application versions • 3,000,000 files • 8,000 installed package rules – Users can create their own user SAI files 19
  • 20. – Subtitle goes here Teaching Techniques 20
  • 21. Teaching Methods 1. From a scan file of the application installed on a “clean” OS installation 2. From a Microsoft Installer file 3. From an output of the MSI scanner 4. From a targeted scan of a particular directory / directories 5. From a UNIX standard package type 6. From a difference between two scan files 7. Express teaching using web UI 8. Teaching in Analysis Workbench when analysing multiple scan files 21
  • 22. Teaching from “Clean” OS Installation – The OS is installed with no patches – Scan the entire image (classic scan) with the scanner after the application installation – “Tools > Import Data from Recognition Result…” menu item in the SAI Editor is used for teaching from the obtained scan file – It is easy to spot the files belonging to the installed application – they are selected and added to the library 22
  • 23. Teaching from “Clean” OS Installation 23
  • 24. Teaching from “Clean” OS Installation Pros Cons – Recognition results are shown – Clean OS install has to be maintained / reverted to – File relation to application is automatically assigned – Application needs to be installed – Easy to indentify and add – The scan of the entire disk is files from different directories required – Only the files actually installed can be added 24
  • 25. Teaching from MSI – Can work even with MSI files embedded into an .exe installer – If the application’s installation is always supplied as MSI, simply create an MSI installed package rule – Otherwise teach the files – the process for either MSI file or output file of the MSI scanner (.xml) is the same: in the SAI Editor use “Tools> Import Data from MSI…” 25
  • 26. Creating an MSI Package Rule 26
  • 27. Teaching from MSI 27
  • 28. Teaching from MSI Pros Cons – No installation required – File Relation to application is not assigned – In many cases it is possible to teach all files – May miss some files 28
  • 29. Teaching from a Targeted Scan of a Directory – Useful when it is known that all application’s files are installed in a separate application directory tree – Scan this directory by specifying its name in the –paths:<dirname> scanner command line switch – “Tools > Import Data from Recognition Result…” menu item in the SAI Editor is used for teaching – All files are selected and added to the library 29
  • 30. Teaching from a UNIX Standard Package – If an application is always supplied as the standard UNIX package type, create an installed package rule – If an application is supplied in multiple media formats, the files belonging to the package need to be added to the library. There are two options: • Extract the files from the standard package to a directory • Install the package 30
  • 31. Teaching from a UNIX Standard Package 31
  • 32. Teaching from Scan File Comparison – Two classic local hard drive scan files are required: • Before the application installation • After the application installation is finished – The SAI Editor is used: “Tools>Import Data from Scanfile Comparison…” 32
  • 33. Teaching from Scan File Comparison 33
  • 34. Teaching from Scan File Comparison 34
  • 35. Teaching from Scan File Comparison Pros Cons – Easy to use – File Relation to application is not assigned – All installed files are added – Need to scan twice – Need to be careful not to include unrelated files 35
  • 36. Express Teaching – Easy to use “entry” level teaching capability – Works with Windows applications that have consistent version data – From the web UI: in the tree on the left choose “Express Teaching” 36
  • 37. Express Teaching 37
  • 38. Express Teaching 38
  • 39. Express Teaching 39
  • 40. Express Teaching Pros Cons – Very easy to use – Only works on Windows – No scan files are required – – Misses 3rd party files, files with no works from the database version data in them or files that do not have consistent version data naming – Cannot see its effect immediately 40
  • 41. Teaching in Analysis Workbench – Offers most advanced teaching capabilities – Can be used to deeply analyse application recognition of multiple scan files: • Identifies CheckVer and Unrecognised files • Identifies CheckVer applications • Can slice and dice data using advanced tagging, filtering and sorting – Once the files to teach have been identified and tagged in the File Window, select “Tag> Add to SAI…” menu of the File Window 41
  • 42. Teaching in Analysis Workbench 42
  • 43. Teaching in Analysis Workbench Pros Cons – Can analyse multiple scan – Loading of scan files is required files – Need to figure out manually – Very flexible which files to teach – Can be used to teach from – Takes time to learn how to use any scan file 43
  • 44. – Subtitle goes here Best Practices and Tips and Tricks 44
  • 45. Top-down Approach Establish the list of required applications Compare to what is available in the Master Application Library Work on the delta to either teach missing applications to the user SAI or submit a request for the Master SAI addition 45
  • 46. Top-down Approach – Usually driven by: • Immediate need to address licence audit from a single or a few software publishers • Need to identify most popular/expensive software titles • Licence reconciliation project – The list of applications can usually be obtained from: • An asset tracking programme, such as Asset Manager • Company’s software purchasing team • Company’s software catalogue • Software Publisher’s list – When comparing the application list to the data in the Master library: • Make sure to use the correct publisher name • The application name from the list may not exactly match to the name used in the Master library 46
  • 47. Bottom-up Approach Establish the list of most critical unrecognised files Identify applications to which the unrecognised files belong to Create recognition rules or teach unrecognised files to the user SAI or submit a request for the Master SAI addition 47
  • 48. Identify Most Critical Unrecognised Files – Using reports: • Reports > Unrecognized File Reports (can sort on Number of devices/Usage) • Reports > Unrecognized Files > Devices with High Risk Files Based on Frequency • Reports > Unrecognized Files > Devices with High Risk Files Based on Usage • Status > Unrecognized Files Distribution – Using Analysis Workbench (File Window) – Using Analysis Workbench (Application Window) 48
  • 49. Identify an Application the Unrecognised File Belongs To – Use Version Data found in Windows applications – Use directory names – Use hardware data under Operating System Data: • OS Installed Applications • Program Shortcuts • Services – If access to the computer is available: • For known programmes run them to find out the version • Use UNIX “strings” command to extracts text messages – Work with a system administrator / owner of the computer – Research information about a file on the Internet 49
  • 50. Keep Recognition Up-to-date – Update Master library monthly – Make use of the Definitive Media Library – Include the requirements for the Application Library update into company’s IT processes – Keep track of the recognition statistics • Status > Percent recognized files • Status > Unrecognized Files Distribution • Analysis Workbench: View> Charts> Recognition – Allocate resources for continued user SAI maintenance – If required submit a Master SAI addition request to the DDM Inventory Support 50
  • 51. Tips and Tricks – For teaching, change the scanner configuration to run at full speed – Before adding a new publisher/application/release name to the user SAI check whether it already exists in the Master library – Select “Complete” or “Full” installation options to ensure that all files are installed and taught – Split complex applications and suites into separate applications and use release relationships to link them – Teach pre-requisite applications separately from the core application 51
  • 52. Troubleshooting Recognition – Do not forget to assign a main file – Make a sensible choice for a main file – Do not forget to assign the Install String – The Installed Package rule recognises the application only when at least one file belonging to this package is available – Make sure the scanner is configured properly to scan the needed files/directories and include information on them into the scan file 52
  • 53. – Subtitle goes here Q&A 53
  • 54. To learn more on this topic, and to connect with your peers after the conference, visit the HP Software Solutions Community: www.hp.com/go/swcommunity 54 ©2010 Hewlett-Packard Development Company, L.P.
  • 55. 55
  • 56. – Subtitle goes here Backup 56
  • 57. User SAI ID Management – Each user SAI is identified by an integer ID – The SAI ID is stored in the SAI file when it is created – User SAI IDs must be unique – When user SAI are loaded by the Recognition Engine items in it are given IDS from the following range: • From: 1,500,000,000 + 50,000*SAI ID • To: 1,500,000,000 + 50,000* SAI ID + 49,999 – Master ID range is used by the master library and is reserved: • From 1 to 1,499,999,999 57
  • 58. SAI File Management – Use one computer to create new SAI files – Do not open the same SAI files in multiple tools – Refer to the Scan Data Analysis Guide, Chapter 7, Application Teaching, which has more guidance, including how to use SAI files in the aggregated environment 58
  • 59. Extracting Files from UNIX Packages – RPM: • rpm2cpio <rpm_file> | cpio -idv (extracts to the current directory) – DEB: • dpkg-deb -x <deb_file> <directory> – PKG (Solaris): • pkgadd -s <extract_dir> -d <pkg_file> all – DEPOT: • swcopy -s <depot_file_absolute_path> * @ <output_dir_absolute_path> • tar xvf <depot_file> (extracts to the current directory) – BFF: • restore -x -f <bff_file> (extracts to the current directory) 59
  • 60. Extracting Files from UNIX Archives – TAR: • tar xvf <tar_file> (extracts to the current directory) – CPIO: (extracts to the current directory) • Linux: cpio -idF <cpio_file> • Solaris: cpio -idI <cpio_file> – BZIP: • bunzip2 -f <bz2_file> – GZIP: • gunzip <gz_file> – ZIP: • unzip -q <zip_file> -d <unzip_dir> – ISO (mount as a file system to an empty directory): • Linux: mount -t iso9660 -o loop <iso_file> <empty_dir> • Solaris: mount -F hsfs -o ro `lofiadm -a <iso_file_absolute_path>` <empty_dir_absolute_path> 60
  • 61. Tips and Tricks – For quick generic identification of in-house applications make use of “Do not care about size” option for the main file (make sure the file name is unique) – Group scan files by operating system: • XML Enricher: configure grouping using hwHostOS hardware field in Administration > System Configuration > Scan File Management > Group processed scan files by • Analysis Workbench: Do a query in “File> Load Scan Files” on hwHostOS – Use “Limit to items also available in other SAIs” search option in the SAI Editor to find which user SAI applications are also available in the master library – For performance reasons unrecognised file table by default is limited to 10,000,000 rows, so not all unrecognised files may appear in the reports 61
  • 62. Tips and Tricks – Publisher information found in version data is normalised for Express Teaching. However, not all possible values are available in the Discovery Knowledge (DK) Package – if you find new values not covered – report them to DDM Inventory support to be added to the next DK package – Multi-task during teaching process – scanning can take some time, use this time to do something else (work with SAI Editor/Analysis Workbench, do multiple scans in parallel on different computers, etc.) – Loading a scan file in the Viewer with full SAI recognition takes time: • The initial delay is caused by loading SAI files – keep the Viewer open and drop files onto it • If only hardware and file/directory data is needed, select “No Recognition” or “Installed Applications” in its options – When using Analysis Workbench, use a dedicated computer (Client Install) for application teaching. Copy scan files from the DDMI server to this computer 62
  • 63. Tips and Tricks – Use recognition objectives in Analysis Workbench to track how well recognition objectives are met – When using “Clean” OS teaching: • Critical security patches may not be installed, so limit network access to prevent virus infection. When used in a virtualised environment, select “Host Only” networking • Disable OS update features (such as Windows Update) – Pure Java applications: • The scanner needs to be configured to collect *.jar, *.ear, *.war files. • Because these files are not identified as executables, change the default for Administration > System Configuration > Scan Processing >Filtering> Use only executable files to No (similar option is available in recognition configuration of other Analysis Tools) 63
  • 64. Troubleshooting: Recognition Report – Use as the last resort (“nuclear button”) – Enable only for short periods – Enabled in configuration files located in the <DataDir>Conf directory – For every scan file processed creates an XML file showing internal details of the recognition engine processing 64