OMAP Verification


Published on

Published in: Technology, Design
  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

OMAP Verification

  1. 1. Verification andValidation OMAP™ verification Presented by: Somdipta Basu Roy Texas Instruments Inc. Dallas +1 214-236-4382
  2. 2. OutlineVerification andValidation • OMAP™ SOC overview • OMAP organization and execution • Overall Verification Methodology • Strategy adopted – Module level – Subsystem – Chip • Progress tracking / metrics • Summary
  3. 3. Verification andValidation• OMAP™ positioning
  4. 4. OMAP2420™ OverviewVerification andValidation • ARM1136 Based Soc includes – 330 MHz ARM1136 – 330 MHz ARM1136 – 220 MHz TI – 220 MHz TI TMS320C55xTM DSP TMS320C55xTM DSP – 2D/3D graphics – 2D/3D graphics accelerator accelerator – Imaging and Video – Imaging and Video accelerator accelerator – High-performance – High-performance system interconnects system interconnects and industry-standard and industry-standard peripherals peripherals
  5. 5. OMAP3430™ overview • New OMAP™ 3 architecture combines mobile entertainment with highVerification and • performance productivity applications Industrys first processor with advancedValidation Superscalar ARM® Cortex™-A8 RISC core enabling 3x gain in performance • Industrys first processor designed in 65- nm CMOS process technology adds processing performance • IVA™ 2+ (Image Video Audio) accelerator enables multi-standard (MPEG4, WMV9, RealVideo, H263, H264) encode/decode at D1 (720x480 pixels) 30 fps • Integrated image signal processor (ISP) for faster, higher-quality image capture and lower system cost • Flexible system support – Composite and S-video TV output – XGA (1024x768 pixels), 16M-color (24-bit definition) display support – Flatlink™ 3G-compliant serial display and parallel display support – High Speed USB2.0 On-The-Go support • Seamless connectivity to Hard Disk Drive (HDD) devices for mass storage • Leverages SmartReflex™ technologies for advanced power reduction • M-shield™ mobile security enhanced with ARM TrustZone™ support • Software-compatible with OMAP™ 2 processors • HLOS support for customizable interface
  6. 6. OMAP development organizationVerification andValidation • OMAP chip level is divided into several subsystems (e.g. ARMSS/DSPSS/…) • Each subsystem consist of key IPs – E.g. ARMSS ARM core, interrupt controller, security block, bus converter bridges – Some IPs are reused from earlier programs, some are developed for a target program • Each IP/group of IPs are developed and delivered to subsystem (s) by IP teams (spanned in different continents) • Each Subsystem integrates and tests IPs together and delivers subsystem to chip level • Chip level integrates subsystems, peripherals, power and clock hookup and tests at chip level
  7. 7. How it is organized Verification and Validation Validation Infrastructure Chip level teams Database FPGA, Silicon RTL Verification PD DFT HW Acc Validation Flow Tracking systems Subsystem Subsystem RTL Verification PD DFT RTL Verification PD DFT IP IP ……. IP IP IP ……. IP• Now imagine that with ~70 IPs, 10-15 subsystems per chip and 4-5 new chips being done simultaneously (in parallel with 5 chips doing revisions) and 5 time zones
  8. 8. How do we do it (and get it right mostVerification and of the time!)Validation • AFV (Architecture for verification) • Strict IP to chip release criteria • Established IP-2-chip exchange mechanism • Automation • Common database / infrastructure / tracking • And of course by increasing frequent flier miles
  9. 9. Architecture for verificationVerification andValidation • It was all kinds of bus protocols and behaviors in OMAP1 series of products • OMAP2/OMAP3 – Standard bus protocol interconnect – All masters and slaves follow variations of same protocol – Plug-and-play • Not everything is so perfect • Power and clock hookup / verification is challenging • Debug protocol complicated
  10. 10. IP to chip releaseVerification andValidation • Pre-defined RTL milestones • Ordered by RTL maturity – Verification status – Physical design step completion • Clear exit criteria • Same for all IPs / subsystems • But – Exception always exists – Had to accept/integrate/test critical IPs before they have completed
  11. 11. IP to Chip milestonesVerification andValidation Chip DB setup/planning Integration RTL verification Physical Design IPDB set up / Planning Basic testing >80% done 100% verification Reviews
  12. 12. IP to chip exchangeVerification andValidation • Design delivery (standard views) – RTL – Timing related – Physical design related – … • Verification delivery – Tests/libraries/macros from processor-based subsystems – Test plans of subsystems for chip level review
  13. 13. AutomationVerification andValidation • Automate a lot of chip level RTL coding – Hookup – I/O connection – Register configuration • Automatically generate tests to check these features
  14. 14. Common database / infrastructureVerification andValidation • Centralized infrastructure • Common database for delivery / exchange • IP delivery and quality tracking • Dedicated infrastructure team
  15. 15. Functional Verification Methodology – same established principleVerification andValidation • Detail verification plan • Reviews at critical design points • Thorough tracking
  16. 16. Verification Methodology Verification and Validation Verification Process – checkpoints / reviews Design Verification Toolkit / Regression Manager / Verification Dashboard Verification Metrics – coverage, bugs, regression, formal, cycles, efficiency tracking Functional Coverage driven Functional Scenario driven Application drivenHVL test bench / scoreboard / checker / assertions HDL test benchConstraint random testing Directed and Random testing C/ASM based directed testing Same environment as chip levelReusable test environment Mimic chip level constraints Reuse from module Application threadsReusable stimulus Reuse module level Synthesizable test bench Operating System boot upExhaustive black/grey box environment Module/Block Subsystem Chip Hardware
  17. 17. Module level verificationVerification andValidation • Objective – Validate module thoroughly before subsystem/system integration • Goal – To achieve 100% code and functional coverage • Strategy – Use pseudo-random test generator – Base infrastructure • A common methodology is used for all module verification • Common VIPs are used by modules following same protocols – Derived components for specific modules – Black-box approach (primarily)
  18. 18. Module level verification ? Verification and Data Scoreboard Validation ? Register• Stimulus: Directed- Scoreboard Expected data random / random Monitor BFM• Correctness: Protocol Checker Monitor Coverage BFM Checker and Data checkers Input Port1 Coverage (end-to-end) DESIGN UNDER Output Port Input Port2 VERIFICATION• Coverage: Code and functional coverage Monitor• Property checking for BFM Checker Coverage certain blocks
  19. 19. Subsystem level verificationVerification and• ObjectiveValidation – To validate the subsystems in the design before top-level integration – Debug/isolate problems inside subsystems which are difficult to find in large SOC • Goal – 100% completion of directed tests as per the verification spec • Core CPU tests • Feature specific directed tests – 100% functional coverage items re-used from IP level verification – 100% Coverage of a Manual Checklist created for test items • Strategy – Generate test bench irritation while processor running real code – Reuse of module components – Isolate subsystem and mimic system environment to create top-level scenarios with a much lesser simulation time
  20. 20. Subsystem level verificationVerification andValidation CLOCK/ RESET INTERRUPT • Stimulus: C/ASM tests for integration, boundary and functional testing • Correctness: Self-checking testing, Checkers reused from module-level • Coverage: Toggle at boundary, directed tests of all target features in the spec
  21. 21. Example: The ARM1136J(F)-SVerification and Subsystem test scenariosValidation • Reuse ARM IP test suite - Retarget CPU tests at the subsystem level - Tests that cover various AHB parameters - Basic Boot Tests - Exception testing at subsystem context - Clock and power management tests - Feature specific testing (interrupt handling, security …) - Derivative tests - Base tests with varying test bench parameters - Data Memory Access Tests with variable wait states in memory - Tests run with random clock speed with allowable speed limit - Random interrupts
  22. 22. ARM Subsystem verificationVerification and environment ComponentsValidation • Mandatory components – A Clock/Reset/Idle Control Block : • For creating multiple clock frequencies ,random/controlled reset and idle – An Interrupt Generator BFM : • For Generating random/controlled simultaneous interrupts and handling them – Memory interface and Memory with variable/random wait states: • Memory model to support Instruction Read, Data Read/Write with random latency • Optional components – Internal Protocol Checkers • Mainly re-use from module level verification
  23. 23. Chip level verificationVerification andValidation • Objective – To validate chip integration and handshaking – To validate real chip level functional scenario • Goal – 100% scenario covered as in the plan • Strategy – Mimic chip environment – Base SW environment for ease of reuse – Break into multiple master-slave blocks – Mix and match of real RTL and bus functional models
  24. 24. Chip-Level VerificationVerification and • Stimulus: C/ASM based directed tests – chip functional scenariosValidation • Correctness: Self-checking tests, Selected checkers from module- level • Coverage: 100% completion of all scenarios in the plan Trace/JTAG Flash SDR/DDR BFM Models Models GPIO UART/ McBSP DRIVERS ARM BFM DSP BFM Camera BFM CLK, Reset IDLE/Power Management Control Block I/O drivers Display BFM
  25. 25. Simulation environmentVerification andValidation • Flexible environment – Replace RTL by BFMs – Software models for processors • Test bench – Synthesizable • Dedicated teams for environment and test bench
  26. 26. Software BaseVerification andValidation • Test case use library functions • The Software Development Library – Library routines are developed based all IP functional specs and put in a repository database to be used for all these levels of verification: • Subsystem Level • Top Level • Chip Level actual Silicon • A standard Format is used for all tests/subroutines/libraries
  27. 27. Key aspects checked at chip levelVerification andValidation • Integration of all subsystems (achieve 100% toggle) • Basic features • Data and control path testing • Parallel and distributed functionality • Latency / performance • Power Management • Application scenario • Debug features
  28. 28. Identify Critical functional arcs forVerification and OMAP2420™Validation
  29. 29. Beyond RTLVerification andValidation • Hardware acceleration – Use at subsystem level and chip level – Stress test – Basic software checkout • Prototyping – Use at chip level – Early software development
  30. 30. Verification ManagementVerification and • Detail test plan at every level – module/subsystem/chipValidation • Review at critical design points with design/spec/system teams • Tracking of – Verification plan – Test environment development – Functional Coverage development – Coverage achievement (code, function) – Design defect – Validation defect – Test development – Test regression – Test cycles – Assertions (formal and simulation)
  31. 31. Metric processVerification andValidation tracking Bug Source code Runtime tools Regression engine Resource estimator • Internal tool • Clearcase • Modelsim • Internal •MS Project • TDM • VCS • others • ???? • CVS • Specman • IUS Metrics Dashboard Management request Trend data • Trend analysis • Risk analysis • What If scenarios
  32. 32. Verification Metrics Verification and Validation• Required • Desired – Bug curve (logic, DV) – Sim farm efficiency – Source code activity (# lines / # edits) • Software license stall time – Cycles / bug for random testing • Setup / cleanup time – Passing rate • Cycles / second • IP level • Integration • % simulator / % HVL • System • Average / distribution for # of running • ECN verification jobs – Code coverage (line, branch, toggle) • Cycles / hour – Functional coverage – Resource stats • Level1 : Features • Resource ramp vs. forecast • Level2 : Cross • Resources invested vs. bottoms- • Level3 : Scenario up plan – DV checkpoint status
  33. 33. DV DashboardVerification and Simulation,Validation Formal 3rd Party Tools Regression logs Internal Tools DV FLOW Coverage logs UPLOAD (Convert to common format) Create Simulation Test Database Coverage Regression Defects SQLDB SQLDB SQLDB Coverage monitor Database Bug Tracking Formal Property Database
  34. 34. Overall DV Metric System Executive EngineeringVerification and Management ManagementValidation Design C Design D Design A Design B Design E Design F RTL DV DFT PD Design A 80% 40% 30% 40% Module A, B DV Methodology Design B 85% 70% 40% 50% Bugs DV Status Cove Exists: Manually rage collected Exists: Automatic Time Actual Metrics Methodology Compliance Review System DV Dashboard Trend Analysis Expected Review Status Metrics Exists: Automatic Regression/Bug/ Actual Metrics Review DatabaseCoverage Database Exchange Exists: Automatic Review checklists Engineering Analysis of 3rd Party Tool coverage data
  35. 35. SummaryVerification andValidation • OMAP™ verification is a resource and time intensive task • Detail plan and review at all levels eliminate redundancy and provide maximum coverage of functions • Need collaboration at every level – Architecture – Design – Infrastructure – Verification • No magic