The document discusses analyzing and improving kernel security. It describes how kernels work and why kernel security is important. Methods for analyzing kernel security like DIGGER are presented, which can identify critical kernel objects like pointers without prior knowledge. The document also discusses approaches for improving kernel security, such as protecting generic pointers with techniques like Sentry that control access to kernel data structures through object partitioning. Future work areas include automatically detecting all kernel data structures and expanding Sentry's protections.
How to Measure RTOS Performance – Colin Walls
In the world of smart phones and tablet PCs memory might be cheap, but in the more constrained universe of deeply embedded devices, it is still a precious resource. This is one of the many reasons why most 16- and 32-bit embedded designs rely on the services of a scalable real-time operating system (RTOS). An RTOS allows product designers to focus on the added value of their solution while delegating efficient resource (memory, peripheral, etc.) management. In addition to footprint advantages, an RTOS operates with a degree of determinism that is an essential requirement for a variety of embedded applications. This paper takes a look at “typical” reported performance metrics for an RTOS in the embedded industry.
Embedded System,
Real Time Operating System Concept
Architecture of kernel
Task
Task States
Task scheduler
ISR
Semaphores
Mailbox
Message queues
Pipes
Events
Timers
Memory management
Introduction to Ucos II RTOS
Study of kernel structure of Ucos II
Synchronization in Ucos II
Inter-task communication in Ucos II
Memory management in Ucos II
Porting of RTOS.
Unit I
Computer System Overview-Basic Elements, Instruction Execution, Interrupts, Memory Hierarchy, Cache Memory, Direct Memory Access, Multiprocessor and Multicore Organization. Operating system overview-objectives and functions, Evolution of Operating System.- Computer System OrganizationOperating System Structure and Operations- System Calls, System Programs, OS Generation and System Boot.
Real-time systems are those systems in which the correctness of the system depends not only on the logical result of computation, but also on the time at which the results are produced.
How to Measure RTOS Performance – Colin Walls
In the world of smart phones and tablet PCs memory might be cheap, but in the more constrained universe of deeply embedded devices, it is still a precious resource. This is one of the many reasons why most 16- and 32-bit embedded designs rely on the services of a scalable real-time operating system (RTOS). An RTOS allows product designers to focus on the added value of their solution while delegating efficient resource (memory, peripheral, etc.) management. In addition to footprint advantages, an RTOS operates with a degree of determinism that is an essential requirement for a variety of embedded applications. This paper takes a look at “typical” reported performance metrics for an RTOS in the embedded industry.
Embedded System,
Real Time Operating System Concept
Architecture of kernel
Task
Task States
Task scheduler
ISR
Semaphores
Mailbox
Message queues
Pipes
Events
Timers
Memory management
Introduction to Ucos II RTOS
Study of kernel structure of Ucos II
Synchronization in Ucos II
Inter-task communication in Ucos II
Memory management in Ucos II
Porting of RTOS.
Unit I
Computer System Overview-Basic Elements, Instruction Execution, Interrupts, Memory Hierarchy, Cache Memory, Direct Memory Access, Multiprocessor and Multicore Organization. Operating system overview-objectives and functions, Evolution of Operating System.- Computer System OrganizationOperating System Structure and Operations- System Calls, System Programs, OS Generation and System Boot.
Real-time systems are those systems in which the correctness of the system depends not only on the logical result of computation, but also on the time at which the results are produced.
Different kind of distance and Statistical DistanceKhulna University
A short brief of distance and statistical distance which is core of multivariate analysis.................you will get here some more simple conception about distances and statistical distance.
Regularized Principal Component Analysis for Spatial DataWen-Ting Wang
In many atmospheric and earth sciences, it is of interest to identify dominant spatial patterns of variation based on data observed at p locations and n time points with the possibility that p > n. While principal component analysis (PCA) is commonly applied to find the dominant patterns, the eigenimages produced from PCA may exhibit patterns that are too noisy to be physically meaningful when p is large relative to n. To obtain more precise estimates of eigenimages, we propose a regularization approach incorporating smoothness and sparseness of eigenimages, while accounting for their orthogonality. Our method allows data taken at irregularly spaced or sparse locations. In addition, the resulting optimization problem can be solved using the alternating direction method of multipliers, which is easy to implement, and applicable to a large spatial dataset. Furthermore, the estimated eigenfunctions provide a natural basis for representing the underlying spatial process in a spatial random-effects model, from which spatial covariance function estimation and spatial prediction can be efficiently performed using a regularized fixed-rank kriging method. Finally, the effectiveness of the proposed method is demonstrated by several numerical examples.
DataEngConf: Feature Extraction: Modern Questions and Challenges at GoogleHakka Labs
By Dmitry Storcheus (Engineer, Google Research)
Feature extraction, as usually understood, seeks an optimal transformation from raw data into features that can be used as an input for a learning algorithm. In recent times this problem has been attacked using a growing number of diverse techniques that originated in separate research communities: from PCA and LDA to manifold and metric learning. The goal of this talk is to contrast and compare feature extraction techniques coming from different machine learning areas as well as discuss the modern challenges and open problems in feature extraction. Moreover, this talk will suggest novel solutions to some of the challenges discussed, particularly to coupled feature extraction.
This is a presentation that I gave to my research group. It is about probabilistic extensions to Principal Components Analysis, as proposed by Tipping and Bishop.
Principal Component Analysis and ClusteringUsha Vijay
Identifying the borrower segments from the give bank data set which has 27000 rows and 77 variable using PROC PRINCOMP. variables, it is important to reduce the data set to a smaller set of variables to derive a feasible
conclusion. With the effect of multicollinearity two or more variables can share the same plane in the in dimensions. Each row of the data can
be envisioned as a 77 dimensional graph and when we project the data as orthonormal, it is expected that the certain characteristics of the
data based on the plots to cluster together as principal components. In order to identify these principal components. PROC PRINCOMP is
executed with all the variables except the constant variables(recoveries and collection fees) and we derive a plot of Eigen values of all the
principal components
Building a data pipeline to ingest data into Hadoop in minutes using Streamse...Guglielmo Iozzia
Slides from my talk at the Hadoop User Group Ireland meetup on June 13th 2016: building a data pipeline to ingest data from sources of different nature into Hadoop in minutes (and no coding at all) using the Open Source Streamsets Data Collector tool.
Data Lake and the rise of the microservicesBigstep
By simply looking at structured and unstructured data, Data Lakes enable companies to understand correlations between existing and new external data - such as social media - in ways traditional Business Intelligence tools cannot.
For this you need to find out the most efficient way to store and access structured or unstructured petabyte-sized data across your entire infrastructure.
In this meetup we’ll give answers on the next questions:
1. Why would someone use a Data Lake?
2. Is it hard to build a Data Lake?
3. What are the main features that a Data Lake should bring in?
4. What’s the role of the microservices in the big data world?
Closed-Loop Platform Automation by Tong Zhong and Emma CollinsLiz Warner
Closed-loop automation would dramatically help with the network transformation which is central to our business. Building a general analytics workflow to support various use cases (such as power management, fault prediction, networking slicing, etc.) is a critical component in the overall platform.
Closed Loop Platform Automation - Tong Zhong & Emma CollinsLiz Warner
Closed-loop automation would dramatically help with the network transformation which is central to our business. Building a general analytics workflow to support various use cases (such as power management, fault prediction, networking slicing, etc.) is a critical component in the overall platform.
-Introduction to NP Systems
-Relevant Applications
-Design Issues and Challenges
-Common Features in NPs
-Hardware Architecture of NPs
-Amazon and Google Processors
Security of Oracle EBS - How I can Protect my System (UKOUG APPS 18 edition)Andrejs Prokopjevs
Nowadays having a proper security configuration is a huge challenge, especially looking at the global hacks and personal data leak incidents that happened in IT a while back. Oracle EBS is not perfect and has lots of vulnerabilities covered by Oracle almost every quarter. A very small percent of Apps DBAs know all the features and options available, and usually, do not go over firewall/reverse proxy layer.
This presentation is going to cover an overview and recommendations of options and security features that are available and can be used out-of-the-box, and some of the non-trivial configurations that can help to keep your Oracle EBS system protected, per our experience.
Approximation techniques used for general purpose algorithmsSabidur Rahman
Survey on approximation techniques used for general purpose algorithms, data parallel applications ans solid-state memories. It is interesting to see how approximation algorithms can contribute to solve real-life problems with better efficiency and lower cost!
Questions? krahman@ucdavis.edu.
Similar to Analyzing Kernel Security and Approaches for Improving it (20)
Unit 8 - Information and Communication Technology (Paper I).pdfThiyagu K
This slides describes the basic concepts of ICT, basics of Email, Emerging Technology and Digital Initiatives in Education. This presentations aligns with the UGC Paper I syllabus.
How to Make a Field invisible in Odoo 17Celine George
It is possible to hide or invisible some fields in odoo. Commonly using “invisible” attribute in the field definition to invisible the fields. This slide will show how to make a field invisible in odoo 17.
The French Revolution, which began in 1789, was a period of radical social and political upheaval in France. It marked the decline of absolute monarchies, the rise of secular and democratic republics, and the eventual rise of Napoleon Bonaparte. This revolutionary period is crucial in understanding the transition from feudalism to modernity in Europe.
For more information, visit-www.vavaclasses.com
Synthetic Fiber Construction in lab .pptxPavel ( NSTU)
Synthetic fiber production is a fascinating and complex field that blends chemistry, engineering, and environmental science. By understanding these aspects, students can gain a comprehensive view of synthetic fiber production, its impact on society and the environment, and the potential for future innovations. Synthetic fibers play a crucial role in modern society, impacting various aspects of daily life, industry, and the environment. ynthetic fibers are integral to modern life, offering a range of benefits from cost-effectiveness and versatility to innovative applications and performance characteristics. While they pose environmental challenges, ongoing research and development aim to create more sustainable and eco-friendly alternatives. Understanding the importance of synthetic fibers helps in appreciating their role in the economy, industry, and daily life, while also emphasizing the need for sustainable practices and innovation.
Acetabularia Information For Class 9 .docxvaibhavrinwa19
Acetabularia acetabulum is a single-celled green alga that in its vegetative state is morphologically differentiated into a basal rhizoid and an axially elongated stalk, which bears whorls of branching hairs. The single diploid nucleus resides in the rhizoid.
Francesca Gottschalk - How can education support child empowerment.pptxEduSkills OECD
Francesca Gottschalk from the OECD’s Centre for Educational Research and Innovation presents at the Ask an Expert Webinar: How can education support child empowerment?
Embracing GenAI - A Strategic ImperativePeter Windle
Artificial Intelligence (AI) technologies such as Generative AI, Image Generators and Large Language Models have had a dramatic impact on teaching, learning and assessment over the past 18 months. The most immediate threat AI posed was to Academic Integrity with Higher Education Institutes (HEIs) focusing their efforts on combating the use of GenAI in assessment. Guidelines were developed for staff and students, policies put in place too. Innovative educators have forged paths in the use of Generative AI for teaching, learning and assessments leading to pockets of transformation springing up across HEIs, often with little or no top-down guidance, support or direction.
This Gasta posits a strategic approach to integrating AI into HEIs to prepare staff, students and the curriculum for an evolving world and workplace. We will highlight the advantages of working with these technologies beyond the realm of teaching, learning and assessment by considering prompt engineering skills, industry impact, curriculum changes, and the need for staff upskilling. In contrast, not engaging strategically with Generative AI poses risks, including falling behind peers, missed opportunities and failing to ensure our graduates remain employable. The rapid evolution of AI technologies necessitates a proactive and strategic approach if we are to remain relevant.
Model Attribute Check Company Auto PropertyCeline George
In Odoo, the multi-company feature allows you to manage multiple companies within a single Odoo database instance. Each company can have its own configurations while still sharing common resources such as products, customers, and suppliers.
Macroeconomics- Movie Location
This will be used as part of your Personal Professional Portfolio once graded.
Objective:
Prepare a presentation or a paper using research, basic comparative analysis, data organization and application of economic information. You will make an informed assessment of an economic climate outside of the United States to accomplish an entertainment industry objective.
2024.06.01 Introducing a competency framework for languag learning materials ...Sandy Millin
http://sandymillin.wordpress.com/iateflwebinar2024
Published classroom materials form the basis of syllabuses, drive teacher professional development, and have a potentially huge influence on learners, teachers and education systems. All teachers also create their own materials, whether a few sentences on a blackboard, a highly-structured fully-realised online course, or anything in between. Despite this, the knowledge and skills needed to create effective language learning materials are rarely part of teacher training, and are mostly learnt by trial and error.
Knowledge and skills frameworks, generally called competency frameworks, for ELT teachers, trainers and managers have existed for a few years now. However, until I created one for my MA dissertation, there wasn’t one drawing together what we need to know and do to be able to effectively produce language learning materials.
This webinar will introduce you to my framework, highlighting the key competencies I identified from my research. It will also show how anybody involved in language teaching (any language, not just English!), teacher training, managing schools or developing language learning materials can benefit from using the framework.
2. Agenda
• Kernel Introduction
• Necessity for Kernel Security
• Kernel breach
• Analyzing Kernel Security
• Improving Approaches
• Future Work
Milan Rajpara
October 8, 2013
2
3. What is Kernel ?
• A computer program that manages
input/output requests from software
and translates them into data
processing instructions for the
central processing unit and other
electronic components of a
computer. [Wikipedia]
• The kernel is a fundamental part of a
modern computer's operating
system.
• OS rests on a outer ring, and
application above that.
Fig: Privilege rings for the x86 available in protected mode
[Source: Wikipedia]
Milan Rajpara
October 8, 2013
3
4. Necessity for Kernel Security
• Kernel, a vary basic (core) part of the Operating Systems
• Single vulnerability will be exposes large number of systems
• Increasing of Cloud Usage with Virtual Systems
• Smartphones now is in every hand
Milan Rajpara
October 8, 2013
4
5. We talk on ..
• Kernels for General Purpose Operating System
• Some Linux flavor gives Server Optimized Kernel
• Ex. Ubuntu older then 12.04, were gave this option. Since 12.04, linux-image-server is merged into linuximage-generic, there is no difference between Generic and Server kernel. [4]
• Windows do not disclose.
• Kernels which Constructed in C language
• Almost kernels are in C
• Improvement for Monolithic kernels
• All work performed in Virtual environment
• The Xen, and VMware used
Milan Rajpara
October 8, 2013
5
6. How Kernel Affected ?
• By Kernel level rootkits
• Manipulating pointers
• Manipulating data
• Direct Kernel Object Manipulation (DKOM)
• By Boot-kits
• Via hooking techniques
• Direct Hardware or Firmware injection
Milan Rajpara
October 8, 2013
6
7. Effect of this Attacks
• Escalate a process’ privileges by overwriting the process’ credentials
• Hide itself by illicitly removing data structures identifying their presence from
loaded drivers
• Eliding task structures for the processes from the kernel’s process accounting list
• Alter the overall behavior of OS without injecting any malicious code into the
kernel address space, by just pointer manipulating.
Milan Rajpara
October 8, 2013
7
8. How to analyze the Kernel Security
• Find the most critical objects of the kernel, without prior knowledge of the OS
kernel data layout in memory
• Identifying OS Kernel Objects for Run-time Security Analysis
• Sort-out objects which are vulnerable to hijack
• Do Kernel Data Disambiguation
• This will make the system easy to analyze
Milan Rajpara
October 8, 2013
8
9. Most critical objects in Kernel
• Windows and Linux, the core kernel part are mostly written in C
• 40% inter-data structure relations are Pointer based
• 35% of these are Generic Pointers
• Pointers which defines at run time, no initial value or data type is associated
• 28% kernel data structure are well known objects
Milan Rajpara
October 8, 2013
9
10. Generic Pointer Problem
• It is the weak link in kernel security
• Use of void pointers *, assists hackers to point somewhere else
• Use of NULL pointers (to implements linklist), helps hackers to hide / change
runtime objects.
• Use of Casting in C
• Enables the hackers to exploit data structure layout in physical memory
Milan Rajpara
October 8, 2013
10
11. To Find Critical Objects
1. Memory Mapping techniques
• Travers address space from global variables via pointer dereferencing until reaching
running object.
• according to a predefined kernel data definition for each kernel version.
2. Value Invariant Approaches
• Use the value invariants of certain fields or of a whole data structure as a signature to
scan the memory for matching running instances. Ex. DeepScanner, DIMSIM
• Drawbacks of this approaches
- Not very accurate
- Require a predefined definition of the kernel data layout
- Not effective when memory mapping and object reachability information is not available.
- High performance overhead
Milan Rajpara
October 8, 2013
11
12. To Find Critical Objects
3. DIGGER
[1]
• Uncover all system runtime objects without any prior knowledge of the OS kernel
data layout in memory.
• First it performs offline and constructs type-graph (which is used to enable
systematic memory traversal of the object details).
• Then it uses the 4-byte pool memory tagging schema (to uncover kernel runtime
objects from the kernel address space.)
• (+)
• Accurate result
• Low performance overhead
• Fast and nearly complete coverage
Milan Rajpara
October 8, 2013
12
13. DIGGER & KDD
• DIGGER uses the KDD (Kernel Data Disambiguator) to precisely models the
direct and indirect relations between data structures.
• KDD is a static analysis tool that operates offline on an OS kernel’s source code
• Generates a type-graph for the kernel data with direct and indirect relations
between structures, models data structures [2]
• KDD disambiguates pointer-based relations (including generic pointers)
• by performing static points-to analysis on the kernel’s source code.
• Points-to analysis is the problem of determining statically a set of locations to
which a given variable may point to at runtime.
Milan Rajpara
October 8, 2013
13
14. KDD Operation
Source: Ref [2]
AST: Abstract Syntax Tree (high-level intermediate representation for the source code )
Milan Rajpara
October 8, 2013
14
15. KDD Operation
• Interprocedural Analysis 1: Takes AST and differentiate it
• Gets: Variables, Procedure definition, Procedure call, etc.. .
• Interprocedural Analysis 2: Do points-to analysis across different files to perform
whole-program analysis.
• Context Sensitive Analysis:
• It uses Procedure Dependency Graph (PDG) consists of nodes representing the statements of the
data dependency in the program.
• context-sensitive analysis solves two problems: the calling context and the indirect (implicit)
relations between nodes.
Milan Rajpara
October 8, 2013
15
16. Soundness and Precision of KDD
• The points-to analysis algorithm is sound if the points-to set for each variable
contains all its actual runtime targets, and is imprecise if the inferred set is larger
than necessary.
• Check on C programs from the SPEC2000 and SPEC2006 benchmark suites.
• Achieved a high level of precision and 100% of soundness.
• And 96% precision on Windows (WRK*, Vista) and Linux kernel (v3.0.22). [2]
*WRK – Windows Research Kernel, the only available code from windows [6]
Milan Rajpara
October 8, 2013
16
18. DIGGER Approach
• Static Analysis Component: from KDD
• Signature Extraction Component:
• When the object manager allocates a memory pool block, it associates with a pool tag
(pool tag is a unique four-byte tag for each object type.) Uses this tag to uncover the
kernel objects running instances, and they are static and cannot be changed during
object runtime.
• Dynamic Memory Analysis Component: Extract the object details,
• From Pool Tag, it gets the pool block start memory address and the object’s start
address.
Milan Rajpara
October 8, 2013
18
19. Analyzing Kernel through DIGGER Gives …
• Disambiguate the points-to relations between data structures, all without any
prior knowledge of the OS kernel data layout.
• Robust and quite small signature size to uncover runtime objects, enhancing
performance
• Able to keep track of all critical objects of kernel
Milan Rajpara
October 8, 2013
19
20. Protection of Kernel
• Protect the generic pointers.
• Microsoft added a feature PatchGuard, which blocks kernel mode drivers from
altering sensitive parts of the Windows kernel.
• But TDL (rootkit) manages to circumvent this protection as well, by altering a machine's MBR so
that it can intercept Windows startup routines. [7]
• One approach is use of “Object Partitioning” to protect kernel data structure. [3]
• Uses Sentry, that creates access control protections for security-critical kernel data.
Milan Rajpara
October 8, 2013
20
21. Sentry Architecture
• Sentry protects critical data and
enforces data access restrictions
based upon the origin of the access
within the code of the kernel and its
modules or drivers. [3]
• The data integrity model is
straightforward and matches that of
the Biba ring policy [9]
• The malicious code that modifies
privileges by directly writing to
memory is in a loaded module and
not in the core kernel code, so Sentry
will prevent the write
Milan Rajpara
October 8, 2013
21
22. Kernel Memory Access Control
• Protect data structure from DCOM
• Sentry’s design uses a hypervisor to remain isolated from an untrusted kernel
• To keep the overhead low, Sentry uses memory partitioning to lay out sensitive
data on separate memory pages and protects those pages using the hypervisor
• The policy enforcer mediates attempted writes to protected data and uses the
policy to determine when writes should be permitted.
Milan Rajpara
October 8, 2013
22
23. Working of Sentry
• Identifying Security-Critical Members
• Activation of mediated access
• Instruction emulation
• Secure execution history extraction
Milan Rajpara
October 8, 2013
23
24. Evaluation of Sentry
• Performance
• Low performance overhead
• more performance van be achieved by memory layout optimization
• False Positive Analysis
• There were no instances when security-critical kernel data protected by Sentry was
directly modified by a benign driver.
• Sentry provided a 100% detection rate for DKOM rootkits
Milan Rajpara
October 8, 2013
24
25. Future Work
• Detect all kernel data structures automatically, beyond the kernel version
• The DIGGER can only be used to analyze Windows Kernels.
• The current prototype of Sentry only protects two key structures.
• Other kernel data structures may also require similar protection.
• This may gives versatile performance of Sentry, (if more data structure included)
Milan Rajpara
October 8, 2013
25
26. References
[1] Amani S. Ibrahim, James Hamlyn-Harris, John Grundy, Mohamed Almorsy, "Identifying OS Kernel Objects for
Run-Time Security Analysis", DOI: 10.1007/978-3-642-34601-9_6
[2] Amani S. Ibrahim, John Grundy, James Hamlyn-Harris, Mohamed Almorsy, "Operating System Kernel Data
Disambiguation to Support Security Analysis", DOI: 10.1007/978-3-642-34601-9_20
[3] Abhinav Srivastava, Jonathon Giffin, "Efficient Protection of Kernel Data Structures via Object Partitioning", DOI:
10.1145/2420950.2421012
[4] RFC: Linux kernel merging. https://lists.ubuntu.com/archives/kernel-team/2011-October/017471.html
[5] Rootkits detail by Symantec http://www.symantec.com/avcenter/reference/windows.rootkit.overview.pdf
[6] Windows Research Kernel https://www.facultyresourcecenter.com/curriculum/pfv.aspx?ID=7366&c1=enus&c2=0
[7] TDL Rootkit: http://www.theregister.co.uk/2010/11/16/tdl_rootkit_does_64_bit_windows
[8] Windows hooks: http://msdn.microsoft.com/en-us/library/ms644959(v=vs.85).aspx
[9] K. J. Biba. Integrity considerations for secure computer systems. Technical Report MTR-3153, Mitre, Apr. 1977
Milan Rajpara
October 8, 2013
26