This report proposes improvements to alleviate traffic congestion in Whitby harbour. It conducted a desk study and site reconnaissance of the area. The study area has sites of special scientific interest nearby and a history of flooding, erosion, subsidence and seismic activity. Two alternative options are proposed: constructing a new bridge at the existing location or developing a park and ride scheme. A final proposal is presented involving building a new bridge from prefabricated parts, constructing a car park, and installing temporary flood barriers to protect Whitby from flooding.
Logistics Feasibility Study for Ultra Mega Power Plant (UMPP)Genex Logistics
Ultra Mega Power projects (UMPP) are a series of ambitious power projects planned by the
Government of India. Genex Logistics (Projects Division) which specilizes in
turnkey Logistics Projects in Power and Oil & Gas
industries, undertook the Logistics Feasibility study of
one of these UMPPs and studied the logistics challenges in executing such a mega project.
This roadmap was developed by the Future BNCI Project and other European Union projects to outline the state of the art and future of brain-computer interface (BCI) technologies. It discusses BCI sensors and signals, devices and applications for disabled users, and devices and applications for all users. The roadmap is intended to guide the field over the next five years by addressing challenges and recommending solutions.
Unified communications applications provide benefits in employee collaboration, mobility, and cost savings according to a survey of 244 organizations:
- 49% of organizations reported employees save up to 20 minutes per day reaching coworkers on the first attempt with presence and IM integration.
- 50% of organizations saw up to 20 minutes in daily savings per employee from escalating IM chats into web conferences.
- 64% of organizations experienced reduced travel costs of over 10% from increased conferencing and collaboration capabilities.
The document provides design considerations for building a high-performance campus local area network (LAN). It discusses a layered network architecture with access, aggregation and core layers. Key access layer topics include wired/wireless connectivity, power-over-Ethernet, VLANs and wireless solutions. Aggregation design focuses on segmentation and distributed switching. The core layer section reviews high-performance solutions and consolidating the core and aggregation layers. High availability techniques like device redundancy, link aggregation and virtual chassis are also covered.
This document is a master's thesis submitted by Satya Prakash Patel to the Jožef Stefan International Postgraduate School in Ljubljana, Slovenia in June 2011. The thesis examines the Common Service Center (CSC) project in India, which aims to deliver internet-enabled services to rural areas through kiosks. The CSC project is a component of India's National e-Governance Plan and seeks to bridge the digital divide. However, implementation of the CSC project has faced several challenges including delays in deploying government services, lack of entrepreneurship among village-level entrepreneurs operating CSCs, and inconsistent connectivity and power issues. The thesis conducts a SWOT analysis and strategic analysis of the C
This document provides guidelines and best practices for rural and urban public library systems in Ontario. It addresses trends in rural sustainability and the role of libraries. Guidelines are presented for branch definitions, catchment areas, facilities, hours, staffing, collections, technology and more. Best practices encourage optimal space allocation, staff training, resource sharing between branches, and partnerships to support rural development. The document aims to help administrators deliver quality library services and resources across different community types.
This document provides an offshore wind roadmap for Vietnam outlining two scenarios: low growth and high growth. Under the low growth scenario, Vietnam would have 500 MW of offshore wind capacity by 2030 focused on a few small pilot projects. The high growth scenario envisions 5,000 MW of offshore wind capacity by 2030 through larger commercial-scale projects. Both scenarios assess the impacts on Vietnam's electricity mix, costs, supply chain development, infrastructure needs, and environmental and social impacts. Key recommended actions focus on developing supportive policies, attracting investment, building local skills, and planning necessary transmission infrastructure.
This report proposes improvements to alleviate traffic congestion in Whitby harbour. It conducted a desk study and site reconnaissance of the area. The study area has sites of special scientific interest nearby and a history of flooding, erosion, subsidence and seismic activity. Two alternative options are proposed: constructing a new bridge at the existing location or developing a park and ride scheme. A final proposal is presented involving building a new bridge from prefabricated parts, constructing a car park, and installing temporary flood barriers to protect Whitby from flooding.
Logistics Feasibility Study for Ultra Mega Power Plant (UMPP)Genex Logistics
Ultra Mega Power projects (UMPP) are a series of ambitious power projects planned by the
Government of India. Genex Logistics (Projects Division) which specilizes in
turnkey Logistics Projects in Power and Oil & Gas
industries, undertook the Logistics Feasibility study of
one of these UMPPs and studied the logistics challenges in executing such a mega project.
This roadmap was developed by the Future BNCI Project and other European Union projects to outline the state of the art and future of brain-computer interface (BCI) technologies. It discusses BCI sensors and signals, devices and applications for disabled users, and devices and applications for all users. The roadmap is intended to guide the field over the next five years by addressing challenges and recommending solutions.
Unified communications applications provide benefits in employee collaboration, mobility, and cost savings according to a survey of 244 organizations:
- 49% of organizations reported employees save up to 20 minutes per day reaching coworkers on the first attempt with presence and IM integration.
- 50% of organizations saw up to 20 minutes in daily savings per employee from escalating IM chats into web conferences.
- 64% of organizations experienced reduced travel costs of over 10% from increased conferencing and collaboration capabilities.
The document provides design considerations for building a high-performance campus local area network (LAN). It discusses a layered network architecture with access, aggregation and core layers. Key access layer topics include wired/wireless connectivity, power-over-Ethernet, VLANs and wireless solutions. Aggregation design focuses on segmentation and distributed switching. The core layer section reviews high-performance solutions and consolidating the core and aggregation layers. High availability techniques like device redundancy, link aggregation and virtual chassis are also covered.
This document is a master's thesis submitted by Satya Prakash Patel to the Jožef Stefan International Postgraduate School in Ljubljana, Slovenia in June 2011. The thesis examines the Common Service Center (CSC) project in India, which aims to deliver internet-enabled services to rural areas through kiosks. The CSC project is a component of India's National e-Governance Plan and seeks to bridge the digital divide. However, implementation of the CSC project has faced several challenges including delays in deploying government services, lack of entrepreneurship among village-level entrepreneurs operating CSCs, and inconsistent connectivity and power issues. The thesis conducts a SWOT analysis and strategic analysis of the C
This document provides guidelines and best practices for rural and urban public library systems in Ontario. It addresses trends in rural sustainability and the role of libraries. Guidelines are presented for branch definitions, catchment areas, facilities, hours, staffing, collections, technology and more. Best practices encourage optimal space allocation, staff training, resource sharing between branches, and partnerships to support rural development. The document aims to help administrators deliver quality library services and resources across different community types.
This document provides an offshore wind roadmap for Vietnam outlining two scenarios: low growth and high growth. Under the low growth scenario, Vietnam would have 500 MW of offshore wind capacity by 2030 focused on a few small pilot projects. The high growth scenario envisions 5,000 MW of offshore wind capacity by 2030 through larger commercial-scale projects. Both scenarios assess the impacts on Vietnam's electricity mix, costs, supply chain development, infrastructure needs, and environmental and social impacts. Key recommended actions focus on developing supportive policies, attracting investment, building local skills, and planning necessary transmission infrastructure.
MetaFabric™ Architecture Virtualized Data Center: Design and Implementation G...Juniper Networks
This document provides an overview and design guide for implementing a MetaFabric architecture virtualized data center using Juniper Networks technologies. It describes the key components of the solution including compute, network, storage and applications. The design uses Juniper QFX switches and EX switches for data center switching and routing, SRX firewalls for security, and IBM Flex System servers and Juniper Network Director/Security Director for management. The guide includes configuration details for validating a proof of concept MetaFabric deployment.
The document provides recommendations for minimum technical requirements to ensure nationwide interoperability for the Nationwide Public Safety Broadband Network (NPSBN). It recommends that the NPSBN comply with 3GPP LTE standards, including adopting various 3GPP interfaces and guidelines. It also recommends requirements for user equipment and device management, testing at various levels, approaches for network evolution, standards for handover and mobility, priorities for quality of service and security measures. The recommendations are intended to enable interoperability across public safety networks and with commercial networks.
The document provides an overview of the DotNetNuke platform and its features. It discusses the DotNetNuke Corporation and highlights of the DNN 7.0 release including new features for content editors, enterprises, web designers, and developers. The document also covers the DotNetNuke platform, mobile optimization, SharePoint integration, editions comparison, social features, training, resources, and limitations. It includes appendices on installation and bibliography.
The first live guidelines from the World Health Organization
(WHO) on Digital Health. I was a member of the guidelines development group for this report
This document provides a summary of Colorado's driver handbook. It covers topics such as obtaining a driver's license, license requirements, traffic laws, safe driving practices, sharing the road with others, emergencies, and more. The table of contents outlines over 15 sections that discuss rules of the road, vehicle operation, driving under the influence, passengers, pedestrians, bicycles and organ donation. Overall, the document serves as a guide for drivers in Colorado to understand state traffic laws and ensure safe driving.
This document provides a guide to using WinCAPS 7.4 software. It contains information on installing the software, setting up options, and an overview of the main program areas. The four program areas are the Catalog tab for selecting products, the Sizing and Selection tab for running applications, and the Projects and Documentation tabs. The Sizing and Selection tab allows running question sets for applications like heating, air conditioning, water supply, and more. The document provides details on settings and functions for each question set type.
This is complete report you will require to make Export Import Report for India's Global Trade.. Pls give your likes and comments.. and pass on this to others..
The existing parking conditions in several areas of Emeryville, California were analyzed. In the commercial/residential border area (Area 1), on-street parking occupancy peaked at 90% between 11:00 AM and 12:00 PM, indicating influx of employees. Block-by-block analysis showed occupancies over 85% on many blocks. Duration data showed both short-term (under 2 hours) and long-term (over 5 hours) parking demand. The Glashaus area (Area 2) also experienced high on-street occupancy throughout the day. Residential areas had typical morning occupancy under 50% but increased throughout the day. Off-street parking in lots and garages was highly utilized as well. Additional data
Born wills intelligence_oversight_tk_en copiamag88
This document is a toolkit for overseeing intelligence services published by DCAF, an international foundation dedicated to security sector governance. It contains seven tools that provide guidance on establishing effective intelligence oversight systems. The tools cover topics such as introducing intelligence oversight, establishing oversight systems, balancing transparency and secrecy, conducting oversight activities, overseeing information collection, use of personal data, and information sharing. The overall aim of the toolkit is to help states develop robust yet practical oversight of their intelligence services in accordance with democratic principles and human rights.
The document provides an introduction to artificial intelligence, including:
- A brief history of AI from the 1980s "AI winter" period of failed projects through to recent advances enabled by improved hardware and new research areas like machine learning.
- Knowledge representation and reasoning, rule engines, hybrid reasoning systems, and expert systems are introduced as key concepts in AI.
- The advantages of using a rule engine are discussed, as well as when rule engines are appropriate versus other approaches like scripting engines. The Rete algorithm, which is commonly used in rule engines, is also introduced.
This document provides a user guide for the SmartDSP operating system. It describes the features and architecture of the OS, including its kernel components, hardware abstraction layers, and documentation resources. The OS uses a real-time kernel to manage tasks, interrupts, scheduling, memory and hardware resources across multiple processor cores. Device drivers and network stacks provide hardware independence and connectivity.
This document provides policies and procedures for configuration, change, and release management. It contains four main sections: an introduction that defines key terms and processes, policies that establish the overall policy statement and process owners, processes that describe the steps for identifying, approving, implementing, testing, and tracking changes, and procedures that provide detailed instructions for each process. The overall goal is to formally manage all changes to information systems through documentation, approval, testing, and auditing in order to maintain system integrity and stability.
This document outlines the Penetration Testing Execution Standard (PTES), which provides guidelines for conducting a penetration test. It describes the pre-engagement interactions between testers and clients to define the scope of work. It also covers intelligence gathering, threat modeling, vulnerability analysis, exploitation, post-exploitation activities, and reporting. The goal is to simulate real-world attacks to identify security issues, but tests are conducted under an agreed scope and rules of engagement.
LoadRunner enables you to test your system under controlled and peak load conditions. To generate load, LoadRunner runs thousands of Virtual Users that are distributed over a network.
Improving software testing efficiency using automation methods by thuravupala...Ravindranath Tagore
This document presents a project report on developing a test automation framework for software testing of network elements. The project aims to create test automation scripts for Tellabs' 8800 series router to evaluate advantages like cost savings, increased productivity and testability. The report describes developing test automation using TCL scripts, executing tests on the 8800 router, and analyzing results to validate that automation improves efficiency over manual testing. It also covers an economic justification analysis to market the automation framework.
This document is a report that analyzes replacing regular transmission line conductors with superconductors. It discusses the problems with current transmission lines, such as losses due to resistance, instability, and environmental/safety issues. The report proposes using superconducting cables cooled by liquid hydrogen in underground conduits as a solution. It evaluates the feasibility and costs/benefits of implementing this system compared to alternatives. The conclusion recommends superconducting cables as they would eliminate losses and provide a more efficient, reliable and environmentally friendly grid.
The document presents the Rail Technical Strategy 2012, which was developed by the Technical Strategy Leadership Group for the railway industry in Great Britain. It outlines a range of technical solutions to address the industry's principal challenges over the next 30 years related to customer satisfaction, capacity increase, cost reduction, and carbon reduction. The strategy supports delivery of the industry's business plan and aims to provide a transformed technical landscape that improves the railway's business and provides better service and value for customers.
This document provides guidelines for designing and building integrated digital library systems (IDLS) for users who are print disabled. It defines the components of an IDLS and concepts to address prior to design, such as defining the scope, elements to consider like buy vs build options, and developing a business case. It also covers content building, archiving and storage, an e-delivery system, and developing an IDLS project plan. The intended audience is libraries serving print disabled users who are planning digital library projects.
This document is a feasibility study report submitted by Benjamin Kremer for the MSc Computer Science degree at University College London. The report examines the feasibility of constructing a system to verify and quantify collaborative work using blockchain architecture. The project aimed to address the problem of student disengagement by developing an API and mobile application to interact with a blockchain that records collaborative task and team data. While the project did not fully establish a way to verify and quantify collaboration, it demonstrated the concept is feasible with more time and blockchain expertise. The report describes the background, requirements, design, implementation, and testing of the prototype system developed as a proof of concept.
This document is a textbook titled "Programming Fundamentals - A Modular Structured Approach using C++" by Kenneth Leroy Busbee. It covers topics related to programming fundamentals such as data types, operators, functions, input/output, and more using C++ as the programming language. The textbook is divided into chapters that each cover a programming concept and include examples and exercises. It is intended to teach structured programming techniques using a modular approach in C++.
MetaFabric™ Architecture Virtualized Data Center: Design and Implementation G...Juniper Networks
This document provides an overview and design guide for implementing a MetaFabric architecture virtualized data center using Juniper Networks technologies. It describes the key components of the solution including compute, network, storage and applications. The design uses Juniper QFX switches and EX switches for data center switching and routing, SRX firewalls for security, and IBM Flex System servers and Juniper Network Director/Security Director for management. The guide includes configuration details for validating a proof of concept MetaFabric deployment.
The document provides recommendations for minimum technical requirements to ensure nationwide interoperability for the Nationwide Public Safety Broadband Network (NPSBN). It recommends that the NPSBN comply with 3GPP LTE standards, including adopting various 3GPP interfaces and guidelines. It also recommends requirements for user equipment and device management, testing at various levels, approaches for network evolution, standards for handover and mobility, priorities for quality of service and security measures. The recommendations are intended to enable interoperability across public safety networks and with commercial networks.
The document provides an overview of the DotNetNuke platform and its features. It discusses the DotNetNuke Corporation and highlights of the DNN 7.0 release including new features for content editors, enterprises, web designers, and developers. The document also covers the DotNetNuke platform, mobile optimization, SharePoint integration, editions comparison, social features, training, resources, and limitations. It includes appendices on installation and bibliography.
The first live guidelines from the World Health Organization
(WHO) on Digital Health. I was a member of the guidelines development group for this report
This document provides a summary of Colorado's driver handbook. It covers topics such as obtaining a driver's license, license requirements, traffic laws, safe driving practices, sharing the road with others, emergencies, and more. The table of contents outlines over 15 sections that discuss rules of the road, vehicle operation, driving under the influence, passengers, pedestrians, bicycles and organ donation. Overall, the document serves as a guide for drivers in Colorado to understand state traffic laws and ensure safe driving.
This document provides a guide to using WinCAPS 7.4 software. It contains information on installing the software, setting up options, and an overview of the main program areas. The four program areas are the Catalog tab for selecting products, the Sizing and Selection tab for running applications, and the Projects and Documentation tabs. The Sizing and Selection tab allows running question sets for applications like heating, air conditioning, water supply, and more. The document provides details on settings and functions for each question set type.
This is complete report you will require to make Export Import Report for India's Global Trade.. Pls give your likes and comments.. and pass on this to others..
The existing parking conditions in several areas of Emeryville, California were analyzed. In the commercial/residential border area (Area 1), on-street parking occupancy peaked at 90% between 11:00 AM and 12:00 PM, indicating influx of employees. Block-by-block analysis showed occupancies over 85% on many blocks. Duration data showed both short-term (under 2 hours) and long-term (over 5 hours) parking demand. The Glashaus area (Area 2) also experienced high on-street occupancy throughout the day. Residential areas had typical morning occupancy under 50% but increased throughout the day. Off-street parking in lots and garages was highly utilized as well. Additional data
Born wills intelligence_oversight_tk_en copiamag88
This document is a toolkit for overseeing intelligence services published by DCAF, an international foundation dedicated to security sector governance. It contains seven tools that provide guidance on establishing effective intelligence oversight systems. The tools cover topics such as introducing intelligence oversight, establishing oversight systems, balancing transparency and secrecy, conducting oversight activities, overseeing information collection, use of personal data, and information sharing. The overall aim of the toolkit is to help states develop robust yet practical oversight of their intelligence services in accordance with democratic principles and human rights.
The document provides an introduction to artificial intelligence, including:
- A brief history of AI from the 1980s "AI winter" period of failed projects through to recent advances enabled by improved hardware and new research areas like machine learning.
- Knowledge representation and reasoning, rule engines, hybrid reasoning systems, and expert systems are introduced as key concepts in AI.
- The advantages of using a rule engine are discussed, as well as when rule engines are appropriate versus other approaches like scripting engines. The Rete algorithm, which is commonly used in rule engines, is also introduced.
This document provides a user guide for the SmartDSP operating system. It describes the features and architecture of the OS, including its kernel components, hardware abstraction layers, and documentation resources. The OS uses a real-time kernel to manage tasks, interrupts, scheduling, memory and hardware resources across multiple processor cores. Device drivers and network stacks provide hardware independence and connectivity.
This document provides policies and procedures for configuration, change, and release management. It contains four main sections: an introduction that defines key terms and processes, policies that establish the overall policy statement and process owners, processes that describe the steps for identifying, approving, implementing, testing, and tracking changes, and procedures that provide detailed instructions for each process. The overall goal is to formally manage all changes to information systems through documentation, approval, testing, and auditing in order to maintain system integrity and stability.
This document outlines the Penetration Testing Execution Standard (PTES), which provides guidelines for conducting a penetration test. It describes the pre-engagement interactions between testers and clients to define the scope of work. It also covers intelligence gathering, threat modeling, vulnerability analysis, exploitation, post-exploitation activities, and reporting. The goal is to simulate real-world attacks to identify security issues, but tests are conducted under an agreed scope and rules of engagement.
LoadRunner enables you to test your system under controlled and peak load conditions. To generate load, LoadRunner runs thousands of Virtual Users that are distributed over a network.
Improving software testing efficiency using automation methods by thuravupala...Ravindranath Tagore
This document presents a project report on developing a test automation framework for software testing of network elements. The project aims to create test automation scripts for Tellabs' 8800 series router to evaluate advantages like cost savings, increased productivity and testability. The report describes developing test automation using TCL scripts, executing tests on the 8800 router, and analyzing results to validate that automation improves efficiency over manual testing. It also covers an economic justification analysis to market the automation framework.
This document is a report that analyzes replacing regular transmission line conductors with superconductors. It discusses the problems with current transmission lines, such as losses due to resistance, instability, and environmental/safety issues. The report proposes using superconducting cables cooled by liquid hydrogen in underground conduits as a solution. It evaluates the feasibility and costs/benefits of implementing this system compared to alternatives. The conclusion recommends superconducting cables as they would eliminate losses and provide a more efficient, reliable and environmentally friendly grid.
The document presents the Rail Technical Strategy 2012, which was developed by the Technical Strategy Leadership Group for the railway industry in Great Britain. It outlines a range of technical solutions to address the industry's principal challenges over the next 30 years related to customer satisfaction, capacity increase, cost reduction, and carbon reduction. The strategy supports delivery of the industry's business plan and aims to provide a transformed technical landscape that improves the railway's business and provides better service and value for customers.
This document provides guidelines for designing and building integrated digital library systems (IDLS) for users who are print disabled. It defines the components of an IDLS and concepts to address prior to design, such as defining the scope, elements to consider like buy vs build options, and developing a business case. It also covers content building, archiving and storage, an e-delivery system, and developing an IDLS project plan. The intended audience is libraries serving print disabled users who are planning digital library projects.
This document is a feasibility study report submitted by Benjamin Kremer for the MSc Computer Science degree at University College London. The report examines the feasibility of constructing a system to verify and quantify collaborative work using blockchain architecture. The project aimed to address the problem of student disengagement by developing an API and mobile application to interact with a blockchain that records collaborative task and team data. While the project did not fully establish a way to verify and quantify collaboration, it demonstrated the concept is feasible with more time and blockchain expertise. The report describes the background, requirements, design, implementation, and testing of the prototype system developed as a proof of concept.
This document is a textbook titled "Programming Fundamentals - A Modular Structured Approach using C++" by Kenneth Leroy Busbee. It covers topics related to programming fundamentals such as data types, operators, functions, input/output, and more using C++ as the programming language. The textbook is divided into chapters that each cover a programming concept and include examples and exercises. It is intended to teach structured programming techniques using a modular approach in C++.
This document provides a framework for developing an organizational or project-based Building Information Modeling (BIM) deployment plan. It outlines key elements to address such as modeling standards, staffing needs, planned models and analyses. The goal is to help users streamline project communications and reduce costs through collaborative planning and BIM implementation. Sections provide guidance on developing a BIM vision and goals, modeling standards, planned model types and analyses, staffing structure and skills requirements. Users can fill in provided templates and tables to develop a customized plan suited to their specific needs and applications of BIM.
This document provides an overview and introduction to the RAPID controller software for the IRC5 robot controller, describing the product documentation available, safety considerations, and basic RAPID programming concepts such as variables, flow control, and robot functionality.
This document provides guidance on using the Test Your Processes application to automate testing of business processes in SAP S/4HANA Cloud. It describes the key features of the application, including creating and editing test plans with data variants, executing test plans, and viewing results. It also outlines the required configuration, including assigning roles to users for test execution and ensuring the correct identity provider is configured.
This document provides an overview of the project plan for developing an online hotel room booking system. It discusses the scope, objectives, and problems with the current manual system. It then describes the proposed computerized solution and outlines the tasks involved in project planning and control, including quality assurance plans, documentation standards, programming standards, and project management. It also discusses scheduling, risk management, requirements analysis, process and data modeling, testing approaches, and references. The overall aim is to develop a system that can efficiently manage all hotel activities and operations.
This document specifies the Linked Media Layer architecture and describes its key components. The architecture includes a repository layer for media storage and metadata, an integration layer, and a service layer. It also describes modules for unstructured search using Apache Nutch/Solr, media collection from social networks, searching media resources with latent semantic indexing, and participation in the MediaEval 2013 benchmarking initiative for video search and hyperlinking tasks.
The document describes the development and testing of the Euclidean Travelling Salesman Platform (ETSP) to test heuristics for solving the Travelling Salesman Problem (TSP). It discusses the motivation, objectives, and requirements for ETSP. It also evaluates the performance of ETSP and compares the QSTSH heuristic tested on ETSP to a greedy nearest neighbor heuristic. The results show that QSTSH has better accuracy and efficiency than the greedy nearest neighbor approach.
Report on e-Notice App (An Android Application)Priyanka Kapoor
The document is a report submitted for a degree at DigiMantra Labs, Ludhiana from January 5, 2014 to May 30, 2014. It describes the development of an e-Notice Application for Android phones. The app allows users to access online notices on their phone and acts as an online notice board where people can communicate and post notices with text, images or videos. It aims to digitize the traditional notice board and allow staff/students to read and respond to notices from anywhere. The app also serves as a mailing list to notify all employees of new notices without needing to maintain a separate mailing list.
This document is a lab manual for database management systems. It contains instructions for installing and using Visual Studio and SQL Server software. Visual Studio is a popular integrated development environment used to develop a wide range of computer programs and applications. It includes features like a code editor, debugger, and various designers. The document provides guidance on tasks for several labs covering topics like creating applications in Visual Studio, installing and managing databases in SQL Server, and building a school management system to apply concepts.
- UNU/IIST is a research center of the United Nations University located in Macau that focuses on assisting developing countries in applying software technology.
- It conducts advanced development projects, research projects, curriculum development, university development, courses, conferences, and disseminates information on software technology progress.
- Fellows from developing countries participate in projects to be trained in software techniques like formal methods, which is currently the main technical focus.
- UNU/IIST produces various report types to document its activities and achievements.
Cenet-- capability enabled networking: towards least-privileged networkingJithu Joseph
In today's IP networks, any host can send packets to any other host irrespective of whether the recipient is interested in communicating with the sender or not. The downside of this openness is that every host is vulnerable to an attack by any other host. We ob- serve that this unrestricted network access (network ambient authority) from compromised systems is also a main reason for data exfiltration attacks within corporate networks. We address this issue using the network version of capability based access control. We bring the idea of capabilities and capability-based access control to the domain of networking. CeNet provides policy driven, fine-grained network level access control enforced in the core of the network (and not at the end-hosts) thereby removing network ambient authority. Thus CeNet is able to limit the scope of spread of an attack from a compromised host to other hosts in the network. We built a capability-enabled SDN network where communication privileges of an endpoint are limited according to its function in the network. Network capabilities can be passed between hosts, thereby allowing a delegation-oriented security policy to be realized. We believe that this base functionality can pave the way for the realization of sophisticated security policies within an enterprise network. Further we built a policy manager that is able to realize Role-Based Access Control (RBAC) policy based network access control using capability operations. We also look at some of the results of formal analysis of capability propagation models in the context of networks.
This document provides information on programming robots using KUKA System Software (KSS) Release 5.2, including:
- The structure and creation of programs, editing programs, and altering programs
- Declaring variables and data objects like arrays, strings, and structures
- Manipulating data using operators and functions
- Using system variables, files, and manipulating string variables
This document is the thesis submitted by Kieran Flesk for the degree of Masters of Science in Software Design and Development. It proposes a novel reinforcement learning approach for selecting virtual machines for migration in cloud computing environments. This approach aims to optimize resource usage and reduce energy consumption by dynamically consolidating virtual machines using live migration and switching idle nodes to sleep mode. The reinforcement learning algorithm provides decision support to efficiently deploy applications across different cloud providers while lowering energy usage without negatively impacting service level agreements.
This document is a project report for a blood donation application called Quick Blood Donate (QBD). It was created by four computer engineering students at S.V.I.T. Nashik as part of their Bachelor's degree. The report includes an abstract, introduction, problem definition, project plan, software requirements specification, design documents, implementation details, testing approach, screenshots of the application, and plans for deployment and maintenance. It aims to address the issue of connecting blood donors and hospitals in real-time to ensure blood is available when needed.
This document describes a student project to develop a prototype file transfer application called Chuck that uses QR codes. The project aims to address the need for easy file transfers between multiple devices. The document outlines the design and development process, including interaction design, technical design of the transmission schema and application, prototype assessment through user testing, iteration of the prototype based on feedback, and evaluation of the effectiveness and future work. Key aspects of the project include creating mockups, building an Android prototype, evaluating it with participants, and improving the prototype based on results.
This document provides an overview of the systems engineering process used to develop an autonomous mining robot called Surus for the NASA Robotic Mining Competition. It describes establishing objectives, defining stakeholders and their expectations, analyzing failures from a previous robot design, setting system requirements, developing subsystem designs through various reviews, finalizing the overall design, and plans for assembly, integration, testing and launch. The document outlines the phases of conceptual design, preliminary design, detailed design, and implementation. It also discusses project management aspects like scheduling, reviews, and financial planning. The aim is to describe the full systems engineering approach used to optimize performance while meeting the competition requirements.
This document provides a software requirements specification for a Smart Attendance System application. The application will use facial recognition technology to mark attendance for students present in class lectures. It will capture faces from existing cameras in the classroom and identify students in real-time video feeds. The system will allow administrators to retrieve and modify attendance records. The document outlines requirements, interfaces, functionalities, constraints, and design diagrams for the application.
This document is a final report from 1993 on modelling and designing scalable parallel computing systems. It details the development of a fractal parallel computer topology that can extend to fill a wafer. Algorithms were developed for routing and load balancing, and a simulation program tested a 64-node network using UNIX and PC workstations. Benchmarking of a 16-node example network demonstrated scalability. The report discusses implementing hardware control to support applications using wafer-scale integration.
This document provides best practices for using IBM Blueworks Live to conduct process discovery. It discusses starting a process discovery project in Blueworks Live, creating a "Discovery Map" to identify milestones and activities, developing a process diagram using BPMN notation, and using the "Analyze Mode" feature to analyze bottlenecks and opportunities for improvement. The overall goal is to uncover an as-is view of key business processes in order to identify areas that could benefit from process optimization or automation.
Similar to My "Grain Motion Detection" Project (20)
1. Abstract
The purpose of this study is development of software that measures the dimensions of the
grains and their speed and displacement. That is required as in the current study of
sediment transport there are no efficient ways of measuring mentioned parameters.
This work covers the experimental work in order to evaluate the efficiencies of the
currently available methods and provide the new ways of problem solution.
Some popular segmentation techniques were implemented in program to test their
efficiencies in current problem. Mostly those methods are based on the edge-detection,
some other methods were used as well.
To test the mentioned ideas it was required to construct a simplified model of the channel
bed and simulate grain movement in order to test it by current methods.
Written codes were tested under different regulation parameters and best possible results
were presented and discussed. Weaknesses of those methods were identified and those
results were the basis to select further alternative research direction.
After analysis of all the work done new alternative methods were recommended and
details regarding their implementation provided in a very descriptive way.
1
2. Contents
ABSTRACT............................................................................................................... 1
1.0 INTRODUCTION ........................................................................................... 6
Aims and objectives .................................................................................................................................6
2.0 LITERATURE REVIEW AND BACKGROUND INFORMATION .............. 8
2.1 Sediment definition .....................................................................................................................8
2.2 Subject Relevance .......................................................................................................................9
2.3 Solution of sediment problems ....................................................................................................9
2.4 Sediment Discharge Calculation ............................................................................................... 10
2.5 Bed Load Transport Definition ................................................................................................ 10
2.5.1 Two Schools of thought ............................................................................................................... 10
2.6 Physical description of the bed load transport ......................................................................... 11
2.6.1 General function........................................................................................................................... 11
2.7 Bed Load Transport Relations.................................................................................................. 13
2.8 Possible implementation of theory in software design .............................................................. 13
2.8.1 Direct measure method ................................................................................................................ 13
2.8.2 Estimation method that uses bed load calculation formulas ........................................................ 14
2.9 Digital image properties ............................................................................................................ 14
2.9.1 Digital Image definition ............................................................................................................... 14
2.9.1.1 Raster graphics ........................................................................................................................ 15
2.9.1.2 Important characteristics ..................................................................................................... 16
2.9.1.3 Advantages .......................................................................................................................... 16
2.9.1.4 Disadvantages ..................................................................................................................... 17
2.9.1.5 Formats................................................................................................................................ 17
2.9.1.6 Types of raster images......................................................................................................... 17
2.9.1.7 Grayscale images ................................................................................................................ 17
2.9.2 Image processing ......................................................................................................................... 18
3.0 METHODOLGY ........................................................................................... 19
3.0.1 Package used for software development ................................................................................... 19
3.0.1.1 Matlab Overview ................................................................................................................. 19
3.0.1.2 Image Processing Toolbox .................................................................................................. 20
3.1 Experimental Set up .................................................................................................................. 21
3.1.1 Programming Package features related to software design ........................................................ 23
3.1.1.1 Systems of co-ordinates ...................................................................................................... 23
3.0.1.1 List of Matlab commands and functions used in experiments. .......................................... 23
3.0.1.2 Cycle operators ................................................................................................................... 24
2
3. 3.0.1.3 PARFOR loop ....................................................................................................................... 25
3.0.1.4 Converting image from colour to grayscale ........................................................................ 25
3.2 Proposed software working principles...................................................................................... 26
3.2.1 Proposed program concept Nr. 1 .................................................................................................. 27
3.2.1.1 Description .......................................................................................................................... 27
3.2.1.2 Background ......................................................................................................................... 27
3.2.1.3 Segmentation Method 1 ...................................................................................................... 28
3.0.1.4 Segmentation Method 2 ...................................................................................................... 32
3.0.1.5 Segmentation Method 3 ...................................................................................................... 36
3.0.1.5 Segmentation Method 4 ...................................................................................................... 46
3.0.1.6 Canny edge detector ............................................................................................................ 53
Conclusion on the Program Concept Nr.1 ............................................................................................ 55
3.0.2 Proposed program concept Nr. 2 ................................................................................................. 55
Description............................................................................................................................................ 55
Highlight modified areas ...................................................................................................................... 55
Results .................................................................................................................................................. 56
Conclusion ............................................................................................................................................ 57
DISCUSSIONS ........................................................................................................ 58
RECOMMENDATIONS ......................................................................................... 60
Development of the 3d surface recreation method ................................................................................ 61
3d scanners ................................................................................................................................................ 61
Three-dimensional photo .......................................................................................................................... 62
Analysis of obtained 3d data ..................................................................................................................... 65
Conclusion on proposed recommendation ................................................................................................ 67
Alternative recommendation ................................................................................................................. 67
Development of the segmentation method based on graph theory ............................................................ 67
Method description ............................................................................................................................... 67
Conclusion on proposed method........................................................................................................... 71
CONCLUSIONS OUTLINE.................................................................................... 72
LIST OF REFERENCES ................................................................................................. 73
APPENDIX A : MATLAB CODES OF THE PROPOSED METHODS .................................... 74
1. Function rem_small ........................................................................................................................ 74
2. Function rate_of_change_scan ....................................................................................................... 75
3. Function watershed_segmentation.m ............................................................................................ 76
4. Function horscan_i ......................................................................................................................... 80
5. Function vibr1_1 ............................................................................................................................. 81
6. Function imabsdiffer.m .................................................................................................................. 84
3
5. FIGURE 53: THRESH = 0.2 ......................................................................................................................... 54
FIGURE 54: HIGHLIGHTED AREAS WHERE PIXEL INTENSITY VALUES HAVE CHANGED
MORE THAN PRESET CRITICAL LEVEL. ..................................................................................... 56
FIGURE 55: HIGHLIGHTED AREAS WHERE PIXEL INTENSITY VALUES HAVE CHANGED
MORE THAN PRESET CRITICAL LEVEL. ..................................................................................... 57
FIGURE 56: WEAK EDGES, METHOD1 ................................................................................................... 59
FIGURE 57: WEAK EDGES, CANNY EDGE DETECTOR ...................................................................... 59
FIGURE 58: OVER SEGMENTATION, CANNY EDGE DETECTOR ..................................................... 59
FIGURE 59: NOISE CONTAMINATION, METHOD1 .............................................................................. 59
FIGURE 60: STRUCTURE LIGHT PRINCIPLE ........................................................................................ 63
FIGURE 61: AN EXAMPLE OF TRIANGULATION PRINCIPLE ........................................................... 64
FIGURE 62: AN EXAMPLE OF RESTORATION OF THREE-DIMENSIONAL SHAPE OF A
SURFACE USING A METHOD OF THE STRUCTURED ILLUMINATION: INITIAL OBJECT
(A) AND SHAPE RECONSTRUCTION (A VIEW FROM VARIOUS ANGLES)........................... 65
FIGURE 63: AN EXAMPLE OF MODELLING OF THE IMAGE THE WEIGHED GRAPH. ................. 68
FIGURE 64: AN EXAMPLE OF A MATRIX OF PAIRED DISTANCES FOR A DOT
CONFIGURATION. ............................................................................................................................ 69
FIGURE 65: RESULTS OF WORK OF METHOD NORMALIZED CUTS( IMAGE TAKEN FROM ..... 69
FIGURE 66: RESULT OF WORK NESTED CUTS .................................................................................... 70
FIGURE 67: CONSTRUCTION OF A PYRAMID OF THE WEIGHED GRAPHS FOR THE IMAGE. .. 70
FIGURE 68: COMPARISON OF RESULTS OF WORK OF ALGORITHM SWA, ITS UPDATINGS
AND NORMALIZED CUTS .............................................................................................................. 71
5
6. 1.0 Introduction
In a science of sediment transport there is a common difficulty of detection of moving
particles. Considering the fact that it is one of the basic properties in an accurate
description of a sediment transport rate, an efficient method of grain movement detection
needs to be developed.
According to the description of the project, required method is limited to the image
analysis; therefore an image processing techniques must be used with input images to
obtain necessary grain parameters and dimensions.
A common challenge of analysis of the objects on the image is image segmentation.
Splitting of the image into unlike areas of some sign is called image segmentation. It is
supposed that areas correspond to real objects, or their parts, and borders of areas
correspond to borders of objects. Segmentation plays the important role in problems of
processing of images and computer sight. Therefore, major and most important part of
creation of the image analysis techniques is the development of precise and effective
ways to segment an image.
Work required for development of the software, is mainly of experimental nature.
Various methods must be tested and compared in terms of their efficiencies. Several
segmentation techniques are available for testing. Most of them are based on the detection
of the edge, some other are relied on another principles of work.
But even though variety of experiments should be done to get a big picture of the
efficiency of the tested methods, as that may lead to working out an optimal solution of
the problem, based on the mentioned experimental statistics.
Aims and objectives
Main aim of the project is development of the software that will be able to effectively
obtain the required parameters of the grain particles (such as size, displacement and
velocity) in order to use that data to work out the volumetric bed load transport rate.
6
7. The objectives of the project are:
Design and construction of a simplified model of a river bed to obtain required
input data for further analysis.
Experiment with different factors that could positively or negatively influence
further analysis(such as lighting mode, use of reflector, etc.) and identify the best
configuration.
Development and comparison of different approaches of obtaining the required
grain parameters. Test a variety of the proposed methods and based on the results
of the experiments – evaluate the efficiency and perspectives of method
implementation.
Consider alternative approaches instead of sticking with a single method
Analyze the obtained results to work out an optimal solution of the problem
7
8. 2.0 Literature review and Background Information
2.1 Sediment definition
River sediment is firm mineral particles carried by a stream. River deposits are formed of
aeration products, denudation and erosion of rocks and soils. Water erosion, destruction
of a terrestrial surface under the influence of fluid waters, represents the most active
process enriching the rivers by deposits.
There are several phases of sediment movement. (Fig.1)
When the stream velocity is low, no deposits move, but with the increase of
velocity some of the deposits begin to sporadically move by rolling and sliding.
This type of movement is called “contact load of the stream”.
If a velocity continues to increase, individual sediments will start to make short
jumps by leaving the bed for a short amount of time, and then returning to rest
after that or continue motion by rolling or further jumping. This type of movement
is called “saltation load of the stream”.
If a velocity increases even more, the saltation will occur more frequently. Some
of the sediments will be kept in suspension by the upward components of flow
turbulence for appreciable lengths of time. Such type of sediment movement is
called “suspended load of the stream”.
FIGURE 1: TYPES OF SEDIMENT MOTION
(HTTP://WWW.DKIMAGES.COM/DISCOVER/PREVIEWS/796/5103379.JPG)
8
9. 2.2 Subject Relevance
The information of river bedload is extremely necessary for designing and building of
various hydraulic engineering constructions, bridge transitions, channels of different
function, dams and other constructions, to avoid and predict possible negative effects of
the sediment.
Some structures in high-velocity streams are taking damage from the sediment particles
in motion. This process can seriously wear off the surface of such structures or pavements
unless sufficient protection is provided. The damage can be caused by both smaller and
larger particles. Chief damage of this nature is to turbines and pumps where sediment-
laden water causes excessive wear on runners, vanes, propellers, and other appurtenant
parts. (Murthy and Madhaven, 1959)
Deposition in natural or artificial channels and reservoirs can also cause serious problems,
as there is always a need of removing excessive sediment for following reasons. In
navigable channels, excessive sediment needs to be removed to maintain specific depths
that are crucial for safe shipping. Considering open channel hydraulics, excessive
sediment amount in natural streams have a huge impact on channel flood-water capacity
that may result in overflow. To avoid mentioned problems, sediments are removed from
the problematic channels on a regular basis. (H.Garcia ,2008)
2.3 Solution of sediment problems
As it may first appear, optimal solution for preventing problems of sediment transport
would be stopping the erosion source and thus preventing the occurring of new
sediments. But bearing in mind huge lengths of the rivers, even if the erosion source is
stopped, huge amounts of sediment will still remain in the river for considerably long
time .Therefore it is more rational to use protection systems that will filter out the
sediments and store them in special sediment-containers or bypass sediments from the
risk areas. In some cases more radical methods like dredging are used. To obtain a
solution for the mentioned and other sediment problems, and provide an assessment of
how effectively the measure is solving the problem, there should be a clear understanding
of basic principles of sedimentation and hydraulics. (Steven J. Goldman, Katharine
Jackson, Taras A. Bursztynsky, ”Erosion and sediment control handbook”)
Basics of the sediment bedload transport principles are discussed in the next chapter.
9
10. 2.4 Sediment Discharge Calculation
It is crucial to have efficient methods for computing sediment discharge to effectively
plan and design all the construction and maintenance works in rivers and canals. But right
now, available techniques and methods for computing sediment discharge do not allow
efficient predicting and estimating of sediment movement. In practice, engineers cannot
use these methods as main argument in making decisions. Usually they have to rely on
their own experience. There is a certain difficulty for the engineer to select a formula to
be used in calculations, because the results often vary significantly when using different
methods. And it is difficult to judge which formula to use to get the most realistic result
unless some observations and comparison of the discharge are made by an engineer.
Many formulas that engineers treat as the most useful and realistic base only on the
subjective experience of the engineer. So comparison of efficiencies of the formulas is a
big concern. (A.Vanoni 1975)
2.5 Bed Load Transport Definition
2.5.1 Two Schools of thought
There are two main schools of thought in bed load science. The author of one is Ralph
Alger Bagnold and the other one is Professor Hans Albert Einstein.
The Bagnold’s (1956) definition of the bed load transport is that contact of the particles
with the bed is mainly influenced by the effect of gravity, when the suspended particles
are lifted by random upward impulses exerted on a particle by turbulence.
Einstein gives a little bit different definition for the transport bed load. He assumed that
the bed load transport is a movement of the grains in a thin layer with a thickness of a few
grain diameters, where grains move by sliding, rolling and making small jumps for a
distance of a few grain diameters.
Einstein considered that the bed layer turbulence mixing is too small to directly influence
movement of sediments and thus suspension of the particles is not possible in the bed
layer. He assumed that any particle in the bed load travels as a series of movements the
distance of approximately 100 grain diameters independently of the flow, transport rate
and bed characteristics.
He treated saltating grains as the suspended load as the heights of jumps were much
larger than one or two grain diameters. But Bagnold (1956, 1973) believed saltation is the
main process causing the bed load transport.
10
11. Many further works were based on these two Schools. The key element in the formulation
of Einstein is the entrainment rate of the particles per unit area as a function of different
parameters such as shear stress and other.
2.6 Physical description of the bed load transport
As the main purpose of the project is the design of software, only basic principles that are
necessary for understanding of the topic and successful implementation of these
principles in the software design, were covered below. More detailed descriptions can be
obtained from the referenced sources.
2.6.1 General function
In general the function of volumetric bed load transport rate is a relation of a boundary
shear stress ������������ and various parameters of deposits.
������������ = ������������ (������������ , ������������������������������ ������������������������������������������������������������)
Eq.1
It can be defined in several ways:
1. As the multiplication of grain velocity, thickness of the bed load layer and the
grain concentration.
������������ = ������������ ������������ ������������
Eq.2
������ 2
������������ – volumetric bed load transport rate ( )
������
������
������������ – velocity of moving particles ( )
������
������������������������������������ ������������ ������������������������������������������������������
������������ – concentration of particles ( )
������������������������������������ ������������ ������������������������������ −������������������������������������������������ ������������������������������������������
������������ – bed load layer thickness ( ������ )
11
12. 2. As the multiplication of grain velocity, grain volume and number of moving
grains per unit area
������������ = ������������ ������������ ������������
Eq.3
������ 2
������������ – volumetric bed load transport rate ( )
������
������
������������ – velocity of moving particles ( )
������
������������ – volume of particles (������3 )
������������ – number of moving grains per unit area
Velocity of the particles can be also defined as the ratio of the saltation distance ������ and the
period of particle movement T so ������������ = ������/������.
������������ = ������������ ������������ ������������ can be expressed as ������������ = ������������ ������������ ������/������
Eq.4, Eq.5
12
13. 2.7 Bed Load Transport Relations
Most of the bed load relations can be described in a universal form:
, where ������∗ is the Einstein bed load number(1950).
Eq.6
Its dimensionless form is introduced by: ������∗ = ������������ ������������ , where
Eq.7
∗
������������ ∗
������������
������������ = and ������������ =
������������������ ������
Eq.8, Eq.9
������������ is volumetric transport rate of bed load
������������������ is the particles Reynolds number. (Transition from laminar to a turbulent
mode occurs after achievement of so-called critical number of Reynolds ������������������������ . At
������������ < ������������������������ flow occurs in a laminar mode, at ������������ > ������������������������ occurrence of turbulence
is possible.)
������ is submerged specific gravity of sediment.
������ is the acceleration of gravity.
������������ is volumetric sediment entrainment rate per unit area.
������������ is particle travel distance
������ is produced shear stress.
2.8 Possible implementation of theory in software design
Assumptions made about the way software could meet the requirements:
2.8.1 Direct measure method
The software is required to detect motion of the grains and measure the sizes of the
moving grains to work out the bed load transport rate afterwards. Thus if it is functioning
13
14. ideally it should be able to collect all the data necessary to work out the volumetric
transport rate ������������ by detecting each grain displacement and estimating individual moving
grain size from the top projection. (For example it can be estimated by treating each
particle as a sphere or by using Einstein‟s (1950) estimation that distance travelled by
each grain is approximately 100 grain diameters.)
2.8.2 Estimation method that uses bed load calculation formulas
In case program is not able to track the motion of the particles efficiently, it possibly
could measure the size of each grain on the picture and produce grain – size distribution.
Then by combining the grain sizes data and flow data characteristics that are measured
simultaneously, it might be possible to estimate the flow by using a variety of theories
and formulas introduced by different scientists.
2.9 Digital image properties
2.9.1 Digital Image definition
As the software input will be presented as the digital image it is very important to
understand what a digital image is and the way data is stored in digital image.
Digital image is differing by creation of the optical image on a photo sensor control
instead of traditional photographic material. The image presented in a digital way, is
suitable for further computer processing, therefore digital image often concerns to area of
information technologies.
Digital technologies are used in digital cameras and video cameras, fax and copying
devices with the various photo sensor controls, writing down and transferring analog or
digital signals. Achievements in the field of technology of photo sensor controls allow to
create digital cameras which supersede film photo technics from the majority of spheres
of application.
14
15. There are two basic ways of digital representation of images (fig.2):
Raster graphics
Vector graphics
FIGURE 2 – VECTOR AND RASTER FORMAT IMAGES
(http://www.edc.uri.edu/criticallands/raster.html)
2.9.1.1 Raster graphics
As the input for the required software will be given as the raster images, it is important to
understand principles of raster graphics.
The raster image - a grid (raster) which cells are called as pixels. The pixel in the raster
image has strictly certain site and color, hence, any object is represented by the program
as a set of the color pixels. It means that the user, working with raster images, works over
groups of pixels making them.
It is represented in the form of a considerable quantity of points - more points there are,
the better visual quality image has, file size increases accordingly. I.e. one specific picture
can be presented with a good or bad quality according to amount of points per unit of
length - the resolution (usually, points on inch - dpi or pixels on inch - ppi).
There are different types of raster images. They differ from each other in the ways of
representation and storage of color or brightness information of the pixel. Color is formed
as a result of mixing several components which can be set in various color systems (or
color spaces).
15
16. Term color depth is used for designation of how many bits are necessary for storage of
pixel color information. Depth of color is measured in bits per pixel.
Volume of memory, necessary for storage of raster image, can be calculated by formula:
������ ∗ ������ ∗ ������
������ =
8
Eq.10
������ - number of columns;
������ - number of lines;
������ - depth of color;
2.9.1.2 Important characteristics
The important characteristics of the raster image are:
quantity of pixels. The quantity of pixels can separately be specified on width and
height (1024*768, 640*480 ...) or, seldom, total amount of pixels (usually
measured in megapixels);
quantity of used colors (or depth of color);
color space RGB, CMYK, XYZ, YCbCr, etc.
Raster image is edited by means of raster graphic editors. Raster images are created by
cameras, scanners, in the raster editor, also by export from the vector editor or in the form
of screenshots.
2.9.1.3 Advantages
Raster graphics allow to create (to reproduce) practically any figure, without
dependence from complexity, in difference, for example, from vector graphics
where it is impossible to transfer precisely effect of transition from one color to
another (in the theory, certainly, it is possible, but a file with a size of 1 MB in
format BMP will have the size of 200 MB in a vector format in complex images).
Prevalence - raster graphics is used now practically everywhere: from small
badges up to posters.
High speed of processing of complex images if scaling is not necessary.
Raster representation of the image is natural to the majority of devices of
input/output of the graphic information, such as monitor, printer, digital camera,
scanner, etc.
16
17. 2.9.1.4 Disadvantages
Big size of files with simple images.
Impossibility of ideal scaling.
Because of these disadvantages, for storage of simple figures it is recommended to use
vector graphics instead of even compressed raster graphics.
2.9.1.5 Formats
Raster images are usually stored in compressed way. Depending on type of compression
it can be possible or impossible to restore the image to a quality what it was before
compression (compression lost-free or compression with losses accordingly). In a graphic
file, additional information can be stored as well: author-related information, a camera
and its adjustments, quantity of points per centimeter at a press, etc.
2.9.1.6 Types of raster images
There are following types of raster images, each of which is intended for the decision of
the certain range of problems:
binary
grayscale
indexed
true colour
2.9.1.7 Grayscale images
Grayscale type of images is overviewed as the input data will be presented as a set of
grayscale raster images.
So called Grayscale or Intensity images are images, pixels of which can have one of
intensity values of any color in a range from minimal up to the maximal intensity (fig.3).
Usually it is supposed that in Intensity picture, gradation of grey color is stored in a range
from black up to white. Therefore sometimes Intensity pictures are called “grey” or
images in a gradation of grey, and term “brightness of pixel” is used as a synonym of
“intensity”.
17
18. Currently grayscale pictures with depth of color of 8 bits/pixel have the widest
application. They can store 256 values of brightness (from 0 up to 255). Grayscale
pictures having depth of color from 2 up to 16 bits/pixel are less often used.
FIGURE 3: VARIATIONS OF GRAYSCALE INTENSITIES
(http://www.kumagera.ne.jp/kkudo/grayscale.jpg)
2.9.2 Image processing
As mentioned above, software is required to measure certain parameters of the grains in
the image. To achieve that a method called “image processing” should be used in the
required software.
Image Processing - any form of processing of the information for which the input data is
presented by the image, for example, by photos or the video frames. Processing of images
can be carried out as for getting the image on an output (for example, preparation for
polygraphic duplicating, for teletranslation, etc.), and for getting other information (for
example, recognition of the text, calculation of number and type of cells in a field of a
microscope, etc.). Except for static two-dimensional images to process it is also required
to process the images changing with time, for example video.
The variety of the purposes and problems of processing of images can be classified as
follows:
Improvement of quality of images;
Measurements on images;
The spectral analysis of multivariate signals;
Recognition of images;
A compression of images.
(Wikipedia accessed on 03.03.2009, John C. Russ - The Image Processing
Handbook 2006)
18
19. 3.0 Methodolgy
3.0.1 Package used for software development
3.0.1.1 Matlab Overview
MATLAB was selected for writing a code of required software due to its availability and
convenience. It has Image Processing Toolbox that makes working with images much
simpler.
MATLAB as the programming language has been developed at the end of 1970.
MATLAB (from English « Matrix Laboratory ») - the term related to a package of
applied programs for the solution of problems of technical calculations, and also to the
programming language used in this package. MATLAB is used by more than 1 000 000
engineering and science officers, it works on the majority of modern operational systems,
including GNU/Linux, Mac OS, Solaris and Microsoft Windows .
MATLAB is high-level programming language that includes structures based on matrixes
of data, a wide spectrum of the functions, the integrated environment of the development,
object-oriented opportunities and interfaces to the programs written in other programming
languages.
The programs written on MATLAB can be of two types - functions and scripts. Functions
have entrance and target arguments, and also own working space for storage of
intermediate results of calculations and variables. Scripts use the general working space.
Both scripts, and functions are not compiled in a machine code and kept in the form of
text files. There is also an opportunity to keep so-called pre-parsed programs - functions
and the scripts processed in a kind, convenient for machine execution. Generally such
programs are carried out more quickly than usual, especially if function contains
commands of construction of graphs and other figures.
The basic feature of programming language MATLAB is its wide opportunities of
working with matrixes, which developers of language have expressed in the slogan
"Think vectorized".
19
20. 3.0.1.2 Image Processing Toolbox
Image Processing Package gives scientists, engineers and even artists a wide spectrum of
tools for digital processing and the analysis of images. Being closely connected with
environment of development of MATLAB applications, Image Processing Toolbox helps
an engineer to avoid performance of long operations of coding and debugging of
algorithms, allowing to concentrate efforts on the decision of the basic scientific or
practical problems.
The basic properties of a package are:
Restoration and allocation of details of images
Work with the allocated site of the image
The analysis of the image
Linear filtration
Transformation of images
Geometrical transformations
Increase in contrast of the important details
Binary transformations
Processing of images and statistics
Color transformations
Change of a palette
Transformation of types of images
Image Processing Package gives wide opportunities for creation and the analysis of
graphic representations in MATLAB environment. This package provides a flexible
interface, allowing to manipulate images, interactively develop graphic pictures, to
visualize data sets and to annotate results for descriptions, reports and publications.
Flexibility, connection of algorithms of a package with such feature of MATLAB as the
matrix-vector description makes a package very successfully adapted for solving any
problems in development and representation. MATLAB includes specially developed
procedures that allow to raise efficiency of a graphic core. It is possible to note, in
particular, such features:
Interactive debugging during development of graphics;
Profiler for optimization of time of algorithm performance;
20
21. Construction tools of the interactive graphic user interface (GUI Builder) for
acceleration of development of the GUI-patterns, allowing to adjust it under goals
and problems of the user.
This package allows the user to spend much less time and forces for creation of standard
graphic representations and, thus, to concentrate efforts on the important details and
features of images.
MATLAB and Image Processing Toolbox are highly adapted for development and
introduction of new ideas and methods of the user. For this purpose there is a set of the
interfaced packages directed on the decision of every possible specific problem and
problems in nonconventional statement.
Package Image Processing now is intensively used in thousands companies and
universities worldwide. Thus there is very much a broad audience of problems, which
users solve by means of the given package, for example space researches, military
development, astronomy, medicine, biology, a robotics, materiology, genetics, etc.
3.1 Experimental Set up
A set of high-quality images of sediment had to be taken to test different ideas and
methods during the development of software. In practice, a video sample of bed load
would be made at water channels or rivers for further analysis. But for required
experiments, simplified model can be efficiently used.
Simplified model consists from(fig.4):
Rectangular fish tank filled with water, and gravel to the point, where whole tank
bottom is uniformly covered with sediment.
A digital camera
A tripod
Two lighting sources (desk lamps)
A simple white reflector
21
22. FIGURE 4: SIMPLIFIED MODEL OF THE CHANNEL
A fish tank represents a section of a water channel where sediment transport occurs.
Camera is mounted on the tripod perpendicularly to the surface of water. Tripod ensures
stability of camera and thus each shot will be made from exactly the same position. This
is very important, because often, images will be analyzed in pairs and even minor image
displacements can create unnecessary problems.
Each camera shot is assumed to be one frame of the video that will be analyzed by the
software. To simulate movement of the sediment, some grains are regularly manually
moved between the shots of camera. All the images will be converted from colour to
grayscale as in reality the input images obtained from the high-speed cameras will be
given in grayscale mode.
To efficiently detect the edges of the grains it is important to create a lighting that will
leave shaded areas in gaps between the grains and to have a little effect on those shadows
in gape lighting source is placed almost parallel to the surface of sediment.
A white reflector is placed opposite to the lighting source. It is used to create dissipated
light that will help to highlight the surface of sediments and will not totally exclude
shadows from the gaps. This configuration was obtained in experimental way and proved
to be the most efficient.
22
23. 3.1.1 Programming Package features related to software design
3.1.1.1 Systems of co-ordinates
Two coordinate systems are used in IРТ(Image Processing Toolbox) package : pixel and
spatial. In the majority of functions of a package the pixel system of co-ordinates is used,
in a number of functions the spatial system of co-ordinates is applied, and some functions
can work with both systems. It is possible to use only pixel system of co-ordinates when
writing own scenarios for the reference to values of pixels in Matlab system.
Pixel system of co-ordinates is traditional for digital processing of images. In it an image
is represented as a matrix of discrete pixels. For the reference to pixel of the image "I" it
is necessary to define a number of a line "r" and number of column "c" on crossing of
which the pixel is located: I(r,c). Lines are numbered from top to down, and columns
from left to right (fig. 5,). The top left pixel has co-ordinates (1;1). Only pixel system will
be used in the current software. Information about spatial system of coordinates can be
found in Appendix.
FIGURE 5 -PIXEL SYSTEM OF COORDINATES
3.0.1.1 List of Matlab commands and functions used in experiments.
In this part all the main used Matlab and Image Processing Toolbox functions that are
necessary for understanding the experimental codes, will be discussed:
imread
Reads the image from a file.
Function D = imread (filename, fmt) reads binary, grayscale or indexed image from a
file with a name "filename" and places it in array "D".
If MATLAB cannot find a file with a name "filename" the file with a name "filename"
and expansion "fmt" is searched. Parametres "filename" and "fmt" are strings.
23
24. imwrite
Writes an image in to a file
Function imwrite (S, filename, fmt) writes down a binary, grayscale or indexed image S
in to a file with a name "filename". The file format is defined by parametre "fmt".
Parametres "filename" and "fmt" are strings.
adapthisteq
Function J=adapthisteq (I) improves contrast of a grayscale picture I by transformation
of values of its elements by a method of contrast limited adaptive histogram equalization
(CLAHE).
Method CLAHE works more effective with small local vicinities of images, than with full
images. Contrast, especially on homogeneous vicinities, should be limited in order to
avoid strengthening of a noise component.
medfilt2
Function q=medfilt2 (q, [A B]) makes a median filtration by a filter kernel with a size of
AхB pixels. The filtration eliminates noise of type "salt-pepper" on the image q in a
following way: all values of pixels in working area of a kernel line up in a row by
increase of brightness and last element of a row is equated to the central.
imabsdiff
Function Z=imabsdiff (X, Y) subtracts each element of image Y from a corresponding
element of the image X and places an absolute difference of these elements in to resultant
variable Z.
3.0.1.2 Cycle operators
Similar and repeating operations are fulfilled by means of cycle operators”for “and
“while”. The cycle for is intended for performance of the predetermined number of
repeating operations, a while - for the operations with unknown number of required
repetitions, but the condition of continuation of a cycle is known.
24
25. 3.0.1.3 PARFOR loop
During experiments a problem arised. Most of the processes in the written functions were
based on the cycles and it took more than 10 hours to process an image using standart
loops. That is the reason why PARFOR cycle was introduced.
General purpose of the PARFOR loop is to run not just one cycle, but divide current cycle
in to independent parts and run them in parallel. This results in significant increase of the
image processing speed. In current project it helped to reduce more than 10 hours
processing time to a satisfactory 3-10 minutes.
More detailed information about parfor loop can be found at Matlab help or the
Mathworks website (http://www.mathworks.com)
3.0.1.4 Converting image from colour to grayscale
As it was previously stated, colour images were made for experiments in the simplified
model of a channel. As high speed cameras that produce grayscale images are going to be
used in real-life conditions, it is necessary to convert obtained colour pictures to the
grayscale ones.
To do that, a function called “mrgb2gray” (written by Kristian Sveen, 10 Sep 2004) was
taken from the Matlab central website. Detailed description of the code can be found on
the hyperlink below.
http://www.mathworks.com/matlabcentral/fileexchange/5855.
Figures below show the original obtained colour image and converted grayscale image.
All the experiments will be done using the grayscale image .(fig.7)
FIGURE 6 : ORIGINAL COLOUR IMAGE FIGURE 7: IMAGE CONVERTED TO GRAYSCALE
25
26. 3.2 Proposed software working principles
Detect the grain Detect the grain Calculate the
edges on first edges on second absolute
image image difference
between frames
Image 1 Image 2
and highlight the
areas where
values have
changed.
Segment and label Segment and label
each grain each grain
Identify which of
the highlighted
Compare two
areas is grain
images and
movement starting
identify grains that
point and where is
have moved
the grain stopping
point
Measure the area
of the moved
grains
Estimate area of
the grain from the
highlight of the
area and
Approximate approximate
volume of those volume of the
grains grain
Work out the Work out the
volumetric volumetric flow
movement rate of rate of the
the sediment sediment
FIGURE 8:PROPOSED ALGORITHM NR.1 FIGURE 9: PROPOSED ALGORITHM NR.2
26
27. 3.2.1 Proposed program concept Nr. 1
3.2.1.1 Description
The main purpose of current experiment is to find out if the segmentation based on object
edge detection is efficient in current situation. Considering the fact that provided image
cannot be classified as the one that is easy to segment, due to its properties(non-
homogeneity of grains surfaces, non-sharp edges, shading, etc.) many different methods
of edge detection are used in a set of experiments.
Additionally a method of a “watershed segmentation” was tested with edge-detectors.
3.2.1.2 Background
Grain detection
In this method, grain detection and segmentation is a key problem that needs to be solved.
When appropriate solution to this problem is found, work on other blocks of the
algorithm can be started.
Segmentation
Image segmentation represents division or splitting the image on regions by similarity of
properties of their points. Primary allocation of required objects on initial grayscale image
by means of segmentation transformation is one of the basic image analysis stages. Most
widely used transformations are brightness and planimetric. Some researchers include as
well textural segmentation as one of the basic methods . According to this classification,
allocation of areas in the process of segmentation is carried out proceeding from the
conformity estimation, by some criterion of similarity of values: either brightness of each
point, or the first derivative of brightness in some specified vicinity of each point, or any
of the textural characteristics of distribution of brightness in the specified vicinity of a
point. (C.Gonsales,2004)
Edges
Edges are such curves on the image along which there is a sharp change of brightness or
its derivatives on spatial variables. Such changes of brightness which reflect the important
features of a represented surface are most interesting. Places where surface orientation
varies in steps concern them, or where one object blocks another, or where the border of
27
28. the rejected shade lays down on the object, or there is no continuity in reflective
properties of a surface, etc.
It is quite natural, that noisy brightness measurements limit possibility to allocate the
information on edges. Contradiction rises between sensitivity and accuracy and thus
short edges should possess higher contrast, than long that they could be distinguished.
Allocation of edges can be considered as addition to image segmentation as edges can be
used for splitting of images into the areas corresponding to various surfaces.
3.2.1.3 Segmentation Method 1
Description
This experiment was the first one and its primary purpose was to check how efficient is
the simpliest method of edge detection and secondary objective was to test how fast
matlab package is processing images using simplest algorithms.
The idea is that ideal borders have a rapid change in grayscale intensity (brightness).
Concept of the method is – comparison of gray intensities of each coherent pair of pixels
in one image in vertical , horizontal and diagonal directions. If certain predetermined
difference amount is reached – pixels are marked as the ones that belong to the grain
edge.
function horscan_i
Input
“T” - an image or the video frame that needs to be analyzed
“differ”- maximum intensity value difference between coherent pixels.
(sensitivity of the edge scanner)
Output
“marked” - A variable that contains an image of the grain borders obtained by the
current method.
28
29. Basic steps
1) Image is read in to the variable:
T = imread('imname.bmp')
2) Image size is being measured for setting the cycle step amount later on:
siz = size(T)
3) Critical difference value between coherent pixels is set.
differ = 8
4) All the pairs of coherent pixels have to be checked for reaching critical intensity
difference value described in previous step.
a) Checking process is a cycle where each two coherent pixels are being processed
in sequence. When whole horizontal line is fully processed, next line is analyzed.
for hor = 1:(siz(1,2))
parfor ver = 1:(siz(1,1)-1)
b) The difference between coherent pixels in horizontal direction is computed, and
checked if it reaches the pre-set “differ” value. If so, pixel is marked as the edge
in a variable that stores the coordinates of the grain edges.
if (max(T(ver,hor), T(ver,hor + 1)) - min(T(ver,hor), T(ver,hor + 1))) >= differ
marked(ver,(hor)) = 1
c) The difference between coherent pixels in vertical direction and checked if it
reaches the pre-set “differ” value. If so, pixel is marked as the edge:
else if (max(T(ver,hor), T(ver + 1,hor)) - min(T(ver,hor), T(ver + 1,hor))) >= differ
marked((ver),hor) = 1
d) The difference between coherent pixels in diagonal direction and checked if it
reaches the pre-set “differ” value. If so, pixel is marked as the edge:
else if (max(T(ver,hor), T(ver + 1,hor + 1)) - min(T(ver,hor), T(ver + 1,hor + 1))) >=
differ
marked((ver),hor) = 1
29
30. Results
FIGURE 10: ORIGINAL IMAGE FIGURE 11: SENSITIVITY PARAMETER “DIFFER” = 7
FIGURE 12: SENSITIVITY PARAMETER “DIFFER” = 10 FIGURE 13: SENSITIVITY PARAMETER “DIFFER” = 13
As it is seen from the figures above, method works to some extent. It captures some of the
grain borders and the amount of captured borders depends on the “differ parameter”. By
decreasing this parameter, sensitivity increases and thus, more borders are captured. But
by increasing sensitivity more noise comes along with better borders.
Noise might be acceptable in this case, as long as the amount of noise particles allows
efficiently excluding it and distinguishing noise from the borders. Noise can possibly be
excluded by size, solidity and other parameters that matlab can work with.
Figure 16 below is an example of significant noise reduction even on images obtained on
high sensitivity („‟differ=7‟‟). In this case it was reduced by smoothing the original image
using function medfilt2. (Appendix A) On the Figure 16, an example of noise reduced
30
31. by the method of excluding objects by their area using function called rem_small is
shown. Function is written by the author of the report. (Function code can be found in
Appendix A)
FIGURE 14:NOISE REDUCED BY MEDIAN FILTERING FIGURE 15: NOISE REDUCED BY REM_SMALL
FUNCTION
It is seen that some of the grains are over contaminated by noise independently from
sensitivity parameter. The reason for that is non-homogenous surface of those grains (this
is clearly seen on the original image.
Conclusion
Despite of all the disadvantages of the method it captures some of the borders and might
find an application. It might efficiently work in combination with other methods of grain
border detection. It can either be used with low sensitivity and thus – little or no noise
particles, or with higher sensitivity, more noise but with efficient measures that separate
grain borders from noise particles.
31
32. 3.0.1.4 Segmentation Method 2
Description
The idea of this experiment is detection of grain borders by comparing change rates of
pixel intensity values around the control pixel. It was assumed that rate of change of
grey-intensities changes rapidly at the point of object border.
FIGURE 16-ANALYZED PIXEL LOCATIONS
First of all, a rate of change of intensity from pixel A1 to central pixel B1 is calculated by
finding a numerical difference between current pixel values. After that, the same process
is repeated between pixels B1 and C1. Finally the rate of change on the left from the
middle pixel is compared to the rate of change on the right side of the middle pixel. If
rates of change have a reasonable difference (manually defined by the user), the middle
pixel B1 is marked as a pixel belonging to the border. The same sequence is used to
calculate vertical rates of change.
Matlab algorithm based on the proposed method
function rate_of_change_scan
Input
T - an image or the video frame that needs to be analyzed
differ- maximum rate of change difference, reaching which,centra pixel is
marked as the edge . (sensitivity of the edge scanner)
Output
marked - A variable that contains an image of the grain borders obtained by the
current method.
32
33. Basic steps
1) Image is read in to the variable:
T = imread('imname.bmp')
2) Image size is being measured for setting the cycle step amount later on:
siz = size(T)
3) Critical rate of change of intensity is set. Less value it has – more sensitive current
edge detector is.
differ = 8
4) With the aid of cycles,
for ver = 2:(siz(1,1) - 1) ;
parfor hor = 2:(siz(1,2) - 1);
each pixel one by one becomes a control pixel and is checked for the rates of change with
coherent pixels.
a) Rate of change is calculated with the pixel on the left
ROC_left = T(ver,hor) - T(ver,hor - 1)
b) Rate of change is calculated with the pixel on the left
ROC_right = T(ver,hor) - T(ver,hor + 1)
c) Rate of change is calculated with the upper pixel
ROC_up = T(ver,hor) - T(ver + 1,hor)
d) Rate of change is calculated with the pixel on the left
ROC_up = T(ver,hor) - T(ver - 1,hor)
5) Difference of intensity change rates around the control point is calculated
a) In the horizontal plane first
ROC_diff_hor = imabsdiff(ROC_left,ROC_right)
b) Then in the vertical plane
ROC_diff_vert = imabsdiff(ROC_up,ROC_down)
6) Difference obtained in previous step is compared with the critical difference value set
by user. If the critical value is exceeded, the central control pixel is marked in a
variable “marked” that highlights the pixels belonging to the border.
33
35. It is seen on the images above that without preliminary preparation of the processed
image (such as smoothing, contrast improvement etc.), the given method highlights edges
efficiently only with big amount of noise, at high sensitivity set by the user. Situation is
very similar with results of the "Experiment 1" - by increasing sensitivity, amount of
noise increases as well, and thus similar measures need to be applied to cope with the
noise.
After a number of tests some methods of the noise reduction proved to be relatively
efficient in this case:
Preliminary original image smoothing using function medfilt2. Map used in the
filter had size 6x6 pixels. (figure 23)
A combination of contrast enhancement using function adapthisteq and image
smoothing using function medfilt2. (figure 24)
FIGURE 21: SENSITIVITY PARAMETER “DIFFER” = 5 FIGURE 22: SENSITIVITY PARAMETER “DIFFER” =9
Conclusion
If compared with the method described in “Experiment 1” both of these methods have
similar quality of grain edge capturing capabilities.
If using this method solely, obtained grain quality does not allow segmenting the grains
with at least a satisfactory quality. Thus current method cannot be used independently but
again, it might be combined with other edge scanners and provide those parts of edges
that an edge-detector of a different concept could not capture.
35
36. 3.0.1.5 Segmentation Method 3
Description
This method is called “Marker-Controlled Watershed Segmentation”.
Development of technologies of processing of images has led to occurrence of new
approaches to the decision of problems of segmentation of images and their application at
the decision of many practical problems.
In this experiment rather new approach to the decision of a problem of segmentation of
images will be considered - a watershed method. Shortly the name of this method and its
essence wil be explained.
It is offered to consider the image as some district map where values of brightness
represent values of heights concerning some level. If this district is filled with water then
pools are formed. At the further filling with water, these pools unite. Places of merge of
these pools are marked as a watershed line.
Division of adjoining objects on the image is one of the important problems of processing
of images. Often for the decision of this problem the so-called Marker-Controlled
Watershed Segmentation is used. At transformations by means of this method it is
necessary to define "catchment basins" and "watershed lines" on the image by processing
of local areas depending on their brightness characteristics.
Matlab algorithm based on the proposed method
During this experiment, instructions described on the website given below were used.
Parts of the code were copied and used.
http://www.mathworks.com/products/image/demos.html?file=/products/demos/shipping/i
mages/ipexwatershed.html
36
37. function watershed_segmentation
Basic steps
1) Reading of the colour image and its transformation to the grayscale.
Reading data from a file
rgb = imread (' G:Matlabtested_image.jpg');
And present them in the form of a grayscale picture.
I = rgb2gray(rgb);
imshow(I)
text(732,501,'Image courtesy of
Corel(R)',...'FontSize',7,'HorizontalAlignment','right')
FIGURE 23: OBTAINED GRAYSCALE PICTURE
2) Use value of a gradient as segmentation function.
For calculation of value of a gradient Sobel edge mask, function imfilter and other
calculations are used. The gradient has great values on borders of objects and small (in
most cases) outside the edges of objects.
hy=fspecial (' sobel ');
hx=hy ';
37
38. Iy=imfilter (double (I), hy, ' replicate ');
Ix=imfilter (double (I), hx, ' replicate ');
gradmag=sqrt (Ix. ^ 2+Iy. ^ 2);
figure, imshow (gradmag, []), title (' value of a gradient ')
FIGURE 24:GRADIENT SEGMENTATION
3) Marking of objects of the foreground.
For marking of objects of the foreground various procedures can be used. Morphological
techniques which are called "opening by reconstruction" and "closing by reconstruction"
are used. These operations allow analyzing internal area of objects of the image by means
of function imregionalmax.
As it has been told above, at carrying out marking of objects of the foreground
morphological operations are also used. Some of them will be considered and compared.
At first operation of disclosing with function use imopen will be implemented.
se=strel (' disk ', 20);
Io=imopen (I, se);
figure, imshow (Io), title (' Io ')
38
39. FIGURE 25-FOREGROUND MARKING
Further opening using functions imerode and imreconstruct will be calculated.
Ie=imerode (I, se);
Iobr=imreconstruct (Ie, I);
figure, imshow (Iobr), title (' Iobr ')
FIGURE 26:IMAGE ERODE
The subsequent morphological operations of opening and closing will lead to moving of
dark stains and formation of markers. Operations of morphological closing are analyzed
below. For this purpose function imclose is used first:
Ioc=imclose (Io, se);
figure, imshow (Ioc), title (' Ioc ')
39
40. FIGURE 27:IMAGE CLOSE
Further function imdilate is applied together with function imreconstruct. For
implementation of operation "imreconstruct" it is necessary to perform operation of
addition of images.
Iobrd=imdilate (Iobr, se);
Iobrcbr=imreconstruct (imcomplement (Iobrd), imcomplement (Iobr));
Iobrcbr=imcomplement (Iobrcbr);
figure, imshow (Iobrcbr), title (' Iobrcbr ')
FIGURE - IMDIALATE
Comparative visual analysis Iobrcbr and Ioc shows, that the presented reconstruction on
the basis of morphological operations of opening and closing is more effective in
comparison with standard operations of opening and closing.
Local maxima Iobrcbr will be calculated and foreground markers recieved.
40
41. fgm=imregionalmax (Iobrcbr);
figure, imshow (fgm), title (' fgm ')
FIGURE 28:IMREGIONALMAX FUNCTION USED
Foreground markers imposed on the initial image.
I2=I;
I2 (fgm) =255;
figure, imshow (I2), title (' fgm, imposed on the initial image ')
FIGURE 29:IMPOSE MARKERS ON THE IMAGE
Some latent or closed objects of the image are not marked. This property influences
formation of result and such objects of the image will not be processed from the
segmentation point of view. Thus, in ideal conditions, foreground markers display
borders only of the majority of objects.
41
42. Some of the foreground markers cross the edges of the grains. Markers should be cleaned
and shrunk to allow further processing. In particular, it can be morphological operations.
se2=strel (ones (5, 5));
fgm2=imclose (fgm, se2);
fgm3=imerode (fgm2, se2);
As a result of carrying out of such operation the separate isolated pixels of the image
disappear. Also it is possible to use function bwareaopen which allows to delete the set
number of pixels.
fgm4=bwareaopen (fgm3, 20);
I3=I;
I3 (fgm4) =255;
figure, imshow (I3)
title (' fgm4, imposed on the initial image ')
FIGURE 30: BWAREAOPEN FUNCTION
4) Calculation of markers of a background.
Now operation of marking of a background will be performed. On image Iobrcbr dark
pixels relate to a background. Thus, it might be possible to apply operation of threshold
processing of the image.
bw=im2bw (Iobrcbr, graythresh (Iobrcbr));
figure, imshow (bw), title (' bw ')
42
43. FIGURE 31:THRESHOLD OPERATION
Background pixels are dark, however it is impossible to perform simple morphological
operations over markers of a background and to receive borders of objects which are
segmented. Background will be "thinned" so that in order to receive an authentic skeleton
of the image or, so-called, the grayscale picture foreground. It is calculated using
approach on a watershed and on the basis of measurement of distances (to watershed
lines).
D=bwdist (bw);
DL=watershed (D);
bgm=DL == 0;
figure, imshow (bgm), title (' bgm ')
FIGURE 32:IWATERSHED LINES
43
44. 5) Calculate the Watershed Transformation of the Segmentation Function.
Function imimposemin can be applied for exact definition of local minima of the image.
On the basis of it function imimposemin also can correct values of gradients on the image
and thus specify an arrangement of markers of the foreground and a background.
gradmag2=imimposemin (gradmag, bgm | fgm4);
And at last, operation of segmentation on the basis of a watershed is carried out.
L=watershed (gradmag2);
Step 6: Visualization of the processing result
Displaying the imposed markers of the foreground on the initial image , markers of a
background and border of the segmented objects.
I4=I;
I4 (imdilate (L == 0, ones (3, 3)) |bgm|fgm4) =255;
figure, imshow (I4)
title (' Markers and the borders of objects imposed on the initial image ')
FIGURE 33:BORDERS MARKED
As a result of such display it is possible to analyze visually a site of markers of the
foreground and a background.
44
45. Display of results of processing by means of the colour image is also useful. The matrix
which is generated by functions watershed and bwlabel, can be converted in the
truecolor-image by means of function label2rgb.
Lrgb=label2rgb (L, ' jet ', ' w ', ' shuffle ');
figure, imshow (Lrgb)
title (' Lrgb ')
FIGURE 34:DISPLAY THE RESULTS
Also it is possible to use a translucent mode for imposing of a pseudo-colour matrix of
labels over the initial image.
figure, imshow (I), hold on
himage=imshow (Lrgb);
set (himage, ' AlphaData ', 0.3);
title (' Lrgb, imposed on the initial image in a translucent mode ')
45
46. FIGURE 35:RESULTS IMPOSE ON ORIGINAL IMAGE
A combination of contrast enhancement function adapthisteq and image smoothing
function medfilt2 was used as preliminary processing as under current conditions
segmentation result proved to be the best from all the tested ones.
Conclusion
It is seen from the Fig.38 that only few of all grains are segmented properly. Some of the
captured grains are over segmented. Such unimpressive grain capture capability can be
explained by non uniform lighting, absence of the well-defined background and non-
homogenous surface of the most grains.
Method might work with different efficiency, depending on variety of conditions (such as
lighting, background etc.)
3.0.1.5 Segmentation Method 4
Description
This method is very interesting as it was discovered by accident.
During experiments on the method of the pixel value difference (method is described
starting from page 55), when comparing 2 coherent frames and highlighting the areas
46
47. values of which have changed, it was noticed that the grain borders are highlighted at
some point as well (fig.39).
FIGURE 36:ACCIDENTLY HIGHLIGHTED BORDERS
After analysis of the reasons that caused the highlight of grain borders, it was noticed that
processed coherent frames were made from a little bit different perspectives. Camera was
accidently moved by few millimeters when pressing an image-capture button and that
caused vibration of the secondary image in relation to the first image.
It was assumed that such vibration can be simulated in matlab environment to construct
an edge detector based on that idea.
The main idea is to shift the same image for a few pixels, compare it with original image
and highlight the difference of the pixel values that are exceeding critical value set by the
user. If picture will be smoothed preliminary, the grains will be relatively homogenous,
and the grain edge lines could be highlighted with a little noise..
Matlab algorithm based on the proposed method
function [ dif ] = vibr1_1(imnam1,pix_mov,pix_difference,rem_area)
Input
imnam1 = image name
rem_area = area of particles to be filtered out
47
48. pix_move = distance to move pixel
pix_difference = critical pixel difference for highlighting, when comparing two frames.
Output
dif = variable containing grain borders obtained by current method
Basic steps
1) Image is read in to the variable:
Im1= imread(imnam1)
2) Image size is being measured for setting the cycle step amount later on:
siz = size(T)
3) Critical difference value between coherent pixels is set.
differ = 8
4) Shifted images are created and saved in to predetermined variables.
a) The process of the image shifting is a cycle that one by one shift pixels in all
possible directions and saves them into different variables to further analyze and
combine the borders obtained using all the directions of image shift.
for ver = (pix_move+1):(ver_size - pix_move)
pix_move_ver_down = ver - pix_move
pix_move_ver_up = ver + pix_move
parfor hor = (pix_move+1):(hor_size - pix_move)
b) Image shifted in horizontal direction:
im2(ver,hor + pix_move) = img1(ver,hor);
c) Image shifted in vertical direction:
im3(pix_move_ver_up,hor) = img1(ver,hor);
d) Image shifted in diagonal direction (vertically + right):
im4(pix_move_ver_down,hor + pix_move ) = img1(ver,hor);
e) Image shifted in diagonal direction (vertically + left):
im5(pix_move_ver_up,hor + pix_move) = img1(ver,hor);
48
49. 5) Absolute pixel differences are calculated between original and shifted images
a) Between original and horizontally shifted image
im_difference1_2 = imabsdiff(im1,im2);
b) Between original and vertically shifted image
im_difference1_3 = imabsdiff(im1,im3);
c) Between original and diagonally(vertically + right) shifted image
im_difference1_4 = imabsdiff(im1,im4);
d) Between original and diagonally(vertically + left) shifted image
im_difference1_5 = imabsdiff(im1,im5);
6) Difference between images is checked for reaching the critical value. If so the pixel is
marked as the border one.
a) Current process is a loop,
for ver = (pix_move+1):(ver_size - pix_move)
pix_move_ver_down = ver - pix_move
pix_move_ver_up = ver + pix_move
parfor hor = (pix_move+1):(hor_size - pix_move)
where pixels are sequentially checked for reaching of critical value predetermined by the
user.
Variable pix_move is used to exclude grain border doubling that is caused by image
shifting.
b) Check differences of all the images for for reaching the critical value and mark
the required pixels.
if im_difference1_2(ver,hor) >= pix_difference
bi_im1_2(ver,hor - pix_move) = 1;
elseif im_difference1_3(ver,hor) >= pix_difference
bi_im1_3(pix_move_ver_down,hor) = 1;
elseif im_difference1_4(ver,hor) >= pix_difference
bi_im1_4(pix_move_ver_up,hor - pix_move) = 1;
elseif im_difference1_5(ver,hor) >= pix_difference
bi_im1_5(pix_move_ver_down,hor - pix_move) = 1;
49
50. On the figures below(Fig.40,Fig.41,Fig.42,Fig.43) grain edges obtained by different shift
directions can be seen.
FIGURE 37: GRAIN EDGES OBTAINED USING FIGURE 38: GRAIN EDGES OBTAINED USING
HORIZONTAL SHIFT VERTICALL SHIFT
FIGURE 39: GRAIN EDGES OBTAINED USING FIGURE 40: GRAIN EDGES OBTAINED USING
DIAGONAL(VERTICAL + RIGHT) SHIFT DIAGONAL(VERTICAL + LEFT) SHIFT
7) From the figures above, it is seen that obtain edge part images contain some amount
of noise. As all of those grain edge parts will be put together later on it is preferred to
remove the noise to avoid noise accumulation on the final image.
a) To do that, a function rem_small was written(Appendix A).
50
51. First of all this function labels all the objects in the image
Then it measures each objects area
Finally it excludes the objects with area that is less than predetermined value
b) Noise is removed from each image using function rem_small described above
bi_im1_2_mod = rem_small(bi_im1_2,rem_area)
bi_im1_3_mod = rem_small(bi_im1_3,rem_area)
bi_im1_4_mod = rem_small(bi_im1_4,rem_area)
bi_im1_5_mod = rem_small(bi_im1_5,rem_area)
Example of the noise removal result can be seen on the images below. Fig.44 is image
before noise removal and Fig.45 is the resultant image.
FIGURE 41: IMAGE BEFORE NOISE REMOVAL FIGURE 42: FIG.454235
8) All the grain edge parts now have been cleaned from the noise and are put together in
one image. The resultant image is cleaned from the noise again.
a) Simple summation of the matrices is used to create a final image containing grain
borders
bi_im_final = bi_im1_2_mod + bi_im1_3_mod + bi_im1_4_mod + bi_im1_5_mod
b) Noise is removed from the resultant image using function rem_small
bi_im_final = rem_small(bi_im_final,rem_area)
51
52. Results
Original grayscale and combined final border images are shown at figures below
FIGURE 43: ORIGINAL GRAYSCALE PICTURE FIGURE 44: GRAIN BORDERS OBTAINED USING
VIBRATION SIMULATION
FIGURE 45: METHOD 1 FIGURE 46: METHOD 2
It is seen that proposed method capture some of the grains ideally, some of the grain
borders are not fully highlighted though. Amount of noise is very small. If compared with
best quality edges captured with method 1 and method 2 (Fig. 42,43) , it is seen that
edges captured by current method (fig.47) are quite stronger and have significantly less
noise.
52
53. Conclusion
Method produced comparatively good edge detection results but still not enough to
segment grains with enough quality to use the segmentation in further analysis of
sediment discharge. It still might provide a reasonable segmentation if used in
combination with other methods. From the experiments described above it is possible to
assume that efficiency of current method in terms of detected edge strength and amount
of noise, is higher than Method 1 and Method 2 efficiencies.
3.0.1.6 Canny edge detector
Description
It is quite complex method consisting of the big number of stages. The method essence is
searching for local sites with brightness differences. Brightness differences are searched
by means of a filtration along each of axes by the one-dimensional Laplacian-Gaussian
filter. In Canny method for classification of differences on "weak" and "strong" two
thresholds are used - bottom and top. "Weak" borders are marked in resultant image, only
if they are connected to "strong". For noisy images the given method provides the best
detection of borders in comparison with other methods but it demands more time.
[J.Cаnny, 1986. А computаtional Аpproach to Еdge Detectiоn, IEEЕ Trаns. Pаttern
Anаlysis аnd Mаchine Intelligеnce].
Matlab function BW=edge(I, 'canny', thresh)
BW = еdgе(I,'cаnny',thrеsh)
The parameter thresh can be a two-element vector. In this case the first element of a
vector sets value of the bottom threshold, and the second element - value of the top
threshold. If the parameter "thresh" is scalar value "thresh" sets value of the top threshold,
and for the bottom threshold value- (0.4*thresh) is used. If the parameter "thresh" is not
given or "thresh" is an empty array, values of thresholds are defined automatically.
Results
Images obtained by Canny edge detector are either over segmented (fig. 51,52) or with
weak edges( fig,5). Oversegmentation is probably caused by the high efficiency of the
scanner. In experimental way it was figured out that there is no “thresh” sensitivity value
53
54. that would allow to avoid oversegmentation and capture the grain borders at satisfactory
level.
FIGURE 47: ORIGINAL IMAGE FIGURE 48: THRESH = 0.05
FIGURE 49: THRESH = 0.1 FIGURE 50: THRESH = 0.2
Conclusion
Based on this experiment results, it can be stated that Canny filter is too efficient for the
image of such type. If sensitivity parameter “thresh” reaches level when a grain edges can
be detected efficiently, apart from that each single detail on the grain is segmented as
well. This causes oversegmentation that gives no possibility for further analysis of the
picture.
54
55. Conclusion on the Program Concept Nr.1
On a flow diagram on page (fig.9) it is seen that primary step of the proposed algorithm is
detection of the grain borders. Methods of detection that work to some extent were found
during a set of experiments described above. Unfortunately quality of the obtained
borders and segmentation does not allow proceeding to the next step of the proposed
algorithm. Probable reason for that is combination of conditions that are seriously
complicating the segmentation.(lighting conditions, non – homogenous surface of the
grains, absence defined background, shadows etc.)
It is not stated that grain segmentation and edge detection is impossible task in this case,
but based on the results, it is probably achievable only using extremely complicated
image processing techniques.
Before tackling such serious task it is recommended to consider other methods that might
be easier to achieve.
3.0.2 Proposed program concept Nr. 2
Description
It is assumed that the areas where the grains have moved will have their intensity values
relatively different, when comparing coherent frames where movement has occurred. T
The idea is to capture and highlight such areas as ideally these areas have the contours of
the moved grains. After obtaining the direction of grain movement and estimation of
volume, volumetric bed load could be estimated.
Highlight modified areas
1) Absolute difference is obtained using imabsdiff function
im3 = imabsdiff(im1,im2)
2) Values obtained in previous step with aid of cycles:
for ver = 1:(siz (1,1))
parfor hor = 1:(siz(1,2))
55
56. 3) Each pixel is sequentially compared to a critical value pix_difference
predetermined by the user. If value is exceeded a pixel is highlighted in a variable
bi_im
if im3(ver,hor) >= pix_difference
bi_im(ver,hor) = 1
Results
During the experiment, only one grain was displaced manually to simulate the effect of
grain movement in order to check the possibility of capturing grain contour efficiently.
FIGURE 51: HIGHLIGHTED AREAS WHERE PIXEL INTENSITY VALUES HAVE CHANGED MORE THAN PRESET
CRITICAL LEVEL.
56
57. FIGURE 52: HIGHLIGHTED AREAS WHERE PIXEL INTENSITY VALUES HAVE CHANGED MORE THAN PRESET
CRITICAL LEVEL.
Based on the results (on figures54,55) it is seen that highlighted difference areas( in red
circles) at some point, have contours similar to the contours of the grains.
Conclusion
Considering amount of noise, coming with highlighted contours there will be difficulties
obtaining area of the grains efficiently. The most probable reason for noise is a change of
shading, caused by grain movement and non homogenous surface of the grains.
Thus it is not likely that a moved grain area could be approximated by this method.
Alternative application of the method
It was concluded that this method may be used in a way it was not designed for:
Method does not allow obtaining precise grain area information easily, but it is surely
efficiently marks the areas where changes have occurred, and thus there is no need to
process whole image, when area of interest can be reduced significantly using this
method.
57
58. That reduction of interest area can significantly simplify the image processing
(segmentation, edge detection, etc) and identification of the moved grains and their
position.
For example, instead of trying to identify a moved grain on the image of 1000 grains, this
number could be reduced to 10 grains and logically it would be much easier to identify
the required grain in that case. As on the smaller area of interest there are less possible
variants of a moved grain, thus the probability of the incorrect grain identification (error)
is also significantly reduced.
Method also might find an application when detecting the grains that did not move, but
only produced small “shakes” as all the displaced grains will be highlighted.
Accidental discovery
It should be also mentioned that when taking pictures for that experiment, a vibration of
the camera was accidently induced by pressing the image capturing button. As a result,
when highlighting the difference between frames, most of grain borders were noticed to
be highlighted with a relatively good quality. (it can be seen on Fig.39)
Mentioned accidently discovered useful effect of the vibration became the basic working
principle for the grain edge-detector described starting from page 44.
Discussions
During analysis of the experiment results described above, a following pattern was
noticed -during tests of the segmentation methods based on edge detection it was
observed that when analyzing the images at relatively low sensitivities at the beginning of
experiments, exactly the same parts of the grain edges often could not be captured even
when different methods of grain detection are used. Method 1 on figure 56 and canny
filter on figure 57 are shown as an example of the edges obtained using low edge-detector
sensitivities)
This might be an evidence of extremely weak preciseness of undetected edges.
58
59. FIGURE 53: WEAK EDGES, METHOD1 FIGURE 54: WEAK EDGES, CANNY EDGE DETECTOR
FIGURE 55: OVER SEGMENTATION, CANNY EDGE FIGURE 56: NOISE CONTAMINATION, METHOD1
DETECTOR
After further increase of each methods sensitivity, there was a moment when excessive
amount of noise or over segmentation (depending on the method) appeared on the image.
(Fig.58 and 59)
Even though the images were over contaminated by noise or excessively
segmented(fig.58,59), some edges were still unrevealed. Considering the fact, that
mentioned observations were obtained using methods based on completely different
working principles (Method 1, Canny edge detector), this might be another confirmation
of the mentioned excessively weak edges hypothesis.
59