Abstract

The purpose of this study is development of software that measures the dimensions of the
grains and their speed ...
Contents

ABSTRACT...........................................................................................................
3.0.1.3         PARFOR loop .................................................................................................
List of figures

FIGURE 1: TYPES OF SEDIMENT MOTION .........................................................................
FIGURE 53: THRESH = 0.2 .....................................................................................................
1.0    Introduction


In a science of sediment transport there is a common difficulty of detection of moving
particles. Co...
The objectives of the project are:


      Design and construction of a simplified model of a river bed to obtain require...
2.0        Literature review and Background Information

2.1        Sediment definition
River sediment is firm mineral par...
2.2    Subject Relevance
The information of river bedload is extremely necessary for designing and building of
various hyd...
2.4    Sediment Discharge Calculation
It is crucial to have efficient methods for computing sediment discharge to effectiv...
Many further works were based on these two Schools. The key element in the formulation
of Einstein is the entrainment rate...
2. As the multiplication of grain velocity, grain volume and number of moving
           grains per unit area


          ...
2.7 Bed Load Transport Relations



Most of the bed load relations can be described in a universal form:
                 ...
ideally it should be able to collect all the data necessary to work out the volumetric
transport rate  by detecting each g...
There are two basic ways of digital representation of images (fig.2):
      Raster graphics
      Vector graphics




  ...
Term color depth is used for designation of how many bits are necessary for storage of
pixel color information. Depth of c...
2.9.1.4       Disadvantages
      Big size of files with simple images.
      Impossibility of ideal scaling.
Because of...
Currently grayscale pictures with depth of color of 8 bits/pixel have the widest
application. They can store 256 values of...
3.0    Methodolgy

3.0.1 Package used for software development

3.0.1.1      Matlab Overview
MATLAB was selected for writi...
3.0.1.2    Image Processing Toolbox
Image Processing Package gives scientists, engineers and even artists a wide spectrum ...
   Construction tools of the interactive graphic user interface (GUI Builder) for
       acceleration of development of t...
FIGURE 4: SIMPLIFIED MODEL OF THE CHANNEL




A fish tank represents a section of a water channel where sediment transport...
3.1.1 Programming Package features related to software design

3.1.1.1    Systems of co-ordinates
Two coordinate systems a...
imwrite
Writes an image in to a file
Function imwrite (S, filename, fmt) writes down a binary, grayscale or indexed image ...
3.0.1.3    PARFOR loop
During experiments a problem arised. Most of the processes in the written functions were
based on t...
3.2 Proposed software working principles




 Detect the grain        Detect the grain                      Calculate the
...
3.2.1 Proposed program concept Nr. 1

3.2.1.1    Description
The main purpose of current experiment is to find out if the ...
the rejected shade lays down on the object, or there is no continuity in reflective
properties of a surface, etc.
It is qu...
Basic steps


1) Image is read in to the variable:
T = imread('imname.bmp')
2) Image size is being measured for setting th...
Results




          FIGURE 10: ORIGINAL IMAGE              FIGURE 11: SENSITIVITY PARAMETER “DIFFER” = 7




FIGURE 12: ...
by the method of excluding objects by their area using function called rem_small is
shown. Function is written by the auth...
3.0.1.4     Segmentation Method 2


Description
The idea of this experiment is detection of grain borders by comparing cha...
Basic steps


1) Image is read in to the variable:
T = imread('imname.bmp')
2) Image size is being measured for setting th...
if ROC_diff_hor = differ
   marked(ver,hor) = 1;


elseif ROC_diff_vert = differ
   marked(ver,hor) = 1;




Results




F...
It is seen on the images above that without preliminary preparation of the processed
image (such as smoothing, contrast im...
3.0.1.5    Segmentation Method 3



Description
This method is called “Marker-Controlled Watershed Segmentation”.
Developm...
function watershed_segmentation



Basic steps


1) Reading of the colour image and its transformation to the grayscale.

...
Iy=imfilter (double (I), hy, ' replicate ');
Ix=imfilter (double (I), hx, ' replicate ');
gradmag=sqrt (Ix. ^ 2+Iy. ^ 2);
...
FIGURE 25-FOREGROUND MARKING

Further opening using functions imerode and imreconstruct will be calculated.


Ie=imerode (...
FIGURE 27:IMAGE CLOSE

Further function imdilate is applied together with function imreconstruct. For
implementation of op...
fgm=imregionalmax (Iobrcbr);
figure, imshow (fgm), title (' fgm ')




                          FIGURE 28:IMREGIONALMAX F...
Some of the foreground markers cross the edges of the grains. Markers should be cleaned
and shrunk to allow further proces...
FIGURE 31:THRESHOLD OPERATION



Background pixels are dark, however it is impossible to perform simple morphological
oper...
5) Calculate the Watershed Transformation of the Segmentation Function.


Function imimposemin can be applied for exact de...
Display of results of processing by means of the colour image is also useful. The matrix
which is generated by functions w...
FIGURE 35:RESULTS IMPOSE ON ORIGINAL IMAGE



A combination of contrast enhancement function adapthisteq and image smoothi...
values of which have changed, it was noticed that the grain borders are highlighted at
some point as well (fig.39).




  ...
pix_move = distance to move pixel
pix_difference = critical pixel difference for highlighting, when comparing two frames.
...
5) Absolute pixel differences are calculated between original and shifted images
    a)      Between original and horizont...
On the figures below(Fig.40,Fig.41,Fig.42,Fig.43) grain edges obtained by different shift
directions can be seen.




FIGU...
    First of all this function labels all the objects in the image
       Then it measures each objects area
       Fin...
Results
Original grayscale and combined final border images are shown at figures below




FIGURE 43: ORIGINAL GRAYSCALE P...
Conclusion
Method produced comparatively good edge detection results but still not enough to
segment grains with enough qu...
that would allow to avoid oversegmentation and capture the grain borders at satisfactory
level.




FIGURE 47: ORIGINAL IM...
Conclusion on the Program Concept Nr.1


On a flow diagram on page (fig.9) it is seen that primary step of the proposed al...
3) Each pixel is sequentially compared to a critical value pix_difference
       predetermined by the user. If value is ex...
FIGURE 52: HIGHLIGHTED AREAS WHERE PIXEL INTENSITY VALUES HAVE CHANGED MORE THAN PRESET
CRITICAL LEVEL.




Based on the r...
That reduction of interest area can significantly simplify the image processing
(segmentation, edge detection, etc) and id...
FIGURE 53: WEAK EDGES, METHOD1              FIGURE 54: WEAK EDGES, CANNY EDGE DETECTOR




FIGURE 55: OVER SEGMENTATION, C...
My "Grain Motion Detection" Project
My "Grain Motion Detection" Project
My "Grain Motion Detection" Project
My "Grain Motion Detection" Project
My "Grain Motion Detection" Project
My "Grain Motion Detection" Project
My "Grain Motion Detection" Project
My "Grain Motion Detection" Project
My "Grain Motion Detection" Project
My "Grain Motion Detection" Project
My "Grain Motion Detection" Project
My "Grain Motion Detection" Project
My "Grain Motion Detection" Project
My "Grain Motion Detection" Project
My "Grain Motion Detection" Project
My "Grain Motion Detection" Project
My "Grain Motion Detection" Project
My "Grain Motion Detection" Project
My "Grain Motion Detection" Project
My "Grain Motion Detection" Project
My "Grain Motion Detection" Project
My "Grain Motion Detection" Project
My "Grain Motion Detection" Project
My "Grain Motion Detection" Project
My "Grain Motion Detection" Project
My "Grain Motion Detection" Project
Upcoming SlideShare
Loading in...5
×

My "Grain Motion Detection" Project

2,273

Published on

Please have a look at my Final Year University Project dedicated to the development of grain motion detection software.

0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total Views
2,273
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
93
Comments
0
Likes
1
Embeds 0
No embeds

No notes for slide

My "Grain Motion Detection" Project

  1. 1. Abstract The purpose of this study is development of software that measures the dimensions of the grains and their speed and displacement. That is required as in the current study of sediment transport there are no efficient ways of measuring mentioned parameters. This work covers the experimental work in order to evaluate the efficiencies of the currently available methods and provide the new ways of problem solution. Some popular segmentation techniques were implemented in program to test their efficiencies in current problem. Mostly those methods are based on the edge-detection, some other methods were used as well. To test the mentioned ideas it was required to construct a simplified model of the channel bed and simulate grain movement in order to test it by current methods. Written codes were tested under different regulation parameters and best possible results were presented and discussed. Weaknesses of those methods were identified and those results were the basis to select further alternative research direction. After analysis of all the work done new alternative methods were recommended and details regarding their implementation provided in a very descriptive way. 1
  2. 2. Contents ABSTRACT............................................................................................................... 1 1.0 INTRODUCTION ........................................................................................... 6 Aims and objectives .................................................................................................................................6 2.0 LITERATURE REVIEW AND BACKGROUND INFORMATION .............. 8 2.1 Sediment definition .....................................................................................................................8 2.2 Subject Relevance .......................................................................................................................9 2.3 Solution of sediment problems ....................................................................................................9 2.4 Sediment Discharge Calculation ............................................................................................... 10 2.5 Bed Load Transport Definition ................................................................................................ 10 2.5.1 Two Schools of thought ............................................................................................................... 10 2.6 Physical description of the bed load transport ......................................................................... 11 2.6.1 General function........................................................................................................................... 11 2.7 Bed Load Transport Relations.................................................................................................. 13 2.8 Possible implementation of theory in software design .............................................................. 13 2.8.1 Direct measure method ................................................................................................................ 13 2.8.2 Estimation method that uses bed load calculation formulas ........................................................ 14 2.9 Digital image properties ............................................................................................................ 14 2.9.1 Digital Image definition ............................................................................................................... 14 2.9.1.1 Raster graphics ........................................................................................................................ 15 2.9.1.2 Important characteristics ..................................................................................................... 16 2.9.1.3 Advantages .......................................................................................................................... 16 2.9.1.4 Disadvantages ..................................................................................................................... 17 2.9.1.5 Formats................................................................................................................................ 17 2.9.1.6 Types of raster images......................................................................................................... 17 2.9.1.7 Grayscale images ................................................................................................................ 17 2.9.2 Image processing ......................................................................................................................... 18 3.0 METHODOLGY ........................................................................................... 19 3.0.1 Package used for software development ................................................................................... 19 3.0.1.1 Matlab Overview ................................................................................................................. 19 3.0.1.2 Image Processing Toolbox .................................................................................................. 20 3.1 Experimental Set up .................................................................................................................. 21 3.1.1 Programming Package features related to software design ........................................................ 23 3.1.1.1 Systems of co-ordinates ...................................................................................................... 23 3.0.1.1 List of Matlab commands and functions used in experiments. .......................................... 23 3.0.1.2 Cycle operators ................................................................................................................... 24 2
  3. 3. 3.0.1.3 PARFOR loop ....................................................................................................................... 25 3.0.1.4 Converting image from colour to grayscale ........................................................................ 25 3.2 Proposed software working principles...................................................................................... 26 3.2.1 Proposed program concept Nr. 1 .................................................................................................. 27 3.2.1.1 Description .......................................................................................................................... 27 3.2.1.2 Background ......................................................................................................................... 27 3.2.1.3 Segmentation Method 1 ...................................................................................................... 28 3.0.1.4 Segmentation Method 2 ...................................................................................................... 32 3.0.1.5 Segmentation Method 3 ...................................................................................................... 36 3.0.1.5 Segmentation Method 4 ...................................................................................................... 46 3.0.1.6 Canny edge detector ............................................................................................................ 53 Conclusion on the Program Concept Nr.1 ............................................................................................ 55 3.0.2 Proposed program concept Nr. 2 ................................................................................................. 55 Description............................................................................................................................................ 55 Highlight modified areas ...................................................................................................................... 55 Results .................................................................................................................................................. 56 Conclusion ............................................................................................................................................ 57 DISCUSSIONS ........................................................................................................ 58 RECOMMENDATIONS ......................................................................................... 60 Development of the 3d surface recreation method ................................................................................ 61 3d scanners ................................................................................................................................................ 61 Three-dimensional photo .......................................................................................................................... 62 Analysis of obtained 3d data ..................................................................................................................... 65 Conclusion on proposed recommendation ................................................................................................ 67 Alternative recommendation ................................................................................................................. 67 Development of the segmentation method based on graph theory ............................................................ 67 Method description ............................................................................................................................... 67 Conclusion on proposed method........................................................................................................... 71 CONCLUSIONS OUTLINE.................................................................................... 72 LIST OF REFERENCES ................................................................................................. 73 APPENDIX A : MATLAB CODES OF THE PROPOSED METHODS .................................... 74 1. Function rem_small ........................................................................................................................ 74 2. Function rate_of_change_scan ....................................................................................................... 75 3. Function watershed_segmentation.m ............................................................................................ 76 4. Function horscan_i ......................................................................................................................... 80 5. Function vibr1_1 ............................................................................................................................. 81 6. Function imabsdiffer.m .................................................................................................................. 84 3
  4. 4. List of figures FIGURE 1: TYPES OF SEDIMENT MOTION ............................................................................................. 8 FIGURE 2 – VECTOR AND RASTER FORMAT IMAGES ...................................................................... 15 FIGURE 3: VARIATIONS OF GRAYSCALE INTENSITIES ................................................................... 18 FIGURE 4: SIMPLIFIED MODEL OF THE CHANNEL ............................................................................ 22 FIGURE 5 -PIXEL SYSTEM OF COORDINATES .................................................................................... 23 FIGURE 6 : ORIGINAL COLOUR IMAGE ................................................................................................ 25 FIGURE 7: IMAGE CONVERTED TO GRAYSCALE .............................................................................. 25 FIGURE 8:PROPOSED ALGORITHM NR.1 .............................................................................................. 26 FIGURE 9: PROPOSED ALGORITHM NR.2 ............................................................................................. 26 FIGURE 12: ORIGINAL IMAGE ................................................................................................................ 30 FIGURE 13: SENSITIVITY PARAMETER “DIFFER” = 7 ........................................................................ 30 FIGURE 14: SENSITIVITY PARAMETER “DIFFER” = 10 ...................................................................... 30 FIGURE 15: SENSITIVITY PARAMETER “DIFFER” = 13 ...................................................................... 30 FIGURE 16:NOISE REDUCED BY MEDIAN FILTERING ...................................................................... 31 FIGURE 17: NOISE REDUCED BY REM_SMALL FUNCTION ............................................................. 31 FIGURE 18-ANALYZED PIXEL LOCATIONS ......................................................................................... 32 FIGURE 19: ORIGINAL GRAYSCALE IMAGE ....................................................................................... 34 FIGURE 20: SENSITIVITY PARAMETER “DIFFER” = 10 ...................................................................... 34 FIGURE 21: SENSITIVITY PARAMETER “DIFFER” = 7 ........................................................................ 34 FIGURE 22:SENSITIVITY PARAMETER “DIFFER” = 5 ......................................................................... 34 FIGURE 23: SENSITIVITY PARAMETER “DIFFER” = 5 ....................................................................... 35 FIGURE 24: SENSITIVITY PARAMETER “DIFFER” =9 ........................................................................ 35 FIGURE 25: OBTAINED GRAYSCALE PICTURE ................................................................................... 37 FIGURE 26:GRADIENT SEGMENTATION .............................................................................................. 38 FIGURE 27-FOREGROUND MARKING ................................................................................................... 39 FIGURE 28:IMAGE ERODE ....................................................................................................................... 39 FIGURE 29:IMAGE CLOSE ........................................................................................................................ 40 FIGURE 31:IMREGIONALMAX FUNCTION USED................................................................................ 41 FIGURE 32:IMPOSE MARKERS ON THE IMAGE .................................................................................. 41 FIGURE 33: BWAREAOPEN FUNCTION ................................................................................................. 42 FIGURE 34:THRESHOLD OPERATION ................................................................................................... 43 FIGURE 35:IWATERSHED LINES ............................................................................................................ 43 FIGURE 36:BORDERS MARKED .............................................................................................................. 44 FIGURE 37:DISPLAY THE RESULTS ....................................................................................................... 45 FIGURE 38:RESULTS IMPOSE ON ORIGINAL IMAGE ......................................................................... 46 FIGURE 39:ACCIDENTLY HIGHLIGHTED BORDERS .......................................................................... 47 FIGURE 40: GRAIN EDGES OBTAINED USING HORIZONTAL SHIFT .............................................. 50 FIGURE 41: GRAIN EDGES OBTAINED USING VERTICALL SHIFT .................................................. 50 FIGURE 42: GRAIN EDGES OBTAINED USING DIAGONAL(VERTICAL + RIGHT) SHIFT ........... 50 FIGURE 43: GRAIN EDGES OBTAINED USING DIAGONAL(VERTICAL + LEFT) SHIFT .............. 50 FIGURE 44: IMAGE BEFORE NOISE REMOVAL ................................................................................... 51 FIGURE 45: FIG.454235 .............................................................................................................................. 51 FIGURE 46: ORIGINAL GRAYSCALE PICTURE .................................................................................... 52 FIGURE 47: GRAIN BORDERS OBTAINED USING VIBRATION SIMULATION ............................... 52 FIGURE 48: METHOD 1 ............................................................................................................................. 52 FIGURE 49: METHOD 2 ............................................................................................................................. 52 FIGURE 50: ORIGINAL IMAGE ................................................................................................................ 54 FIGURE 51: THRESH = 0.05 ....................................................................................................................... 54 FIGURE 52: THRESH = 0.1 ......................................................................................................................... 54 4
  5. 5. FIGURE 53: THRESH = 0.2 ......................................................................................................................... 54 FIGURE 54: HIGHLIGHTED AREAS WHERE PIXEL INTENSITY VALUES HAVE CHANGED MORE THAN PRESET CRITICAL LEVEL. ..................................................................................... 56 FIGURE 55: HIGHLIGHTED AREAS WHERE PIXEL INTENSITY VALUES HAVE CHANGED MORE THAN PRESET CRITICAL LEVEL. ..................................................................................... 57 FIGURE 56: WEAK EDGES, METHOD1 ................................................................................................... 59 FIGURE 57: WEAK EDGES, CANNY EDGE DETECTOR ...................................................................... 59 FIGURE 58: OVER SEGMENTATION, CANNY EDGE DETECTOR ..................................................... 59 FIGURE 59: NOISE CONTAMINATION, METHOD1 .............................................................................. 59 FIGURE 60: STRUCTURE LIGHT PRINCIPLE ........................................................................................ 63 FIGURE 61: AN EXAMPLE OF TRIANGULATION PRINCIPLE ........................................................... 64 FIGURE 62: AN EXAMPLE OF RESTORATION OF THREE-DIMENSIONAL SHAPE OF A SURFACE USING A METHOD OF THE STRUCTURED ILLUMINATION: INITIAL OBJECT (A) AND SHAPE RECONSTRUCTION (A VIEW FROM VARIOUS ANGLES)........................... 65 FIGURE 63: AN EXAMPLE OF MODELLING OF THE IMAGE THE WEIGHED GRAPH. ................. 68 FIGURE 64: AN EXAMPLE OF A MATRIX OF PAIRED DISTANCES FOR A DOT CONFIGURATION. ............................................................................................................................ 69 FIGURE 65: RESULTS OF WORK OF METHOD NORMALIZED CUTS( IMAGE TAKEN FROM ..... 69 FIGURE 66: RESULT OF WORK NESTED CUTS .................................................................................... 70 FIGURE 67: CONSTRUCTION OF A PYRAMID OF THE WEIGHED GRAPHS FOR THE IMAGE. .. 70 FIGURE 68: COMPARISON OF RESULTS OF WORK OF ALGORITHM SWA, ITS UPDATINGS AND NORMALIZED CUTS .............................................................................................................. 71 5
  6. 6. 1.0 Introduction In a science of sediment transport there is a common difficulty of detection of moving particles. Considering the fact that it is one of the basic properties in an accurate description of a sediment transport rate, an efficient method of grain movement detection needs to be developed. According to the description of the project, required method is limited to the image analysis; therefore an image processing techniques must be used with input images to obtain necessary grain parameters and dimensions. A common challenge of analysis of the objects on the image is image segmentation. Splitting of the image into unlike areas of some sign is called image segmentation. It is supposed that areas correspond to real objects, or their parts, and borders of areas correspond to borders of objects. Segmentation plays the important role in problems of processing of images and computer sight. Therefore, major and most important part of creation of the image analysis techniques is the development of precise and effective ways to segment an image. Work required for development of the software, is mainly of experimental nature. Various methods must be tested and compared in terms of their efficiencies. Several segmentation techniques are available for testing. Most of them are based on the detection of the edge, some other are relied on another principles of work. But even though variety of experiments should be done to get a big picture of the efficiency of the tested methods, as that may lead to working out an optimal solution of the problem, based on the mentioned experimental statistics. Aims and objectives Main aim of the project is development of the software that will be able to effectively obtain the required parameters of the grain particles (such as size, displacement and velocity) in order to use that data to work out the volumetric bed load transport rate. 6
  7. 7. The objectives of the project are:  Design and construction of a simplified model of a river bed to obtain required input data for further analysis.  Experiment with different factors that could positively or negatively influence further analysis(such as lighting mode, use of reflector, etc.) and identify the best configuration.  Development and comparison of different approaches of obtaining the required grain parameters. Test a variety of the proposed methods and based on the results of the experiments – evaluate the efficiency and perspectives of method implementation.  Consider alternative approaches instead of sticking with a single method  Analyze the obtained results to work out an optimal solution of the problem 7
  8. 8. 2.0 Literature review and Background Information 2.1 Sediment definition River sediment is firm mineral particles carried by a stream. River deposits are formed of aeration products, denudation and erosion of rocks and soils. Water erosion, destruction of a terrestrial surface under the influence of fluid waters, represents the most active process enriching the rivers by deposits. There are several phases of sediment movement. (Fig.1)  When the stream velocity is low, no deposits move, but with the increase of velocity some of the deposits begin to sporadically move by rolling and sliding. This type of movement is called “contact load of the stream”.  If a velocity continues to increase, individual sediments will start to make short jumps by leaving the bed for a short amount of time, and then returning to rest after that or continue motion by rolling or further jumping. This type of movement is called “saltation load of the stream”.  If a velocity increases even more, the saltation will occur more frequently. Some of the sediments will be kept in suspension by the upward components of flow turbulence for appreciable lengths of time. Such type of sediment movement is called “suspended load of the stream”. FIGURE 1: TYPES OF SEDIMENT MOTION (HTTP://WWW.DKIMAGES.COM/DISCOVER/PREVIEWS/796/5103379.JPG) 8
  9. 9. 2.2 Subject Relevance The information of river bedload is extremely necessary for designing and building of various hydraulic engineering constructions, bridge transitions, channels of different function, dams and other constructions, to avoid and predict possible negative effects of the sediment. Some structures in high-velocity streams are taking damage from the sediment particles in motion. This process can seriously wear off the surface of such structures or pavements unless sufficient protection is provided. The damage can be caused by both smaller and larger particles. Chief damage of this nature is to turbines and pumps where sediment- laden water causes excessive wear on runners, vanes, propellers, and other appurtenant parts. (Murthy and Madhaven, 1959) Deposition in natural or artificial channels and reservoirs can also cause serious problems, as there is always a need of removing excessive sediment for following reasons. In navigable channels, excessive sediment needs to be removed to maintain specific depths that are crucial for safe shipping. Considering open channel hydraulics, excessive sediment amount in natural streams have a huge impact on channel flood-water capacity that may result in overflow. To avoid mentioned problems, sediments are removed from the problematic channels on a regular basis. (H.Garcia ,2008) 2.3 Solution of sediment problems As it may first appear, optimal solution for preventing problems of sediment transport would be stopping the erosion source and thus preventing the occurring of new sediments. But bearing in mind huge lengths of the rivers, even if the erosion source is stopped, huge amounts of sediment will still remain in the river for considerably long time .Therefore it is more rational to use protection systems that will filter out the sediments and store them in special sediment-containers or bypass sediments from the risk areas. In some cases more radical methods like dredging are used. To obtain a solution for the mentioned and other sediment problems, and provide an assessment of how effectively the measure is solving the problem, there should be a clear understanding of basic principles of sedimentation and hydraulics. (Steven J. Goldman, Katharine Jackson, Taras A. Bursztynsky, ”Erosion and sediment control handbook”) Basics of the sediment bedload transport principles are discussed in the next chapter. 9
  10. 10. 2.4 Sediment Discharge Calculation It is crucial to have efficient methods for computing sediment discharge to effectively plan and design all the construction and maintenance works in rivers and canals. But right now, available techniques and methods for computing sediment discharge do not allow efficient predicting and estimating of sediment movement. In practice, engineers cannot use these methods as main argument in making decisions. Usually they have to rely on their own experience. There is a certain difficulty for the engineer to select a formula to be used in calculations, because the results often vary significantly when using different methods. And it is difficult to judge which formula to use to get the most realistic result unless some observations and comparison of the discharge are made by an engineer. Many formulas that engineers treat as the most useful and realistic base only on the subjective experience of the engineer. So comparison of efficiencies of the formulas is a big concern. (A.Vanoni 1975) 2.5 Bed Load Transport Definition 2.5.1 Two Schools of thought There are two main schools of thought in bed load science. The author of one is Ralph Alger Bagnold and the other one is Professor Hans Albert Einstein. The Bagnold’s (1956) definition of the bed load transport is that contact of the particles with the bed is mainly influenced by the effect of gravity, when the suspended particles are lifted by random upward impulses exerted on a particle by turbulence. Einstein gives a little bit different definition for the transport bed load. He assumed that the bed load transport is a movement of the grains in a thin layer with a thickness of a few grain diameters, where grains move by sliding, rolling and making small jumps for a distance of a few grain diameters. Einstein considered that the bed layer turbulence mixing is too small to directly influence movement of sediments and thus suspension of the particles is not possible in the bed layer. He assumed that any particle in the bed load travels as a series of movements the distance of approximately 100 grain diameters independently of the flow, transport rate and bed characteristics. He treated saltating grains as the suspended load as the heights of jumps were much larger than one or two grain diameters. But Bagnold (1956, 1973) believed saltation is the main process causing the bed load transport. 10
  11. 11. Many further works were based on these two Schools. The key element in the formulation of Einstein is the entrainment rate of the particles per unit area as a function of different parameters such as shear stress and other. 2.6 Physical description of the bed load transport As the main purpose of the project is the design of software, only basic principles that are necessary for understanding of the topic and successful implementation of these principles in the software design, were covered below. More detailed descriptions can be obtained from the referenced sources. 2.6.1 General function In general the function of volumetric bed load transport rate is a relation of a boundary shear stress and various parameters of deposits. = ( , ) Eq.1 It can be defined in several ways: 1. As the multiplication of grain velocity, thickness of the bed load layer and the grain concentration. = Eq.2 2  – volumetric bed load transport rate ( )  – velocity of moving particles ( )  – concentration of particles ( ) −  – bed load layer thickness ( ) 11
  12. 12. 2. As the multiplication of grain velocity, grain volume and number of moving grains per unit area = Eq.3 2  – volumetric bed load transport rate ( )  – velocity of moving particles ( )  – volume of particles (3 )  – number of moving grains per unit area Velocity of the particles can be also defined as the ratio of the saltation distance and the period of particle movement T so = /. = can be expressed as = / Eq.4, Eq.5 12
  13. 13. 2.7 Bed Load Transport Relations Most of the bed load relations can be described in a universal form: , where ∗ is the Einstein bed load number(1950). Eq.6 Its dimensionless form is introduced by: ∗ = , where Eq.7 ∗ ∗ = and = Eq.8, Eq.9  is volumetric transport rate of bed load  is the particles Reynolds number. (Transition from laminar to a turbulent mode occurs after achievement of so-called critical number of Reynolds . At flow occurs in a laminar mode, at occurrence of turbulence is possible.)  is submerged specific gravity of sediment.  is the acceleration of gravity.  is volumetric sediment entrainment rate per unit area.  is particle travel distance  is produced shear stress. 2.8 Possible implementation of theory in software design Assumptions made about the way software could meet the requirements: 2.8.1 Direct measure method The software is required to detect motion of the grains and measure the sizes of the moving grains to work out the bed load transport rate afterwards. Thus if it is functioning 13
  14. 14. ideally it should be able to collect all the data necessary to work out the volumetric transport rate by detecting each grain displacement and estimating individual moving grain size from the top projection. (For example it can be estimated by treating each particle as a sphere or by using Einstein‟s (1950) estimation that distance travelled by each grain is approximately 100 grain diameters.) 2.8.2 Estimation method that uses bed load calculation formulas In case program is not able to track the motion of the particles efficiently, it possibly could measure the size of each grain on the picture and produce grain – size distribution. Then by combining the grain sizes data and flow data characteristics that are measured simultaneously, it might be possible to estimate the flow by using a variety of theories and formulas introduced by different scientists. 2.9 Digital image properties 2.9.1 Digital Image definition As the software input will be presented as the digital image it is very important to understand what a digital image is and the way data is stored in digital image. Digital image is differing by creation of the optical image on a photo sensor control instead of traditional photographic material. The image presented in a digital way, is suitable for further computer processing, therefore digital image often concerns to area of information technologies. Digital technologies are used in digital cameras and video cameras, fax and copying devices with the various photo sensor controls, writing down and transferring analog or digital signals. Achievements in the field of technology of photo sensor controls allow to create digital cameras which supersede film photo technics from the majority of spheres of application. 14
  15. 15. There are two basic ways of digital representation of images (fig.2):  Raster graphics  Vector graphics FIGURE 2 – VECTOR AND RASTER FORMAT IMAGES (http://www.edc.uri.edu/criticallands/raster.html) 2.9.1.1 Raster graphics As the input for the required software will be given as the raster images, it is important to understand principles of raster graphics. The raster image - a grid (raster) which cells are called as pixels. The pixel in the raster image has strictly certain site and color, hence, any object is represented by the program as a set of the color pixels. It means that the user, working with raster images, works over groups of pixels making them. It is represented in the form of a considerable quantity of points - more points there are, the better visual quality image has, file size increases accordingly. I.e. one specific picture can be presented with a good or bad quality according to amount of points per unit of length - the resolution (usually, points on inch - dpi or pixels on inch - ppi). There are different types of raster images. They differ from each other in the ways of representation and storage of color or brightness information of the pixel. Color is formed as a result of mixing several components which can be set in various color systems (or color spaces). 15
  16. 16. Term color depth is used for designation of how many bits are necessary for storage of pixel color information. Depth of color is measured in bits per pixel. Volume of memory, necessary for storage of raster image, can be calculated by formula: ∗ ∗ = 8 Eq.10  - number of columns;  - number of lines;  - depth of color; 2.9.1.2 Important characteristics The important characteristics of the raster image are:  quantity of pixels. The quantity of pixels can separately be specified on width and height (1024*768, 640*480 ...) or, seldom, total amount of pixels (usually measured in megapixels);  quantity of used colors (or depth of color);  color space RGB, CMYK, XYZ, YCbCr, etc. Raster image is edited by means of raster graphic editors. Raster images are created by cameras, scanners, in the raster editor, also by export from the vector editor or in the form of screenshots. 2.9.1.3 Advantages  Raster graphics allow to create (to reproduce) practically any figure, without dependence from complexity, in difference, for example, from vector graphics where it is impossible to transfer precisely effect of transition from one color to another (in the theory, certainly, it is possible, but a file with a size of 1 MB in format BMP will have the size of 200 MB in a vector format in complex images).  Prevalence - raster graphics is used now practically everywhere: from small badges up to posters.  High speed of processing of complex images if scaling is not necessary.  Raster representation of the image is natural to the majority of devices of input/output of the graphic information, such as monitor, printer, digital camera, scanner, etc. 16
  17. 17. 2.9.1.4 Disadvantages  Big size of files with simple images.  Impossibility of ideal scaling. Because of these disadvantages, for storage of simple figures it is recommended to use vector graphics instead of even compressed raster graphics. 2.9.1.5 Formats Raster images are usually stored in compressed way. Depending on type of compression it can be possible or impossible to restore the image to a quality what it was before compression (compression lost-free or compression with losses accordingly). In a graphic file, additional information can be stored as well: author-related information, a camera and its adjustments, quantity of points per centimeter at a press, etc. 2.9.1.6 Types of raster images There are following types of raster images, each of which is intended for the decision of the certain range of problems:  binary  grayscale  indexed  true colour 2.9.1.7 Grayscale images Grayscale type of images is overviewed as the input data will be presented as a set of grayscale raster images. So called Grayscale or Intensity images are images, pixels of which can have one of intensity values of any color in a range from minimal up to the maximal intensity (fig.3). Usually it is supposed that in Intensity picture, gradation of grey color is stored in a range from black up to white. Therefore sometimes Intensity pictures are called “grey” or images in a gradation of grey, and term “brightness of pixel” is used as a synonym of “intensity”. 17
  18. 18. Currently grayscale pictures with depth of color of 8 bits/pixel have the widest application. They can store 256 values of brightness (from 0 up to 255). Grayscale pictures having depth of color from 2 up to 16 bits/pixel are less often used. FIGURE 3: VARIATIONS OF GRAYSCALE INTENSITIES (http://www.kumagera.ne.jp/kkudo/grayscale.jpg) 2.9.2 Image processing As mentioned above, software is required to measure certain parameters of the grains in the image. To achieve that a method called “image processing” should be used in the required software. Image Processing - any form of processing of the information for which the input data is presented by the image, for example, by photos or the video frames. Processing of images can be carried out as for getting the image on an output (for example, preparation for polygraphic duplicating, for teletranslation, etc.), and for getting other information (for example, recognition of the text, calculation of number and type of cells in a field of a microscope, etc.). Except for static two-dimensional images to process it is also required to process the images changing with time, for example video. The variety of the purposes and problems of processing of images can be classified as follows:  Improvement of quality of images;  Measurements on images;  The spectral analysis of multivariate signals;  Recognition of images;  A compression of images. (Wikipedia accessed on 03.03.2009, John C. Russ - The Image Processing Handbook 2006) 18
  19. 19. 3.0 Methodolgy 3.0.1 Package used for software development 3.0.1.1 Matlab Overview MATLAB was selected for writing a code of required software due to its availability and convenience. It has Image Processing Toolbox that makes working with images much simpler. MATLAB as the programming language has been developed at the end of 1970. MATLAB (from English « Matrix Laboratory ») - the term related to a package of applied programs for the solution of problems of technical calculations, and also to the programming language used in this package. MATLAB is used by more than 1 000 000 engineering and science officers, it works on the majority of modern operational systems, including GNU/Linux, Mac OS, Solaris and Microsoft Windows . MATLAB is high-level programming language that includes structures based on matrixes of data, a wide spectrum of the functions, the integrated environment of the development, object-oriented opportunities and interfaces to the programs written in other programming languages. The programs written on MATLAB can be of two types - functions and scripts. Functions have entrance and target arguments, and also own working space for storage of intermediate results of calculations and variables. Scripts use the general working space. Both scripts, and functions are not compiled in a machine code and kept in the form of text files. There is also an opportunity to keep so-called pre-parsed programs - functions and the scripts processed in a kind, convenient for machine execution. Generally such programs are carried out more quickly than usual, especially if function contains commands of construction of graphs and other figures. The basic feature of programming language MATLAB is its wide opportunities of working with matrixes, which developers of language have expressed in the slogan Think vectorized. 19
  20. 20. 3.0.1.2 Image Processing Toolbox Image Processing Package gives scientists, engineers and even artists a wide spectrum of tools for digital processing and the analysis of images. Being closely connected with environment of development of MATLAB applications, Image Processing Toolbox helps an engineer to avoid performance of long operations of coding and debugging of algorithms, allowing to concentrate efforts on the decision of the basic scientific or practical problems. The basic properties of a package are:  Restoration and allocation of details of images  Work with the allocated site of the image  The analysis of the image  Linear filtration  Transformation of images  Geometrical transformations  Increase in contrast of the important details  Binary transformations  Processing of images and statistics  Color transformations  Change of a palette  Transformation of types of images Image Processing Package gives wide opportunities for creation and the analysis of graphic representations in MATLAB environment. This package provides a flexible interface, allowing to manipulate images, interactively develop graphic pictures, to visualize data sets and to annotate results for descriptions, reports and publications. Flexibility, connection of algorithms of a package with such feature of MATLAB as the matrix-vector description makes a package very successfully adapted for solving any problems in development and representation. MATLAB includes specially developed procedures that allow to raise efficiency of a graphic core. It is possible to note, in particular, such features:  Interactive debugging during development of graphics;  Profiler for optimization of time of algorithm performance; 20
  21. 21.  Construction tools of the interactive graphic user interface (GUI Builder) for acceleration of development of the GUI-patterns, allowing to adjust it under goals and problems of the user. This package allows the user to spend much less time and forces for creation of standard graphic representations and, thus, to concentrate efforts on the important details and features of images. MATLAB and Image Processing Toolbox are highly adapted for development and introduction of new ideas and methods of the user. For this purpose there is a set of the interfaced packages directed on the decision of every possible specific problem and problems in nonconventional statement. Package Image Processing now is intensively used in thousands companies and universities worldwide. Thus there is very much a broad audience of problems, which users solve by means of the given package, for example space researches, military development, astronomy, medicine, biology, a robotics, materiology, genetics, etc. 3.1 Experimental Set up A set of high-quality images of sediment had to be taken to test different ideas and methods during the development of software. In practice, a video sample of bed load would be made at water channels or rivers for further analysis. But for required experiments, simplified model can be efficiently used. Simplified model consists from(fig.4):  Rectangular fish tank filled with water, and gravel to the point, where whole tank bottom is uniformly covered with sediment.  A digital camera  A tripod  Two lighting sources (desk lamps)  A simple white reflector 21
  22. 22. FIGURE 4: SIMPLIFIED MODEL OF THE CHANNEL A fish tank represents a section of a water channel where sediment transport occurs. Camera is mounted on the tripod perpendicularly to the surface of water. Tripod ensures stability of camera and thus each shot will be made from exactly the same position. This is very important, because often, images will be analyzed in pairs and even minor image displacements can create unnecessary problems. Each camera shot is assumed to be one frame of the video that will be analyzed by the software. To simulate movement of the sediment, some grains are regularly manually moved between the shots of camera. All the images will be converted from colour to grayscale as in reality the input images obtained from the high-speed cameras will be given in grayscale mode. To efficiently detect the edges of the grains it is important to create a lighting that will leave shaded areas in gaps between the grains and to have a little effect on those shadows in gape lighting source is placed almost parallel to the surface of sediment. A white reflector is placed opposite to the lighting source. It is used to create dissipated light that will help to highlight the surface of sediments and will not totally exclude shadows from the gaps. This configuration was obtained in experimental way and proved to be the most efficient. 22
  23. 23. 3.1.1 Programming Package features related to software design 3.1.1.1 Systems of co-ordinates Two coordinate systems are used in IРТ(Image Processing Toolbox) package : pixel and spatial. In the majority of functions of a package the pixel system of co-ordinates is used, in a number of functions the spatial system of co-ordinates is applied, and some functions can work with both systems. It is possible to use only pixel system of co-ordinates when writing own scenarios for the reference to values of pixels in Matlab system. Pixel system of co-ordinates is traditional for digital processing of images. In it an image is represented as a matrix of discrete pixels. For the reference to pixel of the image I it is necessary to define a number of a line r and number of column c on crossing of which the pixel is located: I(r,c). Lines are numbered from top to down, and columns from left to right (fig. 5,). The top left pixel has co-ordinates (1;1). Only pixel system will be used in the current software. Information about spatial system of coordinates can be found in Appendix. FIGURE 5 -PIXEL SYSTEM OF COORDINATES 3.0.1.1 List of Matlab commands and functions used in experiments. In this part all the main used Matlab and Image Processing Toolbox functions that are necessary for understanding the experimental codes, will be discussed: imread Reads the image from a file. Function D = imread (filename, fmt) reads binary, grayscale or indexed image from a file with a name filename and places it in array D. If MATLAB cannot find a file with a name filename the file with ­ a name filename and expansion fmt is searched. Parametres filename and fmt are strings. 23
  24. 24. imwrite Writes an image in to a file Function imwrite (S, filename, fmt) writes down a binary, grayscale or indexed image S in to a file with a name filename. The file format is defined by parametre fmt. Parametres filename and fmt are strings. adapthisteq Function J=adapthisteq (I) improves contrast of a grayscale picture I by transformation of values of its elements by a method of contrast limited adaptive histogram equalization (CLAHE). Method CLAHE works more effective with small local vicinities of images, than with full images. Contrast, especially on homogeneous vicinities, should be limited in order to avoid strengthening of a noise component. medfilt2 Function q=medfilt2 (q, [A B]) makes a median filtration by a filter kernel with a size of AхB pixels. The filtration eliminates noise of type salt-pepper on the image q in a following way: all values of pixels in working area of a kernel line up in a row by increase of brightness and last element of a row is equated to the central. imabsdiff Function Z=imabsdiff (X, Y) subtracts each element of image Y from a corresponding element of the image X and places an absolute difference of these elements in to resultant variable Z. 3.0.1.2 Cycle operators Similar and repeating operations are fulfilled by means of cycle operators”for “and “while”. The cycle for is intended for performance of the predetermined number of repeating operations, a while - for the operations with unknown number of required repetitions, but the condition of continuation of a cycle is known. 24
  25. 25. 3.0.1.3 PARFOR loop During experiments a problem arised. Most of the processes in the written functions were based on the cycles and it took more than 10 hours to process an image using standart loops. That is the reason why PARFOR cycle was introduced. General purpose of the PARFOR loop is to run not just one cycle, but divide current cycle in to independent parts and run them in parallel. This results in significant increase of the image processing speed. In current project it helped to reduce more than 10 hours processing time to a satisfactory 3-10 minutes. More detailed information about parfor loop can be found at Matlab help or the Mathworks website (http://www.mathworks.com) 3.0.1.4 Converting image from colour to grayscale As it was previously stated, colour images were made for experiments in the simplified model of a channel. As high speed cameras that produce grayscale images are going to be used in real-life conditions, it is necessary to convert obtained colour pictures to the grayscale ones. To do that, a function called “mrgb2gray” (written by Kristian Sveen, 10 Sep 2004) was taken from the Matlab central website. Detailed description of the code can be found on the hyperlink below. http://www.mathworks.com/matlabcentral/fileexchange/5855. Figures below show the original obtained colour image and converted grayscale image. All the experiments will be done using the grayscale image .(fig.7) FIGURE 6 : ORIGINAL COLOUR IMAGE FIGURE 7: IMAGE CONVERTED TO GRAYSCALE 25
  26. 26. 3.2 Proposed software working principles Detect the grain Detect the grain Calculate the edges on first edges on second absolute image image difference between frames Image 1 Image 2 and highlight the areas where values have changed. Segment and label Segment and label each grain each grain Identify which of the highlighted Compare two areas is grain images and movement starting identify grains that point and where is have moved the grain stopping point Measure the area of the moved grains Estimate area of the grain from the highlight of the area and Approximate approximate volume of those volume of the grains grain Work out the Work out the volumetric volumetric flow movement rate of rate of the the sediment sediment FIGURE 8:PROPOSED ALGORITHM NR.1 FIGURE 9: PROPOSED ALGORITHM NR.2 26
  27. 27. 3.2.1 Proposed program concept Nr. 1 3.2.1.1 Description The main purpose of current experiment is to find out if the segmentation based on object edge detection is efficient in current situation. Considering the fact that provided image cannot be classified as the one that is easy to segment, due to its properties(non- homogeneity of grains surfaces, non-sharp edges, shading, etc.) many different methods of edge detection are used in a set of experiments. Additionally a method of a “watershed segmentation” was tested with edge-detectors. 3.2.1.2 Background Grain detection In this method, grain detection and segmentation is a key problem that needs to be solved. When appropriate solution to this problem is found, work on other blocks of the algorithm can be started. Segmentation Image segmentation represents division or splitting the image on regions by similarity of properties of their points. Primary allocation of required objects on initial grayscale image by means of segmentation transformation is one of the basic image analysis stages. Most widely used transformations are brightness and planimetric. Some researchers include as well textural segmentation as one of the basic methods . According to this classification, allocation of areas in the process of segmentation is carried out proceeding from the conformity estimation, by some criterion of similarity of values: either brightness of each point, or the first derivative of brightness in some specified vicinity of each point, or any of the textural characteristics of distribution of brightness in the specified vicinity of a point. (C.Gonsales,2004) Edges Edges are such curves on the image along which there is a sharp change of brightness or its derivatives on spatial variables. Such changes of brightness which reflect the important features of a represented surface are most interesting. Places where surface orientation varies in steps concern them, or where one object blocks another, or where the border of 27
  28. 28. the rejected shade lays down on the object, or there is no continuity in reflective properties of a surface, etc. It is quite natural, that noisy brightness measurements limit possibility to allocate the information on edges. Contradiction rises between sensitivity and accuracy and thus short edges should possess higher contrast, than long that they could be distinguished. Allocation of edges can be considered as addition to image segmentation as edges can be used for splitting of images into the areas corresponding to various surfaces. 3.2.1.3 Segmentation Method 1 Description This experiment was the first one and its primary purpose was to check how efficient is the simpliest method of edge detection and secondary objective was to test how fast matlab package is processing images using simplest algorithms. The idea is that ideal borders have a rapid change in grayscale intensity (brightness). Concept of the method is – comparison of gray intensities of each coherent pair of pixels in one image in vertical , horizontal and diagonal directions. If certain predetermined difference amount is reached – pixels are marked as the ones that belong to the grain edge. function horscan_i Input  “T” - an image or the video frame that needs to be analyzed  “differ”- maximum intensity value difference between coherent pixels. (sensitivity of the edge scanner) Output  “marked” - A variable that contains an image of the grain borders obtained by the current method. 28
  29. 29. Basic steps 1) Image is read in to the variable: T = imread('imname.bmp') 2) Image size is being measured for setting the cycle step amount later on: siz = size(T) 3) Critical difference value between coherent pixels is set. differ = 8 4) All the pairs of coherent pixels have to be checked for reaching critical intensity difference value described in previous step. a) Checking process is a cycle where each two coherent pixels are being processed in sequence. When whole horizontal line is fully processed, next line is analyzed. for hor = 1:(siz(1,2)) parfor ver = 1:(siz(1,1)-1) b) The difference between coherent pixels in horizontal direction is computed, and checked if it reaches the pre-set “differ” value. If so, pixel is marked as the edge in a variable that stores the coordinates of the grain edges. if (max(T(ver,hor), T(ver,hor + 1)) - min(T(ver,hor), T(ver,hor + 1))) = differ marked(ver,(hor)) = 1 c) The difference between coherent pixels in vertical direction and checked if it reaches the pre-set “differ” value. If so, pixel is marked as the edge: else if (max(T(ver,hor), T(ver + 1,hor)) - min(T(ver,hor), T(ver + 1,hor))) = differ marked((ver),hor) = 1 d) The difference between coherent pixels in diagonal direction and checked if it reaches the pre-set “differ” value. If so, pixel is marked as the edge: else if (max(T(ver,hor), T(ver + 1,hor + 1)) - min(T(ver,hor), T(ver + 1,hor + 1))) = differ marked((ver),hor) = 1 29
  30. 30. Results FIGURE 10: ORIGINAL IMAGE FIGURE 11: SENSITIVITY PARAMETER “DIFFER” = 7 FIGURE 12: SENSITIVITY PARAMETER “DIFFER” = 10 FIGURE 13: SENSITIVITY PARAMETER “DIFFER” = 13 As it is seen from the figures above, method works to some extent. It captures some of the grain borders and the amount of captured borders depends on the “differ parameter”. By decreasing this parameter, sensitivity increases and thus, more borders are captured. But by increasing sensitivity more noise comes along with better borders. Noise might be acceptable in this case, as long as the amount of noise particles allows efficiently excluding it and distinguishing noise from the borders. Noise can possibly be excluded by size, solidity and other parameters that matlab can work with. Figure 16 below is an example of significant noise reduction even on images obtained on high sensitivity („‟differ=7‟‟). In this case it was reduced by smoothing the original image using function medfilt2. (Appendix A) On the Figure 16, an example of noise reduced 30
  31. 31. by the method of excluding objects by their area using function called rem_small is shown. Function is written by the author of the report. (Function code can be found in Appendix A) FIGURE 14:NOISE REDUCED BY MEDIAN FILTERING FIGURE 15: NOISE REDUCED BY REM_SMALL FUNCTION It is seen that some of the grains are over contaminated by noise independently from sensitivity parameter. The reason for that is non-homogenous surface of those grains (this is clearly seen on the original image. Conclusion Despite of all the disadvantages of the method it captures some of the borders and might find an application. It might efficiently work in combination with other methods of grain border detection. It can either be used with low sensitivity and thus – little or no noise particles, or with higher sensitivity, more noise but with efficient measures that separate grain borders from noise particles. 31
  32. 32. 3.0.1.4 Segmentation Method 2 Description The idea of this experiment is detection of grain borders by comparing change rates of pixel intensity values around the control pixel. It was assumed that rate of change of grey-intensities changes rapidly at the point of object border. FIGURE 16-ANALYZED PIXEL LOCATIONS First of all, a rate of change of intensity from pixel A1 to central pixel B1 is calculated by finding a numerical difference between current pixel values. After that, the same process is repeated between pixels B1 and C1. Finally the rate of change on the left from the middle pixel is compared to the rate of change on the right side of the middle pixel. If rates of change have a reasonable difference (manually defined by the user), the middle pixel B1 is marked as a pixel belonging to the border. The same sequence is used to calculate vertical rates of change. Matlab algorithm based on the proposed method function rate_of_change_scan Input  T - an image or the video frame that needs to be analyzed  differ- maximum rate of change difference, reaching which,centra pixel is marked as the edge . (sensitivity of the edge scanner) Output  marked - A variable that contains an image of the grain borders obtained by the current method. 32
  33. 33. Basic steps 1) Image is read in to the variable: T = imread('imname.bmp') 2) Image size is being measured for setting the cycle step amount later on: siz = size(T) 3) Critical rate of change of intensity is set. Less value it has – more sensitive current edge detector is. differ = 8 4) With the aid of cycles, for ver = 2:(siz(1,1) - 1) ; parfor hor = 2:(siz(1,2) - 1); each pixel one by one becomes a control pixel and is checked for the rates of change with coherent pixels. a) Rate of change is calculated with the pixel on the left ROC_left = T(ver,hor) - T(ver,hor - 1) b) Rate of change is calculated with the pixel on the left ROC_right = T(ver,hor) - T(ver,hor + 1) c) Rate of change is calculated with the upper pixel ROC_up = T(ver,hor) - T(ver + 1,hor) d) Rate of change is calculated with the pixel on the left ROC_up = T(ver,hor) - T(ver - 1,hor) 5) Difference of intensity change rates around the control point is calculated a) In the horizontal plane first ROC_diff_hor = imabsdiff(ROC_left,ROC_right) b) Then in the vertical plane ROC_diff_vert = imabsdiff(ROC_up,ROC_down) 6) Difference obtained in previous step is compared with the critical difference value set by user. If the critical value is exceeded, the central control pixel is marked in a variable “marked” that highlights the pixels belonging to the border. 33
  34. 34. if ROC_diff_hor = differ marked(ver,hor) = 1; elseif ROC_diff_vert = differ marked(ver,hor) = 1; Results FIGURE 17: ORIGINAL GRAYSCALE IMAGE FIGURE 18: SENSITIVITY PARAMETER “DIFFER” = 10 FIGURE 19: SENSITIVITY PARAMETER “DIFFER” = 7 FIGURE 20:SENSITIVITY PARAMETER “DIFFER” = 5 34
  35. 35. It is seen on the images above that without preliminary preparation of the processed image (such as smoothing, contrast improvement etc.), the given method highlights edges efficiently only with big amount of noise, at high sensitivity set by the user. Situation is very similar with results of the Experiment 1 - by increasing sensitivity, amount of noise increases as well, and thus similar measures need to be applied to cope with the noise. After a number of tests some methods of the noise reduction proved to be relatively efficient in this case:  Preliminary original image smoothing using function medfilt2. Map used in the filter had size 6x6 pixels. (figure 23)  A combination of contrast enhancement using function adapthisteq and image smoothing using function medfilt2. (figure 24) FIGURE 21: SENSITIVITY PARAMETER “DIFFER” = 5 FIGURE 22: SENSITIVITY PARAMETER “DIFFER” =9 Conclusion If compared with the method described in “Experiment 1” both of these methods have similar quality of grain edge capturing capabilities. If using this method solely, obtained grain quality does not allow segmenting the grains with at least a satisfactory quality. Thus current method cannot be used independently but again, it might be combined with other edge scanners and provide those parts of edges that an edge-detector of a different concept could not capture. 35
  36. 36. 3.0.1.5 Segmentation Method 3 Description This method is called “Marker-Controlled Watershed Segmentation”. Development of technologies of processing of images has led to occurrence of new approaches to the decision of problems of segmentation of images and their application at the decision of many practical problems. In this experiment rather new approach to the decision of a problem of segmentation of images will be considered - a watershed method. Shortly the name of this method and its essence wil be explained. It is offered to consider the image as some district map where values of brightness represent values of heights concerning some level. If this district is filled with water then pools are formed. At the further filling with water, these pools unite. Places of merge of these pools are marked as a watershed line. Division of adjoining objects on the image is one of the important problems of processing of images. Often for the decision of this problem the so-called Marker-Controlled Watershed Segmentation is used. At transformations by means of this method it is necessary to define catchment basins and watershed lines on the image by processing of local areas depending on their brightness characteristics. Matlab algorithm based on the proposed method During this experiment, instructions described on the website given below were used. Parts of the code were copied and used. http://www.mathworks.com/products/image/demos.html?file=/products/demos/shipping/i mages/ipexwatershed.html 36
  37. 37. function watershed_segmentation Basic steps 1) Reading of the colour image and its transformation to the grayscale. Reading data from a file rgb = imread (' G:Matlabtested_image.jpg'); And present them in the form of a grayscale picture. I = rgb2gray(rgb); imshow(I) text(732,501,'Image courtesy of Corel(R)',...'FontSize',7,'HorizontalAlignment','right') FIGURE 23: OBTAINED GRAYSCALE PICTURE 2) Use value of a gradient as segmentation function. For calculation of value of a gradient Sobel edge mask, function imfilter and other calculations are used. The gradient has great values on borders of objects and small (in most cases) outside the edges of objects. hy=fspecial (' sobel '); hx=hy '; 37
  38. 38. Iy=imfilter (double (I), hy, ' replicate '); Ix=imfilter (double (I), hx, ' replicate '); gradmag=sqrt (Ix. ^ 2+Iy. ^ 2); figure, imshow (gradmag, []), title (' value of a gradient ') FIGURE 24:GRADIENT SEGMENTATION 3) Marking of objects of the foreground. For marking of objects of the foreground various procedures can be used. Morphological techniques which are called opening by reconstruction and closing by reconstruction are used. These operations allow analyzing internal area of objects of the image by means of function imregionalmax. As it has been told above, at carrying out marking of objects of the foreground morphological operations are also used. Some of them will be considered and compared. At first operation of disclosing with function use imopen will be implemented. se=strel (' disk ', 20); Io=imopen (I, se); figure, imshow (Io), title (' Io ') 38
  39. 39. FIGURE 25-FOREGROUND MARKING Further opening using functions imerode and imreconstruct will be calculated. Ie=imerode (I, se); Iobr=imreconstruct (Ie, I); figure, imshow (Iobr), title (' Iobr ') FIGURE 26:IMAGE ERODE The subsequent morphological operations of opening and closing will lead to moving of dark stains and formation of markers. Operations of morphological closing are analyzed below. For this purpose function imclose is used first: Ioc=imclose (Io, se); figure, imshow (Ioc), title (' Ioc ') 39
  40. 40. FIGURE 27:IMAGE CLOSE Further function imdilate is applied together with function imreconstruct. For implementation of operation imreconstruct it is necessary to perform operation of addition of images. Iobrd=imdilate (Iobr, se); Iobrcbr=imreconstruct (imcomplement (Iobrd), imcomplement (Iobr)); Iobrcbr=imcomplement (Iobrcbr); figure, imshow (Iobrcbr), title (' Iobrcbr ') FIGURE - IMDIALATE Comparative visual analysis Iobrcbr and Ioc shows, that the presented reconstruction on the basis of morphological operations of opening and closing is more effective in comparison with standard operations of opening and closing. Local maxima Iobrcbr will be calculated and foreground markers recieved. 40
  41. 41. fgm=imregionalmax (Iobrcbr); figure, imshow (fgm), title (' fgm ') FIGURE 28:IMREGIONALMAX FUNCTION USED Foreground markers imposed on the initial image. I2=I; I2 (fgm) =255; figure, imshow (I2), title (' fgm, imposed on the initial image ') FIGURE 29:IMPOSE MARKERS ON THE IMAGE Some latent or closed objects of the image are not marked. This property influences formation of result and such objects of the image will not be processed from the segmentation point of view. Thus, in ideal conditions, foreground markers display borders only of the majority of objects. 41
  42. 42. Some of the foreground markers cross the edges of the grains. Markers should be cleaned and shrunk to allow further processing. In particular, it can be morphological operations. se2=strel (ones (5, 5)); fgm2=imclose (fgm, se2); fgm3=imerode (fgm2, se2); As a result of carrying out of such operation the separate isolated pixels of the image disappear. Also it is possible to use function bwareaopen which allows to delete the set number of pixels. fgm4=bwareaopen (fgm3, 20); I3=I; I3 (fgm4) =255; figure, imshow (I3) title (' fgm4, imposed on the initial image ') FIGURE 30: BWAREAOPEN FUNCTION 4) Calculation of markers of a background. Now operation of marking of a background will be performed. On image Iobrcbr dark pixels relate to a background. Thus, it might be possible to apply operation of threshold processing of the image. bw=im2bw (Iobrcbr, graythresh (Iobrcbr)); figure, imshow (bw), title (' bw ') 42
  43. 43. FIGURE 31:THRESHOLD OPERATION Background pixels are dark, however it is impossible to perform simple morphological operations over markers of a background and to receive borders of objects which are segmented. Background will be thinned so that in order to receive an authentic skeleton of the image or, so-called, the grayscale picture foreground. It is calculated using approach on a watershed and on the basis of measurement of distances (to watershed lines). D=bwdist (bw); DL=watershed (D); bgm=DL == 0; figure, imshow (bgm), title (' bgm ') FIGURE 32:IWATERSHED LINES 43
  44. 44. 5) Calculate the Watershed Transformation of the Segmentation Function. Function imimposemin can be applied for exact definition of local minima of the image. On the basis of it function imimposemin also can correct values of gradients on the image and thus specify an arrangement of markers of the foreground and a background. gradmag2=imimposemin (gradmag, bgm | fgm4); And at last, operation of segmentation on the basis of a watershed is carried out. L=watershed (gradmag2); Step 6: Visualization of the processing result Displaying the imposed markers of the foreground on the initial image , markers of a background and border of the segmented objects. I4=I; I4 (imdilate (L == 0, ones (3, 3)) |bgm|fgm4) =255; figure, imshow (I4) title (' Markers and the borders of objects imposed on the initial image ') FIGURE 33:BORDERS MARKED As a result of such display it is possible to analyze visually a site of markers of the foreground and a background. 44
  45. 45. Display of results of processing by means of the colour image is also useful. The matrix which is generated by functions watershed and bwlabel, can be converted in the truecolor-image by means of function label2rgb. Lrgb=label2rgb (L, ' jet ', ' w ', ' shuffle '); figure, imshow (Lrgb) title (' Lrgb ') FIGURE 34:DISPLAY THE RESULTS Also it is possible to use a translucent mode for imposing of a pseudo-colour matrix of labels over the initial image. figure, imshow (I), hold on himage=imshow (Lrgb); set (himage, ' AlphaData ', 0.3); title (' Lrgb, imposed on the initial image in a translucent mode ') 45
  46. 46. FIGURE 35:RESULTS IMPOSE ON ORIGINAL IMAGE A combination of contrast enhancement function adapthisteq and image smoothing function medfilt2 was used as preliminary processing as under current conditions segmentation result proved to be the best from all the tested ones. Conclusion It is seen from the Fig.38 that only few of all grains are segmented properly. Some of the captured grains are over segmented. Such unimpressive grain capture capability can be explained by non uniform lighting, absence of the well-defined background and non- homogenous surface of the most grains. Method might work with different efficiency, depending on variety of conditions (such as lighting, background etc.) 3.0.1.5 Segmentation Method 4 Description This method is very interesting as it was discovered by accident. During experiments on the method of the pixel value difference (method is described starting from page 55), when comparing 2 coherent frames and highlighting the areas 46
  47. 47. values of which have changed, it was noticed that the grain borders are highlighted at some point as well (fig.39). FIGURE 36:ACCIDENTLY HIGHLIGHTED BORDERS After analysis of the reasons that caused the highlight of grain borders, it was noticed that processed coherent frames were made from a little bit different perspectives. Camera was accidently moved by few millimeters when pressing an image-capture button and that caused vibration of the secondary image in relation to the first image. It was assumed that such vibration can be simulated in matlab environment to construct an edge detector based on that idea. The main idea is to shift the same image for a few pixels, compare it with original image and highlight the difference of the pixel values that are exceeding critical value set by the user. If picture will be smoothed preliminary, the grains will be relatively homogenous, and the grain edge lines could be highlighted with a little noise.. Matlab algorithm based on the proposed method function [ dif ] = vibr1_1(imnam1,pix_mov,pix_difference,rem_area) Input imnam1 = image name rem_area = area of particles to be filtered out 47
  48. 48. pix_move = distance to move pixel pix_difference = critical pixel difference for highlighting, when comparing two frames. Output dif = variable containing grain borders obtained by current method Basic steps 1) Image is read in to the variable: Im1= imread(imnam1) 2) Image size is being measured for setting the cycle step amount later on: siz = size(T) 3) Critical difference value between coherent pixels is set. differ = 8 4) Shifted images are created and saved in to predetermined variables. a) The process of the image shifting is a cycle that one by one shift pixels in all possible directions and saves them into different variables to further analyze and combine the borders obtained using all the directions of image shift. for ver = (pix_move+1):(ver_size - pix_move) pix_move_ver_down = ver - pix_move pix_move_ver_up = ver + pix_move parfor hor = (pix_move+1):(hor_size - pix_move) b) Image shifted in horizontal direction: im2(ver,hor + pix_move) = img1(ver,hor); c) Image shifted in vertical direction: im3(pix_move_ver_up,hor) = img1(ver,hor); d) Image shifted in diagonal direction (vertically + right): im4(pix_move_ver_down,hor + pix_move ) = img1(ver,hor); e) Image shifted in diagonal direction (vertically + left): im5(pix_move_ver_up,hor + pix_move) = img1(ver,hor); 48
  49. 49. 5) Absolute pixel differences are calculated between original and shifted images a) Between original and horizontally shifted image im_difference1_2 = imabsdiff(im1,im2); b) Between original and vertically shifted image im_difference1_3 = imabsdiff(im1,im3); c) Between original and diagonally(vertically + right) shifted image im_difference1_4 = imabsdiff(im1,im4); d) Between original and diagonally(vertically + left) shifted image im_difference1_5 = imabsdiff(im1,im5); 6) Difference between images is checked for reaching the critical value. If so the pixel is marked as the border one. a) Current process is a loop, for ver = (pix_move+1):(ver_size - pix_move) pix_move_ver_down = ver - pix_move pix_move_ver_up = ver + pix_move parfor hor = (pix_move+1):(hor_size - pix_move) where pixels are sequentially checked for reaching of critical value predetermined by the user. Variable pix_move is used to exclude grain border doubling that is caused by image shifting. b) Check differences of all the images for for reaching the critical value and mark the required pixels. if im_difference1_2(ver,hor) = pix_difference bi_im1_2(ver,hor - pix_move) = 1; elseif im_difference1_3(ver,hor) = pix_difference bi_im1_3(pix_move_ver_down,hor) = 1; elseif im_difference1_4(ver,hor) = pix_difference bi_im1_4(pix_move_ver_up,hor - pix_move) = 1; elseif im_difference1_5(ver,hor) = pix_difference bi_im1_5(pix_move_ver_down,hor - pix_move) = 1; 49
  50. 50. On the figures below(Fig.40,Fig.41,Fig.42,Fig.43) grain edges obtained by different shift directions can be seen. FIGURE 37: GRAIN EDGES OBTAINED USING FIGURE 38: GRAIN EDGES OBTAINED USING HORIZONTAL SHIFT VERTICALL SHIFT FIGURE 39: GRAIN EDGES OBTAINED USING FIGURE 40: GRAIN EDGES OBTAINED USING DIAGONAL(VERTICAL + RIGHT) SHIFT DIAGONAL(VERTICAL + LEFT) SHIFT 7) From the figures above, it is seen that obtain edge part images contain some amount of noise. As all of those grain edge parts will be put together later on it is preferred to remove the noise to avoid noise accumulation on the final image. a) To do that, a function rem_small was written(Appendix A). 50
  51. 51.  First of all this function labels all the objects in the image  Then it measures each objects area  Finally it excludes the objects with area that is less than predetermined value b) Noise is removed from each image using function rem_small described above bi_im1_2_mod = rem_small(bi_im1_2,rem_area) bi_im1_3_mod = rem_small(bi_im1_3,rem_area) bi_im1_4_mod = rem_small(bi_im1_4,rem_area) bi_im1_5_mod = rem_small(bi_im1_5,rem_area) Example of the noise removal result can be seen on the images below. Fig.44 is image before noise removal and Fig.45 is the resultant image. FIGURE 41: IMAGE BEFORE NOISE REMOVAL FIGURE 42: FIG.454235 8) All the grain edge parts now have been cleaned from the noise and are put together in one image. The resultant image is cleaned from the noise again. a) Simple summation of the matrices is used to create a final image containing grain borders bi_im_final = bi_im1_2_mod + bi_im1_3_mod + bi_im1_4_mod + bi_im1_5_mod b) Noise is removed from the resultant image using function rem_small bi_im_final = rem_small(bi_im_final,rem_area) 51
  52. 52. Results Original grayscale and combined final border images are shown at figures below FIGURE 43: ORIGINAL GRAYSCALE PICTURE FIGURE 44: GRAIN BORDERS OBTAINED USING VIBRATION SIMULATION FIGURE 45: METHOD 1 FIGURE 46: METHOD 2 It is seen that proposed method capture some of the grains ideally, some of the grain borders are not fully highlighted though. Amount of noise is very small. If compared with best quality edges captured with method 1 and method 2 (Fig. 42,43) , it is seen that edges captured by current method (fig.47) are quite stronger and have significantly less noise. 52
  53. 53. Conclusion Method produced comparatively good edge detection results but still not enough to segment grains with enough quality to use the segmentation in further analysis of sediment discharge. It still might provide a reasonable segmentation if used in combination with other methods. From the experiments described above it is possible to assume that efficiency of current method in terms of detected edge strength and amount of noise, is higher than Method 1 and Method 2 efficiencies. 3.0.1.6 Canny edge detector Description It is quite complex method consisting of the big number of stages. The method essence is searching for local sites with brightness differences. Brightness differences are searched by means of a filtration along each of axes by the one-dimensional Laplacian-Gaussian filter. In Canny method for classification of differences on weak and strong two thresholds are used - bottom and top. Weak borders are marked in resultant image, only if they are connected to strong. For noisy images the given method provides the best detection of borders in comparison with other methods but it demands more time. [J.Cаnny, 1986. А computаtional Аpproach to Еdge Detectiоn, IEEЕ Trаns. Pаttern Anаlysis аnd Mаchine Intelligеnce]. Matlab function BW=edge(I, 'canny', thresh) BW = еdgе(I,'cаnny',thrеsh) The parameter thresh can be a two-element vector. In this case the first element of a vector sets value of the bottom threshold, and the second element - value of the top threshold. If the parameter thresh is scalar value thresh sets value of the top threshold, and for the bottom threshold value- (0.4*thresh) is used. If the parameter thresh is not given or thresh is an empty array, values of thresholds are defined automatically. Results Images obtained by Canny edge detector are either over segmented (fig. 51,52) or with weak edges( fig,5). Oversegmentation is probably caused by the high efficiency of the scanner. In experimental way it was figured out that there is no “thresh” sensitivity value 53
  54. 54. that would allow to avoid oversegmentation and capture the grain borders at satisfactory level. FIGURE 47: ORIGINAL IMAGE FIGURE 48: THRESH = 0.05 FIGURE 49: THRESH = 0.1 FIGURE 50: THRESH = 0.2 Conclusion Based on this experiment results, it can be stated that Canny filter is too efficient for the image of such type. If sensitivity parameter “thresh” reaches level when a grain edges can be detected efficiently, apart from that each single detail on the grain is segmented as well. This causes oversegmentation that gives no possibility for further analysis of the picture. 54
  55. 55. Conclusion on the Program Concept Nr.1 On a flow diagram on page (fig.9) it is seen that primary step of the proposed algorithm is detection of the grain borders. Methods of detection that work to some extent were found during a set of experiments described above. Unfortunately quality of the obtained borders and segmentation does not allow proceeding to the next step of the proposed algorithm. Probable reason for that is combination of conditions that are seriously complicating the segmentation.(lighting conditions, non – homogenous surface of the grains, absence defined background, shadows etc.) It is not stated that grain segmentation and edge detection is impossible task in this case, but based on the results, it is probably achievable only using extremely complicated image processing techniques. Before tackling such serious task it is recommended to consider other methods that might be easier to achieve. 3.0.2 Proposed program concept Nr. 2 Description It is assumed that the areas where the grains have moved will have their intensity values relatively different, when comparing coherent frames where movement has occurred. T The idea is to capture and highlight such areas as ideally these areas have the contours of the moved grains. After obtaining the direction of grain movement and estimation of volume, volumetric bed load could be estimated. Highlight modified areas 1) Absolute difference is obtained using imabsdiff function im3 = imabsdiff(im1,im2) 2) Values obtained in previous step with aid of cycles: for ver = 1:(siz (1,1)) parfor hor = 1:(siz(1,2)) 55
  56. 56. 3) Each pixel is sequentially compared to a critical value pix_difference predetermined by the user. If value is exceeded a pixel is highlighted in a variable bi_im if im3(ver,hor) = pix_difference bi_im(ver,hor) = 1 Results During the experiment, only one grain was displaced manually to simulate the effect of grain movement in order to check the possibility of capturing grain contour efficiently. FIGURE 51: HIGHLIGHTED AREAS WHERE PIXEL INTENSITY VALUES HAVE CHANGED MORE THAN PRESET CRITICAL LEVEL. 56
  57. 57. FIGURE 52: HIGHLIGHTED AREAS WHERE PIXEL INTENSITY VALUES HAVE CHANGED MORE THAN PRESET CRITICAL LEVEL. Based on the results (on figures54,55) it is seen that highlighted difference areas( in red circles) at some point, have contours similar to the contours of the grains. Conclusion Considering amount of noise, coming with highlighted contours there will be difficulties obtaining area of the grains efficiently. The most probable reason for noise is a change of shading, caused by grain movement and non homogenous surface of the grains. Thus it is not likely that a moved grain area could be approximated by this method. Alternative application of the method It was concluded that this method may be used in a way it was not designed for: Method does not allow obtaining precise grain area information easily, but it is surely efficiently marks the areas where changes have occurred, and thus there is no need to process whole image, when area of interest can be reduced significantly using this method. 57
  58. 58. That reduction of interest area can significantly simplify the image processing (segmentation, edge detection, etc) and identification of the moved grains and their position. For example, instead of trying to identify a moved grain on the image of 1000 grains, this number could be reduced to 10 grains and logically it would be much easier to identify the required grain in that case. As on the smaller area of interest there are less possible variants of a moved grain, thus the probability of the incorrect grain identification (error) is also significantly reduced. Method also might find an application when detecting the grains that did not move, but only produced small “shakes” as all the displaced grains will be highlighted. Accidental discovery It should be also mentioned that when taking pictures for that experiment, a vibration of the camera was accidently induced by pressing the image capturing button. As a result, when highlighting the difference between frames, most of grain borders were noticed to be highlighted with a relatively good quality. (it can be seen on Fig.39) Mentioned accidently discovered useful effect of the vibration became the basic working principle for the grain edge-detector described starting from page 44. Discussions During analysis of the experiment results described above, a following pattern was noticed -during tests of the segmentation methods based on edge detection it was observed that when analyzing the images at relatively low sensitivities at the beginning of experiments, exactly the same parts of the grain edges often could not be captured even when different methods of grain detection are used. Method 1 on figure 56 and canny filter on figure 57 are shown as an example of the edges obtained using low edge-detector sensitivities) This might be an evidence of extremely weak preciseness of undetected edges. 58
  59. 59. FIGURE 53: WEAK EDGES, METHOD1 FIGURE 54: WEAK EDGES, CANNY EDGE DETECTOR FIGURE 55: OVER SEGMENTATION, CANNY EDGE FIGURE 56: NOISE CONTAMINATION, METHOD1 DETECTOR After further increase of each methods sensitivity, there was a moment when excessive amount of noise or over segmentation (depending on the method) appeared on the image. (Fig.58 and 59) Even though the images were over contaminated by noise or excessively segmented(fig.58,59), some edges were still unrevealed. Considering the fact, that mentioned observations were obtained using methods based on completely different working principles (Method 1, Canny edge detector), this might be another confirmation of the mentioned excessively weak edges hypothesis. 59

×