SlideShare a Scribd company logo
  1	
  
	
  
	
  
	
  
	
  
	
  
	
  
	
  
C2Land	
  	
  
Summer	
  Research	
  Assistantship	
  Thesis	
  
	
  
Aleksandra	
  Dervisevic	
  
	
  
Technische	
  Universität	
  Braunschweig	
  Summer	
  Research	
  
Project:	
  C2Land	
  
Research	
  advisor:	
  Stephan	
  Wolkow	
  
Date:	
  03.08.2015	
  
	
  
	
  
	
  
	
  
	
  
	
  
	
  
	
  
	
  
	
  
	
  
	
  
	
  
	
  
	
  
	
  
	
  
  2	
  
Table	
  of	
  Contents	
  
	
  
	
  
	
  
I. Introduction………………………………………………………………..pg.	
  3	
  
	
  
II. Current	
  status…………………………………………………………pgs.	
  3-­‐4	
  
i. Line	
  filter…………………………………………………………….pg.	
  4	
  
	
  
III. Improvement	
  of	
  the	
  algorithm……………………………….pgs.	
  4-­‐10	
  
i. Debugging…………………………………………………...….pgs.	
  4-­‐8	
  
ii. Fourth	
  filter………………………………………………..…pgs.	
  9-­‐10	
  
	
  
IV. Error	
  analysis…………………………………………………..…pgs.	
  10-­‐15	
  
	
  
V. Further	
  improvements………………………………………..pgs.	
  15-­‐17	
  
	
  
VI. Conclusion…………………………………………………….……pgs.	
  17-­‐20	
  
	
  
VII. References………………………………………………………………...pg.	
  21	
  
	
  
	
  
	
  
	
  
	
  
	
  
	
  
  3	
  
	
  
	
  
	
  
	
  
	
  
	
  
	
  
	
  
	
  
	
  
	
  
I.	
  	
   Introduction	
  
	
  
Optical	
  tracking	
  of	
  obstacles	
  and	
  paths	
  have	
  been	
  a	
  growing	
  object	
  of	
  study	
  in	
  
robotics	
  and	
  the	
  automotive	
  industry	
  in	
  the	
  past	
  decade.	
  However,	
  there	
  has	
  been	
  little	
  
application	
  and	
  development	
  in	
  the	
  aeronautical	
  field.	
  A	
  system	
  that	
  relies	
  on	
  optical	
  
navigation	
   would	
   aid	
   the	
   pilot	
   during	
   landing	
   approach	
   by	
   detecting	
   the	
   runway	
   and	
  
estimating	
  the	
  position	
  of	
  the	
  aircraft.	
  The	
  Instrument	
  Landing	
  System	
  (ILS)	
  is	
  currently	
  
in	
  use	
  only	
  in	
  larger	
  airports	
  and	
  aircraft	
  because	
  despite	
  of	
  its	
  accuracy	
  and	
  reliability,	
  
it	
   is	
   a	
   very	
   expensive	
   guidance	
   system	
   which	
   includes	
   high-­‐maintenance	
   ground	
  
installation.	
   Another	
   procedure,	
   the	
   Space	
   Augmentation	
   System	
   (SBAS)	
   LPV	
   200,	
   is	
  
widely	
  used	
  down	
  to	
  the	
  height	
  of	
  60	
  meters,	
  and	
  from	
  there	
  the	
  pilot	
  takes	
  control	
  until	
  
landing.	
   This	
   is	
   from	
   where	
   a	
   need	
   of	
   an	
   additional	
   navigation	
   system	
   comes.	
   The	
  
C2Land	
   project	
   is	
   born	
   from	
   the	
   idea	
   of	
   an	
   optical	
   navigation	
   system	
   for	
   landing	
  
approach,	
   which	
   is	
   highly	
   accurate	
   when	
   closest	
   to	
   the	
   runway.	
   This	
   system	
   would	
  
support	
  the	
  inertial	
  navigation	
  system	
  (INS/GNSS)	
  and	
  would	
  be	
  accompanied	
  by	
  SBAS.	
  
	
  
	
  
	
  
II.	
  	
   Current	
  status	
  
	
  
The	
  C2Land	
  project	
  consists	
  of	
  an	
  image-­‐based	
  navigation	
  system	
  developed	
  to	
  detect	
  
the	
  runway	
  during	
  landing	
  approach.	
  This	
  is	
  done	
  by	
  using	
  a	
  camera	
  placed	
  at	
  the	
  front	
  
of	
  the	
  plane	
  and	
  then	
  analyzing	
  the	
  images	
  using	
  image-­‐processing	
  algorithms	
  in	
  C++.	
  	
  
The	
  code	
  developed	
  was	
  tested	
  using	
  videos	
  from	
  Flight	
  Simulation	
  9	
  at	
  first.	
  The	
  code	
  
showed	
  perfect	
  functioning	
  and	
  accurate	
  results	
  from	
  this	
  testing.	
  The	
  next	
  step	
  in	
  the	
  
testing	
  process	
  was	
  to	
  use	
  recordings	
  from	
  real	
  landing	
  approaches.	
  The	
  videos	
  obtained	
  
showed	
   multiple	
   cases	
   to	
   give	
   as	
   many	
   different	
   approaches	
   as	
   possible.	
   When	
   these	
  
videos	
  were	
  processed	
  by	
  the	
  code,	
  multiple	
  inaccuracies	
  appeared.	
  There	
  were	
  missing	
  
borders	
  selected,	
  or	
  the	
  wrong	
  lines	
  were	
  defined	
  as	
  right	
  and	
  left	
  borders.	
  
	
  
My	
  main	
  task	
  in	
  this	
  project	
  was	
  to	
  work	
  on	
  the	
  code	
  to	
  avoid	
  wrong	
  detection	
  of	
  lines	
  
and	
  refine	
  the	
  selection	
  of	
  lines	
  with	
  the	
  best	
  fit.	
  Specifically,	
  I	
  was	
  assigned	
  to	
  improve	
  
the	
  Image	
  Analyzer	
  part	
  of	
  the	
  code	
  to	
  achieve	
  this	
  stable	
  runway	
  detection.	
  
This	
  thesis	
  will	
  summarize	
  the	
  work	
  done	
  to	
  achieve	
  an	
  improvement	
  on	
  the	
  selection	
  of	
  
runway	
  borders	
  in	
  the	
  landing	
  approach	
  videos.	
  The	
  results	
  will	
  first	
  be	
  introduced	
  by	
  
  4	
  
an	
  overall	
  description	
  of	
  the	
  logic	
  behind	
  the	
  code	
  that	
  analyzes	
  the	
  lines	
  detected	
  in	
  the	
  
images.	
  After	
  this	
  brief	
  description,	
  some	
  of	
  the	
  debugging	
  will	
  be	
  explained,	
  as	
  well	
  as	
  
the	
   fourth	
   filter	
   developed	
   to	
   refine	
   the	
   selection	
   of	
   lines	
   as	
   right	
   and	
   left	
   runway	
  
borders.	
   To	
   close	
   this	
   thesis,	
   a	
   few	
   visual	
   examples	
   are	
   attached	
   along	
   with	
   the	
  
conclusion.	
  
	
  
i.	
  Line	
  filter	
  
	
  
The	
   Image	
   Analyzer	
   consists	
   of	
   three	
   different	
   filters	
   from	
   which	
   weight	
   factors	
   are	
  
obtained.	
  These	
  will	
  be	
  multiplied	
  in	
  the	
  end	
  to	
  obtain	
  an	
  overall	
  weight	
  factor	
  for	
  each	
  
line	
  detected.	
  These	
  criteria	
  will	
  help	
  distinguish	
  between	
  the	
  right	
  and	
  left	
  borders	
  of	
  
the	
  runway	
  (marking	
  them	
  in	
  green	
  and	
  red,	
  respectively)	
  and	
  the	
  rest,	
  which	
  will	
  be	
  
ignored.	
   While	
   processing	
   the	
   lines,	
   the	
   code	
   analyzes	
   “left	
   line”	
   and	
   “right	
   line”	
  
separately	
  (based	
  on	
  angle	
  compared	
  to	
  central	
  line).	
  	
  
The	
  first	
  filter	
  is	
  the	
  angle	
  criterion.	
  The	
  line	
  detected	
  will	
  be	
  at	
  some	
  angle	
  with	
  respect	
  
to	
  the	
  expected	
  left	
  or	
  right	
  lines	
  of	
  the	
  runway,	
  and	
  the	
  angle	
  between	
  them	
  will	
  be	
  
measured.	
   If	
   the	
   difference	
   is	
   greater	
   than	
   |2.5°|,	
   the	
   weight	
   factor	
   will	
   automatically	
  
become	
   0.	
   This	
   parameter	
   was	
   set	
   by	
   testing.	
   If	
   the	
   angle	
   is	
   within	
   the	
   limit,	
   then	
   a	
  
weight	
  factor	
  that	
  can	
  go	
  from	
  0	
  to	
  1	
  will	
  be	
  obtained.	
  	
  
The	
  second	
  filter	
  is	
  the	
  vanishing	
  point	
  criterion.	
  The	
  intersection	
  between	
  the	
  expected	
  
left	
  line	
  and	
  the	
  expected	
  central	
  line	
  will	
  be	
  found,	
  and	
  this	
  point	
  will	
  become	
  the	
  center	
  
of	
  a	
  circle	
  of	
  radius	
  100	
  pixels.	
  This	
  parameter	
  was	
  also	
  set	
  by	
  testing.	
  The	
  filter	
  will	
  
consist	
  of	
  finding	
  the	
  intersection	
  between	
  the	
  detected	
  line	
  and	
  the	
  central	
  line,	
  and	
  
observing	
  if	
  it	
  lies	
  within	
  the	
  limits	
  of	
  the	
  circle.	
  Depending	
  on	
  how	
  far	
  it	
  is	
  from	
  the	
  
center,	
   it	
   will	
   have	
   a	
   different	
   weight	
   factor.	
   If	
   it	
   lies	
   outside	
   of	
   the	
   circle,	
   it	
   will	
  
immediately	
  become	
  0.	
  
The	
  third	
  filter	
  is	
  the	
  expected	
  length	
  criterion.	
  The	
  upper	
  limit	
  for	
  the	
  length	
  is	
  1000	
  
pixels,	
  set	
  again	
  by	
  testing.	
  Depending	
  on	
  how	
  close	
  the	
  length	
  is	
  to	
  the	
  expected	
  length,	
  
the	
  weight	
  factor	
  will	
  change.	
  
After	
  these	
  three	
  factors	
  are	
  obtained,	
  the	
  values	
  will	
  be	
  multiplied.	
  The	
  ones	
  that	
  are	
  
zero	
  will	
  be	
  ignored	
  and	
  the	
  ones	
  that	
  are	
  not	
  will	
  be	
  compared,	
  and	
  the	
  highest	
  overall	
  
weight	
  factors	
  will	
  be	
  marked	
  as	
  the	
  right	
  and	
  left	
  borders	
  of	
  the	
  runway	
  for	
  each	
  time	
  
step.	
  
	
  
	
  
	
  
III.	
  	
   Improvement	
  of	
  the	
  algorithm	
  
	
  
The	
  first	
  task	
  towards	
  the	
  improvement	
  of	
  the	
  code	
  was	
  to	
  watch	
  around	
  20	
  videos	
  of	
  
actual	
  landing	
  approaches	
  and	
  set	
  the	
  right	
  coordinates	
  for	
  the	
  image	
  detection	
  (pitch,	
  
roll	
  and	
  yaw)	
  so	
  that	
  the	
  right	
  position	
  could	
  be	
  used	
  with	
  whichever	
  video	
  was	
  used	
  for	
  
testing.	
  After	
  that,	
  the	
  debugging	
  would	
  take	
  place.	
  
	
  
i.	
  Debugging	
  	
  
	
  
Printing	
  the	
  coordinates	
  of	
  the	
  points	
  that	
  delimit	
  the	
  extremes	
  of	
  the	
  projected	
  borders	
  
of	
  the	
  runway	
  helped	
  visualize	
  what	
  the	
  expected	
  runway	
  is.	
  This	
  test	
  was	
  run	
  with	
  the	
  
simulation	
  and	
  multiple	
  of	
  the	
  real	
  imaging	
  recordings.	
  The	
  finding	
  was	
  that	
  there	
  was	
  
an	
  inconsistency	
  with	
  the	
  coordinate	
  system	
  convention.	
  It	
  appeared	
  as	
  it	
  was	
  reversed	
  
  5	
  
for	
   some	
   cases	
   in	
   the	
   real	
   imaging	
   recordings	
   (but	
   never	
   in	
   the	
   simulation).	
   The	
  
convention	
  assumed	
  that	
  the	
  image	
  and	
  coordinate	
  system	
  should	
  look	
  like	
  in	
  figure	
  1,	
  
but	
  sometimes	
  it	
  would	
  just	
  switch	
  to	
  figure	
  2.	
  	
  
	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  
	
   	
  
	
   	
  
	
  
	
  
	
  
	
  
	
  
	
   	
   	
  
	
  
	
  
	
  
	
  
	
  
The	
  following	
  table	
  shows	
  the	
  values	
  printed	
  from	
  one	
  of	
  the	
  landing	
  approaches	
  tested.	
  
	
  
Theta	
  Left	
   Theta	
  right	
   Projected	
  Left	
  Edge	
  angle	
   Projected	
  Right	
  Edge	
  Angle	
  
-­‐0.313	
  rad	
   0.3651	
  rad	
   0.927	
  rad	
   -­‐1.396	
  rad	
  
	
  
Table	
  1.	
  Angles	
  from	
  incorrect	
  projection	
  of	
  detected	
  lines.	
  (time	
  step	
  1695	
  flight	
  1	
  on	
  the	
  14-­‐04-­‐15)	
  
	
  
	
  
	
  
Figure	
  3.	
  Image	
  associated	
  to	
  time	
  step	
  from	
  which	
  the	
  values	
  on	
  table	
  1	
  were	
  extracted	
  
	
  
Table	
  1	
  helps	
  visualize	
  the	
  inconsistency	
  of	
  signs.	
  While	
  the	
  detected	
  line	
  angle	
  theta	
  
considered	
  left	
  is	
  negative,	
  in	
  the	
  projected	
  left	
  line,	
  the	
  edge	
  angle	
  is	
  positive.	
  This	
  is	
  
inaccurate	
  because	
  “theta”	
  and	
  “edge	
  angle”	
  are	
  complementary	
  angles.	
  	
  
	
  
The	
  class	
  of	
  the	
  code	
  that	
  deals	
  with	
  the	
  Image	
  Analyzer	
  was	
  then	
  tested	
  in	
  order	
  to	
  try	
  
to	
  find	
  the	
  solution	
  to	
  this	
  problem.	
  After	
  observing	
  the	
  different	
  plots	
  from	
  the	
  various	
  
videos,	
  it	
  could	
  be	
  noticed	
  that	
  there	
  was	
  a	
  trend	
  were	
  every	
  time	
  the	
  coordinate	
  system	
  
was	
  upside	
  down:	
  one	
  of	
  the	
  lines	
  defining	
  the	
  runway	
  was	
  always	
  shortened	
  because	
  it	
  
was	
  out	
  of	
  the	
  limits	
  of	
  the	
  given	
  region	
  of	
  interest.	
  The	
  size	
  of	
  this	
  region	
  of	
  interest	
  is	
  
1279	
  x	
  982	
  pixels,	
  and	
  one	
  of	
  the	
  coordinates	
  of	
  one	
  of	
  the	
  points	
  delimiting	
  the	
  runway	
  
borders	
  would	
  be	
  one	
  of	
  these	
  numbers.	
  	
  
This	
  meant	
  that	
  every	
  time	
  the	
  code	
  had	
  to	
  run	
  over	
  the	
  lines	
  that	
  shorten	
  the	
  border	
  
lines	
  because	
  they	
  are	
  too	
  long,	
  this	
  mistake	
  would	
  happen.	
  The	
  focus	
  of	
  the	
  
investigation	
  then	
  switched	
  towards	
  the	
  class	
  of	
  the	
  code	
  that	
  deals	
  with	
  the	
  line	
  
x	
  
y	
  
Figure	
  1.	
  Projected	
  runway	
  borders	
   Figure	
  2.	
  Projected	
  runway	
  borders	
  
x	
  
y	
  
  6	
  
features	
  calculation.	
  Here,	
  values	
  like	
  the	
  slope,	
  the	
  edge	
  angle	
  (angle	
  between	
  the	
  line	
  
and	
  the	
  ordinate),	
  the	
  y-­‐intercept	
  and	
  finally	
  the	
  equation	
  of	
  the	
  line	
  are	
  calculated.	
  	
  
	
  
When	
  going	
  over	
  these	
  lines,	
  a	
  mathematical	
  mistake	
  was	
  found.	
  There	
  were	
  two	
  
different	
  functions	
  for	
  which	
  the	
  equation	
  of	
  a	
  line	
  had	
  to	
  be	
  defined.	
  In	
  both	
  cases,	
  the	
  
equation	
  had	
  to	
  look	
  the	
  same,	
  because	
  it	
  was	
  simply	
  the	
  general	
  equation	
  of	
  a	
  line.	
  In	
  
the	
  first	
  function	
  it	
  was	
  calculated	
  correctly	
  (Eq	
  1),	
  but	
  in	
  the	
  second	
  case	
  the	
  equation	
  
was	
  not	
  accurate	
  (Eq	
  2).	
  
	
  
Eq	
  1.	
  	
   	
   	
   	
   	
   b	
  =	
  y	
  –	
  tan(θ)*x	
  	
  
	
  
Eq	
  2.	
  	
   	
   	
   	
   	
   b	
  =	
  atan(y	
  –	
  θ*x)	
  
	
  
Where	
  b	
  is	
  the	
  y-­‐intercept	
  and	
  θ	
  is	
  the	
  angle	
  that	
  comes	
  from	
  taking	
  the	
  atan(slope).	
  
Given	
  this,	
  the	
  second	
  equation	
  was	
  changed	
  to	
  what	
  it	
  should	
  have	
  been,	
  and	
  the	
  lines	
  
of	
  the	
  code	
  that	
  were	
  unnecessary	
  were	
  deleted	
  (made	
  the	
  if	
  loop	
  with	
  just	
  one	
  else	
  
statement,	
  as	
  seen	
  in	
  figure	
  3).	
  
This	
  change	
  in	
  the	
  code	
  fixed	
  the	
  output	
  of	
  the	
  cases	
  where	
  the	
  coordinate	
  system	
  
appeared	
  upside	
  down.	
  Another	
  coding	
  mistake	
  was	
  found	
  in	
  an	
  if	
  loop.	
  Theta	
  was	
  
calculated	
  after	
  the	
  loop,	
  so	
  for	
  the	
  cases	
  were	
  the	
  x-­‐coordinate	
  of	
  both	
  points	
  were	
  the	
  
same,	
  the	
  slope	
  was	
  not	
  defined	
  and	
  then	
  theta	
  was	
  calculated	
  out	
  of	
  0	
  (predefined	
  value	
  
of	
  the	
  variable	
  slope).	
  
	
  
//Old	
  code	
  
	
  
if	
  (pt2.x	
  >	
  pt1.x)	
  
	
   {	
  
	
   	
   slope	
   =	
   (pt2.y-­‐	
  pt1.y)	
  /	
  (pt2.x-­‐	
  pt1.x);	
  
	
   	
   	
  
	
   }	
  
else	
  if	
  (pt2.x	
  ==	
  pt1.x)	
  
	
   {	
  
	
   	
   theta	
  =	
  M_PI/2;	
  
	
   }	
  
else	
  
{	
  
	
   	
   slope	
   =	
   (pt1.y-­‐	
  pt2.y)	
  /	
  (pt1.x-­‐	
  pt2.x);	
  
	
   }	
  
theta	
   =	
   atan	
  (	
  slope	
  )	
  ;	
  
	
  
	
  
//Code	
  with	
  corrections	
  
	
  
if	
  (pt2.x	
  !=	
  pt1.x)	
  
	
   {	
  
	
   	
   slope	
  	
  =	
   (pt2.y-­‐	
  pt1.y)	
  /	
  (pt2.x-­‐	
  pt1.x);	
  
	
   	
   theta	
   =	
   atan	
  (	
  m_slope	
  )	
  ;	
  
	
   }	
  
else	
  
	
   {	
  
	
   	
   slope	
  =	
  DBL_MAX;	
  
	
   	
   theta	
  =	
  M_PI/2;	
  
	
   }	
  
	
  
Figure	
  4.	
  Lines	
  of	
  the	
  code	
  from	
  the	
  line	
  features	
  calculation	
  
  7	
  
Another	
  correction	
  made	
  was	
  the	
  following.	
  These	
  lines	
  find	
  the	
  edge	
  angle,	
  which	
  is	
  the	
  
angle	
  located	
  between	
  the	
  line	
  and	
  the	
  y-­‐axis.	
  In	
  this	
  context,	
  m_slopeRad	
  is	
  the	
  angle	
  
between	
  the	
  line	
  and	
  the	
  x-­‐axis.	
  
	
  
if	
  (m_slopeRad	
  >	
  0)	
  
	
   {	
  
	
   	
   m_edgeAngle	
   =	
   M_PI/2	
  -­‐	
  m_slopeRad;	
  //angle	
  between	
  
ordinate	
  and	
  line	
  
	
   }	
  
else	
  if	
  (m_slopeRad	
  <	
  0)	
  
	
   {	
  
	
   	
   m_edgeAngle	
   =	
   -­‐	
  M_PI/2	
  -­‐	
  m_slopeRad;	
  
	
   }	
  
else	
  
	
   {	
  
	
   	
   m_edgeAngle	
   =	
  M_PI/2	
  ;	
  
	
   }	
  
	
  
	
  	
   	
   	
   	
   Figure	
  5.	
  Lines	
  of	
  code	
  corrected	
  
	
  
The	
  last	
  else	
  statement	
  had	
  to	
  be	
  added	
  for	
  the	
  code	
  to	
  make	
  sense.	
  Previously,	
  the	
  value	
  
of	
  m_slopeRad	
  equal	
  to	
  zero	
  was	
  never	
  taken	
  into	
  account,	
  so	
  the	
  value	
  for	
  m_edgeAngle	
  
that	
  would	
  have	
  been	
  used	
  would	
  have	
  probably	
  been	
  whatever	
  value	
  was	
  stored	
  in	
  the	
  
memory.	
  This,	
  together	
  with	
  an	
  increase	
  of	
  the	
  value	
  of	
  the	
  parameters	
  used	
  in	
  each	
  
filter	
  for	
  the	
  sensitivity	
  increased	
  immensely	
  the	
  accuracy	
  of	
  the	
  results.	
  
	
  
The	
  following	
  table	
  is	
  analogous	
  to	
  table	
  1,	
  but	
  here	
  the	
  values	
  were	
  obtained	
  after	
  all	
  
the	
  previous	
  changes	
  were	
  implemented.	
  	
  
	
  
Theta	
  Left	
   Theta	
  right	
   Projected	
  Left	
  Edge	
  angle	
   Projected	
  Right	
  Edge	
  Angle	
  
-­‐0.285	
  rad	
   0.213	
  rad	
   -­‐1.243	
  rad	
   1.260	
  rad	
  
	
  
Table	
  2.	
  Angles	
  from	
  accurate	
  projection	
  of	
  detected	
  lines.	
  (time	
  step	
  1695,	
  flight	
  1	
  on	
  the	
  14-­‐04-­‐15)	
  
	
  
	
  
	
  
	
  
Figure	
  6.	
  Image	
  associated	
  to	
  time	
  step	
  from	
  which	
  values	
  in	
  table	
  2	
  were	
  extracted	
  
	
  
The	
  output	
  data	
  was	
  at	
  all	
  times	
  correct	
  and	
  consistent	
  with	
  signs,	
  and	
  now	
  the	
  left	
  and	
  
right	
  borders	
  of	
  the	
  runway	
  are	
  detected	
  more	
  precisely.	
  
The	
  work	
  in	
  the	
  LineFeatures	
  class	
  was	
  finished	
  with	
  this	
  last	
  correction.	
  
	
  
  8	
  
The	
  detection	
  of	
  the	
  left	
  line	
  used	
  to	
  be	
  inexistent	
  in	
  some	
  of	
  the	
  recordings	
  tested;	
  but	
  
that	
  was	
  not	
  the	
  case	
  anymore.	
  All	
  the	
  videos	
  provided	
  with	
  the	
  code	
  were	
  tested	
  with	
  
the	
   new	
   code,	
   and	
   there	
   was	
   one	
   of	
   them	
   that	
   was	
   causing	
   some	
   trouble.	
   This	
   video	
  
started	
   out	
   very	
   well	
   (figure	
   7),	
   but	
   as	
   the	
   plane	
   was	
   approaching	
   the	
   runway,	
   the	
  
detection	
  became	
  worse	
  (see	
  figure	
  8).	
  This	
  is	
  why	
  it	
  was	
  a	
  good	
  idea	
  to	
  develop	
  a	
  fourth	
  
filter	
  that	
  would	
  refine	
  the	
  detection	
  of	
  lines,	
  to	
  avoid	
  the	
  selection	
  of	
  the	
  lines	
  marked	
  
on	
  figure	
  8.	
  	
  
	
  
	
  
	
  
Figure	
  7.	
  Very	
  accurate	
  detection	
  of	
  right	
  and	
  left	
  borders	
  of	
  the	
  runway	
  (flight	
  1,	
  trial	
  8,	
  on	
  16-­‐04-­‐15)	
  
	
  
	
  
	
  
	
  
	
  
	
  
	
  
	
  
	
  
	
  
	
  
	
  
	
  
	
  
	
  
	
  
	
  
	
  
Figure	
  8.	
  Wrong	
  selection	
  of	
  right	
  and	
  left	
  borders	
  of	
  the	
  runway	
  
	
  
	
  
	
  
	
  
	
  
	
  
	
  
	
  
	
  
Lines	
  detected	
  are	
  
too	
  far	
  from	
  the	
  
expected	
  runway	
  
borders	
  
  9	
  
ii.Fourth	
  filter	
  
	
  
The	
  idea	
  behind	
  the	
  fourth	
  filter	
  was	
  to	
  basically	
  design	
  the	
  region	
  depicted	
  on	
  figure	
  9:	
  
	
  
	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
   	
  
	
  
Figure	
  9.	
  Region	
  for	
  Refining	
  Filter	
  
	
  
Here,	
  the	
  red	
  and	
  green	
  lines	
  are	
  bordering	
  the	
  region,	
  which	
  surrounds	
  the	
  projected	
  
runway	
  lines.	
  The	
  objective	
  is	
  to	
  give	
  a	
  higher	
  weight	
  factor	
  to	
  the	
  lines	
  that	
  lie	
  within	
  
the	
   limits	
   of	
   this	
   region.	
   This	
   filter	
   would	
   not,	
   however,	
   eliminate	
   the	
   lines	
   that	
   are	
  
outside	
  of	
  the	
  region,	
  due	
  to	
  possible	
  inaccuracies	
  related	
  to	
  the	
  inexact	
  projection	
  of	
  the	
  
runway	
  borders.	
  	
  
This	
  design	
  did	
  not	
  work	
  so	
  well	
  because	
  of	
  the	
  distance	
  at	
  which	
  the	
  red	
  and	
  green	
  lines	
  
were	
  situated	
  from	
  the	
  projected	
  lines.	
  The	
  distance	
  was	
  constant	
  throughout	
  the	
  entire	
  
approach	
  and	
  this	
  made	
  it	
  very	
  inaccurate	
  because	
  compared	
  to	
  real-­‐life	
  geometry,	
  this	
  
value	
  should	
  be	
  increasing	
  as	
  the	
  plane	
  approaches	
  the	
  runway.	
  This	
  is	
  why	
  we	
  decided	
  
to	
  make	
  a	
  dynamic	
  region,	
  directly	
  using	
  the	
  values	
  of	
  the	
  x-­‐coordinate	
  of	
  the	
  projected	
  
lines	
  end	
  points	
  in	
  the	
  code	
  (these	
  keep	
  changing	
  as	
  the	
  code	
  is	
  run).	
  
	
  
The	
  new	
  design	
  would	
  become	
  like	
  figure	
  10:	
  
	
  
	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
  	
   	
  
	
  
Figure	
  10.	
  Final	
  design	
  of	
  the	
  region	
  for	
  refining	
  filter	
  
	
  
x	
  
y	
  
y	
  
x	
  
  10	
  
This	
   new	
   design	
   improved	
   the	
   selection	
   of	
   lines	
   drastically.	
   As	
   an	
   example,	
   figure	
   11	
  
shows	
  the	
  same	
  photogram	
  as	
  figure	
  8	
  but	
  after	
  the	
  implementation	
  of	
  the	
  refining	
  filter,	
  
showing	
  a	
  high	
  accuracy	
  in	
  the	
  selection	
  of	
  lines.	
  	
  
	
  
	
  
	
  
Figure	
  11.	
  Accurate	
  detection	
  of	
  right	
  and	
  left	
  borders	
  of	
  the	
  runway	
  
	
  
Another	
  change	
  was	
  implemented,	
  which	
  enlarged	
  the	
  projected	
  runway	
  in	
  the	
  bottom	
  
part	
  (yellow	
  lines)	
  to	
  cover	
  the	
  entire	
  runway.	
  	
  
	
  
	
  
	
  
IV.	
  	
   Error	
  analysis	
  
	
  
To	
  quantitatively	
  measure	
  the	
  error	
  in	
  the	
  optical	
  position,	
  the	
  coordinates	
  found	
  are	
  
compared	
  to	
  the	
  ones	
  given	
  by	
  the	
  actual	
  INS/GNSS	
  system	
  on	
  board.	
  In	
  figure	
  12	
  both	
  
sets	
  of	
  (x,y)	
  coordinates	
  are	
  represented.	
  
	
  
	
  
	
  
Figure	
  12.	
  Representation	
  of	
  position,	
  x	
  vs.	
  y	
  axes	
  for	
  the	
  new	
  code	
  
High	
  accuracy	
  in	
  
lines	
  selected	
  
Borders	
  of	
  the	
  region	
  
for	
  fourth	
  filter	
  
  11	
  
This	
   error	
   analysis	
   becomes	
   more	
   meaningful	
   when	
   compared	
   to	
   the	
   analysis	
   made	
  
using	
  the	
  computer	
  algorithm	
  before	
  all	
  the	
  improvements	
  were	
  implemented.	
  
Figure	
  13	
  depicts	
  the	
  two	
  sets	
  of	
  (x,y)	
  coordinates	
  for	
  the	
  optical	
  and	
  reference	
  positions	
  
obtained	
  from	
  the	
  old	
  code.	
  	
  
	
  
	
  
	
  
Figure	
  13.	
  Representation	
  of	
  position,	
  x	
  vs.	
  y	
  axes	
  for	
  the	
  old	
  code	
  
	
  
It	
  can	
  be	
  observed	
  that	
  the	
  first	
  visualization	
  of	
  the	
  runway	
  starts	
  around	
  2,000	
  meters	
  
later	
  than	
  in	
  the	
  new	
  code,	
  and	
  that	
  once	
  on	
  the	
  runway,	
  in	
  multiple	
  instances,	
  the	
  
optical	
  position	
  is	
  highly	
  inaccurate.	
  
For	
  a	
  clearer	
  appreciation	
  of	
  the	
  precision	
  of	
  the	
  new	
  optical	
  position,	
  the	
  following	
  plot	
  
is	
  drawn.	
  
	
  
	
  
Figure	
  14.	
  Error	
  estimation	
  for	
  (x,	
  y)	
  set	
  
  12	
  
Here,	
  the	
  difference	
  between	
  the	
  reference	
  and	
  the	
  optical	
  position	
  is	
  computed.	
  	
  It	
  can	
  
be	
  observed	
  that	
  the	
  values	
  remain	
  between	
  0	
  and	
  20	
  meters	
  throughout	
  all	
  times	
  and	
  
below	
  10	
  meters	
  when	
  the	
  plane	
  is	
  closest	
  to	
  the	
  runway	
  (see	
  figure	
  15	
  for	
  a	
  close-­‐up	
  of	
  
the	
  last	
  part	
  of	
  the	
  landing	
  approach).	
  
	
  
	
  
	
  
Figure	
  15.	
  Error	
  estimation	
  close-­‐up	
  of	
  last	
  2,000	
  meters	
  of	
  the	
  approach	
  
	
  
Figure	
   16,	
   on	
   the	
   other	
   hand,	
   represents	
   the	
   coordinates	
   for	
   optical	
   and	
   INS/GNSS	
  
position	
  in	
  the	
  (x,z)	
  axes.	
  
	
  
	
  
	
  
Figure	
  16.	
  Representation	
  of	
  position,	
  x	
  vs.	
  z	
  axes	
  for	
  the	
  new	
  code	
  
	
  
Like	
  in	
  the	
  (x,y)	
  case,	
  the	
  (x,z)	
  coordinates	
  from	
  the	
  old	
  code	
  are	
  plotted	
  in	
  figure	
  17	
  to	
  
better	
  exemplify	
  the	
  improvement	
  achieved	
  after	
  all	
  changes	
  were	
  implemented.	
  
  13	
  
	
  
	
  
	
  
Figure	
  17.	
  Representation	
  of	
  position,	
  x	
  vs.	
  z	
  axes	
  for	
  the	
  old	
  code	
  
	
  
Here,	
  again,	
  the	
  visualization	
  of	
  the	
  runway	
  starts	
  around	
  2,000	
  meters	
  later	
  than	
  in	
  the	
  
new	
   code.	
   The	
   number	
   of	
   instances	
   the	
   runway	
   is	
   processed	
   is	
   much	
   smaller,	
   which	
  
decreases	
  the	
  precision	
  in	
  which	
  the	
  borders	
  are	
  defined.	
  This	
  can	
  be	
  inferred	
  from	
  the	
  
lack	
  of	
  abundance	
  of	
  blue	
  dots	
  in	
  the	
  plot.	
  
	
  
Figure	
  18	
  is	
  then	
  plotted	
  to	
  understand	
  the	
  error	
  between	
  the	
  optical	
  and	
  the	
  reference	
  
position	
  estimations.	
  The	
  error	
  is	
  greater	
  up	
  to	
  2,000	
  meters	
  of	
  distance	
  because	
  the	
  
plane	
  is	
  certainly	
  far	
  from	
  the	
  runway,	
  and	
  the	
  visibility	
  is	
  minimal,	
  as	
  it	
  can	
  be	
  observed	
  
in	
  figure	
  20,	
  which	
  shows	
  a	
  snapshot	
  of	
  the	
  view	
  at	
  4,000	
  meters	
  of	
  distance.	
  Figure	
  19	
  is	
  
a	
  close-­‐up	
  of	
  the	
  last	
  2,000	
  meters	
  to	
  better	
  represent	
  the	
  persistently	
  decreasing	
  error	
  
estimation,	
  which	
  after	
  1,000	
  meters	
  of	
  distance	
  stays	
  under	
  10	
  meters.	
  
	
  
  14	
  
	
  
	
  
Figure	
  18.	
  Error	
  estimation	
  for	
  (x,	
  z)	
  set	
  
	
  
	
  
	
  
	
  
Figure	
  19.	
  Error	
  estimation	
  close-­‐up	
  of	
  last	
  2,000	
  meters	
  of	
  the	
  approach	
  
	
  
  15	
  
	
  
	
  
Figure	
  20.	
  View	
  of	
  the	
  runway	
  at	
  4,000	
  meters	
  of	
  distance	
  
	
  
The	
  most	
  noticeable	
  improvement	
  in	
  the	
  code	
  is	
  the	
  resulted	
  higher	
  number	
  of	
  instances	
  
where	
  the	
  runway	
  is	
  processed,	
  which	
  increases	
  the	
  overall	
  quality	
  of	
  the	
  detection	
  of	
  
the	
  runway.	
  
	
  
	
  
	
  
V.	
  	
   Further	
  improvements	
  
	
  
During	
  the	
  investigation	
  on	
  this	
  project,	
  there	
  were	
  some	
  problems	
  that	
  could	
  not	
  be	
  
addressed	
   but	
   whose	
   solution	
   would	
   mean	
   a	
   definite	
   improvement	
   to	
   the	
   code.	
   The	
  
main	
  current	
  issues	
  observed	
  in	
  the	
  selection	
  of	
  the	
  right	
  and	
  left	
  borders	
  of	
  the	
  runway	
  
were:	
  
	
  
• Inaccurate	
  projected	
  runway	
  lines.	
  	
  This	
  is,	
  probably,	
  the	
  most	
  notable	
  
problem.	
  Even	
  if	
  the	
  filters	
  work	
  perfectly	
  selecting	
  the	
  lines	
  that	
  are	
  closer	
  to	
  the	
  
projected	
  runway	
  lines,	
  if	
  these	
  are	
  not	
  accurately	
  drawn	
  over	
  the	
  actual	
  runway,	
  
then	
  the	
  lines	
  selected	
  will	
  be	
  wrong,	
  or	
  there	
  will	
  be	
  no	
  selection	
  of	
  lines.	
  	
  
	
  
	
  
	
  
	
  
  16	
  
	
  
	
  
Figure	
  21.	
  The	
  projected	
  runway	
  borders	
  do	
  not	
  overlap	
  the	
  detected	
  lines,	
  so	
  the	
  selection	
  filter	
  
does	
  not	
  process	
  them	
  as	
  accurate	
  borders.	
  
	
  
• Length	
  criterion.	
  The	
  length	
  criterion	
  does	
  not	
  work	
  very	
  well	
  as	
  a	
  filter	
  
in	
  this	
  process.	
  When	
  the	
  line	
  does	
  not	
  lie	
  within	
  the	
  limits	
  set	
  by	
  the	
  sensitivity,	
  
this	
  line	
  is	
  discarded	
  from	
  the	
  selection	
  process	
  because	
  the	
  weight	
  factor	
  for	
  this	
  
filter	
   becomes	
   zero,	
   making	
   the	
   overall	
   weight	
   factor	
   zero	
   as	
   well.	
   What	
   this	
  
causes	
  is	
  that	
  in	
  some	
  time-­‐steps	
  there	
  are	
  no	
  lines	
  selected	
  as	
  suitable	
  borders,	
  
because	
  the	
  ones	
  with	
  a	
  non-­‐zero	
  weight	
  factor	
  were	
  discarded	
  because	
  they	
  are	
  
too	
  short.	
  To	
  solve	
  this	
  problem,	
  the	
  sensitivity	
  is	
  increased.	
  This	
  may	
  solve	
  the	
  
issue	
  in	
  one	
  border,	
  but	
  then	
  in	
  the	
  other,	
  it	
  causes	
  a	
  change	
  in	
  the	
  selection	
  of	
  
most	
   suitable	
   line	
   for	
   one	
   with	
   a	
   higher	
   weight	
   factor,	
   but	
   shorter	
   (before	
   it	
  
would	
  have	
  been	
  discarded,	
  but	
  now	
  it	
  is	
  not).	
  
	
  
	
  
	
  
Figure	
  22.	
  Here,	
  the	
  green	
  line	
  is	
  selected	
  over	
  longer	
  ones	
  to	
  the	
  right	
  because	
  it	
  is	
  undoubtedly	
  more	
  
similar	
  to	
  the	
  projected	
  line	
  than	
  any	
  other.	
  In	
  this	
  case,	
  the	
  sensitivity	
  is	
  large	
  enough	
  to	
  get	
  this	
  line	
  
selected	
  instead	
  of	
  eliminated	
  for	
  its	
  length.	
  
	
  
Then,	
  the	
  length	
  sensitivity	
  is	
  modified	
  to	
  avoid	
  selecting	
  such	
  short	
  lines.	
  
	
  
  17	
  
	
  
	
  	
   	
  
Figure	
  23.	
  After	
  the	
  length	
  sensitivity	
  is	
  increased,	
  the	
  right	
  border	
  selection	
  is	
  improved	
  
drastically,	
  but	
  this	
  change	
  also	
  results	
  in	
  the	
  disappearance	
  of	
  any	
  left	
  border,	
  because	
  the	
  length	
  
sensitivity	
  rules	
  all	
  the	
  lines	
  out	
  of	
  the	
  selection.	
  
	
  
A	
  possible	
  solution	
  to	
  this	
  last	
  issue	
  would	
  be	
  to	
  change	
  the	
  weight	
  factor	
  value	
  of	
  
the	
  line	
  when	
  it	
  does	
  not	
  comply	
  with	
  the	
  sensitivity.	
  Instead	
  of	
  filtering	
  the	
  line	
  
out	
  by	
  giving	
  a	
  weight	
  factor	
  of	
  zero,	
  change	
  this	
  value	
  to	
  a	
  number	
  between	
  0	
  
and	
  1.	
  This	
  way	
  the	
  filter	
  would	
  become	
  a	
  refining	
  filter	
  like	
  the	
  fourth	
  filter	
  I	
  
developed	
   in	
   my	
   work.	
   In	
   this	
   manner,	
   it	
   would	
   be	
   possible	
   to	
   avoid	
   absent	
  
runway	
  borders.	
  	
  
	
  
	
  
	
  
VI.	
   Conclusion	
  
	
  
The	
   goal	
   of	
   this	
   newly	
   developed	
   system	
   is	
   to	
   detect,	
   process	
   and	
   select	
   the	
   runway	
  
borders	
  during	
  landing	
  approach	
  through	
  the	
  use	
  of	
  a	
  camera	
  positioned	
  at	
  the	
  front	
  of	
  
the	
  plane	
  and	
  image-­‐processing	
  algorithms.	
  
Under	
  the	
  supervision	
  of	
  Stephan	
  Wolkow,	
  my	
  task	
  was	
  to	
  improve	
  the	
  results	
  of	
  the	
  
C++	
  code	
  used	
  in	
  this	
  system.	
  The	
  problematic	
  results	
  occurred	
  in	
  certain	
  time	
  steps,	
  
normally	
  when	
  the	
  plane	
  was	
  closer	
  to	
  the	
  runway.	
  These	
  included	
  missing	
  selection	
  of	
  
lines	
  with	
  the	
  best	
  fit,	
  or	
  the	
  selection	
  of	
  inaccurate	
  lines	
  as	
  borders	
  (they	
  were	
  too	
  far	
  
from	
  the	
  projected	
  runway	
  lines,	
  or	
  too	
  short	
  compared	
  to	
  others	
  detected).	
  
After	
  solving	
  major	
  and	
  minor	
  coding	
  mistakes	
  found	
  by	
  printing	
  the	
  coordinates	
  of	
  the	
  
lines	
  selected,	
  a	
  fourth	
  filter	
  was	
  also	
  implemented	
  to	
  refine	
  the	
  selection	
  of	
  the	
  most	
  
suitable	
  lines	
  as	
  left	
  and	
  right	
  borders	
  of	
  the	
  runway.	
  	
  
In	
  the	
  end,	
  the	
  selection	
  of	
  the	
  appropriate	
  lines	
  as	
  runway	
  borders	
  was	
  improved.	
  In	
  
those	
   specific	
   cases	
   where	
   there	
   was	
   an	
   inaccuracy	
   observed	
   in	
   the	
   selection	
   of	
   best	
  
lines,	
  now	
  there	
  is	
  an	
  optimal	
  definition	
  of	
  the	
  runway	
  borders.	
  	
  
The	
   quantitative	
   comparison	
   of	
   the	
   error	
   in	
   optical	
   position	
   before	
   and	
   after	
   the	
  
upgrades	
  were	
  implemented	
  also	
  demonstrates	
  an	
  increase	
  in	
  accuracy	
  when	
  estimating	
  
the	
  position.	
  
	
  
The	
   following	
   images	
   represent	
   visual	
   examples	
   of	
   the	
   improvement	
   achieved	
   in	
   the	
  
selection	
  of	
  right	
  and	
  left	
  borders	
  of	
  the	
  runway.	
  
	
  
	
  
	
  
  18	
  
Flight	
  1,	
  trial	
  4	
  on	
  15/04/15	
  
	
  
	
  
	
  
Figure	
  24.	
  Before	
  code	
  improvement	
  
	
  
	
  
	
  
Figure	
  25.	
  After	
  code	
  improvement	
  
	
  
	
  
	
  
	
  
	
  
	
  
	
  
	
  
	
  
	
  
	
  
  19	
  
Flight	
  1,	
  trial	
  1	
  on	
  16/04/15	
  
	
  
	
  
	
  
Figure	
  26.	
  Before	
  code	
  improvement	
  
	
  
	
  
	
  
Figure	
  27.	
  After	
  code	
  improvement	
  
	
  
  20	
  
Flight	
  1,	
  trial	
  1	
  on	
  20/04/15	
  
	
  
	
  
	
  
Figure	
  28.	
  Before	
  code	
  improvement	
  
	
  
	
  
	
  
	
  
Figure	
  29.	
  After	
  code	
  improvement	
  
	
  
	
  
	
  
	
  
	
  
	
  
	
  
	
  
	
  
	
  
	
  
	
  
	
  
	
  
	
  
  21	
  
VII.	
   References	
  
	
  
Angermann,	
  M.,	
  Wolkow,	
  S.,	
  Schwithal,	
  A.,	
  Tonhäuser,	
  C.,	
  Hecker,	
  P.,	
  "High	
  Precision	
  
Approaches	
  Enabled	
  by	
  an	
  Optical-­‐Based	
  Navigation	
  System",	
  Proceedings	
  of	
  the	
  ION	
  
2015	
  Pacific	
  PNT	
  Meeting,	
  Honolulu,	
  Hawaii,	
  April	
  2015,	
  pp.	
  694-­‐701.	
  
	
  
Wolkow,	
  S.,	
  Schwithal,	
  A.,	
  Tonhäuser,	
  C.,	
  Angermann,	
  M.,	
  Hecker,	
  P.,	
  "Image-­‐Aided	
  
Position	
  Estimation	
  Based	
  on	
  Line	
  Correspondences	
  During	
  Automatic	
  Landing	
  
Approach,"	
  Proceedings	
  of	
  the	
  ION	
  2015	
  Pacific	
  PNT	
  Meeting,	
  Honolulu,	
  Hawaii,	
  April	
  
2015,	
  pp.	
  702-­‐712.	
  
	
  
Tonhäuser,	
  C.,	
  Schwithal,	
  A.,	
  Wolkow,	
  S.,	
  Angermann,	
  M.,	
  Hecker,	
  P.,	
  "Integrity	
  Concept	
  
for	
  Image-­‐Based	
  Automated	
  Landing	
  Systems",	
  Proceedings	
  of	
  the	
  ION	
  2015	
  Pacific	
  PNT	
  
Meeting,	
  Honolulu,	
  Hawaii,	
  April	
  2015,	
  pp.	
  733-­‐747.	
  
	
  
	
  
	
  

More Related Content

What's hot

Surveying instruments
Surveying instrumentsSurveying instruments
Surveying instruments
ShanALi
 
Surveying - chain, compass and plane table surveying
Surveying - chain, compass and plane table surveyingSurveying - chain, compass and plane table surveying
Surveying - chain, compass and plane table surveying
Malla Reddy University
 
To Experimental Study for Comparison Theodolite and Total Station
To Experimental Study for Comparison Theodolite and Total Station To Experimental Study for Comparison Theodolite and Total Station
To Experimental Study for Comparison Theodolite and Total Station
Agriculture Journal IJOEAR
 
Perpendicular Offset
Perpendicular Offset Perpendicular Offset
Perpendicular Offset
Mujeeb Muji
 
Conventional and modern surveying instruments
Conventional and modern surveying instrumentsConventional and modern surveying instruments
Conventional and modern surveying instruments
sameedaslam
 
Azimuth angles - Report
Azimuth angles - ReportAzimuth angles - Report
Azimuth angles - Report
Sarchia Khursheed
 
Total station
Total stationTotal station
Total station
Umarfarook Momin
 
Theodolite surveying part1
Theodolite surveying part1Theodolite surveying part1
Theodolite surveying part1
Naufil Sayyad
 
Total Station surveying
Total Station surveyingTotal Station surveying
Total Station surveying
RAMPRASAD KUMAWAT
 
Plane table surveying ppt
Plane table surveying pptPlane table surveying ppt
Plane table surveying ppt
Naufil Sayyad
 
Mohammed kareem
Mohammed kareemMohammed kareem
Mohammed kareem
Mohammed_82
 
Theodolite angle measurement
Theodolite angle measurementTheodolite angle measurement
Theodolite angle measurement
Mujeeb Muji
 
Total station corrections
Total station correctionsTotal station corrections
Total station corrections
Dr Gopikrishnan T
 

What's hot (13)

Surveying instruments
Surveying instrumentsSurveying instruments
Surveying instruments
 
Surveying - chain, compass and plane table surveying
Surveying - chain, compass and plane table surveyingSurveying - chain, compass and plane table surveying
Surveying - chain, compass and plane table surveying
 
To Experimental Study for Comparison Theodolite and Total Station
To Experimental Study for Comparison Theodolite and Total Station To Experimental Study for Comparison Theodolite and Total Station
To Experimental Study for Comparison Theodolite and Total Station
 
Perpendicular Offset
Perpendicular Offset Perpendicular Offset
Perpendicular Offset
 
Conventional and modern surveying instruments
Conventional and modern surveying instrumentsConventional and modern surveying instruments
Conventional and modern surveying instruments
 
Azimuth angles - Report
Azimuth angles - ReportAzimuth angles - Report
Azimuth angles - Report
 
Total station
Total stationTotal station
Total station
 
Theodolite surveying part1
Theodolite surveying part1Theodolite surveying part1
Theodolite surveying part1
 
Total Station surveying
Total Station surveyingTotal Station surveying
Total Station surveying
 
Plane table surveying ppt
Plane table surveying pptPlane table surveying ppt
Plane table surveying ppt
 
Mohammed kareem
Mohammed kareemMohammed kareem
Mohammed kareem
 
Theodolite angle measurement
Theodolite angle measurementTheodolite angle measurement
Theodolite angle measurement
 
Total station corrections
Total station correctionsTotal station corrections
Total station corrections
 

Viewers also liked

Proposal TEMNAS dana
Proposal TEMNAS danaProposal TEMNAS dana
Proposal TEMNAS dana
Taruna Panatagama (San-Pi)
 
Proyecto. juan jose aguilar ramirez
Proyecto. juan jose aguilar ramirezProyecto. juan jose aguilar ramirez
Proyecto. juan jose aguilar ramirez
juan jose aguilar ramirez
 
SharePoint Lesson #62: Progress Bar in SP2013
SharePoint Lesson #62: Progress Bar in SP2013SharePoint Lesson #62: Progress Bar in SP2013
SharePoint Lesson #62: Progress Bar in SP2013
Peter Heffner
 
Phalcon căn bản
Phalcon căn bảnPhalcon căn bản
Phalcon căn bản
TechMaster Vietnam
 
educational sheet wattle and daub
educational sheet wattle and daubeducational sheet wattle and daub
educational sheet wattle and daub
libnam
 
4 Approaches to Discover a Winning Sales Letter
4 Approaches to Discover a Winning Sales Letter4 Approaches to Discover a Winning Sales Letter
4 Approaches to Discover a Winning Sales Letter
Nate Kennedy
 

Viewers also liked (6)

Proposal TEMNAS dana
Proposal TEMNAS danaProposal TEMNAS dana
Proposal TEMNAS dana
 
Proyecto. juan jose aguilar ramirez
Proyecto. juan jose aguilar ramirezProyecto. juan jose aguilar ramirez
Proyecto. juan jose aguilar ramirez
 
SharePoint Lesson #62: Progress Bar in SP2013
SharePoint Lesson #62: Progress Bar in SP2013SharePoint Lesson #62: Progress Bar in SP2013
SharePoint Lesson #62: Progress Bar in SP2013
 
Phalcon căn bản
Phalcon căn bảnPhalcon căn bản
Phalcon căn bản
 
educational sheet wattle and daub
educational sheet wattle and daubeducational sheet wattle and daub
educational sheet wattle and daub
 
4 Approaches to Discover a Winning Sales Letter
4 Approaches to Discover a Winning Sales Letter4 Approaches to Discover a Winning Sales Letter
4 Approaches to Discover a Winning Sales Letter
 

Similar to TUBraunschweig_SummerResearch_Thesis_Dervisevic

Parking lot
Parking lotParking lot
Parking lot
Alessandro Florio
 
traffic jam detection using image processing
traffic jam detection using image processingtraffic jam detection using image processing
traffic jam detection using image processing
Malika Alix
 
Inspection metrology
Inspection metrologyInspection metrology
Inspection metrology
nilesh sadaphal
 
Leader Follower Formation Control of Ground Vehicles Using Dynamic Pixel Coun...
Leader Follower Formation Control of Ground Vehicles Using Dynamic Pixel Coun...Leader Follower Formation Control of Ground Vehicles Using Dynamic Pixel Coun...
Leader Follower Formation Control of Ground Vehicles Using Dynamic Pixel Coun...
ijma
 
License plate extraction of overspeeding vehicles
License plate extraction of overspeeding vehiclesLicense plate extraction of overspeeding vehicles
License plate extraction of overspeeding vehicles
lambanaveen
 
Enhanced Algorithm for Obstacle Detection and Avoidance Using a Hybrid of Pla...
Enhanced Algorithm for Obstacle Detection and Avoidance Using a Hybrid of Pla...Enhanced Algorithm for Obstacle Detection and Avoidance Using a Hybrid of Pla...
Enhanced Algorithm for Obstacle Detection and Avoidance Using a Hybrid of Pla...
IOSR Journals
 
AUTO LANDING PROCESS FOR AUTONOMOUS FLYING ROBOT BY USING IMAGE PROCESSING BA...
AUTO LANDING PROCESS FOR AUTONOMOUS FLYING ROBOT BY USING IMAGE PROCESSING BA...AUTO LANDING PROCESS FOR AUTONOMOUS FLYING ROBOT BY USING IMAGE PROCESSING BA...
AUTO LANDING PROCESS FOR AUTONOMOUS FLYING ROBOT BY USING IMAGE PROCESSING BA...
csandit
 
208114036 l aser guided robo
208114036 l aser guided robo208114036 l aser guided robo
208114036 l aser guided robo
Chiranjeevi Manda
 
Pid
PidPid
A study on data fusion techniques used in multiple radar tracking
A study on data fusion techniques used in multiple radar trackingA study on data fusion techniques used in multiple radar tracking
A study on data fusion techniques used in multiple radar tracking
TBSS Group
 
Fb4301931934
Fb4301931934Fb4301931934
Fb4301931934
IJERA Editor
 
10 Important AI Research Papers.pdf
10 Important AI Research Papers.pdf10 Important AI Research Papers.pdf
10 Important AI Research Papers.pdf
Linda Garcia
 
LANE DETECTION USING IMAGE PROCESSING IN PYTHON
LANE DETECTION USING IMAGE PROCESSING IN PYTHONLANE DETECTION USING IMAGE PROCESSING IN PYTHON
LANE DETECTION USING IMAGE PROCESSING IN PYTHON
IRJET Journal
 
Lane Detection
Lane DetectionLane Detection
Lane Detection
Fei-Fei Zheng
 
Realtime Road Lane Detection
Realtime Road Lane DetectionRealtime Road Lane Detection
Realtime Road Lane Detection
IRJET Journal
 
Image Processing Applied To Traffic Queue Detection Algorithm
Image Processing Applied To Traffic Queue Detection AlgorithmImage Processing Applied To Traffic Queue Detection Algorithm
Image Processing Applied To Traffic Queue Detection Algorithm
guest673189
 
IRJET - Traffic Density Estimation by Counting Vehicles using Aggregate Chann...
IRJET - Traffic Density Estimation by Counting Vehicles using Aggregate Chann...IRJET - Traffic Density Estimation by Counting Vehicles using Aggregate Chann...
IRJET - Traffic Density Estimation by Counting Vehicles using Aggregate Chann...
IRJET Journal
 
REVIEW OF LANE DETECTION AND TRACKING ALGORITHMS IN ADVANCED DRIVER ASSISTANC...
REVIEW OF LANE DETECTION AND TRACKING ALGORITHMS IN ADVANCED DRIVER ASSISTANC...REVIEW OF LANE DETECTION AND TRACKING ALGORITHMS IN ADVANCED DRIVER ASSISTANC...
REVIEW OF LANE DETECTION AND TRACKING ALGORITHMS IN ADVANCED DRIVER ASSISTANC...
ijcsit
 
Mobile Application Detection of Road Damage using Canny Algorithm
Mobile Application Detection of Road Damage using Canny AlgorithmMobile Application Detection of Road Damage using Canny Algorithm
Mobile Application Detection of Road Damage using Canny Algorithm
Universitas Pembangunan Panca Budi
 
A Study on Single Camera Based ANPR System for Improvement of Vehicle Number ...
A Study on Single Camera Based ANPR System for Improvement of Vehicle Number ...A Study on Single Camera Based ANPR System for Improvement of Vehicle Number ...
A Study on Single Camera Based ANPR System for Improvement of Vehicle Number ...
journal ijrtem
 

Similar to TUBraunschweig_SummerResearch_Thesis_Dervisevic (20)

Parking lot
Parking lotParking lot
Parking lot
 
traffic jam detection using image processing
traffic jam detection using image processingtraffic jam detection using image processing
traffic jam detection using image processing
 
Inspection metrology
Inspection metrologyInspection metrology
Inspection metrology
 
Leader Follower Formation Control of Ground Vehicles Using Dynamic Pixel Coun...
Leader Follower Formation Control of Ground Vehicles Using Dynamic Pixel Coun...Leader Follower Formation Control of Ground Vehicles Using Dynamic Pixel Coun...
Leader Follower Formation Control of Ground Vehicles Using Dynamic Pixel Coun...
 
License plate extraction of overspeeding vehicles
License plate extraction of overspeeding vehiclesLicense plate extraction of overspeeding vehicles
License plate extraction of overspeeding vehicles
 
Enhanced Algorithm for Obstacle Detection and Avoidance Using a Hybrid of Pla...
Enhanced Algorithm for Obstacle Detection and Avoidance Using a Hybrid of Pla...Enhanced Algorithm for Obstacle Detection and Avoidance Using a Hybrid of Pla...
Enhanced Algorithm for Obstacle Detection and Avoidance Using a Hybrid of Pla...
 
AUTO LANDING PROCESS FOR AUTONOMOUS FLYING ROBOT BY USING IMAGE PROCESSING BA...
AUTO LANDING PROCESS FOR AUTONOMOUS FLYING ROBOT BY USING IMAGE PROCESSING BA...AUTO LANDING PROCESS FOR AUTONOMOUS FLYING ROBOT BY USING IMAGE PROCESSING BA...
AUTO LANDING PROCESS FOR AUTONOMOUS FLYING ROBOT BY USING IMAGE PROCESSING BA...
 
208114036 l aser guided robo
208114036 l aser guided robo208114036 l aser guided robo
208114036 l aser guided robo
 
Pid
PidPid
Pid
 
A study on data fusion techniques used in multiple radar tracking
A study on data fusion techniques used in multiple radar trackingA study on data fusion techniques used in multiple radar tracking
A study on data fusion techniques used in multiple radar tracking
 
Fb4301931934
Fb4301931934Fb4301931934
Fb4301931934
 
10 Important AI Research Papers.pdf
10 Important AI Research Papers.pdf10 Important AI Research Papers.pdf
10 Important AI Research Papers.pdf
 
LANE DETECTION USING IMAGE PROCESSING IN PYTHON
LANE DETECTION USING IMAGE PROCESSING IN PYTHONLANE DETECTION USING IMAGE PROCESSING IN PYTHON
LANE DETECTION USING IMAGE PROCESSING IN PYTHON
 
Lane Detection
Lane DetectionLane Detection
Lane Detection
 
Realtime Road Lane Detection
Realtime Road Lane DetectionRealtime Road Lane Detection
Realtime Road Lane Detection
 
Image Processing Applied To Traffic Queue Detection Algorithm
Image Processing Applied To Traffic Queue Detection AlgorithmImage Processing Applied To Traffic Queue Detection Algorithm
Image Processing Applied To Traffic Queue Detection Algorithm
 
IRJET - Traffic Density Estimation by Counting Vehicles using Aggregate Chann...
IRJET - Traffic Density Estimation by Counting Vehicles using Aggregate Chann...IRJET - Traffic Density Estimation by Counting Vehicles using Aggregate Chann...
IRJET - Traffic Density Estimation by Counting Vehicles using Aggregate Chann...
 
REVIEW OF LANE DETECTION AND TRACKING ALGORITHMS IN ADVANCED DRIVER ASSISTANC...
REVIEW OF LANE DETECTION AND TRACKING ALGORITHMS IN ADVANCED DRIVER ASSISTANC...REVIEW OF LANE DETECTION AND TRACKING ALGORITHMS IN ADVANCED DRIVER ASSISTANC...
REVIEW OF LANE DETECTION AND TRACKING ALGORITHMS IN ADVANCED DRIVER ASSISTANC...
 
Mobile Application Detection of Road Damage using Canny Algorithm
Mobile Application Detection of Road Damage using Canny AlgorithmMobile Application Detection of Road Damage using Canny Algorithm
Mobile Application Detection of Road Damage using Canny Algorithm
 
A Study on Single Camera Based ANPR System for Improvement of Vehicle Number ...
A Study on Single Camera Based ANPR System for Improvement of Vehicle Number ...A Study on Single Camera Based ANPR System for Improvement of Vehicle Number ...
A Study on Single Camera Based ANPR System for Improvement of Vehicle Number ...
 

TUBraunschweig_SummerResearch_Thesis_Dervisevic

  • 1.   1                 C2Land     Summer  Research  Assistantship  Thesis     Aleksandra  Dervisevic     Technische  Universität  Braunschweig  Summer  Research   Project:  C2Land   Research  advisor:  Stephan  Wolkow   Date:  03.08.2015                                    
  • 2.   2   Table  of  Contents         I. Introduction………………………………………………………………..pg.  3     II. Current  status…………………………………………………………pgs.  3-­‐4   i. Line  filter…………………………………………………………….pg.  4     III. Improvement  of  the  algorithm……………………………….pgs.  4-­‐10   i. Debugging…………………………………………………...….pgs.  4-­‐8   ii. Fourth  filter………………………………………………..…pgs.  9-­‐10     IV. Error  analysis…………………………………………………..…pgs.  10-­‐15     V. Further  improvements………………………………………..pgs.  15-­‐17     VI. Conclusion…………………………………………………….……pgs.  17-­‐20     VII. References………………………………………………………………...pg.  21                
  • 3.   3                         I.     Introduction     Optical  tracking  of  obstacles  and  paths  have  been  a  growing  object  of  study  in   robotics  and  the  automotive  industry  in  the  past  decade.  However,  there  has  been  little   application  and  development  in  the  aeronautical  field.  A  system  that  relies  on  optical   navigation   would   aid   the   pilot   during   landing   approach   by   detecting   the   runway   and   estimating  the  position  of  the  aircraft.  The  Instrument  Landing  System  (ILS)  is  currently   in  use  only  in  larger  airports  and  aircraft  because  despite  of  its  accuracy  and  reliability,   it   is   a   very   expensive   guidance   system   which   includes   high-­‐maintenance   ground   installation.   Another   procedure,   the   Space   Augmentation   System   (SBAS)   LPV   200,   is   widely  used  down  to  the  height  of  60  meters,  and  from  there  the  pilot  takes  control  until   landing.   This   is   from   where   a   need   of   an   additional   navigation   system   comes.   The   C2Land   project   is   born   from   the   idea   of   an   optical   navigation   system   for   landing   approach,   which   is   highly   accurate   when   closest   to   the   runway.   This   system   would   support  the  inertial  navigation  system  (INS/GNSS)  and  would  be  accompanied  by  SBAS.         II.     Current  status     The  C2Land  project  consists  of  an  image-­‐based  navigation  system  developed  to  detect   the  runway  during  landing  approach.  This  is  done  by  using  a  camera  placed  at  the  front   of  the  plane  and  then  analyzing  the  images  using  image-­‐processing  algorithms  in  C++.     The  code  developed  was  tested  using  videos  from  Flight  Simulation  9  at  first.  The  code   showed  perfect  functioning  and  accurate  results  from  this  testing.  The  next  step  in  the   testing  process  was  to  use  recordings  from  real  landing  approaches.  The  videos  obtained   showed   multiple   cases   to   give   as   many   different   approaches   as   possible.   When   these   videos  were  processed  by  the  code,  multiple  inaccuracies  appeared.  There  were  missing   borders  selected,  or  the  wrong  lines  were  defined  as  right  and  left  borders.     My  main  task  in  this  project  was  to  work  on  the  code  to  avoid  wrong  detection  of  lines   and  refine  the  selection  of  lines  with  the  best  fit.  Specifically,  I  was  assigned  to  improve   the  Image  Analyzer  part  of  the  code  to  achieve  this  stable  runway  detection.   This  thesis  will  summarize  the  work  done  to  achieve  an  improvement  on  the  selection  of   runway  borders  in  the  landing  approach  videos.  The  results  will  first  be  introduced  by  
  • 4.   4   an  overall  description  of  the  logic  behind  the  code  that  analyzes  the  lines  detected  in  the   images.  After  this  brief  description,  some  of  the  debugging  will  be  explained,  as  well  as   the   fourth   filter   developed   to   refine   the   selection   of   lines   as   right   and   left   runway   borders.   To   close   this   thesis,   a   few   visual   examples   are   attached   along   with   the   conclusion.     i.  Line  filter     The   Image   Analyzer   consists   of   three   different   filters   from   which   weight   factors   are   obtained.  These  will  be  multiplied  in  the  end  to  obtain  an  overall  weight  factor  for  each   line  detected.  These  criteria  will  help  distinguish  between  the  right  and  left  borders  of   the  runway  (marking  them  in  green  and  red,  respectively)  and  the  rest,  which  will  be   ignored.   While   processing   the   lines,   the   code   analyzes   “left   line”   and   “right   line”   separately  (based  on  angle  compared  to  central  line).     The  first  filter  is  the  angle  criterion.  The  line  detected  will  be  at  some  angle  with  respect   to  the  expected  left  or  right  lines  of  the  runway,  and  the  angle  between  them  will  be   measured.   If   the   difference   is   greater   than   |2.5°|,   the   weight   factor   will   automatically   become   0.   This   parameter   was   set   by   testing.   If   the   angle   is   within   the   limit,   then   a   weight  factor  that  can  go  from  0  to  1  will  be  obtained.     The  second  filter  is  the  vanishing  point  criterion.  The  intersection  between  the  expected   left  line  and  the  expected  central  line  will  be  found,  and  this  point  will  become  the  center   of  a  circle  of  radius  100  pixels.  This  parameter  was  also  set  by  testing.  The  filter  will   consist  of  finding  the  intersection  between  the  detected  line  and  the  central  line,  and   observing  if  it  lies  within  the  limits  of  the  circle.  Depending  on  how  far  it  is  from  the   center,   it   will   have   a   different   weight   factor.   If   it   lies   outside   of   the   circle,   it   will   immediately  become  0.   The  third  filter  is  the  expected  length  criterion.  The  upper  limit  for  the  length  is  1000   pixels,  set  again  by  testing.  Depending  on  how  close  the  length  is  to  the  expected  length,   the  weight  factor  will  change.   After  these  three  factors  are  obtained,  the  values  will  be  multiplied.  The  ones  that  are   zero  will  be  ignored  and  the  ones  that  are  not  will  be  compared,  and  the  highest  overall   weight  factors  will  be  marked  as  the  right  and  left  borders  of  the  runway  for  each  time   step.         III.     Improvement  of  the  algorithm     The  first  task  towards  the  improvement  of  the  code  was  to  watch  around  20  videos  of   actual  landing  approaches  and  set  the  right  coordinates  for  the  image  detection  (pitch,   roll  and  yaw)  so  that  the  right  position  could  be  used  with  whichever  video  was  used  for   testing.  After  that,  the  debugging  would  take  place.     i.  Debugging       Printing  the  coordinates  of  the  points  that  delimit  the  extremes  of  the  projected  borders   of  the  runway  helped  visualize  what  the  expected  runway  is.  This  test  was  run  with  the   simulation  and  multiple  of  the  real  imaging  recordings.  The  finding  was  that  there  was   an  inconsistency  with  the  coordinate  system  convention.  It  appeared  as  it  was  reversed  
  • 5.   5   for   some   cases   in   the   real   imaging   recordings   (but   never   in   the   simulation).   The   convention  assumed  that  the  image  and  coordinate  system  should  look  like  in  figure  1,   but  sometimes  it  would  just  switch  to  figure  2.                                                             The  following  table  shows  the  values  printed  from  one  of  the  landing  approaches  tested.     Theta  Left   Theta  right   Projected  Left  Edge  angle   Projected  Right  Edge  Angle   -­‐0.313  rad   0.3651  rad   0.927  rad   -­‐1.396  rad     Table  1.  Angles  from  incorrect  projection  of  detected  lines.  (time  step  1695  flight  1  on  the  14-­‐04-­‐15)         Figure  3.  Image  associated  to  time  step  from  which  the  values  on  table  1  were  extracted     Table  1  helps  visualize  the  inconsistency  of  signs.  While  the  detected  line  angle  theta   considered  left  is  negative,  in  the  projected  left  line,  the  edge  angle  is  positive.  This  is   inaccurate  because  “theta”  and  “edge  angle”  are  complementary  angles.       The  class  of  the  code  that  deals  with  the  Image  Analyzer  was  then  tested  in  order  to  try   to  find  the  solution  to  this  problem.  After  observing  the  different  plots  from  the  various   videos,  it  could  be  noticed  that  there  was  a  trend  were  every  time  the  coordinate  system   was  upside  down:  one  of  the  lines  defining  the  runway  was  always  shortened  because  it   was  out  of  the  limits  of  the  given  region  of  interest.  The  size  of  this  region  of  interest  is   1279  x  982  pixels,  and  one  of  the  coordinates  of  one  of  the  points  delimiting  the  runway   borders  would  be  one  of  these  numbers.     This  meant  that  every  time  the  code  had  to  run  over  the  lines  that  shorten  the  border   lines  because  they  are  too  long,  this  mistake  would  happen.  The  focus  of  the   investigation  then  switched  towards  the  class  of  the  code  that  deals  with  the  line   x   y   Figure  1.  Projected  runway  borders   Figure  2.  Projected  runway  borders   x   y  
  • 6.   6   features  calculation.  Here,  values  like  the  slope,  the  edge  angle  (angle  between  the  line   and  the  ordinate),  the  y-­‐intercept  and  finally  the  equation  of  the  line  are  calculated.       When  going  over  these  lines,  a  mathematical  mistake  was  found.  There  were  two   different  functions  for  which  the  equation  of  a  line  had  to  be  defined.  In  both  cases,  the   equation  had  to  look  the  same,  because  it  was  simply  the  general  equation  of  a  line.  In   the  first  function  it  was  calculated  correctly  (Eq  1),  but  in  the  second  case  the  equation   was  not  accurate  (Eq  2).     Eq  1.             b  =  y  –  tan(θ)*x       Eq  2.             b  =  atan(y  –  θ*x)     Where  b  is  the  y-­‐intercept  and  θ  is  the  angle  that  comes  from  taking  the  atan(slope).   Given  this,  the  second  equation  was  changed  to  what  it  should  have  been,  and  the  lines   of  the  code  that  were  unnecessary  were  deleted  (made  the  if  loop  with  just  one  else   statement,  as  seen  in  figure  3).   This  change  in  the  code  fixed  the  output  of  the  cases  where  the  coordinate  system   appeared  upside  down.  Another  coding  mistake  was  found  in  an  if  loop.  Theta  was   calculated  after  the  loop,  so  for  the  cases  were  the  x-­‐coordinate  of  both  points  were  the   same,  the  slope  was  not  defined  and  then  theta  was  calculated  out  of  0  (predefined  value   of  the  variable  slope).     //Old  code     if  (pt2.x  >  pt1.x)     {       slope   =   (pt2.y-­‐  pt1.y)  /  (pt2.x-­‐  pt1.x);           }   else  if  (pt2.x  ==  pt1.x)     {       theta  =  M_PI/2;     }   else   {       slope   =   (pt1.y-­‐  pt2.y)  /  (pt1.x-­‐  pt2.x);     }   theta   =   atan  (  slope  )  ;       //Code  with  corrections     if  (pt2.x  !=  pt1.x)     {       slope    =   (pt2.y-­‐  pt1.y)  /  (pt2.x-­‐  pt1.x);       theta   =   atan  (  m_slope  )  ;     }   else     {       slope  =  DBL_MAX;       theta  =  M_PI/2;     }     Figure  4.  Lines  of  the  code  from  the  line  features  calculation  
  • 7.   7   Another  correction  made  was  the  following.  These  lines  find  the  edge  angle,  which  is  the   angle  located  between  the  line  and  the  y-­‐axis.  In  this  context,  m_slopeRad  is  the  angle   between  the  line  and  the  x-­‐axis.     if  (m_slopeRad  >  0)     {       m_edgeAngle   =   M_PI/2  -­‐  m_slopeRad;  //angle  between   ordinate  and  line     }   else  if  (m_slopeRad  <  0)     {       m_edgeAngle   =   -­‐  M_PI/2  -­‐  m_slopeRad;     }   else     {       m_edgeAngle   =  M_PI/2  ;     }               Figure  5.  Lines  of  code  corrected     The  last  else  statement  had  to  be  added  for  the  code  to  make  sense.  Previously,  the  value   of  m_slopeRad  equal  to  zero  was  never  taken  into  account,  so  the  value  for  m_edgeAngle   that  would  have  been  used  would  have  probably  been  whatever  value  was  stored  in  the   memory.  This,  together  with  an  increase  of  the  value  of  the  parameters  used  in  each   filter  for  the  sensitivity  increased  immensely  the  accuracy  of  the  results.     The  following  table  is  analogous  to  table  1,  but  here  the  values  were  obtained  after  all   the  previous  changes  were  implemented.       Theta  Left   Theta  right   Projected  Left  Edge  angle   Projected  Right  Edge  Angle   -­‐0.285  rad   0.213  rad   -­‐1.243  rad   1.260  rad     Table  2.  Angles  from  accurate  projection  of  detected  lines.  (time  step  1695,  flight  1  on  the  14-­‐04-­‐15)           Figure  6.  Image  associated  to  time  step  from  which  values  in  table  2  were  extracted     The  output  data  was  at  all  times  correct  and  consistent  with  signs,  and  now  the  left  and   right  borders  of  the  runway  are  detected  more  precisely.   The  work  in  the  LineFeatures  class  was  finished  with  this  last  correction.    
  • 8.   8   The  detection  of  the  left  line  used  to  be  inexistent  in  some  of  the  recordings  tested;  but   that  was  not  the  case  anymore.  All  the  videos  provided  with  the  code  were  tested  with   the   new   code,   and   there   was   one   of   them   that   was   causing   some   trouble.   This   video   started   out   very   well   (figure   7),   but   as   the   plane   was   approaching   the   runway,   the   detection  became  worse  (see  figure  8).  This  is  why  it  was  a  good  idea  to  develop  a  fourth   filter  that  would  refine  the  detection  of  lines,  to  avoid  the  selection  of  the  lines  marked   on  figure  8.           Figure  7.  Very  accurate  detection  of  right  and  left  borders  of  the  runway  (flight  1,  trial  8,  on  16-­‐04-­‐15)                                       Figure  8.  Wrong  selection  of  right  and  left  borders  of  the  runway                     Lines  detected  are   too  far  from  the   expected  runway   borders  
  • 9.   9   ii.Fourth  filter     The  idea  behind  the  fourth  filter  was  to  basically  design  the  region  depicted  on  figure  9:                                                                             Figure  9.  Region  for  Refining  Filter     Here,  the  red  and  green  lines  are  bordering  the  region,  which  surrounds  the  projected   runway  lines.  The  objective  is  to  give  a  higher  weight  factor  to  the  lines  that  lie  within   the   limits   of   this   region.   This   filter   would   not,   however,   eliminate   the   lines   that   are   outside  of  the  region,  due  to  possible  inaccuracies  related  to  the  inexact  projection  of  the   runway  borders.     This  design  did  not  work  so  well  because  of  the  distance  at  which  the  red  and  green  lines   were  situated  from  the  projected  lines.  The  distance  was  constant  throughout  the  entire   approach  and  this  made  it  very  inaccurate  because  compared  to  real-­‐life  geometry,  this   value  should  be  increasing  as  the  plane  approaches  the  runway.  This  is  why  we  decided   to  make  a  dynamic  region,  directly  using  the  values  of  the  x-­‐coordinate  of  the  projected   lines  end  points  in  the  code  (these  keep  changing  as  the  code  is  run).     The  new  design  would  become  like  figure  10:                                                       Figure  10.  Final  design  of  the  region  for  refining  filter     x   y   y   x  
  • 10.   10   This   new   design   improved   the   selection   of   lines   drastically.   As   an   example,   figure   11   shows  the  same  photogram  as  figure  8  but  after  the  implementation  of  the  refining  filter,   showing  a  high  accuracy  in  the  selection  of  lines.           Figure  11.  Accurate  detection  of  right  and  left  borders  of  the  runway     Another  change  was  implemented,  which  enlarged  the  projected  runway  in  the  bottom   part  (yellow  lines)  to  cover  the  entire  runway.           IV.     Error  analysis     To  quantitatively  measure  the  error  in  the  optical  position,  the  coordinates  found  are   compared  to  the  ones  given  by  the  actual  INS/GNSS  system  on  board.  In  figure  12  both   sets  of  (x,y)  coordinates  are  represented.         Figure  12.  Representation  of  position,  x  vs.  y  axes  for  the  new  code   High  accuracy  in   lines  selected   Borders  of  the  region   for  fourth  filter  
  • 11.   11   This   error   analysis   becomes   more   meaningful   when   compared   to   the   analysis   made   using  the  computer  algorithm  before  all  the  improvements  were  implemented.   Figure  13  depicts  the  two  sets  of  (x,y)  coordinates  for  the  optical  and  reference  positions   obtained  from  the  old  code.           Figure  13.  Representation  of  position,  x  vs.  y  axes  for  the  old  code     It  can  be  observed  that  the  first  visualization  of  the  runway  starts  around  2,000  meters   later  than  in  the  new  code,  and  that  once  on  the  runway,  in  multiple  instances,  the   optical  position  is  highly  inaccurate.   For  a  clearer  appreciation  of  the  precision  of  the  new  optical  position,  the  following  plot   is  drawn.       Figure  14.  Error  estimation  for  (x,  y)  set  
  • 12.   12   Here,  the  difference  between  the  reference  and  the  optical  position  is  computed.    It  can   be  observed  that  the  values  remain  between  0  and  20  meters  throughout  all  times  and   below  10  meters  when  the  plane  is  closest  to  the  runway  (see  figure  15  for  a  close-­‐up  of   the  last  part  of  the  landing  approach).         Figure  15.  Error  estimation  close-­‐up  of  last  2,000  meters  of  the  approach     Figure   16,   on   the   other   hand,   represents   the   coordinates   for   optical   and   INS/GNSS   position  in  the  (x,z)  axes.         Figure  16.  Representation  of  position,  x  vs.  z  axes  for  the  new  code     Like  in  the  (x,y)  case,  the  (x,z)  coordinates  from  the  old  code  are  plotted  in  figure  17  to   better  exemplify  the  improvement  achieved  after  all  changes  were  implemented.  
  • 13.   13         Figure  17.  Representation  of  position,  x  vs.  z  axes  for  the  old  code     Here,  again,  the  visualization  of  the  runway  starts  around  2,000  meters  later  than  in  the   new   code.   The   number   of   instances   the   runway   is   processed   is   much   smaller,   which   decreases  the  precision  in  which  the  borders  are  defined.  This  can  be  inferred  from  the   lack  of  abundance  of  blue  dots  in  the  plot.     Figure  18  is  then  plotted  to  understand  the  error  between  the  optical  and  the  reference   position  estimations.  The  error  is  greater  up  to  2,000  meters  of  distance  because  the   plane  is  certainly  far  from  the  runway,  and  the  visibility  is  minimal,  as  it  can  be  observed   in  figure  20,  which  shows  a  snapshot  of  the  view  at  4,000  meters  of  distance.  Figure  19  is   a  close-­‐up  of  the  last  2,000  meters  to  better  represent  the  persistently  decreasing  error   estimation,  which  after  1,000  meters  of  distance  stays  under  10  meters.    
  • 14.   14       Figure  18.  Error  estimation  for  (x,  z)  set           Figure  19.  Error  estimation  close-­‐up  of  last  2,000  meters  of  the  approach    
  • 15.   15       Figure  20.  View  of  the  runway  at  4,000  meters  of  distance     The  most  noticeable  improvement  in  the  code  is  the  resulted  higher  number  of  instances   where  the  runway  is  processed,  which  increases  the  overall  quality  of  the  detection  of   the  runway.         V.     Further  improvements     During  the  investigation  on  this  project,  there  were  some  problems  that  could  not  be   addressed   but   whose   solution   would   mean   a   definite   improvement   to   the   code.   The   main  current  issues  observed  in  the  selection  of  the  right  and  left  borders  of  the  runway   were:     • Inaccurate  projected  runway  lines.    This  is,  probably,  the  most  notable   problem.  Even  if  the  filters  work  perfectly  selecting  the  lines  that  are  closer  to  the   projected  runway  lines,  if  these  are  not  accurately  drawn  over  the  actual  runway,   then  the  lines  selected  will  be  wrong,  or  there  will  be  no  selection  of  lines.            
  • 16.   16       Figure  21.  The  projected  runway  borders  do  not  overlap  the  detected  lines,  so  the  selection  filter   does  not  process  them  as  accurate  borders.     • Length  criterion.  The  length  criterion  does  not  work  very  well  as  a  filter   in  this  process.  When  the  line  does  not  lie  within  the  limits  set  by  the  sensitivity,   this  line  is  discarded  from  the  selection  process  because  the  weight  factor  for  this   filter   becomes   zero,   making   the   overall   weight   factor   zero   as   well.   What   this   causes  is  that  in  some  time-­‐steps  there  are  no  lines  selected  as  suitable  borders,   because  the  ones  with  a  non-­‐zero  weight  factor  were  discarded  because  they  are   too  short.  To  solve  this  problem,  the  sensitivity  is  increased.  This  may  solve  the   issue  in  one  border,  but  then  in  the  other,  it  causes  a  change  in  the  selection  of   most   suitable   line   for   one   with   a   higher   weight   factor,   but   shorter   (before   it   would  have  been  discarded,  but  now  it  is  not).         Figure  22.  Here,  the  green  line  is  selected  over  longer  ones  to  the  right  because  it  is  undoubtedly  more   similar  to  the  projected  line  than  any  other.  In  this  case,  the  sensitivity  is  large  enough  to  get  this  line   selected  instead  of  eliminated  for  its  length.     Then,  the  length  sensitivity  is  modified  to  avoid  selecting  such  short  lines.    
  • 17.   17           Figure  23.  After  the  length  sensitivity  is  increased,  the  right  border  selection  is  improved   drastically,  but  this  change  also  results  in  the  disappearance  of  any  left  border,  because  the  length   sensitivity  rules  all  the  lines  out  of  the  selection.     A  possible  solution  to  this  last  issue  would  be  to  change  the  weight  factor  value  of   the  line  when  it  does  not  comply  with  the  sensitivity.  Instead  of  filtering  the  line   out  by  giving  a  weight  factor  of  zero,  change  this  value  to  a  number  between  0   and  1.  This  way  the  filter  would  become  a  refining  filter  like  the  fourth  filter  I   developed   in   my   work.   In   this   manner,   it   would   be   possible   to   avoid   absent   runway  borders.           VI.   Conclusion     The   goal   of   this   newly   developed   system   is   to   detect,   process   and   select   the   runway   borders  during  landing  approach  through  the  use  of  a  camera  positioned  at  the  front  of   the  plane  and  image-­‐processing  algorithms.   Under  the  supervision  of  Stephan  Wolkow,  my  task  was  to  improve  the  results  of  the   C++  code  used  in  this  system.  The  problematic  results  occurred  in  certain  time  steps,   normally  when  the  plane  was  closer  to  the  runway.  These  included  missing  selection  of   lines  with  the  best  fit,  or  the  selection  of  inaccurate  lines  as  borders  (they  were  too  far   from  the  projected  runway  lines,  or  too  short  compared  to  others  detected).   After  solving  major  and  minor  coding  mistakes  found  by  printing  the  coordinates  of  the   lines  selected,  a  fourth  filter  was  also  implemented  to  refine  the  selection  of  the  most   suitable  lines  as  left  and  right  borders  of  the  runway.     In  the  end,  the  selection  of  the  appropriate  lines  as  runway  borders  was  improved.  In   those   specific   cases   where   there   was   an   inaccuracy   observed   in   the   selection   of   best   lines,  now  there  is  an  optimal  definition  of  the  runway  borders.     The   quantitative   comparison   of   the   error   in   optical   position   before   and   after   the   upgrades  were  implemented  also  demonstrates  an  increase  in  accuracy  when  estimating   the  position.     The   following   images   represent   visual   examples   of   the   improvement   achieved   in   the   selection  of  right  and  left  borders  of  the  runway.        
  • 18.   18   Flight  1,  trial  4  on  15/04/15         Figure  24.  Before  code  improvement         Figure  25.  After  code  improvement                        
  • 19.   19   Flight  1,  trial  1  on  16/04/15         Figure  26.  Before  code  improvement         Figure  27.  After  code  improvement    
  • 20.   20   Flight  1,  trial  1  on  20/04/15         Figure  28.  Before  code  improvement           Figure  29.  After  code  improvement                                
  • 21.   21   VII.   References     Angermann,  M.,  Wolkow,  S.,  Schwithal,  A.,  Tonhäuser,  C.,  Hecker,  P.,  "High  Precision   Approaches  Enabled  by  an  Optical-­‐Based  Navigation  System",  Proceedings  of  the  ION   2015  Pacific  PNT  Meeting,  Honolulu,  Hawaii,  April  2015,  pp.  694-­‐701.     Wolkow,  S.,  Schwithal,  A.,  Tonhäuser,  C.,  Angermann,  M.,  Hecker,  P.,  "Image-­‐Aided   Position  Estimation  Based  on  Line  Correspondences  During  Automatic  Landing   Approach,"  Proceedings  of  the  ION  2015  Pacific  PNT  Meeting,  Honolulu,  Hawaii,  April   2015,  pp.  702-­‐712.     Tonhäuser,  C.,  Schwithal,  A.,  Wolkow,  S.,  Angermann,  M.,  Hecker,  P.,  "Integrity  Concept   for  Image-­‐Based  Automated  Landing  Systems",  Proceedings  of  the  ION  2015  Pacific  PNT   Meeting,  Honolulu,  Hawaii,  April  2015,  pp.  733-­‐747.