SlideShare a Scribd company logo
1 of 145
Download to read offline
The Development and Testing of the Euclidean
Travelling Salesman Platform:
Single Honours Report MSci
William J. M. Fraser
w.fraser.10@aberdeen.ac.uk
51010160
Friday 16th
May, 2014
Abstract
To help in the hunt for an optimised solution to the Travelling Salesman
Problem (TSP) this project follows the development and testing of a platform
called the Euclidean Travelling Salesman Platform (ETSP) designed for testing
TSP heuristics. This project was motivated by previous research into an
optimised TSP solution called QSTSH. A Java platform was designed, developed
and tested to a version one and used to test QSTSH. From the testing it was
concluded that ETSP was easy to use and robust. Testing was also done on
QSTSH by comparing it to a Greedy Nearest Neighbour which found QSTSH
to be better in accuracy and efficiency.
1
Acknowledgements
I would like to acknowledge a couple people for their contribution to this work.
First and foremost, I would like to acknowledge the support and advice of Dr Martin
Kollingbaum throughout this project.
Secondly, I would like to acknowledge Dr Nir Oren of the University of Aberdeen for
initially sparking my interest in the Travelling Salesman Problem and algorithms in general.
“Nothing is impossible. Some things are just less likely than others.” -Jonathan Winters
2
Contents
Contents 3
List of Figures 8
List of Tables 9
1 Introduction 10
1.1 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
1.2 Objectives . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
1.3 Motivations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
1.4 Investigations & Goals . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
1.4.1 Primary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
1.4.2 Secondary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
1.4.3 Tertiary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
1.5 Document Structure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
2 Background 14
2.1 History of the Travelling Salesman Problem . . . . . . . . . . . . . . . . 14
2.2 Different Travelling Salesman Types . . . . . . . . . . . . . . . . . . . . . 15
2.2.1 Symmetric and Asymmetric . . . . . . . . . . . . . . . . . . . . . 15
2.2.2 Graph . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
2.2.3 Metric . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
2.2.4 Other Information . . . . . . . . . . . . . . . . . . . . . . . . . . 16
2.3 Solution Types . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
2.3.1 Heuristics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
2.3.2 Exact Algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
2.4 Existing Solutions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
2.4.1 Existing Research Solutions . . . . . . . . . . . . . . . . . . . . . 18
3
CONTENTS
2.4.2 Existing Real Word Solutions . . . . . . . . . . . . . . . . . . . . 18
2.5 Real World Applications . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
3 The QSTSH 20
3.1 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20
3.2 License . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20
3.3 Background . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21
3.4 Implementation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21
3.5 Developed Optimisation and Accuracy . . . . . . . . . . . . . . . . . . . 23
3.6 Development To Date . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25
4 Requirements 26
4.1 Functional Requirements . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
4.1.1 ETSP Must: . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
4.1.2 ETSP Should: . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
4.1.3 ETSP Could: . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27
4.1.4 ETSP Wont: . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27
4.2 Non-Functional Requirements . . . . . . . . . . . . . . . . . . . . . . . . 27
4.2.1 ETSP Must: . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27
4.2.2 ETSP Should: . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28
4.2.3 ETSP Could: . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28
4.2.4 ETSP Wont: . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28
5 Methodology & Technologies 29
5.1 Methodology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29
5.1.1 Research . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29
5.1.2 Requirement Acquisition . . . . . . . . . . . . . . . . . . . . . . . 29
5.1.3 System Development . . . . . . . . . . . . . . . . . . . . . . . . . 30
5.1.4 System Testing . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30
5.1.5 Documentation . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30
5.2 Technology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31
5.2.1 Licensing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31
5.2.2 Java & JavaDoc . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
5.2.3 IDE . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
5.2.4 Storage . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33
5.2.5 Version Control . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33
5.2.6 Parallels Desktop 8 . . . . . . . . . . . . . . . . . . . . . . . . . . 34
5.2.7 LATEX . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34
4
CONTENTS
6 System Design & Architecture 35
6.1 System Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
6.1.1 User Interface . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
6.1.2 Plugin Loader . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39
6.1.3 ETSLib . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39
6.1.4 File Loader . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39
6.2 Architecture . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40
6.2.1 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40
6.2.2 Class UML . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41
6.2.3 Packages . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43
7 Implementation 44
7.1 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44
7.2 User Interface . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45
7.2.1 Heuristic tester . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45
7.2.2 Project Info Viewer . . . . . . . . . . . . . . . . . . . . . . . . . . 46
7.2.3 ETSP Terminal . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48
7.3 ETSPLang . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49
7.3.1 Syntax . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49
7.3.2 Functions and Keywords . . . . . . . . . . . . . . . . . . . . . . . 49
7.4 Plugin Loader . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50
7.5 ETSPLib . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51
7.6 Other Developments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51
8 Experiment 53
8.1 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53
8.2 Aim . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53
8.3 Hypotheses . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53
8.4 Heuristics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54
8.4.1 Greedy Nearest Neighbour . . . . . . . . . . . . . . . . . . . . . . 54
8.4.2 QSTSH . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54
8.5 Method . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54
8.5.1 Test Environment . . . . . . . . . . . . . . . . . . . . . . . . . . . 54
8.6 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55
8.6.1 Runtime . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55
8.6.2 Accuracy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56
8.7 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56
5
CONTENTS
9 Testing and Evaluation 58
9.1 Testing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58
9.1.1 HCI & Usability Testing . . . . . . . . . . . . . . . . . . . . . . . 58
9.1.2 Component Testing . . . . . . . . . . . . . . . . . . . . . . . . . . 63
9.1.3 Scalability Testing . . . . . . . . . . . . . . . . . . . . . . . . . . 64
9.1.4 Efficiency Testing . . . . . . . . . . . . . . . . . . . . . . . . . . . 66
9.1.5 Requirement Testing . . . . . . . . . . . . . . . . . . . . . . . . . 66
9.2 Evaluation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69
9.2.1 Functionality . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69
9.2.2 Usability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69
9.2.3 Durability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 70
9.3 Results & Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 70
10 Conclusions 71
10.1 Achievements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71
10.2 Lessons Learned . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71
10.3 Future Work & Known Bugs . . . . . . . . . . . . . . . . . . . . . . . . . 72
10.3.1 Future Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72
10.3.2 Known Bugs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72
10.4 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73
Bibliography 74
Appendices 77
A User Manual 77
A.1 Preface . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77
A.2 Installation, Setup & Initialisation . . . . . . . . . . . . . . . . . . . . . . 77
A.2.1 Installation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77
A.2.2 Setup . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 78
A.2.3 Getting Started . . . . . . . . . . . . . . . . . . . . . . . . . . . . 79
A.2.4 Closing the Program . . . . . . . . . . . . . . . . . . . . . . . . . 81
A.3 Handy Things to Know . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81
A.4 Using the System . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81
A.4.1 Making a Project File . . . . . . . . . . . . . . . . . . . . . . . . 81
A.4.2 Loading a Project . . . . . . . . . . . . . . . . . . . . . . . . . . . 82
A.4.3 Testing a Plugin . . . . . . . . . . . . . . . . . . . . . . . . . . . 83
6
CONTENTS
A.4.4 Interpreting Plugin Output . . . . . . . . . . . . . . . . . . . . . 84
A.4.5 Modifying the Project . . . . . . . . . . . . . . . . . . . . . . . . 84
A.4.6 Saving a Project . . . . . . . . . . . . . . . . . . . . . . . . . . . 85
A.4.7 Closing a Project . . . . . . . . . . . . . . . . . . . . . . . . . . . 86
A.5 ETSPLang . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 86
A.5.1 Writing a Script . . . . . . . . . . . . . . . . . . . . . . . . . . . . 86
A.5.2 Running a Script . . . . . . . . . . . . . . . . . . . . . . . . . . . 90
A.5.3 Interpreting Output . . . . . . . . . . . . . . . . . . . . . . . . . 90
A.6 Plugins . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 91
A.6.1 Writing a Plugin . . . . . . . . . . . . . . . . . . . . . . . . . . . 91
A.6.2 Writing a Plugin Interface . . . . . . . . . . . . . . . . . . . . . . 92
A.6.3 Making a Plugin Runnable . . . . . . . . . . . . . . . . . . . . . . 93
A.6.4 Other Information on Running a Plugin . . . . . . . . . . . . . . 94
B Maintenance Manual 96
B.1 Installation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 96
B.1.1 JAR Installation . . . . . . . . . . . . . . . . . . . . . . . . . . . 96
B.1.2 Project Opening in Eclipse . . . . . . . . . . . . . . . . . . . . . . 96
B.1.3 Installing a Plugin . . . . . . . . . . . . . . . . . . . . . . . . . . 97
B.2 Class Outline . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 97
B.2.1 gui . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 97
B.2.2 etspLang . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 98
B.2.3 packageInterface . . . . . . . . . . . . . . . . . . . . . . . . . . . 98
B.2.4 storage . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 99
B.3 Class UML . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 99
B.4 Future Improvement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 101
B.4.1 User Interface . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 101
B.4.2 Plugin Improvements . . . . . . . . . . . . . . . . . . . . . . . . . 102
B.4.3 ETSPLang . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 102
B.5 Directory Content . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 103
C File Formats 106
C.1 .TSP . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 106
C.1.1 File Layout . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 106
C.1.2 Example File . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 107
C.2 .TXT . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 108
C.2.1 Example File . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 108
7
C.3 .TSPRJ . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 108
C.3.1 File Layout . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 109
C.3.2 Example File . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 110
D User Testing Template and Results 112
D.1 User Testing Template . . . . . . . . . . . . . . . . . . . . . . . . . . . . 112
D.2 User Output . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 117
List of Figures
2.1 Example of a weighted graph (Wikipedia) . . . . . . . . . . . . . . . . . . . 16
3.1 QSTSH’s main function . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22
3.2 QSTSH’s Constructor . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23
3.3 Plots of depths at which QSTSH exceeds GNN logarithmically scaled and
normally scaled respectively. Showing a line of best fit. . . . . . . . . . . . . 25
5.1 Gantt Chart Plan of Documentation. . . . . . . . . . . . . . . . . . . . . . . 31
6.1 Main Screen Concept . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
6.2 Heuristic Tester Tab Concept . . . . . . . . . . . . . . . . . . . . . . . . . . 36
6.3 Project Tab Concept . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37
6.4 ETSPLang Tab Concept . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38
6.5 Sample Script . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39
6.6 System Architecture Diagram. . . . . . . . . . . . . . . . . . . . . . . . . . . 40
6.7 Class UML . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42
6.8 System Packages . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43
7.1 ETSP overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44
7.2 ETSP: Heuristic Testing Tab . . . . . . . . . . . . . . . . . . . . . . . . . . 45
7.3 ETSP: Heuristic Testing Tab in Action . . . . . . . . . . . . . . . . . . . . . 46
7.4 ETSP: Project Info Viewer . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47
7.5 ETSP: Project Info Viewer Submit Request . . . . . . . . . . . . . . . . . . 47
8
7.6 ETSP: ETSPLang Terminal . . . . . . . . . . . . . . . . . . . . . . . . . . . 48
7.7 ETSP: Help and About boxes . . . . . . . . . . . . . . . . . . . . . . . . . . 51
8.1 Runtime in ms plotted by number of points . . . . . . . . . . . . . . . . . . 55
8.2 Accuracy in % plotted by number of points . . . . . . . . . . . . . . . . . . . 56
List of Tables
6.1 Preliminary Vocabulary Design . . . . . . . . . . . . . . . . . . . . . . . . . 38
7.1 Table of ETSPLang keywords Part 1 . . . . . . . . . . . . . . . . . . . . . . 49
7.2 Table of ETSPLang keywords Part 2 . . . . . . . . . . . . . . . . . . . . . . 50
9
Chapter
1Introduction
1.1 Overview
The Travelling Salesman Problem, also referred to as TSP, in its simplest form is the
problem to find the shortest route for a network of locations, where each location is only
visited once. The Travelling Salesman Problem has been studied in great detail and a
large set of solutions have been proposed. This project aims to create a testing platform,
the so-called Euclidean Travelling Salesman Platform or ETSP, for the investigation of
the two-dimensional euclidean TSP solutions. This platform will be used to investigate a
new algorithm for solving a TSP, the QSTSH, and to compare its performance to another
TSP heuristic. An important motivation for the development of the ETSP platform was a
previous personal research project on an optimised solution to the Travelling Salesman
Problem called the Quick Sort Travelling Salesman Heuristic or QSTSH for short. It was
the desire to compare this heuristic with other heuristics that sparked the development for
this project.
1.2 Objectives
The final outcome of the project is to produce a working version of the Euclidean Travelling
Salesman Platform for testing travelling salesman heuristics. In particular, it will be used
to demonstrate the performance of QSTSH compared to another existing solution, such as
the Greedy Nearest Neighbour (GNN). When fully operational the Euclidean Travelling
Salesman Platform (ETSP) will offer benefits in many different ways:
10
1.3. MOTIVATIONS
– A working version of ETSP with the primary components and a usable GUI
– A plugin loader should run without the need for user input and in a user friendly
manner with a standardised format
– The test data should be loadable in many formats to suit the users needs
– Test data should be manually editable in the system to add to or update
– The user interface should be split into separate sections for the different functionalities
– The output for the user in each should be usable and understandable in a format
that is user-oriented
– Of these sections there should three main focuses: Heuristic Testing, Project Infor-
mation Interface and a Scripting Terminal.
– The interface components, such as buttons, should be deactivated and activated as
and when the user needs them
Since this is a whole new platform design studying current heuristics and incorporating
the information learned will be carried out throughout the whole project and focused on
greatly during the system design and requirement gathering.
After the development of the system, to illustrate the working functionality, two
heuristics will be compared: QSTSH and the Greedy Nearest Neighbour Heuristic (GNN).
(See Chapter 8).
1.3 Motivations
A big motivation for ETSP was a previous personal research project on an optimised
solution to the travelling salesman problem called the Quick Sort Travelling Salesman
Heuristic or QSTSH for short. It was the desire to pitch this heuristic against others that
sparked the development for this project.
A further motivation for the development of ETSP is to help forward the research into
optimised and accurate heuristics for the Travelling Salesman Problem by allowing for
testing several solutions against each other and testing combinations of solutions. The
11
1.4. INVESTIGATIONS & GOALS
information gathered from such a system in the developmental process of heuristics as well
as the testing process could prove indispensable. ETSP will also allow for the removal of
the possible bias created from choosing separate systems to run individual heuristics during
experiments. it would cancel the possible bias since it could be classed as a constant.
Along with the research possibilities, ETSP could be used as an education tool in
the teaching of the Travelling Salesman Problem or less specifically the importance of
optimisation and accuracy.
Another motivation is the lack of such a platform readily existing, that is not to say
such a platform does not exist but more that during the research for this project such a
platform could not be found (see Chapter 2).
1.4 Investigations & Goals
Given the motivations, several investigative questions were raised:
Is QSTSH better than a comparative traditional TSP heuristic?
Would a platform prove beneficial to TSP heuristic testing?
Is ETSP convenient to use and run experiments on?
Based on these questions it was concluded that there was a need for a platform that was
easy to use and produced usable output. This led to the following goals. The goals have
been broken down into three subgroups in an ordinal fashion from Primary to Tertiary
where the Primary goals take precedence over Secondary and Secondary over Tertiary.
This means that if achieving a Primary goal stops a Secondary goal this will be classed as
an acceptable compromise.
1.4.1 Primary
– Create an easy to use system
– Make a simple means of loading in personal solutions to the TSP
12
1.5. DOCUMENT STRUCTURE
– Demonstrate the system meets all set requirements
– Make the system easily usable in both education and research
1.4.2 Secondary
– Make the system with an intuitive GUI
– Build the system to cope with current and future versions of Java
– Build the system to make use of system resources to help with processing speed
– Test QSTSH against a Greedy Nearest Neighbour Heuristic
1.4.3 Tertiary
– Make a platform, which can cope with all current major operating systems
– Design a platform so that it can handle multiple input types in both test solutions
and route data
1.5 Document Structure
This document follows a standard structure in a semi-chronological manner starting with
the motivations for the project, then the history behind the project idea before moving
through to requirement building and system design. After the design documenting there is
the development outline before system testing. The testing is split up into two chapters; the
first being an experiment testing two plugins and the second consisting of more requirement
related testing. The document closes with a conclusion chapter before moving on to the
appendices where the User and Maintenance Manuals as well as further miscellaneous
sections.
13
Chapter
2Background
2.1 History of the Travelling Salesman Problem
The Travelling Salesman Problem is one of the most famous problems in computer science,
although it is not solely researched by computer scientists. In computer science the TSP
is widely studied for optimisation [16]. The fame of this problem could be down to its
simple definition but complex solutions as postulated by Merz and Freisleben [19].
It is widely claimed that there is no known origin of the travelling salesman problem
and all things considered there probably is no provable origin. A quick thought experiment
can show that finding the shortest route is inherent in all of us and indeed in nature.
Imagine that you are on your way to somewhere important, ahead of you are three known
paths A, B and C. You know B is the guaranteed quickest route and therefore, removing
the desire to sight see and dawdle around, you would pick B. Now let’s extrapolate this
further into basic nature. Picture three animals A, B and C all fighting for survival and
having to choose a route to gather food for the winter. Now imagine that A chooses the
most optimal route allowing him to collect more than enough food for the winter whereas
B chooses a route suited to collect enough to survive the winter, but only just, and C does
not collect enough to survive the winter. In this chain of events C’s route is not passed on
whereas A’s route is definitely passed on and B’s is passed on if enough food is collected
to survive. This shows how the need to find optimal solutions to such a situation as the
Travelling Salesman Problem is built into nature.
Although no known origin of the problem exists, it would appear to have been first
mathematically formalised by Irish mathematician Sir William Rowan Hamilton and
British mathematician Thomas Penyngton Kirkman in the 1800s [4].
14
2.2. DIFFERENT TRAVELLING SALESMAN TYPES
The TSP is also famous for being a known NP-Complete problem [22], that is to say
it is ”A problem which is both NP (verifiable in nondeterministic polynomial time) and
NP-hard (any NP-problem can be translated into this problem).” [18] This means that it
takes a very long time to calculate and verify shortest routes See Section 2.3.2.
2.2 Different Travelling Salesman Types
2.2.1 Symmetric and Asymmetric
The sheer extent of the research and scale of the Travelling Salesman Problem means
that it is often found that the TSP is split up into different forms. The most common
split made is between symmetric and asymmetric TSPs. The symmetric TSP is the most
common form when the weighting is equal in both directions, that is to say the distance
between town A and B is the same if travelled in both directions d(A, B) = d(B, A).
Whereas in the asymmetric problem the weighting is unequal in both directions, that is to
say the time taken to travel between town A and B is different travelled in either direction
d(A, B) = d(B, A). As the symmetric problem is in some ways more representative of the
real world much of the research has been concentrated on this [6].
2.2.2 Graph
Weighted graphs can be used to represent TSPs and are especially useful when not all
points are connected. For example, in the real world where travel between two cities directly
may not be possible due to road layouts. Since graphs can be directed or undirected, they
prove useful in adding constraints on the direction of travel between vertices.
A basic example of a weighted graph can be seen in Figure 2.1. In this graph the
vertices A, B, C and D are connected by weighted edges. A weight expresses, for example,
the distance between two vertices (locations in the TSP) or the cost, carbon footprint etc.
The weighted edges can be used to calculate a weighting of an overall route to visit all
four vertices.
15
2.2. DIFFERENT TRAVELLING SALESMAN TYPES
Figure 2.1: Example of a weighted graph (Wikipedia)
2.2.3 Metric
The metric TSP or ∆-TSP (delta-TSP) states that the weightings between points satisfy
the triangle inequality, which states that dA,B ≤ dA,C + dC,B. In short, the metric TSP
states that a straight line is always shorter than a detour and although this sounds obvious
under certain conditions and in certain forms of geometry this may not always be the case.
This rule is strict when dealing with Euclidean Space [15]. An example of the metric TSP
is the Euclidean TSP using euclidean distances, also known as ordinary distances, such as
the distance measured using a measuring tape.
2.2.4 Other Information
When talking about the Travelling Salesman Problem it is easy to assume it is exclusively
and solely linked to three dimensional physical space when this is not actually the case. It
is obvious that the most common forms of the TSP are Three Dimensional, Symmetric and
Euclidean because this maps out many real life scenarios such as the namesake travelling
salesman, but it can also use more or fewer dimensions.
16
2.3. SOLUTION TYPES
2.3 Solution Types
2.3.1 Heuristics
Heuristics in a computer science sense is the way of producing a solution to a problem
that balances efficiency off of accuracy. Since the Travelling Salesman Problem is a
NP-Complete problem, the best that can be hoped for is an accurate heuristic. There are
several ways to categorise heuristics, for example deterministic and nondeterministic. A
deterministic heuristic is one that always produces the same output given the same input
and a nondeterministic heuristic is one that, through the use of random elements and
other such entities, may not always produce the same output given the same input. There
are situations where introducing a random element can create either a more optimised
solution or, on average, a more accurate solution. When deciding on the type of heuristic
it is important to balance the requirements by asking such questions as: ‘what is more
important, efficiency or accuracy?’ and ‘should the heuristic always produce the same
output?’
2.3.2 Exact Algorithm
The exact algorithm is the fundamental solution to the TSP, so although it is not a viable
option on even the smallest list of points, it is best to outline it to better understand
the nature of the problem. The TSP is described as Superpolynomial, so to simplify the
calculations for the sake of demonstration we assume that the TSP runs in true factorial
time and we assume that each calculation takes just one nanosecond. This would mean
that for a list of 10 elements the algorithm would take just 3.63ms. Now, with the addition
of 10 elements making a list of 20, the algorithm would take 77.1 years that is just above
670 billion(1 × 109
) times longer than for a list of 10 elements. Finally, add 10 again to
give a list of 30 and it would take about 8.4 quadrillion(1 × 1015
) years to compute [25].
As the maths shows this solution is outright not viable. In pseudocode this solution could
be described as:
function ExactTSP(P)
d=∞
for Pi each permeation of point set P do
if dist(Pi≤ d) then
17
2.4. EXISTING SOLUTIONS
d = dist(Pi) and Pmin = Pi
end for
return Pmin Skiena [25]
It can be seen from the pseudocode why the exact algorithm takes so long but also
how simple it is. The size of the set of P can be calculated as the number of permutation
of a list of points N (SetSize = N!).
2.4 Existing Solutions
During the research for the development of the Euclidean Travelling Salesman Platform
(ETSP) no previous solutions were found that encapsulates the requirements and the
design decision. It is for this reason that instead of listing existing solutions for ETSP,
solutions for the TSP in general are listed. Some of these solutions are only used for
research and some are used in the real world for everyday tasks.
2.4.1 Existing Research Solutions
In recent years with the improvement of multi core systems and the ability to build multi
agent or distributed solutions to problems, a lot of research has been done on the so called
Ant System[7–9, 11, 13]. Ant colony optimisation feeds perfectly into research about
genetic algorithms and AI, allowing for some ant colony solutions that use AI ants. [12]
Along with distributed solutions and genetic or learning systems, there are larger
systems such as Concorde[3] and the Lin-Kernighan Heuristic also known as LKH [17].
The reason for LKH’s fame is the number of standard library forms of TSP[24] it has
completed to date, with its optimisation and scale of the system. LKH has been shown to
find the optimal route for the 7397-city problem[14].
2.4.2 Existing Real Word Solutions
Interestingly, TSP solutions in many forms exist ubiquitously around in every day life
and quite often as some kind of route planner. Although the exact algorithms behind
18
2.5. REAL WORLD APPLICATIONS
solutions such as the AA Route Planner[2] RAC Route Planner[23] and the TomTom
Route Planner[26] are not publicly released, given their function, it is easy to see that
they are most likely some form of TSP solution. Other solutions that may exist could
include parcel delivery systems and even air travel planning systems.
2.5 Real World Applications
ETSP is designed to not just be research restrictive but also to fit a wide variety of real
world applications. ETSP’s primary application would be for the testing of heuristics, but
that could be in any sense from education to research. From the point of view of research,
ETSP could be used to verify the efficiency and accuracy of TSP solutions whereas from
the point of view of education, ETSP could be used to help teach the importance of the
efficiency-accuracy trade off in the development of heuristics or more specifically could be
used in the education of the Travelling Salesman Problem.
19
Chapter
3The QSTSH
3.1 Overview
QSTSH was named after its full title, the Quick Sort Travelling Salesman Heuristic and
was more than an afternoon’s work, in fact it has been three years in development. The
origin of the Quick Sort Travelling Salesman Heuristic was a single burst of inspiration
whilst attending a lecture on Algorithmic Problem Solving in 2011. The lecture discussed
optimised solutions and broadly touched on Quick Sort as a proof of good optimisation
and on the Travelling Salesman Problem as an example of when optimisation was not
always possible. From this lecture the seeds of QSTSH were planted.
3.2 License
The reason for a license section is to iterate that QSTSH was not developed for this project
or for any other project and as such remains solely owned by William John MacLeod
Fraser. The use of QSTSH in this paper is purely as a demonstration of ETSP and as
such this chapter is just a formalisation of it. QSTSH is not licensed in any way to be
used for research without the express written permission of the author.
20
3.3. BACKGROUND
3.3 Background
The background of QSTSH is the Quick Sort algorithm. QSTSH is handed a list of points
which are assigned an xy value corresponding to the closest point on the x = y line. This
list is then copied twice to give three lists of the same points which are then sorted in three
separate ways by their x, y and xy points. After the three sorts have been completed, the
sorted lists are then passed to the main algorithm behind the heuristic where the route is
calculated. From a given starting point, the algorithm searches up and down the three
lists to a chosen depth in order to find the nearest neighbouring point to the starting point
witch is then removed from the three lists and added to a solution list. The removed point
is then set as the starting point and the process is repeated until the three sorted lists
are empty. The depth is represented as a decimal percentage of each list. These sublists
would be checked for the nearest neighbour where 0 only checks the points above and
below the current point and 1 checks all points within the lists. The list output from this
heuristic is a good attempt at the shortest route from the start point visiting all points
right round back to the start point. Being deterministic QSTSH will always give the same
output given the same set of inputs thus making it easy to test for accuracy.
3.4 Implementation
QSTSH was initially developed in Java and can be easily split up into two parts: Data
Preprocessing and Route Calculation. During the Data Preprocessing all points are read
in and stored in three lists where each list is then sorted on a different perspective.
function QSTSHPreProcessing(SetOfPoints)
XList QuickSortOnX(SetOfPoints);
YList QucikSortOnY(SetOfPoints);
XYList QuickSortOnXY(SetOfPoints);
After the preprocessing the main body of QSTSH is run on the three lists
21
3.4. IMPLEMENTATION
function QSTSH(Double depth, XList, YList, XYList)
SolutionList[0] ← XList[0]
remove XList[0], YList[XList[0]], XYList[XList[0]]
for XList.size-1 do
cPX = nextPoint(XList);
cPY = nextPoint(YList);
cPXY = nextPoint(XYList);
pos = 0;
for pos ≤ depth × XList.size do
if dist(SolutionList[pop],cPX)≥dist(SolutionList[pop],XList[pos]) then
cPX = XList[pos]
if dist(SolutionList[pop],cPY)≥dist(SolutionList[pop],YList[pos]) then
cPY = YList[pos]
if dist(SolutionList[pop],cPXY)≥dist(SolutionList[pop],XYList[pos]) then
cPXY = XYList[pos]
end for
if dist((SolutionList[pop],cPX)≥dsit((SolutionList[pop],cPY)) then
cPX = cPY;
if dist((SolutionList[pop],cPX)≥dsit((SolutionList[pop],cPXY)) then
cPX = cPXY;
SolutionList ← cPX
remove XList[cPX], YList[cPX], XYList[cPX]
end for
SolutionList ← XList[0]
Figure 3.1: QSTSH’s main function
The pseudocode of Figure 3.1 shows the inside of QSTSH in its most basic form.
Although the above code looks complex, it can be easily explained in plain English.
Initially the three lists go in and the first point in the sorted XList is added to the solution,
then that point is removed from all three sorted lists to prevent it being reselected. Then
for each point in the lists, barring the last, a process of testing the distance between the
preceding point and both the selected point and the previous best is carried out. The
point giving the shortest of the two distances is then stored. This is only run through
points up and down from the position of the preceding point to a given depth. Depth
in this instance is a decimal percentage of the size of the list to check; a depth of 0.5
22
3.5. DEVELOPED OPTIMISATION AND ACCURACY
would only check half the list, a quarter up and a quarter down. The point chosen as the
best distance is added to the solution and removed from all input lists then the selection
procedure is run again until only one element is left where the last element is just added
to the solution.
3.5 Developed Optimisation and Accuracy
After development of the original heuristic, several experiments were run to find any possible
optimisation or accuracy improvements. Out of these experiments several optimisations
and one major accuracy improvement were found . As QSTSH was initially developed in
Java some of the optimisation enhancements were purely code based such as allocating
all variables at the top of the class and instantiating all variables only once within the
constructor, as shown in Figure 3.2.
public class QSTSH {
public Vector<Point> listOfPoints, listOfSortedX, listOfSortedY,
listOfSortedXY, cloneList, solution;
public int DEPTH = 0;
public Point beg, last;
public QSTSH(double DTEMP, String index, Vector<Point> all,
Vector<Point> x, Vector<Point> y, Vector<Point> xy)
{
setUpLists();
listOfPoints.addAll(all);
listOfSortedX.addAll(x);
listOfSortedXY.addAll(xy);
listOfSortedY.addAll(y);
DEPTH = DTEMP;
beg = findBeg(index);
...
Figure 3.2: QSTSH’s Constructor
23
3.5. DEVELOPED OPTIMISATION AND ACCURACY
This improvement removed all unnecessary reinstantiation of variables and replaced
them with simple reallocation.
Another interesting discovery during the testing was the processing time differences
between functions on Java Vectors, ArrayLists, Sets and HashMaps. The outcome of
tests showed that by far the most efficient were Vectors compared to ArrayLists and it is
worth noting that Sets and HashMaps were used for their unordered nature to prove that
QSTSH required the Quick Sorts. All uses of ordered lists within the Java implementation
of QSTSH were changed to java.util.Vector.
After the structural Java changes tests were done to find ways in which QSTSH could
be split up and run in multiple threads or processes. This proved difficult for the main
body of QSTSH but the sorts were split between different threads offering a further level
of optimisation. On top of this each sort was made into an in-line recursive form of the
quick sort algorithm, adding another level of efficiency and reducing memory consumption
further.
The final major improvement was to accuracy, although it also proved beneficial
for efficacy, and it came directly from the experiments in the form of a static variable.
The depth variable allows users to set the checks to run through 100% of the list or
just the direct neighbours in the lists. As a general rule for accuracy the equation
depth = (1.6563 × (NumberofPoints)−0.388
) − 0.0096 was found to give a rough best
depth. The tests that lead to the discovery of this equation were carried out by plotting at
what depths QSTSH was better than the Greedy Nearest Neighbour (GNN) then finding
the line of best fit (the graphs created can be seen in Figure 3.3). When plotted, it
became evident that there must be some form of correlation between depth and accuracy
of QSTSH. So much so a simple equation must be reachable. Although the depth produced
by this equation does not always provide the most accurate output from QSTSH for a
dataset, it does on average give the best output. However further experiments need to be
done to determine a better equation.
24
3.6. DEVELOPMENT TO DATE
Figure 3.3: Plots of depths at which QSTSH exceeds GNN logarithmically scaled and
normally scaled respectively. Showing a line of best fit.
3.6 Development To Date
To date, this is the first formalisation of QSTSH but there is still a list of experiments and
improvements to be done in the future. The next major planned development for QSTSH
is to research the depth value and find an equation that better represents the optimal
depth value.
25
Chapter
4Requirements
This section deals with the requirements in a MoSCoW fashion [5]: Must, Should, Could,
Wont have to each of functional and non-functional requirements.
4.1 Functional Requirements
The functional requirements for ETSP are as follows:
4.1.1 ETSP Must:
1. Load in files of the format (.txt, .tsp, .tsproj)
2. Look for and load in plugins with format (*impl.class)
3. Allow the user to save the projects in multiple formats (.txt, .tsproj)
4. Record the runtime of loaded plugins
5. Record the accuracy of a plugins output
6. Display information on plugin runs accurately and usefully
4.1.2 ETSP Should:
1. Allow the user to program scripts
26
4.2. NON-FUNCTIONAL REQUIREMENTS
2. Load in files from other formats such as (.csv)
3. Look for and load in plugins with a class file containing required methods
4.1.3 ETSP Could:
1. Allow the user to program complex scripts for TSP solutions. Such functions that
should be available include:
1.1 Basic list manipulation
1.2 Variable comparison (Such as Point b = Point a)
1.3 Operations (Such as *, %, -, +, ) on Point data (Such as X, Y values)
2. Load in all plugins in formats of (.JAR and .class) with required methods
3. Command line functionality
4.1.4 ETSP Wont:
1. Calculate the big O value or the big Ω value for given plugins
4.2 Non-Functional Requirements
The non-functional requirements for ETSP are as follows
4.2.1 ETSP Must:
1. Load and be usable within 5s
2. Switch between tabs without any lag time
3. Save without any lag
4. Load files with up to 5000 Points within 5s
27
4.2. NON-FUNCTIONAL REQUIREMENTS
4.2.2 ETSP Should:
1. Work on the most current stable and soon to be stable versions of Java (Java 7 & 8)
2. Have a working/ accurate progress bar for script running
3. Not Crash when no plugins or no working plugins are given
4.2.3 ETSP Could:
1. Prevent itself from crashing while running outside plugins
4.2.4 ETSP Wont:
1. Be runnable on older forms of Java (Pre Java 8)
28
Chapter
5Methodology & Technologies
5.1 Methodology
This project was dealt with in an academic manner as more of a research project than
a development task. For this, four primary steps were taken; research into the field and
previous systems, requirement acquisition, system development and system testing where
all along documentation was being carried out.
5.1.1 Research
Primarily the research focused upon the history of pre-existing solutions, different forms
and real life influences in relation to the Travelling Salesman Problem. This was carried
out using methods such as searching through papers on Google Scholar and checking for
resources in the library. Google Scholar provided the bulk of research material linking to
online publishing companies.
5.1.2 Requirement Acquisition
During the first stages of requirement acquisition design elements were chosen to answer
the question: “How can I fairly and accurately test a Travelling Salesman Solution in
a static environment to negate the change in environment from an experiment?” This
managed to raise a large number of requirements that suited the basic functions of the
system. For some of the more advanced functions, such as the functions behind ETSPLang,
29
5.1. METHODOLOGY
a deeper general question was asked: “Is there a group of functions that could prove
useful to have when developing and testing a Travelling Salesman Solution?” Finally the
non-functional requirements were selected based on general knowledge about computer
systems and expected run environments such as a general rule of “Don’t keep the user
waiting”.
5.1.3 System Development
System Development was carried out in an Agile manner where it was completed in a set
of runs. The chosen requirements for completion within a run were decided based on what
could be completed in a week long period. After each run the code was reviewed and the
next iteration of code was decided. On completion of version 1 of ETSP further rounds of
re-factoring were planned to allow for robustness of code. During development version
numbering was carried out in a A.B.C pattern; where A represented the completion of a
set of requirements, B was the completion of one requirement and C was the completion
of a part of a requirement or a minor update. For this project the target of A was 1 with
the hope to suggest groups of requirements that could be used in a future iteration.
5.1.4 System Testing
It was planned that the system would be tested in three ways: functional testing, non-
functional testing and HCI testing. Both functional testing and non-functional were carried
out in a yes/no true/false manner as these can be objectively observed whereas HCI or
Human Computer Interaction testing was carried out using volunteers and the success
or failure depended on their input. An added step of testing functionality was taken by
running an experiment to compare two plugins.
5.1.5 Documentation
Documentation was carried out alongside the development of ETSP and organised by the
use of a Gantt Chart, see Figure 5.1. In the place of breaking documentation down into
“runs” it was broken down into subsections and allotted an estimated length of time based
on the importance of such sections. Instead of learning a new system for creating Gantt
30
5.2. TECHNOLOGY
Charts it was decided to use the graphing function of Excel as it adequately fills the role
and is easy to add in completed time in a time sheet style.
Figure 5.1: Gantt Chart Plan of Documentation.
5.2 Technology
Throughout the early stages of the project decisions on technologies were taken as and
when they became relevant to the progress of the project, keeping in mind the requirements
and the estimated user group of the project material. Other useful factors for decision
making included time restraints and prior used technologies.
5.2.1 Licensing
As with any use of technology all licences had to be observed to the fullest. It is understood
that the development would be done in Java with several libraries such as Swing [20] which
are contained under the Java Licence. Java applications can be released under the GNU
General Public Licence marking it as free or under the Oracle Binary Code License (BCL)
free of charge [21]. Oracle Java Licences can be obtained from the Oracle website for Java
use on some embedded computer environments.
31
5.2. TECHNOLOGY
5.2.2 Java & JavaDoc
The decision to use Java was heavily weighted by time restraints. It is known that Java is
not the most optimised programming language. When dealing with heuristics designed to
be computationally heavy it would probably be ideal to have the system run in a Function
language such as C over an Object Oriented language but this would have required learning
another language. In the end the decision was made to use Java based on prior skills
acquired in this language.
As with most projects it was desired to make ETSP “Future Proof” and as such it
was developed on Java 8 (Project Kenai) then periodically compiled and run on Java 7,
thus assuring it was suited to the stable release Java 7 and for the future release Java 8.
During development of ETSP on the 18th
March 2014 Sun Oracle Released Java 8 as the
current stable version.
ETSP was also documented to follow coding standards using the JavaDoc tools to
allow for easy understanding during future developments.
5.2.3 IDE
Before making a discussion on an IDE for the project several IDEs were reviewed and
analysed for their usefulness in such a systems development. NetBeans, Eclipse and
SublimeText were all tested for suitability. Although arguably SublimeText is a text editor
and not an IDE it was considered for it’s cross platform, lightweight model.
In the end Eclipse1
was chosen on its merits as an IDE that can manage projects and
run applications without the need of packaging. It also has a very user friendly GUI which
had been used and learned for other projects.
Alongside Eclipse, SublimeText was used to carry out the writing of LATEX files. This
decision was made based on SublimeTexts Build Systems function, which allowed for the
creation of a simple script that built the PDF out of the respective TEX and BIB files.
1
Both Luna and Kepler (version 5(Beta) and 4 respectively).
32
5.2. TECHNOLOGY
5.2.4 Storage
Due to the light use of memory in ETSP it was quickly decided to use plain text files as
the input. This decision was made on the basis of an already established file type .tsp, see
Appendix C.1. Although the TSP file type was very accommodating with what it allowed,
it could not account for some of the session features used in ETSP. It was also deemed
overly complex for initial loading of material because of these point two other file formats
were chosen .txt and .tsprj. TXT was chosen to allow for quick and accessible loading of
data. TXT is widespread and in its use for ETSP is perfect for a very simplistic loading
file, see Appendix C.2 for details about the format use. Alongside TSP and TXT a more
comprehensive file type and layout was created. TSPRJ was created as a file type that
would allow for the storing of certain session information. Given the added complexity of
TSPRJ it was primarily developed to be handled solely by ETSP and not really touched
by the user, see Appendix C.3 for details about TSPRJ such as layout and use.
5.2.5 Version Control
Throughout the project version control was taken very seriously and was managed in three
parts. Each folder and structure was stored in DropBox, then the project files were backed
up to GitHub whenever a feature or class was completed with incremental archiving and
storing of all files as zip files. This three way control allowed for a much more dynamic
version management system. The use of DropBox negated the need to keep checking if
files needed pulled as this was managed by DropBox. The use of GitHub allowed for a
constant and well managed version control system that could be accessed in an emergency
where as if simple file checks had to be done or a simple roll back the use of zip archives
negated the need to use messy commands in Git.
This well rounded version control also allowed for additional benefits such as GitHub’s
online portal for viewing the project and DropBox’s portability for opening and editing
single files on the go.
33
5.2. TECHNOLOGY
5.2.6 Parallels Desktop 8
Parallels Desktop 8 [1] is a virtual machine tool used for running virtual operating systems
on a Mac either in cohesion mode where the Mac and chosen operating system run side
by side or in boxed mode where a virtual box is started up. This software is a licensed
software costing £64.99 for a license and was used for the testing of the project on several
major operating systems.
5.2.7 LATEX
For the documentation of this project it was decided that LATEX would be used with a
MScthesis class. LATEX allows for better typesetting, figure placement and referencing
styles compared to other documenting tools.
34
Chapter
6System Design & Architecture
6.1 System Design
6.1.1 User Interface
ETSP was designed to have two primary forms of user interface, Terminal input and GUI.
As GUIs are more sought after for their user experience and usability, designing the GUI
was focused on most. Some focus was placed on good Human Computer Interaction (HCI)
and especially with the design of the GUI. To help with HCI ETSP was designed to only
allow users to run components that were available and disable any that were not necessary.
This does not only insure a better user experience but also insures a robust system.
Figure 6.1: Main Screen Concept
The main user interface was broken up into four components in the form of three tabs
35
6.1. SYSTEM DESIGN
and the overall window. All tabs had to be housed together in a usable window with a
very simple, standard design. In Figure 6.1 the main window layout can be seen with the
three tabs shown on the top left of the window under the menu bar. For the menu bar
several decisions were made based on other programs such as having a File menu and a
Help menu to give a more conventional feel to the overall program.
Heuristic Tester
The main functionality of ETSP is a simple tester for TSP heuristics which would give out
statistics. These statistics will include runtime, the route cost (length) after the heuristic
has been run and, if a known optimal route has been set, it will give a percentage of how
optimal the current route is.
For this part of ETSP a simple, uncluttered design was drawn up as seen in Figure 6.2
where on the left side there is the information about the working data as it stands and on
the right there is the output of a run heuristic with a run button separating the two. A
heuristic/plugin can be chosen from the plugins menu as seen at the top left of the main
screen concept in Figure 6.1.
Figure 6.2: Heuristic Tester Tab Concept
This design was considered best for it’s simple click and run functionality. Coupled
with this design would be the loading and saving options within the File menu as found in
most common programs.
36
6.1. SYSTEM DESIGN
Project Viewer
Due to the way data is used in ETSP it is important to be able to alter the project
information after it has been loaded into the system. For this a specific tab was designed to
allow for the displaying of all project data and the editing of specific project information.
Figure 6.3: Project Tab Concept
To help with usability the Project View tab, as seen in Figure 6.3, was split into two
parts by type of field where the top of the tab was used for editable fields and the bottom
of the tab was used for uneditable or locked fields. This prevents the user from changing
information they should not be able to change whilst at the same time clears up what
information can be changed.
ETSPLang
One of the major design points of ETSP is the ability to build and run scripts as part of
testing heuristics. Following the simple uncluttered design of the heuristic testing tab this
tab has three components: an input screen, output terminal and a run button as seen in
Figure 6.4.
37
6.1. SYSTEM DESIGN
Figure 6.4: ETSPLang Tab Concept
For a scripting system to work a standard language has to be designed and implemented.
This language will require syntax and keywords with definitions. The language design
is simple with commands being split by a new line and the use of a library of defined
commands, see Table 6.1 for the preliminary vocabulary design.
Term Function
Run Run a specific heuristic
Compare Compare two or more
Log Output a log file
// Comment
Help Display help data
Table 6.1: Preliminary Vocabulary Design
This language was designed so scripts such as that in Figure 6.5 could be run. As it
was preferred the development of the language would be dynamic, no strict design choices
were made until development.
38
6.1. SYSTEM DESIGN
// Give some comment
Run <Heuristic Name>
Compare <Heuristic 1> <Heuristic 2>
Help
Figure 6.5: Sample Script
6.1.2 Plugin Loader
Because of known complexity issues with classpath loading in Java it was decided early on
that all classpath handling would be done without user input to prevent complications.
The plugin loader was designed such that, if the plugin interface followed some constraints
then ETSP would automatically load it into the classpath and make it runnable.
6.1.3 ETSLib
It was decided during the design process that, to add to the usability of the system, some
standard libraries should be built to help in the development of plugins suited to ETSP
and out of this idea Euclidean Travelling Salesman Library or ETSLib was born. ETSLib
is designed to handle any additional libraries that could be removed from the main ETSP
system and used in both the plugins and in ETSP. Creating this library would add a level
of standardisation to the data structures used in communication between plugins and
ETSP.
6.1.4 File Loader
With the way data is handled and in the volume in which it can exist there was a need for
a file loader and saver of some description. Given this need a simple flat file loading and
saving system was designed to be integrated as part of the overall ETSP system.
39
6.2. ARCHITECTURE
The flat file system decision was based on a pre-existing file type created for TSPs
given the file extension .tsp (see Appendix C.1) while a decision was also taken to design
further file types for use within ETSP, see Appendix C
6.2 Architecture
6.2.1 Overview
Much deliberation went into the chosen architectural design of the system. Based on the
plugin testing design it was decided to go for a closed three-tiered architecture. As with
any system that runs outside code it is empirical to look out for security issues such is
the reason for the closed architecture preventing the users plugin code changing items
in higher tiers. Separating the system up into three tiers produces: a Presentation Tier
hosting all the GUIs, a Business Logic Tier hosting the processing of plugin runtimes and
statistics as well as the plugin interfaces connecting to the plugins and a Storage Tier
hosting the functions for saving and loading data files, loading plugins at startup and
storage entities. These three tiers are illustrated in Figure 6.6
Figure 6.6: System Architecture Diagram.
40
6.2. ARCHITECTURE
In the way of a strict closed three-tier architecture the system is designed so that
each component is only capable of calling components on the same level or the one below.
Therefore, the Presentation Tier can only call to the Business Logic Tier and the Business
Logic Tier can only call to the Storage Tier and connected plugins. A closed architecture
allows for the stated added security and the help when tracing problems in the system
making it the perfect choice and the system can be cleanly broken up into the three
primary components such that the project is designed to strictly follow this architectural
plan.
Presentation Tier
Given the chosen architecture this tier should only be coupled to the Business Logic Tier
and should not communicate directly with the Storage Tier or plugins. The Presentation
Tier will house the Graphic User Interface (GUI) and Command Line Interface (CLI) for
user interaction with the system.
Business Logic Tier
This tier will act as an interface for communication between the GUI/CLI and the Storage
Tier and Plugins. This tier will carry out all calculations for runtimes and accuracies and
interface directly with the plugins connected by the users.
Storage Tier
As storage in ETSP will primarily be done in the form of flat files this tier will communicate
with the file system. This tier will carry out read and write commands from and to files
and will only be accessible through the Business Logic Tier to prevent Plugins accessing it.
6.2.2 Class UML
Figure 6.7 shows the Class UML for ETSP as designed and implemented.
41
6.2. ARCHITECTURE
Figure 6.7: Class UML
42
6.2. ARCHITECTURE
6.2.3 Packages
The system design was broken up into three main sections with several packages within
each and organised based on relevance. One of the primary aims of the system design was
to follow the Object Oriented Paradigm within the selected three-tier architecture. The
three main sections used are outlined with their packages below in Figure 6.8.
Figure 6.8: System Packages
The ETSP section contains the main system itself from which ETSP can be run.
ETSP is broken down into four packages the GUI making up the Presentation Tier, the
PluginInterface and the ETSPLang make up the Business Logic Tier and the Storage
package making up the Storage Tier. The ETSLIB section contains required and useful
packages for the creation of plugins so they follow a standard format. Although plugins
are created by the users they are a part of the system and as such they are given a section
of their own.
43
Chapter
7Implementation
7.1 Overview
The implementation of ETSP finalised for this project contained all major functionality
discussed in the requirements, Chapter 4. For the demonstration of ETSP several plugins
were built to be deployed with the system. The user interface was developed with the
original designs in mind and with the desire to find ways to add to the Human Computer
Interaction of the system.
Figure 7.1: ETSP overview
44
7.2. USER INTERFACE
7.2 User Interface
It is worth noting that although all major functionality was produced a Command Line
Interface (CLI) was not. Although a CLI was originally planned it was decided that
the time would be better spent adding HCI improvements to the main Graphical User
Interface thus giving the first full version of ETSP as seen in Figure 7.1. Only some minor,
innocuous changes exist between the conceptual main view and the final product such
as using the word Project instead of File in the top left corner of the menu bar. Overall
ETSP components have either lived up to or exceeded expectations from their original
designs.
7.2.1 Heuristic tester
When ETSP is launched the Heuristic Tester is the first tab loaded with all unusable
options and buttons deactivated, as can be seen in Figure 7.1 where the run button, three
tabs at the top and the plugins menu are all greyed out stopping the user from using them.
This was to make ETSP robust against user caused crashes. Once a file is loaded the
Heuristic Tester becomes completely unlocked to the users as seen in Figure 7.2.
Figure 7.2: ETSP: Heuristic Testing Tab
45
7.2. USER INTERFACE
It can also be seen from Figure 7.2 that the original three piece design of information
before the run to the left, information after the run to the right and the run button in the
middle was not entirely kept. The basis of the original design was still used so the before
and after information are still sat left and right respectively with a run button separating
them. However, the addition of constant information above and the list of previous runs
below provided a cleaner output.
Figure 7.3: ETSP: Heuristic Testing Tab in Action
If a user runs a plugin the page allows the user to compare the current information on
the dataset and the outcome of the plugin before deciding weather or not to accept the
new output, as seen in Figure 7.3. This allows the users to compare the new route length
to the previous and the new route accuracy to the previous while the system disables the
ability to change the heuristic and or to change tab.
7.2.2 Project Info Viewer
The project info viewer was kept to the original concept with the editable information sat
in a section above the uneditable information, as seen in Figure 7.4. This tab allows the
user to change nearly anything about a loaded project from the name to the preferred file
path even the list of points.
46
7.2. USER INTERFACE
Figure 7.4: ETSP: Project Info Viewer
Due to statistical output if any points were added, removed or edited all the statistical
information of previous plugin runs had to be reset. To prevent confusion to the user and
to add to the HCI of the project a simple yes/no dialogue was created to ensure the user
understands that submitting changes will remove previous information as seen in Figure
7.5.
Figure 7.5: ETSP: Project Info Viewer Submit Request
47
7.2. USER INTERFACE
To help with HCI and robustness, input checks were made preventing damaging input
from being submitted. This is done by checking the input of each element when the user
tries to click elsewhere. If the element contains damaging data the input element will be
highlighted red and prevent the user from continuing with editing.
7.2.3 ETSP Terminal
The ETSP Terminal is the user interface to ETSPLang which has an extremely simple
three component design following the original concept. The simple design can be seen
in Figure 7.6. Within the ETSP Terminal tab the two main components are the input
window and the output window. In the current version of ETSP the output is given by
tying the Java standard out (StdOut) to a TextPane meaning the output comprises of
Java System.out.print statements which would normally be pushed to the terminal. This
was decided for optimisation of code as individual outputs would not have to be created
then pushed to the TextPane. Instead, the Java System class would handle the outputs.
To add to the systems HCI the output panel was set to be uneditable by the user and was
set to auto-scroll to end if output was added. The input panel of the ETSP Terminal is
just a simple editable TextPanel that when the button is clicked passes its contents to the
ETSPLang package to be processed.
Figure 7.6: ETSP: ETSPLang Terminal
48
7.3. ETSPLANG
7.3 ETSPLang
As of version 1.0.0 of ETSP, ETSPLang sits at version 0.5.9. This is due to the unforeseen
complexity in developing a scripting language. Even with the missing features enough was
developed to run basic scripts and do the two most important functions; to compare two
lists of plugins and run a single plugin (an example of a script that has been run can be
seen in Figure 7.6)..
7.3.1 Syntax
The current version of ETSPLang was given a basic syntax to make an easy to read
scripting language. It was decided for ease of reading that: all keywords would be in
upper-case, colons would be used as separators for lists and all new commands would be
placed on a new line.
7.3.2 Functions and Keywords
As of ETSPLang version 0.5.9 only eight keywords have been developed but they were
chosen to be the most important.
Keyword Function
HELP Displays a complete help dialogue.
CLEAR Clears the output terminal.
OUTPUT Displays a before and after output of the script up to this point
STORE Stores the changes to the live project
 Denotes a comment
Table 7.1: Table of ETSPLang keywords Part 1
49
7.4. PLUGIN LOADER
Keyword Function
START_S Denotes the start of a script
END_S Denotes the end of a script
COMP Compares two lists of heuristics split by a colon
RUN Runs an individual list of heuristics split by a colon
OUTPUTLEVEL Sets the output level as one of NORMAL SILENT or VERBOSE
Table 7.2: Table of ETSPLang keywords Part 2
Some of the major decisions about the use of language and the functionalities were
not made until development was under way. This allowed for a more flexible development
procedure and not being tied to predetermined ideals. Within the language it was decided
that a start and end command would prove helpful for readability. This decision was made
on the basis of sensitive and insensitive commands, where the commands RUN, COMP
and OUTPUTLEVEL would be classed as sensitive commands and would require to be in
the START_S and END_S tags whereas all other commands do not. During development
it was noted that the two most simple commands that would be necessary would be one to
compare heuristics and one to run heuristics and as such COMP and RUN were developed.
A full list of the current keywords can be see in Table 7.1 and Table 7.2 with their function.
7.4 Plugin Loader
The plugin loader for ETSP version 1.0.0 was designed with very specific requirements
and as such plugins have to be written with very specific interfaces and use very specific
data-structures. Currently the plugin loader can only load a plugin that has an interface
class file with the naming convention <plugin name>imple.class. It was intended that
the plugin loader could be designed to handle JAR files as well as class files but due to
time constraints this was not done for this version. Future advancements are discussed in
Section 10.3
Within each plugin interface class two functions are required; runAlg(Vector<Point>
vp) which runs the heuristic given a vector of ETSPLib.Storage.Point and getList() which
returns a vector of Point’s. The choice of using a standardised data-structure was taken
to make developing plugins easier.
50
7.5. ETSPLIB
7.5 ETSPLib
As outlined in the design for ETSP the creation of a library of standard functions and
entities was carried out throughout the development. ETSPLib houses the data-type
Point which is used by ETSP to handle individual coordinates. At the moment only the
Point class is held by ETSPLib. Some beneficial developments were acknowledged but not
carried out and are outlined in Section 10.3
7.6 Other Developments
Other notable developments in ETSP include a help box and an about box, see Figure
7.7, to help create a feel of familiarity to the program by adding features that can be
commonly found in other programs. Other HCI improvements were added such as an
undo functionality (Control + Z) and a redo functionality (Control + Y) as these two
functions have almost become expected when dealing with a platform in which you can
make a mistake entering information. On top of this some background HCI was added in
the form of remembering screen size preferences and file locations.
Figure 7.7: ETSP: Help and About boxes
During development the discussion was made to use some operating system specific
packages such as the com.apple.eawt package. This allowed for the use of some of Apple’s
Mac OSx specific functions such as the hopping icon where, when user attention is required,
the icon for the application hops from the Mac OSx doc along the bottom of the screen.
51
7.6. OTHER DEVELOPMENTS
Along with this the addition of full-screen functionality was added to utilise the Mac OSx
Lion (10.7) and up full-screen ability. These design decisions were taken to give the system
a more “natural” Look and Feel while used in Mac OSx without hindering the usability
within other operating systems.
On a strictly development point, several classes were superseded by a major improvement
in some way and where this occurred rather than delete the older classes it was decided to
mark them as @depreciated so future developers can look back on previous developmental
decisions.
52
Chapter
8Experiment
8.1 Overview
As stated in the motivations of ETSP one of the outcomes was to be able to test QSTSH
against a comparative heuristic using a fair platform. For this test it was decided to use
the Greedy Nearest Neighbour as it shares several similarities with QSTSH.
8.2 Aim
To determine the accuracy and efficiency of the Quick Sort Travelling Salesman Heuristic
(QSTSH) with reference to the Greedy Nearest Neighbour Heuristic (GNN).
8.3 Hypotheses
Given previous experiments it is valid to state that QSTSH will, on average, prove more
accurate than GNN and given the complexity of the code, QSTSH will prove more efficient
than GNN on larger datasets but less efficient on smaller datasets.
53
8.4. HEURISTICS
8.4 Heuristics
8.4.1 Greedy Nearest Neighbour
The Greedy Nearest Neighbour (GNN) is one of the simplest deterministic travelling
salesman heuristics which can be summed up in one short sentence; choose a starting point
then find its closest neighbouring point and move to it and continue this until no points
are left. The GNN is a form of deterministic heuristic as it will give the same output if
given the same input.
8.4.2 QSTSH
In short QSTSH is a specialised form of Nearest Neighbour heuristic where there is a depth
variable denoting how far through the list is to be checked for the nearest neighbour. The
features that specifically denote QSTSH are the three sorted lists, sorted on the X-axis,
Y-axis and the nearest XY Coordinate. These sorted lists are then searched for a nearest
neighbour in only a fraction of the list around where the previous point was situated.
8.5 Method
Each of the sets of test data were prepared from the standard library of TSP files data on
the University of Waterloo Website found at http://www.math.uwaterloo.ca/tsp/world/
countries.html. As both the GNN and QSTSH are deterministic in nature ETSP was
started up and each test file was loaded in. Next GNN and QSTSH were run separately
and the runtime and accuracy were both recorded. Within the results both accuracy and
runtime were plotted against the number of points in the test file.
8.5.1 Test Environment
For the sake of fairness a test environment was set up consisting of a Parallels Desktop
Virtual Machine running OSx 10.9.2 Mavericks using 16 Gigabytes of RAM and 4 × 3.2
54
8.6. RESULTS
GHz Intel i5 cores. The version of Java chosen for testing was Java 8.
8.6 Results
Although the information for the results was collected from the same set of run data the
results themselves can be broken down into two major sections for runtime and accuracy.
8.6.1 Runtime
The runtime results are best represented by the graph in Figure 8.1 where it can be
seen from the red and blue lines that the growth function for each heuristic is startlingly
different. The graph itself is enough to prove the hypothesis correct that QSTSH would be
the more efficient at higher point sizes than GNN. What can not be seen in the graph is
the outcome from the smaller datasets but the results showed that up to a dataset size of
980 points GNN was more efficient or just as efficient as QSTSH. Using R an F-Test was
run on the output data and a P value of < 2 × 10−16
was collected giving a significance
value of 0.001 showing that this data is highly significant.
0 0.5 1 1.5 2 2.5
·104
0
2
4
·104
Number of Coordinates
RunTime(ms)
Runtime by No. of Coordinates
QSTSH GNN
Figure 8.1: Runtime in ms plotted by number of points
55
8.7. CONCLUSION
8.6.2 Accuracy
Results for the accuracy of QSTSH and GNN can be seen plotted in Figure 8.2 where
the nearest neighbour element of QSTSH is more evident. As stated before at the core
of QSTSH is a simple nearest neighbour style heuristic and as expected the two share a
very similar wave pattern in the results. GNN had an average accuracy of 78.25 whereas
QSTSH had an average accuracy of 78.58 which proves the hypothesis that the current
version QSTSH was on average more accurate but not by much. The accuracy results
were run through an F-Test in R which showed the results had a P value of 5.29 × 10−14
which gives a significance value of 0.001 showing the results are significant.
0 0.5 1 1.5 2 2.5
·104
70
75
80
85
Number of Coordinates
RouteAccuracy(%)
Accuracy by No. of Coordinates
QSTSH GNN
Figure 8.2: Accuracy in % plotted by number of points
8.7 Conclusion
The conclusion to the above experiment is clear that QSTSH has a smaller growth function
for runtime and is more accurate on average. Because the average accuracies are so close
it should be stated that it is inconclusive whether or not QSTSH is actually more accurate
on average so further tests would need to be run. It is worth pointing out that QSTSH
can be tailored to be more accurate by finding a depth that is best for each dataset but
56
8.7. CONCLUSION
this task can be laborious thus defeating the purpose in the efficiency of QSTSH. The use
of a standard equation to find a depth value was decided upon to make this test more
fair and keep the depth as a fixed variable, albeit a variable worked out by an equation.
Future research is planned to find a better equation or function for the depth variable that
will increase the average accuracy.
57
Chapter
9Testing and Evaluation
9.1 Testing
Unless stated otherwise all program testing was carried out on a Mid 2012 MacBook Pro
with 8 GB of RAM and a dual core Intel i7 2.9 GHz processor. Testing on other operating
systems was carried out using Parallels Desktop Pro 8®
on the same MacBook Pro as
above with 1 GB of RAM and 1 core. It is also worth noting that testing was heavily used
throughout to remove bugs and as such most of this testing could be deemed obsolete as
it tests for the very bugs that were preemptively removed. This is not to say the testing
was pointless as it solidifies the claims made about the system. Furthermore, given the
emphasis on HCI within development, it was decided User Testing would be a great way
to collect HCI results and Usability results.
9.1.1 HCI & Usability Testing
This section makes reference to the data collected from users testing which can be found
in Appendix D. A template for the user testing can be seen in Appendix D.1 and the
final results can be seen in Appendix D.2 where the User Number at the top of the screen
denotes which user the results relate to.
58
9.1. TESTING
About The Test
The test consisted off five simple experiments and some before and after questioning. The
five experiments were:
1. Open ETSP and open the project file titled wi29.tsp
2. Run the plugin named QSTSH.
3. Change the project name in the Project Information Viewer Tab.
4. In the ETSP Terminal tab run a script.
5. Save the project and close ETSP.
Each experiment was given in one simple line as above with minimal information. This
was a linguistic choice and designed to force the participant into using the User Manual if
or when they got stuck. This technique paid off and will be spoken about later.
In order to collect user data all user feedback was anonymised. All participants are
informed in the Preamble of the user testing document that their data will be collected
and used in an anonymised manner and that they should only continue as far as they are
comfortable. Participants are required to agree to the conditions in the Preamble before
their data can be used. The Preamble can be seen in Appendix D.1. As a standard help
was given only when absolutely necessary and this only occurred in one case.
Sadly due to a lack of interest only seven participants were used in the user testing
but the results are very telling. For each experiment the users were asked to rate Ease of
Use out of 10 (1 being hard and 10 being easy) and to state if the system had any issues.
The choice of these questions allowed for the user to report a problem with understanding
separate from a problem with the system.
Before The Experiment
Before the experiment the users were asked to rate their skills and analyse their capabilities.
These questions were specifically chosen to grasp the users understanding of both the
concepts behind ETSP and Java. The individual user output can be seen in Appendix
59
9.1. TESTING
D.2 and is summarised here. It was aimed to get a wide variety of people with different
skill levels and understandings of both Java and the Travelling Salesman Problem but due
to lack of interest only one participant knew about the TSP.
Four questions were asked in this section starting with a question asking the user to
rate their Java capabilities out of 10. The average result of this was 3.6 but this did include
a range of 1 to 9. Next the users were asked to give their years of experience with Java
which when averaged out gives just above 1.4. This may seem rather low but the rage
was 0 to 4 and the mode was 0. As stated above only one person said they had worked
on the TSP prior to this experiment. One last thing collected in this section was what
operating system the user would be using for the test. The outcome of this question was
very skewed towards Windows with only one user having used Mac.
The overall outcome of this section showed that the userbase for the testing was
predominantly people who probably would not use ETSP in their normal lives. As they
seemed to show very little knowledge in the fields and concepts these tests could prove
useful for HCI and usability as these users would probably have never been faced with
JARgon or field specific information. This means that the systems usability is mostly
down to the user’s interaction and not user knowledge of the field of study.
Experiment 1
In experiment one the user is asked to “Open ETSP and open the project titled wi29.tsp”.
This test required several steps and as the user is given as little information as possible it
is expected that most people would have to use the User Manual. The average Ease of Use
of this experiment was 6.9 which shows that starting ETSP and loading a project file was
mostly easy. Only one user experienced problems with the system directly and that was
User 6. After observing this problem it was realised that the delivery method of the user
test was to blame as the user was required to extract the archive file before running ETSP
and this was not made clear. Apart from problems with the system itself four other users
reported they experienced problems. These problems have been attributed to usability
where one user (User 3) noted that the use of a “New Project” menu option was rather
confusing when it actually loaded a Project. Given the evidence especially the average
Ease of Use having a range of 2 to 10, a mode of 10 and a mean of 8 it would be fair to
say that starting ETSP and loading a project is relatively easy and user friendly.
60
9.1. TESTING
Experiment 2
Experiment 2 Simply asked the user to “Run the plugin named QSTSH”. Again although
this experiment required several steps only one small instruction was given, meaning that
the users were probably going to need to utilise the User Manual. After this experiment
the average Ease of Use was 8.6 with a rage of 5 to 10. No problems were experienced
with the system and this time only one user experienced any other problems. User 6 rated
a Ease of Use of 5 and gave the reason “I’m finding the User Manual difficult to use and
interpret. There’s no contents page and a [lot] of JARgon.” This argument is valid as User
6 was shown to have a low understanding of the background concepts from the initial
questions and as such would not be expected to fully understand all JARgon in the User
Manual.
Overall this test showed that to select and run a plugin must be either fairly intuitive
or easy to understand from the User Manual. User 3 even commented “This was a very
simple experiment and once again the User Manual was very clear.” and User 4 commented
“The figures showing how to run the plugin were very helpful.”
Experiment 3
For this experiment the user was faced with the line “Change the project name in the
Project Information Viewer Tab.” This again was a single line experiment where the user
would either have to use intuition or the User Manual. This experiment received an average
Ease of Use of 9.3 with a rage of 7 to 10 and a mode of 10. No problems were reported
about the system although User 6 did report the system working unexpectedly but judging
by the additional information the system acted as expected for the system but User 6 did
not expect such an action. User 6 again reported an issue with the User Manual and the
experiment by posing the question “What am I meant to change the project name [to]?”.
Barring User 6’s issues the results from the experiment are conclusive that again either
the system is rather intuitive or the User Manual is easy to use. User 4 commented “Very
easy to find and change the name without the use of the manual” which is a plus for the
usability of the system.
61
9.1. TESTING
Experiment 4
The user in this experiment is faced with a script to run in ETSP Terminal. The script
is provided as part of the experiment but the command is just “In the ETSP Terminal
tab run the below script”. Results from this test were excellent giving an average Ease
of Use of 10 and absolutely no problems reported for the system or otherwise. Very few
additional comments were made but those who did comment commonly expressed a lack
of understanding of what was shown in the output. Along with this experiment users were
asked “Could you write and run your own scripts?” three out of seven said they could
which is about expected as only three out of the seven users are known to have studied
some form of computing course.
Experiment 5
Experiment 5 was a simple saving procedure where the user was asked to “Save the
Project as a .TSPRJ file and close ETSP” . The results from this section matched that
of Experiment 4 where no users reported any problems and all rated Ease of Use as 10
(Very Easy). User 4 even left the additional comment of “I was able to save the project
without needing to look at the User Manual” which helps show that the saving and closing
features are intuitive.
Other Questions
The Other Questions section of the user testing was designed to grasp how the user felt
about the User Manual as a whole. The first of the questions asked was: “Was the User
Manual easy to follow?” of which six of the seven users answered as Yes. This is a good
sign that the User Manual is of the right level for the end user. Secondly the users were
asked “Was the manual of an expected technical level?” This question was given a worse
rating as four and a half out of seven. The half vote came about as User 3 answered as
Yes & No and gave a reasoning in the More Information section. The reasoning seemed to
be oriented towards a a lack of understanding of the technical information behind ETSP,
this was made apparent with the line “writing a java script”. The next question asked the
user “Was there any section of the technical manual that seemed too technical?” which
four users answered as Yes. Again it seemed to follow that those with an understanding of
Java answered No whereas those with no background in computing answered as Yes.
62
9.1. TESTING
When the users were asked to “Rate the User Manual as a whole out of 10, (10 Excellent
- 1 Poor)” the average score was 8.3 which shows that on the whole the User Manual
was of a decent level. Finally users were asked if they “would feel comfortable writing a
plugin given the instructions in the User Manual?” As expected all three people with Java
knowledge answered Yes and those without answered No.
Overall Analysis
From the user tests it can be seen that the system is usable by the majority of people and
never demonstrated any programming issues throughout the testing. Although several
issues were pointed out, such as the use of the word New where Load would have been
better and a lack of a table of contents in the User Manual, all in all the system seemed to
live up to expectation. All parts of the system that would require a computing background
were marked as such and sections that should be doable using intuition and the User
Manual were answered as such. An interesting pattern emerged during the experimenting
where throughout testing the average Ease of Use increased from one test to another. This
could be attributed to things such as getting used to the system but further testing would
be needed to prove or disprove this.
9.1.2 Component Testing
In order to test the components of ETSP several tasks were carried out within each tab.
Every time the outcome was as expected this was considered a pass and where the outcome
was not as expected a fail. This test was done strictly black box in style.
Heuristic Tester
To test the Heuristic Tester tab several small tasks were carried out. Firstly a file was
loaded in before a plugin was selected from the menu. Both of these tasks were carried
out several times before moving on to running plugins. Several plugins (QSTSH, GNN
and BubbleThrough) were run with no unforeseen effects. Testing this tab also included
opening the About menu and trying out both the about box and the help box which
worked as expected.
63
9.1. TESTING
Project Info Viewer
Testing the Project Info Viewer tab was fairly straight forward. One change was made in
each box for the project and saved. A further in depth test was carried out where each
component had input data that should not be accepted such as naming the file using
special characters (such as ”, /, }). These tests were all successful.
ETSP Terminal
For testing the ETSP Terminal tab and ultimately ETSPLang some small scripts were
written and run. A known bug with the ETSP Terminal is that an END_S command is
not strictly required because it is not actually checked for, which was proven to be correct
by the testing. Excluding the known bug all scripts ran as expected and gave expected
output. As well as working scripts some non-scripts were tested to check that the system
would give out the expected error output. All error reporting was done as expected.
9.1.3 Scalability Testing
As the Travelling Salesman Problem can easily be scaled up in multiple ways two specific
scalability tests were done and a multi platform test.
Large Datasets
One way in which the Travelling Salesman Problem can be scaled up is by the size of
the input datasets. Within the standard TSP files, hosted by the University of Waterloo,
datasets can range from 29 through to 71,009 points and there is no limiting factor on the
number of possible points in a dataset (barring physical limiting factors such as memory
size in computers). While testing dataset sizes a test was considered a pass if the data
loaded in without the appearance of inactivity. Using this passing criteria ETSP managed
to load in all datasets from the list found on the University of Waterloo server. As a final
test the world dataset containing 1, 904, 711 points was loaded, the data set loaded in
under a second and was usable almost instantly. During the testing of this it was noted
that the larger datasets took longer and longer to load into the Project Info Tab to the
64
9.1. TESTING
point where long periods of inactivity occurred. The world dataset took over an hour to
load into the Project Info Tab but this was foreseen. This was considered adequate as a
prompt dialogue is used to make sure users are sure they would like to open this tab on
large datasets given the delays.
Complex Heuristics
The best way in which ETSP can be looked at for scalability is the running of complex
heuristics. Although you can not strictly test for heuristic complexity several things were
tested to rule out possible issues.
Using a deliberately flawed version of QSTSH, where it was designed to stop working at
random points to test the handling of ETSP on bugged heuristics, several tests were carried
out. When the heuristic would stop working ETSP would just return to the pending state
waiting for a plugin to be run.
Where plugins would take long periods of time to run there can be periods of what
appear to be inactivity from the system which could be seen as a negative.
All in all ETSP appears to handle any size or type of heuristic but exhaustive testing
would be required to completely back this claim.
Multi Platform Testing
During development ETSP was designed to be run on all current major operating systems
this covers Windows 7 & 8, Mac OSx 10.8 & 10.9 and Ununtu Linux 12.04 & 13.04. Some
additional features were put in place for operating system specific functions such as OSx
Lion full screen. To test ETSP on the different operating systems it was decided to create
separate Operating System Virtual Machines for all the above listed Operating Systems.
Each machine was given 1 GB of Ram and 1 2.9 GHz Intel i7 core. All systems worked as
expected except for Mac OSx 10.8 & 10.9 which, when on a virtual machine, seemed to
have issues loading in test files. As this problem had been experienced during development
the creation of a compatibility mode was done allowing the user to start-up the JAR
through the terminal with a compatibility GUI. When run on Mac OSx outwith a virtual
machine ETSP appeared to work fine.
65
9.1. TESTING
These tests concluded that ETSP worked on all major operating systems with no
obvious issues that were not already accounted for during development.
9.1.4 Efficiency Testing
As a large part of ETSP’s efficiency is solely down to external code only tests relating
to ETSP itself were carried out. These tests were specifically the for loading and saving
speeds of files, switching between tabs and Accepting/Rejecting of run data. To pass these
tests the system had to carry out the task with no overly long pauses, by this an arbitrary
time was chosen of 3 seconds. After testing it was concluded that files would load in less
than 3 seconds this was also the case for switching between tabs which averaged less than
2 seconds, again dependent on project size as large files slow down the loading of the
Project Info Tab. As for rejecting and accepting run data this seemed to be completed in
under a second. This test concluded that for system specific parts of ETSP the system is
efficient, but again it should be stressed that the system relies on outside code and this
cannot be verifiability efficient at this point.
9.1.5 Requirement Testing
Requirement testing was done in a simple Yes/No fashion in order of the requirements
chapter (Chapter 4). Yes means the requirement exists and works as expected and any
marked No were either not developed or did not pass testing. Failed or unimplemented
features will be explained.
Functional Requirements: Must
Load in files of the format (.txt, .tsp, .tsproj), Yes
Look for and load in plugins with format (*impl.class), Yes
Allow the user to save the projects in multiple formats (.txt, .tsproj), Yes
Record the runtime of loaded plugins, Yes
Record the accuracy of a plugins output, Yes
Display information on plugin runs accurately and usefully, Yes
66
9.1. TESTING
All must have Requirements were developed and when tested passed.
Functional Requirements: Should
Allow the user to program scripts, Yes.
Load in files from other formats such as (.csv), No
Look for and load in plugins with a class file containing required methods, Yes
During development it was decided that CSV files did not have a desired file structure
and as such CSV loading was not implemented for the first version of ETSP. Apart from
CSV file loading all other requirements were implemented in full and worked when tested.
Functional Requirements: Could
Allow the user to program complex scripts for TSP solutions. Such functions that should
be available include: Basic list manipulation, No
Variable comparison (Such as Point b = Point a), No
Operations (Such as *, %, -, +, ) on Point data (Such as X, Y values), No
Load in all plugins in formats of (.JAR and .class) with required methods, No
Command line functionality, No
In the end it was deemed infeasible to add some of the planned functionalities to
ETSPLang for several reasons such as complexity and time constraints. Some command
line functionality was added but this was only to place the system into a compatibility
mode GUI. It was decided that given time constraints it was best to focus on the GUI and
its HCI than develop a second user interface. As for plugin types, currently only .class
plugins can be loaded as time constraints and developmental preferences prevented any
further plugin types being developed.
Functional Requirements: Wont
Calculate the big O value or the big Ω value for given plugins, No
67
9.1. TESTING
This was placed in wont because this task could be a project unto itself given the
complexity of working out code complexity by just analysing the code.
Non-Functional Requirements Must
Load and be usable within 5 seconds, Yes
Switch between tabs without any lag time, Yes
Save without any lag, Yes
Load files with up to 5000 Points within 5 seconds, Yes
All non-functional must requirements were developed, although larger datasets make
switching to the Project Info Tab increasingly slower. However this can be expected when
dealing with large volumes of data.
Non-Functional Requirements Should
Work on the most current stable and soon to be stable versions of Java (Java 7 & 8), Yes
Have a working/ accurate progress bar for script running, Yes/No
Not Crash when no plugins or no working plugins are given, Yes
Sadly during development it became difficult to implement a loading bar that worked on
all operating systems and did not impede on the efficiency of the system. It is acknowledged
that a solution for the progress bar must exist but was not found during development.
For scripts within ETSPLang a sort of progress information output was created in the
form of feedback to the user as to where in the script it was currently running. All other
non-functional should requirements were implemented in full and worked when tested.
Non-Functional Requirements Could
Prevent itself from crashing while running outside plugins, Yes
68
9.2. EVALUATION
If a faulty, damaged or glitchy plugin is used the system will not crash, but instead
will simply not complete running of the plugin and return to a stable state.
Non-Functional Requirements Wont
Be runnable on older forms of Java (Pre Java 8), Yes
Although this was placed under wont it was incorporated. ETSP has only been tested
on Java 7 and Java 8 and due to the releasing of Java 8 during development emphasis on
Java 7 was removed. Therefore not all tests have been carried out on Java 7, so although
it is given a “Yes” this only means it has worked up to now on Java 7 without any obvious
issues.
9.2 Evaluation
9.2.1 Functionality
Using the MoSCoW system for functionalities as used in the Requirements Chapter
(Chapter 4) the system has had the majority of functionalities developed to a usable level.
All must have requirements, both non-functional and functional were developed and all
important should have requirements were developed as per the requirements chapter. The
could have functional requirements were not developed at the time due to complexity
issues and time constraints. However, the could have non-functional requirement were
developed.
9.2.2 Usability
Throughout development a large emphasis was placed on usability and after testing it was
apparent the emphasis had paid off. By implementing features that users expect within
systems such as undo functionality and operating specific things such as Mac OSx Lion
(10.7) FullScreen mode a more standard Look and Feel was achieved making the system
feel familiar and easy to use.
69
9.3. RESULTS & CONCLUSION
It was brought up during the user testing that the use of the word New in the Project
menu seemed rather unusual given the menu item actually loads a project file. This could
be easily changed at a future date.
9.2.3 Durability
It was good to see that during testing the system seemed to stay robust at all times and
did not have any unexpected crashes. With the evidence from the testing the system could
be said to be durable and robust without any obvious issues.
9.3 Results & Conclusion
Although the user testing was only carried out with seven participants the results clearly
show the system to be intuitive and the User Manual easy to follow. To further this
claim a larger user group would be needed and selected such that it gives a more accurate
representation of the expected user base for ETSP. Given the results from the testing and
the explanations given in the evaluation chapter it would be a fair statement to say that
ETSP is a working, usable and robust system as of the development of its first major
iteration.
70
ETSP Platform Tests TSP Heuristics
ETSP Platform Tests TSP Heuristics
ETSP Platform Tests TSP Heuristics
ETSP Platform Tests TSP Heuristics
ETSP Platform Tests TSP Heuristics
ETSP Platform Tests TSP Heuristics
ETSP Platform Tests TSP Heuristics
ETSP Platform Tests TSP Heuristics
ETSP Platform Tests TSP Heuristics
ETSP Platform Tests TSP Heuristics
ETSP Platform Tests TSP Heuristics
ETSP Platform Tests TSP Heuristics
ETSP Platform Tests TSP Heuristics
ETSP Platform Tests TSP Heuristics
ETSP Platform Tests TSP Heuristics
ETSP Platform Tests TSP Heuristics
ETSP Platform Tests TSP Heuristics
ETSP Platform Tests TSP Heuristics
ETSP Platform Tests TSP Heuristics
ETSP Platform Tests TSP Heuristics
ETSP Platform Tests TSP Heuristics
ETSP Platform Tests TSP Heuristics
ETSP Platform Tests TSP Heuristics
ETSP Platform Tests TSP Heuristics
ETSP Platform Tests TSP Heuristics
ETSP Platform Tests TSP Heuristics
ETSP Platform Tests TSP Heuristics
ETSP Platform Tests TSP Heuristics
ETSP Platform Tests TSP Heuristics
ETSP Platform Tests TSP Heuristics
ETSP Platform Tests TSP Heuristics
ETSP Platform Tests TSP Heuristics
ETSP Platform Tests TSP Heuristics
ETSP Platform Tests TSP Heuristics
ETSP Platform Tests TSP Heuristics
ETSP Platform Tests TSP Heuristics
ETSP Platform Tests TSP Heuristics
ETSP Platform Tests TSP Heuristics
ETSP Platform Tests TSP Heuristics
ETSP Platform Tests TSP Heuristics
ETSP Platform Tests TSP Heuristics
ETSP Platform Tests TSP Heuristics
ETSP Platform Tests TSP Heuristics
ETSP Platform Tests TSP Heuristics
ETSP Platform Tests TSP Heuristics
ETSP Platform Tests TSP Heuristics
ETSP Platform Tests TSP Heuristics
ETSP Platform Tests TSP Heuristics
ETSP Platform Tests TSP Heuristics
ETSP Platform Tests TSP Heuristics
ETSP Platform Tests TSP Heuristics
ETSP Platform Tests TSP Heuristics
ETSP Platform Tests TSP Heuristics
ETSP Platform Tests TSP Heuristics
ETSP Platform Tests TSP Heuristics
ETSP Platform Tests TSP Heuristics
ETSP Platform Tests TSP Heuristics
ETSP Platform Tests TSP Heuristics
ETSP Platform Tests TSP Heuristics
ETSP Platform Tests TSP Heuristics
ETSP Platform Tests TSP Heuristics
ETSP Platform Tests TSP Heuristics
ETSP Platform Tests TSP Heuristics
ETSP Platform Tests TSP Heuristics
ETSP Platform Tests TSP Heuristics
ETSP Platform Tests TSP Heuristics
ETSP Platform Tests TSP Heuristics
ETSP Platform Tests TSP Heuristics
ETSP Platform Tests TSP Heuristics
ETSP Platform Tests TSP Heuristics
ETSP Platform Tests TSP Heuristics
ETSP Platform Tests TSP Heuristics
ETSP Platform Tests TSP Heuristics
ETSP Platform Tests TSP Heuristics
ETSP Platform Tests TSP Heuristics

More Related Content

What's hot

W java81
W java81W java81
W java81rasikow
 
Automatic Detection of Performance Design and Deployment Antipatterns in Comp...
Automatic Detection of Performance Design and Deployment Antipatterns in Comp...Automatic Detection of Performance Design and Deployment Antipatterns in Comp...
Automatic Detection of Performance Design and Deployment Antipatterns in Comp...Trevor Parsons
 
Learn python the right way
Learn python the right wayLearn python the right way
Learn python the right wayDianaLaCruz2
 
Team Omni L2 Requirements Revised
Team Omni L2 Requirements RevisedTeam Omni L2 Requirements Revised
Team Omni L2 Requirements RevisedAndrew Daws
 
Virtual Environments as Driving Schools for Deep Learning Vision-Based Sensor...
Virtual Environments as Driving Schools for Deep Learning Vision-Based Sensor...Virtual Environments as Driving Schools for Deep Learning Vision-Based Sensor...
Virtual Environments as Driving Schools for Deep Learning Vision-Based Sensor...Artur Filipowicz
 
FATKID - A Finite Automata Toolkit - NF Huysamen
FATKID - A Finite Automata Toolkit - NF HuysamenFATKID - A Finite Automata Toolkit - NF Huysamen
FATKID - A Finite Automata Toolkit - NF HuysamenNico Huysamen
 
Introduction to methods of applied mathematics or Advanced Mathematical Metho...
Introduction to methods of applied mathematics or Advanced Mathematical Metho...Introduction to methods of applied mathematics or Advanced Mathematical Metho...
Introduction to methods of applied mathematics or Advanced Mathematical Metho...Hamed Oloyede
 
Am06 complete 16-sep06
Am06 complete 16-sep06Am06 complete 16-sep06
Am06 complete 16-sep06Nemo Pham
 
0802 python-tutorial
0802 python-tutorial0802 python-tutorial
0802 python-tutorialZahid Hasan
 
An introduction-to-predictive-maintenance
An introduction-to-predictive-maintenanceAn introduction-to-predictive-maintenance
An introduction-to-predictive-maintenancesaad
 

What's hot (17)

W java81
W java81W java81
W java81
 
Automatic Detection of Performance Design and Deployment Antipatterns in Comp...
Automatic Detection of Performance Design and Deployment Antipatterns in Comp...Automatic Detection of Performance Design and Deployment Antipatterns in Comp...
Automatic Detection of Performance Design and Deployment Antipatterns in Comp...
 
Learn python the right way
Learn python the right wayLearn python the right way
Learn python the right way
 
Team Omni L2 Requirements Revised
Team Omni L2 Requirements RevisedTeam Omni L2 Requirements Revised
Team Omni L2 Requirements Revised
 
Virtual Environments as Driving Schools for Deep Learning Vision-Based Sensor...
Virtual Environments as Driving Schools for Deep Learning Vision-Based Sensor...Virtual Environments as Driving Schools for Deep Learning Vision-Based Sensor...
Virtual Environments as Driving Schools for Deep Learning Vision-Based Sensor...
 
FATKID - A Finite Automata Toolkit - NF Huysamen
FATKID - A Finite Automata Toolkit - NF HuysamenFATKID - A Finite Automata Toolkit - NF Huysamen
FATKID - A Finite Automata Toolkit - NF Huysamen
 
Introduction to methods of applied mathematics or Advanced Mathematical Metho...
Introduction to methods of applied mathematics or Advanced Mathematical Metho...Introduction to methods of applied mathematics or Advanced Mathematical Metho...
Introduction to methods of applied mathematics or Advanced Mathematical Metho...
 
Scikit learn 0.16.0 user guide
Scikit learn 0.16.0 user guideScikit learn 0.16.0 user guide
Scikit learn 0.16.0 user guide
 
Am06 complete 16-sep06
Am06 complete 16-sep06Am06 complete 16-sep06
Am06 complete 16-sep06
 
0802 python-tutorial
0802 python-tutorial0802 python-tutorial
0802 python-tutorial
 
Tutorial edit
Tutorial editTutorial edit
Tutorial edit
 
Sg246776
Sg246776Sg246776
Sg246776
 
Free high-school-science-texts-physics
Free high-school-science-texts-physicsFree high-school-science-texts-physics
Free high-school-science-texts-physics
 
Offshore structures
Offshore structuresOffshore structures
Offshore structures
 
An introduction-to-predictive-maintenance
An introduction-to-predictive-maintenanceAn introduction-to-predictive-maintenance
An introduction-to-predictive-maintenance
 
Gemini Manual
Gemini ManualGemini Manual
Gemini Manual
 
PhD-2013-Arnaud
PhD-2013-ArnaudPhD-2013-Arnaud
PhD-2013-Arnaud
 

Similar to ETSP Platform Tests TSP Heuristics

eclipse.pdf
eclipse.pdfeclipse.pdf
eclipse.pdfPerPerso
 
Specification of the Linked Media Layer
Specification of the Linked Media LayerSpecification of the Linked Media Layer
Specification of the Linked Media LayerLinkedTV
 
C++ For Quantitative Finance
C++ For Quantitative FinanceC++ For Quantitative Finance
C++ For Quantitative FinanceASAD ALI
 
Introduction to Programming Using Java v. 7 - David J Eck - Inglês
Introduction to Programming Using Java v. 7 - David J Eck - InglêsIntroduction to Programming Using Java v. 7 - David J Eck - Inglês
Introduction to Programming Using Java v. 7 - David J Eck - InglêsMarcelo Negreiros
 
Cenet-- capability enabled networking: towards least-privileged networking
Cenet-- capability enabled networking: towards least-privileged networkingCenet-- capability enabled networking: towards least-privileged networking
Cenet-- capability enabled networking: towards least-privileged networkingJithu Joseph
 
optimization and preparation processes.pdf
optimization and preparation processes.pdfoptimization and preparation processes.pdf
optimization and preparation processes.pdfThanhNguyenVan84
 
project Report on LAN Security Manager
project Report on LAN Security Managerproject Report on LAN Security Manager
project Report on LAN Security ManagerShahrikh Khan
 
Advanced-java.pptx
Advanced-java.pptxAdvanced-java.pptx
Advanced-java.pptxMiltonMolla1
 

Similar to ETSP Platform Tests TSP Heuristics (20)

eclipse.pdf
eclipse.pdfeclipse.pdf
eclipse.pdf
 
Jmetal4.5.user manual
Jmetal4.5.user manualJmetal4.5.user manual
Jmetal4.5.user manual
 
dissertation
dissertationdissertation
dissertation
 
Specification of the Linked Media Layer
Specification of the Linked Media LayerSpecification of the Linked Media Layer
Specification of the Linked Media Layer
 
Programming
ProgrammingProgramming
Programming
 
Javanotes5 linked
Javanotes5 linkedJavanotes5 linked
Javanotes5 linked
 
MS_Thesis
MS_ThesisMS_Thesis
MS_Thesis
 
Liebman_Thesis.pdf
Liebman_Thesis.pdfLiebman_Thesis.pdf
Liebman_Thesis.pdf
 
C++ For Quantitative Finance
C++ For Quantitative FinanceC++ For Quantitative Finance
C++ For Quantitative Finance
 
Javanotes6 linked
Javanotes6 linkedJavanotes6 linked
Javanotes6 linked
 
Introduction to Programming Using Java v. 7 - David J Eck - Inglês
Introduction to Programming Using Java v. 7 - David J Eck - InglêsIntroduction to Programming Using Java v. 7 - David J Eck - Inglês
Introduction to Programming Using Java v. 7 - David J Eck - Inglês
 
test6
test6test6
test6
 
Cenet-- capability enabled networking: towards least-privileged networking
Cenet-- capability enabled networking: towards least-privileged networkingCenet-- capability enabled networking: towards least-privileged networking
Cenet-- capability enabled networking: towards least-privileged networking
 
Master_Thesis
Master_ThesisMaster_Thesis
Master_Thesis
 
optimization and preparation processes.pdf
optimization and preparation processes.pdfoptimization and preparation processes.pdf
optimization and preparation processes.pdf
 
Agathos-PHD-uoi-2016
Agathos-PHD-uoi-2016Agathos-PHD-uoi-2016
Agathos-PHD-uoi-2016
 
Agathos-PHD-uoi-2016
Agathos-PHD-uoi-2016Agathos-PHD-uoi-2016
Agathos-PHD-uoi-2016
 
z_remy_spaan
z_remy_spaanz_remy_spaan
z_remy_spaan
 
project Report on LAN Security Manager
project Report on LAN Security Managerproject Report on LAN Security Manager
project Report on LAN Security Manager
 
Advanced-java.pptx
Advanced-java.pptxAdvanced-java.pptx
Advanced-java.pptx
 

ETSP Platform Tests TSP Heuristics

  • 1. The Development and Testing of the Euclidean Travelling Salesman Platform: Single Honours Report MSci William J. M. Fraser w.fraser.10@aberdeen.ac.uk 51010160 Friday 16th May, 2014 Abstract To help in the hunt for an optimised solution to the Travelling Salesman Problem (TSP) this project follows the development and testing of a platform called the Euclidean Travelling Salesman Platform (ETSP) designed for testing TSP heuristics. This project was motivated by previous research into an optimised TSP solution called QSTSH. A Java platform was designed, developed and tested to a version one and used to test QSTSH. From the testing it was concluded that ETSP was easy to use and robust. Testing was also done on QSTSH by comparing it to a Greedy Nearest Neighbour which found QSTSH to be better in accuracy and efficiency. 1
  • 2. Acknowledgements I would like to acknowledge a couple people for their contribution to this work. First and foremost, I would like to acknowledge the support and advice of Dr Martin Kollingbaum throughout this project. Secondly, I would like to acknowledge Dr Nir Oren of the University of Aberdeen for initially sparking my interest in the Travelling Salesman Problem and algorithms in general. “Nothing is impossible. Some things are just less likely than others.” -Jonathan Winters 2
  • 3. Contents Contents 3 List of Figures 8 List of Tables 9 1 Introduction 10 1.1 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10 1.2 Objectives . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10 1.3 Motivations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11 1.4 Investigations & Goals . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12 1.4.1 Primary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12 1.4.2 Secondary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13 1.4.3 Tertiary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13 1.5 Document Structure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13 2 Background 14 2.1 History of the Travelling Salesman Problem . . . . . . . . . . . . . . . . 14 2.2 Different Travelling Salesman Types . . . . . . . . . . . . . . . . . . . . . 15 2.2.1 Symmetric and Asymmetric . . . . . . . . . . . . . . . . . . . . . 15 2.2.2 Graph . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15 2.2.3 Metric . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16 2.2.4 Other Information . . . . . . . . . . . . . . . . . . . . . . . . . . 16 2.3 Solution Types . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17 2.3.1 Heuristics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17 2.3.2 Exact Algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . 17 2.4 Existing Solutions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18 2.4.1 Existing Research Solutions . . . . . . . . . . . . . . . . . . . . . 18 3
  • 4. CONTENTS 2.4.2 Existing Real Word Solutions . . . . . . . . . . . . . . . . . . . . 18 2.5 Real World Applications . . . . . . . . . . . . . . . . . . . . . . . . . . . 19 3 The QSTSH 20 3.1 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20 3.2 License . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20 3.3 Background . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21 3.4 Implementation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21 3.5 Developed Optimisation and Accuracy . . . . . . . . . . . . . . . . . . . 23 3.6 Development To Date . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25 4 Requirements 26 4.1 Functional Requirements . . . . . . . . . . . . . . . . . . . . . . . . . . . 26 4.1.1 ETSP Must: . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26 4.1.2 ETSP Should: . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26 4.1.3 ETSP Could: . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27 4.1.4 ETSP Wont: . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27 4.2 Non-Functional Requirements . . . . . . . . . . . . . . . . . . . . . . . . 27 4.2.1 ETSP Must: . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27 4.2.2 ETSP Should: . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28 4.2.3 ETSP Could: . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28 4.2.4 ETSP Wont: . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28 5 Methodology & Technologies 29 5.1 Methodology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29 5.1.1 Research . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29 5.1.2 Requirement Acquisition . . . . . . . . . . . . . . . . . . . . . . . 29 5.1.3 System Development . . . . . . . . . . . . . . . . . . . . . . . . . 30 5.1.4 System Testing . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30 5.1.5 Documentation . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30 5.2 Technology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31 5.2.1 Licensing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31 5.2.2 Java & JavaDoc . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32 5.2.3 IDE . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32 5.2.4 Storage . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33 5.2.5 Version Control . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33 5.2.6 Parallels Desktop 8 . . . . . . . . . . . . . . . . . . . . . . . . . . 34 5.2.7 LATEX . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34 4
  • 5. CONTENTS 6 System Design & Architecture 35 6.1 System Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35 6.1.1 User Interface . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35 6.1.2 Plugin Loader . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39 6.1.3 ETSLib . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39 6.1.4 File Loader . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39 6.2 Architecture . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40 6.2.1 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40 6.2.2 Class UML . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41 6.2.3 Packages . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43 7 Implementation 44 7.1 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44 7.2 User Interface . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45 7.2.1 Heuristic tester . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45 7.2.2 Project Info Viewer . . . . . . . . . . . . . . . . . . . . . . . . . . 46 7.2.3 ETSP Terminal . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48 7.3 ETSPLang . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49 7.3.1 Syntax . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49 7.3.2 Functions and Keywords . . . . . . . . . . . . . . . . . . . . . . . 49 7.4 Plugin Loader . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50 7.5 ETSPLib . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51 7.6 Other Developments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51 8 Experiment 53 8.1 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53 8.2 Aim . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53 8.3 Hypotheses . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53 8.4 Heuristics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54 8.4.1 Greedy Nearest Neighbour . . . . . . . . . . . . . . . . . . . . . . 54 8.4.2 QSTSH . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54 8.5 Method . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54 8.5.1 Test Environment . . . . . . . . . . . . . . . . . . . . . . . . . . . 54 8.6 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55 8.6.1 Runtime . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55 8.6.2 Accuracy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56 8.7 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56 5
  • 6. CONTENTS 9 Testing and Evaluation 58 9.1 Testing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58 9.1.1 HCI & Usability Testing . . . . . . . . . . . . . . . . . . . . . . . 58 9.1.2 Component Testing . . . . . . . . . . . . . . . . . . . . . . . . . . 63 9.1.3 Scalability Testing . . . . . . . . . . . . . . . . . . . . . . . . . . 64 9.1.4 Efficiency Testing . . . . . . . . . . . . . . . . . . . . . . . . . . . 66 9.1.5 Requirement Testing . . . . . . . . . . . . . . . . . . . . . . . . . 66 9.2 Evaluation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69 9.2.1 Functionality . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69 9.2.2 Usability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69 9.2.3 Durability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 70 9.3 Results & Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 70 10 Conclusions 71 10.1 Achievements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71 10.2 Lessons Learned . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71 10.3 Future Work & Known Bugs . . . . . . . . . . . . . . . . . . . . . . . . . 72 10.3.1 Future Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72 10.3.2 Known Bugs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72 10.4 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73 Bibliography 74 Appendices 77 A User Manual 77 A.1 Preface . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77 A.2 Installation, Setup & Initialisation . . . . . . . . . . . . . . . . . . . . . . 77 A.2.1 Installation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77 A.2.2 Setup . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 78 A.2.3 Getting Started . . . . . . . . . . . . . . . . . . . . . . . . . . . . 79 A.2.4 Closing the Program . . . . . . . . . . . . . . . . . . . . . . . . . 81 A.3 Handy Things to Know . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81 A.4 Using the System . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81 A.4.1 Making a Project File . . . . . . . . . . . . . . . . . . . . . . . . 81 A.4.2 Loading a Project . . . . . . . . . . . . . . . . . . . . . . . . . . . 82 A.4.3 Testing a Plugin . . . . . . . . . . . . . . . . . . . . . . . . . . . 83 6
  • 7. CONTENTS A.4.4 Interpreting Plugin Output . . . . . . . . . . . . . . . . . . . . . 84 A.4.5 Modifying the Project . . . . . . . . . . . . . . . . . . . . . . . . 84 A.4.6 Saving a Project . . . . . . . . . . . . . . . . . . . . . . . . . . . 85 A.4.7 Closing a Project . . . . . . . . . . . . . . . . . . . . . . . . . . . 86 A.5 ETSPLang . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 86 A.5.1 Writing a Script . . . . . . . . . . . . . . . . . . . . . . . . . . . . 86 A.5.2 Running a Script . . . . . . . . . . . . . . . . . . . . . . . . . . . 90 A.5.3 Interpreting Output . . . . . . . . . . . . . . . . . . . . . . . . . 90 A.6 Plugins . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 91 A.6.1 Writing a Plugin . . . . . . . . . . . . . . . . . . . . . . . . . . . 91 A.6.2 Writing a Plugin Interface . . . . . . . . . . . . . . . . . . . . . . 92 A.6.3 Making a Plugin Runnable . . . . . . . . . . . . . . . . . . . . . . 93 A.6.4 Other Information on Running a Plugin . . . . . . . . . . . . . . 94 B Maintenance Manual 96 B.1 Installation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 96 B.1.1 JAR Installation . . . . . . . . . . . . . . . . . . . . . . . . . . . 96 B.1.2 Project Opening in Eclipse . . . . . . . . . . . . . . . . . . . . . . 96 B.1.3 Installing a Plugin . . . . . . . . . . . . . . . . . . . . . . . . . . 97 B.2 Class Outline . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 97 B.2.1 gui . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 97 B.2.2 etspLang . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 98 B.2.3 packageInterface . . . . . . . . . . . . . . . . . . . . . . . . . . . 98 B.2.4 storage . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 99 B.3 Class UML . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 99 B.4 Future Improvement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 101 B.4.1 User Interface . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 101 B.4.2 Plugin Improvements . . . . . . . . . . . . . . . . . . . . . . . . . 102 B.4.3 ETSPLang . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 102 B.5 Directory Content . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 103 C File Formats 106 C.1 .TSP . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 106 C.1.1 File Layout . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 106 C.1.2 Example File . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 107 C.2 .TXT . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 108 C.2.1 Example File . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 108 7
  • 8. C.3 .TSPRJ . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 108 C.3.1 File Layout . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 109 C.3.2 Example File . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 110 D User Testing Template and Results 112 D.1 User Testing Template . . . . . . . . . . . . . . . . . . . . . . . . . . . . 112 D.2 User Output . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 117 List of Figures 2.1 Example of a weighted graph (Wikipedia) . . . . . . . . . . . . . . . . . . . 16 3.1 QSTSH’s main function . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22 3.2 QSTSH’s Constructor . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23 3.3 Plots of depths at which QSTSH exceeds GNN logarithmically scaled and normally scaled respectively. Showing a line of best fit. . . . . . . . . . . . . 25 5.1 Gantt Chart Plan of Documentation. . . . . . . . . . . . . . . . . . . . . . . 31 6.1 Main Screen Concept . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35 6.2 Heuristic Tester Tab Concept . . . . . . . . . . . . . . . . . . . . . . . . . . 36 6.3 Project Tab Concept . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37 6.4 ETSPLang Tab Concept . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38 6.5 Sample Script . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39 6.6 System Architecture Diagram. . . . . . . . . . . . . . . . . . . . . . . . . . . 40 6.7 Class UML . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42 6.8 System Packages . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43 7.1 ETSP overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44 7.2 ETSP: Heuristic Testing Tab . . . . . . . . . . . . . . . . . . . . . . . . . . 45 7.3 ETSP: Heuristic Testing Tab in Action . . . . . . . . . . . . . . . . . . . . . 46 7.4 ETSP: Project Info Viewer . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47 7.5 ETSP: Project Info Viewer Submit Request . . . . . . . . . . . . . . . . . . 47 8
  • 9. 7.6 ETSP: ETSPLang Terminal . . . . . . . . . . . . . . . . . . . . . . . . . . . 48 7.7 ETSP: Help and About boxes . . . . . . . . . . . . . . . . . . . . . . . . . . 51 8.1 Runtime in ms plotted by number of points . . . . . . . . . . . . . . . . . . 55 8.2 Accuracy in % plotted by number of points . . . . . . . . . . . . . . . . . . . 56 List of Tables 6.1 Preliminary Vocabulary Design . . . . . . . . . . . . . . . . . . . . . . . . . 38 7.1 Table of ETSPLang keywords Part 1 . . . . . . . . . . . . . . . . . . . . . . 49 7.2 Table of ETSPLang keywords Part 2 . . . . . . . . . . . . . . . . . . . . . . 50 9
  • 10. Chapter 1Introduction 1.1 Overview The Travelling Salesman Problem, also referred to as TSP, in its simplest form is the problem to find the shortest route for a network of locations, where each location is only visited once. The Travelling Salesman Problem has been studied in great detail and a large set of solutions have been proposed. This project aims to create a testing platform, the so-called Euclidean Travelling Salesman Platform or ETSP, for the investigation of the two-dimensional euclidean TSP solutions. This platform will be used to investigate a new algorithm for solving a TSP, the QSTSH, and to compare its performance to another TSP heuristic. An important motivation for the development of the ETSP platform was a previous personal research project on an optimised solution to the Travelling Salesman Problem called the Quick Sort Travelling Salesman Heuristic or QSTSH for short. It was the desire to compare this heuristic with other heuristics that sparked the development for this project. 1.2 Objectives The final outcome of the project is to produce a working version of the Euclidean Travelling Salesman Platform for testing travelling salesman heuristics. In particular, it will be used to demonstrate the performance of QSTSH compared to another existing solution, such as the Greedy Nearest Neighbour (GNN). When fully operational the Euclidean Travelling Salesman Platform (ETSP) will offer benefits in many different ways: 10
  • 11. 1.3. MOTIVATIONS – A working version of ETSP with the primary components and a usable GUI – A plugin loader should run without the need for user input and in a user friendly manner with a standardised format – The test data should be loadable in many formats to suit the users needs – Test data should be manually editable in the system to add to or update – The user interface should be split into separate sections for the different functionalities – The output for the user in each should be usable and understandable in a format that is user-oriented – Of these sections there should three main focuses: Heuristic Testing, Project Infor- mation Interface and a Scripting Terminal. – The interface components, such as buttons, should be deactivated and activated as and when the user needs them Since this is a whole new platform design studying current heuristics and incorporating the information learned will be carried out throughout the whole project and focused on greatly during the system design and requirement gathering. After the development of the system, to illustrate the working functionality, two heuristics will be compared: QSTSH and the Greedy Nearest Neighbour Heuristic (GNN). (See Chapter 8). 1.3 Motivations A big motivation for ETSP was a previous personal research project on an optimised solution to the travelling salesman problem called the Quick Sort Travelling Salesman Heuristic or QSTSH for short. It was the desire to pitch this heuristic against others that sparked the development for this project. A further motivation for the development of ETSP is to help forward the research into optimised and accurate heuristics for the Travelling Salesman Problem by allowing for testing several solutions against each other and testing combinations of solutions. The 11
  • 12. 1.4. INVESTIGATIONS & GOALS information gathered from such a system in the developmental process of heuristics as well as the testing process could prove indispensable. ETSP will also allow for the removal of the possible bias created from choosing separate systems to run individual heuristics during experiments. it would cancel the possible bias since it could be classed as a constant. Along with the research possibilities, ETSP could be used as an education tool in the teaching of the Travelling Salesman Problem or less specifically the importance of optimisation and accuracy. Another motivation is the lack of such a platform readily existing, that is not to say such a platform does not exist but more that during the research for this project such a platform could not be found (see Chapter 2). 1.4 Investigations & Goals Given the motivations, several investigative questions were raised: Is QSTSH better than a comparative traditional TSP heuristic? Would a platform prove beneficial to TSP heuristic testing? Is ETSP convenient to use and run experiments on? Based on these questions it was concluded that there was a need for a platform that was easy to use and produced usable output. This led to the following goals. The goals have been broken down into three subgroups in an ordinal fashion from Primary to Tertiary where the Primary goals take precedence over Secondary and Secondary over Tertiary. This means that if achieving a Primary goal stops a Secondary goal this will be classed as an acceptable compromise. 1.4.1 Primary – Create an easy to use system – Make a simple means of loading in personal solutions to the TSP 12
  • 13. 1.5. DOCUMENT STRUCTURE – Demonstrate the system meets all set requirements – Make the system easily usable in both education and research 1.4.2 Secondary – Make the system with an intuitive GUI – Build the system to cope with current and future versions of Java – Build the system to make use of system resources to help with processing speed – Test QSTSH against a Greedy Nearest Neighbour Heuristic 1.4.3 Tertiary – Make a platform, which can cope with all current major operating systems – Design a platform so that it can handle multiple input types in both test solutions and route data 1.5 Document Structure This document follows a standard structure in a semi-chronological manner starting with the motivations for the project, then the history behind the project idea before moving through to requirement building and system design. After the design documenting there is the development outline before system testing. The testing is split up into two chapters; the first being an experiment testing two plugins and the second consisting of more requirement related testing. The document closes with a conclusion chapter before moving on to the appendices where the User and Maintenance Manuals as well as further miscellaneous sections. 13
  • 14. Chapter 2Background 2.1 History of the Travelling Salesman Problem The Travelling Salesman Problem is one of the most famous problems in computer science, although it is not solely researched by computer scientists. In computer science the TSP is widely studied for optimisation [16]. The fame of this problem could be down to its simple definition but complex solutions as postulated by Merz and Freisleben [19]. It is widely claimed that there is no known origin of the travelling salesman problem and all things considered there probably is no provable origin. A quick thought experiment can show that finding the shortest route is inherent in all of us and indeed in nature. Imagine that you are on your way to somewhere important, ahead of you are three known paths A, B and C. You know B is the guaranteed quickest route and therefore, removing the desire to sight see and dawdle around, you would pick B. Now let’s extrapolate this further into basic nature. Picture three animals A, B and C all fighting for survival and having to choose a route to gather food for the winter. Now imagine that A chooses the most optimal route allowing him to collect more than enough food for the winter whereas B chooses a route suited to collect enough to survive the winter, but only just, and C does not collect enough to survive the winter. In this chain of events C’s route is not passed on whereas A’s route is definitely passed on and B’s is passed on if enough food is collected to survive. This shows how the need to find optimal solutions to such a situation as the Travelling Salesman Problem is built into nature. Although no known origin of the problem exists, it would appear to have been first mathematically formalised by Irish mathematician Sir William Rowan Hamilton and British mathematician Thomas Penyngton Kirkman in the 1800s [4]. 14
  • 15. 2.2. DIFFERENT TRAVELLING SALESMAN TYPES The TSP is also famous for being a known NP-Complete problem [22], that is to say it is ”A problem which is both NP (verifiable in nondeterministic polynomial time) and NP-hard (any NP-problem can be translated into this problem).” [18] This means that it takes a very long time to calculate and verify shortest routes See Section 2.3.2. 2.2 Different Travelling Salesman Types 2.2.1 Symmetric and Asymmetric The sheer extent of the research and scale of the Travelling Salesman Problem means that it is often found that the TSP is split up into different forms. The most common split made is between symmetric and asymmetric TSPs. The symmetric TSP is the most common form when the weighting is equal in both directions, that is to say the distance between town A and B is the same if travelled in both directions d(A, B) = d(B, A). Whereas in the asymmetric problem the weighting is unequal in both directions, that is to say the time taken to travel between town A and B is different travelled in either direction d(A, B) = d(B, A). As the symmetric problem is in some ways more representative of the real world much of the research has been concentrated on this [6]. 2.2.2 Graph Weighted graphs can be used to represent TSPs and are especially useful when not all points are connected. For example, in the real world where travel between two cities directly may not be possible due to road layouts. Since graphs can be directed or undirected, they prove useful in adding constraints on the direction of travel between vertices. A basic example of a weighted graph can be seen in Figure 2.1. In this graph the vertices A, B, C and D are connected by weighted edges. A weight expresses, for example, the distance between two vertices (locations in the TSP) or the cost, carbon footprint etc. The weighted edges can be used to calculate a weighting of an overall route to visit all four vertices. 15
  • 16. 2.2. DIFFERENT TRAVELLING SALESMAN TYPES Figure 2.1: Example of a weighted graph (Wikipedia) 2.2.3 Metric The metric TSP or ∆-TSP (delta-TSP) states that the weightings between points satisfy the triangle inequality, which states that dA,B ≤ dA,C + dC,B. In short, the metric TSP states that a straight line is always shorter than a detour and although this sounds obvious under certain conditions and in certain forms of geometry this may not always be the case. This rule is strict when dealing with Euclidean Space [15]. An example of the metric TSP is the Euclidean TSP using euclidean distances, also known as ordinary distances, such as the distance measured using a measuring tape. 2.2.4 Other Information When talking about the Travelling Salesman Problem it is easy to assume it is exclusively and solely linked to three dimensional physical space when this is not actually the case. It is obvious that the most common forms of the TSP are Three Dimensional, Symmetric and Euclidean because this maps out many real life scenarios such as the namesake travelling salesman, but it can also use more or fewer dimensions. 16
  • 17. 2.3. SOLUTION TYPES 2.3 Solution Types 2.3.1 Heuristics Heuristics in a computer science sense is the way of producing a solution to a problem that balances efficiency off of accuracy. Since the Travelling Salesman Problem is a NP-Complete problem, the best that can be hoped for is an accurate heuristic. There are several ways to categorise heuristics, for example deterministic and nondeterministic. A deterministic heuristic is one that always produces the same output given the same input and a nondeterministic heuristic is one that, through the use of random elements and other such entities, may not always produce the same output given the same input. There are situations where introducing a random element can create either a more optimised solution or, on average, a more accurate solution. When deciding on the type of heuristic it is important to balance the requirements by asking such questions as: ‘what is more important, efficiency or accuracy?’ and ‘should the heuristic always produce the same output?’ 2.3.2 Exact Algorithm The exact algorithm is the fundamental solution to the TSP, so although it is not a viable option on even the smallest list of points, it is best to outline it to better understand the nature of the problem. The TSP is described as Superpolynomial, so to simplify the calculations for the sake of demonstration we assume that the TSP runs in true factorial time and we assume that each calculation takes just one nanosecond. This would mean that for a list of 10 elements the algorithm would take just 3.63ms. Now, with the addition of 10 elements making a list of 20, the algorithm would take 77.1 years that is just above 670 billion(1 × 109 ) times longer than for a list of 10 elements. Finally, add 10 again to give a list of 30 and it would take about 8.4 quadrillion(1 × 1015 ) years to compute [25]. As the maths shows this solution is outright not viable. In pseudocode this solution could be described as: function ExactTSP(P) d=∞ for Pi each permeation of point set P do if dist(Pi≤ d) then 17
  • 18. 2.4. EXISTING SOLUTIONS d = dist(Pi) and Pmin = Pi end for return Pmin Skiena [25] It can be seen from the pseudocode why the exact algorithm takes so long but also how simple it is. The size of the set of P can be calculated as the number of permutation of a list of points N (SetSize = N!). 2.4 Existing Solutions During the research for the development of the Euclidean Travelling Salesman Platform (ETSP) no previous solutions were found that encapsulates the requirements and the design decision. It is for this reason that instead of listing existing solutions for ETSP, solutions for the TSP in general are listed. Some of these solutions are only used for research and some are used in the real world for everyday tasks. 2.4.1 Existing Research Solutions In recent years with the improvement of multi core systems and the ability to build multi agent or distributed solutions to problems, a lot of research has been done on the so called Ant System[7–9, 11, 13]. Ant colony optimisation feeds perfectly into research about genetic algorithms and AI, allowing for some ant colony solutions that use AI ants. [12] Along with distributed solutions and genetic or learning systems, there are larger systems such as Concorde[3] and the Lin-Kernighan Heuristic also known as LKH [17]. The reason for LKH’s fame is the number of standard library forms of TSP[24] it has completed to date, with its optimisation and scale of the system. LKH has been shown to find the optimal route for the 7397-city problem[14]. 2.4.2 Existing Real Word Solutions Interestingly, TSP solutions in many forms exist ubiquitously around in every day life and quite often as some kind of route planner. Although the exact algorithms behind 18
  • 19. 2.5. REAL WORLD APPLICATIONS solutions such as the AA Route Planner[2] RAC Route Planner[23] and the TomTom Route Planner[26] are not publicly released, given their function, it is easy to see that they are most likely some form of TSP solution. Other solutions that may exist could include parcel delivery systems and even air travel planning systems. 2.5 Real World Applications ETSP is designed to not just be research restrictive but also to fit a wide variety of real world applications. ETSP’s primary application would be for the testing of heuristics, but that could be in any sense from education to research. From the point of view of research, ETSP could be used to verify the efficiency and accuracy of TSP solutions whereas from the point of view of education, ETSP could be used to help teach the importance of the efficiency-accuracy trade off in the development of heuristics or more specifically could be used in the education of the Travelling Salesman Problem. 19
  • 20. Chapter 3The QSTSH 3.1 Overview QSTSH was named after its full title, the Quick Sort Travelling Salesman Heuristic and was more than an afternoon’s work, in fact it has been three years in development. The origin of the Quick Sort Travelling Salesman Heuristic was a single burst of inspiration whilst attending a lecture on Algorithmic Problem Solving in 2011. The lecture discussed optimised solutions and broadly touched on Quick Sort as a proof of good optimisation and on the Travelling Salesman Problem as an example of when optimisation was not always possible. From this lecture the seeds of QSTSH were planted. 3.2 License The reason for a license section is to iterate that QSTSH was not developed for this project or for any other project and as such remains solely owned by William John MacLeod Fraser. The use of QSTSH in this paper is purely as a demonstration of ETSP and as such this chapter is just a formalisation of it. QSTSH is not licensed in any way to be used for research without the express written permission of the author. 20
  • 21. 3.3. BACKGROUND 3.3 Background The background of QSTSH is the Quick Sort algorithm. QSTSH is handed a list of points which are assigned an xy value corresponding to the closest point on the x = y line. This list is then copied twice to give three lists of the same points which are then sorted in three separate ways by their x, y and xy points. After the three sorts have been completed, the sorted lists are then passed to the main algorithm behind the heuristic where the route is calculated. From a given starting point, the algorithm searches up and down the three lists to a chosen depth in order to find the nearest neighbouring point to the starting point witch is then removed from the three lists and added to a solution list. The removed point is then set as the starting point and the process is repeated until the three sorted lists are empty. The depth is represented as a decimal percentage of each list. These sublists would be checked for the nearest neighbour where 0 only checks the points above and below the current point and 1 checks all points within the lists. The list output from this heuristic is a good attempt at the shortest route from the start point visiting all points right round back to the start point. Being deterministic QSTSH will always give the same output given the same set of inputs thus making it easy to test for accuracy. 3.4 Implementation QSTSH was initially developed in Java and can be easily split up into two parts: Data Preprocessing and Route Calculation. During the Data Preprocessing all points are read in and stored in three lists where each list is then sorted on a different perspective. function QSTSHPreProcessing(SetOfPoints) XList QuickSortOnX(SetOfPoints); YList QucikSortOnY(SetOfPoints); XYList QuickSortOnXY(SetOfPoints); After the preprocessing the main body of QSTSH is run on the three lists 21
  • 22. 3.4. IMPLEMENTATION function QSTSH(Double depth, XList, YList, XYList) SolutionList[0] ← XList[0] remove XList[0], YList[XList[0]], XYList[XList[0]] for XList.size-1 do cPX = nextPoint(XList); cPY = nextPoint(YList); cPXY = nextPoint(XYList); pos = 0; for pos ≤ depth × XList.size do if dist(SolutionList[pop],cPX)≥dist(SolutionList[pop],XList[pos]) then cPX = XList[pos] if dist(SolutionList[pop],cPY)≥dist(SolutionList[pop],YList[pos]) then cPY = YList[pos] if dist(SolutionList[pop],cPXY)≥dist(SolutionList[pop],XYList[pos]) then cPXY = XYList[pos] end for if dist((SolutionList[pop],cPX)≥dsit((SolutionList[pop],cPY)) then cPX = cPY; if dist((SolutionList[pop],cPX)≥dsit((SolutionList[pop],cPXY)) then cPX = cPXY; SolutionList ← cPX remove XList[cPX], YList[cPX], XYList[cPX] end for SolutionList ← XList[0] Figure 3.1: QSTSH’s main function The pseudocode of Figure 3.1 shows the inside of QSTSH in its most basic form. Although the above code looks complex, it can be easily explained in plain English. Initially the three lists go in and the first point in the sorted XList is added to the solution, then that point is removed from all three sorted lists to prevent it being reselected. Then for each point in the lists, barring the last, a process of testing the distance between the preceding point and both the selected point and the previous best is carried out. The point giving the shortest of the two distances is then stored. This is only run through points up and down from the position of the preceding point to a given depth. Depth in this instance is a decimal percentage of the size of the list to check; a depth of 0.5 22
  • 23. 3.5. DEVELOPED OPTIMISATION AND ACCURACY would only check half the list, a quarter up and a quarter down. The point chosen as the best distance is added to the solution and removed from all input lists then the selection procedure is run again until only one element is left where the last element is just added to the solution. 3.5 Developed Optimisation and Accuracy After development of the original heuristic, several experiments were run to find any possible optimisation or accuracy improvements. Out of these experiments several optimisations and one major accuracy improvement were found . As QSTSH was initially developed in Java some of the optimisation enhancements were purely code based such as allocating all variables at the top of the class and instantiating all variables only once within the constructor, as shown in Figure 3.2. public class QSTSH { public Vector<Point> listOfPoints, listOfSortedX, listOfSortedY, listOfSortedXY, cloneList, solution; public int DEPTH = 0; public Point beg, last; public QSTSH(double DTEMP, String index, Vector<Point> all, Vector<Point> x, Vector<Point> y, Vector<Point> xy) { setUpLists(); listOfPoints.addAll(all); listOfSortedX.addAll(x); listOfSortedXY.addAll(xy); listOfSortedY.addAll(y); DEPTH = DTEMP; beg = findBeg(index); ... Figure 3.2: QSTSH’s Constructor 23
  • 24. 3.5. DEVELOPED OPTIMISATION AND ACCURACY This improvement removed all unnecessary reinstantiation of variables and replaced them with simple reallocation. Another interesting discovery during the testing was the processing time differences between functions on Java Vectors, ArrayLists, Sets and HashMaps. The outcome of tests showed that by far the most efficient were Vectors compared to ArrayLists and it is worth noting that Sets and HashMaps were used for their unordered nature to prove that QSTSH required the Quick Sorts. All uses of ordered lists within the Java implementation of QSTSH were changed to java.util.Vector. After the structural Java changes tests were done to find ways in which QSTSH could be split up and run in multiple threads or processes. This proved difficult for the main body of QSTSH but the sorts were split between different threads offering a further level of optimisation. On top of this each sort was made into an in-line recursive form of the quick sort algorithm, adding another level of efficiency and reducing memory consumption further. The final major improvement was to accuracy, although it also proved beneficial for efficacy, and it came directly from the experiments in the form of a static variable. The depth variable allows users to set the checks to run through 100% of the list or just the direct neighbours in the lists. As a general rule for accuracy the equation depth = (1.6563 × (NumberofPoints)−0.388 ) − 0.0096 was found to give a rough best depth. The tests that lead to the discovery of this equation were carried out by plotting at what depths QSTSH was better than the Greedy Nearest Neighbour (GNN) then finding the line of best fit (the graphs created can be seen in Figure 3.3). When plotted, it became evident that there must be some form of correlation between depth and accuracy of QSTSH. So much so a simple equation must be reachable. Although the depth produced by this equation does not always provide the most accurate output from QSTSH for a dataset, it does on average give the best output. However further experiments need to be done to determine a better equation. 24
  • 25. 3.6. DEVELOPMENT TO DATE Figure 3.3: Plots of depths at which QSTSH exceeds GNN logarithmically scaled and normally scaled respectively. Showing a line of best fit. 3.6 Development To Date To date, this is the first formalisation of QSTSH but there is still a list of experiments and improvements to be done in the future. The next major planned development for QSTSH is to research the depth value and find an equation that better represents the optimal depth value. 25
  • 26. Chapter 4Requirements This section deals with the requirements in a MoSCoW fashion [5]: Must, Should, Could, Wont have to each of functional and non-functional requirements. 4.1 Functional Requirements The functional requirements for ETSP are as follows: 4.1.1 ETSP Must: 1. Load in files of the format (.txt, .tsp, .tsproj) 2. Look for and load in plugins with format (*impl.class) 3. Allow the user to save the projects in multiple formats (.txt, .tsproj) 4. Record the runtime of loaded plugins 5. Record the accuracy of a plugins output 6. Display information on plugin runs accurately and usefully 4.1.2 ETSP Should: 1. Allow the user to program scripts 26
  • 27. 4.2. NON-FUNCTIONAL REQUIREMENTS 2. Load in files from other formats such as (.csv) 3. Look for and load in plugins with a class file containing required methods 4.1.3 ETSP Could: 1. Allow the user to program complex scripts for TSP solutions. Such functions that should be available include: 1.1 Basic list manipulation 1.2 Variable comparison (Such as Point b = Point a) 1.3 Operations (Such as *, %, -, +, ) on Point data (Such as X, Y values) 2. Load in all plugins in formats of (.JAR and .class) with required methods 3. Command line functionality 4.1.4 ETSP Wont: 1. Calculate the big O value or the big Ω value for given plugins 4.2 Non-Functional Requirements The non-functional requirements for ETSP are as follows 4.2.1 ETSP Must: 1. Load and be usable within 5s 2. Switch between tabs without any lag time 3. Save without any lag 4. Load files with up to 5000 Points within 5s 27
  • 28. 4.2. NON-FUNCTIONAL REQUIREMENTS 4.2.2 ETSP Should: 1. Work on the most current stable and soon to be stable versions of Java (Java 7 & 8) 2. Have a working/ accurate progress bar for script running 3. Not Crash when no plugins or no working plugins are given 4.2.3 ETSP Could: 1. Prevent itself from crashing while running outside plugins 4.2.4 ETSP Wont: 1. Be runnable on older forms of Java (Pre Java 8) 28
  • 29. Chapter 5Methodology & Technologies 5.1 Methodology This project was dealt with in an academic manner as more of a research project than a development task. For this, four primary steps were taken; research into the field and previous systems, requirement acquisition, system development and system testing where all along documentation was being carried out. 5.1.1 Research Primarily the research focused upon the history of pre-existing solutions, different forms and real life influences in relation to the Travelling Salesman Problem. This was carried out using methods such as searching through papers on Google Scholar and checking for resources in the library. Google Scholar provided the bulk of research material linking to online publishing companies. 5.1.2 Requirement Acquisition During the first stages of requirement acquisition design elements were chosen to answer the question: “How can I fairly and accurately test a Travelling Salesman Solution in a static environment to negate the change in environment from an experiment?” This managed to raise a large number of requirements that suited the basic functions of the system. For some of the more advanced functions, such as the functions behind ETSPLang, 29
  • 30. 5.1. METHODOLOGY a deeper general question was asked: “Is there a group of functions that could prove useful to have when developing and testing a Travelling Salesman Solution?” Finally the non-functional requirements were selected based on general knowledge about computer systems and expected run environments such as a general rule of “Don’t keep the user waiting”. 5.1.3 System Development System Development was carried out in an Agile manner where it was completed in a set of runs. The chosen requirements for completion within a run were decided based on what could be completed in a week long period. After each run the code was reviewed and the next iteration of code was decided. On completion of version 1 of ETSP further rounds of re-factoring were planned to allow for robustness of code. During development version numbering was carried out in a A.B.C pattern; where A represented the completion of a set of requirements, B was the completion of one requirement and C was the completion of a part of a requirement or a minor update. For this project the target of A was 1 with the hope to suggest groups of requirements that could be used in a future iteration. 5.1.4 System Testing It was planned that the system would be tested in three ways: functional testing, non- functional testing and HCI testing. Both functional testing and non-functional were carried out in a yes/no true/false manner as these can be objectively observed whereas HCI or Human Computer Interaction testing was carried out using volunteers and the success or failure depended on their input. An added step of testing functionality was taken by running an experiment to compare two plugins. 5.1.5 Documentation Documentation was carried out alongside the development of ETSP and organised by the use of a Gantt Chart, see Figure 5.1. In the place of breaking documentation down into “runs” it was broken down into subsections and allotted an estimated length of time based on the importance of such sections. Instead of learning a new system for creating Gantt 30
  • 31. 5.2. TECHNOLOGY Charts it was decided to use the graphing function of Excel as it adequately fills the role and is easy to add in completed time in a time sheet style. Figure 5.1: Gantt Chart Plan of Documentation. 5.2 Technology Throughout the early stages of the project decisions on technologies were taken as and when they became relevant to the progress of the project, keeping in mind the requirements and the estimated user group of the project material. Other useful factors for decision making included time restraints and prior used technologies. 5.2.1 Licensing As with any use of technology all licences had to be observed to the fullest. It is understood that the development would be done in Java with several libraries such as Swing [20] which are contained under the Java Licence. Java applications can be released under the GNU General Public Licence marking it as free or under the Oracle Binary Code License (BCL) free of charge [21]. Oracle Java Licences can be obtained from the Oracle website for Java use on some embedded computer environments. 31
  • 32. 5.2. TECHNOLOGY 5.2.2 Java & JavaDoc The decision to use Java was heavily weighted by time restraints. It is known that Java is not the most optimised programming language. When dealing with heuristics designed to be computationally heavy it would probably be ideal to have the system run in a Function language such as C over an Object Oriented language but this would have required learning another language. In the end the decision was made to use Java based on prior skills acquired in this language. As with most projects it was desired to make ETSP “Future Proof” and as such it was developed on Java 8 (Project Kenai) then periodically compiled and run on Java 7, thus assuring it was suited to the stable release Java 7 and for the future release Java 8. During development of ETSP on the 18th March 2014 Sun Oracle Released Java 8 as the current stable version. ETSP was also documented to follow coding standards using the JavaDoc tools to allow for easy understanding during future developments. 5.2.3 IDE Before making a discussion on an IDE for the project several IDEs were reviewed and analysed for their usefulness in such a systems development. NetBeans, Eclipse and SublimeText were all tested for suitability. Although arguably SublimeText is a text editor and not an IDE it was considered for it’s cross platform, lightweight model. In the end Eclipse1 was chosen on its merits as an IDE that can manage projects and run applications without the need of packaging. It also has a very user friendly GUI which had been used and learned for other projects. Alongside Eclipse, SublimeText was used to carry out the writing of LATEX files. This decision was made based on SublimeTexts Build Systems function, which allowed for the creation of a simple script that built the PDF out of the respective TEX and BIB files. 1 Both Luna and Kepler (version 5(Beta) and 4 respectively). 32
  • 33. 5.2. TECHNOLOGY 5.2.4 Storage Due to the light use of memory in ETSP it was quickly decided to use plain text files as the input. This decision was made on the basis of an already established file type .tsp, see Appendix C.1. Although the TSP file type was very accommodating with what it allowed, it could not account for some of the session features used in ETSP. It was also deemed overly complex for initial loading of material because of these point two other file formats were chosen .txt and .tsprj. TXT was chosen to allow for quick and accessible loading of data. TXT is widespread and in its use for ETSP is perfect for a very simplistic loading file, see Appendix C.2 for details about the format use. Alongside TSP and TXT a more comprehensive file type and layout was created. TSPRJ was created as a file type that would allow for the storing of certain session information. Given the added complexity of TSPRJ it was primarily developed to be handled solely by ETSP and not really touched by the user, see Appendix C.3 for details about TSPRJ such as layout and use. 5.2.5 Version Control Throughout the project version control was taken very seriously and was managed in three parts. Each folder and structure was stored in DropBox, then the project files were backed up to GitHub whenever a feature or class was completed with incremental archiving and storing of all files as zip files. This three way control allowed for a much more dynamic version management system. The use of DropBox negated the need to keep checking if files needed pulled as this was managed by DropBox. The use of GitHub allowed for a constant and well managed version control system that could be accessed in an emergency where as if simple file checks had to be done or a simple roll back the use of zip archives negated the need to use messy commands in Git. This well rounded version control also allowed for additional benefits such as GitHub’s online portal for viewing the project and DropBox’s portability for opening and editing single files on the go. 33
  • 34. 5.2. TECHNOLOGY 5.2.6 Parallels Desktop 8 Parallels Desktop 8 [1] is a virtual machine tool used for running virtual operating systems on a Mac either in cohesion mode where the Mac and chosen operating system run side by side or in boxed mode where a virtual box is started up. This software is a licensed software costing £64.99 for a license and was used for the testing of the project on several major operating systems. 5.2.7 LATEX For the documentation of this project it was decided that LATEX would be used with a MScthesis class. LATEX allows for better typesetting, figure placement and referencing styles compared to other documenting tools. 34
  • 35. Chapter 6System Design & Architecture 6.1 System Design 6.1.1 User Interface ETSP was designed to have two primary forms of user interface, Terminal input and GUI. As GUIs are more sought after for their user experience and usability, designing the GUI was focused on most. Some focus was placed on good Human Computer Interaction (HCI) and especially with the design of the GUI. To help with HCI ETSP was designed to only allow users to run components that were available and disable any that were not necessary. This does not only insure a better user experience but also insures a robust system. Figure 6.1: Main Screen Concept The main user interface was broken up into four components in the form of three tabs 35
  • 36. 6.1. SYSTEM DESIGN and the overall window. All tabs had to be housed together in a usable window with a very simple, standard design. In Figure 6.1 the main window layout can be seen with the three tabs shown on the top left of the window under the menu bar. For the menu bar several decisions were made based on other programs such as having a File menu and a Help menu to give a more conventional feel to the overall program. Heuristic Tester The main functionality of ETSP is a simple tester for TSP heuristics which would give out statistics. These statistics will include runtime, the route cost (length) after the heuristic has been run and, if a known optimal route has been set, it will give a percentage of how optimal the current route is. For this part of ETSP a simple, uncluttered design was drawn up as seen in Figure 6.2 where on the left side there is the information about the working data as it stands and on the right there is the output of a run heuristic with a run button separating the two. A heuristic/plugin can be chosen from the plugins menu as seen at the top left of the main screen concept in Figure 6.1. Figure 6.2: Heuristic Tester Tab Concept This design was considered best for it’s simple click and run functionality. Coupled with this design would be the loading and saving options within the File menu as found in most common programs. 36
  • 37. 6.1. SYSTEM DESIGN Project Viewer Due to the way data is used in ETSP it is important to be able to alter the project information after it has been loaded into the system. For this a specific tab was designed to allow for the displaying of all project data and the editing of specific project information. Figure 6.3: Project Tab Concept To help with usability the Project View tab, as seen in Figure 6.3, was split into two parts by type of field where the top of the tab was used for editable fields and the bottom of the tab was used for uneditable or locked fields. This prevents the user from changing information they should not be able to change whilst at the same time clears up what information can be changed. ETSPLang One of the major design points of ETSP is the ability to build and run scripts as part of testing heuristics. Following the simple uncluttered design of the heuristic testing tab this tab has three components: an input screen, output terminal and a run button as seen in Figure 6.4. 37
  • 38. 6.1. SYSTEM DESIGN Figure 6.4: ETSPLang Tab Concept For a scripting system to work a standard language has to be designed and implemented. This language will require syntax and keywords with definitions. The language design is simple with commands being split by a new line and the use of a library of defined commands, see Table 6.1 for the preliminary vocabulary design. Term Function Run Run a specific heuristic Compare Compare two or more Log Output a log file // Comment Help Display help data Table 6.1: Preliminary Vocabulary Design This language was designed so scripts such as that in Figure 6.5 could be run. As it was preferred the development of the language would be dynamic, no strict design choices were made until development. 38
  • 39. 6.1. SYSTEM DESIGN // Give some comment Run <Heuristic Name> Compare <Heuristic 1> <Heuristic 2> Help Figure 6.5: Sample Script 6.1.2 Plugin Loader Because of known complexity issues with classpath loading in Java it was decided early on that all classpath handling would be done without user input to prevent complications. The plugin loader was designed such that, if the plugin interface followed some constraints then ETSP would automatically load it into the classpath and make it runnable. 6.1.3 ETSLib It was decided during the design process that, to add to the usability of the system, some standard libraries should be built to help in the development of plugins suited to ETSP and out of this idea Euclidean Travelling Salesman Library or ETSLib was born. ETSLib is designed to handle any additional libraries that could be removed from the main ETSP system and used in both the plugins and in ETSP. Creating this library would add a level of standardisation to the data structures used in communication between plugins and ETSP. 6.1.4 File Loader With the way data is handled and in the volume in which it can exist there was a need for a file loader and saver of some description. Given this need a simple flat file loading and saving system was designed to be integrated as part of the overall ETSP system. 39
  • 40. 6.2. ARCHITECTURE The flat file system decision was based on a pre-existing file type created for TSPs given the file extension .tsp (see Appendix C.1) while a decision was also taken to design further file types for use within ETSP, see Appendix C 6.2 Architecture 6.2.1 Overview Much deliberation went into the chosen architectural design of the system. Based on the plugin testing design it was decided to go for a closed three-tiered architecture. As with any system that runs outside code it is empirical to look out for security issues such is the reason for the closed architecture preventing the users plugin code changing items in higher tiers. Separating the system up into three tiers produces: a Presentation Tier hosting all the GUIs, a Business Logic Tier hosting the processing of plugin runtimes and statistics as well as the plugin interfaces connecting to the plugins and a Storage Tier hosting the functions for saving and loading data files, loading plugins at startup and storage entities. These three tiers are illustrated in Figure 6.6 Figure 6.6: System Architecture Diagram. 40
  • 41. 6.2. ARCHITECTURE In the way of a strict closed three-tier architecture the system is designed so that each component is only capable of calling components on the same level or the one below. Therefore, the Presentation Tier can only call to the Business Logic Tier and the Business Logic Tier can only call to the Storage Tier and connected plugins. A closed architecture allows for the stated added security and the help when tracing problems in the system making it the perfect choice and the system can be cleanly broken up into the three primary components such that the project is designed to strictly follow this architectural plan. Presentation Tier Given the chosen architecture this tier should only be coupled to the Business Logic Tier and should not communicate directly with the Storage Tier or plugins. The Presentation Tier will house the Graphic User Interface (GUI) and Command Line Interface (CLI) for user interaction with the system. Business Logic Tier This tier will act as an interface for communication between the GUI/CLI and the Storage Tier and Plugins. This tier will carry out all calculations for runtimes and accuracies and interface directly with the plugins connected by the users. Storage Tier As storage in ETSP will primarily be done in the form of flat files this tier will communicate with the file system. This tier will carry out read and write commands from and to files and will only be accessible through the Business Logic Tier to prevent Plugins accessing it. 6.2.2 Class UML Figure 6.7 shows the Class UML for ETSP as designed and implemented. 41
  • 43. 6.2. ARCHITECTURE 6.2.3 Packages The system design was broken up into three main sections with several packages within each and organised based on relevance. One of the primary aims of the system design was to follow the Object Oriented Paradigm within the selected three-tier architecture. The three main sections used are outlined with their packages below in Figure 6.8. Figure 6.8: System Packages The ETSP section contains the main system itself from which ETSP can be run. ETSP is broken down into four packages the GUI making up the Presentation Tier, the PluginInterface and the ETSPLang make up the Business Logic Tier and the Storage package making up the Storage Tier. The ETSLIB section contains required and useful packages for the creation of plugins so they follow a standard format. Although plugins are created by the users they are a part of the system and as such they are given a section of their own. 43
  • 44. Chapter 7Implementation 7.1 Overview The implementation of ETSP finalised for this project contained all major functionality discussed in the requirements, Chapter 4. For the demonstration of ETSP several plugins were built to be deployed with the system. The user interface was developed with the original designs in mind and with the desire to find ways to add to the Human Computer Interaction of the system. Figure 7.1: ETSP overview 44
  • 45. 7.2. USER INTERFACE 7.2 User Interface It is worth noting that although all major functionality was produced a Command Line Interface (CLI) was not. Although a CLI was originally planned it was decided that the time would be better spent adding HCI improvements to the main Graphical User Interface thus giving the first full version of ETSP as seen in Figure 7.1. Only some minor, innocuous changes exist between the conceptual main view and the final product such as using the word Project instead of File in the top left corner of the menu bar. Overall ETSP components have either lived up to or exceeded expectations from their original designs. 7.2.1 Heuristic tester When ETSP is launched the Heuristic Tester is the first tab loaded with all unusable options and buttons deactivated, as can be seen in Figure 7.1 where the run button, three tabs at the top and the plugins menu are all greyed out stopping the user from using them. This was to make ETSP robust against user caused crashes. Once a file is loaded the Heuristic Tester becomes completely unlocked to the users as seen in Figure 7.2. Figure 7.2: ETSP: Heuristic Testing Tab 45
  • 46. 7.2. USER INTERFACE It can also be seen from Figure 7.2 that the original three piece design of information before the run to the left, information after the run to the right and the run button in the middle was not entirely kept. The basis of the original design was still used so the before and after information are still sat left and right respectively with a run button separating them. However, the addition of constant information above and the list of previous runs below provided a cleaner output. Figure 7.3: ETSP: Heuristic Testing Tab in Action If a user runs a plugin the page allows the user to compare the current information on the dataset and the outcome of the plugin before deciding weather or not to accept the new output, as seen in Figure 7.3. This allows the users to compare the new route length to the previous and the new route accuracy to the previous while the system disables the ability to change the heuristic and or to change tab. 7.2.2 Project Info Viewer The project info viewer was kept to the original concept with the editable information sat in a section above the uneditable information, as seen in Figure 7.4. This tab allows the user to change nearly anything about a loaded project from the name to the preferred file path even the list of points. 46
  • 47. 7.2. USER INTERFACE Figure 7.4: ETSP: Project Info Viewer Due to statistical output if any points were added, removed or edited all the statistical information of previous plugin runs had to be reset. To prevent confusion to the user and to add to the HCI of the project a simple yes/no dialogue was created to ensure the user understands that submitting changes will remove previous information as seen in Figure 7.5. Figure 7.5: ETSP: Project Info Viewer Submit Request 47
  • 48. 7.2. USER INTERFACE To help with HCI and robustness, input checks were made preventing damaging input from being submitted. This is done by checking the input of each element when the user tries to click elsewhere. If the element contains damaging data the input element will be highlighted red and prevent the user from continuing with editing. 7.2.3 ETSP Terminal The ETSP Terminal is the user interface to ETSPLang which has an extremely simple three component design following the original concept. The simple design can be seen in Figure 7.6. Within the ETSP Terminal tab the two main components are the input window and the output window. In the current version of ETSP the output is given by tying the Java standard out (StdOut) to a TextPane meaning the output comprises of Java System.out.print statements which would normally be pushed to the terminal. This was decided for optimisation of code as individual outputs would not have to be created then pushed to the TextPane. Instead, the Java System class would handle the outputs. To add to the systems HCI the output panel was set to be uneditable by the user and was set to auto-scroll to end if output was added. The input panel of the ETSP Terminal is just a simple editable TextPanel that when the button is clicked passes its contents to the ETSPLang package to be processed. Figure 7.6: ETSP: ETSPLang Terminal 48
  • 49. 7.3. ETSPLANG 7.3 ETSPLang As of version 1.0.0 of ETSP, ETSPLang sits at version 0.5.9. This is due to the unforeseen complexity in developing a scripting language. Even with the missing features enough was developed to run basic scripts and do the two most important functions; to compare two lists of plugins and run a single plugin (an example of a script that has been run can be seen in Figure 7.6).. 7.3.1 Syntax The current version of ETSPLang was given a basic syntax to make an easy to read scripting language. It was decided for ease of reading that: all keywords would be in upper-case, colons would be used as separators for lists and all new commands would be placed on a new line. 7.3.2 Functions and Keywords As of ETSPLang version 0.5.9 only eight keywords have been developed but they were chosen to be the most important. Keyword Function HELP Displays a complete help dialogue. CLEAR Clears the output terminal. OUTPUT Displays a before and after output of the script up to this point STORE Stores the changes to the live project Denotes a comment Table 7.1: Table of ETSPLang keywords Part 1 49
  • 50. 7.4. PLUGIN LOADER Keyword Function START_S Denotes the start of a script END_S Denotes the end of a script COMP Compares two lists of heuristics split by a colon RUN Runs an individual list of heuristics split by a colon OUTPUTLEVEL Sets the output level as one of NORMAL SILENT or VERBOSE Table 7.2: Table of ETSPLang keywords Part 2 Some of the major decisions about the use of language and the functionalities were not made until development was under way. This allowed for a more flexible development procedure and not being tied to predetermined ideals. Within the language it was decided that a start and end command would prove helpful for readability. This decision was made on the basis of sensitive and insensitive commands, where the commands RUN, COMP and OUTPUTLEVEL would be classed as sensitive commands and would require to be in the START_S and END_S tags whereas all other commands do not. During development it was noted that the two most simple commands that would be necessary would be one to compare heuristics and one to run heuristics and as such COMP and RUN were developed. A full list of the current keywords can be see in Table 7.1 and Table 7.2 with their function. 7.4 Plugin Loader The plugin loader for ETSP version 1.0.0 was designed with very specific requirements and as such plugins have to be written with very specific interfaces and use very specific data-structures. Currently the plugin loader can only load a plugin that has an interface class file with the naming convention <plugin name>imple.class. It was intended that the plugin loader could be designed to handle JAR files as well as class files but due to time constraints this was not done for this version. Future advancements are discussed in Section 10.3 Within each plugin interface class two functions are required; runAlg(Vector<Point> vp) which runs the heuristic given a vector of ETSPLib.Storage.Point and getList() which returns a vector of Point’s. The choice of using a standardised data-structure was taken to make developing plugins easier. 50
  • 51. 7.5. ETSPLIB 7.5 ETSPLib As outlined in the design for ETSP the creation of a library of standard functions and entities was carried out throughout the development. ETSPLib houses the data-type Point which is used by ETSP to handle individual coordinates. At the moment only the Point class is held by ETSPLib. Some beneficial developments were acknowledged but not carried out and are outlined in Section 10.3 7.6 Other Developments Other notable developments in ETSP include a help box and an about box, see Figure 7.7, to help create a feel of familiarity to the program by adding features that can be commonly found in other programs. Other HCI improvements were added such as an undo functionality (Control + Z) and a redo functionality (Control + Y) as these two functions have almost become expected when dealing with a platform in which you can make a mistake entering information. On top of this some background HCI was added in the form of remembering screen size preferences and file locations. Figure 7.7: ETSP: Help and About boxes During development the discussion was made to use some operating system specific packages such as the com.apple.eawt package. This allowed for the use of some of Apple’s Mac OSx specific functions such as the hopping icon where, when user attention is required, the icon for the application hops from the Mac OSx doc along the bottom of the screen. 51
  • 52. 7.6. OTHER DEVELOPMENTS Along with this the addition of full-screen functionality was added to utilise the Mac OSx Lion (10.7) and up full-screen ability. These design decisions were taken to give the system a more “natural” Look and Feel while used in Mac OSx without hindering the usability within other operating systems. On a strictly development point, several classes were superseded by a major improvement in some way and where this occurred rather than delete the older classes it was decided to mark them as @depreciated so future developers can look back on previous developmental decisions. 52
  • 53. Chapter 8Experiment 8.1 Overview As stated in the motivations of ETSP one of the outcomes was to be able to test QSTSH against a comparative heuristic using a fair platform. For this test it was decided to use the Greedy Nearest Neighbour as it shares several similarities with QSTSH. 8.2 Aim To determine the accuracy and efficiency of the Quick Sort Travelling Salesman Heuristic (QSTSH) with reference to the Greedy Nearest Neighbour Heuristic (GNN). 8.3 Hypotheses Given previous experiments it is valid to state that QSTSH will, on average, prove more accurate than GNN and given the complexity of the code, QSTSH will prove more efficient than GNN on larger datasets but less efficient on smaller datasets. 53
  • 54. 8.4. HEURISTICS 8.4 Heuristics 8.4.1 Greedy Nearest Neighbour The Greedy Nearest Neighbour (GNN) is one of the simplest deterministic travelling salesman heuristics which can be summed up in one short sentence; choose a starting point then find its closest neighbouring point and move to it and continue this until no points are left. The GNN is a form of deterministic heuristic as it will give the same output if given the same input. 8.4.2 QSTSH In short QSTSH is a specialised form of Nearest Neighbour heuristic where there is a depth variable denoting how far through the list is to be checked for the nearest neighbour. The features that specifically denote QSTSH are the three sorted lists, sorted on the X-axis, Y-axis and the nearest XY Coordinate. These sorted lists are then searched for a nearest neighbour in only a fraction of the list around where the previous point was situated. 8.5 Method Each of the sets of test data were prepared from the standard library of TSP files data on the University of Waterloo Website found at http://www.math.uwaterloo.ca/tsp/world/ countries.html. As both the GNN and QSTSH are deterministic in nature ETSP was started up and each test file was loaded in. Next GNN and QSTSH were run separately and the runtime and accuracy were both recorded. Within the results both accuracy and runtime were plotted against the number of points in the test file. 8.5.1 Test Environment For the sake of fairness a test environment was set up consisting of a Parallels Desktop Virtual Machine running OSx 10.9.2 Mavericks using 16 Gigabytes of RAM and 4 × 3.2 54
  • 55. 8.6. RESULTS GHz Intel i5 cores. The version of Java chosen for testing was Java 8. 8.6 Results Although the information for the results was collected from the same set of run data the results themselves can be broken down into two major sections for runtime and accuracy. 8.6.1 Runtime The runtime results are best represented by the graph in Figure 8.1 where it can be seen from the red and blue lines that the growth function for each heuristic is startlingly different. The graph itself is enough to prove the hypothesis correct that QSTSH would be the more efficient at higher point sizes than GNN. What can not be seen in the graph is the outcome from the smaller datasets but the results showed that up to a dataset size of 980 points GNN was more efficient or just as efficient as QSTSH. Using R an F-Test was run on the output data and a P value of < 2 × 10−16 was collected giving a significance value of 0.001 showing that this data is highly significant. 0 0.5 1 1.5 2 2.5 ·104 0 2 4 ·104 Number of Coordinates RunTime(ms) Runtime by No. of Coordinates QSTSH GNN Figure 8.1: Runtime in ms plotted by number of points 55
  • 56. 8.7. CONCLUSION 8.6.2 Accuracy Results for the accuracy of QSTSH and GNN can be seen plotted in Figure 8.2 where the nearest neighbour element of QSTSH is more evident. As stated before at the core of QSTSH is a simple nearest neighbour style heuristic and as expected the two share a very similar wave pattern in the results. GNN had an average accuracy of 78.25 whereas QSTSH had an average accuracy of 78.58 which proves the hypothesis that the current version QSTSH was on average more accurate but not by much. The accuracy results were run through an F-Test in R which showed the results had a P value of 5.29 × 10−14 which gives a significance value of 0.001 showing the results are significant. 0 0.5 1 1.5 2 2.5 ·104 70 75 80 85 Number of Coordinates RouteAccuracy(%) Accuracy by No. of Coordinates QSTSH GNN Figure 8.2: Accuracy in % plotted by number of points 8.7 Conclusion The conclusion to the above experiment is clear that QSTSH has a smaller growth function for runtime and is more accurate on average. Because the average accuracies are so close it should be stated that it is inconclusive whether or not QSTSH is actually more accurate on average so further tests would need to be run. It is worth pointing out that QSTSH can be tailored to be more accurate by finding a depth that is best for each dataset but 56
  • 57. 8.7. CONCLUSION this task can be laborious thus defeating the purpose in the efficiency of QSTSH. The use of a standard equation to find a depth value was decided upon to make this test more fair and keep the depth as a fixed variable, albeit a variable worked out by an equation. Future research is planned to find a better equation or function for the depth variable that will increase the average accuracy. 57
  • 58. Chapter 9Testing and Evaluation 9.1 Testing Unless stated otherwise all program testing was carried out on a Mid 2012 MacBook Pro with 8 GB of RAM and a dual core Intel i7 2.9 GHz processor. Testing on other operating systems was carried out using Parallels Desktop Pro 8® on the same MacBook Pro as above with 1 GB of RAM and 1 core. It is also worth noting that testing was heavily used throughout to remove bugs and as such most of this testing could be deemed obsolete as it tests for the very bugs that were preemptively removed. This is not to say the testing was pointless as it solidifies the claims made about the system. Furthermore, given the emphasis on HCI within development, it was decided User Testing would be a great way to collect HCI results and Usability results. 9.1.1 HCI & Usability Testing This section makes reference to the data collected from users testing which can be found in Appendix D. A template for the user testing can be seen in Appendix D.1 and the final results can be seen in Appendix D.2 where the User Number at the top of the screen denotes which user the results relate to. 58
  • 59. 9.1. TESTING About The Test The test consisted off five simple experiments and some before and after questioning. The five experiments were: 1. Open ETSP and open the project file titled wi29.tsp 2. Run the plugin named QSTSH. 3. Change the project name in the Project Information Viewer Tab. 4. In the ETSP Terminal tab run a script. 5. Save the project and close ETSP. Each experiment was given in one simple line as above with minimal information. This was a linguistic choice and designed to force the participant into using the User Manual if or when they got stuck. This technique paid off and will be spoken about later. In order to collect user data all user feedback was anonymised. All participants are informed in the Preamble of the user testing document that their data will be collected and used in an anonymised manner and that they should only continue as far as they are comfortable. Participants are required to agree to the conditions in the Preamble before their data can be used. The Preamble can be seen in Appendix D.1. As a standard help was given only when absolutely necessary and this only occurred in one case. Sadly due to a lack of interest only seven participants were used in the user testing but the results are very telling. For each experiment the users were asked to rate Ease of Use out of 10 (1 being hard and 10 being easy) and to state if the system had any issues. The choice of these questions allowed for the user to report a problem with understanding separate from a problem with the system. Before The Experiment Before the experiment the users were asked to rate their skills and analyse their capabilities. These questions were specifically chosen to grasp the users understanding of both the concepts behind ETSP and Java. The individual user output can be seen in Appendix 59
  • 60. 9.1. TESTING D.2 and is summarised here. It was aimed to get a wide variety of people with different skill levels and understandings of both Java and the Travelling Salesman Problem but due to lack of interest only one participant knew about the TSP. Four questions were asked in this section starting with a question asking the user to rate their Java capabilities out of 10. The average result of this was 3.6 but this did include a range of 1 to 9. Next the users were asked to give their years of experience with Java which when averaged out gives just above 1.4. This may seem rather low but the rage was 0 to 4 and the mode was 0. As stated above only one person said they had worked on the TSP prior to this experiment. One last thing collected in this section was what operating system the user would be using for the test. The outcome of this question was very skewed towards Windows with only one user having used Mac. The overall outcome of this section showed that the userbase for the testing was predominantly people who probably would not use ETSP in their normal lives. As they seemed to show very little knowledge in the fields and concepts these tests could prove useful for HCI and usability as these users would probably have never been faced with JARgon or field specific information. This means that the systems usability is mostly down to the user’s interaction and not user knowledge of the field of study. Experiment 1 In experiment one the user is asked to “Open ETSP and open the project titled wi29.tsp”. This test required several steps and as the user is given as little information as possible it is expected that most people would have to use the User Manual. The average Ease of Use of this experiment was 6.9 which shows that starting ETSP and loading a project file was mostly easy. Only one user experienced problems with the system directly and that was User 6. After observing this problem it was realised that the delivery method of the user test was to blame as the user was required to extract the archive file before running ETSP and this was not made clear. Apart from problems with the system itself four other users reported they experienced problems. These problems have been attributed to usability where one user (User 3) noted that the use of a “New Project” menu option was rather confusing when it actually loaded a Project. Given the evidence especially the average Ease of Use having a range of 2 to 10, a mode of 10 and a mean of 8 it would be fair to say that starting ETSP and loading a project is relatively easy and user friendly. 60
  • 61. 9.1. TESTING Experiment 2 Experiment 2 Simply asked the user to “Run the plugin named QSTSH”. Again although this experiment required several steps only one small instruction was given, meaning that the users were probably going to need to utilise the User Manual. After this experiment the average Ease of Use was 8.6 with a rage of 5 to 10. No problems were experienced with the system and this time only one user experienced any other problems. User 6 rated a Ease of Use of 5 and gave the reason “I’m finding the User Manual difficult to use and interpret. There’s no contents page and a [lot] of JARgon.” This argument is valid as User 6 was shown to have a low understanding of the background concepts from the initial questions and as such would not be expected to fully understand all JARgon in the User Manual. Overall this test showed that to select and run a plugin must be either fairly intuitive or easy to understand from the User Manual. User 3 even commented “This was a very simple experiment and once again the User Manual was very clear.” and User 4 commented “The figures showing how to run the plugin were very helpful.” Experiment 3 For this experiment the user was faced with the line “Change the project name in the Project Information Viewer Tab.” This again was a single line experiment where the user would either have to use intuition or the User Manual. This experiment received an average Ease of Use of 9.3 with a rage of 7 to 10 and a mode of 10. No problems were reported about the system although User 6 did report the system working unexpectedly but judging by the additional information the system acted as expected for the system but User 6 did not expect such an action. User 6 again reported an issue with the User Manual and the experiment by posing the question “What am I meant to change the project name [to]?”. Barring User 6’s issues the results from the experiment are conclusive that again either the system is rather intuitive or the User Manual is easy to use. User 4 commented “Very easy to find and change the name without the use of the manual” which is a plus for the usability of the system. 61
  • 62. 9.1. TESTING Experiment 4 The user in this experiment is faced with a script to run in ETSP Terminal. The script is provided as part of the experiment but the command is just “In the ETSP Terminal tab run the below script”. Results from this test were excellent giving an average Ease of Use of 10 and absolutely no problems reported for the system or otherwise. Very few additional comments were made but those who did comment commonly expressed a lack of understanding of what was shown in the output. Along with this experiment users were asked “Could you write and run your own scripts?” three out of seven said they could which is about expected as only three out of the seven users are known to have studied some form of computing course. Experiment 5 Experiment 5 was a simple saving procedure where the user was asked to “Save the Project as a .TSPRJ file and close ETSP” . The results from this section matched that of Experiment 4 where no users reported any problems and all rated Ease of Use as 10 (Very Easy). User 4 even left the additional comment of “I was able to save the project without needing to look at the User Manual” which helps show that the saving and closing features are intuitive. Other Questions The Other Questions section of the user testing was designed to grasp how the user felt about the User Manual as a whole. The first of the questions asked was: “Was the User Manual easy to follow?” of which six of the seven users answered as Yes. This is a good sign that the User Manual is of the right level for the end user. Secondly the users were asked “Was the manual of an expected technical level?” This question was given a worse rating as four and a half out of seven. The half vote came about as User 3 answered as Yes & No and gave a reasoning in the More Information section. The reasoning seemed to be oriented towards a a lack of understanding of the technical information behind ETSP, this was made apparent with the line “writing a java script”. The next question asked the user “Was there any section of the technical manual that seemed too technical?” which four users answered as Yes. Again it seemed to follow that those with an understanding of Java answered No whereas those with no background in computing answered as Yes. 62
  • 63. 9.1. TESTING When the users were asked to “Rate the User Manual as a whole out of 10, (10 Excellent - 1 Poor)” the average score was 8.3 which shows that on the whole the User Manual was of a decent level. Finally users were asked if they “would feel comfortable writing a plugin given the instructions in the User Manual?” As expected all three people with Java knowledge answered Yes and those without answered No. Overall Analysis From the user tests it can be seen that the system is usable by the majority of people and never demonstrated any programming issues throughout the testing. Although several issues were pointed out, such as the use of the word New where Load would have been better and a lack of a table of contents in the User Manual, all in all the system seemed to live up to expectation. All parts of the system that would require a computing background were marked as such and sections that should be doable using intuition and the User Manual were answered as such. An interesting pattern emerged during the experimenting where throughout testing the average Ease of Use increased from one test to another. This could be attributed to things such as getting used to the system but further testing would be needed to prove or disprove this. 9.1.2 Component Testing In order to test the components of ETSP several tasks were carried out within each tab. Every time the outcome was as expected this was considered a pass and where the outcome was not as expected a fail. This test was done strictly black box in style. Heuristic Tester To test the Heuristic Tester tab several small tasks were carried out. Firstly a file was loaded in before a plugin was selected from the menu. Both of these tasks were carried out several times before moving on to running plugins. Several plugins (QSTSH, GNN and BubbleThrough) were run with no unforeseen effects. Testing this tab also included opening the About menu and trying out both the about box and the help box which worked as expected. 63
  • 64. 9.1. TESTING Project Info Viewer Testing the Project Info Viewer tab was fairly straight forward. One change was made in each box for the project and saved. A further in depth test was carried out where each component had input data that should not be accepted such as naming the file using special characters (such as ”, /, }). These tests were all successful. ETSP Terminal For testing the ETSP Terminal tab and ultimately ETSPLang some small scripts were written and run. A known bug with the ETSP Terminal is that an END_S command is not strictly required because it is not actually checked for, which was proven to be correct by the testing. Excluding the known bug all scripts ran as expected and gave expected output. As well as working scripts some non-scripts were tested to check that the system would give out the expected error output. All error reporting was done as expected. 9.1.3 Scalability Testing As the Travelling Salesman Problem can easily be scaled up in multiple ways two specific scalability tests were done and a multi platform test. Large Datasets One way in which the Travelling Salesman Problem can be scaled up is by the size of the input datasets. Within the standard TSP files, hosted by the University of Waterloo, datasets can range from 29 through to 71,009 points and there is no limiting factor on the number of possible points in a dataset (barring physical limiting factors such as memory size in computers). While testing dataset sizes a test was considered a pass if the data loaded in without the appearance of inactivity. Using this passing criteria ETSP managed to load in all datasets from the list found on the University of Waterloo server. As a final test the world dataset containing 1, 904, 711 points was loaded, the data set loaded in under a second and was usable almost instantly. During the testing of this it was noted that the larger datasets took longer and longer to load into the Project Info Tab to the 64
  • 65. 9.1. TESTING point where long periods of inactivity occurred. The world dataset took over an hour to load into the Project Info Tab but this was foreseen. This was considered adequate as a prompt dialogue is used to make sure users are sure they would like to open this tab on large datasets given the delays. Complex Heuristics The best way in which ETSP can be looked at for scalability is the running of complex heuristics. Although you can not strictly test for heuristic complexity several things were tested to rule out possible issues. Using a deliberately flawed version of QSTSH, where it was designed to stop working at random points to test the handling of ETSP on bugged heuristics, several tests were carried out. When the heuristic would stop working ETSP would just return to the pending state waiting for a plugin to be run. Where plugins would take long periods of time to run there can be periods of what appear to be inactivity from the system which could be seen as a negative. All in all ETSP appears to handle any size or type of heuristic but exhaustive testing would be required to completely back this claim. Multi Platform Testing During development ETSP was designed to be run on all current major operating systems this covers Windows 7 & 8, Mac OSx 10.8 & 10.9 and Ununtu Linux 12.04 & 13.04. Some additional features were put in place for operating system specific functions such as OSx Lion full screen. To test ETSP on the different operating systems it was decided to create separate Operating System Virtual Machines for all the above listed Operating Systems. Each machine was given 1 GB of Ram and 1 2.9 GHz Intel i7 core. All systems worked as expected except for Mac OSx 10.8 & 10.9 which, when on a virtual machine, seemed to have issues loading in test files. As this problem had been experienced during development the creation of a compatibility mode was done allowing the user to start-up the JAR through the terminal with a compatibility GUI. When run on Mac OSx outwith a virtual machine ETSP appeared to work fine. 65
  • 66. 9.1. TESTING These tests concluded that ETSP worked on all major operating systems with no obvious issues that were not already accounted for during development. 9.1.4 Efficiency Testing As a large part of ETSP’s efficiency is solely down to external code only tests relating to ETSP itself were carried out. These tests were specifically the for loading and saving speeds of files, switching between tabs and Accepting/Rejecting of run data. To pass these tests the system had to carry out the task with no overly long pauses, by this an arbitrary time was chosen of 3 seconds. After testing it was concluded that files would load in less than 3 seconds this was also the case for switching between tabs which averaged less than 2 seconds, again dependent on project size as large files slow down the loading of the Project Info Tab. As for rejecting and accepting run data this seemed to be completed in under a second. This test concluded that for system specific parts of ETSP the system is efficient, but again it should be stressed that the system relies on outside code and this cannot be verifiability efficient at this point. 9.1.5 Requirement Testing Requirement testing was done in a simple Yes/No fashion in order of the requirements chapter (Chapter 4). Yes means the requirement exists and works as expected and any marked No were either not developed or did not pass testing. Failed or unimplemented features will be explained. Functional Requirements: Must Load in files of the format (.txt, .tsp, .tsproj), Yes Look for and load in plugins with format (*impl.class), Yes Allow the user to save the projects in multiple formats (.txt, .tsproj), Yes Record the runtime of loaded plugins, Yes Record the accuracy of a plugins output, Yes Display information on plugin runs accurately and usefully, Yes 66
  • 67. 9.1. TESTING All must have Requirements were developed and when tested passed. Functional Requirements: Should Allow the user to program scripts, Yes. Load in files from other formats such as (.csv), No Look for and load in plugins with a class file containing required methods, Yes During development it was decided that CSV files did not have a desired file structure and as such CSV loading was not implemented for the first version of ETSP. Apart from CSV file loading all other requirements were implemented in full and worked when tested. Functional Requirements: Could Allow the user to program complex scripts for TSP solutions. Such functions that should be available include: Basic list manipulation, No Variable comparison (Such as Point b = Point a), No Operations (Such as *, %, -, +, ) on Point data (Such as X, Y values), No Load in all plugins in formats of (.JAR and .class) with required methods, No Command line functionality, No In the end it was deemed infeasible to add some of the planned functionalities to ETSPLang for several reasons such as complexity and time constraints. Some command line functionality was added but this was only to place the system into a compatibility mode GUI. It was decided that given time constraints it was best to focus on the GUI and its HCI than develop a second user interface. As for plugin types, currently only .class plugins can be loaded as time constraints and developmental preferences prevented any further plugin types being developed. Functional Requirements: Wont Calculate the big O value or the big Ω value for given plugins, No 67
  • 68. 9.1. TESTING This was placed in wont because this task could be a project unto itself given the complexity of working out code complexity by just analysing the code. Non-Functional Requirements Must Load and be usable within 5 seconds, Yes Switch between tabs without any lag time, Yes Save without any lag, Yes Load files with up to 5000 Points within 5 seconds, Yes All non-functional must requirements were developed, although larger datasets make switching to the Project Info Tab increasingly slower. However this can be expected when dealing with large volumes of data. Non-Functional Requirements Should Work on the most current stable and soon to be stable versions of Java (Java 7 & 8), Yes Have a working/ accurate progress bar for script running, Yes/No Not Crash when no plugins or no working plugins are given, Yes Sadly during development it became difficult to implement a loading bar that worked on all operating systems and did not impede on the efficiency of the system. It is acknowledged that a solution for the progress bar must exist but was not found during development. For scripts within ETSPLang a sort of progress information output was created in the form of feedback to the user as to where in the script it was currently running. All other non-functional should requirements were implemented in full and worked when tested. Non-Functional Requirements Could Prevent itself from crashing while running outside plugins, Yes 68
  • 69. 9.2. EVALUATION If a faulty, damaged or glitchy plugin is used the system will not crash, but instead will simply not complete running of the plugin and return to a stable state. Non-Functional Requirements Wont Be runnable on older forms of Java (Pre Java 8), Yes Although this was placed under wont it was incorporated. ETSP has only been tested on Java 7 and Java 8 and due to the releasing of Java 8 during development emphasis on Java 7 was removed. Therefore not all tests have been carried out on Java 7, so although it is given a “Yes” this only means it has worked up to now on Java 7 without any obvious issues. 9.2 Evaluation 9.2.1 Functionality Using the MoSCoW system for functionalities as used in the Requirements Chapter (Chapter 4) the system has had the majority of functionalities developed to a usable level. All must have requirements, both non-functional and functional were developed and all important should have requirements were developed as per the requirements chapter. The could have functional requirements were not developed at the time due to complexity issues and time constraints. However, the could have non-functional requirement were developed. 9.2.2 Usability Throughout development a large emphasis was placed on usability and after testing it was apparent the emphasis had paid off. By implementing features that users expect within systems such as undo functionality and operating specific things such as Mac OSx Lion (10.7) FullScreen mode a more standard Look and Feel was achieved making the system feel familiar and easy to use. 69
  • 70. 9.3. RESULTS & CONCLUSION It was brought up during the user testing that the use of the word New in the Project menu seemed rather unusual given the menu item actually loads a project file. This could be easily changed at a future date. 9.2.3 Durability It was good to see that during testing the system seemed to stay robust at all times and did not have any unexpected crashes. With the evidence from the testing the system could be said to be durable and robust without any obvious issues. 9.3 Results & Conclusion Although the user testing was only carried out with seven participants the results clearly show the system to be intuitive and the User Manual easy to follow. To further this claim a larger user group would be needed and selected such that it gives a more accurate representation of the expected user base for ETSP. Given the results from the testing and the explanations given in the evaluation chapter it would be a fair statement to say that ETSP is a working, usable and robust system as of the development of its first major iteration. 70