Key Steps & Best Practices to Custom Mobile App Development Process
Android Benchmarking And Its Authenticity
1. SECTION 1: Candidate Details
Name Osman Sheikh
Student Number 120428665
Project Title
Android Benchmarking And Its
Authenticity
Word Count (excluding Appendices) Approximately 6000 words
Date April 28th 2015
SECTION 2: Project Summary
Brief description of the topic (50 words)
Due to introduction of HSPA and LTE in
mobile telecommunication, low quality
android enabled smartphones have flooded
the market. They claim to have equivalent
specs to high end brands and score high on
benchmarking applications. This research is
based on exploring the concept of android
benchmarking and checking the ways
adopted by these equipment manufacturer of
faking and cheating the benchmarking
applications.
Central Research Question (20 words)
“Which is the most relevant android
benchmarking tool available in the market,
and is it authentic enough for users to base
their judgement on its scores”
Original Research Objectives (100 words)
Find the most relevant android
benchmarking tool
Check and validate the authenticity of
the tool selected by developing test cases
Generate a defect report based on the
test cases
Highlight the benchmark faking and
cheating methods used by android phone
manufacturers
Negate and refute the claims of Chinese
handset manufacturers of providing high
quality sets in very low prices
Enlighten the users by revealing truths
about android benchmarking concept and
the level of involvement of equipment
manufacturers in this business
Revised research objectives (and why My objectives were the same from the
2. revised) (100 words) beginning despite the fact that I had to study
extensive literature on 3rd and 4th Generation
of mobile telecommunication technologies
and do detailed research on low quality
“white box” chinese tablets and devices
Four Keywords to describe the project
Benchmarking Applications
Android Benchmarking
Antutu Benchmark
Chinese android smartphones
Android Handset Comparison
White Box tablets and devices
Briefly describe the framework you are using
(50 words)
I have developed 12 test cases with
relevant handset features associated
with each test case and have also
mentioned the device components
used in each test case. Based on the
test cases a defect report (with
findings) is generated and then an
analysis on android benchmarking
is given at the end
SECTION 3: Findings and Analysis
Checklist
Does your report clearly express the findings
from your project and relate them to an
identified information systems theme?
YES YES NO -
What is the Information Systems theme your
work addresses? (50 words)
Operating Systems; Application Softwares
Communication Technologies
Input and Output Devices
Input, Output and Interaction Styles
Social and Organizational Perspectives
Multimedia Networks
Wireless and Mobile Networks
3. What kind of a contribution do you make in
this area? Who would be interested in reading
your work and making use of your findings?
(100 words)
My research explores the ways the
android based equipment
manufacturers adopt in cheating
and faking android benchmarks and
selling their handsets based on the
fake scores generated through these
benchmarks. Majority of the mobile
phone users use android as
operating system and are deceived
by wrong claims by these Chinese
handset manufacturers. The trend
of android benchmarking is
increasing and users base their
decisions of buying smartphones on
the scores generated by these
benchmarking applications. This
research can prove that although
some benchmarks are useful
indicators and can give you an
impression about the device but
they can’t tell you everything about
how a device performs. The
android phone users can find this
research helpful in making decision
on how to compare android enabled
devices and which factors to look
for while using benchmarking
applications for taking decisions.
SECTION 4: PERSONAL REFLECTIONS
When I started planning to work on this course, several different ideas occurred to me, however, I
chose the one in which I found maximum interest. My understanding is that unless you have interest
in the research area that you choose, it gets very boring and cumbersome to complete the project. It’s
a very comprehensive course and at several times during the research procedure one loses track of
things and unless you have interest and are very specific about what you need, you will end up in
complicating things for yourself. In information systems several concepts and topics are inter related
and it becomes difficult to handle a large or ill-defined topic. Furthermore, completing this project
requires more time, effort and energy than one thinks at the time of starting the research work. I
suggest that one should start working on the project at the start of academic year.
Furthermore, I have realized that explorative research helps you in not just understanding the topic
that you are researching on but also several other topic areas associated with it, for example, my
research was on exploring android benchmarking tools but I also ended up studying extensive
literature on telecommunication technologies and smartphone industry. In a nutshell, completing the
project gets easier if you chose the area of your interest or one that is in line with your field of work.
4. Table of Contents
Introduction
Conceptof Benchmarking and The Most Relevant Application
Objectives with Introduction to Antutu
Conceptual Framework & Research Method
Test plan Description
Test Design Specifications
Tests Specifications
Test Cases (1 to 12)
Findings and Error Report
Analysis & Conclusion
Appendices
Appendix 1: Topic Area Proposal
Appendix 2: Project Specifications
5. 1
Introduction
Before introducing the topic of research, I consider it appropriate to write something about
my job profile and how the topic under discussion is relevant to my field of work and what
instigated me to choose this research area. I work as Enterprise Solutions Specialist with a
telecom company and the scope of my job includes introduction and selling of data and voice
enabled products and services to the SMEs and high profile corporate clients. The number of
mobile phone users has exceeded 145 million and there are five telecom companies operating
in the Pakistani market. The tele density has reached approximately 70% and there exists a
fierce competition between the operating companies. Because of this competition the
companies have dropped their rates on voice minutes and the focus of the companies is
changing towards deriving maximum revenue by selling data services. The introduction of 3rd
Generation and 4th Generation of mobile telecommunication technologies has enabled high
speed data transfers which have changed the complete face of data based services and mobile
phones industry in the country. The telecom companies have started investing in research and
development and many data based services like field force tracking, campaign management
tools and etc. have been introduced to attract more customers. Keeping in view the increase
in data services, the mobile phone manufacturers have shifted their focus from legacy
handsets to 3G and 4G enabled smart phones with high profile operating systems installed in
them. My job is to sell these devices and data based services (through a telecom company) to
these customers.
In the last two years I have noticed that the market has been flooded with low priced Chinese
smart phones. There are presently sixty plus mobile phone brands as compared to six or seven
major brands operating three years back in the country. Interestingly, all of them claim to
provide almost the same specifications and services, however, on highly cheap rates.
Furthermore, all these brands install the same operating system i.e. “Android” in their
devices. The reason is that maximum third party applications serving various purposes are
developed for android. My interest developed to investigate further and find some
benchmarking tools for comparison of specifications and operating system installed on these
handsets. The purpose was not only to understand the difference myself but also to enlighten
and educate my customers based upon my findings. An important point that I would like to
6. 2
mention here is that there are other operating systems too in the likes of IOS, Symbian ,
Windows but they are specific to their particular brands like Apple, Nokia and Microsoft
respectively. Almost all these low priced Chinese smart phones use android as operating
system and thus the scope of my research is limited to finding the right benchmarking tool
available in the market for android enabled devices and then verifying the authenticity of that
tool by generating a defect report. The research can serve a double purpose by not only
checking the authenticity of the benchmarking software but by also checking the validity of
the claims of providing equivalent specifications to high end brands by these low cost
Chinese handset manufacturers.
In the next chapter I would try to explore the concept of benchmarking and find the most
relevant benchmarking application available in the market. Then I would discuss the
objectives and provide an overview of the benchmarking application selected. After that I
would explain the research method used to test the authenticity of that application and then
the conceptual framework of the research design explaining in detail how the research is
conducted followed by the research findings and the error report. In the end I would give my
personal analysis and conclusion on the research findings.
7. 3
Concept of Benchmarking and the Most Relevant Application
In the last decade we have seen a proliferation of smart phones in the mobile phone industry
and the technology in these smart phones is continuously evolving with improved hardware
specifications introduced with every new model. As the usage of these phones is increasing
by leaps and bounces, the quality and the validity of specifications have become a concern for
the users and there have been several applications introduced to test the hardware specs and
performance of these sets. These tests performed by the applications in smartphones are
associated with assessing the performance characteristics of the smartphones hardware and
this concept is termed as benchmarking in smartphones. In benchmarking a series of tests is
performed to find the upper limit of its capacities. The idea is that if you put the phone to
enough stress, you can find its maximum performance and that performance can be given a
numerical rating. The user can get an idea of how a phone performs in different modes from
scoring different parts of the phone’s performance. There are several benchmarking
applications available, however, every application rates the specifications by its own scale
and it is important to note here that two benchmarking applications cannot be compared with
each other but the two devices running the same benchmark can be compared for the same
specifications.
Most of these applications focus on speed and hardware performance but there are also some
that are specific to some particular area e.g. gaming, web browsing or storage speed etc.
Some just benchmark the RAM to test how well the phone can hold up with multiple tasks or
with apps that require a lot of processing. Similarly there are some that just work on the
graphics. Examples of some well-known benchmarking applications and their specific areas
of testing are given below:
Quadrant benchmark the CPU, Memory and 2D/3D graphics
Nenamark benchmark the strength of the GPU- Graphic Processing Unit-
Sunspider JavaScript benchmark the JavaScript performance on tasks relevant to use of
JavaScript
Broswermark benchmark and compare the browsers of various smartphone devices
Vellamo benchmark and provide a holistic view into browser performance
8. 4
Similarly there are several other tools and benchmarking applications but the most commonly
used is “ANTUTU”. AnTuTu can run a full test covering the Memory Performance, CPU
Floating Point Performance, CPU Integer Performance, 2D, 3D Graphics Performance,
Database Input/Output performance and SD card reading/writing speed. A total score is also
reported once you run this benchmark. Furthermore, according to google play “Antutu
benchmark is the most popular android smartphone and tablet benchmarking app in the world
with over 100,000,000 users”. I have version 4.0 of Antutu installed on my personal android
based handset so my entire experience and research findings are based on this specific
version of the application.
9. 5
Objectives with Introduction to Antutu
There has been a recent debate that handset benchmarking tools do not provide authentic
results or the handsets manufacturers design devices in a way that cheat the benchmarking
tests and provide fake scores. Several android smartphone makers have been criticized for
artificially inflating benchmarking scores. Some researchers argue that they have always
known about this and have the ability to manipulate android benchmarks. It shouldn’t be a
surprise to us that in today’s technically advanced times, manufacturers of handsets and other
devices can do everything they can to score as high as possible on these benchmarking tests.
So the objective of this research is to check whether the testing methods developed by Antutu
are sufficient enough to provide valid scores and whether there exists a conflict between
different test results of Antutu. Antutu measure the efficiency and reliability of any android
device and it measures the system efficiency by measuring points. I will try to find a variety
of ways by which the Android hardware makers are cheating end users, for example, some
researchers say that it is very easy to modify AnTuTu's benchmark tool to report "fake
hardware specs list and test scores”. A good example for this could be reporting a high RAM
and faster components than are even present in the device. Similarly there are several other
methods which I shall discuss after the research findings and defect report.
Before moving on to test design specifications and conceptual framework, I would like to
mention here that the below mentioned component evaluations are performed by Antutu
which are also relevant to my research design and objectives
1. Processor Evaluation
Clock Speed test
Frequency Testing
CPU Float Point Test
CPU Core Temperature Testing
CPU Integer Value Test
CPU Load Testing
CPU Architecture Test
10. 6
2. Memory Evaluation
RAM Size Detection
RAM Architecture Testing
RAM Usage
RAM Optimization Time Testing
3. SD Evaluation
SD Class Identification
SD Size Identification
SD Internal Class Testing
SD External Class Testing
4. GPU Evaluation
Shader Model Test
DPI Test
Temperature Sensors
GPU Architecture Support
Max Resolution Support Test
In addition to above, Antutu thoroughly scans the below components for conducting
different tests
• Camera
• Storage
• Processor
• RAM
• Baseband (wifi, gsm, cdma)
• Battery Info
Furthermore, the main interface of Antutu includes the below information:
Device name, Device Score, Test Again Button, Device Details, Handset Model,
Overclocking and Owner Name.
11. 7
Conceptual Framework and Research Method
Several test cases can be developed in order to test the complete functionality of Antutu
benchmarking application, however, the below mentioned twelve cases (mentioned in the test
plan description below) are enough to evaluate the functionality and the validity of its results.
These use cases are developed based upon user experience.
Test Plan Description
The table below contains a list of the test cases, the features to be tested with each test case
and the software components associated with each test case. More specifically, this section
contains a brief description of each test case to be done.
Test Cases:
The following table maps test cases to features and components.
Test Cases Features Components
Test Screen Evaluation
Device name
Device Score
Test Again Button
Device Details
Handset Model
Overclocking
Owner Name
Access device attributes
Ranking Interface
Evaluation
Base Score
Rank List
Top 20 devices
Access wifi component
Access Antutu Server
Device interface
Evaluation
Device Brand info
Device Model info
Device Cpu Model
info
Device Gpu
Renderer info
Rear Camera info
Front Camera info
Imei Info
Operating System
info
Access all hardware
and software
components
Storage Interface
Evaluation
Memory info
Ram info and
utilization
Internal SD info
Access Ram and storage
media
12. 8
CPU Interface
Evaluation
Cpu Model info
Cpu Architecture
info
Cpu Cores info
Cpu Frequency info
Access CPU Components
Display Interface
Evaluation
Renderer info
Gpu Vendor
Gpu version info
Screen Density info
Multi touch info
Touch Type info
Access GPU and LCD
Camera Interface
Evaluation
Camera Rear detail
Camera Front detail
Access Front and rear
Camera hardware
components, also the
algorithm is accessed.
Battery Interface
Evaluation
Battery Level Info
Battery Temperature
info
Battery Ampere
capacity info
Access Battery Integrated
circuit
Access battery sensor
Operating System interface
Evaluation
SDK version info
Android version info
Kernel version info Access kernel and software
version
Sensor Components
Evaluation
Direction Sensor
Support
G Sensor Support
Light Sensor Support
Gyro Sensor Support
Temp Sensor
Support
Pressure Sensor
Support
Ambient Sensor
Support
Humidity Sensor
Access all device sensors
13. 9
Support
Liner Acceleration
Sensor Support
Rotation vector
Sensor Support
Network Interface
Wi-Fi Info
GSM Info
4G LTE info
CDMA info
Access Baseband
Test Phone
Processor Evaluation
Clock Speed test
Frequency Testing
CPU Float Point
Test
CPU Core
Temperature
Testing
CPU Integer Value
Test
CPU Load Testing
CPU Architecture
Test
(ARM/QUALCOM
M/Snap Dragon
.etc.)
Memory Evaluation
RAM Size
Detection
RAM Architecture
Testing
RAM Usage
RAM Optimization
Time Testing
SD Evaluation
SD Class
Identification
SD Size
Identification
SD Internal Class
Testing
SD External Class
Testing
GPU Evaluation
Shader Model Test
Open Gl 2.0 ,3.0
Test
Access all hardware and
software components
14. 10
2D test
3D test
DPI Test
Temperature
Sensors
GPU Architecture
Support
Max Resolution
Support Test
Test Design Specifications
The testing approach will account for the following:
1. Test Screen
2. Ranking Interface
3. Device interface
4. Storage Interface
5. CPU Interface
6. Display Interface
7. Camera Interface
8. Battery Interface
9. Operating System interface
10. Sensor Components
11. Network Interface
12. Phone test
The tests are performed by using different android devices of different manufactures. Since it
is for professional android users so each interface is checked by its design and the logic
behind that design. The black box technique is used to ensure the working of interfaces in
different devices by testing the design and logic of each interface and not the programming
code behind different features and interfaces. The programming codes are beyond the scope
of this research
Environmental Requirements:
The following will be required in order to support the testing process.
Software
• At least Android 1.5.1(Cupcake)
• Maximum supports to 4.3.1 (Kikat)
• Antutu Benchmark Version 4.0
Hardware
• Storage Space – The storage size of application is ten megabytes.
15. 11
• Processor – The processor can handle multiple users’ interactions.
• RAM – The memory will support multiple user interactions.
• Android supported smartphones
• Processor should be of ARM series.
• Uses all input and output components of device.
Communication
• Google Play: Google Play can be downloaded from internet. Most
smartphones contain it by default. Antutu Benchmark Version 4.0 can be
installed from Google Play.
Suspension / Resumption Criteria:
All tests will be run to completion once the test has begun. If the test is interrupted, then the
testing must start over, and the problem will be recorded. If an error occurs that makes
continuing with a test impossible then the cause of the error shall be reported, examined, and
repaired as quickly as possible. Once the error is fixed, the test during which the error
occurred will be restarted, and the test will be run in its entirety. In addition, if the affected
component is used elsewhere, all related cases will need to be rerun.
Risks and Contingencies:
The following list provides some risks and contingency measures.
If software problem occurs in device which affects the device performance, I
recommend to reinstall android OS.
Make sure that no harm is done to hardware during the testing.
If application doesn’t respond, restart the device by holding power-on key for
10-15 seconds.
Always install the same version of Antutu software i.e. Ver. 4.0 as changing the
version can alter the test results.
Handsets models used in the tests include the following:
Ainol(Flame, Spark).
Sky Vega ( IM-830L,IM850S).
LG Optimus LTE 2 F-160L.
LG Optimus G F-180K (Quad Control Edition).
Ramos.
HTC One X.
QMobile Z4 Noir
Samsung Galaxy S4
16. 12
Tests Specifications
Test Procedures:
The testing procedures are listed for each test case. The tester will follow every
step in the following test cases. If no errors occurred, then the tester will need to
validate that the outcome is correct. If errors occur, a software error report will be
written.
Each test case has some specific steps which have to be taken to complete the
test. All test cases will start at the home page.
All the features that are to be tested are contained within the test cases below..
(The line spacing in test cases has been reduced to single to avoid page wastage)
Test Cases
1) Test Screen_ 1
This test case checks and validate main User Interface screen.
Test Items:
The following information is needed to perform this test:
Device name
Device Score
Test Again Button
Device Details
Handset Model
Overclocking
Owner Name
Steps:
1. View device name.
2. View device score.
3. Touch test again button to check the test starts again or not.
4. Touch device details button view device details are working or not.
5. Verify handset model is correct or not.
6. Verify Overclocking is supported or not.
7. Verify the owner name is the one you entered and each word is capitalized.
Pre-Conditions:
Device must contain Android 1.5(cupcake). Antutu Benchmark version 4.0 must be installed.
Post Conditions:
If the test items work correctly then it performs the functionality correctly. TestScreen test
must be succeeded so that user can use this application further.
17. 13
2) Ranking Interface_2:
This test case will check whether the ranking interface is working correctly or not.
Test Items:
The items that are needed to perform this test include:
1. Base Score
2. Rank List
3. Top 20 devices
Steps:
1. Check base score is according to build.prop file.
2. Check connectivity to internet.
3. Verify updated ranking list.
4. Verify and load top 20 devices.
Pre-Condition:
Base score must be tested before viewing ranking and top 20 devices.
Post Condition:
Can move any other interface or exit app.
3) Device Interface_3:
This will emulate device specifications and verifies hardware and software components.
Test Items:
The following test items are needed to perform this test:
1. Device Brand info
2. Device Model info
3. Device Cpu Model info
4. Device Gpu Renderer info
5. Rear Camera info
6. Front Camera info
7. Imei Info
8. Operating System info
STEPS:
1. Verify device brand info from prop file.
2. Verify device model info from prop file.
3. Verify device CPU model info from prop file.
4. Verify device GPU model info from prop file.
5. Verify device rear camera info from prop file.
6. Verify device front camera info from prop file.
7. Verify device IMI info from device baseband.
8. Verify device operating system info from prop file.
Pre-Condition:
1. Build.prop file must be loaded in Antutu Benchmark.
Post Condition:
1. If the device shows no information about wifi and IMEI than display error “no
baseband found” will appear.
2. If unknown CPU or GPU is detected then it will show “unknown” in the CPU or GPU
info.
18. 14
4) Storage Interface_4:
This will emulate the uses of storage interface.
Test Items:
The following program files that are needed to perform this test:
1. Memory info
2. Ram info and utilization
3. Internal SD info
Steps:
1. Verify RAM, storage media speed, size.
2. Get and verify used and free Ram.
3. Get and verify used and free storage speed and size.
Pre-Condition:
1. Internal storage must be mounted.
2. External storage must be mounted.
3. Kill 3rd party apps to verify how much ram system consumes.
Post Condition:
1. Shows the SD class speed Mbps of SD storage.
2. Shows RAM consumed by system.
3. Shows used internal and external free and used storage.
5) CPU Interface_5:
This will emulate the uses of CPU and extract the information and maximum performance.
Test Items:
The following test items that are needed to perform this test:
1. Cpu Model info
2. Cpu Architecture info
3. Cpu Cores info
4. Cpu Frequency info
Steps:
1. Verify CPU model file from build prop file.
2. Verify CPU cores info from build prop file.
3. Verify CPU architecture info from build prop file.
4. Verify CPU clock frequency from build prop file.
Pre-Condition:
Device must read build.prop file for the identification of CPU components.
Post condition:
Device must identify next component info from prop file.
6) Display Interface_6:
This will emulate the uses of display Interface and extract the information about GPU.
Test Items:
The following test items that are needed to perform this test:
1. Renderer info
2. Gpu Vendor
3. Gpu version info
4. Screen Density info
5. Multi touch info
6. Touch Type info
19. 15
Steps:
1. Verify GPU renderer from build prop file.
2. Verify GPU vendor from build prop file.
3. Verify GPU version info from build prop file.
4. Verify screen density from build prop file.
5. Verify multi touch info from build prop file.
6. Verify touch type info from build prop file.
7. Verify CPU model file from build prop file.
Pre-condition:
Device must read build prop file for the identification of GPU components.
Post condition:
Device must identify next component info from prop file.
7) Camera Interface_7:
This will emulate the uses of camera interface.
Test Items:
The following program files that are needed to perform this test:
1. Camera Rear detail
2. Camera Front detail
STEPS:
1. Verify the details of primary camera from build prop file.
2. Verify the details of secondary camera from build prop file.
Pre-condition:
Device must read build prop file for the identification of GPU components.
Post condition:
Device must identify next component info from build prop file.
8) Battery Interface_8:
This will emulate the uses of battery interface.
Test Items:
The following test items that are needed to perform this test:
1. Battery Level Info
2. Battery Temperature info
3. Battery Ampere capacity info
Steps:
1. Verify the current battery percentage from hardware.
2. Verify the current battery temperature from temperature sensor.
3. Verify the batter capacity from build prop file.
Pre-condition:
1. Device must read build prop file for the verification of battery type.
2. Device must activate temperature sensor.
Post Condition:
Device must identify next component info from build prop file.
20. 16
9) Operating System interface_9:
This will emulate the functionality of operating system interface.
Test Items:
The following test items that are needed to perform this test:
1. SDK version info
2. Android version info
3. Kernel version info
STEPS:
1. Verify the Sdk version from build prop file.
2. Verify android version from build prop file.
3. Verify device kernel version from build prop file.
Pre-condition:
1. Access should be granted to verify SDK version.
2. Access should be granted to verify android version.
3. Access should be granted to verify device kernel.
Post condition:
Device must identify next component info from build prop file.
10) Other Interfaces (Components)_10:
This use case will check all the sensors present in the device.
Test Items:
The following test items that are needed to perform this test:
1. Direction Sensor Support
2. G Sensor Support
3. Light Sensor Support
4. Gyro Sensor Support
5. Temp Sensor Support
6. Pressure Sensor Support
7. Ambient Sensor Support
8. Humidity Sensor Support
9. Liner Acceleration Sensor Support
10. Rotation vector Sensor Support
Steps:
1. Verify device direction sensor support info from prop file.
2. Verify device G sensor support info from prop file.
3. Verify device light sensor support info from prop file.
4. Verify device gyro sensor support info from prop file.
5. Verify device temperature sensor support info from prop file.
6. Verify device pressure sensor support info from prop file.
7. Verify device ambient sensor support info from prop file.
8. Verify device humidity sensor support info from prop file.
9. Verify device linear acceleration sensor support info from prop file.
10. Verify device rotation vector sensor support info from prop file.
Pre-condition:
Device must request access to read data from sensors.
Post condition:
Device must identify next component info from build prop file.
21. 17
11) Network Interface_11:
This will emulate the functionality of network by using baseband.
Test Items:
The following test items that are needed to perform this test:
1. Wi-Fi Info
2. GSM Info
3. 4G LTE info
4. CDMA info
Steps:
1. Verify device baseband.
2. Verify device MAC address from network controller.
3. Verify GSM support.
4. Verify 3G support
5. Verify 4G support.
6. Verify CDMA support.
Pre-condition:
1. Device must access baseband information which is transferred to build prop file then.
2. Device baseband gives the information about each and every network components
when access is requested with super user access
Post Condition:
Device must identify next component info from build prop file.
12) Test Device_12:
Test Items:
The following test items that are needed to perform this test:
Processor Evaluation:
1. Clock Speed test
2. Frequency Testing
3. CPU Float Point Test
4. CPU Core Temperature Testing
5. CPU Integer Value Test
6. CPU Load Testing
7. CPU Architecture Test (ARM/QUALCOMM/Snap Dragon .etc.)
Memory Evaluation:
1. RAM Size Detection
2. RAM Architecture Testing
3. RAM Usage
4. RAM Optimization Time Testing
SD Evaluation:
1. SD Class Identification
2. SD Size Identification
3. SD Internal Class Testing
4. SD External Class Testing
GPU Evaluation:
1. Shader Model Test
2. Open LL 2.0 ,3.0 Test
3. 2D test
4. 3D test
22. 18
5. DPI Test
6. Temperature Sensors
7. GPU Architecture Support
8. Max Resolution Support Test
Steps:
1. Verify and run processor evaluation.
2. Verify and run memory evaluation.
3. Verify and run GPU evaluation.
4. Verify and run SD storage evaluation.
5. Verify and submit base score.
Pre-Condition:
1. Device must be running Antutu benchmark.
2. User must click to test button to start testing.
Post Condition:
1. Device must pass the benchmark result.
2. Device should show benchmarking score.
3. Device must move to ranking interface.
4. Device score must be written in bar chart.
23. 19
Findings and Error Report
Defect severity is measured in the following scale:
1. Fatal
2. Minor
3. Serious
Test Case No.3 (Device Interface)
Test Case Function:
Device interface emulates device specifications and verifies hardware and software
components.
Defect: Built in altered version of Antutu Software
This cheating method is generally found on fake Chinese devices. To trick the customers, the
manufacturers change the client side of AnTuTu Benchmark application that generates a list
of fake hardware specs. There is a common problem with these devices that one cannot
uninstall the built-in AnTuTu application or install a newer version of it.
Cases affected by defects:
1. Test Case No.2 (Ranking Interface_2)
2. Test Case No.3 (Device Interface_3)
3. Test Case No.12 (Test Device_12)
Defect Severity: Serious
Test Case No.6 (Display Interface_6)
Test Case Function:
This test emulates the uses of display Interface and extracts the information about GPU.
Defect: self-omission of frames per second and enhancement of frames rate by the
processor
Some built in chips in these imitation handsets recognize the Antutu Application and self-
omit test frames. For example, while running a 3D test, the number of frames per second gets
enhanced by the processor resulting in a better overall test score.
Defect: Altering of screen resolution
When the screen resolution is high the processor takes more load and the processing speed
gets slow. Slow processing speed results in lower scores. Some fake devices self-generate
24. 20
low resolution and feed it to antutu to avoid overload on processor and thus generate fake
high scores.
Cases affected by defects:
1. Test No.2 (Ranking Interface_2)
2. Test No.12 (Test Device_12)
Defect Severity: Fatal
Test Case No.5 (CPU Interface_5)
Test Case Function:
This will emulate the uses of CPU and extract the information and maximum performance.
Defect: Temporary enhanced performance of CPU
As mentioned before that some devices (including some models of high end brands like
Samsung) have an added feature of recognizing Antutu installation. At the time of running
the benchmark, these devices mould or self-improve the CPU frequency by an additional 20
to 25 MHz for a short time and this adjusting or over clocking for such a short duration does
not destabilize the device but boosts the score temporarily. The CPUs do not operate at that
higher frequency during general processing but only do it when they recognize Antutu
benchmark testing.
Cases affected by defects:
1. Test No.2 (Ranking Interface_2)
2. Test No.12 Test Device_12
Defect Severity: Serious
Test Case No.12 (Test Device)
Test Case Function:
A complete device test based on use case 12
Defect: Cheat on hardware components
Antutu Benchmark takes real data from devices. But unfortunately some Chinese companies
as well as some international brands create fake build.prop file which contains all the
information of device. When this fake data is entered into Antutu application it gives wrong
results. The user gets satisfied by getting high performance results on low cost handsets.
25. 21
Cases affected by defects:
1. Test No.1 Test Screen_ 1
2. Test No.2 Ranking Interface_2
3. Test No.12 Test Device_12
4. Test No.3 Device Interface_3
5. Test No.5 CPU Interface_5
6. Test No.6 Display Interface_6
7. Test No.9 Operating System interface_9
Defect Severity: Fatal
26. 22
Analysis & Conclusion
After conducting the tests in detail there is no doubt left in the fact that there are some
smartphone manufacturers who do system optimization and specially target benchmarking
applications to enhance the test scores. There are several methods of tricking customers and
that is proved from the defect report. I have mentioned five different ways of cheating used
by these manufacturers; however, there must be some more technical ways to bypass the
benchmarks. When I run the application on some white box companies’ brands like Ainol
and Ramos, the results were shocking. These brands are not even able to play a 3D game but
scores generated against these devices surpass brands like Motorola and Google Nexus. At
the time of testing through benchmarks, the CPU capacity increases and this can also be felt
by the heat generated in the device. The temporary over clocking give artificial scores hence
showing improved performance. In reality, these devices are slow and their multitasking
ability in no way can be compared to high end brands.
The china’s white box iPad substitutes top the list in the fake devices. They just don’t fake
benchmarks but also use low quality components in their devices. The chip makers for these
devices design their products in a way that detect the presence of most used benchmarking
tools like Antutu, Quadrant etc. in the operating system and when these tools are run, the
chips are designed to behave in a certain way that results in artificially inflated performance.
Sometimes you can get a handset with better specifications but in the longer run the hardware
stops performing because of poorly built firmware.
Another way they use is by disabling the lower clock speeds and sleep modes on all CPU
cores and locking the CPU cores at their maximum speed when the device detects a
benchmarking tool. Some manufacturers apply the same tactics for Graphic Processing Units
(GPU) too. Interestingly some high end brands like Samsung have also been found to be
involved in this practice i.e. “temporarily boosting the results when the device senses that a
benchmark is running”. In late 2013 a member of the beyond3d.com forums discovered
benchmark specific optimizations in the Samsung Galaxy S4. Further investigation revealed
that the device's power management allowed higher GPU frequencies when certain
benchmark applications were run. So, cheating benchmarks has become more come as more
and more benchmarking tools are developed to compare devices.
27. 23
There have also been cases of tampering with the benchmark applications and installing
altered versions of benchmarking tools permanently on the devices. In these cases the already
installed benchmarking application cannot be removed from the device and fresher or an
upgraded version of it cannot be installed. The installed version always shows high scores on
the device it is installed in.
Some Chinese handset manufacturers create fake build.prop file which contains all the
information of device. The actual specifications of the device are different, whereas, the
specifications and information generated through these fake build.prop files when entered
into Antutu application gives wrong results. The user gets satisfied by getting high
performance results on low cost handsets.
In addition to this some manufacturers of these imitation handsets build chips that recognize
the Antutu or other application and self-omit test frames. While running a 3D Rating test, the
number of frames per second gets enhanced by the processor resulting in a better overall test
score. Similarly, meddling with the resolution is done to avoid overloading the processor and
thus generating fake high scores.
Faking benchmarks has become a huge business in Asia and China and particularly amongst
the White Box iPhone and iPad substitute makers as their business is legitimized by
reassuring higher benchmark scores. AnTuTu, which bills itself as an “impartial and unbiased
smart phones benchmark” itself says on its website that “there are some smartphone makers
who do system optimization especially aiming at AnTuTu Benchmark to improve the test
scores”
I think that application detects are so common and easy now that there must be some
remedial measures being taken up benchmarking application developers to counter this issue
and stay one step ahead of the optimizations, however, what if the equipment manufacturers
target the programming codes next? It would open another Pandora box and keeping in mind
the speed with which technology is advancing, it seems to be next on the cards.
I believe it isn’t worth the effort the equipment manufacturers put in to faking the
benchmarks for small gains in performance. What they lose is their integrity. Instead of
28. 24
investing time, money and energy in developing ways to fake or bypass benchmarks, they
should invest their efforts in manufacturing better performing devices.
As a suggestion I would say that although some benchmarks are useful indicators, avoid
using benchmarks that cannot be upgraded. Furthermore, these benchmarks can give you an
impression about the device but they can’t tell you everything about how a device performs.
[The entire research has been done based on my years of personal experience and interest
in android handsets and benchmarking applications. No quotations have been copied or
references taken from any book or magazine. However, for the purpose of gaining clarity
on the subject, some websites were consulted which I have mentioned in the Topic Area
Proposal]
29. 25
Appendix 1
Topic Area Proposal
Author: Osman Sheikh Date: April 2015
Working Title: Android Benchmarking and Its Authenticity
Theme:
The introduction of High Speed Packet Access (HSPA) and LTE has enabled high speed data
transfers which have changed the complete face of mobile phones industry in Pakistan. The
trend has shifted from legacy phones to smart phones and Chinese android based smart
phones have flooded the market. Many users have concerns regarding the quality and
genuineness of specifications claimed by these Chinese equipment manufacturers. My
interest developed to investigate further and find some benchmarking tools for comparison of
specifications and operating system installed on these handsets. This project is based upon
my research of handsets, the operating systems installed on them and applications for
comparison of specifications of these devices.
Research Questions:
1. How has the introduction of 3rd Generation and 4th Generation mobile telecommunication
technologies impacted the smart phone industry?
2. Can the white box tablets and Chinese smartphones compare in specifications with high
end brands like Samsung or Google Nexus.
3. What is the most authentic way of comparing android enabled devices?
4. Is android benchmarking an error free method to rate the performance of different android
enabled devices
5. Which is the most authentic benchmarking tool and why.
6. What are the methods handset manufacturers adopt to cheat on benchmarks?
7. Is it worth investing time, money and energy in developing benchmark faking methods
keeping in view the “morality question” and business ethics in mobile phone industry?
Outline of Research Argument:
In this project the scope of my research is limited to finding the right benchmarking tool
available in the market for android enabled devices and then verifying the authenticity of that
tool by generating a defect report. The research can serve a double purpose by not only
checking the authenticity of the benchmarking software but by also checking the validity of
the claims of providing equal or better specifications than high end brands by these low cost
Chinese handset manufacturers
Links to wider Information Systems issues
Research in application development
Android benchmarks development
Unlawful system optimizations for small gains
Security breach in software development
Complexity of ethics in Information and Communication Technologies
30. 26
Links to syllabus of other courses within the degree:
Introduction to Information System (IS-1060)
Computer Software and Networks ( Operating Systems; Application Software)
Communication Technologies
Input and Output Devices
Information and Communication Technologies (IS-2138)
Human Computer Interaction
Information Processing and perception
Input, Output and Interaction Styles
Social and Organizational Perspectives
Multimedia Networks
Wireless and Mobile Networks
5 key words or phrases for use in an online search
HSPA, LTE and WiMAX
Chinese android smartphones
Benchmarking Applications
Android Benchmarking
Antutu Benchmark
Android Handset Comparison
White Box tablets and devices
Research Framework:
For the purpose of finding the right benchmarking tool, I have done extensive research
online, download several benchmarking applications and have personally tested several
applications on different android devices. After finding the most relevant application, I
developed 12 test cases with each case testing several different features while accessing
different components of the device. Testing procedures have been defined for each test case
including test items, steps to follow while conducting tests and the pre and post conditions of
the test. In the end a defect report is generated on which is based my final conclusion and
analysis.
Required resources and issues of access:
I used eight different android enabled devices including Chinese white box devices and some
high end brands. I personally possessed two while the remaining were acquired from friends
and colleagues in the market (list of devices is mentioned in the report). All the tested
applications and benchmarking tools were available online for free and most of them were
acquired from google play.
Required skills and techniques for research:
My view is that any kind of research requires some beforehand information, interest and
experience in the related field. I am dealing in smartphones for years and my interest in
operating systems paid off while developing test cases for testing different features and
components of devices. Technical knowledge about the handsets specifications and some
understanding of the different versions of operating systems is a pre requisite for conducting
this research.
31. 27
References to websites and articles:
www.androidbenchmark.net
https://browser.primatelabs.com/android-benchmarks
www.antutu.com
“How to benchmark your device” on
http://www.techhive.com/article/255977/how_to_benchmark_your_android_device.ht
ml
“Ten awesome apps for benchmarking your smart phone” by John Corpus on
http://www.tomsguide.com/us/pictures-story/485-best-benchmarking-apps.html
“What does Antutu Benchmark actually measure” by Chris Hoffman
http://www.makeuseof.com/tag/antutu-benchmark-measure/
Justification of interest to others:
Due to high speed internet now available on handheld devices, the interest in smartphones
has increased tremendously. Seeing this as a business opportunity many companies have
started manufacturing low quality devices and cheat customers by claiming fake
specifications. The customers seek tools for comparison of specifications and these tools are
available. My research and findings on the authenticity of these benchmarking tools can help
users in deciding to choose the right tool and also help in deciding on how much to rely on
the results of these tools. Many people in the business of selling smart phones can enlighten
and educate their customers based on the findings. Furthermore, it is possible to contact the
equipment manufacturers and make them realize about their involvement in dirty business of
faking benchmarks. Instead of investing time, money and energy in developing ways to fake
or bypass benchmarks, they should invest their efforts in manufacturing better performing
devices.
32. 28
Appendix 2
Project Specifications
Introduction and Research Question
My nature of job and interest in the field of enterprise solutions prompted me to do some
research on 3rd and 4th Generation of mobile telecommunication technologies, their impact on
data transfer rates and introduction of new and innovative applications supported by high
speed internet on smartphones. While doing extensive study on the related literature, I
narrowed down my research to the study of smartphones supporting higher internet speeds.
Inexpensive Chinese smartphones caught my attention and my interest narrowed down
further to compare these low quality devices with the renowned brands in the industry. As all
these devices use android as operating system so the best way was to do some research on
android benchmarking applications and check their authenticity. So, after narrowing down
my larger research area, I would summarize my research question as:
“Which is the most relevant android benchmarking tool available in the market, and is it
authentic enough for users to base their judgement on its scores”
Objectives
Find the most relevant android benchmarking tool
Check and validate the authenticity of the tool selected by developing test cases
Generate a defect report based on the test cases
Highlight the benchmark faking and cheating methods used by android phone
manufacturers
Negate and refute the claims of Chinese handset manufacturers of providing high
quality sets in very low prices
Enlighten the users by revealing truth about android benchmarking concept and the
level of involvement of equipment manufacturers in this business
Requirements
Software (Operating System and Application)
• At least Android 1.5.1(Cupcake)
• Maximum supports to 4.3.1 (Kikat)
• Antutu Benchmark Version 4.0
Hardware (Handsets):
Ainol(Flame, Spark).
Sky Vega ( IM-830L,IM850S).
LG Optimus LTE 2 F-160L.
LG Optimus G F-180K (Quad Control Edition).
Ramos.
HTC One X.
QMobile Z4 Noir
Samsung Galaxy S4
33. 29
Test Plan and Design
The tests are performed by using different android devices of different manufactures. Since it
is for professional android users, each interface is checked by its design and the logic behind
that design. The black box technique is used to ensure the working of interfaces in different
devices by testing the design and logic and not the programming code behind different
features and interfaces. The programming codes are beyond the scope of this research
Twelve test cases are developed, with several features associated with each test case and also
the device components used in each test case. The test cases are mentioned below:
13. Test Screen
14. Ranking Interface
15. Device interface
16. Storage Interface
17. CPU Interface
18. Display Interface
19. Camera Interface
20. Battery Interface
21. Operating System interface
22. Sensor Components
23. Network Interface
24. Phone test
Test suspension and resumption criteria with contingencies and risks expected during the test
are also explained in detail.
Test Procedure
There are four levels involved in conducting each test case:
Test Item (tells about the features tested)
Test Steps (step involved in the test)
Test Precondition (conditions prior to starting the test)
Test Post Conditions (conditions after conducting the test)
Findings and Error Report
The errors found while running test cases are categorized as below:
1. Fatal
2. Minor
3. Serious
The defects are explained in detail with the names of test cases affected by each test.
Conclusion and Analysis
Based on the findings after conducting the test, I have given my personal analysis on the use
and authenticity of these benchmarking tools. It is explained in detail about the ways of
faking benchmarks that are adopted by equipment manufacturers and what a user should
expect while running these benchmarking applications.