SlideShare a Scribd company logo
1Hammelburger, Pease, Sweeper
NATIONAL FISH AND WILDLIFE
SERVICE SURVEY REDESIGN
Hammelburger, Pease, Sweeper
2
CONTENTS
10		 COMPETITIVE ANALYSIS
14	 	 PROTOTYPING & TESTING
19	 	 BIBLIOGRAPHY
12	 	 A SURVEY FOR HUNTERS
08	 	 A PROBLEM OF FORM(S)
04	 	 PERSONAS
03	 	 POPULATION
02	 	 INTRODUCTION
3
INTRODUCTION
Since it’s inception in 1871 as the Commission of
Fish and Fisheries (USFWS, NCTC2014), the U.S. Fish
and Wildlife service has served as the government’s
premier agent for the conservation of wildlife species
and habitat. As a division of the Department of
the Interior, the Fish and Wildlife service has slowly
expanded to encompass a variety of purposes,
mostly related to the monitoring and preservation of
biodiversity. The National Fish and Wildlife Service
as we know it can be traced to a 1940 reconstitution
and reorganization, merging the Bureau of Fisheries
and the Bureau of Biological Survey, creating an
overarching organization devoted to “protect and
preserve in their natural habitat representatives of all
species and genera of their native flora and fauna,
including migratory birds”( Convention On Nature
Protection And Wild Life Preservation In The Western
Hemisphere. Oct. 12, 1940).
In pursuit of this aim, the NFWS employ a variety
of data collection methods aimed at generating as
complete as possible a view of the overall health of
the species and ecosystems under the purview. For
the purposes of the monitoring of migratory birds,
data for the NFWS comes from two primary locations;
ecological surveys of the birds’ (primarily arctic)
breeding grounds, and surveys from licensed hunters.
The latter will be the primary focus of this paper.
The NFWS began collecting what they call the
“National Harvest Surveys of Sports Hunters” in
1952. Although the survey has changed marginally
over the years, it licensed waterfowl hunters to report
on the number of birds harvested over the course
of a hunting season. This data is then used to not
only report on the overall health of the game bird
population, but also to help set the limits on and
standards for successive hunting seasons (Silvernman,
Wilkins. 2015).
In recent years the USFWS has received the
information for over 1,000,000 registered waterfowl
hunters per hunting season. From this pool they
select around 10% to receive survey forms; forms
intended to be completed at the conclusion of the
hunting season with a record of the hunter’s personal
totals for both birds harvested and number of days
spent hunting. Of those selected, around half submit
completed surveys at season’s end. This information
is then gathered by the migratory birds division,
analyzed in comparison with the data gathered from
the surveys conducted at the birds’ nesting sites,
and used to make recommendations and impose
regulations for the succeeding year’s hunting season.
This paradigm has changed little in the last fifty years,
and a number of concerns have arisen related to it.
Firstly, the cost of conducting the survey by paper
is not insignificant. Dr. Wilkins and Dr. Silverman
estimate that the NFWS spends over half a million
dollars in postage costs alone. Upon receipt of the
response, NFWS personnel must devote considerable
man hours to sifting through and clarifying the often
error filled response forms. While some forms can
be salvaged with correction and discrimination
Wilkens and Silverman estimate that, of those surveys
received, around a quarter of responses must be
discarded due to errors in the subject’s submissions.
Of additional concern to the NFWS is a decline in the
user response rate, which they suspect is related to
an increasing unwillingness to complete and submit a
paper survey by mail (Silvernman, Wilkins. 2015).
Taken together, these problems present an
opportunity to redesign the “National Harvest Survey
of Sport’s Hunters”, informed by modern tools and
theories in survey design and distribution.
4 NATIONAL FISH AND WILDLIFE SERVICE SURVEY REDESIGN
POPULATION
To design the US Fish and Wildlife Service
survey it is first necessary to identify and define the
demographics of the population of sports hunters
to be surveyed. We attempt to take into account the
hunters’ sex, age, regional distribution, education,
race and electronic literacy. These factors provide a
general understanding of the population who may be
selected US Fish and Wildlife Service’s survey.
As has been noted since surveys of hunter
populations were first recorded, it is expected that
most duck hunters will be male. In a survey taken
in 1997 by Ringelman, female participation was
lower than 10 % and a 2005 National Duck Hunter
Survey found that most, 99% in fact, respondents
were male. The trend continues, and a 2008 survey
found that 86% of active hunters identify as white
and 84% identify as male. Additionally, 38% report
living in rural areas (Responsive Management, The
National Shooting Sports Foundation, 2008). It is also
significant to note that inactive hunters are generally
older than active hunters and discontinue the sport
due to their age and health. “Only 10% of active
hunters are 65 years old or older, 23% of inactive
hunters are that age, reinforcing other findings that
suggest that many inactive hunters simply dropped
out because of age or health rather than for other
reasons” (Responsive Management, The National
Shooting Sports Foundation, 2008).
Due to the reported homogeneity of most hunters,
and in an effort to combat a declining waterfowl
hunter population, current recruitment efforts have
targeted females, young people, the disabled, and
minorities (Responsive Management, The National
Shooting Sports Foundation. 2008). A 2011 US census
bureau survey identified percentage participation
amongst ethnic groups within the U.S., with 2 percent
of Hispanics, 7% of Whites, 2% of African Americans,
0.5% of Asian Americans, and 2% of those identified
as other races participating in the hobby of hunting.
(U.S. Census Bureau, 2011)
Furthermore, in recent studies amongst duck hunters
these trends persist, albeit with a marginal increase
amongst the under-represented groups. According
to the U.S. Fish & Wildlife Service 2011 National
Survey of Fishing, Hunting, and Wildlife- Associated
Recreation “of the 13.7 million participants who
hunted, 89 percent (12.2 million) were male and
11 percent (1.5 million) were female.” The hunting
participation rate also increased by roughly five
percent as individuals reached 65 years of age, and
declined for those 75 and older at a rate of 2 percent.
The largest active age group was those 45 to 54 years
old(U.S. Census Bureau 2011).
Additionally, “twenty percent of duck hunters said
they “frequently” access the internet to look up duck
hunting information; 49% access the internet “once
in awhile,” and 31%, “not at all” (U.S. Census Bureau
2011). From this research, it is evident that most duck
hunters may not be tech savvy. Subsequently, we
must acknowledge the limitations in technological
literacy amongst duck hunters, as well as make
accommodations for an increasingly older population.
With the predominant demographic amongst
waterfowl hunters identified, it becomes easier
5Hammelburger, Pease, Sweeper
to define a set of design
heuristics to apply. An
extensive body of research
exists exploring the
relationship between older
individuals and information
technology. In particular,
research shows a correlation
between a positive
perception of technology’s
usefulness and individual’s
likeliness of using it. (Adams,
N., Stubbs, D., & Woods, V.
2005) It is therefore
imperative in the design
revisions to the National
Harvest Survey of Sport’s
Hunters to cultivate the
perception of usefulness
amongst those surveyed.
As the aging process is
associated with a decrease in
working memory and spatial
visualization skills(Adams,
N., Stubbs, D., & Woods, V.
2005), and the majority of
surveyed hunters now fall,
or soon will fall, within this
demographic, affordances
should be made to decrease
the necessity of these skills.
As Adams et. al. note,
“simple and uniformly
designed Internet pages,
more user-friendly online
help and error message
terminology, and increased
provision of training for
the older user would assist
uptake”
amongst older users(Adams,
N., Stubbs, D., & Woods,
V. 2005). In the proceeding
sections, we will identify
a set of personas to help
guide design decisions,
review survey heuristics as
they apply to our target
demographic, as well as
analyze and critique a
popular existing application
which replicates features to
be considered for the NFWS
hunter survey.
The participation rate in hunting
increased as household income
increased until it reached incomes
of $100,000 or more. The
participation was highest among
those with incomes of $75,000 to
$99,999 at 9 percent. The majority
of hunters had household incomes
of $50,000 or more.
(U.S. Fish & Wildlife Service 2011 National
Survey of Fishing, Hunting, and Wildlife-
Associated Recreation)
“20% of duck hunters said they
“frequently” access the internet
to look up duck hunting
information; 49% access the
internet “once in awhile,” and
31%, “not at all.”
(Responsive Management/National
Shooting Sports Foundation, 2008)
38% of hunters report living in rural areas.
(Responsive Management/National Shooting Sports Foundation, 2008)
6 NATIONAL FISH AND WILDLIFE SERVICE SURVEY REDESIGN
NAME: J.W. Brouebeck
AGE: 53
HOME STATE: Seattle, Washington
OCCUPATION: Journalist
EDUCATION: Bachelor degree
TECH SAVVY: Intermediate
J.W. Brouebeck has traveled the United States
researching, hunting, photographing and writing
about duck hunting for the last 25 years. He has been
published in Field & Stream, Outdoor Life, National
Geographic and other notable publications as well as
written 5 books on duck hunting. He is familiar with
the rules of duck hunting for several states and has
been selected for the US Fish and Wildlife survey 5
times consecutively. Because he is always traveling
and has multiple addresses, he has only completed
the USFW forms 2 out of 5 times they were sent.
Some were mailed to an address where he was not
currently living. He would like to have an updated
process where he is notified through email. J.W.
would also like to be able to access the form on the
web because he is often mobile while traveling.
NAME: Sarah Jesemy
AGE: 27
HOME STATE: San Francisco Bay, California
OCCUPATION: software engineer
EDUCATION: Master Degree
TECH SAVVY: Advanced
Sarah Jesemy is the director of communications
at software engineering firm, Halbot Engineering.
She spends her time away from the office hunting
waterfowl during duck hunting season. She learned
to duck hunt as a young girl from her father Rusty
Horowitz who is the founder of Halbot, who now sits
as the chair of the board of advisers. Sarah began
running the media department at the company when
her dad retired 2 years ago. She is excited about the
opportunity to take the USFW online, which is the way
she prefers to do most of her personal business. Sarah
is an avid Primos Hunting Calls and Ducks Unlimited
Waterfowler’s Journal apps user.
7Hammelburger, Pease, Sweeper
PERSONAS
NAME: Robert Frierson
AGE: 61
HOME STATE: Venice, Louisiana
OCCUPATION: Captain of a shrimp boat
EDUCATION: High School
TECH SAVVY: Beginner
Robert is a Vietnam War veteran from Venice,
Louisana. He runs a family-owned shrimp fishery. He is
one of 5 brothers who have been hunting gators, deer
and duck with him since their teenage years. He is
opposed to taking the USFW survey online, preferring
to have them mailed. He would much rather not to
have to do them at all. His wife Susan helped him fill
out the last survey that he was sent 4 years ago, but
she has passed away since then. Robert has used his
time duck hunting to help him grieve. He is not sure
who will help him now that Susan is gone.
NAME: John Sweeney
AGE: 45
HOME STATE: Chesapeake Bay, Maryland
OCCUPATION: US Army
EDUCATION: High School
TECH SAVVY: Beginner-Intermediate
John Sweeney is a Sergeant in the Army who will
retire next year. He has served in the Armed Forces
since the age of 18 and misses hunting after during
several tours of duty overseas. He is happy to take
the survey if it means that he is helping the USFW.
John prefers to take the mail in survey but is open
to becoming a better computer user to help him
complete the online version. Once he retires he
is planning to join a computer literacy group for
veterans that his service buddies told him about.
8 NATIONAL FISH AND WILDLIFE SERVICE SURVEY REDESIGN
Since 1953 the Fish and Wildlife Service has relied
on a mail based survey to collect a vast amount of data
related to the experiences of hunters of waterfowl. In
addition to being incredibly expensive, time consuming,
and inefficient, this survey has been fraught with mistakes
and inaccuracies stemming from poor design and
usability issues.
The current form is divided into two sections. The top half
has fields relating to with when someone hunted, where
they hunted, and what they hunted. The bottom half deals
with the season hunting totals, focusing on how many
days someone hunted, how many of each species they
harvested, and how many were downed but not recovered.
The proposition of using a web form instead of a physical
survey has the potential to remove many of the problems
present in the current form. With so much at stake it’s vitally
important that we focus on making sure that the design of
the form is as useable as possible.
As with many other aspects of design, when a survey is
done well, it’s not something that people take notice of.
But when there are issues, they are all too obvious. When
web forms are used for sign ups, online quotes, and
checkout pages they are often the final step towards a
conversion.
When it comes to checkout pages, even just a slight
improvement in the form can lead to a massive
improvement in a business’s profit. A simple change in
the form’s title or a layout change can lead to conversion
improvement of 5-10%. This effects can be applied
elsewhere, particularly to survey design.
Poor form design has its consequences. According to
Baymard Institute, a web research company in the UK,
67.45% of online shopping carts are abandoned due
to issues with the forms during the checkout process
(Baymard Institute). Making the form as easy to use as
possible must be a priority in the design process. In order
to make this project a success we must design the form
using web form design best practices.
As Smashing Magazine states: “The ISO 9241 standard
defines website usability as the “effectiveness, efficiency
and satisfaction with which specified users achieve
specified goals in particular environments.” When using a
website, users have a particular goal. If designed well, the
website will meet that goal and align it with the goals of the
organization behind the website.
Standing between the user’s goal and the organization’s
goals is very often a form, despite the advances in human-
computer interaction, forms remain the predominant form
of interaction for users on the Web. In fact, forms are often
considered to be the last and most important stage of the
journey to the completion of goals.” (Mifsud, 2011)
With the main demographic ranging from 45 to 54 years,
a the design focus has an obligation to be centered on
usability issues for older individuals, ranging from health
issues such as vision decline and motor skills decline to
elder frustration. A Nielson Norman Study showed that
45% of older individuals were uncomfortable or hesitant
exploring new things. (Nielsen, 2013) Of particular note
was the observation that seniors were more likely to get
frustrated and give up altogether instead of searching for
more options.
Additionally, people don’t like answering surveys. They
are intrusive, feel like a waste of time, and are annoying to
complete. At least that’s how most surveys are perceived.
One respondent to the fish and wildlife survey sent a letter
saying “This survey in my opinion is a waste of my time and
A PROBLEM OF FORM(S)
9Hammelburger, Pease, Sweeper
money that should be used for the wildlife and water.” This
respondent didn’t feel that the survey was worth the money
or time spent on it. But it is not impossible to create a
form that users will nott feel is a waste of time. In the book
“Forms That Work” by Caroline Jarret and Gerrey Gaffney
they establish 3 rules to influence response rate: establish
trust, reduce social costs, and increase rewards.
People are more likely to respond if these conditions are
met, if they perceive that the surveyor is trustworthy, that
the costs of participation are not unduly high, and that
they will be rewarded for doing so. By tapping into these
inclinations, hunters’ perception of the survey, and their
subsequent completion of it, can be increased. In fact,
a few respondents note this themselves. “If you would
make an online hunting journal where we hunt and could
access it year after year so we could see our own trends, we
hunters would respond well. You would get more data”.
The goal of any survey is to get the information that is
already inside the user’s’ mind onto paper. The best surveys
aren’t the long ones, they’re the ones that ask the right
questions, which help the users answer in the best way
possible.
When the wrong questions are asked we get issues such
as the case of one respondent who, when filling out the
bottom half of the survey, put in 5 killed and retrieved
ducks and 8 downed but lost. The survey reviewer noted
that “It is not typical for birds killed to be greater than birds
lost. Suspect misunderstanding (SIC) what we want”.
To avoid situations like this it is required to ask the
right questions. In order to make the questions easily
understood, they need to be written in a way that uses
concepts familiar to the hunters. As Jarrett writes “Even
something apparently straightforward can give the user a
bit of a problem to decode”(Jarrett, Gaffney, 2009).
Writing the questions in an easy to understand way will
both increase the accuracy of the collected information,
and avoid feedback like this from frustrated users “I have
no idea what this survey is about. I asume(SIC) you are
asking about ducks. Send me a survey that makes sence
(SIC) and I will happily make it out”.
People focus best when concentrating one question at a
time, yet many surveys fall into the trap of asking multiple
questions in one go. There are two problems with asking
more than one question at a time: First, more than one
question can confuse the user about which question
to answer, making it difficult for them to focus on one
question at a time. Second, the user may skip potentially
important questions as they attempt to answer either the
question they remember or the easiest question, and they
may not input the most important information.
To avoid this unfortunate consequence, ask one question
at a time. In addition, multiple questions make it hard
for respondents to give precise answers, which in turn
makes those answers difficult to interpret or evaluate. It is
therefore recommended that, rather than asking a series of
questions, the survey should be formed around a smaller
number of complete questions, addressing the overall
scope of the survey’s purpose.
Understanding the question is only the first part, users must
then be able to properly answer the questions, otherwise a
well-intentioned user will answer the question in the wrong
way, invalidating the data. In one case, a hunter who was
documenting the number of days that he hunted wrote 10
in each column, which left the person documenting the
information confused as to whether they hunted for just 10
days or if they hunted each type of game on 10 different
days.
According to Jarrett and Gaffney there are four ways
that people come up with answers: Slot-in answers which
consist of everyday information, which in this case is used
to record the date and location. Another method involves
gathered answers, which consist of information originating
from somewhere that the user can get to personally, such
as the recorded amount of waterfowl hunted. Third party
answers consist of questions that the user must seek from
an outside source, for example by asking a friend who
knows what days were hunted. Created answers are choices
that the user could not reasonably have made in advance,
i.e. thoughts and feedback on the survey. (Jarrett Gaffney,
2009)
Knowing what type of answers will be given helps ensure
that the right answers are being written. On the current
survey there is an issue where many hunters have filled
in areas with 0’s to fill in the space, instead of just leaving
them blank. In one survey a hunter filled out a 2 for ducks
and a 0 for sea ducks. The surveyor commented “ Don’t
want these 0’s, they imply days of unsuccessful hunting”.
One problem that was noticeable in the surveys was the
many contradictions of information from the top half of the
form to the bottom half of the form. Many of these issue
stem from simple clerical errors, where the hunter added
the information incorrectly when transferring the data from
the top to the bottom.
In one case the hunter mentioned hunting geese for 3 days
on the top of the form, but on the bottom only wrote 2
days. In another case the hunter wrote that killed 40 ducks
on the bottom, but the sum total of ducks killed on the top
only added up to 39.
These issues can possibly be alleviated through the use
of a web form by making it easy to transfer the answers
on the form. The important thing is to meet the user’s’
expectations and use the options that will make the user
feel the most comfortable. By offering drop down menus
and radio buttons we can make it easier to ensure that the
10 NATIONAL FISH AND WILDLIFE SERVICE SURVEY REDESIGN
correct information will be properly filled out. As
we have seen through the physical surveys, users
are prone to making mistakes, the trick is to offer
help when needed and feedback when there is an
error.
The ideal form should never have to be explained
to users to know. If it doesn’t look like a form or it’s
too complicated to fill out, then redesigning it is
your only option. As form expert Luke Wroblewski
writes “if excessive instructions are required to
explain how to complete your form, then chances
are the questions you are asking are either phrased
poorly, too complex, or just plain unnecessary.”
(Wroblewski, 2008)
Eye tracking data has also shown that users often
skip the help text and instructions and dive right
into the form, so designing it in a simplistic way is
a must. One suggestion of getting around this is
”Rather than include help text next to each input
field, show it only where required. You could show
an icon next to an input field that the user can click
on when they need help for that field. Even better,
show help dynamically when the user clicks into an
input field to enter data.”(Mifsud, 2011)
Aside from the help text, a good form needs
error validation. With proper error validation the
problem of contradictory information, incorrect
info, or even mere clerical errors would vanish.
Smashing Magazine suggest using an error
message saying “This notifies the user that an error
has occurred, and it usually prevents them from
proceeding further in the form. Emphasize error
messages through color (typically red), familiar
iconography (such as a warning sign), prominence
(typically at the top of the form or beside where
the error occurred), large font, or a combination of
these (Mifsud, 2011).
Many seniors have vision issues such as Glaucoma,
Macular Degeneration and Cataracts. Issues
such as these can make it more difficult to read
computer screen fonts making it more difficult
to read the form. The Nation Institute of Aging
recommends that apps and websites should use
“Make type size at least 12 point, 13 point, or 14
point”(NIH, NIA 2007). Be designing the form to
meet these requirements, it is hoped that overall
completion rate can be improved, and the resulting
design will be easier, more accurate, and more cost
effective.
If the users choose to report by day, they would
start the multi-page survey. The survey was
designed around Steve Krugs principle that if what
the user has to do is mindless, it doesn’t matter
how long it will take. We decided to take each field
of the survey and give it its own page. By doing
so we would make the survey mindlessly easy for
anyone to complete.
The scope and features of the NFWS project contain
a number of similarities with a recent app created by the
organization Ducks Unlimited. Ducks Unlimited was founded
in 1937 with a mission to help conserve waterfowl and
wetlands. They are a well trafficked website will SimilarWeb.
com recording 140k users a month, Compete.com recording
112,578 unique visitors, and Alexa.com ranking them #19,458
in the US.
The organization’s is funded through a combination of
grassroots fundraising events (member/sponsor banquets,
shooting/fishing tournaments, golf outings) and advertising
revenue obtained through their website and app.
Ducks unlimited created a mobile waterfowl hunting
event logging app in 2011, which has received over 50,000
downloads. There are two versions of the app; a free version
solely requiring signup, and a premium version for due-
paying members. This analysis we will focus on the journal
functionality of the free app.
The journal function of the app offers the opportunity to
record in great detail the myriad factors of any specific
hunting day, factors which users may not have paid attention
to or sought to record. For example, the first section begins
with basic location and date info, then proceeds to ask for
the temperature, weather, wind-speed and wind direction.
The actual act of logging birds harvested is an experience in
frustration. Particularly, it would seem the designers were so
focused on allowing users to identify species by selecting a
picture, they neglected to include any affordances to guide
users on how to do so.
A PROBLEM OF FORM(S) CONTINUED
11Hammelburger, Pease, Sweeper
COMPETITIVE ANALYSIS
The screen begins by prompting users with a large
“skip adding birds & save entry” button that takes up
nearly a third of screen real-estate. Below is a small
table indicating users to select a species, labeled
“select”, then choose a species from the popup.
Despite being arguably the most important aspect of
the page, and the application
in general, this button shares no similarities with any
other button within the app. There is no design, no
label, and worst of all, the size is smaller than every
other button. These features fail nearly every design
heuristic identified for the hunter population.
If the correct course of action can be identified,
users then choose their bird from a drop down
containing small pictures of each species, with the
obvious intention of making it easy for users to pick
out their species. This drop down divides species
into four types and orders them alphabetically.
Once a species is selected, users are provided the
opportunity to record the number of birds harvested,
while subdividing the species into four categories:
Drake Banded, Hen Banded, Drake Unbanded, and
Hen Unbanded. This presents a number of problems,
many of which violate heuristics for survey design
amongst the target population. Firstly, these choices
are not dynamically chosen based on species, and are
instead standardized across all species of waterfowl.
Secondly, the form contains barely enough space for
users to select the field to add the number of birds
harvested. Lastly, there are no labels designating
the field as requiring a numbered entry, nor are they
designed to look like tapable buttons. Collectively,
these features demonstrate a distinct lack of
affordances to guide the application’s use.
The Ducks Unlimited application also makes it
confusing to add multiple species of birds. The main
issue is the button titles. “Save Bird” pushes the user
to the next screen, rather than merely saving that
particular bird. Once a user “saves” a bird (humor
mine), they are moved to an “edit entry” screen and
are obligated to select the bird icon in order to go
back and add a second bird, a decidedly unintuitive
process.
Finally, users can save their bird and officially log their
day. Users are able to go back and make edits to their
entry and the app sorts each entry by the most recent
day and season.
Despite myriad design problems, the overall response
to the app is positive, with an average of 4.3 out of 5
on google play and 4 stars on the Apple store. Most
of the reviews center on other features, with few users
making any mention of the journal. Conceptually, the
journal app seems like a good idea, with excellent
features such as identifying birds by picture and
the ability to save entries at every step. However, in
execution the application seems decidedly difficult
and non-intuitive. With a design that focuses more on
breadth of features rather than ease of use, the Ducks
Unlimited journal application serves as a definitive
guide on needlessly complicating a user experience,
a guide which will inform our own decisions on what
NOT to do in the redesign of NFWS hunter survey.
12 NATIONAL FISH AND WILDLIFE SERVICE SURVEY REDESIGN
In researching the best methods for redesigning
the National Harvest Survey of Sport’s Hunters we have
identified a number of axes on which improvements can
be made. Essentially, what we propose is a three pronged
approach; identify and define the target population,
cleanup the language, and streamline the delivery. By
accounting for the population that constitutes the main
body of sports hunters, and designing around their needs,
tendencies, and beliefs it is hoped that the conversion
rate on survey completion can be improved. By altering
the language of the survey to increase clarity and reduce
confusion, it is hoped that the number of completed but
unusable surveys can be decreased. Lastly, by re-designing
the survey as an online tool, the considerable postal costs
can be eliminated, freeing budget for other endeavors.
Taken together, these tactics could serve to greatly improve
the cost, accuracy and ease of the National Harvest Survey
of Sport’s Hunters.
As previously noted, the greater part of the sports hunter
population lies between the ages of 45-64, and are white
males residing in rural areas. It is therefore imperative
that the survey redesign be optimized for users within this
demographic. Additionally, as this large group continues to
age, and with the relative lack of younger hunters replacing
them (Responsive Management, The National Shooting
Sports Foundation 2008), the survey should be designed
with the needs of a primarily older audience in mind.
Additionally, as becomes obvious from reading the notes
submitted by the surveyed, a concerted effort should
be made to impress upon the hunter population the
importance of the survey for the continued health of the
waterfowl population, and by extension the continuation of
the pastime of sports shooting. While many of the hunters
seem confused and/or hostile toward the perceived
invasion of privacy represented by the survey, i.e. the social
cost (Jarrett, Gaffney, 2009), they also seem to generally
be invested in the preservation of the environments in
which they hunt. It could therefore prove fruitful to provide
some explanation as to the survey’s purpose and efficacy in
relation to the greater goal of preserving and maintaining
the diversity of waterfowl. By helping to show an alignment
between the NFWS’s goals and that of the hunters
themselves, it is believed that those surveyed would be
more likely to successfully complete the questionnaire.
For the survey itself, there are a number of industry best
practices that could be applied to improve its clarity and
usability. In its current form, many survey recipients seem
to show confusion as to the terminology and labeling of the
survey’s various sections. In particular, the area questioning
total days hunted seems to commonly produce errors. The
survey could benefit from a clarification and streamlining
of language to help reduce usage errors on the part of the
participants. While it would take a considerable amount
of user testing to arrive at a survey whose language is
ideal for the target demographic, a general application
of consistent and plain language could certainly serve to
reduce submission errors.
Additionally, changing the formatting of the questions to
center on the data that is most important to the NFWS
could help to increase the survey’s ease of use. By
placing emphasis on the most important questions, and
having them answered first, it is hoped that the number
of completed but unusable surveys can be reduced.
Additional consultation would be necessary to determine
what elements of the survey are most important, but
orienting the survey to focus on the most salient question is
understood to be of great importance to successful surveys
(Jarrett, Gaffney, 2009).
Lastly, the move away from paper presents a number of
advantages that could help with all aspects of the survey
process. By recreating the survey as an e-tool, the NFWS
can reduce both the cost and the hassle associated with
physically mailing and receiving the surveys. This poses
a number of benefits. Not only will money be saved for
A SURVEY FOR HUNTERS
13Hammelburger, Pease, Sweeper
the NFSW, the survey recipients will now no longer have
to contend with the inconvenience of mailing in their
responses. By reducing the required commitment on the
recipients end, it is hoped that overall conversion can
be increased. While not on the same level as younger
generations, older adults have still seen a remarkable
uptick in web tool prowess, and will only become savvier
as time goes on (Xie, 2003). This paradigm should be
embraced, as the benefits of moving away from paper
seem to greatly outweigh the costs.
An online tool also allows for certain opportunities that
would be impossible with paper. An application’s capacity
to tailor the survey to the individual, as well as identify
and help correct data entry errors prior to submission,
could greatly decrease the number of unusable surveys
submitted. By having the application do the sums of days
hunted, the arithmetic errors frequently present in the
paper forms could be all but eliminated. Additionally, the
ability to provide queues and warnings when the user’s
data seems to lay far outside the norms, i.e. downed and
lost ducks greatly outnumbering retrieved ducks, will help
increase confidence that the data user’s submit is indeed
what they hoped to submit, and not noise produced by a
lack of understanding.
The importance of the National Harvest Survey of Sports
Hunters to both the NFWS and the hunters themselves
cannot be understated. It is an invaluable tool for the
continued viability of the migratory bird population, as
well as the hobbyists who hunt them. By redesigning the
survey to express this symbiosis, to show the relationship
between survey and sport, and to facilitate its completion
by the target demographic, is an end that should serve all
parties well. The plans laid out in this paper, taken together,
can serve to guide the design of a greatly improved survey.
Now it is merely a matter of testing and application,
revision and iteration, to arrive at a design that fulfills
everyone’s needs.
OUR DESIGN
THEORY
Our design is based on resolving the core
problems identified by the Department of Fish and
Wildlife with their mailed out surveys. Based on the
information provided by the National Fish and Wildlife
Services representatives, and our own observations
from looking at previous surveys, we were able to
condense these problems into three main issues:
• Low amount of survey respondents
• High percentage of mistakes made in respondents’ 	
returned surveys
• High cost of mailing and sorting through the surveys
Our solutions are intended to solve all three of
these problems. We have designed a system focused
on a singular goal: getting users to complete the
survey while avoiding mistakes. While we toyed with
the idea of creating an app similar to Ducks Unlimited,
we decided against such a solution for the following
reasons:
1) Relying on an app would reduce the already small
user pool, forcing the DFW to choose only people who
have already downloaded the app.
2) An app would create an extra barrier for entry,
further deterring a group of people who may already be
reluctant to complete the survey.
3) The app would take data already recorded by the
user in the context of a social network. We worried that
this social context could increase the chances of kill
exaggeration, leading to faulty data.
4) Our persona research showed that a large majority of
the target demographic were older users who may not
be as adept in using computers and technology.
After weighing these considerations, we decided on
a multi-page survey that would guide even the most
unskilled of users towards the completion of the survey,
a strategy whose efficacy was confirmed in user testing.
14 NATIONAL FISH AND WILDLIFE SERVICE SURVEY REDESIGN
PROTOTYPING & TESTING
Paper Prototype
Our survey would begin with an email sent to
selected users, informing them of their selection to
participate in the annual Fish and Wildlife survey. After
clicking on the link in the email they would be taken
to the first page of the survey. This page was designed
to immediately welcome the user and thank them for
agreeing to take part in the survey. This method was
chosen to give users an initial burst of encouragement, and
express to them the importance of the survey, and how
their contribution will help the sport they love.
Users would login with their hunter ID number, which they
would then submit to start the survey.
1)	 By day
2)	 By Season
This was based on the paper surveys two halves, with the
top half being the day by day and the bottom half being
for the season. The idea was that many respondents
don’t know the day by day amount, but they do know
the seasonal totals. We also put in question pop-ups that
would help users to complete the report by day section,
which would take them through the survey, then auto
populate the seasonal survey.
As one of the biggest issues with the paper survey was
inconsistent data transfer from the top section to the
bottom, we hoped this solution would prevent those
errors from ruining the perfectly good data that the survey
otherwise had.
If the users choose to report by day, they would start the
multi-page survey. The survey was designed around Steve
Krugs principle that if what the user has to do is mindless,
it doesn’t matter how long it will take. We decided to take
each field of the survey and give it its own page. By doing
so we would make the survey mindlessly easy for anyone to
complete.
This section of the survey was divided into 5 parts:
1)	 Date
2)	 Location
3)	 What did you hunt?
4)	 How many did you retrieve?
5)	 How many did you hit but not retrieve?
The first page users would encounter was the date page.
This page included both a manual box for users to fill
out the date as well as a calendar popup. Older users
have been known to dislike popups, so part of our design
decision was to allow for the option to avoid using them
entirely.
The second page asked users what specific state and
county they hunted in. We found on the paper surveys that
users occasionally misspelled their locations, which could
lead to some confusion. Our solution was to allow for auto
filling of those locations.
After getting the date and location we needed to find out
what exactly they had hunted, and how many of each that
had retrieved and hit but not retrieved. We did not want
15Hammelburger, Pease, Sweeper
to make it all on one screen, as we felt that would just confuse the user. We decided to
instead create separate pages for each question. On this page users could choose ducks,
geese, or brant and for each animal they chose they would be asked “how many did you
kill” and “how many did you hit but not retrieve.” User said no. When the user was done
adding days they would be sent to the seasonal total page. The page was auto filled with
all the previous data that they had completed and was there for the users to confirm their
totals.
If the total was incorrect, they could edit it directly on the page, and this would be sent as
just a seasonal total, rather than a daily total. This page also doubled as the option page
for seasonal totals. As such we made sure to include small explanation options next to
days, killed, and downed in case the language was confusing.
After completing the seasonal totals the users would be taken to the thank you page. This
page was just to reinforce how important the survey was and how valuable the input of the
user was to furthering the mission of the DFW.
Problems
No prototype is without its issues and ours was no exception. When testing, the
problems we found were:
• There were no indicators of time on the survey. Users had no idea how long the 		
survey would last from the outset and there were no indicators of progress as they 		
went along.
• The survey was too long. There were too many pages, and with all the doubling back in
the case of multiple days, it was too long, even if mindlessly easy.
• We forgot a back button, critical should someone realize that they had made a mistake.
While we did have those issues, we still believed in our design principle, and that if we
could design a survey that was easy enough to complete, we could boost completion rate.
With those three issues in mind we made some adjustments to the design.
In the second paper prototype we made minor changes to the functionality. Instead of one
continuous survey it would function more like multiple mini-surveys based on each day. We
accomplished this by adding a “home” page that users would go back to after completing
each day.
The design of the page was divided into two sections, the top was designed for users to
start the “add a day” feature and the bottom was designed to display the days that were
already completed. In addition users could edit the completed days.
Additionally we realized that one important issue was people making mistakes, so we
added an edit page feature that would allow for users to edit previously completed pages.
The adding a day process was streamlined into 4 pages:
1)	 Date
2)	 Location
3)	 What did you hunt?
4)	 How many did you retrieve/ not retrieve?
Figure 1 Prototype V2,
Home screen
Figure 2 Prototype v2
Edit screen
16 NATIONAL FISH AND WILDLIFE SERVICE SURVEY REDESIGN
The date and location screen remained mostly unchanged,
aside from adding a progress meter and back buttons.
The “What did you hunt” page also remained mostly
unchanged, the only major change was using checkboxes
instead of ‘pushdown’ buttons.
The major change was on the amount of retrieved/not
retrieved page. We combined the retrieved and not
retrieved pages, and that page would display all the
animals checked on the previous page. By doing this we
were still able to streamline the process while maintaining
the simplicity.
Paper Prototype Testing
On Thursday, Nov 12, we, along with another
group from our class, traveled to BassPro Shop to perform
testing of the paper prototypes. We brought v2 of our
paper prototype, updated based on the takeaways from
the in-class peer reviews.  BassPro was chosen because, as
one of the nation’s largest retailers of hunting supplies and
accessories, there was a high likelihood of persons of our
target demographics being present.
After receiving approval from the manager, we were lent
a folding table and allowed to set-up near the section of
the store devoted to waterfowl hunting.  From there, we
effectively cold-called patrons of BassPro shop, asking
them if they a) had hunted waterfowl in the past and b) had
any interest in participating in our user research.  Perfecting
the approach angle took some figuring out. Hunters seem
to, generally, be a reticent bunch, and it was revealing
how quickly many of them closed off at the mention of a
“survey”.  It proved much more efficacious to approach
from the angle of anonymous prototype testing.  A script
developed, by which we first explained who we were, who
we were working with, the purpose of the survey itself,
and finally our purpose in testing it.  It was found that, by
expressing our specific desire for negative feedback that
participants were far more willing to participate in the
testing.
For the testing itself, we had participants seat themselves
at the folding table, and placed the paper prototype in
front of them.  We explained that the prototype should
be considered just like a series of screens, and that one
of us would function as the computer, performing the
user›s intended actions.  This led to a number of funny
interactions where the participant, amused at the “make
believe”, emphatically typed on an invisible keyboard to
fill out the imaginary data fields.  The other team member
acted as the facilitator/recorder, though the simplicity of
the design seemed to minimize the amount of facilitation
necessary.
Our participants ranged in age from mid 20’s to mid 60’s,
providing a nice cross-section of our anticipated user
groups.  It was interesting to note that their insights and
recommendations did not seem to fall alongside our
predictions, with the youngest participant being the one
most reticent about potential privacy concerns, and the
need for the continuation of the paper survey for those
unwilling or unable to participate electronically.  However,
there were a number of cohesive takeaways mentioned
in some way or another by our participants, that greatly
informed subsequent design decisions.
Primary amongst these insights was the fact that many
hunters do not keep an exact record of their kill count, and
even fewer maintain statistics for downed birds. Making
accommodations for this fact would be implemented in
subsequent prototype revisions.  Many of the participants
expressed concern over how the information would
be used, and expressed an opinion that many hunters
may enter less than accurate information out of fear of
legal repercussions.  Ameliorating this concern, largely
through branding and language, was also earmarked for
implementation.
Overall reactions to the survey were positive, with all
participants noting how straightforward the prototype
seemed.  Aside from suggesting marginal improvements
for clarity and ease of use, as well as an expression that
most of the form elements should be subdivided to
individual tasks, the participants found our prototype
did in fact achieve our design goals of simplicity and
accuracy, and those participants familiar with the paper
survey expressed that this would in fact be much, much
easier.  We felt that our design was headed in a very good
direction, and the information gathered in testing mostly
suggested changes in language and labelling to further
clarify the interactions, as well as to increase the amount of
feedback provided to the hunters.  These changes would
be implemented in our electronic prototype.
Electronic Prototype
We felt that that our testing of our second version of
the paper prototype went well, and we decided to go for a
faithful transition from paper to electronic. Our main goal
was to give users a better idea of the system, and see how
they would be able navigate through it based on various
scenarios. We used the Axure prototyping software to build
our prototype.
On the welcome screen we decided to change things
around slightly. We felt that saying “thank you” was a bit
confusing to users, as it sounds more like something said
after completing a task, rather than before completing
it. We also added the approximate time it would to
complete, and an explanation of what the survey would be
accomplishing, giving users a greater sense of purpose and
drive to complete it.
Part of our feedback from the paper prototype implied
17Hammelburger, Pease, Sweeper
that users didn’t quite know what survey they were completing, so we added a large
DFW logo front and center. This would also serve as a trust signal so users would
feel more comfortable submitting their data.
Finally, our testing showed that not everyone knew exactly where they could find
their member ID, so we added a link that would show users where they could find it.
In the electronic prototype removed the rules page. We found the rules page to be
a bit superfluous and we could make the survey much more usable if we showed
the rules in a more contextual manner, such as when they were completing the
respective step, rather than all at once.
The home screen remained mostly the same. We added the DFW logo to the top for
trust. We also moved the report by day and report by season onto the “home” page.
In our testing we found that many people gravitated to the “report by season” page
even if they had their daily data. Obviously we’d much rather have them complete
the daily version. So we shrunk down the “report by season” button to make it more
of an afterthought, should they not have the daily data.
For the “how many were hunted” screen we realized we could also solve a few
problems that were noted in paper survey testing. One, noted by participants, was
that it is unlikely that hunters would retrieve more than they lost. We could solve this
by having a notification in cases where users put in more ‘retrieved’ than ‘lost’ that
would ask them if they were sure of this.
We could also solve the issue of people entering numbers that were over the
hunting limit. We could limit the total amount killed to whatever the cap in that area
was that year.
The home screen would then be populated with the day that was hunted and the
user could go and edit it should they need to. On the edit page users could edit
anything they entered. They could edit the day, location and what they killed directly
from this page.
Electronic Prototype Testing
On Monday, December 7th, we returned to BassPro Shop to undertake a
second round of testing utilizing our technological prototype, built in Axure.  The
testing format was the same as our previous visit, with a folding table set up in the
store section containing waterfowl hunting supplies and accessories.  This time
however we were armed with two laptops loaded with the technological prototype.
Our method of participant acquisition remained unchanged; asking patrons if
they participated in waterfowl hunting, a brief background on who we were and
the scope of the project, and lastly if they would like to assist us.  Once again, the
participants spanned a wide range demographically, and in this session we even had
the privilege of testing with a female duck hunter, who also happened to work for
NASA.
The testing procedure was similar to that used for the paper prototypes, albeit
without the need for a team member to act in the role of computer.  While the
prototype was not fully functional, we declined to build in the mathematical logic, it
was still a reasonable enough facsimile that the participants had no trouble inferring
the work flow.  We were happy to see that users could intuit the workflow, and most
of their suggestions focused on labeling and terminology, rather than the mechanics.
Figure 3 electronic prototype
welcome screen
Figure 4,5,6 electronic
prototype pages
18 NATIONAL FISH AND WILDLIFE SERVICE SURVEY REDESIGN
User Testing at Bass Pro Shop
One interesting point of disagreement amongst
the participants was the overall level of technological
sophistication of the hunter population.  One of the
participants in the technological prototype testing
expressed a desire for the app to be designed mobile
first since, in his opinion, everyone can use a phone.  This
was in direct contrast to an earlier participant from the
paper prototype session who expressed great concern
that the survey be designed with those without access to
a smartphone in mind.  This reinforced to us the need for
a device agnostic approach that could be easily ported
across a wide variety of devices.
Another interesting insight provided by a participant,
and one that we had overlooked, was a need to label
the hunting season.  Specifically, the season often
spans across the new year, and a label to the effect of
“2015/2016 season” would be needed for clarity.  This
particular participant, who had filled out the NFWS survey
a number of previous years, applauded the simplicity of our
design.  In remarking on the paper survey he humorously
commented “I was glad I used a pencil”.  He saw our
design decisions as an improvement in both ease and
clarity, and most of the suggestions related to our choices
in terminology.
Overall, the final testing session confirmed for us that
we had fashioned a design that accomplished our goals
of simplicity and accuracy.  There was an expression
of satisfaction and near excitement for a design that
eliminated much of the hassle and drudgery of the extant
survey.  Design suggestions focused almost entirely on
word choice and labelling, changes we were happy to
implement in our final prototype version, and which will be
demonstrated in our final design recommendations.
Final Takeaways From
Testing
After testing we did make some minor changes to
the home screen of the design as well as some changes to
the overall language that we used. One issue we found with
the home screen was that it didn’t tell people what it was,
i.e. lacked affordances, and should a user walk away during
the survey and come back later, they may not remember
what the survey was.
In the end, while our design underwent some significant
changes in aesthetics and function, our goals remained the
same. We set out to create a survey that would solve the
biggest problems experienced by the DFW. They needed
to boost their completion rate, we created a survey with a
single minded focus of helping users go through the steps,
while avoiding any possible distractions or difficulties.
They needed to reduce errors, we designed a system
that would check for the most the most common errors.
From small thing such as spelling errors and inconsistent
numbers to larger things such as the possibility that the
user put in an incorrect amount of harvested animals.
They needed a solution that would help them cut down
on cost. Our solution is a low cost solution that does
not require a large team, nor does it require constant
maintenance. It’s also a simple system that can realistically
be in place within the next few months.
All in all, we believe our design accomplishes the goals
set forward by the NFWS, at the lowest possible cost, with
the highest possible efficiency. It is our sincere belief that
that implementing our design would allows the staff at the
DFW to focus on doing what they do best: preserving the
ecosystems.
19Hammelburger, Pease, Sweeper
BIBLIOGRAPHY
31 Cart Abandonment Rate Statistics. (n.d.). Retrieved October 12, 2015, from
	http://baymard.com/lists/cart-abandonment-rate
Hedegaard, Erik (2013, October) Redneck, Inc: The Duck Dynasty Story. Men’s Journal.
Retrieved from http://www.mensjournal.com/magazine/redneck-inc-the-duck- dynasty-story-20131005
	 Adams, N., Stubbs, D., & Woods, V. (2005). Psychological barriers to Internet usage among older adults 		
	 in the UK. Informatics for Health and Social Care, 30(1), 3–17.
Bargas-Avila, J. A., Orsini, S., Piosczyk, H., Urwyler, D., & Opwis, K. (2011). Enhancing online forms: Use format	
	 specifications for fields with format restrictions to help respondents. Interacting with Computers, 23(1), 		
	33–39.
Bickman, L., & Rog, D. J. (Eds.). (2009). The SAGE handbook of applied social research methods. (2nd ed.). 		
	 Thousand Oaks, CA: SAGE Publications, Inc.
Calak, P. (2013). Smartphone Evaluation Heuristics for Older Adults. Retrieved from
	https://atrium.lib.uoguelph.ca/xmlui/handle/10214/5610
Convention On Nature Protection And Wild Life Preservation In The Western Hemisphere. Oct. 12, 1940. 		
	 Retrieved October 11, 2015, from http://www.oas.org/juridico/english/treaties/c-8.html
Dickinson, K. (1997). Distance learning on the internet: Testing students using web forms and the computer 		
	 gateway interface. TechTrends, 42(2), 43–46.
U.S. Census Bureau (2011). National Survey of Fishing, Hunting, & Wildlife-Associated Recreation (FHWAR). 		
	 Retrieved October 5, 2015, from https://www.census.gov/prod/www/fishing.html
American Sportsfishing Association, Responsive Management, The Oregon Department of Fish and Wildlife, 		
	 Southwick Associates (2013). Exploring Recent Increases in Hunting and Fishing Participation. Retrieved 	
	 from http://www.responsivemanagement.com/download/reports/ Hunt_Fish_Increase_Report.pdf
Goodman, J., Syme, A., & Eisma, R. (2003). Older Adults’ Use of Computers: A Survey. Presented at the Annual
20 NATIONAL FISH AND WILDLIFE SERVICE SURVEY REDESIGN
	 HCI International 2003, Crete, Greece Retrieved from http://www-edc.eng.cam.ac.uk/~jag76/			
	research/2003_bcs_hci/paper.pdf
National Shooting Sports Foundation (2009). A Portrait of Hunters and Hunting License Trends: National 		
	 Report. Retrieved from https://www.nssf.org/PDF/HuntingLicTrends-NatlRpt.pdf
Silvernman, Emily, Wilkins, Khristi. (2015) Web portal for survey data entry Project information. (.pdf) 			
	 Retrieved from https://ubonline.ubalt.edu/portal/site/1154IDIA612WB1/page/280ca13c-9b25-
	4b19-a65f-ef8e1c815763
Guyer, Daniel (2014). Iron Duck Hunting. Retrieved October 5, 2015, from http://ironduckhunting.com/daniel/
Wroblewski, Luke. (2008). Web Form Design: Filling in the Blanks. New York: Rosenfeld Media.
Jarrett, C., & Gaffney, G. (2009). Forms that work: Designing Web forms for usability. Amsterdam: Morgan 		
	Kaufmann.
Kitzmann, A. (2003). That Different Place: Documenting the Self Within Online Environments. Biography, 		
	 26(1), 48–65.
Lavery, D., Cockton, G., & Atkinson, M. P. (1996). Heuristic evaluation. Usability evaluation materials. Tech. 		
	 Rep. TR-1996-15. Glasgow, Scotland: University of Glasgow.
Lim, M. S. C., Sacks-Davis, R., Aitken, C. K., Hocking, J. S., & Hellard, M. E. (2010). Randomised controlled 		
	 trial of paper, online and SMS diaries for collecting sexual behaviour information from young people. 		
	 Journal of Epidemiology and Community Health, 64(10), 885–889.
U.S. Fish & Wildlife Service (2012). Your Guide to Hunting on National Wildlife Refuges. Retrieved from http://		
	www.fws.gov/refuges/hunting/pdf/huntingGuide.pdf
Morrell, R. W., Mayhorn, C. B., & Bennett, J. (2000). A Survey of World Wide Web Use in Middle-Aged and 		
	 Older Adults. Human Factors: The Journal of the Human Factors and Ergonomics Society, 42(2), 			
	175–182.
Widener, Nick. (2013, July 14). New Site Targets Duck Hunters. Online Athens. Retrieved from
	 http://onlineathens.com/sports/outdoors/2013-07-14/new-site- targets-duck-hunters
Pauwels, S. L., Hübscher, C., Leuthold, S., Bargas-Avila, J. A., & Opwis, K. (2009). Error prevention in online 		
	 forms: Use color instead of asterisks to mark required- fields. Interacting with Computers, 21(4), 			
	257–262.
Wright, Kevin. (2006). Researching Internet-Based Populations: Advantages and Disadvantages of Online 		
	 Survey Research, Online Questionnaire Authoring Software Packages, and Web Survey Services. Journal 		
	 of Computer-Mediated Communication, 10(3), 00.
Ruark, R. (1993). The Old Man and the Boy (Reprint edition). New York: Holt Paperbacks.
21Hammelburger, Pease, Sweeper
Staff, C. P. I. (1997). Duck Hunting: Guide-Tested Techniques for Taking All of the Important North American 		
	 Duck Species. Quayside.
Mifsud, Justin (2011). An Extensive Guide to Web Form Usability. Smashing Magazine. Retrieved from
	 http://www.smashingmagazine.com/2011/11/extensive-guide- web-form-usability/.
Williams, Kimberly J. (2010). The Call of The Wild: The Serious Business of Duck Hunting in Eastern Arkansas. 		
	 Union Sportsmen’s Alliance. Retrieved from http://unionsportsmen.org/the-call-of-the-wild-the-serious-		
	 business-of-duck- hunting-in-eastern-arkansas/
Responsive Management, The National Shooting Sports Foundation. (2008). The Future of Hunting and the 		
	 Shooting Sports. Retrieved from http://www.dnr.state.il.us/nrab/children/future_hunting.pdf
Underwood, L. (2004). The Duck Hunter’s Book: Classic Waterfowl Stories. Globe Pequot.
U.S. Fish and Wildlife Service. National Conservation Training Center. (2014). USFWS History: A Timeline for 		
	 Fish and Wildlife Conservation. Retrieved October 11, 2015, from
	http://training.fws.gov/history/USFWS-history.html
Brown, Curt. (2014, Jan 26). Westport duck hunter recalls his will to survive harrowing accident in river. 			
	 SouthCoast Today. Retrieved	 October 5, 2015, from
	http://www.southcoasttoday.com/article/20140126/NEWS/401260320
Xie, B. (2003). Older adults, computers, and the Internet: Future directions. Gerontechnology, 2(4), 289–305.
National Institute of Health. National Institute on Aging. (2007). Making your printed health Materials Senior 		
	 Friendly. Retrieved from https://www.nia.nih.gov/health/publication/making-your-printed-			
	 health-materials- senior-friendly
Nielsen, Jakob. (2013, May 28). Seniors as Web Users. Nielsen Norman Group. Retrieved from http://www.
nngroup.com/articles/usability-for-senior-citizens/

More Related Content

What's hot

Biases in crowdsourced livestock data and its impact on modelling tick distri...
Biases in crowdsourced livestock data and its impact on modelling tick distri...Biases in crowdsourced livestock data and its impact on modelling tick distri...
Biases in crowdsourced livestock data and its impact on modelling tick distri...
ILRI
 
Survey report on ultra poor
Survey report on ultra poorSurvey report on ultra poor
Survey report on ultra poor
Diajul Islam
 
The Influence Of Socio-Economic Characteristics on Consumers’ Preference on F...
The Influence Of Socio-Economic Characteristics on Consumers’ Preference on F...The Influence Of Socio-Economic Characteristics on Consumers’ Preference on F...
The Influence Of Socio-Economic Characteristics on Consumers’ Preference on F...
Agriculture Journal IJOEAR
 
Sweke et al. 2016_Comparative job satisfaction of fishers in northeast Hokkai...
Sweke et al. 2016_Comparative job satisfaction of fishers in northeast Hokkai...Sweke et al. 2016_Comparative job satisfaction of fishers in northeast Hokkai...
Sweke et al. 2016_Comparative job satisfaction of fishers in northeast Hokkai...Sweke Emmanuel Andrew (SEA)
 
Gender analysis of rural dwellers accessibility to free natural resources in ...
Gender analysis of rural dwellers accessibility to free natural resources in ...Gender analysis of rural dwellers accessibility to free natural resources in ...
Gender analysis of rural dwellers accessibility to free natural resources in ...
Alexander Decker
 
Effects of non wood forest products on rural household in
Effects of non wood forest products on rural household inEffects of non wood forest products on rural household in
Effects of non wood forest products on rural household in
Alexander Decker
 
SMALLHOLDER FARMERS’ CREDIT PARTICIPATION: THE CASE OF OMO MICROFINANCE INSTI...
SMALLHOLDER FARMERS’ CREDIT PARTICIPATION: THE CASE OF OMO MICROFINANCE INSTI...SMALLHOLDER FARMERS’ CREDIT PARTICIPATION: THE CASE OF OMO MICROFINANCE INSTI...
SMALLHOLDER FARMERS’ CREDIT PARTICIPATION: THE CASE OF OMO MICROFINANCE INSTI...
Premier Publishers
 
Integration and conflict management among igbo migrants
Integration and conflict management among igbo migrantsIntegration and conflict management among igbo migrants
Integration and conflict management among igbo migrants
Alexander Decker
 
The Limit of Chill Method as a Non-Lethal Deterrent in Mitigating Human Eleph...
The Limit of Chill Method as a Non-Lethal Deterrent in Mitigating Human Eleph...The Limit of Chill Method as a Non-Lethal Deterrent in Mitigating Human Eleph...
The Limit of Chill Method as a Non-Lethal Deterrent in Mitigating Human Eleph...
inventionjournals
 

What's hot (10)

Biases in crowdsourced livestock data and its impact on modelling tick distri...
Biases in crowdsourced livestock data and its impact on modelling tick distri...Biases in crowdsourced livestock data and its impact on modelling tick distri...
Biases in crowdsourced livestock data and its impact on modelling tick distri...
 
Survey report on ultra poor
Survey report on ultra poorSurvey report on ultra poor
Survey report on ultra poor
 
SNE-Wildlife_Fact_Sheet
SNE-Wildlife_Fact_SheetSNE-Wildlife_Fact_Sheet
SNE-Wildlife_Fact_Sheet
 
The Influence Of Socio-Economic Characteristics on Consumers’ Preference on F...
The Influence Of Socio-Economic Characteristics on Consumers’ Preference on F...The Influence Of Socio-Economic Characteristics on Consumers’ Preference on F...
The Influence Of Socio-Economic Characteristics on Consumers’ Preference on F...
 
Sweke et al. 2016_Comparative job satisfaction of fishers in northeast Hokkai...
Sweke et al. 2016_Comparative job satisfaction of fishers in northeast Hokkai...Sweke et al. 2016_Comparative job satisfaction of fishers in northeast Hokkai...
Sweke et al. 2016_Comparative job satisfaction of fishers in northeast Hokkai...
 
Gender analysis of rural dwellers accessibility to free natural resources in ...
Gender analysis of rural dwellers accessibility to free natural resources in ...Gender analysis of rural dwellers accessibility to free natural resources in ...
Gender analysis of rural dwellers accessibility to free natural resources in ...
 
Effects of non wood forest products on rural household in
Effects of non wood forest products on rural household inEffects of non wood forest products on rural household in
Effects of non wood forest products on rural household in
 
SMALLHOLDER FARMERS’ CREDIT PARTICIPATION: THE CASE OF OMO MICROFINANCE INSTI...
SMALLHOLDER FARMERS’ CREDIT PARTICIPATION: THE CASE OF OMO MICROFINANCE INSTI...SMALLHOLDER FARMERS’ CREDIT PARTICIPATION: THE CASE OF OMO MICROFINANCE INSTI...
SMALLHOLDER FARMERS’ CREDIT PARTICIPATION: THE CASE OF OMO MICROFINANCE INSTI...
 
Integration and conflict management among igbo migrants
Integration and conflict management among igbo migrantsIntegration and conflict management among igbo migrants
Integration and conflict management among igbo migrants
 
The Limit of Chill Method as a Non-Lethal Deterrent in Mitigating Human Eleph...
The Limit of Chill Method as a Non-Lethal Deterrent in Mitigating Human Eleph...The Limit of Chill Method as a Non-Lethal Deterrent in Mitigating Human Eleph...
The Limit of Chill Method as a Non-Lethal Deterrent in Mitigating Human Eleph...
 

Similar to IDIA 612 - Fish and Wildlife Presentation

Birding in the United States: A Demographic and Economic Analysis
Birding in the United States: A Demographic and Economic AnalysisBirding in the United States: A Demographic and Economic Analysis
Birding in the United States: A Demographic and Economic Analysis
usfws
 
Potential_effects_of_a_major_hurricane_o
Potential_effects_of_a_major_hurricane_oPotential_effects_of_a_major_hurricane_o
Potential_effects_of_a_major_hurricane_oTim Hoffland
 
Travis Jansen - Thesis Conference Paper
Travis Jansen - Thesis Conference PaperTravis Jansen - Thesis Conference Paper
Travis Jansen - Thesis Conference PaperTravis Jansen
 
fish population dynamics, Population structure
fish population dynamics, Population structurefish population dynamics, Population structure
fish population dynamics, Population structure
Degonto Islam
 
Ey Transparency Report 2022 Gmc. Online assignment writing service.
Ey Transparency Report 2022 Gmc. Online assignment writing service.Ey Transparency Report 2022 Gmc. Online assignment writing service.
Ey Transparency Report 2022 Gmc. Online assignment writing service.
Becki Roy
 
Weigel and metz polling wildlife presentation for 6 6-13 final
Weigel and metz polling wildlife presentation for 6 6-13 finalWeigel and metz polling wildlife presentation for 6 6-13 final
Weigel and metz polling wildlife presentation for 6 6-13 finalNational Wildlife Federation
 
Data collection methods for inland fisheries
Data collection methods for inland fisheriesData collection methods for inland fisheries
Data collection methods for inland fisheries
SWAGATIKA SAHOO
 
TABLE 11-1Community Assessment using the Community-as Partne
TABLE 11-1Community Assessment using the Community-as PartneTABLE 11-1Community Assessment using the Community-as Partne
TABLE 11-1Community Assessment using the Community-as Partne
lisandrai1k
 
DRS-111 Data Structure and Data Collection Methods.pdf
DRS-111 Data Structure and Data Collection Methods.pdfDRS-111 Data Structure and Data Collection Methods.pdf
DRS-111 Data Structure and Data Collection Methods.pdf
Nay Aung
 
Essay Report 1Murid 1 Sukan 1 Malaysia
Essay Report 1Murid 1 Sukan 1 MalaysiaEssay Report 1Murid 1 Sukan 1 Malaysia
Essay Report 1Murid 1 Sukan 1 Malaysia
Jennifer Reese
 
An Economic Assessment of the Value of Lakes and Lake Water Quality In Itasca...
An Economic Assessment of the Value of Lakes and Lake Water Quality In Itasca...An Economic Assessment of the Value of Lakes and Lake Water Quality In Itasca...
An Economic Assessment of the Value of Lakes and Lake Water Quality In Itasca...
Luke Garner
 
Global demography
Global demographyGlobal demography
Global demography
Thirdy Malit
 
MadanRiva_Thesis
MadanRiva_ThesisMadanRiva_Thesis
MadanRiva_ThesisRiva Madan
 
Behavioral signature of intraspecific competition anddensity.docx
Behavioral signature of intraspecific competition anddensity.docxBehavioral signature of intraspecific competition anddensity.docx
Behavioral signature of intraspecific competition anddensity.docx
AASTHA76
 
Global Demography-The tools of demography
Global Demography-The tools of demographyGlobal Demography-The tools of demography
Global Demography-The tools of demography
LuisSalenga1
 
Socio-economics of fishermen community around the Junglighat fish landing cen...
Socio-economics of fishermen community around the Junglighat fish landing cen...Socio-economics of fishermen community around the Junglighat fish landing cen...
Socio-economics of fishermen community around the Junglighat fish landing cen...
Journal of Research in Biology
 
14 Health Reports, Vol. 29, no. 4, pp. 14-22, April 2018 • Sta
14 Health Reports, Vol. 29, no. 4, pp. 14-22, April 2018 • Sta14 Health Reports, Vol. 29, no. 4, pp. 14-22, April 2018 • Sta
14 Health Reports, Vol. 29, no. 4, pp. 14-22, April 2018 • Sta
AnastaciaShadelb
 
14 Health Reports, Vol. 29, no. 4, pp. 14-22, April 2018 • Sta
14 Health Reports, Vol. 29, no. 4, pp. 14-22, April 2018 • Sta14 Health Reports, Vol. 29, no. 4, pp. 14-22, April 2018 • Sta
14 Health Reports, Vol. 29, no. 4, pp. 14-22, April 2018 • Sta
ChantellPantoja184
 
AFWA_Monarch_Report_Oct_23_2015
AFWA_Monarch_Report_Oct_23_2015AFWA_Monarch_Report_Oct_23_2015
AFWA_Monarch_Report_Oct_23_2015Jonathan Mawdsley
 

Similar to IDIA 612 - Fish and Wildlife Presentation (20)

Birding in the United States: A Demographic and Economic Analysis
Birding in the United States: A Demographic and Economic AnalysisBirding in the United States: A Demographic and Economic Analysis
Birding in the United States: A Demographic and Economic Analysis
 
Barthel-Schuett-73-902
Barthel-Schuett-73-902Barthel-Schuett-73-902
Barthel-Schuett-73-902
 
Potential_effects_of_a_major_hurricane_o
Potential_effects_of_a_major_hurricane_oPotential_effects_of_a_major_hurricane_o
Potential_effects_of_a_major_hurricane_o
 
Travis Jansen - Thesis Conference Paper
Travis Jansen - Thesis Conference PaperTravis Jansen - Thesis Conference Paper
Travis Jansen - Thesis Conference Paper
 
fish population dynamics, Population structure
fish population dynamics, Population structurefish population dynamics, Population structure
fish population dynamics, Population structure
 
Ey Transparency Report 2022 Gmc. Online assignment writing service.
Ey Transparency Report 2022 Gmc. Online assignment writing service.Ey Transparency Report 2022 Gmc. Online assignment writing service.
Ey Transparency Report 2022 Gmc. Online assignment writing service.
 
Weigel and metz polling wildlife presentation for 6 6-13 final
Weigel and metz polling wildlife presentation for 6 6-13 finalWeigel and metz polling wildlife presentation for 6 6-13 final
Weigel and metz polling wildlife presentation for 6 6-13 final
 
Data collection methods for inland fisheries
Data collection methods for inland fisheriesData collection methods for inland fisheries
Data collection methods for inland fisheries
 
TABLE 11-1Community Assessment using the Community-as Partne
TABLE 11-1Community Assessment using the Community-as PartneTABLE 11-1Community Assessment using the Community-as Partne
TABLE 11-1Community Assessment using the Community-as Partne
 
DRS-111 Data Structure and Data Collection Methods.pdf
DRS-111 Data Structure and Data Collection Methods.pdfDRS-111 Data Structure and Data Collection Methods.pdf
DRS-111 Data Structure and Data Collection Methods.pdf
 
Essay Report 1Murid 1 Sukan 1 Malaysia
Essay Report 1Murid 1 Sukan 1 MalaysiaEssay Report 1Murid 1 Sukan 1 Malaysia
Essay Report 1Murid 1 Sukan 1 Malaysia
 
An Economic Assessment of the Value of Lakes and Lake Water Quality In Itasca...
An Economic Assessment of the Value of Lakes and Lake Water Quality In Itasca...An Economic Assessment of the Value of Lakes and Lake Water Quality In Itasca...
An Economic Assessment of the Value of Lakes and Lake Water Quality In Itasca...
 
Global demography
Global demographyGlobal demography
Global demography
 
MadanRiva_Thesis
MadanRiva_ThesisMadanRiva_Thesis
MadanRiva_Thesis
 
Behavioral signature of intraspecific competition anddensity.docx
Behavioral signature of intraspecific competition anddensity.docxBehavioral signature of intraspecific competition anddensity.docx
Behavioral signature of intraspecific competition anddensity.docx
 
Global Demography-The tools of demography
Global Demography-The tools of demographyGlobal Demography-The tools of demography
Global Demography-The tools of demography
 
Socio-economics of fishermen community around the Junglighat fish landing cen...
Socio-economics of fishermen community around the Junglighat fish landing cen...Socio-economics of fishermen community around the Junglighat fish landing cen...
Socio-economics of fishermen community around the Junglighat fish landing cen...
 
14 Health Reports, Vol. 29, no. 4, pp. 14-22, April 2018 • Sta
14 Health Reports, Vol. 29, no. 4, pp. 14-22, April 2018 • Sta14 Health Reports, Vol. 29, no. 4, pp. 14-22, April 2018 • Sta
14 Health Reports, Vol. 29, no. 4, pp. 14-22, April 2018 • Sta
 
14 Health Reports, Vol. 29, no. 4, pp. 14-22, April 2018 • Sta
14 Health Reports, Vol. 29, no. 4, pp. 14-22, April 2018 • Sta14 Health Reports, Vol. 29, no. 4, pp. 14-22, April 2018 • Sta
14 Health Reports, Vol. 29, no. 4, pp. 14-22, April 2018 • Sta
 
AFWA_Monarch_Report_Oct_23_2015
AFWA_Monarch_Report_Oct_23_2015AFWA_Monarch_Report_Oct_23_2015
AFWA_Monarch_Report_Oct_23_2015
 

Recently uploaded

Design Thinking Design thinking Design thinking
Design Thinking Design thinking Design thinkingDesign Thinking Design thinking Design thinking
Design Thinking Design thinking Design thinking
cy0krjxt
 
一比一原版(UW毕业证)西雅图华盛顿大学毕业证如何办理
一比一原版(UW毕业证)西雅图华盛顿大学毕业证如何办理一比一原版(UW毕业证)西雅图华盛顿大学毕业证如何办理
一比一原版(UW毕业证)西雅图华盛顿大学毕业证如何办理
kecekev
 
Design Thinking Design thinking Design thinking
Design Thinking Design thinking Design thinkingDesign Thinking Design thinking Design thinking
Design Thinking Design thinking Design thinking
cy0krjxt
 
EASY TUTORIAL OF HOW TO USE CAPCUT BY: FEBLESS HERNANE
EASY TUTORIAL OF HOW TO USE CAPCUT BY: FEBLESS HERNANEEASY TUTORIAL OF HOW TO USE CAPCUT BY: FEBLESS HERNANE
EASY TUTORIAL OF HOW TO USE CAPCUT BY: FEBLESS HERNANE
Febless Hernane
 
一比一原版(BU毕业证书)伯恩茅斯大学毕业证成绩单如何办理
一比一原版(BU毕业证书)伯恩茅斯大学毕业证成绩单如何办理一比一原版(BU毕业证书)伯恩茅斯大学毕业证成绩单如何办理
一比一原版(BU毕业证书)伯恩茅斯大学毕业证成绩单如何办理
h7j5io0
 
一比一原版(毕业证)长崎大学毕业证成绩单如何办理
一比一原版(毕业证)长崎大学毕业证成绩单如何办理一比一原版(毕业证)长崎大学毕业证成绩单如何办理
一比一原版(毕业证)长崎大学毕业证成绩单如何办理
taqyed
 
Connect Conference 2022: Passive House - Economic and Environmental Solution...
Connect Conference 2022: Passive House -  Economic and Environmental Solution...Connect Conference 2022: Passive House -  Economic and Environmental Solution...
Connect Conference 2022: Passive House - Economic and Environmental Solution...
TE Studio
 
一比一原版(Columbia毕业证)哥伦比亚大学毕业证如何办理
一比一原版(Columbia毕业证)哥伦比亚大学毕业证如何办理一比一原版(Columbia毕业证)哥伦比亚大学毕业证如何办理
一比一原版(Columbia毕业证)哥伦比亚大学毕业证如何办理
asuzyq
 
一比一原版(LSE毕业证书)伦敦政治经济学院毕业证成绩单如何办理
一比一原版(LSE毕业证书)伦敦政治经济学院毕业证成绩单如何办理一比一原版(LSE毕业证书)伦敦政治经济学院毕业证成绩单如何办理
一比一原版(LSE毕业证书)伦敦政治经济学院毕业证成绩单如何办理
jyz59f4j
 
一比一原版(Brunel毕业证书)布鲁内尔大学毕业证成绩单如何办理
一比一原版(Brunel毕业证书)布鲁内尔大学毕业证成绩单如何办理一比一原版(Brunel毕业证书)布鲁内尔大学毕业证成绩单如何办理
一比一原版(Brunel毕业证书)布鲁内尔大学毕业证成绩单如何办理
smpc3nvg
 
一比一原版(Bolton毕业证书)博尔顿大学毕业证成绩单如何办理
一比一原版(Bolton毕业证书)博尔顿大学毕业证成绩单如何办理一比一原版(Bolton毕业证书)博尔顿大学毕业证成绩单如何办理
一比一原版(Bolton毕业证书)博尔顿大学毕业证成绩单如何办理
h7j5io0
 
Impact of Fonts: in Web and Apps Design
Impact of Fonts:  in Web and Apps DesignImpact of Fonts:  in Web and Apps Design
Impact of Fonts: in Web and Apps Design
contactproperweb2014
 
一比一原版(Glasgow毕业证书)格拉斯哥大学毕业证成绩单如何办理
一比一原版(Glasgow毕业证书)格拉斯哥大学毕业证成绩单如何办理一比一原版(Glasgow毕业证书)格拉斯哥大学毕业证成绩单如何办理
一比一原版(Glasgow毕业证书)格拉斯哥大学毕业证成绩单如何办理
n0tivyq
 
Can AI do good? at 'offtheCanvas' India HCI prelude
Can AI do good? at 'offtheCanvas' India HCI preludeCan AI do good? at 'offtheCanvas' India HCI prelude
Can AI do good? at 'offtheCanvas' India HCI prelude
Alan Dix
 
一比一原版(UNUK毕业证书)诺丁汉大学毕业证如何办理
一比一原版(UNUK毕业证书)诺丁汉大学毕业证如何办理一比一原版(UNUK毕业证书)诺丁汉大学毕业证如何办理
一比一原版(UNUK毕业证书)诺丁汉大学毕业证如何办理
7sd8fier
 
一比一原版(UCB毕业证书)伯明翰大学学院毕业证成绩单如何办理
一比一原版(UCB毕业证书)伯明翰大学学院毕业证成绩单如何办理一比一原版(UCB毕业证书)伯明翰大学学院毕业证成绩单如何办理
一比一原版(UCB毕业证书)伯明翰大学学院毕业证成绩单如何办理
h7j5io0
 
UNIT IV-VISUAL STYLE AND MOBILE INTERFACES.pptx
UNIT IV-VISUAL STYLE AND MOBILE INTERFACES.pptxUNIT IV-VISUAL STYLE AND MOBILE INTERFACES.pptx
UNIT IV-VISUAL STYLE AND MOBILE INTERFACES.pptx
GOWSIKRAJA PALANISAMY
 
一比一原版(MMU毕业证书)曼彻斯特城市大学毕业证成绩单如何办理
一比一原版(MMU毕业证书)曼彻斯特城市大学毕业证成绩单如何办理一比一原版(MMU毕业证书)曼彻斯特城市大学毕业证成绩单如何办理
一比一原版(MMU毕业证书)曼彻斯特城市大学毕业证成绩单如何办理
7sd8fier
 
Maximize Your Content with Beautiful Assets : Content & Asset for Landing Page
Maximize Your Content with Beautiful Assets : Content & Asset for Landing Page Maximize Your Content with Beautiful Assets : Content & Asset for Landing Page
Maximize Your Content with Beautiful Assets : Content & Asset for Landing Page
pmgdscunsri
 
Research 20 slides Amelia gavryliuks.pdf
Research 20 slides Amelia gavryliuks.pdfResearch 20 slides Amelia gavryliuks.pdf
Research 20 slides Amelia gavryliuks.pdf
ameli25062005
 

Recently uploaded (20)

Design Thinking Design thinking Design thinking
Design Thinking Design thinking Design thinkingDesign Thinking Design thinking Design thinking
Design Thinking Design thinking Design thinking
 
一比一原版(UW毕业证)西雅图华盛顿大学毕业证如何办理
一比一原版(UW毕业证)西雅图华盛顿大学毕业证如何办理一比一原版(UW毕业证)西雅图华盛顿大学毕业证如何办理
一比一原版(UW毕业证)西雅图华盛顿大学毕业证如何办理
 
Design Thinking Design thinking Design thinking
Design Thinking Design thinking Design thinkingDesign Thinking Design thinking Design thinking
Design Thinking Design thinking Design thinking
 
EASY TUTORIAL OF HOW TO USE CAPCUT BY: FEBLESS HERNANE
EASY TUTORIAL OF HOW TO USE CAPCUT BY: FEBLESS HERNANEEASY TUTORIAL OF HOW TO USE CAPCUT BY: FEBLESS HERNANE
EASY TUTORIAL OF HOW TO USE CAPCUT BY: FEBLESS HERNANE
 
一比一原版(BU毕业证书)伯恩茅斯大学毕业证成绩单如何办理
一比一原版(BU毕业证书)伯恩茅斯大学毕业证成绩单如何办理一比一原版(BU毕业证书)伯恩茅斯大学毕业证成绩单如何办理
一比一原版(BU毕业证书)伯恩茅斯大学毕业证成绩单如何办理
 
一比一原版(毕业证)长崎大学毕业证成绩单如何办理
一比一原版(毕业证)长崎大学毕业证成绩单如何办理一比一原版(毕业证)长崎大学毕业证成绩单如何办理
一比一原版(毕业证)长崎大学毕业证成绩单如何办理
 
Connect Conference 2022: Passive House - Economic and Environmental Solution...
Connect Conference 2022: Passive House -  Economic and Environmental Solution...Connect Conference 2022: Passive House -  Economic and Environmental Solution...
Connect Conference 2022: Passive House - Economic and Environmental Solution...
 
一比一原版(Columbia毕业证)哥伦比亚大学毕业证如何办理
一比一原版(Columbia毕业证)哥伦比亚大学毕业证如何办理一比一原版(Columbia毕业证)哥伦比亚大学毕业证如何办理
一比一原版(Columbia毕业证)哥伦比亚大学毕业证如何办理
 
一比一原版(LSE毕业证书)伦敦政治经济学院毕业证成绩单如何办理
一比一原版(LSE毕业证书)伦敦政治经济学院毕业证成绩单如何办理一比一原版(LSE毕业证书)伦敦政治经济学院毕业证成绩单如何办理
一比一原版(LSE毕业证书)伦敦政治经济学院毕业证成绩单如何办理
 
一比一原版(Brunel毕业证书)布鲁内尔大学毕业证成绩单如何办理
一比一原版(Brunel毕业证书)布鲁内尔大学毕业证成绩单如何办理一比一原版(Brunel毕业证书)布鲁内尔大学毕业证成绩单如何办理
一比一原版(Brunel毕业证书)布鲁内尔大学毕业证成绩单如何办理
 
一比一原版(Bolton毕业证书)博尔顿大学毕业证成绩单如何办理
一比一原版(Bolton毕业证书)博尔顿大学毕业证成绩单如何办理一比一原版(Bolton毕业证书)博尔顿大学毕业证成绩单如何办理
一比一原版(Bolton毕业证书)博尔顿大学毕业证成绩单如何办理
 
Impact of Fonts: in Web and Apps Design
Impact of Fonts:  in Web and Apps DesignImpact of Fonts:  in Web and Apps Design
Impact of Fonts: in Web and Apps Design
 
一比一原版(Glasgow毕业证书)格拉斯哥大学毕业证成绩单如何办理
一比一原版(Glasgow毕业证书)格拉斯哥大学毕业证成绩单如何办理一比一原版(Glasgow毕业证书)格拉斯哥大学毕业证成绩单如何办理
一比一原版(Glasgow毕业证书)格拉斯哥大学毕业证成绩单如何办理
 
Can AI do good? at 'offtheCanvas' India HCI prelude
Can AI do good? at 'offtheCanvas' India HCI preludeCan AI do good? at 'offtheCanvas' India HCI prelude
Can AI do good? at 'offtheCanvas' India HCI prelude
 
一比一原版(UNUK毕业证书)诺丁汉大学毕业证如何办理
一比一原版(UNUK毕业证书)诺丁汉大学毕业证如何办理一比一原版(UNUK毕业证书)诺丁汉大学毕业证如何办理
一比一原版(UNUK毕业证书)诺丁汉大学毕业证如何办理
 
一比一原版(UCB毕业证书)伯明翰大学学院毕业证成绩单如何办理
一比一原版(UCB毕业证书)伯明翰大学学院毕业证成绩单如何办理一比一原版(UCB毕业证书)伯明翰大学学院毕业证成绩单如何办理
一比一原版(UCB毕业证书)伯明翰大学学院毕业证成绩单如何办理
 
UNIT IV-VISUAL STYLE AND MOBILE INTERFACES.pptx
UNIT IV-VISUAL STYLE AND MOBILE INTERFACES.pptxUNIT IV-VISUAL STYLE AND MOBILE INTERFACES.pptx
UNIT IV-VISUAL STYLE AND MOBILE INTERFACES.pptx
 
一比一原版(MMU毕业证书)曼彻斯特城市大学毕业证成绩单如何办理
一比一原版(MMU毕业证书)曼彻斯特城市大学毕业证成绩单如何办理一比一原版(MMU毕业证书)曼彻斯特城市大学毕业证成绩单如何办理
一比一原版(MMU毕业证书)曼彻斯特城市大学毕业证成绩单如何办理
 
Maximize Your Content with Beautiful Assets : Content & Asset for Landing Page
Maximize Your Content with Beautiful Assets : Content & Asset for Landing Page Maximize Your Content with Beautiful Assets : Content & Asset for Landing Page
Maximize Your Content with Beautiful Assets : Content & Asset for Landing Page
 
Research 20 slides Amelia gavryliuks.pdf
Research 20 slides Amelia gavryliuks.pdfResearch 20 slides Amelia gavryliuks.pdf
Research 20 slides Amelia gavryliuks.pdf
 

IDIA 612 - Fish and Wildlife Presentation

  • 1. 1Hammelburger, Pease, Sweeper NATIONAL FISH AND WILDLIFE SERVICE SURVEY REDESIGN Hammelburger, Pease, Sweeper
  • 2. 2 CONTENTS 10 COMPETITIVE ANALYSIS 14 PROTOTYPING & TESTING 19 BIBLIOGRAPHY 12 A SURVEY FOR HUNTERS 08 A PROBLEM OF FORM(S) 04 PERSONAS 03 POPULATION 02 INTRODUCTION
  • 3. 3 INTRODUCTION Since it’s inception in 1871 as the Commission of Fish and Fisheries (USFWS, NCTC2014), the U.S. Fish and Wildlife service has served as the government’s premier agent for the conservation of wildlife species and habitat. As a division of the Department of the Interior, the Fish and Wildlife service has slowly expanded to encompass a variety of purposes, mostly related to the monitoring and preservation of biodiversity. The National Fish and Wildlife Service as we know it can be traced to a 1940 reconstitution and reorganization, merging the Bureau of Fisheries and the Bureau of Biological Survey, creating an overarching organization devoted to “protect and preserve in their natural habitat representatives of all species and genera of their native flora and fauna, including migratory birds”( Convention On Nature Protection And Wild Life Preservation In The Western Hemisphere. Oct. 12, 1940). In pursuit of this aim, the NFWS employ a variety of data collection methods aimed at generating as complete as possible a view of the overall health of the species and ecosystems under the purview. For the purposes of the monitoring of migratory birds, data for the NFWS comes from two primary locations; ecological surveys of the birds’ (primarily arctic) breeding grounds, and surveys from licensed hunters. The latter will be the primary focus of this paper. The NFWS began collecting what they call the “National Harvest Surveys of Sports Hunters” in 1952. Although the survey has changed marginally over the years, it licensed waterfowl hunters to report on the number of birds harvested over the course of a hunting season. This data is then used to not only report on the overall health of the game bird population, but also to help set the limits on and standards for successive hunting seasons (Silvernman, Wilkins. 2015). In recent years the USFWS has received the information for over 1,000,000 registered waterfowl hunters per hunting season. From this pool they select around 10% to receive survey forms; forms intended to be completed at the conclusion of the hunting season with a record of the hunter’s personal totals for both birds harvested and number of days spent hunting. Of those selected, around half submit completed surveys at season’s end. This information is then gathered by the migratory birds division, analyzed in comparison with the data gathered from the surveys conducted at the birds’ nesting sites, and used to make recommendations and impose regulations for the succeeding year’s hunting season. This paradigm has changed little in the last fifty years, and a number of concerns have arisen related to it. Firstly, the cost of conducting the survey by paper is not insignificant. Dr. Wilkins and Dr. Silverman estimate that the NFWS spends over half a million dollars in postage costs alone. Upon receipt of the response, NFWS personnel must devote considerable man hours to sifting through and clarifying the often error filled response forms. While some forms can be salvaged with correction and discrimination Wilkens and Silverman estimate that, of those surveys received, around a quarter of responses must be discarded due to errors in the subject’s submissions. Of additional concern to the NFWS is a decline in the user response rate, which they suspect is related to an increasing unwillingness to complete and submit a paper survey by mail (Silvernman, Wilkins. 2015). Taken together, these problems present an opportunity to redesign the “National Harvest Survey of Sport’s Hunters”, informed by modern tools and theories in survey design and distribution.
  • 4. 4 NATIONAL FISH AND WILDLIFE SERVICE SURVEY REDESIGN POPULATION To design the US Fish and Wildlife Service survey it is first necessary to identify and define the demographics of the population of sports hunters to be surveyed. We attempt to take into account the hunters’ sex, age, regional distribution, education, race and electronic literacy. These factors provide a general understanding of the population who may be selected US Fish and Wildlife Service’s survey. As has been noted since surveys of hunter populations were first recorded, it is expected that most duck hunters will be male. In a survey taken in 1997 by Ringelman, female participation was lower than 10 % and a 2005 National Duck Hunter Survey found that most, 99% in fact, respondents were male. The trend continues, and a 2008 survey found that 86% of active hunters identify as white and 84% identify as male. Additionally, 38% report living in rural areas (Responsive Management, The National Shooting Sports Foundation, 2008). It is also significant to note that inactive hunters are generally older than active hunters and discontinue the sport due to their age and health. “Only 10% of active hunters are 65 years old or older, 23% of inactive hunters are that age, reinforcing other findings that suggest that many inactive hunters simply dropped out because of age or health rather than for other reasons” (Responsive Management, The National Shooting Sports Foundation, 2008). Due to the reported homogeneity of most hunters, and in an effort to combat a declining waterfowl hunter population, current recruitment efforts have targeted females, young people, the disabled, and minorities (Responsive Management, The National Shooting Sports Foundation. 2008). A 2011 US census bureau survey identified percentage participation amongst ethnic groups within the U.S., with 2 percent of Hispanics, 7% of Whites, 2% of African Americans, 0.5% of Asian Americans, and 2% of those identified as other races participating in the hobby of hunting. (U.S. Census Bureau, 2011) Furthermore, in recent studies amongst duck hunters these trends persist, albeit with a marginal increase amongst the under-represented groups. According to the U.S. Fish & Wildlife Service 2011 National Survey of Fishing, Hunting, and Wildlife- Associated Recreation “of the 13.7 million participants who hunted, 89 percent (12.2 million) were male and 11 percent (1.5 million) were female.” The hunting participation rate also increased by roughly five percent as individuals reached 65 years of age, and declined for those 75 and older at a rate of 2 percent. The largest active age group was those 45 to 54 years old(U.S. Census Bureau 2011). Additionally, “twenty percent of duck hunters said they “frequently” access the internet to look up duck hunting information; 49% access the internet “once in awhile,” and 31%, “not at all” (U.S. Census Bureau 2011). From this research, it is evident that most duck hunters may not be tech savvy. Subsequently, we must acknowledge the limitations in technological literacy amongst duck hunters, as well as make accommodations for an increasingly older population. With the predominant demographic amongst waterfowl hunters identified, it becomes easier
  • 5. 5Hammelburger, Pease, Sweeper to define a set of design heuristics to apply. An extensive body of research exists exploring the relationship between older individuals and information technology. In particular, research shows a correlation between a positive perception of technology’s usefulness and individual’s likeliness of using it. (Adams, N., Stubbs, D., & Woods, V. 2005) It is therefore imperative in the design revisions to the National Harvest Survey of Sport’s Hunters to cultivate the perception of usefulness amongst those surveyed. As the aging process is associated with a decrease in working memory and spatial visualization skills(Adams, N., Stubbs, D., & Woods, V. 2005), and the majority of surveyed hunters now fall, or soon will fall, within this demographic, affordances should be made to decrease the necessity of these skills. As Adams et. al. note, “simple and uniformly designed Internet pages, more user-friendly online help and error message terminology, and increased provision of training for the older user would assist uptake” amongst older users(Adams, N., Stubbs, D., & Woods, V. 2005). In the proceeding sections, we will identify a set of personas to help guide design decisions, review survey heuristics as they apply to our target demographic, as well as analyze and critique a popular existing application which replicates features to be considered for the NFWS hunter survey. The participation rate in hunting increased as household income increased until it reached incomes of $100,000 or more. The participation was highest among those with incomes of $75,000 to $99,999 at 9 percent. The majority of hunters had household incomes of $50,000 or more. (U.S. Fish & Wildlife Service 2011 National Survey of Fishing, Hunting, and Wildlife- Associated Recreation) “20% of duck hunters said they “frequently” access the internet to look up duck hunting information; 49% access the internet “once in awhile,” and 31%, “not at all.” (Responsive Management/National Shooting Sports Foundation, 2008) 38% of hunters report living in rural areas. (Responsive Management/National Shooting Sports Foundation, 2008)
  • 6. 6 NATIONAL FISH AND WILDLIFE SERVICE SURVEY REDESIGN NAME: J.W. Brouebeck AGE: 53 HOME STATE: Seattle, Washington OCCUPATION: Journalist EDUCATION: Bachelor degree TECH SAVVY: Intermediate J.W. Brouebeck has traveled the United States researching, hunting, photographing and writing about duck hunting for the last 25 years. He has been published in Field & Stream, Outdoor Life, National Geographic and other notable publications as well as written 5 books on duck hunting. He is familiar with the rules of duck hunting for several states and has been selected for the US Fish and Wildlife survey 5 times consecutively. Because he is always traveling and has multiple addresses, he has only completed the USFW forms 2 out of 5 times they were sent. Some were mailed to an address where he was not currently living. He would like to have an updated process where he is notified through email. J.W. would also like to be able to access the form on the web because he is often mobile while traveling. NAME: Sarah Jesemy AGE: 27 HOME STATE: San Francisco Bay, California OCCUPATION: software engineer EDUCATION: Master Degree TECH SAVVY: Advanced Sarah Jesemy is the director of communications at software engineering firm, Halbot Engineering. She spends her time away from the office hunting waterfowl during duck hunting season. She learned to duck hunt as a young girl from her father Rusty Horowitz who is the founder of Halbot, who now sits as the chair of the board of advisers. Sarah began running the media department at the company when her dad retired 2 years ago. She is excited about the opportunity to take the USFW online, which is the way she prefers to do most of her personal business. Sarah is an avid Primos Hunting Calls and Ducks Unlimited Waterfowler’s Journal apps user.
  • 7. 7Hammelburger, Pease, Sweeper PERSONAS NAME: Robert Frierson AGE: 61 HOME STATE: Venice, Louisiana OCCUPATION: Captain of a shrimp boat EDUCATION: High School TECH SAVVY: Beginner Robert is a Vietnam War veteran from Venice, Louisana. He runs a family-owned shrimp fishery. He is one of 5 brothers who have been hunting gators, deer and duck with him since their teenage years. He is opposed to taking the USFW survey online, preferring to have them mailed. He would much rather not to have to do them at all. His wife Susan helped him fill out the last survey that he was sent 4 years ago, but she has passed away since then. Robert has used his time duck hunting to help him grieve. He is not sure who will help him now that Susan is gone. NAME: John Sweeney AGE: 45 HOME STATE: Chesapeake Bay, Maryland OCCUPATION: US Army EDUCATION: High School TECH SAVVY: Beginner-Intermediate John Sweeney is a Sergeant in the Army who will retire next year. He has served in the Armed Forces since the age of 18 and misses hunting after during several tours of duty overseas. He is happy to take the survey if it means that he is helping the USFW. John prefers to take the mail in survey but is open to becoming a better computer user to help him complete the online version. Once he retires he is planning to join a computer literacy group for veterans that his service buddies told him about.
  • 8. 8 NATIONAL FISH AND WILDLIFE SERVICE SURVEY REDESIGN Since 1953 the Fish and Wildlife Service has relied on a mail based survey to collect a vast amount of data related to the experiences of hunters of waterfowl. In addition to being incredibly expensive, time consuming, and inefficient, this survey has been fraught with mistakes and inaccuracies stemming from poor design and usability issues. The current form is divided into two sections. The top half has fields relating to with when someone hunted, where they hunted, and what they hunted. The bottom half deals with the season hunting totals, focusing on how many days someone hunted, how many of each species they harvested, and how many were downed but not recovered. The proposition of using a web form instead of a physical survey has the potential to remove many of the problems present in the current form. With so much at stake it’s vitally important that we focus on making sure that the design of the form is as useable as possible. As with many other aspects of design, when a survey is done well, it’s not something that people take notice of. But when there are issues, they are all too obvious. When web forms are used for sign ups, online quotes, and checkout pages they are often the final step towards a conversion. When it comes to checkout pages, even just a slight improvement in the form can lead to a massive improvement in a business’s profit. A simple change in the form’s title or a layout change can lead to conversion improvement of 5-10%. This effects can be applied elsewhere, particularly to survey design. Poor form design has its consequences. According to Baymard Institute, a web research company in the UK, 67.45% of online shopping carts are abandoned due to issues with the forms during the checkout process (Baymard Institute). Making the form as easy to use as possible must be a priority in the design process. In order to make this project a success we must design the form using web form design best practices. As Smashing Magazine states: “The ISO 9241 standard defines website usability as the “effectiveness, efficiency and satisfaction with which specified users achieve specified goals in particular environments.” When using a website, users have a particular goal. If designed well, the website will meet that goal and align it with the goals of the organization behind the website. Standing between the user’s goal and the organization’s goals is very often a form, despite the advances in human- computer interaction, forms remain the predominant form of interaction for users on the Web. In fact, forms are often considered to be the last and most important stage of the journey to the completion of goals.” (Mifsud, 2011) With the main demographic ranging from 45 to 54 years, a the design focus has an obligation to be centered on usability issues for older individuals, ranging from health issues such as vision decline and motor skills decline to elder frustration. A Nielson Norman Study showed that 45% of older individuals were uncomfortable or hesitant exploring new things. (Nielsen, 2013) Of particular note was the observation that seniors were more likely to get frustrated and give up altogether instead of searching for more options. Additionally, people don’t like answering surveys. They are intrusive, feel like a waste of time, and are annoying to complete. At least that’s how most surveys are perceived. One respondent to the fish and wildlife survey sent a letter saying “This survey in my opinion is a waste of my time and A PROBLEM OF FORM(S)
  • 9. 9Hammelburger, Pease, Sweeper money that should be used for the wildlife and water.” This respondent didn’t feel that the survey was worth the money or time spent on it. But it is not impossible to create a form that users will nott feel is a waste of time. In the book “Forms That Work” by Caroline Jarret and Gerrey Gaffney they establish 3 rules to influence response rate: establish trust, reduce social costs, and increase rewards. People are more likely to respond if these conditions are met, if they perceive that the surveyor is trustworthy, that the costs of participation are not unduly high, and that they will be rewarded for doing so. By tapping into these inclinations, hunters’ perception of the survey, and their subsequent completion of it, can be increased. In fact, a few respondents note this themselves. “If you would make an online hunting journal where we hunt and could access it year after year so we could see our own trends, we hunters would respond well. You would get more data”. The goal of any survey is to get the information that is already inside the user’s’ mind onto paper. The best surveys aren’t the long ones, they’re the ones that ask the right questions, which help the users answer in the best way possible. When the wrong questions are asked we get issues such as the case of one respondent who, when filling out the bottom half of the survey, put in 5 killed and retrieved ducks and 8 downed but lost. The survey reviewer noted that “It is not typical for birds killed to be greater than birds lost. Suspect misunderstanding (SIC) what we want”. To avoid situations like this it is required to ask the right questions. In order to make the questions easily understood, they need to be written in a way that uses concepts familiar to the hunters. As Jarrett writes “Even something apparently straightforward can give the user a bit of a problem to decode”(Jarrett, Gaffney, 2009). Writing the questions in an easy to understand way will both increase the accuracy of the collected information, and avoid feedback like this from frustrated users “I have no idea what this survey is about. I asume(SIC) you are asking about ducks. Send me a survey that makes sence (SIC) and I will happily make it out”. People focus best when concentrating one question at a time, yet many surveys fall into the trap of asking multiple questions in one go. There are two problems with asking more than one question at a time: First, more than one question can confuse the user about which question to answer, making it difficult for them to focus on one question at a time. Second, the user may skip potentially important questions as they attempt to answer either the question they remember or the easiest question, and they may not input the most important information. To avoid this unfortunate consequence, ask one question at a time. In addition, multiple questions make it hard for respondents to give precise answers, which in turn makes those answers difficult to interpret or evaluate. It is therefore recommended that, rather than asking a series of questions, the survey should be formed around a smaller number of complete questions, addressing the overall scope of the survey’s purpose. Understanding the question is only the first part, users must then be able to properly answer the questions, otherwise a well-intentioned user will answer the question in the wrong way, invalidating the data. In one case, a hunter who was documenting the number of days that he hunted wrote 10 in each column, which left the person documenting the information confused as to whether they hunted for just 10 days or if they hunted each type of game on 10 different days. According to Jarrett and Gaffney there are four ways that people come up with answers: Slot-in answers which consist of everyday information, which in this case is used to record the date and location. Another method involves gathered answers, which consist of information originating from somewhere that the user can get to personally, such as the recorded amount of waterfowl hunted. Third party answers consist of questions that the user must seek from an outside source, for example by asking a friend who knows what days were hunted. Created answers are choices that the user could not reasonably have made in advance, i.e. thoughts and feedback on the survey. (Jarrett Gaffney, 2009) Knowing what type of answers will be given helps ensure that the right answers are being written. On the current survey there is an issue where many hunters have filled in areas with 0’s to fill in the space, instead of just leaving them blank. In one survey a hunter filled out a 2 for ducks and a 0 for sea ducks. The surveyor commented “ Don’t want these 0’s, they imply days of unsuccessful hunting”. One problem that was noticeable in the surveys was the many contradictions of information from the top half of the form to the bottom half of the form. Many of these issue stem from simple clerical errors, where the hunter added the information incorrectly when transferring the data from the top to the bottom. In one case the hunter mentioned hunting geese for 3 days on the top of the form, but on the bottom only wrote 2 days. In another case the hunter wrote that killed 40 ducks on the bottom, but the sum total of ducks killed on the top only added up to 39. These issues can possibly be alleviated through the use of a web form by making it easy to transfer the answers on the form. The important thing is to meet the user’s’ expectations and use the options that will make the user feel the most comfortable. By offering drop down menus and radio buttons we can make it easier to ensure that the
  • 10. 10 NATIONAL FISH AND WILDLIFE SERVICE SURVEY REDESIGN correct information will be properly filled out. As we have seen through the physical surveys, users are prone to making mistakes, the trick is to offer help when needed and feedback when there is an error. The ideal form should never have to be explained to users to know. If it doesn’t look like a form or it’s too complicated to fill out, then redesigning it is your only option. As form expert Luke Wroblewski writes “if excessive instructions are required to explain how to complete your form, then chances are the questions you are asking are either phrased poorly, too complex, or just plain unnecessary.” (Wroblewski, 2008) Eye tracking data has also shown that users often skip the help text and instructions and dive right into the form, so designing it in a simplistic way is a must. One suggestion of getting around this is ”Rather than include help text next to each input field, show it only where required. You could show an icon next to an input field that the user can click on when they need help for that field. Even better, show help dynamically when the user clicks into an input field to enter data.”(Mifsud, 2011) Aside from the help text, a good form needs error validation. With proper error validation the problem of contradictory information, incorrect info, or even mere clerical errors would vanish. Smashing Magazine suggest using an error message saying “This notifies the user that an error has occurred, and it usually prevents them from proceeding further in the form. Emphasize error messages through color (typically red), familiar iconography (such as a warning sign), prominence (typically at the top of the form or beside where the error occurred), large font, or a combination of these (Mifsud, 2011). Many seniors have vision issues such as Glaucoma, Macular Degeneration and Cataracts. Issues such as these can make it more difficult to read computer screen fonts making it more difficult to read the form. The Nation Institute of Aging recommends that apps and websites should use “Make type size at least 12 point, 13 point, or 14 point”(NIH, NIA 2007). Be designing the form to meet these requirements, it is hoped that overall completion rate can be improved, and the resulting design will be easier, more accurate, and more cost effective. If the users choose to report by day, they would start the multi-page survey. The survey was designed around Steve Krugs principle that if what the user has to do is mindless, it doesn’t matter how long it will take. We decided to take each field of the survey and give it its own page. By doing so we would make the survey mindlessly easy for anyone to complete. The scope and features of the NFWS project contain a number of similarities with a recent app created by the organization Ducks Unlimited. Ducks Unlimited was founded in 1937 with a mission to help conserve waterfowl and wetlands. They are a well trafficked website will SimilarWeb. com recording 140k users a month, Compete.com recording 112,578 unique visitors, and Alexa.com ranking them #19,458 in the US. The organization’s is funded through a combination of grassroots fundraising events (member/sponsor banquets, shooting/fishing tournaments, golf outings) and advertising revenue obtained through their website and app. Ducks unlimited created a mobile waterfowl hunting event logging app in 2011, which has received over 50,000 downloads. There are two versions of the app; a free version solely requiring signup, and a premium version for due- paying members. This analysis we will focus on the journal functionality of the free app. The journal function of the app offers the opportunity to record in great detail the myriad factors of any specific hunting day, factors which users may not have paid attention to or sought to record. For example, the first section begins with basic location and date info, then proceeds to ask for the temperature, weather, wind-speed and wind direction. The actual act of logging birds harvested is an experience in frustration. Particularly, it would seem the designers were so focused on allowing users to identify species by selecting a picture, they neglected to include any affordances to guide users on how to do so. A PROBLEM OF FORM(S) CONTINUED
  • 11. 11Hammelburger, Pease, Sweeper COMPETITIVE ANALYSIS The screen begins by prompting users with a large “skip adding birds & save entry” button that takes up nearly a third of screen real-estate. Below is a small table indicating users to select a species, labeled “select”, then choose a species from the popup. Despite being arguably the most important aspect of the page, and the application in general, this button shares no similarities with any other button within the app. There is no design, no label, and worst of all, the size is smaller than every other button. These features fail nearly every design heuristic identified for the hunter population. If the correct course of action can be identified, users then choose their bird from a drop down containing small pictures of each species, with the obvious intention of making it easy for users to pick out their species. This drop down divides species into four types and orders them alphabetically. Once a species is selected, users are provided the opportunity to record the number of birds harvested, while subdividing the species into four categories: Drake Banded, Hen Banded, Drake Unbanded, and Hen Unbanded. This presents a number of problems, many of which violate heuristics for survey design amongst the target population. Firstly, these choices are not dynamically chosen based on species, and are instead standardized across all species of waterfowl. Secondly, the form contains barely enough space for users to select the field to add the number of birds harvested. Lastly, there are no labels designating the field as requiring a numbered entry, nor are they designed to look like tapable buttons. Collectively, these features demonstrate a distinct lack of affordances to guide the application’s use. The Ducks Unlimited application also makes it confusing to add multiple species of birds. The main issue is the button titles. “Save Bird” pushes the user to the next screen, rather than merely saving that particular bird. Once a user “saves” a bird (humor mine), they are moved to an “edit entry” screen and are obligated to select the bird icon in order to go back and add a second bird, a decidedly unintuitive process. Finally, users can save their bird and officially log their day. Users are able to go back and make edits to their entry and the app sorts each entry by the most recent day and season. Despite myriad design problems, the overall response to the app is positive, with an average of 4.3 out of 5 on google play and 4 stars on the Apple store. Most of the reviews center on other features, with few users making any mention of the journal. Conceptually, the journal app seems like a good idea, with excellent features such as identifying birds by picture and the ability to save entries at every step. However, in execution the application seems decidedly difficult and non-intuitive. With a design that focuses more on breadth of features rather than ease of use, the Ducks Unlimited journal application serves as a definitive guide on needlessly complicating a user experience, a guide which will inform our own decisions on what NOT to do in the redesign of NFWS hunter survey.
  • 12. 12 NATIONAL FISH AND WILDLIFE SERVICE SURVEY REDESIGN In researching the best methods for redesigning the National Harvest Survey of Sport’s Hunters we have identified a number of axes on which improvements can be made. Essentially, what we propose is a three pronged approach; identify and define the target population, cleanup the language, and streamline the delivery. By accounting for the population that constitutes the main body of sports hunters, and designing around their needs, tendencies, and beliefs it is hoped that the conversion rate on survey completion can be improved. By altering the language of the survey to increase clarity and reduce confusion, it is hoped that the number of completed but unusable surveys can be decreased. Lastly, by re-designing the survey as an online tool, the considerable postal costs can be eliminated, freeing budget for other endeavors. Taken together, these tactics could serve to greatly improve the cost, accuracy and ease of the National Harvest Survey of Sport’s Hunters. As previously noted, the greater part of the sports hunter population lies between the ages of 45-64, and are white males residing in rural areas. It is therefore imperative that the survey redesign be optimized for users within this demographic. Additionally, as this large group continues to age, and with the relative lack of younger hunters replacing them (Responsive Management, The National Shooting Sports Foundation 2008), the survey should be designed with the needs of a primarily older audience in mind. Additionally, as becomes obvious from reading the notes submitted by the surveyed, a concerted effort should be made to impress upon the hunter population the importance of the survey for the continued health of the waterfowl population, and by extension the continuation of the pastime of sports shooting. While many of the hunters seem confused and/or hostile toward the perceived invasion of privacy represented by the survey, i.e. the social cost (Jarrett, Gaffney, 2009), they also seem to generally be invested in the preservation of the environments in which they hunt. It could therefore prove fruitful to provide some explanation as to the survey’s purpose and efficacy in relation to the greater goal of preserving and maintaining the diversity of waterfowl. By helping to show an alignment between the NFWS’s goals and that of the hunters themselves, it is believed that those surveyed would be more likely to successfully complete the questionnaire. For the survey itself, there are a number of industry best practices that could be applied to improve its clarity and usability. In its current form, many survey recipients seem to show confusion as to the terminology and labeling of the survey’s various sections. In particular, the area questioning total days hunted seems to commonly produce errors. The survey could benefit from a clarification and streamlining of language to help reduce usage errors on the part of the participants. While it would take a considerable amount of user testing to arrive at a survey whose language is ideal for the target demographic, a general application of consistent and plain language could certainly serve to reduce submission errors. Additionally, changing the formatting of the questions to center on the data that is most important to the NFWS could help to increase the survey’s ease of use. By placing emphasis on the most important questions, and having them answered first, it is hoped that the number of completed but unusable surveys can be reduced. Additional consultation would be necessary to determine what elements of the survey are most important, but orienting the survey to focus on the most salient question is understood to be of great importance to successful surveys (Jarrett, Gaffney, 2009). Lastly, the move away from paper presents a number of advantages that could help with all aspects of the survey process. By recreating the survey as an e-tool, the NFWS can reduce both the cost and the hassle associated with physically mailing and receiving the surveys. This poses a number of benefits. Not only will money be saved for A SURVEY FOR HUNTERS
  • 13. 13Hammelburger, Pease, Sweeper the NFSW, the survey recipients will now no longer have to contend with the inconvenience of mailing in their responses. By reducing the required commitment on the recipients end, it is hoped that overall conversion can be increased. While not on the same level as younger generations, older adults have still seen a remarkable uptick in web tool prowess, and will only become savvier as time goes on (Xie, 2003). This paradigm should be embraced, as the benefits of moving away from paper seem to greatly outweigh the costs. An online tool also allows for certain opportunities that would be impossible with paper. An application’s capacity to tailor the survey to the individual, as well as identify and help correct data entry errors prior to submission, could greatly decrease the number of unusable surveys submitted. By having the application do the sums of days hunted, the arithmetic errors frequently present in the paper forms could be all but eliminated. Additionally, the ability to provide queues and warnings when the user’s data seems to lay far outside the norms, i.e. downed and lost ducks greatly outnumbering retrieved ducks, will help increase confidence that the data user’s submit is indeed what they hoped to submit, and not noise produced by a lack of understanding. The importance of the National Harvest Survey of Sports Hunters to both the NFWS and the hunters themselves cannot be understated. It is an invaluable tool for the continued viability of the migratory bird population, as well as the hobbyists who hunt them. By redesigning the survey to express this symbiosis, to show the relationship between survey and sport, and to facilitate its completion by the target demographic, is an end that should serve all parties well. The plans laid out in this paper, taken together, can serve to guide the design of a greatly improved survey. Now it is merely a matter of testing and application, revision and iteration, to arrive at a design that fulfills everyone’s needs. OUR DESIGN THEORY Our design is based on resolving the core problems identified by the Department of Fish and Wildlife with their mailed out surveys. Based on the information provided by the National Fish and Wildlife Services representatives, and our own observations from looking at previous surveys, we were able to condense these problems into three main issues: • Low amount of survey respondents • High percentage of mistakes made in respondents’ returned surveys • High cost of mailing and sorting through the surveys Our solutions are intended to solve all three of these problems. We have designed a system focused on a singular goal: getting users to complete the survey while avoiding mistakes. While we toyed with the idea of creating an app similar to Ducks Unlimited, we decided against such a solution for the following reasons: 1) Relying on an app would reduce the already small user pool, forcing the DFW to choose only people who have already downloaded the app. 2) An app would create an extra barrier for entry, further deterring a group of people who may already be reluctant to complete the survey. 3) The app would take data already recorded by the user in the context of a social network. We worried that this social context could increase the chances of kill exaggeration, leading to faulty data. 4) Our persona research showed that a large majority of the target demographic were older users who may not be as adept in using computers and technology. After weighing these considerations, we decided on a multi-page survey that would guide even the most unskilled of users towards the completion of the survey, a strategy whose efficacy was confirmed in user testing.
  • 14. 14 NATIONAL FISH AND WILDLIFE SERVICE SURVEY REDESIGN PROTOTYPING & TESTING Paper Prototype Our survey would begin with an email sent to selected users, informing them of their selection to participate in the annual Fish and Wildlife survey. After clicking on the link in the email they would be taken to the first page of the survey. This page was designed to immediately welcome the user and thank them for agreeing to take part in the survey. This method was chosen to give users an initial burst of encouragement, and express to them the importance of the survey, and how their contribution will help the sport they love. Users would login with their hunter ID number, which they would then submit to start the survey. 1) By day 2) By Season This was based on the paper surveys two halves, with the top half being the day by day and the bottom half being for the season. The idea was that many respondents don’t know the day by day amount, but they do know the seasonal totals. We also put in question pop-ups that would help users to complete the report by day section, which would take them through the survey, then auto populate the seasonal survey. As one of the biggest issues with the paper survey was inconsistent data transfer from the top section to the bottom, we hoped this solution would prevent those errors from ruining the perfectly good data that the survey otherwise had. If the users choose to report by day, they would start the multi-page survey. The survey was designed around Steve Krugs principle that if what the user has to do is mindless, it doesn’t matter how long it will take. We decided to take each field of the survey and give it its own page. By doing so we would make the survey mindlessly easy for anyone to complete. This section of the survey was divided into 5 parts: 1) Date 2) Location 3) What did you hunt? 4) How many did you retrieve? 5) How many did you hit but not retrieve? The first page users would encounter was the date page. This page included both a manual box for users to fill out the date as well as a calendar popup. Older users have been known to dislike popups, so part of our design decision was to allow for the option to avoid using them entirely. The second page asked users what specific state and county they hunted in. We found on the paper surveys that users occasionally misspelled their locations, which could lead to some confusion. Our solution was to allow for auto filling of those locations. After getting the date and location we needed to find out what exactly they had hunted, and how many of each that had retrieved and hit but not retrieved. We did not want
  • 15. 15Hammelburger, Pease, Sweeper to make it all on one screen, as we felt that would just confuse the user. We decided to instead create separate pages for each question. On this page users could choose ducks, geese, or brant and for each animal they chose they would be asked “how many did you kill” and “how many did you hit but not retrieve.” User said no. When the user was done adding days they would be sent to the seasonal total page. The page was auto filled with all the previous data that they had completed and was there for the users to confirm their totals. If the total was incorrect, they could edit it directly on the page, and this would be sent as just a seasonal total, rather than a daily total. This page also doubled as the option page for seasonal totals. As such we made sure to include small explanation options next to days, killed, and downed in case the language was confusing. After completing the seasonal totals the users would be taken to the thank you page. This page was just to reinforce how important the survey was and how valuable the input of the user was to furthering the mission of the DFW. Problems No prototype is without its issues and ours was no exception. When testing, the problems we found were: • There were no indicators of time on the survey. Users had no idea how long the survey would last from the outset and there were no indicators of progress as they went along. • The survey was too long. There were too many pages, and with all the doubling back in the case of multiple days, it was too long, even if mindlessly easy. • We forgot a back button, critical should someone realize that they had made a mistake. While we did have those issues, we still believed in our design principle, and that if we could design a survey that was easy enough to complete, we could boost completion rate. With those three issues in mind we made some adjustments to the design. In the second paper prototype we made minor changes to the functionality. Instead of one continuous survey it would function more like multiple mini-surveys based on each day. We accomplished this by adding a “home” page that users would go back to after completing each day. The design of the page was divided into two sections, the top was designed for users to start the “add a day” feature and the bottom was designed to display the days that were already completed. In addition users could edit the completed days. Additionally we realized that one important issue was people making mistakes, so we added an edit page feature that would allow for users to edit previously completed pages. The adding a day process was streamlined into 4 pages: 1) Date 2) Location 3) What did you hunt? 4) How many did you retrieve/ not retrieve? Figure 1 Prototype V2, Home screen Figure 2 Prototype v2 Edit screen
  • 16. 16 NATIONAL FISH AND WILDLIFE SERVICE SURVEY REDESIGN The date and location screen remained mostly unchanged, aside from adding a progress meter and back buttons. The “What did you hunt” page also remained mostly unchanged, the only major change was using checkboxes instead of ‘pushdown’ buttons. The major change was on the amount of retrieved/not retrieved page. We combined the retrieved and not retrieved pages, and that page would display all the animals checked on the previous page. By doing this we were still able to streamline the process while maintaining the simplicity. Paper Prototype Testing On Thursday, Nov 12, we, along with another group from our class, traveled to BassPro Shop to perform testing of the paper prototypes. We brought v2 of our paper prototype, updated based on the takeaways from the in-class peer reviews.  BassPro was chosen because, as one of the nation’s largest retailers of hunting supplies and accessories, there was a high likelihood of persons of our target demographics being present. After receiving approval from the manager, we were lent a folding table and allowed to set-up near the section of the store devoted to waterfowl hunting.  From there, we effectively cold-called patrons of BassPro shop, asking them if they a) had hunted waterfowl in the past and b) had any interest in participating in our user research.  Perfecting the approach angle took some figuring out. Hunters seem to, generally, be a reticent bunch, and it was revealing how quickly many of them closed off at the mention of a “survey”.  It proved much more efficacious to approach from the angle of anonymous prototype testing.  A script developed, by which we first explained who we were, who we were working with, the purpose of the survey itself, and finally our purpose in testing it.  It was found that, by expressing our specific desire for negative feedback that participants were far more willing to participate in the testing. For the testing itself, we had participants seat themselves at the folding table, and placed the paper prototype in front of them.  We explained that the prototype should be considered just like a series of screens, and that one of us would function as the computer, performing the user›s intended actions.  This led to a number of funny interactions where the participant, amused at the “make believe”, emphatically typed on an invisible keyboard to fill out the imaginary data fields.  The other team member acted as the facilitator/recorder, though the simplicity of the design seemed to minimize the amount of facilitation necessary. Our participants ranged in age from mid 20’s to mid 60’s, providing a nice cross-section of our anticipated user groups.  It was interesting to note that their insights and recommendations did not seem to fall alongside our predictions, with the youngest participant being the one most reticent about potential privacy concerns, and the need for the continuation of the paper survey for those unwilling or unable to participate electronically.  However, there were a number of cohesive takeaways mentioned in some way or another by our participants, that greatly informed subsequent design decisions. Primary amongst these insights was the fact that many hunters do not keep an exact record of their kill count, and even fewer maintain statistics for downed birds. Making accommodations for this fact would be implemented in subsequent prototype revisions.  Many of the participants expressed concern over how the information would be used, and expressed an opinion that many hunters may enter less than accurate information out of fear of legal repercussions.  Ameliorating this concern, largely through branding and language, was also earmarked for implementation. Overall reactions to the survey were positive, with all participants noting how straightforward the prototype seemed.  Aside from suggesting marginal improvements for clarity and ease of use, as well as an expression that most of the form elements should be subdivided to individual tasks, the participants found our prototype did in fact achieve our design goals of simplicity and accuracy, and those participants familiar with the paper survey expressed that this would in fact be much, much easier.  We felt that our design was headed in a very good direction, and the information gathered in testing mostly suggested changes in language and labelling to further clarify the interactions, as well as to increase the amount of feedback provided to the hunters.  These changes would be implemented in our electronic prototype. Electronic Prototype We felt that that our testing of our second version of the paper prototype went well, and we decided to go for a faithful transition from paper to electronic. Our main goal was to give users a better idea of the system, and see how they would be able navigate through it based on various scenarios. We used the Axure prototyping software to build our prototype. On the welcome screen we decided to change things around slightly. We felt that saying “thank you” was a bit confusing to users, as it sounds more like something said after completing a task, rather than before completing it. We also added the approximate time it would to complete, and an explanation of what the survey would be accomplishing, giving users a greater sense of purpose and drive to complete it. Part of our feedback from the paper prototype implied
  • 17. 17Hammelburger, Pease, Sweeper that users didn’t quite know what survey they were completing, so we added a large DFW logo front and center. This would also serve as a trust signal so users would feel more comfortable submitting their data. Finally, our testing showed that not everyone knew exactly where they could find their member ID, so we added a link that would show users where they could find it. In the electronic prototype removed the rules page. We found the rules page to be a bit superfluous and we could make the survey much more usable if we showed the rules in a more contextual manner, such as when they were completing the respective step, rather than all at once. The home screen remained mostly the same. We added the DFW logo to the top for trust. We also moved the report by day and report by season onto the “home” page. In our testing we found that many people gravitated to the “report by season” page even if they had their daily data. Obviously we’d much rather have them complete the daily version. So we shrunk down the “report by season” button to make it more of an afterthought, should they not have the daily data. For the “how many were hunted” screen we realized we could also solve a few problems that were noted in paper survey testing. One, noted by participants, was that it is unlikely that hunters would retrieve more than they lost. We could solve this by having a notification in cases where users put in more ‘retrieved’ than ‘lost’ that would ask them if they were sure of this. We could also solve the issue of people entering numbers that were over the hunting limit. We could limit the total amount killed to whatever the cap in that area was that year. The home screen would then be populated with the day that was hunted and the user could go and edit it should they need to. On the edit page users could edit anything they entered. They could edit the day, location and what they killed directly from this page. Electronic Prototype Testing On Monday, December 7th, we returned to BassPro Shop to undertake a second round of testing utilizing our technological prototype, built in Axure.  The testing format was the same as our previous visit, with a folding table set up in the store section containing waterfowl hunting supplies and accessories.  This time however we were armed with two laptops loaded with the technological prototype. Our method of participant acquisition remained unchanged; asking patrons if they participated in waterfowl hunting, a brief background on who we were and the scope of the project, and lastly if they would like to assist us.  Once again, the participants spanned a wide range demographically, and in this session we even had the privilege of testing with a female duck hunter, who also happened to work for NASA. The testing procedure was similar to that used for the paper prototypes, albeit without the need for a team member to act in the role of computer.  While the prototype was not fully functional, we declined to build in the mathematical logic, it was still a reasonable enough facsimile that the participants had no trouble inferring the work flow.  We were happy to see that users could intuit the workflow, and most of their suggestions focused on labeling and terminology, rather than the mechanics. Figure 3 electronic prototype welcome screen Figure 4,5,6 electronic prototype pages
  • 18. 18 NATIONAL FISH AND WILDLIFE SERVICE SURVEY REDESIGN User Testing at Bass Pro Shop One interesting point of disagreement amongst the participants was the overall level of technological sophistication of the hunter population.  One of the participants in the technological prototype testing expressed a desire for the app to be designed mobile first since, in his opinion, everyone can use a phone.  This was in direct contrast to an earlier participant from the paper prototype session who expressed great concern that the survey be designed with those without access to a smartphone in mind.  This reinforced to us the need for a device agnostic approach that could be easily ported across a wide variety of devices. Another interesting insight provided by a participant, and one that we had overlooked, was a need to label the hunting season.  Specifically, the season often spans across the new year, and a label to the effect of “2015/2016 season” would be needed for clarity.  This particular participant, who had filled out the NFWS survey a number of previous years, applauded the simplicity of our design.  In remarking on the paper survey he humorously commented “I was glad I used a pencil”.  He saw our design decisions as an improvement in both ease and clarity, and most of the suggestions related to our choices in terminology. Overall, the final testing session confirmed for us that we had fashioned a design that accomplished our goals of simplicity and accuracy.  There was an expression of satisfaction and near excitement for a design that eliminated much of the hassle and drudgery of the extant survey.  Design suggestions focused almost entirely on word choice and labelling, changes we were happy to implement in our final prototype version, and which will be demonstrated in our final design recommendations. Final Takeaways From Testing After testing we did make some minor changes to the home screen of the design as well as some changes to the overall language that we used. One issue we found with the home screen was that it didn’t tell people what it was, i.e. lacked affordances, and should a user walk away during the survey and come back later, they may not remember what the survey was. In the end, while our design underwent some significant changes in aesthetics and function, our goals remained the same. We set out to create a survey that would solve the biggest problems experienced by the DFW. They needed to boost their completion rate, we created a survey with a single minded focus of helping users go through the steps, while avoiding any possible distractions or difficulties. They needed to reduce errors, we designed a system that would check for the most the most common errors. From small thing such as spelling errors and inconsistent numbers to larger things such as the possibility that the user put in an incorrect amount of harvested animals. They needed a solution that would help them cut down on cost. Our solution is a low cost solution that does not require a large team, nor does it require constant maintenance. It’s also a simple system that can realistically be in place within the next few months. All in all, we believe our design accomplishes the goals set forward by the NFWS, at the lowest possible cost, with the highest possible efficiency. It is our sincere belief that that implementing our design would allows the staff at the DFW to focus on doing what they do best: preserving the ecosystems.
  • 19. 19Hammelburger, Pease, Sweeper BIBLIOGRAPHY 31 Cart Abandonment Rate Statistics. (n.d.). Retrieved October 12, 2015, from http://baymard.com/lists/cart-abandonment-rate Hedegaard, Erik (2013, October) Redneck, Inc: The Duck Dynasty Story. Men’s Journal. Retrieved from http://www.mensjournal.com/magazine/redneck-inc-the-duck- dynasty-story-20131005 Adams, N., Stubbs, D., & Woods, V. (2005). Psychological barriers to Internet usage among older adults in the UK. Informatics for Health and Social Care, 30(1), 3–17. Bargas-Avila, J. A., Orsini, S., Piosczyk, H., Urwyler, D., & Opwis, K. (2011). Enhancing online forms: Use format specifications for fields with format restrictions to help respondents. Interacting with Computers, 23(1), 33–39. Bickman, L., & Rog, D. J. (Eds.). (2009). The SAGE handbook of applied social research methods. (2nd ed.). Thousand Oaks, CA: SAGE Publications, Inc. Calak, P. (2013). Smartphone Evaluation Heuristics for Older Adults. Retrieved from https://atrium.lib.uoguelph.ca/xmlui/handle/10214/5610 Convention On Nature Protection And Wild Life Preservation In The Western Hemisphere. Oct. 12, 1940. Retrieved October 11, 2015, from http://www.oas.org/juridico/english/treaties/c-8.html Dickinson, K. (1997). Distance learning on the internet: Testing students using web forms and the computer gateway interface. TechTrends, 42(2), 43–46. U.S. Census Bureau (2011). National Survey of Fishing, Hunting, & Wildlife-Associated Recreation (FHWAR). Retrieved October 5, 2015, from https://www.census.gov/prod/www/fishing.html American Sportsfishing Association, Responsive Management, The Oregon Department of Fish and Wildlife, Southwick Associates (2013). Exploring Recent Increases in Hunting and Fishing Participation. Retrieved from http://www.responsivemanagement.com/download/reports/ Hunt_Fish_Increase_Report.pdf Goodman, J., Syme, A., & Eisma, R. (2003). Older Adults’ Use of Computers: A Survey. Presented at the Annual
  • 20. 20 NATIONAL FISH AND WILDLIFE SERVICE SURVEY REDESIGN HCI International 2003, Crete, Greece Retrieved from http://www-edc.eng.cam.ac.uk/~jag76/ research/2003_bcs_hci/paper.pdf National Shooting Sports Foundation (2009). A Portrait of Hunters and Hunting License Trends: National Report. Retrieved from https://www.nssf.org/PDF/HuntingLicTrends-NatlRpt.pdf Silvernman, Emily, Wilkins, Khristi. (2015) Web portal for survey data entry Project information. (.pdf) Retrieved from https://ubonline.ubalt.edu/portal/site/1154IDIA612WB1/page/280ca13c-9b25- 4b19-a65f-ef8e1c815763 Guyer, Daniel (2014). Iron Duck Hunting. Retrieved October 5, 2015, from http://ironduckhunting.com/daniel/ Wroblewski, Luke. (2008). Web Form Design: Filling in the Blanks. New York: Rosenfeld Media. Jarrett, C., & Gaffney, G. (2009). Forms that work: Designing Web forms for usability. Amsterdam: Morgan Kaufmann. Kitzmann, A. (2003). That Different Place: Documenting the Self Within Online Environments. Biography, 26(1), 48–65. Lavery, D., Cockton, G., & Atkinson, M. P. (1996). Heuristic evaluation. Usability evaluation materials. Tech. Rep. TR-1996-15. Glasgow, Scotland: University of Glasgow. Lim, M. S. C., Sacks-Davis, R., Aitken, C. K., Hocking, J. S., & Hellard, M. E. (2010). Randomised controlled trial of paper, online and SMS diaries for collecting sexual behaviour information from young people. Journal of Epidemiology and Community Health, 64(10), 885–889. U.S. Fish & Wildlife Service (2012). Your Guide to Hunting on National Wildlife Refuges. Retrieved from http:// www.fws.gov/refuges/hunting/pdf/huntingGuide.pdf Morrell, R. W., Mayhorn, C. B., & Bennett, J. (2000). A Survey of World Wide Web Use in Middle-Aged and Older Adults. Human Factors: The Journal of the Human Factors and Ergonomics Society, 42(2), 175–182. Widener, Nick. (2013, July 14). New Site Targets Duck Hunters. Online Athens. Retrieved from http://onlineathens.com/sports/outdoors/2013-07-14/new-site- targets-duck-hunters Pauwels, S. L., Hübscher, C., Leuthold, S., Bargas-Avila, J. A., & Opwis, K. (2009). Error prevention in online forms: Use color instead of asterisks to mark required- fields. Interacting with Computers, 21(4), 257–262. Wright, Kevin. (2006). Researching Internet-Based Populations: Advantages and Disadvantages of Online Survey Research, Online Questionnaire Authoring Software Packages, and Web Survey Services. Journal of Computer-Mediated Communication, 10(3), 00. Ruark, R. (1993). The Old Man and the Boy (Reprint edition). New York: Holt Paperbacks.
  • 21. 21Hammelburger, Pease, Sweeper Staff, C. P. I. (1997). Duck Hunting: Guide-Tested Techniques for Taking All of the Important North American Duck Species. Quayside. Mifsud, Justin (2011). An Extensive Guide to Web Form Usability. Smashing Magazine. Retrieved from http://www.smashingmagazine.com/2011/11/extensive-guide- web-form-usability/. Williams, Kimberly J. (2010). The Call of The Wild: The Serious Business of Duck Hunting in Eastern Arkansas. Union Sportsmen’s Alliance. Retrieved from http://unionsportsmen.org/the-call-of-the-wild-the-serious- business-of-duck- hunting-in-eastern-arkansas/ Responsive Management, The National Shooting Sports Foundation. (2008). The Future of Hunting and the Shooting Sports. Retrieved from http://www.dnr.state.il.us/nrab/children/future_hunting.pdf Underwood, L. (2004). The Duck Hunter’s Book: Classic Waterfowl Stories. Globe Pequot. U.S. Fish and Wildlife Service. National Conservation Training Center. (2014). USFWS History: A Timeline for Fish and Wildlife Conservation. Retrieved October 11, 2015, from http://training.fws.gov/history/USFWS-history.html Brown, Curt. (2014, Jan 26). Westport duck hunter recalls his will to survive harrowing accident in river. SouthCoast Today. Retrieved October 5, 2015, from http://www.southcoasttoday.com/article/20140126/NEWS/401260320 Xie, B. (2003). Older adults, computers, and the Internet: Future directions. Gerontechnology, 2(4), 289–305. National Institute of Health. National Institute on Aging. (2007). Making your printed health Materials Senior Friendly. Retrieved from https://www.nia.nih.gov/health/publication/making-your-printed- health-materials- senior-friendly Nielsen, Jakob. (2013, May 28). Seniors as Web Users. Nielsen Norman Group. Retrieved from http://www. nngroup.com/articles/usability-for-senior-citizens/