SEO in the Web 2.0 Era: The Evolution of Search Engine Optimization


Published on

Whitepaper I wrote while at BKV ( back in 2007. Surprisingly still a pretty decent read. Notwithstanding the prediction of SecondLife's growth of course :)

Published in: Business, Technology, Design
  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

SEO in the Web 2.0 Era: The Evolution of Search Engine Optimization

  1. 1. A Bennett Kuhn Varner, Inc. White Paper SEO in the Web 2.0 Era: The Evolution of Search Engine Optimization By Will Fleiss Organic Search Specialist May 1, 2007
  2. 2. 2 Table of ContentsIntroduction…………………………………………………………………………………….. 3Search Engines: A Brief History…………………………………………................................. 3Web 2.0: The New Internet……………………………………………………………………. 4Web 2.0: The Technical Landscape………………………………………………………….... 5SEO Linking Strategy in Web 2.0…………………………………………………………….. 6Social Media Optimization: A Piece of the SEO Puzzle……………………………………... 7Usability vs. Searchability: The RIA Search Challenge……………………………………... 8Google’s Personalized Search: The End of Traditional SEO?................................................ 10Search Behavior R&D: Customized Engines and Long Tail Keywords……………………. 12Conclusion………………………………………………………………………………………. 13About BKV……………………………………………………………………………………... 14References………………………………………………………………………………………. 14 © 2007 Bennett Kuhn Varner, Inc. Licensed under the Creative Commons License, Attribution 3.0. Buckhead Centre 2964 Peachtree Road, Suite 700 Atlanta, GA 30305 PH 404.233.0332 FX 404.233.0302
  3. 3. 3I. Introduction tags in the HTML code. Early engines, like AltaVista, struggled with providing To those of us whose passion for relevant search results because the growth of the World Wide Web is webmasters, who were paid on a cost- exceeded only by the marketing per-impression basis at the time, wrote possibilities that emerge from that inaccurate meta tags using high search growth, the Internet has become a volume keywords in order to increase playground for the imagination. There is visits to their websites.1 It was Google a large number of marketers, however, who finally answered the call for a more who are fascinated by the Web but complex ranking algorithm that would approach its marketing capabilities more greatly improve the relevancy of SERPs. out of necessity than lifestyle. The Sergey Brin and Larry Page, the Internet’s capacity has advanced in so founders of Google, invented the many areas in the past few years that concept of PageRank, an algorithm marketers playing catch-up are at a which helps rank web pages based on significant disadvantage. Marketing the probability that a random person directors and account managers with surfing the Internet will find a given traditional media backgrounds need to page.2 The PageRank algorithm assigns expand their breadth of knowledge in a numerical value to each web page by order to make informed decisions in analyzing the quantity and quality of the today’s e-commerce. This article pages that link back to a given page. provides clarification surrounding the Known as a backlink, each link fairly recent buzzword “Web 2.0” and represents a vote for the page it links to focuses on the evolution and future of by the page on which the link appears. the search engine born occupation of The significance of each vote depends Search Engine Optimization (SEO). on how relevant the page giving the link SEO and its implications are expanding is to the page receiving the link, as well so fast and in so many directions that it as the PageRank of the linking page. has never been more important for C Along with the changing search level professionals and traditionally engines continually trying to provide oriented marketers to fully understand more relevant search results to the user, the world of Internet search. the entire Web has been evolving to meet the needs of the massive Internet population. In conjunction with theII. Search Engines: A Brief History growth of the Internet and the popularity of search, a unique profession known as When the first search engines Search Engine Optimization (SEO) was began cataloging the World Wide Web born. SEO tactics and skills have in the mid-1990s, obtaining a high rank evolved alongside the changing Internet, on search engine results pages (SERP) but such changes have never been as was not particularly difficult or significant as the most recent. We have secretive. It was the webmasters who entered into a second phase of the submitted URLs to the engines and Internet, and as a result SEO is taking on communicated a page’s relevancy to a a new face. This second generation of keyword search through keyword meta the Internet, often referred to as Web Buckhead Centre 2964 Peachtree Road, Suite 700 Atlanta, GA 30305 PH 404.233.0332 FX 404.233.0302
  4. 4. 4 2.0, has moved away from the old model What is so significant about the changes — based on static websites, clicks, and in the last few years that distinguish the impressions — and burst onto a cyber current Web as an upgrade from its playing field built around communities, previous omnipotent self? The simple participation and open cooperation answer to this question is you. Web 2.0 towards better products and services.3 represents the user’s needs, hopes, and An unprecedented level of interaction desires finally manifesting into a between consumers, businesses, and definable force of “voluntary interest groups exists in this new Web. motivation.”4 The blogosphere, social Due to the existence of a new social networks, wikis, and other new forms of presence, vehicles for driving organic expression on the Internet have captured traffic to one’s website have expanded the Web population by harnessing their far beyond the major search engines. goals, skills, and interests onto a While obtaining high rankings on the platform of collaborative creation and major search engines is still an SEO’s production. Websites are reflecting an main objective, the means by which this up-to-the-minute common voice rather positioning is achieved requires a much than a collection of static informational broader capacity for creativity than ever documents. The Web has never before before. Many of these new tactics also experienced this level of effective provide additional avenues of incoming interaction between its users, and that traffic, which has significantly expanded reason alone warrants its 2.0 the big picture view of the SEO designation. professional. In order to grasp the Ease of self-expression, now fundamental principles of the creativity apparent on the Internet through the and perspective now required of SEO, it popularity of websites like MySpace and is important to get a better understanding YouTube, is generating massive of the new Web 2.0 environment. amounts of original content. Critics of this tremendous increase in creativity and public opinion complain about theIII. Web 2.0: The New Internet dilution of reliable quality content on the Internet. Many social networks, Defining or labeling the new however, naturally weed out undesirable Internet is often met with a considerable content, and promote popular, well amount of critique due to the expansive referenced content to the top of searches. reach of such a description. There are so In Web 2.0, popular content emerges via many different things that have changed a user-generated ranking system that about the Internet in the past several determines the positioning of articles by years; a concise definition is difficult to the number of user votes they receive. come by. In addition, the term Web 2.0, This model was made most popular by while perhaps the most accurate term, is, which joins several typically scoffed at by the skeptical community-based popularity websites industry veteran who is wary of a vendor like and in or brass employee attempting to sound providing a user-edited resource for Internet savvy. The World Wide Web finding news stories, blog entries and has existed for almost twenty years. other websites. In Web 2.0, up-to-date, Buckhead Centre 2964 Peachtree Road, Suite 700 Atlanta, GA 30305 PH 404.233.0332 FX 404.233.0302
  5. 5. 5 reliable content is produced by the 2.0 upgrade. The user’s interaction, not editing abilities of the wiki. Wikipedia, with other users but with the interface of the Internet’s user-written and -edited the Net itself, has changed significantly. encyclopedia, boasts an accuracy level Technical advancements in web not far from the widely accepted navigation and design, as well as Encyclopedia Britannica. In a study that increased penetration of high-speed and compared forty-two science entries in broadband connection, make the new both resources, Wikipedia had only four Web a foreign landscape compared to its inaccuracies per entry compared to older version. Web applications have Britannica’s three.5 continued to improve, providing a Social network websites in the profoundly different user experience. new Internet also have a way of allowing The implementation of rich internet like-minded people to find each other’s applications (RIA) is gaining ground. favorite content through a system called RIA technologies, such as Flash, Ajax social bookmarking. is and Java, are leading the Internet in the the most popular example of a social direction of a Web without web pages. bookmarking website. This system of Websites are traditionally made up of classification, known as folksonomy, networks of static pages linked together involves users assigning labels, or tags, by text in the form of the computer in the form of keywords, to content on language HTML. These pages behave in the web. Through this collaborative a synchronous manner. That is, after the form of tagging, web content becomes user clicks on a link, there is a short grouped by recognizable categories. period while the server processes the Continuous tagging and creation of input, in turn triggering the browser to categories by users increases the download the requested page. RIAs content’s ability to be searched by a operate in an asynchronous fashion, wider range of people. This social allowing response time to be much phenomenon happens “because stable faster. RIAs’ increased responsiveness patterns emerge in tag proportions results from the following five factors: 8 [allowing] minority opinions [to] coexist alongside extremely popular ones  Information can be obtained without disrupting the nearly stable from a server by anticipating consensus choices made by many certain user input. users.”6 Such websites are considered  The screen can be refreshed “social” because of the nature in which in pieces instead of all at users’ bookmarks are publicly shared for once, eliminating the need for other users to browse and discover what entirely different pages to people find interesting. 7 load when navigating content.  More than one user’s inputIV. Web 2.0: The Technical Landscape can be collected and validated before it is sent to Aside from the collaborative the server. aspect of the new Internet, there is another reason the Web has earned its Buckhead Centre 2964 Peachtree Road, Suite 700 Atlanta, GA 30305 PH 404.233.0332 FX 404.233.0302
  6. 6. 6  Some responses to user input a blog from an authority source on can take place without any virtually any topic. The blogosphere, server communication. centered on the concept of original  Certain processing that was content, has provided a link rich venue once handled on the server for the SEO to plan his or her linking end can be stored on the user strategy surrounding good content. desktop. So what is “good content,” and what does it have to do with good Google Maps is a prime example linking strategy in Web 2.0? In this new of the pageless capabilities intrinsic in era of the Internet, good content is viral. RIAs, such as Ajax, that are disrupting Whether this content is a written article, the crawlability of the Web. Search a homemade video or a podcast, if it engines rely on crawling and indexing grabs, provokes or tickles the user, it pages that have unique URL addresses. will travel, and it will travel fast. From Websites built using RIAs not only the content’s eye-view, the Internet has function without the constant reloading become much easier to navigate of pages, but in most cases do not following the advent of Really Simple require unique URLs. Growing Syndication (RSS). RSS allows for a implementation of RIA has important program called an aggregator (or feed implications on search engines and reader) to notify users of new content optimizers alike. added to a website, retrieve that new content, and present it to the user in an easy-to-use interface. RSS and bloggingV. SEO Linking Strategy in Web 2.0 go hand-in-hand because of the constantly updating nature of the blog. The Blogosphere & RSS As a result of RSS, people are discovering new content on the Internet, The common SEO adage passing it along, and linking to it at an continues to be valid in the 2.0 world: unprecedented rate. content is king. It is the content Enter the skilled SEO, who sees boundaries and means for dispatching potential for organic traffic and quality content that have truly taken SEO to backlinks. By creating or at least another level. Since the inception of the harnessing the creativity necessary to blogosphere — a term that describes all produce content that people find useful, blogs as a social network of public funny or interesting, the optimizers can opinion — rumblings of the people’s watch as their creative efforts pay off in voice via the Internet have quickly risen dividends. In Web 2.0, a fully to a powerful roar. Beginning in the comprehensive linking strategy must form of an online diary in the mid 90s, spend more time producing quality, viral the blog has since developed into a content and less time submitting to simple vehicle of communication for directories, buying links, and reciprocal anyone who desires to send content linking. While these later tasks are low- across the Web. The dissemination of cost, they are time consuming, and information through blogging has efforts can often be wasteful or present become so mainstream that one can find the risk of a decrease in search engine Buckhead Centre 2964 Peachtree Road, Suite 700 Atlanta, GA 30305 PH 404.233.0332 FX 404.233.0302
  7. 7. 7rankings if responsible linking tactics are engine rankings, but they also representnot observed. great potential for organic traffic. Creating a popular widget could,Baiting the Link in some cases, outweigh traffic from the major search engines. One example of The SEO practice of producing traffic generated by a widget is a blogcontent in hopes that people will link to editor Firefox extension created by theit from their own website is known as professional blogging company,“link baiting.” Good link bait has the Performancing, that received close tosame qualities as good content. From a half a million downloads when it waswell written controversial article to a first released.10 The brand awarenessvideo clip of a bulldog on a skateboard, that widgets can promote has also madewebsite owners will link to any and all advertisers extremely enthusiastic. Onecontent as long as it is interesting and would be hard pressed to find a bettercatches people’s attention. There are no method of exposure than a logo attachedboundaries surrounding the types of to a button that sits in front of a user’scontent one can use to bait a link. In eyes daily. Widgets can be downloadedfact, the very name of a new kind of link to the desktop, so the user does not evenbaiting suggests an indefinable quality. have to have an internet browser open toThis new link baiting tactic is called be exposed to the advertising. While all“widget baiting.” Nick Wilson, CEO interactive marketers will recognize theand senior strategist of the social media widget as an effective marketing tactic,market agency Clickinfluence, declared in most cases, due to the linking andthat “the holy grail of linkbaiting in 2007 organic traffic potential, it will be thewill be the widget.”9 SEO who is best suited to orchestrate the In reference to computers, a creation and implementation of thewidget is an element of user interface widget. In Web 2.0, effective linkingthat displays information or provides a strategy must include widget baiting.specific way for a user to interact withan application. A widget could be acalendar, a stock ticker, a quote of the VI. Social Media Optimization: A Pieceday, or an icon that collects the most of the SEO Puzzlepopular YouTube videos. To get an ideaof the limitless widget possibilities, In this new age of the Internet,check out Yahoo! Widgets people have been quick to deviate from( In its most the title Search Engine Optimizationbasic form, a widget is a downloadable when describing the organic promotioninteractive virtual tool made up of of a website. In August 2006, Rohitsimple bits of code that can easily be Bhargava, VP of Interactive Marketingadded to a webpage. When a widget is for Ogilvy Public Relations, coined theadded to a webpage, if coded correctly, phrase Social Media Optimizationit will act as a crawlable link pointing (SMO) and defined it as the following:back to its page of origin. These linkscan help to boost a site in the search [The act of implementing] changes to optimize a site so that it is more easily linked to, more Buckhead Centre 2964 Peachtree Road, Suite 700 Atlanta, GA 30305 PH 404.233.0332 FX 404.233.0302
  8. 8. 8 highly visible in social media searches on traffic after the spike does typicallycustom search engines (such as Technorati), and return to a level higher than it was more frequently included in relevant posts on blogs, podcasts and vlogs.11 before. The more authentic the illusion of natural interaction created by the On one hand, Bhargava’s point is SEO, the better the results. SEO in Webwell taken. If the tasks one is 2.0 introduces a new skill set ofperforming to drive traffic to a website creativity that was previously notare not intended to do so by improving present. The space for this creativity,search engine rankings, but rather by which ties in with the above link baitingbuilding a presence in social networks, topic of quality content, is especiallythan perhaps SEO is not the appropriate exciting for the SEO of the future. Thedefinition of their occupation. There is possibilities for attracting genuine linksno doubt that SEO has undergone, and and organic traffic are limited only bywill continue to undergo, a certain level the SEO’s imagination.of compartmentalization. As differentareas of SEO continue to experience thegrowth of specialized services, such as VII. Usability vs. Searchability: The RIAblogging, widget baiting and social Search Challengenetworking, the future SEO will spend alarge part of his or her time moderating To the skilled web designer, Weband collaborating with more outsourcing 2.0 invokes a technical definitionopportunities that are not, by themselves, referring to fairly new applications thatSEO related. In the end, however, SEO are being implemented on the Internet tois a sum of its parts, and from the improve the usability and/or aestheticsperspective of a company looking to pay of web sites. These applications, knownfor SEO services, all methods of driving as RIAs, present an interesting dilemmaorganic traffic will reside under the to anyone involved in creating as well asumbrella of Search Engine Optimization. finding these sites. Search engines that Notwithstanding the new find and index web pages by crawlingcoinage, SMO is an important HTML code are presented with thecomponent of SEO in Web 2.0. An challenge of expanding their crawlingSEO’s intention in a social network is to capabilities. Clients want to give theircreate the illusion of natural links that customers the best user experience tooccur during the interaction that takes increase loyalty, but do not want toplace on networks such as sacrifice the organic traffic and, and awareness received from the It is these links that search search engines natural listings. The SEOengines value the most because they is caught somewhere in the middle,happen as a result of real interests, not scrambling to find a solution for thepaid or reciprocal contracts. These links client and cursing the search engines foroften lead to spikes in traffic, which their limited indexing abilities. As ahave been criticized for only providing result of this RIA conundrum, SEOs willunqualified visitors and using up be forced to become increasingly techbandwidth. While these spikes continue savvy in order to work closely with RIAto be a topic of debate on SEO forums, architects and web designers while Buckhead Centre 2964 Peachtree Road, Suite 700 Atlanta, GA 30305 PH 404.233.0332 FX 404.233.0302
  9. 9. 9building the web sites. By fully made up of pages with unique URLs,understanding search engine algorithms and as a result they lose the benefit ofand RIA technology, it is possible to the interconnected pages. Until searchembrace the RIA applications of Web engines implement a new way of2.0 while increasing, or at least indexing the Web, where does this leavemaintaining, SEO results.12 those who want the usability of RIA but Currently, in addition to do not want to sacrifice their high searchPageRank and similar link-based rankings?algorithms, search engine spiders rely on According to Matt Cutts, seniormetadata within the HTML code to engineer at Google, “The vast majoritydetermine the contents of a webpage. of sites are still built as static WebThe most common definition of pages, so we don’t foresee a problem atmetadata is “data about data.” An this time.”13 This answer does notexample of metadata in the HTML code indicate any alarm on the part of theas it refers to a webpage is as follows: leading search engine. Companies that want to have the usability of RIA while <META NAME="Description" maintaining search engine accessibility CONTENT="BKV is a full service direct will have to consider building two marketing agency specializing in direct mail, broadcast media production and planning, separate websites with unique URLs. interactive media and search engine Designing separate sites is not to be marketing."> confused with the black hat SEO tactic known as “cloaking,” which meansAbove is the “meta description” of the serving different versions of a webpageBKV homepage. The text is a type of based on whether the search agent is adata that is in reference to the data, or user or search engine spider. While thecontent, that fills the webpage. This need for talented RIA web designers willdescription (meta description) is crawled no doubt drive up web design costs, ifby the search engine spiders in order for companies want to keep their presencethem to index the contents of the on search engines they will be forced towebsite. Because spiders crawl from spend more money on SEO andpage to page, internal linking between traditional web design. Building andpages of a web site is important. maintaining entirely search engineIncreased interconnectivity improves the oriented websites will become alikelihood that the spiders will crawl and common practice for companies thatindex the entire site. For this reason, the demand the usability of RIA. Despiteinclusion of a sitemap is a common SEO the aforementioned, the professionaltactic because it provides a concise SEO must also be the one to forewarnhierarchy of the pages and links that the clients about the overuse of RIA. Anmake up the website. entire site should not be made in RIA With the use of RIA, all of the just because it can be. It will be theabove crawlability disappears. Pages are SEO’s job to team up with the webwritten in JavaScript instead of HTML, designers and determine which parts of aand the applications no longer require site can remain static in order to salvagethe refreshing of entire pages. For this indexability from a search engine’sreason, RIA-based web sites are not stand-point. The SEO of Web 2.0 will Buckhead Centre 2964 Peachtree Road, Suite 700 Atlanta, GA 30305 PH 404.233.0332 FX 404.233.0302
  10. 10. 10 have enhanced technical skills in order Google search query and SERP listing to deal with this RIA search challenge. they click. This service was originally marketed to the public as a useful resource to remember past searchVIII. Google’s Personalized Search: The interests. The following quotation, End of Traditional SEO? announcing My Search History, is from Google’s official blog: From the Web 2.0 era has emerged one overriding feature: How many times have you used Google to find personalization. These days, those an obscure funny website or fun facts about "The Wizard of Oz," but then got distracted by other browsing the Net are clicking with far web pages and tasks? I know — me too. more conviction, as increased exposure Wouldnt it be great to find them again, and for to the Web has sharpened their ability to that matter review all your Google searches over find what they are looking for. Instead time? Which is exactly why we built My Search of lackadaisically surfing from site to History.15 site, many users now enter the Web through a personalized homepage The true purpose of My Search containing RSS feeds that do the History, however, became clear in June searching for them. Search engines that 2005 with the introduction of Google specialize locally, topically, and Personalized Search. This service alters vertically have increased significantly in search results based on search queries, recent years, signifying the demand for a the sites one selects, and the number of more precise search experience. In times sites are visited, as well as any order to combat the competition of activity taking place on the user’s increasingly specialized search engines, Google Personalized Homepage (created Google has wisely focused its efforts on with every Google account,) such as improving the relevancy of SERPs bookmarks, RSS feeds, or widgets. The through personalization. The beginning ability to take into account a user’s of Google’s quest for more personalized unique search behavior when forming [and therefore, in theory more relevant] search results means that Google search search results can be marked by their results for a given keyword will no 2004 beta launch of Google Personalized longer be the same across the Web. Web Search. Users of the service were Currently, SEOs typically optimize required to set up profile pages where websites with the Google algorithm in they indicated personal interests by mind. This practice is reasonable checking off various categories. The because Google receives the most SERPs would refine themselves based searches in the U.S. (63.9% as of on the user’s interests and desired level February 200716). The fear is that of personalization; the later depending without the Google standard, measuring on how the user manipulated a the effects of SEO efforts by designated “slider” bar that appeared improvement in ranking will become above the search results.14 In April 2005 impossible because the results will vary Google introduced My Search History, by individual. which, when users are signed into their If personalization has been Google account, keeps track of every infiltrating search results since 2004, Buckhead Centre 2964 Peachtree Road, Suite 700 Atlanta, GA 30305 PH 404.233.0332 FX 404.233.0302
  11. 11. 11why is the SEO world only now to create widgets, referred to above asexpressing its concern? On February 2, widget baiting, will be another necessary2007, Sep Kamvar, Google’s SEO tactic.Engineering Lead for Personalization, The question remains, doesannounced on Google’s official blog that Google Personalized Search representanyone who signs up for a Google the end of traditional SEO? Without aAccount, whether it be Gmail, Adwords, common SERP for each keyword query,Adsense, or Google Analytics, will, by what will be the point of optimizing fordefault, be enrolled in Google specific keywords? Personalized SearchPersonalized Search. While there exists rightfully sends a shiver down anya way to opt out of the personalized SEO’s spine; however, the truth is theresearch results, there will be plenty of is no immediate danger that websitesusers who will either breeze through the will not require optimization in order tosign up process without noticing the obtain search engine visibility. Theoption, see the service as an frequent use of Content Managementimprovement or simply not care one way Systems (CMS) that are not searchor another. Responses to this rather engine friendly will require traditionallarge step towards personalizing the “on-page” optimization for many yearssearch experience have been mixed. to come. In order to achieve initialSome say that Google Personalized rankings on the search engines, aSearch will not have any sort of notable website must be indexed, which requiresimpact on SEO because most people will optimization of HTML, creation andnot be logged in to their Gmail accounts submission of sitemaps, internal anchorwhen searching the Internet. Others note linking, and all sorts of other little thingsthe relatively small market share held by that increase a website’s chances ofGmail users. As of May 2006, Gmail being crawled. Google Personalizedusers were a distant fourth in market Search will further the importance ofshare behind Yahoo! Mail, MSN quality content; however traditional SEOHotmail, and MySpace Mail users.17 will remain necessary until searchThe immediate impact of Google engines change the way they indexPersonalized Search on SEO efforts is websites. Personalized Search doesunknown; however, there is no doubt represent a step in the development of athat the leader in online search continues search algorithm that can effectivelyto move in the direction of improved take into account click through ratesearch relevancy for the user. This [CTR] when determining rankings.movement towards personalization will Logically, listings that are clicked onalter SEO best practices to revolve more are more relevant to the keywordincreasingly around Google. Tactics search query. Search engines typicallysuch as prompting visitors to add use a webpage’s meta title andwebsites to their Google Bookmarks, as description to display the page’s ranking.well as add RSS feeds to their Google When this CTR development is realized,Personalized Homepage are ways to the traditional SEO tasks of writing gooddrive traffic, as well as influence a user’s webpage titles, descriptions, and URLSERP who is signed in to Personalized directory structures will become evenSearch. Using the Google Gadget API more important for the purpose of Buckhead Centre 2964 Peachtree Road, Suite 700 Atlanta, GA 30305 PH 404.233.0332 FX 404.233.0302
  12. 12. 12 communicating the contents of a engines provides the code in order for webpage. Receiving a click will not the owner of the customized search only mean more traffic, but also a higher engine to place the search box on their CTR, which will increase a site’s website and tailor it to the look of the ranking. site. Not only can customized search engines improve a user’s experience when visiting a website, but byIX. Search Behavior R&D: Customized experimenting with different parameters Engines and Long Tail Keywords and results, SEOs will gain a better understanding of their customer’s search With the new age of the internet behavior. According to a report by the comes an improved ability to track user Aberdeen Group released in February behavior. Better tracking means 2007, within the next 24 months, 96% of increased accountability for all internet all ecommerce web sites will contain marketers, SEO included. Obtaining a search tools.18 Among companies that better understanding of search behavior track how many visitors convert to sales, will prove to be imperative as 60% found that search tools convert competition for high rankings increases. customers at a rate of 5% or higher.19 It In order to meet the higher standard for will be this conversion rate that will lead performance, research and development to the use of more customized search will receive more attention. engines; however, the SEO must take Advancements will take place in this this opportunity to expand his or her arena by SEOs utilizing customized knowledge of search behavior. One way search engines and long tail keywords. to do this is use the customized search A customized search engine engine to expand on the understanding refers to a user-modified search engine of what keywords customers are using to that produces web results based on a set search for different products and of predefined parameters. These services. The search engine box can be parameters include: designating a list of programmed to record the search terms URLs for the spiders to crawl, refining a used, and functionality can be added that search by topical category such as health allows users to click on the most recent care or sports, only searching a website’s search keywords in order to perform that “neighborhood” (the site’s inbound and search. outbound links), and narrowing results to As methods for discovering more include only certain languages, regions, qualified search terms improves, file formats or RSS feeds. Each of the keyword research will once again have three big search engines has its own its day in the sun. SEOs will start version of a customized search engine. dedicating the necessary time towards Google has Google Co-op Customized optimizing for “long tail” keywords. Search Engine, launched on May 10, Long tail keywords are the thousands of 2006. Yahoo has Rollyo, released in multi word search queries that lead September 2005, and MSN has the customers to find a website, but are Advanced Live Search Box, a feature of rarely optimized for because they Windows Live Search introduced to the generate such a small number of public on March 8, 2006. Each of these monthly searches. The buzz regarding Buckhead Centre 2964 Peachtree Road, Suite 700 Atlanta, GA 30305 PH 404.233.0332 FX 404.233.0302
  13. 13. 13 long tail keywords has been around since million users. As Second Life and three 2005, but they still receive little attention dimensional worlds like it grow in due to the conventional 80-20 rule that popularity, the opportunity to further has been so thoroughly ingrained in the understand customer behavior will traditional marketer’s mind. This rule present itself. SEOs will need to use this refers to the idea that the group of high- platform to continue to develop new volume keywords in a category make up ways to drive organic traffic. 80% of a website’s traffic and The fundamental nature of search converting sales. Typically, people will continue to evolve as well. Search focus on the top ten or twenty keywords algorithms that currently rely on the that are bringing in thousands of visitors, crawling of HTML text will eventually because clients are happy to see the be forced to adapt to applications that broad keywords bring in traffic and promote usability, such as Ajax. ranking high. The truth is, however, that Indexing abilities will expand for other not only does the sum of the low-volume reasons as well. The significant influx keywords bring in more visitors than the of video and image content to the high volume ones, but they typically Internet in the past few years has made it convert better as well. Use of long tail apparent that the search engines’ current keywords represent a searcher that is text-based methods of indexing content further along in the buying process than is far from sufficient. Several someone who types in only one or two companies have already developed the broad keywords. As measurability software to search for videos and images increases on the Internet, the SEO will by speech and pattern recognition be forced to develop extensive long tail technology. It could be several years keyword strategies that lead to more before such technology fully infiltrates conversions in order to report a better mainstream search, but it will be the return on investment (ROI). SEO who will ensure that websites are optimized in order to be found by such indexing technology.X. Conclusion The skill set of the future SEO includes elements of both art and Search Engine Optimization is, science. The SEO’s level of by all accounts, a multifaceted understanding of both human and occupation. It has evolved most computer behavior will distinguish how significantly as a result of the Web 2.0 effective they are at achieving high phase of the Internet. The social rankings and driving organic traffic to presence on the Net will continue to play websites. The best SEOs will drive a large role in determining how SEOs qualified traffic that yields a higher attract traffic to their web sites. An up- conversion rate of sales. and-coming technology that SEOs should keep an eye on is the advanced social networking software that creates 3-D virtual worlds, such as Second Life. Second Life hosts the social and business oriented interaction of over five Buckhead Centre 2964 Peachtree Road, Suite 700 Atlanta, GA 30305 PH 404.233.0332 FX 404.233.0302
  14. 14. 14About BKV The company employs proven techniques for driving response whileHeadquartered in Atlanta and founded in inventing bold new strategies that1981, BKV is focused on direct deliver record-breaking strategy and implementation. Currently BKV boasts such SEO clientsThe largest full-service direct response as Equifax, Harrah’s Entertainment,agency in the Southeast, BKV employs a Rooms To Go, and AfterHoursstaff of 100+ experts in creative, account Formalwear. For more informationservice, direct response television, please call Jamie Turner at 404.233.0332media, interactive, search engine or visit, search engine optimization,production, and database management. References1 Doctorow, Cory, Metacrap: Putting the torch to seven straw-men of the meta-utopia. Version 1.3, 26August 2001. Doctorow is co-editor of the award-winning group blog Boing Boing, a contributing writer toWired magazine, a frequent speaker on copy write issues, and a teacher at the University of SouthernCalifornia.2 Brin, Sergey and Page, Larry, The Anatomy of a Large-Scale Hypertextual Web Search Engine,Proceedings of the seventh international conference on World Wide Web 7, 1998, Pages: 107-117 Brin and Page are the co-founders of Google. While graduate students at Stanford University, thetwo co-authored this paper.3 Tapscott, Don and Williams, Anthony D. Wikinomics: How Mass Collaboration Changes Everything.London: Portfolio, 2006, Page: 194 Tapscott, Don and Williams, Anthony D. Page: 685 Tapscott, Don and Williams, Anthony D. Page: 756 Golder, Scott A. Huberman, Bernardo A. "The Structure of Collaborative Tagging Systems." InformationDynamics Lab, HP Labs. Aug. 18, 2005. Cornell University Library. Golder is a researcher of the Information Dynamics Lab at Hewlett Packard Laboratories.Huberman is a Senior HP Fellow and Director of the Information Dynamics Lab.7 Golder, Scott A. Huberman, Bernardo A. Aug. 18, 2005.8 Loosley, Chris. Rich Internet Applications: Design, Measurement, and Management Challenges. Keynote Systems, 2006.9 Wilson, Nick. 2007 Guide to Linkbaiting: The Year of Widgetbait? January 18, 2007.10 Wilson, Nick. January 18, 200711 Bhargava, Rohit. “5 Rules of Social Media Optimization (SMO).” Influential Interactive Marketing blog. August 10, 2006.12 Datta, Aman (Vice President, Roundarch). Making SEM Work in an RIA World. September 18, 2006.13 Garner, Rob. SEO 2.0 And The Pageless Web: The RIA Search Conundrum. December 20, 2006. Garner is a Senior Strategic Planner at iCrossing.14 Sullivan, Danny. “Google Loses Tabs In New Look, Gains Web Alerts & Personalized Search Result.” March 30, 2004.15 Shah, Avni (My Search History team) “From Lost to Found.” Official Google Blog. April 20, 2005. Buckhead Centre 2964 Peachtree Road, Suite 700 Atlanta, GA 30305 PH 404.233.0332 FX 404.233.0302
  15. 15. 1516 Hitwise. February 24, 2007.17 Tancer, Bill (Hitwise General Manager) “Google, Yahoo!, and MSN: Property Size-up.” HitwiseIntelligence Analyst Weblog. May 19, 2006.18 Lovett, John (Research Analyst) “Web Site Search: Revenue in the Results.” Aberdeen Group. February, 2007. Aberdeen Group is a leading research group focused on the fact-based analysis of globaltechnology-driven ecommerce. They are based out of Boston, MA.19 Lovett, John. February, 2007. Buckhead Centre 2964 Peachtree Road, Suite 700 Atlanta, GA 30305 PH 404.233.0332 FX 404.233.0302