2. Sponsored by Adobe
2
There are no unreachable goals.
Only underpowered tools.
Are you using the right tool for technical
writing?
Adobe FrameMaker 11
Find out:
www.douwriteright.com
3. Embracing the beast
• Get all of your content on a public-facing
server.
• Learn about Google Search.
• Learn about search engine optimization.
• Exploit other search engines.
• Create a custom Google Search.
• How this applies to personal visibility *
4. Getting content on the public web
• Good to go
• Web sites (HTML, CSS, JavaScript)
• Web-based Help or Lessons (?? frames)
• Hybrid formats (XHTML)
• PDFs (?? some problems)
• Problems
• Microsoft CHMs, HLPs
• Apple Help, Oracle Help, JavaHelp
• Flash, eLearning (text yes, images no)
• Almost any proprietary mark-up
13. Mirror content
• Periodic conversions
• Maybe “almost the same” is good enough
• Single-sourcing to multiple output formats
• Requires regular FTP management
• Content management system is helpful
• DITA framework?
• Migrate to web-based standards for all
content
• You can still have both local and server-based
content.
14. Security issues and questions
• Content behind a firewall
• Privileged content requiring login
• Realize that your users are using Google
anyway. What are they finding?
• What really needs to be secure?
• Does it all need to be secure?
• Develop your own Google examples to
show to decision-makers.
15. Dealing with loss of context
• Use a conditional text authoring tool to
generate a separate version.
• Provide navigation elements on all
pages.
• Add branding information.
• Include update date and version info.
16. How does Google Search work?
• Indexes the web with brute force and
smart algorithms.
• Secret sauce for ranking content
• Search query is matched against the
index.
• Results are returned according to
priority.
17. High ranking
• Fresh content, real content
• Nomenclature
• Links to your content
• Webmaster guidelines
• http://www.google.com/support/webmasters/bin
/answer.py?hl=en&answer=35769
• http://www.vaughns-1-
pagers.com/internet/google-ranking-factors.htm
• Google ads?
*
18. • Robot.txt
• Sitemap.xml
• Registering your site
• Google.com
• Metadata
• Community building
• Test and rework
• See supplemental handout
• Robot.txt
• Sitemap.xml
• Registering your site
• Google.com
• Metadata
• Community building
• Test and rework
• See supplemental handout
Search engine optimization
*
* Wordpress
support
19. Metadata
• Title
• Meta Description
• Meta Keywords
• Yours and those of others
• http://freekeywords.wordtracker.com/index.html
• Content blocks
*
20. SEO Guidelines
• Proper Title Tags
• Well-constructed title tags contain the main keyword for the page, followed by a brief description of the page content. It will be less than 65
characters and avoid using stop words such as: a, if, the, then, and, an, to, etc. Your title tag should also be limited to the use of alphanumeric
characters, hyphens, and commas.
• Proper Description Tags
• Good description tags contain information about the page's content and persuade search engine users to visit your web site. They should be
between 25 and 35 words in length.
• Proper Keywords Tags
• Your keywords meta tag should contain between 5-10 keywords or keyword phrases that are also found in page content.
• Proper Heading Tags
• Each page of your site should use at least the H1 heading tag for the search engines that examine it when crawling your site.
• Page Content
• Pages should have between 300 and 700 words of descriptive content that contains the keywords specified for the page.
• Proper Navigation
• Each page of your site should contain links to every other page so search engine spiders can find every page. This is a critical step for the
proper indexing and page rank distribution of your site.
• Proper Sitemap
• It's important to use two site maps for your website--an XML version and a static version. The XML version can be created with Search Engine
Visibility's site map tool. The static version should sit on a static HTML page and contain links to every other page.
• Controlled Crawling
• It's important that search engine spiders find your robots.txt file that guides spiders to pages and directories you want crawled and denies entry
to protected areas of your site.
• Duplicate Content/Tags
• Because search engines treat web sites as a grouping of pages and not a single entity, each page on your site should be unique so that the tags
and content differ between each page. Doing so increases the number of pages that will rank.
• Word Density
• Pages should contain 300 to 700 words of unique and descriptive content. A page's meta tag keywords should also be those that occur most
frequently on the page.
*
21. Other search engines
• Yahoo
• Bing
• Alexa
• Yahoo Directory
• Open Directory Project
• 700+ niche, regional, manual
• searchenginewatch.com
Each engine has a
separate registration
procedure.
22. Link testing
• Report generation
• Site analysis
• Submit links to relevant sites.
• Give links to users via email.
• Site analysis also points to hotspots.
*
25. Create a “Google Custom Search”
• Register your site.
• Add code to your site for query box.*
26. Don’t forget to consider mobile
• "The mobile device will be the primary
connection tool to the internet for most people
in the world in 2020." – Pew Research
27. e Promo Code “STC” for an
dditional $100 off thru 8/19 WritersUA.com
Resource Directory
welinske.com/resources/
Blog
welinske.com/blog/
Courses & Presentations
Editor's Notes
If you don’t read anything else in this section of the article, read this: The key element is making your information findable is to get your content onto a public-facing server. Everything else about search engine optimization is exactly that – optimizaion. The core value is provided to you just by making your information visible to Google. In order to do that you need to get all of your content into formats that Google understands and on a server that Google search bots can index.
If you have been developing your content using standard web technologies like HTML and CSS then you are golden. Google knows how to process your information and add it to their indexing service. Likewise, XHTML, XML, XSLT all are easily consumed by Google. Other proprietary formats can create problems. Google is available to index a variety of file formats but with differeing levels of accuracy. For anything other than standard web formats, it will be incumbent upon you to see if Google is making sense of the files you post. The Adobe PDF format used to be a problem for Google. But they have done a lot of work to minimize those problems. For the most part, Google is able to consume PDF and make that content visible through search. Where things sometimes break down is whether the link displayed by Google takes the user directly to the point in the PDF with the content they are interested in.
It can be easy to find out the googleability of your content. Wait about a week after you have posted your content to a public-facing server. Then start googling key phrases unique to your documentation. See whether or not Google has found your content and where it places on the page of search of results. If your content is at the top of the hit list – congratulations. Your information will be easily found by your customers. If other pages are ahead of yours you may have a big problem. Those pages will be ones that users are likely to view first. If the information is incorrect, your customers will be frustrated with your product – not with the info page they found. This points out an important consideration. Not getting your content on the web and visible to Google means that others will, by default, become the authority on how to use your products.
With almost any software application you support, it is likely that at least a few of your users are posting their own information about how to work with that application. The larger your user base, the more linfo you can expect to be posted. Some of this information may be accurate. Some of it may not be. It is important to understand that by making the decision to not post your Help content to the web you are making the decision to allow other people to be the authorities on your products. People are going to use Google to search for answers about your software whether you want them to or not. What do you want those people to see when they do those searches? It should be the information you write that you know is correct and presents the right approach to using your software.
Unless your content is already all on public-facing servers, you are going to need to make web-based versions of it. This almost certainly requires some form of single-source authoring strategy. If you have a large collection of context-sensitive Help and manuals, you will need to make what I call mirror versions. The Microsoft example illustrates this. Each of the context-sensitive Help topics for Outlook is duplicated as a web page. The wrapper around the content and the formatting is different. But the text is the same. The Adobe Tech Comm Suite is really good at supporting this type of authoring. Using conditional text and automated output targets means you can consistently and accurately deliver different content for different viewers.