Part 5: Steps to take after submitting to search engines (from the web)
All Web sites should be thoroughly tested using a site maintenance tool in order to catch errors in operation before customers are brought to the site. HTML errors can hinder a search engine spider’s ability to index a site, it can also keep a search engine from reading a page or cause it to be viewed in a manner different from how it was intended. NetMechanic’s HTML Toolbox or another site maintenance tool, must be used by the Webmaster, to avoid potential visitor disasters due to site errors.
Selecting a search engine submission service requires careful thought and important decisions. Using an auto submission service is a good place to begin. Most search engines like Alta Vista, HotBot and InfoSeek automatically spider a site, index it and hopefully add it to their search database without any human involvement. Some engines, like Yahoo, are done completely with human review and for many reasons are best submitted individually.
Understanding the waiting periods
A variety of waiting periods must be endured with each search engine before there is even a hope of being listed. There is 4-6 weeks of wait period in normal stance.
To improve site rankings and increase understanding of the listing process, there are many tasks that can be done on a regular or semi-regular basis. Optimizing rankings within the search engines is also to help ensure that a site attracts the right traffic.
Crunching and examining log files
Data contained in log files is an excellent resource for identifying which engines are sending the majority of traffic to a site. It can also show which key words or gateway pages are generating the strongest traffic and what are those visitors doing when they enter the site.
Searching the Search Engines
Conduct a search of the search engines to analyze where the highest rankings of the site have materialized and what keywords are generating the best rankings.
Different search engines use different rules to rank pages. Individual gateway pages should be created based on the knowledge and interpretation of what each search engine is using to determine top rankings. Several pages can be tested out on one or more engines and the pages that have the most success can be kept, while the unsuccessful pages can be dumped or revised to achieve a higher ranking.
Periodic update on how search engines work
Each search engine uses different rules to determine how well a Web page matches a particular query. As a result, building a single page that gets a good score in all the major engines is just about impossible. Learning how each engine ranks pages is also hard, since the engines often keep this information as a closely guarded secret. However, with a little patience, some experimentation and reverse engineering, the way that many of the search engines work can be discovered.
Resubmitting the site
For engines that reject a site or don’t list it high enough, it is strongly recommended that more information is learned about the engine’s criteria before resubmitting. This information should then be incorporated into gateway pages or key word revisions in order to have greater success with subsequent submissions.
Fine tune the page (or pages) make adjustments to TITLE tags and META tags, then after resubmitting the site, track the results to further learn about the engine’s criteria and which adjustments made an impact on the rankings.
Checking log files for traffic being directed to erroneous pages on the site
There may be external links pointing to pages that don’t exist in your site or pages that have been removed from your site. Don’t dump these pages or remove them from the search engine as most people will do when they redesign their site. Any page with a high ranking is of value. If a page is bringing traffic to a site, leave that page on the search engine, don’t change it but rather redirect the traffic to valid pages in the site.