One should not forget to overlook the fundamentals of the good page and the structure of the site— building the pages so that they are both user-friendly and search engine friendly. This will not only ensure proper seo for your site but also enable your visitors to navigate properly.

The most important things to remember:

The search engines are in need of the content. The content is the major source of business and revenue for the search engines. But if (a.) the website cannot be traversed by text-reading and automated spiders, and/or (b.) The pages are not provided with the distinguishing features. A barrier must be put in order to find the impact of the ability of spiders for the purpose of indexing the content as well as the site.

Few details about page and site structure:

1. There must be a unique HTML TITLE: There is a necessity of using a descriptive title that precisely describes the content of that page. It also provides proper context to the browsers who first sees the title on Google as well as the other search results. For instance, if it is a political commentary site, including yet another political article about President Bush in the site. The existing viewers might not notice the title. One must make sure that the title provides a proper summary of the content. The title must give some idea to the browsers scanning Google search results. Google can conveniently index as well as catalogue the content. It would help to find better matches between the pages and the search pages of the users.

2. Every Page must have a unique summary of the content which is described in the META description tag.

The search engine will definitely try to find out the summary if the description of the page is missing. The search engines will function in two different ways:

If the META description tag is not provided, but the content is properly displayed then the search engines will find out the topic of the page. They will pick up a bit of text from the paragraph or the sentence which is most relevant to the content. However, computers rarely find the topic correctly.

It is difficult to find out the topic of the page since there is no description. Search engines will definitely find out if the site has the DMOZ listing and it will pick up the summary of the website that is offered by the human editors. It is best to write few sentences and place them in the META DESCRIPTION tag.

META descriptions will appear as the "teaser" text that will be visible under the links to the pages or articles or pages on Google as well as on the other engines' search results pages. One must be sure that the description is enough to describe the page. The description must offer a brief idea about the content to the visitors who are not on that particular website or those who are not provided with any other context with which another page or article can be associated.

When the link to one of the webpages emerge on search results of one of the main search engines one must make sure that the title as well as the description are interesting and accurate. It should be accurate and important in such a way that the link should look better than at least 20 on that results page the browser is looking at.

Note: It is advisable not to give titles and descriptions which are not relevant to the content on the webpage. This is one of the mistake due to which the website can be banned from the important search engines.

3. Usage of good old HTML hierarchical conventions. The H1 tag should be first, main visible title. It should be followed by the normal paragraph text. H1, H3 and a lot more must be required for the subheaders.

4. META KEYWORDS tags should be removed. The search engines have ignored them for long period of time. KEYWORDS tags do not have any important role in adding to the do favourable scores with search engines. However, it is used as the factor of the negative ratings. It is ideal to remove them as well as be safe.

5. Dynamic drop-down menus, fancy Flash animations and javascript or form-based navigation cannot be crawled by the spiders as a result the links and the content would not be read or found by search engines.

Search engines usually only follow text (or standard HREF) links, and do not have a tendency to read inside of javascripts or DHTML menu scripts. CSS visible/hidden based menus can be used which are capable of loading all links and texts into source code where it can be conveniently read by spiders. It is also advisable to add a "Site Map" link into the webpage header and footer. This easily links to a webpage where one can have simple HTML HREF links to the single page on the website.

6. "Site Map" must be provided – It must be provided wither in the Google's XML format or in the own format), so Google and others can conveniently find all pages on the website. However, this is not a guarantee of indexing since if Google as well as other cannot find the topic or the descriptions of the webpages they will not catalog or index the page.

7. Dynamic URLs will not cause any problem for any search engine -- **UNLESS** (a.) The URLs and site are designed in the way the links might lead to a “spider-traps” for crawlers. (often the search engine spiders are caught by the infinite looping of the links within the website-like the calendar links that may lead to infinite number of future and past months – in the case the site will simply be abandoned), or

(b.) The dynamic URLs when added (such as session or IDs datestamps) would cause a problem of duplicate content (in this case the same page is reachable through the separate URLs).

These features will enable your site to be Search Engine Friendly and thus easier for you to do seo on your site.


Post a Comment

Thank you for commenting on this blog. Your comment is awaiting moderation and will be live soon.