WHAT IS SEo
SEO stands for” Search Engine Optimization “. SEO refers to bring your website on first page of Google for a searched word.
Search Engine Optimization is the process of increasing the amount of visitors to a Web site by ranking high in the search results of a search engine. The higher a Web site ranks in the results of a search, the greater the chance that site will be visited by a user. It is common practice for Internet users to not click through pages and pages of search results, so where a site ranks in a search is essential for directing more traffic towards the site. SEO helps to ensure that a site is accessible to a search engine and improves the chances that the site will be found by the search engine.
Let me explain you properly->
Suppose you and me both are having their own websites on “Photography” and when “Photography” this word is searched in Google, my website comes at first page and your website comes at 10th page.
In this case I will get a huge amount of traffic on my website and you will not.
So for bringing our website on first page of Google, we do SEO of our website i.e. Search Engine Optimization.
STEPS OF SEO
STEP 1:- INITIAL ANALYSIS OF SEO
First step of SEO should be initial analysis which most of companies used to overlook. The questions listed below will help you analyze the market before making any decision.
Make Sure the Market
Emphasize the Unique Selling Points of your products to see if people are going to receive and take action to them.
Take Care of Your Competitors
Find out what products and services your competitors offer. Are their products better distributed and easier for consumers to obtain? Does their product do the same job at less cost?
Look into Your Target Audience
Try to identify who your target audience is as early as you can. Think about whom your site will be aimed at because different target group will require different approaches.
Find Out How to Target Them
Social media is a great way of driving traffic to your site but the levels of success may vary depending on who your site is aimed at.
Integrate Marketing Strategy
Start to write since you have gathered all this information. Integrating a marketing strategy will help you keep focus on where you are going.
STEP 2:-Keyword analysis and keyword density
Keywords are a very important factor on deciding your websites popularity. The Search engine spiders crawl through your website, they are merely looking to match your keywords with the search strings to decide the websites rankings. For this reason, it's very important that you have optimized your website with the most appropriate keywords to accomplish the desirable rankings in the search engines. Finding the correct Keywords is not an easy task, you should research which keywords you should use so that there closely match your websites content.
The ideal density of keywords on a webpage, the recommended keyword density is between 3-7% for your major keywords and between 1-2% for minor ones.
You should use your main keyword in your website title tags and it should also be placed in the first place of the keyword tag, Using your chosen word within <h1>, <h2> and <h3> is more than likely to give you a helping hand at ranking highly.
STEP 3:-Competitor analysis
What is a Competitor Analysis?
A competitor analysis is one very effective method of deconstructing their online marketing strategy to discover how they are doing so well.
Observe your competitor Find your competitor websites. Analyze and gain knowledge on their tactics, the keywords they use, the techniques applied. With the help of this analysis you will learn what is working and what is not working, which will help in your SEO process.
There are some steps that are used in competitor analysis.
Step 1. Find the Ideal Sites to Compare
Type the ideal keyphrase you want to rank for into both Google and Yahoo and take note of the top 10 websites on each; a spreadsheet makes this easy. Next, choose up to three sites that have a top ranking in both search engines. Select websites that provide similar services to yours; avoid Goliath sites, such as Amazon, Wikipedia, and eBay that tend to achieve rankings by reason of sheer girth.
Step 2. Review Sites and Search for Patterns
With the three sites in hand, review the page that is ranking from each site and pay special attention to locations where the keyphrase is being used. Record your findings and compare the results to see any patterns of keyphrase use across all three competitors. Ultimately your goal will be to find a pattern of successful tactics that you can emulate on your own website. Pay close attention to the following areas:
Text Links. Observe how many times the keyphrase is found in a text link within the page.
Description Tag and Title Tag. View the source code to determine the structure of both the Description and Title tags. Pay special attention to how often the keyphrase is utilized.
Heading Tags (H1, H2, H3). Search through the ranked page for uses of heading tags. Successful websites often utilize heading tags to emphasize the keyphrases on the page.
Navigation. Does the page navigation use the keyphrase? You can often increase page relevance by adding the keyphrase in the wording of your links.
CSS Layout. If the site was designed using CSS (Cascading Style Sheets), you may find the competitor placed keyphrase rich content at the top of the source code even though that content actually appears elsewhere on the page when it is viewed in a browser -- possibly through the use of "CSS positioning" (sometimes called "absolute positioning"). I have often noticed that optimized CSS sites can have a significant edge over non-CSS websites.
Keep looking. There are too many potential tactics to mention, but if you take note where the keyphrase is utilized on the page, you will catch many of the tactics that can increase your return on investment. However, if you find tactics on competitor websites that you believe may be spam, don't blindly copy them. Be careful!
Step 3. Apply the Successful Strategies to Your Website
Once you have determined which tactics are working across each of your competitors' ranking pages, it is time to apply these tactics to your website wherever sensible and possible within your site's technology, design, and content parameters.
STEP 4:-Site map
Site maps are at the heart of Search Engine Friendly Design. They make it very quick and easy for the search engines to find and index every page on your web site by putting links to every page in one spot.
A site map is a separate HTML page that acts as a directory of the pages on your website. Each page listed in the site map is hyperlinked.
Sitemaps can be viewed as a luxury, but for new sites they are critical. It can take the search engines months to crawl your new site and they may never crawl the entire site. Adding a sitemap can speed this process up. It also helps when adding new pages or sections, the search engines have easy access to all pages through you sitemap.
It allows easier indexing of your site by the search engines.
It provides PageRank or link popularity to all pages it links to.
Sitemaps help with usability and site navigation.
You need to understand that the term sitemap usually refers to two very different things:
XML Sitemap
HTML Sitemap
In their essence, they are basically the same thing. They contain the same data structured for different purposes. XML Sitemap is the one you submit to Google through the Google Webmaster Tools. HTML sitemap is supposed to be used by actual users, not just bots.
STEP 4/1:-RSS Feed Inclusion
RSS is short form for “Really Simple Syndication”. RSS is a syndication plan for web content. Simply RSS is a format for receiving regularly changing web content. For SEO Friendly Website development it is necessary to update your sites content regularly.
RSS feeds are regularly updating content that you can add to your WebPages for free! RSS feeds make available fresh articles, breaking news and popular headlines on any topic directly to your webpage without making your effort. RSS feed are published by anyone who wants to share their content to other. RSS feeds are available on internet from various popular publishers free, includes big names like Yahoo News.
Various websites show a small icon with the acronyms RSS to let you know that the RSS feed is available. It’s very simple …just adding one line of code to your website will update themselves forever.STEP 5:-Search engine
A web search engine is designed to search for information on the World Wide Web and FTP servers. The search results are generally presented in a list of results and are often called hits. The information may consist of web pages, images, information and other types of files. Some search engines also mine data available in databases or open directories. Unlike web directories, which are maintained by human editors, search engines operate algorithmically or are a mixture of algorithmic and human input. When a person searches for something online, it requires the search engines to scour their corpus of billions of documents and do two things – first, return only those results that are relevant or useful to the searcher’s query, and second, rank those results in order of perceived value (or importance). It is both “relevance” and “importance” that the process of search engine optimization is meant to influence.
STEP 5/1:-Directory submission
One simple way to obtain back links is to submit your link to directories. Online directories exist for the sole purpose of providing links to web users to various sites categorized under relevant topics. “Directory submission” is a website optimization strategy that no website owner should ignore. Here’s a quick summary of the benefits of submitting your links to web directories.
Directory submissions give you one-way links
Search engines give importance to the number of inbound links to your site and from web directories you get one-way inbound links. Web directories, while including your links, don’t expect you to put their link on your website. Search engines discourage link-exchange tactics but they consider one-way inbound links with high regard.
Directory submissions allow the anchor text of your choice
When you submit your links to web directories they allow you to enter a site title (site title is different from the URL) that can contain your keywords. This generates SEO anchor text for you and this helps you improve your search engine rankings.
Directory submissions are mostly free
Yes, you don’t have to pay for most of the web directory submissions. There are some highly professional web directories that charge a high premium for adding your links but their number is miniscule compared to free directories.
Directory submissions send you targeted traffic
The organizations of web directories make people find the right links under the right categories. Even if you, inadvertently of course, submit your link under a wrong category, there is a great possibility that the human editor will place your link under the correct category.
STEP 6:-Social bookmarking
Social bookmarking is a method for internet users to share, organize, search, and manage bookmarks of web resources. In a social bookmarking system, users save links to web pages that they want to remember and/or share. These bookmarks are usually public, and can be saved privately, shared only with specified people or groups, shared only inside certain networks or another combination of public and private domains. Social bookmarking involves saving bookmarks (web addresses) to a public Web site such as Digg or Del.icio.us so you can access these bookmarks from any computer connected to the web. Your favorite bookmarks are also available for others to view and follow as well, hence the social aspectSocial.Bookmarking is defined by Wikipedia as a method for Internet users to store, organize, search, and manage bookmarks of web pages on the Internet. If you have ever posted your link to one of your blog articles with Digg you have social bookmarked it.
Social Bookmarking has a number of SEO advantages:
You may get a backlink when you bookmark a page on your site (I say “may” because some bookmarking sites use “no follow” in their backlinks).
Bookmarking a new page or a new blog post on your site can help get that page indexed in the search engines quickly.
Most social bookmarking sites allow you to use tags. You can list your target keywords in these tags and this can help your ranking for these keywords.
STEP7:-Blogs
A blog originally was a personal website meant to be like a diary or journal. If you are familiar with Facebook or MySpace, these sites and their user pages are a derivative of blogs. The word blog is the shortened version of the word weblog. A person would usually create a blog as a hobby to share their information and experience on a particular subject. The blogs are designed to be very easy to add new entries to, so the information on blogs is updated much more frequently than a traditional site. As the blogger adds entries to the blog, the viewers can add comments to the entries, so the blog becomes an interactive site.
What is a blog?–> A weblog is a hierarchy of text, images, media objects and data, arranged chronologically, that can be viewed in an HTML browser
This blogging concept came only for those people, who does not have any web designing or web development knowledge but want to say something to the world.
In short, blogging concept came for noobs (new comers) who does not have any programming skills
and because blogging is easy it got pretty famous.
Main free blogging services are - blogspot.com, wordpress.com,myallservices.com And a paid blogging services is – typepad.com and some more.
STEP7/1:-Press release creation
Press Release Creation is designed to offer well –planned and effective communication that is so essential to the success of your online business venture. We understand the importance of content writing and editing and formatting to your business web presence. Anuva creates and designs press releases and other important communication material that enables you to save precious time and money and better concentrate on increasing your business prospects.
STEP 8:-Articles submission
An article marketing assignment involves gathering information on a wide variety of topics, compiling that information and submitting it to these online libraries. Article submission directories typically check for foul play by use of what are called article checkers. These are computer programs that scan the Net for the exact copies of the similar articles. If an actual copy or copies are found, the submitted article is illegal to be plagiarized. Since plagiarized content isn't lawful, it's not accepted. If plagiarized content is to be submitted, then credit is given to the original article.
The main purpose behind article submission is to attract a large number of visitors (and links) to your website without incurring a great cost. Obviously, it is important to make sure that the articles you intend to submit are directly related or relevant to your business.
There are various benefits of article submission including advertising, marketing and publicity of your business on the World Wide Web. Two of the main benefits are:
Article submissions can enhance and improve the ranking of your online business in search engines by increasing the quantity of backlinks and PR.
Another benefit that article submission offers is that of establishing the particular website owner as an expert in their industry. By providing valuable detail and information within articles, it builds the trust of potential customers thereby giving them a reason to visit your website.
The search engine Powers That Be have decided that if other sites are linking to your site, it must be a winner; therefore, it deserves a boost in rankings (when all else is equal). If you think about it, this makes a lot of sense. People link to good sites, not bad ones.
STEP 9:-Link popularity
Link popularity is a familiar phrase to many Web masters & SEO specialists. Essentially, link popularity is a "determination" of how popular a website is based on the number of "incoming links" it has from other websites.
Link popularity plays an important role in the potential "visibility" of a website within search engine results pages (SERPs). Several major search engines require at least a few incoming links (back links) to a website, otherwise they will drop it from their servers The search engine Powers That Be have decided that if other sites are linking to your site, it must be a winner; therefore, it deserves a boost in rankings (when all else is equal). If you think about it, this makes a lot of sense. People link to good sites, not bad ones.
Step 10:- Search engine results page
A search engine results page (SERP), is the listing of web pages returned by a search engine in response to a keyword query. The results normally include a list of web pages with titles, a link to the page, and a short description showing where the Keywords have matched content within the page. A SERP may refer to a single page of links returned, or to the set of all links returned for a search query.
STRATEGIES OF SEO
What is Off Page SEO?
Off page SEO or search engine optimisation is doing things off site to improve your sites search engine rankings. The only thing you can do off site to incraese your rankings is build up more links. More links will generally lead to better Google PageRank and better search engine rankings.>
When you are trying to get more links you need to think about quality, not all links are created equal and links from low quality sites will have little or no impact on your rankings. The best types of links that you can get are from trusted sources such as universities, newspapers and even some of the top notch directories such as dmoz and Yahoo.
It is sometimes difficult to spot the good links; here are a few questions you should ask yourself when you are looking at sites or pages to get links from:
- Is this site or page relevant to what I do?
- Is this site linking out to any low quality off topic sites?
- Is this site or page going to send the right sort of traffic?
- Is this site or page ranking well in the search engines?
- Does this site have a lot of links from other websites?
Another important factor is the way that the sites link to you, sites that use the rel=nofollow tag to link to you or sites that use a redirect to link to you will not help you, also the search engines look at the text that links to you, if you are trying to rank for the phrase blue widgets and you can get a site to link to you including that phrase in the text or alt tag of the link then that is going to help you to rank higher for that particular phrase.
Take these things into consideration, then check out our link building articles and you are on your way towards good off page SEO and good rankings in the search engines.
Off page search engine optimization
SEO experts have suggested that a greater percentage of search engine optimization has to be done off the page because several search engines such as Google determine the importance of website based on the number of links that come from external websites.
On-page optimization is now considered easier and hence, many webmasters use on-page optimization exhaustively. To improve search engine results, search engines now emphasize on off-page factors.
Off-page optimization of website is achieved using sources outside the website. To improve search engine indexing for certain keywords, webmasters should try to relate their website from external pages.
Increasing page rank by improving search engine indexing can be done using search engine optimization. Off-page optimization technique involves link building strategy. Any webmaster interested in showing his website in the top pages of Google search engine should look for off-page optimization to increase search engine ranking.
To get your website included in long term search engine profile, you need to perform off page optimization. Every website owner is looking for a place in search engine result pages. Search engines determine your website ranking for specific keywords depending on authority. Authority is determined by search engines not only by the contents present in your website. Inbound links and outbound links related to links to and from your website are also considered while determining your website's authority. Moreover, authority of websites to which your website is affiliated is also calculated to establish authority of your website.
Off-page search engine optimization in the present world wants you to spread words about your website as much as possible. This means that your website content has to be socialized. This can be done by providing your site contents through RSS feeds. Establishing your website by means of news and media also brings more links to your website. As your website content is available in several places like news, aggregators, blogs, and media companies, it will get more links from the outside. Authority of website also develops because of the availability of website in many places.
Social bookmarking websites are considered very important for off-page SEO. Many social bookmarking websites such as Digg, del.icio.ous, Stumble Upon, etc are very popular. These websites have high authority established by search engines. As you can easily post your website content on the social bookmarking website using blog posts, you can get quality back links to your website effectively. To get deep inbound links, your website viewers can be encouraged to use bookmarking websites throughout your website. Inturn, you get your contents added to their bookmark, which is beneficial for off page SEO.
You must try to include your links to leading directories in your niche. Search engine algorithms consider the source of links to your website. Just by increasing the number of links to your website, you cannot expect great results. The links should come from those places that are relevant to your own website. Search engines can identify the difference between relevant and irrelevant links. When links come form websites with keywords similar to your niche, they are valued high by the search engines. On the other hand, if you get so many links from other websites that are not related to your website, they are not well acknowledged by search engines.
Using all off page SEO techniques, you will develop a circle of profiles that improves authority which inturn increases profile ranking, generating more authority and the process continues. Various off page SEO tactics such as peer-to-peer promotion, link building, advertising, and internet marketing have to be followed all together to achieve desired results.
When you are taking efforts to achieve off page SEO, you have to concentrate more on relevance. You should match SEO techniques with the requirements of your website viewers. With usage of optimization techniques, your search engine ranking will improve. This will definitely bring more visitors to your website. If you don't exploit increased traffic to your website, you end up with wasting your efforts. Internet users will not be satisfied if they don't get what they want from your website.
Your optimization techniques should aim at exhibiting the relevance of your website with keywords. The website must contain high quality content and information that fascinates people interested in your niche. A professionally designed website will only benefit from professionally planned and executed off page SEO.
Your website must contain internal links and the anchor text used is very important. Don't waste this space by providing irrelevant text like click here or go here. Use keywords and key phrases as anchor text. The text you use at different places must be different. Alternatively, you can also use contextual qualifiers that are related to your website. While trying to include keyword phrases, take care that you avoid explicit repetitions that disrupt the flow of the contents.
Following is a short and useful list of sources that you can exploit for increasing number of inbound links:
Themed portals – Article directories such as ezine and discussion forums related to your website niche.
Press releases – Many news websites like BBC allow external websites to post content and links. You can exploit this feature to include new and fresh information related to your website. As these media websites have many readers, your website will be visited by more number of people. Further, search engines will crawl popular websites very often.
Organization websites – Several nonprofit organization websites encourage many people to become a member and promote their website. You can join as many websites as possible as long as they are related to your website content.
Discussion forums – The discussion forums are places that get more traffic everyday. In the signature, you can include your website link and participate in interesting discussions.
Affiliate program – If you have a profitable business, start an affiliate program for your website, and invite many affiliates to join hands with you. This way, you can quickly increase website authority, as the affiliates will happily promote your website if you provide them rewards.
RSS feeds – Internet users will be interested in subscribing to your RSS feeds if you present information in the format they like.
Advertising – Many popular internet directories allow you to publish your links at a fixed cost.
PPC campaigns – To establish relevance of your website with keywords; participate in PPC campaigns to publish your website links in several blogs and websites that are unreachable otherwise.
On page search engine optimisation
Search engine spiders, also referred to as robots, will spiders your web page and follows all the links on a web page that it traces. This process is called website crawling. By changing your website architecture, you can use the spider's ability to crawl to increase your website indexing.
In order to exploit search engine robots, you have to make your links available for the robots to find. If your website contains more flash and graphic contents, the number of links available for search engine robots will be reduced. Moreover, with dynamic pages such as PHP and ASP, spiders cannot see the associated links. Frames and javascript menus also hide links from search engine robots. There are tools that help you see your website just as it is seen by search engine robots. Using these tools, you can identify what changes can be made in your website.
Newcomers become overly enthusiastic imagining their website appearing in the first few pages of search engine results. As a result, they lead themselves into SEO tunnels and forget about human visitors. Whenever you optimize your website for search engine, don't ignore the quality of website available to your website users. Unless the traffic is turned into sales, Your conversion rate will be affected badly if you optimize your website just for search engines without considering usability of the website for the end-user.
Title tag <title> your site title </title>
Majority of the search engines including Google emphasize more on title tag. This tag is used to give a title for your webpage. In SEO terms, title tag has much more meaning. Your title tag should contain at least two of your main key phrases. The keywords must be placed close to the title tag even before the business name.
Search engine optimization should be applied to all the webpages in your website. Search engine will index all webpages in a website and hence, it is advisable to add many webpages to a website. The keywords targeted by any webpage should not be more than three. This ensures that your webpage is recognized for certain keywords. In case you add more keywords to a single webpage, you will find it harder to get optimum results.
Meta description tag <META NAME="Description" CONTENT="Your website descriptive goes here.">
Meta tag is a description tag, using this you can provide a summary of your website to your readers. These tags are ignored by some search engine robots. However, you can include keywords in meta tag to increase the occurrence of key phrases in your webpage.
Meta keyword tag <META NAME="KEYWORDS" CONTENT="your keywords, go here ">
Meta keyword tag was initially used by search engine robots to identify relevancy of the webpage. As this tag was abused by webmasters, search engines started ignoring meta keyword tag. You don't have to include anything other than the key phrases you have included in the title tag within meta keyword tag.
Copying text
Repetition of keyword phrases in your webpage text is crucial for on-page search engine optimisation. Keyword phrases that you have mentioned in the title and meta tags must be included in the webpage content. You can repeat the key phrases many times, but that should not affect the flow of webpage contents. If you forcibly include keyword phrases, your text will not sound natural and this affects user experience on your website.
There are other places on your webpage where you have to include keywords. You have to include key phrases at appropriate places and inform the search engines indirectly about the relevance of contents in your website.
- Heading tags from h1 to h7 are normally used in every webpage to emphasize on key points. Include your key phrases in these heading tags to get search engine attention. Formatting tags such as strong, bold and italics must also be used around key phrases.
- Main page should contain keywords at appropriate places.
- If you use images in your website, make sure that you include img alt tags to include keyword phrases. Alt tag is used along with image tag to inform the visitors about the image when the website is used in non-compatible browsers. Using keyword rich alt tags is an essential tip for on-page SEO.
- File names and folders used in your website are also crawled by search engine robots. Using keyword rich folder names and file names will greatly help search engines to index your website for those specific keywords.
While you can exploit HTMP tags to include your keywords, make sure that you don't stuff the tags with keywords. Your site may be blacklisted if the search engine robots find undesirable keyword stuffing.
If you would like any information on how we can help your website with on page Seo, please contact us with your query, please remember to add your website address and the keyword you are aiming to rank.
On page SEO or search engine optimisation is making sure that your website is as search engine friendly as possible. If your website is not optimised then you have less chance of getting good results in the search engines, here is a quick guide towards good on page SEO:
- Make sure that all of your web pages can be indexed by search engines - make sure that they all have at least one link from somewhere on your site.
- Make sure that you have unique content on every single page.
- Make sure that your meta-tags are arranged correctly - your page title tags and description tags should describe the content of your different web pages. The page title tags should be less then 68 characters and the description tags more detailed but less then 148 characters.
- Make sure you label the different headers on your web pages using H tags.
- Make sure that your web page URLs are SEO friendly, use mod re-write for Linux and Apahche hosting or use IIS redirect for Windows. Ideally make it so that the URLs describe your content i.e. use domain.com/blue-widgets.php as apposed to having something like domain.com/product.php?cat=146. Use hyphens or underscores to separate words in the URLs.
- Make sure that the links within your site are complete i.e. if you are linking to the blue widgets page link to domain.com/blue-widgets.php as apposed to just blue-widgets.php.
- Make sure that you use descriptive URLs for your images i.e. use blue-widget.jpg as apposed a bunch of numbers and or letters .jpg.
- Make sure that you label all of your images with descriptive alt attributes.
- Make sure that you make good use of anchor text links within your content - if you have a page about blue widgets, use the phrase blue widgets in the text that links to it.
- Make sure that there is only one version of your site - 301 redirect all non www. URLs to the www. ones or vice versa.
- Make sure that there is only one version of your homepage - 301 redirect the index or default page back to domain.com.
- Use the rel="nofollow" tag in the links to websites that you do not trust, you think maybe using spamming techniques or you do not want to help in the search engines.
- Make sure that your code is valid, in some instances bad code can lead to search engines not being able to properly read a page. Use the W3C validator to check your markup.
If you follow these guidelines you are on your way towards good on page SEO and to good rankings in the search engines.
What is a Backlink?
An incoming link to a website is called a Backlink. In other words, a link coming from an external page to a page in your website is a backlink. A backlink is also called inbound or incoming link.
Let us understand what backlinks are with an example:
Suppose you own a website www.abc.com and your website deals with SEO. Another website www.xyz.com deals with SEO. The owner of xyz.com browsed your website aaa.com and found the content on your site interesting and useful for the readers who would read his website xyz.com. Assume that xyz.com has a higher page rank and has quality content in it. So the owner of xyz.com has placed a link to your website on his website so that readers who browse xyz would also get to read content in abc.com. In this way you got a backlink from xyz.
Hence, this would help you i.e. abc.com to gain popularity. Hence, the number of backlinks is a factor to indicate the popularity of a website.
Creating Backlinks
Backlinks are important in Search Engine Optimization and are considered an important to calculate the page rankinging. The number of backlinks to your website indicates the popularity of your website. Google uses the number of Backlinks as an important criterion to rank a page. Hence, the more number of backlinks a website has, more are the chances for it to be ranked high.
Web masters use lot of techniques to get more backlinks. One technique is to include your website link as a signature in blogs and forums. To increase the number of backlinks, you may pay a website owner to create links to your page. But getting quality backlinks is important and this can be achieved if your website has quality content.
Writing quality content on your webpage would be the best way to get more number of quality backlinks because if the readers like your content, they will automatically create links to your web page. Such backlinks are known as Natural backlinks which are more essential to get your webpage ranked higher.
Checking Backlinks
Backlinks to a website can be checked using various backlink checking tools that are available for free on the internet. You have to type the domain name for example abc.com and press submit. Identifying the number of backlinks would help you to optimize your website and get more number of quality backlinks.
Why Backlink is So Important?
Backlink is a link from another website that leads to our site. Backlink will be considered as a
natural backlink if the backlinks are inserted in the content or articles and in the form of a sentence so it looks natural. Most of newbies think that to get a backlink, we only have to put the link on the sidebar, but actually it’s not appropriate and less than optimal. In addition, prone to be considered as a paid links. And always keeps in mind that the links are counted as backlinks by search engines is the links that form the text (or text links), not the links that exist in the form of banner (the banner link).
So, what is the purpose of gaining so much backlinks? Is it to increasing the PageRank? Or to get a good position on search engines? Both answers are true, but in practice there is little difference in the implementation. To increase the PageRank, backlinks that we have mu
st be:
- Derived from site with the higher PageRank.
But there is no harm in gathering backlinks from various PR because it will still be counted in the assessment of PageRank. Well I guess that’s it for today.. I’ll share about it more tomorrow. Don’t forget to come back here again.. see Ya
The Benefits of Backlinks
The Benefits of Backlinks. After talking about what is backlink on yesterday, now i wanna share about the benefits of backlinks. And here they are: - Increasing your blog traffic
With your good position on Search Engine, making it easier for the search engine users to find your site. The obtained result can bring in visitors from the search engine and automatically will increase your traffic. But of course according to the keywords of your site.
If your blog’s SEO is very good with a lot of backlinks, the search engines will automatically put your blog/site on the front page, moreover if your blog’s keywords is quite strong, search engine will place your blog/site in top positions.
- Increasing your blog popularity
The more the number of backlinks that point to your site, then the search engines will greatly prioritize your blog. And guess what will you get, the more famous of your site by being at the top position on search engines.
- Increasing your blog Page Rank
And this is the most important thing, your PageRank will increase if there are so many backlinks that point to your site.
What do you think? Hopefully this article can stimulate your enthusiasm for even harder to gain much more backlinks.
Outbound link Definition
A link to a site outside of your site.
Information
Outbound links send visitors away from your web site. Attitudes towards outbound links vary considerably among site owners. Some site owners still link freely. Some refuse to link at all, and some provide links that open in a new browser window.
Opponents of outbound linking argue that it risks losing time and money from site visitors. This can be a large risk if a site is facing high customer acquisition costs.
Proponents argue that providing high quality references actually enhances the value of a site and increases the chance of return visitors.
A description tag
A description tag is a metadata item in the beginning of a web page holds a short description of the contents of the web page.
Description tags may be used by search engines as the segment of descriptive text that is displayed in the search results pages.
What Is a Title Tag?
The title tag has been – and probably will always be – one of the most important factors in achieving high search engine rankings.
In fact, fixing just the title tags of your pages can often generate quick and appreciable differences to your rankings. And because the words in the title tag are what appear in the clickable link on the search engine results page (SERP), changing them may result in more clickthroughs.
Search Engines and Title Tags
Title tags are definitely one of the "big three" as far as the algorithmic weight given to them by search engines; they are equally as important as your visible text copy and the links pointing to your pages – perhaps even more so. Yet, even though this has been common knowledge among SEO professionals for at least 10 years, it is often overlooked by webmasters and others attempting to optimize their websites for targeted search engine traffic.
Article Submission Service
Article submission is very common among search engine optimization experts, because it gives your site one way links from relevant sites and it will increase the traffic to your site. We will create an article and include 2 links pointing to your website in the authors resource box, we will only submit to Article Directories that are do follow, this will ensure that your site will benefit from the Backlinks.
We will manually add your Articles to High Ranking Article Websites, every submission will include 1-2 different anchor text links that point to your website, and this is great for helping a website rank well within the search Engines. We only submit to quality Article Directories, we will provide you with a report that lists the sites that we have submitted to, once submission is complete we will send you a follow up report that lists your approved submissions.
Key Features:
* The links that you acquire are permanent one way links.
* We supply first-class customer support with all enquiries answered within twenty-four hours.
* We will supply full reports that list all the directories we have submitted your article to.
* We will only submit your Article to quality Directories that have good approval rates.
* You can choose two keywords that we will use in the author's resource box.
* We only use articles that are unique.
* Customers to supply their own article for submission. (Must be unique)
KEYWORDS
Search Engine Optimization (SEO) is essential for the success of almost every website. The majority of traffic to websites is brought in by search engines. Keywords are an important part of SEO. Unfortunately, many website owners overdo it. Keywords are one of the most important tools in ranking high on search engines. Abuse them and you will suffer the consequences.
Many novices read about places to add keywords, such as ALT tags, anchors and other areas. While these are meant to be used if you want to help boost your keywords, they should be relevant to the image or the place where you are linking. Many people try to stuff keywords into every possible place. There is no benefit whatsoever in doing this.
As a general rule, I would say you can optimize one page with 3 keywords or phrases. I would recommend starting with one or two. Depending on who you talk to, your percentage of keywords in your text should be between three and five percent of the total content on the site. I have seen site tops with percentages as high as ten. There are some free online tools that can tell you the percentage of each word on your site. This one is http://www.googlerankings.com/ultimate_seo_tool.php
This tool will also show under which of those words your site comes up on.
One huge mistake I see all the time is repeated keywords in the keyword tag. Here is an example: shoes, brown shoes, blue shoes, and red shoes. There are two things wrong here, Do you know what it is? First, of all, the use of the word “and†is not needed. This is called a stop word. Search engines ignore these type of words. Some common stop words are and, the, because, this, that and there. The second thing wrong with the keywords is the word “shoesâ€. It is repeated four times. You do not need to repeat shoes. The corrected keywords should read; shoes, brown, blue, red. The search engine will know you are offering shoes and will match the color with the word shoe. Repeating “shoes†will just penalize your site.
One final tip on using keywords. Make sure you use your top keywords in the TITLE of your site as well as in the description tag. Do not load the title with keywords. These are the first places the engines will scan. Your description tag is usual what is displayed on search engines when relevant keywords are typed in. So, be sure they are there. Do not just type in key word phrases. No one will know what your site is about. What is the point of coming up in rank if no one understand the description.
Keyword Analysis and keyword density
Keywords are a very important factor on deciding your websites popularity. The Search engine spiders crawl through your website, they are merely looking to match your keywords with the search strings to decide the websites rankings. For this reason, it's very important that you have optimised your website with the most appropriate keywords to accomplish the desirable rankings in the search engines. Finding the correct Keywords is not an easy task, you should research which keywords you should use so that there closely match your websites content.
Put yourself in the shoes of the website searcher, if you were trying to find your website on a search engine, what would the keyword or key phrase you would enter in the search engine to find the products or services that your site offers? You can find tools on the internet that will assist you in choosing the best keywords for your website, Google has its own and the results are based on Google searches, this is as good place as any to start with. If your website is a web directory, your selected keyword could be “Web Directory”, but merely aiming for this particular keyword probably won’t be very effective as it is a very competitive keyword and the competition will be massive. You would be better targeting related keywords which a web directory user may type in to the search engine, for Instance “internet web directory”, “free web directory”, “paid web directory”, “web directory submit site”, these keywords are easier to target. Keyword tools will detect a large amount of related keywords, be sensible and select the keywords that describe the content of your website. As it is a very competitive keyword and the competition will be massive.
When you've selected the keywords you are trying to rank highly for, the next step is to use them within the content of your website. Having Keyword rich content increases the chances of high search engine placement. Many webmasters make the mistake of overloading the page content with the target phrases, believing that it will help with rankings, this is a mistake and some search engines will penalise your website as a result. The ideal density of keywords on a webpage, the recommended keyword density is between 3-7% for your major keywords and between 1-2% for minor ones. If you are uncertain of the density you can always use a checker, there are many available on the internet.
Keyword placement also plays a big part, if you have done your research before purchasing a domain name, you should try and include the keyword within the chosen domain name. If you are trying to rank for “Seo Directory” then get a domain name that contains it. Nowadays you can find an appropriate domain tools that will help you find a domain by entering your keywords. You should use your main keyword in your website title tags and it should also be placed in the first place of the keyword tag, Using your chosen word within <h1>, <h2> and <h3> is more than likely to give you a helping hand at ranking highly.
Primary keywords (major keywords): These are used to a related product page or merchant, -- preferably near the top, headline or sub-headline. These are your major targeted keywords.
Secondary Keywords (minor keywords): These are your minor keywords. Used in the first or second paragraph or somewhere else on the page. Secondary keywords throughout the text has no particular density, but added only as they would naturally appear editorially.
For Example here I have done Keyword Research on ****** Related Keywords.
1. ****** (Search Frequency 260)
2. Buy ****** (Search Frequency 200)
3. Buy ****** Online (Search Frequency 150)
In this case "******" becomes primary keyword for your website and "buy ******", "buy ****** online" keywords secondary keywords for you.
Major keywords are your targeted keywords which you want to rank your site via querying that keyword in SERPs. As you go on updating or SEO-ing your site, eventually those minor keywords came along naturally. Be sure not to ignore them for it will help your site to be organic.
Definition of the term, "indexing"
Search engine indexing
Also known as: "search engine indexing", "website indexing"
The term "indexing" is used to describe the process that search engine's carry out to firstly find your site, then secondly "digest" the content of your site. This is the way that search engines gather information about your site in order to "score" each page using their ranking algorithm.
Each search engine have their own spiders (automated programs used to crawl the web, finding new websites and recording details about the site) which carry out the indexing process. The ultimate result is the search engine index - this is the large set of data that search engines use to base their final search results on.
Search engine optimization campaigns generally make reference to this area in terms of describing your website as being "spider friendly" (or not as the case may be). A "spider friendly" website poses no barriers for search engine spiders to index the site. A site that isn't "spider friendly" is one that uses certain web design techniques that (unintentionally) prevent search engines indexing some or all of the content on the site.
For example, if a large amount of the text on your site is displayed using graphic files (ie, you have embedded the text in graphics to make use of a unique font), then search engines can't read that text and as a consequence, they can't determine what your site is about or rank it accordingly.
Any
SEO campaign will start off by reviewing the technical infrastructure of your site to ensure it is "spider friendly". A site with "indexing issues" will not be able to rank on search engines as well as it otherwise could, although most "indexing issues" are usually very easy to correct.
Indexing-
– In search engine optimization, the term "index" is both a noun and a verb. A search engine indexes your site, meaning it adds your site to its index.
Basically, an index is a huge database of websites that a search engine has found, evaluated and indexed. This will make more sense if we talk about the basics of how search engines work:
A search engine grows its index / database through the use of robotic web crawlers (often called spiders). When the search engine spiders find a new website (or one that they're revisiting), they crawl through the site to index as many pages as they can reach.
If Google does not have a particular web page or site in its index, nobody will be able to find that page by searching in Google ... because they're actually searching Google's database as opposed to searching the entire web. Make sense?
This is why you hear people say things like: "I need help getting my site indexed" or "Hey, only half of my pages have been indexed!"
You can increase the number of web pages that get indexed by having a well-organized site with a good navigation structure, and also by having plenty of
inbound links to your internal pages (i.e.
deep links).
Suppose that you have a website and it is not shown in Google, then it mean, your website is not indexed in Google.
Indexing means, if your Google and other search engines has your website name in there database.
Whenever you make any website, then search engines easily find that but the problem comes for the back links you make.
Suppose that you made 50 back links in a day , then google will not index all they back links ,Either google will not index all or will take a long time to index them all and if a back link url is not indexed in Google then your that back link is really of NO USE…
Therefore for indexing urls, we do pinging and bookmarking.
Pinging –
Pinging is a technique through which you notify blog directories and search engines that your blog/website has been updated.
Whenever your blog or website is pinged, google and other directories get to know that you have made a new entry on your website. So now, you will ask, why pinging is necessary.
Actually if you ping your blog or website very often then Search engines will crawl your website faster and it also help search engines to index your url in them fastly.
For pinging your website you can go to
pingoat.com
pingomatic.com
And on these websites, give your url and click on “ping” and now your website / blog will be pinged to major directories and it will help search engines to crawl your website faster and quicker.
CACHING
Website Caching is a technique to reduce the server load by sending visitors of the website static pages instead of dynamic ones. A normal request would query the database of the website and run all the commands references in the php files. That’s usually not a huge problem unless your website gets hit by a storm of visitors, say a link from a very popular website.
Many websites go down if visitors reach a critical level. It has also happened in the past that the webhosting company decided to disable the website because of the massive traffic. Yes, that has happened to Ghacks as well in the past.
Caching can help to reduce the load which means that the server or host can serve additional visitors. It can also decrease page load times which is the obvious benefit for visitors. I decided to give the WordPress plugin
Super Cache a try. It is basically the most advanced caching plugin for WordPress. Installation was a big difficulty because of server permissions that had to be changed to run the script properly.
The script is now working properly and I could not discover any errors or problems so far. It would be nice if you would let me know if you encounter any errors on my website so that I can fix them asap.
What’s a Web Cache? Why do people use them?
A Web cache sits between one or more Web servers (also known as origin servers) and a client or many clients, and watches requests come by, saving copies of the responses — like HTML pages, images and files (collectively known as representations) — for itself. Then, if there is another request for the same URL, it can use the response that it has, instead of asking the origin server for it again.
There are two main reasons that Web caches are used:
- To reduce latency — Because the request is satisfied from the cache (which is closer to the client) instead of the origin server, it takes less time for it to get the representation and display it. This makes the Web seem more responsive.
- To reduce network traffic — Because representations are reused, it reduces the amount of bandwidth used by a client. This saves money if the client is paying for traffic, and keeps their bandwidth requirements lower and more manageable.
Kinds of Web Caches
Browser Caches
If you examine the preferences dialog of any modern Web browser (like Internet Explorer, Safari or Mozilla), you’ll probably notice a “cache” setting. This lets you set aside a section of your computer’s hard disk to store representations that you’ve seen, just for you. The browser cache works according to fairly simple rules. It will check to make sure that the representations are fresh, usually once a session (that is, the once in the current invocation of the browser).
This cache is especially useful when users hit the “back” button or click a link to see a page they’ve just looked at. Also, if you use the same navigation images throughout your site, they’ll be served from browsers’ caches almost instantaneously.
Proxy Caches
Web proxy caches work on the same principle, but a much larger scale. Proxies serve hundreds or thousands of users in the same way; large corporations and ISPs often set them up on their firewalls, or as standalone devices (also known as intermediaries).
Because proxy caches aren’t part of the client or the origin server, but instead are out on the network, requests have to be routed to them somehow. One way to do this is to use your browser’s proxy setting to manually tell it what proxy to use; another is using interception. Interception proxies have Web requests redirected to them by the underlying network itself, so that clients don’t need to be configured for them, or even know about them.
Proxy caches are a type of shared cache; rather than just having one person using them, they usually have a large number of users, and because of this they are very good at reducing latency and network traffic. That’s because popular representations are reused a number of times.
Gateway Caches
Also known as “reverse proxy caches” or “surrogate caches,” gateway caches are also intermediaries, but instead of being deployed by network administrators to save bandwidth, they’re typically deployed by Webmasters themselves, to make their sites more scalable, reliable and better performing.
Requests can be routed to gateway caches by a number of methods, but typically some form of load balancer is used to make one or more of them look like the origin server to clients.
Content delivery networks (CDNs) distribute gateway caches throughout the Internet (or a part of it) and sell caching to interested Web sites. Speedera and Akamai are examples of CDNs. This tutorial focuses mostly on browser and proxy caches, although some of the information is suitable for those interested in gateway caches as well.
Spider Description
SEO Spider crawls any website and returns the number of pages on the server, the number of pages indexed by Google, link popularity, and Alexa rank, and a summary report with search engine ranking stats.
Other SEO Spider features include:
· Spider any website and build a list of URLs, along with the titles, for each page on that server
· Finds out how many pages are indexed and what the Google rank is for each of those listed pages
· Phrase matching search options
· Check individual pages or the entire list
· Discover how well your website performs over time
· Searches Whois records to uncover contact and domain registration information
· View, save or email the summary report with page rank statistics
· Option to save all results, only the selected results, printing, and/or exporting to CSV
· Google Dance Tool shows you when Google is updating their index.
The SEO Spider Works
A seo spider is a program which browses the World Wide Web in a methodical, automatic manner.
This process is called spidering. Many sites, in particular search engines, use spidering as a means of providing up to date data. The seo spider creates a copy of all visited web pages in a huge database and then searched when a user enters a keyword into a search engine.
A spider will start with a list of URLs to visit, index and as it visits these websites it comes across other links which in time will return to visit those websites as well. This process can sometimes take weeks even months.
Spiders like links
This is why it is important to have some incoming links from other websites.
Spiders can follow these links and find your website, it will then index and enter you into their database for future searches at search engines.
Some websites can not be indexed by spiders due to the way they were designed. This is where a seo sitemap has to be used to insure your whole website is indexed. The spider in most cases will index every incoming link, outgoing link, internal links, web titles, headers, meta tags content and so on.
In the seo keyword section you will learn where to place keywords. This will tell the spiders what your website is about and at the same time attract human visitors.
Helping the SEO Spider
You need to make it easy for the spider otherwise it can not do its job.
Using frames to design your website will make it unable to read any content. Using JavaScript will stop it in its path. Using flash in your website is a big no no.
Sitemaps can overcome most of the above problems, but how else can you help the spider do its job.
Imagine that your home page is a giant field to this spider and then it has to leap to your 2nd tier pages and index another load of huge fields and then leaps and hits your 3rd tier page and has to index another huge field. It starts to become tired and decides to go home. If you don’t have links it can not leap and if your website is more than 3 tiers deep it gets to tired.
Ok Ok it’s not a real spider. It will never get tired, but the basics behind it is that the search engines decide that your home page must be important. It is like the cover of a book. The further and deeper you go into a website it deems the web pages to be less important.
Lots on HTML links, content, content, content and a sitemap and you will keep those spiders happy.
What are Crawlers? How do they work?
When you hear people talk about Crawlers in the context of SEO, they are referring to the programs that search engines use to scan and analyze websites in order to determine their importance, and thus their ranking in the results of internet searches for certain keywords. Crawlers are also often referred to as spiders or robots.
Crawlers are very active, and often account for a great deal of the visitors to websites all over the internet. The Google crawler, known as Googlebot, is particularly active, and will often visit a website several times a day, checking for updates or new information. Studies have shown that it is much more active than other crawlers, the closest being the Yahoo crawler, which is about half as active as Googlebot.
Search engine optimization is aimed at understanding how crawlers work and what they look for the determine the importance and ranking of certain sites. The idea is to then implement SEO marketing strategies that will fill websites with the kind of information that the crawlers will determine to be of high value. Crawlers are on the lookout for sites that are rich with the kinds of keywords that people search for, and sites that contain those keywords in high density are seen as being more relevant, and thus will be awarded high rankings. However, crawlers also gather other information important in determining rankings, including link population information and filenames structure. Some forms of search engine marketing are deliberately aimed at deceiving the crawlers into thinking a site is more important than it is. These are known as black hat techniques, and they are frowned upon by most web optimizers, as they can produce penalties from search engines. There are all kinds of SEO tools out there to help you better understand crawlers and how they work. The Google keyword tool is a good place to start Basics of Cloaking
Cloaking generally means to present different version of web page contents from of search engines to search robots and human visitors based on their browser's user agent or IP address. It's a deceptive method used to cheat search engines in order to rank well for desired keywords.
Not always but in most of the cases, cloaking is also used to trick users to visit certain websites based on their description in search engines. For example a user searching for some product may click on a website in search engine based on its description and title but the final website will not be the one described in SERP's. It will be a totally different one. Therefore, these practices that are designed to manipulate search engines and deceive users by directing them to sites other than the ones they selected, and that provide content solely for the benefit of search engines should be avoided.
Different Cloaking Methods
IP address Cloaking - a method of presenting different contents based on determining IP addresses. e.g. Search engines with certain IP addresses will be shown a one version of a web page and all other IP addresses will be shown another version.
User-Agent Cloaking - a method of delivering different versions of a website based on User-Agent. e.g. Search engines and/or users using different versions of web browsers are served with different contents of a web page.
HTTP_REFERER Header Cloaking - if a user is coming from a certain website (e.g. clicking a link from search results or a website) they will be presented a different version of a website based on the HTTP_REFERER header value.
HTTP Accept-Language Header Cloaking - may be used to show different versions of a website based on a users web browser language without letting them for an option of language selection.
JavaScript Cloaking - users with JavaScript enabled browsers are shown one version while users with JavaScript turned off (like search engines) are shown another version of a website.
Finally, cloaking can be a dangerous thing so be careful while using it.
In other words
What is Cloaking?
Cloaking is commonly referred to as a
black hat, or unethical, SEO practice. Cloaking consists of deceiving
search engine crawlers by directing them to web pages that are not actually visible on the domain that the crawlers are trying to analyze. Usually, these false pages are loaded with
keyword rich content and
search engine optimization designed to boost the search results ranking of the web page.
However, since cloaking is an artificial method of boosting rankings, most search engines frown on the practice of cloaking, and will penalize web sites that they find to be using cloaking as a method of
search engine marketing.
To properly understand cloaking, you have to understand how the search engines scan sites to determine their relevance, popularity, and ultimately what their position should be in the search results rankings. The main method by which they accomplish this is through crawlers, which scour the internet analyzing web pages for keywords,
inbound links and other factors.
Cloaking works when a hidden web page is armed with a CGI script that scans the IP address of incoming visitors. If the IP address is determined to be that of a regular human user, then they will be directed to the home page of the site. However, if the cloaking page determines the visitor to be a crawler, then they will direct it to a page loaded with keywords and optimized content designed to trick the crawler into believing the site is of more value than it is, and thus award it a higher ranking.
Cloaking is similar to
doorway pages, in that it is a practice used solely for the purpose of
SEO marketing. The pages represented by cloaking serve no actual purpose, and are of no use or benefit to the average user. Therefore cloaking is seen as spamdexing, and will be punished by search engines. While some web optimizers still employ this practice, in this day and age there are far better
SEO tools that you can use than cloaking to boost your ranking in a lasting manner.
Google Bombing in SEO Definition
This is the glossary definition for Google Bombing in SEO from my E-marketing glossary which provides succinct definitions of the many terms related to managing and implementing Internet marketing today.
For each Internet marketing term I define, there is a link below to all other pages on this site that provide more detailed information, including the latest developments. So this Internet marketing glossary is not static, but continually updated.
What is Google Bombing in SEO? Glossary Entry
This is how I define Google Bombing in SEO:
The practice of denigrating someone or an organization by creating many pages which link to a site based on a theme (high link context).
The best known is the ‘miserable failure’ example. The site does not contain this phrase, but contain links into the site with pages or links referencing this phrase.
White hat v/s black hat
SEO techniques are classified by some into two broad categories: techniques that search engines recommend as part of good design, and those techniques that search engines do not approve of and attempt to minimize the effect of, referred to as
spamdexing. Some industry commentators classify these methods, and the practitioners who employ them, as either white hat SEO, or black hat SEO.
[40] White hats tend to produce results that last a long time, whereas black hats anticipate that their sites will eventually be banned once the search engines discover what they are doing.
[41]
An SEO tactic, technique or method is considered white hat if it conforms to the search engines' guidelines and involves no deception. As the search engine guidelines
[26][27][28][42] are not written as a series of rules or commandments, this is an important distinction to note. White hat SEO is not just about following guidelines, but is about ensuring that the content a search engine indexes and subsequently ranks is the same content a user will see.
White hat advice is generally summed up as creating content for users, not for search engines, and then making that content easily accessible to the spiders, rather than attempting to game the algorithm. White hat SEO is in many ways similar to web development that promotes accessibility,
[43] although the two are not identical.
White Hat SEO is merely effective marketing, making efforts to deliver quality content to an audience that has requested the quality content. Traditional
marketing means have allowed this through transparency and exposure. A search engine's algorithm takes this into account, such as Google's
PageRank.
Black hat SEO attempts to improve rankings in ways that are disapproved of by the search engines, or involve deception. One black hat technique uses text that is hidden, either as text colored similar to the background, in an invisible
div, or positioned off screen. Another method gives a different page depending on whether the page is being requested by a human visitor or a search engine, a technique known as
cloaking.
Search engines may penalize sites they discover using black hat methods, either by reducing their rankings or eliminating their listings from their databases altogether. Such penalties can be applied either automatically by the search engines' algorithms, or by a manual site review. One infamous example was the February 2006 Google removal of both
BMW Germany and
Ricoh Germany for use of deceptive practices.
[44] Both companies, however, quickly apologized, fixed the offending pages, and were restored to Google's list.
[45].
HEADING TAG <h1> to <h6>
The <h1> to <h6> tags define headers. <h1> defines the largest header. <h6> defines the smallest header. Do you know what these tags are used for? Many of the pages I view are using these tags to style your content. That is NOT the purpose of the heading tag. The heading tag provides structure for your document, much like an outline displays the structure of a term paper or technical document. The h1 tag defines you page title, what your page is actually about. There should be only ONE h1 tag on each page. The default size of the <h1> tag is BIG.
Additional Resources:
HEADING TAG STRUCTURE by James Huggins
Some additional thoughts on headers.
Remember that H1, H2, H3, etc is NOT ABOUT FORMATTING.
Rather, it IS ABOUT STRUCTURE.
H1 is the primary heading on a page.
H2 are the secondary headings on a page.
H3 are the tertiary headings on a page.
Headings are typically short, stand-alone paragraphs which introduce a section of the document.
Yes, you can specify how to format the headings for consistency and that is cool. But the primary purpose of H1, H2, H3, etc. is to specify the STRUCTURE of the document.
Headings, levels 1-6, were constructed for the original web. The original web was built to share information and HTML was originally constructed to reflect CONTENT STRUCTURE.
The original web was viewable on dumb IBM 3270 terminals, TI Silent 700 Printers and a host of other devices we would never consider using today. The purpose of headings is to identify the information content structure.
h1 is the master heading of the document h2's are the second level h3's are the third level
It is like an outline.
Consider, for example, this outline
The Effect of Women on Men <== h1 (document title)
- Why This Whole Women and Men Thing Matters <== h2
- Economic Impacts <== h3
- Sociological Impacts <== h3
- Physiological Impacts <== h3
- The History of Women and Men <== h2
A. Pre-Columbian <== h3
B. Medieval <== h3
C. Dark Ages <== h3
D. The Reagan Years <== h3
How It All Works <== h2
. The Electrical Factors <== h3
A. The Psychological Factors <== h3
B. The Chemical Factors <== h3
C. The Physical Factors <== h3
One of the reasons that some web writers have trouble with h1, h2, etc., is that they don't write "structured". In fact, some writers think of their web pages more as "posters" than as "documents". The web was designed for "documents".
Let's keep exploring this idea.
Do you have a non-fiction book you can look at?
Non-fiction books have HEADINGS to separate the body copy.
H1, H2, H3, H4, etc, were designed to designate HEADINGS used to separate the body copy
Look at most non-fiction books which use headings and you will see various levels of headings, each one formatted slightly differently.
Here are other examples:
Look at THIS page
The main heading on the page (the page title) is
How to Write a White Paper - A White Paper on White Papers
That would be tagged as H1 because it is the highest "level" of heading.
The next "level" of headings are H2. On this page these are
-- What is a White Paper?
-- Know Your Audience
-- Decide on an Approach
This particular page does not have any third level headings.
Here is a document with a lot of levels
(Note that this document does NOT actually use the H1-H6 tags. I'll just explain where they SHOULD go based on heading "levels".)
H1: Management of Internet Names and Addresses
H2: Agency
Action
Summary
Effective Date
For Further Information
Authority
Supplementary Information
H3: Background
H4: U.S.Role in DNS Development
DNS Management Today
The Need for Change
H3: Comments and Response
H2: Administrative Law Requirements
H2: Revised Policy Statement
H3: The Coordinated Functions
H2: The Transition
= = = = = =
Ok
And here is one which actually uses the H1, H2 and H3 tags.
h1: What's New in SQL Server Agent for Microsoft SQL Server 2005 Beta 2
h2: Introduction
h3: Security Improvements
h4: New Roles in the msdb Database
h4: Multiple Proxy Accounts
h3: Performance Improvements
h4: Thread Life
h4: Performance Counters
h3: New Features
h4: New SQL Server Agent Subsystems
h5: The DTS Subsystem
h5: The Analysis Query Subsystem
h4: Shared Schedules
h4: WMI Event Alerts
h5: Example of WMI Event Alert
h4: Tokens
h5: Example WMI Event Alert
h4: SQL Server Agent Sessions
h4: SQLIMail Support
h3: Stored Procedure Changes
h2: Conclusion
Post a Comment