On this page:
A common expectation (from both technical and non-technical people alike) I find is that web developers should be able to develop sites in such a way that they will rank higher.
I have seen IT tenders that requires a site that “must” appear on the first page of a search query. From a business perspective, this requirement is understandable. However, for an IT/web development/search company to somehow guarantee this is misleading and irresponsible. However, it is an opportunity to explain “natural search.”
Explaining Natural Search to business managers
I usually try to explain that natural search engine optimisation (SEO) involves at least two areas:
- Search engine indexing
- Getting search engines inside your site to understand the content
- Search engine ranking
- Determining how to rank the content
I then try to explain that the issues/techniques for each of these areas can be quite different (some may overlap though) and I often sum it as:
- Improving indexing is mostly a technical task
- Improving ranking is mostly a business/marketing strategy
- What might work now may not work in the future
- It takes time
Improving indexing; mostly a technical task
Technical things that web developers do typically helps increase the chance a site is indexed well. For example (in no particular order, and not a complete list!):
- Sufficiently clean URLs to avoid page weight dilution. This usually means trying to ensure that everyone uses the same link to a given page, with no variation in the way querystrings or paths are written (else the search engines will assume these are different pages)
- Good use of the
<title>
element as well as meta keywords and descriptions so that if a page does show up in the rankings, the summary information will be useful (these elements have little impact on ranking these days, but are useful for indexing). - Proper use of redirects and other HTTP status codes to help search engines follow or not follow certain pages. For example, an HTTP 302 is for a temporary redirect and will not be followed by search engines. A 301 Permanent redirect will be.
- Sitemaps especially if your content is vast.
- etc
Side note about HTTP status codes
In some web technologies such as ASP/ASP.NET, a Response.Redirect
issues an HTTP 302 and is the most common way to do redirects. Although headers can be set manually, there are no convenient methods for permanent redirects, and so developers often overlook this crucial difference.
It is especially important during site redesigns to use 301s to redirect old URLs to new ones. Otherwise, all those people linking to your old pages will be left out, and search engines won’t pass on the weighting/recognition of those links to the new pages.
Sometimes using proper 404 Not Found status codes on pages that query databases might be useful if you do not want the search engine to index that page (maybe your site no longer sells those products).
Or, similarly a 500 Internal Server Error is very important. For example, if there is a temporary glitch resulting in the page showing some error information, without the right status code, the search engine will index that content, even replacing your previously good indexed content!
What about using web standards?
Some web standards advocates will be surprised to read that I have not included the use of standards-compliant HTML markup and appropriate use of headers from a search engine perspective.
While these techniques are undoubtedly crucial for accessibility and forms the basis for any modern web development strategy, their value for search engine indexing or ranking is questionable (unfortunately), because spammers can easily abuse elements such as <h1>
.
That being said, avoiding table-based layout and following web standards can help because:
- Standards help to minimize code bloat (some search engines limit how much of a page they will index, though increasingly less important it seems).
- A valid page does not hurt in ensuring a search engine can understand your content. An invalid page might be so invalid that even if it somehow renders okay, a technical program such as a search engine robot may struggle to make sense of it.
- Use of proper markup such as headers do help users, which can help with ranking indirectly (explained below)
There used to be a time when having content first and navigation last was helped too (something done reasonably easily with CSS, and to some extent with HTML layout tables too). But even this technique is less important. If something like HTML 5 becomes more prominent, then elements such as <nav>
will make source code order less relevant (although search engines will still need to deal with abuse of those elements!)
Improving ranking; mostly a business/marketing strategy
To get good ranking however, there is typically very little a web developer can do. Ranking, these days, boils down to search engines trying to determine how popular your site is. The way they do that is through seeing how many links your page(s) get and the nature of those inbound links. (You can also provide internal links from some pages to others, and that can sometimes help, but most SEO experts seem to find that it is the external inbound links that are key.)
This means the task of getting good ranking is ultimately a business/marketing strategy: sites need to have compelling enough content for others to want to link to them.
There are of course always caveats or exceptions. For example, a new site on a very niche area may rank highly on those niche keywords (but the number of people searching for such niche words may be small too).
One of the few technical things that may help (though not necessarily “technical” as such) is training business/marketing to encourage those linking to your pages to use relevant text in those links, such as the title of the page, instead of “click here” or “more info.”
Providing content management systems that allow content creators to provide proper titles, keywords, descriptions, etc is also important.
What about link farms, keyword stuffing, etc?
So, you may have received lots of requests to join various link farms to help each other promote themselves. Search engines try to watch out for these things and only factor relevant in-bound links, by analysing the topics and keywords of the other site. Also, if the other site linking to you is itself determined to be popular, then your page’s weighting increases accordingly. It is kind of like search engines on the look out for who “votes” for your page.
Some people have tried to stuff key words everywhere in the content (I am surprised some developers even advocate use of the HTML title
attribute in many places in the belief it will aide with search engine ranking!). Again, this misses the point that this technical trick is likely not to work (and actually make the site more noisy, especially for those using assistive technologies).
Trying to trick search engines isn’t worth it; you will likely get found out and deslisted. Building up your ranking and reputation will be difficult. For example, BMW was delisted from Google’s listings for providing different content to search engines and users. After changing their practices they were listed again, but few online businesses can afford such delisting.
Search engine companies such as Google provide guidelines that ultimately advise web masters not to trick search engines, but instead concentrate on creating sites for humans to consume; search engines will pick up on that and visit accordingly.
Here are some examples from Google:
Don’t fill your page with lists of keywords, attempt to “cloak” pages, or put up “crawler only” pages. If your site contains pages, links, or text that you don’t intend visitors to see, Google considers those links and pages deceptive and may ignore your site.
Don’t feel obligated to purchase a search engine optimization service. Some companies claim to “guarantee” high ranking for your site in Google’s search results. While legitimate consulting firms can improve your site’s flow and content, others employ deceptive tactics in an attempt to fool search engines. Be careful; if your domain is affiliated with one of these deceptive services, it could be banned from our index.
Don’t use images to display important names, content, or links. Our crawler doesn’t recognize text contained in graphics. Use ALT attributes if the main content and keywords on your page can’t be formatted in regular HTML.
— How can I create a Google-friendly site?, Google.com, accessed September 9, 2007
And:
Quality guidelines – basic principles
- Make pages for users, not for search engines. Don’t deceive your users or present different content to search engines…
- Avoid tricks intended to improve search engine rankings. … ask, “Does this help my users? Would I do this if search engines didn’t exist?”
- Don’t participate in link schemes designed to increase your site’s ranking or PageRank. … avoid links to web spammers … as your own ranking may be affected adversely by those links.
— Webmaster Guidelines, Google.com, accessed September 9, 2007
What works now may not work in the future
Search engine companies are always looking to improve their algorithms so what seems true today may not be the case tomorrow. They also try to guard their algorithms as much as possible so a lot of the above comes from trial and error.
I have often come across people who encourage techniques which are no longer as relevant as they may have once been. It is a rapidly changing area. For example, a useful search engine resource, SEOmoz, provides an excellent summary of ranking factors for 2007 and compares them to just two years ago, showing that even then factors have changed considerably.
Here is a part of their summary:
Top 10 Ranking Factors in 2005:
- Title Tag
- Anchor Text of Links
- Keyword Use in Document Text
- Accessibility of Document
- Links to Document from Site-Internal Pages
- Primary Subject Matter of Site
- External Links to Linking Pages
- Link Popularity of Site in Topical Community
- Global Link Popularity of Site
- Keyword Spamming
Top 10 Ranking Factors in 2007:
- Keyword Use in Title Tag
- Global Link Popularity of Site
- Anchor Text of Inbound Link
- Link Popularity within the Site’s Internal Link Structure
- Age of Site
- Topical Relevance of Inbound Links to Site
- Link Popularity of Site in Topical Community
- Keyword Use in Body Text
- Global Link Popularity of Linking Site
- Topical Relationship of Linking Page
For me, this is one of the most valuable documents on the web for determining how to approach an overall SEO strategy. While the factors may not be perfect, they give a remarkably concise and trustworthy view of what makes a site rank well at Google. I hope you all enjoy it as much as I have – please add your thoughts in the comments!
— Ranking Factors Version 2 Released, April 3, 2007, SEOmoz
It takes time
Businesses can understandably expect a new site to start ranking highly quickly, even if it is a prominent brand. However, the reality is that it takes time (and effort) to build up the critical mass and quality in-bound links. Some search engine companies, such as Google are even factoring in the age of the site into some of their ranking algorithms.
Other times, new content (e.g. a news-based site) even from a new site can rank highly quickly (temporarily or for a long time), especially on niche topics.
Bottom line though is that nothing is really guaranteed!
Terminology confuses matters?
“SEO” is probably a misleading phrase; you don’t optimise google (unless you work there!).
Maybe two terms should be used to help with communication: Search Engine Marketing (SEM) is already often used when talking about making a site compelling enough for others to link to it and improve ranking. So how about something like Search Engine Visibility (SEV) when talking about indexing?
Or maybe I contribute to the confusion by trying introducing another TLA!
Content is king. If you look at all of the nets top sites, they don’t concentrate on keywords and SEO, instead on the user. This is what gets them to their current standings.
Great article. Thanks
Thanks Oleg. Yes, “Content is king.” Well put! Maybe I should have replaced the entire post with that phrase 🙂
Hi Anup,
Good to see you write about SEO. I am writing a book on the subject, so it will be good to talk to you about this face to face at some point.
Quick note here. It does not have to take a long time for a site to rank high (even first). It all depends on context really. Everything in SEO is relative, just like in semantics.
I have managed to rank a few sites on the first page of Google, only because they were the first proper information resources for given niche keywords.
On the other hand, it can take forever to rank a site even on the first page of Google letalone having it be the first result. For some highly competitive keywords some companies manage thousands of blogs to ensure they get steady ‘organic’ refferal links towards their sites to rank their main site higher.
There so much that could be written about this topic, hence my book effort on the subject.
Anyway – well done on putting this together, especially the bit about explaining it to the business people.
Jason: Thanks for the comments. Yes, I mentioned that new sites can rank quickly for niche topics. I have also worked on sites that already rank highly in general and new pages on such a site can sometimes rank highly quite quickly, too.
This site is an interesting example: I have not tried (or had time!) to really push it out anywhere or gather any interest so generally of course it doesn’t currently rank anywhere. That being said a few articles are already ranking reasonably well due to the niche topic (I think some of my posts on HTML 5, for example).
Will be interested to talk to you about your book. Sounds intriguing!
Link Popularity
It is defined as the number of incoming links to your website. In other words it can be explained as the umber of votes (links) a website casts for your website. The more popular is your site the more other websites would like to link to your site.
Link popularity is so important because it drives a huge amount of traffic to a site. When people search something on the net and they find something useful, they will click to visit the site .Now If your site gets linked with a highly popular site, then there will be a huge surge in traffic .
Link popularity is also crucial to search engine ranking of your website. Search engines, especially Google, claim that link popularity is one of the key factors that they consider in their search algorithm while indexing a website. What would be the rank of a website in a search result depend on its link popularity in a very big way.
We should try to link to website having high quality, useful content.
Backward links are so important; building link popularity by exchanging links with other websites should be a priority in your online marketing effort.
Link popularity can be done
1) Link exchange
2) Directory submission
4) Forums posting
4) Blog posting
5) Article writing
6) Guest book and comments
7) Participating in affiliate programs
Thanks for your comment Palcom. I would suggest that some activities such as link exchange may not be valuable — search engines are catching on to, and penalizing link exchanges that are just for search engine promotion and not for exchanging valuable content for readers to benefit from. Your general points are good though.
I am stuck with maintaining two sites. My http://www.verylightjetmagazine.com enjoys hard-won page one on Google but it needed to be redesigned. I chose open-source Joomla for the new site http://www.vljmag.com
New site has no ranking, old site has incredibly valuable ranking which I depend on for my 8000-10000 visitors per month.
I need a redirect strategy- perhaps 301s- in a way that I don’t lose that ranking..
You guys are the experts- any suggestions?
Sincerely,
Bill Strait
Bill,
Unfortunately this is not really a forum for SEO advice, but I’ll try to give some quick hints/thought:
If your old domain has a lot of weighting and value already, it may be worth trying to keep using that same domain rather than changing it.
You should also 301 all your old pages to the new corresponding pages, not just to the home page of the new site. If you are using Apache, then you can use 301 permanent redirects in your apache configuration file (e.g. httpd.conf if you control the server, or .htaccess if you are on a shared host, for example).
In your config file you can then do things like this for permanent redirects:
Redirect permanent /old http://www.newdomain.com/new
Where “old” and “new” can be paths to files or directories.
Your two URL structures on each site look different so you may end up having to write quite a lot of these but the URL rewriting capabilities in Apache are quite sophisticated. It would be worth having a look at the Apache documentation or googling for examples of what you may need to help reduce all the URL translation efforts.
Here are a couple of links which might be useful:
http://httpd.apache.org/docs/2.0/mod/mod_alias.html
http://httpd.apache.org/docs/2.0/mod/mod_rewrite.html
http://www.mcanerin.com/EN/articles/301-redirect-apache.asp
Good luck!
Good business needs a site that must appear on the first page of a search query. Natural SEO help to promote web site to be placed in the first position in search.
@martin: Natural search helps, but I don’t think you can ever guarantee a first position in search results.
I have not tried (or had time!) to really push it out anywhere or gather any interest so generally of course it doesn’t currently rank anywhere. That being said a few articles are already ranking reasonably well due to the niche topic (I think some of my posts on HTML 5, for example).
The perfect web design, and the unique content is Your first step towards success in the search engines. Finish with these tasks, You will rank better 🙂
Thanks Anup – most helpful.
Hello Anup.
Much is written about external linking and its value. Does internal linking have any value? How does it measure up to external linking?
Thanks,
Carol
@Carol: From what I have read and experienced, internal cross linking helps; it helps search engines realize some relationships between pages. However, from what I have seen, it seems external links (good quality ones) are more important. But they also go hand in hand; internal cross linking can help highlight other articles deeper within the site; this in turn can help build in-bound links from external sites.
Good job of research. Thanks for the informative post.
That’s a very comprehensive post about SEO! Thanks for sharing this!
WOW! Great info. The difference between search engine indexing and search engine ranking is very useful!
i am a newbie in Search Engine Optimization but i think that the submission of articles in article directories is one of the best ways to gain backlinks. ‘
@Tomas: you could submit them but if those directories are low quality themselves, and not relevant to your content, then I doubt it will help as much as getting even less backlinks from higher quality, relevant sites.
Hey,
First of all you did a very good job by defining these factors.
I will agree with what Luqman said,
Its Fact, perfect web design, and the unique content is Your first step towards success in the search engines.