ASP.NET and Search Engines: friends or foes ?

ASP.NET and Search Engines are friends or foes ? It should be a wide doubt among the web application developers. This blog of mine, will share with you the facts and myths about the puzzle.

First of all, Let me recall some terms about search engine optimization.

Indexing is the technique used by the search engines to find out the relevant contents on your website for particular keywords.To be elaborate,whenever a new website it created or the contents of an existing site is updated, the search engine will navigate through the site for collecting the possible combinations of the search keywords.This is known as Indexing.

Ranking is another popular term in the SEO circle. The technique based on which the search engines rate the pages for a particular search query is known as Ranking. The search engines displays the results for the keywords in chronological order of the rank generated on raking. The factors influencing the ranking are, accuracy of search key, number of hits on your page, activity (how often the site is updated and visited), etc.

Your site can be optimized for the search engines, if they are easy to index and ranked better. In my previous post on SEO, i gave the basic things to be done for better visibility of your sites.

Let us go into our topic now, Whether ASP.NET is search engine friendly?

No, It is not. There was a claim that the ASP.Net is not as friendly as other web technologies for creating the web application. The major arguments to support their claim are complex and unfriendly URL, Huge page size due to the over range state management techniques like hidden files. So it is recommended to give meaningful URLs and optimal the use of hidden fields. The search engines does not like duplicate contents(especially Google),Server errors those are unhandled by the code, clumsy meaningless URLs, huge page sizes due to hidden state and other server based state management techniques that makes indexing a headache.Also, unreachable content due to inefficient use of AJAX, Poor quality content, poor usage of web standards like missing the page title,poor usage of tagging make indexing tough. Shouldn’t it give us millions of related results in seconds instead of trying to index a poorly managed page?

So, how can we make them Search engine friendly? Obviously by eliminating those mistakes while developing our site. Make you site more visible,

  • by making appropriate use of tags
  • by giving a proper title for your page
  • by creating meaningful URLs that makes indexing simple
  • by updating the site with appropriate information often
  • by making the right use of hidden fields and by making use of alternative state management principles instead of hidden fields.
  • by effective placement of AJAX calls(recommended to call the script once the page is completely loaded).

Apart from all that, quality content has no substitute to increase your hits. Unique keywords is the key for your better visibility.

Look forward for more information about SEO in my future blog post on MVC and SEO. Happy coding!


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s