Why Asp.Net Web API (Web API) ?

Today, a web-based application is not enough to reach it’s customers. People are very smart, they are using iphone, mobile, tablets etc. devices in its daily life. These devices also have a lot of apps for making the life easy. Actually, we are moving from the web towards apps world.

So, if you like to expose your service data to the browsers and as well as all these modern devices apps in fast and simple way, you should have an API which is compatible with browsers and all these devices.

For example twitter,facebook and Google API for the web application and phone apps

Web API is the great framework for exposing your data and service to different-different devices. Moreover Web API is open source an ideal platform for building REST-ful services over the .NET Framework. Unlike WCF Rest service, it use the full featues of HTTP (like URIs, request/response headers, caching, versioning, various content formats) and you don’t need to define any extra config settings for different devices unlike WCF Rest service.


Web API Features

  1. It supports convention-based CRUD Actions since it works with HTTP verbs GET,POST,PUT and DELETE.
  2. Responses have an Accept header and HTTP status code.
  3. Responses are formatted by Web API’s MediaTypeFormatter into JSON, XML or whatever format you want to add as a MediaTypeFormatter.
  4. It may accepts and generates the content which may not be object oriented like images, PDF files etc.
  5. It has automatic support for OData. Hence by placing the new [Queryable] attribute on a controller method that returns IQueryable, clients can use the method for OData query composition.
  6. It can be hosted with in the applicaion or on IIS.
  7. It also supports the MVC features such as routing, controllers, action results, filter, model binders, IOC container or dependency injection that makes it more simple and robust.

Why to choose Web API ?

  1. If we need a Web Service and don’t need SOAP, then ASP.Net Web API is best choice.
  2. It is Used to build simple, non-SOAP-based HTTP Services on top of existing WCF message pipeline.
  3. It doesn’t have tedious and extensive configuration like WCF REST service.
  4. Simple service creation with Web API. With WCF REST Services, service creation is difficult.
  5. It is only based on HTTP and easy to define, expose and consume in a REST-ful way.
  6. It is light weight architecture and good for devices which have limited bandwidth like smart phones.
  7. It is open source.
What do you think?

I hope you have enjoyed the article and would able to have a clear picture about Web API. I would like to have feedback from my blog readers. Your valuable feedback, question, or comments about this article are always welcome.

.Net Developer
Vaibhav Kothia


When should indexes be avoided?

Although indexes are intended to enhance a database’s performance, there are times when they should be avoided. The following guidelines indicate when the use of an index should be reconsidered:

  • Indexes should not be used on small tables.
  • Tables that have frequent, large batch update or insert operations.
  • Indexes should not be used on columns that contain a high number of NULL values.
  • Columns that are frequently manipulated should not be indexed.

300,000 Records via LINQ Data Sources

A unique LinqServerModeDataSource control was specifically designed by DevExpress for the ASPxGridView, allowing it to efficiently work with large amounts of data by using LINQ Server Mode. In this mode, only small portions of the required data are loaded into the ASPxGridView by demand, and all the necessary data processing (such as grouping or sorting) is performed on the data server. This technique significantly reduces the application’s response time.

Test the application’s performance by switching between two data sources and performing the same data operations (grouping or sorting) within the ASPxGridView control.




<%@ Page Language=”C#” MasterPageFile=”~/Site.master” AutoEventWireup=”true” CodeFile=”LinqDataSourceServerMode.aspx.cs”
Inherits=”DataBinding_LinqDataSourceServerMode” EnableSessionState=”ReadOnly” %>

<asp:Content ID=”Content2″ ContentPlaceHolderID=”ContentHolder” runat=”Server”>
<dx:CreateDatabaseControl runat=”server” ID=”CreateDatabaseControl” TableKey=”GeneratedEmailTable” />
<div runat=”server” id=”Demo”>
<script type=”text/javascript”>
// <![CDATA[
var start;

function grid_Init(s, e) {

function grid_BeginCallback(s, e) {
start = new Date();

function grid_EndCallback(s, e) {
ClientTimeLabel.SetText(new Date() – start);
// ]]>

<div style=”float: left”>
<b>Data Source</b>
<div class=”TopPadding”>
<dx:ASPxRadioButton runat=”server” Text=”LinqServerModeDataSource (DevExpress data source)”
AutoPostBack=”True” ID=”rbLinqDevExpress” GroupName=”DataSource” Checked=”True” />
<dx:ASPxRadioButton runat=”server” Text=” LinqDataSource (standard .NET data source)”
AutoPostBack=”True” ID=”rbLinqMS2008″ GroupName=”DataSource” />

<div style=”float: left; margin-left: 4%”>
<b>Test Results</b>
<div class=”TopPadding LeftPadding”>
<dx:ASPxLabel runat=”server” ID=”ASPxLabel1″ Text=”Command:” />
<dx:ASPxLabel runat=”server” ID=”ASPxLabel2″ ClientInstanceName=”ClientCommandLabel”
Text=”…” />
<div class=”TopPadding”>
<dx:ASPxLabel runat=”server” ID=”ASPxLabel3″ Text=”Time taken (ms):” />
<dx:ASPxLabel runat=”server” ID=”ASPxLabel4″ ClientInstanceName=”ClientTimeLabel”
Text=”…” />

<b class=”Clear”></b>
<br /><br />

<dx:ASPxGridView ID=”grid” ClientInstanceName=”grid” runat=”server” KeyFieldName=”ID”
Width=”100%” AutoGenerateColumns=”False” OnCustomColumnDisplayText=”grid_CustomColumnDisplayText”
<dx:GridViewDataTextColumn FieldName=”From” VisibleIndex=”0″ />
<dx:GridViewDataTextColumn FieldName=”Subject” VisibleIndex=”1″ />
<dx:GridViewDataDateColumn FieldName=”Sent” VisibleIndex=”2″ />
<dx:GridViewDataCheckColumn Caption=”Attachment?” FieldName=”HasAttachment” VisibleIndex=”3″ />
<dx:GridViewDataTextColumn FieldName=”Size” VisibleIndex=”4″ Width=”80″>
<Settings AllowAutoFilter=”false” />
<Settings ShowFilterRow=”True” ShowFilterRowMenu=”true” ShowGroupPanel=”True” ShowFooter=”True” />
<ClientSideEvents Init=”grid_Init” BeginCallback=”grid_BeginCallback” EndCallback=”grid_EndCallback” />
<dx:ASPxSummaryItem FieldName=”Size” SummaryType=”Sum” />
<dx:ASPxSummaryItem SummaryType=”Count” />
<asp:LinqDataSource ID=”LinqDataSource” runat=”server” ContextTypeName=”LargeDatabaseDataContext”
TableName=”Emails” />
<dx:LinqServerModeDataSource ID=”LinqServerModeDataSource” runat=”server” ContextTypeName=”LargeDatabaseDataContext”
TableName=”Emails” />





using System;
using System.Web.UI;
using DevExpress.Web.Demos;
using DevExpress.Web.ASPxGridView;

public partial class DataBinding_LinqDataSourceServerMode : Page {

static DataBinding_LinqDataSourceServerMode() {

protected void Page_Init(object sender, EventArgs e) {
var ready = DatabaseGenerator.IsReady(CreateDatabaseControl.TableKey);
CreateDatabaseControl.Visible = !ready;
Demo.Visible = ready;

protected void Page_Load(object sender, EventArgs e) {
grid.DataSourceID = rbLinqDevExpress.Checked ? “LinqServerModeDataSource” : “LinqDataSource”;

protected void grid_CustomColumnDisplayText(object sender, ASPxGridViewColumnDisplayTextEventArgs e) {
if(e.Column.FieldName == “Size”)
e.DisplayText = FormatSize(e.Value);

protected void grid_SummaryDisplayText(object sender, ASPxGridViewSummaryDisplayTextEventArgs e) {
if(e.Value == null)
if(e.Item.FieldName == “Size”)
e.Text = FormatSize(e.Value);

string FormatSize(object value) {
string unit = “b”;
double amount = Convert.ToDouble(value);
if(amount > 1000) {
amount /= 1024;
unit = “Kb”;
if(amount > 1000) {
amount /= 1024;
unit = “Mb”;
if(amount > 1000) {
amount /= 1024;
unit = “Gb”;
return String.Format(“{0:#,0.##} {1}”, amount, unit);


Server Mode: Binding to Data Source Using ‘LINQ to SQL Classes’

hen a data-aware control operates in server mode, it delegates all data processing to the server, and only loads records to be displayed on screen. This allows you to dramatically increase performance against large datasets.

To enable a server mode (regular or Instant Feedback) you need to create an appropriate data source. You can use data sources provided by the eXpress Persistent Objects (XPO) library, or use dedicated data sources tailored to work with ‘LINQ to SQL Classes’. This topic shows how to use LINQ query providers.

If you want to enable a regular server mode, use the LinqServerModeSource data source. If you want to enable an Instant Feedback (async) server mode, use the LinqInstantFeedbackSource data source.

Setting up PHP with IIS 6 on Windows XP

This is a small guide on setting up PHP under IIS 6 on a Windows XP/2003 server. If you have any questions or queries about it please don’t hesitate to leave me a comment.

Installing PHP

  • Get the latest php (5.2.6 at present) from PHP.net’s download page. Download the zip packager rather than the installer.
  • Extract the zip file to c:\php
  • Copy php.ini-dist file to php.ini
  • Edit the c:\php\php.ini file and
    • Set the session.save_path to a temporary directory on your server, you could create a special one for this (e.g. c:\php\session)
    • Set the upload_tmp_dir to a temporary directory on your server, you could create a special on for this (e.g. c:\php\upload)
    • Set the SMTP setting to the ip of your smtp server. It won’t work with authentication so make sure you can send from your web server through that smtp server without a user/pass
    • Set the sendmail_from to the email you want all emails from the server to come from
    • Set the upload_max_filesize setting to something a little large (if necessary). Default is 2M. I think 8M is normally ok.
    • Set your extension_dir=C:\php\ext
    • Set post_max_size to at least the size of your upload_max_filesize (+ a little bit more). Default is 8M. 10M should probably be ok if you increased the upload_max_filesize to 8M.
    • Uncomment (by removing the ; at the start of the line)”cgi.force_redirect” and change the value to 0 instead of 1
    • Add these lines to the end of the file
  • Save and exit the file

Setting up IIS and file/directory permissions

  • Now open the IIS management console by typing Start->Run and typing %SystemRoot%\system32\inetsrv\iis.msc
  • Find out what user we need to give write access to by
    • Edit the properties of the web site you are going to install the php application into
    • On the directory security tab, click the edit button under anonymous access and authentication control
    • Copy down the user name under the anonymous access section (it is allowed right ?). It’s usually COMPUTER\IUSR_COMPUTER where COMPUTER is the name of the computer running IIS
  • Now right click on the c:\php\session directory and go to properties, click the security tab, click add and in the box type in the name you copied down before.
  • Click check names and it should underline the name, then click ok.
  • Click modify and it should check all the other necessary boxes for you.
  • Click ok
  • If there is no security tab, then in windows explorer
    • Go to the Tools menu, choose ‘Folder Options’.
    • Choose the ‘View’ tab and find the option called ‘Use simple file sharing’ (usually the last one) and untick it.
    • Click apply, then ok.
    • Go back to the previous step
  • Repeat this for the c:\php\upload directory and any other directories which need to be writeable by your php application.
  • Now go back to the IIS management console, right click on the website you want to setup php for and go to properties.
  • Click the documents tab and click add and type in index.php and click ok. This tells the webserver to look for index.php files for a default file.
  • Now, still in the iis management console, click the home directory tab, click on the configuration button and then click add.
  • Browse to the c:\php\php-cgi.exe file for the executable and .php for the extension. We want it for All Verbs which is checked by default, check script engine and uncheck “Check that file exists” then click ok.
  • Then click ok until all the dialogs are closed and restart your website in the iis control panel by clicking the restart button in the toolbar or by right click and choosing restart from the tasks menu.
  • Now create a file in your web root called info.php and put in it then browse to that file in your web browser via it’s url. It should bring up a phpinfo page which will let you check that mysql is enabled, the session.save_path is right etc.

List of Best and Worst practices for designing and development a high traffic website



Keywords in <title> tag

This is one of the most important places to have a keyword
because what is written inside the <title> tag shows in
search results as your page title. The title tag must be short (6
or 7 words at most) and the the keyword must be near the



Keywords in URL

Keywords in URLs help a lot – e.g. –
where “SEO services” is the keyword phrase you
attempt to rank well for. But if you don’t have the keywords in
other parts of the document, don’t rely on having them in the



Keyword density in document text

Another very important factor you need to check.
3-7 % for major keywords is best, 1-2 for minor. Keyword density
of over 10% is suspicious and looks more like keyword stuffing,
than a naturally written text.



in anchor text

Also very important, especially for the
anchor text of inbound links, because if you have the keyword
in the anchor text in a link from another site, this is regarded
as getting a vote from this site not only about your site in
general, but about the keyword in particular.



Keywords in headings (<H1>, <H2>, etc.

One more place where keywords count a lot. But
beware that your page has actual text about the particular



Keywords in the beginning of a document

Also counts, though not as much as anchor text,
title tag or headings. However, have in mind that the beginning
of a document does not necessarily mean the first paragraph –
for instance if you use tables, the first paragraph of text might
be in the second half of the table.



Keywords in <alt> tags

Spiders don’t read images but they do read their
textual descriptions in the <alt> tag, so if you have
images on your page, fill in the <alt> tag with some
keywords about them.



Keywords in metatags

Less and less important, especially for Google.
Yahoo! and Bing still rely on them, so if you are optimizing for
Yahoo! or Bing, fill these tags properly. In any case, filling
these tags properly will not hurt, so do it.



Keyword proximity

Keyword proximity measures how close in the text
the keywords are. It is best if they are immediately one after
the other (e.g. “dog food”), with no other words
between them. For instance, if you have “dog” in the
first paragraph and “food” in the third paragraph,
this also counts but not as much as having the phrase “dog
food” without any other words in between. Keyword proximity
is applicable for keyword phrases that consist of 2 or more



Keyword phrases

In addition to keywords, you can optimize for
keyword phrases that consist of several words – e.g. “SEO
services”. It is best when the keyword phrases you optimize
for are popular ones, so you can get a lot of exact matches of
the search string but sometimes it makes sense to optimize for 2
or 3 separate keywords (“SEO” and “services”)
than for one phrase that might occasionally get an exact match.



Secondary keywords

Optimizing for secondary keywords can be a golden
mine because when everybody else is optimizing for the most
popular keywords, there will be less competition (and probably
more hits) for pages that are optimized for the minor words. For
instance, “real estate new jersey” might have
thousand times less hits than “real estate” only but
if you are operating in New Jersey, you will get less but
considerably better targeted traffic.



Keyword stemming

For English this is not so much of a factor because
words that stem from the same root (e.g. dog, dogs, doggy, etc.)
are considered related and if you have “dog” on your
page, you will get hits for “dogs” and “doggy”

as well, but for other languages keywords stemming could be an
issue because different words that stem from the same root are
considered as not related and you might need to optimize for all
of them.




Optimizing for synonyms of the target keywords, in
addition to the main keywords. This is good for sites in English,
for which search engines are smart enough to use synonyms as
well, when ranking sites but for many other languages synonyms
are not taken into account, when calculating rankings and



Keyword Mistypes

Spelling errors are very frequent and if you know
that your target keywords have popular misspellings or
alternative spellings (i.e. Christmas and Xmas), you might be
tempted to optimize for them. Yes, this might get you some more
traffic but having spelling mistakes on your site does not make a
good impression, so you’d better don’t do it, or do it only in
the metatags.



Keyword dilution

When you are optimizing for an excessive amount of
keywords, especially unrelated ones, this will affect the
performance of all your keywords and even the major ones will be
lost (diluted) in the text.



Keyword stuffing

Any artificially inflated keyword density (10% and
over) is keyword stuffing and you risk getting banned from search


Links – internal, inbound, outbound


Anchor text of inbound links

As discussed in the Keywords section, this is one
of the most important factors for good rankings. It is best if
you have a keyword in the anchor text but even if you don’t, it
is still OK.



Origin of inbound links

Besides the anchor text, it is important if the
site that links to you is a reputable one or not. Generally sites
with greater Google PR are considered reputable.



Links from similar sites

Having links from similar sites is very, very
useful. It indicates that the competition is voting for you and
you are popular within your topical community.



Links from .edu and .gov sites

These links are precious because .edu and .gov
sites are more reputable than .com. .biz, .info, etc. domains.
Additionally, such links are hard to obtain.



Number of backlinks

Generally the more, the better. But the reputation
of the sites that link to you is more important than their
number. Also important is their anchor text, is there a keyword
in it, how old are they, etc.



Anchor text of internal links

This also matters, though not as much as the anchor
text of inbound links.



Around-the-anchor text

The text that is immediately before and after the
anchor text also matters because it further indicates the
relevance of the link – i.e. if the link is artificial or
it naturally flows in the text.



Age of inbound links

The older, the better. Getting many new links in a
short time suggests buying them.



Links from directories

Great, though it strongly depends on which
directories. Being listed in DMOZ, Yahoo Directory and similar
directories is a great boost for your ranking but having tons of
links from PR0 directories is useless and it can even be regarded
as link spamming, if you have hundreds or thousands of such



Number of outgoing links on the page that links to

The fewer, the better for you because this way your
link looks more important.



Named anchors

Named anchors (the target place of internal links)
are useful for internal navigation but are also useful for SEO
because you stress additionally that a particular page, paragraph
or text is important. In the code, named anchors look like this:
<A href= “#dogs”>Read about dogs</A> and
“#dogs” is the named anchor.



IP address of inbound link

denies that they discriminate against links that come from
the same IP address or C class of addresses, so for Google the IP
address can be considered neutral to the weight of inbound links.
However, Bing and Yahoo! may discard links from the same IPs or IP
classes, so it is always better to get links from different IPs.



Inbound links from link farms and other suspicious

This does not affect you in any way, provided that
the links are not reciprocal. The idea is that it is beyond your
control to define what a link farm links to, so you don’t get
penalized when such sites link to you because this is not your
fault but in any case you’d better stay away from link farms and
similar suspicious sites.



Many outgoing links

Google does not like pages that consists mainly of
links, so you’d better keep them under 100 per page. Having many
outgoing links does not get you any benefits in terms of ranking
and could even make your situation worse.



Excessive linking, link spamming

It is bad for your rankings, when you have many
links to/from the same sites (even if it is not a cross- linking
scheme or links to bad neighbors) because it suggests link buying
or at least spamming. In the best case only some of the links are
taken into account for SEO rankings.



Outbound links to link farms and other suspicious

Unlike inbound links from link farms and other
suspicious sites, outbound links to bad
neighbors can drown you. You need periodically to check the
status of the sites you link to because sometimes good sites
become bad neighbors and vice versa.




Cross-linking occurs when site A links to site B,
site B links to site C and site C links back to site A. This is
the simplest example but more complex schemes are possible.
Cross-linking looks like disguised reciprocal link trading and is



Single pixel links

when you have a link that is a pixel or so wide it
is invisible for humans, so nobody will click on it and it is
obvious that this link is an attempt to manipulate search




<Description> metatag

Metatags are becoming less and less important but
if there are metatags that still matter, these are the
<description> and <keywords> ones. Use the
<Description> metatag to write the description of your
site. Besides the fact that metatags still rock on Bing and
Yahoo!, the <Description> metatag has one more advantage –

it sometimes pops in the description of your site in search



<Keywords> metatag

The <Keywords> metatag also matters, though
as all metatags it gets almost no attention from Google and some
attention from Bing and Yahoo! Keep the metatag reasonably long –

10 to 20 keywords at most. Don’t stuff the <Keywords> tag
with keywords that you don’t have on the page, this is bad for
your rankings.



<Language> metatag

If your site is language-specific, don’t leave this
tag empty. Search engines have more sophisticated ways of
determining the language of a page than relying on the
<language>metatag but they still consider it.



<Refresh> metatag

The <Refresh> metatag is one way to redirect
visitors from your site to another. Only do it if you have
recently migrated your site to a new domain and you need to
temporarily redirect visitors. When used for a long time, the
<refresh> metatag is regarded as unethical practice and
this can hurt your ratings. In any case, redirecting through 301
is much better.




Unique content

Having more content (relevant content, which is
different from the content on other sites both in wording and
topics) is a real boost for your site’s rankings.



Frequency of content change

Frequent changes are favored. It is great when you
constantly add new content but it is not so great when you only
make small updates to existing content.



Keywords font size

When a keyword in the document text is in a larger
font size in comparison to other on-page text, this makes it more
noticeable, so therefore it is more important than the rest of
the text. The same applies to headings (<h1>, <h2>,
etc.), which generally are in larger font size than the rest of
the text.



Keywords formatting

Bold and italic are another way to emphasize
important words and phrases. However, use bold, italic and larger
font sizes within reason because otherwise you might achieve just
the opposite effect.



Age of document

Recent documents (or at least regularly updated
ones) are favored.



File size

Generally long pages are not favored, or at least
you can achieve better rankings if you have 3 short rather than 1
long page on a given topic, so split long pages into multiple
smaller ones.



Content separation

From a marketing point of view content separation
(based on IP, browser type, etc.) might be great but for SEO it
is bad because when you have one URL and differing content,
search engines get confused what the actual content of the page



Poor coding and design

Search engines say that they do not want poorly
designed and coded sites, though there are hardly sites that are
banned because of messy code or ugly images but when the design
and/or coding of a site is poor, the site might not be indexable
at all, so in this sense poor code and design can harm you a lot.



Illegal Content

Using other people’s copyrighted content without
their permission or using content that promotes legal violations
can get you kicked out of search engines.



Invisible text

This is a black hat SEO practice and when spiders
discover that you have text specially for them but not for
humans, don’t be surprised by the penalty.




Cloaking is another illegal technique, which
partially involves content separation because spiders see one
page (highly-optimized, of course), and everybody else is
presented with another version of the same page.



Doorway pages

Creating pages that aim to trick spiders that your
site is a highly-relevant one when it is not, is another way to
get the kick from search engines.



Duplicate content

When you have the same content on several pages on
the site, this will not make your site look larger because the
content penalty kicks in. To a lesser degree duplicate
content applies to pages that reside on other sites but obviously
these cases are not always banned – i.e. article
directories or mirror sites do exist and prosper.


Visual Extras and SEO



If used wisely, it will not hurt. But if your main
content is displayed through JavaScript, this makes it more
difficult for spiders to follow and if JavaScript code is a mess
and spiders can’t follow it, this will definitely hurt your



Images in text

Having a text-only site is so boring but having
many images and no text is a SEO sin. Always provide in the <alt>
tag a meaningful description of an image but don’t stuff it with
keywords or irrelevant information.



Podcasts and videos

Podcasts and videos are becoming more and more
popular but as with all non-textual goodies, search engines can’t
read them, so if you don’t have the tapescript of the podcast or
the video, it is as if the podcast or movie is not there because
it will not be indexed by search engines.



Images instead of text links

Using images instead of text links is bad,
especially when you don’t fill in the <alt> tag. But even
if you fill in the <alt> tag, it is not the same as having
a bold, underlined, 16-pt. link, so use images for navigation
only if this is really vital for the graphic layout of your site.




Frames are very, very bad for SEO. Avoid using them
unless really necessary.




Spiders don’t index the content of Flash movies, so
if you use Flash on your site, don’t forget to give it an
alternative textual description.



A Flash home page

Fortunately this epidemic disease seems to have
come to an end. Having a Flash home page (and sometimes whole
sections of your site) and no HTML version, is a SEO suicide.


Domains, URLs, Web Mastery


URLs and filenames

A very important factor, especially for Yahoo! and



Site Accessibility

Another fundamental issue, which that is often
neglected. If the site (or separate pages) is unaccessible
because of broken links, 404 errors, password-protected areas and
other similar reasons, then the site simply can’t be indexed.




It is great to have a complete and up-to-date
spiders love it, no matter if it is a plain old HTML sitemap or
the special Google sitemap format.



Site size

Spiders love large sites, so generally it is the
bigger, the better. However, big sites become user-unfriendly and
difficult to navigate, so sometimes it makes sense to separate a
big site into a couple of smaller ones. On the other hand, there
are hardly sites that are penalized because they are 10,000+
pages, so don’t split your size in pieces only because it is
getting larger and larger.



Site age

Similarly to wine, older
sites are respected more. The idea is that an old,
established site is more trustworthy (they have been around and
are here to stay) than a new site that has just poped up and
might soon disappear.



Site theme

It is not only keywords in URLs and on page that
matter. The site theme is even more important for good ranking
because when the site fits into one theme, this boosts the
rankings of all its pages that are related to this theme.



File Location on Site

File location is important and files that are
located in the root directory or near it tend to rank better than
files that are buried 5 or more levels below.



Domains versus subdomains, separate domains

Having a separate domain is better – i.e.
instead of having blablabla.blogspot.com, register a separate
blablabla.com domain.



Top-level domains (TLDs)

Not all TLDs are equal. There are TLDs that are
better than others. For instance, the most popular TLD –
.com – is much better than .ws, .biz, or .info domains but
(all equal) nothing beats an old .edu or .org domain.



Hyphens in URLs

Hyphens between the words in an URL increase
readability and help with SEO rankings. This applies both to
hyphens in domain names and in the rest of the URL.



URL length

Generally doesn’t matter but if it is a very long
URL-s, this starts to look spammy, so avoid having more than 10
words in the URL (3 or 4 for the domain name itself and 6 or 7
for the rest of address is acceptable).



IP address

Could matter only for shared hosting or when a site
is hosted with a free hosting provider, when the IP or the whole
C-class of IP addresses is blacklisted due to spamming or other
illegal practices.



Adsense will boost your ranking

Adsense is not related in any way to SEO ranking.
Google will definitely not give you a ranking bonus because of
hosting Adsense ads. Adsense might boost your income but this has
nothing to do with your search rankings.



Adwords will boost your ranking

Similarly to Adsense, Adwords has nothing to do
with your search rankings. Adwords will bring more traffic to
your site but this will not affect your rankings in whatsoever



Hosting downtime

Hosting downtime is directly related to accessibility because if a
site is frequently down, it can’t be indexed. But in practice
this is a factor only if your hosting provider is really
unreliable and has less than 97-98% uptime.



Dynamic URLs

Spiders prefer static URLs, though you will see
many dynamic pages on top positions. Long dynamic URLs (over 100
characters) are really bad and in any case you’d better use a
tool to rewrite
dynamic URLs in something more human- and SEO-friendly.



Session IDs

This is even worse than dynamic URLs. Don’t use
session IDs for information that you’d like to be indexed by



Bans in robots.txt

If indexing of a considerable portion of the site
is banned, this is likely to affect the nonbanned part as well
because spiders will come less frequently to a “noindex”



Redirects (301 and 302)

When not applied properly, redirects
can hurt a lot – the target page might not open, or worse –
a redirect can be regarded as a black hat technique, when the
visitor is immediately taken to a different page.


MVC-Based Web Application V/S Web Forms-Based Web Application

Advantages of an MVC-Based Web Application

The ASP.NET MVC framework offers the following advantages:

  • It makes it easier to manage complexity by dividing an application into the model, the view, and the controller.
  • It does not use view state or server-based forms. This makes the MVC framework ideal for developers who want full control over the behavior of an application.
  • It uses a Front Controller pattern that processes Web application requests through a single controller. This enables you to design an application that supports a rich routing infrastructure. For more information, see Front Controller on the MSDN Web site.
  • It provides better support for test-driven development (TDD).
  • It works well for Web applications that are supported by large teams of developers and Web designers who need a high degree of control over the application

Advantages of a Web Forms-Based Web Application

The Web Forms-based framework offers the following advantages:

  • It supports an event model that preserves state over HTTP, which benefits line-of-business Web application development. The Web Forms-based application provides dozens of events that are supported in hundreds of server controls.
  • It uses a Page Controller pattern that adds functionality to individual pages. For more information, see Page Controller on the MSDN Web site.
  • It uses view state or server-based forms, which can make managing state information easier.
  • It works well for small teams of Web developers and designers who want to take advantage of the large number of components available for rapid application development.
  • In general, it is less complex for application development, because the components (the Page class, controls, and so on) are tightly integrated and usually require less code than the MVC model.

Guid Datatype In MS Access

Many of us have faced this problem. As In SQL Server we have data type Guid but when we use Access as our database. We will wonder for this datatype.
For that we need to use some tric. Its availble but we have not tried so we could not able to use it in access.
For Guid Data Type What should you need to do is as follows:
1) Add Field in design view of table
2) Set datatype of field to Number
3) In Field properties shown below , set Field Size to “Replication ID
Now you have Guid in your access database.

Import Data from EXcel Sheet to SQL Server

To get the data from Excel sheet to SQL Server we can use many of SQL server services.
Before we import data using Excel Sheet ..We have to reconfigure our service with below queries..

EXEC sp_configure ‘show advanced options’, 1;
EXEC sp_configure ‘Ad Hoc Distributed Queries’, 1;

Below query will return data from your Excel file and sheet1.
some of the attributes we have used “Microsoft.Jet.OLEDB.4.0” it is driver name. and spcify Excel 8.0.

In database we have specified our file path.

In that HDR specifies that whether our excel sheet can treat first row as Header or not. If we have set it to NO then OLEDB provider assign Column Names F1 to Fn. Same thing will happen if we have not included HDR in our connection string.

In Excel sheet it is possible that column have mixed data type. By default OLEDB provider see the column’s value and then depending upon it defined datatype on that column.

If column have numeric value then it will omit anyother value in column and set null for other values.To over come this problem we can use IMEX attribute which brings all data regardless of its type.

If we have set IMEX we will have all data as we have in Excel sheet.

FROM OPENROWSET(‘Microsoft.Jet.OLEDB.4.0’,
‘Excel 8.0;Database=C:\testImport.xls;HDR=No;IMEX=1’,
‘SELECT * FROM [Sheet1$]’)