Showing posts with label Tags. Show all posts
Showing posts with label Tags. Show all posts

Friday, March 21, 2008

How to Improve Search Engine Ranking in India?

Article on how to get top search engine ranking

Search engine optimization is about making your site in such a way that it will rank higher in the search engines. A website that's optimized for search engines can reap huge benefits on to your website and your business’s. A recent Forrester Research report showed that 80% of web surfers discover the new sites that they visit through search engines. Include your most important keywords both in the title and the description you submit to the ODP with perfect descriptions and titles.

There are various areas of page optimization for Google search engine

Title: Title Tags are a very important guide for all search engines in determining what is in the content of a specific web page. You will find the title tag in the head section of the source code along with your Meta description tag and Meta keyword Tag. We feel that title tags should span no less than six words and no more than twelve which result in about a range of fifty to eighty characters, this includes spaces, hyphens, commas, etc.

Keyword Density: keyword density is the ratio of the word that is being searched for, i.e. the keyword, against the total number of words appearing on a given web page. There are a few techniques you can choose that were carefully researched by experts in the SEO Company.

Link Popularity: Link Popularity refers to the number of hyperlinks pointing to a webpage. The higher a webpage's link popularity rating the better, as most major search engines consider link popularity as an important factor in determining a webpage's relevancy in its search rankings.

Blogging: Weblogs can be used as a way of providing current up to date content on an existing website or can be used as a website on their own. Most importantly Blogs allow for easy visitor interaction, whereas websites generally don't.

Wednesday, February 27, 2008

Google’s Tag To Remove Content Spamming

Content spamming, in its simplest firm, is the taking of content from other sites that rank well on the search engines, and then either using it as-it-is or using a utility software like Articlebot to scramble the content to the point that it can’t be detected with plagiarism software. In either case, your good, search-engine- friendly content is stolen and used, often as part of a doorway page, to draw the attention of the search engines away from you.

Everyone has seen examples of this: the page that looks promising but contains lists of terms (like term term paper term papers term limits) that link to other similar lists, each carrying Google advertising. Or the site that contains nothing but content licensed from Wikipedia. Or the site that plays well in a search but contains nothing more than SEO gibberish, often ripped off from the site of an expert and minced into word slaw.

These sites are created en masse to provide a fertile ground to draw eyeballs. It seems a waste of time when you receive a penny a view for even the best-paying ads but when you put up five hundred sites at a time, and you’ve figured out how to get all of them to show up on the first page or two of a lucrative Google search term, it can be surprisingly profitable.

The losers are the people who click on these pages, thinking that there is content of worth on these sites and you. Your places are stolen from the top ten by these spammers. Google is working hard to lock them out, but there is more that you can do to help Google.

Using The Antispam Tag

But there is another loser. One of the strengths of the Internet is that it allows for two-way public communication on a scale never seen before. You post a blog, or set up a wiki; your audience comments on your blog, or adds and changes your wiki.

The problem? While you have complete control over a website and its contents in the normal way of things, sites that allow for user communication remove this complete control from you and give it to your readers. There is no way to prevent readers of an open blog from posting unwanted links, except for manually removing them. Even then, links can be hidden in commas or periods, making it nearly impossible to catch everything.

This leaves you open to the accusation of link spam for links you never put out there to begin with. And while you may police the most recent several blogs you’ve posted, no one polices the ones from several years ago. Yet Google still looks at them and indexes them. By 2002, bloggers everywhere were begging Google for an ignore tag of some sort to prevent its spiders from indexing comment areas.

Not only, they said, would bloggers be grateful; everyone with two-way uncontrolled communication wikis, forums, guest books needed this service from Google. Each of these types of sites has been inundated with spam at some point, forcing some to shut down completely. And Google itself needed it to help prevent the rampant sp@m in the industry.

In 2005, Google finally responded to these concerns. Though their solution is not everything the online community wanted (for instance, it leads to potentially good content being ignored as well as sp@m), it does at least allow you to section out the parts of your blog that are public. It is the “nofollow” attribute.

“Nofollow” allows you to mark a portion of your web page, whether you’re running a blog or you want to section out paid advertising, as an area that Google spiders should ignore. The great thing about it is that not only does it keep your rankings from suffering from sp@m, it also discourages spammers from wasting your valuable comments section with their junk text.

The most basic part of this attribute involves embedding it into a hyperlink. This allows you to manually flag links, such as those embedded in paid advertising, as links Google spiders should ignore. But what if the content is user-generated? It’s still a problem because you certainly don’t have time to go through and mark all those links up.

Fortunately, blogging systems have been sensitive to this new development. Whether you use Wordpress or another blogging system, most have implemented either automated “nofollow” links in their comment sections, or have issued plugins you can implement yourself to prevent this sort of spamming.

This does not solve every problem. But it’s a great start. Be certain you know how your user-generated content system provides this service to you. In most cases, a software update will implement this change for you.

Is This Spamming And Will Google Block Me?

There’s another problem with the spamming crowd. When you’re fighting search engine sp@m and start seeing the different forms it can take and, disturbingly, realizing that some of your techniques for your legitimate site are similar you have to wonder: Will Google block me for my search engine optimization techniques?

This happened recently to BMW’s corporate site. Their webmaster, dissatisfied with the dealership’s position when web users searched for several terms (such as “new car”), created and posted a gateway page a page optimized with text that then redirects searchers to an often graphics-heavy page.

Google found it and, rightly or wrongly, promptly dropped their page rank manually to zero. For weeks, searches for their site turned up plenty of sp@m and dozens of news storiesbut to find their actual site, it was necessary to drop to the bottom of the search, not easy to do in Googleworld.

This is why you really need to understand what Google counts as search engine sp@m, and adhere to their restrictions even if everyone else doesn’t. Never create a gateway page, particularly one with spammish data. Instead, use legittmate techniques like image alternate text and actual text in your page. Look for ways to get other pages to point to your site t article submission, for instance, or directory submission. And keep your content fresh, always.

While duplicated text is often a sign of serious spammage, the Google engineers realize two things: first, the original text is probably still out there somewhere, and it’s unfair to drop that person’s rankings along with those who stole it from them; and second, certain types of duplicated text, like articles or blog entries, are to be expected.

Their answer to the first issue is to credtt the site first catalogued with a particular text as the creator, and to drop sites obviously spammed from that one down a rank. The other issue is addressed by looking at other data around the questionable data; if the entire site appears to be spammed, it, too, is dropped. Provided you are not duplicating text on many websites to fraudulently increase your ranking, you’re safe. Ask yourself: are you using the same content on several sites registered to you in order to maximize your chances of being read? If the answer is yes, this is a bad idea and will be classified as spamdexing. If your content would not be useful to the average Internet surfer, it is also likely to be classed as spamdexing.

There is a very thin line between search engine optimization and spam indexing. You should become very familiar with it. Start with understanding hidden/invisible text, keyword stuffing, metatag stuffing, gateway pages, and scraper sites.