Dont forget to

AddThis Social Bookmark Button

Monday, April 5, 2010

AdWords Search Funnels


In order to help you make more informed decisions about your AdWords keywords, ad groups, and campaigns, Google release a new set of reports for your AdWords account: AdWords Search Funnels (beta). Currently, conversions in AdWords are attributed to the last ad someone clicks before making a conversion, masking the fact that many customers perform multiple searches before finally converting. AdWords Search Funnels help you see the full picture by giving you insight into the ads your customers interact with during their shopping process.


What are AdWords Search Funnels?

AdWords Search Funnels are a set of reports describing the ad click and impression behavior on

Wednesday, March 31, 2010

Google penalty for Over Optimization

The SEO industry has been plagued for years by a lack of consistency with SEO terms and definitions. One of the most prevalent inaccurate terms we hear is “duplicate content penalty.” While duplicate content is not something you should strive for on your website, there’s no search engine penalty for having it.

Duplicate content has been and always will be a natural part of the Web. It’s nothing to be afraid of. If your site has some dupe content for whatever reason, you don’t have to lose sleep every night worrying about the wrath of the Google gods. They’re not going to shoot lightning bolts at your site from the sky, nor are they going to banish your entire website from ever showing up for relevant searches.

They are simply going to filter out the dupes.

The search engines want to index and show to their users (the searchers) as much unique content as algorithmically possible. That’s their job, and they do it quite well considering what they have to work with: spammers using invisible or irrelevant content, technically challenged websites that crawlers can’t easily find, copycat scraper sites that exist only to obtain AdSense clicks, and a whole host of other such nonsense.

There’s no doubt that duplicate content is a problem for search engines. If a searcher is looking for a particular type of product or service and is presented with pages and pages of results that provide the same basic information, then the engine has failed to do its job properly. In order to supply users with a variety of information on their search query, search engines have created duplicate content “filters” (not penalties) that attempt to weed out the information they already know about. Certainly, if your page is one of those that is filtered, it may very well feel like a penalty to you, but it’s not – it’s a filter.

Penalties Are for Spammers

Search engine penalties are reserved for pages and sites that are purposely trying to trick the search engines in one form or another. Penalties can be meted out algorithmically when obvious deceptions exist on a page, or they can be personally handed out by a search engineer who discovers the hanky-panky through spam reports and other means. To many people’s surprise, penalties rarely happen to the average website. Sites that receive a true penalty typically know exactly what they did to deserve it. If they don’t, they haven’t been paying attention.

Honestly, the search engines are not out to get you. If you have a page on your site that sells red hats and another very similar page selling blue hats, you aren’t going to find your site banished off the face of Google. The worst thing that will happen is that only the red hat page may show up in the search results instead of both pages showing up. If you need both to show up in the search engines, then you’ll need to make them substantially unique.

Suffice it to say that just about any content that is easily created without much human intervention (i.e., automated) is not a great candidate for organic SEO purposes.

Article Reprints

Another duplicate-content issue that many are concerned about is the republishing of online articles. Reprinting someone’s article on your site is not going to cause a penalty. While you probably don’t want every article on your site to be a reprint of someone else’s, if the reprints are helpful to your site visitors and your overall mission, then it’s not a problem for the search engines.

If your own bylined articles are getting published elsewhere, that’s a good thing. You don’t need to provide a different version to other sites or not allow them to be republished at all. The more sites that host your article, the more chances you have to build your credibility as well as to gain links back to your site through a short bio at the end of the article. In many cases, Google doesn’t even filter out duplicate articles in searches, but even if they eventually show only one version, it’s still okay.

Inadvertent Multiple URLs for the Same Content

Where duplicate content CAN be a problem is when a website shows essentially the same page, but on numerous URLs. WordPress blogs often fall victim to this when multiple tags or categories are chosen to label any one blog post. The blog software then creates numerous URLs for the same article, depending on which category or tag a user clicked to view it. While this type of duplicate content won’t cause a search engine penalty, it will often split the overall link popularity of the article, which is not recommended.

Any backend system or CMS that creates numerous URLs for any one piece of content can indeed be a problem for search engines, because it makes their spiders do more work. It’s silly to have the spider finding the same information over and over again, when you’d rather have it finding other, unique information to index. This type of unintended duplicate content should definitely be cleaned up either through 301-redirects or by using the canonical link element (rel=canonical).

When it comes to duplicate content, the search engines are not penalizing you or thinking that you’re a spammer; they’re simply trying to show some variety in their search results pages and don’t want to waste time indexing content they already have in their databases.

Wednesday, March 17, 2010

Quick Tips on SEO

1. Meta title and Description tags are very important due to search relevancy and accuracy of site indexing. Clear, concise, and informative Meta Tags produce higher ranking SERP's.

2. Accurate and effective Meta Title can often rank higher than your primary landing page Meta Tag.

3. Your Title Tag must contain keyword or search phrases that should be ranked. Be sure it‘s in the proper length or it will be truncated.

4. Your Description Tag offers you a chance to clearly and concisely explain what you offer and why people should visit your site. Including a quantifiable benefit or selling point will help reduce click-through. Reducing click-through will result in more traffic. More traffic will result in more leads or sales.

5. Optimize and update website content on a regular basis- Content should be relevant to a site's theme, products or services. It’s also important if this content clearly written, concise and contains proper grammar. After all, your website is a direct reflection of your online image. Keep in mind that content contained graphic images cannot be read by Search Engine spiders.

6. Analyze and improve your Site/page download times- While it's not yet clear how much emphasis Google or other Search Engines are placing on site down load times what is clear is the following: Slow loading websites frustrate visitors. Frustrated visitors look elsewhere for their needs. They will back click off your website and visit your competition. Resulting in a poor conversion rates and lost leads. Keep in mind that many searches are now being done by I-phones. Many I-phone connections tend to be similar to 56K dial up services. Performing a Y2K technology test is highly recommended.

7. Provide fresh and updated website content- Fresh and updated content reflects a site that's not stagnant. Your visitors and Search Engine crawlers take notice of this activity. Your audiences search trends change so it's important to monitor them and change your content to reflect this.

8. Include only Quality outbound links in your website- There are high quality and low quality outbound links. High quality outbound links are links that are relevant to your website's core subject or service. Avoid link exchange offers (cross linking) that don’t relate at all to your business or service. Also avoid paying for multiple inbound links that are submitted to obscure sources around the globe. Indications are that these practices can result in your site being penalized by Search Engines. Your website should contain high quality outbound links that educate or persuade your visitors. This will increase their Time On Site and lead to more conversions.

9. Create Quality inbound links- Avoid 'link farming' at all costs. This practice can result in your website being penalized. The most effective and highest rated links are those that are created organically through articles, Social Media platforms, press releases, human edited industry related news or resource sites.

Saturday, January 23, 2010

Happy news on Sitemap for Webmasters

Every one is familiar that Sitemaps are a way to tell Google about pages on site we might not otherwise discover. They're a valuable addition to your site but do not control how many, or which pages will be indexed.

While submitting sitemaps to google via webmaster tools, only 1000 URLs will be accepted but now its possible to submit around 50,000. Great news??

But not sure when does this actually happened, Got it from Google Webmaster Central forum.

Tuesday, January 12, 2010

Google Caffine Alogrithm

I got some clues on "Google Caffine" algorithm update. It will probably be a major overhaul of the calculations that Google uses to rank web pages. Google hasn't revealed the details of Google Caffeine yet. Though Caffine has been implemented in some test servers, this could be more than any other Google algorithm Updates. However, the following factors might be included in the Caffine algorithm.

Website speed: if you have a slow loading website, it might not get high rankings on Google.
Broken links: if your website contains many broken links, this might have a negative impact of the position of your web pages in Google search results.
Bad neighborhoods: Linking to known spammers and getting a lot of links from known spammers isn't good for your rankings in Google's current algorithm. The negative impact of a bad neighborhood will probably be even worse with Google Caffeine.
The over-all quality of your website: Google's new algorithm probably will take a closer look at the over-all quality of your website. It's not enough to have one or two ranking factors in place.

You'll probably need good optimized content, a good website design with a clear navigation, good inbound links, a low bounce rate, etc. The number of social bookmarks might also play an increased role. Factors like the age of a website, its past history, authority etc. will still play a role in Google's new index. However, the effect of the different factors on your rankings will shift.

Saturday, January 2, 2010

Directories list

These links contain lot of directories list where SEO's can make use of it..

http://www.strongestlinks.com/directories.php?sortcolumn=PR
http://info.vilesilencer.com/top/order_by/google_pr/dn/

Will be updating more as found

Google Caffeine

Couple of months, a team of Google was working on a project
that’s name is “Google Caffeine.” The main aim of
Google Caffeine is to make Google search results more accurate,
indexing speed, comprehensiveness and many other things.

Before some days Google launched Google Caffeine, a new trend of search. That is a new search technology which Google is discussing with developers and selecting their contribution to implement on new Google. Though there are not any major changes in the rankings but We may see minor fluctuations. Google gave the name “Caffeine” to this new technology.Google-Caffeine Let’s wait and watch what it reflects to SEO guys..!!




Tuesday, December 22, 2009

SEO and File Extensions

The most common questions of SEO's will be, Is it important from SEO point of view what kind of extension has page html, asp, cfm, php … Can it make your web site pages be more visible on search engines with a specific extension or not?

Many of us are familiar that shouldn’t build websites using Flash and JavaScript, coz Google can't read it. It looks really cool, but Google will never find such websites.
I got an explanation from one of the expert, follows.

The extension is irrelevant to Google, but it is very relevant to the workings of your site. .html is what Google actually sees and reads, but PHP, ASP and CFM are all made to output html. The real difference lies in the technology. ASP is a scripting language also unstable and not secure. CFM or ColdFusion is a Windoze application, so it runs a Web server application on top of Windoze. It can also be a bit expensive to license depending on the demands of your site. PHP is free. It can run on Windoze, but why. It actually runs as an extension of Apache. In that environment, it is very stable and very fast. A LAMP (Linus/Apache/MySQL/PHP) installation is a common platform to use, but I prefer to run Apache, MySQL and PHP on an OpenBSD server. OpenBSD is by far the most stable and secure platform around. Linux will be fine for many applications.

Saturday, August 29, 2009

Google Caffeine

Couple of months, a team of Google was working on a project
that’s name is “Google Caffeine.” The main aim of
Google Caffeine is to make Google search results more accurate,
indexing speed, comprehensiveness and many other things.

Before some days Google launched Google Caffeine, a new trend of
search. That is a new search technology which Google is discussing with
developers and selecting their contribution to implement on new Google.
Though there are not any major changes in the rankings but We may see
minor fluctuations. Google gave the name “Caffeine” to this
new technology.Google-Caffeine
Let’s wait and watch what it reflects to SEO guys..!!




Thursday, August 27, 2009

Site visibility depends on HTML validation

HTML Validation is the process of checking the document for compliance with web standards and bug search in html code, so the document is considered to pass the validation when it has been checked for errors, and a validator has no remarks to the code.

At the end  of html validation process you will have a page without visible errors and your website will appear nice in any browser. This should be done at SEO stage doe to several reasons:

  • this will add to site visibility for search engines; search engine will never quote your site in top10 if there are many errors there
  • this will allow your portal be properly laid out in almost any browser
  • validation optimizez the cod,e so the pages are loaded faster

There are many advantages for doing page validation – pages get the structure becoming a web-standardized document; for large sites this will simplify editing and creating new pages. But please remember that users are mostly interested in the content rather than the code changes.

So pay attention to the content making it popular and interesting, but do not forget about the code validation & optimization and marketing features that your website more popularity.

visit here http://blog.cogniance.com/?p=138 for more info