Wednesday 28 November 2012

Effective PPC campaign tips for setup ads






Use The Keyword in the Headline as Much as You Can

This is pretty self explanatory. Every time you use a keyword in your ad, be it in the headline or anywhere else, it’ll get bolded by the system (See example below). Of course, that bolding will make the ad much easier for people to notice it, and…


Increase Multi-Word Keyword Portfolio
The success of your PPC campaign is all about keywords. It only requires a small amount of your time each day to brainstorm keywords which are relevant to your product or service and help to expand your keyword portfolio. The more relevant keywords you have the better your chances of converting more visitors to sales.

Multi-word keyword phrases attract highly targeted traffic to your website and tend to have higher rankings than single and double word keyword phrases. Coming up with as many multi-word keyword phrases as you can, will increase your sales conversion ratio since your visitors are searching for that specific information. For example, “blue shoe laces Nike sneakers” would be an example of a multi-word keyword phrase which is also commonly referred to as a long tail keyword and tells you exactly what the customer is looking for.

Monitor Your Competitors
It is smart business to keep up with what your competitors are doing with their PPC advertising campaigns. Changes in the process your competitors use could affect keyword prices as well as your ad positioning in the keyword results. Using analysis to monitor your competitors will help you to stay ahead of the game.

Include an offer in the copy. "Free," "Save $XX," or "XX% Savings" will usually lift response. You can also mention a gift or bonus.

Relevancy! Relevancy! Relevancy! In case you missed the common theme throughout: Relevancy is the most important element of PPC search engine marketing. If you ignore relevancy, you will likely be frustrated with low click-through rates and an unprofitable campaign.

Sunday 25 November 2012

Negative ON Page SEO Techniques to Avoid






·         Avoid Using "hidden" or invisible text on your page for the purpose of higher search engine placement. For example the words/text for search phrase "Widget" in the html, the font color has been set to White. The background of the page is also white. Therefore the textual content is actually there, however the words are "hidden" from the surfer. This is frowned upon by search engines and frequently results in your site being penalized
·         Avoid Using Negative <div> htmltags. Div tags, Div tags are division tags. Unscrupulous seo services may insert them into your page with negative x/y coordinates to place content outside of the visible page for the surfer, but the text itself is in the html page. The search engine finds the keywords in the text, yet the surfer does not see it. Again a technique to be avoided and not recommended under any circumstances
·         Avoid Cloaking or Sneaky Redirects. Cloaking refers to serving up 2 different types of content based on the visitor who is visiting. Is the visitor a regular web surfer, serve up this page. Is the visitor a search engine spider? Serve up this OTHER page specifically for the search engine spider. The other page being served up is typically garbled textual content with no meaning to a human, and is stuffed with various keywords and search phrases. Again this technique is not recommended and will likely get your site penalized or banned from search engines
·         Avoid duplicate content. Duplicate content means you create one web site, with content on topic a, and then repeat the content over and over again on multiple websites. In theory you could create one website, achieve high ranking on it, and then clog up the search engines with the same content duplicated on multiple domains. Again this is not recommended and should be avoided

Friday 23 November 2012

SEO Optimization for mobile Websites





Configure mobile sites so that they can be indexed accurately:

It seems the world is going mobile, with many people using mobile phones on a daily basis, and a large user base searching on Google’s mobile search page. However, as a webmaster, running a mobile site and tapping into the mobile search audience isn't easy. Mobile sites not only use a different format from normal desktop sites, but the management methods and expertise required are also quite different. This results in a variety of new challenges. While many mobile sites were designed with mobile viewing in mind, they weren’t designed to be search friendly.


Here are troubleshooting tips to help ensure that your site is properly crawled and indexed:

If your web site doesn't show up in the results of a Google mobile search even using the site: operator, it may be that your site has one or both of the following issues:

1. Googlebot may not be able to find your site
Googlebot must crawl your site before it can be included in our search index. If you just created the site, we may not yet be aware of it. If that's the case, create a Mobile Sitemap and submit it to Google to inform us of the site’s existence. A Mobile Sitemap can be submitted using Google Webmaster Tools, just like a standard Sitemap.


2. Googlebot may not be able to access your site
Some mobile sites refuse access to anything but mobile phones, making it impossible for Googlebot to access the site, and therefore making the site unsearchable. Our crawler for mobile sites is "Googlebot-Mobile". If you'd like your site crawled, please allow any User-agent including "Googlebot-Mobile" to access your site (). You should also be aware that Google may change its User-agent information at any time without notice, so we don't recommend checking whether the User-agent exactly matches "Googlebot-Mobile" (the current User-agent). Instead, check whether the User-agent header contains the string "Googlebot-Mobile". You can also use DNS Lookups to verify Googlebot.


Verify that Google can recognize your mobile URLs

Once Googlebot-Mobile crawls your URLs, we then check for whether each URL is viewable on a mobile device. Pages we determine aren't viewable on a mobile phone won't be included in our mobile site index (although they may be included in the regular web index). This determination is based on a variety of factors, one of which is the "DTD (Doc Type Definition)" declaration. Check that your mobile-friendly URLs' DTD declaration is in an appropriate mobile format such as XHTML Mobile or Compact HTML (). If it's in a compatible format, the page is eligible for the mobile search index.

Wednesday 21 November 2012

Calculate the Return on Investment (ROI) for PPC Campaigns

Calculating ROI is one of the basic tenets of PPC, and yet many advertisers don’t consider it or even understand it.

A lot of advertisers perform campaign optimizations based solely on conversion rate or cost per conversion, choosing the ads and keywords with the best metric and calling it a day.

For ecommerce goal analysis the method of calculation is similar. The biggest difference in the Ecommerce data is that it’s real and accurate compared to a lead form’s estimated goal value.
In Adwords, we can look at the money we spent to reach our goals in the same time frame. Compare that with the purchases made through CPC and we have a precise number we can confidently call ROI.

Lead Generation

Conversion rates (found in Google Analytics)
Conversions (found in Google Analytics)
ECommerce

ROI (found in Google Analytics and Adwords)
Revenue (found in Google Analytics)

I’ve always found that ROI is one of those terms that has been over-used and abused by so many people and as such, there is confusion on how best to calculate it. Personally, I like to use the following formula when we are discussing the ROI for any PPC campaign:

ROI = [Contribution] / [Cost]

So to calculate Contribution for a PPC campaign:
([Your average profit per sale] x [Estimated number of Conversions]) – [PPC Spend]

To demonstrate more fully, let’s take the following example:

Monthly PPC Spend: £1,500
Average Profit per Sale: £50
Number of Conversions (Sales) per Month: 75

and so the Contribution to Margin of the PPC campaign is:
(£50 x 75) – £1500 = £2,250
and your ROI would be:
£2,250 / £1500 = 150%

Phew! But there is an easier way. We have just created an ROI estimator / calculator spreadsheet that you can now download for free. We hope that it will be a useful tool for you when reviewing your PPC campaigns.

Facebook Launches Conversion Measurement Tool

seo freelancer mumbai
Facebook began rolling out a conversion measurement tool on Friday to help marketers bridge the data gap between social ads and online sales.

Facebook Inc (NASDAQ:FB) is bringing a new and an advanced tool that will enable online retailers and marketers to track online purchases of Facebook users who have viewed their ads, reports… Reuters. This development is reportedly going to help e-marketersThe post Facebook Inc (FB) Offers A Conversion Measurement Tool To E-Marketers appeared first on ValueWalk.

Third parties such as social shopping app maker Glimpse have been offering solutions to specific aspects of the social commerce “problem” for some time, particularly the disparate data sets available to online retailers.

Daid Baser used the example of an online show retailer to demonstrate the function and outcomes of this tool. David believes that marketers can see the number of people who bought shoes, but personal identification of purchasers will remain private.

Tuesday 20 November 2012

What is Off Page Optimization?

Off Page Optimization is a method of Search Engine Optimization (SEO) in which you build links for your website so as to make it reach millions of people using internet. This would help you in getting rankings in search result.

Some of the techniques used are:

Directory Submissions: Submitting your website in web directories, just like you have your phone number in the telephone directory.

Social Bookmarking: Saving your links on social bookmarking sites like diigo, digg, folkd, delicious, jumptags, slashdot and many more. This is just like you bookmark any website in your Bookmark Menu.

Article Posting/Submission: This is the way by which you can sell your content, means tell the audience about your product/service.

Blog Creation/Posting/Submission:Same as Articles.

Forums Posting: These are discussion boards where you can write about your website and attract audience.

Search Engine Submission: is the process of notifying search engines of the existence of  website content so that they include the site in their indices and search results.

Business listing: Business Listing allows you to write whatever is important about your domain name, website, blog or business that viewers don’t find in the Who is. For instance, you could let people know that a domain is for sale, post business hours or list multiple domain names that your company uses.

RSS Feed Submission: Really Simple Syndication are used for updating our web content, website, Blogs, Articles RSS feed is the best way to update our Web content and make it available for the Readers.

Review Submission

The main reason behind doing Off page optimization is attracting traffic and visitors to your website.

Monday 19 November 2012

What is On Page Optimization?



On page optimization refers to optimization of a web site page through its content. Through SEO, One aims to rank high in search engine results so more traffic come in with on page SEO. One directs people through their sits by providing high quality and relevant content. On page SEO is the most effective way through which can promote your business and business website online. On page is a very complete Process But a very important fact Of Seo.

Important Components Of On-Page SEO

1.    Proper Post Title(H1 Tags)
2.    Alt tags (Must use Proper Alt tags With targeted keyword)
3.    URl Structure (one of the most important part in SEO, must contain Your main keyword)
4.    Meta description tag
5.    Keyword density
6.    XML sitmap
7.    Proper Content(Should be unique)
8.    Internal linking strategy(Related articles must be properly interlinked within the website)
9.    Proper HTML and css validation

What Is SEO



SEO
        SEO(Search Engine optimization) is the art of optimizing your website to for Search engine friendly and getting a good search rank for various keyword.

Type Of SEO
                       SEO is two types  
  1. On Page SEO
  2. Off Page SEO

Friday 16 November 2012

Google's Latest Algorithm Update

This latest algo update this is being rolled out is predicted to impact around 3% of search queries, and to put that into perspective, the original Panda algorithm was said to affect around 12% of all queries. However, us SEO's have learned to take Google's percentile predictions with a pinch of salt after Matt Cutts stated that the "(not provided)" keyword would account for less than 10% of all website traffic.




Before releasing any details on the algorithm update itself, Google kindly gave us some background information on how they feel about search engine optimisation. This is likely to counter the speculation from some SEO circles when Google make an announcement, the most recent example is the speculation that followed misreporting of the "over-optimization" penalty, which Matt Cutts discussed at SXSW. There was a suggestion that his speech was perhaps 'anti-SEO', however, those who have read the transcript of listened to the talk in full will know that this couldn't be further from the truth.

In this latest blog, Google have left no room for debate as they empathically state that: "SEO can be positive and constructive", "search engine optimization can make a site more crawlable and make individual pages more accessible" and "'White hat' search engine optimizers often improve the usability of a site, help create great content, or make sites faster, which is good for both users and search engines." These are only a few examples of the positive endorsement that ethical, organic white-hat SEO received from Google in this blog. The problem G have is with those who manipulate and game the system and rob search users of the quality user experience that they expect, I of course refer to the propagators of Black Hat SEO. As mentioned, it is sites that use black hat SEO tactics and violate the Webmaster Guidelines that will be hit by this algo update, in an attempt by Google to return higher quality search results.

G aren't able to reveal specifics about the changes to how they handle certain signals, as this would leave the door open to those wanting to game the system. However, from the examples given in the blog, it seems there is a real focus on on-page aspects of webspam such as keyword stuffing, and excessive exact match outbound links. SEO Consult will also be conducting a review in an attempt to identify other metrics that this algorithm update targets.

The second screenshot in the blog seems to indicate that this is another step by Google to clamp down on the blog networks favoured by spammers to acquire anchor-text rich links. It identifies a piece of blatantly spun content with three links semantically connected to the search query 'loans', which are completely irrelevant to the content in which they are placed. This is the kind of spam that would be found on low-quality blog networks such as BuildMyRank, which was recently de-indexed by Google.



As I alluded to in the second paragraph, Matt Cutts recently spoke about an "over-optimization" penalty that is expected to be rolled out imminently. We've cleared up that this wasn't a criticism of SEO general, but again, those who abuse the system and lower the quality of results that are returned to users. We don't think that this announcement is directly linked to the over-optimisation penalty, but we expect to see that released soon, most likely with a US launch followed by a global launch, similar to how Panda was launched.

While we haven't seen any dramatic changes in the SERPs just yet (and we're not expecting to see any change for clients), we will be closely monitoring the social networks and SEO blogs for a better understanding of the initial impact of this algorithm update. We have already seen numerous people complaining in the Google Webmaster Forums and in other blog comments about their site incurring a penalty. This seems to indicate that the update has already begun rolling out, but the full impact won't be known until later this week when the update is fully rolled out.

Wednesday 7 November 2012

how to use google tag manager


Google Tag Manager is also known as GTM, is a free container tag system from Google. A container tag helps you manage different kinds of tags that you may have on your site. This include web analytics tags, advertising conversion tags, general JavaScript, etc.

Users can add and update their own tags anytime. It’s not limited to Google-specific tags. It includes asynchronous tag loading, so “tags can fire faster without getting in each other’s way,” as Google puts it. It comes with tag templates for marketers to quickly add tags with Google’s interface, and supports custom tags. It also has error prevention tools like Preview Mode, a Debug Console, and Version History “to ensure new tags won’t break your site.”

how to create robots.txt file for website

Robots.txt file :

Robots.txt is a text (not html) file you put on your site to tell search robots which pages you would like them not to visit. Robots.txt is by no means mandatory for search engines but generally search engines obey what they are asked not to do.

Robots.txt file tells search engines which directories to crawl and which not to. You can use it to block crawlers from looking at your image directory if you don't want your images showing up on google search. Be careful not to use this to try and block people from directories you want to keep secret. Anyone can view you robots.txt file. Make sure you password protect directories that need to be secured.


The location of robots.txt is very important. It must be in the main directory because otherwise user agents (search engines) will not be able to find it – they do not search the whole site for a file named robots.txt. Instead, they look first in the main directory (i.e. http://mydomain.com/robots.txt) and if they don't find it there, they simply assume that this site does not have a robots.txt file and therefore they index everything they find along the way. So, if you don't put robots.txt in the right place, do not be surprised that search engines index your whole site.


Creating the robots.txt file

Robots.txt should be put in the top-level directory of your web server.

Take the following robots.txt file for example:


1) Here's a basic "robots.txt":
User-agent: *
Disallow: /
With the above declared, all robots (indicated by "*") are instructed to not index any of your pages (indicated by "/"). Most likely not what you want, but you get the idea.

2) Lets get a little more discriminatory now. While every webmaster loves Google, you may not want Google's Image bot crawling your site's images and making them searchable online, if just to save bandwidth. The below declaration will do the trick:
User-agent: Googlebot-Image
Disallow: /

3) The following disallows all search engines and robots from crawling select directories and pages:
User-agent: *
Disallow: /cgi-bin/
Disallow: /privatedir/
Disallow: /tutorials/blank.htm

4) You can conditionally target multiple robots in "robots.txt." Take a look at the below:
User-agent: *
Disallow: /
User-agent: Googlebot
Disallow: /cgi-bin/
Disallow: /privatedir/
This is interesting- here we declare that crawlers in general should not crawl any parts of our site, EXCEPT for Google, which is allowed to crawl the entire site apart from /cgi-bin/ and /privatedir/. So the rules of specificity apply, not inheritance.

5) There is a way to use Disallow: to essentially turn it into "Allow all", and that is by not entering a value after the semicolon(:):
User-agent: *
Disallow: /
User-agent: ia_archiver
Disallow:
Here I'm saying all crawlers should be prohibited from crawling our site, except for Alexa, which is allowed.

6) Finally, some crawlers now support an additional field called "Allow:", most notably, Google. As its name implies, "Allow:" lets you explicitly dictate what files/folders can be crawled. However, this field is currently not part of the "robots.txt" protocol, so my recommendation is to use it only if absolutely needed, as it might confuse some less intelligent crawlers.

Per Google's FAQs for webmasters, the below is the preferred way to disallow all crawlers from your site EXCEPT Google:
User-agent: *
Disallow: /
User-agent: Googlebot
Allow: /