If you’ve been doing SEO for a while, you’ve probably heard of Google’s Penguin algorithm that combats spammy and low-quality backlinks. 17tth October, 2014 was a highly significant date in SEO world as Google officially has released Penguin 3.0 update with the core aim to show more quality results. Sites that somehow managed to bypass from the Penguin 2.0, an algorithm that combats spammy and low-quality backlinks as well, should now be detected by the latest update. The Penguin 3.0 is just an updated version of the Penguin algorithm that specially designed to target web pages that used to amplify their rankings by tricking Google through unethical SEO practices.
Actually, Google was never 100% satisfied with the outcome of the Penguin 2.0 update before and that’s why started manual penalty as well to support the automated penalty. So Penguin 3.0 is a much expected update by Google’s spam control team. It has been reported that approx 1% English results have been affected for that automated penalty.
While working to recover sites from Google’s penguin 3.0 penalty we have primarily found the below 3 issues –
- Unnatural link structure
- Too many outbound links that passes link juice
- Poor quality content
Generating a mass of inbound links was always a significant part of any SEO strategy since the stone age of SEO as number of inbound links determined website’s importance and ranking. The most widely discussed feature of all Penguin updates is the way it treats backlinks. All Penguin algorithm updates are designed to penalize sites having artificial link structure irrespective of the fact that those links were acceptable before and helped to improve ranking.
You can be buried deep within the search results if you have been using wrong SEO techniques now to create backlinks to achieve page one ranking desperately. However SEO is not dead – ethical SEO practices will definitely reward your site to hit the bull’s-eye gradually – slowly but steadily, thanks to the rapidly advancing science behind search engines and the new Penguin!
If your site contains too many “dofollow” external links that would be suspicious to Google and your site might be affected by the latest Penguin update. One simple explanation is Google will think that you are selling links. We have seen the sites having a link directory or link page which was made before for link exchange etc., external links from several blog posts to other commercial sites are penalized with the Penguin 3.0 update.
Poor Quality Content
If even a single page of your site contains any duplicate or spun content that can affect the entire performance of your site. Be it just an innocent blog post or other page, such pages may be the culprit.
Hope now you have a clear insight to protect your site from Penguin 3.0 and all such future updates. Google algorithm are being updated on much shorter cycle than in the past, making Google more coherent at identifying unethical ways used to boost ranking and traffic. Always sustain a quality website that follows Google Webmaster guidelines and you will probably never get affected by any algorithm changes.
Have you ever wondered how visitors interact with your website or some specific landing pages? Are the most important portions of your web pages or website are properly visible to the visitors? Are they clicking on your various offers or call to action buttons? Well, Google’s In-Page Analytics tool is the answer to all of the above questions. Here you go -
- Login to your Google Analytics account.
- Select your website whose page click metrics you wish to analyze.
- Change the date range of the analyzed period as required.
- From the left side panel click on the Content option and then click In-Page Analytics.
- Your default page i.e. Homepage’s In-Page Analytics stats would be displayed within the integrated browser located at the right panel. You can adjust the view by clicking expand (which will hide the left panel) and then clicking the Up-direction arrow just above the integrated browser to hide Site Usage metrics. You can also analyze the page click metrics of the other pages by browsing other pages from that browser.
Google In-Page Analytics tool also gives you a clear insight about the content part which appears within the first fold of the page without any scrolling and a percent of visitors who scroll down to see more. So to judge how things are working at your website or how popular a particular link or button is – do an in-page analysis using Google analytics.
Please be sure that you have sufficient click data before you come to any conclusion about any onsite SEO changes.
If you are sort in time or manpower and have bulk internet marketing and web development related works – our dedicated hiring work model would be best for you to save time and money. See how you can be benefited as follows -
- Your hired person will work from our office from Monday to Friday (except national holidays) on 8hrs per day basis. You are guaranteed to get atleast (8hrs x 20 working days) 160 hrs. of work in a month. Also we understand that you are paying for hours. All your resources will work for 8 hours per day excluding lunch time or other pleasure breaks from Monday to Friday.
- You will be able to see what tasks he is performing on day to day basis with hour’s usage details anytime you want.
- You will be given a free Account Manager who will act as a single point of contact and team leader for your hired resource(s). No need to manage individual resources, your account manager will be at your service to answer all your queries during the business hours promptly.
- We can sign NDA (Non Disclosure Agreement) to protect your privacy, client details, promotional activities done etc.
Key Advantages of Offshore Hiring –
- Quick Turnaround – You can assign priority of your projects and we will work accordingly. This is something like instructing an in-house staff and getting the job done quickly.
- No hiring burden, setup fees, and office expenses – you’re guaranteed to save more than 50% cost than an in-house employee who will do the same tasks.
- Quick Solution to Any Problem – we are specialized in all internet marketing and web development activities. If your hired person get any problem or need assistance we are here to solve his/her problem promptly, so your work will not be hampered anyway.
- Quality Control – we are responsible to meet all your expectations from us. So we monitor all activities of your hired person closely and ensure 100% error free work always.
List of tasks that can be performed by your hired person –
- Total SEO activities including link building, content marketing, making your websites 100% search engine friendly for both English and non-English websites
- All promotional content writing.
- Complete Paid Advertising Campaign management including Google AdWords, Microsoft adCenter etc. for both English & Multilingual websites.
- Social Media Profile Management including Twitter, Facebook account setup and promotion and online reputation management.
- All kind of website development works including but not limited to web design, web programming, website maintenance etc.
Type of resources that we can provide based on your requirements –
Web Guru – Can execute all SEO activities, paid advertising activities, web design and web development activities.
In this model of hiring, 2 or 3 experts will work simultaneously to produce 8hrs of total output in a day. For example, if you have some design requirements – we will assign a designer, if you have some SEO requirements – we will assign an SEO expert.
Your account manager will manage these resources for you based on the given assignments. So, you will get all kinds of capabilities within a single virtual resource.
SEO Expert – Can execute all internet marketing related activities including total SEO and paid marketing support. SEO expert can’t do any web design or development related works and is limited to only internet marketing related jobs.
Link Builder – can execute all SEO link building activities and promotional content writing. Not capable to execute any paid marketing, onpage SEO works and any web development related activities.
|Meta tags are used since the stone age of website development and search engine optimization. But over the years, these tags are lost its importance due to excessive keyword staffing or spam as an onsite optimisation tool. Most of the major search engines no longer use them to understand a site and rather prioritise other on-page aspects to understand the relevancy of a website for any specific set of keywords.
Syntax: <meta name=”attribute” content=”content type” />
Based on their characteristics, Meta tags mainly classified in the following two categories –
- Meta http-equiv – this tag is used to send information to the browsers as per its content attribute. However this is no longer the only way to specify character set of a webpage. For example, <meta http-equiv=”refresh” content=”20″> instruct the browser to refresh a page automatically after 20 seconds of loading.
- Meta Name – these are user generated optional Meta tags that can be used to provide some specific info to the search engines and/or web developers.
Below are the standardized Meta Name tags which are recognised by most search engines.
Meta Description Tag – It is used to provide a short description of a page within approx 160 characters limit. Most search engines including Google, Yahoo, Bing show content of that tag as a part of snippet shown in the search results. Though, actually you can write more but most search engines display till 160 characters in SERP. Writing a compelling Meta description within that character limit can help to improve conversion or influences searchers to click on your website from SERP.
Meta Keywords Tag – Google has confirmed a long ago that they do not use Meta Keyword tag or give any weight on that tag as a ranking factor. However I have not found any official reference for Bing about whether they use it or not but Yahoo surprisingly still encourage to use that tag on their guidelines. Personally, I think it’s worthless to use that tag and it would better to concentrate on other parameters that influence more as ranking factor.
Meta Robots Tag – It is recognised by all major search engines. However, it does not make any sense to use it to specify something explicitly like “content =index, follow” or “content=all” as by default all search engines index and follow links found on a webpage. So, we should use that tag only when we don’t want a page to be indexed or its links to be followed by the search engines. It can be also useful for other negative scenarios including “noodp” – when we don’t want to show DMOZ’s description in SERP; “noarchive “ – to prevent showing cached link of a page; “nosnippet” – to prevent snippet to appear in the SERP etc. purposes.
Watch the below video to find what Matt Cutts has said about Meta tags –
There are also some particular search engine specific Meta Name tags which can be used to instruct a specific search engine spider like Google, Yahoo, Bing as follows -
- <meta name=”googlebot” content=”noindex, nofollow”>
- Google / Bing / Yahoo Site verification Meta Tags by specifying the following as meta name agent – “google-site-verification”, “msvalidate.01″, “y_key”
There are some arbitrary or obsolete Meta tags found as well. However most search engines now do not recognise them at all or those are not related anyway as a ranking factor for SEO. Here are some of those arbitrary tags -
<meta name=”title” content=”some text goes here” />
<meta name=”generator” content=”Frontpage”>
<meta name=”revisit-after” content=”period”>
<meta name=”expires” content=”tue, 01 Feb 2007″>
<meta name=”distribution” content=”option”>
<meta name=”rating” content=”general”
<meta name=”subject” content=”your website’s subject”>
<meta name=”copyright” content=”company name”>
<meta name=”language” content=”en”>
<meta name=”author” content=”company name or person name” />
Hope now you have a clear insight on Meta tags and their usages in 2013. Do leave your feedback as comment if any.
If you are in the SEO field for the last couple of years you should be able to remember that there was a well known term in the industry called “Google Dance”. The effect of the Google Dance encountered most in the year 2007 and 2008 and from late the 2009 onwards it was less frequent as far as I can remember. Starting from 2011 until September this year it was almost vanished or not noticeable like before.
What is Google Dance?
Google Dance is an experimental algorithm update to try to improve search results or drop-off web spam in search results. It generally happens without any official update from Google. As an effect, organic rank for some sites jump (up or down) even 3 to 5 pages. The live search results impacted for a period of 1 or 2 weeks approx and then again come back to the normal steady state. However for some victims the effect stays longer and appears permanent.
Google Dance is just an indication of a major permanent algorithm change which generally not happens at one go. As for example, the last “Panda” and “Penguin” updates were just the final stage performance of so many Google Dance rehearsals. Google generally announces about those dance factors officially much later – if they got satisfied with the results and decide to go for those algorithmic changes permanently.
What happened at the dance floor for the last rehearsal in September?
Here are the top 3 impacts according to our analysis –
- URL Devaluation – Domains and URLs which were getting advantage solely for having keywords in the URL are identified and devalued for lacking other offpage and onpage SEO friendly factors. Here is an official confirmation from the Google web spam team head Matt Cutts – https://twitter.com/mattcutts/status/251784203597910016.
- Structural Devaluation – websites hosted in the same server (shared hosting) with very similar design/coding, navigation structure, pages and content are identified and kick-out from the floor.
- Authority Reward – website link found as ethical sponsor advertising such as banner and text ads across relevant network got rewarded in organic ranks.
The results now got steady and some sites negatively affected for that dance got back to the previous position automatically. But I believe websites who are true victims are still not on the dance floor as Google may retain a minor part of that algorithm change permanently or will soon apply it and they will be just out.
“Prevention is better than cure”, I encourage you to follow only ethical SEO and strongly avoid challenging Google with tricky techniques. Consider to go for periodical SEO audit and ensure everything is Google friendly.
Do let me know your views by commenting here, oh! wait, wait, 1 more thing – I hope you are still not using those old-school comment spam techniques?
|Google officially announced a rank algorithm update on 24th April, 2012 which is popularly known as Google Penguin Update. Though I have not found any official reference about the name “Penguin” but anyway, the name not sounds too bad . The algorithm change has affected approx 3% of the English searched results as declared in their blog. However this is just a smarter version of their previous update called Panda and the effect of both these update has changed approx 15% of the search results. Now that’s a dramatic change.|
Based on my research and analysis over the time, I have seen that the update mainly affected following 5 kind of SEO practices –
- Continuous practice of using exact targeted keywords to generate incoming links.
- Too many links from the same websites or same categorized sites.
- Practice of building poor quality or off-the-topic links mainly through the stereotype blog comments (includes username spam), forum post signature etc. specifically just to generate a link for your website.
- Building irrelevant contextual links or content spam as demonstrated in their official post (I was really not aware about such kind of spam technique).
- Links from a website with minimum to no unique and useful content and specifically created for link building or more specifically 3-way link building purposes.
We have heard couple of questions from our clients recently like –
Should we not use article submission or press release distribution anymore as Google appears to penalize mainly these content aggregators?
Should we do only contextual link building?
Should we remove all previous links that we have generated from all article and press release sites etc. and etc?
Well, let me first specify that I have no intention to promote any of our services through that post. I would be just happy to guide about the safe link building strategies after that update. The mantra is “Diversify, Diversify and Diversify”.
Lots of links from the similar categorized sites always appear suspicious to the search engines. That was applicable even before the effects of that update – the difference is that now Google is smart enough to detect them easily. For example, if there are say 3000 links pointing to your websites and more than 80% links are coming from the article directories that’s unnatural and your website likely to be penalized. Your website should have links from all possible resources like social media websites, blogs, business listing sites, directories, press release sites, article sites to name a few. You must not use or prioritise a single method and just forget about all other techniques.
Second thing is that needs to be keeping in mind that diversification among the same category link building sources. For example – are you submitting your articles to the same list of article directories over and over? Well, consider not doing that anymore and instead pick a list of top quality aggregators and vary the site list you use on weekly or monthly basis for a particular project.
Exact targeted keywords as anchor text was useful to get on top about some months ago but now that can be the main reason of penalty. That always sounds unnatural and you know Google always loves natural links . So tweak your keywords by using synonyms, singular-plural forms, by including prefix or suffix with your keyword, preposition/articles if that makes sense or by changing the keyword order and even use common misspelling as anchor text. Also consider to have some links using the URL itself i.e. without any anchor text.
Write content for users; create value etc. that what Google says. Well, what you really need to do is that use quality unique articles to build contextual links. Though it’s natural that the same article can be traversed among several content aggregators but it is safer to use an article version once only. Article spinner can be useful to save your time; however I must suggest checking each spun version manually to ensure those are upto the mark or acceptable by the human editors.
Link Type Diversification:
Now you also need to build non SEO friendly links for the sake of SEO! Yes, you can’t have all “dofollow” links, that’s unnatural. The natural look link structure should have incoming links from the images, nofollow text links etc. from the relevant sources.
You must not build links particularly for the Homepage or only for some of the priority inner pages always. Your site should have several other pages which may not be useful for SEO but for visitors. So there must be some balance so that it looks natural.
I hope that post will help somehow to conceptualize what you need to do next. Let me know your thoughts here.
Statutory note – “no comment” is better than posting stereotype comment to get a link
To give your blog the look and feel of your website or to simply just change the look of your WordPress site or blog you may need to change the theme. There are several WordPress theme aggregators or WordPress theme submission directories which offer both free and premium WordPress themes. So if you don’t have any ready-to-use theme just Google the term “free wordpress themes” to find a list of those directories. Browse and download the theme of your choice (comes as a .zip file).
Now you have the WP theme ready at your hand and here is how you can install it quickly –
Step 1: Login to your WP Admin panel using the administrative username and password.
Step 2: From the left navigation menu under the Appearance, click on Themes.
Step 3: Now select the Install Themes tab (located just beside Manage Themes) and click on the Upload option.
Step 4: You should find an option to choose the theme related ZIP file from your hard drive. Select that file and click on the Install Now button.
Step 5: A page should be displayed with the message “Theme installed successfully”. Click on the Activate option if you wish to apply the theme now or just return to the theme page to use it later.
Please note that the above options may slightly vary depending on the version of your WordPress.
Hope this quick manual will help you to change or install a new WP theme easily.
Google tried to compete with the Yahoo Answer by making a similar kind of thing. But unlike Yahoo Answers, Google Answers was not free and it seems to be the main reason of the failure. While askers can get a quick solution through Yahoo community for free why they might pay for it – seems to be the prime reason of the failure.
3. Google Base:
It’s my favorite Google flops! The concept is quite similar like online classified sites. As of September, 2010 that product is promoted to Google Merchant Center. But as there are a huge number of free local and international sites which supports similar features like this – so there is nothing exciting!
4. Power Meter:
I’m quite sure that many people even have not heard about it. Google discontinued that program since 24th June, 2011. The main aim of that software was to show electricity usage in real time and create an awareness of how much energy they use.
5. Google Sets:
It’s appears quite complicated and I’m not very sure why I need to make a set for any practical usage. Thank God Google shuts it down from 5th September, 2011.
6. Google Squared:
Another complicated search product which discontinued along with the Sets from 5th September. Wish to thank Google again for their wise decision to discontinue it.
Buzz is a social networking service integrated with Gmail service allowing users to share updates, photos, videos and more at once. It lets users make conversations about things they find interesting. It was released on February 9, 2010. It’s scheduled for discontinuation by end of 2011.
Jaiku is a social networking, micro-blogging and lifestreaming service comparable to Twitter. People are well engaged in twitter so it is expected that most of us are not looking for another similar program. Thanks again that Google decided to shut down it on January 15, 2012.
A website demonstrating and testing new Google projects. Google Labs closed on October 17, 2011.
There was a huge buzz about that service but it was another super flop program that calls back so quickly. I still have no idea what it was intended to do after accepting a wave invitation.
What’s next? I guess Google Plus would be on the next list. I forgot when I last opened it after signing up and browsing for merely 15 minutes. Oh yes, I logged another time there to add few friends to my circle and to accept few invitations. Though I still have not enough chance to review it thoroughly but have not seen anything more interesting or much different than Facebook community.
I would love to hear from you all – comment your review on that topic.
|First of all, in short the answer is “Yes, Google recognises nofollow links and can pass trust vote”!Now let me give a short introduction about nofollow links for those who are new to it. Nofollow is a link attribute used to instruct search engine crawlers to not follow that link. By default, search engines follow all links found in a webpage. But if you want to prevent search engines to not follow or crawl some links for preventing to pass link juice then you can do so by using rel=”nofollow” attribute.|
Now we all know that nofollow links don’t pass any link juice or anchor text across the link. Ok, let’s come to the point – so does it make any sense to build “nofollow” links as far as SEO is concerned? Here are why it’s definitely makes sense to build nofollow links as a part of your entire link building efforts –
Recognisable to all search engines including Google -
If you give a closure look to the link section after logging to your Google Webmaster Tools account, you should find that Google is counting links from the web pages with nofollow attribute! I guess you already have a Twitter and Facebook account and you promote your site through these platforms and you know links from Twitter, Facebook are all marked with the nofollow attribute automatically. BUT you would see that Google Webmaster Tools is displaying the incoming links from the web pages with nofollow attribute including Facebook, Twitter, nofollow blogs, article aggregators etc. So, should there be any further doubt on that topic?
Trust vote -
By now, you should be agree that Google and other leading search engines can detect all your incoming links irrespective of the link attribute. So eventually if the search engines detect that you have a link from a highly relevant page then that passes a hidden trust vote to your site. The thing matters here is that search engines know about your site’s link source and link source from a authority domain or relevant page passes trust vote regardless of the link attribute.
Natural link structure -
Nofollow links are an absolute must to make your total link structure natural to the eyes of search engines. If you focus to build only “dofollow” (or links without the “nofollow” attribute) links that would be suspicious to the search engines. So you just need to maintain a balance between followed and not followed links for best SEO benefits.
So, some nofollow links are valuable as well for SEO – PROVED
You can target visitors mainly in 2 ways as discussed below –
- Area Names – You can just select the countries, regions or the city names you wish to target.
- Custom Targeting – You can target the required area you wish to cover from a particular given location. For example, if you have a restaurant you may be want to target 50 miles radius from your restaurant’s location point. Covering an area other than circle shape is also possible. You can even exclude areas within your target.
Sounds great, isn’t it? Let’s now discuss how Google determines to show the ads based on the target area specified.
- By Domain extension – Searchers using a country specific Google domain will view the ads targeted to that country.
- By Language – Search Keyword language that matches with the Google Preference Setting language given by the advertiser.
- By IP address – Whenever you connect to the internet you are assigned a unique IP address assigned by your ISP. Google can track your location through your IP and show you ads targeted to that area.
- By Query Parsing – Query that matches with your ad and targeted keywords in the campaign.
Now you understand how Google geo targeting works and how Google determines which ads to show in response to a query. But is that all you just need to know to run a successful campaign? Not at all – there are some pitfalls you may be probably unaware or not thought before. To get the best ROI from your AdWords campaign you must need to ensure that you are not getting unnecessary clicks and impressions. So what are those Geo targeting problems that can just drain your money? Let’s see -
Geo Targeting Problems
Have you ever thought that your ads can be visible to different countries other than the country name you specified in your campaign? Yes, this can happen if searchers use a Google search domain dedicated to that particular country you have targeted. For example, if you target UK, then your ads would be displayed if someone searches using google.co.uk – regardless of the searcher’s actual physical location or country.
Your ads can also be visible to the users located in different countries or areas rather than the areas you actually target depending on your target language and the searched domain used. For example, if you target France and specify French as your target language, then your ad would be displayed if someone searches using any French term you specified in your target keyword list – regardless of the searcher’s actual physical location or country.
IP address plays a major role in city or custom area targeting. For example, if you wish to target the New York City and 50 km. around the New York City – you would probably expect that you will get visitors only from that particular area, right? But eventually this is not 100% true, and there are big chances that you lose potential visitors and/or get unnecessary traffic beyond that area! Why? Because it depends on the IP address assigned by the ISPs covered a large area. So even visitors located around 200 miles away from New York can be treated as visits from that area and also visitors actually located inside that area can be treated as outsiders.
Now you can understand that how improper AdWords campaign management can ruin your money or how you can lose potential customers. Hope this post will help you to setup your campaign cleverly.