Panda 2.0 is hardly anything new, just Google’s way of extending upon and providing more effective ways to fight cheaters and spammers on the web. Despite several years of warning web masters of the consequences for not complying with their demands, it appears most sites, including some of the world’s largest have not heeded to Google’s authoritarian rule. Their demands, while seeming extensive, have actually been pretty simple. Google has made it clear what is acceptable and what is not, but the world of web publishing continues to turn a blind eye anyway. Here we’re going to look at some of the things that can get you penalized by Google manually and algorithmically, effectively causing your sites to score poorly in organic search results.
Penalties Caused by a Poor User Experience:
Loading down your webpage’s with tons of advertisements isn’t exactly what Google considers to make for a good user experience. They’ve stated in the past plenty of times to stop over stuffing your sites with advertising, yet it appears no one is listening. In fact, most sites today, both small and large, appear to be dumping more ads than ever into their sites with the hopes of earning more revenue. Too bad this method doesn’t work, and all it will do is cause you to lose revenue from the loss of traffic you will receive once Google realizes what you are doing. Google today has plenty of algorithms in place to identify and drop the ranking of sites that are loaded to the core with ads.
Placing too many ads in pages degrades the user experience by causing pages to load slowly, and if it’s a lower rank in organic search you seek, this is the way to do it! Google is after sites not just with too many ads, but those that are loaded with popups. Popup ads are clearly an annoyance to users, they don’t like them, so don’t use them! That being said, if you are using them its best to ditch them altogether. Sites that have high quality content will easily outrank yours in organic search results if they employ fewer ads in their pages, since those sites provide a better user experience. No one wants to be in the middle of reading an article and get interrupted with a popup. Another thing to keep in mind here, when Google penalizes sites and lowers their rankings, they then raise the rank of other sites who do comply with their rules. This site has benefitted tremendously from Google’s new algorithms, and is likely to continue to do so for the reasons just stated.
Penalties from Scraped Content:
This is another rule in Google’s handbook that has clearly gone unheeded. After close to a half a year on Google plus, I now see more than ever people making posts to articles on their sites with scraped or stolen content (no names), and some of them are doing this on a large scale. If you think you’re going to pull a fast one on Google by scraping articles and changing a few keywords, they will catch you, and they will penalize you, so don’t bother. Google insists that all content on your site must be original, and any duplicate content will count against you.
Duplicate content doesn’t necessarily mean scraped or stolen content; this rule can is also applied to royalty and license free images that many use on their websites. Web publishers will purchase the rights to use images from sites like fotofolia, or PhotoXpress, but what they don’t realize is that while they may have the legal rights to use those photos, they are still considered duplicate content in the eyes of search engines, and they will get you a lower organic rank for this reason. One of the things you can do to avoid duplicate content penalties from images is to either create your own, edit graphics to create new ones from the various parts of others, or to change various features of images such as, the brightness, contrast, color ratios, sizing, and or encoding schemes. If you’re using a Jpeg you could convert it to a PNG, then create a new graphic as well by mixing pieces from several images into one. This would be an effective strategy in order to avoid duplicate image penalties. We made the graphic banners on this site using several programs, and those banners are original only to this site, a big plus.
When an article is published to the internet, search engines are only going to give credit to one site for that article. They tend to do this in the form of time stamps when crawling your content. It’s best to submit newly minted articles directly through your webmaster tools accounts both on Google and Bing immediately after publishing any new content. This helps search engines to identify the rightful owner of content and weed out the bad guys. One myth that seems to be swirling around the internet is that implementing Google Authorship into your content publishing strategy does away with the risk of anyone unrightfully stealing your content. This is complete nonsense since anyone could scrape an article and pen their name to it using Authorship themselves! While Google has a number of measures in place to protect the rightful owners of content online, the best method is still the good old fashion time stamp that search engines themselves create when they index your content.
Penalties for Unresponsive Sites:
If your site takes too long to load, Google’s algorithms are sure to spot this, as so are your sites users themselves. If your site takes more than a few seconds to load on average, then you have a serious problem on your hands, as search engines are likely to count slow page load times against you. Also, sites that load slowly tend to have a high bounce rate in which users spends only an average of a few seconds on the page they visited before leaving because they got tired of waiting. In extreme cases users may just assume that your site isn’t going to load at all and leave altogether.
While content management systems seem like a quick and easy way to publish a new site, they are more of a hindrance to the web publishing process than a savior. CMS based sites tend to be heavy on the code, especially PHP, tend to rely on plugins in order to function, and add all kinds of unnecessary code to pages that is clearly not required to build a website. In fact, content management systems like Wordpress are not used to build websites, they are used to house and manage them, that’s all and nothing more. PHP adds functionality to sites, but it has nothing to do with the creation of them in any way, shape, or form. All CMS based sites still rely on HTML and CSS as the core foundation of their makeup.
Content management systems do have their place, and many Wordpress templates do load fast, but the point here is that you need to understand how they work from the ground up; otherwise they will do nothing but cause you headaches. Either way it goes, in the end static sites win hands down. If you’re not willing to learn web design and expect Wordpress to do it all for you as everyone else does, then don’t complain later when you’re outranked by a million other sites because your site loads like crap! If you want to publish content to the web then learn how it works or don’t quit your day job.
Penalties from using Link Exchanges:
The internet today is loaded to the core with sites offering to sell you high quality back links from high page rank sites for pennies on the dollar, even guaranteeing their results. Don’t bother doing this, whenever Google identifies a link network they will not only penalize the network, but they will penalize anyone who used it, including you!
Penalties from Paid Links:
This one appears to be seriously misunderstood by many, so I will do the best I can to explain this thoroughly. What Google means by paid links is sites that pass page rank through advertisements. Placing ads on your site won’t get you penalized, but placing ads that pass page rank will! Also, many sites are implementing advertorials that appear to be genuine articles when in fact they are nothing more than ads, and do not identify to users that they are in fact not articles based on genuine interest. Google’s recommendations when placing advertorials on your sites is to specify to users that they are in fact ads, and not just articles. As well, you should make an effort to “nofollow” links from ads on your sites. If you aren’t sure whether or not your ads pass page rank or not, then contact your advertising agent to determine whether or not they have put measures in place to keep this from happening to begin with.
Implementing Sites Purely For Affiliate Marketing Purposes:
Affiliate Marketing is clearly a grey area with Google. The way this works is, someone creates a site, writes articles, and then recommends products within the articles as if those products are worthy of being purchased based on the writers experience. The problem here is, the articles are generally produced on the premise of making a sale and collecting a commission from that sale, and most of the praise claimed within these types of articles is not genuine and is usually a flat out lie! 99% of the time Affiliate marketing based articles are deceptive in that they market products which the author actually knows nothing about and has never used. Search engines don’t see any value from them and neither do I, don’t do this!
Poor Ranking from unoriginal Content:
Unoriginal content doesn’t necessarily mean duplicate content. What we’re referring to here is writing about the same subjects as everyone one else, as well as writing short, dry, and boring articles. Instead, make longer posts, get more in depth, and put a new spin on an old topic. We all have subjects that interest us that we’d like to write about, but its best to add real insight and personal experience into the mix whenever possible. Trying to figure out what to write about nags me more than anything in the world. I try to my hardest to consolidate some ideas, add a new touch to them, and write about them as thoroughly as possible. This method seems to work. Also, doing research is a great way to come up with original content, since you can write based on your observations, which I do often, and more than nine out of ten of my articles are based on research I performed on a given topic. Research related articles can provide valuable input to users to solve real world problems, regardless of the subject.
Today, more than half of the articles found on the web are likely less than a few paragraphs at most, and the majority of them are nothing more than an overview. People don’t search online for overviews, they search for answers to real problems, and they want real information in return. If it’s a lower bounce rate you seek, you get this by posting longer, more informative articles for your users to read. This keeps them on the site longer, in turn keeping bounce rates low. Also, implementing a search engine on your site helps users to quickly find what they are looking for, rather than having them leave your site for Google search, why not keep them right where they are. Besides helping your users to find what they are looking for, integrated search engines can also provide an avenue for creating additional revenue through the use of advertisements. The search engine on this site provides a full index of all content that has been previously published here, and it shows ads to users at the top of the screen that are relevant to their search, either way, you and your users win.
Poor Website Functionality:
This one will seriously doom your site from day one! You could have the best content ever, but search engines like Google don't care; if your site's overall functionality is of poor quality, you'll never rank at all! Whether you're an SEO or have any web design skills or not, I strongly suggest making use of Google's Page Speed Insights, which can outline serious problems with your sites functionality that you otherwise might not have recognized or known about previously. For instance, We used Page Speed Insights to make huge changes to our site that have proved to have a dramatic influence on our ranking.
Page Speed Insights shows webmasters detailed information regarding the technical functionality of their sites, including pointing issues that can seriously slow down ranking, including stop rendering scripts (these are scripts from third party plugins that can be placed at the bottom of page instead of the head section or be replaced with asynchronous versions that load with the page), uncompressed graphics, the leveraging of browser and server caching, and overall desktop and mobile performance data.Another place to look for errors in your webpages code is simply by right clicking your browser and selecting view source, which will show you the source code for a given webpage. For instance, Firefox will usually show code that has errors in red, helping to outline potential problems with both HTML markup and scripts. You can also make use of W3C validation, which instantly show you what code on your page isn't up to par with current standards.
Penalties from Not Having Enough Content Above the Fold:
Here’s another Google rule that continues to go unnoticed. Making users scroll to find the beginning of the content on the page is clearly a bad idea, and forcing users to do this will get you penalized, quick, fast, and in a hurry by Google! The key here, place your content as high up on the page as possible, put focus to the theme of a given page, making the article or graphical content the most prominent and placing ads off to the side somewhere. While scrolling is a natural process for long articles, it’s also unnecessary to force users to scroll past a ton of ads to find so much as the first heading of your page. Besides hampering user experience, the lower your content on the page, the lower quality scores you can expect to receive from search engines, effectively allowing others sites to take your spot in organic search.
For whatever reason, it appears much of everything I covered here has been completely ignored by the larger majority of sites on the web today. People tend to think they can continue to get away with loading down their sites with advertisements, writing short and boring articles, or stealing others content and pretending as if it were their own. One individual on Google+ has been stealing content all along and plastering his name across it on a daily basis for who knows how long now. I’m sure Google has figured out what he’s up too, and he will definitely get his in the end. Why piss off the entity that delivers your traffic to you? Then you have to work ten times harder to get traffic, when it could flow naturally and freely to you. In the end, the whole idea is providing the quality user experience that people expect, and more than often than not, we almost never manage to get when surfing the internet.