Simple Techniques to Lower Website Duplicate Content

 

A lot of SEO advice is normally given. Optimizing pages, getting links, submitting to search engines et al. all these make sense. However, an often neglected tip is getting rid of duplicate content from a website. Duplicates are bad for a website. They represent the chunks or bits of the main website that look similar or are similar. To get the best quality in its search results, Google and the other search engines penalize web pages that are duplicates. Thus lower search engine ranking and traffic. For business websites or for ecommerce, this is bad. Here is how you can reduce duplicate content on a website.

Showing Google that all your pages are distinct

Google works with algorithms. Sometimes, content on your website can be caught as duplicate unintentionally. To control this, you need to tell Google or any other search engine of what is what. This means; every page on the website should be optimized independently. That is, making a page completely dedicated to a particular topic and target keyword. The more direct and clear your message to the search engine is, the better your chances of stemming duplicates that happen this way. A good place to start is your website’s Meta descriptions and titles. Having them all unique will boost your duplicates sheet from the search engines.

Using the Robots txt files

The file extension stands for exclusions. It enables you to tell the search engine what it should not crawl or add to its index. This tool is located in the root of the website. With this code, you can advise Google about what they shouldn’t index. This will not only protect that piece from being indexed, it can help you stem all the scenarios related to duplicate content. Moreover, implementing the robots txt file in your folders is quite simple and straightforward.

301 permanent redirects

Permanent redirects or 301 codes are permanent redirects. They tell website users what the right page is on a website. It’s especially great on lost pages. It tells users that the page that was previously there has been send elsewhere or to a different location. The 301 does the same thing to the search engine algorithms. Some people create a redirect after the end of a campaign. Others send the redirect to the home page to lower their bounce rates. If it’s created on the pages that have the duplicates and directed to the right pages, the duplicates are stemmed.

Exclude parameters in Google Webmaster Tools and rel=“canonical” tags

The canonical tags indicate what page is the right one. This is especially if you have different versions of the same page on the website. You can also use the Google exclusion parameters to manage these pages. Normally, the canonical tags, which are placed in the website’s head section, tell the search engines which specific page the page content applies.

Using duplicate content checker

Using free duplicate content checker Plagspotter to check all sides with duplicate content as your own is critical. You will be able to tell what parts of your sites are plagiarized so you can take decisive action to stem the loss.

Matt
 

After a career as a professional musician and band leader in the Miami South Florida Area I decided to see if I could make some money with this new internet thing. After years of trial and error I started to get the hang of it and now I am completely financially independent because of my various online businesses. The goal of this blog is to chronicle my continued marketing experiences. I focus on real examples of what works and what does not work. Google does not give us a recipe for getting our sites ranked. We have to use our own experiences to see what actually works rather than theory. I hope you enjoy the blog. Please let us know what you think in the comments area. We appreciate your feedback.