Well, I will go over some of the basics of seo and give some more advanced tips here and there, but I am sure a lot of people here are somewhat familiar with search engine optimization. But today I wanted to talk to you about how Google can filter out parts of your site and I actually want to raise a question that I am still not sure about.
If you don’t already know, when a search engine spider or robot comes to scan your site, its looking for information or content on the page. I bypasses stuff like coding and boilerplate content so it can scan the important information on the page. Boilerplate content is what SEO’s consider the same content on each of the pages, with stuff like the copyright notice at the bottom or the same message that you may have on the side and header. Basically it can be considered duplicate content in a way. And we know that duplicate content is filtered out by the search engines or in other words, its not indexed.
So what raises my eyebrow, is the same links at the bottom of each page filtered out or are those links credited with link juice. Alot of people know that placing anchor text links from lots of pages helps those pages get rank for that keyword so they just place them in the footer area or something. Now something that you might want to look into is if google actually discounts those links. It might be better to place them in the body area mixed in with unique content.
Got a question or want to state your opinion, then leave it below.
Nice work. Please keep working on. Thank you very much.
Your post is much interesting. Thank you for your great work!
The same as somebody else documented what a great blog this is. Normally I dont make the effort with a comment but for your effort and hard work you should have 1. Very well done Happy New Year