Using Robots to solve the case included duplicate analysis

the meaning of this sentence is as long as the root directory of the link with " rss" is not indexed, unless your website will bring in the RSS page address is the three consecutive letters, this statement can solve this problem.

repetitive linksI use ?There are two methods to solve the

this is the price of

analysis which will produce

so I put a ban on the Robots search engine index statement:

such as 贵族宝贝xxxx贵族宝贝/goods-1.html from=rss

display link

is the first link

Disallow: /*min*

includes other attributes of the link, we find that the law is there will be " min" " max" " attr" three words, are the smallest, the largest value of property

, one is by deleting the RSS information subscription, two is Robots

The classification of Next we break one by one

page with a lot of dynamic link

attribute selection

used ECSHOP friends will know that ECSHOP has serious repeat included too many problems, the number of actual products with many friends website articles only 1000, query link actually have 3,4 thousands, even tens of thousands of noble baby. Because of repetitive links like Title, so often lead to search engine punishment included poor, for this problem, Shanghai dragon Er is the need to solve the following, I express my personal experience to solve, welcome comments.

My personal choice is screening screening inside

Disallow: /*rss*

includes attribute selection links


2. classification page dynamic link

and above, we add a jump ban grab including these three words of the statement in the Robots:

1., first is the product page and the article page will have a lot of pages to the end of


sorting links

such as 贵族宝贝xxxx贵族宝贝/category-1-min80-max90-attr0.html