Announcement

Collapse
No announcement yet.

Google penalty for SEO Settings?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

    #16
    Re: Google penalty for SEO Settings?

    I've been following this thread and I am concerned about what this means for content in our site. With tractor parts, we have many product pages that are almost identical with only the model number of the item changing. The url's of these products look like this:

    http://www.farmlandtractor.com/mm5/merchant.mvc?Screen=PROD&Store_Code=FTS&Product_Co de=7205&Category_Code=
    http://www.farmlandtractor.com/mm5/merchant.mvc?Screen=PROD&Store_Code=FTS&Product_Co de=7035&Category_Code=
    http://www.farmlandtractor.com/mm5/merchant.mvc?Screen=PROD&Store_Code=FTS&Product_Co de=7110&Category_Code=
    http://www.farmlandtractor.com/mm5/merchant.mvc?Screen=PROD&Store_Code=FTS&Product_Co de=7217&Category_Code=

    Only the product code changes in the url, so would this be considered duplicate content in the "eyes" of the search engines? If so, what can I do to alter this and increase overall and specific page ranking?

    Thanks for the input.
    Last edited by newguyintown; 09-08-08, 08:42 AM.
    Rex Shafer
    PWP Designs

    "Start by doing what is necessary, then what's possible, and suddenly you're doing the impossible." St. Francis of Assisi

    Comment


      #17
      Re: Google penalty for SEO Settings?

      Yes, because the content on each page is identical, except for the model number.

      As to your second question, I doubt if you will ever get any of those pages to rank well - there is insufficient text content - only 85 characters including spaces and punctuation marks.

      Perhaps your products should be reorganized - i.e. in this case, just have one product "Ford/New Holland 12in Disc - New" with 4 attributes: Model #s: 6610, 6600, 5600 and 5000?

      Comment


        #18
        Re: Google penalty for SEO Settings?

        We talked about designing the site like that, but the boss wants the selection process to go Brand - Model# - Item Group - Specific Item. Breaking it down with the attributes would mean that we would have a less specific selection process for the customer. i.e. Brand - Item Group - loads of items to choose through to find the one(s) that go for their model number. For the big brands like John Deere and Ford, that could be pretty inconvenient. I don't know that there is another way to keep specific selection, but reduce the duplicate pages. All our competitors are set up the same way.

        Also, you mentioned insufficient text content on the sites. What's the target range amount of good content for an e-commerce product page to rank well? I would like them to show up in searches, but not be unreadable or uninformative for the customer.

        Thank you for your help.
        Rex Shafer
        PWP Designs

        "Start by doing what is necessary, then what's possible, and suddenly you're doing the impossible." St. Francis of Assisi

        Comment


          #19
          Re: Google penalty for SEO Settings?

          Anyone who thinks that duplicate content within the same domain will NOT get you penalized is clearly misinformed. I've been penalized on my own sites in the past and I fully believe @aarcmedia's account is accurate. Navigation and site structure are not considered duplicate, it's the actual content that matters. Read up...Matt Cutts, Aaron Wall, Danny Sullivan and the other great SEOs dont preach about this for no reason. Use 301's and robot.txt where applicable. Non-canonical urls are key, you can read about it from the man himself (for those that don’t know, Cutts is the official Google employee "Spam Cop").

          @newguyintown: Those pages would most likely be considered duplicate content, but unfortunately there is little to no content on those pages either way. If you're having trouble keeping things inline, you can always use a non-Miva page for SEO purposes, then link from there to purchase pages. It might be smart to robots.txt out the dupe pages and nofollow links to them. As far as content goes, don’t write for search engines. Write for a normal person to read. Use a paragraph or two with a clear call to action.
          Last edited by silvalex; 09-09-08, 10:39 AM.

          Comment


            #20
            Re: Google penalty for SEO Settings?

            Say I have duplicate paths to the same content:
            1) mydomain/product/part123.html
            2) mydomain/widgets/part123.html

            Can the robot.txt file be used to block bots from going to one of the short URLs created by Miva's SEO Settings?
            i.e. Disallow: /product/
            Bronson Design Studio, LLC
            Website: bronsondesign.com
            Facebook: facebook.com/bronsondesign

            Comment


              #21
              Re: Google penalty for SEO Settings?

              @papi34: Yes, you can disallow directories like that. Just keep in mind that everything in the directory will be skipped.

              Comment


                #22
                Re: Google penalty for SEO Settings?

                Originally posted by silvalex View Post
                Anyone who thinks that duplicate content within the same domain will NOT get you penalized is clearly misinformed. I've been penalized on my own sites in the past and I fully believe @aarcmedia's account is accurate. Navigation and site structure are not considered duplicate, it's the actual content that matters. Read up...Matt Cutts, Aaron Wall, Danny Sullivan and the other great SEOs dont preach about this for no reason. Use 301's and robot.txt where applicable. Non-canonical urls are key, you can read about it from the man himself (for those that don’t know, Cutts is the official Google employee "Spam Cop").
                The problem with relying on those "great SEOs" is problematic for many reasons, none the least is that a lot of those "sermons" are 'OLD.' I usually like to refer to the source:

                (From this article that someone else posted here.)
                To conclude, I'd like to point out that in the majority of cases, having duplicate content does not have negative effects on your site's presence in the Google index. It simply gets filtered out. If you check out some of the tips mentioned in the resources above, you'll basically learn how to have greater control about what exactly we're crawling and indexing and which versions are more likely to appear in the index. Only when there are signals pointing to deliberate and malicious intent, occurrences of duplicate content might be considered a violation of the webmaster guidelines.

                While correctly dealing with this by using approriate measure such as robots.txt, site map tools such as our Merchant Moogle, and other issued addressed in the article, its not something to loose sleep over...and its, has I have maintained for a LONG time...NOT TRUE. The statement that Google will "penalize you for duplicate content" is false. Its not going to help, may even detract somewhat (i.e., is not optimal) but its not going to result in a penality. (Ok, maybe I'm slicing semantics a bit, but the effect of saying "google penalizes" verses "its not optimal" is probably worth a few hours of sleep for the poor store owner already working 12 hours a day.)
                Bruce Golub
                Phosphor Media - "Your Success is our Business"

                Improve Your Customer Service | Get MORE Customers | Edit CSS/Javascript/HTML Easily | Make Your Site Faster | Get Indexed by Google | Free Modules | Follow Us on Facebook
                phosphormedia.com

                Comment


                  #23
                  Re: Google penalty for SEO Settings?

                  Bruce,

                  Firstly, you need to ask yourself the question, why has he specifically covered the issue that "you can take matters into your own hands to avoid Google indexing duplicate content on your site" and provided links to articles on how to do it. If there was absolutely nothing to worry about then he wouldn't have mentioned this.

                  Secondly, you need to read between the lines in everything he says:

                  1. "majority of cases" - Majority could mean 51% so 49% could experience negative effects. Even if it were as high as 75%, which I doubt, that would mean 1 in 4 experience negative effects. Personally, I don't fancy the idea of doing something that has little or no positive effect where there is a 1 in 4 chance of a negative effect (and it won't be small).

                  2. "effects on your site's presence in the Google index" - Note he uses the word "site", not "pages". So your 1,000 page site could be in the index but 900 pages are in the Supplementary (or whatever they call it now) index and only 100 in the main index.

                  3. "effects on your site's presence in the Google index" - Note that he uses the word "presence" and does not mention the word "ranking". So, your site can be "present" in the index and all your pages rank greater than 1,000 (i.e. they don't appear in the normal search results).

                  4. "simply gets filtered out" - So all the PageRank that your home page and other pages that have passed to the filtered pages, is lost. He does not address PageRank at all, in the article. Within a site, PageRank is passed from the home page and top level pages, to all the lower pages. Through links from those lower level pages back to the home page and other top level pages, most of that PageRank is passed back up to them and then another iteration occurs. However, those pages that are filtered do not appear to pass their PR back up the chain thus damaging the PR of the pages that actually are included in the main index, thus damaging the ranking of those pages.

                  While the Google engineers may not aim to penalize a site for duplicate content within a site, does not mean that there is not collateral damage.

                  Comment


                    #24
                    Re: Google penalty for SEO Settings?

                    Also important to note is that I'm on board with the fact that "background" duplicate content, meaning duplicate content that was previously indexed, but is no longer linked to ANYWHERE in the site will eventually be filtered out.

                    HOWEVER, many site owners when implementing SEO friendly links, leave old links to non seo friendly URL's creating a situation where now the site owner is spreading link juice and PR to duplicate pages, diluting the sites overall rankability.

                    So it's ALWAYS important to try and battle against duplicate content. Considering how easy it is to do, it should always be addressed.

                    I agree that a novice in ecommerce can easily get sidetracked by reading old material. In the SEO world, if you're reading an article about optimization from 6 months ago, chances are that data is actually 1-2 years old. We still have clients requesting we stuff their global footers with keyword rich links and have to inform them that this is not just old-hat, but ancient and nowadays would actually hurt their cause a bit.

                    So Bruce, you are correct in one respect that duplicate content is not a cure all and it's not all that bad all the time. HOWEVER, it is impossible to argue that it shouldn't be allowed and should ALWAYS be remedied if you know you have it.

                    If an old product page with a dynamic URL is indexed and carries some "juice", 301 redirecting it to the new page will pass that juice on to the new page. Without taking care of those types of problems, implementing SEO friendly links on an old site is futile. You're going to murder your rankings before they pick back up again. It takes the search engines a long time to re-index everything on your page and re-figure out what is supposed to be included and what is not.
                    Ted Hust
                    AarcMediaGroup.com

                    Celebrating 13 Years of Outstanding Service & Support
                    Miva Merchant Design

                    Comment


                      #25
                      Re: Google penalty for SEO Settings?

                      I agree with aarcmedia, Pete McNamara and Bruce - PhosphorMedia on this topic because really everyone is saying the same thing differently to a certain extent.

                      First a quick clarification, there is a duplicate content filter not a penalty. A penalty is being removed from the index while a filter allows you to still be in the index but Google views other pages as the original content. It is much easier to fix being filtered out than it is to fix being penalized. If you copy someone's website or your own and put it on a second site then the pages on the second site could be penalized because they are "new" but contain old content found on another domain already. You can get ranked short term but it will not last with duplicating content from another website. Note there are some sneaky ways to trick the search engines on which content is older & the originator but you shouldn't do this so I won't go into details.

                      The reason duplicate content on the same domain "hurts" your rankings is you are essentially watering down the strength of your page authority. If you have a dynamic URL & short link URL that both exist with the exact same content on the page then the authority is split between the pages (not necessarily 50/50). Also having the default Miva Merchant dynamic URLs will not hurt your ranking chances; I've seen plenty of dynamic URLs ranked above "static" or seo friendly URLs. The reason I recommend using SEO friendly links is from a customer standpoint. If I search for "New Holland 12in Disc" and I see two results:

                      Code:
                      http://www.farmlandtractor.com/mm5/merchant.mvc?Screen=PROD&Store_Code=FTS&Product_Code=7035&Category_Code
                      
                      http://www.farmlandtractor.com/mm5/new-holland-12in-disc.html
                      Which one would you click on? I would be clicking on the second URL.

                      @newguyintown - Given your situation I would recommend creating unique product descriptions for each product even though the model number is all that is different. This will serve 2 purposes, a) you will not have to worry about the content on the pages being seen as duplicate and b)you may find that the product description for one product results in more sales and can test that style of product description on other products. I would also recommend linking to the similar products (ones that are exactly the same except model number) on each product page.

                      At the end of the day duplicate content does not help you get ranked so in some way it does hurt you. To what degree varies site to site.
                      Mark Simon | SEO Specialist | Miva Merchant

                      Connect and join the conversation with Miva Merchant on Twitter, Facebook, YouTube, Vimeo & our blog.

                      Comment


                        #26
                        robots.txt question

                        If I have a category tree heirarchy such as:
                        lighting/indoor/lamps/tablelamps
                        where the acme_table_lamp.html is in each of the 4 categories above and therefore has a least 4 URLs pointing to the same content...
                        How do I instruct robots to
                        Disallow: /lighting/indoor/lamps/
                        Disallow: /lighting/indoor/
                        Disallow: /lighting/
                        yet ensure the bottom category (tablelamps) gets crawled?
                        Bronson Design Studio, LLC
                        Website: bronsondesign.com
                        Facebook: facebook.com/bronsondesign

                        Comment


                          #27
                          Re: Google penalty for SEO Settings?

                          The best way is to have identical links to a product page no matter from what category page or any other page - i.e. do not pass the category code in the link to a product page. Then, you can allow the robots to crawl and index all your category pages (which generally is better for your site's rankings).

                          Comment


                            #28
                            Re: Google penalty for SEO Settings?

                            I haven't read the google bot stuff lately.. but I do know that at one time when they applied for a patent... part of their published ranking scheme was that pages closer to the 'top' of a website have more weight. So if your product appears to be several folders deep or several click into your site.. it won't be weighted as favorable as if it were linked directly off the web root of your store.

                            So.. given that, I would clean up my linking style so that all those folders aren't in the url. You can still display breadcrumbs that LOOK like the category path.. but for the actual a href on the product link... leave out those cats.

                            Comment


                              #29
                              Re: Google penalty for SEO Settings?

                              The biggest issue here isn't a penalty per se, it is a waste of link juice.

                              Take 2 identical pages with distinct URLs

                              http://domain.com/widgets/widget.html
                              http://domain.com/products/widget.htm

                              If you have 50 inbound links to EACH of them for the keyword "Best Widgets" (or whatever you are trying to rank for) then you are splitting the power of those inbound links in half.

                              EDIT: Let me clarify, I am not saying there is or is not a penalty, I am just giving an additional reason to avoid have duplicate content onsite.
                              Last edited by gmanning; 04-02-09, 04:10 AM.
                              Geoff Manning
                              -------------------------
                              Miva Sites: Oriental Furniture | Room Dividers

                              Comment


                                #30
                                Re: Google penalty for SEO Settings?

                                The problem lies in using the category code in links to products. This creates the duplicate content.

                                Our answer is to eliminate the category code from the product links. This breaks the breadcrumbs and cattree. So we add the category codes back in with Toolkit to get breadcrumbs and cattree working again.

                                Duplicate content eliminated automatically without using manual robots directives.

                                One last trick - use nofollow on non-indexed links to prevent PR leaks.
                                Last edited by Biffy; 04-02-09, 04:34 AM.
                                Steve Strickland
                                972-227-2065

                                Comment

                                Working...
                                X