Here’s a search engine optimization (SEO) pitfall that doesn’t get much press, so it’s sort of a “stealth” hazard: duplicate content.
As the term suggests, duplicate content occurs when two or more Web pages have the same content. This may sound implausible at first blush; but in fact it’s a big problem for B2Cs, who frequently sell the same product on multiple sites. For B2Bs, it can happen when…
- multiple sites publish your e-distributed press release
- partners use your product/service descriptions verbatim
- your blog content is posted to multiple websites
- multiple URLs on your domain point to the same content
- you pay industry or news sites to host your content (syndication)
Some of this clearly represents things you set out to make happen, and for good reason. But there can be an associated cost, if you’re not a bit careful. It’s not that Google will actually punish your site, as commonly rumored (here is their statement on the matter); it’s that your website page might not show up in a given search results page, with Google noting its absence by a little message something like this:
In order to show you the most relevant results, we have omitted some entries similar to the ones already displayed. If you like, you can repeat the search with the omitted results included.
Of course, Google does this to avoid giving searchers 30 or so entries that are in fact the same page, with different urls. You may think this would be a rare event, but in fact Yahoo has stated that roughly 1/3 of the WWW’s pages are duplicates.
If the duplications are on your own website, then a second SEO hit may come into play: if Google can’t determine which is the primary page, your page-rank strength may be diluted as the value of inbound links may be spread across all of the duplicate pages. Now, you may think that this never happens either, but: I have a shall-remain-nameless B2B client whose site contained a couple dozen pages of content …each one targeting a specific industry, and all built using the same content. When called on it, they changed three sentences in one paragraph! Even disregarding the issue with search engines for a moment, what about the poor site visitors (aka prospects) who’ll be subjected to this?
In her excellent post for Marketo’s Modern B2B Marketing blog, Maria Pergolino provides several tips for avoiding duplicate-content problems…
- Submit a site map; this will help Google understand which pages you think are most important on your site
- Use canonical tags to avoid duplication concerns within your domain
- Provide unique content descriptions when doing content syndication programs
- Deploy press releases on your site before submitting them for distribution, and include additional unique information in your version
- Build your product pages using unique information not provided to partners
- Require partners to add unique information about how the partnership is relevant to your product or service on their pages
All this is goodness, but it still pales beside the biggest content problem of all, which is that too many B2B marketers still skimp, cut corners, or otherwise treat their content development as an unworthy pursuit. There once was a time when B2B marketers spent countless hours trying to get the copy in their off-line collateral “perfect” before disseminating it to prospects. But now, with that chore fading into the background and all the focus on their website, strangely they treat the content as if it is somehow secondary. Instead, they spend all their efforts on getting the blue line that spans the width of the page below the header banner “just so”.
Evidently, we can’t say it enough, so we just gotta keep on saying it: quality, relevant content is the biggest single thing you can do for both prospect engagement and search engine ranking.