The Haunting Tales of Duplicate Content
If you are a brokerage or agent in today’s real estate web landscape, there is currently something terrifying looming over your head. It’s creeping under your bed, lurking in the darkest corners of your room, hiding in the depths of your closets and that thing is Duplicate Content. We may not know it exists, but it haunts our every web move. For real estate professionals, Duplicate Content is almost a guarantee. Unless you are able to relegate all of your listings to one web page per, Duplicate Content could be harming your Search Engine results efficiency.
Until recently, it wasn’t as big of a problem, and only spooked us once in a while when we thought we caught glimpses of it. Then Google had to go and rile us up with its haunting tales of websites being negatively affected by duplicate content issues. With the recent release of Google’s Penguin Algorithm update, they have shifted everyone’s attention to making sure their site is not violating Google’s “quality guidelines” and our Boogey Man, duplicate content, is a big part of that. Having to rely on search engines to make the determination on the intent of your duplicate pages may not be the best way to go about it. Not to mention, third party sites, multiple agent sites and IDX systems all generating content very similar to your own. Where do the lines start to blur?
Duplicate content, loosely defined by Google, is when different URLs on your site display overwhelmingly similar content (same keywords, text, photos, video, etc…). Search engines, like Google, are now starting to get smarter about how they index your webpages, and when presented with multiple pages of very similar content, they make an algorithmic determination as to what page is the most preferred. This may not always give you the best foot forward on the search results pages, the page you want displayed may not be the page they choose. Plus, the Penguin Update set a precedent that sites may start to get penalized for having copious amounts of the same content throughout the site. Google suggests that Penguin only applies to sites that use duplicate content to attempt to manipulate the search results; however, I would rather err on the side of caution.
Common Reasons Behind Duplicate Content
There are various reasons how duplicate content gets generated. For example, many brokerages still use practices of dynamically generating URLs using scripts, which present long character ridden query strings in the URL slug.
See samples below:
www.example.com/products.php?id=546416&title=Page_Name&action=blank
www.example.com/products/item.php?id=546417&title=Page_Name2&action=blank
Both URLs show the exact same information, but are different addresses. A single listing page could have 4 or 5 different URLs, dependant on how the consumer navigated to that page through the brokerage web site.
Even if you don't use scripts to generate your urls you can still run into duplicate content concerns. Whether it is bad organization of your site, or just multiple pages being created for the same property through your agent pages and your main framework, there are multiple ways content is duplicated, even without you being privy to it. For example, the "index.html" page of a website is usually the same page displayed if the visitor accesses the site without specifying a filename. So, "http://www.example.com/index.html" and "http://www.example.com/" are usually the same page, showing the same content.
So, How do we Remedy Duplicate Content?
There are a few steps you can take to help address duplicate content issues, and ensure that consumers visiting your real estate site see the content you want them to.
So, What’s the Big Deal?

- One way to solve Duplicate Content issues is by using 301 redirects:
- Quickly take care of Duplicate Content using the “Rel=Canonical” Link: