Why does Semrush say I have duplicate content?
If the Site Audit bot identifies multiple pages with 80% similarities in content, it will flag them as duplicate content.
HTTP & HTTPS / WWW & non-WWW
In most cases, domains have duplicate content due to http/https issues. According to W3C standards, whenever you have two versions of a URL (one on http and the other https), they are considered two separate documents.
The same goes for when a site has a www version of a page as well as a non-www version of the same page - search bots see these as two separate documents.
So when the SemrushBot sees these two separate documents, it will identify them as duplicate because that’s how the GoogleBot would see them.
To avoid this issue, you need to use canonical tags pointing from the duplicate pages to the correct page that you set as the canonical (or indexed) version.
You should also set up a 301 redirect from the http page to the https page so that users and search engine bots strictly see your https version.
Pages with little content
Another reason would be if Site Audit sees two pages that have the same content in the header and footer of your website, but there is so little body content (1 or 2 sentences on the page) that the bot sees the pages as at least 80% similar and therefore duplicates. In this case, you would need to expand the content on your pages so that bots can identify them as unique.
- What Issues Can Site Audit Identify?
- How many pages can I crawl in a Site Audit?
- Quanto ci vuole a eseguire il crawling di un sito web? Mi sembra che il mio audit sia bloccato.
- How do I audit a subdomain?
- Can I change automatic Audit re-run schedule or stop it completely?
- Can I set up a custom re-crawl schedule?
- How is Site Health Score calculated in the Site Audit tool?
- How Does Site Audit Select Pages to Analyze for Core Web Vitals?
- How do you collect data to measure Core Web Vitals in Site Audit?
- Why is there a difference between GSC and Semrush Core Web Vitals data?
- Why are only a few of my website’s pages being crawled?
- Perché pagine funzionanti del mio sito web risultano non disponibili?
- Why can’t I find URLs from the Audit report on my website?
- Why does Semrush say I have duplicate content?
- Why does Semrush say I have an incorrect certificate?
- What are unoptimized anchors and how does Site Audit identify them?
- What do the Structured Data Markup Items in Site Audit Mean?
- Configuring Site Audit
- Risoluzione dei problemi di Site Audit
- Site Audit Overview Report
- Site Audit Thematic Reports
- Reviewing Your Site Audit Issues
- Site Audit Crawled Pages Report
- Site Audit Statistics
- Compare Crawls and Progress
- Exporting Site Audit Results
- Come ottimizzare la velocità di crawling dell'audit del tuo sito
- Come integrare Site Audit con Zapier
- Using Semrush to Find Areas for Improvement on Your Website
- Optimizing Page Title and Meta Description via Semrush