Blog: SEO
Man looking at laptop open to Google

How to Remove Old URLs from Search Results

Avatar for John Locke

John Locke is a SEO consultant from Sacramento, CA. He helps manufacturing businesses rank higher through his web agency, Lockedown SEO.

Occasionally, you may change some pages on your website, eliminate pages altogether, or migrate to a new website platform altogether.

You may notice that Google still has old URLs that no longer exist in the search results.

How do you get the new web addresses indexed, and make the old outdated web addresses disappear from search results?

The URLs in Google’s index don’t instantly disappear once you change or delete a URL. The index is the collection of all web pages and documents that Google has previously encountered that are eligible to appear in search results.

There are a few things you can do to get outdated URLs removed from search results, and changed URLs indexed, but in any case, it may take a while for Google to drop the old URLs entirely.

Here is a video describing how to remove outdated URLs from Google search results here.

A Few Things to Know About the Google Index

Googlebot, the web crawler that finds, crawls, and captures web pages and documents, will continue to try and periodically crawl any URL it has encountered in the past. Googlebot seems to have an infinite recollection of previously discovered URLs.

Because websites sometime experience technical difficulties and downtime, if Googlebot discovers a URL is missing, it will not immediately drop a URL from the index. It will continue to try and crawl that web address until it is satisfied that it no longer resolves to a web page.

Popular websites are crawled more often than less popular sites. If you have a basic small business website that doesn’t receive an abundance of organic search traffic, an outdated URL may still appear in search results for months. Googlebot must reasonably determine an old web address is missing.

Google will also try to crawl an new URLs it discovers through:

  • XML sitemaps submitted in Google Search Console
  • HTML sitemaps and other internal links on your website during a regular crawl
  • External links (hyperlinks outside of your website) that link to URLs on your website

Discovering new URLs doesn’t resolve old URLs that no longer exist. The best way yo speed up this process is to use 301 redirects.

Use 301 Redirects When You Change or Eliminate a URL

A 301 redirect is used to divert a browser to a URL when you type in another URL. Generally, redirects are either a 301 (Permanent Redirect), 302 (Found – Temporary Redirect), or 307 (Temporary Redirect). When given the choice, you should use a 301 redirect instead of a 302 or 307 redirect, if SEO is your main concern.

Remember how we said that Googlebot will try and crawl URLs it found in the past? Unless you use a 301 redirect, it won’t automatically know that you changed one URL on your website to another. 301 redirects are also useful for keeping your link profile intact.

Hyperlinks from external websites to your website help establish your authority. Losing this “linking power” by changing URLs and failing to 301 redirect the old pages to the new pages can result in broken links. This diminishes your overall website authority, and can impact your SEO.

Implementing 301 Redirects on Various Platforms

If you have a WordPress website, you can use a plugin like Redirection or Simple 301 Redirects to implement 301 redirects. Some of the more robust premium SEO plugins like SEOPress Pro also offer built in 301 redirects.

On Squarespace, 301 redirects are handled under Advanced > URL Mappings. Using the formatting that Squarespace recommends here, you can set up redirects.

Using Google Search Console

You can also speed up the process of removing outdated web pages from search by using the Inspect URL tool in Google Search Console.

1. Login into Google Search Console and find your website property.
2. Click the input field in the top navigation that says Inspect any URL in your website property. Put in the old URL that is now being redirected. Hit Enter to inspect.
3. If the URL is indexed, a message will appear that says URL is on Google. Click the button next to Page Changed? that says Request Indexing.
4. When the URL has changed, and the redirect is working, a message will appear that says the URL is being redirected. Click Request Indexing to schedule a crawl of the new URL.

What Happens Next

Googlebot will schedule a re-crawl of the changed or new URL. The new version of the page, or the web page that the redirect points towards will be indexed soon. You can test to see if a new URL is indexed by searching the URL directly in Google.

In time, the old URL will drop from search results, and the new URL will be shown in search results.

If the new version of the page is different from the old page, there may be some fluctuation in the search results, as the page will be re-evaluated for different search queries.

Final Thoughts

Change is part of the ephemeral nature of the Word Wide Web.

Pages change; websites change; links disappears; entire sites change hands or disappear entirely.

Making sure that you have a cohesive plan for site migration or a site redesign should include a spreadsheet of URLs that are going to change, and which URLs they will be replaced by. Make sure your web development team has a plan for redirecting old URLs once the site migration is complete.

If you are only changing a handful of URLs, you should still have a plan for 301 redirects and determining current indexing in Google Search Console.

Avatar for John Locke

John Locke is a SEO consultant from Sacramento, CA. He helps manufacturing businesses rank higher through his web agency, Lockedown SEO.

4 comments on “How to Remove Old URLs from Search Results

  1. I have read and understood all that. But I have a little bit more weird of a situation. Let´s say there is a page that has had 3 different URLs in the past.

    1. a very old one
    2. an old one
    3. the current one

    What happened is, that we changed from the old to the current one and it was indexed and it ranked after a while. But the other two were still indexed, too. And after a while even the very old one started to rank for certain keywords. So google dropped the current one and preferred the old URL over the current URL.
    Whereas the old one and the very old one are both redirected to the current one. And should not exist in any sitemap or what so ever anymore.
    Is that normal? Can this naturaly happen or is this something to worry about?

    1. After a while, the old URLs should no longer be coming up in search results, if you have 301 redirects in place, sending the two older URLs to the current URL. What I can tell you is that for about a year and a half now (as of August 2022), Google seems to not crawl many sites as often as it used to. Maybe they are trying to conserve resources, who knows. What you will want to do: 1) Make sure both redirects are 301s and are working correctly. 2) Go into Google Search Console and recrawl those old URLs (Under Crawl any URL in this Property – should be the top search bar.) It should tell you “URL is not on Google”, and you should Request Indexing. This should clear the old URLs.

  2. Thanks for this post John. I understand creating new pages and redirecting old URLs to new URLs. My issue goes beyond that. My website is built on WordPress. Google Search Console is crawling a pile of old URLs I personally did not create and have no use for. I believe that they are mostly URLs that were created when I installed a theme while building my website 1.5 years ago. There were sections of the theme that I did not need e.g product catalogues with all their images and recipes etc.—and these URLs are coming back in my Google Search Console report. Just as an example, it is telling me that there are 268 pages that are 404, “These pages aren’t indexed or served on Google”. My site is pretty small – only 35 posts and 5 pages. I believe this affects my site negatively. How can I get Google to stop crawling these?

    1. Hi Jane:

      Googlebot never forgets a link that it encounters, so these 268 links that you see in Google Search Console will be crawled periodically, even if it is only once every few years. My suggestion would be to create 301 redirects using a plugin, that way they will resolve to a valid page. I’m not sure how much these 404s are affecting your SEO, as this is a fairly common issue. But it’s still best to be safe, and make sure they are permanently redirected.

      – John

Join the Conversation

Your email address will be kept private. Required fields marked *.