TransWikia.com

How to Remove URLs from Google Search Engine

Webmasters Asked by Web Trainings on January 9, 2021

I have used so many things like url removal request and deleted the pages etc., but still the errors are shown in my webmaster tools.

How can I remove the urls completely from Google?

6 Answers

First off, Google's got a great memory. So even if you are successful in getting Google to drop your URLs from their index, they will request those URLs in the future again. Your approach needs to account for both.

I'd advise you to go through these steps

  1. Use Google Search Console's Removals Tool to stop Google from surfacing your URLs. This is a manual request that you file with Google, which usually gets the URLs hidden within 24 hours (but often sooner), under the TEMPORARILY REMOVE URL. Please note that this only means that Google hides your URLs for 180 days, they're not really gone yet (we'll get to that later).
  2. Be sure to clear any caches Google has for your URLs, by using the second tab CLEAR CACHED URL on Google Search Console's Removals Tool. After handling step 1 and 2, Google won't be showing your URLs anymore, including cached versions.
  3. Now, from your post I gather that you already removed the pages. So in that case it's best to make sure you return the 410 Gone status code instead of a 404 Not Found. The 410 Gone sends a much stronger signal to Google that they should remove the URL from their index. This is better than returning a 404 Not Found because those are often accidental.
  4. Remove any internal links, incoming redirects and incoming canonicals to the pages you want to be removed. You need to stop sending signals that indicate that a URL should be crawled (and potentially be indexed) to Google.
  5. To easily monitor whether Google's picking up on all of your hints, create an XML sitemap listing all of the URLs you want gone and submit it in Google Search Console. Remove the XML sitemap after Google's successfully dropped all URLs from its indexed.

What to avoid

Avoid applying robots.txt directives to try and get Google to remove URLs from their index, because if they've already indexed the content it's going to take a really long time for them to drop the URLs out of their index. They won't be able to pick up on your 410 Gone either, because you prevented them from accessing it.

Also for new pages that should be accessible to visitors, it's not recommended to keep Google from indexing it through the robots.txt. It's best to use the Noindex Robots directive (either through the HTML source or through the X-Robots-Tag HTTP header).

Interestingly, Google does advice to use the robots.txt to get them to drop images from their index.

Learning more about URL removal

If you want to dig in further into removing URLs from Google, I've documented the removal process for this situation and several others in great detail in: https://www.contentkingapp.com/academy/google-remove-urls/.

Answered by Steven Van Vessum on January 9, 2021

If your site has pages indexed in Gooogle that you want to remove the first step is to change your site. You can either remove the pages, block Googlebot from viewing them, or include a tag in them that tells Google not to index them.

  • 410 Gone -- Removing the pages with a 410 status allows Googlebot to remove them from the index as soon as it next crawls them.
  • 404 Not Found -- Googlebot will remove 404 URLs after a 24 grace period when it next crawls them.
  • noindex tag -- If the page is still available on the site, but should not be indexed, include <meta name="robots" content="noindex"> in the <head> section. Google will remove it from the index the next time it crawls it.
  • Password protection -- Protect the information by requiring a login or password to access it. Google may index the URL in this case, but it won't be able to index the content.
  • Disallow in robots.txt -- You can block Googlebot from crawling pages by adding Disallow: /mypage in robots.txt. Google doesn't usually index pages it can't crawl and it never indexes their content. However, Google sometimes may show the URL in the search results without any snippet:

    enter image description here

    If you have removed the page or added a noindex tag, it is important that you allow Googlebot to crawl the page. If you also disallow crawling in robots.txt, Googlebot won't be able to see that your page should be removed. Don't disallow pages in robots.txt if you have also implemented another method from this list.

With all these methods, Googlebot has to return and crawl the pages before they get removed. This process may take a couple months, especially for many pages that are not very popular. If you want to speed the process up, log into Google Search Console and use the Fetch as Google feature for each URL. Note that this is a manual process with a quota of ten fetches per day, so it won't work for a large number of URLs.

Google also has a Remove URLs tool in Search Console that can temporarily remove URLs or directories from Google for 90 days. You can use this tool to remove URLs from search results quickly before Googlebot has a chance to crawl them.

Even after URLs are removed from the Google search results, Google may still show the URLs as errors in Google Search Console. As long as Google finds links to URLs, they may show up as crawl errors after they have 404 or 410 status. This is completely normal and won't hurt the other pages on your site at all. See what Google's John Mueller has to say about crawl errors for more information.

Answered by Stephen Ostermiller on January 9, 2021

There are two way to Remove URLs from Google Search Engine

1st is

If you don't want crawl your page in google, so you have to use google webmaster tool's Removal URL tool to request its removal Here it is the way

  • Log in your webmaster tool
  • go to Site configuration
  • go to Crawler access
  • go Crawler access tab
  • pest your URL in new removal request

after few day your url will be delet from search engine and not able to crawl.

2nd

If your site has content you don't want Google or other search engines to access, use a robots.txt and disallow the folder of that web page.

you can use one of the above step to Remove URLs from Google Search Engine

Answered by Nishi on January 9, 2021

You can: Remove a page or site from Google's search results, but it will take a while before it takes affect.

Answered by John Conde on January 9, 2021

In general, you can't. But there are some options.

One option is to remove the pages from your server entirely, and make sure they respond with 404 errors.

Another is to use the robots.txt file to prevent Google from crawling the content. This will certainly prevent new pages from being indexed, but it may take a while for existing pages to no longer show up.

A final option is to use the noindex meta tag.

For details, see Google's page on this topic: http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=164734

Answered by payne on January 9, 2021

The short answer in the context of being practical: you can't.

What you can do is remove links that point to them and make sure that the pages respond with a 404 status code, and when Google sees that the pages are no longer in existence they'll eventually purge them from results.

Answered by coreyward on January 9, 2021

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP