MY MENU

Ask your every questions in Dubuplus

DUBUPLUS
CUSTOMER SUPPORT

SEO & Statistics

SEO & Statistics

SEO & Statistics

Title
How to handle Search Robots
Author
관리자
Date Created
02/02/2018
Attachment0
Views
149
Content


STEP01. Effective Use of robots.txt

 

1. What is the effective use of robots.txt?


A robots.txt file is used to control the search engine traffic by letting search engines know the parts to access to the website and crawl. So if you don’t want certain pages to be exposed to search engine results, you can control them with robots.txt.

If your website is not exposed in search, check robots.txt settings first.

 


2. Recommendations for effective use of robots.txt


Avoid excessive robots.txt settings

Use safer methods for sensitive contents.

Use free tools for webmasters

 

3. Setting robots.txt


Set robots.txt in [Manage Mode] – [Preferences] – [Site Management] – [Settings for SEO/SNS Sharing]







Copy URL

Select the entire URL below to copy.

Edit Comment

Enter your password to edit the post.

Delete CommentDelete Post

Enter your password to delete the post.