您好,山东省E销宝总代理欢迎您的光临! 关于我们 在线留言

24小时服务热线

18615220710 

robots文件对于网站优化的重要性!

2020/07/02 来源:http://www.jnexb.com
一般都说可以使用robots.txt 文件来遮挡不想被搜索引擎抓取的页面,但是这些“不想被抓取的页面”般都会有哪些呢?下面来举几个简单的例子。
Generally speaking, it can be used robots.txt  Files to block pages that you don't want to be crawled by search engines, but what are these "pages you don't want to be crawled" like? Here are a few simple examples.
(1)多版本URL情况下,非主显URL 的其他版本。比如网站链接伪静态后不希望搜索引擎抓取动态版本了,这时可以使用robots.txt 遮挡掉站内所有动态链接。
(1) In the case of multi version URL, other versions of non main display URL. For example, after the website link is pseudo static, you don't want the search engine to grab the dynamic version, so you can use it robots.txt  Block all dynamic links in the station.
(2) 如果网站内有大量的交叉组合查询所生成的页面,肯定有大量页面是没有内容的,对于没有内容的页面可以单独设置个URL 特征,然后使用robots.txt 进行遮挡,以防被搜索引擎认为网站制造垃圾页面。
(2) If there are a large number of pages generated by cross combination query in the website, there must be a large number of pages without content. For pages without content, you can set a URL feature separately, and then use the robots.txt  In case the search engine thinks that the website makes spam page.
(3) 如果网站改版或因为某种原因突然删除了大量页面,众所周知。网站突然出现大量死链接对网站在搜索引擎上的表现是不利的。虽然现在可以直接向百度提交死链接,但是还不如直接遮挡百度对死链接的抓取,这样理论上百度不会突然发现网站多了太多死链接,或者两者同时进行。当然站长自己好把站内的死链接清理干净。
 
E销宝
(3) It is well known that if the website is changed or a large number of pages are suddenly deleted for some reason. The sudden emergence of a large number of dead links on the site is not conducive to the performance of the site in the search engine. Although you can directly submit dead links to Baidu, it's better to block Baidu's capture of dead links directly. In theory, Baidu will not suddenly find that there are too many dead links on the website, or both at the same time. Of course, the webmaster himself to clean up the dead links in the station.
(4) 如果网站有类似UGC 的功能,且为了提高用户提供内容,并没有禁止用户在内容中夹杂链接,此时为了不让这些链接浪费网站权重或牵连网站,可以把这些链接做成站内的跳转链接,然后使用robots.txt进行遮挡。现在有不少已经这样操作了。
(4) If the website has the function similar to UGC, and in order to improve the content provided by users, users are not prohibited from mixing links in the content. At this time, in order not to waste the weight of the website or implicate the website, these links can be made into jump links in the station, and then used robots.txt Occlusion. Now many of them have already done so.
(5) 常规的不希望被搜索引擎索引的。
(5) Regular ones that don't want to be indexed by search engines.
本文由E销宝为您提供,我们的网站是:www.jnexb.com我们将以全心全意的热情为您提供更优质的服务,欢迎您的访问!
This article is provided by e-marketing. Our website is: www.jnexb.com We will provide you with better service with wholehearted enthusiasm. Welcome to visit!

截屏,微信识别二维码

微信号:18615220710

点击微信号复制,添加好友

 打开微信