8000 update robots.txt to prevent spidering · adityai/activeadmin-demo@07996ab · GitHub 8000
[go: up one dir, main page]

Skip to content

Commit 07996ab

Browse files
committed
update robots.txt to prevent spidering
this is a precaution, as I'm not sure if the spam links are being scraped by spiders
1 parent 833b9c4 commit 07996ab

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

public/robots.txt

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
# See http://www.robotstxt.org/wc/norobots.html for documentation on how to use the robots.txt file
22
#
33
# To ban all spiders from the entire site uncomment the next two lines:
4-
# User-Agent: *
5-
# Disallow: /
4+
User-Agent: *
5+
Disallow: /

0 commit comments

Comments
 (0)
0