Block search engines via robots.txt (#631)
Prevents instances from being rate limited due to being senselessly crawled by search engines. Since there is no reason to index Nitter instances, simply block all robots. Notably, this does *not* affect link previews (e.g. in various chat software).
This commit is contained in:
		
							parent
							
								
									778c6c64cb
								
							
						
					
					
						commit
						c543a1df8c
					
				| 
						 | 
				
			
			@ -0,0 +1,2 @@
 | 
			
		|||
User-agent: *
 | 
			
		||||
Disallow: /
 | 
			
		||||
		Loading…
	
		Reference in New Issue