Robots.txt
The Lumis allows you to register settings for parameters and criteria of user-Agents that will make up the robots.txt.
Each service instance corresponds to a robots.txt. Therefore, for each website (solution), you must instantiate a Robots.txt service.
The robots.txt files are read by the user-Agents of external search engines. Through robots.txt, the engine will know which path it can navigate and index the pages, among other information.
The next sections address the visualization of the service by the service administrator, the administrative environment for it, and finally, the availability of the service by the portal administrator.