I implement Lijit Search on all my sites, as I really like the control it gives me over the search box and the detailed analytics.
I recently received a message from Lijit support (whom are great people, btw) stating this:
Quote:
It looks like you currently have a setting prohibiting crawlers from indexing your content. In order for us to get you properly crawled, you will need to edit the robots.txt file on your site and remove the following line: Disallow: / Until then, we won’t be able to index anything. |
I googled to learn more about the "Disallow" and want to know --- should I Disallow certain folders or can I just delete this line?
I am still relatively new to understanding the code and want to be sure that I know what I'm doing so I don't negatively impact how Google search crawlers see the site, etc.
Anyone else using Lijit?