The only thing I seem to be having a problem with is my robots.txt file. I have pages blocked by the robots file but Jcrawler is still including them in the sitemap. No real problem at the moment as they are easy to find and I can delete them manually.
The file is readable and while the line:
Works in Google it doesn't in the component.