It's far, far, far quicker to use the web indexes to browse for releases, and use sites like for more specific files that aren't already batch processed.įor individual files, using newsleecher in parallels works OK, but for binaries, the woeful DL speed and UI slowdown as the HDD and RAM chokes, well it will really suck. You can find answers to some common questions on NewsLeechers support forum (Image credit: Newsleecher). it's the same advice for getting games to work in crossover really. (When data is received to the TCP Mapping, it will just relay it to the remote server you setup in the mapping.
#FORUM NEWSLEECHER HOW TO#
If you use a wrapper like crossover and configure the damnable thing (not fun), then it will work OK just not fast or stable, you should hunt down an older version or the hacked version as it doesn't use the licensing software from the latest betas, which will either break or simply not function in crossover or wine. Newsleecher -> Usenet menu -> Manager -> Server -> Add New Server - then enter the details on how to connect to this TCP Mapping. in windows alone, it's fine, since you're not quadruple-dipping into the memory pool to run a single app. otherwise, it's not that great, other usenet mac apps would be better (though i can't think of any)įor a mac, newsleecher uses up a lot of memory and HDD speed, as it's writing to the virtual pagefile and the virtual HDD in parallels.
#FORUM NEWSLEECHER FOR FREE#
The only reason i can think of to prefer NL to SABnzbd is that newsleecher offers a reseller deal for giganews, if you don't already have a usenet provider, they offer a giganews + newsleecher deal, you get newsleecher+supersearch for free as long as you get giganews via their link, and keep using giganews. could be wrong on that.Īs for NZB sources, there are plenty of NZB indexing sites available, i can think of 5 or 6 free ones. tv.babylon5.moderateddiscussion of Babylon 5 and other projects of J. discussion of abuse of email by spammers and other parties. not sure on that one, but i believe it's used to gather parts. Matters related to the functioning of Usenet itself. As for Newsleecher, the indexes they use are self-gathered using a dedicated batch process, it's not an indexing site per se, as your queries need to be using the specific NL protocols, and queries are sent back as XPAT requests, i believe.