Newsleecher question

hearttech22

New member
After trying Grabit and being dissapointed by the constant crashes, I tried Newsleecher, and it seems ok, if only a little messy. Anyway, how do you make it download several articles at once? Right now, it just downloads one at a time, and I can't find any option to increase it. Anyone? Their homepage is down, so I can't ask over there.
 
Just click and drag your mouse over the files you want to download,then right-click and select 'Leech'.

Is that what you mean? :ermm:

Or do you mean when already downloading,you want it to dl one file and already start with another one?

If so,you'll have to connect to the server with multiple connections.
But I think it also depends on how big the files are.
If they're bigger than xbytes it will only dl them one at the time.
Not sure though.
 
Yes, that's exactly what I mean. :)
Filliz said:
If so,you'll have to connect to the server with multiple connections. But I think it also depends on how big the files are.
If they're bigger than xbytes it will only dl them one at the time.
Not sure though.
Very wierd. With GrabIt, this was extremely easy. I just checked as many checkboxes as I wanted connections. That newsleecher doesn't have this functionality seems very strange. :(
 
But you have got your server details set to use multiple connections right?

nlset9by.jpg


If so I would assume that it would dl more than one file at a time if possible. :mellow:
 
Ok, now I've got another one that I can hopefully get an answer to. :) The groups I subscribe to keep downloading all the headers from time to time when I select "Update group", even though I have recently updated it before. What could cause the program to "forget" which headers I have already downloaded and discarded?
 
I think you'll have to change some settings in your options.

View > Options > Group caching and Headers
Have a look around there :)
 
So I have to keep all the headers? So let's say my usenet provider has 20 days retention, then I have to set the "Days to keep group headers" to 20? And what about "Download IDs? I'm sorry if I'm asking stupid questions, but Newsleecher's site has been down the last few days so I can't get any help there.
 
Yes :)
And what about "Download IDs?
Same as the headers.

I'm sorry if I'm asking stupid questions, but Newsleecher's site has been down the last few days so I can't get any help there.
No problem. :01:
It's not the first time their site goes offline and they don't send out any notices to their subscribers :pinch:
 
Unfortunately, setting "Days to keep group headers" to 20 doesn't help. I just double-clicked a group to update it, and it started to download 4 000 000 headers, the oldest ones are 23 days. This is a group that I updated, and cleared earlier today. This program seems quite buggy. :(
 
Strange.
I've got it set to 50 on both headers and ID's and when I update a group I only receive the new headers since my last visit. :mellow:

Well,here are screenshots on how I got it set up,maybe there's something in it that's different from your settings and might fix the issue:

caching3jl.jpg


headers7wb.jpg
 
Yep, same settings as mine, except that I set "Days to keep..." to 20. It's strange because sometimes it works like it should, but the next time, chances are it starts from the beginning again. Can anyone maybe recommend another good program?
 
What version do you have as this was a bug for some in older versions.
You could also try to uncheck get all headers and put a count limit on it...
 
I'm using 2.1 final (not registered). Anyway... setting a count limit just means it downloads as many headers as the limit, and that's not what I wan't either. I want it to download new headers since the last update. Bah, this app sucks. :(
 
Well like I mentioned its a know bug from a way back that the cache gets dumped and starts all over after closing the app if that what you poblem is. Anyway count may work for you although youll have to tweak it ti be accurate to time if thats what you want I guess...
 
Back
Top