How To: Download Newsgroup Binaries Using IPv6

Torrents are getting so closely monitored these days that it can be risky to download things through them for fear of getting a copyright infringement notice (even when using things like PeerGuardian alongside your torrent client) – get two of these CINs, and the least that’ll happen is that your ISP will drop you like a hot potato. If you’re lucky. You can always look for things on MediaFire or RapidShare-type sites, but quite often the files are pulled soon after they’re uploaded and you end up looking at dead air. So how about using newsgroups?

Originally designed, and still largely used, as message boards – newsgroup postings can also have files of limited size attached to posts, and through the nature of newsgroups themselves and the magic of yEnc, you can post as many messages as you like to upload anything from a jpeg picture to a BluRay rip…

Usually, your access to a news server will fall into one of these three categories:

  • Your ISP has a news server, but does not subscribe to binary channels (i.e. they have the newsgroup channels with the text, but not the ones with the files)
  • Your ISP has a news server, and does subscribe to binary channels, but their selection of binary channels is meagre at best and/or the retention is low (that is, only posts over the last X days will be kept, where X is a low number, 15-25 days or so. This limits what you can download to only the recent stuff, which sometimes is fine, but mostly isn’t ideal).
  • Your ISP doesn’t have a news server, so you have to pay for one – and usually this means not only paying per amount of time, but also paying per amount you download (i.e. $25 US for 1 months access, 25GB limit). Pay servers often come with benefits like high retention (400/500/600+ days) and SSL options to encrypt your traffic.

As the available IPv4 address space dwindles, companies are looking to set-up and test the next generation of TCP/IP addressing: IPv6 – and to do this, they sometimes let people have free accounts (and hence free access) to things. Things such as newsgroups… See where I’m heading with this? So let’s go! :D

1.) Set Up an IPv6 Address

To get yourself all set up with an IPv6 address, you don’t need an IPv6 router, or a native IPv6 address from your ISP – we can do it all encapsulated through our standard, run-of-the-mill IPv4. What we do need though is a tunnel – by which I mean some address we can connect to using IPv4, and from that address comes native IPv6 with which we’re going to get all our newsgroup stuff.

There are plenty of free IPv6 tunnel brokers out there, I decided to go with a company in LA called Hurricane Electric: So head on over to and sign up for a free account. You’ll need to provide your external IP to set up the account (so they know which end of the tunnel points back to you), and you’ll need to have your router configured to accept ICMP Pings from external sources (for the set-up stage only) otherwise the tunnel can’t be established because it thinks you have an intermittent connection or something. The way you change your ICMP Ping settings will vary from router to router, but should live under the firewall section from your routers web interface.

Enable ICMP Ping Responses
Enabling external ping responses on a router running Tomato firmware

Once you’ve signed up and given them your external IP, you can set up the tunnel on their end through the web interface where you’ll be provided with a IPv6 address which you can tunnel to using whatever OS you’re using.

IPv6 Tunnel Setup
I've scrubbed my details out - you'll get your own numbers when you get your free account

2.) Create the Local Tunnel Link

On the tunnelbroker website you can select which OS you’re using, and it’ll provide you with the commands to link your external IP to the IPv6 address they’ve given you.

IPv6 Tunnel Configuration Options
Choose your poison...

Since I’m using linux, I’m going to go with the Linux-net-tools script, which gives me this (I’ve added comments to clarify what’s going on and dumped it all into a bash file – I hope my interpretation is right, if not feel free to correct me):

3.) Connect to a News Server

For this we need two things: A binary newsgroup file downloader, and a news server to connect to.

There are a number of free ipv6 news servers available, I’ve gone with one called based in the Netherlands, which you need to sign up for, but which appears to give you unlimited downloads at approx 100-200Kb/Sec and with a retention of around 70 days – not bad for free! So, sign up for a free newsgroup access account here and keep the username/password handy. Note: The page is in Dutch, but it’s only got one field and one button, so I’m sure you can figure it out ;)

Next we need the actual application to connect to the news server. I’ve gone with one called sabnzbdplus (“sab newsboard plus”), which is free, largely platform agnostic, and has a stack of options. Sabnzbdplus works through a web interface, so grab and install it (you can get it through the repo’s if you’re in linux), and it’ll set up on http://localhost:8080/sabnzbd/ by default (you can change the port easily enough in the config file, or through the web interface itself). Also by default, sabnzbdplus will fire up your default web browser and open a tab to the web interface on launch, so you shouldn’t have a hard time finding it. You should be greeted by something like this:

sabnzbdplus front page

Now click on Config | Servers and enter your XSNews server and username/password credentials:

sabnzbdplus server settings

When you enter your server details you’ll notice there’s a field there called Connections – this means how many concurrent connections to the server sabnzbdplus should try to make, some news servers support a lot (like 8 or so), some 3, some only 1. If you get 50KB/sec per connection, and you’re allowed to (and do) make 8 connections, you’re looking at 400KB/Sec – it works just like a http download manager can be set-up to use multiple connections to get the same file. I’ve left mine on a single connection for two reasons: I don’t know how many connections accepts, and I’m not in a hurry to get things. Feel free to research and/or try different settings and see how it works out.

If you want further information about configuring sabnzbd, the official quick-setup guide lives here.

4.) Find and Download Files

Once you’re got all that out of the way, the final piece of the puzzle is to find things to download, but from where? Well, newgroups use .nzb files to download large files, think of them as the .torrent files of the newsgroup world. These .nzb files contain a list of all the messages required to download your file(s), amongst other things. A good place to start looking is Once you head on over, I’d click on the advanced tab and change your search settings to something like the below:

Binsearch advanced settings
The important bit here is that you group the messages into collections...

You can add multiple files to the .nzb you generate, but it’s probably best to keep it to a single download per NZB so you don’t have to wait for a bunch of downloads to complete before sabnzbdplus automatically extracts (and repairs, as necessary) the file(s) you’ve selected to download. Once you’ve checked the box of whatever you want to download and clicked the binsearch Create NZB button, you can download the file, then go to your sabnzbdplus web interface and add it to the queue via the Home tab.

A Quick Note about PAR2s

Lets say that you’re downloading some file and it’s a couple of hundred MB; it’s going to come as a collection of maybe 1000 individual messages, each containing a small part of the file. But what if you can only download 950 out of the 1000 messages? Well, quite often this isn’t an issue, as people will post .PAR2 files along with the main file archive (usually split up into a large amount of .RAR archive files). These parity files can be used to regenerate missing pieces of the archive and fix any errors in the parts you’ve already downloaded. Now if you only have a few PAR2 files, and you’re missing a large section of the downloaded file you’re not going to be able to create a working archive from the pieces you have and the parity files, but if you’re only missing a small percentage, and have sufficient parity files you’ll be able to complete the missing pieces using the partity data. Neat!

As mentioned, although sabnzbdplus will try to repair your archives on extraction if it has parity files to work with, it may be the case that you didn’t have a high enough percentage of the main file to begin with, in which case the newsreader will fail to repair the archive and rename the folder _FAILED_Whatever-You’re-Downloading – in this case, it’s sometimes possible to just go and re-download any corrupted sections to try to get a pristine copy, and then run a third-party par2 archive repair tool on the downloaded archive (such as PyPar2 on linux) to hopefully get you a corrected archive set to extract.

Wrap Up and Miscellaneous

I’m not a newsgroup expert by any stretch, so there might well be better/alternate/simpler methods to do all this, but the above method’s worked well for me. As I’m living in Australia where capped ISP quotas are the norm (currently running off 25GB/Month on-peak [7am through 1am], 25GB off-peak [1am through 7am]), it’s good to be able to set your newsboard stuff to download overnight and use up your off-peak before your main quota. To do this go to Config | Scheduling in sabnzbdplus and add two rules; in my case it’s resume at 1:05am and pause at 6:55m like so:

sabnzbdplus scheduling
Remember not to put your machine to sleep if you're downloading overnight!

I should also mention that when configuring sabnzbdplus, it’ll sometimes say you need to manually restart it for changes to come into effect. To do this, simply click on Home | Shutdown, then once it’s stopped just manually restart the app.

I’m not going to be all pious about what you download, but do keep in mind that with these free servers we aren’t using SSL to encrypt the connection, so it’s possible to eavesdrop on who’s getting what, but I don’t think it’s anywhere near as risky as torrenting currently is. And also, if you download something and like it & use it often, it’s only fair to buy a legit copy at some point to show your support and appreciation. But you already knew that – so what are you sticking around for? Get out there and have some fun! :D

Bonus IPv6 Tomfoolery: Now that you’ve got IPv6 up and running – what else can you do with it? Well, how about trying some of these!

How To: Compress a Directory of Files into Individual Archives

I’ve got a stack of files all thrown together in the same directory, and I wanted them compressed – simple enough, eh? Only thing is I wanted each file compressed to its own archive, so I can see at a glance what’s there, and if for some reason an archive gets corrupted, it’s just one file lost and I can replace it instead of having to dick around repairing corrupted “blob” archives that contain the entire bunch of files. And I want to be able to specify all files with a given file extension to compress.

Although I wouldn’t be surprised if you could do this in 4 lines of Perl, I don’t know flippin’ Perl (yet), so I wrote a bash script to do the job.

Bash, as it turns out, is a fiddly, finicky beast in that you really have to think about what the command-line will see under different circumstances and enclose variables in inverted commas or not in very precise ways (see this article to understand what I mean). All that if / fi stuff too… very odd.

To use the script, copy and paste it into a text file (in my case I’ve called it, save it, make the file executable using chmod +x and move it to /usr/bin or something so it’s in your path using sudo mv ./ /usr/bin/ – then run it inside any directory you want to zip files to individual archives by calling it with nds (for example) to compress all the .nds (Nintendo DS roms) in a folder into individual archives. Or use this link ;)

Anyway, job done – suggestions? improvements? props? death-threats? Let me know below!

Oh, and cheers to James McDonald for his WP-Syntax hack to stop all the embedded code appearing on a single (incredibly long) line!