Upgrade Tor to TRUE latest version (in Ubuntu)

October 8th, 2006
Very out of date by now. Just use these instructions.

The Tor packages in the Debian (and Ubuntu) respositories are not up-to-date. This is dangerous, since people do rely on them for strong anonymity, even though the package warns them not to do so (it's still the best out there.) To get the latest stable version, you'll have to add another repository to your sources.list file and set your system to trust it.

Read full entry »

Adding Dublin Core metadata to WordPress posts

October 1st, 2006

WordPress does not have a built-in mechanism for automatically adding standardized metadata to pages. Today I wrote a plugin to add Dublin Core metadata to all posts and pages. It currently supports the following metadata:

  • Site name as DC.publisher
  • Site URL as DC.publisher.url
  • Post title as DC.title
  • Permalink as DC.identifier
  • Date created as DC.date.created

Install it by downloading the latest version from the DC 4 WP page (version 0.2 at this time), unpacking the .zip archive, and dropping the .php file into your WordPress installation's wp-content/plugins folder.

This is my first plugin, so any and all constructive feedback would be greatly appreciated!

Wanted: Spam trap extension for Mozilla Thunderbird

June 16th, 2006

I'd like to see someone write a spam-trap extension for Mozilla Thunderbird that would simply delete any messages that match messages from a spam-only account. I'd be willing to pay for such an extension.

Concept

I first saw this idea in use on unstable.nl. At the bottom of the page was this puzzling declaration:

spam-trap@unstable.nl - Please send spam.
Humans may write to andreas@unstable.nl

I presume that Andreas has programmed his mail client or retriever to delete from andreas@ any messages that are identical or similar to messages that appear on spam-trap@. I later contacted him on Jabber, and he validated my suspicions, adding that he only sees one piece of spam per week. I was impressed.

Specification

A Mozilla Thunderbird plugin could easily implement this concept. Have the user specify an address they don't use, but own, such as an outdated Hotmail account. Then delete any similar or identical messages that arrive on other accounts. Defining "similar" is the hard part, of course, but I have some ideas:

  • Compute a quick hash of each embedded attachment (otherwise may have disproportionate effect on filtering)
  • Use the diff function on textual areas
  • Strip query strings from URLs and embedded forms (query strings may have hashed copy of email address embedded)
  • Compare some email headers

Research

I don't know much about email headers or routing, so I don't know how same-session spam messages are similar or different. Research into this would be necessary. Perhaps public data on this already exists.

Problems

This technique of filtering may be circumvented if spammers start sending out messages with more randomization and scrambling. Additionally, if this filtering technique were to become popular, unforeseen loopholes would undoubtedly arise. In both cases, however, I am certain that spammers would be required to use more processing power, and therefore incur more cost to themselves.

Bounty

This is a cool enough idea to warrant a bounty, especially if research is required. I would be willing to pay $50 out of my own pocket for the first successful solution, and I'm sure others would be willing to contribute. Alternatively, if someone can find a fatal flaw in the idea before any serious work is done, I am willing to pay that person $5-10 dollars. (I might pay more if they devise a new specification that is not vulnerable to the same flaw.)

A "successful solution" is defined as open source/free software, cross-platform, reasonably non-buggy, and able to implement at least the core feature of the request (here, deletion of mail on one account upon receipt of a similar message in another.) A "fatal flaw" is defined as a reasonably easy concept or proof-of-concept which, if implemented, would defeat any reasonable solution.

Please, if you plan to implement this idea, leave a note here so that people are not duplicating efforts. If there is a change in status, I will notify every person who leaves a comment, unless they request otherwise. (Yeah, I know, opt-out emailing...)

Are you willing to pledge bounty money for an implementation? Leave a note here to motivate potential developers. (Your pledge isn't binding, even though mine is.)

Download a torrent from behind a firewall

April 17th, 2006
Problem
Is your ISP practicing traffic shaping so restrictively that you can't use torrents?
Solution
Use an encryption-enabled torrent client, such as Azureus.
  1. Get Azureus from azureus.sourceforge.net. (You need to have Java installed on your machine first.)
  2. Run and configure Azureus:
    • Skip any updates, because they rely on torrents.
    • Set your proficiency level to Intermediate, so you can change the encryption settings later.
  3. Go to Tools -> Options -> Connections -> Transport Encryption, and require encryption.
  4. Under Mode, set your proficiency to Beginner. (Beginner has all the options you'll ever need as a downloader.)
  5. Restart Azureus, and let it do any updates it deems necessary.
Explanation
Encrypting the transport layer prevents your ISP from determining what kind of traffic is passing through. If it can't tell it's a torrent, it can't slow or restrict it.
Notes
If you are only trying to avoid traffic shapers, set the minimum encryption level to Plain, which only obfuscates the packet headers. If plain encryption fails, Azureus will automatically escalate to RC4, which encrypts the entire packet. If you are also trying to avoid being caught downloading commercial music or software, you should probably use RC4 as the minimum. Actually, the best way to avoid being caught pirating is to use an extension like SafePeer. Encryption won't do anything, because your IP address is still visible.
Resources

Edit the web

April 16th, 2006

What if web surfers could edit any page on a website? What if webmasters could get webcorrections from users? What if readers could fix typos in blog posts, without leaving nitpicky comments? I've got a plan...

Read full entry »