This page contains a few programs, that I have made. All programs are free, in the sense that I claim copyright to all of them, but you and anybody else can download them and use them for free. Note that this also gives you permission to distribute the programs, if only it is done without charge of any kind and without changing the files.
I originally made these programs for my own use, meaning they are developed in my spare time, and are as such not "professional" programs. I use them myself extensively, but I can give you no guarantee, that they will be working perfectly in any given environment for any given task. I provide these programs as is, without any responsibility. Should you experience any errors in the programs, please send me a mail with sufficient information for me to reconstruct the problem, and I will see, if I can do anything about it.
The list of programs may grow as time goes by. I have several more programs in store, and should I decide they could be useful for anybody else, and they are in a "worthy" state to be published, I may put them here. Until then, I hope you enjoy the ones already here! Click the title to start download (all files are zipped).
Note that all programs are 32-bit console programs.
WComp is a program, that will compress wave files! We're not talking anything close to what compressing programs usually do to usual files, but those of you who have tried to zip a wave file, will know that if you get a compression of 5-10%, you're lucky! Wave files are generally very difficult to compress without loss of quality.
WComp is performing lossless compression, generally in the magnitude of 30%, but for "simpler" files, more can be expected.
In total 5 different compressing methods are implemented (all 16-bit):
Huffman (adaptive) is also the default mode. Fast and good!
Arithmetic coding is theoretically better than Huffman coding, it will usually yield a marginally better result, and after the latest optimization, it is even almost as fast as huffman too! In extreme cases, for instance for a wave file where the majority of the samples are equal, it will prove significantly better than Huffman. In the extreme case, it compresses a file of several megabytes of silence to just a few bytes plus a header!
Fibonacci coding will always give a compression of 62.5%! However, it is NOT lossless! Give it a try, and see if you can find a use for it somewhere, where a minor distortion of the signal does not matter.
Runlength Very simple coding, that only works wonders, if the file contains areas where the samples are equal!
Static-tree Huffman same as huffman coding, but not adaptive, meaning it is very fast, but compresses poorer.
The program is quite intelligent with respect to guessing filenames and extensions (the default extension for a compressed file is 'whc'), and it also stores information in the header about the used compression method, so you do not have to remember that. Actually, if you look at the file in a hex viewer, you will find the wave file's uncompressed header starting at the 12'th byte, should you need to look at it, without wanting to uncompress the file. It also has some limitations: It only works on 16-bit stereo files, and it only compresses the "body" of the wave file, other parts of the file, like a copyright notice, are ignored. If anybody needs it, I may consider to enhance the program to include other formats as well. Note also, that the program is NOT tested (and not expected to be working) with files larger than 2GB in size. However, for now, I consider that problem to be academic.
GREP probably needs no further introduction, since it is a well known program originally appearing in the UNIX environment. Borland created a fantastic version of it for DOS back in the early nineties, I think it was. I have made my version of grep very close to Borland's version, with a few new options inspired by Unix' egrep, like () and | to seperate alternatives, plus a lot of other options, that I have had a use for over the years. Check out the possibilities by typing "grep" or "grep ?".
DU is inspired by the good old UNIX program by the same name. DU is an abbreviation of "Disk Usage", and it is summing up the total size of files in all directories from the specified path (current directory if none specified), and downwards. It can also do a total report (using -r), a sum on each directory in the tree, but in general that feature is not all that useful, I think.
HD is actually just another hex dumper, however, it has some nice features, that I use a lot myself. It can generate output in various formats, it can start the dump at a given position in the file, pause for every specified number of lines, and so on. Run it with /? as option, and it will explain it all to you itself.
SDel is just a smart "delete" command, that will work on subdirectories as well. It can delete files of any given name including wildcards, it can delete files in directories of a given name, it can delete empty directories, and very important: It has a "test" mode, where nothing is deleted, so you can verify, that you have specified the directory and file names right before actually running it.
Last revised: 2015-10-30