Photo Compression in an Overcompressed Nutshell
The process of reducing the file size of an image may either 1. encode patterns and blocks of the same color, like a zip file or digital shorthand, for a smaller file which then requires processor time to re-expand. 2. throw away all of the data that is not going to be missed at the intended display resolution, such as in 'lossy' formats such as jpeg. This is why, when you try to zoom in on most web images you have stored on your disk, there is only useless empty magnification.
I am not familiar with the programs mentioned above, and I am pretty much spoiled by PhotoShop. When I am putting an image on the web, the first thing I will do is crop out extraneous periphery, next I will fix contarast and color balance, then I will resize the image to 640 wide. This blurs the image somewhat as the new pixel grid is unlikely to perfectly overlap the original. I then employ the 'unsharp mask' to artificially restore an appearance of clarity. PhotoShop then presents me with a side by side view of the original image and a proposed jpeg compression. I then visually alter the quality (compression) setting for the smallest file size still carrying adequate clarity. I have the entire process scripted, so it will take only 15 or so seconds if the original image is of good quality.
I believe many of the photo upload and hosting sites probably automate the compression process and I understand that there is even a free add-on in the vbulletin user community which will accept oversize pictures and then reduce and compress them to fit the limits. Perhaps I need to investigate that, though I hate to do anything which would further bog down the speed of the server.
|