Upload your photos, chat, win prizes and much more
Can't Access your Account?
New to ePHOTOzine? Join ePHOTOzine for free!
Upload photos, chat with photographers, win prizes and much more for free!
Could somebody enlighten me? For as long as I can remember, data compression techniques have existed whereby a data file can be shrunk to a fraction of its original size and then, when required, restored exactly to its original form. For DOS/Windows the ZIP programs are the most popular exponents of this. Compressing a colour bitmap using winzip gives upwards of 65% compression without any eventual loss of quality (once restored).
In addition to zip technology, operating systems such as Windows have, for years, offered on the fly file compression, for those getting desperate for disk space. Although this slows down operation it does work well, files being compressed as they are written to disk and de-compressed as they are opened for use.
Why then, is arguably the de facto standard for image compression, jpeg, one which loses data each time the file is re-saved? I can't see any technical reason for this.
Any ideas anybody?
Join ePHOTOzine for free and remove these adverts.
My understanding of this comes down to usability and market share take VHS v Betamax where Betamax gave the better quality etc and is used by broadcasters where VHS is a lower but sill acceptable quality used by us home users. JPEG for the average user is far more convenient in that you dont have to uncompress to view the file (well at least the software will do that for you) Its an instant image in a good enough quality to display at a sensible file size. However JPEG is not a lossless compression and image data is lost (as with VHS).
Ian - in the old days when memory costs, lack of technology etc were considerable, data compression systems were introduced(ZIP is a spin off of this technology) to help keep costs down. However there were several competing systems and no official standard. Now in the digital imaging era an international commitee was formed (The joint picture expert group) to write such an "imaging standard". Hence the JPEG algorithm's for still high resolution images. The sister system is MPEG for moving pictures.
If you wish to go into the technical details of JPEGs, then try this link.
I understand the technology of zip and jpeg, my point is that with all the technology available there is no reason why the standards bodies could not have made jpeg a lossless compression method. Software which reads jpeg images has to know how to compress the files on writing, so they should also be able to decompress them when reading, to revert to the original image.
The algorithm for the jpeg compress was not designed to be lossless as it was never designed to transmit images for high quality print etc At the highest quality settings you should see no loss of quality (with the human eye) as it was designed to exploit limitations of the human eye.
Saying that the newer jpeg2000 standard is lossless at the higher quality setting and will hopefully become a well used standard soon. See JPEG2000 info
Out of interest I took an image that in RAW format is 11.2 MB and saved in a number of formats to compare file size:
ZIP (raw image zipped) - 5.46MB
JPEG (highest quality) - 1.72MB
JPG2000 (lossless) - 3.51MB
Ben. Not being a computor Geek I am more than a little confused as to why TIFF is not used more for sending picture files. I was given to understand that TIFF would result in the best image transmission.
Stan, you are correct tiff is traditionally the best way to store image files with no loss of information jpg however the file sizes are large this was not a problem when you were not using the internet etc to transfer data (as when the tiff algorithms were developed) Its really only since the internet has taken hold that people have been taking serious looks at compressing the images data sizes even further to speed up data transfer until recently the developers have taken the view that the compressed images only needed to be as good as the human eye could see and so spent time and effort developing standards like jpeg that produce that result. Now that more and more people are using the internet to communicate anything quality is starting to become a more important factor and so various standards like the jpeg2000 are being developed to cope with the new demand.
I guess like everything is a case of trade offs that you need to fulfill your needs.
In my last experiment I didnt mention that the time taken to open the image also varies. The raw image on my computers takes just under a second to open while using the same software the jpg2000 image takes almost 10 seconds again a trade off between time and file size.
No doubt some clever person will develop a whole new algorithm that will compress images even further with no loss of quality until then
Ian, techniques for compressing software and text data produce very good results as there is a large degree of redundancy in those types of files. A photograph does not typically contain large areas of exactly the same data, so traditional compression techiniques are not terribly effective. There is a "lossless" option when writing a JPEG, and there are also a number of other compressed lossless formats such as TIFF LZW and PNG.
The JPEG (Joint Photographic Experts Group) format was designed to reduce transport times over the internet, while losing data that was felt the human eye would ignore. For instance, blues are typically compressed more than other colours as the human eye does not distiguish between shades of blue as well as other colours. As has been said above, it is a trade off between size and quality.
The PNG (Portable Network Graphics) format is a newer format that was intended to replace the JPEG for internet use, as it was compressed and supports 16 million colours (like JPEG) but also supports an alpha layer, so you can have proper transparency, thus replacing the GIF format as well.
Unfortunately the PNG doesn't seem to have taken off, although we us it at work as it is immensely useful..
JPEG 2000 has now appeared and uses a completely different compression algorithm. We have tested against JPEG and found we can get a 40Mb TIFF file down to under 400Kb and still have good enough quality for a consumer print to 10x15 inch. That's a 100-1 compression ratio !!!
If you are in a pro environment, I would recommend always using a lossless compression such as TIFF.
Thanks Ben & Big Bri.
You have clearly answered my question/query.
ePHOTOzine, the web's friendliest photography community.
Upload photos, chat with photographers, win prizes and much more.
You must be a member to leave a comment
Get the latest photography news straight from ePHOTOzine in your email every month and win prizes!
1st March 2014 - 31st March 2014
Check out ePHOTOzine's inspirational photo month calendar! Each day click on a window to unveil new photography tips, treats and techniques.
View March's Photo Month Calendar