Its also important to note in case of compression algorithms that all of them have a specific tradeoff between good compression ratio and speed. For example in case of the general purpose 7zip: it has probably one of the best compression ratios however its lzma is quite computation intensive and this might be a downside when you are searching for something to use on a mobile device. Sometimes the extra compression ratio might not be a good enough reason to use a compressor instead of another that has worse compression ratio for much less computation power. We recently dropped lzma from a mobile project because the load times were not satisfying (with lzma: 20 secs, without: 5 secs) on android/arm platform (despite the fact that on the dev platform - win32 - it wasn't noticable). You should try several solutions before choosing one that is suitable for your needs. Also note that sometimes the time needed for compression isn't important because sometimes the final product contains only the decompressor to load its data and that can be an important factor! In this case a library that comes with fast general-purpose compression method and one slow ultra-compression feature to use before software release can be awesome!

Solution 2

Having compression lossless makes a great difference compared with different lossy algorithms. While lossy algorithms differ dramatically depending on the nature of the data, with lossless algorithms, it's not so much so. It all depends on how much do you want to optimize your algorithm to the specific nature of data. Even with lossless compression, taking into account the nature of data it could be very useful for compression ratio, doing it is much more difficult. At the same time, general-purpose lossless algorithms are well-known, open source implementations are available, so I'm not sure if you even need to develop anything. (Can you find such implementation by yourself, or you might need some help?)