There are lots of different ways to compress data, here are a few examples of common algorithms:
Index Compression
Values are saved in a flat store with an index number referencing each value. The more values that are the same the better results you will get with this.
Shared Dictionary Compression
Similar to index compression but the dictionary file of values is stored externally from the compressed string, which greatly reduces the size, but requires the dictionary file to be downloaded and stored for compression and decompression.
Huffman Compression
Loops through every single character value and creates a probability of how often it is used, then it creates a tree of values which can be navigated by switching left or right in the tree.
LZ Compression
Using the zip algorithm which matches repeating patterns in your binary data. This gives even better better results than index compression but require more cpu and time.
Each compression technique has different advantages and disadvantages depending on your data type and processing power available. However the options are reduced when you need to run these algorithms in the browser.
I've created my own version of Index compression which supports strings, arrays and json:
http://jsfiddle.net/kmturley/MCzN7/13/
JavaScript performance tests of different methods and libraries:
http://jsperf.com/json-compression/5
From my tests the JSONH library is not only the fastest library to pack and unpack, but it created a fairly small size string and the library is not very large. So I would choose this over any other method for compression client side.
Here's a real world example of compressing data using JSONH, then storing it in LocalStorage for retrieval later on:
function set(key, value) {
var item = JSONH.pack(value);
localStorage.setItem(key, JSON.stringify(item));
return item;
}
function get(key, value) {
var item = JSON.parse(localStorage.getItem(key));
return JSONH.unpack(item);
}
var packed = set('people', data);
var unpacked = get('people', packed);
And a real-world example showing 58% of the size after compression which takes around 0.26 of second:
http://jsfiddle.net/kmturley/8cMY5/5/
Index Compression
Values are saved in a flat store with an index number referencing each value. The more values that are the same the better results you will get with this.
Shared Dictionary Compression
Similar to index compression but the dictionary file of values is stored externally from the compressed string, which greatly reduces the size, but requires the dictionary file to be downloaded and stored for compression and decompression.
Huffman Compression
Loops through every single character value and creates a probability of how often it is used, then it creates a tree of values which can be navigated by switching left or right in the tree.
LZ Compression
Using the zip algorithm which matches repeating patterns in your binary data. This gives even better better results than index compression but require more cpu and time.
Each compression technique has different advantages and disadvantages depending on your data type and processing power available. However the options are reduced when you need to run these algorithms in the browser.
I've created my own version of Index compression which supports strings, arrays and json:
http://jsfiddle.net/kmturley/MCzN7/13/
JavaScript performance tests of different methods and libraries:
http://jsperf.com/json-compression/5
From my tests the JSONH library is not only the fastest library to pack and unpack, but it created a fairly small size string and the library is not very large. So I would choose this over any other method for compression client side.
Here's a real world example of compressing data using JSONH, then storing it in LocalStorage for retrieval later on:
function set(key, value) {
var item = JSONH.pack(value);
localStorage.setItem(key, JSON.stringify(item));
return item;
}
function get(key, value) {
var item = JSON.parse(localStorage.getItem(key));
return JSONH.unpack(item);
}
var packed = set('people', data);
var unpacked = get('people', packed);
And a real-world example showing 58% of the size after compression which takes around 0.26 of second:
http://jsfiddle.net/kmturley/8cMY5/5/
No comments:
Post a Comment