OpenCores
no use no use 1/1 no use no use
open lossless data compression core
by Unknown on Apr 15, 2005
Not available!

Hi,

I have been trying to see if i can implement a lossless data
compression core (like gzip, etc) in HDL. I searched a lot for some
simple compression cores already in HDL that I could use. The only one
i found was the static huffman encoder/decoder in jpeg core from
opencores. That is not really usable for general data as its a static
encoder. I am interested in a general purpose data compression core.

Has anyone here ever written any kind of data compression cores in
HDL? From my study of the "deflate" algorithm, i think it will be
extremely tough to code it in verilog but, any other simpler
compressor implementation, that I can learn from?

I am surprised that no such simple cores exist. Is it very tough to
develop such a core?

Any opinion on this would be welcome. If by any luck, i get a large
number of replies, I also volunteer to edit and submit to OC as a
FAQ/HOWTO (depending on the results).

Thanks,

Abgoyal.

open lossless data compression core
by Unknown on Apr 15, 2005
Not available!

Here's the problem, and the reason why I designed a static huffman
encoder/decoder:
For a compressor/decompressor to be generic it needs to be loaded with
tables. The core then needs to traverse through these tables and (de)code
the data depending on the results it found.
So the first issue is that you need to traverse large tables. This could be
simplified by using multiple small tables and some guidance into selecting
one of these (e.g. hash tables).
Second problem is that the data isn't neatly alligned. In case of the
huffman (de)coder, for example, the mostly used blocks are assigned a small
value (2bits), less used blocks are assigned a large value (many bits). This
is in fact they way a huffman coder/decoder works. Unfortunately this
doesn't lend itself neatly for implementing into HDL. Where fixed data sizes
are prefered.

All in all it is simply easier (and sometimes faster!) to implement these
algorithms with a CPU or a simple statemachine.

Cheers,
Richard

-----Original Message----- From: cores-bounces at opencores.org [mailto:cores-bounces at opencores.org] On Behalf Of abgoyal at gmail.com Sent: Friday, April 15, 2005 3:12 PM To: cores at opencores.org Subject: [oc] open lossless data compression core Hi, I have been trying to see if i can implement a lossless data compression core (like gzip, etc) in HDL. I searched a lot for some simple compression cores already in HDL that I could use. The only one i found was the static huffman encoder/decoder in jpeg core from opencores. That is not really usable for general data as its a static encoder. I am interested in a general purpose data compression core. Has anyone here ever written any kind of data compression cores in HDL? From my study of the "deflate" algorithm, i think it will be extremely tough to code it in verilog but, any other simpler compressor implementation, that I can learn from? I am surprised that no such simple cores exist. Is it very tough to develop such a core? Any opinion on this would be welcome. If by any luck, i get a large number of replies, I also volunteer to edit and submit to OC as a FAQ/HOWTO (depending on the results). Thanks, Abgoyal. _______________________________________________ http://www.opencores.org/mailman/listinfo/cores




open lossless data compression core
by Unknown on Apr 28, 2005
Not available!
----- Original Message ----- From: abgoyal at g...abgoyal at g...> To: Date: Fri Apr 15 15:11:51 CEST 2005 Subject: [oc] open lossless data compression core
Hi,
I have been trying to see if i can implement a lossless data
compression core (like gzip, etc) in HDL. I searched a lot for some
simple compression cores already in HDL that I could use. The only
one
i found was the static huffman encoder/decoder in jpeg core from
opencores. That is not really usable for general data as its a
static
encoder. I am interested in a general purpose data compression
core.
Has anyone here ever written any kind of data compression cores in
HDL? From my study of the "deflate" algorithm, i think it
will be
extremely tough to code it in verilog but, any other simpler
compressor implementation, that I can learn from?
I am surprised that no such simple cores exist. Is it very tough to
develop such a core?
Any opinion on this would be welcome. If by any luck, i get a large
number of replies, I also volunteer to edit and submit to OC as a
FAQ/HOWTO (depending on the results).
Thanks,
Abgoyal.
Hello, you have thousands of lossles compression algorithms. If you post at nntp://comp.compression the answer will be like this: What type of data you want to compress:? Is a stream or blocks of data:? Size:? Do you know something about data (are text, image, voice, numbers... has structure, is a mathematical function behind it):? What speed you need:? What compression you need:? If i were you i try a static statistical as golomb/rice codes plus a model (MTF, etc..). Or an arithmethic coder (dinamic statistical) with Finite context model of order 0, 1 or even 2 (256 bytes, 64KB or 16MB of memory) plus a model. If you already has a sort algorithm implemented in hdl, you can try BWT as model. Check http://www.arturocampos.com and the comp.compression newsgroup for more info and tutorials. Deflate, other dictionary algorithms (called also LZ) need data structures for dfast search substrings. Also, FCM with order >3 need a lot of memory (multiply by 256 for next order) or use linked lists. A FCM widely used is PPM, check it. HTH
no use no use 1/1 no use no use
© copyright 1999-2025 OpenCores.org, equivalent to Oliscience, all rights reserved. OpenCores®, registered trademark.