Compress Random Data

BM-2cWdaAUTrGZ21RzCpsReCk8n86ghu2oY3v
Feb 19 03:47 [raw]

How much would it be worth if there were an algorithm that could compress deterministic or random data 90+ % ? For example let us assume we can compress 1KB down to 72 bytes. Who would pay for that technology?

BM-2cWdaAUTrGZ21RzCpsReCk8n86ghu2oY3v
Feb 19 03:59 [raw]

Let me guess. Can you also DEcompress the original data in more than 10% of the cases, or still working on that part? :)

BM-2cWdaAUTrGZ21RzCpsReCk8n86ghu2oY3v
Feb 19 04:21 [raw]

What have you achieved?

BM-2cWdaAUTrGZ21RzCpsReCk8n86ghu2oY3v
Feb 19 05:08 [raw]

I didn't say one way or the other. I said "let us assume." Then I asked who would pay for such a powerful technology.

BM-2cWdaAUTrGZ21RzCpsReCk8n86ghu2oY3v
Feb 19 06:34 [raw]

ANY IT business would pay for an algorithm that can compress truly random data to less than X% its original size AND decompress it back correctly more than X% of the time, using real-world computing hardware and practical timeframes (you could even market it as "unlimited compression", which would be technically correct). However, most novel compression proposals that I've seen fail one or more of these conditions. There are some information theory limitations that seem to keep getting in the way.

BM-2cWdaAUTrGZ21RzCpsReCk8n86ghu2oY3v
Feb 26 15:23 [raw]

This is in fact a contradiction within itself, probably due to the lack of properly understanding mathematics. If the data is perfectly random, any kind of compression will be impossible. So to answer the second question: No one would, at least no one with a sane mind.

BM-2cWdaAUTrGZ21RzCpsReCk8n86ghu2oY3v
Feb 26 16:14 [raw]

Random data compression is possible. Compressor 1 is able to compress 50% of any data, including some data which is random - but not all such data. Compressor 2 is able to compress 50% of any data, including some data which is random - but not all such data. Compressible data sets of these two compressors do not overlap. Farewell, "pigeonhole principle".

BM-2cWdaAUTrGZ21RzCpsReCk8n86ghu2oY3v
Feb 27 23:48 [raw]

> So to answer the second question: No one would, at least no one with a sane mind. You're a naysayer, not a doer, and your job is to discourage the doers. > If the data is perfectly random, any kind of compression will be impossible. Common wisdom once dictated that a human moving more than 50mph would disintegrate. In my study of history I learned that as visionary men were inventing the railroad naysayers were heckling the doers about how high-speed carriages would cause people to disintegrate from the rush of speed. > This is in fact a contradiction within itself, probably due to the lack of properly understanding mathematics. You do what all naysayers do: argue as if you do have a proper understanding. Where did you get your degree in mathematics? I placed top 1/10th percentile in my class. I did all the coursework for math in high school in six weeks and tested out: 97%. I never even set foot in a classroom. I think what you mean is I don't properly understand Wikipedia's version of mathematics. Or perhaps you mean I don't understand Stack Exchange's version of mathematics. While you call a thing insane I am slowly devising language and symbol sets for a new branch of mathematics to deal with resonance and periodicity in all noise. I have devised several compression algorithms which successfully compress random data. I am able to compress random integers and bit streams of hundreds of digits a minimum of 12.5% per pass. There is a cost: CPU. The algorithm searches for field patterns that can be reduced to a polymorphic algorithm, which takes a long time. A specially devised ASIC could probably do the compression 1000x faster. I'm looking in my terminal just now at the statistics of the last compression pass script with my newest test algorithm. It just compressed totally random bits 6.875%. I have repeatedly broken the pigeonhole principle with adaptive polymorphism that is able to map a smaller map to a larger structure. My best algorithm yet is 12.5% per pass minimum, and it compresses at least that on every pass not matter what. It is also the slowest. My work continues.

BM-2cWdaAUTrGZ21RzCpsReCk8n86ghu2oY3v
Feb 28 20:06 [raw]

You probably could not be more wrong with your assumptions. Have a nice day.

BM-2cWdaAUTrGZ21RzCpsReCk8n86ghu2oY3v
Mar 2 05:19 [raw]

I believe I made a breakthrough that allows arbitrary compression of any quantity to any target quantity with at least 2.5% marker bits to feed the expansion function. I successfully compressed a random bit stream from 32 kb down to a 280 byte algorithm that maps to the entire original data set. I have discovered properties in byte patterns that allows them to be graphed. Just as you can take a graphing polynomial equation to populate a huge field of data, you can take the huge field of data and wind it back into a graphing function. In other words, random doesn't really exist. Everything is structured. Even noise can be mapped to functions. It is fast enough that Python can do it without too much lag (a bit). C should do it at least 10 times faster. What this means is the possibility of sending an entire Linux distro DVD (approx. 1 GB) in a file 1/20th the size or smaller. With optimized machine code and buffering one should be able to turn a terabyte drive into 80+ terabytes of compressed data. Who do I sell to first? This is worth billions.

[chan] Crypto-Anarchist Federation
BM-2cWdaAUTrGZ21RzCpsReCk8n86ghu2oY3v

Subject Last Count
1 Sep 19 13:20 2
111 Sep 18 22:09 1
eita Sep 14 20:36 1
ankw Sep 14 20:36 1
ltd Sep 14 20:36 1
ycy Sep 14 20:36 1
bzgk Sep 14 20:25 1
imy Sep 14 20:25 1
pmrc Sep 14 20:14 1
vgz Sep 14 20:11 1
npjyr Sep 14 20:08 1
puij Sep 14 20:05 1
yml Sep 14 20:03 1
vcw Sep 14 20:02 1
kbkz Sep 14 19:58 1
byfx Sep 14 19:58 1
vdqqy Sep 14 19:57 1
ngcj Sep 14 19:56 1
ficeu Sep 14 19:54 1
gduv Sep 14 19:46 1
yczg Sep 14 19:46 1
jiy Sep 14 19:45 1
xun Sep 14 19:44 1
zft Sep 14 19:44 1
eto Sep 14 19:44 1
mqtjx Sep 14 19:44 1
uow Sep 14 19:43 1
odo Sep 14 19:43 1
bjzd Sep 14 19:41 1
pczer Sep 14 19:23 1
dob Sep 14 19:23 1
dni Sep 14 19:23 1
xldp Sep 14 19:23 1
ukzj Sep 14 19:20 1
yhx Sep 14 19:15 1
egjo Sep 14 19:12 1
zxg Sep 14 19:07 1
gihxd Sep 14 19:07 1
rqow Sep 14 19:07 1
sgaj Sep 14 19:07 1
mvttv Sep 14 19:06 1
lakyj Sep 14 19:04 1
jxns Sep 14 19:03 1
sbxp Sep 14 19:00 1
sgqic Sep 14 18:59 1
cxr Sep 14 18:58 1
cur Sep 14 18:56 1
malxq Sep 14 18:56 1
hhjf Sep 14 18:56 1
gyei Sep 14 18:50 1
dhfiw Sep 14 18:48 1
qkz Sep 14 18:32 1
zqzc Sep 14 18:31 1
aanp Sep 14 18:29 1
llezn Sep 14 18:25 1
ybqir Sep 14 18:10 1
orl Sep 14 18:10 1
tfbhw Sep 14 18:00 1
wha Sep 14 17:48 1
ovv Sep 14 17:48 1
rch Sep 14 17:44 1
rdxp Sep 14 17:44 1
zom Sep 14 17:40 1
vmdk Sep 14 17:37 1
pxvwp Sep 14 17:34 1
kkrdt Sep 14 17:31 1
ukbw Sep 14 17:30 1
gzsh Sep 14 17:29 1
yilmg Sep 14 17:19 1
rtpqj Sep 14 17:17 1
egxt Sep 14 17:12 1
shymw Sep 14 17:12 1
lgn Sep 14 17:08 1
cga Sep 14 17:08 1
rmlc Sep 14 17:08 1
jom Sep 14 17:08 1
rcc Sep 14 17:08 1
qht Sep 14 17:06 1
ukqep Sep 14 16:47 1
puxwg Sep 14 16:47 1
shin Sep 14 16:47 1
uftg Sep 14 16:47 1
gfp Sep 14 16:46 1
xjz Sep 14 16:39 1
afnp Sep 14 16:38 1
jokre Sep 14 16:36 1
acsyd Sep 14 16:30 1
zkqnl Sep 14 16:29 1
qpx Sep 14 16:29 1
zwlf Sep 14 16:29 1
eiu Sep 14 16:25 1
rgvs Sep 14 16:19 1
qkcs Sep 14 16:19 1
ewoe Sep 14 16:13 1
aylru Sep 14 16:11 1
ljacu Sep 14 16:06 1
dmub Sep 14 16:06 1
vithq Sep 14 16:06 1
zfcv Sep 14 16:01 1
glwvv Sep 14 16:00 1