/* Entropy in Picat. http://rosettacode.org/wiki/Entropy This Picat model was created by Hakan Kjellerstrand, hakank@gmail.com See also my Picat page: http://www.hakank.org/picat/ */ % import util. import cp. main => go. go => ["1223334444", "Rosetta Code is the best site in the world!", "1234567890abcdefghijklmnopqrstuvwxyz", "Picat is fun"].map(entropy).println(), nl. % generate a string that matches an entropy... go2 ?=> Alpha = "abcdefghijklmnopqrstuvwxyz1234567890", Wanted = math.e, A = 1..4, member(N,1..10), println(n=N), Cs = new_list(N), foreach(I in 1..N) member(Cs[I],A) end, increasing(Cs), L = [ [Alpha[I] : _ in 1..Len] : {Len,I} in zip(Cs,1..Cs.len)].flatten, entropy(L) = Entropy, % println(L=Entropy), abs(Entropy-Wanted) < 0.01, println(cs=Cs=sum(Cs)), println(l=L=Entropy), nl, fail, nl. go2 => true. % probabilities of each element/character in L entropy(L) = Entropy => Len = L.length, Occ = new_map(), % # of occurrences foreach(E in L) Occ.put(E, Occ.get(E,0) + 1) end, Entropy = -sum([P2*log2(P2) : _C=P in Occ, P2 = P/Len]).