Entropy

Question 1 [12 pts]
What is entropy?
Calculate the entropy of a fair die toss.
Calculate the entropy of a biased die, where the probability of landing “6” is 1/2, the probabilities of the others are equal.

Compare the entropies of the fair die and the biased die? What does higher entropy mean?

You can use the Log Base 2 Calculator hosted here:
https://miniwebtool.com/log-base-2-calculator/

Use the exponents calculator in this address to verify the number of possible landings of a fair dice, which should be 6.
https://www.rapidtables.com/calc/math/Exponent_Calculator.html
Question 2 [5 pts]
What is the difference between a RNGS and PRNGS?
Question 3 [10 pts]
Which one is used for assessing the security of a cryptographic algorithm: Informational security or computational security? Why?
Which is of the following statements are correct? Choose more than one.
a. One-time pad is informationally secure.
b. AES algorithm is computationally secure.
c. Organizations should prefer using informationally secure algorithms over computationally secure algorithms as much as they can.
d. A new cipher that is using 5096-bit key and takes 10^10000 years to break in even by the help of parallel computing infrastructure composed of all microprocesses in the world is informationally secure.
Question 4 [8 pts]
Think about two encryption keys with lengths 80-bit and 128-bit, respectively. Assume that the ciphers using these keys are NIST-approved ciphers. What can be said about these keys?
Do both keys provide similar level of security in practical real-world applications? Please justify your answer.

find the cost of your paper

Sample Answer

 

 

Entropy

Entropy is a measure of the disorder or randomness of a system. A system with high entropy is disordered and unpredictable, while a system with low entropy is ordered and predictable.

Entropy is often measured in bits, where one bit of entropy represents the choice between two equally likely outcomes. For example, the toss of a fair coin has two equally likely outcomes, heads and tails, so it has one bit of entropy.

Full Answer Section

 

 

 

Entropy of a Fair Die Toss

The entropy of a fair die toss can be calculated by the following formula:

H = log2(n)

where:

  • H is the entropy
  • n is the number of possible outcomes

In the case of a fair die toss, there are six possible outcomes: 1, 2, 3, 4, 5, and 6. So the entropy of a fair die toss is:

H = log2(6) = 2.58 bits

This means that there is 2.58 bits of uncertainty about the outcome of a fair die toss. In other words, there are 2.58 possible outcomes that are equally likely.

The entropy of a system can be used to measure how much information is required to describe the system. In the case of a fair die toss, the entropy tells us that we need 2.58 bits of information to describe the outcome of the toss.

Entropy is a fundamental concept in thermodynamics and information theory. It is used to measure the disorder or randomness of a system, and it can be used to quantify the amount of information required to describe a system.

This question has been answered.

Get Answer