Before we start measuring information, let's introduce a definition and understand what we are dealing with.

## Definition

Information is information, messages, data in all its manifestations, forms, regardless of their content. Even complete rubbish written on a piece of paper can be considered information. However, this definition is from Russian federal law.

The following values can be distinguished from international standards:

- knowledge about things, facts, ideas, meanings, opinions exchanged between people in a particular context;
- knowledge about facts, events, meanings, things, concepts that have a certain meaning in a particular context.

Data is a materialized form of information representation, although in some texts the two terms may be used as synonyms.

## Measurement methods

The concept of information is defined in different ways. It is also measured differently. The following main approaches to measuring information can be distinguished:

- Alphabet approach.
- Probabilistic approach.
- Meaningful approach to measuring information.

They all correspond to different definitions and have different authors who have different opinions about the data. The probabilistic approach was created by A. N. Kolmogorov and did not take into account the subject of information transfer, that is, he measures its quantity, regardless of how important it is for the subject transmitting and receiving it. A meaningful approach to measuring information, created by K. Shannon, takes into account more variables and is a kind of assessment of the importance of this data for the receiving party. But let's look at everything in order.

## Probabilistic approach

As already mentioned, approaches to measuring the amount of information are very different. This approach was developed by Shannon in 1948. It lies in the fact that the amount of information depends on the number of events and their probability. You can calculate the amount of information received using this approach using the following formula, in which I is the desired amount, N is the number of events and p_{iis the probability of each specific event.}

## Alphabet

Absolutely self-sufficient method of calculating the amount of information. It does not take into account what is written in the message, and does not relate the amount of writing to the content. To calculate the amount of information, we need to know the power of the alphabet and the volume of the text. In fact, the power of the alphabet is not limited. However, in computersa sufficient alphabet with a capacity of 256 characters is used. Thus, we can calculate how much information one character of printed text carries in a computer. Since 256=2^{8, one character is 8 bits of data.}

1 bit is the minimum, indivisible amount of information. According to Shannon, this is the amount of data that reduces the uncertainty of knowledge by half.

8bit=1 byte.

1024 bytes=1 kilobyte.

1024 kilobytes=1 megabyte.

## Thought

As you can see, approaches to measuring information are very different. There is another way to measure its quantity. It allows you to evaluate not only quantity, but also quality. A meaningful approach to measuring information allows you to take into account the usefulness of the data. Also, this approach means that the amount of information contained in the message is determined by the amount of new knowledge that a person will receive.

If expressed in mathematical formulas, then the amount of information equal to 1 bit should reduce the uncertainty of human knowledge by 2 times. Thus, we use the following formula to determine the amount of information:

Х=log_{2Н, where X is the amount of data received and H is the number of equiprobable outcomes. For example, let's solve a problem.}

Let's have a trihedral pyramid with four sides. When tossing it up, there is a chance that it will land on one of the four sides. Thus, H=4 (the number of equiprobable outcomes). As you understand, the chance that ourthe object will fall on one of the faces and remain standing there, less than if you toss a coin and expect it to stand on its edge.

Decision. X=log_{2}N=log_{24=2.}

As you can see, the result is 2. But what is this number? As already mentioned, the minimum indivisible unit of measure is a bit. As a result, after the fall, we received 2 bits of information.

Approaches to measuring information use logarithms for calculations. To simplify these steps, you can use a calculator or a special table of logarithms.

## Practice

Where can you use the knowledge gained in this article, especially data about a meaningful approach to measuring information? No doubt in the computer science exam. The considered issue allows you to better navigate in computer technology, in particular, in the size of internal and external memory. In fact, this knowledge has no practical value, except in science. No employer will force you to calculate the amount of information in a printed document or written program. Unless in programming, where you will need to set the size of the memory allocated for the variable.