1 |
The discipline |
Information theory |

2 |
Course of study, specialty |
3, specialty «Mechanics» |

3 |
Term of study |
5 |

4 |
Amount of credits |
3 |

5 |
Lector |
D. Sc., prof. А.V. Bogdanovich |

6 |
Objectives of the discipline |
Formation of social knowledge on quantitative assessment of information characteristics for information sources and communication channels, effects and noise-immune encoding information; mastering of basic definitions and concepts; studying the principles of the action of information-measuring systems, approaches to their analysis and synthesis; mastering of methods of transformation of signals; the study of issues of quantitative information evaluation in relation to various types of message sources and communication channels.
– apply the acquired knowledge to evaluate the effectiveness of information processing systems; – solve problems on all sections of the course. |

7 |
Prerequisites |
Probability theory |

8 |
Discipline contents |
Philosophical aspect of the concept of “information”. Information and communication. The subject of information theory. Classification of signals. Coding of signals. Definition and classification of information and information-measuring systems (IMS). Statistical approach to the analysis and synthesis of IMS. Specific systems using the example of radar. Basic energy relationships. The role of information in improving the effectiveness of IMS. Decomposition of an arbitrary signal by a given system of functions. Orthogonal and orthonormal systems. Generalized Fourier series. Bessel’s inequality for an arbitrary orthogonal system. Harmonic analysis of periodic oscillations. Complex and trigonometric form of the Fourier series. Harmonic analysis of non-periodic oscillations. Spectral characteristic of a function. Forward and backward Fourier transform. Sampling. Kotelnikov’s theorem and features of its practical use. Quantization by level. Errors. Pulse modulation and delta modulation. Experience as a characteristic of the fact of obtaining arbitrary information. A priori and a posteriori uncertainties. Requirements for assessing the amount of information. A measure of information on Hartley. The unit of quantity of information. The average a priori uncertainty. The concept of information entropy. Shannon’s formula. Basic concepts of entropy. The amount of information from the experience in the general case. Basic properties of the amount of information. Relative entropy. Independence of relative entropy from specific values of a random variable. Determination of the probability density that delivers the maximum of entropy. Ergodic sources of arbitrary order. Ergodic random functions and sequences. Entropy of ergodic sources. The fundamental property of the entropy of discrete ergodic sources. Redundancy and information flow of the message source. Structural diagram of the communication system. Classification of communication channels. Model of the information system for the transmission of discrete messages in the absence of noise. The speed of information transfer. Bandwidth of a discrete communication channel. Effective coding. Kraft’s theorem. Shannon’s theorem on encoding messages. Building effective code with independent source symbols. Shannon’s method is Fano and Hoffmann’s method. The model of a communication channel in which noise acts. The first, second and third Shannon’s theorems on coding in the presence of noise. Throughput of continuous communication channels. Bandwidth of a stationary Gaussian link. Dependence of the information transfer rate on signal distribution and noise over the frequency spectrum. |

9 |
Recommended literature |
Lidovsky, V.V. Theory of Information: Textbook / V.V. Lidovsky.- Moscow: Fizmatlit, 2004. – 111 p.. Fursov, V.A. Lectures on Information Theory: Textbook. Ed. By N.A. Kuznetsov / V.A. Fursov. – Samara, 2006. – 148 p. Dmitriev, V.I. Applied information theory. Textbook / V.I. Dmitriev. – Moscow, 1989. – 332 p. Stratonovich, R.L. Theory of Information / R.L. Stratonovich. – Moscow, 1975. – 424 p. Goldman, S. Theory of Information / S. Goldman. – Moscow, 1957. – 446 p. |

10 |
Teaching methods |
Lectures with presentations, practical exercises with solving problems |

11 |
Language |
Russian, English |

12 |
Conditions (requirements), current control |
– checking test; – performance of assignments in practical classes. The mark on the exam is set taking into account: 45% – checking test; 5% – performance of tasks on practical work; 50% – oral response to the exam. |

13 |
Certification form |
Exam |