In 1948 Claude Shannon published a landmark paper, A Mathematical Theory of Communication, where he introduced the concept of information and communication from a mathematical perspective. With this a new research topic was born, namely information theory.Based on probability theory he defined two functiona, Entropy and Mutual information, that are the core of understanding what information is and what is required for it to be communicated. Another change groundbreaking result is that all information can be viewed in digital, or binary, form. Today we use a variety of digital communication systems spanning from digital TV and mobile phones to Internet as a whole. We use these systems on a regular basis and many vital parts of our sociaty are relying on them.
And the evolution is not slowing down, on the contrary, it is expanding. Today we can for exampe see the beginning of usage of IoT devices and applications in different forms, and in 20 years from now it is likely that we see new ways of using information and communications. It is remarkable that all of the above systems that together has change our way of living during the last 70 years, in essence falls back to the theory from Shannon. In this course you will be given the fundamentals of Information Theory. It will explain and set up the theoretical limitations and possibilities of modern information and communication systems.
All lectures and exercise classes will be held online using Zoom at the scheduled times.
The Time Edit schedule can be found here.
|Credits:||7.5 hp / ECTS|
|Study Period:||VT2 / LP4|
|Course responsible:||Michael Lentmaier|
|Teaching assistant:||Mgeni Makambi Mashauri|
|Course administrator:||Erik Göthe|