TTC_-_The_Science_of_Information

علم اطلاعات (Science of Information) یکی از بانفوذترین و مهمترین رشته در علم امروزه به شمار می آید.

تا پیش از این ما انسان ها دسترسی به این حجم عظیم اطلاعات را نداشتیم و طی سال ها پیشرفت علم و افزایش درک انسان از محیط اطراف خود و جهانی که در آن زندگی می کند، کم کم شاخه های مختلف از علم مانند زبان شناسی، رمز نگاری، اعصاب، ژنتیک، اقتصاد، فیزیک، شیمی، کوانتوم و … توسعه یافتند.

علم اطلاعات خیلی گسترده است و از زبان شناسی تا سیاه چاله ها را پوشش می دهد.

در این دوره آموزشی که توسط پروفسور Benjamin Schumacher فیزیک دان برجسته آمریکایی در 24 جلسه تدریس شده است شما با مفاهیم و مباحث بسیار جالبی در زمینه علم اطلاعات آشنا می شوید.

این دوره ارزشمند حدود 13 گیگابایت حجم دارد.

خلاصه سرفصل های این دوره:

  • قابلیت تبدیل اطلاعات
  • محاسبات و گیت های منطقی
  • اندازه گیری اطلاعات
  • آنتروپی و میانگین سورپرایز
  • فشرده سازی داده ها و کدگذاری آنها
  • رمزنگاری صدا  تصویر
  • نویز و ظرفیت کانال
  • تصحیح خطا
  • سیگنال و پهنای باند
  • رمزنگاری و آنتروپی کلیدی
  • تحلیل رمز و پرده برداشتن از انیگما
  • کدهای غیر قابل شکستن و کلیدهای عمومی

عنوان دوره: TTC – The Science of Information

مدت زمان: 12 ساعت

توضیحات:

The science of information is the most influential, yet perhaps least appreciated field in science today. Never before in history have we been able to acquire, record, communicate, and use information in so many different forms. Never before have we had access to such vast quantities of data of every kind. This revolution goes far beyond the limitless content that fills our lives, because information also underlies our understanding of ourselves, the natural world, and the universe. It is the key that unites fields as different as linguistics, cryptography, neuroscience, genetics, economics, and quantum mechanics. And the fact that information bears no necessary connection to meaning makes it a profound puzzle that people with a passion for philosophy have pondered for centuries.
Little wonder that an entirely new science has arisen that is devoted to deepening our understanding of information and our ability to use it. Called information theory, this field has been responsible for path-breaking insights such as the following:

What is information? In 1948, mathematician Claude Shannon boldly captured the essence of information with a definition that doesn’t invoke abstract concepts such as meaning or knowledge. In Shannon’s revolutionary view, information is simply the ability to distinguish reliably among possible alternatives.
The bit: Atomic theory has the atom. Information theory has the bit: the basic unit of information. Proposed by Shannon’s colleague at Bell Labs, John Tukey, bit stands for “binary digit”—0 or 1 in binary notation, which can be implemented with a simple on/off switch. Everything from books to black holes can be measured in bits.
Redundancy: Redundancy in information may seem like mere inefficiency, but it is a crucial feature of information of all types, including languages and DNA, since it provides built-in error correction for mistakes and noise. Redundancy is also the key to breaking secret codes.
Building on these and other fundamental principles, information theory spawned the digital revolution of today, just as the discoveries of Galileo and Newton laid the foundation for the scientific revolution four centuries ago. Technologies for computing, telecommunication, and encryption are now common, and it’s easy to forget that these powerful technologies and techniques had their own Galileos and Newtons.

The Science of Information: From Language to Black Holes covers the exciting concepts, history, and applications of information theory in 24 challenging and eye-opening half-hour lectures taught by Professor Benjamin Schumacher of Kenyon College. A prominent physicist and award-winning educator at one of the nation’s top liberal arts colleges, Professor Schumacher is also a pioneer in the field of quantum information, which is the latest exciting development in this dynamic scientific field.

Professor Schumacher introduces the essential mathematical ideas that govern the subject—concepts that can be understood by anyone with a background in high school math. But it is not necessary to follow the equations to appreciate the remarkable story that Dr. Schumacher tells.

A New View of Reality

Clearly, information has been around a long time. In human terms, language, writing, art, music, and mathematics are perfect examples; so are Morse code, Mendelian genetics, and radio signals—all originating before 1900. But a series of conceptual breakthroughs in the 20th century united what seemed like unrelated phenomena and led to a dramatic new way of looking at reality. The Science of Information takes you on this stimulating intellectual journey, in which some of the key figures include:

Claude Shannon: Shannon plays a key role throughout the course as the dominant figure in the early decades of information theory, making major contributions in computer science, cryptography, genetics, and other areas. His crucial 1948 paper was the “shot heard” round the world” for the information revolution.
Alan Turing: The genius behind the decryption of the Nazi Enigma code during World War II, Turing invented the principle of the modern digital computer, and he showed the inherent limitation of all computers by showing that the notorious “halting problem” was fundamentally unsolvable.
John A. Wheeler: One of the greatest physicists of the 20th century, Wheeler had a passion for the most fundamental questions of science, which led him to conceive the famous slogan, “It from bit,” meaning that all of physical reality emerges from information. He was also Professor Schumacher’s mentor.
In addition, you study the contributions of other pioneers, such as John Kelly, who used information theory to devise an influential strategy for betting and investing; David Huffman, who blazed the trail in data compression, now used in formats such as JPEG and MP3; and Gregory Chaitin, who pursued computer algorithms for information theory, hypothesizing a celebrated yet uncomputable number called Omega. You also explore the pivotal contributions of pre-20th-century thinkers including Charles Babbage, Ada Lovelace, Samuel F. B. Morse, and Joseph Fourier.

The Laws of Information at Work

With lucid explanations and imaginative graphics, Professor Schumacher shows you the world through an extraordinary set of lenses. “If we wear our information-colored glasses,” he says, “we will see the laws of information at work all around us, in a hundred different ways.” The course illustrates this with examples such as:

Money: Today most money exists as electronic account data. But even in ancient times, money was a record-keeping device—in other words, information. Precious metal coins had a cryptographic function: to make it hard to counterfeit messages of economic agreement and obligation.
Privacy: The search for guaranteed privacy has only one refuge—the quantum realm. Professor Schumacher explains how the only perfectly secure communications take place between pairs of entangled quantum particles called qubits (a term he coined). Such systems are now in use.
Games: The parlor game 20 Questions obviously involves the exchange of information. But why is the number of questions 20? Why not 10 or 30? The answer has to do with the connection between entropy and information—in this case, the total number of possible solutions to the game.
Dr. Schumacher also shows you how information theory can provide answers to profound scientific questions. What is the information content of the genome? The human brain? A black hole? The universe? Time and again, the concepts and laws of information reveal breathtaking insights into the workings of nature, even as they lay the foundation of astounding new technologies.

One final example: 12 billion miles from Earth, a spacecraft built with 1970s technology is racing through interstellar space, never to return. From that distance, the sun is a very bright star and Earth is a pale blue dot. Voyager 1’s radio transmitter is about as strong as a cell phone tower on Earth, which typically can’t reach phones more than a few miles away. Yet we continue, to this day, to receive data from Voyager. How is that possible? The Science of Information explains this amazing feat, along with so much more.
The Transformability of Information
 What is information? Explore the surprising answer of American mathematician Claude Shannon, who concluded that information is the ability to distinguish reliably among possible alternatives. Consider why this idea was so revolutionary, and see how it led to the concept of the bit - the basic unit of information. 
Computation and Logic Gates
 Accompany the young Claude Shannon to the Massachusetts Institute of Technology, where in 1937 he submitted a master's thesis proving that Boolean algebra could be used to simplify the unwieldy analog computing devices of the day. Drawing on Shannon's ideas, learn how to design a simple electronic circuit that performs basic mathematical calculations. 
Measuring Information
 How is information measured and how is it encoded most efficiently? Get acquainted with a subtle but powerful quantity that is vital to the science of information: entropy. Measuring information in terms of entropy sheds light on everything from password security to efficient binary codes to how to design a good guessing game. 
Entropy and the Average Surprise
 Intuition says we measure information by looking at the length of a message. But Shannon's information theory starts with something more fundamental: how surprising is the message? Through illuminating examples, discover that entropy provides a measure of the average surprise. 
Data Compression and Prefix-Free Codes
 Probe the link between entropy and coding. In the process, encounter Shannon's first fundamental theorem, which specifies how far information can be squeezed in a binary code, serving as the basis for data compression. See how this works with a text such as Conan Doyle's The Return of Sherlock Holmes. 
Encoding Images and Sounds
 Learn how some data can be compressed beyond the minimum amount of information required by the entropy of the source. Typically used for images, music, and video, these techniques drastically reduce the size of a file without significant loss of quality. See how this works in the MP3, JPEG, and MPEG formats. 
Noise and Channel Capacity
 One of the key issues in information theory is noise: the message received may not convey everything about the message sent. Discover Shannon's second fundamental theorem, which proves that error correction is possible and can be built into a message with only a modest slowdown in transmission rate. 
Error-Correcting Codes
 Dig into different techniques for error correction. Start with a game called word golf, which demonstrates the perils of mistaking one letter for another and how to guard against it. Then graduate to approaches used for correcting errors in computer operating systems, CDs, and data transmissions from the Voyager spacecraft. 
Signals and Bandwidth
 Twelve billion miles from Earth, the Voyager spacecraft is sending back data with just a 20-watt transmitter. Make sense of this amazing feat by delving into the details of the Nyquist-Shannon sampling theorem, signal-to-noise ratio, and bandwidth - concepts that apply to many types of communication. 
Cryptography and Key Entropy
 The science of information is also the science of secrets. Investigate the history of cryptography starting with the simple cipher used by Julius Caesar. See how entropy is a useful measure of the security of an encryption key, and follow the deciphering strategies that cracked early codes. 
Cryptanalysis and Unraveling the Enigma
 Unravel the analysis that broke the super-secure Enigma code system used by the Germans during World War II. Led by British mathematician Alan Turing, the code breakers had to repeat their feat every day throughout the war. Also examine Claude Shannon's revolutionary views on the nature of secrecy.  
Unbreakable Codes and Public Keys
 The one-time pad may be in principle unbreakable, but consider the common mistakes that make this code system vulnerable. Focus on the Venona project that deciphered Soviet intelligence messages encrypted with one-time pads. Close with the mathematics behind public key cryptography, which makes modern transactions secure - for now.