You cannot select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.

3.4 KiB

Information

Information wants to be free.

Information is knowledge that can be used for making decisions. Information is interpreted data, i.e. while data itself may not give us any information, e.g. if they're encrypted and we don't know the key or if we simply don't know what the data signifies, information emerges once we make sense of the data. Information is contained in books, on the Internet, in nature, and we access it through our senses. Computers can be seen as machines for processing information and since the computer revolution information has become the focus of our society; we often encounter terms such as information technology, informatics, information war etc. Information theory is a scientific field studying information.

Information wants to be free, i.e. it is free naturally unless we decide to limit its spread with shit like intellectual property laws. What does "free" mean? It is the miraculous property of information that allows us to duplicate information basically without any cost. Once we have certain information, we may share it with others without having to give up our own knowledge of the information. A file on a computer can be copied to another computer without deleting the file on the original computer. This is unlike with physical products which if we give to someone, we lose them ourselves. Imagine if you could make a piece of bread and then duplicate it infinitely for the whole world -- information works like this! We see it as a crime to want to restrict such a miracle. We may also very nicely store information in our heads. For all this information is beautiful. It is sometimes discussed whether information is created or discovered -- if a mathematician invents an equation, is it his creation or simply his discovery of something that belongs to the nature? This question isn't so important because whatever terms we use, we at LRS decide to create, spread and freely share information without limiting it in any way.

In computer science the basic unit of information amount is 1 bit (for binary digit), also known as shannon. It represents a choice of two possible options, for example an answer to a yes/no question, or one of two binary digits: 0 or 1. From this we derive higher units such as bytes (8 bits), kilobytes (1000 bytes) etc. Other units of information include nat or hart. With enough bits we can encode any information including text, sounds and images. For this we invent various formats and encodings with different properties: some encodings may for example contain redundant data to ensure the encoded information is preserved even if the data is partially lost. Some encodings may try to hide the contained information (see encryption, obfuscation, steganography). For processing information we create algorithms. We store information in computer memory or on storage media such as CDs, or with traditional potentially analog media such as photographs or books. The opposite measure of information is entropy; it is measured in same units but says how much information is missing rather than what is present.