Massimiliano Sala edited bits1.tex  over 7 years ago

Commit id: ae7f3b327124962d527ea56fa1381f4b0e1654ac

deletions | additions      

       

\section{Bits standard operations and logic}\index{LectNo1}  A \emph{bit} (binary digit) is a unit of measure for information, introduced   by C. Shannon in 1948.\\ Roughly speaking, it represents It might be seen as  the minimal amount of information needed in order to distinguish among two events, events  occurring with the same probability.\\ Bits are used in Information Theory and - in general - in   most of Computer Science applications.  \\  They are represented denoted  with the constants $0 $ $0$  and $1$ (the $1$, representing the  two events with the same probability). In Mathematics, the probability. Their  setof bits  is usually often  denoted by with $\{ 0,1\}$, but we need another notation, that is,  \[  \Fb = \{0,1\}   \]