**3. Information and information theory**

According to (Drekste, 1999) Information Theory is a branch from probability theory and from statistics mathematics that identifies the amount of Information associated or generated by an event occurrence, or the realization of the state of things, by uncertainty reduction and the elimination of possibilities, presented by the event or by the state of things in question. Relatively to Information "choice" and "amount" are always mentioned and how to measure Information.

Sir Ronald Fisher (1890-1962), a renowned scientist from the 20th century with large contri‐ butions to Statistics, Evolutionary Biology and Genetics, introduced in 1925 the concept of Fisher Information, long before the *entropy notion*<sup>1</sup> from Claude E. Shannon (1916-2001) with the maximum probabilities technique and variable analysis.

However, the concept and notion of information is vague and intuitive. When we ask a question, we are requesting data from someone. When we watch a television show or a movie, we are absorbing data. While reading a magazine, or listening to music we know we are dealing with some kind of data. In a general manner, we use, we absorb, we assimilate, we manipulate, we transform, produce and transmit data for as long as our existence.

In 1948, Shannon gave us a more recent definition of Information amount, as the measure of freedom of choice of each one of us when we select a message (data) between all messages of a set.

For Shannon, the content of Information in each message consists only in the quantity of numbers (bits), of zeros and ones leading to the message conveyed. Information can thus be processed as a physical measurable amount.

<sup>1</sup>*Shannon's entropy has to do with the fact of the addition of all data, which, according to F. Carvalho Rodrigues is incorrect. We must work with a set of data the smallest and simplest as possible, to maximize the amount of Information to acquire.*

When we want to transform data into Information, we must bear in mind three basic rules (Snodgrass, 2007):

Maeda (2006), one can describe Simplicity as it has been sustained by Fisher (1925), as the first

In a predictable future, Simplicity is thus doomed to be Design and Engineering's motto, of a whole Industry, as it has been featured in it's own time, the discovery of Pythagoras's Theorem… Shortly, we aim to give to the reader's understanding that simplicity is not something intuitive. In fact, today we possess and have access to great data complication, on the other hand the same data are complicated to process, hence we have very little information, which by analogy leads to conclude that if we are facing an enormous simplicity, a few data set will give a great amount of information about the instrument, or the system, or whatever.

Simplicity, like easiness, hardness, information or complication is a matter of the observer's

According to (Drekste, 1999) Information Theory is a branch from probability theory and from statistics mathematics that identifies the amount of Information associated or generated by an event occurrence, or the realization of the state of things, by uncertainty reduction and the elimination of possibilities, presented by the event or by the state of things in question. Relatively to Information "choice" and "amount" are always mentioned and how to measure

Sir Ronald Fisher (1890-1962), a renowned scientist from the 20th century with large contri‐ butions to Statistics, Evolutionary Biology and Genetics, introduced in 1925 the concept of

However, the concept and notion of information is vague and intuitive. When we ask a question, we are requesting data from someone. When we watch a television show or a movie, we are absorbing data. While reading a magazine, or listening to music we know we are dealing with some kind of data. In a general manner, we use, we absorb, we assimilate, we manipulate,

In 1948, Shannon gave us a more recent definition of Information amount, as the measure of freedom of choice of each one of us when we select a message (data) between all messages of

For Shannon, the content of Information in each message consists only in the quantity of numbers (bits), of zeros and ones leading to the message conveyed. Information can thus be

1*Shannon's entropy has to do with the fact of the addition of all data, which, according to F. Carvalho Rodrigues is incorrect. We*

*must work with a set of data the smallest and simplest as possible, to maximize the amount of Information to acquire.*

from Claude E. Shannon (1916-2001) with

definition of information amount.

120 Advances in Industrial Design Engineering

measuring. It depends on his own system of beliefs.

**3. Information and information theory**

Fisher Information, long before the *entropy notion*<sup>1</sup>

processed as a physical measurable amount.

the maximum probabilities technique and variable analysis.

we transform, produce and transmit data for as long as our existence.

Information.

a set.


Information and knowledge are the result of human action while aggregating data (symbols or facts) in a social or physical scope, out of context, not directly or immediately significant (signs). Data setting in a determined context acquiring meaning and value, being thus designated as Information. Knowledge comes from the successive accumulation of relevant and structured Information capable of action production, partially based in experience. The data transformation in Information and therefore in knowledge requires a cognitive effort in the perception of the structure and of the allowance of a meaning and a value.

Information can thus be grouped at several "levels" from the most basic shape to the most complex one. In these different levels we refer (Bernoulli, 1713; Quinn, 1980; Waibel and Stiefelhagen, 2009) to:


A possible structure for Information levels is shown above.

**• Bits and Bytes/Signs** are (SINGH, 2007) the raw material, representing binary Information in the format understood by computers, i.e., zeros and ones that correspond to waves of electric impulses. Zeros represent no current going on and ones represent current going on, like an electric switch. It's the lowest existing level in computers and, for that reason, the one that exists in higher quantity. It is impossible to the observer to distinguish what is represented in it, only finding usefulness when it's found in relevant form. Raw material is no longer considered a level of Information.

etc. Explicit codified knowledge is valuable, raising the capacity of observed and negotiable knowledge, easing communication and learned codification transmitted in rules and thus

Measuring Design Simplicity http://dx.doi.org/10.5772/54753 123

**• Perception** (Frazer and Norman, 1977; Dember et al., 2011) is the meaning filtered by the System of Beliefs. It's the process by which organisms interpret and organize the sensation to build a meaningful experience of the world, generally enhancing processing additional sensorial inputs, the process by which sensorial stimulation is converted in organized experience. The sensation is usually referred to non-processed sense results and, either

**• Meaning** (Frazer and Mackay, 1975) is the meaning understood in terms of objectives. In statistic terms, for very simple models it is possible to find meaning in a numerical equiv‐ alence. The test of meaning has the broadest range of applications and requests a model to

It is so deductible that the Theory of Information explores the possibility of quantitatively measuring a message's information to utter analysis of its meaning (Carvalho Rodrigues, 1989).

From this we can conclude that before a set of data it is possible to "know" if it has more or

In the past, guessing was a possible science, having the objective of trying to determine the meaning and causes of events. With science evolution, guessing was abandoned taking into

All that is known beyond doubt (Bernoulli, 1713), we claim to know or understand. Relatively to all that is left, we only conjecture and opine. To conjecture about something is the same as to measure the probability of something the best way possible, in order to choose the best

Randomness is not part of our knowledge but the object's property, rendering it impossible to make predicaments about it. The probability is a measure of how certain we are and it's achieved with a combination of arguments. An argument may be defined as a thought to prove

When an argument fits with mathematics, one can make predictions. When an argument is an image, one can forecast as in "an image is worth a thousand words" as being objective. When one has both arguments (image plus mathematics) one can acquire certainty. Each argument must have a weight and the set of arguments with their relative weight is a System of Beliefs.

Probabilities are estimated by the number and weight of the arguments that prove or indicate that a certain thing is, was or will be. Arguments, by themselves, are intrinsic, or artificial in the daily speech, they are expressed or come out according to considerations of cause, the

sensation either perception, in practice, are hard to separate.

less Information over a variable we do not know but want to know.

make the description of an answer as a whole or just a reduced answer.

enduring.

**4. System of beliefs**

account only predicament and certainty.

option to our judgments and actions.

or refute a given question.


<sup>2</sup> According to Porto Editora Portuguese Language Dictionary, the word "sememas" means the signification unit in a lexeme (lexema), which pertains in a set of semes (semas), the minimal significant component of a word.

etc. Explicit codified knowledge is valuable, raising the capacity of observed and negotiable knowledge, easing communication and learned codification transmitted in rules and thus enduring.


It is so deductible that the Theory of Information explores the possibility of quantitatively measuring a message's information to utter analysis of its meaning (Carvalho Rodrigues, 1989).

From this we can conclude that before a set of data it is possible to "know" if it has more or less Information over a variable we do not know but want to know.
