Hans Christian von Baeyer
Chancellor Professor of Physics at the College of William and Mary.
We don't know what energy is, any more than we know what information is, but as a now robust scientific concept we can describe it in precise mathematical terms, and as a commodity we can measure, market, regulate and tax it.
The smell of subjectivity clings to the mechanical definition of complexity as stubbornly as it sticks to the definition of information.
As every bookie knows instinctively, a number such as reliability - a qualitative rather than a quantitative measure - is needed to make the valuation of information practically useful.
Claude Shannon, the founder of information theory, invented a way to measure 'the amount of information' in a message without defining the word 'information' itself, nor even addressing the question of the meaning of the message.
The problem of defining exactly what is meant by the signal velocity, which cropped up as long ago as 1907, has not been solved.
Science has taught us that what we see and touch is not what is really there.
The switch from 'steam engines' to 'heat engines' signals the transition from engineering practice to theoretical science.
If the intensity of the material world is plotted along the horizontal axis, and the response of the human mind is on the vertical, the relation between the two is represented by the logarithmic curve. Could this rule provide a clue to the relationship between the objective measure of information, and our subjective perception of it?
Paradox is the sharpest scalpel in the satchel of science. Nothing concentrates the mind as effectively, regardless of whether it pits two competing theories against each other, or theory against observation, or a compelling mathematical deduction against ordinary common sense.
As with all quantum devices, a qubit is a delicate flower. If you so much as look at it, you destroy it.
If quantum communication and quantum computation are to flourish, a new information theory will have to be developed.
Nowhere is the difference between either/or and both/and more clearly apparent than in the context of information.
If you don't understand something, break it apart; reduce it to its components. Since they are simpler than the whole,you have a much better chance of understanding them; and when you have succeeded in doing that, put the whole thing back together again.
In order to understand information, we must define it; bit in order to define it, we must first understand it. Where to start?
Entropy is not about speeds or positions of particles, the way temperature and pressure and volume are, but about our lack of information.
Information gently but relentlessly drizzles down on us in an invisible, impalpable electric rain.
To put it one way, a collection of Shakespeare's plays is richer than a phone book that uses the same number of letters; to put it another, the essence of information lies in the relationships among bits, not their sheer number.
Time has been called God's way of making sure that everything doesn't happen at once. In the same spirit, noise is Nature's way of making sure that we don't find out everything that happens. Noise, in short, is the protector of information.
The solution of the Monty Hall problem hinges on the concept of information, and more specifically, on the relationship between added information and probability.
This is not what I thought physics was about when I started out: I learned that the idea is to explain nature in terms of clearly understood mathematical laws; but perhaps comparisons are the best we can hope for.