Home | Journal | Bookshelf | Index | Other | Summaries | Timeline | Help | Copyright
Information
Previous | Next | Cards | Index | Cloud

Ross indexed the following pages under the keyword: "Information".


Previous | Next | Cards | Index | Cloud
1948
Ergodism Ergodic systems
Information Wiener on
2517 2518

Previous | Next | Cards | Index | Cloud
1949
Summary: Stability of a chain-circuit of variables. (See also 2604)
The Conditioned Reflex [5]: Conditioned reflex machine built. 2614. Seems to be original, 2623.
Summary: Inaugural meeting of the Ratio Club at the National Hospital for Nervous Diseases.
Information in time series
2623 2624

Previous | Next | Cards | Index | Cloud
1950
Bit (= binary digit) quantities in speaking etc
Homeostat new possibility
Information quantitative examples
Delay (in substitution) and oscillation
Society [30]: Oscillations in fly population and its cause, 2784.
2783 2784
Code types of
Genes as information
Information in genes
2785 2786
Baudot code
Information in genes
Summary: Information theory.
Pattern (in general) recognition of
2819 2820
Summary: Canonical equations of systems composed of units each of which tries to make itself (its dial value) some function of the others. (3200)
Society [35]: Equations of a society in which each unit tries to maintain itself at some function of its surroundings 2991.
Summary: The system that does not generate information is identical with an absolute system. 3032
Information the noiseless system
Oddments [28]: Is the absolute system 'noiseless'? 2992, 3013, 3031, 3060.
2991 2992
Summary: Necessary and sufficient conditions that a first adaptation should be still present after a second has taken place.
Oddments [28]: Is the absolute system 'noiseless'? 2992, 3013, 3031, 3060.
Information the noiseless system
3031 3032

Previous | Next | Cards | Index | Cloud
1951
Summary: Systems that are partly stochastic.
Density in phase in absolute system
Information and density in phase
Statistical mechanics density in phase
3099 3100
Summary: The basic equations of statistical mechanics (Continued 3134)
Black box, problem of the conjuring as
Information effect of threshold
Threshold and information
3105 3106
Summary: Darwinian mechanisms are to be developed by Darwinian process.
Information gate admitting
Switching as a gate admitting information
3151 3152
Summary: Switches that see a Markoff process only through themselves: consequent bias in their settings. (Theory in metric-less states, 4527)
Markov process / chain seen through a gate or switch
Stochastic processes seen through a gate or switch
Information in machines
3163 3164
Information in machines
Information in machines
3165 3166
Information in machines
Information in machines
3167 3168
Information in machines
Summary: In an absolute system formed by the junction of independent parts, if a particular part can take one of ρ initial states and can show σ lines of behaviour from each initial state, then the quantity of information log2 ρ + log2 σ cannot be exceeded whatever part has been chosen.
Information in machines
3169 3170
Density in phase in absolute system
Information in machines
Information in machines
3171 3172
Summary: Information in an absolute system always falls to log2 η* (3176) where η is the number of the system's stable states and cycles. *Allowance should be made for the fact that the resting states are not equally probable.
Information in machines
Information in machines
3173 3174
Information in machines
Summary: Information in a machine. The catchment area of a resting state.
Information in machines
3175 3176
Summary: Information in a conjoined system. 3274
Information in machines
Information organisms aim to destroy it
Information in machines
Markov process / chain passing through transducer
3177 3178
Information in machines
Information in machines
3179 3180
Summary: Example and proof of Shannon's Theorem 7
Information in machines
Information in machines
DAMS (Dispersive and Multistable System) [13]: Possible patterns for joining output and inputs, 3182, 3237.
3181 3182
Information in machines
Information in machines
3183 3184
Information in machines
Information in machines
3185 3186
Information in machines
Information in machines
3187 3188
Summary: Networks for DAMS. (Cf. 3237) (Further example 3306)
Information in machines
Information in machines
3189 3190
Information in machines
Information in machines
3191 3192
Summary: Information in machines.
Information in machines
Information in machines
3193 3194
Information in machines
Information in machines
Markov process / chain affecting a machine
3195 3196
Information in machines
Markov process / chain affecting a machine
Information in machines
Markov process / chain affecting a machine
3197 3198
Information in machines
Markov process / chain affecting a machine
Summary: Shannon and I.
Chasing equation of
Coding in machine
Constant intrinsic stability definition
Equilibrium 'constant intrinsic'
Information in machines
Markov process / chain affecting a machine
Output as function of input
Transformation function-forming
3199 3200
Summary: A variable of constant intrinsic stability and one that always moves towards some function of its neighbours' states are identical. (Cf. 3110) (Behaviour 3134, 3239)
Information in machines
Markov process / chain affecting a machine
Information in a disturbed system
Information in machines
Markov process / chain affecting a machine
Parameter as source of information
3201 3202
Summary: Passing information from parameter into machine. The previous theorem can be improved. Here is a better statement...
Information in machines
Markov process / chain affecting a machine
Information in machines
Markov process / chain affecting a machine
3203 3204
Summary: Accurate statement of the amount of information that can be put into a machine by arbitrary interference. (3275)
Information in machines
Markov process / chain affecting a machine
Summary: A physical example of habituation.
Habituation physical example
Information in machines
Markov process / chain affecting a machine
3205 3206
Summary: In the field of an absolute system, every convergent junction acts as a sink for information.
Information in machines
Information lost by convergence in field
Markov process / chain affecting a machine
Information in machines
Markov process / chain affecting a machine
3207 3208
Summary: Maximal loss at a convergent point in a field. Table of log2[(aa bb)/(a+b)a+b].
Information in machines
Markov process / chain affecting a machine
Summary: We cannot measure information by finding contributions from sub-ensembles and adding. (Another example 3249)
Information belongs to the whole system
Information in machines
Markov process / chain affecting a machine
3209 3210
Information in machines
Markov process / chain affecting a machine
Summary: An absolute machine can never gain more information than is put into it.
Information in machines
Markov process / chain affecting a machine
3211 3212
Information in machines
Markov process / chain affecting a machine
Summary: When a parameter affects a machine, the gain in information is stationary (and a maximum) if the parameter's values are distributed independently of the machine's.
Information in machines
Markov process / chain affecting a machine
3213 3214
Information in machines
Markov process / chain affecting a machine
Summary: Passage of information as machine dominates machine. (See 3298, 3218, 3275)
Information in machines
Markov process / chain affecting a machine
3215 3216
Capacity information
Channel capacity of absolute systems
Information in machines
Markov process / chain affecting a machine
Transmission capacity of absolute systems
Information in machines
Markov process / chain affecting a machine
3217 3218
Information in machines
Markov process / chain affecting a machine
Summary: (Stated at the front - on 3218): If a machine is driven by an absolute system, the duration of coupling makes no difference to the amount of information received.
Information in machines
Markov process / chain affecting a machine
3219 3220
Information in machines
Markov process / chain affecting a machine
Summary: An information source controlling an otherwise absolute system raises it to a definite information content at which it is in stable equilibrium. (3086) (Canonical equations next page)
Information in machines
Markov process / chain affecting a machine
3221 3222
Information in machines
Markov process / chain affecting a machine
Markov process / chain equilibrium in ensemble
Parameter as source of information
Resting state of system with Markoff parameter
Summary: Canonical equations of the densities in state of a system disturbed by an information source. (See 3227)
Information in machines
Markov process / chain affecting a machine
3223 3224
Information in machines
Information of transition
Markov process / chain affecting a machine
Summary: Another measure of information applicable to a machine.
Information in machines
Markov process / chain affecting a machine
3225 3226
Information in machines
Markov process / chain affecting a machine
Summary: When driven by a steady statistical source, the information in a machine does not tend to a minimum.
Information in machines
Markov process / chain affecting a machine
3227 3228
Information in machines
Markov process / chain affecting a machine
Statistical mechanics states that cannot be escaped from
Summary: States that lock accumulate all the members of the ensemble. 3233, 3291, 4524
Information in machines
Markov process / chain affecting a machine
3229 3230
Information in machines
Markov process / chain affecting a machine
Information in machines
Markov process / chain affecting a machine
Transition probability between resting states
3231 3232
Information in machines
Markov process / chain affecting a machine
Summary: Information when a stochastic parameter changes infrequently.
Information in machines
Information ways of loosing
Markov process / chain affecting a machine
3233 3234
Summary: Ways of losing information. 3274
Information in machines
Information in machines
3235 3236
Summary: Wiring pattern of DAMS.
Information in machines
DAMS (Dispersive and Multistable System) [13]: Possible patterns for joining output and inputs, 3182, 3237.
Information in machines
Resting state maximal number
3237 3238
Information in machines
Summary: Conditions that a machine shall have the maximal number of resting states. This can be specified further...
Information in machines
3239 3240
Summary: Maximal number of resting states. (3308)
Information in machines
Summary: Information when A drives B.
Information in machines
3241 3242
Absolute system why not x=f-1 (x')?
Canonical equations why not [x=f-1 (x')]?
Information and canonical equations
Information in machines
Summary: The inverse of the canonical equations.
Experiment when it stops
Information and experiment
Information in machines
3243 3244
Summary: An experiment stops when the exchange of information has reached equilibrium. (3248, 3254, 3691)
Information and experiment
Information in machines
Epistemology [5]: When does an experiment stop? 3245.
Independence and information
Information and independence
Information in machines
3245 3246
Summary: Independence does not in general cause loss of information. (3274)
Information in machines
Experiment when it stops
Information and experiment
Information in machines
Epistemology [6]: Passage of information from machine to observer, { 3248 - 3271 }
3247 3248
Summary: Entropies in the parts do not sum to that of the whole. Entropy of a part may equal that of the whole.
Experiment when it stops
Information and experiment
Information in machines
Epistemology [6]: Passage of information from machine to observer, { 3248 - 3271 }
Summary: Information and experiment.
Experiment when it stops
Information and experiment
Information in machines
Epistemology [6]: Passage of information from machine to observer, { 3248 - 3271 }
3249 3250
Experiment when it stops
Information and experiment
Information in machines
Epistemology [6]: Passage of information from machine to observer, { 3248 - 3271 }
Experiment when it stops
Information and experiment
Information in machines
Epistemology [6]: Passage of information from machine to observer, { 3248 - 3271 }
3251 3252
Summary: Information in an absolute machine. [deleted]
Experiment when it stops
Information and experiment
Information in machines
Epistemology [6]: Passage of information from machine to observer, { 3248 - 3271 }
Experiment when it stops
Information and experiment
Information in machines
Epistemology [6]: Passage of information from machine to observer, { 3248 - 3271 }
3253 3254
Experiment when it stops
Information and experiment
Information in machines
Epistemology [6]: Passage of information from machine to observer, { 3248 - 3271 }
Experiment when it stops
Information and experiment
Information in machines
Epistemology [6]: Passage of information from machine to observer, { 3248 - 3271 }
3255 3256
Experiment when it stops
Information and experiment
Information in machines
Epistemology [6]: Passage of information from machine to observer, { 3248 - 3271 }
Experiment when it stops
Information and experiment
Information in machines
Epistemology [6]: Passage of information from machine to observer, { 3248 - 3271 }
3257 3258
Experiment when it stops
Information and experiment
Information in machines
Epistemology [6]: Passage of information from machine to observer, { 3248 - 3271 }
Experiment when it stops
Information and experiment
Information in machines
Epistemology [6]: Passage of information from machine to observer, { 3248 - 3271 }
3259 3260
Experiment when it stops
Information and experiment
Information in machines
Epistemology [6]: Passage of information from machine to observer, { 3248 - 3271 }
Experiment when it stops
Information and experiment
Information in machines
Epistemology [6]: Passage of information from machine to observer, { 3248 - 3271 }
3261 3262
Summary: Information and the experimenting on dynamic systems.
Experiment when it stops
Information and experiment
Information in machines
Epistemology [6]: Passage of information from machine to observer, { 3248 - 3271 }
Experiment when it stops
Information and experiment
Information in machines
Epistemology [6]: Passage of information from machine to observer, { 3248 - 3271 }
3263 3264
Experiment when it stops
Information and experiment
Information in machines
Epistemology [6]: Passage of information from machine to observer, { 3248 - 3271 }
Summary: This then is the maximal information obtainable in an absolute system of σ states by starting it at a state selected arbitrarily and then observing how it's behaviour goes from state to state.
Experiment when it stops
Information and experiment
Information in machines
Epistemology [6]: Passage of information from machine to observer, { 3248 - 3271 }
3265 3266
Experiment when it stops
Information and experiment
Information in machines
Epistemology [6]: Passage of information from machine to observer, { 3248 - 3271 }
Experiment when it stops
Information and experiment
Information in machines
Epistemology [6]: Passage of information from machine to observer, { 3248 - 3271 }
3267 3268
Experiment when it stops
Information and experiment
Information in machines
Epistemology [6]: Passage of information from machine to observer, { 3248 - 3271 }
Summary: Information always decreases, step by step, as an unknown line of behaviour unfolds.
Experiment when it stops
Information and experiment
Information in machines
Epistemology [6]: Passage of information from machine to observer, { 3248 - 3271 }
3269 3270
Summary: Uncertainty about the details within a line of behaviour is independent of whether that line, or some other, will occur. 3274
Experiment when it stops
Information and experiment
Information in machines
Epistemology [6]: Passage of information from machine to observer, { 3248 - 3271 }
Latent roots distribution of
3271 3272
Information and machines, collected
3273 3274
Summary: The longer the line of behaviour, the higher the chance of step-function change.
Information when only part is observed
3297 3298
Summary: Information in machines.
Information in a continuous system
3301 3302
Summary: The continuous system can gain information though absolute.
Summary: As soon as a sub-system is isolated it starts losing information.
Information in a sub-system
Personal notes [13]: Discussion with Wiener, May '51. 3304.
3303 3304
Entropy calculation of
Information calculation of
Markov process / chain information from
Transducer theory of
3368 3369
Summary: Information going through a transducer.
Information through a transducer
Transducer destroying information
3376 3377
Information necessary for resting states
Resting state and information
Transducer destroying information
3403 3404
Summary: A machine's tendency to destroy or conserve information (as uncertainty of state) depends slightly on certain necessary factors in the parts but depends more on the holistic factor of assembly.
Information destruction in machine
3405 3406
Information destroyed by part function
Latent roots distribution
DAMS (Dispersive and Multistable System) [22]: DAMS must contain its information reduntantly 3419.
Cortex, sensory layering in
DAMS (Dispersive and Multistable System) [23]: DAMS' variables must be arranged in layers at its sensory input, in order not to lose information, 3420.
3419 3420
Summary: History of DAMS.
Summary: Sex and the Multistable System
Natural Selection [5]: There are two sexes so as to bring together all mutations. Each valve in DAMS has two inputs for the same reason 3509.
DAMS (Dispersive and Multistable System) [34]: History of DAMS: taken down for uniformisation, 3509.
DAMS (Dispersive and Multistable System) [35]: The two inputs to each value cause mixing just as do two sexes, 3509.
Information nature of
Redundancy organisms prefer highly redundant information
3509 3510
Campbell's theorem
Degenerance, in system definition
Epistemology Campbell's theorem
Observable Campbell's theorem
Summary: Statistical mechanics.
Chess 'super' moves
Homeostat variety on switches
Information on uniselectors
3607 3608
Cleverness may be purely negative
Selection of patterns
Summary: 'Positive' cleverness may be really only what is left after the elimination of nonsense. 4578 5307.3 (See 3629)
Information amplifier for
3609 3610

Previous | Next | Cards | Index | Cloud
1952
Summary: The mechanism underlying paranoia.
Aggression relation to paranoia
Information and resting states
Resting state and information
DAMS (Dispersive and Multistable System) [52]: The resting states of DAMS are not accessible unless a rich source of information is available and a broad channel into it, 3656.
3655 3656
Summary: The number of resting states that a machine can display to an observer depends on the information that the observer can get into the machine. (Next page)
Stimulus information in
Information in primary operation
Primary operation information admitted
3657 3658
Information in canonical equations
3659 3660
Experiment when it stops
Information from machine to machine
3690 3691
Information in partition
Partition information in
3694 3695
Summary: Searching, random and systematic.
Information first amplifier
Summary: Qualification to 3746.
3752 3753
Summary: Rank of the differential matrix, and null-functions.
Rank and reducibility
Reducibility rank of
Information retention by null-function
Null-function and retention of information
3772 3773
Control via constants
Jacobian (determinant) rank and control
Parameter independence among
Information loss in function
3778 3779
Determinate meaning of
Set or Ensemble relation to 'determinate'
Summary: Field + field, and determinate changes of parameter.
Summary: The observer-system relation is symmetrical; so we can calculate 'information' over an ensemble of observers
Experimenter information to
Information to experimenter
3786 3787
Information in vector
Rank and resting state
Resting state rank around
Summary: Information, rank, equations. 3799
Experimenter testing independence
Experimenter using primary operations
Independence testing for
3788 3789
Summary: On the operation that brings the representative point to a particular initial state. 3846, 4628
Convergence (of lines of behaviour)
Independence and convergence
Independence types of
Information convergence and
Invariant three types
3792 3793
Summary: Convergence of lines as invariance.
Experimenter receiving information
Information transfer of
3794 3795
Isomorphism of science
Information chain of systems
3802 3803
Summary: Habituation. 3837, 3842, 3856, 4526
Dispersion defined by information
Independence definition
Information and independence and dispersion
Essential variables possible mode of action
3824 3825
Summary: Other things being equal, the system with more step-functions will have more resting states.
Information decay of
Natural Selection [51]: Importance of the fact that the world is old, 3857.
3856 3857
Summary: Return of parameter to a previous value can cause further loss of information. 3863, 3954, 4057, 4074, 4373
Information fall under change of parameter
3862 3863
Summary: Topology of absolute system.
Summary: 'Stability' must be re-stated in terms of information. 3963, 3980, 3975
Genes should receive no information
Information genes should get none
Unsolved problems [13]: 'Stability' should be re-stated in terms of information, so as not to raise the unwanted 'unstable' equilibria. 3929.
3928 3929
Summary: Self-reproducing arcs are of very great importance, for good or evil. Review 4154
The Multistable System [40]: Self-reproducing arcs are of high importance, for good or evil 3932.
Information in partition
Partition information in
3932 3933
Basin and habitation
Summary: Administering a determinate impulse to an ensemble can only cause its information to fall. 3954
Basin defined
Information and d-impulse
3936 3937
Step function topology of
Summary: Unsolved problem. 3959, 3962
Information must be destroyed
3938 3939
Summary: Partition and lattice.
Summary: Partitions in absolute ensembles.
Information and lattice theory
3950 3951
Experience
Information fall under change of parameter
Parameter allows information to fall
3954 3955
Summary: A quite different way of defining and handling the absolute system. 3962, 4019. Review, 4338
Information levels of
Information what is necessary
3960 3961
Summary: Why do our brains take so much notice of a step-function? 4233
Unsolved problems [14]: Why do we take so much notice of a step-function? 3994. (because of its 1-bit simplicity).
Summary: 'Absoluteness' is relative to an observer. 4031, 4043.
Information levels of
3994 3995
Class I.e. not to individual
Congruence
Ideal (mathematical)
Information 'complete' within limits
Quotient machine and residue class
Residues (of congruences)
Summary: Residue classes in behaviour.
Basin as partition
Partition lattice theory
3996 3997
Summary: The representative point may be divided into parts, as the whole system is thought of in parts.
Information destruction of
Summary: More on information. 4133
Design for a Brain first advert
Personal notes [18]: First advertisment for "Design", 4123.
4122 4123
Summary: Runaway and information. 4303
Information at runaway
4156 4157
Summary: Effectiveness of parameters with finite substitution. Applied 4414
Dimension of parametric control
History examples
Memory in system
Parameter control exerted by
Summary: Examples of states that include history.
Summary: Knowledge must be tested by control. 4438
Information at runaway
Epistemology [17]: Knowledge must mean 'control' if it is to be objective, 4303.
4302 4303

Previous | Next | Cards | Index | Cloud
1953
Summary: A system can "know" by having the right values of variables or parameters. 4438
Knowing as selection of vector
Summary: When a designer makes a machine of n states and a parameter-combinations, he puts in variety of a.n.log n This is its intrinsic content. 4463, 4704
Information in machine
4428 4429
Summary: Trial and error in reaction to shock.
Summary: Information in printed English.
Information per word
Memory information in
4432 4433
Entropy computing
Information computing
Summary: Computing entropy.
4444 4445
Summary: One formulation covers "design" "input" and "noise". (Next section) 4449
Design nature of
Information in machine
Epistemology [24]: Distinction between 'design' and 'input' 4449.
4448 4449
Summary: A single machine contains many "informations", most of which do not interact with the others. Example 4577
Information two need not interact
Summary: Joining absolute systems without metric. Example 4473, 4498, 4733
Joining general theory
4470 4471
Distinguishability in brain
Information conservation in chain
4502 4503
Genes reproduction of
Information on chromosomes
Summary: How chromosomes reproduce themselves. High survival power.
Natural Selection [78]: Reproduction of genes; extra survival power given by combination in pairs 4523.
4522 4523
Topology necessity for
Summary: Poincaré on the study of functions.
Information two need not interact
Linkage
Meccano linkage
4576 4577
Summary: 'Equilibrium' refers to a state, 'stability' to the region around the state.
Basin
Traffic principle nature
Summary: Properties of metric-less system. 4623
Continuity as constraint
Information in stimulus
Stimulus information in
4596 4597
Summary: Entropies in regulation. 4666, 4722, 4971
Capacity for control
Information and entropy
4664 4665
Summary: Error-controlled regulation is possible only by blocking conduction.
Information and observer
Epistemology [34]: When variety arrives, the observer who sees all sees unity changing to diversity; the element in the set sees diversity change to unity. 4691.
4690 4691
Information flows in homeostat
Selection and variety
Switching switching off
4706 4707

Previous | Next | Cards | Index | Cloud
1954
Control of system that learns
Learning during regulation
Summary: Observations on controlling a system that can learn.
Design and variety
Information variety is sufficient
Variety and information
4780 4781
Summary: The limit of progressive regulation.
Amplifier limit of
Regulation limit of
Control as reduction
Equilibrium and information
Information as reduction
Regulation and information
Selection and information
Survival in set theory
4794 4795
Summary: Adaptation in evolution.
Constraint and natural law
Information Goldman on
Quotations [53]: Goldman on Science as a use of Constraints 4866.
Summary: Quotations.
Quotations [54]: "A study of the real world thus becomes a study of transducers." 4867.
4866 4867
Differential equation stability theory of
Equilibrium Bellman on
Equilibrium complexity of
Information partial
Knowledge partial
Stability stability theory of differential equations
Summary: Bellman on 'stability'.
Trajectory stability of
4914 4915
Summary: Trials are good, for they bring information. 4945, 4948, 4965, 4963
Information by trials
Trial and error gives information
Behaviour all reducible to behaviour
Thing as way of behaving
4942 4943
Summary: Discriminative feedback. 4963
Summary: Information repair.
Information "repair" (MacKay)
Information and discriminative feedback
Summary: Letters as Markov chain.
Adaptation strategy of
Markov process / chain English as
Maximal likelihood and adaptation
4946 4947
Convergence (of lines of behaviour) in random transformation
Information in random machine
Transformation random
Variety and random transformation
4974 4975
Summary: With a random transformation the variety tends to fall to two thirds at each step. Qualified 5158
Information blocking
4978 4979
Organisation emergent properties
Summary: Whether properties "emerge" or not depends on our knowledge of the parts.
Summary: Why information?
Energy an old question
Information and energy
4986 4987
Summary: Effects of channel capacity in joining two systems, only one of which is observed.
Epistemology [49]: As the coupling between B and A is made richer, either by increasing channel capacity or by adding immediate effects, so will what is in B affect x the sooner. 5026. DIAGRAM
Entropy during search
Hunt and stick information flow
Information during hunt and stick
Searching information flow
Selection informatiom flow
Trial and error gives information
5026 5027

Previous | Next | Cards | Index | Cloud
1955
Summary: There is no "law" to be found in systems much removed from the atomic level. 5142
Laws of nature only near atomic level
Summary: On causality. (Continued over)
Black box, problem of the "cause" and "why"
Cause Rapoport on
Information statistics as reducer
Large system and statistics
Operational principle "why?" and
Statistic as diminisher of information
Why ...? meaning of
5116 5117
Summary: Simplification by running together, or deleting, the elements of time. 5165, 5245
Summary: "Loss of control" in set theory. 5155
Control in set theory
Convergence (of lines of behaviour) loss of control
Derivative and set theory
Information non-transmission
5152 5153
Summary: Kershner and Wilcox' book.
Canonical equations information in
Information in canonical equation
Input of mathematical form
Mathematics variety in
Variety in canonical representation
5160 5161

Previous | Next | Cards | Index | Cloud
1956
Summary: Selection cannot proceed quickly, by dichotomising, in a class that is undefined.
Summary: Example of a case in which information about the initial state gets lost.
Absolute system decay of information
Information example of decay
Initial state decay of information regarding
5280 5281
Discrimination in feedback, no general rule
Information depends on set
5334 5335
Channel capacity crowding within
Information loss by crowding
5372 5373

Previous | Next | Cards | Index | Cloud
1957
Summary: Constraints are found by applying information-losing transformations and seeing whether they are still acceptable to the essential variables. 5756
Constraint finding
Information advantages of losing
5728 5729

Previous | Next | Cards | Index | Cloud
1958
Information more than 1 man-lifetime
Regulation and reducibility
5792 5793
Summary: Some systems are not to be understood, or controlled, by the amount of information that can be accepted in 1 man lifetime. 5810
Information direction is irrelevant
Summary: The symmetrical relation between transmitter and receiver, of McGill and Garner and Woodward, is: between two variables' variations a constraint has been perceived. 5820
Constraint information as
Protocol constraint in
5794 5795
Summary: In the cortex, study only the unspecialised case.
Information loss through net
Network transmission through
5830 5831
Information amount in theory
Model has two informational measures
Summary: Every theory has two informational aspects: its passive, when it is learnt or otherwise acquired; and its active, when it is used as transducer. The two qualities of information are not linked necessarily.
6010 6011
Summary: Eigen-theory generalised. 6109
Summary: A physical system that is not completely analysable. 6065
Information transducer that conserves
Machine information machine conserving
Reducibility irreducible systems
6032 6033
Information in a set of functions
Laplace transformation in set theory
6036 6037
Summary: Finding how much information there is in a brain.
Information how much in brain?
6048 6049

Previous | Next | Cards | Index | Cloud
1960
Equivalence relation information in
Homomorphism variety of
Information in equivalence relation
6170 6171

Previous | Next | Cards | Index | Cloud
1964
Information Reza's book
Topology Bushaw's book
Evolution ensures extinction!
Genes no unique pattern
Survival evolution prevents!
Natural Selection [89]: Selection intensifies selection. Simpson, 6469.
6468 6469

Previous | Next | Cards | Index | Cloud
1965
Summary: Any selection of 1 from more than 101000...(47 zeros)...0 is physically impossible.
Bremermann's limit another form
Information in a mapping
6546 6547
Summary: How information-quality can explode when complicated at the sensory side. 6549
Non-linear systems information in
Summary: Length of sequence increases the uncertainty exponentially.
Information when input is a sequence
6548 6549

Previous | Next | Cards | Index | Cloud
1966
Summary: To be able to compute a function by a few variables at a time is a (non-trivial) restriction.
Information in continuous waveform
Summary: Information in a continuous waveform.
6612 6613

Previous | Next | Cards | Index | Cloud
1967
Information loss, in "mesa" phenomenon
Mapping destroying information
6642 6643
Information loss, in conditioned reflex
Reflex, conditioned information loss
Summary: Pavlov, p.197
Summary: Events that show mutual exclusion in their behaviours may be called a "variable".
Variable nature of
6726 6727
Summary: Group therapy in kinematic graphs. 6811
Information axioms of
Summary: Axioms of information theory.
Cylindrance and reducibility
Explanation cylindrance of
Relation Five-fold
6770 6771

Previous | Next | Cards | Index | Cloud
1968
Summary: Exchanging variables for cylindrance.
Information in regulation, Conant
Summary: Collection of some ways of transforming a space and affecting cylindrance.
6830 6831
Summary: Studying mechanisms that are "dynamic" demands far more information than those that are more of the "static" type.
Dynamic system information in, v. static
Information dynamic versus static systems
6906 6907
Summary: Conant's measures HL,TL,QL have the freedom from absolute value called for in the "Bio Science" paper.
Information Conant's HL,TL,QL
Library retrieval equals theorem-proving
Theorem theorem proving equals information retrieval
Transmission Conant's TL
Homomorphism of machines with input
Machine homomorphic
6926 6927
Coding of machine to machine
Coordination machinery for coordination
Information in regulation, Conant
Transducer for coordination
Summary: Machinery for coordination.
6940 6941
Summary: (1) The "transmissions" require for coordination can always be achieved - method given. (2) The whole flow of information can be analysed by the methods of book-keeping. Another analysis: 6997
Analysis of information flow
Coordination minimal information for
Information partitioned
Partition of information flows
Transmission minimal, in coordination
Summary: High order interactions can readily occur in physical systems. 7006
Coordination minimal information for
Interaction in physical systems
Transmission minimal, in coordination
6950 6951
Analysis of information flow
Coordination minimal information for
Information partitioned
Partition of information flows
Transmission minimal, in coordination
Summary: Information flow for control and coordination. Cf. 7165
Coordination minimal information for
Transmission minimal, in coordination
6962 6963
ASS (Automatic Self-Strategizer) information flow in
Information flow in ASS
Summary: The flow of information, control, and decision when ASS (automatic self-strategizer) produces a legal and winning play. 6989, 6996
Hover mouse here to display note
6974 6975

Previous | Next | Cards | Index | Cloud
1969
Information the identities of
Multinomial measure of information
7018 7019
Information Cause analysis
7027 7028

Previous | Next | Cards | Index | Cloud
1970
Summary: Transmission enters when an unordered set is distributed over a product space. Example 7092
Self-awareness in chimpanzee
Summary: Self-recognition.
Information in design
7088 7089
Information in pulse and gap
Summary: No-pulse may carry more information than pulse.
7094 7095
Summary: Why allowing interaction boosts up informational demands. 7122, 7127
Information and interaction
7096 7097
Summary: Example of how much selection is required to make a system stable.
Information required for stability
Stability information needed
7106 7107

Previous | Next | Cards | Index | Cloud
1971
Summary: Interaction between parts having inputs raises the complexity by x ki. i = number of inputs to each part; k = number of values (states) of each part).
Information and interaction
Interaction increases information by x ki
Summary: Me and my brain in dreams.
Dream and "self"
Self-awareness impossible
7124 7125
Summary: Transmission cannot be created. 7146
Conservation of Complexity Law of
Transmission cannot be created
Information and interaction
Interaction squares it (doubled size)
Size, effect of on information
7126 7127

Home | Journal | Bookshelf | Index | Other | Summaries | Timeline | Help | Copyright