Transcomputability , ( Glanville ’ s corollary of ) Ashby ’ s law of requisite variety and epistemic processes

Purpose – Ranulph Glanville has argued that ambitions of strict control are misplaced in epistemic processes such as learning and designing. Among other reasons, he has presented quantitative arguments for this ethical position. As a part of these arguments, Glanville claimed that strict control even of modest systems transcends the computational limits of our planet. The purpose of this paper is to review the related discourse and to examine the soundness of this claim. Design/methodology/approach – Related literature is reviewed and pertinent lines of reasoning are illustrated and critically examined using examples and straightforward language. Findings – The claim that even modest epistemic processes transcend the computational means of our planet is challenged. The recommendation to assume out-of-control postures in epistemic processes, however, is maintained on ethical rather than on quantitative grounds. Research limitations/implications – The presented reasoning is limited in as far as it is ultimately based on an ethical standpoint. Originality/value – This paper summarizes an important cybernetic discourse and dispels the notion therein that epistemic processes necessarily involve computational demands of astronomical proportions. Furthermore, this paper presents a rare discussion of Glanville’s Corollary of Ashby’s Law of Requisite Variety.

1. Introduction Bremermann (1962) describes an in-principle limitation of computability resulting from the finite availability of potentially computation-carrying physical matter.Beyond this so-called Bremermann's Limit, computational problems are considered transcomputable, i.e. unsolvable within the material and temporal means of our planet.Since Ashby proposed a connection between transcomputability and systemic control in 1964, references to Bremermann's Limit are frequently featured in cybernetic discourse (Ashby, 1991;Glanville, 1998aGlanville, , 1998bGlanville, , 2009;;von Foerster, 2003, p. 309;Pickering, 2010, p. 150;van Stralen, 2015).Ashby (1956, pp. 121ff, Ashby, 1991) based this connection on his notion of variety[1], a quantitative measure of systemic states.Glanville (2009, pp. 139-147) later argues against strict control in epistemic processes such as learning and designing.Glanville proposes a quantitative, technical argument for this essentially ethical position, suggesting that strict control easily gives rise to levels of variety with transcomputable data processing demands.Some aspects of this technical argument lack accuracy, however, and therefore may inadvertently undermine Glanville's ethical position.Upholding said position, Sections 8 and 9 examine and clarify some of the technical assertions Glanville offers in its support.To set the stage for this purpose, the following Sections 2 to 7 present a brief review of the related discourse.

Variety
In his book An Introduction to Cybernetics, Ashby (1956, pp. 121ff.)introduces variety as a measure of systemic states.When he first mentioned the term in his personal journal, Ashby (1952, p. 4312; his underscore) explained its purpose as follows: I want to get away from the Shannon method of entropies [and] averaging over infinitely long messages; I want something I can count.Ashby (1956, p. 126), nonetheless, defined this measure somewhat ambiguously to mean either the number of distinct elements or the logarithm to the base 2 of the number, the context indicating the sense used.Glanville (2009, pp. 116-117) furthermore observes that Ashby made distinct uses of the term: One use, referred to by Glanville as "s-variety", enumerates the states a system actually takes.The other use of the term, referred to by Glanville as "e-variety", takes into account the number of states a system may assume in principle.Adhering metaphorically to Shannon entropy (Shannon, 1948), e-variety measures the (logarithm of base 2 of the) number of states a system assumes in relation to the (logarithm of base 2 of the) number of states a system may assume in principle.
Both Ashby (1956, p. 127) and Glanville (2009, pp. 115-123) use the example of (British) traffic light signals to illustrate the notion of variety.Together, the three signal lampsred, amber and greeneach with two stateson and offcan assume 2 3 = 8 different states.In their traffic control application, however, traffic lights make use of only four states: red, redand-amber, green, amber.The state red-and-amber-and-green, for example, is not designated for use in this application.The s-variety of traffic lights, accordingly, is log 2 8 = 3, while their e-variety is log 2 4/log 2 8 = 0.666.The difference between potential variety (the number of states a system may assume in principle) and actual variety (the number of states assumed by a system) is referred to as constraint (Ashby, 1956, p. 127;Heylighen and Joslyn, 2001, p. 6).In their traffic control application, in this sense, traffic lights' potential variety of eight states (or log 2 8) is constrained to an actual variety of four states (or log 2 4).
Although it is an important notion in the cybernetic discourse, variety is neither understood unanimously nor applied consistently in the ways other units and measures tend to be.The term entails conceptual challenges, some of which deserve mentioning here.The journal note quoted above suggests that Ashby, quite in the spirit of scientific units, regarded variety as countable in absolute terms (Glanville, 2009, p. 115).This is called into question when variety is not only used to measure and compare the number of discrete states but also the bandwidths of continuous gradations.Variety also appears largely observer-dependent.States may not be available for observation and measurement or have qualitative implications so that different states of a system "count" differently or that different observers appraise variety differently.Variety may, as will be discussed below, also involve extremely large quantities that prohibit numerical accuracy and even K 48,4 computational processing.Actual variety, in particular, is subject to given scopes and units of observation, whereas potential variety, in particular, may not be fixed, but subject to growth and "amplification".
Furthermore, variety is not only used to measure sets of states in systems but also sets of states of channels between them.Here, observer dependency becomes apparent when different "protocol layers" are considered.Morse code, for example, with its longer and shorter on/off states has a relatively low variety.At the layer of alphanumeric symbols encoded by these on/off states, variety is higher, while at the layer of written prose that can be constructed from these alphanumerics, variety is enormous and hardly quantifiable.
Moreover, conceptual as well as ethical challenges can arise when variety is used metaphorically rather than literally to describe and compare "sets of states" of living systems.Human language acquisition, for example, might be considered analogous to the provision of symbol sets in the engineering of technical systems, to the effect that adult language is said to have a higher variety than child language.In such usage, variety may be (mis)understood as a measure or as a synonym for development, education or sophistication.
The reasoning examined below involves some of these conceptual challenges, chiefly those resulting from extremely large numbers, and those resulting from the metaphorical rather than formal application of variety to human epistemic processes.Ashby (1965, pp. 206-207) states that a control system, to exhibit effective (i.e.strict and therefore reliable), control, must be void of ambiguity so that for each state displayed by the controlled, there exists exactly one corresponding state on the side of the controller.In keeping with the example of traffic light signals, this can be illustrated with the constraining of motorist behavior vis-à-vis traffic lights in driving school.Thereby constrained to only four principal responses to the four traffic light signals, motorists "match" the variety of traffic lights and consequently are under effective control.Known as Ashby's Law of Requisite Variety (or short Ashby's Law), this precondition of effective control is commonly phrased as follows: For a system to be in control, the number of states of the controller must be greater than or equal to the number of states of the controlled.In other words: If the controlled has a number of states which the controller cannot match, then there is ambiguity, and the system is not in effective control.

Bremermann's limit and transcomputability
In a separate context, Bremermann (1962) argues that, as any computation occurs in time and some physical substrate, the limits of the physical universe impose limits on computational possibilities.Within a finite volume and in a finite period, matter has, according to Bremermann, a limited capacity to process data.He conjectures that one gram of matter can, as a matter of principle, process no more than 10 47 bits per second.Multiplying this value to the mass of planet Earth and all time of its existence shows that our planet, were it used entirely as an extremely efficient computer, could in principle not have processed more than 10 93 bits [2].Computational tasks beyond this so-called Bremermann's Limit are therefore considered transcomputable, i.e. unsolvable by Earthbound computational means.Ashby (1991, p. 167) extrapolates from Bremermann's Limit the limit of data processing at the universal scale and states that "Everything material stops at 10 100 ".Given that quantities of such magnitudes can arise through permutation even in relatively small systems, and given his Law of Requisite Variety, Ashby (1991, p. 168) recognizes that the in-principle limitations of computability identified by Bremermann affect our grasp of control systems:

Transcomputability and epistemic processes
The systems theorist may [. ..] be defined as a man, with resources not possibly exceeding 10 100 , who faces problems and processes that go vastly beyond this size.What is he to do? Klir (1991, pp. 121-123) explains how Bremermann arrived at the figures 10 47 and 10 93 , and demonstrates how problems involving such vast numbers can arise using, as an example, the exhaustive testing of systems with multiple variables (Klir, 1991, p. 123).The exhaustive testing of systems with n variables, each of which may hold k different values becomes a transcomputable challenge even with apparently moderate values for n and k.The following Table I lists nine such combinations of n variables with a variety of k different states.
Systems with any of these combinations of values for n and k approach Bremermann's Limit.The exhaustive testing of a system with 194 variables, each with a variety of just three different states, for example, approaches transcomputability.It can therefore, as a matter of principle, not be exhaustively tested using all the matter and time of our planet.As computational processes generally have far fewer resources available than the mass of our planet throughout the time of its existence, significantly smaller systems can exceed practical limits of exhaustive testing.Systems with great variety are therefore in practice not tested exhaustively but by alternative, "variety-reducing" test methods, such as exploratory testing, random ("monkey") testing or testing within the bounds of typical, representative or critical scenarios only.

Glanville's corollary of Ashby's law of requisite variety
Since its early formation as an academic discipline, cybernetics has addressed feedback, and thus circularity.Taking the concept of feedback to its logical conclusion, Glanville points out that feedback-systems are, as far as control is concerned, symmetrical: A effects B and B effects A (Glanville 2007a(Glanville , pp. 1181(Glanville -1182)).This contradicts prevalent narratives of linear determinism, i.e. the notion that A affects B only (Fischer, 2015(Fischer, , pp. 1233(Fischer, -1236)).Glanville explains this with the example of the household thermostat, consisting of a heat sensitive switch and a heater.With a variety of two states in the heat sensitive switch ("too cold" and "too warm") and with two states in the heater ("on" and "off"), the system is characterized by requisite variety.While conventional wisdom states that the switch causes state changes in the heater, the heater, with the heat it emits, also causes state changes in the temperature switch.The system is circularly causal, as far as the control relationship is concerned, and in this sense symmetrical.The terms "controller" and "controlled" are therefore observerdependent and essentially interchangeable (Glanville, 2007a(Glanville, , p. 1182;;Glanville, 2012, p. 524;Glanville, 2014b, p. 5)[3].In light of this symmetry and interchangeability, Ashby's requirement for the controller to have at least as many states as the controlled requires refinement.This refinement is offered by Glanville's (2009 p. 70, 121) Corollary of Ashby's Law of Requisite Variety, which states that "The variety of the controller and the controlled must, for effective control to take place, be the same".If either side in a control system has states that the other side cannot match, then there is ambiguity, and the system is out of control (Glanville, 2012, p. 526).To bring out-of-control feedback systems under effective control from an external perspective, two interrelated matchmaking steps are required (Fischer, 2011(Fischer, , p. 1009): (1) The establishment of equal variety in both elements by way of variety reduction and/or variety amplification (Fischer, 2014(Fischer, , pp. 1332(Fischer, -1333)).Variety reduction involves the elimination or grouping of states.Having a coarsening effect, this strategy may be seen as having an impoverishing impact.Variety amplification involves the (likely demanding or expensive) addition of new states.In technical systems, this is typically achieved by way of upgrades.In living systems, it is a consequence of growth, development, learning and innovation.If the impoverishing impact of variety reduction is to be avoided, variety in the respective other element may be amplified.This is what is meant by the cybernetic expression "Only variety can absorb variety" (Beer, 1975, p. 34)[4].
(2) The establishment of a suitable protocol between both elements in the form of a full and unambiguous mapping of the states in either element to the states in the respective other element.
6. Types of control and feedback, and the value of being out-of-control Developing and simplifying Pask's (1976Pask's ( , 1992, p. 11, p. 11) Conversation Theory, Glanville offers key contributions to the expansion of the more technical and deterministic first-order cybernetics to the gentler and more general second-order cybernetics.This broader perspective takes the subjective observer into account and extends the application of cybernetic principles and ideas to various non-technical subject matters, including epistemic processes.In this context, Glanville (2007b, pp. 378-384) describes conversations as circularly-causal, out-of-control feedback loops of acting and understanding between a self and an other (Glanville, 2007b(Glanville, , 2014a)).
In his various writings, addressing different issues, Glanville distinguishes different kinds of control [5].He differentiates, for example, between "Hitler control" andbased on a skiing metaphor used by Humberto Maturana -"skiing control" (Glanville, 2009, p. 307).In an earlier version of the same text[6], Glanville qualifies "Hitler control" as "control [. ..] used to enforce restriction, the reducing of choices" and "skiing control" as "control that allows us to stay [. ..] stable in the face of perturbations".Elsewhere, Glanville (2014c, pp. 54-56) distinguishes between "restrictive" and "facilitative" control.Glanville also distinguishes between control in the non-cybernetic sense and control in the cybernetic sense and relates the latter to the cybernetic concept of conversation (Glanville, 1997, p. 43, 45).He furthermore distinguishes three alternative strategies to engage others that exhibit more variety than self has to offer.These strategies are the "military/fascist" reduction of variety in the other, mutual control, and variety amplification on the part of self (Glanville, 1997, p. 46).Table II offers an overview of these distinctions, as interpreted by the current author.
What sets the approaches and attitudes distinguished in Table II apart is that difference, noise and error are regarded as nuisances to be avoided or corrected on the left-hand side of the table, while being accepted as endemic and welcome for being productive on the righthand side (Glanville, 2000;2007a, p. 1181;Glanville, 2014b, pp. 4-10).Moreover, relationships that fall into the column on the left-hand side of Table II are limited by the variety available on the part of the controller, while on the right-hand side varieties are themselves variable and subject to negotiations unfolding in the respective relationships.

Transcomputability and epistemic processes
While we may be culturally inclined to seek and to exercise the restricting (effective) control listed on the left-hand side of Table II, Glanville invites those who wish to learn and to design (i.e.those who wish to amplify variety) to join him in seeking the conversational feedback relationships that fall on the right-hand side of Table II.Out-of-control and nondeterminable, these relationships tend to offer greater stability in the face of unexpected perturbations (Glanville, 2009, p. 307).They also have a potential to lead to new insight, and to thereby increase the repertoire of "states"amplify varietyin one or both parties to the encounter (Glanville, 2007a(Glanville, , pp. 1188(Glanville, -1189)).Glanville describes design accordingly as a circularly-causal conversation with a negotiable destination, conducted by a self with an at least somewhat non-determinable other, such as a person, an imagined person, a piece of technology, physical material or pen and sketching paper (Glanville, 1997, p. 47, Glanville, 1999, p. 88, Glanville, 2009, p. 146).
In this view, epistemic processes such as learning and designing are defined byor better: synonymous tovariety amplification.Ambiguity, "lucky accidents", and misunderstandings are, accordingly, not seen as "noise", but as sources of inspiration and novelty.They are not only valued but, at least in the case of many experienced designers and artists, actively sought.As the value of the unexpected only shows itself when encountered with openness towards the previously unknown, Glanville stresses the importance of "listening", a term he applies metaphorically to all modes of open-minded perception (Glanville, 2007b(Glanville, , p. 1189;;Fantini and Glanville, 2013).If these processes remained in efficient control with enforced restrictions on variety, and if the unexpected and surprising were thus rejected, these processes would hardly lead to new insights and to amplified variety, and therefore would not qualify as learning or designing.
Aiming to enable rather than to restrict (Glanville, 2007a(Glanville, , p. 1181;;Fischer, 2007), the invitation to get out of control is, in essence, both an epistemic recommendation as well as an ethical position.Glanville reinforces his invitation by arguing that attempts to achieve requisite variety with all but the simplest of systems are futile on technical, quantitative grounds.He illustrates this argument on several occasions, using an educational as well as a designerly example.These are briefly outlined in the following section.

Robinson's classroom scenarios and Alexander's material combinations in design
Arguing in favor of "control that allows us to stay [. ..] stable in the face of perturbations", against the pursuit of requisite variety in classroom settings, and hence against "control [. ..] used to enforce restriction, the reducing of choices" in human interactions and epistemic processes, Glanville recurrently refers to Bremermann's Limit and Robinson's (1979) discussion of transcomputability in classroom settings (Glanville, 1998a(Glanville, , 1998b, p. 126;, p. 126;Glanville, 1997, p. 43;Glanville, 2007aGlanville, , p. 1187)).Robinson (1979, pp. 377-378)  A class of 30 such "pupils" generates a variety of 10 60 statessuch an unimaginably large number that we would need an "earth-mass computer" before we could even think about processing it.Glanville (1998aGlanville ( , 1998b, p. 58) , p. 58) adapts this example, assuming, less conservatively, that the human brain has a variety of about 1,000,000,000 different states.A classroom with 30 learners thus has a combined variety of 1,000,000,000 30 states, i.e. about 10 270 states.It must be concluded that achieving equity with such an enormous, transcomputable variety is far out of reach for any one individual.Any attempt by a teacher, with her or his own limited variety of available brain states, to achieve requisite variety with, and effective control of, a group of learners is therefore futile.With regards to design, Glanville refers to an illustration, which he attributes to Christopher Alexander (Glanville, 1998a(Glanville, , 1998b, pp. 417-418;, pp. 417-418;Glanville, 1997, p. 43): As there are 10 20 possible combinations of chemical elements, the design challenge to compute the combination of five elements that provides the ideal collection of materials for a purpose such as building a room using only five materials is vastly transcomputable.Such an ideal combination of five materials can therefore not be identified by exhaustive consideration or testing.

Analysis and discussion
Notwithstanding this author's desire to champion Glanville's ethical position, some of Glanville's supporting arguments involving transcomputability deserve critical examination.As an entry point for this examination, we may consider some of the abovementioned terminological choices.Robinson (1979, pp. 377-378) notes that variety in a classroom is too vast for us to "even think about processing it", and Glanville (2000, p. 28) states that: At a certain size, i.e. 10 100 , numbers become what is called transcomputable and therefore impracticably big.That is, it is inconceivable that there could be any (physical) computing device powerful enough to compute their values.
These statements hinge on the terms "processing" and "to compute the values of numbers", neither of which is rigorously defined within its respective context.In addition to both authors' more obvious references to computing, Glanville's choice of terms appears to allude particularly to Turing's (1936, p. 230) definition of computable numbers "as the real numbers whose expressions as a decimal are calculable by finite means."Already at this point, objections may be raised, as references to the "computing of the values of numbers" in the context of cybernetic relationships unduly conflate computable numbers as atemporal outcomes of algorithmic processes on the one hand with control, regulation and conversation as processes extending in time on the other hand.
Furthermore, epistemic and computational approaches, even to identical challenges, tend to vary considerably.It has been pointed out that in chess play, for example, computers may evaluate "game trees" of possible combinations of future moves which, even at relatively shallow depths, include millions of paths, whereas human players consider rather small numbers of possible moves (Simon and Chase, 1973, p. 394).In this way, human epistemic capability appears to rest, to a considerable degree, on a seemingly effortless ability to ignore vast ranges of options.Parallels between epistemic and computational processes are therefore better taken metaphorically than literally.Attempts to exercise strict control in Transcomputability and epistemic processes epistemic processes would, preposterously, require technical accuracy and thus be literal.Opposed to such undertakings, Glanville follows this line of reasoning and rejects it in turn on technical grounds, with reference to transcomputability.In doing so, Glanville engages the mindset he wishes to rebut on its terms, which is a potentially powerful rhetorical strategy.Yet, at the same time he implicitly accepts literal parallels between epistemic and computational processes.Rather than a fault in Glanville's particular line of reasoning, this issue arises from conceptual and ethical challenges between technical and metaphorical uses of the term variety (Section 2 ).More urgent objections may be raised, however, based on discrepancies between the arithmetic underlying the concept of transcomputability on the one hand and the relational properties of control systems on the other hand.Robinson and Glanville refer to processing and computing to point to the prohibitively vast numbers that result from exhaustive permutation.Glanville in particular argues that these numbers preclude the possibility of requisite variety and therefore effective control.To test this argument let's consider some control scenarios: A lecturer sets up the lighting in her lecture theatre.There are eight groups of lights mounted under the ceiling, controlled by eight on/off wall switches respectively.The lecturer is unlikely to perform an exhaustive permutation of all 2 8 = 256 switch settings in order to find the optimal configuration.Instead, she will probably settle for a good enough combination of switch settings after exploring, say, a dozen of them.Regardless of whether she will perform an exhaustive search or not, however, a complete permutation is not necessary to exercise efficient control of the lights.In manipulating the eight switches, and in observing the illumination provided in turn by the eight groups of lights, the lecturer reduces her variety to that of the lighting setup, thus achieving requisite variety and controlling it effectively and without ambiguity.The system is therefore in effective control with regards to requisite variety between lecturer and lighting setup.
Expert "cubers" solve the 3-by-3-by-3 Rubik's Cube in well below 100 moves.They are clearly in control of the cube and its movable parts.Performing, at any time, one out of the 20 different possible moves, and observing the resulting color patterns, cubers achieve requisite variety with, and control the cube.Yet, their arrivals at correct solutions are not the result of exhaustive permutation.Such a permutation is not humanly possible.Producing all possible configurations of the 3-by-3-by-3 Rubik's Cube, at a rate of one turn per second, would, according to its producers (Seven Towns, 2010, p. 10), take 100 times the age of the Universe to complete.
The artist Benjamin Heidersberger has composed (and produces digital devices that perform) a piece titled Pentatonic Permutations (Nückel, 2017).The piece is an exhaustive permutation of 11 tones of the hemitonic pentatonic scale C, D#, F, G, B and five pauses within a scheme of 16 positions based on the 16 prime numbers between two and 53, at a rate of one position (tone or pause) per second.The duration of the piece is 16 trillion years, beginning with the Big Bang.Any performance of the piece starts not at the beginning of the permutation but, synchronized via radio clock signal[7], at its current position since the beginning of cosmic time.So far, about 0.1 per cent of the composition has elapsed.There is no element of chance in either the composition or in the performance of this piece.Pattern permutation and sound wave generation (using samples of a Steinway grand piano) are performed by a modest Raspberry Pi 2 computer.
These control scenarios show that requisite variety and efficient systemic control on the one hand, and the permutation of all combinations of possible states of a system on the other hand are not the same.Glanville conflates not only these two issues but also, on occasion, the far greater numbers of potential criteria for exhaustive testing.This is evident in an K 48,4 example used by Ashby and subsequently taken on by Glanville. Ashby (1991, p. 167) illustrates transcomputability with an array of 20 by 20 lamps, each with the two states "on" and "off", thus having 2 400 (i.e.roughly 10 120 ) possible combined states.
This example allows, prior to a discussion of Glanville's take on it, for a brief reiteration of the point just made.The exhaustive permutation of an array of 20 by 20 two-state lamps does indeed pose a challenge of astronomical proportions [8].From the perspective of control, however, consider that the computer screen used to type this article has 2,880 by 1,880 pixels at 30 Bit color depth, resulting in 5,184,000 1,073,000 possible states.Despite having a far greater variety than the array of 20 by 20 two-state lamps, the computer screen is controlled effectively 60 times per second within a portable-sized material substrate.Obviously, the control of a set of states can remain well below Bremermann's Limit, even when the exhaustive permutation of these states would be a task of vastly transcomputable proportions!Ashby explains that if all possible combinations of states of the array of 20 by 20 twostate lamps were to be separated into two sets based on some criterion (depending on whether some characteristic is given or not), then that criterion would be selected out of 2 10 120 possible criteria.This number equals 10 10 119:5 , which is approximately and more conveniently written as, 10 10 120 : Ashby contrasts this number with the much smaller figure 10 10 80 , which, in our common notation, is a one followed by 10 80 zeros.Ashby (1991, p. 167) argues that, since there are only about 10 73 potentially bit-carrying atoms in the universe, this string of digits "cannot be written [. ..] in our universe".Glanville (2009, p. 141) takes up this example and, abbreviating Ashby's above reasoning, concludes: [T]here are not enough atoms in the universe to mark each of [the] potential states [of the array of 20 by 20 two-state lamps].This condition, where the numbers required exceed those of the atoms in the (physical) universe, is referred to as transcomputability.There is nowhere for the individual states of the system (the variety) to be marked.And thus Ashby's Law of Requisite Variety cannot in practice be satisfied [in attempts to control the array of lamps].
This statement conflates requisite variety and efficient control with the size of the set of possible criteria for exhaustive evaluations of a given variety.The practical task of efficient control requires merely requisite variety between the controller and the controlled.Many (including relatively large) control systems would transcend Bremermann's Limit only when their states were to be permutated and evaluated extensively.The likely astronomical numbers of possible criteria for such evaluations are, as Ashby (1991, p. 168) states, a problem of "the systems theorist".They are not a problem of practical controllers.Glanville's above statement furthermore implies that requisite variety and efficient control require the dedicated marking of all possible combinations of the controlled system's states.However, even the above-mentioned computer screen, which has a far greater variety than the array of 20 by 20 two-state lamps, is controlled efficiently in the absence of explicit and individually dedicated representations of all possible combinations of states[9].

Conclusion
Efficient control depends on requisite variety between the controller and the controlled (the assignment of these two roles being in effect interchangeable).A surplus of variety on one side of a potential control relationship may be constrained relatively easily to achieve requisite variety, albeit at the expense of some loss of granularity or fidelity the respective other side might be capable to support.A lack of variety, however, can be harder to remedyespecially in living systems that do not afford upgrading interventions in the way many technical systems do.Various living systems are capable of variety amplification, for example by way of growing, developing, learning and innovating.Such variety amplification capabilities, however, are relatively modest.This can prohibit individual humans from absorbing the variety even of seemingly humble combinations of elements, such as the combined variety of small groups of humans.This is primarily due to variety combination being not a matter of summation, but one of exponential aggregation.
Glanville recommends the abdication of efficient (i.e.strict) control in epistemic processes.The reasoning underlying this recommendation is multilayered.As discussed, one may not be able to match the variety of what one is engaged witheven with some capability for variety amplification.More critically, where living and social systems are engaged, the restriction of variety in those systems is likely undesirable from an ethical perspective.Variety restriction in the other may also be inexpedient from an epistemological perspective, as variety restriction isat least superficiallyantithetical to the purpose of epistemological processes, which is variety amplification [10].
On these grounds, Glanville's recommendation to assume out-of-control postures in epistemic processes is cogent and consistent with second-order cybernetic values and principles.Glanville's substantiation of his recommendation on quantitative grounds with reference to transcomputability, however, is partially flawed.Glanville suggests that requisite variety for efficient control is quantitatively akin to exhaustive permutation.This is agreeable to the extent that both requisite variety and exhaustive permutation demand, by definition, uncompromising fullness.It is not agreeable in as far as requisite variety per se does not imply the data processing and storage demands that push exhaustive permutations of even small systems beyond Bremermann's Limit.Glanville not only conflates quantities of variety (a concern of practical controllers) and of the exhaustive permutation of variety (a concern of system testers) but also of the sizes of sets from which criteria for exhaustive testing may be chosen (a concern of systems theorists).This, too, is inaccurate.Nonetheless, any rejection of Glanville's recommendation to assume out-ofcontrol postures in epistemic processes, on grounds of his partially inaccurate quantitative reasoning, would be most unfortunate.
Notes 1.The notion of variety will be introduced in more detail in the following section of this article.
2. The large figures cited here are not to be taken as literal or precise but as careful estimations.
3. This is not the case with simple traffic lights that are not feedback-based but driven by timed sequencers and unaffected by traffic movement.More advanced traffic control systems, however, monitor traffic movement and adapt their control signals accordingly.In this case, there is circularly-causal control: Traffic light signals affect traffic flows, and traffic flows affect traffic light signals.Both the thermostat and feedback based traffic control combine more focused and directed signal paths (electric current, light signal) and less directed, aggregate signal paths (heat transfer, traffic movement).This apparent asymmetry, among other factors, can obscure circularly-causal relationships (Fischer 2015(Fischer , p. 1236(Fischer -1238)).
4. When Ashby (1956, p. 207) originally presented this insight, he used the verbs "force down" and "destroy" instead of "absorb", which Glanville, in our private exchanges, has repeatedly described as an unfortunate choice of terms.Glanville also expressed his dissatisfaction with the use of the word "only" in this context.
5. Glanville notes that the ambiguity of the term control is "almost always unfortunate" (Glanville 2009, p. 309).
9. Pixel locations can be addressed, and pixel states can be determined not by activating individually marked triggers, but via few permutable numerals communicated as bit patterns on shared data busses.
10.The effects of variety restriction for the purpose of epistemological variety amplification require a more differentiated discussion, which is touched upon in Fischer and Richards (2017, p. 38).This discussion goes beyond the scope of this article.

Table I .
assumes a