I am delighted that Stephen Durnford has agreed to provide us with this fascinating exposition of how Instinct might work. We at www.animalnav.org. are always searching to go beyond phrases that do not describe in detail how things work so Durnford by suggesting that instinctive behaviour is passed from generation to generation encoded in DNA (which presently is referred to as junk) is a powerful argument that points to research that needs to be done.
This work is in parts: The first part is a general explanation and the second a more detailed discussion and a mathematical modelling exercise how this may all work.
You should also look at the work done by Miriam Liedvogel here:
DNA, learned behaviour and instinct
by Stephen Durnford
Who has not wondered how this or that species has managed to learn a particular behaviour pattern and, furthermore, to then produce offspring who exhibit the same behaviour? Sometimes the young do indeed observe and imitate a parent, but this cannot occur in the vast number of species where no such intergenerational continuity takes place. The question of how such instinctive behaviour is first created was posed recently by Linda Geddes in New Scientist (7th. Dec. 2013, p. 10) in relation to the heritability of a certain olfactory alarm response in mice. In a subsequent letter (18th. Jan. 2014, p. 29) NS reader John O’Hara wrote from Australia to remind us that the NS of 24/31 Dec. 2011, p. 37, reported the outcome of research by Joe Hutto with wild turkeys in the US. Fourteen eggs were incubated and raised to adulthood without contact with other wild turkeys, yet had an innate ability to recognise various calls, among them not only those for danger, but also relating to socialisation. So we have mammals and birds with this ability in our sample and no obvious reason to deny it to other forms of life too.
In “Memories passed down generations” Linda Geddes asked “How could a fearful memory of a smell wheedle its way into eggs and sperm and change the behaviour of future generations?” This unanswered question stands as a challenge that invites both thought from philosophers and research from scientists. The genome encodes the blueprint of an entire physical organism and is the only known mechanism for the transmission of attributes. On top of this, DNA also specifies not only how the organism will grow, mature and reproduce, but also those aspects of its behaviour which we traditionally label as instinctive. Such instinctive behaviour must have been acquired during life at some time in the past, and there is no reason to suppose that the mechanism enabling this is no longer active. Thus be we are back with Linda Geddes’s question.
Only about 5% of our DNA produces the proteins of which we are composed, but this also includes the regulation of those proteins, i.e., the “software” that specifies some of their behaviour. We have some 95% awaiting elucidation and, if without function, rather a lot for a cell to carry around. It seems only logical to look here, in what is often called “junk” DNA, for the “software” that deals with everything else, including the “rules” for growth, maturation and reproduction. These encoded “rules” also get transmitted to later generations and clearly differ in function, though not in structure, from the mechanical-chemical protein-building sequences that have been the main focus of research so far. The brain, built by those mechanical-chemical sequences, also contains the “rules” for controlling involuntary bodily functions and behaviour, including what we call instinct. Why should not memory in all its forms not also be encoded in a similar way? It is easier to look within the genome than outside it.
“Epigenetic” is the conventional term for inherited attributes that cannot (yet) be pegged to DNA, and creating this concept is, in effect, simply the standard way we label something unknown in order to research it systematically. Other instances are, from 1667 to 1778, “phlogiston” for the combustible part of matter and, on-going, “dark matter” for hidden mass in the universe. Given that we are faced with a large amount of “junk” DNA on the one hand and a quantity of epigenetically transmitted phenomena on the other, it seems only rational to bring the two together. If DNA is the medium in which all conscious and unconscious memory is encoded, then it is surely little surprise that some part of it, for whatever reason, also ends up in the heritable genome itself. How else could a migratory species, for example, keep its members and their descendants in tune with our planet’s shifting continents? If DNA can achieve all these roles, then it is only a matter of on-going research to discover which sequences encode what information and how, and which may then also get carried into the genome of subsequent generations.
As a start, there are 64 varieties of each DNA codon, offering a rather richer set of combinations than the base 2 that, based on the digital computer as model, is often used in calculating the information capacity of a brain cell. If you take the estimated number of cells in a brain, human or from another species, or any chosen sub-area thereof, and multiply by the average proportion of the genome in each cell that is currently deemed “junk”, then again by the quantity of codons (coda?) in a typical cell, you get an idea of the possible amount of codons, the information-bearing components, held in that brain tissue. Since each codon contains three of the possible four letter-coded chemical bases, it equates to one “bit” of mathematical base 64, as against a conventional bit of base 2. Digital computing has evolved an 8-bit byte as the unit for doing something useful with, and someone may perhaps already have ventured an opinion about a DNA equivalent. It also occurs to me that a digital representation of DNA sequence could be little more than the mathematical translation of values between encodings that both use powers of 2.
Be that as it may, consider a motion sensor that switches on CCTV which has face-recognition software. Faces thus recognised are stored, and the timestamp, frequency and duration of each in the field of view is assessed algorithmically, so that the camera is automatically moved to track the most frequent. As the population of people in the field of view mutates over the weeks, so the camera’s apparent behaviour changes to match. That is present-day technology, and one could set up a good model of what you are concerned with. Rather than having a data-base shared between cameras, one would allow each to be painted with coded markings that are visible to the other cameras in the local cluster, and the movements of each are monitored by all the others algorithmically. You would then have a collection of quasi-organisms behaving in unison, as though sharing an esoteric information link. Further, each local cluster would evolve a separate group behaviour pattern and diverge from other clusters over time. If the cameras are also mobile, i.e., each unit is a robot under autonomous control, and the algorithm has some sort of overall strategy, such as “stick together” or “avoid other clusters”, then the behaviour becomes complex and “life-like”. I am sure that lots of research of this kind is going on, but perhaps to produce military and commercial products and not thought of much as modelling animal learning. The shoaling behaviour of fish and starlings would equate to the dynamic collective behaviour that I just described. Managing a migration would then depend on an algorithm somehow tuned by earlier experiences.
In computing terms data is data, but some of it may be a representation of an executable sequence, i.e., a stored program or algorithm. It is commonplace to use one program to compose or edit another that can be set running in due course. There are an almost endless number of programming environments, e.g., Windows, Fortran, Excel. Each such programming environment is in effect a pre-existing interactive algorithm that both encodes and decodes both passive data and executable statements as passive data (“code”). It does not matter what the data is, and how it is organised when encoded is arbitrary, as long as it is internally complete and consistent, which is why the latitude for variation among different programming conventions is effectively limitless, as is, for the same underlying structural reason, the variation among human languages and, further, the belief systems expressed in those languages. Observing different brain cells lighting up under various stimuli forms part of current research. Once one can allow that DNA may store memories of all kinds, and also that its data may be a mix of executable sequences and their reference values, then the key to understanding it could focus initially on how it encodes externally gained information of the simplest kind. How one can examine the DNA of a single cell that lights up, but do so before it lights up, and then again afterwards, that is the challenge for the designers of experiments. Which sequences then become heritable, and how, would make the follow-on stages of investigation.
© Stephen Durnford 2014