Behavioural genetics-inspired framework for learning
Tuesday, 29 January 2013

 

 maitre_kohli.jpg
 
Student
Maitrei Kohli
This e-mail address is being protected from spam bots, you need JavaScript enabled to view it

Supervisors
Prof. George Magoulas
(Dept. of Computer Science & Information Systems)

Prof. Michael Thomas
(Dept. of Psychological Sciences)


Project Details
PhD Research
Started in Oct'2010, expected to finish by Oct' 2014


Keywords
Behavioural Genetics, Inductive transfer, Artificial neural networks, Genetic Algorithms, Heritability, Multivariate analysis, Ensembles

Behavioural genetics-inspired framework for learning


Project Introduction

The main idea behind the research is to apply the principles of behavioural genetics in bio-inspired computing with the aim of finding novel methods that support intelligence with generalisation. By looking into research in psychology which investigates how intelligence evolves in humans, the aim is to find answers to the current challenges in transfer of learning. The concepts of heritability, covariance and multivariate analysis provide some insight about task relatedness, usefulness of bias.


The Behavioural Genetics based Approach

Behavioural genetics is the field of study that examines the role of genetics in human behaviour. The key concepts that we use in our methodology are - (i) Most behavioural traits influenced by many genes, each with small effect size: generalist genes. (ii) Heritability: Amount of population variability explained by genetic similarity. It increases throughout development. (iii) Genes and Environment interact throughout development to shape differences in behaviour. (iv) Twin studies used to disentangle genetic-environmental effects on behaviour. (v) Environmental effects - 2 types - shared and non shared. Non shared environmental effects are specialists. (vi) Multivariate analysis estimates the extent to which genetic & environmental contributions on covariance between multiple traits and variance of each trait. These concepts are embedded in our hybrid framework that exploits the synergy between Genetic Algorithms and Artificial Neural Networks-ANN (Kohli et al., 2012).


Preliminary Results & Findings

A population of 100 ANNs, whose parameters are generated by the genetic algorithms (GAs), was trained in two different setups. In the first setup, the population of ANN was trained using the full training set, which contains all the objects/concepts, along with their accepted categorisations/associations (henceforth referred as the Non Family setup). In the second setup, a filtered training sets is used by taking samples from the perfect training set to create subsets, for each member of the population (henceforth referred as the Family setup). This arrangement ensures that each member of the population has a different environment or training set, and, thus, simulated the effect of socio-economic-status (SES). Though the ANNs are trained according to their filtered or Family training sets, the performance is always assessed against the full training set. A comparison of Non Family and Family setups demonstrates the impact of variability in the environment, independent of the learning properties of the ANNs. So far the framework has been tested in two cases:

• English Past Tense Acquisition: the created model achieved 84.4% and 80.0% accuracy on training datasets and an accuracy of 82.8 % and 75.2 % on generalisation dataset when tested in Non Family and Family modes, respectively. These results indicate that for the ranges of genetic and environmental variation considered, genetic variation has more influence in determining performance while acquiring past tense.

• Auto association: the model achieved average classification accuracy of 97.3% on the training set while the average generalisation accuracy is 95.03%.


Relevant publication

M. Kohli , G.D. Magoulas, and M. Thomas, Hybrid Computational Model for Producing English Past Tense Verbs. In C. Jayne, S. Yue, and L. Iliadis (Eds.): Proceedings of the 13th Engineering Applications of Neural Network Conference-EANN 2012, Springer, CCIS 311, pp. 315-324, 2012.