|SCIENTIFIC DISCIPLINARY SECTOR
This course aims at provining to the Master student basic and advanced concepts on the design of methods and techniques for data driven self-awareness in autonomous artificial agents . Signal Processing, Data Fusion and Machine learning under a Bayesian pespective will be the key dimensions on which introduced concepts will be described. Laboratory application and agent design will integrate course theoretical activities
AIMS AND CONTENT
The course aims at providing theory and techniques for architectural and functional design of interactive cognitive dynamic systems. Topics are related to data fusion, mutilevel bayesian state estimation and their application to cognitive video and radio domains. Project based learning allows students to acquire design capabilities in the field.
AIMS AND LEARNING OUTCOMES
-Basic and advanced knowledge on design of telecommunication systems frameworks for context-aware multisensorial processing of signals and data in cognitive agents
- Knowledge on methods and techniques for acquisition, joint representaion and processing of proprioreceptive and exteroreceptive multisensorial signals in cognitive dynamic agents (e.g. semi autonomous&autonomous vehicles like drones, cars, robots) cognitive radios, etc.)
- Knowledge on methods and techniques for Multisensor Data Fusion: coupled hierarchical processing of multisensorial signals. Machine learning for data driven driven experience based learning of Dynamic Generative Fusion models from sequences of multdimensional sensorial data.
- Knowledge on Machine Learning methods and techniques based on Cognitive Dynamic Systems theory for Situation awareness and Self awareness in artificial cognitive agents
- Knowledge and capabilities on case studies: design of Self Awareness frameowrk for autonomous systems (dataset on cars robots and drones, cognitive radios)
- Knowledge and capabilities to use and apply: multisensorial signal processing tools and algorithms for acquisition, experience driven machine learning techniques for estimation of Generative multisensorial Bayesian hierarchical models. Bayesian Inference on learned Generative Models for dynamic state estimation, prediction and anomaly detection of interaction between agent and its contextual environment situation .
Probability theory, Random Processes, Signal theory
The course is divided in two parts. Lectures in frontal teaching modality presented together with slides will aim at describing the theroretical concepts and the techniques. Such lectures will cover 40 hours and can be recorded and made available on those channels recommended by Univeristy of Genova. The second part is done within a laboratory carried on by an expert of the field and will involve application of programs in Matlab framework that correspond to theories and techniqeus shon at lectures. Students will be required to present a report at the end of each lab experience. 10 Lab experiences are planned and will help students to be prepared to present the final report to be discussed during the exam that will show application and discussion of techniques abnd results over a dataset assigned..
- The module can be associated with ONU 2023 Objectived for Sustinable Development Nr. 4,9 and 11
- Introduction to cognitive dynamic systems
- Proprioceptive and exteroceptive sensors
- Representation and inference for self awareness
- First person and Third Person representations
- Self awareness as result of incremental learning of probabilistic representations
- Bayesian Networks
- General concepts: conditional independence, examples, Markovianity. Inference: Naive method. Belief propagation
- Dynamic Bayesian Networks. Slices, model unfolding, Kaman filters and HMM as DBNs . Filters as two level Bayesian networks
- Bayesian inference: filtering, smoothing, prediction and update
- Kalman Filter: prediction and update. Discussion of model filter and example
- Hidden Markov Models, Extended Kalman Filter, Unscented Kalman Filter and Cubature Kalman Filter
- Other Filters: Information Filter, Square root Filter, H infinity filter
- Particle Filter: Non parametric probability representation; SIR and SIS
- Hierarchical representations and Switching models
- Hierarchical DBNs: including discrete and continuous variables in DBN slice hierarchy
- Switching models: hierarchical feature representation Data Fusion inference and data equivalence with HDBNs
- Markov Jump Particle Filter Rao Blackwellized Particle filter
- Generative models: representation and inference
- HDBN with continuous switching variables.
- Generalized coordinates and generalized filtering
- Generalized filtering: free energy minimization as learning paradigm
- Machine learning for HDBNs
- Learning generative models from multisensory sequences
- Unsupervised Clustering and HDBN learning
- Self organizing Map,
- Growing Neural Gas and ITM
- Learning conditional probabilities:
- parameter learning,
- HMM and KF learning methods (EM, Baum Welch, etc.)
- Gaussian processes.
- Dirichlet process
- ELBO variational approach
- Incremental learning of switching generative models.
- Generalized DBNs as result of agent’s self poietic dynamic stability control.
- Abnormality detection methods for loss of stability detection: probabilistic distances
- Kullback Liebler.
- Null force filter, abnormality detection and generalized errors
- Learning new generative models using generalized errors.
- Application and integration of course concepts by multisensory dataset processing. Assisted Laboratory experiences and report preparation
- Introduction to Matlab
- Bayesian Networks
- Kalman Filter
- Particle Filter
- Hidden Markov Models
- Markov Jump Particle Filter
- Generalized Filtering
- Unsupervised Clustering
- Markov Jump Particle Filter and anomaly detection
- Integration of methods for incremental learning
- Report presentation and discussion
Slides of all lectures written by the lecturer will be made available
Books and research papers that can help the student to integrate concepts described during frontal activity are here provided and can be integrated during the year.
- A. R. Damasio, Looking for Spinoza: Joy, Sorrow, and the Feeling Brain, 1st ed. Orlando: Harcourt, 2003. [Online]. Available:http://lccn.loc.gov/2002011347
- S. Haykin, Cognitive Dynamic Systems: Perception-action Cycle, Radar and Radio, ser. Cognitive Dynamic Systems: Perception–action Cycle, Radar, and Radio. Cambridge University Press, 2012.
- P. R. Lewis, M. Platzner, B. Rinner, J. Torresen, and X. Yao, Eds., Selfaware Computing Systems: An Engineering Approach. Springer, 2016.
- K. J. Friston, B. Sengupta, and G. Auletta, “Cognitive dynamics: From attractors to active inference,” Proceedings of the IEEE, vol. 102, no. 4, pp. 427–445, 2014. [Online]. Available:
- S. Haykin and J. M. Fuster, “On cognitive dynamic systems: Cognitive neuroscience and engineering learning from each other,” Proceedings of the IEEE, vol. 102, no. 3, pp. 608–628, 2014.
TEACHERS AND EXAM BOARD
Ricevimento: Students can define appointments online or by remote writing e-mail at Carlo.Regazzoni@unige.it
CARLO REGAZZONI (President)
PAMELA ZONTONE (President Substitute)
L'orario di tutti gli insegnamenti è consultabile all'indirizzo EasyAcademy.
Exam is structured in a written plus an oral part
The written part consists of presentation of either a report or a poster describing a set of activities and results done by the student aiming at demonstrating nowledge and capabilities he acquired along lecures and lab attendance. In case the student selects to process assigned dataset of the same type to the ones analyzed along lab experiences a report has to be presented. Otherwise the student can select and propose autonomously a case of study of interest agreed with the professor. The case of study should be oriented to the design of an integrated processing system base on course techniques capable to analyze a generic dataset whose specifics come for the choice of the case of study itself. A poster or a report can be presented in this case as written exams.
The dataset or case of study have to be assigned/agreed at least three weeks before oral exam date on the basisof student request. In both cases the written text proposed by the student will have to show that student has acquired knowledge and capabilities presented in the course. The written exam will have to be delivered at indicated online repository at least 4 days before oral exam. The outcome of the evaluation of the written exam will be communicated the day before the oral exam.
Oral exam will consist of discussion of the written exam, The student will have to prepare max 20 slides to describe results and approaches presented in the written text. Oral discussion will be oriented to demonstrate knowledge and capabilities to describe choices performed when developing report or poster, as well as to comment results and performances obtained/expected for the chosen problem.
The oral exam will be passed in case the student will be admitted to the oral with at least 12 over 20 AND if the outcome of the oral will be at least 6 over 10. Laude can be assigned during the oral part.
Exam aims at assessing the following aspects about acquired student's knowldge and capabilities:
Level of Knowledge acquired with respect to theories and methods presented in course lectures
Level of practical and integration capabilities with respect to either the assigned data analytics problem (in case of dataset processing choice) o the design and the specifications of the data analysis system (in case of case of study choice) chosen for the written part of the exam.
Level of Capability and knowledge when motivating performed choices and obtained results during the oral discussion
Students with learning disorders ("disturbi specifici di apprendimento", DSA) will be allowed to use specific modalities and supports that will be determined on a case-by-case basis in agreement with the delegate of the Engineering courses in the Committee for the Inclusion of Students with Disabilities
Dataset will be assigned at least two weeks before exam and report will have to be presented on Monday before exam date (usually on Thursday) Oral admission will be communicated a day before oral exam.