The Task Force on Action and Perception

 TF Action and Perception

 

The Task Force on Action and Perception is primarily concerned with the developmental processes involved in the emergence of representations of action and perception in humans and artificial agents in continual learning. These processes include the action-perception cycle, active perception, continual sensory-motor learning, environmental-driven scaffolding, and intrinsic motivation. As the algorithms for learning single tasks in controlled environments are improving, new challenges have gained relevance. They include multi-task learning, multimodal sensorimotor learning and lifelong adaptation to injury, growth and ageing. Members of this task force are strongly motivated by behavioural and neural data and develop mathematical and computational models to improve robot performance and/or attempt to unveil the underlying mechanisms that lead to continual adaptation to changing environments or embodiment and continual learning in open-ended environments. Members of this task force make extensive use of raw sensor data in multi-task robotic experiments.

Goals

The goals of this task force are:

  • to establish a network within CDS TC to develop algorithms capable of learning representations of action and perception in an open-ended unsupervised manner.
  • to promote the use of computational models to understand better the underlying mechanisms of human and animal developmental cognition.
  • to promote collaboration in developmental continual learning systems through dissemination events, including conferences, workshops and special issues for relevant journals.

Scope

  • Emergence of representations via continual interaction
  • Continual sensory-motor learning
  • Active perception
  • Environmental-driven scaffolding
  • Intrinsic motivation

Membership

The Task Force (TF) consists of a Chair, a Vice-Chair and several Members. The Chair needs to be approved by the Cognitive and Developmental Systems Technical Committee. Task Force Chairs are appointed for two years with a possible extension for another two years. If you are interested in joining this task force, please send an email to the current TF Chair with a link to your CV. Memberships end at the end of the calendar year and need to be renewed.

Members

Position Name Affiliation Country Website
Chair Junpei Zhong The Hong Kong Polytechnic University Hong Kong http://junpei.eu/
Vice-Chair Sao Mai Nguyen Flowers Team (ENSTA Paris, IP Paris & INRIA) & IMT Atlantique http://nguyensmai.free.fr/
Members Alessandra Sciutti Italian Institute of Technology Italy https://www.iit.it/people/alessandra-sciutti
Céline Teulière Institut Pascal - CNRS UMR 6602, Uni­ver­sité Clermont-Auvergne France https://comsee.ispr-ip.fr/members/celine-teuliere/
Cristiano Alessandro University of Milano-Bicocca Italy cristiano.alessandro@unimib.it
Erhan Öztop Ozyegin University Türkiye http://robotics.ozyegin.edu.tr/members/erhan-oztop/
Jochen Triesch Frankfurt Institute of Advanced Studies Germany https://fias.uni-frankfurt.de/neuro/triesch/
Luis Octavio Arriaga Camargo (Octavio Arriaga) University of Bremen, AG Robotik https://github.com/oarriaga
María-José Escobar Department of Electronic Engineering, Universidad Técnica Federico Santa María Chile http://profesores.elo.utfsm.cl/~mjescobar/
Nicolás Navarro-Guerrero Deutsches Forschungszentrum für Künstliche Intelligenz GmbH (DFKI), Bremen Germany https://nicolas-navarro-guerrero.github.io/
Pablo Lanillos Technical University of Munich Germany http://www.selfception.eu/
Vieri Giuliano Santucci Istituto di Scienze e Tecnologie della Cognizione (ISTC) Italy https://www.istc.cnr.it/en/people/vieri-giuliano-santucci
Xavier Hinaut Inria Researcher (CR), Mnemosyne team, Inria and LaBRI, Université de Bordeaux, Institut des Maladies Neurodégénératives www.xavierhinaut.com
Ran Dong Tokyo University of Technology, Tokyo Japan https://www.ikulab.org/dong/
Ding Ding Southeast University, China and Delft University of Technology The Netherlands https://dingdingseu.mystrikingly.com/