BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//scienceofintelligence.de - ECPv6.15.12.2//NONSGML v1.0//EN
CALSCALE:GREGORIAN
METHOD:PUBLISH
X-ORIGINAL-URL:https://www.scienceofintelligence.de
X-WR-CALDESC:Events for scienceofintelligence.de
REFRESH-INTERVAL;VALUE=DURATION:PT1H
X-Robots-Tag:noindex
X-PUBLISHED-TTL:PT1H
BEGIN:VTIMEZONE
TZID:Europe/Berlin
BEGIN:DAYLIGHT
TZOFFSETFROM:+0100
TZOFFSETTO:+0200
TZNAME:CEST
DTSTART:20240331T010000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:+0200
TZOFFSETTO:+0100
TZNAME:CET
DTSTART:20241027T010000
END:STANDARD
BEGIN:DAYLIGHT
TZOFFSETFROM:+0100
TZOFFSETTO:+0200
TZNAME:CEST
DTSTART:20250330T010000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:+0200
TZOFFSETTO:+0100
TZNAME:CET
DTSTART:20251026T010000
END:STANDARD
BEGIN:DAYLIGHT
TZOFFSETFROM:+0100
TZOFFSETTO:+0200
TZNAME:CEST
DTSTART:20260329T010000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:+0200
TZOFFSETTO:+0100
TZNAME:CET
DTSTART:20261025T010000
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
DTSTART;TZID=Europe/Berlin:20250925T160000
DTEND;TZID=Europe/Berlin:20250925T173000
DTSTAMP:20260420T170918
CREATED:20250429T091548Z
LAST-MODIFIED:20250924T073557Z
UID:24506-1758816000-1758821400@www.scienceofintelligence.de
SUMMARY:Ariana Strandburg-Peshkin (MPI-AB & the University of Konstanz)\, "Communication and coordination in animal societies"
DESCRIPTION:Abstract: \nMany social species use signals such as vocalizations to coordinate a range of group behaviors\, from coming to consensus on where to move to banding together against threats. Despite their widespread importance\, these behaviors remain challenging to study in the wild because doing so requires monitoring many individuals simultaneously. In this talk\, I will give an overview of our work tracking the movements and vocalizations of entire social groups in the wild to tackle questions at the interface of communication and collective behavior. What roles does vocal signaling play in the coordination of collective movement? What drives groups to split up? And how do vocalizations mediate collective action against external threats? I will explore how we and our collaborators are addressing these questions in three species of social carnivore that coordinate across varying spatial scales – highly cohesive meerkat groups\, moderately cohesive coati groups\, and fission-fusion hyena clans. \nThis talk is part of David Mezey‘s course “Introduction to Modeling Collective Behavior\, ” a seminar on collective behavior research\, combined with multiple interactive elements. \nPhoto by Gertrūda Valasevičiūtė on Unsplash.
URL:https://www.scienceofintelligence.de/event/ariana-strandburg-peshkin-mpi-ab-the-university-of-konstanz/
LOCATION:SCIoI\, Marchstraße 23\, 10587 Berlin\, Room 2.057
CATEGORIES:Hot Topics in Intelligence Research
ATTACH;FMTTYPE=image/jpeg:https://www.scienceofintelligence.de/wp-content/uploads/2025/04/gertruda-valaseviciute-xMObPS6V_gY-unsplash-scaled.jpg
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/Berlin:20250729T140000
DTEND;TZID=Europe/Berlin:20250729T160000
DTSTAMP:20260420T170918
CREATED:20250627T090602Z
LAST-MODIFIED:20250724T124821Z
UID:25799-1753797600-1753804800@www.scienceofintelligence.de
SUMMARY:Alan Winfield (UWE Bristol) & Dafna Burema (Science of Intelligence)
DESCRIPTION:How should we think about Ethics when Machines become part of our social worlds? Alan Winfield and Dafna Burema will explore the ethical and societal dimensions of robotics and AI in an interactive fishbowl and in conversation with Master`s students of the course “Introduction to Modeling Collective Behavior”. Alan Winfield\, a pioneer in the field of robot ethics\, will share insights from his work on cognitive robotics\, science communication\, and the development of ethical standards for intelligent systems.\nDafna Burema brings a sociological lens\, focusing on how AI and robots shape—and are shaped by—social values\, particularly in sensitive areas like eldercare. Together\, they’ll reflect on how society can critically engage with intelligent technologies and what ethical frameworks might guide their integration into collective life. \nThis talk is part of David Mezey‘s course “Introduction to Modeling Collective Behavior\, ” a seminar on collective behavior research\, combined with multiple interactive elements. \nPhoto created with DALL-E by Maria Ott.
URL:https://www.scienceofintelligence.de/event/alan-winfield-uwe-bristol-dafna-burema-science-of-intelligence-2/
LOCATION:Marchstraße 23\, 10587 Berlin\, Room 2.057
CATEGORIES:Hot Topics in Intelligence Research
ATTACH;FMTTYPE=image/jpeg:https://www.scienceofintelligence.de/wp-content/uploads/2025/02/chatgtp8.jpg
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/Berlin:20250718T140000
DTEND;TZID=Europe/Berlin:20250718T153000
DTSTAMP:20260420T170918
CREATED:20250429T090411Z
LAST-MODIFIED:20250716T122502Z
UID:24495-1752847200-1752852600@www.scienceofintelligence.de
SUMMARY:Jacob Yates (UC Berkeley)\, "The Role of Motor Signals in Visual Cortex"
DESCRIPTION:Embodiment is fundamental to biological intelligence. Brains do not passively receive the world\, they actively shape what they sense through self-motion. For nearly a century\, we have known that perception and action are deeply entangled\, and that organisms must constantly infer whether a sensory change comes from the environment or from themselves. A longstanding idea holds that sensory signals are either suppressed during movement or that movement effects are subtracted out. However\, recent discoveries in neuroscience\, especially in rodents\, suggest that spontaneous movements strongly influence sensory cortex. In this talk\, I will share our work re-examining this question in primates. We found that movements do not broadly modulate visual cortex unless they move the retina\, creating an inherent ambiguity between motor effects and changes in sensory input. I will describe our new approach to disentangling sensorimotor interactions during natural behavior\, combining high-resolution eye tracking with high-density neural recordings and modern machine learning. By precisely measuring the retinal input during natural vision\, we find that much of what appears to be a motor signal is actually visual reafference\, the lawful\, structured sensory consequences of an animal’s own actions. I will discuss how measuring and modeling this loop can deepen our understanding of active inference in the brain and what it means for designing truly embodied agents that adapt to the world as brains do. \nBio \nJacob Yates (he/him) is an Assistant Professor of Optometry & Vision Science at UC Berkeley and leads the Active Vision and Neural Computation Lab. His research explores how populations of neurons in the cortex and early visual pathways encode the visual world\, with a particular focus on how eye movements generate and utilize information for perception. By combining statistical and machine learning approaches\, his lab builds computational models to better understand neural activity and human perception\, ultimately aiming to bridge the gap between neural coding and real-world visual behavior. \nThis talk is part of Aravind Battaje‘s course “Mind\, Body\, Environment: An Interactive Seminar on Embodied Intelligence\,” a seminar introducing to key theories and research highlighting this shift in perspective through invited lectures from experts in the field and interactive sessions. \nPhoto by Soliman Cifuentes on Unsplash.
URL:https://www.scienceofintelligence.de/event/jacob-yates-uc-berkeley/
LOCATION:SCIoI\, Marchstraße 23\, 10587 Berlin\, Room 2.057
CATEGORIES:Hot Topics in Intelligence Research
ATTACH;FMTTYPE=image/jpeg:https://www.scienceofintelligence.de/wp-content/uploads/2025/04/soliman-cifuentes-RXGLTHZ6Mo8-unsplash-scaled.jpg
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/Berlin:20250711T140000
DTEND;TZID=Europe/Berlin:20250711T160000
DTSTAMP:20260420T170918
CREATED:20250402T102151Z
LAST-MODIFIED:20250711T082214Z
UID:24013-1752242400-1752249600@www.scienceofintelligence.de
SUMMARY:William Warren (Brown University)\, "The Dynamics of Perception and Action: From Pedestrian Interactions to Collective Behavior"
DESCRIPTION:It’s a perplexing time in the study of visual perception. On the one hand\, there is a resurgence of models that freely posit a priori structure in the visual system\, including priors\, generative world models\, and physics engines. On the other hand\, there is the astonishing a posteriori success of deep neural networks trained only on natural images and image sequences. Although their performance offers an existence proof of the sufficiency of image information for certain visual tasks\, the black box of deep learning does not easily offer up that information or how it’s extracted by the visual system. \nA science of perception depends on understanding the visual information that is available in natural environments and is used to guide natural behavior. I propose that we take seriously James Gibson’s information hypothesis: For every perceivable property of the environment\, however subtle\, there must be a variable of information\, however complex\, that uniquely specifies it. The project is to identify the information that the visual system uses to perceive and act within the constraints of a species’ ecological niche. \nTwo decades ago I decided to work out a test case to see whether an information-based account of a natural behavior could be sustained. In this talk I will offer a status report on our effort to build a model of visually controlled human locomotion – a pedestrian model – that scales up from individual behaviors like steering and obstacle avoidance\, to pedestrian interactions like following and collision avoidance\, to the collective behavior of human crowds. Surprisingly\, linear combinations of these nonlinear components can account for the emergence of more complex behavior\, such as self-organized ‘flocking’\, crowd bifurcations\, and stripe formation in crossing flows. \nBio \nBill (he/him) earned his undergraduate degree at Hampshire College (1976)\, his Ph.D. in Experimental Psychology from the University of Connecticut (1982)\, did post-doctoral work at the University of Edinburgh\, and has been a professor at Brown ever since. He served as Chair of the Department of Cognitive and Linguistic Sciences from 2002-10. Warren is the recipient of a Fulbright Research Fellowship\, an NIH Research Career Development Award\, and Brown’s Elizabeth Leduc Teaching Award for Excellence in the Life Sciences. Warren’s research focuses on the visual control of action – in particular\, human locomotion and navigation. He seeks to explain how this behavior is adaptively regulated by multi-sensory information\, within a dynamical systems framework. Using virtual reality techniques\, his research team investigates problems such as the visual control of steering\, obstacle avoidance\, wayfinding\, pedestrian interactions\, and the collective behavior of crowds. Experiments in the Virtual Environment Navigation Lab (VENLab) enable his group to manipulate what participants see as they walk through a virtual landscape\, and to measure and model their behavior. The aim of this research is to understand how adaptive behavior emerges from the dynamic interaction between an organism and its environment. He believes the answers will not be found only in the brain\, but will strongly depend on the physical and informational regularities that the brain exploits. This work contributes to basic knowledge that is needed to understand visual-motor disorders in humans\, and to develop mobile robots that can operate in novel environments. For more information\, visit his faculty profile or the VENLab website. \n  \nFor those who are not in Berlin but would like to join virtually:\nhttps://tu-berlin.zoom-x.de/j/69207754612?pwd=IKxoTdY3dQWccHpce2nA0IsNkNxPHu.1 \nThis talk is part of Aravind Battaje‘s course “Mind\, Body\, Environment: An Interactive Seminar on Embodied Intelligence\,” a seminar introducing to key theories and research highlighting this shift in perspective through invited lectures from experts in the field and interactive sessions. \n  \n  \nPhoto created with DALL-E by Maria Ott.
URL:https://www.scienceofintelligence.de/event/william-warren-brown-university/
LOCATION:SCIoI\, Marchstraße 23\, 10587 Berlin\, Room 2.057
CATEGORIES:Hot Topics in Intelligence Research
ATTACH;FMTTYPE=image/jpeg:https://www.scienceofintelligence.de/wp-content/uploads/2025/04/chatgtp3.jpg
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/Berlin:20250701T140000
DTEND;TZID=Europe/Berlin:20250701T153000
DTSTAMP:20260420T170918
CREATED:20250407T095720Z
LAST-MODIFIED:20250627T085132Z
UID:24183-1751378400-1751383800@www.scienceofintelligence.de
SUMMARY:POSTPONED: Alan Winfield (UWE Bristol) & Dafna Burema (Science of Intelligence)
DESCRIPTION:This event has been postponed to 29 July 2025. \nHow should we think about Ethics when Machines become part of our social worlds? Alan Winfield and Dafna Burema will explore the ethical and societal dimensions of robotics and AI in an interactive fishbowl and in conversation with Master`s students of the course “Introduction to Modeling Collective Behavior”. Alan Winfield\, a pioneer in the field of robot ethics\, will share insights from his work on cognitive robotics\, science communication\, and the development of ethical standards for intelligent systems.\nDafna Burema brings a sociological lens\, focusing on how AI and robots shape—and are shaped by—social values\, particularly in sensitive areas like eldercare. Together\, they’ll reflect on how society can critically engage with intelligent technologies and what ethical frameworks might guide their integration into collective life. \nThis talk is part of David Mezey‘s course “Introduction to Modeling Collective Behavior\, ” a seminar on collective behavior research\, combined with multiple interactive elements. \n  \n  \nPhoto created with DALL-E by Maria Ott.
URL:https://www.scienceofintelligence.de/event/alan-winfield-uwe-bristol-dafna-burema-science-of-intelligence/
LOCATION:SCIoI\, Marchstraße 23\, 10587 Berlin\, Room 2.057
CATEGORIES:Hot Topics in Intelligence Research
ATTACH;FMTTYPE=image/jpeg:https://www.scienceofintelligence.de/wp-content/uploads/2025/02/chatgtp8.jpg
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/Berlin:20250626T140000
DTEND;TZID=Europe/Berlin:20250626T180000
DTSTAMP:20260420T170918
CREATED:20250402T101646Z
LAST-MODIFIED:20250618T134257Z
UID:24009-1750946400-1750960800@www.scienceofintelligence.de
SUMMARY:Michael Brecht\, "Active touch and Large-Brain Neuroscience in Elephants" and Yasemin Vardar\, "Active Synthetic Touch: Generating Naturalistic Multisensory Tactile Stimuli for Active Exploration"
DESCRIPTION:Michael Brecht (BCCN Berlin) will present data on a systemic investigation of brains and of grasping behavior in elephants. The analysis of sensory nerves suggests that elephants are extremely tactile animals. In elephants\, trunk whisker length is lateralized as a result of heavily lateralized trunk behaviors. The elephant trunk tip appears to be represented by a large cortical three-dimensional trunk-tip model; this observation is reminiscent of the somatosensory cortical snout representation in pigs. The trunk musculature of elephants is breath-takingly complex and filigree. Trunk morphology\, motor neuron organization and grasping differs between African elephants (which pinch objects with their two trunk fingers) and Asian elephants (which have only one finger and wrap objects with their trunk).\nHe will discuss the potential of novel X-ray technologies for large brain analysis. Both behavioral analysis and elephant neuroanatomy reveal striking individual differences between individual elephants. Thus\, it appears that elephants are less equal than other animals. \nImagine you could feel your pet’s fur on a Zoom call\, the fabric of the clothes you are considering purchasing online\, or tissues in medical images. We are all familiar with the impact of digitization of audio and visual information in our daily lives – every time we take videos or pictures on our phones. Yet\, there is no such equivalent for our sense of touch. This talk will encompass Yasemin Vardars (Delft University of Technology) scientific efforts in digitizing naturalistic tactile information for the last decade. She will explain the methodologies and interfaces we have been developing with my team and collaborators for capturing\, encoding\, and recreating the perceptually salient features of tactile textures for active bare-finger interactions. She will also discuss current challenges\, future research paths\, and potential applications in tactile digitization. \nThis talk is part of course Olga Shurygina‘s course “Active Sensing\,” a seminar on cutting-edge research on active sensory perception in humans and other mammals and realted advances in artificial agents’ abilities such as seeing\, grasping\, and navigating in space. \n  \nPhoto created with DALL-E by Maria Ott.
URL:https://www.scienceofintelligence.de/event/michael-brecht-bccn-berlin-and-yasemin-vardar-delft-university-of-technology-active-touch/
LOCATION:SCIoI\, Marchstraße 23\, 10587 Berlin\, Room 2.057
CATEGORIES:Hot Topics in Intelligence Research
ATTACH;FMTTYPE=image/jpeg:https://www.scienceofintelligence.de/wp-content/uploads/2025/04/chatgtp12.jpg
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/Berlin:20250617T140000
DTEND;TZID=Europe/Berlin:20250617T153000
DTSTAMP:20260420T170918
CREATED:20250226T124956Z
LAST-MODIFIED:20250617T121156Z
UID:23627-1750168800-1750174200@www.scienceofintelligence.de
SUMMARY:Heiko Hamann (Science of Intelligence)\, "From Models to Machines: A Roboticist’s View on Collective Behavior"
DESCRIPTION:Swarm robotics investigates how large numbers of relatively simple\, autonomous robots can coordinate to complete complex collective tasks. In this lecture\, we explore how models of collective behavior can guide the design of such systems. We highlight how modeling collective behavior is not only a tool for understanding natural systems\, but a powerful method to synthesize coordinated behaviors in robot swarms. We contrast bio-mimicry to more abstract bio-inspired paradigms. Through examples like task allocation and flocking\, we demonstrate how biological insights can shape engineering choices.  An impressive insight from biology is that ‘less is more\,’ that is\, less communication or less knowledge can sometimes increase the swarm’s performance. We conclude by briefly discussing swarm robotics applications that diverge from biological analogies and reflect on future directions. \n--\nThis talk is part of David Mezey‘s course “Introduction to Modeling Collective Behavior\, ” a seminar on collective behavior research\, combined with multiple interactive elements. \n  \n  \nImage created with DALL-E by Maria Ott.
URL:https://www.scienceofintelligence.de/event/heiko-hamann-science-of-intelligence/
LOCATION:SCIoI\, Marchstraße 23\, 10587 Berlin\, Room 2.057
CATEGORIES:Hot Topics in Intelligence Research
ATTACH;FMTTYPE=image/jpeg:https://www.scienceofintelligence.de/wp-content/uploads/2025/02/chatgtp19.jpg
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/Berlin:20250613T140000
DTEND;TZID=Europe/Berlin:20250613T160000
DTSTAMP:20260420T170918
CREATED:20250407T094415Z
LAST-MODIFIED:20250610T100714Z
UID:24172-1749823200-1749830400@www.scienceofintelligence.de
SUMMARY:Fumiya lida (University of Cambridge) "Info-Bodiment: Informatization of Robot Embodiment for the Next Generation AI Robots"
DESCRIPTION:There is growing interest in applying AI technologies to the control of intelligent robotic systems. While this research has led to promising developments\, it still faces major challenges due to its heavy reliance on learning from limited datasets—often dominated by visual information. In this talk\, I will introduce “Info-Embodiment” as a new research framework for realizing Embodied Intelligence\, along with its underlying technological foundations. As advances in soft robotics and functional materials enable deeper integration between the informational and physical realms\, we are beginning to see the emergence of novel forms of embodied intelligence. Within this evolving landscape\, I will explore how rapidly advancing fields such as machine learning can help accelerate progress. Going beyond conventional models of body control and AI as abstract computational systems\, this approach positions the body itself as an active site of information processing and generation\, opening new possibilities for intelligent behavior. \nBio\nFumiya Iida is Professor of Robotics at the Department of Engineering\, University of Cambridge. Previously he was an assistant professor for bio-inspired robotics at ETH Zurich (2009-2014) and a lecturer at Cambridge (2014-2018). He received his bachelor and master degrees in mechanical engineering at Tokyo University of Science (Japan\, 1999)\, and Dr. sc. nat. in Informatics at University of Zurich (2006). In 2004 and 2005 he was also engaged in biomechanics research of human locomotion at Locomotion Laboratory\, University of Jena (Germany). From 2006 to 2009 he worked as a postdoctoral associate at the Computer Science and Artificial Intelligence Laboratory\, Massachusetts Institute of Technology in USA. In 2006 he was awarded the Fellowship for Prospective Researchers from the Swiss National Science Foundation and\, in 2009\, the Swiss National Science Foundation Professorship. He was a recipient of the IROS2016 Fukuda Young Professional Award\, Royal Society Translation Award in 2017\, Tokyo University of Science Award in 2021. His research interests include biologically inspired robotics\, embodied artificial intelligence\, and biomechanics of human locomotion and manipulation\, where he was involved in a number of research projects related to dynamic legged locomotion\, navigation of autonomous robots\, and human-machine interactions. For more information\, visit the Bio-Inspired Robotics Laboratory website. \n  \nThis talk is part of Aravind Battaje‘s course “Mind\, Body\, Environment: An Interactive Seminar on Embodied Intelligence\,” a seminar introducing to key theories and research highlighting this shift in perspective through invited lectures from experts in the field and interactive sessions. \n 
URL:https://www.scienceofintelligence.de/event/fumiya-iida-university-of-cambridge/
LOCATION:SCIoI\, Marchstraße 23\, 10587 Berlin\, Room 2.057
CATEGORIES:Hot Topics in Intelligence Research
ATTACH;FMTTYPE=image/jpeg:https://www.scienceofintelligence.de/wp-content/uploads/2025/04/zp-TU-HU-ExcelenzForschung-20240122-073-scaled-e1749550030237.jpg
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/Berlin:20250612T140000
DTEND;TZID=Europe/Berlin:20250612T180000
DTSTAMP:20260420T170918
CREATED:20250407T094009Z
LAST-MODIFIED:20250611T105232Z
UID:24168-1749736800-1749751200@www.scienceofintelligence.de
SUMMARY:Jennifer Groh (Duke University) and Kristen Grauman (University of Texas)\, "What Eye Movements Have to Do with Hearing"
DESCRIPTION:Jennifer Groh (Duke University) \nHearing works in concert with vision\, such as when we watch someone’s lips move to help us understand what they are saying.  But bridging between these two senses poses computational challenges for the brain.  One such challenge involves movements of the eyes – every time the eyes move with respect to the head\, the relationship between visual spatial input (the retina) and auditory spatial input (sound localization cues anchored to the head) changes.  I will describe this problem from early computational and experimental work showing how and where signals regarding eye movements are incorporated into auditory processing\, closing with a recent discovery from our group that a signal regarding eye movements is sent by the brain to the ears themselves.  This signal casues the eardrum to oscillate in conjunction with eye movements (Gruters et al PNAS 2018) and carries detailed spatial information about the direction and amplitude of the eye movement (Lovich et al PNAS 2023). I will also present new findings concerning the underlying mechanism of this effect\, involving the contributions of the middle ear muscles and outer hair cells\, and the potential impact on sound transduction. \n  \nKristen Grauman (University of Texas)\, “Audio-visual learning in 3D environments” \nPerception systems that can both see and hear have great potential to unlock problems in video understanding\, augmented reality\, and embodied AI. I will present our recent work in egocentric audio-visual (AV) perception. First\, we explore how audio’s spatial signals can augment visual understanding of 3D environments. This includes ideas for self-supervised feature learning from echoes\, AV floorplan reconstruction\, and active source separation\, where an agent intelligently moves to hear things better in a busy environment. Throughout this line of work\, we leverage our open-source SoundSpaces platform\, which allows state-of-the-art rendering of highly realistic audio in real-world scanned environments. Next\, building on these spatial AV and scene acoustics ideas\, we introduce new ways to enhance the audio stream – making it possible to transport a sound to a new physical environment observed in a photo\, or to dereverberate speech so it is intelligible for machine and human ears alike. \n  \nThis talk is part of course Olga Shurygina‘s course “Active Sensing\,” a seminar on cutting-edge research on active sensory perception in humans and other mammals and realted advances in artificial agents’ abilities such as seeing\, grasping\, and navigating in space. \n  \nPhoto created with DALL-E by Maria Ott.
URL:https://www.scienceofintelligence.de/event/jennifer-groh-duke-university-and-kristen-grauman-university-of-texas-active-hearing/
LOCATION:SCIoI\, Marchstraße 23\, 10587 Berlin\, Room 2.057
CATEGORIES:Hot Topics in Intelligence Research
ATTACH;FMTTYPE=image/jpeg:https://www.scienceofintelligence.de/wp-content/uploads/2025/04/chatgtp11.jpg
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/Berlin:20250610T140000
DTEND;TZID=Europe/Berlin:20250610T153000
DTSTAMP:20260420T170918
CREATED:20250226T122854Z
LAST-MODIFIED:20250606T131115Z
UID:23624-1749564000-1749569400@www.scienceofintelligence.de
SUMMARY:Andrew J. King (Swansea University)\,"Understanding Animal Collective Behaviour Across Systems"
DESCRIPTION:Andrew King is a scientist driven by curiosity\, exploring questions across species\, contexts\, and methods. His research group investigates how and why individuals engage in collective behaviour\, using a wide range of systems\, perspectives\, and tools. In this seminar\, he will present their fundamental work in behavioural biology\, as well as its applied themes\, including animal management and bio-inspired engineering. \nThis talk is part of David Mezey‘s course “Introduction to Modeling Collective Behavior\, ” a seminar on collective behavior research\, combined with multiple interactive elements. \n  \n  \n  \nImage created with DALL-E by Maria Ott.
URL:https://www.scienceofintelligence.de/event/andrew-j-king-shoal-group-swansea-university/
LOCATION:SCIoI\, Marchstraße 23\, 10587 Berlin\, Room 2.057
CATEGORIES:Hot Topics in Intelligence Research
ATTACH;FMTTYPE=image/jpeg:https://www.scienceofintelligence.de/wp-content/uploads/2025/02/chatgtp13.jpg
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/Berlin:20250606T140000
DTEND;TZID=Europe/Berlin:20250606T160000
DTSTAMP:20260420T170918
CREATED:20250407T093540Z
LAST-MODIFIED:20250603T094631Z
UID:24164-1749218400-1749225600@www.scienceofintelligence.de
SUMMARY:Tony Prescott (University of Sheffield)\, "The Psychology of Artificial Intelligence"
DESCRIPTION:Artificial intelligence and robotics have been making great progress in recent years but how close are we to emulating human intelligence?  This talk will explore the similarities and differences between humans and AIs and discuss the development of biomimetic cognitive systems that more directly think and behave like us.  A key focus will be on layered control architectures for robots inspired by the mammalian brain. The talk will be illustrated with work from my lab on active sensing\, memory\, and sense of self for animal-like and humanoid robots. \nThis talk is part of Aravind Battaje‘s course “Mind\, Body\, Environment: An Interactive Seminar on Embodied Intelligence\,” a seminar introducing to key theories and research highlighting this shift in perspective through invited lectures from experts in the field and interactive sessions. \nFor those who are not in Berlin but would like to join virtually:\nhttps://tu-berlin.zoom-x.de/j/69207754612?pwd=IKxoTdY3dQWccHpce2nA0IsNkNxPHu.1 \nPhoto generated with DALL-E by Maria Ott.
URL:https://www.scienceofintelligence.de/event/tony-prescott-university-of-sheffield/
LOCATION:SCIoI\, Marchstraße 23\, 10587 Berlin\, Room 2.057
CATEGORIES:Hot Topics in Intelligence Research
ATTACH;FMTTYPE=image/jpeg:https://www.scienceofintelligence.de/wp-content/uploads/2025/04/abstract_ai_vs_human_thought-e1748620484784.jpg
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/Berlin:20250605T140000
DTEND;TZID=Europe/Berlin:20250605T180000
DTSTAMP:20260420T170918
CREATED:20250407T093220Z
LAST-MODIFIED:20250530T112036Z
UID:24159-1749132000-1749146400@www.scienceofintelligence.de
SUMMARY:Martina Poletti (University of Rochester)\, "Active Foveal Vision" and Michele Rucci (University of Rochester)\, "Active Space-Time Encoding: The Inseparable Link Between Vision and Action"
DESCRIPTION:Martina Poletti’s talk will focus on active foveal vision. Vision is an active process even at its finest scale in the 1-deg foveola\, the visual system is primarily sensitive to changes in the visual input and it has been shown that fixational eye movements reformat the spatiotemporal flow to the retina in a way that is optimal for fine spatial vision. Using high-precision eye-tracking coupled with a system for gaze-contingent display capable of localizing the line of sight with arcminute precision\, and an Adaptive Optics Scanning Light Ophthalmoscope (AOSLO) for high-resolution retinal imaging enabling retinal-contingent manipulations of the visual input\, their results show that the need for active foveolar vision also stems from the non-uniformity of fine spatial vision across this region. Further\, they show that the visual system is highly sensitive even to a small sub-foveolar loss of vision and fixation behavior is readjusted to compensate for this loss. Overall\, the emerging picture is that of a highly non-homogenous foveolar vision characterized by a refined level of control of attention and fixational eye movements at this scale. \nMichele Rucci’s talk explores how the human visual system constructs spatial representations. Unlike other sensory modalities\, where spatial information must be inferred from incoming signals\, vision begins with a sophisticated imaging system—the eye—that explicitly preserves spatial structure on the retina. This might suggest that human vision is primarily a passive spatial process\, in which the eye simply transmits the retinal image to the cortex—much like uploading a digital photograph—to form a map of the scene. However\, this analogy is misleading\, as it overlooks the strong temporal sensitivity of visual neurons and contradicts theoretical models and experimental findings that examine vision in the context of natural motor behavior. Here\, Michele Rucci will review recent evidence supporting active space-time encoding—the idea that\, as with other senses\, vision relies on motor strategies to encode spatial information in the temporal domain. This concept has important implications for understanding the normal functioning of the visual system\, the effects of abnormal oculomotor behavior\, and the development of visual prostheses. \nThis talk is part of course Olga Shurygina‘s course “Active Sensing\,” a seminar on cutting-edge research on active sensory perception in humans and other mammals and realted advances in artificial agents’ abilities such as seeing\, grasping\, and navigating in space. \n  \nPhoto created with DALL-E by Maria Ott.
URL:https://www.scienceofintelligence.de/event/active-seeing-with-martina-poletti-university-of-rochester-and-michele-rucci-university-of-rochester/
LOCATION:SCIoI\, Marchstraße 23\, 10587 Berlin\, Room 2.057
CATEGORIES:Hot Topics in Intelligence Research
ATTACH;FMTTYPE=image/png:https://www.scienceofintelligence.de/wp-content/uploads/2025/04/ChatGPT-Image-May-30-2025-01_17_03-PM.png
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/Berlin:20250603T140000
DTEND;TZID=Europe/Berlin:20250603T153000
DTSTAMP:20260420T170918
CREATED:20250226T122648Z
LAST-MODIFIED:20250606T131027Z
UID:23618-1748959200-1748964600@www.scienceofintelligence.de
SUMMARY:Jens Krause (Science of Intelligence)\, "The Adaptive Value of Collective Behavior"
DESCRIPTION:In this talk Jens Krause will discuss the adaptive value of collective behaviour from different perspectives. One perspective is the potential ability of groups or collectives to make better and even faster decisions. In this context Jens will show some of the modelling approaches to explain collective intelligence and the empirical support for them in the laboratory and in the field. Furthermore\, he will show some empirical findings regarding collective intelligence which challenge our current understanding of the underlying mechanisms. Another perspective is that of collective behaviour as a defense against predators. It has been found in a number of different species that various forms of collective spirals and waves can fend off predators. This implies that at a global group-wide level\, collective patterns are not just beautiful to look at but can provide anti-predator functions which are just beginning to understand. \nThis talk is part of David Mezey‘s course “Introduction to Modeling Collective Behavior\, ” a seminar on collective behavior research\, combined with multiple interactive elements. \n  \n  \nImage created with DALL-E by Maria Ott.
URL:https://www.scienceofintelligence.de/event/jens-krause-science-of-intelligence/
CATEGORIES:Hot Topics in Intelligence Research
ATTACH;FMTTYPE=image/jpeg:https://www.scienceofintelligence.de/wp-content/uploads/2025/02/KK_2-scaled-e1748593902816.jpg
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/Berlin:20250530T140000
DTEND;TZID=Europe/Berlin:20250530T160000
DTSTAMP:20260420T170918
CREATED:20250402T094459Z
LAST-MODIFIED:20250603T123920Z
UID:23998-1748613600-1748620800@www.scienceofintelligence.de
SUMMARY:Cornelia Fermüller (University of Maryland)\, “Computational Principles of Embodied Intelligence for Robust Motion Perception and Action”
DESCRIPTION:Abstract\nUnderstanding the computational principles of embodied intelligence is central to advancing robotic systems that perceive and act in complex environments. This talk explores key principles—low power consumption\, robustness\, and generalizability—as they emerge in the context of motion perception and action. For visual navigation\, evidence is presented that challenges the conventional SLAM paradigm\, which relies on correspondence estimation and 3D scene reconstruction. Instead\, 3D motion estimation and scene segmentation can be achieved using 1D normal flow measurements derived from image gradients\, offering a simpler and more robust alternative. The effectiveness of this approach is demonstrated through implementations on drones equipped with both standard and neuromorphic dynamic vision sensors. Further\, it is shown that physical interaction tasks do not necessarily require explicit depth estimation; rather\, distance can be inferred in action-dependent units grounded in control dynamics. Finally\, the role of visual motion in action understanding is examined\, focusing on how motion-derived primitives support robust and generalizable representations of action\, opening new avenues for embodied intelligence in robotic systems. \nThis talk is part of Aravind Battaje‘s course “Mind\, Body\, Environment: An Interactive Seminar on Embodied Intelligence\,” a seminar introducing to key theories and research highlighting this shift in perspective through invited lectures from experts in the field and interactive sessions. \nBio \nCornelia Fermüller is a Research Scientist at the University of Maryland’s Institute for Advanced Computer Studies (UMIACS)\, where she co-founded the Autonomy Cognition and Robotics (ARC) Lab and co-leads the Perception and Robotics Group. Her research lies at the intersection of computer vision\, robotics\, and human vision\, with a focus on biologically inspired solutions for active vision systems. She has made significant contributions to the understanding of visual perception by developing computational models for visual motion analysis\, 3D motion and shape estimation\, texture analysis\, and action recognition\, as well as integrating perception\, action\, and reasoning to enable cognitive robots to learn and interpret human manipulation actions. \nDr. Fermüller holds an M.S. from the University of Technology\, Graz\, and a Ph.D. in Applied Mathematics from the Technical University of Vienna. Her recent work emphasizes the use of event-based\, bio-inspired sensors for robust motion perception in challenging environments\, with applications ranging from fast motion perception for drones to autonomous driving in diverse lighting conditions. She is the principal investigator of an NSF-sponsored Science of Learning Center Network for Neuromorphic Engineering\, co-organizes the Neuromorphic Engineering and Cognition Workshop\, and has been recognized for her leadership in interdisciplinary research bridging computational modeling and psychophysical studies of human vision. \n  \nFor those who are not in Berlin but would like to join virtually:\nhttps://tu-berlin.zoom-x.de/j/69207754612?pwd=IKxoTdY3dQWccHpce2nA0IsNkNxPHu.1 \n  \nPhoto created with DALL-E by Maria Ott.
URL:https://www.scienceofintelligence.de/event/cornelia-fermuller/
LOCATION:SCIoI\, Marchstraße 23\, 10587 Berlin\, Room 2.057
CATEGORIES:Hot Topics in Intelligence Research
ATTACH;FMTTYPE=image/jpeg:https://www.scienceofintelligence.de/wp-content/uploads/2025/02/chatgtp16.jpg
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/Berlin:20250527T140000
DTEND;TZID=Europe/Berlin:20250527T153000
DTSTAMP:20260420T170918
CREATED:20250226T122357Z
LAST-MODIFIED:20250409T100410Z
UID:23614-1748354400-1748359800@www.scienceofintelligence.de
SUMMARY:Jacob Davidson (Max Planck Institute for Animal Behavior\, Konstanz)
DESCRIPTION:More details to follow. \nThis talk is part of David Mezey‘s course “Introduction to Modeling Collective Behavior\, ” a seminar on collective behavior research\, combined with multiple interactive elements. \n  \n  \n  \nImage created with DALL-E by Maria Ott.
URL:https://www.scienceofintelligence.de/event/jacob-davidson-max-planck-institute-for-animal-behavior-konstanz/
LOCATION:SCIoI\, Marchstraße 23\, 10587 Berlin\, Room 2.057
CATEGORIES:Hot Topics in Intelligence Research
ATTACH;FMTTYPE=image/jpeg:https://www.scienceofintelligence.de/wp-content/uploads/2025/02/chatgtp4.jpg
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/Berlin:20250513T140000
DTEND;TZID=Europe/Berlin:20250513T153000
DTSTAMP:20260420T170918
CREATED:20250226T122030Z
LAST-MODIFIED:20250409T100403Z
UID:23609-1747144800-1747150200@www.scienceofintelligence.de
SUMMARY:Mate Nagy (MTA-ELTE Lendület Collective Behaviour Research Group\, Budapest)
DESCRIPTION:More details to follow. \nThis talk is part of David Mezey‘s course “Introduction to Modeling Collective Behavior\, ” a seminar on collective behavior research\, combined with multiple interactive elements. \n  \n  \n  \nImage created with DALL-E by Maria Ott.
URL:https://www.scienceofintelligence.de/event/mate-nagy-mta-elte-lendulet-collective-behaviour-research-group-budapest/
LOCATION:SCIoI\, Marchstraße 23\, 10587 Berlin\, Room 2.057
CATEGORIES:Hot Topics in Intelligence Research
ATTACH;FMTTYPE=image/jpeg:https://www.scienceofintelligence.de/wp-content/uploads/2025/02/chatgtp3.jpg
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/Berlin:20250506T140000
DTEND;TZID=Europe/Berlin:20250506T153000
DTSTAMP:20260420T170918
CREATED:20250226T121637Z
LAST-MODIFIED:20250409T100355Z
UID:23605-1746540000-1746545400@www.scienceofintelligence.de
SUMMARY:Pawel Romanczuk (Science of Intelligence)
DESCRIPTION:More details to follow. \nThis talk is part of David Mezey‘s course “Introduction to Modeling Collective Behavior\, ” a seminar on collective behavior research\, combined with multiple interactive elements. \n  \n  \nImage created with DALL-E by Maria Ott.
URL:https://www.scienceofintelligence.de/event/pawel-romanczuk-science-of-intelligence/
LOCATION:SCIoI\, Marchstraße 23\, 10587 Berlin\, Room 2.057
CATEGORIES:Hot Topics in Intelligence Research
ATTACH;FMTTYPE=image/jpeg:https://www.scienceofintelligence.de/wp-content/uploads/2025/02/chatgtp2.jpg
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/Berlin:20250502T141500
DTEND;TZID=Europe/Berlin:20250502T154500
DTSTAMP:20260420T170918
CREATED:20250402T100615Z
LAST-MODIFIED:20250603T124009Z
UID:24007-1746195300-1746200700@www.scienceofintelligence.de
SUMMARY:Michael Levin (Tufts University)\, “Communication With Intelligence in Unconventional Embodiments: Bioelectricity as a Case Study”
DESCRIPTION:Embodiment is critical for intelligence; however\, the traditional concept of embodiment as movement in 3-dimensional space covers only a small slice of the way biology exploits embodiment. In this talk\, I will present a framework for understanding intelligence more broadly\, and show how the multiscale competency of bodies navigates many different kinds of spaces. I will use our findings in developmental bioelectricity as a case study for how an unconventional intelligence can be recognized and communicated with\, for exciting applications in regenerative medicine and cancer. I will also show novel multicellular life forms\, which highlight the remarkable plasticity of the agential material of life by self-constructing new embodied intelligences from un-modified cells. The emerging field of diverse intelligence merges biophysics\, computer science\, and cognitive science in a way that helps us relate to current and forthcoming beings\, with applications in science\, engineering\, and ethics. \n  \nBio\nMichael Levin is the Vannevar Bush Chair and Distinguished Professor of Biology at Tufts University\, where he directs both the Allen Discovery Center and the Tufts Center for Regenerative and Developmental Biology. Renowned for his pioneering work at the intersection of developmental biology\, synthetic biology\, and cognitive science\, Prof. Levin investigates how cells and tissues process information to control growth\, regeneration\, and form. His research explores the collective intelligence of cells\, bioelectric signaling\, and the emergence of cognition in both natural and synthetic organisms\, with applications ranging from regenerative medicine to synthetic bioengineering. Prof. Levin is widely recognized for co-discovering xenobots—programmable living machines made from frog cells—and has published over 350 scientific papers. His work has been featured in major scientific and popular media. For more information\, visit his lab website. \n  \nThis event will take place on site with the speaker joining on Zoom: https://tu-berlin.zoom-x.de/j/69207754612?pwd=IKxoTdY3dQWccHpce2nA0IsNkNxPHu.1 \nThis is part of Aravind Battaje‘s course “Mind\, Body\, Environment: An Interactive Seminar on Embodied Intelligence\,” a seminar introducing to key theories and research highlighting this shift in perspective through invited lectures from experts in the field and interactive sessions.
URL:https://www.scienceofintelligence.de/event/michael-levin-tufts-university/
LOCATION:SCIoI\, Marchstraße 23\, 10587 Berlin\, Room 2.057
CATEGORIES:Hot Topics in Intelligence Research
ATTACH;FMTTYPE=image/jpeg:https://www.scienceofintelligence.de/wp-content/uploads/2025/04/levin.jpg
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/Berlin:20250429T140000
DTEND;TZID=Europe/Berlin:20250429T153000
DTSTAMP:20260420T170918
CREATED:20250319T104304Z
LAST-MODIFIED:20250603T124029Z
UID:23816-1745935200-1745940600@www.scienceofintelligence.de
SUMMARY:Marina Papadopoulou (Tuscia University)\, “Across the Swarm-Verse: The Self-Organization of Animal Collectives on the Move”
DESCRIPTION:From the daily movement of primate troops to the mesmerizing murmurations of starling flocks in the sky\, the dynamics of animal groups on the move fascinate us with the mystery of their underlying social interactions. In this talk\, I will first showcase how we combine empirical data and computational models based on self-organization to understand the individual rules that underlie collective behaviour\, using bird flocks under attack by a robotic predator as a case study. Given that identifying unique and common traits across systems is necessary to understand the ecological and evolutionary processes that shape the diversity of collective behaviour we see in nature\, I will further present the Swarm-Verse\, a new framework to quantify variation in collective motion across species\, using studies on fish\, goats\, pigeons and baboons. \nThis talk is part of David Mezey‘s course “Introduction to Modeling Collective Behavior\, ” a seminar on collective behavior research\, combined with multiple interactive elements. \n  \n  \n  \nImage created with DALL-E by Maria Ott
URL:https://www.scienceofintelligence.de/event/hannah-j-williams-centre-for-the-advanced-study-of-collective-behaviour-overview-of-sensory-basis-in-collective-behavior/
LOCATION:SCIoI\, Marchstraße 23\, 10587 Berlin\, Room 2.057
CATEGORIES:Hot Topics in Intelligence Research
ATTACH;FMTTYPE=image/webp:https://www.scienceofintelligence.de/wp-content/uploads/2025/01/Fiore.webp
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/Berlin:20250422T140000
DTEND;TZID=Europe/Berlin:20250422T153000
DTSTAMP:20260420T170918
CREATED:20250319T103943Z
LAST-MODIFIED:20250603T124055Z
UID:23812-1745330400-1745335800@www.scienceofintelligence.de
SUMMARY:Ralf Kurvers (MPI for Human Development)\, “Individual\, Social and Ecological Drivers of Human Collective Foraging”
DESCRIPTION:Foraging complexity and competitive social challenges are considered key drivers of human cognition. Yet\, we still have a poor understanding of the decision-making mechanisms underlying foraging behavior\, especially in social contexts. In this talk\, I will combine immersive lab experiments\, field work using high-resolution tracking\, and computational and agent-based models to uncover the mechanisms guiding human foraging decisions. I hope to convince you that foraging provides a rich test bed to study a broad range of cognitive processes\, such as memory\, learning\, and evidence accumulation\, and that the current technological advancements allow us to do this even in the challenging conditions of the natural world. \n  \nThis talk is part of David Mezey‘s course “Introduction to Modeling Collective Behavior\, ” a seminar on collective behavior research\, combined with multiple interactive elements. \n  \n  \n 
URL:https://www.scienceofintelligence.de/event/ralf-kurvers-overview-of-human-collective-behavior/
LOCATION:SCIoI\, Marchstraße 23\, 10587 Berlin\, Room 2.057
CATEGORIES:Hot Topics in Intelligence Research
ATTACH;FMTTYPE=image/jpeg:https://www.scienceofintelligence.de/wp-content/uploads/2025/03/a2345417-a007-468a-8182-6b5320513e32-1.jpeg
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/Berlin:20250212T170000
DTEND;TZID=Europe/Berlin:20250212T183000
DTSTAMP:20260420T170918
CREATED:20250113T103151Z
LAST-MODIFIED:20250603T124241Z
UID:23105-1739379600-1739385000@www.scienceofintelligence.de
SUMMARY:Agnieszka Wykowska (the Italian Institute of Technology\, Genoa)\, “Using Humanoid Robots To Study Human Cognition”
DESCRIPTION:Humanoid robots have recently received a lot of attention and enthusiasm in the robotics community and beyond. Indeed\, with new technological advancements\, they hold the promise to become our assistants in daily lives\, as general-purpose machines. In this talk\, however\, Agnieszka Wykowska will focus on a different\, less explored\, way of using humanoids – as tools to understand human cognition. Humanoids can play a substantial role in the scientific understanding of human cognition\, both through the construction of embodied models of cognitive mechanisms\, and in the role of sophisticated apparatus in experimental paradigms. Agnieszka Wykowska will present the work of her lab where they have examined how fundamental mechanisms of human cognition\, such as attention\, decision making or sense of agency\, are modulated by the interaction with a humanoid. She will then demonstrate how results from such studies can be used in robot-assisted cognitive training for children with disabilities\, highlighting the role of fundamental science in applied research. \nThis talk will take place as part of SCIoI member Jonas Frenkel’s seminar “Artificial Social Intelligence.” It aims to provide a comprehensive exploration of ASI\, which involves the observation\, analysis\, and synthesis of social phenomena. It integrates synthetic sciences such as machine learning\, computer vision\, and robotics with cognitive science\, psychology\, neuroscience\, and the humanities to focus on the perception\, cognitive components\, and behaviors linked to social intelligence. \nPhoto by Zak on Unsplash.
URL:https://www.scienceofintelligence.de/event/agnieszka-wykowska-the-italian-institute-of-technology-genoa/
LOCATION:SCIoI\, Marchstraße 23\, 10587 Berlin\, Room 2.057
CATEGORIES:Hot Topics in Intelligence Research
ATTACH;FMTTYPE=image/webp:https://www.scienceofintelligence.de/wp-content/uploads/2025/01/Agnieszka2-e1736785356683.webp
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/Berlin:20250207T140000
DTEND;TZID=Europe/Berlin:20250207T153000
DTSTAMP:20260420T170918
CREATED:20241024T160716Z
LAST-MODIFIED:20250603T124252Z
UID:22501-1738936800-1738942200@www.scienceofintelligence.de
SUMMARY:Tucker Hermans (University of Utah\, NVIDIA)\, “Learning and Planning With Relational Dynamics Models for Robot Manipulation”
DESCRIPTION:More info will follow soon. \nThis talk will take place as part of SCIoI member Svetlana Levit’s seminar “Selected Topics in Robot Learning\,” which explores how advances in machine learning are helping robots operate in new environments\, learn new behaviors\, and adapt to changing conditions.
URL:https://www.scienceofintelligence.de/event/tucker-hermans-university-of-utah-nvidia-learning-and-planning-with-relational-dynamics-models-for-robot-manipulation/
LOCATION:SCIoI\, Marchstraße 23\, 10587 Berlin\, Room 2.057
CATEGORIES:Hot Topics in Intelligence Research
ATTACH;FMTTYPE=image/jpeg:https://www.scienceofintelligence.de/wp-content/uploads/2024/10/Handmanipulations-scaled-e1729866591231.jpg
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/Berlin:20250205T160000
DTEND;TZID=Europe/Berlin:20250205T171500
DTSTAMP:20260420T170918
CREATED:20250113T102236Z
LAST-MODIFIED:20250603T124306Z
UID:23100-1738771200-1738775700@www.scienceofintelligence.de
SUMMARY:Goldie Nejat (University of Toronto)\, “Paging the Socially Assistive Robots: Intelligent and Persuasive Social Robots for Healthcare and Beyond”
DESCRIPTION:The world is experiencing a silver tsunami: rapid population aging. As the world’s older population significantly increases\, dementia is becoming one of the fastest growing diseases\, with no cure in sight. Socially assistive robots are a unique disruptive innovation that are becoming a crucial part of everyday society\, especially in a post-pandemic world\, aiding people in everyday life to meet urgent and immediate assistive needs. This talk will present some of my group’s recent research efforts in developing intelligent and persuasive socially assistive robots to improve quality of life and promote independence (aging-in-place) of older adults\, including those living with dementia and their care providers. In particular\, I will discuss some of my team’s many robots including Brian\, Casper\, Tangy\, Blueberry\, Salt\, Pepper\, Chili\, Hans Solo\, and Luke and Leia that have been deployed in human-centered environments from long-term care homes and hospitals to grocery stores to autonomously provide cognitive and social interventions\, help with activities of daily living\, and lead individual-based and group-based recreational activities\, games and therapies. Our novel multimodal interactive robots are serving as assistants to individuals as well as groups of users\, while learning to personalize these interactions to the needs and wants of these users by using perceptual\, behavioral and persuasive intelligence. Numerous user studies conducted in care settings will also be discussed to highlight how these robots can effectively be integrated into people’s everyday lives to support person-centered care. \nDr. Goldie Nejat\, Ph.D.\, P.Eng.\, is a Professor in the Department of Mechanical & Industrial Engineering at the University of Toronto. She is also the Founder and Director of the Autonomous Systems and Biomechatronics Laboratory Professor Nejat is an Adjunct Scientist at both KITE in the Toronto Rehabilitation Institute (University Health Network) and the Rotman Research Institute at Baycrest Health Sciences\, and a Fellow of both ASME and CIFAR. She was the Canada Research Chair in Robots for Society (2014-2024).\nDr. Nejat’s research focuses on developing intelligent service robots and robot cooperative teams for applications in health\, eldercare\, emergency response\, search and rescue\, security and surveillance\, retail and manufacturing. Her ground-breaking robotics research is leading the development of intelligent socially assistive robots aimed at meeting the challenges posed by a rapidly aging population. She has been invited to speak about her research to scientists\, healthcare professionals\, policymakers\, governments and the general public at many events\, conferences and institutions around the world. She has served on the organizing\, program and editorial committees of numerous international conferences and journals on robotics\, automation\, human-robot interaction and medical devices. She is an Associate Editor for the International Journal of Social Robotics\, a program co-chair for the 2025 IEEE International Conference on Robot and Human Interactive Communication (RO-MAN) and is also a past Associate Editor for IEEE Robotics and Automation Letters (RA-L)\, and IEEE Transactions on Automation Science and Engineering (T-ASE). Her team’s work has been presented in over 100 media stories including in Popular Science\, National Geographic Magazine\, Time Magazine\, Bloomberg\, NBC News\, the Telegraph\, Reader’s Digest\, and the Discovery Channel. In 2022\, she received the Robotics Society of Japan (RSJ) Pioneering Research Award in Robot & Human Interactive Communication along with her students and collaborator. In 2022\, she was also internationally recognized as 1 of 50 women in robotics you need to know by Women in Robotics for her inspiring contributions to robotics. In 2020\, she received the Engineering Excellence Medal from the Ontario Society of Professional Engineers (PEO) and the Professional Engineers Ontario. \nThis talk will take place as part of SCIoI member Jonas Frenkel’s seminar “Artificial Social Intelligence.” It aims to provide a comprehensive exploration of ASI\, which involves the observation\, analysis\, and synthesis of social phenomena. It integrates synthetic sciences such as machine learning\, computer vision\, and robotics with cognitive science\, psychology\, neuroscience\, and the humanities to focus on the perception\, cognitive components\, and behaviors linked to social intelligence. \nPhoto by Michael Dziedzic on Unsplash.
URL:https://www.scienceofintelligence.de/event/goldie-nejat-university-of-toronto/
LOCATION:SCIoI\, Marchstraße 23\, 10587 Berlin\, Room 2.057
CATEGORIES:Hot Topics in Intelligence Research
ATTACH;FMTTYPE=image/webp:https://www.scienceofintelligence.de/wp-content/uploads/2025/01/Goldie.webp
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/Berlin:20250129T161500
DTEND;TZID=Europe/Berlin:20250129T173000
DTSTAMP:20260420T170918
CREATED:20250106T111016Z
LAST-MODIFIED:20250603T124316Z
UID:23002-1738167300-1738171800@www.scienceofintelligence.de
SUMMARY:Maarten Sap (Carnegie Mellon University)\, “Artificial Social Intelligence? On the Challenges of Socially Aware and Ethically Informed LLMs”
DESCRIPTION:Modern AI systems such as LLMs are pervasive and helpful\, but do they really have the social intelligence to seamlessly and safely engage in interactions with humans? In this talk\, Maarten Sap will delve into the limits of social intelligence of LLMs and how we can measure and anticipate their risks. He will introduce Sotopia\, a new social simulation environment to evaluate the interaction abilities of LLMs as social AI agents. He will show how today’s most powerful models struggle to socially interact due to inability to deal with information asymmetry. He will then shift to how LLMs pose new ethical challenges in their interactions with users. Specifically\, through their language modality and possible expressions of uncertainty\, his work shows that LLMs tend to express overconfidence in their answers even when incorrect\, which users tend to over-rely on.  Finally\, Maarten Sap  will introduce ParticipAI\, a new framework to anticipate future AI use cases and dilemmas. Through their framework\, his work shows that lay users can help us anticipate the benefits and harms of allowing or not allowing an AI use case\, paving the way for more democratic approaches to AI design\, development\, and governance. He will conclude with some thoughts on future directions towards socially aware and ethically informed AI. \nThis talk will take place as part of SCIoI member Jonas Frenkel’s seminar “Artificial Social Intelligence.” It aims to provide a comprehensive exploration of ASI\, which involves the observation\, analysis\, and synthesis of social phenomena. It integrates synthetic sciences such as machine learning\, computer vision\, and robotics with cognitive science\, psychology\, neuroscience\, and the humanities to focus on the perception\, cognitive components\, and behaviors linked to social intelligence. \nThis talk will take place in person at SCIoI. \nImage created with DALL-E by Maria Ott
URL:https://www.scienceofintelligence.de/event/artificial-social-intelligence-maarten-sap-carnegie-mellon-university/
LOCATION:SCIoI\, Marchstraße 23\, 10587 Berlin\, Room 2.057
CATEGORIES:Hot Topics in Intelligence Research
ATTACH;FMTTYPE=image/webp:https://www.scienceofintelligence.de/wp-content/uploads/2025/01/Saap.webp
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/Berlin:20250122T160000
DTEND;TZID=Europe/Berlin:20250122T173000
DTSTAMP:20260420T170918
CREATED:20250106T110543Z
LAST-MODIFIED:20250603T124340Z
UID:23000-1737561600-1737567000@www.scienceofintelligence.de
SUMMARY:Stephen M. Fiore (University of Central Florida)\, “Studying Artificial Social Intelligence: Understanding and Examining Social Cognitive Processes in Human-Machine Collaborations”
DESCRIPTION:In this presentation Stephen M. Fiore will provide an overview of a body of research in social cognition and its relation to developing artificial social intelligence. In the rapidly evolving landscape of artificial intelligence (AI)\, an important research direction is the development of systems that can work alongside and collaborate with humans as actual teammates. Effective teamwork is crucial in fields that have high-stakes and can require complex collaborative problem solving (e.g.\, disaster response). In these environments\, the ability of team members to collaborate requires social-cognitive processes over and above an understanding of the tasks to be accomplished. We address this through the study of socially intelligent AI and how these influence interactions with human counterparts acting as a team. In this talk\, Stephen Fiore will first provide an overview of our approach to social cognition and the theoretical concepts being studied. He will describe theory and data from his various research projects studying human-human and human-machine teaming and conclude with recommendations and guidance for future research on artificial social intelligence. \nThis course This talk will take place as part of SCIoI member Jonas Frenkel’s seminar “Artificial Social Intelligence.” It aims to provide a comprehensive exploration of ASI\, which involves the observation\, analysis\, and synthesis of social phenomena. It integrates synthetic sciences such as machine learning\, computer vision\, and robotics with cognitive science\, psychology\, neuroscience\, and the humanities to focus on the perception\, cognitive components\, and behaviors linked to social intelligence. \nImage created with DALL-E by Maria Ott
URL:https://www.scienceofintelligence.de/event/artificial-social-intelligence-stephen-fiore/
LOCATION:SCIoI\, Marchstraße 23\, 10587 Berlin\, Room 2.057
CATEGORIES:Hot Topics in Intelligence Research
ATTACH;FMTTYPE=image/webp:https://www.scienceofintelligence.de/wp-content/uploads/2025/01/Fiore.webp
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/Berlin:20250121T160000
DTEND;TZID=Europe/Berlin:20250121T180000
DTSTAMP:20260420T170918
CREATED:20241217T172315Z
LAST-MODIFIED:20250603T124352Z
UID:22911-1737475200-1737482400@www.scienceofintelligence.de
SUMMARY:Giovanni Beltrame (Polytechnique Montreal)\, “Field Collective Robotics: Challenges and Applications”
DESCRIPTION:Giovanni Beltrame is a Professor in the Department of Computer and Software Engineering at Polytechnique Montréal\, where he leads the Making Innovative Space Technology (MIST) Laboratory. At MIST Lab\, Giovanni is conducting projects in collaboration with industry and government agencies in areas such as robotics\, disaster response\, and space exploration. His research interests include the modeling and design of embedded systems\, artificial intelligence\, and robotics\, with a particular emphasis on swarm robotics. He has participated in several field missions with ESA\, the Canadian Space Agency (CSA)\, and NASA\, including BRAILLE\, PANGAEA-X\, and IGLUNA. He has made significant contributions to the field of swarm robotics\, notably through the development of Buzz\, a programming language designed for heterogeneous robot swarms\, which facilitates the coordination and control of large groups of robots. \nThis talk will take place as part of SCIoI member Mohsen Raoufi’s seminar “Introduction to Collective Robotics: Where Complexity Meets Robotics\,” which provides an overview on the topic of collective robotics while exploring key areas like complexity science\, network science\, and engineering. \nImage created with DALL-E by Maria Ott.
URL:https://www.scienceofintelligence.de/event/giovanni-beltrame-polytechnique-montreal-field-collective-robotics-challenges-and-applications/
LOCATION:SCIoI\, Marchstraße 23\, 10587 Berlin\, Room 2.057
CATEGORIES:Hot Topics in Intelligence Research
ATTACH;FMTTYPE=image/webp:https://www.scienceofintelligence.de/wp-content/uploads/2024/12/Giovanni_Mohsen2.webp
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/Berlin:20250120T140000
DTEND;TZID=Europe/Berlin:20250120T153000
DTSTAMP:20260420T170918
CREATED:20241024T160424Z
LAST-MODIFIED:20250102T131427Z
UID:22499-1737381600-1737387000@www.scienceofintelligence.de
SUMMARY:Rudolf Lioutikov (Karlsruher Institut für Technologie)\, "Versatile\, Language Conditioned Robots"
DESCRIPTION:More info will follow soon. \nThis talk will take place as part of SCIoI member Svetlana Levit’s seminar “Selected Topics in Robot Learning\,” which explores how advances in machine learning are helping robots operate in new environments\, learn new behaviors\, and adapt to changing conditions.
URL:https://www.scienceofintelligence.de/event/rudolf-lioutikov-karlsruher-institut-fur-technologie-versatile-language-conditioned-robots/
LOCATION:SCIoI\, Marchstraße 23\, 10587 Berlin\, Room 2.057
CATEGORIES:Hot Topics in Intelligence Research
ATTACH;FMTTYPE=image/jpeg:https://www.scienceofintelligence.de/wp-content/uploads/2024/10/Copy-of-zp-TU-HU-ExcelenzForschung-20240122-077___-scaled-e1729865955744.jpg
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/Berlin:20250115T160000
DTEND;TZID=Europe/Berlin:20250115T180000
DTSTAMP:20260420T170918
CREATED:20241217T153357Z
LAST-MODIFIED:20250603T124411Z
UID:22907-1736956800-1736964000@www.scienceofintelligence.de
SUMMARY:Mary Ellen Foster (University of Glasgow)\, “Face-to-Face Conversation With Socially Intelligent Robots”
DESCRIPTION:When humans talk to each other face-to-face\, they use their voices\, faces\, and bodies together in a rich\, multimodal\, continuous\, interactive process. For a robot to participate fully in this sort of natural\, face-to-face conversation in the real world\, it must also be able not only to understand the social signals of its human partners\, but also to produce appropriate signals in response. In this talk\, I will present recent research from my group in this area\, and will also discuss the issues involved in moving social robots from the lab to real-world contexts\, which involves consultation with a large number of stakeholders. \nDr Mary Ellen Foster is a Senior Lecturer in the School of Computing Science at the University of Glasgow. Her primary research interests are human-robot interaction\, social robotics\, and embodied conversational agents. She recently coordinated the MuMMER project\, a European Horizon 2020 project in the area of socially aware human-robot interaction\, and is currently coordinating a UK/Canada collaborative project investigating the use of socially intelligent robots in paediatric emergency rooms. She obtained her PhD from the University of Edinburgh in 2007 and has previously worked at the Technical University of Munich and Heriot-Watt University. \nThis talk will take place as part of SCIoI member Jonas Frenkel’s seminar “Artificial Social Intelligence.” It aims to provide a comprehensive exploration of ASI\, which involves the observation\, analysis\, and synthesis of social phenomena. It integrates synthetic sciences such as machine learning\, computer vision\, and robotics with cognitive science\, psychology\, neuroscience\, and the humanities to focus on the perception\, cognitive components\, and behaviors linked to social intelligence. \nThis talk will take place in person at SCIoI.
URL:https://www.scienceofintelligence.de/event/mary-ellen-foster-university-of-glasgow/
LOCATION:SCIoI\, Marchstraße 23\, 10587 Berlin\, Room 2.057
CATEGORIES:Hot Topics in Intelligence Research
ATTACH;FMTTYPE=image/jpeg:https://www.scienceofintelligence.de/wp-content/uploads/2024/12/Z__2526-e1734449630441.jpg
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/Berlin:20250110T140000
DTEND;TZID=Europe/Berlin:20250110T153000
DTSTAMP:20260420T170918
CREATED:20241024T160017Z
LAST-MODIFIED:20250603T124424Z
UID:22496-1736517600-1736523000@www.scienceofintelligence.de
SUMMARY:Oliver Kroemer (Carnegie Mellon University)\, “Modularity and Learning To Structure Robot Manipulation Skills”
DESCRIPTION:Oliver Kroemer’s research focuses on developing algorithms and representations that enable robots to learn versatile manipulation skills over time. By equipping robots with the ability to acquire new skills and adapt manipulations to novel situations\, his work opens up a wide range of potential applications—from assisting the elderly and maintaining parks and public spaces to operating in hazardous environments. \nOliver has developed methods that allow robots to learn about objects through physical interactions and autonomously refine their skills using reinforcement learning. Additionally\, he has proposed innovative representations for capturing key aspects of manipulations\, such as contact states and motor primitives\, to enhance generalization across different tasks and scenarios. \nThe ultimate aim of his research is to create a life-long learning framework that enables robots to continuously acquire and improve manipulation skills\, paving the way for more adaptable and capable robotic systems. \nThis talk will take place as part of SCIoI member Svetlana Levit’s seminar “Selected Topics in Robot Learning\,” which explores how advances in machine learning are helping robots operate in new environments\, learn new behaviors\, and adapt to changing conditions. \nImage generated with DALL-E by Maria Ott
URL:https://www.scienceofintelligence.de/event/oliver-kroemer-carnegie-mellon-university-modularity-and-learning-to-structure-robot-manipulation-skills/
LOCATION:SCIoI\, Marchstraße 23\, 10587 Berlin\, Room 2.057
CATEGORIES:Hot Topics in Intelligence Research
ATTACH;FMTTYPE=image/webp:https://www.scienceofintelligence.de/wp-content/uploads/2024/10/Kroemer-e1734442005415.webp
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/Berlin:20250107T160000
DTEND;TZID=Europe/Berlin:20250107T180000
DTSTAMP:20260420T170918
CREATED:20241217T123131Z
LAST-MODIFIED:20250603T124437Z
UID:22899-1736265600-1736272800@www.scienceofintelligence.de
SUMMARY:Carlo Pinciroli (Worcester Polytechnic Institute)\, “Simulation Platforms and sim2real Gap”
DESCRIPTION:Carlo Pinciroli is an Associate Professor and Graduate Coordinator of the Robotics Engineering department at Worcester Polytechnic Institute (WPI)\, where he leads the NEST (Novel Engineering for Swarm Technologies) Lab. With additional appointments in WPI’s Artificial Intelligence Program\, Computer Science\, and Fire Protection Engineering\, his research centers on swarm robotics. \nHe is the creator of ARGoS\, a widely used\, high-performance robot swarm simulator\, and the driving force behind Buzz\, a programming language designed for real-world robot swarms\, recognized by MIT Technology Review and Communications of the ACM. Their work has received funding from NSF\, NASA\, Amazon Science\, Raytheon Technologies\, and other major institutions\, advancing both simulation accuracy and real-world swarm applications. \nThis talk will take place as part of SCIoI member Mohsen Raoufi’s seminar “Introduction to Collective Robotics: Where Complexity Meets Robotics\,” which provides an overview on the topic of collective robotics while exploring key areas like complexity science\, network science\, and engineering. \nThis talk will take place in person at SCIoI and will be available as a live stream. \nImage generated with DALL-E by Maria Ott
URL:https://www.scienceofintelligence.de/event/carlo-pinciroli-worcester-polytechnic-institute-simulation-platforms-and-sim2real-gap/
LOCATION:SCIoI\, Marchstraße 23\, 10587 Berlin\, Room 2.057
CATEGORIES:Hot Topics in Intelligence Research
ATTACH;FMTTYPE=image/png:https://www.scienceofintelligence.de/wp-content/uploads/2024/12/Pinciroli-e1734438654860.png
END:VEVENT
END:VCALENDAR