BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//scienceofintelligence.de - ECPv6.15.12.2//NONSGML v1.0//EN
CALSCALE:GREGORIAN
METHOD:PUBLISH
X-WR-CALNAME:scienceofintelligence.de
X-ORIGINAL-URL:https://www.scienceofintelligence.de
X-WR-CALDESC:Events for scienceofintelligence.de
REFRESH-INTERVAL;VALUE=DURATION:PT1H
X-Robots-Tag:noindex
X-PUBLISHED-TTL:PT1H
BEGIN:VTIMEZONE
TZID:Europe/Berlin
BEGIN:DAYLIGHT
TZOFFSETFROM:+0100
TZOFFSETTO:+0200
TZNAME:CEST
DTSTART:20240331T010000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:+0200
TZOFFSETTO:+0100
TZNAME:CET
DTSTART:20241027T010000
END:STANDARD
BEGIN:DAYLIGHT
TZOFFSETFROM:+0100
TZOFFSETTO:+0200
TZNAME:CEST
DTSTART:20250330T010000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:+0200
TZOFFSETTO:+0100
TZNAME:CET
DTSTART:20251026T010000
END:STANDARD
BEGIN:DAYLIGHT
TZOFFSETFROM:+0100
TZOFFSETTO:+0200
TZNAME:CEST
DTSTART:20260329T010000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:+0200
TZOFFSETTO:+0100
TZNAME:CET
DTSTART:20261025T010000
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
DTSTART;TZID=Europe/Berlin:20250626T140000
DTEND;TZID=Europe/Berlin:20250626T180000
DTSTAMP:20260423T121754
CREATED:20250402T101646Z
LAST-MODIFIED:20250618T134257Z
UID:24009-1750946400-1750960800@www.scienceofintelligence.de
SUMMARY:Michael Brecht\, "Active touch and Large-Brain Neuroscience in Elephants" and Yasemin Vardar\, "Active Synthetic Touch: Generating Naturalistic Multisensory Tactile Stimuli for Active Exploration"
DESCRIPTION:Michael Brecht (BCCN Berlin) will present data on a systemic investigation of brains and of grasping behavior in elephants. The analysis of sensory nerves suggests that elephants are extremely tactile animals. In elephants\, trunk whisker length is lateralized as a result of heavily lateralized trunk behaviors. The elephant trunk tip appears to be represented by a large cortical three-dimensional trunk-tip model; this observation is reminiscent of the somatosensory cortical snout representation in pigs. The trunk musculature of elephants is breath-takingly complex and filigree. Trunk morphology\, motor neuron organization and grasping differs between African elephants (which pinch objects with their two trunk fingers) and Asian elephants (which have only one finger and wrap objects with their trunk).\nHe will discuss the potential of novel X-ray technologies for large brain analysis. Both behavioral analysis and elephant neuroanatomy reveal striking individual differences between individual elephants. Thus\, it appears that elephants are less equal than other animals. \nImagine you could feel your pet’s fur on a Zoom call\, the fabric of the clothes you are considering purchasing online\, or tissues in medical images. We are all familiar with the impact of digitization of audio and visual information in our daily lives – every time we take videos or pictures on our phones. Yet\, there is no such equivalent for our sense of touch. This talk will encompass Yasemin Vardars (Delft University of Technology) scientific efforts in digitizing naturalistic tactile information for the last decade. She will explain the methodologies and interfaces we have been developing with my team and collaborators for capturing\, encoding\, and recreating the perceptually salient features of tactile textures for active bare-finger interactions. She will also discuss current challenges\, future research paths\, and potential applications in tactile digitization. \nThis talk is part of course Olga Shurygina‘s course “Active Sensing\,” a seminar on cutting-edge research on active sensory perception in humans and other mammals and realted advances in artificial agents’ abilities such as seeing\, grasping\, and navigating in space. \n  \nPhoto created with DALL-E by Maria Ott.
URL:https://www.scienceofintelligence.de/event/michael-brecht-bccn-berlin-and-yasemin-vardar-delft-university-of-technology-active-touch/
LOCATION:SCIoI\, Marchstraße 23\, 10587 Berlin\, Room 2.057
CATEGORIES:Hot Topics in Intelligence Research
ATTACH;FMTTYPE=image/jpeg:https://www.scienceofintelligence.de/wp-content/uploads/2025/04/chatgtp12.jpg
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/Berlin:20250701T140000
DTEND;TZID=Europe/Berlin:20250701T153000
DTSTAMP:20260423T121754
CREATED:20250407T095720Z
LAST-MODIFIED:20250627T085132Z
UID:24183-1751378400-1751383800@www.scienceofintelligence.de
SUMMARY:POSTPONED: Alan Winfield (UWE Bristol) & Dafna Burema (Science of Intelligence)
DESCRIPTION:This event has been postponed to 29 July 2025. \nHow should we think about Ethics when Machines become part of our social worlds? Alan Winfield and Dafna Burema will explore the ethical and societal dimensions of robotics and AI in an interactive fishbowl and in conversation with Master`s students of the course “Introduction to Modeling Collective Behavior”. Alan Winfield\, a pioneer in the field of robot ethics\, will share insights from his work on cognitive robotics\, science communication\, and the development of ethical standards for intelligent systems.\nDafna Burema brings a sociological lens\, focusing on how AI and robots shape—and are shaped by—social values\, particularly in sensitive areas like eldercare. Together\, they’ll reflect on how society can critically engage with intelligent technologies and what ethical frameworks might guide their integration into collective life. \nThis talk is part of David Mezey‘s course “Introduction to Modeling Collective Behavior\, ” a seminar on collective behavior research\, combined with multiple interactive elements. \n  \n  \nPhoto created with DALL-E by Maria Ott.
URL:https://www.scienceofintelligence.de/event/alan-winfield-uwe-bristol-dafna-burema-science-of-intelligence/
LOCATION:SCIoI\, Marchstraße 23\, 10587 Berlin\, Room 2.057
CATEGORIES:Hot Topics in Intelligence Research
ATTACH;FMTTYPE=image/jpeg:https://www.scienceofintelligence.de/wp-content/uploads/2025/02/chatgtp8.jpg
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/Berlin:20250711T140000
DTEND;TZID=Europe/Berlin:20250711T160000
DTSTAMP:20260423T121754
CREATED:20250402T102151Z
LAST-MODIFIED:20250711T082214Z
UID:24013-1752242400-1752249600@www.scienceofintelligence.de
SUMMARY:William Warren (Brown University)\, "The Dynamics of Perception and Action: From Pedestrian Interactions to Collective Behavior"
DESCRIPTION:It’s a perplexing time in the study of visual perception. On the one hand\, there is a resurgence of models that freely posit a priori structure in the visual system\, including priors\, generative world models\, and physics engines. On the other hand\, there is the astonishing a posteriori success of deep neural networks trained only on natural images and image sequences. Although their performance offers an existence proof of the sufficiency of image information for certain visual tasks\, the black box of deep learning does not easily offer up that information or how it’s extracted by the visual system. \nA science of perception depends on understanding the visual information that is available in natural environments and is used to guide natural behavior. I propose that we take seriously James Gibson’s information hypothesis: For every perceivable property of the environment\, however subtle\, there must be a variable of information\, however complex\, that uniquely specifies it. The project is to identify the information that the visual system uses to perceive and act within the constraints of a species’ ecological niche. \nTwo decades ago I decided to work out a test case to see whether an information-based account of a natural behavior could be sustained. In this talk I will offer a status report on our effort to build a model of visually controlled human locomotion – a pedestrian model – that scales up from individual behaviors like steering and obstacle avoidance\, to pedestrian interactions like following and collision avoidance\, to the collective behavior of human crowds. Surprisingly\, linear combinations of these nonlinear components can account for the emergence of more complex behavior\, such as self-organized ‘flocking’\, crowd bifurcations\, and stripe formation in crossing flows. \nBio \nBill (he/him) earned his undergraduate degree at Hampshire College (1976)\, his Ph.D. in Experimental Psychology from the University of Connecticut (1982)\, did post-doctoral work at the University of Edinburgh\, and has been a professor at Brown ever since. He served as Chair of the Department of Cognitive and Linguistic Sciences from 2002-10. Warren is the recipient of a Fulbright Research Fellowship\, an NIH Research Career Development Award\, and Brown’s Elizabeth Leduc Teaching Award for Excellence in the Life Sciences. Warren’s research focuses on the visual control of action – in particular\, human locomotion and navigation. He seeks to explain how this behavior is adaptively regulated by multi-sensory information\, within a dynamical systems framework. Using virtual reality techniques\, his research team investigates problems such as the visual control of steering\, obstacle avoidance\, wayfinding\, pedestrian interactions\, and the collective behavior of crowds. Experiments in the Virtual Environment Navigation Lab (VENLab) enable his group to manipulate what participants see as they walk through a virtual landscape\, and to measure and model their behavior. The aim of this research is to understand how adaptive behavior emerges from the dynamic interaction between an organism and its environment. He believes the answers will not be found only in the brain\, but will strongly depend on the physical and informational regularities that the brain exploits. This work contributes to basic knowledge that is needed to understand visual-motor disorders in humans\, and to develop mobile robots that can operate in novel environments. For more information\, visit his faculty profile or the VENLab website. \n  \nFor those who are not in Berlin but would like to join virtually:\nhttps://tu-berlin.zoom-x.de/j/69207754612?pwd=IKxoTdY3dQWccHpce2nA0IsNkNxPHu.1 \nThis talk is part of Aravind Battaje‘s course “Mind\, Body\, Environment: An Interactive Seminar on Embodied Intelligence\,” a seminar introducing to key theories and research highlighting this shift in perspective through invited lectures from experts in the field and interactive sessions. \n  \n  \nPhoto created with DALL-E by Maria Ott.
URL:https://www.scienceofintelligence.de/event/william-warren-brown-university/
LOCATION:SCIoI\, Marchstraße 23\, 10587 Berlin\, Room 2.057
CATEGORIES:Hot Topics in Intelligence Research
ATTACH;FMTTYPE=image/jpeg:https://www.scienceofintelligence.de/wp-content/uploads/2025/04/chatgtp3.jpg
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/Berlin:20250718T140000
DTEND;TZID=Europe/Berlin:20250718T153000
DTSTAMP:20260423T121754
CREATED:20250429T090411Z
LAST-MODIFIED:20250716T122502Z
UID:24495-1752847200-1752852600@www.scienceofintelligence.de
SUMMARY:Jacob Yates (UC Berkeley)\, "The Role of Motor Signals in Visual Cortex"
DESCRIPTION:Embodiment is fundamental to biological intelligence. Brains do not passively receive the world\, they actively shape what they sense through self-motion. For nearly a century\, we have known that perception and action are deeply entangled\, and that organisms must constantly infer whether a sensory change comes from the environment or from themselves. A longstanding idea holds that sensory signals are either suppressed during movement or that movement effects are subtracted out. However\, recent discoveries in neuroscience\, especially in rodents\, suggest that spontaneous movements strongly influence sensory cortex. In this talk\, I will share our work re-examining this question in primates. We found that movements do not broadly modulate visual cortex unless they move the retina\, creating an inherent ambiguity between motor effects and changes in sensory input. I will describe our new approach to disentangling sensorimotor interactions during natural behavior\, combining high-resolution eye tracking with high-density neural recordings and modern machine learning. By precisely measuring the retinal input during natural vision\, we find that much of what appears to be a motor signal is actually visual reafference\, the lawful\, structured sensory consequences of an animal’s own actions. I will discuss how measuring and modeling this loop can deepen our understanding of active inference in the brain and what it means for designing truly embodied agents that adapt to the world as brains do. \nBio \nJacob Yates (he/him) is an Assistant Professor of Optometry & Vision Science at UC Berkeley and leads the Active Vision and Neural Computation Lab. His research explores how populations of neurons in the cortex and early visual pathways encode the visual world\, with a particular focus on how eye movements generate and utilize information for perception. By combining statistical and machine learning approaches\, his lab builds computational models to better understand neural activity and human perception\, ultimately aiming to bridge the gap between neural coding and real-world visual behavior. \nThis talk is part of Aravind Battaje‘s course “Mind\, Body\, Environment: An Interactive Seminar on Embodied Intelligence\,” a seminar introducing to key theories and research highlighting this shift in perspective through invited lectures from experts in the field and interactive sessions. \nPhoto by Soliman Cifuentes on Unsplash.
URL:https://www.scienceofintelligence.de/event/jacob-yates-uc-berkeley/
LOCATION:SCIoI\, Marchstraße 23\, 10587 Berlin\, Room 2.057
CATEGORIES:Hot Topics in Intelligence Research
ATTACH;FMTTYPE=image/jpeg:https://www.scienceofintelligence.de/wp-content/uploads/2025/04/soliman-cifuentes-RXGLTHZ6Mo8-unsplash-scaled.jpg
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/Berlin:20250729T140000
DTEND;TZID=Europe/Berlin:20250729T160000
DTSTAMP:20260423T121754
CREATED:20250627T090602Z
LAST-MODIFIED:20250724T124821Z
UID:25799-1753797600-1753804800@www.scienceofintelligence.de
SUMMARY:Alan Winfield (UWE Bristol) & Dafna Burema (Science of Intelligence)
DESCRIPTION:How should we think about Ethics when Machines become part of our social worlds? Alan Winfield and Dafna Burema will explore the ethical and societal dimensions of robotics and AI in an interactive fishbowl and in conversation with Master`s students of the course “Introduction to Modeling Collective Behavior”. Alan Winfield\, a pioneer in the field of robot ethics\, will share insights from his work on cognitive robotics\, science communication\, and the development of ethical standards for intelligent systems.\nDafna Burema brings a sociological lens\, focusing on how AI and robots shape—and are shaped by—social values\, particularly in sensitive areas like eldercare. Together\, they’ll reflect on how society can critically engage with intelligent technologies and what ethical frameworks might guide their integration into collective life. \nThis talk is part of David Mezey‘s course “Introduction to Modeling Collective Behavior\, ” a seminar on collective behavior research\, combined with multiple interactive elements. \nPhoto created with DALL-E by Maria Ott.
URL:https://www.scienceofintelligence.de/event/alan-winfield-uwe-bristol-dafna-burema-science-of-intelligence-2/
LOCATION:Marchstraße 23\, 10587 Berlin\, Room 2.057
CATEGORIES:Hot Topics in Intelligence Research
ATTACH;FMTTYPE=image/jpeg:https://www.scienceofintelligence.de/wp-content/uploads/2025/02/chatgtp8.jpg
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/Berlin:20250925T160000
DTEND;TZID=Europe/Berlin:20250925T173000
DTSTAMP:20260423T121754
CREATED:20250429T091548Z
LAST-MODIFIED:20250924T073557Z
UID:24506-1758816000-1758821400@www.scienceofintelligence.de
SUMMARY:Ariana Strandburg-Peshkin (MPI-AB & the University of Konstanz)\, "Communication and coordination in animal societies"
DESCRIPTION:Abstract: \nMany social species use signals such as vocalizations to coordinate a range of group behaviors\, from coming to consensus on where to move to banding together against threats. Despite their widespread importance\, these behaviors remain challenging to study in the wild because doing so requires monitoring many individuals simultaneously. In this talk\, I will give an overview of our work tracking the movements and vocalizations of entire social groups in the wild to tackle questions at the interface of communication and collective behavior. What roles does vocal signaling play in the coordination of collective movement? What drives groups to split up? And how do vocalizations mediate collective action against external threats? I will explore how we and our collaborators are addressing these questions in three species of social carnivore that coordinate across varying spatial scales – highly cohesive meerkat groups\, moderately cohesive coati groups\, and fission-fusion hyena clans. \nThis talk is part of David Mezey‘s course “Introduction to Modeling Collective Behavior\, ” a seminar on collective behavior research\, combined with multiple interactive elements. \nPhoto by Gertrūda Valasevičiūtė on Unsplash.
URL:https://www.scienceofintelligence.de/event/ariana-strandburg-peshkin-mpi-ab-the-university-of-konstanz/
LOCATION:SCIoI\, Marchstraße 23\, 10587 Berlin\, Room 2.057
CATEGORIES:Hot Topics in Intelligence Research
ATTACH;FMTTYPE=image/jpeg:https://www.scienceofintelligence.de/wp-content/uploads/2025/04/gertruda-valaseviciute-xMObPS6V_gY-unsplash-scaled.jpg
END:VEVENT
END:VCALENDAR