https://vimeo.com/248253044
The groundwork for decision making (a form of intelligence) is already present in unicellular organisms.

'Single-Celled Organism Appears to Make Decisions' - https://www.the-scientist.com/news-opinion/single-celled-organism-appears-to-make-decisions-66818
Lisa Marleen Mantel und Laura Wagner: Erzeugung von Mensch-Insekt-Hybriden mit GANs
https://www.youtube.com/watch?v=E8oIitQN2M4
https://www.kyb.tuebingen.mpg.de/426726/sensory-and-sensorimotor-systems
Other Possible Contact; Brandon Ballengee ; https://brandonballengee.com
"Brandon Ballengée (American, born 1974) is a visual artist, biologist and environmental educator based in Louisiana."
Notes and further References from our conversation
FIND A BIOLOGY / SCIENTIFIC PARTNER TO HAVE A SCIENTIFICALLY BASED CONVERSATION
AND LEGITIMISE YOUR WORK.
Possible partner:
Max Planck Institut für Biologische Kybernetik
> Make an appointment > see how they work
FIND An AI SPECIALIST TO HAVE A SCIENTIFICALLY BASED CONVERSATION
AND LEGITIMISE YOUR WORK.

Mattis Kuhn ?
Alexander König ? 
Ars Electronica Centre, Linz ?
Futurium Berlin ?
NATURAL INTELLIGENCE LAB !!

Aufnehmen von sensorischen Reizen zur Verhaltens und Entscheidungsfindung. (Prozesse des Lernens und der Intelligence bei Tieren.) 
THE HEAVY ANALOGY BETWEEN HOW DOES ANIMAL/BIOLOGICAL VISION WORK FROM DECODING ETC.

AND ALSO THE REDUCTION OF INFORMATION THROUGH LEVELS THAT IT HAS TO PASS THROUGH THAT REDUCE THE WEALTH OF POSSIBLE INFORMATION - JUST LIKE Y PROCESS TRANSLATING FROM ANIMAL OVER HUMAN TO MACHINE
I COULD TAKE IT ONE STEP FURTHER IN M AUDIOVISUAL REPRESENTATION OD DATA AND NOT SHOW HOW IT IS NOW BUT SHOW HOW IT COULD BE IF MODELLED MORE CLOSELY ON ANIMAL/BIOLOGICAL PROCESSING ...
I think questions of ATTENTION and BEHAVIOUR are also really important and something we need to consider fully when filming.
Welche Merkmale in der Umgebung sind für Entscheidungen relevant?
Wie Verhalten anpassen i.e. LERNEN?
Meine Tiere sind ebenfalls 'Model-organismen'
Der Blick auf die Tierforschung um intelligentes Verhalten , Lernen und sensorische Kapazität zu erforschen
zeigt mir eine eigentlich offensichtliche Verbindung auf, das eben mein Blick auf die Tiere mit dieser Forschung übereinstimmt. Wir schauen auf Tiere als Modelorganismen um Vergleiche und Schlüsse für die eignen menschlichen Fähigkeiten und biologische Systeme zu ermitteln.
Ich denke dass mehr zu erfahren wie wissenschaftlich gearbeitet wird mit Tierversuchen mir aufzeigen kann in welchem Kontext ich meine Überlegungen anlege/frame bzw. wie diese dann in der Daten-visualisierung auch weiter contextualisiert werden können. (Das vlt. eher als das es die Filmaufnahmen selbst drastisch beeinflusst.)
ie. Erkenntnisse zum Menschen werden über Tierversuche und Rückschlüsse erforscht und dennoch verstehen wir auch eigentlich das tierische Wahrnehmung ganz anders funktionieren kann.
1) Die Abteilung für sensorische und sensomotorische Systeme

Forschungsgruppen
> Psychophysik Labor
> fMRT Labor
> Zebrafisch Labor < meine Anlaufstelle
Auch Interessant wär es einen Einblick zubekommen wie mit biologischen neuralen Netzwerken geforscht wird.
https://www.kyb.tuebingen.mpg.de/de/systems-neuroscience-neuroengineering
https://www.mpg.de/kuenstliche-intelligenz
Max-Planck-Institut für Intelligente Systeme

Unser Ziel ist es, die Prinzipien von Wahrnehmen, Lernen und Handeln in autonomen Systemen zu verstehen, die mit komplexen Umgebungen interagieren. Das Verständnis wollen wir nutzen, um künstliche intelligente Systeme zu entwickeln. Die Wissenschaftlerinnen und Wissenschaftler am Max-Planck-Institut für Intelligente Systeme erforschen diese Prinzipien in biologischen, hybriden und Computer-Systemen sowie in Materialien. Das Spektrum reicht dabei vom Nano- bis zum Makrobereich. Mit unserer stark interdisziplinären Herangehensweise kombinieren wir mathematische Modelle, Computer- und Materialwissenschaft sowie Biologie miteinander.
https://www.is.mpg.de/de
It could also be something along the lines of this headline: "Zebrafinken singen in unterschiedlichen Dialekten - Künstliche Intelligenz analysiert Gesänge von Zebrafinken"
https://www.mpg.de/18478116/0328-orni-zebrafinken-singen-in-unterschiedlichen-dialekten-154562-x?c=17657895
Max Plank Institut für biologische Intelligenz ' in Gründung'...
Max-Plank-Gesellschaft
Max-Plank-Institut for Intelligent Systems
"Die Wissenschaftler am Max-Planck-Institut für Intelligente Systeme erforschen die fundamentalen Fragen von Wahrnehmung, Handeln und Lernen, die intelligenten Systemen zugrunde liegen. "
"..verfolgt die Ziele, ein fundamentales Verständnis von Wahrnehmung, Lernen und Anpassung in komplexen Systeme zu erlangen."
It's interesting how they do research on animal's but then publish research results on humans.
But I am less interested in the translation to human's and rather the animal's themselves.


The encoding and decoding and selection of visual information relates to how machines process information. She states that the eye in V! first focuses on lines and edges - this is similar to filters in the convolutional artificial neural networks.
"Die Fähigkeit, Systeme für autonome Robotik und intelligente Software zu entwickeln, ist eine künftige Schlüsseltechnologie für Industrie, Transport und Logistik als auch für unsere Gesellschaft als Ganzes....Intelligente Systeme in der Natur - inklusive der Mensch - haben durch Interaktion, Evolution und Lernen ausgeklügelte Fähigkeiten entwickelt, erfolgreich in der Welt zu bestehen. Jedoch ist unser Verständnis dieser Phänomene noch sehr beschränkt, und die Synthese von intelligenten, autonomen und lernenden Systeme bleibt eine grosse wissenschaftliche Herausforderung. "
Das bringt noch einmal anders den zukunftsgewandten Aspekt meiner Arbeit hervor: Es ist 'wichtig' intelligente biologische Systeme zu verstehen und diese zur Zukunftsanwendung zu gestalten/ zu programmieren.
Max Planck Institute für Intelligente Systeme
https://www.is.mpg.de/de/overview
“Die Wissenschaftler des Instituts untersuchen die Organisationsprinzipien von intelligenten Systemen und wollen diese sowie den zugrunde liegenden Kreislauf von Warhnehmen – Handeln – Lernen verstehen. “

Abteilungen in Tübingen: Maschinelles Lernen, Maschinelles Sehen, Robotik, Regelung und Steuerung sowie die Theorie von intelligenten Systemen.

(1) Empirische Inferenz - Tübingen Standort
“Das primäre Ziel der Forscher*innen ist, zu verstehen, wie Lebewesen und künstliche Systeme Strukturen erkennen, um damit in der Welt zu agieren.” “Our department is conducting theoretical, algorithmic, and experimental studies to try and understand the problem of empirical inference.”





















(2) Perceiving Systems
“Die Abteilung „Perzeptive Systeme“ kombiniert Computer Vision, maschinelles Lernen und Computergrafik mit dem Ziel Computern beizubringen, Menschen und ihr Verhalten in Bildern und Videos zu verstehen. Der Ansatz der Abteilung ist einzigartig, da mathematische Modelle der menschlichen 3D-Form und Bewegung mittels maschinellem Lernen erstellt und mit vergleichsweise wenigen Parametern beschrieben werden. Diese Modelle werden herangezogen, um das Bewegungsverhalten von Menschen aus 3D-Szenen zu extrahieren und zu analysieren. Die Abteilung beschäftigt rund 45 Mitarbeiter*innen und Student*innen sowie weitere angegliederte Forscher*innen. Sie verfügt über spezielle 4D-Scanner, die mit 60 Bildern pro Sekunde hochpräzise und detaillierte 3D-Netze von Körpern, Gesichtern, Händen und Füßen erzeugen. Darüber hinaus werden auch tragbare Motion Capture-Systeme, Flugroboter und hochspezialisierte Kamerasysteme zur Aufzeichnung eingesetzt.”

We combine research on computer vision, computer graphics, and machine learning to teach computers to see and understand humans and their behavior. A key goal is to learn digital humans. This work combines Computer Vision, Machine Learning, and Computer Graphics.

https://ps.is.mpg.de/
https://uni-tuebingen.de/en/fakultaeten/mathematisch-naturwissenschaftliche-fakultaet/fachbereiche/informatik/lehrstuehle/autonomous-vision/home/
https://al.is.mpg.de/
(3) Autonomous Motion
The Autonomous Motion Department has its focus on research in intelligent systems that can move, perceive, and learn from experiences.

“We are interested in investigating such perception-action-learning loops in biological systems and robotic systems, which can range in scale from nano systems (cells, nano-robots) to macro systems (humans, and humanoid robots).”



(4) Autonomous Vision Group - young people /students!
We are interested in computer vision and machine learning with a focus on 3D scene understanding, parsing, reconstruction, material and motion estimation for autonomous intelligent systems such as self-driving cars or household robots. In particular, we investigate how complex prior knowledge can be incorporated into computer vision algorithms for making them robust to variations in our complex 3D world.


























(5) Autonomous Learning Group
We are interested in autonomous learning, that is how an embodied agent can determine what to learn, how to learn, and how to judge the learning success. In particular, we focus on learning to control a robotic body in a developmental fashion. Artificial intrinsic motivations are a central component that we develop using information theory and dynamical systems theory. We work on reinforcement learning, representation learning, and internal model learning.
https://ps.is.mpg.de/projects/optical-flow-for-mostly-rigid-scenes
https://ps.is.mpg.de/projects/high-level-priors
"Our geometric reasoning is valid for all types of natural scenes"
https://ps.is.mpg.de/publications/wulff-2018-thesis
Scenes, Structure and Motion Focus
The perceiving systems department also develops Datasets: https://ps.is.mpg.de/research_fields/datasets-and-code
One of which is 'Slow Flow':
https://ps.is.mpg.de/research_fields/datasets-and-code
This sounds like exactly the type of use case my dataset could be useful for
Brings back the idea of using a depth sensor camera while filming but not to composite into the animal vision but rather to create the transition to the machine vision.
https://ps.is.mpg.de/research_fields/data-team
In 0ur conversation the idea came up to work with the 'visualiser' and pull out images. What they did here was to train StyleGAN 3 that they installed locally and train it on a set up 2000 images they shot in the studio. Apparently StyleGAN works well with slight variations and 'overfitting' isn't a problem. And that by feeding in these slightly different images I can get neural network adapted versions and get out the filters /layers through this 'visualiser'.
https://robertseidel.com/hysteresis/
https://www.drones.org/news/remote-controlled-cockroaches-at-the-touch-of-a-button-156/
https://maxplanckneuroscience.org/everything-is-relative-how-flies-see-the-world/
Perceiving Systems department
https://www.ab.mpg.de/events/30880/345436
https://www.ab.mpg.de/events/30769/345436
https://www.ab.mpg.de/events/30769/345436
Wissenschaftliche Kontakte
Institut für Zoologie und Evolutionsforschung in Jena

Prof. Dr. Martin S. Fischer - Professur spezielle Zoologie und Evolutionsbiologie
martin.fischer@uni-jena.de
+49-3641-9-49141
- hat Projekt gemacht bei dem er Hunde beim Laufen live geröntget hat

PD Dr. Manuela Schmidt - Professur spezielle Zoologie und Evolutionsbiologie
schmidt.manuela@uni-jena.de
+49-3641-9-49188
- speziell für Morphologie und Physiologie

Prof. Dr. Manuela Nowotny - Professur Tierphysiologie
manuela.nowotny@uni-jena.de
+49-3641-9-49101
- Vorlesungsmaterial von ihr und forscht an Grillen



Dr. Berthold Hedwig - forscht an Grillen & Heuschrecken in Cambridge
bh202@cam.ac.uk
- https://www.neuroscience.cam.ac.uk/directory/profile.php?bh202
- http://www.cab.zju.edu.cn/cabzbenglish/2018/0702/c20829a819333/page.htm
- https://www.amazon.com/Insect-Hearing-Acoustic-Communication-Signals-ebook/dp/B00GICG2RO


Alison Barker - vocal communication in the naked mole-rat
eMail Kommunikation zum Random Forest Classifier

Biologie
KI
Alexander König
Mattis Kuhn
Laura Juliane Wagner & Lisa-Marleen Mantel

Ars Electronica Futurelab Researchers and Artists


"creating an ultra-realistic immersive world has given rise to a thriving offshoot that’s far removed from the glare of game arcades. For the last two decades, biologists have been using VR as a tool to reveal fundamental principles about the neuronal circuitry underpinning behaviour in animals. And in Konstanz, behavioural biologists are joining forces with computer scientists to push the limits of this technology to gain insights into decision-making in animal collectives that were previously inaccessible.

What we call “VR” is technically defined as an immersive environment where the sensory organs (such as visual or auditory) of the user are artificially stimulated to alter the perception of reality. While we usually think of people using VR, animals can also be placed in virtual environments. But in place of a headset, the animal’s whole body is within the space."



..an animal immersed in a realistic and dynamic, yet synthetic world,..

So why would you put animals into virtual worlds? To answer this, it’s important to understand the powerful technique of artificial stimulation for studying behaviour. Artificial stimuli can reliably elicit behaviours in animals during experiments, thereby providing a deeper understanding of the decision-making of animals. ..experimenters can tweak properties of the artificial stimuli and plan the timing of delivery to systematically test behaviours.

Video playback has a major limitation: it does not react to the animal viewing it. Yet in the real world, action and reaction are intricately linked. If a spider acts aggressively, the spider observing it should also respond. So for biologists to turn digital stimuli into something closer to reality, they had to link action and reaction.

The moment when animal experiments broke into true VR territory, in the early 2000s, was when technology was capable of simulating the real world in two important ways: the animal could see the world from an egocentric perspective and, crucially, that world reacted in real time.

Navigation in virtual mazes allowed scientists to pinpoint circuits that underlie cognition, learning, and memory.

In the FreemoVR system, individual animals are embedded in a photorealistic synthetic world in which they can interact with virtual organisms, or inspect and move around virtual obstacles, just as they do in the real world. Graphics are projected into the volume to create a virtual world in full 3D with depth cues. To ensure the illusion is preserved, the animal’s movement is tracked and the graphics are updated accordingly.

In his research programme in Konstanz, Couzin has applied this platform to study fish, locusts, and flies, and in doing so has begun to decipher the pathways of communication in animal collectives.

“Virtual reality offers a means of controlling causality,” says Couzin.

...“Combined with some cool projection systems you can really start fooling humans about reality.”

But of course, it’s not enough to design a realistic virtual world for us. The animals must perceive it to be real. Considering solely vision, many animal species show a range of properties that differ from our own. For instance, the human visual system merges a stream of images into a continuous percept when presented with a refresh rate of at least 30 images per second, whereas this happens at 200 images per second in insects.
https://www.campus.uni-konstanz.de/wissenschaft/how-vr-became-a-key-to-unlocking-animal-behaviour
I keep being irritated by them focusing mucho the collectives - but in the end any collective behaviour is also an individual behaviour and says a lot about the actions - decision-making of theses model organisms.
To develop the animal VR they need to develop it for the animal's sight so they know about going into the animal and trying to see through their eyes. So it's the cross-sectionof using vision to gain insight about behaviour. (Not researching vision itself. But as it is a somehow more general approach and more media based I think it fits better to what I am going than going to one researcher to learn how they establish insights on vision of the mouse e.g.) 
!!!
Virtual Environments have become reliable tools for studying vision, cognition, and sensory-motor control in animals.
Biologists and computer scientists can benefit greatly from deepening interdisciplinary research in this emerging field and together we can develop new methods for conducting fundamental research in behavioral sciences and engineering.
Important article that lead me to the 'right' people: https://www.campus.uni-konstanz.de/wissenschaft/how-vr-became-a-key-to-unlocking-animal-behaviour
Publication on 'Animals in Virtual Environments' https://ieeexplore.ieee.org/document/8998379
https://collectivebehaviour.com/positions/
https://www.exc.uni-konstanz.de/collective-behaviour/news-and-events/events-detail/2022/6/20/event/45416-CASCB-talk-Hawkmoth-neur/tx_cal_phpicalendar/
https://www.pnas.org/doi/10.1073/pnas.2102157118