Automated analysis of animal behavior and its relation to key aspects of the environment reveals new cognitive specializations of neurons

Cover Page


Cite item

Full Text

Open Access Open Access
Restricted Access Access granted
Restricted Access Subscription or Fee Access

Abstract

A thorough analysis of animal behavior is essential for examining the relationship between specific neuron activations with the elements of the external environment, behavior, or internal state. Machine learning techniques have made some advancements in automatic segmentation of animal behavior based on data concerning the location of animal body parts [1–3]. At present, these methods cannot achieve the level of segmentation accuracy desired or make correlations between an animal’s behavioral acts and key environmental factors. To address this issue, the authors have created a software package that can extract a variety of behavioral variables from video recordings of animals in experimental settings, enabling mathematical analysis of a behavioral act continuum.

The identification of specific aspects of an animal’s anatomy is crucial for extracting a vast array of behavioral variables. In order to accomplish this task, our team employed DeepLabCut, an accessible toolkit for tracking experimental animal behavior that operates on the principle of transfer learning through deep neural networks. We have devised a technique to ascertain the positions of animal body parts in diverse behavioral situations, resulting in a body parts collection meeting two criteria: offering superior responsiveness to small motor movements of the animal and delivering a high percentage of correct body part locations. In scenarios employing camera shooting from above, such a collection encompasses the nose, ears, tail base, body center, forelimbs, hind limbs, and both flanks of the animal’s body.

Next, we created software tools to extract and annotate behavioral variables from data on animal kinematics in various cognitive tasks. Our automated system comprises two main scripting modules: CreatePreset and BehaviorAnalyzer. The CreatePreset module interacts with users to select the type of arena geometry, object location, and necessary temporal and spatial parameters for analysis. The script’s result saves as a mat-file for analyzing the behavior of all experiment videos, assuming a constant relative position of the arena and the video camera alongside the experiment’s design. The BehaviorAnalyzer module conducts initial processing on time series data consisting of coordinates of an animal’s body parts. This results in the formation of a kinematogram, which details the kinematics of the body parts. The module then isolates individual behavioral acts of the animal and annotates its behavior based on motivational and environmental factors.

Using mutual information-based methods, we analyzed the specialization of hippocampal CA1 neurons in animals as they explored arenas with varying degrees of novelty. Through the analysis, we have identified neurons that exhibit selectivity in relation to specific continuous kinematic parameters governing the posture and trajectory of the animal. These parameters include the animal’s location in the arena space (X and Y coordinates), as well as the speed and angle of rotation of the animal’s head (i.e. absolute orientation in the arena). Neurons specialized in discrete acts of behavior were identified, including rests, locomotions, freezing, rears, and acts of interaction with objects. Furthermore, a selective activation of neurons was found with regard to an additional set of distinct parameters, which combine the animal’s location in the arena and its speed.

Full Text

A thorough analysis of animal behavior is essential for examining the relationship between specific neuron activations with the elements of the external environment, behavior, or internal state. Machine learning techniques have made some advancements in automatic segmentation of animal behavior based on data concerning the location of animal body parts [1–3]. At present, these methods cannot achieve the level of segmentation accuracy desired or make correlations between an animal’s behavioral acts and key environmental factors. To address this issue, the authors have created a software package that can extract a variety of behavioral variables from video recordings of animals in experimental settings, enabling mathematical analysis of a behavioral act continuum.

The identification of specific aspects of an animal’s anatomy is crucial for extracting a vast array of behavioral variables. In order to accomplish this task, our team employed DeepLabCut, an accessible toolkit for tracking experimental animal behavior that operates on the principle of transfer learning through deep neural networks. We have devised a technique to ascertain the positions of animal body parts in diverse behavioral situations, resulting in a body parts collection meeting two criteria: offering superior responsiveness to small motor movements of the animal and delivering a high percentage of correct body part locations. In scenarios employing camera shooting from above, such a collection encompasses the nose, ears, tail base, body center, forelimbs, hind limbs, and both flanks of the animal’s body.

Next, we created software tools to extract and annotate behavioral variables from data on animal kinematics in various cognitive tasks. Our automated system comprises two main scripting modules: CreatePreset and BehaviorAnalyzer. The CreatePreset module interacts with users to select the type of arena geometry, object location, and necessary temporal and spatial parameters for analysis. The script’s result saves as a mat-file for analyzing the behavior of all experiment videos, assuming a constant relative position of the arena and the video camera alongside the experiment’s design. The BehaviorAnalyzer module conducts initial processing on time series data consisting of coordinates of an animal’s body parts. This results in the formation of a kinematogram, which details the kinematics of the body parts. The module then isolates individual behavioral acts of the animal and annotates its behavior based on motivational and environmental factors.

Using mutual information-based methods, we analyzed the specialization of hippocampal CA1 neurons in animals as they explored arenas with varying degrees of novelty. Through the analysis, we have identified neurons that exhibit selectivity in relation to specific continuous kinematic parameters governing the posture and trajectory of the animal. These parameters include the animal’s location in the arena space (X and Y coordinates), as well as the speed and angle of rotation of the animal’s head (i.e. absolute orientation in the arena). Neurons specialized in discrete acts of behavior were identified, including rests, locomotions, freezing, rears, and acts of interaction with objects. Furthermore, a selective activation of neurons was found with regard to an additional set of distinct parameters, which combine the animal’s location in the arena and its speed.

ADDITIONAL INFORMATION

Authors’ contribution. All authors made a substantial contribution to the conception of the work, acquisition, analysis, interpretation of data for the work, drafting and revising the work, final approval of the version to be published and agree to be accountable for all aspects of the work.

Funding sources. This work was supported by the Interdisciplinary Scientific and Educational School of Moscow University “Brain, Cognitive Systems, Artificial Intelligence” and by Nonprofit Foundation for the Development of Science and Education “Intellect”.

Competing interests. The authors declare that they have no competing interests.

×

About the authors

V. V. Plusnin

Lomonosov Moscow State University

Author for correspondence.
Email: witkax@mail.ru
Russian Federation, Moscow

N. A. Pospelov

Lomonosov Moscow State University

Email: witkax@mail.ru
Russian Federation, Moscow

V. P. Sotskov

Lomonosov Moscow State University

Email: witkax@mail.ru
Russian Federation, Moscow

N. V. Dokukin

Lomonosov Moscow State University

Email: witkax@mail.ru
Russian Federation, Moscow

O. S. Rogozhnikova

Lomonosov Moscow State University

Email: witkax@mail.ru
Russian Federation, Moscow

K. A. Toropova

Lomonosov Moscow State University

Email: witkax@mail.ru
Russian Federation, Moscow

O. I. Ivashkina

Lomonosov Moscow State University

Email: witkax@mail.ru
Russian Federation, Moscow

K. V. Anokhin

Lomonosov Moscow State University; Research Institute of Normal Physiology named after P.K. Anokhin

Email: witkax@mail.ru
Russian Federation, Moscow; Moscow

References

  1. Weinreb C, Osman MAM, Zhang L, et al. Keypoint-MoSeq: parsing behavior by linking point tracking to pose dynamics. bioRxiv. 2023;2023.03.16.532307. doi: 10.1101/2023.03.16.532307
  2. Hsu AI, Yttri EA. B-SOiD, an open-source unsupervised algorithm for identification and fast prediction of behaviors. Nat Commun. 2021;12(1):5188. doi: 10.1038/s41467-021-25420-x
  3. Mathis A, Mamidanna P, Cury KM, et al. DeepLabCut: markerless pose estimation of user-defined body parts with deep learning. Nat Neurosci. 2018;21(9):1281–1289. doi: 10.1038/s41593-018-0209-y

Supplementary files

Supplementary Files
Action
1. JATS XML

Copyright (c) 2023 Eco-Vector

Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

СМИ зарегистрировано Федеральной службой по надзору в сфере связи, информационных технологий и массовых коммуникаций (Роскомнадзор).
Регистрационный номер и дата принятия решения о регистрации СМИ: 

This website uses cookies

You consent to our cookies if you continue to use our website.

About Cookies