Skip to main content

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • Article
  • Published:

Virtual reality for freely moving animals

Abstract

Standard animal behavior paradigms incompletely mimic nature and thus limit our understanding of behavior and brain function. Virtual reality (VR) can help, but it poses challenges. Typical VR systems require movement restrictions but disrupt sensorimotor experience, causing neuronal and behavioral alterations. We report the development of FreemoVR, a VR system for freely moving animals. We validate immersive VR for mice, flies, and zebrafish. FreemoVR allows instant, disruption-free environmental reconfigurations and interactions between real organisms and computer-controlled agents. Using the FreemoVR platform, we established a height-aversion assay in mice and studied visuomotor effects in Drosophila and zebrafish. Furthermore, by photorealistically mimicking zebrafish we discovered that effective social influence depends on a prospective leader balancing its internally preferred directional choice with social interaction. FreemoVR technology facilitates detailed investigations into neural function and behavior through the precise manipulation of sensorimotor feedback loops in unrestrained animals.

This is a preview of subscription content, access via your institution

Access options

Rent or buy this article

Prices vary by article type

from$1.95

to$39.95

Prices may be subject to local taxes which are calculated during checkout

Figure 1: FreemoVR virtual reality system for visual simulation.
Figure 2: Innate anxiety behavior to real and virtual elevated heights in mice.
Figure 3: Effect of head movement on Drosophila flight.
Figure 4: Specific visuomotor deficit in mitf-a−/− zebrafish.
Figure 5: Teleportation, swarms, and social feedback in virtual reality.

Similar content being viewed by others

References

  1. Aghajan, Z.M. et al. Impaired spatial selectivity and intact phase precession in two-dimensional virtual reality. Nat. Neurosci. 18, 121–128 (2015).

    Article  CAS  PubMed  Google Scholar 

  2. Chiappe, M.E., Seelig, J.D., Reiser, M.B. & Jayaraman, V. Walking modulates speed sensitivity in Drosophila motion vision. Curr. Biol. 20, 1470–1475 (2010).

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  3. von Holst, E. & Mittelstaedt, H. Das Reafferenzprinzip - Wechselwirkungen zwischen Zentralnervensystem und Peripherie. Naturwissenschaften 37, 464–476 (1950).

    Article  Google Scholar 

  4. Jung, S.N., Borst, A. & Haag, J. Flight activity alters velocity tuning of fly motion-sensitive neurons. J. Neurosci. 31, 9231–9237 (2011).

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  5. Kim, A.J., Fitzgerald, J.K. & Maimon, G. Cellular evidence for efference copy in Drosophila visuomotor processing. Nat. Neurosci. 18, 1247–1255 (2015).

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  6. Leinweber, M. et al. Two-photon calcium imaging in mice navigating a virtual reality environment. J. Vis. Exp. 20, e50885 (2014).

    Google Scholar 

  7. Sperry, R.W. Neural basis of the spontaneous optokinetic response produced by visual inversion. J. Comp. Physiol. Psychol. 43, 482–489 (1950).

    Article  CAS  PubMed  Google Scholar 

  8. Ravassard, P. et al. Multisensory control of hippocampal spatiotemporal selectivity. Science 340, 1342–1346 (2013).

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  9. Acharya, L., Aghajan, Z.M., Vuong, C., Moore, J.J. & Mehta, M.R. Causal influence of visual cues on hippocampal directional selectivity. Cell 164, 197–207 (2016).

    Article  CAS  PubMed  Google Scholar 

  10. Harvey, C.D., Collman, F., Dombeck, D.A. & Tank, D.W. Intracellular dynamics of hippocampal place cells during virtual navigation. Nature 461, 941–946 (2009).

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  11. Schmidt-Hieber, C. & Häusser, M. Cellular mechanisms of spatial navigation in the medial entorhinal cortex. Nat. Neurosci. 16, 325–331 (2013).

    Article  CAS  PubMed  Google Scholar 

  12. Aronov, D. & Tank, D.W. Engagement of neural circuits underlying 2D spatial navigation in a rodent virtual reality system. Neuron 84, 442–456 (2014).

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  13. Sofroniew, N.J., Cohen, J.D., Lee, A.K. & Svoboda, K. Natural whisker-guided behavior by head-fixed mice in tactile virtual reality. J. Neurosci. 34, 9537–9550 (2014).

    Article  PubMed  PubMed Central  CAS  Google Scholar 

  14. Hölscher, C., Schnee, A., Dahmen, H., Setia, L. & Mallot, H.A. Rats are able to navigate in virtual environments. J. Exp. Biol. 208, 561–569 (2005).

    Article  PubMed  Google Scholar 

  15. Dombeck, D.A., Harvey, C.D., Tian, L., Looger, L.L. & Tank, D.W. Functional imaging of hippocampal place cells at cellular resolution during virtual navigation. Nat. Neurosci. 13, 1433–1440 (2010).

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  16. Maimon, G., Straw, A.D. & Dickinson, M.H. Active flight increases the gain of visual motion processing in Drosophila. Nat. Neurosci. 13, 393–399 (2010).

    Article  CAS  PubMed  Google Scholar 

  17. Cushman, J.D. et al. Multisensory control of multimodal behavior: do the legs know what the tongue is doing? PLoS One 8, e80465 (2013).

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  18. Straw, A.D., Branson, K., Neumann, T.R. & Dickinson, M.H. Multi-camera real-time three-dimensional tracking of multiple flying animals. J. R. Soc. Interface 8, 395–409 (2011).

    Article  PubMed  Google Scholar 

  19. Fry, S.N., Rohrseitz, N., Straw, A.D. & Dickinson, M.H. Visual control of flight speed in Drosophila melanogaster. J. Exp. Biol. 212, 1120–1130 (2009).

    Article  PubMed  Google Scholar 

  20. Schuster, S., Strauss, R. & Götz, K.G. Virtual-reality techniques resolve the visual cues used by fruit flies to evaluate object distances. Curr. Biol. 12, 1591–1594 (2002).

    Article  CAS  PubMed  Google Scholar 

  21. Straw, A.D., Lee, S. & Dickinson, M.H. Visual control of altitude in flying Drosophila. Curr. Biol. 20, 1550–1556 (2010).

    Article  CAS  PubMed  Google Scholar 

  22. Stowers, J.R. et al. Reverse engineering animal vision with virtual reality and genetics. Computer 47, 38–45 (2014).

    Article  Google Scholar 

  23. Del Grosso, N., Graboski, J., Chen, W., Blanco-Hernández, E. & Sirota, A. Virtual reality system for freely-moving rodents. Preprint at http://www.biorxiv.org/content/early/2017/07/10/161232 (2017).

  24. Ellard, C.G., Goodale, M.A. & Timney, B. Distance estimation in the Mongolian gerbil: the role of dynamic depth cues. Behav. Brain Res. 14, 29–39 (1984).

    Article  CAS  PubMed  Google Scholar 

  25. Poggio, T. & Reichardt, W. A theory of the pattern induced flight orientation of the fly Musca domestica. Kybernetik 12, 185–203 (1973).

    Article  CAS  PubMed  Google Scholar 

  26. Duistermars, B.J., Care, R.A. & Frye, M.A. Binocular interactions underlying the classic optomotor responses of flying flies. Front. Behav. Neurosci. 6, 6 (2012).

    Article  PubMed  PubMed Central  Google Scholar 

  27. Reiser, M.B. & Dickinson, M.H. Visual motion speed determines a behavioral switch from forward flight to expansion avoidance in Drosophila. J. Exp. Biol. 216, 719–732 (2013).

    Article  PubMed  Google Scholar 

  28. Kress, D. & Egelhaaf, M. Head and body stabilization in blowflies walking on differently structured substrates. J. Exp. Biol. 215, 1523–1532 (2012).

    Article  PubMed  Google Scholar 

  29. Schilstra, C. & Hateren, J.H. Blowfly flight and optic flow. I. Thorax kinematics and flight dynamics. J. Exp. Biol. 202, 1481–1490 (1999).

    PubMed  Google Scholar 

  30. Lister, J.A., Robertson, C.P., Lepage, T., Johnson, S.L. & Raible, D.W. nacre encodes a zebrafish microphthalmia-related protein that regulates neural-crest-derived pigment cell fate. Development 126, 3757–3767 (1999).

    CAS  PubMed  Google Scholar 

  31. Ahrens, M.B. et al. Brain-wide neuronal dynamics during motor adaptation in zebrafish. Nature 485, 471–477 (2012).

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  32. O'Malley, D.M. et al. Optical physiology and locomotor behaviors of wild-type and nacre zebrafish. Methods Cell Biol. 76, 261–284 (2004).

    Article  PubMed  Google Scholar 

  33. Antinucci, P. & Hindges, R. A crystal-clear zebrafish for in vivo imaging. Sci. Rep. 6, 29490 (2016).

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  34. Lange, M. et al. Inter-individual and inter-strain variations in zebrafish locomotor ontogeny. PLoS One 8, e70172 (2013).

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  35. Liu, Y. et al. Statistical analysis of zebrafish locomotor response. PLoS One 10, e0139521 (2015).

    Article  PubMed  PubMed Central  CAS  Google Scholar 

  36. Barker, A.J. & Baier, H. Sensorimotor decision making in the zebrafish tectum. Curr. Biol. 25, 2804–2814 (2015).

    Article  CAS  PubMed  Google Scholar 

  37. Thibos, L.N., Still, D.L. & Bradley, A. Characterization of spatial aliasing and contrast sensitivity in peripheral vision. Vision Res. 36, 249–258 (1996).

    Article  CAS  PubMed  Google Scholar 

  38. Reynolds, C.W. Flocks, herds and schools: a distributed behavioral model. Computer Graphics 21, 25–34 (1987).

    Article  Google Scholar 

  39. Couzin, I.D., Krause, J., Franks, N.R. & Levin, S.A. Effective leadership and decision-making in animal groups on the move. Nature 433, 513–516 (2005).

    Article  CAS  PubMed  Google Scholar 

  40. Couzin, I.D. et al. Uninformed individuals promote democratic consensus in animal groups. Science 334, 1578–1580 (2011).

    Article  CAS  PubMed  Google Scholar 

  41. Ioannou, C.C., Guttal, V. & Couzin, I.D. Predatory fish select for coordinated collective motion in virtual prey. Science 337, 1212–1215 (2012).

    Article  CAS  PubMed  Google Scholar 

  42. Naumann, E.A., Kampff, A.R., Prober, D.A., Schier, A.F. & Engert, F. Monitoring neural activity with bioluminescence during natural behavior. Nat. Neurosci. 13, 513–520 (2010).

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  43. Randlett, O. et al. Whole-brain activity mapping onto a zebrafish brain atlas. Nat. Methods 12, 1039–1046 (2015).

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  44. Szuts, T.A. et al. A wireless multi-channel neural amplifier for freely moving animals. Nat. Neurosci. 14, 263–269 (2011).

    Article  CAS  PubMed  Google Scholar 

  45. Ziv, Y. et al. Long-term dynamics of CA1 hippocampal place codes. Nat. Neurosci. 16, 264–266 (2013).

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  46. Bastien, R. et al. KymoRod: a method for automated kinematic analysis of rod-shaped plant organs. Plant J. 88, 468–475 (2016).

    Article  CAS  PubMed  Google Scholar 

  47. D'Azzo, J.J. Linear Control System Analysis and Design: Conventional and Modern (McGraw-Hill, 1995).

  48. Fenk, L.M., Poehlmann, A. & Straw, A.D. Asymmetric processing of visual motion for simultaneous object and background responses. Curr. Biol. 24, 2913–2919 (2014).

    Article  CAS  PubMed  Google Scholar 

  49. Svoboda, T., Martinec, D. & Pajdla, T. A convenient multicamera self-calibration for virtual environments. PRESENCE Teleoperators Virtual Environ. 14, 407–422 (2005).

    Article  Google Scholar 

Download references

Acknowledgements

We thank M. Colombini, A. Fuhrmann, L. Fenk, E. Campione, S. Villalba, and the IMP/IMBA Workshop for help constructing FreemoVR hardware and software. We thank M. Dickinson and T. Klausberger for helpful discussions, V. Böhm for help with experiments, and the MFPL fish facility for fish care. The manual mouse behavior annotation was performed by the Preclinical Phenotyping Facility at Vienna Biocenter Core Facilities. This work was supported by European Research Council (ERC) starting grants 281884 to A.D.S., 311701 to W.H., 337011 to K.T.-R.; Wiener Wissenschafts-, Forschungs- und Technologiefonds (WWTF) grant CS2011-029 to A.D.S.; FWF (http://www.fwf.ac.at/) research project grants P28970 to K.T.-R. and P29077 to K.N.; NSF grants PHY-0848755 to I.D.C., IOS-1355061 to I.D.C., EAGER-IOS-1251585 to I.D.C.; ONR grants N00014-09-1-1074 to I.D.C., N00014-14-1-0635 to I.D.C; ARO grants W911NG-11-1-0385 to I.D.C., W911NF-14-1-0431 to I.D.C. A.D.S and W.H. were further supported by the IMP, Boehringer Ingelheim and the Austrian Research Promotion Agency (FFG). K.T.-R. is supported by grants from the University of Vienna (research platform “Rhythms of Life”). IDC acknowledges further support from the “Struktur- und Innovationsfonds für die Forschung (SI-BW)” of the State of Baden-Württemberg and from the Max Planck Society. I.D.C. and R.B. gratefully acknowledge fish care and technical support from C. Bauer, J. Weglarski, A. Bruttel, and G. Mazué.

Author information

Authors and Affiliations

Authors

Contributions

A.D.S., K.T.-R., W.H., and I.D.C. conceived the projects. J.R.S., M.H., R.M.F., R.B., and A.D.S. developed the hardware and software and built the apparatus. J.R.S., M.H., R.B., J.G., P.H., S.F., and A.D.S. performed experiments. J.R.S., M.H., R.B., J.G., S.F., W.H., I.D.C., K.T.-R. and A.D.S. performed data analyses. A.D.S., K.T.-R., I.D.C., J.R.S., M.H., and J.G. wrote the manuscript. A.D.S., K.T.-R., W.H., I.D.C., and K.N. funded the work. J.R.S. and M.H. contributed equally to this work. J.G., P.H., and S.F. contributed equally to this work.

Corresponding authors

Correspondence to Kristin Tessmar-Raible or Andrew D Straw.

Ethics declarations

Competing interests

J.R.S. and M.H. are executives with loopbio, gmbh, a company offering virtual reality services. The other authors declare no competing financial interests.

Integrated supplementary information

Supplementary Figure 1 VR Assays for Flies, Fish, and Mice

(a) The ‘Flycave’ assay, a 1m diameter 1m high cylindrical VR arena. Three projectors create a panoramic VR, each projecting directly onto the surface of the cylinder, and simultaneously via 6 mirrors placed in 3 pairs around the setup. Flies are tracked in 3D from above using custom software and multiple cameras. (b) The FishVR fishbowl assay, in which fish swim in 9cm deep water in a 32cm diameter hemispherical bowl. A panoramic VR is created by projecting onto the surface of the bowl, via a mirror, from below. Fish position is tracked in 3D in real time using custom software and multiple cameras. (c) The MouseVR mouse floor assay. A 0.5m diameter elevated circular platform is placed on top of a 1.9m consumer television. Mouse head position is tracked in real time using custom software and a single camera placed above.

Supplementary Figure 2 FreemoVR software and system architecture

(a) The ‘Flycave’ assay, a 1m diameter 1m high cylindrical VR arena. Three projectors create a panoramic VR, each projecting directly onto the surface of the cylinder, and simultaneously via 6 mirrors placed in 3 pairs around the setup. Flies are tracked in 3D from above using custom software and multiple cameras. (b) The FishVR fishbowl assay, in which fish swim in 9cm deep water in a 32cm diameter hemispherical bowl. A panoramic VR is created by projecting onto the surface of the bowl, via a mirror, from below. Fish position is tracked in 3D in real time using custom software and multiple cameras. (c) The MouseVR mouse floor assay. A 0.5m diameter elevated circular platform is placed on top of a 1.9m consumer television. Mouse head position is tracked in real time using custom software and a single camera placed above.

Supplementary Figure 3 Post avoidance behavior as a function of latency

(a) The FreemoVR rendering pipeline. From left to right; a virtual world is first rendered as a cube map based on the observer position. Using a model of the display geometry, a geometry texture is rendered. Based on a calibration of the display elements which maps their pixels to geometry coordinates, a VR is created. (b) The FreemoVR software allows distributed operation across multiple computers – such as in situations where multiple PCs are required for real-time tracking, or for generating or displaying the stimulus on sufficient displays to suit the experiment. (c) Schematic describing the information flow between the system components in a FreemoVR assay for a freely moving observer. Dark grey indicates computation occurs in real time, light grey represents a calibration step specific to the particular display geometry.

Supplementary Figure 4 Drosophila post avoidance behavior trajectories

(a) All trials for post avoidance and latency measurement analysis. In trials with a real post (RW) N flies = 40, n trials = 240. In all VR trials, N=80, n=1890. In all VR trials, total added latency is indicated in text.

Supplementary Figure 5 MouseVR arena and analysis

(a) The elements of the mouse VR experiments were located in the laboratory as indicated. The television lies in the corner of the room surrounded by black walls on two sides, and a black curtain on the other. Mice are introduced from the front side and are shielded from the rest of the lab by a black cardboard divider. (b) Mouse preference (cumulative) for the shallow side. (c) Cumulative preference of mice for the wall side of the laboratory. (d) Cumulative distance walked by the mice for all trials. Mean ± s.d. for all plots. N=15 mice for RW trials, N=16 for VR, N=16 for ST.

Supplementary Figure 6 MouseVR head dip analysis

(a) Occurrence of head dips during the RW, ST and VR trials. Black bars indicate the time the mouse head was scored to be below the platform; start of bar is head-down, end of bar is head-up. (b) Distribution of the duration of all head dips for all scored trials. (c) Summary plots per mouse quantifying the number of head dips. Circles indicate each scored mouse. Mann-Whitney test; RW vs VR; p=0.15, RW vs ST; p=0.59, ST vs VR, p=0.43. 9 mice in each group. (d) Summary plot per mouse quantifying the total head dip duration. Circles indicate each scored mouse. Mann-Whitney test; RW vs VR; p=0.53, RW vs ST; p=0.29, ST vs VR, p=0.89. 9 mice in each group. (b,c,d) Box plots indicate median, upper- and lower-quartile. Whiskers extend to 1.5 IQRs of the lower and upper quartile, observations outside this range are indicated with diamonds. (e) Spatial location of each head dip. Arrows indicate dip location; downward arrows indicate head-down phase, upward arrows indicate head-up phase.

Supplementary Figure 7 Eliciting path following of freely flying Drosophila

(a) Fly trajectories for path following task with different gain values. Path following is elicited in a gain-dependent manner. The gain (text in top-left of each panel) describes how strong the virtual world is biased to bring the fly location closer to a target location on the path. As the fly position approaches the target, it is advanced around the path. (b) Distance from the current fly position to current target position on the path. The accuracy of the path following task can be quantified by considering the distance from the fly to the target point on the infinity path. When the distance between fly and target is less than 0.1m the target is advanced to the next point on the path - yielding a theoretical lower bound of 0.1m between the fly and the target in the case of perfect path following. N=50 flies.

Supplementary Figure 8 Flight performance for glued fly experiments

(a) Pearson correlation between angular velocity of stimulus and angular velocity of the fly for free flight path following experiments under different glue preparations; no glue (N=36 flies), head-free glue (N=39), and head-fixed glue (N=78). Mean and ± 68% c.i. plotted. (b) Distribution of vertical speed for freeflight trails. (c) Distribution of altitude for freeflight trials. (d) Pearson correlation between angular velocity of stimulus and simulated angular velocity of the fly under head-free (N=6) and head-fixed (N=6) tethered preparations. (e) Distribution of simulated horizontal speed of tethered trials. (f,g) Distribution of simulated angular velocity for tethered head-free and head-fixed preparations for different gain couplings between wingbeat amplitude and simulated turning torque.

Supplementary Figure 9 Fish trajectories for path-following experiments

(a-c) All experimental data for AB and mitf-a-/- path following experiments in the a large-dots, b small-dots and c grey conditions. All conditions were tested for all fish. N=56 AB fish, N=62 mitf-a-/-.

Supplementary Figure 10 Swimming behavior of AB and mitf-a-/- during path experiments

(a-c) Large dot condition statistics. (d-f) Small dot condition statistics (g-i) Grey condition statistics. Distribution of fish forward velocity in all trials a,e,g. during the a large-dots Distribution of control action – the magnitude of the dot velocity shown to the fish in order to elect path following in all trials b,e,h. Distribution of distance to target – the distance between the target point on the path and the fish position in all trials c,f,i. All conditions were tested for all fish. N=56 AB fish, N=62 mitf-a-/-.

Supplementary Figure 11 Teleportation and changing environments in virtual reality

(a) Zebrafish could visit a checkerboard scene or (b) a plant scene by entering a virtual teleportation portal. See also Supplementary Video 12. (c) Each of two portals was coupled to a constant destination per fish but different fish had different couplings. Upon entering a portal, the portals were rearranged to equalize distance required for subsequent portal entry. Additionally, for the portal coupled to the other scene, the fish was virtually teleported to that new scene. (d) Decisions, operationally defined as portal entry (vertical marks) and current scene (horizontal line), over time for each fish. (e) Top view of occupancy. (f) Fraction of all decisions per fish that teleported it to the plant scene (one-sample t-test difference from 0.5, p=0.81). (g) Fraction of all portal entries per fish that were magenta (one-sample t-test difference from 0.5, p=2.7e-7). (h) Fraction of 30 minutes per fish in which the fish was in the plant scene (one-sample t-test difference from 0.5, p=0.047). (i) Mean horizontal speed per fish in each coupling condition (two-related-sample t-test, p=0.0411). (d-I) N=12 fish, AB strain. Box plots indicate median, upper- and lower-quartile. Whiskers extend to 1.5 IQRs of the lower and upper quartile.

Supplementary Figure 12 Animation of the tail beat of the zebrafish larvae

To obtain kinematic information during swimming, 3 zebrafish larvae of the control group were recorded in a square arena (20cm width, 1 cm water depth) and filmed with a Basler 2040-um camera at 180 frames per second for a duration of 5 minutes. The curvature, C(s), of the animal is computed along the curvilinear abscissa of the median line, s, from head to tail of the fish. (a) Curvature along the median line of an individual as a function of time. Three different tail beats are shown for a single animal. The movement is highly stereotyped, and can be described by an inflexible part of the body (~.0002m from the head) and an oscillation of the body that propagates from the end of the inflexible part to the tail (with temporal frequency, f ~ 20Hz and a velocity of propagation c~.2m.s-1). 0.1s after the beginning of the tail beat, no movement is observed. (b) Movement of the recorded fish (18 frames at 180 fps). (c) Animation of the median line of the fish. (d) Animation generated of the virtual fish. The rendering of FreemoVR is done at 120fps, so only 12 frames are represented.

Supplementary Figure 13 Burst and glide movement of the zebrafish larvae in absence of VR stimuli

(a) The velocity, Vr, in the direction of the movement is displayed for a single animal as a function of time. Burst and glide events are clearly visible with phase of high acceleration followed by a decay of in speed resulting from drag. We first identified the position of the local maxima of the velocity (in red). (b) For each individual of the control group we plotted the distribution of the time between two successive burst and glide events, tbeat. (c) The median value for all individuals is given by tbeat ~0.42 ± 0.16s. We selected tbeat = 0.5s for our animation to allow sufficient frames for effective animation. (d) For each individual of the control group we plotted the distribution of the characteristic time of decay of the velocity, tc. This time has been computed by the fit of an exponential function during the 0.3 s after the detection of each velocity peak. (e) The median value for all individuals is given by tc ~0.3 ± 0.2s. (f) For each individual of the control group we plotted the distribution of the value of the maximal value of the velocity for each burst and glide event, V0 (in red on a.). (g) The median value for all individuals is V0 ~0.9 ± 0.3m.s-1. For simplicity, and due to its extremely short nature, the approach we use to approximate the burst and glide animation does not account for the first part of acceleration of the burst and glide movement. Furthermore, since the time between two successive beats is on the higher end of the range, the distance travelled for each event should be longer. The value used for animation were taken slightly higher to account for those discrepancies, V0 ~1.4m.s-1.

Supplementary Figure 14 Individual fish data from social feedback experiment

Histogram of each individual real fish’s distance from the periphery of the arena, r, as a function of the strength of the goal-oriented tendency, ω, of the virtual fish. The virtual fish’s internal preferred trajectory was fixed at r = 0.07m (dotted white). These individual fish data were used (together with control experiments on 15 real fish with no virtual fish) to create panel Fig. 5k.

Supplementary Figure 15 Empirically derived measurements of group splitting and shared travel direction in social feedback experiment

(a) Empirically measured probability of the group (real and virtual zebrafish) splitting as defined by the distance between the two fish exceeding a threshold value of 0.05 m. (b) Empirically measured probability that the real fish is traveling on the virtual fish’s internal preferred trajectory, r = 0.07 ±0.01m. N=16 fish, same experiment as shown in Fig. 5i-l.

Supplementary information

Supplementary Text and Figures

Supplementary Figures 1–15 and Supplementary Tables 1–4.

Life Sciences Reporting Summary

Supplementary Software

Zip file containing software source code and documentation.

Supplementary Data 1

Metadata associated with figures and videos

Demonstration of VR from the perspective of a freely moving observer

(left) Video taken from a camera (GoPro) showing the view from the perspective of a freely moving observer. (right) The colored L-shaped box virtual world FreemoVR is simulating and the position of the camera in the virtual world (red dot). Once the camera enters the ‘FlyCave’ VR arena its' position is estimated from the tracking software (right, red dot) and the perspective correct VR is projected onto the walls of the arena. As the camera moves, the projection is updated in real-time to maintain a perspective correct display. Reproduced with permission from Stowers et al. 2014.

Demonstration of multiple-display perspective correct VR

Related to Video 1. (left) Video taken from above, looking into the ‘FlyCave’ arena, showing the 3D position of the camera (red dot) and the projection onto the arena walls as it moves in space. (right) The virtual world being simulated, and the estimated position of the camera in the virtual world (red dot). Reproduced with permission from Stowers et al. 2014.

Photo realistic and naturalistic VR for freely swimming fish

(left) Swimming behavior of a zebrafish, its position highlighted in red, as it navigates a virtual world. The fish swims in a hemispherical bowl filled with water. (right) The virtual world being simulated. The world consists of a cyan sphere and a magenta pyramid in a naturalistic environment. As the fish approaches the pyramid the rendering is updated to display a perspective correct view of the world.

Photo realistic and naturalistic VR for freely flying Drosophila

(left) An Drosophila flies inside the cylindrical ‘FlyCave’ VR arena. Its' position is tracked and highlighted in red. (right). The virtual world simulated consists of a cyan sphere and a magenta pyramid in a naturalistic environment. As the fly explores the arena the virtual world is updated in real-time to maintain a perspective correct display for the subject.

Simulation of a virtual post for freely flying Drosophila

(left) A flying Drosophila (position highlighted in red) interacts with a virtual vertical gray post. (right) The virtual world being simulated. On the arena walls a checkerboard texture is moved vertically to control the fly's altitude and to prevent it flying into the walls.

Interaction of a Drosophila with a real post

A flying Drosophila (position highlighted in red) interacts with real post. On the arena walls a checkerboard texture is moved vertically to control the fly's altitude and to prevent it flying into the walls.

Simulation of a virtual post for freely swimming Zebrafish

(left) A juvenile Zebrafish (position highlighted in red) interacts with a virtual post. (right) The virtual world being simulated contains a black upright post placed at the center of a sphere covered in a checkerboard pattern.

A virtual elevated maze paradigm for freely moving mice

An unrestrained mouse explores an elevated platform placed above a 75'' consumer television. FreemoVR simulates a virtual world consisting of two platforms placed virtually 20cm and 40cm below the physical platform. By tracking the mouse head position, a perspective correct virtual reality can be displayed to the mouse, retaining naturalistic parallax queues and thus the percept of height to the mouse.

Mouse head tracking

Illuminated and filmed from above, the software detects the position of the mouse head in real-time (indicated in green) and uses this to create a perspective correct virtual reality. The detected mouse contour and center are shown in magenta.

Remote control flies – controlling the behavior of freely flying Drosophila by exploiting the optomotor response

(left) A Drosophila flies in the ‘FlyCave’ VR arena (position highlighted in red). As the fly flies, the virtual world is modified; rotated about its center, eliciting the optomotor response in the subject and causing it to turn. Doing this continuously causes the fly to follow a path of our design, an infinity-symbol (8) shaped path (right).

Zebrafish swims among a cloud of 3D dots

(left) A zebrafish swims (position highlighted in red) among a cloud of dots. (right) The simulated virtual world containing the 3D cloud of dots. The dots all move with the same velocity. The velocity of the dots is controlled to cause the fish to swim along an infinity-symbol (∞) shaped path. Dot size is 6.2°, double the size of the “large dot” stimulus in Fig. 4 to increase visibility in the video recording.

Zebrafish in 2AFC teleportation experiment

(left) Wide-angle camera footage of zebrafish swimming in 2AFC teleportation experiment. (right) The simulated virtual world containing either a checkerboard floor or virtual plants with a gravel floor. When a fish makes a decision, operationally defined as entering a teleportation portal (black and white or magenta shape), the fish is virtually teleported to the environment coupled to the portal. Depending on the particular experiment, the specific coupling between portal and destination varies, but remains fixed for each individual fish.

Zebrafish in 2AFC swarm teleportation experiment

(left) Wide-angle camera footage of zebrafish swimming in 2AFC swarm experiment. (right) The simulated virtual world containing the either a swarm of space invaders or a scene without swarm. When a fish makes a decision, operationally defined as entering a teleportation portal (black and white or magenta shape), the fish is virtually teleported to the environment coupled to the portal. Depending on the particular experiment, the specific coupling between portal and destination varies, but remains fixed for each individual fish.

Social feedback experiment with real and virtual fish

Camera footage of zebrafish swimming with a virtual fish. The virtual fish is reacting to the position of the real fish. Here, ω=1, the virtual fish equally balances social and goal-oriented behavior.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Stowers, J., Hofbauer, M., Bastien, R. et al. Virtual reality for freely moving animals. Nat Methods 14, 995–1002 (2017). https://doi.org/10.1038/nmeth.4399

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1038/nmeth.4399

This article is cited by

Search

Quick links

Nature Briefing

Sign up for the Nature Briefing newsletter — what matters in science, free to your inbox daily.

Get the most important science stories of the day, free in your inbox. Sign up for Nature Briefing