boidsxr.github.io

By maxf and danbri

Virtual reality (VR) is becoming more widespread, and as such it offers an opportunity to present previous work in a way that demonstrates how brilliant it is. This includes data visualisation, gaming or education. Here we’ve added immersion and interactivity to boids, an 80s artificial live program.

Boids

Boids were introduced by Craig Reynolds in Flocks, Herds, and Schools: A Distributed Behavioral Model, a paper describing computer-generated simulations of flocks of birds, or schools of fish. By programming a simplified model of the individual behaviour of an artificial bird (called boid), Reynolds was able to produce realistic renditions of whole flocks composed of dozens of boids.

Screenshot of original boids demo

Because boids produced realistic animations of flocks from a very simple behaviour model, Reynolds’s paper is considered a milestone in the field of artificial life, but has also been influencial in the development of other CGI techniques like particle systems and crowd animation. Dozens of implementations and variations of the original code can be found online and attest how innovative and seminal the original idea was.

VR

VR is an even older concept which is slowly becoming accessible thanks to the recent release of dedicated and self-contained devices such as Google’s Cardboard, or the Oculus Quest (and a few more expected). With this new hardware comes a new market of games and applications, similar to phone apps or console games.

As is often the case, open and web-based standards follow proprietary formats and APIs for developing applications. In this case W3C’s WebXR is becoming the standard for developing open immersive web applications, such as this one. WebXR is the new name the more obvious WebVR as it now integrates the capability to create augmented reality applications.

Combining boids and VR

It’s not difficult to imagine the improvements that an immersive experience can bring to the original boids program. While Reynolds’s original animations are still very impressive to watch, we can now bring many of the benefits of VR.

New Boids demo

Next

Reynold’s paper lists possible improvements to the original algorithm, for instance by including the effect of gravity or creating a better model of each boid’s senses. Computing the movement of boids in the GPU (see below) gives us the possibility to improve the original algorithms to something more realistic without degrading performance. The article also mentions adding external factors controlling the movement of birds is also mentioned: for instance obstacles, or the presence of predators disrupting the flock.

Many of those have been included on subsequent implementations, and we could add many of them to this demo. We’ve also considered more improvements such as:

Technical details

We started by taking the p5.js implementation of the original algorithm and integrated it in a VR enfironment using A-frame:

A-frame Boids demo

We then moved on to the WebGL implementation found in the three.js examples. This verion computes the movement of boids using GPGPUs (general-purpose GPUs). This allowed us to increase to multiply the number of boids by 5 with no performance impact.

The way the flock is controlled by the user is by having each handset define a point in VR space which the boids will be attracted to (adding an extra influencing force on top of the three simulated by the original algorithm). If multiple controllers are used, each boid will move toward the nearest one.

A downside of using a GPGPU to compute the flocking algorithm is that the position and velocity of each boids is no longer available to the JS code. This makes it very difficult to create audio since calculating the volume and position of sound in VR space would need to be computed as a function of the position of each boid, only available in the GPU.