In the previous part of this series, Inside Facebook 3D Photos, we have explained how modern mobile phones are able to infer depth from pictures. Such a piece of information is stored in a depth map, which is used for a variety of effects. From blurring the background to three-dimensional reconstruction, this type of technology will become more and more present in our daily lives.
This is a two-part series. You can read all the posts here:
In the past few months, Facebook has been plagued filled with 3D photos. If you have not had the chance to see one, 3D photos are images inside a post which gently change perspective as you scroll the page, or as you move your mouse over them.
A few months prior to their introduction, Facebook had been testing a similar feature with 3D models. While it is easy to understand how Facebook can render 3D models and rotates them according to the mouse position, the same might not be as intuitive for 3D photos.
The techniques that Facebook is using to create the illusion of three-dimensionality on two-dimensional pictures is sometimes known as height map displacement, and it relies on an optical phenomenon called parallax.
This is a two-part series. You can read all the posts here:
Most indie developers might know Lucas Pope as the developers of the critically acclaimed Papers, Please. Thanks to its simple, yet thoughtful mechanics, Papers, Please helped to shape an entirely new genre of video games. And it even inspired a short film with the same name.
Despite its success, one of the most recurring criticisms the game has faced is related to the apparent simplicity of its execution. With Return of the Obra Dinn, Lucas Pope clears any doubt with a game that, by itself, is nothing less than an achievement in technical excellence.
It is no mystery that Fortnite has now become one of the most successful computer games of all time. While many see it as a case study for excellence in marketing and game design, the game itself features some very interesting shader effects.
From a Technical Artist perspective, the most striking effect featured in Fortnite is the self-building effect. When an object is being constructed, its individual pieces appear one by one out of thin air, and fly into position. The same effect is somehow played, in reverse, when an object is damaged, by showing those very pieces flying away and disappearing (above).
If you have been on Twitter this past week, you might have seen videos of the new Spider-Man, developed by Insomniac Games. The game has been praised for its stunning visuals and exceptional attention to detail. One effect, in particular, has captured the players’ attention. It appears that you can see inside every single window of every building. But at a closer look, something does not look right. What’s going on?
If you ever placed a strong light source behind your hand, you might have noticed how light is able to filter through the skin. Even more interesting is the fact that light “travels” inside the skin, and can sometimes make the entire hand glow. This optical phenomenon is called subsurface scattering (or SSS) and is caused by individual photons penetrating the skin, bouncing (scattering) inside it, and finally exiting from a different point. For this reason, subsurface scattering is also called subsurface light transport.
Most semi-transparent materials exhibit a certain degree of SSS, which gives them a “smoother” look. Milk, for instance, owe its uniform colouration to the presence of fat molecules which diffuse and scatter visible light very well. Even solid materials can be subjected to SSS. Marble is a typical example, and this is the reason why most subsurface scattering demos feature marble statues. Skin, milk, marble and wax are the materials which most commonly owe their look to SSS, although this is an optical phenomenon that is present in virtually all non-metallic materials.
Some of the readers might have heard of a game called Duke Nukem 3D. Released in 1996, it was one of the first 3D games I had the chance to play. An interesting feature of that game is that most of the interactive elements (including the enemies) were not actually 3D. They were 2D sprites rendered on quads which are always facing the camera (below).
This technique is called billboarding, and early 3D games were using it extensively. Even today it is still used for some background details, such as trees in a forest far away. For instance, one of them is Massive Vegetation, which uses billboarding to render grass blades in a very realistic way.
After two previous instalments of Shader Showcase Saturday focused on wind and rain, talking about snow was simply unavoidable.
Creating realistic snow is a serious challenge, which will be further explored in the following months. This week, we focus on how shaders can be used to add snow to an existing scene. Most of the references shown in this post will not be photorealistic. We will show on how to simulate photorealistic snow and frost in a few weeks. If you cannot wait, I would strongly advise having a look at Winter Suite. It contains some of the most realistic shaders for snowy and frosty surfaces.
As you can see from the image above, it supports translucency, subsurface scattering and the shimmering effect that is typically seen in snow.
The first time I played Diablo 2 I remember how impressed I was to see rain causing ripples on the river just behind the Rogue Encampment. But only when I looked closer I realised that those ripples were not actually caused by any raindrop. Both ripples and raindrops were simply unrelated. As it often happens, improving graphics in modern computer games is not a quest for realism: it’s all about believability.
When it comes to 2D games, creating ripples in water is relatively easy, as that is often done with particles. But for 3D objects, things are a little bit more complicated. Since meshes can be curved, is hard to have particles following those shapes correctly. Technically speaking, such an effect could be perfectly simulated with physics, but as we have seen already, fluid simulations are expensive and hard to control.
This is why most games in which you see water slowly dripping on a 3D surface often rely on shaders.
In the two previous instalments of Shader Showcase Saturday, we have talked about waterfalls and interactive grass. Those two subjects sound very different from each other, yet they share something in common: the original phenomenon can be modelled as a fluid simulation. This week’s Shader Showcase Saturday will continue this trend, talking about another effect that involves fluids: fire.