Text & Photo by TRY-Z, Zenji Nishikawa

Half-Life 2 Developer Interview

=Page 1of 2=
Developing company Valve Software (Valve) of "Half-Life2" performed the presentation "Making of Half-Life2" in the Softimage booth of "SIGGRAPH2003" held in SAN DIEGO, Calif. on July 27, 2003.
Since we had the opportunity to interview the presenter Bill Van Buren, Valve's designer, this report combines with the contents of a presentation.

"Face" is a "Face" of "Half-Life2"


forGamer, Zenji Nishikawa (following, ZEN)
First question is which process did you take charge of in "Half-Life2"?

Bill Van Buren (following, BVB)
I mainly generalize the modeling of a face, and other technology relation, and am also participating in producing the game's contents, or sound-related direction.

ZEN:
Please tell me about the circumference of facial expression in "Half-Life2".
There was an explanation on "eyeball shader" in today's presentation. You said "eyeball shader is reproducing the refrection of the scene to eyeballs", then, how does it specifically process? I suppose whether the environmental map is prepared or it may be the technique of a multipass rendering system.

BVB:
In fact, that is a kind of fake. The blightness is projected as if he sees the light source in the scene "from his eyeballs" at the time of seeing. Although there is no technique like an environmental map, it is equally effective when we glance his eyeballs.

ZEN:
I see. Probably, that is the cost-effective technique. Since the light in eyeballs also moved when the character and the light source moved, they were seen like lively eyes. How about eyelids?

BVB:
The trick is realized by changing the outer skin of eyelids.

ZEN:
40 control points are used in the character's facial expression in "Half-Life2", and I heard that displacement of the points was carried out for the expression. Could you tell me the system?

BVB:
There are 34 vertex for controlling expression in the character's face, and the expression is made by such displacement. Since performance falls greatly, we have not used bones inside of a face.

ZEN:
How is the animation of facial expression made?

BVB:
This is the so-called key frame animation. If there is information on the displacement of the vertex in the beginning and the end of expression, changing expression is reproduced by arithmetic vertex coordinates change.

ZEN:
Although the skin of a face model looked very real, is it because special shader is used? It seems that so-called "skin shader"(*) began to be used in the ATI or NVIDIA GPU demonstration.

BVB:
We don't use the so-called skin shader, in order that we may express the textures of the skin. We take care so that it may not become the vinyl textured skin,like old games.
The speculer mask is applied to parts such as a cheek and a nose, in order to express the glisten or sweating skin.

ZEN:
I heard that holes of a nose and ears and the hollows of eyes are carrying out shade processing using a parameter like self-cover degree....

BVB:
At this point, we are working carefully and it has become a texture based expression now.

ZEN:
That is, to say, a dark part of a face is covered by the texture, and is made dark.

BVB:
That's right. Since we took considerable time for designing faces, please check them well. Some 2500 polygons are used only for modeling one face.

ZEN:
How is the hair? I think it is like the "feathered hair" using the anisotropic lighting.

BVB:
Absolutely. When the hair needs to be expressed, we use the technique.



(*) skin shader : Blood and flesh are covered by skin and the translucent skin reflects light intricately. The shader reproducing the textures of skin is especially called skin shader.

The brilliance in eyeballs is expressed by reflecting the light source position in a scene. That is the secret of lively eyes. The control engine of expression was inspired by Dr. Paul Ekman's cooperation who is the professor of psychiatry of the California medical school, and is prominent in the field of psychology.


Generation of the shadow in "Half-Life2" was made possible by projection texture mapping

ZEN:
Please let me know about shadow expression in "Half-Life2". There are techniques like stencil shadow volume or shadow mapping (shadow buffer) in generating a shadow. What technology is used in "Half-Life2"?

BVB:
We used the so-called projection texture shadow. It is the method of making a light source into a viewpoint, rendering a low detail model to a texture, and sticking projection texture mapping.

ZEN:
I see. The technique has high compatibility.
You can use this technique regardless of which card we use, GeForce or RADEON. The technique can be used also for a console game machine, and is advantageous. However, by this system, it is difficult to express the self shadow projected on an own shadow. Again, how is the object's shadow of a background processed?

BVB:
The game doesn't have a feature of self shadow.
The light source which casts a shadow in "Half-Life2" is limited to one area at one piece. The shadow projected on a scene is based on the light source. And a common light map is used for the static shadow of the background.



The shadow of the static object is generated by the traditional light map and the moving character's shadow is by the projection texture. The techniques are often seen recently. When we look into the shadow of a game, we often make sure whether the graphics ofthe PC game casts self shadow. Unexpectedly, "Half-Life2" does not employ the technique.


High Dynamic Range rendering technique (HDR) is employed

ZEN:
When I see "Half-Life2" graphics, expressions like HDR rendering(*1) will be seen in some places. May I think that "Half-Life2" engine has employed HDR rendering technique?

BVB:
Yes. About the HDR rendering, it was implemented by the cooperation of a certain GPU vender, although I can't tell the vender's name. That's why HDR rendering-expression will be seen here and there when you play "Half-Life2".


ZEN:
By what technique does the rendering specifically carry out? When we use ATI RADEON 9500 or greater, the HDR rendering using the floating decimal point real number buffer will also be possible.

BVB:
HDR rendering expression of "Half-Life2" is the system which gives the luminosity intensity of light to an alpha channel and finally casts the glare effect with programmable pixel shader.

ZEN:
The technique is often used in DirectX 8 based 3D games like "Splinter Cell" and began to appear recently. Further more, when I see "Half-Life2", there are scenes which look like volume rendering. Could you tell us the name of the technique?

BVB:
We use the particle system for most volume expression. On scenes such as underwater, it seems that particle system is used together with the so-called usual fog function.

The glare effect, which is popular HDR expression, is used for the very bright place like a skylight. By expression called the glare effect and the light breeding effect, the light of an electric bulb shines beyond a pillar. When the traditional fog function with near/far parameter is used, a long distance scene will fade and that is called air distance expression.