Search this blog

15 May, 2011

Skinning normals notes


Ok, really this is a Penultimate test :) but the defect is real, I think I've read of this the first time in an article about facial animation in Capcom's MT framework engine (that I can't find right now, I think it was related to Resident Evil 5). This is an example in FNC, see the weird lighting under the armpit, a complex area where many bones meet:



Normals in realtime rendering (or rendering in general) are the same as colors. We use them all the times but we seldom if ever really reason about them until an article comes our and teaches the basics of gamma correction or normalmap blending and so on and everyone jump mindlessly on the new "cool" technique without really having any deeper understanding of the problem. I wonder how many renderers really went from the rendering equation to the sRGB colour space... We should do better.

Let's say we derive face normals by averaging the vertices of a face. And that we compute vertex normals by some form of averaging of the faces of a vertex (usually, weighted by the areas). Let's assume that the face areas do not change under skinning. In that case we can compute a set of bone weights and indices that is the average of the bone weights and indices that act upon the faces of a given vertex.

Errata: The weights in the note are wrong, for the second vertex they should be 0.75/0.25 as one segment is influenced by the bone 1 with weight 1 (as both their vertices are skinned by that bone) and the other segment of the vertex is influenced by 1,2 with 0.5,0.5, so the vertex should be skinned with bones 1 and 2 with weights 0.75,0.25 (and a similar reasoning applies to the third vertex).

P.S. I still prefer my pens and my notebooks over writing with an iPad with a capacitive pen :(

Update (2023): Sergey Makeev smartly noted that there is even a bigger issue with normals if you allow for bone translation. Translation would not change normals at all, which is clearly wrong. In Fight Night the skeleton did only rotations, so it was not a problem, but in modern facial animation systems you might end up with bones that do a lot of translation instead!

13 May, 2011

08 May, 2011

2011 Future programming languages for games - POLL

Prerequisite: this.


It took some time, but here there are, the results of the 2011 Future Languages for Games poll. I have to say the poll itself could have been better, I'm sorry but it was my first experiment with SurveyMonkey.
Nonetheless, I've got 174 responses that is fairly good, considering the subject…

Let's begin reviewing this data… from the end, the questions about expertise.

The majority of people that answered the survey self-rated themselves as "hobbyists" (25.6%), followed by a 23.3% that just started working in the filed, 21.1% that worked for at least 3 years and shipped at least a title, 18.9% of very experienced professionals (technical directors or equivalent) and the remaining 11.1% claimed to be working for at least 5 years, shipping multiple titles.

Most people claimed to know a few languages and to be interested in knowing more or to even use multiple languages and paradigms in substantial projects. (37.8% each). A few know only one or two languages, the ones they use at work (10%) and some claimed to be languages gurus having used many languages and designed some of their own (14.4%).
Now this sounds a bit atypical to me, it might be related to the topic of the survey itself (most probably) and also a bit to the fact that less experienced programmers tend to overestimate themselves, so I expect them to be more accurate in the question about expertise, which was stated in more strict temporal terms, than the one about knowledge.

I don't have a professional account, so I can't do correlation analysis, that would have been handy. Next year I'll also setup multiple gathering links, so I can see  which audience I have from which sources (reddit, my blog, twitter…).

The "meat" of the survey was of course the question about which programming languages are in use today for games, and which ones would be preferred if the subject were to start a new game engine from scratch today. Let's see:

Pure C. Today: 30. Tomorrow: 30
Pure C++. Today: 66. Tomorrow: 53
C++ extended via tools. Today 22. Tomorrow: 20
C extended via tools. Today 9. Tomorrow: 7
Java. Today: 8. Tomorrow: 7
C#. Today: 32. Tomorrow: 29
D. Today: 2. Tomorrow: 12
Objective-C. Today: 11. Tomoorow: 6
OpenCL. Today: 7. Tomorrow: 10
Erlang. Today: 3. Tomorrow: 2
ML-Famliy. Today: 3. Tomorrow: 6
DSLs code-generated to C/C++. Today: 2. Tomorrow: 1
DSLs, compiled. Today: 2. Tomorrow: 2
DSLs, interpreted. Today: 3. Tomorrow: 1
Lua. Today: 35. Tomorrow: 34
Proprietary scripting. Today: 13. Tomorrow: 4
Other scripting languages. Today: 12. Tomorrow: 15
Python. Today: 8. Tomorrow: 6
Haskell. Today: 2. Tomorrow: 2
JavaScript. Today: 3. Tomorrow: 2

No big surprises here. Today's languages are C++/C, Lua and C#. DSLs are not so popular, and proprietary scripting systems are still the first alternative to Lua, with all the other scripting languages trailing behind.

The question also asked for language usage in the "high-level" and "low-level" components of the engine. It's interesting there to note that C is used today mostly for the "low-level" while C++ scores almost equally for "low" and "high". C++ extended via tools (i.e. code generation) is on the other hand mostly for the "high level" as all other languages, with D being the only exception.

For the future, C remains stable, while C++ loses some points. Also, proprietary scripting and DSLs are not sense as a good alternative for the future, probably recognizing the difficulties that engineering a language entail.
Surprisingly (to me at least) C# does not rise either, while D seems to be a language quite a few people are hoping to use. No other language registers such a sharp rise, it seems that people still want the next C++ to look very similar to C, rather than a higher level alternative.

Speaking of programming paradigms, it seems that none is really neglected, everything has its use. Imperative and stream/dataflow programming are the champions of rendering, functional, declarative and actors are seen as suited for AI, while OO and events (reactive programming) are still strong for Gameplay. Asset loading is quite obviously dominated by data-driven strategies, together with generic/template based, that are strong also for the build system part of the problem even if in that case, declarative programming seems to be the preferred choice.

Next question was about which languages are most known, most liked and most likely to be used in a game. C++ is the most well-known language (68.2% say "a lot"), and the most likely to be used in a game (52.3% choose "a lot") but it's comparatively less liked (36%).

C is of course very well known as well, and most people say they know and like C# well enough, even if there are way less "C# experts "than C++ (and that might be the reason why many C# tools are so slow, and so many people have wrong ideas about GC and related concepts. C# is deceivingly similar to C++, but we are all C++ experts and often reason "in C++" also when coding in other languages, thus making naive mistakes).

Lua is a lesser known and liked language, but comparatively many people are willing to use it, a sign that it's probably seen as a language for designers, to be added to an engine but not used by the engine team itself.

All the other languages are not nearly as well known. After Lua we have Python, Javascript, D, Scheme/Lisp, Objective-C and ML-family.

Among the tools to create languages, code generation and code parsing still score more points than the relatively "recent" LLVM framework. Compiler-compilers are also not so well known but as it happens with LLVM, the few experts that know these tools tend to like them at lot. LLVM scores well also in the suitability for games field.

Last but not least, I asked what feature of C++ make it so suitable for games. Of course the first and foremost is "platform availability" (selected by almost everyone, 86.8%), followed by the ability of directly manipulating raw memory via pointers, encapsulation in classes and the compile-time knowledge of type size (67%).

Function pointers and Templates are still seen as a very important feature, followed by const-correctness and inheritance from interfaces (inheritance from concrete classes and multiple inheritance are way less popular). Surprisingly (for some) explicit new and delete and destructors (RAII) are not so high in the list, and half of the respondents don't deem them as being so important. All the other features have smaller numbers, the least liked being the STL, RTTI, Multiple Inheritance, Exception handling and reinterpret_cast.

For C++ I also asked which features where perceived as being well-implemented. Now, among the ones that are scored well for their practical usage, the worst are related to the object system, in particular inheritance, overloading and templates. Memory management is also not really loved, and in general, no C++ feature seems to score really high, the most liked are the ones of C heritage, in particular platform support and pointers.

So in conclusion, what did I learn?

Well, quite a few things I have to admit. C++, even if it's showing its problems, is less hated overall than I expected, and its "better designed" cousin D seems to be have a few people hoping it will be a protagonist in the future.

People are looking everywhere, considering many different paradigms but still even if there is quite a lot of interest for many different things, only a few languages are really well known, the usual suspects (C/C++, C#, Lua…).

So what are you doing still reading this? Go, download a language and start using it!

06 May, 2011

Processing vs IntelliJ IDEA

Update: This old article used to describe how to get Processing up and running with Eclipse. That's not so great, so now I'm using IntelliJ's IDEA Community Edition instead.

Even if I use Processing for small prototypes and stuff like that, I've always restricted myself to its own sketch IDE, even if it's fairly bad I never made big processing projects, I like its minimal interface and I'm not really the kind of guy that cares too much about his editor.

Also, the popular alternative is Eclipse an IDE that is the de-facto standard in the Java world and its known to be huge, slow and generally a mess. The few times I had to use Java I went with the simple and beautiful DrJava "educational" IDE instead.

Oh just how wrong I was about that. It turns out that Processing in IDEA Eclipse is just amazing, as Eclipse IDEA supports edit-and-continue ("hotswapping" if we don't want to stick to the MS lingo, it's a fairly old feature of the JVM).

Now you won't get anywhere near the awesomeness of something like Fields, that was designed from the grounds up as an interactive coding tool, but well, it's 100 times better than the Sketch IDE and it's really enjoyable.

So assuming that you know nothing about IDEA, just like me, here is a step by step guide to coding fun:

Step.1
Download the IDEA, at launch, you can disable all plugins, none are relevant to Processing development.
Unfortunately it doesn't seem possible to download a version of IDEA without all that crap, nor uninstall the default plugins, but at least everything can be disabled. Note that more stuff can be removed if you use Configure/Plugins...

Configure/Settings/Keymap also allows to use Visual Studio's default keybindings, if you (like me) prefer that.

Step.2
IntelliJ IDEA doesn't come with a JDK, and for some reason I can't just point the IDE to the JDK included with processing. You can either install a JDK or I found that PortableApps.com does have a portable version of the JDK, and I used that.
Note that not all directories seem to work for that, IDEA still complained when I had the portable JDK in my downloads folder for some reason, moving it to C:\ fixed the issue.

Step.3
Download Processing and locate the core.jar library. There might be other jar files alongside it, most probably, you'll need them all... Note the directory where these live.

Run IDEA and create a new Java project.

Now you have to add the Processing jar files. Select the project (view/tool windows/project) and right-click to "Open module settings". Here, under "Libraries" add the path where the Processing core.jar lives.

Step.4
IDEA should have created a Main.java file. Edit it with the following sample code:

import processing.core.PApplet;
import java.util.*;

import processing.data.*;

public class Main extends PApplet {

    public static void main(String args[]) {
        PApplet.main("Main");
    }

    @Override
    public void settings() {
        size(512, 512);
    }

    @Override
    public void setup() {
    }

    @Override
    public void draw() {
        background(15);
    }
}

In general, converting Processing PDE code to plain java implies mostly sprinkling some "public" keywords in classes, as Processing PDE class members are all public by default (while Java aren't).
Also, you might need to change double literals (e.g. 0.0) to floats (0.f), another difference between PDE and plain Java, this is actually quite annoying. Lastly Processing "color" type should be changed to a plain "int".

Step.4
Launch the debugger. Notice that every time you build the class/project, IDEA will prompt asking if you want to hot-reload, which should just work.

Have fun!

Here are the ooooold instructions, for Eclipse/Processing 2.x, if you prefer that one...

Step.1
Download the most tiny, stripped down version of Eclipse possible. This can be quite a challenge as Eclipse comes in many bloated flavors by default. 

The tiniest of the prepackaged ones seems to be Eclipse Classic, that just includes all the Eclipse sources plus the CVS versioning plus the plugin development environment... 

Luckily, there is a customization service run by Yoxos here:  https://yoxos.eclipsesource.com/discover.html, just select from the Components tab the "Eclipse Java Development Tools" and that should do the trick. It's "only" 85mb!

Note: If you're not lazy like me you can read this and learn the Eclipse shortcuts. Otherwise, include in the custom download also the C++ development platform. Now the package should be around 120mb, but you'll get the Visual Studio keybindings (preferences/general/keys)

Step.2
Download Processing and locate the core.jar library. There might be other jar files alongside it, most probably, you'll need them all (in the current processing v2.1 you'll need core, jogl-all and jogl-natives-, gluegen-rt and gluegen-rt-natives-) In my case, being on OSX Processing comes packages in a nice .app structure, I used muCommander to extract the files I needed.

Step.3
Run Eclipse and create a new Java project. Select the package you just created, right-click and select "Build Path/Configure Build Path/Libraries/Add External Jar" and select the Processing jar libraries. Then add a new class, named as the project and type something like this:

import processing.core.PApplet;
public class Test extends PApplet {

    public void setup()
    {
        size(512,512,P3D);
    }
    public void draw()
    {
        background(0);
        sphere(100);
    }
}

Step.4
Launch the debugger. Notice that every time you save the class, the processing window automatically refreshes. Have fun!