Search this blog

10 October, 2013

Mathematica Manipulate[] example

Addendum to the Mathematica 101 post. I was working today and noticed that this little device is fun to use, and thus wanted to post it. Variable names have been changed to protect the innocent. I might even add more to this post in the future...

This is what I do often with Mathematica, I have some data computed/acquired somehow and I want to look at it from a few angles and try to find an analytic expression, without knowing all the underlying math, because either it's too hard to find a closed solution or I'm just not good enough.
Anyhow finding approximations even on paper would require assumptions that you have to check anyhow, and a good way to check them is to compare with the computed/acquired data. Other times you know a perfectly find analytic expression but want to fit a second one that's cheaper to compute.

With Manipulate is easy to explore a parametric function, given one. But most often I'm not entirely sure of the parametric form to use to begin with! Fear not, as in Mathematica everything is an expression, of course you can Manipulate expressions as well...

The code and graphs are captures in the two screenshots below, click on them to see the original size.

06 October, 2013

Wolfram's Mathematica 101

After a lengthy preamble, I'll try to explain the language in the http://learnxinyminutes.com/ style, so you might prefer to skip down to the last section.

This will appear in a shorter form also on AltDevBlogADay

- Introduction

I've been using Wolfram's Mathematica since my days in university. I wasn't immediately sold as initially I saw it as a computer algebra system and preferred Maple's more math-friendly syntax for that, but with time it became a great tool in my arsenal of languages.
The way I see Mathematica fit in today's rendering engineer (or game developer in general) work is that of a data analysis tool, mostly. We increasingly have to deal with data, either acquired (e.g. measured BRDFs) or simulated (e.g. integrals of the rendering equation), get "a sense" of it, compare it with our realtime rendering models, and try to derive the right approximations for the sea of things we still can't directly solve.

What makes Mathematica good for this job, a better tool than say C++, are a few key features: it's an interactive, exploratory environment, it has strong visualization and manipulation abilities, it has a rich library providing almost everything you could think of, it's a concise language, and it has a great community (see http://mathematica.stackexchange.com/ and http://www.wolfram.com/broadcast/video.php?channel=311) and great documentation.

Two notes, before looking at the language. First, you might notice that on the technical level there are alternatives that can compete. We want a prototyping language, with lots of libraries, an interactive shell, solid visualization abilities… Python fits the bill as well, most probably Matlab and a number of its clones (Scilab, Octave), Maple and a number of others.
So why should you be interested in even learning Mathematica, if you can do most of the same things in Python, which is free? In my view, the money you pay for Wolfram's system is well spent because of the packaging. Many of the functions might be exactly the same you get in other systems (e.g. Lapack for linear algebra), but Mathematica packages them in a consistent syntax, with astonishingly good documentation, great support, testing and so on.

The second remark is, as might have noticed, that didn't mention the CAS aspects. Perhaps surprisingly, computer algebra is not the most important part for my job, as more often than not you're dealing with integrals that can't be analytically solved, or with directly with raw data. Nonetheless, Mathematica being a CAS is a great perk, as being able to easily manipulate your expressions makes also the numerical experiments more flexible, and Wolfram's is undoubtedly the best CAS out there (Sage, Maxima and so on can help, but aren't close).
Also, don't think that CAS can magically solve maths if you don't know it. It's true that it gan greatly help, as you might have forgotten all the myriads of formulas used to solve limits, derivatives, integrals or to transform trigonometric expressions and so on. But you still have to know what you're doing, sometimes even "better" than doing it yourself in a way that often we solve equations under some mental assumptions that don't hold true in general (i.e. range of the variables, domains, periodicity), and if you don't realize that, and tell the system, Mathematica won't be able to solve sometimes even "obvious" equations.

I've always encouraged my companies to get a few distributed seats of Mathematica, but remember, if you just need the occasional solution of an analytic expression, Sage, Maxima (both can be tried online) or even Wolfram Alpha can work wel. On iOS I use MathStudio but PocketCAS and iCAS (based on Reduce) look promising as well.

- Mathematica's language

It stands to reason that a CAS is built on top of a symbolic language that supports programmatic manipulation of its own programs (code as data, a.k.a. homoiconicity), and indeed this is the case here. The most famous homoiconic language is Lisp, and indeed you're familiar with the Lisp family of languages, Mathematica won't be too far off, but there are a few notable differences.
While in Lisp everything is a list, in Mathematica, everything is an expression tree. Also, expressions in mathematica can have different forms, that is, input (or display) versions of the same internal expression node. This allows you for example to have equations entered in the standard mathematical notation (TraditionalForm) via equation editors or in a textual form that can be typed without auxiliary graphical symbols (InputForm) and so on. Mathematica's environment, the notebook, is not a purely textual one, but supports graphics, so even images and graphs can be displayed as output or input, inside equations, while still maintaining the same internal representation.

Mathematica is an interactive environment, but it's not a standard REPL (read-eval-print loop), instead it relies on the concept of "notebooks" which are a collection of "cells". Each cell can be evaluated (shift-enter) and it will yield an output cell underneath them, thus allowing for changes and re-evaluation of cells in any order. Cells can also be marked as not containing Mathematica code but just text, thus the notebook is a mix of code and documentation which enables a sort of "literary programming" style.
For completeness it's worth noticing that Mathematica also has a traditional text-only interface that can be invoked by running the Kernel outside the notebook environment, which has only textual input and output and has only the standard REPL you would expect, but there's little reason to use it. There is also a more "programming" oriented environment called the Workbench, an optional product that can make your life easier if you write lots of Mathematica code and need to profile, debug and so on.

- By example crash course. In a notebook, put each group in a separate cell and evaluate.

Note: Mathics
 is an OpenSource implementation based on SciPy and Sage. It also has an online interface so you can try I expect most of the code below!

(* This is a comment, if you're entering this in a notebook remember that to evaluate the content of a cell you need to use shift-enter or the numeric pad enter *)

(* Basic math is as expected, but it's kept at arbitrary precision unless you use machine numbers *)
(1+2)*3/4
(1.+2.)*3./4.
(* % refers to the last computed value *)
%+2
(* Functions are invoked passing parameters in square braces, all built-in functions start with capitals*)
Sin[Pi/3]
(* N[] forces evaluation to machine numbers, using machine numbers makes evaluation faster, but will defeat many CAS functions *)
N[Sin[Pi/3]]
(* Infix and postfix operators all have a functional form, use FullForm to show *)
FullForm[Hold[(1+2)*3/4]]
(* Each expression in a cell will yield an output in a separate output cell. Expressions can be terminated with ; if we don't want them to emit output, which is useful when doing intermediate assignments that would yield large outputs otherwise *)
1+2;

(* Assigning a symbol to a value *)
x = 10
(* If a symbol is not yet defined, it will be kept in its symbolic version as the evaluation can't proceed further. y will be assigned to the expression 10*w *)
y = x*w
(* We can see that even more clearly by peeking at the internal representation *)
y // TreeForm
(* In Mathematica, symbols are immediate values, or terminals, just like numbers are in most languages. a,b,c are the same as 1,2,3... The difference is that we can, optionally, assign values to symbols, and if that's done the evaluator will replace the symbol with its value when it encounters it... *)
(* This will recursively expand z until it reaches expansion limit and errors out: *)
z = z+1
(* Clears the previous assignments. It's not wise to assign as globals such common symbols, we use these here for brevity and will clear as needed *)
Clear[x,y,z] 

(* Whenever evaluation happens immediately or not is controlled by symbols attributes. = for example, immediately evaluates *)
x = 5*2
(* y will be equal to "x*2", not 20 as := is the infix version of the function SetDelayed, which doesn't evaluate the right hand...*)
y := x*2 
(* …that's because SetDelayed has attribute HoldAll, which tells the evaluator to not evaluate any of its arguments. HoldAll and HoldFirst attributes are one of the "tricky" parts, and a big difference from Lisp where you should explicitly quote to stop evaluation *)
Attributes[SetDelayed] 
(* As many functions in Mathematica are supposed to deal with symbolic expressions and not their evaluated version, you'll find that many of them have HoldAll or HoldFirst, for example Plot has HoldFirst to not evaluate its first argument, that is the expression that we want to graph *)
Plot[Sin[x], {x, 0, 6*Pi}]
(* The Hold function can be used to stop evaluation, and the Evaluate function can be used to counter-act HoldFirst or HoldAll *)
Hold[x*2]
y:=Evaluate[x*2]
y

(* A neat usage of SetDelayed is for memoization of computations, the following pi2, the first time it will be evaluated, will set itself to the numerical value of Pi*Pi to 50 decimal points *)
pi2:=pi2=N[Pi*Pi,50]
pi2

(* Defining functions can be done with the Function function, which has attributes HoldAll *)
fn=Function[{x,y}, x*y];
fn[10,20]
(* As many Mathematica built-ins, Function has multiple input forms, the following is a shorthand with unnamed parameters #1 and #2, ended with the & postfix *)
fn2=#1*#2&
fn2[10,20]
(* Third version, infix notation. Note that \[Function] is a textual representation of a graphical symbol that can be more easily entered in Mathematica with the key presses: esc f n esc, many symbols can be similarly entered, try for example esc theta esc *)
fn3={x,y}\[Function]x*y
fn3[10,20]

(* A second, very common way of defining functions is to use pattern matching and delayed evaluation, the following defines the fn4 symbol to evaluate the expression x*y when it's encountered with two arguments that will matched to the symbols x and y *)
fn4[x_,y_]:=x*y
fn4[10,20]
(* _ or Blank[] can match any Mathematica expression, _h matches only expressions with the Head[] h *)
fn5[x_Integer,y_Integer]:=x+y
fn5[10,20]
fn5[10,20.]
(* A symbol can have multiple matching rules *)
fn6[0] = 1;
fn6[x_Integer] := x*fn6[x - 1]

fn6[3]

(* In general pattern matching is more powerful than Function as it's really an evaluation rule, but it's slower to evaluate, thus not the best if a function has to be applied over large datasets *)
(* Note that pattern matching can be used also with =, not only :=, but beware that = evaluates RHS, in the following fnWrong will multiply y by 3, not by the value matching test at "call" site, as test*y gets fully evaluated and test doesn't "stay" a symbol, it evaluates to its global value *)
test = 3;

fnWrong[test_, y_] = test*y

(* Lists are defined with {} *)
a={1,2,3,{4,5},{aa,bb}}
(* Elements are accessed with [[index]], indices are one-based, negative wrap-around *)
a[[1]]
a[[-1]]
(* Note! Mathematica thinks of symbols and pattern-matching as more fundamental than arrays, [] is about patterns, [[]] is about arrays. So x[1]=0 defines the symbol x to be 0 when the first variable slot has value 1. This can be used to define recurrences as well, see RSolve. *) (* x[[1]]=0 instead evaluates symbol x first, if it is a list then [[]] makes sense as it's a postfix operator on lists... If you need ten different symbols, Array[x,10] will do the trick... *)
(* Ranges are expressed with ;; or Span *)
a[[2;;4]]
(* From the beginning to the second last *)
a[[;;-2]]
(* Vectors and matrices are just appropriately sized lists and lists of lists *)
b={1,2,3}
m={{1,0,0},{0,1,0},{0,0,1}}
(* . is the product for vector, matrices, and tensors *)
m.b

(* Expression manipulation and CAS. ReplaceAll or /. applies rules to an expression *)
(x+y)/.{x->2,y->Sin[Pi]}
(* Rules can contain patterns, the following will match only the x symbols that appear to a power, match the expression of the power and replace it *)
Clear[x];
1+x+x^2 +x^(t+n)/.{x^p_->f[p]}
(* In a way, replacing a symbol with a value in an expression is similar to defining functions using := or = and pattern-matching, but we have to manually replace the right symbol... *)
expr = x*10
expr/.x->5
(* Mathematica has lots of functions that deal with expressions, Integrate, Limit, D, Series, Minimize, Reduce, Refine, Factor, Expand and so on. We'll show only some basic examples. Solve finds solution to systems of equations or inequalities *)
Clear[a];
Solve[x^2+a*x+1==0, x]
(* It returns results as list of replacement rules that we can replace into the original equation *)
eq=x^2+a*x+1
sol=Solve[eq==0, x]
neweq=eq/.sol[[1]]
(* Simplifying neweq yields true as the equation is satisfied *)
Simplify[neweq]
(* Assumptions on the variables can be made *)
Simplify[Sqrt[x^2], Assumptions -> x < 0]
(* fn7 will compute the Integral and Derivative every time it's evaluated, as Function is HoldAll, fn8, using Evaluate, will force the definition to be equal to the simplified version which yields correctly back the original equation *)
fn7[x_]:=Function[x,D[Integrate[x^3,x],x]]
fn8[x_]:=Function[x,Evaluate[Simplify[D[Integrate[x^3,x],x]]]]

(* Many procedural programming primitives are supported *)
If[3>2,10,20]
For[i = 0,i < 4,i++,Print[i]]
n=1; While[n < 4,Print[n];n++]
Do[Print[n^2],{n,4}]
(* Boolean operators are C-like for the most, only Xor is not ^ which means Power instead *)
!((1>2)||(4>3))&&((1==1)&&(5<=6))
(* Equality tests can be chained *)
(5>4>3)&&(1!=2!=3)
(* == compares the result of the evaluation on both sides, === is true only if the expression are identical *)
v1=1;v2=1;
v1==v2
v1===v2
(* Boolean values are False and True. No output is Null *)

(* With, Block and Module can be used to set symbols to temporary values in an expression *)
With[{x = Sin[y]}, x*y]
Block[{x = Sin[y]}, x*y]

Module[{x = Sin[y]}, x*y]

(* The difference is subtle. With acts as a replacement rule. Block temporarily assigns the value to a symbol and the restores the previous definition. Module creates an unique, temporary symbol, which affects only the occurrences in the inner scope. *)
m=i^2
Block[{i = a}, i + m]
Module[{i = a}, i + m]
(* In general prefer Block or With, which are faster than Module. Module implements lexical scoping, Block does dynamic scoping *)
(* Block and Module don't require to specify values for the declared locals, With does. The following is fine with Block, not with With*)
Block[{i},i=10;i+m]

(* Data operations. Table generates data from expressions *)
Table[i^2,{i,1,10}]
(* Table can generate multi-dimensional arrays, i.e. matrices *)
Table[10*i+j,{i,1,4},{j,1,3}]
MatrixForm[%]
(* List elements can be manipulated using functional programming primitives, like Map which applies a function over a list *)
squareListElements[list_]:=Map[#^2&,list]
(* Short-hand, infix notation of Map[] is /@ *)
squareListElements2[list_]:=(#^2&)/@list
(* You can use MapIndexed to operate in parallel across two lists, it passes to the mapped function *)
addLists[list1_,list2_]:=MapIndexed[Function[{element,indexList},element + list2[[indexList[[1]]]] ], list1]
addLists[{1,2,3},{3,4,5}]
(* A more complete version of the above that is defined only on lists and asserts if the two lists are not equal size. Note the usage of ; to compound two expressions and the need of parenthesis *)
On[Assert]
addListsAssert[list1_List,list2_List]:=(Assert[Length[list1]==Length[list2]]; MapIndexed[Function[{element,indexList},element + list2[[indexList[[1]]]] ], list1])
(* Or Thread can be used, which "zips" two or more lists together *)
addLists2[list1_,list2_]:=MapThread[#1+#2&,{list1,list2}]
(* There are many functional list manipulation primitives, in general, using these is faster than trying to use procedural style programming. Extract from a list of the first 100 integers, the ones divisible by five *)
Select[Range[100],Mod[#,5]==0&]
(* Group together all integers from 1...100 in the same equivalence class modulo 5 *)
Gather[Range[100],Mod[#1,5]==Mod[#2,5]&]
(* Fold repeatedly applies a function to each element of a list and the result of the previous fold *)
myTotal[list_]:=Fold[#1+#2&,0,list]
(* Another way of redefining Total is to use Apply, which calls a function with as arguments, the elements of a list. The infix shorthand of Apply is @@ *)
myTotal2[list_]:=Apply[Plus,list]

(* Mathematica's CAS abilities also help with numerical algorithms, as Mathematica is able to infer some information from the equations passed in order to select or optimize the numerical methods *)
(* NMinimize does constrained and unconstrained minimization, linear and nonlinear, selecting among different algorithms as needed *)
Clear[x,y]
NMinimize[{x^2-(y-1)^2, x^2+y^2<=4}, {x,y}]
(* NIntegrate does numerical definite integrals. Uses Monte Carlo methods for many-dimensional integrands *)
NIntegrate[Sin[Sin[x]], {x,0,2}]
(* NSum approximates discrete summations, even to infinites *)
NSum[(-5)^i/i!,{i,0,Infinity}]
(* Many other analytic operators have numerical counterparts, like NLimit, ND and so on... *)
NLimit[Sin[x]/x,x->0]
ND[Exp[x],x,1]

(* Mathematica's plots produce Graphics and Graphics3D outputs, which the notebook shows in a graphical interface *)
Plot[Sin[x],{x,0,2*Pi}]
(* Graphics are objects that can be further manipulated, Show combines different graphics together into a single one *)
g1=Plot[Sin[x],{x,0,2*Pi}];
g2=Plot[Cos[x],{x,0,2*Pi}];
Show[g1,g2]
(* GraphicsGrid on the other hand takes a 2d matrix of Graphics objects and displays them on a grid *)
GraphicsGrid[{{g1,g2}}]
(* Graphics and Graphics3D can also be used directly to create primitives *)
Graphics[{Thick,Green,Rectangle[{0,-1},{2,1}],Red,Disk[],Blue,Circle[{2,0}]}]
(* Most Mathematica functions accept a list of options as the last argument. For Plots an useful one is to override the automatic range. Show by default uses the range of the first Graphics so it will cut the second plot here: *)
Show[Plot[x^2,{x,0,1}],Plot[x^3,{x,1,2}]]
(* Forcing to show all the plotted data *)
Show[Plot[x^2,{x,0,1}],Plot[x^3,{x,1,2}], PlotRange->All]

(* Very handy for explorations is the ability of having parametric graphs that can manipulated. Manipulate allows for a range of widgets to be displayed next to the output of an expression *)
Manipulate[Plot[x^p,{x,0,1}],{{p,1},1,10}]
Manipulate[Plot3D[x^p[[1]]+y^p[[2]],{x,0,1},{y,0,1}],{{p,{1,1}},{1,1},{5,5}}]
(* Manipulate output is a Dynamic cell, which is special as it get automatically re-evaluated if any of the symbols it capture changes. That's why you can see Manipulate output behaving "weirdly" if you change symbols that are used to compute its output. This allows for all kind of "spreadsheet-like" computations and interactive applications. *)

(* Debugging functional programs can be daunting. Mathematica offers a number of primitives that to a degree help. Monitor generates a temporary output that shows the computation in progress. Here the temporary output is a ProgressIndicator graphical object. Evaluations can be aborted with Alt+. *)
Monitor[Table[FactorInteger[2^(2*n)+1],{n,1,100}], ProgressIndicator[n, {1,100}]]
(* Another example, we assign the value of the function to be minimized to a local symbol, so we can display how it changes as the algorithm progresses *)
complexFn=Function[{x,y},(Mod[Mod[x,1],Mod[y,1]+0.1])*Abs[x+y]]
Plot3D[complexFn[x,y],{x,-2,2},{y,-2,2}]
Block[{temp},Monitor[NMinimize[{temp=complexFn[x,y],x+y==1},{x,y}],N[temp]]]
(* Print forces an output from intermediate computations *)
Do[Print[Prime[n]],{n,5}]
(* Mathematica also supports reflection, via Names, Definition, Information and more *)

(* Performance tuning. A first common step is to reduce the number of results Mathematica will keep around for % *)
$HistoryLength=2
(* Evaluate current memory usage *)
MemoryInUse[]
(* Share[] can sometimes shrink the memory usage by making Mathematica realize that certain subexpressions can be shared, it prints the amount of bytes saved *)
Share[]
(* Reflection can be used to know which symbols are taking the most memory *)
Reverse@Sort[{ByteCount[Symbol[#]],#}&/@Names["`*"]]
(* Timing operations is simple with AbsoluteTiming *)
AbsoluteTiming[Pause[3]]
(* Mathematica's symbolic evaluation is relatively slow. Machine numbers operations are faster, but slow compared to other languages. In general Mathematica is not made for high-performance, and if that's needed it's best to directly go to one of the ways it supports external compilation: LibraryLink, CudaLink, and OpenCLLink *)
(* On the upside, many list-based operations are trivially parallelizable via Parallelize *)
Parallelize[Table[Length[FactorInteger[10^50+n]],{n,20}]]
(* The downside is that only a few functions seems to be natively parallelized, mostly image-related, and many others require manual parallelization via domain-splitting. E.G. integrals *)
sixDimensionalFunction=Function[{a,b,c,d,e,f},Re[(a*b+c)^d/e+f]];
Total[ParallelTable[NIntegrate[sixDimensionalFunction[a,b,c,d,e,f],{a,-1,1},{b,-1,1},{c,-1,1},{d,-1,1},{e,-1,1},{f,-1+i/4,-1+(i+1)/4}],{i,0,7}]]
(* Even plotting ca be parallelized, see http://mathematica.stackexchange.com/questions/30391/parallelize-plotting. Intra-thread communication is expensive, beware of the amount of data you move! *)
(* There is a Compile functionality that can translate -some- Mathematica expressions into bytecode or C code, even parallelizing, but it's quite erratic and requires planning from the get-go of your code. See http://mathematica.stackexchange.com/questions/1803/how-to-compile-effectively/ http://mathematica.stackexchange.com/questions/1096/list-of-compilable-functions*)

- Parting thoughts
Clearly, it's impossible to cover all the library functionality that Mathematica offers. But for that the documentation is great, and usually a bit of search there and if it fails, on the stackexchange forums, will yield a very elegant solution for most issues.
Performance can be tricky, and can require more effort than using directly native CPU and GPU languages, on the other hand, support for external CPU and GPU functions is great and Mathematica is capable of invoking external compilers from strings of sourcecode, and you can use Mathematica as a template metaprogramming language, even with a bit of effort converting its expressions into other language equivalents (a good starting point is CForm[]). Being a very strong pattern-matching engine, quite some magic is possible.

Next time I might write something that shows in practice how Mathematica, via it's numerical and visualization abilities enables exploration of possible approximations of expensive rendering formulas... Stay tuned.

Further reading:


01 October, 2013

Goodbye Zootool, welcome Twitter

I used to have the habit of sharing interesting articles, originally via google reader share, then I moved to ZooTool when the latter killed the share functionality. The good about this is that you could get a RSS feed and everybody was happy...

Nowadays, RSS seem not to be that hot anymore, I'm tired of ZooTool (I originally chose it because it was the only one integrated with my iPad RSS reader... reeder. Have since moved to mr.Reader if you care) so, I'm killing that (effectively I did a long time ago as if you were following that, you would have noticed I haven't shared anything in a while).

Also, once upon a time I used to keep this blog "anonymous" because I thought it would give me more freedom of commenting on games, graphics and so on without having people associate me with a given company. Nowadays I guess most people know whose blog this is an certainly google does... Also I end up being more "controversial" on outlets that are completely not anonymous, so what's the point...

Bottom line, if you don't yet and you want to, follow me on Twitter where I will share interesting stuff I find during the day... My handle is "kenpex".

And to make this post a bit less useless... let's add a couple of blurred instagram pictures of what I've been working on in the past few weeks...




07 September, 2013

It's all about heart (making great games; how to interview)

This also appears on AltDevBlog, albeit in a shorter version (which I see few reasons not to prefer...), I decided it was best not to ramble too much on other websites.

Introduction

The original post title was "it's all about people" but I couldn't resist quoting Fight Night… Bear with me, as I don't think this will be short :) As always I talk from a perspective of a rendering engineer making videogames, but this really applies to anything. Here, take a comic and let's go...

I've had in my career the pleasure (mostly) of working with a few different studios (and teams), seeing a few others at work and talking with people from many more. And of course, as we are scientists/engineers/geeks after a while of this exposure you start wondering what's "best", or at least, what's good and bad, the science of making great games (great, not successful, that would add marketing to the equation).

Truth is though, I still don't really know. (*shocking* huh...)

I've seen great games made by twenty creatives having fun and good games made by two thousand slaves burning their life away. Smart people coding in pretty oldschool C and equally smart people coding in "modern" C++, and both parties with their reasonable reasons to do so. Wonderfully "engineered" practices, shared by juniors and seniors alike, to code relying on hackers and their ability to work without any engineering. You get the picture...

Now, of course part of this is due to the fact that "great games" come in all sizes and genres, and there's probably not much in common to how "great" (and successful) Little Big Planet is with how "great" (and successful) Call Of Duty is.
And of course in practice a given process works best only at a given scale (similarly on how a given algorithm is best for a given problem size), something good for two people is not for twenty or two hundred.
It's uncanny the similarity to good programming: over and under engineering are problems, and we should apply fancier methods only when they actually save us time (and profile, always!). That said, all true, but even with all things being comparable, the ways for greatness seem to be many.

I always use as an example of how success seems hard to transfer, Fifa and Nba. Maybe it's not a good example, I've worked only for a very short while in Fifa and never on Nba, so don't get this as an "insider" opinion, just as something puzzling when seen from the outside... 
These are two teams that seemingly make two fairly similar games, sport simulations, on huge licenses. They do that under the same company, label, studio, and a company that is notorious for its organizational practices. They were even physically closer, same building, same floor... They had access to the same technology (what now EA brands as the "Ignite" engine) and shared many people too. Yet, somehow, one thrived, squashing its competition both in quality, innovation and sales, while the other lost year after year, went through different "rewrites" and eventually led that team at EA Canada to be shut down and the franchise moved to Tiburon...
Now, surely there are reasons, and I might even have my idea of what level is to blame most, but that's matters nothing, the reality is, success evidently is very hard to transfer...

That's not to say that there is no good and bad. Certainly there is bad
I've seen it, horrible projects with horrible processes, with horrible code that horribly failed, there certainly is a quality to the process of making games. And there are even things that are "universal", that are never or always are a good idea.
But most of what makes a big difference, I'm persuaded, is relative. And it's relative, I think, to the people you have. Not even the product you're making, that comes second, it's informed by the people you have but even if it's not (we have to make the game X), the ways you'll make it depend on the people.

Initially, the people you have are the ones you happen to have, companies are made, somehow, not much to say about that... But then you keep (or steer towards) a given bias via hiring. 
And that's why I'll spend the rest of this post on interviewing, and my half-wrong opinions on how you should interview "good" people...

Interviewing is a bidirectional communication process

When I started drafting this article (unfortunately, I wrote less that I though I did, and ended up using nothing), I was looking for a new job and so was a period where I was interviewing more than usual, talking with many studios (maybe even too many, some interviews, I came in shamefully unprepared), and I was reminded now to finish it as I went through a few calls as an interviewer. So, I won't claim to be good at this, but I'm not lacking experience :)

From my empirical survey of the state of the industry, I'd say the number one "offender" in the process is not realizing, or caring enough, about the fact that interviewing is a bidirectional communication.
What you ask and say is not only used by you to assess the candidate, but by the candidate to assess you, and the job you're offering. You're not an university professor giving an exam, the goals of an interviewer and as an interviewee are fundamentally the same. Find a work relationship that makes both parties happy.

I remember when I first landed at EA, six years ago now. It was my first international job (that's to say, going outside Italy), I flew for an entire day and I was terribly jet-lagged. After interviewing with six (IIRC) teams I left with more doubts than I had before, I didn't know what to expect from the job, and I thought that if I accepted, I would be tasked with the most trivial things, as most interviews were surprisingly basic.
In the end EA made me a good offer and I accepted the job, it also helped that I thought I had no more to grow in my current company at the time, it was the right decision and a very lucky one, I landed in an incredible team, making an amazing game and I was free from the get-go to do much more than I was doing at the "peak" of my career in Italy. But, the interview process almost killed all this for me.
It wasn't an isolated incident. Since then, only a very few times I was "sold" on a job by the interview, really excited by the process. Most of the other days I consider it lucky if I had some fun, talked to some nice people, but left with knowing at all what a given job and company really looks like. Unlucky if I went through everything as a chore, procrastinated on tests and kept going on motivated only by the name of the company and the role of the opening rather than true interest...

Before diving in, as a disclaimer I have to say (because I predict that would also be a comment) I might nowadays be biased towards "seniors", as I happen to interview more experienced engineers than juniors, on average, but I believe that it doesn't fundamentally matter what position you're hiring for, I think the same principles apply, even if they should be implemented differently.

Eight common, bad mistakes in the technical interview
Coming up next: eight magic foods for a flat belly your doctor doesn't want you to know about

1) Wasting time.
It's good to assume that smart people are busy, and that they are in demand. So, having to spend weeks on an interview test is often not a great idea.
Now, to be fair, longer tests are not necessarily bad, and not necessarily a waste of time. They can be even fun and truly informative, e.g. interesting mini projects done with good communication could be great. Even trying out a candidate as a contractor could be not a bad idea. But, pragmatically, chances are that good candidates are busy, and won't subject themselves to all this while interviewing with many other companies, so you might lose some great people to your lengthy test. It's a compromise, be aware of that, and make your process as long as it needs to be but not any longer.
Like good code, avoid waste.

Great companies with (in my experience) longer tests? Media Molecule, Sucker Punch (their rendering engineer code test was publicly available on their website!), Crytek. The best, short, elegant one? Sony Santa Monica.

2) Overused questions.
Simple to google, simple to memorize, boring to answer over and over again. Overused, simple questions are bad in many ways. To the experienced, they signal that you're not doing a great job interviewing people, that quality is not a priority, that you did not take much effort crafting your interview. For "juniors", they encourage memorization (or looking things up on the net) over reasoning.
It's not hard to come up with original questions and even slight variants of common problems are great. Why asking the distance of a point to a line if you can ask for a sphere versus a line, or capsule? It tests exactly the same knowledge, but applied, instead of just recited.
A common side-effect of using such "dumb" question is the necessity of strict timing, where as your questions are too stupid and easily googled, you counter-act not by "fixing your bug" but by asking a load of them in quick succession to "defeat" google.

3) Useless questions.
This is an extension and aggravation of the former, often, overused questions are also worthless, not needed. Questions that require a follow-up which would already demonstrate the knowledge of the preceding, e.g. I ask you what's a class in C++, then I ask you to apply it in a practical context.
Now, it's true that there is merit in making your candidate comfortable, but useless questions often come in written questionnaires where there is less stress, often don't make a "difficulty ramp" but are just random.
Good questions sidestep the issue though, as you can, and should, design ones that can be as easy as needed to answer trivially but as deep as possible for smart candidates to dive into and provide smarter solutions. 
An example is N-d AABB vs AABB intersection, but there are so many deep and fascinating simple problems in computer science that there are no excuses.

4) Worthless questions.
Delving into bad questions, the worst are ones that are not just too simple, overused, boring or made useless by the structure of the test, but questions that are not good at all per se.
An example could be of the kind of "IQ", not even slightly related to the field, questions that infamously some large organizations are said to use (but that I doubt they really do). Another are questions that test knowledge that really one shouldn't need for the job.
Try your best to keep your questions relevant. Large organizations can analyze the performance of the questions they ask, but most studios won't interview enough for the data to be really significant. A good indicator could be to "eat your own dog food" and "submit" your questions to your employees to see and rate. Ask how much they think they're relevant, and how much they think they're interesting, and fun...
Unfortunately, often one is genuinely asking questions that are worthless because it doesn't think that's the case. At least that does communicate something though, e.g. if you ask me about design patterns I'll know I won't like to work for you.

5) Not tailoring to the skill level.
As good code, good questions are specialized. Not tailoring to the skill level is often associated with the previous mistakes, as for sake of generality you tend to ask more questions and wasting more time.
True, certain things have to be asked. And true, certain companies might hire people all of similar skill levels (i.e. smaller studios with mostly senior generalists), but that only means that you won't need many different questionnaires, still, the one you have will be tailored for the people you're looking for.

6) Pretending to interview.
Asking what you're supposed to ask. Saying what you're supposed to say. Going through a checklist someone made somewhere, not understanding what you're really doing. Happens often with standardized tests when interviewers don't really understand the purpose or depth of the tests.
This hampers the communication really, it makes the interview not a real interaction but a standardized bureaucratic chore that leaves both parties with very little information.
Yes, you made sure that the candidate knows certain things. What does that mean? That's necessary but by no mean sufficient! A counterexample, and one of the worst thing to work with, is people that have a sufficient knowledge but not enough experience to know their limits, and openness to learn and experiment more.
Knowledge is easy nowadays, much easier than finding people that understand a given domain or that are able to produce smart ideas about it.
Ideally, interview questions are coauthored by the interviewers, and every new interviewer has a chance to discuss the process, understand and even add or change it.

7) Not paying enough attention.
Pretending to interview redux. Pretending not only doesn't make much sense as a test of the candidate, but it also usually leaves the candidate with no impression of how working at your company will be, what really are its strengths, how does it work.
Remember, as you won't accept a candidate that just -says- he's good at certain things, the same applies to you. 
You have to show that your company is smart, you have to prove you are smart, and you have to prove that you work in the ways you say you work! And asking the right questions is one strong hint that the candidates (especially the smarter ones, the ones you want) will look for…

8) Over-engineered platforms.
Lastly, there is the sin of over-engineered testing platforms. That probably doesn't depend on you, but it can be so bad, and such a waste of time, it will make people not want to apply to your company or prioritize it after others that use more friendly processes.
If you need a person to jump through a hundred hoops, register to faulty online services, scan, copy and paste all his resumé, write pages after pages of forms on his education and so on, I really hope you're getting a great return on all that neatly catalogued information, because surely you're pissing off your applicants a LOT to gather them.
Also you're giving the impression of a overly bureaucratic company with a ton of management levels and an HR group that doesn't know what technical people find fun.

Going further

Your interview is as short as possible. Your questions are simple but deep. They are relevant to the job, tailored to the skill level. You understand them, what kind of people your company needs, you discussed about interviewing with your peers, and you're doing all this in a fun, informal process.

Great! You're doing everything right, and even with just interviews you should be able to make an impression in your candidate, paint a picture of what's valued at the company, how it works, and making a good effort of finding not just people that you can use, but actually good fits. People that work they way you work, or that can learn to.

Can it go even further? I'm partial to openness, and collaboration. What can you show? Are you looking for someone that will hack through your code? Then maybe you could use some of said code in the process. Are you looking for someone that will need to work with the artists? Then you might want them in a room to brainstorm techniques to get a given visual result. The more you can incorporate of the job and of the company in the interview process, the better.
Many companies won't do that because it's complicated to disclose anything, even of past projects, even under NDA. It's a shame and a problem that bogs said companies down, but even if you have to face such restrictions, can you imagine ways around them? Put some effort, and remember that getting the right people is most probably all that it takes to make great games. 

Good luck.