Search this blog

20 May, 2017

The technical interview.

I like interviewing people. I like going to interview. This is how I do it.

1 - Have your objectives clear.

There is no consensus on what an interview is for. Companies work in different ways, teams have different cultures, we have to accept this, there is no "one true way" of making software, of managing creative teams and of building them.

Some companies hire very slowly, very senior staff. Others hire more junior positions more often. Some companies are highly technical, others care more about creativity and the ability for everyone to contribute to the product design. Some prefer hyper-specialized experts, others want to have only generalists and so on.

You could craft an interview process that is aimed at absolutely having zero false positives, probing your candidates until you know them more than they know themselves, and accept to have a high rate of false negatives (and thus piss some candidates) because the company is anyhow very slow in creating new openings.
Or you could argue that a strength of the company is to give a chance to candidates that would otherwise be overlooked, and thus "scout" gems.

Regardless, this means that you have to understand what your objectives are, what the company is looking for and what specifically your role in the overall interview process. 
And that has to be tailored with the specific opening, the goals in an interview for a junior role are obviously not the same as the goals you have when interviewing a senior one.

Personally, I tend to focus on the technical part of the interview process, that's both my strongest suit and also I often interview on the behalf of other teams to provide an extra opinion. So, I don't focus on cultural fit and other aspects of the team dynamics.

Having your objectives clear won't just inform you on what to ask and how, but also how to evaluate the resulting conversations. Ideally one should always have an interview rubric, a chart that clearly defines what is that we are looking for and what traits correspond to what levels of demonstrated competence.

It's hard to make good standardized interviews as the easiest way to standardize a test is to dumb it down to something that has very rigid, predefined answers, but that doesn't mean that we should not strive to engineer and structure the interview in a way that makes it as standard, repeatable and unbiased as possible.

2 - What (usually) I care about.

Almost all my technical interviews have three objectives:
  • Making sure that the candidate fulfills the basic requirements of the position.
  • Understanding the qualities of the candidate: strengths and weaknesses. What kind of engineer I'm dealing with.
  • Not being boring.
This is true for basically any opening, what changes are the specific things I look for.

For a junior role, my objective is usually to hire someone that, once we account for the time spent by the senior staff for mentoring, still manages to create positive value, that doesn't end up being a net loss for the team.

In this case, I tend not to care much about previous experience. If you have something on your resume that looks interesting it might be a good starting point for discussion, but for the most part, the "requirement" part is aimed at making sure the candidate knows enough of the basics to be able to learn the specifics of the job effectively.

What university where you in (if any), what year, what scores, what projects you did, none of these things matter (much) because I want to directly assess the candidate's abilities.

When it comes to the qualities a junior should demonstrate, I look for two characteristics. First: its' ability to unlearn, to be flexible. 
Some junior engineers tend (and I certainly did back in my times) to misjudge their expertise, coming from high education you get the impression that you actually know things, not that you were merely given the starting points to really learn. That disconnect is dangerous because inordinate amounts of time can be spent trying to change someone's mindset.

The second characteristic is, of course, to be eager and passionate about the field, trying to finding someone that really wants to be there, wants to learn, experiment, grow.

For a senior candidate, I still will (almost) always have a "due diligence" part, unless it would truly be ridiculous because I do truly -know- already that the candidate fulfills it. 
I find this to be important because in practice I've seen how just looking at the resume and discussing past experience does not paint the whole picture. I think I can say that with good certainty. In particular, I've found that:
  • Some people are better than others at communicating their achievements and selling their work. Some people might be brilliant but "shy" while others might be almost useless but managed to stay among the right people and absorb enough to present a compelling case.
  • With senior candidates, it's hard to understand how "hands on" one still is. Some people can talk in great detail about stuff they didn't directly build.
  • Some people are more afraid of delving into details and breaking confidentiality than others.
  • Companies have wildly different expectations in given roles. In some, even seniors do not daily concern themselves with "lower level" programming, in others, even juniors are able to e.g. write GPU assembly.
When it comes to the qualities of a senior engineer, things are much more varied than juniors. My aim is not to fit necessarily a given stereotype, but to gain a clear understanding, to then see if the candidate can fit somewhere.
What are the main interests? Research? Management? Workflows? Low-level coding? The tools and infrastructure, or the end-product itself?

Lastly, there is the third point: "not be boring", and for me, it's one of the most interesting aspects. To me, that means two things:
  • Respect the candidate's time.
  • Be aware that an interview is always a bi-directional communication. The interviewer is under exam too!
This is entirely an outcome that is guided by the interviewer's behavior. Nonetheless, it's a fundamental objective of the interview process: you have to demonstrate that you, and your company, are smart and thoughtful.

You have to assume that any good candidate will have many opportunities for employment. Why would they chose you, if you wasted their time, made the process annoying, and didn't demonstrate that you put effort into its design? What would that telegraph about the quality of the other engineers that were hired through it?

You have to be constantly aware of what you're doing, and what you're projecting.

3 - The technical questions.

I won't delve into details of what I do ask, both because I don't want to, and because I don't think it would be of too much value, being questions catered to the specific roles I interview for. But I do want to talk about the process of finding good technical questions, and what I consider to be the main qualities of good questions.

First of all, always, ALWAYS use your own questions and make them relevant to the job and position.  This is the only principle I dare to say is universal, regardless of your company and objectives, should be adhered to.

There is no worse thing than being asked the same lazy, idiotic question taken from the list of the "ten most common questions in programming interviews" off the internet.

It's an absolute sin. 
It both signals that you care so little about the process that just recycled some textbook bullshit, and it's ineffective because some people will just know the answer and be able to recite it with zero thought.
That will be a total waste of time, it's not even good for "due diligence" because people that just prepared and learned the answer to that specific question are not necessarily really knowledgeable in the field you're trying to assess.

The "trick" I use to find such questions is to just be aware during my job. You will face time to time interesting, general, small problems that are good candidates to become interview questions. Just be aware not to bias things too much to reflect your specific area of expertise.

Second, always ask questions that are simple to answer but hard to master. The best question is a conversation starter, something that allows many follow-ups in many different directions.

This serves two purposes. First, it allows people to be more "at ease", relaxed as they don't immediately face an impossibly convoluted problem. Second, it allows you not to waste time, as you established a problem setting, asking a couple follow-ups is faster than asking two separate questions that need to be set-up from scratch.

Absolutely terrible are questions that are either too easy or too hard and left there. On one extreme you'll get no information and communicate that your interview is not aimed at selecting good candidates, on the other you will frustrate the candidate which might start closing up and missing the answers even when you scale back the difficulty.

Also, absolutely avoid trick questions: problems that seem should have some very smart solution but really do not. Some candidates will be stuck thinking there ought to be more to the question and be unable to just dare try the "obvious" answer.

Third, avoid tailoring too much to your knowledge and expectations. There are undoubtedly things that are to be known for a given role, that are necessary to be productive and that are to be expected from someone that actually did certain things in the past. But there are also lots of areas where we have holes and others are experts.

Unfortunately, we can't really ask questions about things we don't know, so we have to choose among our areas of expertise. But we should avoid specializing too much, thinking that what we know is universal.

A good trick to sidestep this problem (other than knowing everything) is to have enough options that is possible to discuss and find an area where the both you and your candidate have experience. Instead of having a single, fixed checklist, have a good pool of options so you can steer things around.
This also allows you to make the interview more interactive, like a discussion where questions seem a natural product of a chat about the candidate's experience, instead of being a rigid quiz or checklist.

Lastly, make sure that each question has a purpose. Optimize for signal to noise. There are lots of questions that could be asked and little time to ask them. A good question leaves you reasonably informed that a given area has been covered. You might want to ask one or two more, but you should not need tens of questions just to probe a single area.

This can also be achieved, again, by making sure that your questions are steerable, so if you're satisfied e.g. with the insight you gained about the candidate's mastery of low-level programming and optimization, you could steer the next question towards more algorithmic concerns instead.

Conclusions.

I do not think there is a single good way of performing interviews. Some people will say that technical questions are altogether bad and to be avoided, some will have strong feelings about the presence of whiteboards in the room or computers, of leaving candidates alone solving problems or being always there chatting.

I have no particular feeling about any of these subjects. I do ask technical questions because of the issues discussed above, and for the record, I will draw things on a whiteboard and occasionally have people collaborating on it, if it's the easier way to explain a concept or a solution.

I do not use computers because I do not ask anything that requires real coding. The only exception is for internships, due to the large number of candidates I do pre-screen with an at-home test. That's always real coding, candidates are tasked to make some changes to a small, and hopefully fun and interesting codebase, which either I make ad-hoc from scratch or extract from suitable real-world programs).

I do not think any of this matters that much. What it really matters, is to take time, study, and design a process deliberately. In the real world the worst interviews I've seen were never because they chose a tool or another, but because no real thinking was behind them.

11 May, 2017

Where do GPUs come from.

A slide deck for a introduction to CG class.



PPTX - PDF (smaller)

Note: this is not really a tutorial in this form, there are no presenter's notes. But if you want to use this scheme to teach something similar, feel free. The CPU->GPU trajectory is heavily inspired by the brilliant work Kayvon Fatahalian did.

07 May, 2017

Privacy, bubbles, and being an expert.

Privacy is not the issue.

Much has been said about the risk of losing our privacy in this era of microwaves that can turn into cameras and other awful "internet of things" things. It seems that today there is nothing you can buy that does not both intentionally spy on your behaviors and is also insecure enough to allow third parties to spy on your behavior.

It's the wrong problem.

Especially when dealing with big, reputable companies, privacy is taken really quite seriously, there is virtually zero chance of anyone spying on you, as an individual. Even when it comes to anonymized data, care is taken to avoid singling out individuals, it might be unflattering, but big companies do not really care about you.

What they care about is targeting. Is being able to statistically know what various groups of people prefer, in order to serve them better and to sell them stuff.


CV Dazzle
The dangers of targeting.

Algorithmic targeting has two faces. There is a positive side to it, certainly. Why would a company not want to make its customers happier? If I know what you like, I can help you find more things that you will like, and yes, that will drive sales, but it's driving sales by effectively providing a better service, a sort of digital concierge. Isn't that wonderful? Why would anyone not opt in in such amazing technology...

But there is a dark side of this mechanism too, the ease with which algorithms can tune into the easiest ways to keep us engaged, to provide happiness, rewards. We're running giant optimizers attached to human minds, and these optimizers have access to tons of data and can do a lot of experiments on a huge (and quite real) population of samples, no wonder we can quickly converge towards a maximum of the gratification landscape.

Is it right to do so? Is it ethical? Are we really improving the quality of life, or are we just giving out quick jolts of pleasure and engagement? Who can say?
Where is the line between, for example, between keeping a player in a game because it's great, for some definition of great, and doing it so because it provides some compulsion loops that tap into basic brain chemistry the same way slot machines do?

Will we all end up living senseless lives attached to machines that provide to our needs, farmed like the humans in the Matrix?


Ellen Porteus

Pragmatically.

I don't know, and to be honest there are good reasons not to be a pessimist. Even with just a look at the history behind us, we had similar fears for many different technologies, and so far we always came up on top.

We're smarter, more literate, more creative, more productive, happy, healthy, pacific, rich that we ever were, globally. It is true that technology is quickly making leaps and opening options that were unthinkable even just a decade ago, but it's also true that there is not too much of a reason to think we can not adapt.

And I think if we look at newer generations, we can already see how this adaption is taking place, even observing product trends, it seems to be becoming harder and harder to engage people with the most basic compulsion loops and cheap content, acquiring users is increasingly hard, and the products that in practice make it onto the market doing so by truly offering some positive innovation.

Struggling with bubbles.

Even if I'm not a pessimist though, there is something I still struggle with: the apparent emergence of radicalization, echo chambers, bubbles. I have to admit, this is something hard to quantify on a global scale, especially when it comes to placing the phenomenon in a historical perspective, but it just bothers me personally, and I think it's something we have to be aware of.

I think we are at a peculiar intersection today. 

On one hand, we have increasingly risen out of ignorance and starting to be concerned with the matters of the world more. This might not seem to be the case looking at Trump and so on, but it's certainly true if we look at the trajectory of humanity with a bit more long-term historical perspective.

On the other hand, the kind of problems and concerns we are presented with increased in complexity exponentially. We are exposed to the matters of the world, and the world we live in is this enormous, interconnected beast were cause and effect get lost in the chaotic nature of interactions. 

Even experts don't have easy answers, and I think we know that because we might be experts in a field or two, and most big questions I believe would be answered with "it depends".
There are a myriad of local optima in the kind of problems we deal with today, and which way to go is more about what can work in a given environment, with given people, than what can be demonstrably proven to be the best direction.


Echo Chambers

The issue.

And this is where a big monster rears its head. In a world with lots of content and information, with systems that allow us to quickly connect to huge groups of similar-minded people, algorithms that feed contents that agree with our views seeking instant satisfaction over exploration, true knowledge and serendipity, when faced with increasingly complex issues, how attractive the dark side of the confirmation bias becomes?

We have mechanisms built-in all of us, regardless of how smart, that were designed not to seek the truth but to be effective when navigating the world and its social interactions. Cognitive biases are there because they serve us, they are tools stemmed from evolution. But is our world changing faster than our brain's ability to evolve?

Pragmatically again though, I don't intend to look too much at the far future (which I believe is generally futile, as you're trying to peek into a chaotic horizon). What annoys me is that even when you are aware of all this, and all these risks today, it's becoming hard to fight the system.
There is simply too much content out there, and too many algorithms around you (even if you isolate yourself from them) tuned to spread it in different groups that finding good information is becoming hard.

Then again, I am not sure of the scale of this issue, because if again, we look at things historically, probably we are still on average better informed today and less likely to be deceived than even just a few decades ago, where most people were not informed at all and it was much easier to control the few means of mass communication available.

Yet, it unavoidably irks me to look around and be surrounded by untrustworthy content and even worse, content that is made to myopically serve a small world view instead of trying to capture a phenomenon in all its complexity (either with malice, or just because it's simpler and gets clicks).
Getting accurate, global data is incredibly hard, as it's increasingly valuable and thus, kept hidden for competitive advantage.

John W.Tomac

Being an expert, or just decent.

I find that similar mechanisms and balances affect our professional lives, unsurprisingly. I often say that experience is a variance reduction technique: we become less likely to make mistakes, more effective, more knowledgeable and able to quickly dismiss dangerous directions, but we also risk of becoming less flexible, rooted in beliefs and principles that might not be relevant anymore.

I find no better example of these risks than in the trajectory of certain big corporations and how they managed to become irrelevant, not due to the lack of smart, talented people, but because at a given size one risks to have a gravity of its own, and truly believe in a snapshot of the world that meanwhile has moved on. How so many smart people can manage to be blinded.

Experience is a trade-off. We can be more effective even if we might be more wrong. Maybe, more importantly, we risk losing the ability to discover more revolutionary ideas.
How much should we be open to exploration and how much should we be focused on what we do best? How much should we seek diversity in a team, and how much should we value cohesion and unity of vision. I find these to be quite hard questions.

I don't have answers, but I do have some principles I believe might be useful. The first has to do with ego, and here it helps not to have a well-developed one, to begin with, because my suggestion is to go out and seek critique, "kill your darlings".
This has been taught to me at an early age by an artist friend of mine who was always critical of my work and when I protested made me notice how the only way to be better is to find people willing to trash what you do.

In practice, I think that we should be more severe, critical, doubtful of what we love and believe than anything else. We should set for our own ideas and social groups a higher standard of scrutiny than what we do for things that are alien to us.

The second principle that I believe can help is to encourage exploration, discovery, experimentation and failure. Going outside our comfort zones is always hard but even harder is to face failure, we don't like to fail, obviously and for good reasons.
So one cannot achieve these goals without setting some small, safe spaces where exploration is easier and not burdened (I would say unconstrained, but certain other constraints actually do help) by too much early judgment.

Lastly, beware that for how much you know about all this, and are willing to act, many times you will not. I don't always follow my own principles and I think that's normal. I try to be aware of these mechanisms though. And even there, the keyword is "try".

Epilog: affecting change.

I believe that polarizing, blinding, myopic forces are at work everywhere, in our personal and professional lives, in the society at large, and being aware of them is important even just to try to navigate our world.

But if instead of just navigating the world, one wants to actually affect change, then it's imperative to understand the fight that lies ahead.

The worst thing that can be ever done is to feed the polarization forces, cater to our own and scare away people who might have been willing to consider our ideals. It does not help, it damages.

Catering to our own enclaves, rallying our people, is easy and tempting and fulfilling. It's not useless, certainly, there is value in reaffirming people who are already inclined to be on our side, but there is much more to be gained in even just instilling a doubt reaching out to someone who is on the opposite side, or is undecided in the middle, than solidifying beliefs of people that share ours already.

You can even look at current events, elections and the way they are won.

Understanding people who think differently than us, applying empathy, extending reach, is so much harder. But it's also the only smart choice.



06 May, 2017

Shadow mistery

Can shadows cause the texture UVs to shift?

This was a bug assigned to one of our engineers. Puzzling. Instead of being really useful, I started to investigate with some offline rendering.


Look at the shadows of the disappearing pillar

Wow! Mental Ray is broken :)

So apparently yes, you can easily create this optical illusion. It's pretty easy to understand why, especially on a constant albedo the texture of the surface comes entirely from shading. If the light moves, so will the highlights traverse the surface, and that creates an illusion similar to a texture shift, of just a few pixels.

The effect is much stronger when the ambient light creates an highlight opposite to the main light, so when the main light is shadowed the ambient highlight dominates and the shift becomes apparent.

On a render where the ambient and highlight are on the same side, the effect is much less pronounced.



The nail on the coffin was though when I managed to reproduce the same effect in real-life, so, it's definitely an optical illusion that can happen, but it's probably made worse by realtime rendering unshadowed ambient/GI on normalmaps and by the fact that shadows abruptly cancel the sun/sky light, instead of gradually blocking only some rays and rolling out the highlight direction in the penumbra region.