Super Senso Announced!

The game I’ve been working on at Turbo Studios for the past 2 years has finally been announced! Super Senso (to be released early 2016) is a turn based strategy game for iOS and Android. The easiest way to describe Super Senso is as a free to play mobile remix of Advance Wars, but there’s tons more to it than that, including but not limited to giant-ass robots.

Here’s some screenshots and gifs highlighting my work on this game. Check out my portfolio post for more details.

Shortcut Mapper: A Visual Shortcuts Explorer for Desktop Applications

A year ago I created an experimental web application to make it easier to learn Blender shortcuts. Since then I’ve been working on a new version and today, I’m officially releasing Application Shortcut Mapper: A visual shortcuts explorer for desktop applications.

shortcutmapper_overview

It has support for multiple applications, versions, operating systems and different keyboard layouts. I’ve added Photoshop, Lightroom and Blender to start with, and I hope the internet helps out with adding a lot more applications.

The data is scraped from online documentation for Adobe applications. For Blender, all shortcuts are accessible via their python api, which makes it really really awesome because everything can be nicely exported.

It is completely open-source and hosted on GitHub pages. This has the benefit that all commits to the gh-pages branch are instantly hosted and viewable by everyone.

Check it out yo! http://waldobronchart.github.io/ShortcutMapper/

Multimania 2011 Talk: Working At Black Rock Studio

Back in 2011, after my internship at Black Rock Studio, I was asked to talk at MultiMania about my experience as a Technical Artist in the industry. Now, 2 years later, I finally found the courage to upload this and I’ve fully exposed myself to be judged by the internet. Scary!

The audience consisted of mainly students from the course Digital Arts and Entertainment (which is the game dev course I did). I talked about how I landed the job, how it was to work there, what I worked on, what a Technical Artist meant to Black Rock, and eventually a short bit at the end about Black Rock’s closure.

Ambient Light TV with a Raspberry Pi and a Webcam (Part2)

In this post I’ll go into more details about how I built the ambient light rig. Here is the previous part in case you missed it.

I’m not the first one to make one of these, so this is not going to be a perfect instructable. Instead, I’ll go deeper into the things I did differently (like the camera), and link to the resources I used for all the other stuff. Here we go!

Most rigs out there use an Arduino linked to a pc, which has the benefit of being relatively easy to implement but the downside is that you can only use it with a video stream directly from the pc’s graphics card (for most people, this is acceptable). You can buy ready-made systems that work in a similar way: LiveLight.

I decided to go full option. I wanted every feed to work: PC, Cable TV, Apple TV and my gaming consoles.

For this to work, I would need access to all the feeds going into my TV. Intercepting a HDMI feed before it goes into the TV is technically possible, but the gear I needed would have been way too expensive. So the only other cheap alternative was to use a webcam, knowing that frame rate, lag and color inaccuracies were going to be an issue.

The camera is best placed pointing directly at the screen to avoid glare and viewing angle color inaccuracies. In my case, I only had two viable options:

As you see in the second picture, I had to close the curtains because at that angle the TV reflects everything from the window adjacent to it. No biggy, I prefer to close the curtains when watching a movie or playing games anyway.

Pure blacks tend to be more blueish too, but I can live with that. This actually has a nice effect when the screen goes dark, as the whole rig glows blue providing some light in the room.

I used a Logitech C250 (because I already owned one). It can capture at different resolutions and has a manual focus ring. The manual focus ring is great, because you can intentionally set it to out of focuse to get color averaging for free!

AMBI_0003

The Adalight tutorial was a good guide for building the frame. This is what I used:
– 1x Raspberry Pi (any version works)
– 2x 12mm Diffused Flat Digital RGB LED Pixels (Strand of 25)
– 1x Stontronics 5V DC 4A Power Supply Unit T2502ST (Powers both the LEDs and the Pi)
– 1x 100x75x40mm Housing for the Pi
– 1x 2.1mm Power Socket (Rated 5A)
– 1x Rocker Switch
– 1x Micro USB
Some grommets
– 4x Angled aluminium profiles
– Small nuts and bolts to attach the profiles

At this point, I started to work on the server code for the Raspberry Pi. The first implementation was a Python program running in Debian linux LXDE gui mode. This was a failed approach for two reasons:
– The GUI mode was using too many resources on the Pi, so the experience would be quite sluggish
– The Python GPIO library (wiringPi) I was using wasn’t fast enough to update 50LEDs.

So I decided to go for a full-on C implementation using the wiringPi C++ library. I wrote all the code on my pc and pushed it to the Pi using git and an ssh client. The source code for the server is hosted on github. I’m working on a readme, but the code should be pretty comprehensive if you’re familiar with C++.

I used OpenCV to capture frames from the webcam, jansson as a JSON library for saving settings, Log4CPlus as a logging library, and Boost for network communication with a client program running on my PC.

The client program is used for tweaking camera color settings and to set the capture bounds of the frame. It communicates directly with the server on the raspberry Pi. I’ll go into more details of the client app in Part3. Here’s how it looks:

At this point I had everything pretty much working, and the main task now was to clean everything up and attach the Pi to the frame.

Capturing at full 640×480 resolution was really slow, I expect this is due to the limited USB bandwidth on the Pi. So I had to go all the way down to the lowest capture size: 160×120. This gives me a relatively stable capture framerate of 20FPS.

The LED frame is lagging around 2 frames behind the TV. This is noticeable, but I’ve found that you eventually get used to it. Blending previous frames together with the current one also helps a lot for smoother color transitions on the LEDs.

The other reason for the slow framerate is the Auto Exposure setting on the webcam. Basically, when the image on screen is dark, the webcam takes longer to capture the frame (longer exposure). The driver I’m currently using on the Pi (using Debian Linux) doesn’t allow me to set this to a fixed exposure. I’m still trying alternatives and will post once I have a solution.

All in all, I’m pretty pleased with the results. If I manage to turn off Auto Exposure, I think I might be able to hit 30FPS.
Ninja Edit: I finally managed to set a fixed Exposure setting, and I’m getting a smooth 30fps now!

I’m eventually going to write Part3, which is more about the client. All the code is on github, so don’t wait up.

If you’re interested in building one of these, here are some resources I used to get me started:
http://siliconrepublic.blogspot.de/2011/02/arduino-based-pc-ambient-lighting.html
http://learn.adafruit.com/adalight-diy-ambient-tv-lighting/overview

Ambient Light TV with a Raspberry Pi and a Webcam (Part1)

After finishing my Arduino 8×8 Led Screen, I got a Raspberry Pi. This is what I made with it! A 50 Pixel RGB ambient light rig for my TV.

The colors are sampled from the edges of the TV screen using a webcam. After sampling from the captured frame, each RGB led is updated with the appropriate color on screen. Watching movies and playing games with it turned on is awesome!

The initial prototype used an Arduino. The color sampling was done on the PC and sent to the Arduino with serial output. However, with that setup I could only use it when my computer was the input feed (cloned monitor to TV). I also managed to break my Arduino, so that forced me to use the Pi instead.

With the webcam approach, I can use it with any feed: cable TV, XBox, Apple TV, Steam Big Picture and the Raspberry Pi itself. I briefly looked into HDMI splitters and frame grabbers but that was all way too expensive.

I took a load of work in progress pictures while making it, I’ll post those in part 2. Overall it took about 6 weekends to complete, over a period of 7 months.

Read part 2 here!