Quick Mario Storyboard

In one of Tony’s lessons we had to pick a game and draw a storyboard for it. I did a very simple one for a level of Mario, where you get powers, defeat enemies, grab the flag and run into the castle.

This was an experiment in game storyboarding as opposed to film. I focussed on the actions and general obstacles and goals of the level, rather than specific details and timings.

I drew this on Photoshop and enjoyed drawing an existing character without trying to draw a copy of the character’s style as it is only a storyboard.

I liked the bright outcome of this drawing.



The Future of Motion Control?

Motion controllers have been around since very early in the gaming game. The first gaming joystick was invented in 1967 for the Odyssey by Ralph H. Baer. These joysticks worked as motion control as you help them in your hand and where you moved your hand (though not far) was what controlled the character on screen.

The joystick developed. The Atari standard joystick had a button at the top to press with your thumb to fire in games. Eventually joysticks had the ability to calculate the angle and push power of a movement more accurately and were used mainly for flight simulators and arcade games, slowly going out of fashion.


When the Nintendo Wii came out in 2006 the Wii Remote (or Wiimote) debuted. You held it like a joystick, except it was wireless and ran on motion sensor controls. A lot of games were built to purely be powered by the controller’s unique functionalities, though it was equipped with button controls for a wider range of games. It sent signals to a sensor above a TV, rather than be mounted to a base with a  wire.


Similar controllers were made for PlayStation Move, which had motion sensors and were tracked with a camera. Since then Nintendo developed smaller motion controllers for the Switch, and a lot of developers have been working on a new wave of VR gaming. New headsets and better technology allow a player to immerse themselves in more and more realistic games all of the time and use handheld controls with their headsets.


When I look at the future I see people attempting to make more developments on motion controls. Wii Fit and Dance Dance Revolution have used your feet. Wiimotes and PlayStation Move used your hands. Eyetoy and Kinect use cameras. VR uses your head. What if someone made a combination of all of these to make the ultimate motion control technology, or stripped it back to only the best parts of previous control inventions?

I will use this information and the questions I have to write an essay about Emerging Technologies.

Job Roles

I’ve researched into my ideal future a lot of times, but it usually leads to places very far away. I would love to do concept art and animation for a game or movie company and I used to only look at Pixar, Disney, DreamWorks etc., but since then I have opened my mind to working in many other companies in Britain. Rare is a good example of a company I would be interested in, as they made some of the games I grew up on- such as old Donkey Kong games.

I always have wanted to be a concept artist, and aimed to do a Degree in concept art, but over the last year I have began learning something I never thought I would get to do- 3D animation. Art is still important but animation is new to me and is something I tried to not aim for when I was younger as I didn’t think I would ever be able to learn it- unless I did it at university- but I took the opportunity to do it now, as soon as I could. I am considering a degree covering those two subjects the best I can find it anywhere if no apprenticeships are available to go for. I am still in early days deciding where I want to be in a year and a bit, but deciding my specialisation for next year is a start.

I want to one day design for and/or animate for a game or movie production. There are different ways of getting there, some definitely require degrees in areas I am looking into, so I will likely do one, but that isn’t the only way. The Unis I have heard do courses I might look into are Sunderland, Teesside and Sheffield Hallam. I will look into more over summer.

Marvel-lous Guests

set1In a turn of events last week, Marvel came to my town to film scenes of the Avengers in Durham Cathedral.

I love Marvel movies and a lot of us think of them whenever we’re asked to talk about VFX and what our goals would be with it. It’s so close to where I live I had to see what was going on and it was fascinating. Lots of vans were parked outside each for different things such as lighting, props and more. It showed how much is needed just to film a scene in one location.set3

The stain glass windows were totally covered also, as I assume on the inside were either green screens or black screens to stop the real sky being shown, so they can make it look like it was in a different location. The fact that they did that shows a huge amount of physical props are used during filming to help the post production VFX. I heard the scene is set in Asgard and the inside of the Cathedral had some harnesses, green screens and props in place, so the windows will probably show that.

set2It was interesting to go to and the atmosphere was different to regular Durham. In summer there are people around the cathedral, but (though you cant totally tell) lots of people were huddled on the grass and talking to security guards trying to see inside. Some people signed up to be extras and some met Chris Hemsworth nearby. I was just happy to finally see a real movie set which is WAY more complex than I would expect for a non studio scene. I realised more prep goes into it with equipment- even in outside locations- than I guessed.

Hardware Limitations and Breaking Expectations

Today we are thinking about the way hardware, particularly the controllers, for games can affect the design of the game. A lot of the time it is a decision of whether the developer will fit their games into the limitations by cutting or changing controls, breaking the limitations by using the quirks of the controllers to their advantage, or designing the game based totally on the unique functionality of the controller/console.

The first game that came to mind about using the controller’s uniqueness to the advantage of the game was let’s Tap for Wii. The Wii remote was a revolutionary step for Nintendo, which influenced other consoles making motion sensor controllers. Games like WarioWare, Wii Sports, Wii Party, and Mario Kart for a few examples showed users how to use every function of the controller to do so many different things, including adding attachments (steering wheels, rackets etc.) to the controller to make the experience more specific tot he game. Let’s Tap was a simple idea- place the Wii remote face down on a cardboard box, and tap the box to make the figure on the screen run and jump through different levels and races. This idea is used in touchscreen games sometimes where tapping the screen makes the player move, but Let’s Tap just used the remote’s sensitivity to control the game. The game is targeted at all age groups, and can easily be picked up. One simple instruction to follow “tap” is universal and easy to understand.

Let's_Tap  nintendogs

Nintendogs for the original Nintendo DS was limited to a few buttons, but was one of the earliest games to grace the Dual Screen handheld. The touchscreen was probably the most used control on the entire game. When using the touchscreen for  menu, you could watch over you dogs on the top screen (just like the Gameboy Advance screen), and it was the same for almost all activities on the game. Most of the time you would use your stylus to care for your dogs, from scrubbing them down int he bath, to holding their leash whilst walking. You could even train your dogs to do tricks and throw them frisbees with the touchscreen, then enter them into competitions as their skills progressed. The DS also has a mic which you could use for the first time to train your dog to recognize its name in your voice. Nintendogs took total advantage of the new screen, rather than rely on the Gameboy-esque buttons. In fact it almost ignores the buttons altogether. This game is directed at young children, so the lack of buttons is helpful to teach them to just touch what they want, so gets younger audiences to play DS.

The Sims series was something that only used one main control on PC, the mouse. It is aimed at 12+ year olds because of some of its grown up content. For a game that is based on human lives, the daily routines of a household of people in a town of characters with individual traits, it seems odd that everything is based on a selection, but multiple choice just works so well for the franchise. Yes, you can make a person, in recent games more unique than ever, and raise them, letting  them form deep connections and complex careers (and even have babies) but you can’t live an entire life through them. You can’t type in something to say to another Sim, you need to pick a topic. You can’t write a book which other Sims will understand, so you jut let them write their own in ‘Simlish’. The multiple choice game-play eliminates the idea of such complicated personal touches, and the fact that they speak a made up language helps the player only want to make the characters bring emotions to other Sims, rather than know exactly what they are saying. Movements are even multiple choice, where you select where you want to go and how you want to get there, rather than use arrow keys. This places the movement in a queue behind the actions the player is already doing. This makes the game manageable, and takes away being able to have 100% control, so it almost feels like you’re controlling the events of a reality TV show.


If Sims let you move your sim in real time with your arrow keys, they wouldn’t be able to complete the actions you’d set (talk to a sim, go to toilet, go to ed) because they need to interact with the object. There is no use for the keys if they have no tasks either, because you still have the option to click and select a destination. Without the use of cheat codes and mods there’s not much use for any of the keyboard keys. If there are shortcuts, they can still be completed through selection- which explains why they have mobile apps and games on consoles too (which are more limited)

VFX Studio Research and New Ideas

Our first lesson with Gary introduced us to another side of VFX. We were told we could start coming up with ideas for our own project and were shown how movies use so much VFX- even in scenes you wouldn’t expect.

I know now the extent of what layers of VFX can do to create an entire realistic scene for a film, pictures on top of footage overlapping one another until you have something that looks totally believable.

Firstly, though, we were asked to look at some VFX studios and pick one to write about and, though I could choose one I recognised more, I decided to look into ‘Milk’ because I’d not heard of them before now and they’ve worked on a few things I’ve seen.

Milk Studios is based in London and Cardiff, and is an independent VFX company. The company is owned by Will Cohen, Sara Bennett, Nicolas Hernandez, Jean-Clause Deguara, and Murray Barber.

In the past few years they’ve won awards, such as an Oscar for feature film Ex-Machina, an Emmy for Sherlock (The Abominable Bride special) and 3 BAFTAs.

They’ve worked on other things, such as Doctor Who, Divergent, Snow White and the Huntsman, Thunderbirds, and more.

The studios are currently looking for Lighting TDs , Senior animators, Modellers, Texturers, and CG and VFX supervisors. I’d personally be interested in the animation.

I would want to animate for a big production company because since forever I’ve had a passion to learn the craft and I finally am, and a visual fx studio like Milk would allow me to use those effects to enhance my work and incorporate it into scenes.

To submit applications they ask for CVs and showreels to show off your talent.

I really like the look of this company, they definitely pulled off a brilliant twist in Sherlock, their work is really impressive and it is inspiring me to have more interest in what I make in VFX.

I’m really looking forward to doing my first original VFX project with my own footage, ideas and direction. I have some way to go first but I’m excited to learn!

Game Jam Final Day

Sunday was for finishing the game off. We had a lot of code left to do including the random generator which Luke kindly helped us with. It made the waves spawn in on the bass strings and fly along the level left. Then we programmed the waves to be a trigger to kill the character, Rick, on collision. Over time the waves get faster also, and will only spawn on 2 strings at once, never more.guitargameplay
I had to animate Rick on Sunday to make him bounce as he stood, so he wasn’t static. I selected each limb on his body separately and imported each file into flash on the stage as separate files. I animated each piece frame-by-frame and exported it. Ant then turned this into a sprite sheet and imported it into Unity. This replaced our static character with a moving one.
Jack created a main menu page and I wrote instructions on it, we named the game Bass Booster, and I did a game over screen. I took inspiration from the Donkey Kong game over, as I edited Rick to be injured and bandaged up accompanied by the game over text. Kyle then put a button on the page and it linked back to the menu.
On the menu there were two buttons, one leading to the game, and one to a credits scene we made.

Jack’s music was imported after that. he edited his tracks into a compilation and we put them on loop in the game, so it had variety. He then chose tracks to put on the menu, credits, and game over screens.

Kyle and Ant fixed bugs in the game to stop Rick jumping off screen. Then, finally, we were pretty much done! So, afterwards, we uploaded the game onto the Global Game Jam and sat down to watch Ant live stream our games, one team at a time, on Twitch.

Overall I really enjoyed the weekend and I’m proud of what we accomplished. I wish it could be refined a little bit more, but for two days we made a whole lot and it is a completed game which sounded a bit impossible before now.

Game Jam Day One and Two



Though I’m only 17 I joined in with the #NGenGGJ this weekend. To do a Waves themed game me and my group decided to do one based on sound waves.

Our game is themed as a bass guitar having sound waves travel down the string. I am the artist so I designed “Rick the Pick” our character, and the game background of the bass guitar.

On Friday night I drew a lot of ideas for the features of Rick around an empty plectrum, then decided with my group if they liked the same ones as me and designed the ideal design based on that. I drew it out in Pro-marker pens and then scanned them in.


At college on Saturday I used my Rick design to paint him digitally. I completely covered the entire thing on Photoshop by digitally painting over him to make him look how I envisioned him when I first drew him. I am new to Photoshop this year but I used the graphics tablet to draw some of it using the pen which made it feel far more familiar to me. I am very pleased with how he turned out and the college even tweeted me out- which was cool.

I then designed the game scene, a horizontal front facing bass guitar- cropped to the strings and pickups.


In game there are 4 Strings over the top of this, but they are platforms, so are separate files from Flash that Kyle made. Kyle, Jack and Owen did the code, Jack did music and is working on a main menu, and I did the art and may do animation on my character. I am happy with how it is going so far and everyone’s work looks great.

Analysing a Pixar Short

I am analysing this Pixar short, Mike’s New Car, for it’s use of the 12 Principals of Animation:

At the very beginning we have anticipation, where Mike and Sully are behind the door, then in front of it, and then after mike walks away Sully opens his eyes and sees something we cannot yet. This is suspenseful and makes the viewer wonder what it is he can see.

Mike goes through a few emotional states while Sully keeps adjusting his seat; at first we see squash and stretch as Sully has to squeeze into the vehicle, then slowly relaxes as his seat moves down, only to squash into the side afterwards. Then we see appeal as Mike reacts to the situations and grows frustrated until they both are a bit shocked by the beeping.

When mike is standing at the bonnet and the hood is closed on his fingers there is a pause before he jumps and screams in pain. This is an example of squash and stretch and also of exaggeration because he lifts off the ground. Immediately afterwards he is launched in the air by the same hood, and lands on the machinery inside, causing him to spin, jump, and scream in pain; all of this is secondary action, and his arms flinging around is overlapping action.

The scene set up is an example of staging because you are outside of the car looking in through the window like a pedestrian on the street outside so it feels realistic, and you can see the action from the big front window. When the car malfunctions totally near the end the camera view shows everything at once and then pans out and you see the car from a distance, where all of the action is happening in just one part of the street.

My Portfolio Presentation

I’m back blogging after Christmas (merry Christmas) and I’m going to do a post about my portfolio I presented earlier this month to Matt, and how it went.

For my portfolio I included work from all of the subjects we cover, and on the presentation they are categorised and titled in the top right of each slide. This is my portfolio. It was quite difficult to decide which work was best to put in and how to present it as I’ve never done this before, but I was happy that this was my best attempt at doing it alone. Now I have a lot of feedback on areas I need to work on as well as areas I did well on to improve it for the end of year portfolio.

I’m aiming as high as I can get and i am working hard and I’m pleased that at the minute I have been told I could get a pass showing this at the end of the year, arguably a merit and that if I follow Matt’s advice it could help me reach a distinction grade in the future. If I focus on perfecting work I have more interest in, and then keep working on building up skills in ones I find difficult, I should end up with a better grade. For example if I focus on my animations, instead of doing lots of average ones I could do less of them but with a higher quality to show I can incorporate as many skills as I can into just one or few pieces of work.

Also in the presentation itself I don’t need to worry as much as I did about time limit as what I had to say went by quickly even though I thought I’d go over-time, which means I can space out my work onto more slides to show it better and pace myself. I need to present more work I do outside of college and show I can use the skills we learn to make my own projects too.

Another pointer I would love to showcase more of (which I was lacking in my presentation) was to include my thought processes behind my work; concept art with notes on and the developments up to a final piece, notes of ideas for a game, storyboards for animations etc. I like showing my thought process so I will be including far more of this in a way that makes sense next time.

Finally another bit of advice was to make sure all work on paper is scanned in to improve the quality even of it’s early concepts, and to work on my lighting and rendering to get better images from Maya to show my models more clearly.

The feedback I got made me feel confident that I’m on the right path, and I’m very happy I know where I need to improve and how to do it.