Creative Strategies: UV mapping and texturing credits

With the creation of our characters coming to an end, all that was left to do was lay out the UV’s for each character and texture them. Unfortunately no one in the team knew how to uv unwrap or were particularly confident in texturing. SO we sought the outside help of our lecture Alec Parkin and fellow student Kerry McCormick.

UV mapping by Alec Parkin

Mother bird UVs

Image converted using ifftoany

Image converted using ifftoany

Baby Bird UVs

Image converted using ifftoany

Image converted using ifftoany

Textures By Kerry McCormick

Mother Bird Texture

Mother_Bird_Texture

Baby Bird Texture

Baby_Bird_Texture

Can’t thank them enough!

In addition, this has made me aware of a huge gap in my work ability (to UV map and texture characters), as a result I realise that I need to practice these a lot more to the extent where I’m comfortable to complete tasks such as UV unwrapping and texturing in the future.

Advertisements

Creative Strategies: Christian’s Environmental Work

Christian took it upon himself to create the environment for our animation by taking the assets we modelled and placing them all together within the scene (like the tree and nest) and needless to say I’m really happy with the outcomes that have been produced. I love the use of fur to create the bushes of the tree and the blades of grass. I also thought the use of a displacement map to create the detail in the ground was also really smart. Awesome Work!

12188942_618385728303322_8965823240328954533_n

12208806_618385658303329_5995500549017902610_n

12208654_617675931707635_6642148458272843043_n

12191946_617742198367675_2087674892423334665_n

12191643_617675961707632_2009425810924232429_n

12190802_618384901636738_2031763496688228865_n

Creative Strategies: Rigging Our mother Bird: Alec’s Notes

I decided to take on the responsibility of rigging our characters for our animated short, which I really enjoyed, but I found it incredibly difficult to initially break into. So my lecturer, Alec Parkin, put together this amazing helpful tutorial.

‘I feel the reason why it helped me so much much because everything is well explained and conveyed within the video. Everything said made sense to me and I was able to follow along with it perfectly. This helped make the rigging process less daunting to me and eased me into it and also left me with enough knowledge and confidence to continue on with updating our mother bird rig and completing our other two character rigs, Baby Bird and Fat Baby Bird.’

Key Notes Taken

-Always remember to zero out the values of your rig controls

Modifyfreeze transformations (with the controller selected)

EditDelete by Typehistory

-Remember to name the attributes in your outliner appropriately,make it easy and straight forward to understand, not just for yourself but for the others working with the rig.

-Keep all connections, such as aim constraints connected (intact)

-Applying a new group to a pre-existing group will zero out the origins

-Applying blend shapes to our character rig, to apply the modified meshes as blend shapes to our original mesh, select the meshes in the following order:

‘DUPLICATED MESH’ ——->‘DUPLICATED MESH’ ——-> ‘ORIGINAL MESH’

Now, with the meshes selected in the following order:

Go to ‘Derform’‘Blend shape options’‘Edit-reset settings’‘Apply’

Now the blend deformers have been applied to the original mesh.

“APPLYING BLEND SHAPES KILL YOUR RIG?”

No worries

This can be fixed by editing the input of our character: 

With our mesh selected, go to ‘inputs’ – ‘All inputs’ – using the middle mouse button, move our blend deformers under our skin cluster (smooth bind)

inputorder

So as seen above, we want our blend deformers to effect our mesh before our smooth bind is applied.

When Rigging we want:

  • The CONTROL OBJECTS to drive the JOINTS and the JOINTS TO DRIVE THE MESH

We don’t want:

  • ROTATED JOINTS

Rigging for Translating and scaling our characters:

With our root joint applied to our base control as seen below:

base_control

Apply a scale constraint and a parent constraint to the base control.

(under Rigging – Constrain – Parent and Scale)

Final Rig Outcome:

Creative Strategies: Alec’s Notes Rigging

Popular Rigging Tools

  • Grouping/parening
  • Constraints
  • Joints
  • Blend shapes
  • Face deformers.

Creating an FK Joint chain

FK -Forward kinematics

Refers to the movement of the children of the hierarchy being
driven from parent of the hierarchy.

The create Joint tool automatically creates an FK hierarchy which
you an also see in the outliner.

Maya’s joint tool lets us create skeleton style set-ups to deform and drive the polygon geometry.

The Create Joint tool is found under the rigging sub-menu

Skeleton – Create Joints

It’s recommended to layout your joints in the orthographic view ports (front, side or top)

A video tutorial going through the joint creation process

Creating an IK chain

IK-Inverse Kinematics

A Inverse Kinematics joint system is where the movement is driven from the end of a joint chain.

Therefore the inverse of an FK system is where the movement is driven
from the start of a joint chain.

To create an Inverse Kinematics chain do the following:

Create a joint chain with the Create Joint tool

Make sure to add a slight translation in the chain in the direction you want the chain to bend.

Select create IK Handle toolSkeleton – Create IK Handle

Click on the start of the joint chain and then on the end or last joint you want to affect in the chain.

A video tutorial going through the creation of a simple IK rig

Creating Parent Constraints

it’s not recommended to add key frames to joints directly, instead you should use controller objects that drive your joints or IK handles.

This can be solved by:
• Creatining a NURBs circle and snapping it to the relevant joint or IK handle
• Freeze the transforms for that circle – Modify Freeze Transforms
• Select the NURBs circle then shift select the corresponding joint/IK handle
• Go to Constrains menu – Parent Constraint

Parenting and unparenting joints

‘P’ to parent and shift + P to unparent joints or other objects.

Alternatively you can middle mouse click and drag the joints about in the outliner

Blend shapes

Animation menu – Anim deform – Blend shape

Creative Strategies: Adding eyelids To our Characters Eyes, and Rigging the eye using key constraints

I used this tutorial by Tony Johnson to create the eyelids

  • I Started off by creating a NURBS SPHERE, rotate it on the X axis by 90 degrees and the Y axis by -90 degrees.
  • Under the sphere’s inputs in the attribute editor,  I changed its start sweep value to 100 and it’s end sweep value to 260

This will open up the sphere as seen below:

eyelid1

The current values set will act as the max values for the upper eyelid and the max values for the lower eyelid.

To make the inner eyelids, I repeated the same process as before accept this time I changed the scale value of the inner eyelids to 0.9. This makes the inner eyelid slightly smaller than the outer eyelid which will allow me to create a loft between the two. As seen below

eyelid2

In order to create the loft between both the inner and outer eyelids, I selected both eyelid meshes, and with ‘Isoparm’ activated, I selected the upper edges of both the inner and outer eyelids, went into surfaces and Loft options and changed the section spans value to match the amount of spans across my eyelid mesh. In this case 3.

eyelid3

I repeat the same process the the lower eyelids

Then using the CV curve tool, I created the set of controllers that would eventually be used to control the opening and closing of the eyelids and applied a freeze transformation to Zero out it’s values.

eyelid4

I then used set driven keys to connect the eyelid input values such as start sweep and end sweep to my controllers. The controllers act as the driver and the eyelid mesh acts as the driven creating a blinking motion.

I also used key constraints to rig the eyes:

  • An aim constrain was used to move to eyeball geometry
  • The blend deformers of the dilating Iris were key constrained to a slider, in which the slider is the driver and the Iris is the driven

Final outcome shown below:

Creative Strategies: Making the Vomit using nParticles

Previously we stated that we’d be using Bifrost to create or simulate the vomit, however, having under gone a bit more research, I personally think the use of nParticles would be our best bet for simulating vomit.

The reason why I’m coming away from Bifrost is heavily influenced by the aspect of gravity within maya, and it can be quite tricky to control as a result. As of yet, Bifrost will mainly just interact with a scenes geometric assets as a collider and doesn’t really allow a great amount of control for the user.

Where as I found nParticles allow more control over the emition of the particles in general, ranging from how they move, the amount of particles used and setting specific directions for the liquid to travel.

As a result I came across this really helpful tutorial on Digital Tutors thats convers the use of nParticles within Maya, and applying them into Houdini. However, for now we’re just interested in how nParticles can be used within Maya.

Using Houdini Engine to Simulate Finite Elements in Maya

Digital-Tutors-logo

Clink the image above to be taken to the project overview of the the set of tutorials used to achieve the vomit effect.

Key Notes

Create an object (in my case a cylinder)

with the object selected, hold space, select nParticle and select emit from object

In our outliner, we will now two new attributes have been added:

nParticle1 & Nucleus1

outlinernparticles

With the attribute ‘nParticle1’ selected, go to basic emitter attributes:

here, we are able to edit values such as:

  • Direction
  • Particle Rate per second
  • Emission Speeds
  • Speed Random

Once the desired values have been set, select the ‘Nucleus1’ attribute

Here we are able to edit values such as:

  • Time Scale
  • Space Scale
  • Particle Spread

(Personal note: Deselect ‘Ignore solver gravity,’ this will enable our particles to act under a slight influence of gravity but still allow the user to edit and control the particles with ease.)

I then decided to add a collider into the scene to test how our particles would react with collider objects.

To create the collider, I made a default cube in maya, with the cube selected, hold ‘space‘ select ‘nCloth’ and select ‘create passive collider

Under the attribute editor of our new collider object (cube): we can edit its attribute values, such as:

  • Thickness
  • Bounce
  • Stickiness

To finish off the creation and simulation of the vomit, we select the nParticle1 attribute, enable ‘Liquid Simualtion’ (causes our particles to behave more like a liquid within our scene) and then convert our nParticles into into polygons by:

  • Selecting our nParticles
  • going to ‘modify’ – ‘convert’ – ‘nParticles to polygons’

This will convert our an particles into an output mesh

from here we can edit the meshes

  • blobby radius scale
  • material attributes such a colour and transparency

Personal Note: Sometimes editing the poly mesh can be difficult, so, if you’d like to restore the original nParticles, do the following:

  1. Delete the generated poly mesh in your outliner
  2. Then under the nParticle mesh attribute, select nparticle shape1
  3. under object display, untick, Intermediate object.
  4. Your nParticles have now been restored.

Below are a series of tests I carried out throughout the vomit creation process:

In this test, I was simply testing to see if the nParticle pre-sets I used worked correctly: Like Emission speed, directional attribute and collider attributes.

In this test, I was playing about with the ‘Space scale’ values with the nParticles having been converted into a poly mesh, I didn’t like the random spaces that existed within the vomit in this test. 

Following off the last test, I fixed the ‘Space scale’ values so it’s now just the one continuous stream of liquid. 

This test was carried out to see how well the mesh and flow of the liquid worked whenever the poly count of the mesh was increased, smoothed.

An animation test carried out to see how the liquid reacted whenever the emitter mesh was being moved. This was a very important test because in our final outcome, the vomit is going to have to be able to move with the character its being emitted from. so gaining a brief understanding of how the liquid moves under an animated influence was crucial.

Final render test.