Daz3d clothing for Unity

Pro Tip:  When you are exporting models from Daz3d to FBX, make sure to uncheck “Convert clothing to static geometry”.  This is checked by default, and will cause any clothing to be seen as just a “mesh renderer” in Unity.  If it was exported correctly, you will see all clothing nodes using the “skinned mesh renderer”

Daz3d animations to Unity 3d Clips

After you have exported a FBX model from Daz, it will have the one big timeline animation show up as an animation in your FBX file.

For easier Animator state machines, you need to break it up into different clips.

Select that animation in the Project viewer to see the details in the inspector.  There should be an Edit button in main details section.  Click it to edit the clips.  Select the Animations tab at the top, and add Clips.  For each clip, you can set a start/stop time.

After you Apply (at the bottom) all of these clips will appear as separate animations in the model in the Project.

Animations from Daz3d to Unity

Daz3D has  pretty decent animation editor.

Timeline:

You can position your character and set keyframes in the Timeline.  The problem is that I have not figured out a good way just to export the animation by itself.  This requires to export the minimal amount of data.

  1. Export to FBX
  2. In the export options just choose animation and figure.
  3. DO NOT choose morphs.

Without the figure, the FBX does not include the animation for some reason after it is imported into Unity.

With the Morphs, the animation will transform your character to the proportions of the one you used in the animation.

 

AniMate2:

If you have the AniMate2 plugin for Daz, you can add a few steps for customization.

First, you can create Timeline animations, then pull them into the Animate tab.  After your Timeline is setup, switch tabs.  Right-click in a blank are of the tab, and choose “Create aniBlock from Studio Timeline”.  This will create an unnamed block on the Animate timeline.   Make sure to Save As New.  Also, make sure to save it under /Documents/Daz3D/Studio/My Library/aniBlocks/animate

Once there, they will appear in the Animate2 tab at the bottom as a block.  Then you can drag them in the order you want.

When you are ready to export, right click again in an empty area of the tab, and choose “Bake to Studio Keyframes”.  This updates the Timeline, and allows you to export like before.

 

Unit3D:

Once your character is exported, you can copy it into your project.  Viewing it in the Projects tab.  If you choose to expand the FBX (don’t forget to choose the zoom-level on the viewer so you have teh icons) you will see the Animation in it.

At this point, you can drag that animation into your own Animator state machine.

Selecting an object you are looking at. Raycast from camera.

In an Unity 3D project, it is nice if you can select what you are looking at through the Oculus Rift, instead of having to move a cursor or recital around.

To do this using the Oculus’ Unity player controller, you create a Ray from the position and forward vector of the right eye.  Then test that against yours scene. If you hit anything, it will return the object data.

// Create a Ray from your rightEye.
Ray ray = new Ray(this.rightEye.transform.position, this.rightEye.transform.forward);

RaycastHit hitInfo = new RaycastHit();
// Specify 100, or some number, if you don’t want an infinite ray
bool hit = Physics.Raycast(ray, out hitInfo, 100);
if (hit)
{
// Do you stuff here
Debug.Log(“Hit ” + hitInfo.transform.gameObject.name);
}

If you run this on every Update, you can set a ‘highlight’ flag on objects that you are hovering over, to signify that actions can be used on them.

References:

  1. Unity.Ray: http://docs.unity3d.com/ScriptReference/Ray-ctor.html
  2. Raycast: http://docs.unity3d.com/ScriptReference/Physics.Raycast.html?from=RaycastHit
  3. Discussion about using the right eye: https://forums.oculus.com/viewtopic.php?f=37&t=2375
  4. Selecting object with mouse click: http://answers.unity3d.com/questions/411793/selecting-a-game-object-with-a-mouse-click-on-it.html

Giving a body to your Oculus Rift.

Since I can’t get the Leap Motion working right yet, I still want to get rid of this disembodied feeling you get when walking around a level.  I realized that even if the Leap Motion was drawing hands, I still need to show a body.

Looking over some demos, and even the new Elite game, they have bodies, and those that don’t usually get complaints.

So, I started off by just making a headless body and making it a child object of my OVRPlayerController.  The problem I see so far is that:

  1. The body is Y rotated based on the position of the headset upon initialization.
  2. The body does not follow head tracking.

I tried attaching it on several places in the hierarchy to no avail.  I guess what will be needed is to:

  1. Update the position of the body in real time based upon head tracking.
  2. Animate the body based on head tracking.

Item 2 is the hardest, but could make it really believable.  If you are looking up, and tracking backwards, the body should have Inverse Kinetics to connect it right.

At the least, to understand you are sitting down, or standing up in real life so your fake body isn’t always standing with its legs poking through the ground.

This all now is on my list for further investigation.

Leap Motion and Oculus Rift in Unity

Getting the Oculus Rift working in Unity 3D was pretty easy.  Just download the Unity SDK, import the package, and add the OVR player controller.

Getting Leap Motion to work in Unity had a few more step.  There are several different ‘official’ assets in the Asset Store to choose from.  The ‘Boilerplate’ has dll’s from March 2014, while the Skeleton package has a lot more, and was just created on Nov, 2014.

The directions require Unity free users to copy files from the imported package to your root project directory.  By simply placing the HandController prefab in your scene, you should start seeing your hands appear.

The issue is when you want to start doing both Unity and Leap, and I haven’t figured it out yet.

By having both the OVRPlayerController and HandController resulted in the screen being gray, or weird.  Missing dll errors.  I found a post that said that all dlls need to be in plugins/x86_xxx/ directory.  Since there are a couple for OR, I moved the leap ones in there.  After that, nothing worked.  So, I copied, them back, and kept the ones in the plugin directory.  The scene could be played, but now the OVR stereoscope wasn’t working, and there are TON of missing dll errors.  The console errors aren’t much help, and the paths they do give seem correct.

I need to hunt down a few working examples of Unity projects that use both of these devices.

Looking at you

Trying to make the ‘robots’ in my game look more alive.  I think the biggest impact is having them seem aware of you.  The first step was having them at least stand up when you walked near them.  In a previous post I talked about how to just get clips working in Unity, and I combined that with the trigger to have them going from a ‘sitting’ clip, transition to a ‘standing up’, and then finally staying in a ‘standing’ clip.  It worked pretty well.

The next step is going to see if I can get the model to look toward the camera.  There seems to be a few publicly available scripts to do that:

 

Blender animations in Unity

In Blender, you will setup all your animations as one long sequence, taking note of the start and end frames.

After importing the fbx into Unity, looking at the fbx details, there will be an Animation tab.  Add clips, setting Clamps for those same frame numbers.

Add the fbx to the scene.  Create a new AnimationController object, and add it to the Animation component that should be on the object you just added to your scene.

Edit that animation controller, and you will see the animation view, which resembles UML.  Drag all your clips onto this view.  They can be found by expanding your FBX file in your project list.

Now add transitions between them.  Setting all the exit conditions, like variables and whatnot.

In your script, you can now access that AnimationController by getting the component from your game object.  Then, you can trigger those states by setting the values.

Animator anim = GameObject.Find("yourObjectName").GetComponent<Animator>();
anim.SetBool("myFlag", false");

A great video about this is:

Oculus on the Mac Air

Developing on the Mac Air for the Oculus using Unity has brought up some ‘old school’ concerns.  Mainly in terms of developing for an older system. Primarily, this can be dealt with by reducing polygon counts, along without using any fancy shaders that require more piplelines than what a stock low-end card can provide.

The first way I have dealt with this is through making multiple scenes, and transitioning to these through doors.  This requires setting up invisible triggers that change your scene, though.  This has worked, but it seems jarring so far by Unity’s default standards.

Next up: Playing with what Unity has to offer in poly management

1) Level controller LOD:  Just have the room triggers control the visibility of items in others.  This is exactly what saved Quake, except you would have to set it up manually.  Have a trigger that would show/hide different objects in the level.  While the main level would still have to be loaded, you could have a big poly reduction

2) The Biohazard door: Have a closed door separating levels.  Activating the door, triggers an animation that hides the scene loading and.

Rigging a Daz3D model in Blender

I wanted to learn more about rigging in Blender, and since I liked the models that came from Daz3D, I decided to start with one of those.  Getting the model from Daz3D into Blender is pretty easy, and there are different ways, which I discus in other posts (TODO LINKS).

Once the model is in Blender, I started trying to rig it for animation. I found this pretty good tutorial.

However, I kept coming across errors when applying the bones to the model.  Here was another good video that might help

It still didn’t help me, so I kept trying smaller and smaller models.  Eventually, I just had a model of two legs, and put two bones in one leg, still with no luck.  So, I created two more bones for the other leg, and it worked.

I am guessing you need to have adequate bone coverage for your model.  I am going to try more and more complex models, and add bones to see how far I can get.

As I have seen in other cases, having more bones does indeed work.  I would like to just understand further why having less bones is an issue.  I would think at a minimum, Blender would associate all nearby vertices with the closest bones instead of just failing.