Collab Group Week Five

We had our weekly collaboration group catch up on Friday this week. As we’d all been sharing our progress with one another via OneDrive and the school network, there wasn’t much that we had to go over, but a quick check-in is essential for everyone to stay on the same page.

The VFX team discussed with us their thoughts on particle simulation. They’d decided that the best way to go would be to create the work in Maya rather than Houdini, and probably rely on simulations more often than zoetropes. Below is a prototype of the sandstorm effect.

Gherardo has also been working on setting up Act 1 Shot 1, and I’m really impressed with how well it’s come out so far.

It looks great and only requires animation on the spaceship. We talked a little bit about different ways to hide lights on solid objects, specifically in this case, the planets.

As for the animation team, my plan over the weekend is to learn to put some motion capture on Jordan in Act 1 Shot 4. Once I’ve got the hang of motion capture, things will run a lot faster.

Kamil and I spent some time discussing how our spaceship was rigged. Because I’d found that the doors weren’t moving as I needed them to, I’d animated them separately with my own parent constraints in my shots, and I’d done it rather loosely, too, only for the purposes of what I needed in each scene. But what I didn’t realize was that when Kamil then went in to update the rig, it broke the work I’d done on my shot because it was referenced in. So we spent some time discussing the resolution of the issue, and our spaceship is now good to go.

Render Test from Act 1 Shot 3

This week will be challenging to make progress as we balance it with our performance animations. My biggest goal is to teach myself motion capture, and we’ll take it from there.

Mehdi Tutorials: Houdini Week Five

I’ve had a very hard time with the Houdini tutorials lately, and so I was proud to even get through the first forty minutes following along. My smoke looks okay, but it’s doing something weird towards the end of the simulation, when a large box of light develops.

I was able to follow along until rendering, but then this happened. I’m going to book a session with Mehdi on Monday and just ask him what’s going on, I suspect there are multiple issues with my file.

UPDATE

I spoke to Mehdi and he informed me that the only issues were either with my graphics card or not hiding the geometry. I rendered again with the geometry hidden and was able to produce this:

The density adjustment slider was really interesting to me, and it’s cool to be able to choose between pure flames and a more smokey fire.

When it came time to render our simulation, I was at first met with the same issue as before:

But happily, I was able to solve the problem on my own this time! I realized that it was because the material shader name had not been updated and I reassigned it as well as making sure all the geo was hidden. As soon as I did, I got this image:

I feel that this week helped me understand rendering better and continued to help me get accustomed to the Houdini workflow. Being able to find my own material shader issue is a big step in the right direction of understanding the way Houdini nodes interact.

Virtual Production: The Latest Tech from VFX Studios and Indie Creators at Vertex 2021

The first session I attended of the Vertex 2021 conference was the Virtual Production talk with Ian Failes, Theo Jones, Paul Franklin, and Hasraf Dulull.

The panelists began by giving an overview of what virtual production actually is, in case any audience members were going into it fresh. Virtual production is a system of tools that allow us to combine live action footage with CGI in real time, although, as Paul pointed out, this can encompass a vast number of different areas, and so it’s hard to specifically point to one output and say “this is virtual production”- it can be used for so many different things.

What most people are excited about right now, according to the panelists, is LED Wall, which can run a CGI environment on a live action set, effectively replacing the need for green screen. This can save a lot of time and money when shooting, and from a director’s standpoint it’s useful to be able to change lighting or any number of different props or scenes without having to entirely rework the physical set. However, the panelists also discussed the fact that it’s important for clients to manage their expectations, because although virtual production can allow a very fast turnaround and almost anything can be changed at any time, there is the requirement of testing time and trial and error.

The panelists finished off by telling us that there is a big need in the industry for Unreal Engine experienced artists, and with the resources available for free online on blogs, YouTube, or Unreal’s own website, it’s easy for absolutely anyone to get started on it from their own bedroom. This is one unexpected advantage to COVID- easy virtual access.

Performance Animation: Acting Feedback

I was told that the character I was playing on the left was okay, but I could even restrict the motion a little more and remember to think pose-to-pose. I should make sure to brush the hair back first and then say the line, keeping in mind pose staging. I may extend the audio a bit in the beginning to do this.

As for the character on the right, I got a few critiques; I was told that he should take shorter strides into the frame and turn around with less of a jolt. The arms don’t need to wave so much. When he says “The University of the United States”, perhaps that can just be a brandished pointer finger, not one large sweeping movement. He can also stay in place more between poses. A good idea, too, is that he maybe looks at the camera when coming up with his lies, breaks the fourth wall a little bit, rather than making eye contact with her. This also contributes to the sense that he is lying.

One more thing is to make sure that the characters don’t physically pass in front of one another, as it confuses the silhouette.

Collab Group Week Four

Over the past week:

we’ve all gotten a lot done.

Using the animated storyboard as a reference, Callum and Ben created a suspenseful theme to be used in Act 2, which I’m very excited about. It’s exactly what I was envisioning for the story.

In the meantime, I put together a set, and began working on rough blocking animation for Act 1 Shots 2 & 3.

Kamil then reached out to us with some suggestions for changes to the file structure, models, and set. He was able to reduce file size tremendously by referencing in each element of our scene:

In this setup, we use the animation file anim_act01_shot00 to work on the blocking, incrementing it for each shot so that eventually we have a new anim file for each one. The character, background, and spaceship are all referenced in. This way, we can work on updates to the set or character and have them update in all of our animation files. This is not only more efficient, but it allows us to get the animations to the VFX and sound design teams quicker so that they may begin working without waiting for us finalize every aspect of the scene.

He also updated the ship so that the shaders were applied correctly:

And began working on the colony. We still have some ways to go on this, but with our new referencing format, we can start animating without worrying about that yet.

The group expressed an interest, however, in keeping the large cliffs and canyons visible in the original set, as they had been planned into the storyboard. Kamil sent us an updated version with a possible way to lay this out:

While me and Kamil worked on set design and beginning animation, Antoni and Gherardo have been working on particle simulation ideas for the space dust and sandstorm scenes. This morning, we learned that we have a sandstorm zoetrope available on the class resource share folder, which is incredibly lucky for us.

During the meeting today, we decided on this plan for next week:

Kamil and Emma (Animation Team): We will split Act 1 so that I do shots 2-5 and Kamil does shots 6-9. We have all been discussing using an entirely different set for shot 1, so this doesn’t concern us yet. Kamil shared his screen with us during the call to demo how to use the pipeline for our work. I will transfer my previous animation to the new scene using Studio Library and get shot 2 to the rest of other teams by Thursday or Friday so that they can each begin working in it.

Ben and Callum (Sound Design Team): Once Ben and Callum have shot 2, they will start working on sound effects for it (spaceship whirring, wind, music, etc).

Antoni and Gherardo (VFX): This week, they will also take a look into shot 2 and begin working on their simulations in it, as well as settle on the most time and space efficient way to store these simulations (Zoetrope vs created in Maya or Houdini).

Once the rest of the group has shot 2, we can really begin getting a sense of transferring our work between teams and feeling out what our finished product will look like for each shot.

We are also meeting with Luke on Thursday to discuss all of these plans and take any feedback or suggestions necessary to better our project.

3DE Independant Work Challenge

After learning about our external collaboration unit coming up in the summer, I got in touch with Dom to talk about practicing my skills in 3DE in order to prepare to possibly take a tracking role in the project. I’ve been told that 3DE is one of the best ways to get started working in the industry, and I like it a lot, so I think it would be a very smart move to try to angle my showreel around it.

When we first started learning 3DE, I had a really tough time even finishing the tutorials, as the software was all so new and the process requires the memorization of a lot of very small but very important steps. However, as is often the case, struggling to get over that hump ended up helping me in the long run because I remember the solution to each problem I had so solidly. I find tracking shots to be actually pretty fun now, and find that I understand the significance of most of the steps.

I talked to Dom about starting a few different shots on my own to challenge myself. I feel that working independently forces me to learn more because I have to solve my own problems, so I will be working on a couple different shots over the coming weeks and meeting with Dom each week to check in and critique.

This is the shot that Dom recommended I work with as an easy starter, just to see if I can remember how to get through the process by myself:

I felt pretty proud of myself when I was able to get everything set up without a reminder: converting the video to an EXR sequence, importing it in the camera tab, exporting buffer compression files. I created a Maya project and set up the project window, saved the file path, and found an obj file for free online, which I textured.

I found the tracking relatively easy, as we had planned.

Still, I was nervous that having to choose all of my own points would be difficult, but when I first pressed “calc all from scratch”, I was relieved to find that the line was very straight.

And, even better, my deviation was only 0.8!

Unfortunately, though, I suspect that this may not be what the parameter adjustment should ideally look like. From what I can recall, it should look more like a cube. My theory is that I did not include enough of the sides of the buildings on the street, and only the street and the horizon line were detected. But when I pulled it into lineup view, it looked perfect, so I continued.

I did have to rewatch the superhero training video to remember the order of calculating the lens distortion in the parameter adjustment window, though. I wasn’t able to remember that on my own. I remembered how to do everything and why, though, just not what happens when. I’ve written it down to study.

I put in the 3D model and was at first confused that I was unable to move it, but realized quickly that I needed to turn off “contains survey data.”

I projected my points onto the 3D model and had no issues.

So I exported the mel script- it crashed a couple times and I was forced to export without 3D models- and ran Warp4. This was the first time that Warp4 has not crashed on me on the first try. I saved the dewarped footage and brought it all into Maya.

Then came some purely aesthetic work. I tried to match my skydome’s exposure to the fading light of the shot and I rotated R2-D2 so that he tilts as he swerves, in order to make it look less like he’s floating along and more like he’s powering his own journey.

This R2-D2 was not rigged and the geo was in numerous small pieces, so I didn’t have the option to move the body and ambulatory limbs separately.

Overall I’d say I did a fairly decent job. I’m not sure if there was anything I forgot that I just happened to get lucky with not ruining my shot. I do suspect that I’m lacking some information on the sides of the shot. However, considering I did this without help and only had to look up one step, I feel pretty good about it.

KK Tutorials: Lighting Week 1

I made it all the way through the first lighting tutorial and only ran into a couple of issues. When I first started, I was confused because my file opened to look like this:

-as opposed to the full scene that KK had on his screen. No matter how many times I re-pathed the images I kept getting this result.

I reached out to the class and found that Crystal was having the same issue. After some troubleshooting, she informed me that the answer was simple: pressing 1 on a node displays it in the viewport, and KK’s scene happened to be on a different node than ours when we loaded ours in. I was able to follow along with him after that.

The only other issue I ran into was when we were rendering the shot in Maya. For some reason when I pulled up the geo, I see this fully constructed scene rather than the HDRI in the background. I repeated this step a few times and achieved the same results. I hesitate to tamper with it as I’m not sure whether there’s an actual issue in the file structure or if this is simply what the final outcome should look like.

Houdini Tutorials: Week Four

This week I was able to follow along all the way until the point in which we got back into the particle animation. For some reason, my computer crashes every single time I try to render even one frame of it.

But I learned a lot about rendering just from following along with the rubber toy example, and I will try to render one of my destruction sequences using this knowledge.

The Researcher: Collab 2/8/21 Meetup

These are the decisions reached by the group during today’s call.

This Week’s Work:

Monday to Tuesday- Emma creates an animatic from the storyboard with consideration of timing so that our Sound Designers can begin working with a clearer idea of what is needed and when.

Tuesday- Gherardo and Antoni attend 3D lesson, possibly meet with professor one on one to ask about rocket simulation aspect.

Wednesday- Kamil and Emma attend mocap lesson, learn more about the projected length of time it will take us to animate each shot.

Thursday- Entire group meets with Luke to discuss project.

Wednesday to next Monday- Kamil and Emma begin building the set for Act I. VFX team works on space simulation for first shot and finding 3D assets to incorporate for the set. On Monday we meet back up to discuss whether our set is done and plan to begin animation.

During our meeting we went through the storyboard and discussed any possible changes or updates on our shots, found some 3D interior models to use for our sets, and discussed a realistic output for the project. Our goal is to have at least one act fully finished for our showreels by the end of the project.