Liu Cixin’s successful Chinese novel series Remembrance of Earth’s Past features various timelines across multiple worlds, creating challenges in capturing the scope of landscapes and the vastness of space. However, Emmy award-winning cinematographer Jonathan Freeman embraced the ingenuity and construction of new camera lenses to help bring the series to life.
3 Body Problem was created by Game of Thrones alum David Benioff and D.B. Weiss, and Alexander Woo. After a Chinese astrophysicist makes first contact with aliens during the Cold War, her decision to help bring extraterrestrial life to Earth reverberates in the present day as a group of scientists are forced to confront the unimaginable. The series has already been picked up for a second and third season.
The series is lauded as an ambitious and impressive tale of curiosity and exploration, with visuals spanning across VR landscapes that devolve from gorgeous sunlight into harrowing snowstorms. Multiple American Society of Cinematographers (ASC) award winner Freeman is quick to highlight how the construction of the visual storytelling was a collaborative effort that required patience and resilience to get the balance right.
“In a series, what’s interesting is that you all start out with an idea of what it will look like, and you give things a go that sometimes works out and others you learn from,” shares Freeman. “You develop, you tweak, and that’s the case with us where we found these lenses and were learning to utilize their beauty in the right balance and when to be more conservative.”
Freeman spoke with Awards Focus about working with a team of cinematographers, creative volume stages with intricate lighting systems to control lighting directions, and how he achieved the incredible rehydration sequence in episode two.
Awards Focus: How much did you know about the novel series on which the show is based?
Jonathan Freeman: I had heard about the novels and that D.B. and David were developing it as their new project a couple of years before they went into production. I was very interested in the material and wondered how much they would be taking from the books and the direction of the adaptation.
I dove into the books, and they’re an incredibly fascinating read and, at the same time, somewhat intimidating. We would have to try and make this story unfold into a TV series when some aspects of the books are so intellectually based, and translating that to film could be challenging. D.B., David, and Alex [Woo] did a beautiful job of finding that right balance, of getting the intellectual content weaved into the storyline without feeling forced.
AF: When considering the choice of lens and aspect ratio, what were those conversations like, and what involvement did the cinematographers from later episodes have in those decisions?
Freeman: I was the first photographer out of the gate, so I had the responsibility of beginning discussions with the showrunners about key elements of our storytelling, which we started with aspect ratio. At that time, PJ Dillon, who was one of our other cinematographers, and I were touching base on the process because I wanted to get his take.
With the aspect ratio, we were considering future seasons and what ratio made sense for the whole series. In the first season we have dramatic landscapes in the VR world, and so it felt natural that we would want to shoot widescreen. There was also the episode “Judgement Day” that Martin Ahlgren was directing with this set piece around a tanker where we would want a wonderful, wide horizontal frame, which would require a widescreen. Beyond that, it was also the cinematic notion of space stories that seem to want a widescreen format because you can utilize the assets of space effectively in your storytelling by literally using negative space, like having a planet on one end of the frame and to better convey the scale of a starship.
The lenses were inspired by the beautiful bokeh that an anamorphic lens can convey. We found beautiful, vintage glass, but when I approached AARI, Simon Surtees suggested I consider looking at prototypes that Greig Fraser, who shot The Batman, was developing. We were so blown away by them. They had this classic, vintage quality that you want from anamorphic, in terms of bokeh, and this unique aspect that crossed into a modern field where the center of the lens tended to be optically pure. It’s hard to frame a good closeup on an actor without causing the distortion on their face, and these were perfectly in the middle with chromatic aberrations, or sort of distortions within the edges of the frame, that made it feel otherworldly.
AF: You’re credited for the episodes alongside Richard Donnelly. Can you talk a bit about working together on episodes one and two and the division of tasks?
Freeman: Richard came into the project when I had to leave a bit early. He was available to shoot some additional scenes and contributed greatly to the first two episodes. I really appreciate that he took what I had established and then was able to have his own touches that really elevated the bar, which is true for all cinematographers. By the time I was shooting and PJ and Martin joined the team, we were always communicating and watching each other’s dailies, which helped us give advice to each other.
AF: In episode two, there’s a scene where shriveled bodies are thrown into a large body of water, and the bodies rehydrate and reanimate in stunning VFX. How did your work behind the camera pair with the VFX material?
Freeman: We eventually had pre-vis material for all of those VR sequences so we could plan shots. We storyboarded initially to get the ball rolling, but then we didn’t spend too much time revising, partially because of time, and everything had to be accurate—every frame needed to be perfect. If the camera was going to be two inches off the ground, shooting up towards the background of a period, you needed to frame it correctly.
Ultimately, the shots of the dehydration elements needed practical props where you’d see a transition, but a lot of work would have to be done virtually. There were multiple tests that the brilliant VFX team went through, and I love working with those teams to collectively find a solution.
AF: How did you overcome lighting challenges, like after the bodies are rehydrated and the climate shifts from a glorious sunny sky into a snowstorm?
Freeman: What’s unique about our story is that light is not just a character; it drives the plot, and characters are emotionally affected by the light. So, getting the light right, and as best we could, was critical, and it was almost as challenging to do that inside. There was one point where we considered some scenes to be shot outside, but ultimately, for efficiency and consistency, and knowing some of the light would look naturalistic, we decided against it.
Some of it had to look alien so you can have a sun that rises over the horizon. We can simulate it naturally as best we can inside a studio, but the next time it sinks behind the horizon and comes back up again, it’s 40 times the size, and there’s no reference for that. The light had to be interactive with characters in that environment where the rest of the virtual world would be recreated.
We did tests with Unreal Engine through the pre-vis, and we ultimately knew that the interaction with the light was critical and successful on the volume stages. We had to have a working environment that had the light interaction but also the scale so we could have a hundred extras running from one point to the other butt naked in the snowstorm. There was no volume big enough to accommodate all those elements, so I had to come up with what I call a low-rise volume, which was effective and very expensive [laughs]. It comprised of a light wall of sky panels, which are a common light source that we use. Each was almost like a pixel, so we created an array of 110 feet by 45 wide wrapped around 180 degrees.
Then, there was the lighting on top, similar to the skylight, where we programmed animated lights to recreate sunrise, sunset, dusk, and dawn and were able to create those transitions. It was quite complicated, but then we were able to still use the same space and transition in camera. It took a lot of minds to put it together, but I thought it was really successful.