◡◶▿ SOFT10 | Hybrid crew
🐦 Reprogramming the migratory patterns of your dolly grip. Surgically transforming your boom op into a Tetsuo-assed NPC. Plus: €6k XR residency. | Imaginary Software of the Filmmaking Future Week 10
Hello. Welcome back! Or simply welcome, if you’re new. We’re learning about imaginary software of the filmmaking future. And this week’s lesson contains four standalone - but inter-lockable - micro-essays on software-powered crew members.
Missed a week? Joined late? Don’t worry about reading these lessons out of order. Each functions independently. They are sent in a sensible sequence but hardly reliant on it.
First, let’s glance back at last week. Last week, when we learned how:
A paradox occurs when making films about the human condition with robot tools: the human routine has become so robotic, how’s the robot to sense the difference?
The solution might be a Le Corbusian “soft machine for robot-filmmaking in.”
The film set has long been an awkward playground for creative misfits - your AI assistant is just the latest oddball.
But it may need closer access to the human body to be of real artistic use.
Great! Excellent.
In today’s lesson, we’ll see how some of that might work out. Let’s find out how:
🐦 Sensors and buzzers may turn the dolly operator into a human bird - migrating naturally to the most effective angles of a scene.
🐙 The boom operator might consider surgically extending her arms… and her mind.
🤺 You can raise the stakes for the sedentary editor by sending them on a virtual reality quest to cut your movie in close combat with sword-wielding AI sprites.
📼 Your producer may eventually grow nostalgic for the ‘legacy format’ of the flesh and blood ‘best boy.’
Oh god!
Plus, there’s a vaguely calming recommended read and a tip-off for a €6,000 collaborative XR residency in Rotterdam.
Study note: I’ve only just discovered that when you read UPV lessons on the website rather than the email, you can hover over footnote numbers1 to make their text bubble up alongside! That saves a lot of scooting up and down the page of a footnote-heavy lesson. Which today’s isn’t.
To read (and listen, ‘Like,’ comment etc.) on the website, just click on the title at the top of the email.
Software, eh? The things it can do. Let’s get started.
Bird sense
You can hear me deliver this lesson by scrolling up to the header and clicking Listen and/or the play ▸ button.
Tools and techniques - utensils and habits - software - restructure a human’s underlying patterns.
At the start of the 2020s, a number of AI-powered image, text, and video generators started to appear on the market. Many human filmmakers were outraged or frightened. They felt that their work and their talent were being devalued, demeaned. But did they also recognise that this software would change their own human hardware and software?
What the cinematographers were going through now, taxi drivers went through two decades before with Global Positioning System (GPS) software. The brain structures of veteran taxi drivers are demonstratively mappier than those of ordinary citizens. London taxi drivers have been obliged to complete an orienteering test called The Knowledge since 1865. Their brains wrap and fold around 320 urban routes and all the roads and landmarks around the starts and ends of these routes. A lineage of “brain cousins” across the centuries!
But the app-based rideshare drivers - what do their brains look like? We do not yet know.
It is highly probable their brains have reorganised in sympathy with the software that guides their daily routes. The synthesised ker-ching of a post-ride tip becomes hardwired to the muscles that reach for that second cereal bar in the petrol station shop.
The London cab driver looks on as the rideshare driver swoops in on fare after fare, guided by satellites and the networked needs of plugged-in passengers.
The filmmaker or creative technician who specialises in the spatial realms may wonder where this leaves her. The:
architectural or landscape filmmaker who builds worlds and narratives using space and directionality,
set designer, or
may wonder if she should upgrade her inner mapping software.
How might that work?
We know a man has “permanently anchored” an electromagnetic compass to his chest. Part of the idea was to train or re-awaken part of his brain to instinctively recognise ‘northness’ even should the device be untethered. How long until the dolly operator is guided by subcutaneous sensors and processors, buzzing her along the most metaphysically appropriate route for a particular take of a complex shot? How long until she can dig out those implants, confident she can still sense the emotional northness of a scene with the accuracy of a migrating bird?
The compass man was not alone. His colourblind colleague became an “eyeborg.” Sensors enable him to “hear” the full light spectrum (including infrared and ultraviolet). Perfect for the Steadicam operator on the prowl.
Another’s elbow was wired to vibrate when an earthquake strikes anywhere in the world. Absurd and yet super-intuitive. A butterfly flaps its wings in China and a costume supervisor spills her coffee on a film set in Bosnia and Herzegovina. You’re in tune with the planet!
As the film set evolves, software may infect and alter the filmmaker or dolly operator without them noticing. Just as London’s roads remoulded the taxi drivers’ brains.
Or the filmmaker may invite the software onboard, integrate it, assimilate it.
Or she may believe that she’s inviting the software in of her own free will - when, in truth, the mutations caused by prior software infections are (quite literally) calling the shots. Of course, this has been an issue for filmmakers since long before the invention of computers.
Inner boom operator
Imagine a boom pole with onboard software that guides the operator. Guides the operator to the clearest waves and signals. The most apt reverberations. All while communicating with the ‘cam-bot’ about the limits of the frame.
“Get back,” hisses the AI camera, seeing the boom pole graze the frame edge. “Alright!” whispers the boom pole. “The cambot says we need to move back, if that’s okay,” the pole suggests to its human operator. “Thanks, buddy,” thinks the boom operator, retreating to the buffer zone.
Thankfully, this whole dialogue is silent, relayed through code, LCD screen, vibration (the pole shivers as it approaches the danger zone), or electromagnetic resistance. And thought.
Okay - so why do we need a human boom operator at all? Because the room would feel empty without her? Yes. But also for her flexibility. She may be more flexible:
Intellectually
Emotionally
Physically
than a smart, articulated robo/drone boom.
As such, there are at least four clear advantages to re-locating elements of the smart boom pole into the human operator.
For one, the boom operator has her own energy source. She’s a veritable furnace of legitimate and contraband snacks, trailer stew, and fizzy pop.
Secondly, she can utilise her human proprioceptive senses, refined over millions of years, to respond to guidance from her colleagues and their virtual assistants. Electrodes or embedded sensors and software could record or replicate precisely what the boom operator hears:
Electrodes could ‘harvest’ recorded sound from her own ears.
Circuitry to relay sound information from anywhere along the signal path from the outer ear, cochlear, or brain of the ‘boom operator’ - if she can even be called a “boom operator” at this point.
AI software could model the boom operator’s organic listening apparatus and ‘predict’ what she’s hearing, how she’s hearing it, and what’s distracting her in real-time.
In this scenario, the boom operator’s listening skills gain greater currency; you can imagine a world in which:
high-profile boom operators are literally ‘head’-hunted from rival productions,
therapists and investigators quit their jobs for lucrative boom gigs, and
the big studios genetically engineer larger-eared, more attentive listeners in whom to plant their proprietary smart Portastudio software.
Thirdly, re-locating elements of the smart boom pole into the human operator’s body could save precious space. The ultimate boom pole retraction.
And fourthly, this “human sound desk” could go “undercover” in a busy scene. Dressed as an extra or even draped in green screen cloth, the boom operator may mingle inconspicuously among the characters. Any one of which could be an undercover make-up assistant, poised to make quick, aesthetically-optimised touch-ups using her onboard smart cosmetics.
Future editing (fencing)
In the future, editing a film in augmented or virtual reality may be a bit like tennis and a bit like yoga.
But maybe editing should be less like tennis than fencing:
slicing not the film stock (or the digital brick) but the protective membranes that separate dimensions;
blood-letting for possibilities;
opening little pockets of causality in the fabric;
leaving little scars on the story.
What would your movie look like if you were forced to fight your rushes in a duel?
A crew made of vinyl
Only on the most enlightened2 film set is the social process of creativity the official priority.
But, as our minds, techniques, rivals, and narratives are reduced to computation, the value of our bodies - and the bodily experience of making a film in space and time and togetherness - inflates.
“Bring back vinyl!” comes the call from those who long to be touched. “Bring back magazines! Bring back coffee breath and bad knees straining for the director to yell Cut!”
What kind of funky bio-software makes people feel like that?
Please share your thoughts, queries, and exercises from this week’s lesson in the comments.
“Essence of reality” up for grabs in €6k Dutch XR residency
Have you been affected by the issues raised in today’s lesson?
Consider applying to Realities in Transition, a two-month collaborative residency at Rotterdam’s V2_ Lab for the Unstable Media!
“We invite artists, designers, 3D modelers, illustrators, directors, writers, photographers, sound-designers, creative coders and (web) developers, to join the co-creation of a new XR experience,” they say. “Since presentations of XR experiences often focus on the individual rather than incorporating the audience present in the exhibition space, this residency aspires to explore the possibilities of a communal XR experience.”
Themes include: Virtual, Augmented and Mixed Reality, Artificial Intelligence and Machine Consciousness, Data Privacy and Surveillance, Simulation Theory, Existential Risk and Future of Humanity. 🥳🎉
The only eligibility caveats are EUROPEAN RESIDENCY and TALENT.
The deadline is 1st May, 2024. There’s a budget of €6,000 to cover your fee, travel, accommodation, and subsistence. Great! Not bad.
“Every ‘new thing’ quickly becomes too much”
Panicking about all this AI stuff? Try refocussing on the distant mountains!
“AI hype operates at the speed of fashion, while AI itself can only move at the speed of infrastructure,” writes
.Mr. Salvaggio speaks a lot of sense about AI tech and culture. Knows what he’s talking about. In his newsletter last week, Salvaggio analogised the differing experiences of taking the bullet train vs. the regular train to the Future Shock of staring too closely at all the AI hype (whether positive or alarming or both).
“Despite running on parallel tracks,” he wrote, “the scenery changes with the pace of these trains. At high speeds, that which was closest to us was invisible, passing as a blur. I’d take photos with my 2010 digital camera and the images would be distorted, traveling too fast to catch on the sensor. On the slower train, you saw what you were missing: mostly, powerlines and lamp posts, the boring stuff of infrastructure. Focus a bit further out and your eyes don’t race. The mountains loiter while the fence disappears.”
(And that’s very like life, when you think about it…)
Likewise, AI tech progresses at a different pace depending if you concentrate on the hype, the infrastructure, the governance… all the way down to the limits of nature itself.
That doesn’t mean it’s all good news (some of those mountains could be volcanoes) nor bad news (still, lovely mountains). But it is a good call to recalibrate your panic levels. (Or enthusiasm levels, if that’s your bag). Do read Salvaggio’s entire post for more insight on every ‘pace layer.’ Or maybe don’t. It’s not that reassuring.
Next week, we’ll look at your software-powered cast: programming your actors to program their characters to program the plot. And what to do with the husk when you’re finished.
Class dismissed!
~Graeme Cole.
(Principal)
📹 Unfound Peoples Videotechnic | Cloud-based filmmaking thought. ☁️
🐦 Twitter | ⏰ TikTok | 📸 Instagram | 😐 Facebook | 🎞️ Letterboxd
just like this little fellow!
/unprofitable?