PROJECT OVERVIEW
When Tuff Gong International approached us with Soul-Rebel Marley's "Holy Father," they faced a formidable creative challenge: their original vision—a spiritual pilgrimage filmed on location at the ancient rock-hewn churches of Lalibela, Ethiopia—had stalled due to budget constraints and filming restrictions from the Ethiopian government and church elders. With only B-roll footage and a compressed 2-month timeline, the project seemed destined to compromise its ambitious vision.
Instead of scaling back, we proposed a bold solution: combine LED wall technology with AI-generated environments to create broadcast-quality representations of Ethiopia's sacred landscapes—all shot in a Glendale studio. Using Google Veo 2, Kling 3.0, Runway ML, and Midjourney, we generated over 100 styleframes, built immersive volumetric environments, and delivered a cinematic music video that premiered on YouTube (100K+ views), aired on Jamaican broadcast television, and was displayed on OOH billboards across Kingston—proving AI-native production could achieve what traditional methods couldn't afford.
Director/Producer: Martyn Watts (BRONKO)
Artist: Soul-Rebel Marley
Label: Tuff Gong International
Producer: Joseph I
Director of Photography: Jacob Caron
Production Designer: Jasmine Nicholes
Wardrobe: Joseph I
Colorist: Martyn Watts
AI Workflow Design: Martyn Watts
Given the insurmountable challenges of filming in Ethiopia for this project, our approach to "Holy Father" shifted from pursuing complete authenticity to achieving heightened realism—a hybrid methodology that combined cutting-edge AI-generated environments and LED wall technology with conventional filmmaking techniques. By grounding digitally constructed settings with authentic Ethiopian art direction, wardrobe, props, and color palettes, we could capture the spiritual essence of Lalibela's sacred landscapes without the logistical and financial impossibility of location filming.
The song's themes—spiritual pilgrimage, divine guidance, ancestral legacy—demanded visual grandeur that honored Ethiopian Christian tradition. Rather than compromise that vision, we built it from scratch in a Glendale studio.
The Marley family's initial concept focused on performance footage in front of Lalibela's churches and walking sequences through the region—straightforward, but lacking narrative depth and iconic symbolism.
The original plan stalled when three critical barriers emerged:
Budget Reality: Flying a capable film crew to Ethiopia, securing equipment, and coordinating multi-location logistics would exceed $50K—prohibitive for an independent release
Access Restrictions: The Ethiopian government and Lalibela church elders denied filming permits at sacred sites
Timeline Pressure: Release date delays created urgency to deliver without sacrificing the video's spiritual and cinematic scope
Initial compromises included shooting entirely on green screen—a solution that would have reduced the project's grandeur to flat, generic backgrounds. Producer Joseph I approached us with the dilemma: Could we salvage the vision without destroying the budget or timeline?
Beyond solving the technical challenge, I recognized an opportunity to elevate the creative vision. Rather than simply recreating performance footage, I developed a new treatment from scratch—building a hero's journey narrative complete with symbolic trials (the serpent and lion), divine encounters (the bishop's blessing), and spiritual transformation (his vision and coronation as king). This narrative structure transformed a straightforward performance video into a cinematic pilgrimage that honored Ethiopian Christian tradition while delivering the emotional and visual impact the song deserved.
Rather than compromise the creative vision, we proposed an innovative approach that would expand the scope while reducing costs: AI-generated environments displayed on volumetric LED walls, combined with practical cinematography and strategic art direction.
Week 1-2: Treatment development & AI styleframe iteration (1,200+ variations)
Week 2: Production shoot (1.5 days, 40 setups)
Week 3-8: Post-production (, VFX compositing, color grading, finishing)
Rather than adapting the original concept, I built an entirely new treatment from scratch, structuring Soul's journey as a classical hero's pilgrimage:
Act I: Soul walks the Ethiopian highlands, praying for guidance at Biete Medhane Alem
Act II: Trials in the cave (serpent and lion), divine footprints leading him forward
Act III: Bishop's blessing, vision of ancestors, ceremonial coronation with incense and umbrellas
Act IV: Celebration before the Debre Libanos Waterfall, community rejoicing
This narrative framework gave the video iconic symbolism and emotional arc—transforming simple performance footage into a spiritual epic.
Once the narrative was locked, I populated the treatment with over 100 AI-generated styleframes, iterating continuously (over 1,200 iterations in total) through pre-production to refine the look and approach—making on-set execution as guesswork-free as possible.
Tools tested: Google Veo 2, Runway ML, Midjourney, and Kling 3.0 (we experimented with multiple platforms to determine which delivered the best results for each scene type)
We generated styleframes exploring:
Ethiopian highland landscapes at golden hour
Interior architecture of Lalibela's Biete Medhane Alem church
Dramatic cave systems with volumetric lighting
Cosmic "heavenly" environments with god rays and celestial elements
The Debre Libanos Waterfall surrounded by celebration
Thunder and lightning storm backdrops
Demera bonfire ceremonies with sparks and smoke
Reference influences: Star Wars (epic scale), Ben-Hur (spiritual grandeur), 300 (dramatic lighting), Gladiator (throne room majesty)
Color palette strategy: Earth tones, deep blues, reds, whites—colors of nature balanced between warm and cold to evoke spiritual transcendence
Approval process: Virtual storyboard sessions with Soul-Rebel, Joseph I (producer/wardrobe), and department heads, iterating on lighting, color palette, and spiritual authenticity.
Once styleframes were approved, we generated full-motion 4K AI backgrounds tailored to each narrative beat:
Google Veo 2 for:
Physics-accurate water simulations (light refraction, surface tension)
Atmospheric phenomena (fog rolling through highlands, dust particles in god rays)
Natural movement (clouds drifting, fire flickering)
Kling 3.0 for:
Dramatic cinematic lighting (chiaroscuro, rim lighting, volumetric beams)
Dynamic action sequences (lion emerging from darkness, lightning strikes)
Elements feature: Multi-angle object mapping to maintain consistency of the lion, throne, and architectural details across multiple shots
Why This Approach Worked:
Stock libraries lacked specific Ethiopian sacred architecture
AI enabled precise art direction (exact lighting angles, color temperature, composition)
Cost efficiency: API fees for 40+ custom backgrounds = $2K vs. $15K+ for stock licensing and compositing labor
Location: XRStage LA (Glendale) - 30-foot panoramic curved 4K LED wall
Timeline: 1.5 days of principal photography (40 setups)
Camera: Sony Venice (exceptional dynamic range for LED wall integration)
On-set workflow:
Pre-loaded AI-generated backgrounds onto LED wall playback system
80% of setups used AI environments; 20% used green screen projected through LED for flexibility
No LUTs created in advance—shot in S-Log3 for maximum latitude
Lighting designed with modularity: setups could change "with the flip of a switch" to accommodate rapid scene transitions
Color temperature matching: Practical lights motivated by LED background colors (warm candlelight glow, cool storm atmospherics, etc.)
Critical advantage of LED vs. green screen: The LED wall cast interactive light onto Soul-Rebel and the set—creating authentic rim lighting, reflections in eyes, and environmental spill that's nearly impossible to replicate convincingly with green screen compositing.
DP Selection: I specifically hired Director of Photography Jacob Caron for his LED wall experience and enthusiasm about the AI integration, ensuring technical fluency on set.
Timeline: 6 weeks
Editorial: Assembled narrative arc emphasizing Soul's transformation from pilgrim to crowned king
VFX/Compositing (40% of shots required work):
30% used DaVinci Resolve Magic Keyer (AI-enhanced edge detection)
60% used Delta Keyer (standard chroma key refinement)
10% used After Effects KeyLight 1.2 (complex hair/motion blur scenarios)
Upscaling/Enhancement:
All AI-generated footage required uprezzing to broadcast 4K
Custom "upscaling cocktail": Astra + Topaz Video AI (iteratively tested blend ratios for optimal sharpness without artifacts)
Color Grading Workflow:
Color Space Transform (S-Log3 → DaVinci Wide Gamut → Rec.709)
Exposure/contrast balancing
Selective color (emphasizing earth tones, deep blues, spiritual whites)
Midtone detail enhancement
Final sharpening and focus refinement
Audio: Traditional vocal recording with strategic use of ElevenLabs for scratch vocal tracks during pre-visualization
Narrative Context: Soul is trapped in a cave between a serpent behind him and darkness ahead—when suddenly, a lion charges forward from the shadows.
From Treatment: Slide 12
Traditional Approach:
Green screen shoot with Soul reacting to nothing
License stock lion footage or hire animal wrangler ($5K-$10K)
Composite lion with manual rotoscoping (8-12 hours)
Hope the lighting/perspective matches
AI Approach:
Generated close-up of lion walking forward through volumetric cave lighting using Kling 3.0 (dramatic rim lighting, realistic fur movement, intimidating eye contact)
Used Kling Elements: Mapped 4-5 reference angles of lion head/body to maintain consistency across 3 separate shots (wide, medium, close-up)
Displayed on LED wall during shoot—Soul's genuine reaction to seeing the massive lion image created authentic performance
Minimal compositing required (just edge refinement)
Why It Worked:
$200 in API costs vs. $8K+ traditional
Lion's lighting perfectly matched cave environment (single unified render)
Soul's performance authenticity elevated by seeing real imagery instead of green void
Narrative Context: Soul performs against a stormy sky backdrop symbolizing spiritual struggle—"when the thunder rolls, dem a falter."
Traditional Approach:
Green screen + stock storm footage (generic, overused)
Separate lightning VFX elements composited in post
Multiple layers = complex edge work around Soul's regal costume details
AI Approach:
Generated custom "throne room in the clouds" environment using Veo 2 (physics-accurate cloud movement, atmospheric perspective)
Added dynamic lightning strikes using Kling 3.0 (precise timing, dramatic volumetric illumination)
LED wall displayed a unified environment—lightning flashes naturally illuminated Soul's gold/black robe with interactive light
Why It Worked:
Unified lighting (no "pasted-on" green screen look)
Custom-designed storm (not generic stock)
Interactive practical lighting from LED enhanced cinematic realism
Narrative Context: Soul performs in front of Lalibela's iconic St. Giorgi church, surrounded by hundreds of worshippers holding candles, bishops with ceremonial umbrellas, and sacred incense.
Challenge: Couldn't film at the actual location; needed to recreate architectural accuracy and spiritual atmosphere.
From Treatment: Slide 21
AI Approach:
Generated St. Giorgi's distinctive cross-shaped architecture using Midjourney styleframes → refined with Kling 3.0 for cinematic lighting
Created a "sea of candlelight" background using volumetric particle simulation
LED wall provided warm, flickering ambient light across Soul's costume and face
Why It Worked:
An architecturally accurate representation honored Ethiopian heritage
Warm LED glow created an authentic candlelit atmosphere, impossible with a green screen
Soul's team and Ethiopian cultural consultants approved the scene’s authenticity
Production Constraint: Studio fire safety regulations prohibited lighting beeswax candles during the St. Giorgi procession scene—but extinguished candles would destroy the spiritual atmosphere and visual authenticity essential to the ceremony.
The Challenge:
Traditional VFX flame compositing is notoriously difficult and time-intensive:
Stock flame elements look generic and "pasted on," lacking natural flicker variation
Manual rotoscoping of organic flame movement = 20+ hours per shot for clean edges
Matching flame color temperature and interactive flicker to LED wall ambient lighting requires expert compositing skills and extensive trial-and-error
AI-Native Solution:
Rather than settle for stock elements or labor-intensive traditional VFX, I developed a streamlined hybrid AI workflow:
Step 1: AI Flame Generation
Generated photorealistic candle flames with natural flicker, varied intensity, and authentic warm color temperature (2700K-3000K range matching beeswax candles)
Step 2: Magic Keyer Rotoscoping
Used DaVinci Resolve's AI-enhanced Magic Keyer to cleanly isolate flames with edge detection in minutes (vs. hours of manual rotoscoping). The AI recognized the organic flame shapes and automatically refined the edge work frame-by-frame.
Step 3: Precision Tracking & Compositing
Composited flames onto each beeswax candle held by the talent
Tracked hand movement, perspective shifts, and talent blocking across 15 shots
Matched flame scale and orientation to candle angle in each frame
Step 4: Interactive Lighting Integration
Added matching flicker illumination on talent faces using selective color grading and luminance keys, simulating how candlelight would naturally cast warm, dancing light across skin tones. This "interactive lighting" pass sold the realism, making it appear the flames were genuinely lighting the scene.
Step 5: Detail Enhancement
Composited glowing embers at the candle wick tips beneath flames, adding a subtle layer of realism that reinforced the practical effect illusion.
Results:
15 shots completed in 3 hours (vs. 40+ hours traditional VFX workflow)
Indistinguishable from practically lit candles in the final output
Passed Ethiopian cultural consultant review for spiritual authenticity
Solved critical production constraint without creative compromise
This demonstrates the power of hybrid workflows: AI accelerates the tedious parts (generation, rotoscoping) while traditional compositing skills handle the nuanced finishing (tracking, interactive lighting, detail work). The result? Professional-grade VFX delivered at a fraction of traditional timelines and costs.
YouTube: 100,000+ views
Jamaican Broadcast Television: Aired nationwide
Out-of-Home: Billboard placements across Kingston, Jamaica
Timeline: 8 weeks (concept to delivery) vs. 16-24 weeks traditional
Shoot Days: 1.5 days (40 setups) vs. the estimated 5-7 days traditional
Budget: $17K final budget (66% reduction vs. $50K+ location shoot + VFX composite approach)
Technical Discovery: Working with AI-generated backgrounds on LED walls proved easier than anticipated. The way LED environments cast interactive light onto talent added an immersive, authentic quality nearly impossible to achieve with green screen—even with expert compositing. The technology elevated realism rather than creating a "digital" look.
Client Response: The Marley family and Tuff Gong executives were thrilled not just with the final product, but with the on-set experience. Seeing their vision materialize in real-time on the LED wall—rather than imagining it months later in post—created confidence and creative momentum.
Extended shoot schedule: Add half a day for additional coverage and experimentation
LED brightness testing: Conduct more detailed pre-production tests of background brightness levels to optimize sensor latitude
Backup environment variations: Generate 2-3 alternate versions of each background for spontaneous creative pivots on set
Storyboard everything down to the smallest detail.
AI production demands more pre-visualization than traditional shoots, not less. We prompt-engineered lighting direction, wardrobe interactions, and art department specifications into every styleframe (1,200+ iterations), then shared them with department heads for iterative feedback. By the time we stepped on set, every shot was pre-visualized, every LED background was rendered and tested, and our modular lighting grid could adapt instantly.
The discipline of detailed pre-production transforms AI from "experimental tool" into "production-grade workflow."