Disco Elevator – Oopsie Daisy feat. Peder Pan

Disco Elevator – Oopsie Daisy feat. Peder Pan

Disco Elevator – Oopsie Daisy feat. Peder Pan

Oopsie Daisy is one of my new AI-assisted musical project that explores the intersection of classic disco fundamentals with contemporary electronic production. “Disco Elevator” marks my latest experiment, featuring the alterego character Peder Pan as creative collaborator — the pink bunny-suited character who has appeared across multiple Stimulus projects as both visual anchor and creative catalyst.

The musical experiment channels the euphoric energy of classic disco while incorporating modern electronic production techniques that give the sound a distinctly contemporary edge. “Disco Elevator” embodies a philosophy of taking listeners on an unexpected journey — much like stepping into an elevator that doesn’t quite follow the laws of physics – or even life itself. The track draws inspiration from the pioneering work of disco legends like Chic, Donna Summer and Earth, Wind & Fire, while incorporating the playful electronic sensibilities found in artists like Justice and Daft Punk.
The collaboration with Peder Pan brings an additional layer of creative unpredictability to the mix. Known for his appearances across various Stimulus projects, Peder Pan represents the intersection of established character IP and experimental music creation. His influence can be heard in the track’s more adventurous sonic territories — the moments where conventional disco structure gives way to unexpected electronic flourishes and rhythmic surprises.

Why I finally decided to test the full AI pipeline

Sometimes the stars align in unexpected ways. When MidJourney dropped their significantly improved video generation capabilities around the same time Suno finally cracked the code on audio quality, I knew I had to test these tools against a real creative challenge. This wasn’t just about playing with new features—it was about seeing if AI had finally matured enough to support a complete vision from concept to finished music video. For the first time, the vision of a complete AI-assisted music video seemed genuinely achievable rather than merely aspirational.

Suno’s Quality Revolution proved to be the primary enabler for this project. Previous iterations of AI music generation suffered from two critical limitations that rendered them unsuitable for serious creative work: inconsistent composition and song structure, and most notably, persistent audio artifacts. Earlier versions were plagued by a characteristic high-pitched ambient white noise that permeated every track, creating an unmistakable “AI signature” that broke immersion immediately.

The latest Suno update represented a quantum leap in both areas. Not only did the platform demonstrate a more sophisticated understanding of song structure — properly developing verses, choruses, and bridges with musical logic—but the audio quality reached a threshold where the technology became invisible. The elimination of those telltale artifacts meant that for the first time, an AI-generated track could stand alongside human-produced music without immediately revealing its origins.

MidJourney’s video capabilities completely surprised me—and probably no one expected this level of quality this quickly. The new video module is genuinely fast, and using image references actually enhances otherwise lesser images. What really impressed me was the movement and character consistency: no morphing into something else, facial expressions that feel natural and real, and compositions that don’t end up with subjects cut off or quirky clip endings.
The best part? Almost every rendered video clip is actually workable. You’re not spending several evenings trying to get a few decent clips like with previous AI video tools. Your real work becomes crafting prompt descriptions for what should actually happen in each scene. This was my first experience with a GenAI video tool that actually delivers on its promises. Combined with the established visual reference library of Peder Pan from previous projects, this created an unprecedented opportunity to test the full AI creative pipeline against a real creative challenge.

This wasn’t just about using AI tools—it was about testing whether they had matured enough to support a complete creative vision from concept to finished product.

It’s still a lot of manual work behind the magic

While “Disco Elevator” leverages multiple AI tools throughout its production pipeline, the reality of working with these technologies reveals a crucial truth: every part of the production requires manual work, real skills, and strategic thinking. AI tools function as powerful accelerators and creative partners, but they don’t eliminate the need for human expertise — they amplify it.

And the planning still matters a lot

The project began with traditional creative planning that no AI tool could handle: developing a basic storyline, mapping out where verses and choruses would fall, marking beat tempos, and identifying key musical moments that needed visual emphasis to create a coherent audio-visual experience. This foundational work required understanding both musical structure and visual storytelling — skills that remain fundamentally human.

The storyline turned out to be crucial. Even if you start with just a vague idea of what should happen, sooner or later everything comes down to how strong that storyline is. That’s why it’s important to nail this down as early as possible in a project like this. If you’re planning to spend weeks of evening work, you don’t want to shoot from the hip—unless you’re prepared to spend double the time catching up later.

Tool Integration Strategy:

    • Claude: Lyrical brainstorming and creative direction
    • Midjourney: Character-consistent visual generation in 2:3 portrait format (optimized for mobile viewing since 50% of video consumption happens on mobile devices)
    • Magnific AI: Image enhancement and quality refinement
    • Suno: Complete song composition and production
    • Cubase 14: Audio post-production, intro trimming, LFO/EQ sweep + reverb effects to create the “closed room” sound, plus mastering with compressor and limiter to meet streaming loudness standards
    • CapCut: Video editing and post-production
    • Affinity Photo/Design: Graphics and final visual polish

The video editing — Where I started making a mess

The most significant challenge — and learning experience — came during the video editing phase. The abundance of AI-generated content created a deceptive sense of abundance that masked the complexity of proper post-production workflow.

The initial mistake: I began adding transitions and effects far too early in the process, before establishing the foundational structure. While this approach occasionally sparked creative ideas, it primarily created chaos. With dozens of video clips requiring integration, starting with effects rather than structure led to an increasingly messy timeline that became difficult to navigate and modify.

How I finally got organized: The project transformed when I stepped back and approached it systematically:

    1. Storyline first: Having a solid narrative foundation made all other decisions clearer and prevented endless revisions
    2. Trim first, enhance later: All video clips were carefully trimmed and organized before any effects were applied
    3. Align with musical structure: Clips were precisely placed within the established timeframes of verses, choruses, and instrumental breaks
    4. Establish stable foundation: Only after achieving a coherent basic cut did I begin applying transitions and effects

This restructuring created what I call “stable experience flow”—a solid foundation that made it much easier to apply the right amount of enhancement without overdoing it. While I may have still overreached slightly in the final effects application, the mess factor was dramatically reduced compared to the chaotic earlier approach.

 

What actually worked and what didn’t

Working through this complete AI-assisted pipeline revealed several important insights:

 

MidJourney video breakthrough:
    • Almost every generated clip was actually usable (a first for AI video tools)
    • Character consistency stayed solid without morphing issues
    • Image references significantly enhanced even lesser source images
    • Natural facial expressions and proper composition framing
    • Focus shifted to crafting better prompts rather than generating dozens of clips hoping for one good one
Suno’s audio quality revolution:
    • Complete elimination of the high-pitched white noise that plagued earlier versions
    • Sound artifacts that made tracks instantly recognizable as AI-generated are gone
    • Song structure and composition finally understand musical logic
    • Audio quality reached the threshold where the technology becomes invisible
Workflow discoveries:
    • Storyline preparation prevents exponential time waste later in the project
    • Starting with effects before structural organization creates chaos quickly
    • Having established character references (like Peder Pan) dramatically accelerates AI visual workflows
    • The “stable experience flow” principle: solid foundation first, enhancement second
Integration insights:
    • Different AI tools complement each other when approached strategically
    • Format decisions matter: choosing 2:3 portrait over traditional 16:9 because mobile viewing accounts for 50% of consumption
    • Professional audio post-production remains essential even with high-quality AI generation
    • Human workflow knowledge becomes more critical, not less, when managing AI-generated abundance
    • Creative vision and aesthetic judgment remain the binding elements that make everything coherent

Final thoughts: Where this leaves us

“Disco Elevator” represents a successful test of the current AI creative pipeline, but more importantly, it reveals where we stand in the ongoing evolution of human-AI collaboration. The technology has reached a threshold where serious creative work is possible, but it demands more sophisticated human skills, not fewer.

The project succeeds not because AI tools did everything automatically, but because they enabled a level of creative experimentation and production value that would have been impossible with traditional resources. Peder Pan’s pink bunny suit may be digitally generated, but the creative vision that brings him to life in this disco universe remains distinctly human.

 

Listen to “Disco Elevator” and decide for yourself: How close are we to seamless human-AI creative collaboration?
How this project was made…

This music video was created through a collaboration between myself (Michael Käppi) and multiple AI tools: Claude for creative brainstorming, Suno for music generation, MidJourney for visual content, Magnific AI for enhancement, CapCut for editing, and Affinity suite for final polish. The Peder Pan character builds on established visual references from previous Stimulus projects, demonstrating how character IP can accelerate AI-assisted creative workflows.

0
The Northern Link – An Elof tale

The Northern Link – An Elof tale

The Northern Link – An Elof tale

After our first playful experiment with GenAI in “The Fisheries Agency – An Elof Tale,” Peter Öberg and I couldn’t resist diving into a new Elof story. This time, with “The Northern Link,” we set out to create a short film with a more complex storyline, voice narration, and historical references. Inspired by the ongoing infrastructure project in Gothenburg, our fictional character Elof takes on his own ambitious tunnel-building attempt from the late 1930s. Here’s how we brought this nostalgic, humorous tale to life using GenAI tools.

A few weeks after wrapping up our first GenAI film project, “The Fisheries Agency – An Elof Tale,” my friend Peter Öberg and I found ourselves brainstorming ideas for the “next” Elof adventure. Partly driven by curiosity and partly by a desire to push GenAI tools further, we wanted to see how well AI could handle a more complex short film, incorporating dialogue, voice narration, and a more developed storyline.

This time, we aimed for a film just under two minutes to keep it manageable. Our main goal was to experiment with new techniques, but we also wanted to have some fun with it. We didn’t invest too heavily in scriptwriting or serious plot development—this was about testing the tools, learning, and having a laugh along the way.

The Storyline and Setting

Once again, Elof took center stage in our tale. For this story, we drew inspiration from current events in our hometown of Gothenburg, where an ongoing infrastructure project called Västlänken (The West Link) is reshaping the city landscape with a tunnel construction that’s stirred up public opinion due to delays and budget overruns. We decided to create a fictional backstory for Elof, imagining that he had attempted his own version of this project back in the 1930s, dubbing it “The Northern Link.” Of course, his endeavor was destined to fail—but in a humorous, nostalgic way that locals might recognize.

To make the story feel authentic, we focused on capturing details of the late 1930s Gothenburg, sourcing old photographs and adding “filler” clips of the city and its surroundings. For narration, we used an old journalistic style with a voice we trained specifically in ElevenLabs to resemble the tones of classic Swedish newsreels. We also chose a brass orchestra soundtrack to capture the feel of that era, along with historically accurate fonts for the title graphics. These details helped transport viewers to another time, making the short film feel true to its setting.

Developing Cinematic Techniques

This time, we went beyond the static, silent film approach of the first Elof movie. We wanted to mimic the evolving cinematic techniques of the era, so we analyzed old journal films and noticed a shift in how scenes were presented—more filler shots, smoother transitions, and a balance between narration and action. We tried to capture this in our own scenes, arranging them to keep the narrative flow without overcrowding the main footage with voiceovers.

The entire project took us about 16-20 hours, as we worked with new elements like voice narration and advanced editing. The process was an eye-opener, not only for understanding AI’s capabilities but also for appreciating the art of vintage cinematic storytelling. The end result was a charming little short film that blended nostalgic elements with our own playful take on historical events.

The Story Behind “Elof”

Who is Elof? Elof was Peter’s grandfather—a figure Peter met only a few times in his childhood but heard many stories about over the years. Known for his entrepreneurial spirit and unique way of doing things, Elof had the perfect personality for a quirky, fictionalized portrayal. As Peter’s alter ego “Peder Pan” shares some of these traits, Elof became an ideal character to bring to life in our nostalgic cinematic experiments. “The Northern Link – An Elof Tale” became a playful exploration of Elof’s character, merging my love for graphic design with the curious possibilities of GenAI. It’s been a journey in combining past and present, humor and nostalgia, all through the lens of AI-assisted creativity. Say hello to my creative friend & ideator Peder Pan at: YouTube

0
The Fisheries Agency – An Elof tale

The Fisheries Agency – An Elof tale

The Fisheries Agency – An Elof tale

“The Fisheries Agency – An Elof Tale” began as a late-night exploration of GenAI video tools with my friend Peter Öberg. Inspired by the nostalgic charm of old Swedish silent films, we decided to create a one-minute silent film, featuring Peter’s larger-than-life, fictionalized grandfather, Elof. With GenAI as our tool and a vintage aesthetic as our guide, here’s how we turned a simple idea into a quirky short film, complete with classic silent-movie vibes.

The Fisheries Agency – An Elof Tale started as a spontaneous project born from a late-night creative jam session with my friend, Peter Öberg, in autumn 2023 (September 29, to be exact). Our conversation that night drifted toward the possibilities of GenAI video tools, and soon enough, we were brainstorming wild ideas about the kinds of short films we could create with these tools. One tool we used, Pika, was still in beta, with several limitations, including an embedded watermark on all clips, but that didn’t stop us.

We couldn’t resist the urge to create something simple yet unique. Inspired by old Swedish newsreel clips that played before feature films in theaters from the 1920s to the 1950s, we envisioned our project as a short, silent-style film. We aimed to capture that vintage aesthetic, using intertitles (text slides) to drive the storyline and a crackling old record for background music. This approach gave us both a theme and a set of creative boundaries, allowing us to turn the limitations of AI into a stylistic choice.

Crafting the Storyline and Style

Peter suggested his grandfather “Elof” as the protagonist. Elof, as Peter remembers him from family stories, was a bit of a character—a spirited entrepreneur with a reputation for doing things his own way. We set ourselves a one-minute time limit to keep the story focused and fun. With this loose structure, we brainstormed scenes and plot points to showcase Elof’s personality.

With the basic storyline sketched out, we split up the tasks. I created the initial images and generated clips, while Peter assembled the movie’s foundation by organizing frames, adding text slides, and working on sound. Once my clips were ready, I sent them to Peter, who arranged them into scenes, added sound effects, and applied vintage-style filters for that classic silent film look. In just 6-8 hours—from ideation to final product—we had a one-minute “trailer” for The Fisheries Agency, a quirky homage to Elof’s fictional exploits.

The Story Behind “Elof”

Who exactly is Elof? Elof was Peter’s grandfather, whom Peter met a few times as a child before he passed away. Over the years, Peter heard stories of Elof’s colorful personality and unconventional ways, a skilled entrepreneur who wasn’t afraid to take risks. For this project, Elof became an ideal muse—a character with enough charisma to carry a quirky story. Interestingly, Elof shares some personality traits with “Peder Pan,” Peter’s alter ego: a fantasy persona with Peter’s characteristics amplified a thousandfold.
The Fisheries Agency – An Elof Tale turned into a playful exploration of Elof’s character in a vintage cinematic style, blending my love for graphic design with the experimental potential of GenAI. Together, Peter and I brought Elof back to life, capturing a moment of Swedish nostalgia and timeless humor. Say hello to my creative friend & ideator Peder Pan at: YouTube

0

Pin It on Pinterest