Experimenting with an AI-Powered Portfolio Video in Remotion
Static portfolios are useful, but they are not always the best way to present context, voice, and motion. I wanted to explore whether a short generated video could communicate work and personality faster without turning into gimmick-heavy content.
What I built
- A Remotion-based video project wired into a Next.js application structure.
- A composition workflow that can evolve into reusable scenes for project highlights and narration.
- A rendering setup designed for local work first, with a path toward Lambda-based rendering later.
Technical decisions
- Started from a proven Remotion application structure so I could focus on storytelling and scene composition instead of infrastructure trivia.
- Kept the project intentionally experimental because the right output format is still being tested.
- Used the repo as a sandbox for deciding where AI assistance adds value and where manual editing still wins.
Why this project matters
- It shows that I am willing to prototype new presentation formats instead of only repeating standard portfolio patterns.
- It demonstrates curiosity around video tooling, AI-assisted workflows, and content systems.
- Even as a work in progress, it is honest proof of experimentation rather than inflated claims about a finished product.
Scope note
This is a working prototype and exploration repo, not a fully launched product.
Links
- GitHub: https://github.com/neutral-Stage/ai-powered-portfolio-video
- Live URL: Not published
- Primary language: TypeScript
Takeaway
This project reflects the kind of work I enjoy most: shipping practical software, tightening the developer or user workflow, and documenting the technical decisions clearly enough that another engineer can pick it up and keep moving.