Show HN: Text-to-3D Motion Generator (Hunyuan 1.0 wrapper)

Show HN (score: 5)
Found: January 02, 2026
ID: 2878

Description

Other
Show HN: Text-to-3D Motion Generator (Hunyuan 1.0 wrapper) Hi everyone,

I built a UI for the new open-source Hunyuan Motion model to generate 3D animations from text: https://hy-motion.ai

It generates BVH files instantly. I'm trying to bridge the gap between "cool AI demo" and "useful game dev tool".

Question for 3D devs/animators:

If you were to use this in production, what is the single biggest missing feature?

1. Export Pipeline: Auto-conversion to FBX for Unity/Unreal? 2. Motion Fusion: Blending multiple prompts into one long sequence? 3. Rig Variety: Support for non-humanoid skeletons?

Feedback is much appreciated.

More from Show

Show HN: Cachekit – High performance caching policies library in Rust

Show HN: Cachekit – High performance caching policies library in Rust

Show HN: AI video generator that outputs React instead of video files

Show HN: AI video generator that outputs React instead of video files Hey HN! This is Mayank from Outscal with a new update. Our website is now live. Quick context: we built a tool that generates animated videos from text scripts. The twist: instead of rendering pixels, it outputs React&#x2F;TSX components that render as the video.<p>Try it: <a href="https:&#x2F;&#x2F;ai.outscal.com&#x2F;" rel="nofollow">https:&#x2F;&#x2F;ai.outscal.com&#x2F;</a> Sample video: <a href="https:&#x2F;&#x2F;outscal.com&#x2F;v2&#x2F;video&#x2F;ai-constraints-m7p3_v1&#x2F;12-01-26-18-47-41" rel="nofollow">https:&#x2F;&#x2F;outscal.com&#x2F;v2&#x2F;video&#x2F;ai-constraints-m7p3_v1&#x2F;12-01-26...</a><p>You pick a style (pencil sketch or neon), enter a script (up to 2000 chars), and it runs: scene direction → ElevenLabs audio → SVG assets → Scene Design → React components → deployed video.<p>What we learned building this:<p>We built the first version on Claude Code. Even with a human triggering commands, agents kept going off-script — they had file tools and would wander off reading random files, exploring tangents, producing inconsistent output.<p>The fix was counterintuitive: fewer tools, not more guardrails. We stripped each agent to only what it needed and pre-fed context instead of letting agents fetch it themselves.<p>Quality improved immediately.<p>We wouldn&#x27;t launch the web version until this was solid. Moved to Claude Agent SDK, kept the same constraints, now fully automated.<p>Happy to discuss the agent architecture, why React-as-video, or anything else.

Show HN: SubTrack – A SaaS tracker for devs that finds unused tools

Show HN: SubTrack – A SaaS tracker for devs that finds unused tools Hi HN,<p>I built SubTrack to help teams find unused SaaS tools and cloud resources before they silently eat into budgets.<p>The motivation came from seeing how hard it is to answer simple questions: – Which SaaS tools are actually used? – Which cloud resources are idle? – What will our end-of-month spend look like?<p>SubTrack connects to tools like AWS, GitHub, Vercel, and others to surface unused resources and cost signals from one place. Recently I added multi-account support, currency localization, and optional AI-based insights to help interpret usage patterns.<p>This is an early-stage project and I’m actively iterating. I’d really appreciate feedback—especially from people managing cloud or SaaS sprawl.

Show HN: A MCP for controlling terminal UI apps built with bubbletea and ratatui

Show HN: A MCP for controlling terminal UI apps built with bubbletea and ratatui so you can start vibe-coding your ad-hoc terminal dashboard. With session replay and mouse click support built-in.

No other tools from this source yet.