//=time() ?>
とりあえずAIでけた
Talking Head AnimeとiFacialMocapで動かせた
ちょっとPC分かる人なら誰でもできる
こういう技術がサクサク使えるようになったのヤバイな
WebCamMotionCaptureとVroomで優勝できると思ったけどWCMCでiFacialMocapを読み込めなかったので一生不貞寝してくる…
マジでいったん準備中に戻るしかないレベルでは…👼
I have to call it for the night when there's still lots of little issues, but at least I can use the ifacialmocap software for pc to test in unity without having to set the blendshapes every reworking
Vbidger/ifacialmocap vtubers, I am having lagging problems, if anyone knows any solutions! After about 30min-1hr, my phone will heat up and seem to start lagging on the transferring of data from IFM to vbridger. This seems to be the phone's fault. Anyone experience this? thanks💕
VMagicMirrorとアイフォンアプリのiFacialMocap連携して使うとCPUが5%前後まで落ちて、GPUは12-30ぐらいで動作してます。しかし画面に向かって右側の目が、左に比べ若干閉じてる謎挙動になるんだよね。
少し下向くとよく分かる(画像2)。
表情スイッチの目を細める40%で設定してあげてごまかす😠
Here is an image showing iPhone blendshapes mixed with the ":3" face and there are no more weird clipping issues. There is so much I have to keep noted for the tutorial, even iFacialMocap settings but I lost my phone which I need to record the IRL settings for iPhone.
Here are some examples that i have done so far for myself personally. If i were to do commissions i would also include iFacialMoCap tracking !
Raw test of android face tracking (Meowface) versus ios face tracking (ifacialmocap). Not on same level, but way better than old camera tracking, yet much cheaper than an iphone. As a bonus, hybrid lipsync is actually very good. #vtuberuprising
@poesidious Live 2D for rigging 2D models. It has a learning curve.
VRoid Studio is a free anime like 3D model maker. It’s easy to start with. 👍🏻
I use a 3D VRoid Model I customized and use VSeeFace with the iPhone app ifacialmocap
I use the 𝙫𝙚𝙖𝙧 app to make TikToks
⬇️ example
@KSG_twt 無料なのでぜひぜひ!
頑張ればiPhoneでのトラッキングもできたような…?
あとは蛇足かもだけど、VMagicMirrorっていうもあるよ。Webカメラ対応で、キーボード打ってるのも反映されるやつ!
笑顔はiFacialMocapっていう有料のアプリ入れないとだめだった