Blender add-on for exporting COCO 2d keypoints compliant bone placement as seen from the camera.
This add-on is designed to generate images in specified poses using ControlNet.
ControlNet: https://github.com/lllyasviel/ControlNet
Export all frames of an animation at once. Multiple armatures can also be exported.
Rigs generated by Rigify can be automatically configured for bone correspondence.
The following Extension for AUTOMATIC1111 WebUI and others are useful for generating images from keypoint images.
https://github.com/Mikubill/sd-webui-controlnet
The following Colab and repositories have implementations that generate images and video from exported json
Local generation requires a GPU with at least 8GB of VRAM
https://github.com/mili-inch/ControlNet
Go to Preferences->Add-ons in the Edit menu at the top of Blender, click Install in the upper right corner, and select the zip file without unzipping it.
Check the Animation:B2ConrtolNet checkbox to complete the installation.
In 3DView, an operation panel is added to the right menu that can be called up with the N key.
At least one camera is required in the scene
- Armature: Set the armatures to be output. Checkboxes can be toggled to output or not. The armature name is written to json.
- Each keypoints: The coordinates of the specified bone are written to json as a keypoint.
- head/tail: You can choose whether the root or the tip of the bone is the keypoint coordinate.
- Export as Image: Saves the current keypoints as seen by the camera as an image.
- Export All Frames as Images: Saves the keypoints of all frames as seen by the camera as sequentially numbered images.
- Export as JSON: Saves the current keypoints as seen by the camera as json
- Export All Frames as JSON: Saves the keypoints of all frames as seen by the camera as json
- Q.I want to change the number of frames generated.
- A.Please set it with the start frame and end frame in the property panel
- Q.What is the correspondence between Bones and Keypoints?
- A.https://cmu-perceptual-computing-lab.github.io/openpose/web/html/doc/md_doc_02_output.html
- If the rig is generated by Rigify, the correspondence will be set automatically when the armature is set.
- Q.The pose of the image output from StableDiffusion is broken.
- A.Try changing the bone placement. Particularly, the neck bone should be on the line connecting the two shoulders
- Q.What is the format of json?
- A.It looks like this
{
resolution: int[2],
fps: int,
frames: [
{
frame_current: int
armatures: [
{
name: str,
keypoint_indices: int[18]
}
],
keypoints: int[len(armatures)*18, 2]
}
]
}