I hereby claim:
- I am catid on github.
- I am catid (https://keybase.io/catid) on keybase.
- I have a public key whose fingerprint is ED87 9F30 5595 C87E D250 2854 ED7D 4351 6971 E7C0
To claim this, I am signing this object:
| [include mainsail.cfg] | |
| [include timelapse.cfg] | |
| # This file contains common pin mappings for the BigTreeTech Octopus V1. | |
| # To use this config, the firmware should be compiled for the STM32F446 with a "32KiB bootloader" | |
| # Enable "extra low-level configuration options" and select the "12MHz crystal" as clock reference | |
| # after running "make", copy the generated "klipper/out/klipper.bin" file to a | |
| # file named "firmware.bin" on an SD card and then restart the OctoPus with that SD card. |
| x264_param_t param; | |
| x264_param_default_preset(¶m, "veryfast", "zerolatency"); | |
| param.rc.i_rc_method = X264_RC_ABR; | |
| param.rc.i_bitrate = kbps_bitrate; | |
| param.i_width = width; | |
| param.i_height = height; | |
| param.i_fps_num = fps; | |
| param.i_fps_den = 1; | |
| param.i_csp = X264_CSP_I420; |
| GPD Win Max 2021 CPU-Z Benchmark Results | |
| Seems worth setting TDP lower on Win Max 2021. | |
| Single-threaded: Up to 50% faster. | |
| Multi-threaded: Up to 22% faster. | |
| BIOS: TDP Down (15 W) | |
| Windows: Best Power Efficiency | |
| Single-thread: 405.8 |
| ``` | |
| Unhandled exception: Error: Output "output" requires type float32 but was defined as type float32x12793. | |
| ``` | |
| ``` | |
| const int expected_width = 640; | |
| const int expected_height = 512; | |
| static Func blur_x("blur_x"), blur_y("blur_y"), blur2_x("blur2_x"); | |
| static Func input_float("input_float"); |
| Snake with cubes! | |
| Discrete curvy snakes! | |
| Energy level shoot shots. | |
| Chopped off at energy level. | |
| Head of snake is a free shot (no energy). |
| 2021-03-30 19:01:04.068 23980-10184/com.example.cameratest I/CameraTest: [19:01:04 -07:00] [I] StartCameras | |
| 2021-03-30 19:01:04.070 23980-10184/com.example.cameratest I/CameraTest: [19:01:04 -07:00] [I] Camera count: 2 | |
| 2021-03-30 19:01:04.070 23980-10184/com.example.cameratest I/CameraTest: [19:01:04 -07:00] [I] * Camera: 0 | |
| 2021-03-30 19:01:04.071 23980-10184/com.example.cameratest I/CameraTest: [19:01:04 -07:00] [I] ** Camera metadata contains 133 tags: | |
| 2021-03-30 19:01:04.071 23980-10184/com.example.cameratest I/CameraTest: [19:01:04 -07:00] [I] *** "Color Correction Available Aberration Modes" x3 = [ Off, Fast, High Quality ] -- from ACAMERA_COLOR_CORRECTION : ACAMERA_COLOR_CORRECTION_AVAILABLE_ABERRATION_MODES#4 t=Byte | |
| 2021-03-30 19:01:04.071 23980-10184/com.example.cameratest I/CameraTest: [19:01:04 -07:00] [I] *** "Ae Available Antibanding Modes" x4 = [ Off, 50 Hz, 60 Hz, Auto ] -- from ACAMERA_CONTROL : ACAMERA_CONTROL_AE_AVAILABLE_ANTIBANDING_MODES#65554 t=Byte | |
| 2021-03-30 19:01:04.071 23980-10184/co |
I hereby claim:
To claim this, I am signing this object:
| Holoportation: Virtual 3D Teleportation in Real-time | |
| Microsoft Research | |
| https://www.microsoft.com/en-us/research/publication/holoportation-virtual-3d-teleportation-in-real-time/ | |
| KinectFusion: Real-Time Dense Surface Mapping and Tracking | |
| Microsoft Research | |
| https://www.microsoft.com/en-us/research/wp-content/uploads/2016/02/ismar2011.pdf | |
| FusionMLS: Highly dynamic 3D reconstruction with consumergrade RGB-D cameras | |
| Siim Meerits |
| /* | |
| RotationFromEulerAngles() | |
| This solves the issue where rotating on one axis and then by another causes | |
| the second rotation to get distorted. Instead the rotation on both axes are | |
| performed at once. | |
| +yaw (radians) rotates the camera clockwise. | |
| +pitch (radians) rotates the camera so it is looking slant-downward. |
| // Accelerometer frame: (x, y, z) = (+forward, +right, +up) | |
| // Pointcloud frame: (x, y, z) = (+right, +up, +forward) | |
| Eigen::Quaternionf q; | |
| q.setFromTwoVectors( | |
| Eigen::Vector3f(accel[1], accel[2], accel[0]), | |
| Eigen::Vector3f(0.f, -1.f, 0.f)); | |
| Eigen::Matrix3f TiltR = q.toRotationMatrix(); | |
| // Then just multiply the pointcloud by this rotation matrix to get the corrected positions. |