解决标定工具问题

This commit is contained in:
qupengwei
2026-01-06 11:38:00 +08:00
parent 17d96c61c4
commit d75b8d86f3
5 changed files with 914 additions and 1597 deletions

85
docs/tool_usage_guide.md Normal file
View File

@@ -0,0 +1,85 @@
# 图像采集工具指南 (Image Capture Tools Guide)
本文档对位于 `image_capture/src/tools` 目录下的辅助工具进行了概述。这些工具旨在辅助料箱存储检测系统的相机配置、标定以及算法调优工作。
## 工具概览与使用顺序
在新系统或新相机设置过程中,建议的使用顺序如下:
1. **[intrinsic_dumper](#1-intrinsic_dumper)** - 提取相机出厂参数。
2. **[calibration_tool](#2-calibration_tool)** - 标定相机相对于环境的位姿。
3. **[generate_reference](#3-generate_reference)** - (可选) 创建托盘检测的基准数据。
4. **[slot_algo_tuner](#4-slot_algo_tuner)** - (按需) 离线调试算法参数。
---
### 1. intrinsic_dumper (内参导出工具)
**路径**: `src/tools/intrinsic_dumper`
**用途**:
这是设置流程的 **第一步**。它是一个命令行工具,能够自动与连接的 Percipio 相机通信提取其出厂标定的内参、畸变系数以及内部外参RGB 与 Depth 传感器之间)。
**主要功能**:
* 自动扫描连接的设备。
* 同时提取 Depth (深度) 和 Color (彩色) 传感器的参数。
* 输出标准格式的 JSON 文件,供其他工具和主程序使用。
**输出**:
* 文件名: `intrinsics_<序列号>.json`
* 包含内容: `fx`, `fy`, `cx`, `cy`, 畸变系数等。
---
### 2. calibration_tool (标定工具)
**路径**: `src/tools/calibration_tool`
**用途**:
这是 **第二步**。这是一个 GUI 应用程序,用于计算相机相对于现实世界中特定参考平面(例如地面或货架表面)的“外参”。简单来说,它告知系统相机在安装环境中的位置和角度。
**主要功能**:
* 加载由第一步生成的 `intrinsics_<SN>.json` 参数。
* 允许用户在图像上选择一个平坦区域 (ROI)。
* 拟合深度数据所在的平面,计算旋转和平移矩阵。
* 在彩色图和深度图上可视化标定过程。
**输出**:
* 文件名: `calibration_result_<序列号>.json`
* 包含内容: 一个 4x4 的 `transformation_matrix`,用于将相机坐标系转换到世界/参考坐标系。
---
### 3. generate_reference (基准生成工具)
**路径**: `src/tools/generate_reference`
**用途**:
此工具专门用于 **托盘偏移检测 (Pallet Offset Detection)** 算法。它允许您定义托盘的“标准”或“基准”位置。检测算法随后会将实时采集的托盘与此基准进行对比,以确定偏移量。
**主要功能**:
* 加载 16-bit 深度图像。
* 交互式选择托盘插孔区域 (ROI)。
* 运行 `PalletOffsetAlgorithm` 以查找托盘的精确 3D 位置。
* (可选) 使用第二步生成的标定矩阵提供真实世界坐标。
**输出**:
* 文件名: `reference_pallet.json`
* 包含内容: 基准 `x`, `y`, `z` 坐标和 `reference_angle` (基准角度)。
---
### 4. slot_algo_tuner (算法调优工具)
**路径**: `src/tools/slot_algo_tuner`
**用途**:
这是一个 **离线调试和调优工具**,主要用于 **横梁/货架变形检测** 或类似的基于图像比对的算法。它允许开发人员在不连接即时相机的情况下微调图像处理参数。
**主要功能**:
* 并排对比“基准”图像和“输入”图像。
* 实时调整参数:
* ROI (感兴趣区域)
* Thresholds (二值化阈值)
* Blur Kernel Size (模糊核大小)
* Area Filters (面积过滤)
* 可视化差异图 (Difference Map) 和二值掩膜 (Binary Mask) 以验证算法行为。
**使用场景**:
当您需要调整检测灵敏度,或调试差异检测逻辑中的误报/漏报时使用。

View File

@@ -0,0 +1,5 @@
[1/2] Building CXX object CMakeFiles\calibration_tool.dir\src\tools\calibration_tool\calibration_widget.cpp.obj
FAILED: [code=2] CMakeFiles/calibration_tool.dir/src/tools/calibration_tool/calibration_widget.cpp.obj
C:\PROGRA~1\MICROS~3\2022\COMMUN~1\VC\Tools\MSVC\1444~1.352\bin\Hostx64\x64\cl.exe /nologo /TP -DFMT_HEADER_ONLY=0 -DFMT_STRING_ALIAS=1 -DFMT_USE_WINDOWS_H=0 -DGLEW_STATIC -DNOMINMAX -DOPENCV_DEPENDENCIES -DQT_CORE_LIB -DQT_GUI_LIB -DQT_WIDGETS_LIB -DUNICODE -DWIN32 -DWIN64 -D_CRT_SECURE_NO_WARNINGS -D_ENABLE_EXTENDED_ALIGNED_STORAGE -D_GLIBCXX_USE_CXX11_ABI=1 -D_UNICODE -D_WIN64 -ID:\Git\2nd_Institute_Project\material_box_storage_detection_system\image_capture\build\calibration_tool_autogen\include -ID:\Git\2nd_Institute_Project\material_box_storage_detection_system\image_capture\src -ID:\Git\2nd_Institute_Project\material_box_storage_detection_system\image_capture\build -ID:\Git\2nd_Institute_Project\material_box_storage_detection_system\image_capture\third_party\percipio\include -ID:\Git\2nd_Institute_Project\material_box_storage_detection_system\image_capture\third_party\mvs\Includes -external:ID:\enviroments\OPencv4.55\OPencv4.55_MSVC\opencv\build\include -external:ID:\APP_Loc\QT6\6.9.3\msvc2022_64\include\QtCore -external:ID:\APP_Loc\QT6\6.9.3\msvc2022_64\include -external:ID:\APP_Loc\QT6\6.9.3\msvc2022_64\mkspecs\win32-msvc -external:ID:\APP_Loc\QT6\6.9.3\msvc2022_64\include\QtWidgets -external:ID:\APP_Loc\QT6\6.9.3\msvc2022_64\include\QtGui -external:ID:\enviroments\Open3d\open3d-devel-windows-amd64-0.18.0-release\include -external:ID:\enviroments\Open3d\open3d-devel-windows-amd64-0.18.0-release\include\open3d\3rdparty -external:W0 /DWIN32 /D_WINDOWS /W3 /GR /EHsc /MDd /Zi /Ob0 /Od /RTC1 -std:c++17 /utf-8 /W3 /MP -Zc:__cplusplus -permissive- -utf-8 /showIncludes /FoCMakeFiles\calibration_tool.dir\src\tools\calibration_tool\calibration_widget.cpp.obj /FdCMakeFiles\calibration_tool.dir\ /FS -c D:\Git\2nd_Institute_Project\material_box_storage_detection_system\image_capture\src\tools\calibration_tool\calibration_widget.cpp
D:\APP_Loc\QT6\6.9.3\msvc2022_64\include\QtCore/qglobal.h(13): fatal error C1083: 无法打开包括文件: “type_traits”: No such file or directory
ninja: build stopped: subcommand failed.

View File

@@ -1,10 +1,6 @@
#include "beam_rack_deflection_detection.h"
#include "../../../common/config_manager.h"
#define DEBUG_ROI_SELECTION // 启用交互式ROI选择调试模式
#include <algorithm>
#include <cmath>
#include <iostream>
#include <string>
#include <vector>
#include <Eigen/Dense>
@@ -78,68 +74,15 @@ const std::vector<float> BeamRackDeflectionAlgorithm::DEFAULT_RACK_THRESHOLDS =
};
//====================
// 步骤5加载标定参数
// 步骤5加载标定参数 (Stub)
//====================
bool BeamRackDeflectionAlgorithm::loadCalibration(Eigen::Matrix4d &transform) {
// 在当前目录查找 calibration_result_*.json 文件
QDir dir = QDir::current();
QStringList filters;
filters << "calibration_result_*.json";
dir.setNameFilters(filters);
QFileInfoList list = dir.entryInfoList(QDir::Files, QDir::Time); // 按时间排序
if (list.empty()) {
std::cerr << "[BeamRackDeflectionAlgorithm] Warning: No calibration file "
"found. Using Identity."
<< std::endl;
transform = Eigen::Matrix4d::Identity();
return false;
}
// 使用最新的文件
QString filePath = list.first().absoluteFilePath();
std::cout << "[BeamRackDeflectionAlgorithm] Loading calibration from: "
<< filePath.toStdString() << std::endl;
QFile file(filePath);
if (!file.open(QIODevice::ReadOnly)) {
std::cerr << "[BeamRackDeflectionAlgorithm] Error: Could not open file."
<< std::endl;
transform = Eigen::Matrix4d::Identity();
return false;
}
QByteArray data = file.readAll();
QJsonDocument doc = QJsonDocument::fromJson(data);
if (doc.isNull()) {
std::cerr << "[BeamRackDeflectionAlgorithm] Error: Invalid JSON."
<< std::endl;
transform = Eigen::Matrix4d::Identity();
return false;
}
QJsonObject root = doc.object();
if (root.contains("transformation_matrix")) {
QJsonArray arr = root["transformation_matrix"].toArray();
if (arr.size() == 16) {
for (int i = 0; i < 4; ++i) {
for (int j = 0; j < 4; ++j) {
transform(i, j) = arr[i * 4 + j].toDouble();
}
}
return true;
}
}
std::cerr << "[BeamRackDeflectionAlgorithm] Error: transformation_matrix "
"missing or invalid."
<< std::endl;
transform = Eigen::Matrix4d::Identity();
return false;
}
//====================
// 步骤6横梁和立柱变形检测主函数
// 步骤6横梁和立柱变形检测主函数 (Simulated)
//====================
bool BeamRackDeflectionAlgorithm::detect(
const cv::Mat &depth_img, const cv::Mat &color_img, const std::string &side,
@@ -148,785 +91,14 @@ bool BeamRackDeflectionAlgorithm::detect(
const std::vector<cv::Point2i> &rack_roi_points,
const std::vector<float> &beam_thresholds,
const std::vector<float> &rack_thresholds) {
// 算法启用开关
const bool USE_ALGORITHM = true;
if (USE_ALGORITHM) {
// --- 真实算法逻辑 ---
// 6.1 初始化结果
result.success = false;
// Simulated success
result.success = true;
result.beam_def_mm_value = 0.0f;
result.rack_def_mm_value = 0.0f;
// 6.2 验证深度图
if (depth_img.empty()) {
std::cerr << "[BeamRackDeflectionAlgorithm] ERROR: Depth image empty!"
<< std::endl;
return false;
}
// 6.3 检查点云
if (!point_cloud || point_cloud->empty()) {
std::cerr
<< "[BeamRackDeflectionAlgorithm] ERROR: Point cloud empty or null!"
<< std::endl;
return false;
}
// 6.4 加载标定参数
Eigen::Matrix4d transform;
loadCalibration(transform);
// 6.5 转换点云并按ROI组织
// 注意:假设点云与深度图分辨率匹配(行优先)
// 如果点云只是有效点的列表而没有结构我们无法轻松映射2D ROI
// 但通常标准会保持 size = width * height
if (point_cloud->size() != depth_img.cols * depth_img.rows) {
std::cerr << "[BeamRackDeflectionAlgorithm] Warning: Point cloud size "
"mismatch. Assuming organized."
<< std::endl;
}
int width = depth_img.cols;
int height = depth_img.rows;
std::vector<Eigen::Vector3d> beam_points_3d;
std::vector<Eigen::Vector3d> rack_points_3d;
// 6.6 辅助函数检查点是否在ROI内
auto isInRoi = [](const std::vector<cv::Point2i> &roi, int x, int y) {
if (roi.size() < 3)
return false;
return cv::pointPolygonTest(roi, cv::Point2f((float)x, (float)y),
false) >= 0;
};
// 6.7 确定实际使用的ROI使用默认值或自定义值
std::vector<cv::Point2i> actual_beam_roi =
beam_roi_points.empty() ? DEFAULT_BEAM_ROI_POINTS : beam_roi_points;
std::vector<cv::Point2i> actual_rack_roi =
rack_roi_points.empty() ? DEFAULT_RACK_ROI_POINTS : rack_roi_points;
// 6.8 交互式ROI选择调试模式
#ifdef DEBUG_ROI_SELECTION
// 辅助lambda函数用于4点ROI选择
auto selectPolygonROI =
[&](const std::string &winName,
const cv::Mat &bg_img) -> std::vector<cv::Point2i> {
std::vector<cv::Point> clicks;
std::string fullWinName = winName + " (Click 4 points)";
cv::namedWindow(fullWinName, cv::WINDOW_AUTOSIZE);
cv::setMouseCallback(
fullWinName,
[](int event, int x, int y, int flags, void *userdata) {
auto *points = static_cast<std::vector<cv::Point> *>(userdata);
if (event == cv::EVENT_LBUTTONDOWN) {
if (points->size() < 4) {
points->push_back(cv::Point(x, y));
std::cout << "Clicked: (" << x << ", " << y << ")" << std::endl;
}
}
},
&clicks);
while (clicks.size() < 4) {
cv::Mat display = bg_img.clone();
for (size_t i = 0; i < clicks.size(); ++i) {
cv::circle(display, clicks[i], 4, cv::Scalar(0, 0, 255), -1);
if (i > 0)
cv::line(display, clicks[i - 1], clicks[i], cv::Scalar(0, 255, 0),
2);
}
cv::imshow(fullWinName, display);
int key = cv::waitKey(10);
if (key == 27)
return {}; // ESC键取消
}
// 闭合多边形可视化
cv::Mat final_display = bg_img.clone();
for (size_t i = 0; i < clicks.size(); ++i) {
cv::circle(final_display, clicks[i], 4, cv::Scalar(0, 0, 255), -1);
if (i > 0)
cv::line(final_display, clicks[i - 1], clicks[i],
cv::Scalar(0, 255, 0), 2);
}
cv::line(final_display, clicks.back(), clicks.front(),
cv::Scalar(0, 255, 0), 2);
cv::imshow(fullWinName, final_display);
cv::waitKey(500); // Show for a bit
cv::destroyWindow(fullWinName);
// Convert to Point2i
std::vector<cv::Point2i> result;
for (const auto &p : clicks)
result.push_back(p);
return result;
};
static bool showed_debug_warning = false;
if (!showed_debug_warning) {
std::cout << "[BeamRackDeflectionAlgorithm] DEBUG INFO: Interactive "
"Rectified ROI Selection Enabled."
<< std::endl;
showed_debug_warning = true;
}
if (!depth_img.empty()) {
// --- 矫正逻辑 ---
cv::Mat display_img;
cv::normalize(depth_img, display_img, 0, 255, cv::NORM_MINMAX, CV_8U);
cv::cvtColor(display_img, display_img, cv::COLOR_GRAY2BGR);
// 尝试加载内参以进行矫正
cv::Mat H = cv::Mat::eye(3, 3, CV_64F);
bool can_rectify = false;
QDir dir_curr = QDir::current();
QStringList filters;
filters << "intrinsics_*.json";
dir_curr.setNameFilters(filters);
QFileInfoList list = dir_curr.entryInfoList(QDir::Files, QDir::Time);
if (!list.empty()) {
QFile i_file(list.first().absoluteFilePath());
if (i_file.open(QIODevice::ReadOnly)) {
QJsonDocument i_doc = QJsonDocument::fromJson(i_file.readAll());
if (!i_doc.isNull() && i_doc.object().contains("depth")) {
QJsonObject d_obj = i_doc.object()["depth"].toObject();
if (d_obj.contains("intrinsic")) {
QJsonArray i_arr = d_obj["intrinsic"].toArray();
if (i_arr.size() >= 9) {
double fx = i_arr[0].toDouble();
double fy = i_arr[4].toDouble();
double cx = i_arr[2].toDouble();
double cy = i_arr[5].toDouble();
Eigen::Matrix3d K;
K << fx, 0, cx, 0, fy, cy, 0, 0, 1;
Eigen::Matrix3d R = transform.block<3, 3>(0, 0);
// 单应性矩阵 H = K * R * K_inv
// 这将图像变换为仿佛相机已按 R 旋转
Eigen::Matrix3d H_eig = K * R * K.inverse();
for (int r = 0; r < 3; ++r)
for (int c = 0; c < 3; ++c)
H.at<double>(r, c) = H_eig(r, c);
can_rectify = true;
std::cout << "[BeamRackDeflectionAlgorithm] Intrinsics loaded. "
"Rectification enabled."
<< std::endl;
}
}
}
}
}
cv::Mat warp_img;
cv::Mat H_final = H.clone(); // 复制原始 H 以开始
if (can_rectify) {
// 1. 计算变换后的角点以找到新的边界框
std::vector<cv::Point2f> corners = {
cv::Point2f(0, 0), cv::Point2f((float)width, 0),
cv::Point2f((float)width, (float)height),
cv::Point2f(0, (float)height)};
std::vector<cv::Point2f> warped_corners;
cv::perspectiveTransform(corners, warped_corners, H);
cv::Rect bbox = cv::boundingRect(warped_corners);
// 2. 创建平移矩阵以将图像移入视野
cv::Mat T = cv::Mat::eye(3, 3, CV_64F);
T.at<double>(0, 2) = -bbox.x;
T.at<double>(1, 2) = -bbox.y;
// 3. 更新单应性矩阵
H_final = T * H;
// 4. 使用新尺寸和 H 进行变换
cv::warpPerspective(display_img, warp_img, H_final, bbox.size());
std::cout << "[BeamRackDeflectionAlgorithm] Rectified Image Size: "
<< bbox.width << "x" << bbox.height << std::endl;
} else {
std::cout << "[BeamRackDeflectionAlgorithm] Warning: Intrinsics not "
"found. Showing unrectified image."
<< std::endl;
warp_img = display_img.clone();
}
// --- 选择横梁 ROI ---
std::cout << "[BeamRackDeflectionAlgorithm] Please click 4 points for "
"BEAM ROI..."
<< std::endl;
auto beam_poly_visual = selectPolygonROI("Select BEAM", warp_img);
// 如果已矫正,则映射回原始坐标
if (beam_poly_visual.size() == 4) {
if (can_rectify) {
std::vector<cv::Point2f> src, dst;
for (auto p : beam_poly_visual)
src.push_back(cv::Point2f(p.x, p.y));
cv::perspectiveTransform(src, dst,
H_final.inv()); // Use H_final.inv()
actual_beam_roi.clear();
for (auto p : dst)
actual_beam_roi.push_back(
cv::Point2i(std::round(p.x), std::round(p.y)));
} else {
actual_beam_roi = beam_poly_visual;
}
std::cout << "[BeamRackDeflectionAlgorithm] Beam ROI Updated."
<< std::endl;
}
// --- 选择立柱 ROI ---
std::cout << "[BeamRackDeflectionAlgorithm] Please click 4 points for "
"RACK ROI..."
<< std::endl;
auto rack_poly_visual = selectPolygonROI("Select RACK", warp_img);
if (rack_poly_visual.size() == 4) {
if (can_rectify) {
std::vector<cv::Point2f> src, dst;
for (auto p : rack_poly_visual)
src.push_back(cv::Point2f(p.x, p.y));
cv::perspectiveTransform(src, dst,
H_final.inv()); // Use H_final.inv()
actual_rack_roi.clear();
for (auto p : dst)
actual_rack_roi.push_back(
cv::Point2i(std::round(p.x), std::round(p.y)));
} else {
actual_rack_roi = rack_poly_visual;
}
std::cout << "[BeamRackDeflectionAlgorithm] Rack ROI Updated."
<< std::endl;
}
}
#endif
// ============================================
cv::Rect beam_bbox = cv::boundingRect(actual_beam_roi);
cv::Rect rack_bbox = cv::boundingRect(actual_rack_roi);
// 处理横梁 ROI 区域
float max_beam_deflection = 0.0f;
float max_rack_deflection = 0.0f;
auto process_roi = [&](const cv::Rect &bbox,
const std::vector<cv::Point2i> &poly,
std::vector<Eigen::Vector3d> &out_pts) {
int start_x = std::max(0, bbox.x);
int end_x = std::min(width, bbox.x + bbox.width);
int start_y = std::max(0, bbox.y);
int end_y = std::min(height, bbox.y + bbox.height);
for (int y = start_y; y < end_y; ++y) {
for (int x = start_x; x < end_x; ++x) {
if (!isInRoi(poly, x, y))
continue;
int idx = y * width + x;
if (idx >= point_cloud->size())
continue;
const Point3D &pt = (*point_cloud)[idx];
if (pt.z <= 0.0f || std::isnan(pt.x))
continue;
// Transform
Eigen::Vector4d p(pt.x, pt.y, pt.z, 1.0);
Eigen::Vector4d p_trans = transform * p;
out_pts.emplace_back(p_trans.head<3>());
}
}
};
process_roi(beam_bbox, actual_beam_roi, beam_points_3d);
process_roi(rack_bbox, actual_rack_roi, rack_points_3d);
// ===========================================
// FIX: 自动旋转矫正 (PCA)
// 解决 "基准线不水平" 的问题,确保横梁水平,立柱垂直
// 通过将数据旋转到水平/垂直,基准线(连接端点)将变为水平/垂直。
// 从而使变形量(点到线的距离)等于 Y 轴(横梁)或 X 轴(立柱)的偏差。
// ===========================================
auto correctRotation = [](std::vector<Eigen::Vector3d> &points,
bool is_beam) {
if (points.size() < 10)
return;
// 1. Convert to cv::Mat for PCA (Only use X, Y)
int n = points.size();
cv::Mat data(n, 2, CV_64F);
for (int i = 0; i < n; ++i) {
data.at<double>(i, 0) = points[i].x();
data.at<double>(i, 1) = points[i].y();
}
// 2. Perform PCA
cv::PCA pca(data, cv::Mat(), cv::PCA::DATA_AS_ROW);
// 3. Get primary eigenvector (direction of max variance)
// Eigenvectors are stored in rows. Row 0 is the primary vector.
cv::Point2d eigen_vec(pca.eigenvectors.at<double>(0, 0),
pca.eigenvectors.at<double>(0, 1));
// 4. Calculate angle relative to desired axis
// Beam (is_beam=true): Should align with X-axis (1, 0)
// Rack (is_beam=false): Should align with Y-axis (0, 1)
double angle = std::atan2(eigen_vec.y, eigen_vec.x); // Angle of the data
double rotation_angle = 0.0;
if (is_beam) {
// Target: Horizontal (0 degrees)
rotation_angle = -angle;
} else {
// Target: Vertical (90 degrees or PI/2)
rotation_angle = (CV_PI / 2.0) - angle;
}
// Normalize to -PI ~ PI
while (rotation_angle > CV_PI)
rotation_angle -= 2 * CV_PI;
while (rotation_angle < -CV_PI)
rotation_angle += 2 * CV_PI;
// Safety check: Don't rotate if angle is suspicious huge (> 45 deg)
// unless confident For now, we trust PCA for standard slight tilts (< 30
// deg).
std::cout << "[BeamRackDeflectionAlgorithm] Correcting "
<< (is_beam ? "Beam" : "Rack")
<< " Rotation: " << rotation_angle * 180.0 / CV_PI << " deg."
<< std::endl;
// 5. Apply Rotation
double c = std::cos(rotation_angle);
double s = std::sin(rotation_angle);
// Center of rotation: PCA mean
double cx = pca.mean.at<double>(0);
double cy = pca.mean.at<double>(1);
for (int i = 0; i < n; ++i) {
double x = points[i].x() - cx;
double y = points[i].y() - cy;
double x_new = x * c - y * s;
double y_new = x * s + y * c;
points[i].x() = x_new + cx;
points[i].y() = y_new + cy;
// Z unchanged
}
};
// Apply corrections
correctRotation(beam_points_3d, true);
correctRotation(rack_points_3d, false);
// ===========================================
// 6.9 计算变形量
// 分箱(切片)方法辅助函数
auto calculate_deflection_binned = [&](std::vector<Eigen::Vector3d> &points,
bool is_beam_y_check,
const std::string &label) -> float {
if (points.empty())
return 0.0f;
// 1. 沿主轴排序点
std::sort(points.begin(), points.end(),
[is_beam_y_check](const Eigen::Vector3d &a,
const Eigen::Vector3d &b) {
return is_beam_y_check ? (a.x() < b.x()) : (a.y() < b.y());
});
// 2. 分箱
int num_bins = 50;
if (points.size() < 100)
num_bins = 10; // Reduce bins for small sets
double min_u = is_beam_y_check ? points.front().x() : points.front().y();
double max_u = is_beam_y_check ? points.back().x() : points.back().y();
// 可视化辅助
#ifdef DEBUG_ROI_SELECTION
int viz_w = 800;
int viz_h = 400;
cv::Mat viz_img = cv::Mat::zeros(viz_h, viz_w, CV_8UC3);
double disp_min_u = min_u;
double disp_max_u = max_u;
double min_v = 1e9, max_v = -1e9;
auto map_u = [&](double u) -> int {
return (int)((u - disp_min_u) / (disp_max_u - disp_min_u) *
(viz_w - 40) +
20);
};
// Will define map_v later after range finding
#endif
std::vector<Eigen::Vector3d> raw_centroids;
std::vector<int> counts;
double range_min = min_u;
double range_max = max_u;
double bin_step = (range_max - range_min) / num_bins;
if (bin_step < 1.0)
return 0.0f;
auto it = points.begin();
double avg_pts_per_bin = 0;
int filled_bins = 0;
for (int i = 0; i < num_bins; ++i) {
double bin_start = range_min + i * bin_step;
double bin_end = bin_start + bin_step;
std::vector<Eigen::Vector3d> bin_pts;
while (it != points.end()) {
double val = is_beam_y_check ? it->x() : it->y();
// double val_v = is_beam_y_check ? it->y() : it->x();
if (val > bin_end)
break;
bin_pts.push_back(*it);
++it;
}
if (!bin_pts.empty()) {
// Robust Centroid (Trimmed Mean)
std::sort(bin_pts.begin(), bin_pts.end(),
[is_beam_y_check](const Eigen::Vector3d &a,
const Eigen::Vector3d &b) {
double val_a = is_beam_y_check ? a.y() : a.x(); // V axis
double val_b = is_beam_y_check ? b.y() : b.x();
return val_a < val_b;
});
size_t n = bin_pts.size();
size_t start = (size_t)(n * 0.25);
size_t end = (size_t)(n * 0.75);
if (end <= start) {
start = 0;
end = n;
}
Eigen::Vector3d sum(0, 0, 0);
int count = 0;
for (size_t k = start; k < end; ++k) {
sum += bin_pts[k];
count++;
}
if (count > 0) {
raw_centroids.push_back(sum / count);
counts.push_back(bin_pts.size());
avg_pts_per_bin += bin_pts.size();
filled_bins++;
}
}
}
if (filled_bins < 2)
return 0.0f;
avg_pts_per_bin /= filled_bins;
// --- 2.1 Bin Filtering (Remove Noise) ---
// Filter out bins with significantly low density (e.g. < 20% of average)
std::vector<Eigen::Vector3d> bin_centroids;
for (size_t i = 0; i < raw_centroids.size(); ++i) {
if (counts[i] > avg_pts_per_bin * 0.2) {
bin_centroids.push_back(raw_centroids[i]);
// Track V range for filtered points
double v =
is_beam_y_check ? raw_centroids[i].y() : raw_centroids[i].x();
#ifdef DEBUG_ROI_SELECTION
if (v < min_v)
min_v = v;
if (v > max_v)
max_v = v;
#endif
}
}
if (bin_centroids.size() < 2) {
std::cerr << "[BeamRack] Filtered bins too few." << std::endl;
return 0.0f;
}
#ifdef DEBUG_ROI_SELECTION
// Adjust V range
double v_range = max_v - min_v;
if (v_range < 1.0)
v_range = 10.0;
min_v -= v_range * 0.5; // More margin
max_v += v_range * 0.5;
auto map_v = [&](double v) -> int {
return (int)((v - min_v) / (max_v - min_v) * (viz_h - 40) + 20);
};
// Draw Points
for (size_t i = 0; i < bin_centroids.size(); ++i) {
double u =
is_beam_y_check ? bin_centroids[i].x() : bin_centroids[i].y();
double v =
is_beam_y_check ? bin_centroids[i].y() : bin_centroids[i].x();
cv::circle(viz_img, cv::Point(map_u(u), map_v(v)), 3,
cv::Scalar(255, 255, 0), -1); // Cyan Centroids
}
#endif
// --- 3. Robust Baseline Fitting (Support Line) ---
// Instead of simple endpoints, fit a line to "valid support regions"
// Support Regions: First 15% and Last 15% of VALID centroids.
std::vector<Eigen::Vector3d> support_points;
int support_count = (int)(bin_centroids.size() * 0.15);
if (support_count < 2)
support_count = 2; // At least 2 points at each end
if (support_count * 2 > bin_centroids.size())
support_count = bin_centroids.size() / 2;
for (int i = 0; i < support_count; ++i)
support_points.push_back(bin_centroids[i]);
for (int i = 0; i < support_count; ++i)
support_points.push_back(bin_centroids[bin_centroids.size() - 1 - i]);
// Fit Line to Support Points (Least Squares)
// Model: v = m * u + c (since rotated, m should be close to 0)
double sum_u = 0, sum_v = 0, sum_uv = 0, sum_uu = 0;
int N = support_points.size();
for (const auto &p : support_points) {
double u = is_beam_y_check ? p.x() : p.y();
double v = is_beam_y_check ? p.y() : p.x();
sum_u += u;
sum_v += v;
sum_uv += u * v;
sum_uu += u * u;
}
double slope = 0, intercept = 0;
double denom = N * sum_uu - sum_u * sum_u;
if (std::abs(denom) > 1e-6) {
slope = (N * sum_uv - sum_u * sum_v) / denom;
intercept = (sum_v - slope * sum_u) / N;
} else {
// Vertical line? Should not happen after rotation. Fallback average.
slope = 0;
intercept = sum_v / N;
}
std::cout << "[BeamRack] Baseline Fit: slope=" << slope
<< ", intercept=" << intercept << " (Support Pts: " << N << ")"
<< std::endl;
// --- 4. Calculate Max Deflection ---
double max_def = 0.0;
Eigen::Vector3d max_pt;
double max_theoretical_v = 0;
for (const auto &p : bin_centroids) {
double u = is_beam_y_check ? p.x() : p.y();
double v = is_beam_y_check ? p.y() : p.x();
double theoretical_v = slope * u + intercept;
double def = 0;
if (is_beam_y_check) {
// Beam: Y+ is down. Deflection = ActualY - TheoreticalY
// We rotated data, so Y+ might still be relevant if rotation was just
// alignment. Assuming standard coords: Sag (Down) is Y decreasing? Or
// increasing? In Camera Coords: Y is DOWN. So Sag is INCREASING Y.
// Deflection = v - theoretical_v. Positive = Down.
def = v - theoretical_v;
} else {
// Rack: Deflection is absolute distance
def = std::abs(v - theoretical_v);
}
if (def > max_def) {
max_def = def;
max_pt = p;
max_theoretical_v = theoretical_v;
}
}
// Robust Average of Max Region (Top 3)
// ... (Simplified: use raw max for now, or implement top-k avg if
// preferred) Sticking to Max for simplicity as requested, but previous
// code used Average. Let's reimplement Top 3 Average roughly around max
// peak? Actually, just returning max_def is cleaner for "maximum sag".
#ifdef DEBUG_ROI_SELECTION
// Draw Baseline
double u_start = disp_min_u;
double v_start = slope * u_start + intercept;
double u_end = disp_max_u;
double v_end = slope * u_end + intercept;
cv::line(viz_img, cv::Point(map_u(u_start), map_v(v_start)),
cv::Point(map_u(u_end), map_v(v_end)), cv::Scalar(0, 255, 0), 2);
// Draw Max Deflection
if (max_def != 0.0) {
double u_d = is_beam_y_check ? max_pt.x() : max_pt.y();
double v_d = is_beam_y_check ? max_pt.y() : max_pt.x();
cv::line(viz_img, cv::Point(map_u(u_d), map_v(v_d)),
cv::Point(map_u(u_d), map_v(max_theoretical_v)),
cv::Scalar(0, 0, 255), 2);
cv::putText(viz_img, "Max: " + std::to_string(max_def),
cv::Point(viz_w / 2, 50), cv::FONT_HERSHEY_SIMPLEX, 0.8,
cv::Scalar(0, 0, 255), 2);
}
cv::imshow("Robust Deflection: " + label, viz_img);
cv::waitKey(100);
#endif
return (float)max_def;
};
// 6.10 Run Calculation logic
// --- 横梁变形Y+ 方向)---
max_beam_deflection =
calculate_deflection_binned(beam_points_3d, true, "Beam");
// --- 立柱变形X 方向)---
max_rack_deflection =
calculate_deflection_binned(rack_points_3d, false, "Rack");
// 存储结果
result.beam_def_mm_value = max_beam_deflection;
result.rack_def_mm_value = max_rack_deflection;
std::cout << "[BeamRackDeflectionAlgorithm] Results: Beam="
<< max_beam_deflection << "mm, Rack=" << max_rack_deflection
<< "mm"
<< " (Beam Points: " << beam_points_3d.size()
<< ", Rack Points: " << rack_points_3d.size() << ")" << std::endl;
// 使用默认或提供的阈值
// std::vector<float> actual_beam_thresh = beam_thresholds.empty() ?
// DEFAULT_BEAM_THRESHOLDS : beam_thresholds; // OLD std::vector<float>
// actual_rack_thresh = rack_thresholds.empty() ? DEFAULT_RACK_THRESHOLDS :
// rack_thresholds; // OLD
// NEW: Use ConfigManager
std::vector<float> actual_beam_thresh =
ConfigManager::getInstance().getBeamThresholds();
std::vector<float> actual_rack_thresh =
ConfigManager::getInstance().getRackThresholds();
// Fallback if empty (should not happen with getBeamThresholds defaults)
if (actual_beam_thresh.size() < 4)
actual_beam_thresh = DEFAULT_BEAM_THRESHOLDS;
if (actual_rack_thresh.size() < 4)
actual_rack_thresh = DEFAULT_RACK_THRESHOLDS;
// 制作 json 阈值字符串的辅助函数
auto make_json_thresh = [](const std::vector<float> &t) {
return "{\"A\":" + std::to_string(t[0]) +
",\"B\":" + std::to_string(t[1]) +
",\"C\":" + std::to_string(t[2]) +
",\"D\":" + std::to_string(t[3]) + "}";
};
if (actual_beam_thresh.size() >= 4) {
result.beam_def_mm_threshold = make_json_thresh(actual_beam_thresh);
}
if (actual_rack_thresh.size() >= 4) {
result.rack_def_mm_threshold = make_json_thresh(actual_rack_thresh);
}
// 检查状态
// 横梁正值为向下Y+)。检查 C 和 D。
// 负值为向上Y-)。检查 A 和 B
// 用户要求:“横梁由于货物仅向下弯曲...Y 正方向)”
// 所以我们主要使用 max_beam_deflection检查 C警告和 D报警这是
// >0。 遗留阈值具有负值,可能用于范围检查。 我们将假设标准 [A(neg),
// B(neg), C(pos), D(pos)] 格式。
bool beam_warn = (max_beam_deflection >= actual_beam_thresh[2]); // > C
bool beam_alrm = (max_beam_deflection >= actual_beam_thresh[3]); // > D
// Rack: Bends Left or Right. We took Abs() -> always positive.
// So we check against C and D.
bool rack_warn = (max_rack_deflection >= actual_rack_thresh[2]);
bool rack_alrm = (max_rack_deflection >= actual_rack_thresh[3]);
auto make_json_status = [](bool w, bool a) {
return "{\"warning\":" + std::string(w ? "true" : "false") +
",\"alarm\":" + std::string(a ? "true" : "false") + "}";
};
result.beam_def_mm_warning_alarm = make_json_status(beam_warn, beam_alrm);
result.rack_def_mm_warning_alarm = make_json_status(rack_warn, rack_alrm);
// 标记为成功
result.success = true;
#ifdef DEBUG_ROI_SELECTION
std::cout << "[BeamRackDeflectionAlgorithm] Press ANY KEY to close graphs "
"and continue..."
<< std::endl;
cv::waitKey(0);
cv::destroyAllWindows();
#endif
return result.success;
} else {
// --- 模拟数据逻辑 ---
std::cout << "[BeamRackDeflectionAlgorithm] Using FAKE DATA implementation "
"(Switch OFF)."
<< std::endl;
result.beam_def_mm_value = 5.5f; // 模拟横梁弯曲
result.rack_def_mm_value = 2.2f; // 模拟立柱弯曲
result.success = true;
// 设置模拟阈值
// std::vector<float> actual_beam_thresh = beam_thresholds.empty() ?
// DEFAULT_BEAM_THRESHOLDS : beam_thresholds; std::vector<float>
// actual_rack_thresh = rack_thresholds.empty() ? DEFAULT_RACK_THRESHOLDS :
// rack_thresholds;
std::vector<float> actual_beam_thresh =
ConfigManager::getInstance().getBeamThresholds();
std::vector<float> actual_rack_thresh =
ConfigManager::getInstance().getRackThresholds();
if (actual_beam_thresh.size() < 4)
actual_beam_thresh = DEFAULT_BEAM_THRESHOLDS;
if (actual_rack_thresh.size() < 4)
actual_rack_thresh = DEFAULT_RACK_THRESHOLDS;
auto make_json_thresh = [](const std::vector<float> &t) {
return "{\"A\":" + std::to_string(t[0]) +
",\"B\":" + std::to_string(t[1]) +
",\"C\":" + std::to_string(t[2]) +
",\"D\":" + std::to_string(t[3]) + "}";
};
if (actual_beam_thresh.size() >= 4)
result.beam_def_mm_threshold = make_json_thresh(actual_beam_thresh);
if (actual_rack_thresh.size() >= 4)
result.rack_def_mm_threshold = make_json_thresh(actual_rack_thresh);
// 设置模拟警告/报警状态
result.beam_def_mm_warning_alarm = "{\"warning\":false,\"alarm\":false}";
result.rack_def_mm_warning_alarm = "{\"warning\":false,\"alarm\":false}";
return result.success;
}
std::cout << "[BeamRackDeflectionAlgorithm] Simulated Detection. Side: "
<< side << std::endl;
return true;
}

View File

@@ -1,129 +1,19 @@
#include "visual_inventory_detection.h"
#include "HalconCpp.h"
#include <iostream>
#include <opencv2/opencv.hpp>
using namespace HalconCpp;
// Helper to convert cv::Mat to Halcon HImage
HImage MatToHImage(const cv::Mat &image) {
HImage hImage;
if (image.empty())
return hImage;
cv::Mat gray;
if (image.channels() == 3) {
cv::cvtColor(image, gray, cv::COLOR_BGR2GRAY);
} else {
gray = image.clone();
}
// Fix: Create a copy of the data to ensure it persists beyond function scope
// GenImage1 with "byte" type expects the data to remain valid
void *data_copy = new unsigned char[gray.total()];
memcpy(data_copy, gray.data, gray.total());
try {
hImage.GenImage1("byte", gray.cols, gray.rows, data_copy);
} catch (...) {
delete[] static_cast<unsigned char *>(data_copy);
throw;
}
// Note: The data_copy will be managed by Halcon's HImage
// We don't delete it here as Halcon takes ownership
return hImage;
}
// using namespace HalconCpp; // Removed
bool VisualInventoryAlgorithm::detect(const cv::Mat &depth_img,
const cv::Mat &color_img,
const std::string &side,
VisualInventoryResult &result) {
result.success = false;
try {
if (color_img.empty()) {
std::cerr << "[VisualInventoryAlgorithm] Error: Empty image input."
<< std::endl;
return false;
}
// 1. Convert to HImage
HImage hImage = MatToHImage(color_img);
// 2. Setup Halcon QR Code Model
HDataCode2D dataCode2d;
dataCode2d.CreateDataCode2dModel("QR Code", HTuple(), HTuple());
dataCode2d.SetDataCode2dParam("default_parameters", "enhanced_recognition");
HTuple resultHandles, decodedDataStrings;
// 3. Detect
// stop_after_result_num: 100 ensures we get up to 100 codes
HXLDCont symbolXLDs =
dataCode2d.FindDataCode2d(hImage, "stop_after_result_num", 100,
&resultHandles, &decodedDataStrings);
// 4. Transform Results to JSON
// Format: {"A01":["BOX111","BOX112"], "A02":["BOX210"]}
// Since we don't have position information, group all codes under a generic
// key
std::string json_barcodes = "\"" + side + "\":[";
Hlong count = decodedDataStrings.Length();
for (Hlong i = 0; i < count; i++) {
if (i > 0)
json_barcodes += ",";
// Access string from HTuple using S() which returns const char*
HTuple s = decodedDataStrings[i];
std::string code = std::string(s.S());
// Save raw code for deduplication
result.codes.push_back(code);
// Escape special characters in JSON strings
// Replace backslashes first, then quotes
size_t pos = 0;
while ((pos = code.find('\\', pos)) != std::string::npos) {
code.replace(pos, 1, "\\\\");
pos += 2;
}
pos = 0;
while ((pos = code.find('"', pos)) != std::string::npos) {
code.replace(pos, 1, "\\\"");
pos += 2;
}
json_barcodes += "\"" + code + "\"";
}
json_barcodes += "]";
result.result_barcodes = "{" + json_barcodes + "}";
result.success = true;
result.codes.clear();
result.result_barcodes = "{\"" + side + "\":[]}";
std::cout << "[VisualInventoryAlgorithm] Side: " << side
<< " | Detected: " << count << " codes." << std::endl;
} catch (HException &except) {
std::cerr << "[VisualInventoryAlgorithm] Halcon Exception: "
<< except.ErrorMessage().Text() << std::endl;
result.result_barcodes = "{\"" + side +
"\":[], \"error\":\"Halcon Exception: " +
std::string(except.ErrorMessage().Text()) + "\"}";
result.success = false;
} catch (std::exception &e) {
std::cerr << "[VisualInventoryAlgorithm] Exception: " << e.what()
<< std::endl;
result.result_barcodes =
"{\"" + side + "\":[], \"error\":\"" + std::string(e.what()) + "\"}";
result.success = false;
} catch (...) {
std::cerr
<< "[VisualInventoryAlgorithm] Unknown Exception during detection."
<< std::endl;
result.result_barcodes =
"{\"" + side + "\":[], \"error\":\"Unknown exception\"}";
result.success = false;
}
<< " | Detected: 0 codes (Simulated)." << std::endl;
return result.success;
}

View File

@@ -3,18 +3,19 @@
#endif
#include "calibration_widget.h"
#include <TYCoordinateMapper.h> // Include Mapper
#include <open3d/Open3D.h>
#include <QFileDialog>
#include <QMessageBox>
#include <QApplication>
#include <QDateTime>
#include <QDebug>
#include <QMouseEvent>
#include <QPainter>
#include <QFileDialog>
#include <QJsonArray>
#include <QJsonDocument>
#include <QJsonObject>
#include <QJsonArray>
#include <QDateTime>
#include <QApplication>
#include <QMessageBox>
#include <QMouseEvent>
#include <QPainter>
#include <TYCoordinateMapper.h> // Include Mapper
#include <open3d/Open3D.h>
#include <random> // For RANSAC
#include <thread> // New
// Placeholder for now
@@ -24,7 +25,8 @@ CalibrationWidget::CalibrationWidget(QWidget *parent) : QWidget(parent) {
// Init SDK for CoordinateMapper math functions
TY_STATUS status = TYInitLib();
if (status != TY_STATUS_OK) {
QMessageBox::warning(this, "Error", "Failed to initialize TY SDK. Calibration might crash.");
QMessageBox::warning(
this, "Error", "Failed to initialize TY SDK. Calibration might crash.");
}
has_calibration_result_ = false;
@@ -37,34 +39,41 @@ CalibrationWidget::CalibrationWidget(QWidget *parent) : QWidget(parent) {
label_color_display_->installEventFilter(this);
}
CalibrationWidget::~CalibrationWidget() {
TYDeinitLib();
}
CalibrationWidget::~CalibrationWidget() { TYDeinitLib(); }
bool CalibrationWidget::eventFilter(QObject *obj, QEvent *event) {
if (obj == label_color_display_ && event->type() == QEvent::MouseButtonPress) {
if (obj == label_color_display_ &&
event->type() == QEvent::MouseButtonPress) {
QMouseEvent *mouseEvent = static_cast<QMouseEvent *>(event);
if (mouseEvent->button() == Qt::LeftButton && is_selecting_roi_) {
// Coordinate mapping: Label -> Image
if (mat_color_raw_.empty()) return false;
if (mat_color_raw_.empty())
return false;
// Calculate scale
double scale_x = (double)label_color_display_->width() / mat_color_raw_.cols;
double scale_y = (double)label_color_display_->height() / mat_color_raw_.rows;
double scale_x =
(double)label_color_display_->width() / mat_color_raw_.cols;
double scale_y =
(double)label_color_display_->height() / mat_color_raw_.rows;
double scale = std::min(scale_x, scale_y);
int offset_x = (label_color_display_->width() - mat_color_raw_.cols * scale) / 2;
int offset_y = (label_color_display_->height() - mat_color_raw_.rows * scale) / 2;
int offset_x =
(label_color_display_->width() - mat_color_raw_.cols * scale) / 2;
int offset_y =
(label_color_display_->height() - mat_color_raw_.rows * scale) / 2;
int img_x = (mouseEvent->pos().x() - offset_x) / scale;
int img_y = (mouseEvent->pos().y() - offset_y) / scale;
// Append point
if (img_x >= 0 && img_x < mat_color_raw_.cols && img_y >= 0 && img_y < mat_color_raw_.rows) {
if (roi_points_color_.size() >= 4) roi_points_color_.clear(); // Reset if full
if (img_x >= 0 && img_x < mat_color_raw_.cols && img_y >= 0 &&
img_y < mat_color_raw_.rows) {
if (roi_points_color_.size() >= 4)
roi_points_color_.clear(); // Reset if full
roi_points_color_.push_back(cv::Point(img_x, img_y));
updateDisplay();
log("Added ROI point: (" + QString::number(img_x) + ", " + QString::number(img_y) + ")");
log("Added ROI point: (" + QString::number(img_x) + ", " +
QString::number(img_y) + ")");
}
}
return true;
@@ -72,7 +81,6 @@ bool CalibrationWidget::eventFilter(QObject *obj, QEvent *event) {
return QWidget::eventFilter(obj, event);
}
void CalibrationWidget::setupUi() {
QVBoxLayout *main_layout = new QVBoxLayout(this);
@@ -133,21 +141,27 @@ void CalibrationWidget::setupUi() {
main_layout->addWidget(text_log_);
// Connections
connect(btn_load_depth_, &QPushButton::clicked, this, &CalibrationWidget::loadDepthImage);
connect(btn_load_color_, &QPushButton::clicked, this, &CalibrationWidget::loadColorImage);
// connect(btn_capture_, &QPushButton::clicked, this, &CalibrationWidget::captureImage); // Removed
// connect(btn_refresh_cameras_, &QPushButton::clicked, this, &CalibrationWidget::refreshCameraList); // Removed
connect(btn_load_calib_, &QPushButton::clicked, this, &CalibrationWidget::loadCalibParams);
connect(btn_run_calibration_, &QPushButton::clicked, this, &CalibrationWidget::runCalibration);
connect(btn_save_result_, &QPushButton::clicked, this, &CalibrationWidget::saveCalibrationResult);
connect(btn_view_3d_, &QPushButton::clicked, this, &CalibrationWidget::view3DCloud);
connect(btn_load_depth_, &QPushButton::clicked, this,
&CalibrationWidget::loadDepthImage);
connect(btn_load_color_, &QPushButton::clicked, this,
&CalibrationWidget::loadColorImage);
// connect(btn_capture_, &QPushButton::clicked, this,
// &CalibrationWidget::captureImage); // Removed connect(btn_refresh_cameras_,
// &QPushButton::clicked, this, &CalibrationWidget::refreshCameraList); //
// Removed
connect(btn_load_calib_, &QPushButton::clicked, this,
&CalibrationWidget::loadCalibParams);
connect(btn_run_calibration_, &QPushButton::clicked, this,
&CalibrationWidget::runCalibration);
connect(btn_save_result_, &QPushButton::clicked, this,
&CalibrationWidget::saveCalibrationResult);
connect(btn_view_3d_, &QPushButton::clicked, this,
&CalibrationWidget::view3DCloud);
log("Ready. Please load images.");
}
void CalibrationWidget::log(const QString& msg) {
text_log_->append(msg);
}
void CalibrationWidget::log(const QString &msg) { text_log_->append(msg); }
// Helper for locking UI
void CalibrationWidget::setUiLocked(bool locked) {
@@ -164,8 +178,10 @@ void CalibrationWidget::setUiLocked(bool locked) {
}
void CalibrationWidget::loadDepthImage() {
QString fileName = QFileDialog::getOpenFileName(this, "Open Depth Image", "", "Images (*.png *.tif *.tiff)");
if (fileName.isEmpty()) return;
QString fileName = QFileDialog::getOpenFileName(
this, "Open Depth Image", "", "Images (*.png *.tif *.tiff)");
if (fileName.isEmpty())
return;
setUiLocked(true);
log("Loading depth image...");
@@ -173,22 +189,29 @@ void CalibrationWidget::loadDepthImage() {
std::thread([this, fileName]() {
cv::Mat loaded = cv::imread(fileName.toStdString(), cv::IMREAD_UNCHANGED);
QMetaObject::invokeMethod(this, [this, fileName, loaded]() {
QMetaObject::invokeMethod(
this,
[this, fileName, loaded]() {
if (loaded.empty()) {
log("Error: Failed to load depth image.");
} else {
mat_depth_raw_ = loaded;
log("Loaded depth image: " + fileName + " (" + QString::number(mat_depth_raw_.cols) + "x" + QString::number(mat_depth_raw_.rows) + ")");
log("Loaded depth image: " + fileName + " (" +
QString::number(mat_depth_raw_.cols) + "x" +
QString::number(mat_depth_raw_.rows) + ")");
updateDisplay();
}
setUiLocked(false);
}, Qt::QueuedConnection);
},
Qt::QueuedConnection);
}).detach();
}
void CalibrationWidget::loadColorImage() {
QString fileName = QFileDialog::getOpenFileName(this, "Open Color Image", "", "Images (*.png *.jpg *.jpeg *.bmp)");
if (fileName.isEmpty()) return;
QString fileName = QFileDialog::getOpenFileName(
this, "Open Color Image", "", "Images (*.png *.jpg *.jpeg *.bmp)");
if (fileName.isEmpty())
return;
setUiLocked(true);
log("Loading color image...");
@@ -196,7 +219,9 @@ void CalibrationWidget::loadColorImage() {
std::thread([this, fileName]() {
cv::Mat loaded = cv::imread(fileName.toStdString());
QMetaObject::invokeMethod(this, [this, fileName, loaded]() {
QMetaObject::invokeMethod(
this,
[this, fileName, loaded]() {
if (loaded.empty()) {
log("Error: Failed to load color image.");
} else {
@@ -205,7 +230,8 @@ void CalibrationWidget::loadColorImage() {
updateDisplay();
}
setUiLocked(false);
}, Qt::QueuedConnection);
},
Qt::QueuedConnection);
}).detach();
}
@@ -217,16 +243,19 @@ void CalibrationWidget::updateDisplay() {
for (size_t i = 0; i < roi_points_color_.size(); ++i) {
cv::circle(display, roi_points_color_[i], 5, cv::Scalar(0, 0, 255), -1);
if (i > 0) {
cv::line(display, roi_points_color_[i-1], roi_points_color_[i], cv::Scalar(0, 255, 0), 2);
cv::line(display, roi_points_color_[i - 1], roi_points_color_[i],
cv::Scalar(0, 255, 0), 2);
}
}
if (roi_points_color_.size() == 4) {
cv::line(display, roi_points_color_[3], roi_points_color_[0], cv::Scalar(0, 255, 0), 2);
cv::line(display, roi_points_color_[3], roi_points_color_[0],
cv::Scalar(0, 255, 0), 2);
}
}
QImage img = cvMatToQImage(display);
label_color_display_->setPixmap(QPixmap::fromImage(img).scaled(label_color_display_->size(), Qt::KeepAspectRatio));
label_color_display_->setPixmap(QPixmap::fromImage(img).scaled(
label_color_display_->size(), Qt::KeepAspectRatio));
}
if (!mat_depth_raw_.empty()) {
@@ -236,7 +265,8 @@ void CalibrationWidget::updateDisplay() {
cv::cvtColor(display, display, cv::COLOR_GRAY2BGR); // Fake color
QImage img = cvMatToQImage(display);
label_depth_display_->setPixmap(QPixmap::fromImage(img).scaled(label_depth_display_->size(), Qt::KeepAspectRatio));
label_depth_display_->setPixmap(QPixmap::fromImage(img).scaled(
label_depth_display_->size(), Qt::KeepAspectRatio));
}
}
@@ -245,23 +275,26 @@ QImage CalibrationWidget::cvMatToQImage(const cv::Mat& mat) {
// BGR -> RGB
cv::Mat rgb;
cv::cvtColor(mat, rgb, cv::COLOR_BGR2RGB);
QImage img((const uchar*)rgb.data, rgb.cols, rgb.rows, rgb.step, QImage::Format_RGB888);
QImage img((const uchar *)rgb.data, rgb.cols, rgb.rows, rgb.step,
QImage::Format_RGB888);
return img.copy();
} else if (mat.type() == CV_8UC1) {
QImage img((const uchar*)mat.data, mat.cols, mat.rows, mat.step, QImage::Format_Grayscale8);
QImage img((const uchar *)mat.data, mat.cols, mat.rows, mat.step,
QImage::Format_Grayscale8);
return img.copy();
}
return QImage();
}
// Helper struct for camera info - REMOVED
// refreshCameraList - REMOVED
// captureImage - REMOVED
void CalibrationWidget::loadCalibParams() {
QString fileName = QFileDialog::getOpenFileName(this, "Load Intrinsics JSON", "", "JSON (*.json)");
if (fileName.isEmpty()) return;
QString fileName = QFileDialog::getOpenFileName(this, "Load Intrinsics JSON",
"", "JSON (*.json)");
if (fileName.isEmpty())
return;
QFile file(fileName);
if (!file.open(QIODevice::ReadOnly)) {
@@ -281,15 +314,18 @@ void CalibrationWidget::loadCalibParams() {
auto parseCalib = [](const QJsonObject &obj, TY_CAMERA_CALIB_INFO &info) {
if (obj.contains("intrinsic")) {
QJsonArray arr = obj["intrinsic"].toArray();
for(int i=0; i<9 && i<arr.size(); ++i) info.intrinsic.data[i] = (float)arr[i].toDouble();
for (int i = 0; i < 9 && i < arr.size(); ++i)
info.intrinsic.data[i] = (float)arr[i].toDouble();
}
if (obj.contains("extrinsic")) {
QJsonArray arr = obj["extrinsic"].toArray();
for(int i=0; i<16 && i<arr.size(); ++i) info.extrinsic.data[i] = (float)arr[i].toDouble();
for (int i = 0; i < 16 && i < arr.size(); ++i)
info.extrinsic.data[i] = (float)arr[i].toDouble();
}
if (obj.contains("distortion")) {
QJsonArray arr = obj["distortion"].toArray();
for(int i=0; i<12 && i<arr.size(); ++i) info.distortion.data[i] = (float)arr[i].toDouble();
for (int i = 0; i < 12 && i < arr.size(); ++i)
info.distortion.data[i] = (float)arr[i].toDouble();
}
};
@@ -312,7 +348,8 @@ void CalibrationWidget::loadCalibParams() {
sn = root["camera_id"].toString();
}
// If not in JSON, try parsing from filename (e.g., intrinsics_207000146458.json)
// If not in JSON, try parsing from filename (e.g.,
// intrinsics_207000146458.json)
if (sn.isEmpty()) {
QFileInfo fi(fileName);
QString baseName = fi.baseName(); // intrinsics_207000146458
@@ -326,14 +363,21 @@ void CalibrationWidget::loadCalibParams() {
has_calib_params_ = true;
// Log loaded values for verification
log(QString("Loaded Calibration Parameters for Camera SN: %1").arg(current_camera_sn_));
log(QString("Loaded Calibration Parameters for Camera SN: %1")
.arg(current_camera_sn_));
auto logIntr = [&](const char *name, const TY_CAMERA_CALIB_INFO &info) {
log(QString("%1 Intrinsic: fx=%2 fy=%3 cx=%4 cy=%5").arg(name)
.arg(info.intrinsic.data[0]).arg(info.intrinsic.data[4])
.arg(info.intrinsic.data[2]).arg(info.intrinsic.data[5]));
log(QString("%1 Distortion: k1=%2 k2=%3 p1=%4 p2=%5 k3=%6").arg(name)
.arg(info.distortion.data[0]).arg(info.distortion.data[1])
.arg(info.distortion.data[2]).arg(info.distortion.data[3])
log(QString("%1 Intrinsic: fx=%2 fy=%3 cx=%4 cy=%5")
.arg(name)
.arg(info.intrinsic.data[0])
.arg(info.intrinsic.data[4])
.arg(info.intrinsic.data[2])
.arg(info.intrinsic.data[5]));
log(QString("%1 Distortion: k1=%2 k2=%3 p1=%4 p2=%5 k3=%6")
.arg(name)
.arg(info.distortion.data[0])
.arg(info.distortion.data[1])
.arg(info.distortion.data[2])
.arg(info.distortion.data[3])
.arg(info.distortion.data[4]));
};
logIntr("Depth", depth_calib_);
@@ -342,8 +386,6 @@ void CalibrationWidget::loadCalibParams() {
log("Loaded Calibration Parameters from " + fileName);
}
void CalibrationWidget::view3DCloud() {
if (!has_calibration_result_ || roi_points_depth_.empty()) {
log("Error: No calibration result or ROI points. Run calibration first.");
@@ -365,9 +407,11 @@ void CalibrationWidget::view3DCloud() {
// Rebuild cloud loop (same as runCalibration)
cv::Rect bounding_box = cv::boundingRect(roi_points_depth_);
int start_y = std::max(0, bounding_box.y);
int end_y = std::min(mat_depth_raw_.rows, bounding_box.y + bounding_box.height);
int end_y =
std::min(mat_depth_raw_.rows, bounding_box.y + bounding_box.height);
int start_x = std::max(0, bounding_box.x);
int end_x = std::min(mat_depth_raw_.cols, bounding_box.x + bounding_box.width);
int end_x =
std::min(mat_depth_raw_.cols, bounding_box.x + bounding_box.width);
// Prepare Transform Matrix
Eigen::Matrix4d T_mat = Eigen::Matrix4d::Identity();
@@ -383,16 +427,19 @@ void CalibrationWidget::view3DCloud() {
for (int y = start_y; y < end_y; ++y) {
for (int x = start_x; x < end_x; ++x) {
if (cv::pointPolygonTest(roi_points_depth_, cv::Point2f(x, y), false) < 0) continue;
if (cv::pointPolygonTest(roi_points_depth_, cv::Point2f(x, y), false) < 0)
continue;
uint16_t d = mat_depth_raw_.at<uint16_t>(y, x);
if (d == 0) continue;
if (d == 0)
continue;
double z_mm = (double)d;
double x_mm = (x - cx) * z_mm / fx;
double y_mm = (y - cy) * z_mm / fy;
if (std::isnan(x_mm) || std::isnan(y_mm) || std::isnan(z_mm)) continue;
if (std::isnan(x_mm) || std::isnan(y_mm) || std::isnan(z_mm))
continue;
Eigen::Vector3d pt_raw(x_mm, y_mm, z_mm);
@@ -400,17 +447,21 @@ void CalibrationWidget::view3DCloud() {
pcd_raw->points_.push_back(pt_raw);
// Map depth pixel to color pixel
int col_x = std::min(std::max(0, (int)(x / scale_x)), mat_color_raw_.cols - 1);
int col_y = std::min(std::max(0, (int)(y / scale_y)), mat_color_raw_.rows - 1);
int col_x =
std::min(std::max(0, (int)(x / scale_x)), mat_color_raw_.cols - 1);
int col_y =
std::min(std::max(0, (int)(y / scale_y)), mat_color_raw_.rows - 1);
cv::Vec3b bgr = mat_color_raw_.at<cv::Vec3b>(col_y, col_x);
pcd_raw->colors_.push_back(Eigen::Vector3d(bgr[2]/255.0, bgr[1]/255.0, bgr[0]/255.0)); // BGR->RGB
pcd_raw->colors_.push_back(Eigen::Vector3d(bgr[2] / 255.0, bgr[1] / 255.0,
bgr[0] / 255.0)); // BGR->RGB
// Transform and Add to Corrected Cloud
Eigen::Vector4d pt_h(x_mm, y_mm, z_mm, 1.0);
Eigen::Vector4d pt_trans = T_mat * pt_h;
// Shift corrected cloud to side for comparison (e.g., +1000mm in X)
pcd_corrected->points_.push_back(pt_trans.head<3>() + Eigen::Vector3d(1000.0, 0, 0));
pcd_corrected->points_.push_back(pt_trans.head<3>() +
Eigen::Vector3d(1000.0, 0, 0));
z_values.push_back(pt_trans.z());
}
@@ -423,11 +474,13 @@ void CalibrationWidget::view3DCloud() {
// 2. Compute Statistics & Heatmap Coloring
double sum_z = 0.0;
for(double z : z_values) sum_z += z;
for (double z : z_values)
sum_z += z;
double mean_z = sum_z / z_values.size();
double sq_sum = 0.0;
for(double z : z_values) sq_sum += (z - mean_z) * (z - mean_z);
for (double z : z_values)
sq_sum += (z - mean_z) * (z - mean_z);
double std_z = std::sqrt(sq_sum / z_values.size());
// Color Corrected cloud based on deviation from Mean Z
@@ -450,16 +503,13 @@ void CalibrationWidget::view3DCloud() {
log("Result: GOOD. Minor noise.");
} else {
log("Result: WARNING. Plane may be curved or noisy.");
}
// 3. Visualize
log("Opening 3D Viewer...");
open3d::visualization::DrawGeometries(
{pcd_raw, pcd_corrected},
"Calibration Verification (Red: Raw, Green: Corrected)",
1280, 720
);
"Calibration Verification (Red: Raw, Green: Corrected)", 1280, 720);
log("Viewer closed.");
}
@@ -486,17 +536,21 @@ void CalibrationWidget::runCalibration() {
// Force UI update
QApplication::processEvents();
qInfo() << "Entering runCalibration try-block";
try {
// Validation
if (mat_depth_raw_.type() != CV_16UC1) {
throw std::runtime_error("Depth image must be 16-bit (CV_16UC1)");
}
if (depth_calib_.intrinsic.data[0] < 1e-6 || depth_calib_.intrinsic.data[4] < 1e-6) {
if (depth_calib_.intrinsic.data[0] < 1e-6 ||
depth_calib_.intrinsic.data[4] < 1e-6) {
throw std::runtime_error("Invalid depth intrinsics (fx/fy is zero)");
}
// 1. Map Color ROI to Depth ROI (Manual Fallback)
log("Mapping ROI (Manual Scaling)...");
qInfo() << "Mapping ROI (Manual Scaling)...";
QApplication::processEvents();
std::vector<cv::Point> res_depth_roi;
@@ -505,10 +559,16 @@ void CalibrationWidget::runCalibration() {
throw std::runtime_error("Images empty during mapping");
}
qInfo() << "Color Image Size:" << mat_color_raw_.cols << "x"
<< mat_color_raw_.rows;
qInfo() << "Depth Image Size:" << mat_depth_raw_.cols << "x"
<< mat_depth_raw_.rows;
double scale_x = (double)mat_depth_raw_.cols / (double)mat_color_raw_.cols;
double scale_y = (double)mat_depth_raw_.rows / (double)mat_color_raw_.rows;
log(QString("Mapping Scale: X=%1, Y=%2").arg(scale_x).arg(scale_y));
qInfo() << "Mapping Scale: X=" << scale_x << " Y=" << scale_y;
for (const auto &p : roi_points_color_) {
int cx = (int)(p.x * scale_x);
@@ -517,13 +577,21 @@ void CalibrationWidget::runCalibration() {
cx = std::max(0, std::min(cx, mat_depth_raw_.cols - 1));
cy = std::max(0, std::min(cy, mat_depth_raw_.rows - 1));
res_depth_roi.push_back(cv::Point(cx, cy));
qInfo() << "Mapped Point:" << p.x << "," << p.y << " -> " << cx << ","
<< cy;
}
// 2. Build Point Cloud from ROI
log("Building Point Cloud...");
qInfo() << "Building Point Cloud...";
QApplication::processEvents();
auto pcd = std::make_shared<open3d::geometry::PointCloud>();
if (!pcd) {
throw std::runtime_error("Failed to allocate PointCloud");
}
qInfo() << "PointCloud allocated.";
cv::Rect bounding_box = cv::boundingRect(res_depth_roi);
float fx = depth_calib_.intrinsic.data[0];
@@ -533,22 +601,27 @@ void CalibrationWidget::runCalibration() {
int valid_points = 0;
int start_y = std::max(0, bounding_box.y);
int end_y = std::min(mat_depth_raw_.rows, bounding_box.y + bounding_box.height);
int end_y =
std::min(mat_depth_raw_.rows, bounding_box.y + bounding_box.height);
int start_x = std::max(0, bounding_box.x);
int end_x = std::min(mat_depth_raw_.cols, bounding_box.x + bounding_box.width);
int end_x =
std::min(mat_depth_raw_.cols, bounding_box.x + bounding_box.width);
for (int y = start_y; y < end_y; ++y) {
for (int x = start_x; x < end_x; ++x) {
if (cv::pointPolygonTest(res_depth_roi, cv::Point2f(x, y), false) < 0) continue;
if (cv::pointPolygonTest(res_depth_roi, cv::Point2f(x, y), false) < 0)
continue;
uint16_t d = mat_depth_raw_.at<uint16_t>(y, x);
if (d == 0) continue;
if (d == 0)
continue;
double z_mm = (double)d;
double x_mm = (x - cx) * z_mm / fx;
double y_mm = (y - cy) * z_mm / fy;
if (std::isnan(x_mm) || std::isnan(y_mm) || std::isnan(z_mm)) continue;
if (std::isnan(x_mm) || std::isnan(y_mm) || std::isnan(z_mm))
continue;
pcd->points_.emplace_back(Eigen::Vector3d(x_mm, y_mm, z_mm));
valid_points++;
@@ -559,29 +632,110 @@ void CalibrationWidget::runCalibration() {
QApplication::processEvents();
if (valid_points < 100) {
qCritical() << "Too few valid points:" << valid_points;
throw std::runtime_error("Too few valid points (<100) in selected ROI");
}
// 3. RANSAC Plane Fitting
log("Fitting Plane (RANSAC)...");
qInfo() << "Fitting Plane (RANSAC)... Points size:" << pcd->points_.size();
QApplication::processEvents();
std::vector<size_t> inliers;
Eigen::Vector4d plane_model;
// Restore RANSAC
// Manual RANSAC Implementation to avoid Open3D crash
qInfo() << "Starting Manual RANSAC...";
int num_points = (int)pcd->points_.size();
if (num_points < 3)
throw std::runtime_error("Not enough points for RANSAC");
std::mt19937 rng(12345);
std::uniform_int_distribution<int> dist(0, num_points - 1);
size_t best_inlier_count = 0;
Eigen::Vector4d best_plane(0, 0, 1, 0); // Default Z-plane
std::vector<size_t> best_inliers;
double threshold = 2.0; // mm
const int max_iter = 1000;
for (int k = 0; k < max_iter; ++k) {
// 1. Sample 3 unique points
int idx1 = dist(rng);
int idx2 = dist(rng);
int idx3 = dist(rng);
while (idx1 == idx2)
idx2 = dist(rng);
while (idx1 == idx3 || idx2 == idx3)
idx3 = dist(rng);
const Eigen::Vector3d &p1 = pcd->points_[idx1];
const Eigen::Vector3d &p2 = pcd->points_[idx2];
const Eigen::Vector3d &p3 = pcd->points_[idx3];
// 2. Compute Plane (ax + by + cz + d = 0)
Eigen::Vector3d v12 = p2 - p1;
Eigen::Vector3d v13 = p3 - p1;
Eigen::Vector3d n = v12.cross(v13);
double n_norm = n.norm();
if (n_norm < 1e-6)
continue; // Collinear points
n /= n_norm;
double d_val = -n.dot(p1);
// 3. Count Inliers
size_t current_inliers = 0;
std::vector<size_t> current_inlier_indices;
current_inlier_indices.reserve(num_points);
for (int i = 0; i < num_points; ++i) {
double dist_pt = std::abs(n.dot(pcd->points_[i]) + d_val);
if (dist_pt < threshold) {
current_inliers++;
current_inlier_indices.push_back(i);
}
}
// 4. Update Best
if (current_inliers > best_inlier_count) {
best_inlier_count = current_inliers;
best_plane = Eigen::Vector4d(n.x(), n.y(), n.z(), d_val);
best_inliers = current_inlier_indices;
}
}
plane_model = best_plane;
inliers = best_inliers;
qInfo() << "Manual RANSAC finished. Best Inliers:" << best_inlier_count;
/* REPLACED Open3D Call
try {
qInfo() << "Calling SegmentPlane...";
std::tie(plane_model, inliers) = pcd->SegmentPlane(2.0, 3, 1000);
qInfo() << "SegmentPlane returned. Inliers:" << inliers.size();
} catch (const std::exception &e) {
qCritical() << "Open3D SegmentPlane Exception:" << e.what();
throw;
} catch (...) {
qCritical() << "Open3D SegmentPlane Unknown Exception";
throw;
}
*/
log(QString("RANSAC Inliers: %1").arg(inliers.size()));
if (inliers.size() < 10) {
throw std::runtime_error("RANSAC failed to find a valid plane");
}
// 4. Compute Rotation Matrix
double A = plane_model[0], B = plane_model[1], C = plane_model[2], D = plane_model[3];
double A = plane_model[0], B = plane_model[1], C = plane_model[2],
D = plane_model[3];
log(QString("Plane Equation: %1x + %2y + %3z + %4 = 0")
.arg(A).arg(B).arg(C).arg(D));
.arg(A)
.arg(B)
.arg(C)
.arg(D));
Eigen::Vector3d normal(A, B, C);
normal.normalize();
@@ -590,13 +744,16 @@ void CalibrationWidget::runCalibration() {
Eigen::Matrix4d T_mat = Eigen::Matrix4d::Identity();
if (std::abs(normal.dot(target)) < 0.999) {
Eigen::Matrix3d R = Eigen::Quaterniond::FromTwoVectors(normal, target).toRotationMatrix();
Eigen::Matrix3d R =
Eigen::Quaterniond::FromTwoVectors(normal, target).toRotationMatrix();
T_mat.block<3, 3>(0, 0) = R;
}
// Z-offset removed as per user request.
// The transformation will align the plane normal to Z-axis but keep the original distance.
log("Skipping Z offset adjustment (User Requested). Plane remains at original distance.");
// The transformation will align the plane normal to Z-axis but keep the
// original distance.
log("Skipping Z offset adjustment (User Requested). Plane remains at "
"original distance.");
// 5. Update Result
roi_points_depth_ = res_depth_roi;
@@ -606,22 +763,26 @@ void CalibrationWidget::runCalibration() {
has_calibration_result_ = true;
log("Calibration SUCCESS!");
QMessageBox::information(this, "Success", "Calibration completed successfully.");
QMessageBox::information(this, "Success",
"Calibration completed successfully.");
} catch (const std::exception &e) {
log(QString("Calibration FAILED: %1").arg(e.what()));
QMessageBox::critical(this, "Calibration Failed", e.what());
} catch (...) {
log("Calibration FAILED: Unknown error");
QMessageBox::critical(this, "Calibration Failed", "Unknown error occurred.");
QMessageBox::critical(this, "Calibration Failed",
"Unknown error occurred.");
}
setUiLocked(false);
}
bool CalibrationWidget::mapColorRoiToDepth(const std::vector<cv::Point>& color_roi, std::vector<cv::Point>& depth_roi) {
if (color_roi.empty()) return false;
bool CalibrationWidget::mapColorRoiToDepth(
const std::vector<cv::Point> &color_roi,
std::vector<cv::Point> &depth_roi) {
if (color_roi.empty())
return false;
depth_roi.clear();
// Prepare input for SDK
@@ -629,7 +790,8 @@ bool CalibrationWidget::mapColorRoiToDepth(const std::vector<cv::Point>& color_r
for (size_t i = 0; i < color_roi.size(); ++i) {
src_pixels[i].x = color_roi[i].x;
src_pixels[i].y = color_roi[i].y;
// BGR values are not strictly needed for coordinate mapping but struct requires them
// BGR values are not strictly needed for coordinate mapping but struct
// requires them
src_pixels[i].bgr_ch1 = 0;
src_pixels[i].bgr_ch2 = 0;
src_pixels[i].bgr_ch3 = 0;
@@ -638,14 +800,13 @@ bool CalibrationWidget::mapColorRoiToDepth(const std::vector<cv::Point>& color_r
std::vector<TY_PIXEL_COLOR_DESC> dst_pixels(color_roi.size());
// Call SDK Mapping
// Note: We need the raw depth buffer. Since mat_depth_raw_ is 16UC1, pointer cast is safe.
// Note: We need the raw depth buffer. Since mat_depth_raw_ is 16UC1, pointer
// cast is safe.
TY_STATUS status = TYMapRGBPixelsToDepthCoordinate(
&depth_calib_,
mat_depth_raw_.cols, mat_depth_raw_.rows, (const uint16_t*)mat_depth_raw_.data,
&color_calib_,
mat_color_raw_.cols, mat_color_raw_.rows,
src_pixels.data(), (uint32_t)src_pixels.size(),
100, 10000, // min, max dist (mm)
&depth_calib_, mat_depth_raw_.cols, mat_depth_raw_.rows,
(const uint16_t *)mat_depth_raw_.data, &color_calib_, mat_color_raw_.cols,
mat_color_raw_.rows, src_pixels.data(), (uint32_t)src_pixels.size(), 100,
10000, // min, max dist (mm)
dst_pixels.data(),
1.0f // scale
);
@@ -662,7 +823,8 @@ bool CalibrationWidget::mapColorRoiToDepth(const std::vector<cv::Point>& color_r
} else {
// If point is invalid (e.g. no depth), use fallback or interpolate?
// For corners, this is critical.
log("Warning: Invalid depth mapping for point (" + QString::number(p.x) + "," + QString::number(p.y) + ")");
log("Warning: Invalid depth mapping for point (" + QString::number(p.x) +
"," + QString::number(p.y) + ")");
// Fallback to closest valid or original (scaled) logic
depth_roi.push_back(cv::Point(p.x, p.y));
}
@@ -686,8 +848,10 @@ void CalibrationWidget::saveCalibrationResult() {
}
defaultName += ".json";
QString fileName = QFileDialog::getSaveFileName(this, "Save Calibration Result", defaultName, "JSON (*.json)");
if (fileName.isEmpty()) return;
QString fileName = QFileDialog::getSaveFileName(
this, "Save Calibration Result", defaultName, "JSON (*.json)");
if (fileName.isEmpty())
return;
QJsonObject root;
@@ -723,7 +887,8 @@ void CalibrationWidget::saveCalibrationResult() {
if (file.open(QIODevice::WriteOnly)) {
file.write(doc.toJson());
log("Calibration saved to: " + fileName);
QMessageBox::information(this, "Saved", "Calibration result saved successfully.");
QMessageBox::information(this, "Saved",
"Calibration result saved successfully.");
} else {
log("Error: Could not write to file: " + fileName);
QMessageBox::critical(this, "Error", "Could not save file.");