Quantcast
Channel: ROS Answers: Open Source Q&A Forum - RSS feed
Viewing all 348 articles
Browse latest View live

Where to find a 3D video camera for outdoor use?

$
0
0
Is anyone aware of an affordable 3D video camera (preferrably USB) that's good for outdoor use? I'm designing a small outdoor rover, and I need a sensor that provides high resolution depth readings. Sensors like the Kinect don't work outdoors, and laser range finders are enormously expensive, so I'm looking at stereoscopy. I've cobbled together my own setup using cheap USB webcams, and although it works, the quality is quite poor, as expected. I've seen some "3D Cameras" advertised for drones and quadcopters, [like this one](http://www.hobbyking.com/hobbyking/store/__90687__Skyzone_2D_3D_5_8GHz_FPV_Goggles_W_40CH_Raceband_Receiver_H_Tracking_V2_600mW_VTX_and_3D_Camera.html), but I've only seen them sold with a VR headset, and never with a USB interface. Would something like this work outdoors for stereo range finding, and if so, where would I buy just the camera? I've also seen this [ROS survey](http://rosindustrial.org/news/2016/1/13/3d-camera-survey) of 3D cameras, but it don't mention if the camera work outside.

3D Camera Selection

$
0
0
Hi, I am working on a project and we plan to integrate a camera to our robot cell. We use the UR5 and would like to use the camera as a feedback sensor. The accuracy requirement is +/- 0.5mm. I would like to know what camera options are there and which ones are supported in ROS. I have gathered from [here](http://wiki.ros.org/openni_kinect/kinect_accuracy) that the accuracy of the kinect is +/- 1mm which can be improved by intrinsic and/or extrinsic calibration. Would that be suitable for my case? I am new to image processing in ROS so kindly excuse me if I am missing anything.

how to modify and publish camera_info?

$
0
0
Hello. I would like to make a cropped image from raw image in python. In my program, I can subscribe `/usb_cam/image_raw` (`sensor_msgs/Image`, height=480, width=640) and `/usb_cam/camera_info` (`sensor_msgs/CameraInfo`). Then, I modified roi in `/usb_cam/camera_info` like below, edited_camera_info.roi.x_offset = 320 edited_camera_info.roi.y_offset = 240 and publish edited camera info and image. pub_edited_image.publish(no_edited_image) pub_edited_camera_info.publish(edited_camera_info) However, there are no differences between `/usb_cam/image_raw` and `/editted_image_raw`. Is it possible to crop a part of an image by modifying camera_info? Or, is there any alternative ways to realize to crop an image?

Error installig zed-ros-wrapper

$
0
0
Hi! I can't compile with catkin_make the zed-ros-wrapper on the Jetson Tk1. I Have this linking error "undefinded reference to XInitThreads". I have installed cuda 6.5, openCV 2.4 and ZED SDK 1.0. Can you help me with this error? Thanks

camera drivers experimental (swissranger_camera)

$
0
0
Hi, I am following this tutorial (Download and install swissranger_camera) to install and work with swissranger camera (sr4500). When I used this code (svn co https://code.ros.org/svn/ros-pkg/branches/trunk_cturtle/stacks/camera_drivers_experimental ) , I got an error because www.code.ros.org does not exist to check out and download the drivers. So, I search on the Internet and foundnd this page (https://github.com/OpenPTrack/open_ptrack.git) and I checked it out by using svn and I did not get any error, but when I used this cod (rosmake swissranger_camera), I got this error : ([rosrun] Couldn't find executable named swissranger_camera below /home/abbas/catkin_ws/src/swissranger_camera [rosrun] Found the following, but they're either not files, [rosrun] or not executable: [rosrun] /home/abbas/catkin_ws/src/swissranger_camera [rosrun] /home/abbas/catkin_ws/src/swissranger_camera/src/swissranger_camera [rosrun] /home/abbas/catkin_ws/src/swissranger_camera/include/swissranger_camera ). This is also my cmakelists.txt in the swiss_ranger folder in my catkin workspace/src: cmake_minimum_required(VERSION 2.8.3) project(swissranger_camera) set(CMAKE_BUILD_TYPE RelWithDebInfo) IF(EXISTS "/usr/lib/libmesasr.so") set(SWISSRANGER_ENABLED 1) ELSE() set(SWISSRANGER_ENABLED 0) ENDIF() LIST(APPEND CMAKE_CXXFLAGS "-DSWISSRANGER_ENABLED") find_package(catkin REQUIRED COMPONENTS cmake_modules roscpp tf camera_info_manager image_transport dynamic_reconfigure driver_base sensor_msgs cv_bridge) find_package(OpenCV REQUIRED) include_directories(${OpenCV_INCLUDE_DIRS}) link_directories(${OpenCV_LIB_DIR}) add_definitions(${OpenCV_DEFINITIONS}) find_package(Eigen REQUIRED) include_directories(${Eigen_INCLUDE_DIRS} include ${catkin_INCLUDE_DIRS}) find_package(PCL 1.7 REQUIRED) include_directories(BEFORE ${PCL_INCLUDE_DIRS}) link_directories(${PCL_LIBRARY_DIRS}) add_definitions(${PCL_DEFINITIONS}) MESSAGE("PCL_INCLUDE_DIRS\n${PCL_INCLUDE_DIRS}\n") MESSAGE("PCL_LIBRARY_DIRS\n${PCL_LIBRARY_DIRS}\n") MESSAGE("PCL_DEFINITIONS\n${PCL_DEFINITIONS}\n") MESSAGE("PCL_COMMON_LIBRARIES\n${PCL_COMMON_LIBRARIES}\n") if (NOT PCL_FOUND) MESSAGE(FATAL_ERROR "PCL not found.\n") endif (NOT PCL_FOUND) generate_dynamic_reconfigure_options(cfg/SwissRanger.cfg) include_directories(include ${catkin_INCLUDE_DIRS} cfg/cpp) catkin_package( INCLUDE_DIRS include LIBRARIES swissranger_utility CATKIN_DEPENDS roscpp tf camera_info_manager image_transport dynamic_reconfigure driver_base ) add_library(swissranger_utility src/swissranger_camera/utility.cpp include/swissranger_camera/utility.h) target_link_libraries(swissranger_utility ${catkin_LIBRARIES}) IF(SWISSRANGER_ENABLED) add_executable(${PROJECT_NAME} src/sr.cpp src/dev_sr.cpp include/sr.h) add_dependencies(${PROJECT_NAME} ${PROJECT_NAME}_gencfg) target_link_libraries(${PROJECT_NAME} ${catkin_LIBRARIES} mesasr) add_library(swissranger_utility src/swissranger_camera/utility.cpp include/swissranger_camera/utility.h) target_link_libraries(swissranger_utility ${catkin_LIBRARIES}) add_executable(image_publisher_sr apps/main_image_publisher_sr.cpp) target_link_libraries(image_publisher_sr swissranger_utility ${catkin_LIBRARIES}) ENDIF(SWISSRANGER_ENABLED) How can I solve the problem? Thanks, Abbas

problem of camera of SLAM when using ROS

$
0
0
i am totally new to ROS and SLAM ,just started this like 2 weeks ago.i am using ubuntu 14.04 + ROS indigo my problem is that.when i am using ROS to launch the SLAM software.I type in rosrun lsd_slam_core live_slam /image:=image_raw _calib:=/LSD_room/cameraCalibration.cfg and the laptop always want to launch the internal camera(video0),but actually i want to use the external camera (Logitech C920)(video1).what i want to do is that i want to deactivate the internal camera.so i type in sudo rm /dev/video0 to deactivate the internal camera and i had only one external camera now.after that i tried again the rosrun lsd_slam_core live_slam /image:=image_raw _calib:=/LSD_room/cameraCalibration.cfg now came something weird. yukan@yukan-TP300LA:~/rosbuild_ws/package_dir$ ls /dev/video* /dev/video1 yukan@yukan-TP300LA:~/rosbuild_ws/package_dir$ rosrun uvc_camera uvc_camera_node device:=/dev/video1 [ INFO] [1476452216.489423243]: using default calibration URL [ INFO] [1476452216.489512363]: camera calibration URL: file:///home/yukan/.ros/camera_info/camera.yaml [ INFO] [1476452216.489588606]: Unable to open camera calibration file [/home/yukan/.ros/camera_info/camera.yaml] [ WARN] [1476452216.489615448]: Camera calibration file /home/yukan/.ros/camera_info/camera.yaml not found. opening /dev/video0 terminate called after throwing an instance of 'std::runtime_error' what(): couldn't open /dev/video0 Aborted (core dumped) I take a close look at the response ,it seems to me that the ROS was always trying to communicate with video0,am I the only one who has feeling like this? but i need it to be video1 for god sake. I am totally newbee and anyhelp will be really appreciated.

calibration

$
0
0
Now I am using [camera calibration package](http://wiki.ros.org/camera_calibration/Tutorials/MonocularCalibration) to calibrate my rgb camera. I found that I got different yaml file every time when I calibrate the same camera. How does this happen? Is there a method to check which yaml file is valid? Thank you!

ptz_control on p2os_driver

$
0
0
I am working with the p2os_driver on a seekur_jr robot. (ROS-GROOVY, UBUNTU 12.04) I have successfully moved the robot with the teleop_keyboard launch. (using serial port ttyS0) But I haven't found a way to pan,tilt and zoom the PTZ camera using p2os_driver. I've a node written by myself which publishes a PTZState msg on /ptz_control topic. When I publish a msg on /ptz_control, the /ptz_state topic changes its values either, but the camera does not move. Maybe the commands I am sending to /ptz_control are not being effective because the ptz port configuration is wrong. The PTZ camera is a rvision connected to the COM4 (ttyS3) serial port. **I would like to know what is the port the p2os_driver expects the ptz camera to be connected, and also how I could change this parameter.** I've checked the p2os_ptz.cc and p2os.cc codes to see if I could change this parameter. It appears the ptz msgs comes from the aux serial port. If it is possible, how could I configure it to communicate with ttyS3 port? Thank you, Jhony Ferreira

Impact,on the calibration of a camera, of the square size

$
0
0
Hi all, I am at the 3rd year of my phd and I finally really need cameras for some tasks. First of all I need to calibrate my camera. I need to find the intrinsic parameters. I don't like to have an approach like : "Ok it is working let's go on." and therefore I have a question. I am using the following tutorial to calibrate my camera: http://wiki.ros.org/camera_calibration/Tutorials/MonocularCalibration And I was wondering what happens when I calibrate a camera and I give the wrong size of the square, after `--square` in the command: $ rosrun camera_calibration cameracalibrator.py --size 8x6 --square 0.108 image:=/camera/image_raw camera:=/camera I really would like to have some details about it. Explanation with formulas and so on because I am not able to understand. I did 3 calibrations and I generated 3 calibration files. To me they look almost the same but only 2 of them are with the right size of the square. For this reason a question came to my mind: why do I need to specify that number? Thanks in advance for all your explanations!

install apriltags and make it work with USB camera?

$
0
0
I'm trying to setup localization with AprilSLAM and an RPi camera. But before I get into that, I want to test out the apriltags on my laptop. So I want to try the normal apriltags detection and the usb camera inside the laptop. Then after I'm done with that, I'd go onto aprilslam. So I started out with installing ubuntu 14.04 and installed ROS indigo. I'm faced with two apriltag options. http://wiki.ros.org/apriltags_ros http://wiki.ros.org/apriltags And I'm trying to connect this to them. http://wiki.ros.org/usb_cam How do I even do that?

Is there a way to remap the mvBlueFox 2 topic?

$
0
0
Hello, I am trying to remap the topic name of the BlueFox 2 camera name from `/mv_$(arg device)` to `camera/front`. I am using the driver found on the [KumarRobotics](https://github.com/KumarRobotics/bluefox2) website with ROS Jade. I have tried using the `` syntax. When i get the error [ERROR] [1477999041.574038036]: /camera/fromt: not found. 2 availabe device(s): and then the serial numbers of my cameras. Thank you.

Is there a V4L camera driver for Kinetic?

$
0
0
Hi, I see several usb camera drivers (usb_cam, uvc_camera) that I assume will work w/ a V4L driver, but nothing that supports Kinetic. Am I missing something? Any suggestions (e.g., wipe and install Indigo)?

camera_pose_calibration migrate to indigo

$
0
0
Hello guys, I'm trying to use this [calibration package](http://wiki.ros.org/camera_pose_calibration) for 2 monocular cameras that are fixed on the bottom of a Quadcoter. According to the [documentation](http://wiki.ros.org/camera_pose_calibration?distro=groovy) this package is capable to do this. ### my setup My setup is shown in this figure ![drawing](https://cloud.githubusercontent.com/assets/15015938/11775751/c3d099b6-a241-11e5-8bf0-48350c6ff591.png) - Ubuntu 14.04 (Trusty) - ROS Distro= Indigo - Catkin workspace with catkin tools ### my goal My goal is to get the pose between the two bottom cameras and also the tf Transformation of the bigger one (I already got the tf from the small bottom camera). ### installation problems Problem about the whole story is more or less the installation. I checked with ```javascript rosdep db ``` for the package ros-indigo-camera-pose so i could use ```javascript sudo apt-get install ros-indigo-camera-pose ``` but had no success. Second attempt was to try an installation from source through github. Somehow Catkin tools does not recognise the camera_pose package when I try to build it through the command: ```javascript catkin build [package] ``` Checking the dependencies with `rosdep` fails too. So my first thought was to update this package, so catkin tools can build it. Any suggestion how to do this without studying informatics for 5 years :D?

Trying to get position data from Aruco Package

$
0
0
Hi all, I am trying to get position data using the [aruco ros package](http://wiki.ros.org/aruco). I can get the marker to appear in the the screen, and it will be highlighted. But, no position data is being published to the markers topic. If I output to the screen in the launch file, there is output that identifies the marker, and has 4 2D points, as well as a 3D translation matrix, and a rotation transform(from what I read on the Aruco site). The last two are all 0. But the 2D points seem to be what I want. What topic should these be broadcasting on? And also I am assuming I'll need the transformation matrix and rotation transform, could they be defaulting to zero since I don't have a camera config file, because that is the case.

How can I access to the raw data from sr4000 camera (swissranger_camera)?

$
0
0
Hi, I am new in ROS. I can run the swissranger_camera (sr4000) on ROS (I also can see the distance and intensity image of the camera with rviz). This camera has an interface for windows, and I can save the data (distance, or intensity data) in one file (that file is just for one image in a specific time). My question is how can I get and save the data from the camera on ROS (like the interface on windows does)? Thanks, Abbas

Best ROS Kinetic webcam driver?

$
0
0
Hi all, What is the best webcam driver for ROS Kinetic? I've looked around and it appears libuvc_ros might be OK, but will this work with integrated laptop camera too? Thanks in advance! Regards, Martin

Nodelets And Desynchronized Images

$
0
0
Folks, Can too many desynchronized images cause a camera_nodelet to crash? For one of my programs I have a camera_nodelet publishing all my images, but when it crashes, I always get a few "Image desynchronised" when the nodelet is about to crashing. I get a few of those messages, then, bamm!...it crashes.

using gscam with janus-gateway

$
0
0
When running my pipeline with gst-launch-1.0 in terminal, I get exactly what I want to see. However, when transferring that pipeline to my gscam launchfile, no video stream is recognized on the port. Am I doing something wrong? my launch file is just a copy of their sample with gscam_config changed:

UVC_camera fails

$
0
0
I have a problem with uvc_camera. When I run: rosrun uvc_camera uvc_camera_node it just says: [ INFO] [1481414376.278537346]: using default calibration URL [ INFO] [1481414376.278608501]: camera calibration URL: file:///home/solmaz/.ros/camera_info/camera.yaml Segmentation fault When I run my launch file: roslaunch work_space camera.launch I get the following results: [camera_1/uvc_camera_node_1-1] process has died [pid 4942, exit code -11, cmd /opt/ros/indigo/lib/uvc_camera/uvc_camera_node __name:=uvc_camera_node_1 __log:=/home/solmaz/.ros/log/5910aa52-bf33-11e6-bdb3-e03f494ac521/camera_1-uvc_camera_node_1-1.log]. log file: /home/solmaz/.ros/log/5910aa52-bf33-11e6-bdb3-e03f494ac521/camera_1-uvc_camera_node_1-1*.log The log file is empty that makes it harder to find the problem. My launch file this: This web camera works fine on another machine. And `guvcview` also works.

how to run a pc-webcam with ROS

$
0
0
HI, I am new to ROS and I want to know how to publish camera images with ros to use it in ORBSLAM. I've tried "rosrun image_view image_view image:=/camera/image" but nothing is happening.
Viewing all 348 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>