• Ensenso
  • Getting Started
  • Software
  • Hardware
  • Guides
  • About
  • Ensenso Community

General Information

  • About Guides

First Steps

  • Software Installation
  • Hardware Installation
  • Opening a Device
    • Trouble-Shooting
      • My Device Is Not Listed
      • My Device Shows a Warning
      • The 3D Data is Bad
      • The MTU is Low
  • Using Sample Data from the Ensenso Website
    • 1. Download Data
    • 2. Create a File Camera in NxView
    • 3. Open the Camera in NxView

Hardware Guides

  • Camera Setup Considerations
    • Limitations of the 3D Reconstruction
      • Effective resolution for surface details
      • Effective resolution for S-series cameras
      • Reconstruction of curved and slanted surfaces
      • Slanted surfaces in outer field-of-view
    • Camera Position and Orientation
      • Direct specular reflection from the projector into the cameras
  • Network Configuration
    • Network Wizard
      • Launching the Network Wizard
      • Privileged Changes Made to Your System
        • Configuration of Network Adapters
        • uEye Services
        • Reverse Path Filtering (RPF)
        • Maximum Transmission Unit (MTU)
    • Network Performance
      • Background
      • Solutions
        • Update Your Network Card Drivers
        • Setting Network Card Properties
          • Receive Buffer Size
          • Maximum Transmission Unit (MTU)
        • Disable Energy Saving Settings
        • Lowering Packet Rate From the Camera
    • Firewall Configuration
      • GigE Vision
      • X-Series Projector
      • XR-Series Devices

API Usage

  • API and Command Error Handling
    • API and Command Errors
    • Error Handling with Exceptions
    • Error Handling with Return Codes
      • Reading/Writing Tree Nodes
      • Command Execution
  • Debugging and Simulation
    • Exporting Debug Information
      • Exporting Debug Information Using NxTreeEdit
      • Exporting Debug Information From Your Application
        • Debug Levels
        • Enable Logging from the Start
        • Retrieving Debug Information
        • Automatic Export with File Rotation
      • Generate Custom Debug Information
    • Using File Cameras
      • Using File Cameras in NxView
        • Saving Image Data
        • Creating a File Camera
      • Using File Cameras With the API
        • Saving Image Data
        • Creating a File Camera
        • Loading Original Camera Parameters
        • Loading the Original Position of File Cameras
    • Using Virtual Cameras
      • Creating a Virtual Camera
      • Using Virtual Cameras in NxView
        • Modifying Objects Manually
        • Throwing Models into the Scene
        • Create a Randomly Filled Bin using the Scene Wizard
      • Using Virtual Cameras in User Applications
      • Accuracy and Limitations of camera simulation
  • CUDA
    • Multiple GPUs
    • Hints and Limitations
  • OpenGL
    • Headless Rendering on Linux
      • Manual Specification of the EGL Platform
      • Headless Rendering with X Server and Nvidia Driver

Camera Operation

  • Binary Pipeline
  • Basic Camera Operations
    • Opening a Camera and Setting Parameters
      • Code Examples
    • Using Parameter Presets
      • Preset Schema
      • Preset Example
        • Structuring a complex condition
      • Code Example
    • Parameter Adjustment in NxView
      • 1. Adjust Capture Settings
      • 2. Select Quality Preset
      • 3. Enable Cuda
      • 4. Post Processing
    • Reading/Writing Camera Parameter Files
      • Format of JSON parameter files
      • Parameter Import/Export from NxView
      • Code Examples
        • Open camera and read parameter file
        • Write parameter file
    • Capturing Images with Hardware Trigger
      • Code Examples
    • Using the Digital Input and Output
      • Code Examples
        • Flash Output
        • Statically setting the output state
        • Reading the input state
    • Running Cameras Hardware Synchronized
      • Code Examples
    • Analyzing Frame Times
      • Case 1: Frame Time Is Compute-Bound
      • Case 2: Frame Time Is Network-Bound
      • Case 3: Frame Time Is Limited by Flash Time
  • Operation of Specific Camera Series
    • B-Series
      • Limiting Projector Power for PoE Power Supplies
    • C-Series
      • Interaction Between Stereo and Color Devices
        • IP Configuration
        • Firmware Updates
        • Device-Internal Trigger
      • Limiting Projector Power for PoE Power Supplies
      • Factory Calibration
    • S-Series
      • Laser Projector Heating
      • Camera Node
      • Computing 3D Data
      • Stereo Matching Parameters and Filtering 3D Data
      • Functional Limitations
      • Unmatched Regions in S-Series Images
    • XR-Series
      • Software Concept
      • Commands
      • Image Acquisition
      • Image Transfer Times
      • Debug Logging
      • Functional Limitations
        • Limitations of Patch Match
      • Known Issues
      • Using the Wifi Function
  • Getting 3D Data
    • Grabbing 3D Data
      • Code Examples
    • Texturing 3D Data
      • Capturing Texture Images
      • Computing Texture from Projector Images
    • Aligning Images With 3D Data
    • Set Z-Range
      • Set distance of measurement volume far/near plane
      • Set MinimumDisparity and NumberOfDisparity
    • Optimize Settings for Performance/Quality
      • General Settings
      • Capture Parameters
      • Stereo Matching Parameters
    • Deferred 3D Processing
      • Image Acquisition
      • Stereo Processing
      • Code Examples
  • Multithreading
    • Parallel Capturing and Processing
      • Code Examples
        • Single Threaded
        • Multi Threaded
    • Parallel Usage of Multiple Cameras
      • Code Examples
        • Multi Threaded
  • Calibration
    • Calibration Patterns
      • Halcon Patterns
      • Ensenso Patterns
        • Single and Custom Single Patterns
        • Flexible Patterns
        • Assembly Patterns
      • Coordinate Systems on Calibration Patterns
      • Reference Points
      • Measurement of the Grid Spacing
      • Printing Calibration Patterns
    • Collecting Calibration Patterns
      • The Global Pattern Buffer
      • Collecting Patterns
      • Code Example
    • Calibrating a Camera
    • Checking Camera Calibrations
      • Measuring Calibration Accuracy
        • Measuring Calibration Accuracy in NxView
        • Measuring Calibration Accuracy with the NxLib
        • Error Metrics
        • Evaluating Measurement Results
      • Common Calibration Errors
      • Dynamic Recalibration
        • Code Example
    • Restore Factory Calibration
      • Overwriting EEPROM content with factory raw calibration data
      • Loading factory calibration patterns to recompute calibration data
      • Backwards Compatibility
  • Multi Camera Setups and Calibrations
    • Link Tree
      • Link Concept
      • Coordinate Systems
      • Link Tree
      • World Coordinate System
    • Multi Camera Setups
      • Example Setup
    • Texturing Point Clouds
      • Calibrating the Monocular Camera with the Calibration Wizard
      • Calibrating the Monocular Camera Manually
      • Combining Data from Stereo and Mono Camera
    • Workspace Calibration
      • Code Examples
    • Hand-Eye Calibration
      • Fixed Camera
      • Moving Camera
      • Steps to Perform a Hand-Eye Calibration
        • Example Code
      • After the Calibration
        • Fixed Camera
        • Moving Camera
        • Example Code
      • Recalibrating Robot Geometry
      • Calibrating Robots with Less Than 6 Degrees of Freedom
      • Calibration Result Improvement
  • PartFinder
    • Activating a PartFinder License
      • Licenses
      • Installation
      • License Activation
        • Evaluation License
        • Runtime License
    • First Steps with PartFinder
      • 1. Select a Camera
      • 2. Open PartFinder
      • 3. Create a Model
      • 4. Search for Parts
    • Generate a PartFinder model and search for it
      • 1. Generating a model
      • 2. Search for Parts
    • Load a PartFinder model file saved from NxView
      • 1. Loading the model
      • 1. Search for Parts
Guides - Ensenso SDK
  • »
  • Getting 3D Data

Getting 3D DataΒΆ

  • Grabbing 3D Data
  • Texturing 3D Data
  • Aligning Images With 3D Data
  • Set Z-Range
  • Optimize Settings for Performance/Quality
  • Deferred 3D Processing
Previous Next

© Copyright 2025 Optonic GmbH.