Scale Point Cloud
SUMMARY
Scale Point Cloud applies uniform scaling to a point cloud about a specified center point using the formula: p' = scale * (p - center) + center.
This Skill is useful in industrial, mobile, and humanoid robotics pipelines for adjusting the size of objects or scenes. For example, it can scale CAD-derived point clouds to match real-world measurements in manufacturing, normalize LIDAR point clouds for mobile robot mapping, or adjust object models for humanoid robot simulation and manipulation. Scaling is often applied during preprocessing or synthetic data generation.
Use this Skill when you want to resize a point cloud while preserving its relative geometry.
The Skill
from telekinesis import vitreous
import numpy as np
scaled_point_cloud = vitreous.scale_point_cloud(
point_cloud=point_cloud,
scale_factor=0.5,
center_point=np.array([0.0, 0.0, 0.0]),
modify_inplace=False,
)Performance Note
Current Data Limits: The system currently supports up to 1 million points per request (approximately 16MB of data). We're actively optimizing data transfer performance as part of our beta program, with improvements rolling out regularly to enhance processing speed.
Example
Raw Pointcloud
Unprocessed point cloud.
Scaled Pointcloud
Scaled pointcloud.
Parameters: scale = 0.3
The Code
from telekinesis import vitreous
from datatypes import datatypes, io
import pathlib
import numpy as np
# Optional for logging
from loguru import logger
DATA_DIR = pathlib.Path("path/to/telekinesis-data")
# Load point cloud
filepath = str(DATA_DIR / "point_clouds" / "relay_2_raw.ply")
point_cloud = io.load_point_cloud(filepath=filepath)
logger.success(f"Loaded point cloud with {len(point_cloud.positions)} points")
# Execute operation
scaled_point_cloud = vitreous.scale_point_cloud(
point_cloud=point_cloud,
scale_factor=0.3,
center_point=np.array([0.0, 0.0, 0.0], dtype=np.float32),
modify_inplace=False,
)
logger.success(f"Scaled point cloud to {len(scaled_point_cloud.positions)} points")Running the Example
Runnable examples are available in the Telekinesis examples repository. Follow the README in that repository to set up the environment. Once set up, you can run this specific example with:
cd telekinesis-examples
python examples/vitreous_examples.py --example scale_point_cloudThe Explanation of the Code
The script begins by importing the required modules for point cloud processing (vitreous), data handling (datatypes, io), file management (pathlib), numerical operations (numpy), and optional logging (loguru).
from telekinesis import vitreous
from datatypes import datatypes, io
import pathlib
import numpy as np
# Optional for logging
from loguru import loggerNext, the point cloud is loaded from disk. Logging confirms that the cloud has been successfully loaded and provides the number of points, which is useful for understanding the dataset size before performing any transformations.
DATA_DIR = pathlib.Path("path/to/telekinesis-data")
# Load point cloud
filepath = str(DATA_DIR / "point_clouds" / "relay_2_raw.ply")
point_cloud = io.load_point_cloud(filepath=filepath)
logger.success(f"Loaded point cloud with {len(point_cloud.positions)} points")The main operation applies the scale_point_cloud Skill. This Skill performs a uniform scaling of the point cloud relative to a specified center point. Key parameters include:
center_point: the 3D coordinate around which the scaling is applied. Points are moved closer to or farther from this reference point depending on the scale factor.scale_factor: the multiplier that defines how much to scale the point cloud. Values greater than 1 enlarge the cloud, while values less than 1 shrink it.modify_inplace: whether to update the original point cloud or return a new scaled copy.
# Execute operation
scaled_point_cloud = vitreous.scale_point_cloud(
point_cloud=point_cloud,
scale_factor=0.3,
center_point=np.array([0.0, 0.0, 0.0], dtype=np.float32),
modify_inplace=False,
)
logger.success(f"Scaled point cloud to {len(scaled_point_cloud.positions)} points")This operation is useful in industrial perception pipelines for normalizing object sizes, adapting point clouds to a simulation or visualization scale, or preparing data for algorithms that assume a specific geometric scale.
How to Tune the Parameters
The scale_point_cloud Skill has three parameters that control the scaling operation:
scale_factor (required):
- The uniform scale factor to apply
- Values > 1.0: Enlarge the point cloud
- Values < 1.0: Shrink the point cloud
- Value = 1.0: Leaves it unchanged
- Increase to make the cloud larger
- Decrease to make it smaller
- Typical range: 0.01-100.0
- Use 0.1-0.5 to shrink, 0.5-2.0 for moderate scaling, 2.0-10.0 to enlarge significantly
center_point (required):
- The 3D point about which scaling is performed
- Units: Uses the same units as your point cloud (e.g., if point cloud is in meters, center_point is in meters; if in millimeters, center_point is in millimeters)
- All points are scaled relative to this center
- Points at this location remain unchanged
- Typically set to the centroid or a known reference point
- Use
calculate_point_cloud_centroidto get the centroid if needed
modify_inplace (default: False):
- Whether to modify the input point cloud in place or create a new one
True: Modifies the original point cloud (saves memory but destructive)False: Returns a new point cloud (preserves original, recommended)- Default:
False
TIP
Best practice: Keep modify_inplace=False (default) to preserve the original point cloud. Use calculate_point_cloud_centroid to get the centroid as the center point for scaling around the object's center.
Where to Use the Skill in a Pipeline
Point cloud scaling is commonly used in the following pipelines:
- Size normalization and standardization
- CAD to real-world conversion
- Simulation and visualization preparation
- Unit conversion and calibration
A typical pipeline for size normalization looks as follows:
# Example pipeline using point cloud scaling (parameters omitted).
from telekinesis import vitreous
import numpy as np
# 1. Load point cloud
point_cloud = vitreous.load_point_cloud(...)
# 2. Preprocess point cloud
filtered_cloud = vitreous.filter_point_cloud_using_statistical_outlier_removal(...)
# 3. Calculate centroid for scaling center
centroid = vitreous.calculate_point_cloud_centroid(point_cloud=filtered_cloud)
# 4. Scale point cloud (e.g., to normalize size)
scaled_cloud = vitreous.scale_point_cloud(
point_cloud=filtered_cloud,
scale_factor=0.5, # Shrink to half size
center_point=centroid,
modify_inplace=False,
)
# 5. Optional: Apply transform if needed
transformed_cloud = vitreous.apply_transform_to_point_cloud(
point_cloud=scaled_cloud,
transformation_matrix=transform_matrix,
)
# 6. Process the scaled point cloud
clusters = vitreous.cluster_point_cloud_using_dbscan(...)Related skills to build such a pipeline:
calculate_point_cloud_centroid: get center point for scalingapply_transform_to_point_cloud: apply additional transformations after scalingfilter_point_cloud_using_statistical_outlier_removal: clean input before scalingcluster_point_cloud_using_dbscan: process scaled point clouds
Alternative Skills
There are no direct alternative skills for scaling point clouds. This Skill is the primary method for uniform scaling about a center point.
When Not to Use the Skill
Do not use scale point cloud when:
- You need non-uniform scaling (use
apply_transform_to_point_cloudwith a transformation matrix instead) - You need to translate or rotate (use
apply_transform_to_point_cloudinstead) - The point cloud is empty (the operation will succeed but return an empty point cloud)
- You want to preserve the original point cloud (ensure
modify_inplace=False, which is the default) - The scale factor is 0 or negative (this will collapse or invert the point cloud)
WARNING
Important: Ensure scale_factor is positive. Negative values will invert the point cloud, and a value of 0 will collapse all points to the center point.

