Let’s start by creating a new function, which will contain the whole process (you are free to apply the exact same principles as Events or any other set up). Let’s call it MyPointCloudFunction.
Once the function is ready, we’ll go straight into loading the data and storing it as an asset variable.
Properties First Line and Last Line will determine the range of lines to process while reading the given file. In this example we will load points 1000 - 50000 (or less if the file doesn't actually hold that much), discarding the rest.
Setting the Last Line to 0 will read data until the end of the file, while setting both to 0 will read the whole
This will determine normalization and clipping range for color values
Setting this to (0, 255) will result in all values being clipped to 0 - 255 range, then normalized for 0 - 1 range.
-100 would be clipped to 0, then normalized to 0.0f
0 does not need to be clipped, and will be normalized to 0.0f
51 does not need to be clipped, and will be normalized to 0.2f
255 does not need to be clipped, and will be normalized to 1.0f
512 would be clipped to 255, then normalized to 1.0f
If set to (0, 0), the best match will be automatically determined based on the data contained within the file.
WARNING: Applying incorrect range will most likely result in incorrect color representation.
This object determines, which columns to use as which data sources
Use 0-based indexing when specifying the column ID
In this example, we will:
Use first column in the file as X location
Use second column in the file as Y location
Use third column in the file as Z location
Use seventh column in the file as Red channel
Do not use Green nor Blue channels - this will result in Cloud importing as Intensity data only
At this point you can do with the asset as you would with one imported through the Content Browser. In this example, we'll assign it to some hypothetical Point Cloud Actor.
Here are the full code snippet and full-sized Blueprint screenshot of the complete process.