Processing from Storage
Imports asset data from user storage into UP42 for processing.
Processing from Storage allows for running processing jobs from Storage with specific OneAtlas data assets.
Note: This block is a BETA feature - please read the block description to understand the capabilities and limitations prior use.
Supported data assets
This block connects Pléiades, SPOT and Pléiades Neo data assets from Storage with processing blocks. Both Archived and Tasked asset types are supported.
- Pléiades / SPOT Reflectance (Download): Archive and Tasking
- Pléiades / SPOT Display (Download): Archive and Tasking
- Pléiades Neo: Archive and Tasking
For Pléiades Neo Archive and Tasking, the following data products are supported:
- Pansharpened 8-bit DISPLAY DIMAP-GeoTIFF ORTHO
- Pansharpened 16-bit REFLECTANCE DIMAP-GeoTIFF ORTHO
Matching processing blocks
This block is able to connect to all processing blocks which match to one of the data products listed above.
If unsure, please check the corresponding block input and output capabilities to see if they match.
Note: It may be possible to connect data assets with processing blocks that are not supported.
How to use this block
1.) Go to Storage
2.) Click the “copy” button in the right column of the asset that should be processed
3.) In a new tab, create a workflow (via API, SDK or Console) using the “Processing from Storage” as a data block
4.) Insert the asset ID as an array into the job parameters during workflow configuration ("asset_ids": [“copied_asset_id”])
5.) No additional job parameters need to be defined, including geometry
6.) Run job
For the best user experience, it is recommended to use this block via API / SDK. However, this block can also be run via Console.
When asset IDs are processed from Storage, the data cost is 0. Infrastructure and algorithms costs may apply.
As stated above, this feature is in BETA with the following limitations:
- Data assets from other data providers are currently not supported when using this block.
- Please run a test job and evaluate the output results before processing large volumes of data
- It may be possible to connect and run not supported workflows. Use at your own risk.
The technical documentation page can be accessed on the documentation hub.
Any questions? Please reach out to [email protected].
Terms & Conditions
View the End User License Agreement conditions.