Code Samples
Curated integration code examples covering common patterns — SAP C4C, Salesforce, Snowflake, CSV import/export, Data Feed, and more.
Table of Contents
Core Examples
-
Export Data to CSV
-
Load Data from CSV to Sellers
-
Load Data from CSV to Sellers Extensions
-
Fetch Data from Datamart
-
Fetch Data from General Data Source
-
Fetch Data from Pricefx (Batched)
-
Fetch Price Parameter Values
-
Fetch Products from SAP C4C
-
Parse CSV and Load Data to Data Feed
-
Parse CSV and Load Data to General Data Source
-
How to Set Quote Owner
-
Send New Price Lists to SAP C4C
-
Salesforce Integration (legacy)
-
OpenData v2 Integration (legacy)
-
Unload Data from Snowflake and Load Them into Greenplum
Proposed New Examples — High Priority
-
SFTP File Retrieval and Import
-
Event-Driven Route with pfx-event
-
Fetch Data Directly to CSV File
-
Multi-Connection Route (Cross-Partition)
Proposed New Examples — Medium Priority
-
Refresh Datamart After Load
-
Delete Records with Filter
-
Mass Edit Records
-
AWS S3 File Import
-
CSV Validation Before Import
-
Groovy Converter in Mapper
-
Conditional Mapping with Predicate
-
XML File Import
Export Data to CSV
This example shows how to fetch data from Pricefx and export it to a CSV file.
<filter id="pxCarsFilter">
<and>
<criterion fieldName="name" operator="equals" value="Cars"/>
</and>
</filter>
.....
<route>
<from uri="timer://fetchData?repeatCount=1"/>
<to uri="pfx-api:fetch?filter=pxCarsFilter&objectType=PX"/>
<to uri="pfx-csv:marshal"/>
<to uri="file:test?fileName=data1.csv&fileExist=Append"/>
</route>
Note: For large datasets, consider using batchedMode=true with split to avoid memory issues.
Load Data from CSV to Sellers
<loadMapper id="import-sellers.mapper" includeUnmappedProperties="true"/>
.....
<route id="importSellers">
<from uri="file:{{data.directory}}/import/sl?{{archive.file}}&{{read.lock}}"/>
<split strategyRef="recordsCountAggregation" streaming="true">
<tokenize group="20000" token="
"/>
<to uri="pfx-csv:unmarshal?skipHeaderRecord=true"/>
<to uri="pfx-api:loaddata?objectType=SL&mapper=import-sellers.mapper"/>
</split>
</route>
Note: recordsCountAggregation is a built-in aggregation strategy that tracks PfxInputRecordsCount, PfxTotalInputRecordsCount, PfxFailedBatchesCount, and PfxTotalFailedInputRecordsCount headers.
Fetch Data from Datamart
This example shows how to fetch data from a Datamart with the name 1669.DMDS.
<route>
<from uri="timer://fetchDataFromDatamartByQuery?repeatCount=1"/>
<to uri="pfx-api:fetch?objectType=DM&dsUniqueName=1669.DMDS&batchedMode=true&batchSize=5000"/>
<split>
<simple>${body}</simple>
<to uri="pfx-api:fetchIterator"/>
<log message="${body}"/>
</split>
</route>
dsUniqueName can be used to get data from Datamart, Datamart Data Source, or Data Feed:
|
Type of Data Source |
Pattern |
|---|---|
|
Datamart |
|
|
Data Source |
id.DMDS |
|
Data Feed |
id.DMF |
Fetch Data from Pricefx (Batched)
The batched mode is used to fetch huge amounts of data from Pricefx when the dataset exceeds Java memory. You can define the batchSize.
<filter id="pxCarsFilter">
<and>
<criterion fieldName="name" operator="equals" value="Cars"/>
</and>
</filter>
.....
<route>
<from uri="timer://fetchData?repeatCount=1"/>
<to uri="pfx-api:fetch?filter=pxCarsFilter&objectType=PX&batchedMode=true&batchSize=5000"/>
<split>
<simple>${body}</simple>
<to uri="pfx-api:fetchIterator"/>
<log message="${body}"/>
</split>
</route>
Parse CSV and Load Data to Data Feed
This example shows how to parse a CSV file and load it into a Data Source (DMDS).
<pfx:loadMapper id="load-data-feed.mapper" convertEmptyStringToNull="true">
<pfx:body in="sku"/>
<pfx:body in="price"/>
</pfx:loadMapper>
.....
<route>
<from uri="file:src/data-4?{{archive.file}}&{{read.lock}}"/>
<split streaming="true">
<tokenize token="
" group="5000"/>
<to uri="pfx-csv:unmarshal?header=sku,label,price&skipHeaderRecord=true"/>
<to uri="pfx-api:loaddata?mapper=load-data-feed.mapper&objectType=DM&dsUniqueName=1849.DMDS"/>
</split>
<!-- Flush is required to make data visible in analytics -->
<to uri="pfx-api:flush?objectType=DM&dsUniqueName=1849.DMDS"/>
</route>
Parse CSV and Load Data to General Data Source
This example shows how to parse a CSV file and send it to a Product Extension with the name Cars.
<pfx:loadMapper id="load-general-ds.mapper" convertEmptyStringToNull="true">
<pfx:simple expression="Cars" out="name"/>
<pfx:body in="sku"/>
<pfx:body in="label" out="attribute1"/>
<pfx:body in="price" out="attribute2"/>
</pfx:loadMapper>
.....
<route>
<from uri="file:src/data-4?{{archive.file}}&{{read.lock}}"/>
<split streaming="true">
<tokenize token="
" group="5000"/>
<to uri="pfx-csv:unmarshal?header=sku,label,price&skipHeaderRecord=true"/>
<to uri="pfx-api:loaddata?mapper=load-general-ds.mapper&objectType=PX"/>
</split>
</route>
Fetch Products from SAP C4C
<pfx:loadMapper id="sap-product.mapper">
<pfx:body in="ID" out="sku"/>
<pfx:body in="Description" out="label"/>
</pfx:loadMapper>
<route>
<from uri="timer://foo?repeatCount=1"/>
<to uri="pfx-c4c:fetchProducts?username={{sap.c4c.username}}&password={{sap.c4c.password}}&url={{sap.c4c.url}}"/>
<to uri="pfx-api:loaddata?objectType=P&mapper=sap-product.mapper"/>
</route>
Note: Always use property placeholders ({{...}}) for credentials. Define them in your application.properties.
Fetch Price Parameter Values
<route>
<from uri="timer://fetchData?repeatCount=1"/>
<to uri="pfx-api:fetch?objectType=LTV&pricingParameterName=ExchangeRate&batchedMode=true&batchSize=5000"/>
<split>
<simple>${body}</simple>
<to uri="pfx-api:fetchIterator"/>
<log message="${body}"/>
</split>
</route>
You can use pricingParameterName or pricingParameterId. Note that pricingParameterId values differ between partitions.
Additional Examples
For the complete set of code samples including SAP C4C price list sync, Salesforce integration, Snowflake-to-Greenplum unload, and many more proposed examples (Load Data File, Integrate Data, MLTV2, SFTP, database integration, etc.), refer to the full documentation in the git repository at docs/code-samples.md.
Common Pitfalls
-
Missing flush after DMDS load — Data will not appear in analytics without
pfx-api:flushafter loading to a Data Source. -
Large exports without batchedMode — Fetching large datasets without
batchedMode=truecan cause OutOfMemoryError. -
Hardcoded credentials — Always use
{{property}}placeholders for URLs, usernames, and passwords. -
tokenize token attribute — The token attribute uses backslash-n to represent the newline character for splitting CSV lines.