Routes
Routes
Camel route definitions — the main integration logic. Each file defines one or more routes that orchestrate data flow.
|
Attribute |
Details |
|---|---|
|
Purpose |
Define data flow: what triggers the route, how data is processed, where it goes |
|
Format |
XML — <routes> wrapper with Camel Spring DSL <route> elements inside |
|
Naming |
Descriptive, by function: import-products.xml, export-prices.xml, event-handlers.xml |
|
Loaded by |
Camel context at startup — all *.xml files are auto-discovered |
When to Use Each Pattern
|
Pattern |
Use when |
Key components |
|---|---|---|
|
CSV → Pricefx (loaddataFile) |
Default for P, PX, CX, C imports |
|
|
CSV → Pricefx (loaddata) |
Need Groovy row-level logic |
|
|
CSV → Data Source (DMDS) |
PA/Datamart imports |
|
|
Pricefx → CSV |
Export data |
|
|
Pricefx → CSV (large) |
Export 50k+ rows |
|
|
Event handler |
React to Pricefx events |
|
|
Scheduled |
Periodic execution |
|
Route XML Structure
Routes are defined in XML files under the routes/ directory using Apache Camel Spring DSL with Pricefx extensions. Each route file uses a <routes> wrapper element with the Camel Spring namespace — no beans wrapper or XML declaration is needed.
Minimal Route File
<routes xmlns="http://camel.apache.org/schema/spring">
<route id="myRoute">
<from uri="..."/>
<to uri="..."/>
</route>
</routes>
Common Route Patterns
Import: CSV File to Pricefx (Data Source / DMDS)
The standard DMDS import pattern: read CSV, split into batches, unmarshal, load, flush.
<routes xmlns="http://camel.apache.org/schema/spring">
<route id="csvImportToDatasource">
<from uri="file:{{import.fromUri}}"/>
<split>
<tokenize group="5000" token=" "/>
<to uri="pfx-csv:unmarshal?header=sku,label,price&skipHeaderRecord=true&delimiter=,"/>
<to uri="pfx-api:loaddata?mapper=myMapper&objectType=DM&dsUniqueName=Product"/>
</split>
<onCompletion onCompleteOnly="true">
<to uri="pfx-api:flush?dataFeedName=DMF.Product&dataSourceName=DMDS.Product"/>
</onCompletion>
</route>
</routes>
Key points:
-
split + tokenize group="5000" batches CSV rows for efficient loading
-
pfx-csv:unmarshal converts CSV text to List of Maps
-
pfx-api:loaddata sends data to Pricefx using a mapper
-
onCompletion with pfx-api:flush is mandatory for DMDS imports
-
Both
objectType=DMandobjectType=DMDSwork for loaddata withdsUniqueName, butDMis the standard convention used across the project
Export: Pricefx to CSV File
<routes xmlns="http://camel.apache.org/schema/spring">
<route id="exportToCsv">
<from uri="timer://fetchData?repeatCount=1"/>
<to uri="pfx-api:fetch?filter=myFilter&objectType=PX"/>
<to uri="pfx-csv:marshal"/>
<to uri="file:export?fileName=data.csv&fileExist=Append"/>
</route>
</routes>
Batched Export (Large Datasets)
For large datasets, use batchedMode to avoid memory issues:
<routes xmlns="http://camel.apache.org/schema/spring">
<route id="batchedExport">
<from uri="timer://fetchData?repeatCount=1"/>
<to uri="pfx-api:fetch?filter=myFilter&objectType=PX&batchedMode=true&batchSize=5000"/>
<split>
<simple>${body}</simple>
<to uri="pfx-api:fetchIterator"/>
<to uri="pfx-csv:marshal"/>
<to uri="file:export?fileName=data.csv&fileExist=Append"/>
</split>
</route>
</routes>
The first pfx-api:fetch (with batchedMode=true) returns batch descriptors. The split iterates them, and pfx-api:fetchIterator fetches each batch using state from the exchange.
Import: Product Master Data (P)
<routes xmlns="http://camel.apache.org/schema/spring">
<route id="productImport">
<from uri="file:{{products.fromUri}}"/>
<to uri="pfx-csv:unmarshal"/>
<to uri="pfx-api:loaddata?objectType=P&mapper=productMapper"/>
<onCompletion onCompleteOnly="true">
<to uri="pfx-api:internalCopy?label=Product"/>
</onCompletion>
</route>
</routes>
Delta / Incremental Export
Export only records modified since last run. The filter is defined in its own file under filters/:
filters/delta-filter.xml:
<filters>
<filter id="deltaFilter">
<and>
<criterion fieldName="name" operator="equals" value="Cars"/>
<criterion fieldName="lastUpdateDate" operator="greaterThan" value="simple:${header.lastExportTimestamp}"/>
</and>
</filter>
</filters>
routes/delta-export.xml:
<routes xmlns="http://camel.apache.org/schema/spring">
<route id="deltaExport">
<from uri="timer://deltaExport?repeatCount=1"/>
<to uri="pfx-api:fetch?filter=deltaFilter&objectType=PX&batchedMode=true&batchSize=5000"/>
<split>
<simple>${body}</simple>
<to uri="pfx-api:fetchIterator"/>
<to uri="pfx-csv:marshal"/>
<to uri="file:export?fileName=delta-${date:now:yyyyMMdd-HHmmss}.csv"/>
</split>
</route>
</routes>
Export to SFTP
<routes xmlns="http://camel.apache.org/schema/spring">
<route id="exportToSftp">
<from uri="timer://fetchData?repeatCount=1"/>
<to uri="pfx-api:fetch?filter=myFilter&objectType=PX"/>
<to uri="pfx-csv:marshal"/>
<to uri="sftp:{{ftp.address}}:{{ftp.port}}?username={{ftp.username}}&password={{ftp.password}}&fileName={{ftp.path}}/export_${date:now:yyyyMMdd}.csv&useUserKnownHostsFile=false"/>
</route>
</routes>
Events
Pricefx generates events that IM can process. Configure pull-based event handling in application.properties:
Pull Events (Recommended)
integration.events.enabled=true
integration.events.scheduler-delay=60000
integration.events.delay=10000
integration.events.event-to-route-mapping.PADATALOAD_COMPLETED=direct:eventPADataLoadCompleted
integration.events.event-to-route-mapping.CALCULATION_COMPLETED_CFS=direct:eventCFSCompleted
Event handler route:
<routes xmlns="http://camel.apache.org/schema/spring">
<route id="eventPADataLoadCompleted">
<from uri="direct:eventPADataLoadCompleted"/>
<setHeader headerName="targetName">
<simple>${body[data][0][targetName]}</simple>
</setHeader>
<choice>
<when>
<simple>${headers.targetName} == "DMDS.Product"</simple>
<to uri="pfx-api:truncate?targetName=DMF.Product"/>
</when>
</choice>
</route>
</routes>
Event Properties
|
Property |
Default |
Description |
|---|---|---|
|
|
false |
Enable/disable event processing |
|
|
60000 |
Pull interval from Pricefx (ms) |
|
|
10000 |
File read interval (ms) |
|
|
1000 |
Max events per pull |
|
|
pricefx |
Pricefx connection name |
|
|
1000 |
Initial delay before first pull (ms) |
Stuck Events Monitoring
|
Property |
Default |
Description |
|---|---|---|
|
|
true |
Check for stuck events |
|
|
7200000 |
Check interval (2 hours) |
|
|
24 |
When event is considered stuck |
|
|
h |
Unit: d, h, m, s |
File Consumer Patterns
When reading files from the local file system, always configure archiving and file locking via standard properties.
Standard Properties
These properties are defined once in config/application.properties and referenced in route URIs via {{...}} placeholders:
# Move processed files to timestamped archive (default — always add this)
archive.file=move=.archive/%24%7Bdate:now:yyyy%7D/%24%7Bdate:now:MM%7D/%24%7Bfile:name.noext%7D__%24%7Bdate:now:yyyyMMdd_HHmmss%7D.%24%7Bfile:ext%7D
# Wait until file size stabilizes before processing (use when no .done marker)
read.lock=readLock=changed
# Wait for a .done marker file before processing (use when upstream writes .done)
done.file=doneFileName=%24%7Bfile:name%7D.done
# Move files that fail processing to an error folder (optional, add on request)
error.file=moveFailed=.error/%24%7Bfile:name.noext%7D__%24%7Bdate:now:yyyyMMdd-HHmmss%7D.%24%7Bfile:ext%7D
File URI Template
<from uri="file://{{integration.sftp.root}}/my-path?delay=10000&{{archive.file}}&{{read.lock}}"/>
Rules:
-
Always include
{{archive.file}}— moves processed files to.archive/YYYY/MM/filename__timestamp.ext -
Use either
{{read.lock}}or{{done.file}}— never both, never neither. Default:{{read.lock}}-
{{read.lock}}— waits until the file size stabilizes (no external signal required) -
{{done.file}}— waits for a.donemarker file (e.g.,data.csv.done) before readingdata.csv
-
-
{{error.file}}is optional — moves failed files to.error/filename__timestamp.exton processing failure -
Never use
noop=trueonfile:consumers — files must be moved/archived after processing (exception:pollEnrichfor one-shot reads) -
Never use
includeparameter by default
Scheduled Exports with Quartz Cron
For time-based scheduling, use the Quartz scheduler. In Camel URIs, spaces in cron expressions are replaced with +:
<!-- Every day at 6:00 AM -->
<from uri="quartz://export/my-export?cron=0+0+6+*+*+?"/>
<!-- Every day at midnight -->
<from uri="quartz://export/my-export?cron=0+0+0+*+*+?"/>
<!-- Every hour -->
<from uri="quartz://export/my-export?cron=0+0+*+*+*+?"/>
<!-- Monday–Friday at 8:00 AM -->
<from uri="quartz://export/my-export?cron=0+0+8+?+*+MON-FRI"/>
<!-- First day of each month at 1:00 AM -->
<from uri="quartz://export/my-export?cron=0+0+1+1+*+?"/>
Cron format: seconds minutes hours day-of-month month day-of-week
Delta Sync Pattern
Delta sync exports only records modified since the last run, using pfx-config:get/set to persist the timestamp between runs.
<routes xmlns="http://camel.apache.org/schema/spring">
<route id="delta-export-products">
<from uri="quartz://export/delta-export-products?cron=0+0+6+*+*+?"/>
<!-- Read last export timestamp from partition -->
<toD uri="pfx-config:get?name={{integration.name}}.${routeId}.export.timestamp&toHeader=lastExportTimestamp"/>
<!-- Fallback: first run exports everything -->
<choice>
<when>
<simple>${headers.lastExportTimestamp} == null || ${headers.lastExportTimestamp} == ''</simple>
<setHeader name="lastExportTimestamp">
<constant>1970-01-01T00:00:00</constant>
</setHeader>
</when>
</choice>
<!-- Capture current time as upper bound -->
<setHeader name="currentExportTimestamp">
<simple>${date-with-timezone:now:UTC:yyyy-MM-dd'T'HH:mm:ss}</simple>
</setHeader>
<toD uri="pfx-api:fetch?objectType=P&filter=delta-export-products.filter&batchedMode=true&batchSize=5000"/>
<split>
<simple>${body}</simple>
<to uri="pfx-api:fetchIterator"/>
<to uri="pfx-csv:marshal"/>
<to uri="file://{{integration.sftp.root}}/export?fileName=products-delta-${date:now:yyyyMMdd}.csv&fileExist=Append"/>
</split>
<!-- Save upper bound as the new stored timestamp -->
<toD uri="pfx-config:set?name={{integration.name}}.${routeId}.export.timestamp&value=${headers.currentExportTimestamp}"/>
</route>
</routes>
Delta filter (filters/delta-export-products.filter.xml):
<filter id="delta-export-products.filter" sortBy="lastUpdateDate">
<and>
<criterion fieldName="lastUpdateDate" operator="greaterThan" value="simple:${headers.lastExportTimestamp}"/>
<criterion fieldName="lastUpdateDate" operator="lessOrEqual" value="simple:${headers.currentExportTimestamp}"/>
</and>
</filter>
The time window (greaterThan lower bound + lessOrEqual upper bound) ensures records that change during the export are picked up in the next run, not lost.
Common Pitfalls
-
Route ID must match the file name (without
.xml). Fileimport-products.xml→id="import-products". A mismatched ID causes a deployment failure — IM cannot start the route. -
Every route must have a unique id across the entire project
-
Always use
&for&in XML attribute values -
DMDS imports must include onCompletion with pfx-api:flush
-
Use {{property}} placeholders for paths, credentials — never hardcode
-
Use split + tokenize for CSV file imports (batch processing)
-
Use batchedMode=true for large data exports
-
Route files use a <routes xmlns="http://camel.apache.org/schema/spring"> wrapper — no beans wrapper or XML declaration needed
-
PX and CX have no
extensionNameURI parameter — the table name is set as<constant expression="{TableName}" out="name"/>in the mapper -
Do not use
connection=pricefx— the default connection ispricefxand is used automatically
Rules
-
Route ID MUST match file name without
.xml— mismatched ID causes deployment failure -
Use
<routes xmlns="http://camel.apache.org/schema/spring">wrapper — no<beans>, no<?xml>declaration -
All
&in URI attributes MUST be& -
DMDS imports MUST include
onCompletionwithpfx-api:flush(outside the<split>block) -
NEVER use
noop=trueonfile:consumers — use{{archive.file}}+{{read.lock}} -
NEVER hardcode file paths or credentials — use
{{property}}placeholders -
NEVER include
connection=pricefx— it's the default -
PX/CX: table name goes in the mapper as
<constant out="name"/>, NOT as a URI parameter -
Inside batched export split: use
pfx-api:fetchIterator, NOTpfx-api:fetch -
Quartz cron: spaces replaced with
+in URI
Quick Reference: Common From URIs
|
URI |
Description |
|---|---|
|
|
Read files from directory |
|
|
One-time trigger |
|
|
Periodic trigger |
|
|
Internal route call (sync) |
|
|
Internal route call (async) |