The action Export to CSV

Modified on Fri, 15 Aug at 5:55 PM

Purpose

Keeps a trace of processed files by appending one entry per processed item into a delimited text file (CSV-style). Use it to record filename, processing date/time, or any metadata you capture in the scenario so you have a simple searchable log.


Where to add

Place Export to File at the point in your scenario where you want to record the file (beforeor after Convert to PDF, Save File, Summarize with AI, Send File to FTP, etc.). The action writes one line for the current pipeline item each time it runs.


Settings to trace process in csv file



Main settings

  • Choose Separator
    Tab, Comma or Semicolon: Selects the field delimiter used in the output file. Pick the delimiter that matches your downstream tools (Excel, import scripts, databases).
  • Choose Fields
    Select which fields to include and the order they will appear in the file. Typical fields are filename, full path, date/time....
    Use the Fields button to open the token selector and insert values. The textbox shows each chosen field in order; at runtime tokens are expanded and written as delimited columns.
  • Save to
    Full path to the trace file where lines will be appended. Example in the UI: C:\temp\%DATE_YEAR%-%DATE_MONTH%-%DATE_DAY%.txt
    Use the Search button to browse and pick a path or the Fields button to insert tokens into the filename or path so you can create daily logs, per-client logs, etc.
  • Behavior:
    If the file does not exist the action will create it and then append the first line.
    Use UNC paths (\\server\share\path) for network locations when the scenario runs as a service (avoid mapped drive letters).


Quick configuration steps

1. Add Export to File to your scenario after the step that produces the item you want logged.  

2. Choose the separator (Tab/Comma/Semicolon).  

3. Click Fields and add the field you want to record (date, filename, status, etc.) in the desired order.  

4. Enter the Save to path (use Search to pick a folder or Fields to build dynamic names).  

5. Click OK and run a test with a representative file to verify the log line format, encoding and file placement.


Examples

  • Daily CSV of processed files:
    Separator: Comma
    Fields: %DATE_FULLSHORT%, %FILENAME_FULL%, %FILE_SIZE%, %PROCESS_STATUS%
    Save to: \\fileserver\logs\processed-%DATE_YEAR%-%DATE_MONTH%-%DATE_DAY%.csv
    Result (one line): "2025-08-15 14:09","invoice-123.pdf","23456","Success"


Behavior notes and tips

  • Append mode: the action appends one line per processed item. It does not rewrite the whole file unless you change the filename so a new file is created.  
  • Field quoting and escaping: fields containing the chosen separator, newlines or quotes are enclosed in double quotes, and embedded quotes are escaped by doubling them (standard CSV behavior).  
  • Encoding: the action uses UTF‑8 encoding by default (safest for international text). If your downstream tool requires a different encoding, convert or configure accordingly.  
  • Concurrency and file locking: multiple simultaneous runs writing to the same file can cause conflicts or partial writes. If you expect concurrent processing:
  •   - Prefer per-day or per-run files (use tokens in the filename), or
  •   - Ensure your environment serializes access to the log file, or
  •   - Use a centralized logging system (database, message queue) for high-concurrency scenarios.
  • File size and rotation: log files grow over time. Rotate or archive logs periodically (by using dated filenames or a scheduled archival job) to keep files manageable.
  • Permissions: ensure the account that runs the scenario has write and modify permissions to the target folder and file (both NTFS and share permissions for network paths).
  • Windows/Network considerations: when the scenario runs as a Windows service use UNC paths, not mapped drive letters, and verify the service account can access the share.


Best practices

  • Include a timestamp and unique identifier in the fields so each line is traceable (for example %DATE_FULLSHORT% and %RANDOM_NUMBER%).  
  • Use dated filenames for automatic log rotation: processed-%DATE_YEAR%-%DATE_MONTH%-%DATE_DAY%.csv.  
  • Keep the field order consistent so downstream imports/analytics can rely on fixed column positions.  
  • Test with a small dataset first and open the file in your target consumer (Excel, import script) to ensure delimiters and encoding are correct.


Done, build the list of tokens you want to record, choose a separator, set the Save to path (use Fields to create dynamic filenames), verify permissions, and the Export to File action will append one delimited line to the chosen file for every processed item.

Was this article helpful?

That’s Great!

Thank you for your feedback

Sorry! We couldn't be helpful

Thank you for your feedback

Let us know how can we improve this article!

Select at least one of the reasons
CAPTCHA verification is required.

Feedback sent

We appreciate your effort and will try to fix the article