Skip to main content
The Table File Reader processes Excel and CSV files, extracting structured data from spreadsheets with support for thousands of rows and multiple sheets. This reader handles both bulk data import and single-row processing with cell references across multiple sheets for use in automation workflows.

Key Features

  • Multi-format support — Process Excel (.xlsx, .xls) and CSV files with automatic format detection
  • High-volume processing — Handle thousands of rows with batch processing
  • Multi-sheet support — Extract data from multiple Excel sheets with cross-sheet cell references
  • Flexible data mapping — Use column names or cell references for field mapping and extraction

Supported File Types

The Table File Reader can process various spreadsheet formats:
  • XLSX - Modern Excel format (Excel 2007+)
  • XLS - Legacy Excel format (Excel 97-2003)
  • Multi-sheet workbooks - Extract from specific sheets or all sheets
  • Complex formulas - Processes calculated values
  • CSV - Comma-separated values
  • TSV - Tab-separated values
  • Custom delimiters - Configure custom separators
  • UTF-8 encoding - Full Unicode support
  • Tabular data - Structured rows and columns
  • Report formats - Header rows and summary data
  • Template-based - Data in specific cell locations
  • Mixed formats - Combination of structured and template data

Processing Modes

The Table File Reader supports two modes:
  • Bulk data processing — Process many rows using column names. Set the header row, data row range, and map column names to field names with appropriate types (Text, Number, Date). Use batch size and error-handling options for large files. Best for data imports, bulk updates, and ETL.
  • Single-row processing — Extract specific values using cell references (e.g., A1, B2, Sheet1!C3). Use sheet selection and cross-sheet references (Sheet1!A1, Sheet2!B2) for multi-sheet workbooks. Best for forms, template-based extraction, and multi-sheet reports.

Field Mapping Examples

Column-based (bulk): Map source columns to fields (e.g., “Customer_Name” → customer_name, “Purchase_Date” → purchase_date with Date type). You can set header row, data start/end rows, skip columns, and default values for complex sheets. Cell references (single-row): Map cells to fields (e.g., Customer Name → A2, Order Date → B2). For multi-sheet workbooks use sheet prefixes (e.g., Customer!A2, Finance!C2).

Creating a Table File Reader

1

Navigate to File Readers

In your application, go to File Readers section
2

Create New Reader

Click + File Reader and select Table Data from the document type options
3

Configure Basic Settings

Name: Enter a descriptive name (e.g., “Customer Data Import”)Description: Optional description for your teamProcessing Mode: Choose between bulk processing or single-row processing
4

Set Up Field Mapping

For Bulk Processing: Map column names to field names For Single Row: Define cell references for each fieldConfigure field types and validation rules
5

Test with Sample File

Upload a sample Excel or CSV file to validate data extraction

Using in Automations

The Table File Reader works with automation workflows. Typical flow: File UploadTable File ReaderTransform Data (or Repeat For Each for bulk) → Update Records or Create Record → optional Generate Report. Common patterns:
  • Bulk import — Trigger on attachment; process rows with column mapping; Repeat For Each row → Transform Data → Search Records → Create or Update Record.
  • Report processing — Trigger on email with attachment; extract metrics with cell references; Run Calculation, Update Record, optional AI Classification or notification.
  • Large files — Use batch processing (e.g., 100 rows per batch), then Repeat For Each batch → Transform Data → Update Records.

Best Practices

  • File structure: Use consistent column names and data formats; avoid merged cells in data areas; use standard date formats (e.g., YYYY-MM-DD).
  • Performance: Process large files in batches to avoid timeouts; use specific cell ranges instead of entire sheets; prefer CSV for very large datasets; minimize complex formulas in source files.
  • Validation: Add validation rules (required fields, data types, ranges) and plan for missing data and format errors.

Advanced Features

  • Multi-sheet workbooks: Configure which sheets to process and use cross-sheet references (e.g., Sheet1!A1, Sheet2!B2). Handle missing sheets or invalid references in your workflow.
  • Dynamic column detection: Use with AI Classification to detect column types and map fields when file structure varies.
  • Downstream processing: Use Transform Data, Run Calculation, Search Records, and IF conditions in your automation to clean, validate, and deduplicate extracted data.

Error Handling and Troubleshooting

Common Issues

Symptoms: Fields return empty values or incorrect dataCauses:
  • Column names don’t match configuration
  • Header row in wrong location
  • Data types incompatible
Solutions:
  • Verify column names in source file
  • Check header row configuration
  • Adjust field types to match data
  • Use Transform Data to clean values
Symptoms: Cell references return errors or empty valuesCauses:
  • Sheet names changed
  • Cell locations moved
  • Referenced cells are empty
Solutions:
  • Verify sheet names and structure
  • Update cell references
  • Add IF conditions to handle empty cells
  • Use named ranges for stability
Symptoms: Processing timeouts or memory errorsCauses:
  • File too large for single processing
  • Complex formulas slow processing
  • Memory limitations
Solutions:
  • Enable batch processing
  • Split large files into smaller chunks
  • Use CSV format for very large datasets
  • Process during off-peak hours
Validate that required fields are populated, data types match expected formats, and dates/numbers are in valid ranges before processing.

Integration Examples

  • Customer data import: Attachment Added (CSV) → Table File Reader (column mapping) → Transform Data → Search Records → Create or Update Record.
  • Report processing: Email Received (Excel attachment) → Table File Reader (cell references for metrics) → Run Calculation → Update Record → optional Generate Report.
  • Multi-sheet workbook: File Upload → Table File Reader (per-sheet configuration) → combine and process data → Create Record or analysis.

File Reader Comparison

Choose Table File Reader when:
  • Processing Excel or CSV files
  • Working with structured tabular data
  • Need to handle thousands of rows
  • Requiring multi-sheet processing

Next Steps

Automation System

Learn how to integrate Table File Readers with data processing workflows

Data Mining

Explore advanced data processing and analysis capabilities

Calculations

Perform calculations on extracted spreadsheet data

Analytics

Create analytics and reports from processed table data

The Table File Reader supports processing Excel and CSV files at scale for data imports, report processing, and workflows that need structured tabular data extraction.