Skip to footer content
USING IRONXL

How to Use the .NET Core CSV Reader with IronXL: Practical Examples

Processing CSV file operations in .NET Core applications is a common requirement for data import and export operations. Yet, developers often encounter challenges with different delimiters, data type conversions, and performance issues when trying to parse CSV file contents efficiently. While libraries like the CsvHelper package and TextFieldParser exist for CSV parsing, not all of them provide Excel interoperability with strong exception handling capabilities. IronXL is a battle-tested CSV parser solution that handles both CSV and Excel formats, offering strong performance for large-scale batch processing scenarios. This tutorial demonstrates how to effectively use IronXL as your .NET Core CSV reader with practical, easy-to-follow examples for parsing CSV data efficiently, including async operations for improved application responsiveness. Developers contributing improvements or submitting pull requests to open-source CSV utilities will also find IronXL's clear API a valuable reference, especially when managing datasets that contain repeated values, duplicate entries, or require data validation during the data import process.

In the .NET ecosystem, several packages handle CSV file operations, including alternatives like EPPlus, NPOI, and OpenXML, but IronXL's versatility makes it a top choice for developers who want to go beyond simple CSV reading and enjoy Excel interoperability within a single CSV library -- particularly for ETL operations and report generation tasks in enterprise applications. Choosing the right parser depends on your specific requirements, so this guide walks you through real-world patterns to help you make an informed decision.

How to Use the .NET Core CSV Reader, IronXL with Practical Examples: Image 1 - IronXL

Why Choose IronXL as Your .NET Core CSV Reader?

When selecting a .NET Core CSV reader, IronXL offers several compelling advantages over traditional CSV parsing libraries. IronXL integrates with .NET Core's modern architecture while maintaining backward compatibility with .NET Framework projects. This solution eliminates common pain points developers face when working with CSV file operations, including:

  • Automatic encoding detection for international character sets
  • Intelligent delimiter recognition without manual configuration
  • Memory-efficient processing for files ranging from kilobytes to gigabytes
  • Built-in data type inference and conversion
  • Carriage return and line feed handling across platforms
  • Excel formula support even when working with CSV data
  • Cross-platform reliability on Windows, Linux, and macOS

Unlike basic CSV readers that require extensive configuration and manual parsing logic, IronXL handles edge cases automatically -- such as quoted fields containing delimiters, multi-line cell values, and special characters. The library's architecture ensures optimal performance through lazy loading and streaming capabilities, making it suitable for both small configuration files and large-scale data processing tasks. IronXL can skip header rows when needed and split complex data structures efficiently. Learn more in the IronXL features overview.

For developers transitioning from legacy systems, IronXL provides a familiar API that reduces the learning curve while offering modern async/await patterns for responsive applications. This makes it an ideal choice for teams modernizing their data processing infrastructure. See the full IronXL documentation for API references and configuration options.

How Do You Install IronXL for CSV File Reading?

Installing IronXL in your .NET Core project takes just seconds, whether you are building a console application, ASP.NET Core web app, or Windows Forms application. To parse CSV files in .NET Core efficiently and begin reading CSV data, open the Package Manager Console in Visual Studio and run:

Install-Package IronXL.Excel
Install-Package IronXL.Excel
SHELL

Or use the .NET CLI:

dotnet add package IronXL.Excel
dotnet add package IronXL.Excel
SHELL

How to Use the .NET Core CSV Reader, IronXL with Practical Examples: Image 2 - Installation

Alternatively, use the NuGet Package Manager UI by searching for "IronXL.Excel" and clicking install. This library integrates with existing .NET Framework projects during migration to .NET Core. You can also reference it directly from the IronXL NuGet page.

Once installed, add the namespace to your code:

using IronXL;
using IronXL;
$vbLabelText   $csharpLabel

This setup gives you access to powerful CSV reading capabilities without requiring Microsoft Office or Interop dependencies, making it ideal for cloud deployment and Docker containers. For detailed installation instructions and configuration settings, check the IronXL installation guide documentation.

How Do You Read CSV Files Using IronXL's LoadCSV Method?

IronXL makes CSV file processing straightforward with its LoadCSV method, which handles CSV headers, CSV columns, and CSV rows efficiently, as shown in the below example:

// Load CSV file into a WorkBook object for .NET Core CSV reading
var workbook = WorkBook.LoadCSV("Budget.csv",
    fileFormat: ExcelFileFormat.XLSX,
    listDelimiter: ",");
// Access the default worksheet containing parsed CSV data
WorkSheet worksheet = workbook.DefaultWorkSheet;
// Read specific cell values with type-safe methods
string cellValue = worksheet["A1"].StringValue;
// Iterate through a range for bulk CSV data processing
foreach (var cell in worksheet["A1:C10"])
{
    Console.WriteLine($"Cell {cell.AddressString}: {cell.Text}");
}
// Load CSV file into a WorkBook object for .NET Core CSV reading
var workbook = WorkBook.LoadCSV("Budget.csv",
    fileFormat: ExcelFileFormat.XLSX,
    listDelimiter: ",");
// Access the default worksheet containing parsed CSV data
WorkSheet worksheet = workbook.DefaultWorkSheet;
// Read specific cell values with type-safe methods
string cellValue = worksheet["A1"].StringValue;
// Iterate through a range for bulk CSV data processing
foreach (var cell in worksheet["A1:C10"])
{
    Console.WriteLine($"Cell {cell.AddressString}: {cell.Text}");
}
$vbLabelText   $csharpLabel

The LoadCSV method creates a WorkBook object that represents your CSV data structure in memory using optimized memory stream handling. The fileFormat parameter specifies the internal format for processing, while listDelimiter defines the CSV separator character used in your CSV file -- supporting tab-delimited files and pipe-delimited formats.

Input and Output

How to Use the .NET Core CSV Reader, IronXL with Practical Examples: Image 5 - Sample CSV Input

When dealing with CSV files exported from systems that include a sep= line (sometimes called the "sep takes" marker), IronXL intelligently reads this metadata to determine the correct delimiter automatically. This feature saves time when processing regional CSV formats that may use semicolons, tabs, or pipes instead of commas, ensuring proper encoding handling across character sets.

The default WorkSheet property provides immediate access to your parsed CSV data as a worksheet, enabling cell-by-cell or range-based data extraction. You can retrieve values using properties like StringValue, IntValue, or DecimalValue for type-safe operations with built-in type conversion. For more complex data manipulation and transformation, explore IronXL's cell formatting options and range selection features.

How Do You Map CSV Data to C# Classes with Data Validation?

You can map CSV data directly to C# objects with field mapping and data validation. For instance, imagine a CSV file with columns for Name, Age, and City. Define a model with property mapping like this:

public record Customer(string Name, int Age, string City)
{
    public bool IsValid() => !string.IsNullOrEmpty(Name) && Age > 0;
}

// Parse CSV rows into typed objects
var customers = new List<Customer>();
for (int row = 2; row <= worksheet.RowCount; row++)
{
    var customer = new Customer(
        Name: worksheet[$"A{row}"].StringValue,
        Age:  worksheet[$"B{row}"].IntValue,
        City: worksheet[$"C{row}"].StringValue
    );
    if (customer.IsValid())
        customers.Add(customer);
}
// Output the records
foreach (var record in customers)
{
    Console.WriteLine($"Customer: {record.Name}, Age: {record.Age}, City: {record.City}");
}
public record Customer(string Name, int Age, string City)
{
    public bool IsValid() => !string.IsNullOrEmpty(Name) && Age > 0;
}

// Parse CSV rows into typed objects
var customers = new List<Customer>();
for (int row = 2; row <= worksheet.RowCount; row++)
{
    var customer = new Customer(
        Name: worksheet[$"A{row}"].StringValue,
        Age:  worksheet[$"B{row}"].IntValue,
        City: worksheet[$"C{row}"].StringValue
    );
    if (customer.IsValid())
        customers.Add(customer);
}
// Output the records
foreach (var record in customers)
{
    Console.WriteLine($"Customer: {record.Name}, Age: {record.Age}, City: {record.City}");
}
$vbLabelText   $csharpLabel

Using IronXL, each row in the worksheet maps to a typed object, ready for data processing, serialization to JSON, or export back to another format with proper exception handling. This approach lets you create strongly-typed records from CSV data with minimal boilerplate. For a deeper walkthrough, see the guide on reading CSV files in C#.

How Do You Handle Different Delimiters and Convert to a DataTable?

Real-world CSV file formats often use various delimiters beyond commas, requiring flexible handling. IronXL manages this elegantly with automatic delimiter detection:

// Load CSV with semicolon delimiter
WorkBook workbook = WorkBook.LoadCSV("products.csv",
    fileFormat: ExcelFileFormat.XLSX,
    listDelimiter: ";");
WorkSheet worksheet = workbook.DefaultWorkSheet;
// Convert to DataTable for database operations
DataTable dataTable = worksheet.ToDataTable(true);
// Process the DataTable
foreach (DataRow row in dataTable.Rows)
{
    Console.WriteLine($"Product: {row["ProductName"]}, Price: {row["Price"]}");
}
// Load CSV with semicolon delimiter
WorkBook workbook = WorkBook.LoadCSV("products.csv",
    fileFormat: ExcelFileFormat.XLSX,
    listDelimiter: ";");
WorkSheet worksheet = workbook.DefaultWorkSheet;
// Convert to DataTable for database operations
DataTable dataTable = worksheet.ToDataTable(true);
// Process the DataTable
foreach (DataRow row in dataTable.Rows)
{
    Console.WriteLine($"Product: {row["ProductName"]}, Price: {row["Price"]}");
}
$vbLabelText   $csharpLabel

The ToDataTable method converts worksheet data into a .NET DataTable, with the boolean parameter indicating whether to use the first row as column headers. This conversion is particularly useful for database operations, data binding in ASP.NET Core applications, or when you need to apply existing DataTable processing logic for SQL Server integration. The resulting DataTable maintains data types and schema information, and can be directly used with SqlBulkCopy for efficient bulk insert operations.

Learn more about importing CSV to DataTable and database integration in the detailed guides.

How Do You Export a DataTable Back to CSV?

After processing data in a DataTable, you often need to write results back to a CSV file. IronXL handles this direction too:

// Load a DataTable from your data source
DataTable exportTable = GetProcessedData(); // your data source method

// Create a new workbook and populate it from the DataTable
WorkBook outputWorkbook = WorkBook.Create(ExcelFileFormat.XLSX);
WorkSheet outputSheet = outputWorkbook.DefaultWorkSheet;

// Write headers from DataTable columns
for (int col = 0; col < exportTable.Columns.Count; col++)
{
    outputSheet[0, col].Value = exportTable.Columns[col].ColumnName;
}

// Write rows
for (int row = 0; row < exportTable.Rows.Count; row++)
{
    for (int col = 0; col < exportTable.Columns.Count; col++)
    {
        outputSheet[row + 1, col].Value = exportTable.Rows[row][col]?.ToString();
    }
}

// Save as CSV
outputWorkbook.SaveAsCsv("output.csv", delimiter: ",");
Console.WriteLine("Export complete.");
// Load a DataTable from your data source
DataTable exportTable = GetProcessedData(); // your data source method

// Create a new workbook and populate it from the DataTable
WorkBook outputWorkbook = WorkBook.Create(ExcelFileFormat.XLSX);
WorkSheet outputSheet = outputWorkbook.DefaultWorkSheet;

// Write headers from DataTable columns
for (int col = 0; col < exportTable.Columns.Count; col++)
{
    outputSheet[0, col].Value = exportTable.Columns[col].ColumnName;
}

// Write rows
for (int row = 0; row < exportTable.Rows.Count; row++)
{
    for (int col = 0; col < exportTable.Columns.Count; col++)
    {
        outputSheet[row + 1, col].Value = exportTable.Rows[row][col]?.ToString();
    }
}

// Save as CSV
outputWorkbook.SaveAsCsv("output.csv", delimiter: ",");
Console.WriteLine("Export complete.");
$vbLabelText   $csharpLabel

This pattern works well for ETL pipelines where data is loaded, transformed, then written to a new file. For additional export formats, see the C# export to CSV tutorial and the DataTable to Excel guide.

How Do You Convert Between CSV and Excel Formats?

One of IronXL's standout features is CSV to Excel conversion and Excel to CSV transformation, essential for data migration projects. The following example demonstrates this capability:

// Load CSV and save as Excel
WorkBook csvWorkbook = WorkBook.LoadCSV("report.csv",
    fileFormat: ExcelFileFormat.XLSX,
    listDelimiter: ",");
// Save as Excel file
csvWorkbook.SaveAs("report.xlsx");
// Or load Excel and export to CSV
WorkBook excelWorkbook = WorkBook.Load("data.xlsx");
excelWorkbook.SaveAsCsv("exported_data.csv", delimiter: ",");
// Load CSV and save as Excel
WorkBook csvWorkbook = WorkBook.LoadCSV("report.csv",
    fileFormat: ExcelFileFormat.XLSX,
    listDelimiter: ",");
// Save as Excel file
csvWorkbook.SaveAs("report.xlsx");
// Or load Excel and export to CSV
WorkBook excelWorkbook = WorkBook.Load("data.xlsx");
excelWorkbook.SaveAsCsv("exported_data.csv", delimiter: ",");
$vbLabelText   $csharpLabel

This bidirectional conversion preserves data integrity while allowing format flexibility for various file conversion scenarios. The SaveAs method automatically detects the desired format from the file extension, supporting XLSX, XLS, and other Excel formats with worksheet management. When saving to CSV using SaveAsCsv, you can specify custom delimiters and text encoding to match your requirements.

This feature is invaluable when integrating with systems that require specific file formats for data exchange. For developers migrating from other libraries or evaluating manual parsing alternatives, see how IronXL compares to popular alternatives discussed on Stack Overflow and performance considerations in the .NET community. For more file format conversion patterns, visit the convert Excel spreadsheet guide.

How Do You Read Large CSV Files Without Memory Issues?

Processing large CSV files -- those with millions of rows -- requires a careful approach to memory. IronXL uses lazy loading internally, meaning worksheet rows are read on demand rather than all at once. To keep memory usage low when iterating large datasets, process rows in batches:

WorkBook workbook = WorkBook.LoadCSV("large-dataset.csv",
    fileFormat: ExcelFileFormat.XLSX,
    listDelimiter: ",");
WorkSheet worksheet = workbook.DefaultWorkSheet;

int batchSize = 1000;
int totalRows = worksheet.RowCount;

for (int start = 1; start <= totalRows; start += batchSize)
{
    int end = Math.Min(start + batchSize - 1, totalRows);
    for (int row = start; row <= end; row++)
    {
        string id   = worksheet[$"A{row}"].StringValue;
        string name = worksheet[$"B{row}"].StringValue;
        // Process each record here
        Console.WriteLine($"Row {row}: {id} - {name}");
    }
    Console.WriteLine($"Processed batch {start}-{end}");
}
WorkBook workbook = WorkBook.LoadCSV("large-dataset.csv",
    fileFormat: ExcelFileFormat.XLSX,
    listDelimiter: ",");
WorkSheet worksheet = workbook.DefaultWorkSheet;

int batchSize = 1000;
int totalRows = worksheet.RowCount;

for (int start = 1; start <= totalRows; start += batchSize)
{
    int end = Math.Min(start + batchSize - 1, totalRows);
    for (int row = start; row <= end; row++)
    {
        string id   = worksheet[$"A{row}"].StringValue;
        string name = worksheet[$"B{row}"].StringValue;
        // Process each record here
        Console.WriteLine($"Row {row}: {id} - {name}");
    }
    Console.WriteLine($"Processed batch {start}-{end}");
}
$vbLabelText   $csharpLabel

This technique keeps heap allocations predictable and avoids out-of-memory errors on large datasets. The same pattern applies when exporting -- write rows incrementally and save once at the end. For further tips, see the file size limits guide.

What Advanced Features Does IronXL Offer for Enterprise CSV Processing?

IronXL provides enterprise-grade features that set it apart from basic CSV parsers, including unit testing support and debugging tools. The library offers cross-platform compatibility, running on Windows, Linux, macOS, and in Docker containers -- essential for modern .NET Core deployments and microservices architecture. According to Microsoft's documentation, cross-platform support is crucial for cloud-native applications and Azure deployment.

Beyond technical capabilities, IronXL includes professional support and regular updates with all licenses, ensuring compatibility with the latest .NET versions and security patches. This commercial backing ensures reliability for mission-critical applications where open-source libraries might fall short in production environments. The library handles large datasets efficiently through optimized memory management and supports advanced scenarios like:

IronXL Advanced Feature Summary
Feature Description Use Case
Formula calculations Evaluate Excel formulas on CSV data after import Financial reports, aggregations
Cell formatting preservation Retain number formats, date styles during conversion Accounting exports, date-sensitive data
Multi-sheet workbook operations Merge multiple CSV files into one workbook with named sheets Monthly report consolidation
Data aggregation SUM, AVERAGE, COUNT across ranges Dashboard generation, KPI calculation
Range sorting Sort rows by one or more columns Ranked output, alphabetical exports

For production deployments that require scalability and load balancing, IronXL's licensing model offers flexibility with options for single projects, teams, and enterprise-wide usage -- all of which include source code access and royalty-free redistribution rights. Purchase a license to unlock full functionality without watermarks.

How to Use the .NET Core CSV Reader, IronXL with Practical Examples: Image 10 - Licensing

How Do You Apply a License Key?

After purchasing, apply your license key before calling any IronXL method:

// Apply license key at application startup
IronXL.License.LicenseKey = "YOUR-LICENSE-KEY-HERE";

// Then proceed with CSV reading as normal
WorkBook workbook = WorkBook.LoadCSV("data.csv",
    fileFormat: ExcelFileFormat.XLSX,
    listDelimiter: ",");
WorkSheet worksheet = workbook.DefaultWorkSheet;
Console.WriteLine($"Loaded {worksheet.RowCount} rows from CSV.");
// Apply license key at application startup
IronXL.License.LicenseKey = "YOUR-LICENSE-KEY-HERE";

// Then proceed with CSV reading as normal
WorkBook workbook = WorkBook.LoadCSV("data.csv",
    fileFormat: ExcelFileFormat.XLSX,
    listDelimiter: ",");
WorkSheet worksheet = workbook.DefaultWorkSheet;
Console.WriteLine($"Loaded {worksheet.RowCount} rows from CSV.");
$vbLabelText   $csharpLabel

Place the license key assignment at the earliest entry point of your application -- typically Program.cs in top-level statement projects -- to ensure all subsequent calls are fully licensed. For step-by-step instructions, see the IronXL license key guide.

How Do You Get Started with a Free Trial?

IronXL simplifies CSV reading and writing operations in .NET Core applications while providing the flexibility to handle complex scenarios, including data analysis, reporting, and automation tasks. Its intuitive API, combined with Excel format support and enterprise features, makes it a reliable choice for developers who need proven CSV processing capabilities with thread safety and concurrent access support. The library's ability to convert between formats, handle various delimiters, perform data cleansing, and integrate with existing .NET data structures reduces development time significantly while keeping code maintainable.

Ready to start processing CSV files? Download a free trial of IronXL today and experience how the library transforms your data handling workflows with professional-grade CSV parsing capabilities. For production use, explore licensing options that include professional support, documentation, and ongoing updates for your .NET projects.

Additional resources for further reading:

Frequently Asked Questions

What makes IronXL an ideal choice for CSV processing in .NET Core?

IronXL offers seamless handling of both CSV and Excel formats, with robust exception handling and superior performance optimization, especially for large-scale batch processing scenarios.

How does IronXL improve performance when reading CSV files?

IronXL is optimized for large-scale batch processing, ensuring efficient CSV file operations by handling different delimiters and data type conversions smoothly.

Can IronXL handle different delimiters in CSV files?

Yes, IronXL can efficiently process CSV files with various delimiters, making it versatile for different data import and export scenarios.

Does IronXL support asynchronous operations for CSV processing?

IronXL supports async operations, which enhances application responsiveness during CSV parsing by allowing non-blocking operations.

How does IronXL assist developers contributing to open-source CSV utilities?

IronXL provides a clear API that serves as a valuable reference for developers, especially when dealing with datasets that contain repeated values, duplicate entries, or require data validation.

What are the benefits of using IronXL over other CSV parsing libraries?

IronXL stands out with its Excel interoperability, robust exception handling, and enhanced performance, making it a comprehensive solution compared to libraries like CsvHelper and TextFieldParser.

Can IronXL be used for data validation during CSV import?

Yes, IronXL's capabilities include data validation, which is particularly useful when managing datasets with duplicate entries or repeated values during the data import process.

Is IronXL compatible with .NET Core applications?

Absolutely, IronXL is designed to integrate seamlessly with .NET Core applications, providing a reliable solution for CSV reading and processing.

Jordi Bardia
Software Engineer
Jordi is most proficient in Python, C# and C++, when he isn’t leveraging his skills at Iron Software; he’s game programming. Sharing responsibilities for product testing, product development and research, Jordi adds immense value to continual product improvement. The varied experience keeps him challenged and engaged, and he ...
Read More

Iron Support Team

We're online 24 hours, 5 days a week.
Chat
Email
Call Me