Skip to footer content
USING IRONXL

How to Read a CSV File into a DataTable in C#

Converting CSV files to DataTable in C# is straightforward with IronXL: use WorkBook.LoadCSV() to read any CSV file, then call ToDataTable(true) to create a structured DataTable with proper column headers, ready for database import or data manipulation.

Working with CSV files is a common task for .NET developers, whether you're importing sales reports, processing inventory, or syncing customer records. Converting a CSV file into a DataTable makes it easy to manipulate, analyze, or insert into a database table. The IronXL library provides a full-featured solution for handling Excel and CSV operations in C# applications -- no Excel installation required.

Reading CSV files in C# can be tricky. Large files, different delimiters, or embedded commas often require complex parsing logic. IronXL simplifies this: with just a few lines of code, you can read CSV files from any file path, convert them into a DataTable with proper column headers, and prepare them for bulk database operations. This approach is particularly useful when working with Excel data in C#.

In this guide, you will learn how to:

  • Load a CSV file into a DataTable in C# using IronXL's CSV reading capabilities
  • Handle different delimiters like commas, tabs, or semicolons
  • Import your DataTable directly into SQL Server efficiently using export functionality
  • Manage large datasets reliably without running into memory issues

By the end, you will have a complete, practical workflow for turning CSV data into actionable information using IronXL in your .NET 10 applications.

Why Does CSV to DataTable Conversion Matter?

CSV to DataTable conversion lets developers transform comma-separated values into structured objects for further processing. Whether you're handling inventory data, customer records, or transaction logs, converting CSV to a DataTable efficiently is crucial. Using the first row as column headers, you can ensure that DataTable columns align with your database table schema, making it perfect for Excel to SQL operations.

Traditional approaches often struggle with large files, different delimiters, or memory management. IronXL eliminates these challenges and handles different delimiters, quoted fields, and embedded commas without extra code. IronXL's CSV reading capabilities eliminate common pain points while providing additional features like data validation and file size control.

When Should You Use DataTable Over Other Data Structures?

DataTables work well when you need database-like operations in memory. They're ideal for scenarios involving SQL Server imports, data binding to UI controls, or when you need to perform LINQ queries on structured data. Unlike simple arrays or lists, DataTables provide schema validation, relationships between tables, and direct integration with ADO.NET. For more complex scenarios, you can also convert between DataSet and DataTable as needed.

What Are Common Issues With Traditional CSV Parsing?

Manual CSV parsing often fails with edge cases: embedded commas within quoted fields, line breaks in cell values, or inconsistent delimiter usage. Memory consumption becomes problematic with large files when loading everything into memory at once. Character encoding issues can corrupt international data, while type inference often misinterprets numeric strings as numbers. These challenges are why a structured parsing library like IronXL has become essential for production applications.

How Do You Avoid Memory Problems With Large CSV Files?

IronXL implements buffered reading techniques to handle large CSV files efficiently. Instead of loading entire files into memory at once, it processes data in chunks, maintaining a small memory footprint even with large files. This makes it suitable for server environments with limited resources -- including cloud deployments where memory allocation is constrained.

How Do You Install IronXL?

Getting started with IronXL requires a simple NuGet package installation. Open the NuGet Package Manager Console in Visual Studio and execute one of the following commands:

Install-Package IronXL
Install-Package IronXL
SHELL

Or if you prefer the .NET CLI:

dotnet add package IronXL
dotnet add package IronXL
SHELL

Once installed, add the IronXL namespace to your project:

using IronXL;
using System.Data;
using IronXL;
using System.Data;
$vbLabelText   $csharpLabel

This provides access to all CSV processing capabilities without any Excel dependencies. For more details, refer to the IronXL documentation and NuGet installation guide.

What Are the System Requirements for IronXL?

IronXL supports .NET Framework 4.6.2+ and .NET Core/5/6/7/8/9/10, making it compatible with modern and legacy applications. It runs on Windows, Linux, and macOS platforms. For environments with special requirements, consult the IronXL features page to verify platform support. A trial license is available to evaluate the full feature set before purchasing.

How Do You Verify the Installation Was Successful?

Create a test program that loads a CSV file to verify installation. If you encounter licensing messages, you will need to obtain a license key. Check the package references in your project file to ensure IronXL appears with the correct version. Run a basic CSV read operation -- if it completes without errors, your installation is working correctly.

NuGet Install with NuGet

PM >  Install-Package IronXL.Excel

Check out IronXL on NuGet for quick installation. With over 10 million downloads, it’s transforming PDF development with C#. You can also download the DLL.

How Do You Convert a CSV File to DataTable?

The core workflow in IronXL involves just a few lines of code. Here is how to read a CSV file and convert it to a DataTable using top-level statements in C# 10+:

using IronXL;
using System.Data;

// Load CSV file into a WorkBook object
WorkBook workbook = WorkBook.LoadCSV("sales_data.csv",
    fileFormat: ExcelFileFormat.XLSX);

// Access the default worksheet
WorkSheet worksheet = workbook.DefaultWorkSheet;

// Convert to DataTable with headers
DataTable dataTable = worksheet.ToDataTable(true);

// Display the data
foreach (DataRow row in dataTable.Rows)
{
    foreach (var item in row.ItemArray)
    {
        Console.Write($"{item}\t");
    }
    Console.WriteLine();
}
using IronXL;
using System.Data;

// Load CSV file into a WorkBook object
WorkBook workbook = WorkBook.LoadCSV("sales_data.csv",
    fileFormat: ExcelFileFormat.XLSX);

// Access the default worksheet
WorkSheet worksheet = workbook.DefaultWorkSheet;

// Convert to DataTable with headers
DataTable dataTable = worksheet.ToDataTable(true);

// Display the data
foreach (DataRow row in dataTable.Rows)
{
    foreach (var item in row.ItemArray)
    {
        Console.Write($"{item}\t");
    }
    Console.WriteLine();
}
$vbLabelText   $csharpLabel

The LoadCSV method parses your CSV file automatically, handling quoted fields and embedded commas. The fileFormat parameter tells IronXL how to interpret the CSV data internally, ensuring it's treated as an Excel-compatible structure. Once loaded, the ToDataTable(true) method converts the worksheet into a DataTable, with the parameter true indicating that the first row should be used as column headers. This results in a clean, structured DataTable with named columns that match your CSV headers. For more advanced manipulation options, explore the complete IronXL how-to guide.

The conversion process also preserves data types -- numbers remain numeric, dates are parsed correctly, and text fields maintain their formatting. This automatic type inference saves considerable development time compared to manual parsing approaches. For scenarios requiring custom data formatting, you can apply cell formatting before conversion.

What Does the ToDataTable Parameter Control?

The boolean parameter in ToDataTable() determines header row handling. When set to true, the first row becomes column names in your DataTable, creating meaningful field references like CustomerName instead of Column1. When false, generic column names are assigned, which is useful for headerless CSV files. This flexibility supports various CSV formats encountered in real-world applications. Advanced scenarios might require opening workbooks with custom options to handle multiple data ranges.

How Do You Handle CSV Files Without Headers?

For headerless CSV files, use ToDataTable(false) and manually assign column names afterward. You can iterate through the Columns collection and set meaningful names based on your data schema. Alternatively, prepend a header row to your CSV programmatically before loading. IronXL also supports creating Excel files from scratch if you need to add headers to existing data.

What Are the Performance Implications of Large CSV Files?

IronXL processes large CSV files efficiently using buffered techniques. Files under 100MB typically load in seconds. Memory usage remains controlled regardless of file size due to intelligent buffering. For optimal performance with massive datasets, consider processing in batches using range selection techniques. The IronXL features page documents performance characteristics in detail.

How Do You Import a DataTable into SQL Server?

Once you have your DataTable, importing it into SQL Server becomes efficient using SqlBulkCopy. The following example uses C# top-level statements:

using System.Data;
using Microsoft.Data.SqlClient;
using IronXL;

// Create connection string
string connectionString = "Data Source=localhost;Initial Catalog=SalesDB;" +
    "Integrated Security=True;TrustServerCertificate=True;";

// Read CSV into DataTable
WorkBook workbook = WorkBook.LoadCSV("inventory_report.csv");
DataTable dataTable = workbook.DefaultWorkSheet.ToDataTable(true);

// Bulk insert into SQL Server
using SqlConnection connection = new SqlConnection(connectionString);
connection.Open();

using SqlBulkCopy bulkCopy = new SqlBulkCopy(connection);

// Set destination table name
bulkCopy.DestinationTableName = "dbo.Inventory";

// Map DataTable columns to SQL table columns
bulkCopy.ColumnMappings.Add("ProductCode", "product_code");
bulkCopy.ColumnMappings.Add("Quantity", "quantity");
bulkCopy.ColumnMappings.Add("LastUpdated", "last_updated");

// Set batch size for better performance
bulkCopy.BatchSize = 1000;

// Write data to SQL Server
bulkCopy.WriteToServer(dataTable);

Console.WriteLine($"Successfully imported {dataTable.Rows.Count} records");
using System.Data;
using Microsoft.Data.SqlClient;
using IronXL;

// Create connection string
string connectionString = "Data Source=localhost;Initial Catalog=SalesDB;" +
    "Integrated Security=True;TrustServerCertificate=True;";

// Read CSV into DataTable
WorkBook workbook = WorkBook.LoadCSV("inventory_report.csv");
DataTable dataTable = workbook.DefaultWorkSheet.ToDataTable(true);

// Bulk insert into SQL Server
using SqlConnection connection = new SqlConnection(connectionString);
connection.Open();

using SqlBulkCopy bulkCopy = new SqlBulkCopy(connection);

// Set destination table name
bulkCopy.DestinationTableName = "dbo.Inventory";

// Map DataTable columns to SQL table columns
bulkCopy.ColumnMappings.Add("ProductCode", "product_code");
bulkCopy.ColumnMappings.Add("Quantity", "quantity");
bulkCopy.ColumnMappings.Add("LastUpdated", "last_updated");

// Set batch size for better performance
bulkCopy.BatchSize = 1000;

// Write data to SQL Server
bulkCopy.WriteToServer(dataTable);

Console.WriteLine($"Successfully imported {dataTable.Rows.Count} records");
$vbLabelText   $csharpLabel

The SqlBulkCopy class provides strong performance for large-scale data imports. The ColumnMappings collection maps DataTable columns to database columns with different names, ensuring flexibility in your data schema. The BatchSize property optimizes memory usage by processing records in chunks rather than loading everything at once. Learn more about IronXL's import and export capabilities.

For additional data integrity, consider implementing transaction support around your bulk copy operations. This ensures that partial imports can be rolled back if errors occur. You can also use IronXL's write Excel file features to pre-validate data before import.

Here, you can use the query SELECT * FROM dbo.Inventory; in SSMS to check that your CSV data was imported into the SQL Server. First, take a look at the data used in the example CSV file:

Notepad window showing a sample CSV inventory report with product codes, quantities, and dates in comma-separated format

Now, if you look at the output for the query, you can see it successfully copied every column, header, and row into the database table.

SQL query results showing 3 inventory records imported from CSV with product codes A100, B200, and C300, their quantities, and timestamps in a database management tool

Why Is BatchSize Important for Performance?

BatchSize controls how many rows are sent to SQL Server in each network round trip. Smaller batches (100 -- 1000 rows) reduce memory usage and allow for progress monitoring but increase network overhead. Larger batches (5000 -- 10000 rows) maximize throughput for fast networks but consume more memory. The optimal size depends on your row width, network latency, and available memory. For complex scenarios involving multiple data ranges, batch processing becomes even more critical.

How Do You Handle Column Mapping Mismatches?

Column mapping mismatches cause runtime exceptions during bulk copy operations. Always verify that source DataTable column names match your mapping definitions exactly, including case sensitivity. Use the GetOrdinal method to validate column existence before mapping. For dynamic schemas, query the destination table's structure and build mappings programmatically. IronXL's Excel to DataSet guide can help standardize column names before import.

What Security Considerations Apply to Bulk Imports?

Bulk imports require elevated database permissions, typically db_datawriter or bulkadmin roles. Use integrated authentication when possible to avoid embedding credentials in connection strings. Implement row-level security by adding audit columns during import. Always validate data types and ranges to prevent issues through malformed CSV data. Review the IronXL licensing page if you need to understand enterprise deployment requirements.

How Do You Handle Different CSV Delimiters?

Not all CSV files use commas. IronXL handles various delimiters, making it practical for international data formats or legacy systems. The following example demonstrates loading tab-delimited and semicolon-delimited files:

using IronXL;
using System;
using System.Data;
using System.IO;

// --- Tab-delimited file ---
string tsvPath = "export_data.tsv";
WorkBook tsvWorkbook = WorkBook.LoadCSV(tsvPath, ExcelFileFormat.XLSX, "\t");

// --- Semicolon-delimited file ---
string semiPath = "european_data.csv";
string tempCsv = Path.Combine(Path.GetTempPath(), "european_data_comma.csv");

// Replace semicolons with commas for proper parsing
string[] lines = File.ReadAllLines(semiPath);
for (int i = 0; i < lines.Length; i++)
{
    lines[i] = lines[i].Replace(';', ',');
}
File.WriteAllLines(tempCsv, lines);

WorkBook semiWorkbook = WorkBook.LoadCSV(tempCsv, ExcelFileFormat.XLSX);

// Print tab-delimited results
DataTable tsvTable = tsvWorkbook.DefaultWorkSheet.ToDataTable(true);
Console.WriteLine("--- Tab-delimited File ---");
foreach (DataColumn col in tsvTable.Columns)
    Console.Write($"{col.ColumnName}\t");
Console.WriteLine();
foreach (DataRow row in tsvTable.Rows)
{
    foreach (var item in row.ItemArray)
        Console.Write($"{item}\t");
    Console.WriteLine();
}

// Print semicolon-delimited results
DataTable semiTable = semiWorkbook.DefaultWorkSheet.ToDataTable(true);
Console.WriteLine("\n--- Semicolon-delimited File ---");
foreach (DataColumn col in semiTable.Columns)
    Console.Write($"{col.ColumnName}\t");
Console.WriteLine();
foreach (DataRow row in semiTable.Rows)
{
    foreach (var item in row.ItemArray)
        Console.Write($"{item}\t");
    Console.WriteLine();
}
using IronXL;
using System;
using System.Data;
using System.IO;

// --- Tab-delimited file ---
string tsvPath = "export_data.tsv";
WorkBook tsvWorkbook = WorkBook.LoadCSV(tsvPath, ExcelFileFormat.XLSX, "\t");

// --- Semicolon-delimited file ---
string semiPath = "european_data.csv";
string tempCsv = Path.Combine(Path.GetTempPath(), "european_data_comma.csv");

// Replace semicolons with commas for proper parsing
string[] lines = File.ReadAllLines(semiPath);
for (int i = 0; i < lines.Length; i++)
{
    lines[i] = lines[i].Replace(';', ',');
}
File.WriteAllLines(tempCsv, lines);

WorkBook semiWorkbook = WorkBook.LoadCSV(tempCsv, ExcelFileFormat.XLSX);

// Print tab-delimited results
DataTable tsvTable = tsvWorkbook.DefaultWorkSheet.ToDataTable(true);
Console.WriteLine("--- Tab-delimited File ---");
foreach (DataColumn col in tsvTable.Columns)
    Console.Write($"{col.ColumnName}\t");
Console.WriteLine();
foreach (DataRow row in tsvTable.Rows)
{
    foreach (var item in row.ItemArray)
        Console.Write($"{item}\t");
    Console.WriteLine();
}

// Print semicolon-delimited results
DataTable semiTable = semiWorkbook.DefaultWorkSheet.ToDataTable(true);
Console.WriteLine("\n--- Semicolon-delimited File ---");
foreach (DataColumn col in semiTable.Columns)
    Console.Write($"{col.ColumnName}\t");
Console.WriteLine();
foreach (DataRow row in semiTable.Rows)
{
    foreach (var item in row.ItemArray)
        Console.Write($"{item}\t");
    Console.WriteLine();
}
$vbLabelText   $csharpLabel

This code demonstrates loading CSV files with different delimiters into IronXL WorkBook objects. Tab-delimited files are read using "\t" as the separator, while semicolon-delimited files are converted to standard CSV format before loading. Each workbook's default worksheet is converted into a DataTable with ToDataTable(true), which uses the first row as column headers. For more complex file format scenarios, you can also explore exporting Excel data to various formats.

The flexibility to handle different delimiters addresses many real-world CSV parsing challenges. When working with European data formats, IronXL's delimiter handling accommodates regional differences in data formatting standards. For files with inconsistent structures, pre-process the raw text before passing it to LoadCSV.

The RFC 4180 standard defines the CSV format specification that IronXL follows for quoted fields and embedded commas. This flexibility is also discussed in detail in the Stack Overflow community thread on reading CSV into DataTable in .NET.

Which Delimiters Does IronXL Support Natively?

IronXL supports common delimiters including commas, tabs, pipes (|), and custom single-character delimiters through the LoadCSV overload. For best performance, use the built-in delimiter parameter rather than pre-processing files. Multi-character delimiters require preprocessing as shown in the semicolon example above. You can also merge cells in the resulting workbook if you need to reorganize the parsed data.

How Do You Detect the Delimiter Automatically?

Automatic delimiter detection involves analyzing the first few lines of your CSV file. Count occurrences of common delimiters (comma, tab, semicolon, pipe) in each line. The character appearing consistently with the highest frequency is likely the delimiter. IronXL doesn't provide automatic detection, but you can implement this logic before calling LoadCSV. For files with inconsistent delimiters, validate the file structure first, then normalize it.

As shown in the console output, both files are correctly formatted into DataTables, despite starting with different delimiters.

Microsoft Visual Studio Debug Console showing output of two different file formats -- a tab-delimited file with fruit data (ID, Name, Quantity) and a semicolon-delimited file with city population data

What Are the Supported IronXL Features Beyond CSV?

IronXL covers much more than CSV reading. The library handles the full spectrum of Excel and spreadsheet operations in C#, making it a single dependency for data-centric applications.

IronXL Key Features for .NET Developers
Feature Description Learn More
Read Excel files Load .xlsx, .xls, .csv files and access cell data programmatically Read Excel
Write Excel files Create and modify spreadsheets, set values, formulas, and styles Write Excel
Create Excel files Generate new workbooks and worksheets from code Create Excel
Export to DataSet Convert entire workbooks into DataSet objects for multi-table scenarios Excel to DataSet
Cell formatting Apply number formats, fonts, colors, and borders to cells Cell Formatting
Import data Populate worksheets from DataTable, List, or database sources Import Data

How Do You Export an Entire Workbook as a DataSet?

When your CSV file contains multiple sections or you need to work with multiple sheets, use the ToDataSet() method on the WorkBook object. This returns a DataSet containing one DataTable per worksheet. The Excel to DataSet how-to provides step-by-step code examples for this scenario.

What Licensing Options Are Available?

IronXL is available for commercial use with a variety of licensing tiers to match your team size and deployment needs. A free trial license provides access to the full feature set for evaluation. The licensing page documents available plans including royalty-free redistribution and SaaS options.

What Are the Next Steps?

IronXL transforms CSV database imports into a straightforward, reliable process. With just a few lines of code, you can:

  • Read CSV files from any file path using LoadCSV functionality
  • Convert CSV data into a DataTable with proper data formatting
  • Preserve column headers and DataTable columns through automatic parsing
  • Import millions of rows into a database table efficiently using bulk operations

The library handles the intricacies of CSV parsing, memory management, and data type conversions, letting you focus on your business logic rather than file processing details. Whether you're building ASP.NET applications, working with Blazor, or developing .NET 10 desktop apps, IronXL provides consistent, reliable CSV processing across all platforms.

Ready to get started? The IronXL documentation covers every feature in depth, and a trial license lets you evaluate the full library. With extensive code examples and clear API references, you will be processing CSV files efficiently in your .NET 10 applications in no time.

Frequently Asked Questions

What is the best way to read CSV files into a DataTable in C#?

Using IronXL, you can efficiently read CSV files into a DataTable in C# with simple code examples provided in our developer guide.

Why should I use IronXL for CSV to DataTable conversion?

IronXL offers a straightforward API for parsing CSV files and converting them into DataTables, allowing you to easily manipulate and analyze your data in C#.

Can IronXL handle large CSV files when converting to DataTables?

Yes, IronXL is designed to efficiently process large CSV files and convert them into DataTables without performance issues.

Does IronXL support data manipulation after converting CSV to DataTable?

Absolutely, once you convert a CSV file into a DataTable using IronXL, you can easily manipulate and analyze the data as needed.

How can I import CSV data into a database using IronXL?

After converting your CSV file into a DataTable with IronXL, you can insert the data into a database using standard database connectivity in C#.

Is IronXL suitable for processing CSV files in enterprise applications?

Yes, IronXL is designed to handle CSV processing tasks in enterprise applications, offering robust performance and reliability.

What are the advantages of converting CSV files to DataTables in C#?

Converting CSV files to DataTables allows for easier data manipulation, analysis, and integration with databases, enhancing your application's data handling capabilities.

Can IronXL be used for other spreadsheet file types besides CSV?

Yes, IronXL supports various spreadsheet file types including Excel formats, enabling versatile data processing capabilities in C#.

Jordi Bardia
Software Engineer
Jordi is most proficient in Python, C# and C++, when he isn’t leveraging his skills at Iron Software; he’s game programming. Sharing responsibilities for product testing, product development and research, Jordi adds immense value to continual product improvement. The varied experience keeps him challenged and engaged, and he ...
Read More

Iron Support Team

We're online 24 hours, 5 days a week.
Chat
Email
Call Me