How to Read a CSV File into a DataTable in C#
Working with CSV files is a common task for .NET developers., whether you're importing sales reports, processing inventory, or syncing customer records. Converting a CSV file into a DataTable makes it easy to manipulate, analyze, or insert into a database table.
Reading CSV files in C# can be tricky, large files, different delimiters, or embedded commas often require complex parsing or Excel. IronXL simplifies this: with just a few lines of code, you can read CSV files from any file path, convert them into a DataTable with proper column headers, and prepare them for bulk database operations - all without Excel.
In this guide, you’ll learn how to:
- Load a CSV file into a DataTable in C#
- Handle different delimiters like commas, tabs, or semicolons
- Import your DataTable directly into SQL Server efficiently
- Manage large datasets reliably without running into memory issues
By the end, you’ll have a complete, practical workflow for turning CSV data into actionable information using IronXL in your .NET applications.
Why Is CSV to DataTable Conversion Essential?
CSV to DataTable conversion allows developers to transform comma separated values (CSV data) into structured objects like a DataTable for further processing. Whether you're handling inventory data, customer records, or transaction logs, being able to convert CSV to a new DataTable efficiently is crucial. Using the first row as column headers, you can ensure that datatable columns align with your database table schema.
Traditional approaches often struggle with large files, different delimiters, or memory management. IronXL eliminates these challenges and handles different delimiters, quoted fields, and embedded commas seamlessly. IronXL's CSV reading capabilities eliminate these common pain points.
How to Install IronXL?
Getting started with IronXL requires just a simple NuGet package installation. Open the NuGet Package Manager Console in Visual Studio and execute:
Install-Package IronXL.Excel
Once installed, add the IronXL namespace to your project:
using IronXL;
using System.Data;
using IronXL;
using System.Data;
Imports IronXL
Imports System.Data
This provides access to all the powerful CSV processing capabilities without any Excel dependencies. For more details on IronXL's extensive API, refer to the API Reference Documentation.
How to Convert CSV to DataTable?
The beauty of IronXL lies in its simplicity. Here's how to read a CSV file and convert it to a DataTable:
// Load CSV file into a WorkBook object
WorkBook workbook = WorkBook.LoadCSV("sales_data.csv",
fileFormat: ExcelFileFormat.XLSX);
// Access the default worksheet
WorkSheet worksheet = workbook.DefaultWorkSheet;
// Convert to DataTable with headers
DataTable dataTable = worksheet.ToDataTable(true);
// Display the data
foreach (DataRow row in dataTable.Rows)
{
foreach (var item in row.ItemArray)
{
Console.Write($"{item}\t");
}
Console.WriteLine();
}
// Load CSV file into a WorkBook object
WorkBook workbook = WorkBook.LoadCSV("sales_data.csv",
fileFormat: ExcelFileFormat.XLSX);
// Access the default worksheet
WorkSheet worksheet = workbook.DefaultWorkSheet;
// Convert to DataTable with headers
DataTable dataTable = worksheet.ToDataTable(true);
// Display the data
foreach (DataRow row in dataTable.Rows)
{
foreach (var item in row.ItemArray)
{
Console.Write($"{item}\t");
}
Console.WriteLine();
}
IRON VB CONVERTER ERROR developers@ironsoftware.com
The LoadCSV method intelligently parses your CSV file, automatically handling common cases such as quoted fields and embedded commas. The fileFormat parameter tells IronXL how to interpret the CSV data internally, ensuring it is treated as an Excel-compatible structure. Once loaded, the ToDataTable(true) method converts the worksheet into a DataTable, with the parameter `true` indicating that the first row should be used as column headers. This results in a clean, structured DataTable with named columns that match your CSV headers. For more advanced manipulation options, explore the complete Excel to DataTable tutorial.
How to Import DataTable into SQL Server?
Once you have your Data Table, importing it into SQL Server becomes remarkably efficient using SqlBulkCopy:
using System.Data;
using Microsoft.Data.SqlClient;
using IronXL;
class Program
{
static void Main(string[] args)
{
// Create connection string
string connectionString = @"Data Source=localhost;Initial Catalog=SalesDB;
Integrated Security=True;TrustServerCertificate=True;";
// Read CSV into DataTable
WorkBook workbook = WorkBook.LoadCSV("inventory_report.csv");
DataTable dataTable = workbook.DefaultWorkSheet.ToDataTable(true);
// Bulk insert into SQL Server
using (SqlConnection connection = new SqlConnection(connectionString))
{
connection.Open();
using (SqlBulkCopy bulkCopy = new SqlBulkCopy(connection))
{
// Set destination table name
bulkCopy.DestinationTableName = "dbo.Inventory";
// Map DataTable columns to SQL table columns
bulkCopy.ColumnMappings.Add("ProductCode", "product_code");
bulkCopy.ColumnMappings.Add("Quantity", "quantity");
bulkCopy.ColumnMappings.Add("LastUpdated", "last_updated");
// Set batch size for better performance
bulkCopy.BatchSize = 1000;
// Write data to SQL Server
bulkCopy.WriteToServer(dataTable);
}
}
Console.WriteLine($"Successfully imported {dataTable.Rows.Count} records");
}
}
using System.Data;
using Microsoft.Data.SqlClient;
using IronXL;
class Program
{
static void Main(string[] args)
{
// Create connection string
string connectionString = @"Data Source=localhost;Initial Catalog=SalesDB;
Integrated Security=True;TrustServerCertificate=True;";
// Read CSV into DataTable
WorkBook workbook = WorkBook.LoadCSV("inventory_report.csv");
DataTable dataTable = workbook.DefaultWorkSheet.ToDataTable(true);
// Bulk insert into SQL Server
using (SqlConnection connection = new SqlConnection(connectionString))
{
connection.Open();
using (SqlBulkCopy bulkCopy = new SqlBulkCopy(connection))
{
// Set destination table name
bulkCopy.DestinationTableName = "dbo.Inventory";
// Map DataTable columns to SQL table columns
bulkCopy.ColumnMappings.Add("ProductCode", "product_code");
bulkCopy.ColumnMappings.Add("Quantity", "quantity");
bulkCopy.ColumnMappings.Add("LastUpdated", "last_updated");
// Set batch size for better performance
bulkCopy.BatchSize = 1000;
// Write data to SQL Server
bulkCopy.WriteToServer(dataTable);
}
}
Console.WriteLine($"Successfully imported {dataTable.Rows.Count} records");
}
}
IRON VB CONVERTER ERROR developers@ironsoftware.com
The SqlBulkCopy class provides exceptional performance for large-scale data imports. The ColumnMappings collection allows you to map DataTable columns to database columns with different names, ensuring flexibility in your data schema. The BatchSize property optimizes memory usage by processing records in chunks rather than loading everything at once. This approach can efficiently handle millions of records efficiently, making it ideal for enterprise-scale data operations. Learn more about IronXL's DataSet and DataTable capabilities.
Output
Here, we can use the query *SELECT FROM dbo.Inventory;** in SSMS to check that our CSV data actually got imported into the SQL Server. First, let's take a look at the data we used in our example CSV file:
Now, if you look at the output for our query, we can see it has successfully copied every column, header, and row into the database table.
How to Handle Different Delimiters?
Not all CSV files use commas. IronXL seamlessly handles various delimiters:
using IronXL;
using System;
using System.Data;
using System.IO;
class Program
{
static void Main()
{
// --- Tab-delimited file ---
string tsvPath = @"export_data.tsv";
WorkBook tsvWorkbook = WorkBook.LoadCSV(tsvPath, ExcelFileFormat.XLSX, "\t");
// --- Semicolon-delimited file ---
string semiPath = @"european_data.csv";
string tempCsv = Path.Combine(Path.GetTempPath(), "european_data_comma.csv");
// Replace semicolons with commas for proper parsing
var lines = File.ReadAllLines(semiPath);
for (int i = 0; i < lines.Length; i++)
{
lines[i] = lines[i].Replace(';', ',');
}
File.WriteAllLines(tempCsv, lines);
WorkBook semiWorkbook = WorkBook.LoadCSV(tempCsv, ExcelFileFormat.XLSX);
// Print both workbooks
PrintWorkbook(tsvWorkbook, "Tab-delimited File");
PrintWorkbook(semiWorkbook, "Semicolon-delimited File");
}
static void PrintWorkbook(WorkBook workbook, string name)
{
Console.WriteLine($"\n--- {name} ---\n");
DataTable dt = workbook.DefaultWorkSheet.ToDataTable(true);
// Print column headers
foreach (DataColumn col in dt.Columns)
{
Console.Write($"{col.ColumnName}\t");
}
Console.WriteLine();
// Print rows
foreach (DataRow row in dt.Rows)
{
foreach (var item in row.ItemArray)
{
Console.Write($"{item}\t");
}
Console.WriteLine();
}
}
}
using IronXL;
using System;
using System.Data;
using System.IO;
class Program
{
static void Main()
{
// --- Tab-delimited file ---
string tsvPath = @"export_data.tsv";
WorkBook tsvWorkbook = WorkBook.LoadCSV(tsvPath, ExcelFileFormat.XLSX, "\t");
// --- Semicolon-delimited file ---
string semiPath = @"european_data.csv";
string tempCsv = Path.Combine(Path.GetTempPath(), "european_data_comma.csv");
// Replace semicolons with commas for proper parsing
var lines = File.ReadAllLines(semiPath);
for (int i = 0; i < lines.Length; i++)
{
lines[i] = lines[i].Replace(';', ',');
}
File.WriteAllLines(tempCsv, lines);
WorkBook semiWorkbook = WorkBook.LoadCSV(tempCsv, ExcelFileFormat.XLSX);
// Print both workbooks
PrintWorkbook(tsvWorkbook, "Tab-delimited File");
PrintWorkbook(semiWorkbook, "Semicolon-delimited File");
}
static void PrintWorkbook(WorkBook workbook, string name)
{
Console.WriteLine($"\n--- {name} ---\n");
DataTable dt = workbook.DefaultWorkSheet.ToDataTable(true);
// Print column headers
foreach (DataColumn col in dt.Columns)
{
Console.Write($"{col.ColumnName}\t");
}
Console.WriteLine();
// Print rows
foreach (DataRow row in dt.Rows)
{
foreach (var item in row.ItemArray)
{
Console.Write($"{item}\t");
}
Console.WriteLine();
}
}
}
IRON VB CONVERTER ERROR developers@ironsoftware.com
This code loads CSV files with different delimiters into IronXL WorkBook objects. Tab-delimited files are read using "\t" as the separator, while semicolon-delimited files are converted to standard CSV format before loading. Each workbook’s default worksheet is converted into a DataTable with ToDataTable(true), which uses the first row as column headers.
The PrintWorkbook method then iterates through the DataTable and displays the data in the console, including the headers and all rows. This flexibility addresses many of the common CSV parsing challenges discussed by developers.
Output
As you can see in our console output, both files are correctly formatted into data tables, despite starting out with different delimiters.
Conclusion
IronXL transforms the complex task of CSV database imports into a straightforward, reliable process. With just a few lines of code, you can:
- Read CSV files from any file path
- Convert CSV data into a new DataTable
- Preserve column headers and datatable columns
- Import millions of rows into a database table efficiently
The library handles the intricacies of CSV parsing, memory management, and data type conversions automatically, letting you focus on your business logic rather than file processing details.
Ready to streamline your data import workflow? Start your free trial to find the perfect fit for your team's needs.
Frequently Asked Questions
What is the best way to read CSV files into a DataTable in C#?
Using IronXL, you can efficiently read CSV files into a DataTable in C# with simple code examples provided in our developer guide.
Why should I use IronXL for CSV to DataTable conversion?
IronXL offers a straightforward API for parsing CSV files and converting them into DataTables, allowing you to easily manipulate and analyze your data in C#.
Can IronXL handle large CSV files when converting to DataTables?
Yes, IronXL is designed to efficiently process large CSV files and convert them into DataTables without performance issues.
Does IronXL support data manipulation after converting CSV to DataTable?
Absolutely, once you convert a CSV file into a DataTable using IronXL, you can easily manipulate and analyze the data as needed.
How can I import CSV data into a database using IronXL?
After converting your CSV file into a DataTable with IronXL, you can insert the data into a database using standard database connectivity in C#.
Is IronXL suitable for processing CSV files in enterprise applications?
Yes, IronXL is designed to handle CSV processing tasks in enterprise applications, offering robust performance and reliability.
What are the advantages of converting CSV files to DataTables in C#?
Converting CSV files to DataTables allows for easier data manipulation, analysis, and integration with databases, enhancing your application's data handling capabilities.
Can IronXL be used for other spreadsheet file types besides CSV?
Yes, IronXL supports various spreadsheet file types including Excel formats, enabling versatile data processing capabilities in C#.