Passer au contenu du pied de page
UTILISATION D'IRONXL

Comment lire un fichier CSV dans une DataTable en C#

Working with CSV files is a common task for .NET developers., whether you're importing sales reports, processing inventory, or syncing customer records. Converting a CSV file into a DataTable makes it easy to manipulate, analyze, or insert into a database table.

Reading CSV files in C# can be tricky, large files, different delimiters, or embedded commas often require complex parsing or Excel. IronXL simplifies this: with just a few lines of code, you can read CSV files from any file path, convert them into a DataTable with proper column headers, and prepare them for bulk database operations - all without Excel.

In this guide, you’ll learn how to:

  • Load a CSV file into a DataTable in C#
  • Handle different delimiters like commas, tabs, or semicolons
  • Import your DataTable directly into SQL Server efficiently
  • Manage large datasets reliably without running into memory issues

By the end, you’ll have a complete, practical workflow for turning CSV data into actionable information using IronXL in your .NET applications.

Why Is CSV to DataTable Conversion Essential?

CSV to DataTable conversion allows developers to transform comma separated values (CSV data) into structured objects like a DataTable for further processing. Whether you're handling inventory data, customer records, or transaction logs, being able to convert CSV to a new DataTable efficiently is crucial. Using the first row as column headers, you can ensure that datatable columns align with your database table schema.

Traditional approaches often struggle with large files, different delimiters, or memory management. IronXL eliminates these challenges and handles different delimiters, quoted fields, and embedded commas seamlessly. IronXL's CSV reading capabilities eliminate these common pain points.

How to Install IronXL?

Getting started with IronXL requires just a simple NuGet package installation. Open the NuGet Package Manager Console in Visual Studio and execute:

Install-Package IronXL.Excel

Once installed, add the IronXL namespace to your project:

using IronXL;
using System.Data;
using IronXL;
using System.Data;
Imports IronXL
Imports System.Data
$vbLabelText   $csharpLabel

This provides access to all the powerful CSV processing capabilities without any Excel dependencies. For more details on IronXL's extensive API, refer to the API Reference Documentation.

NuGet Installer avec NuGet

PM >  Install-Package IronXL.Excel

Consultez IronXL sur NuGet pour une installation rapide. Avec plus de 10 millions de téléchargements, il transforme le développement PDF avec C#. Vous pouvez également télécharger le DLL.

How to Convert CSV to DataTable?

The beauty of IronXL lies in its simplicity. Here's how to read a CSV file and convert it to a DataTable:

// Load CSV file into a WorkBook object
WorkBook workbook = WorkBook.LoadCSV("sales_data.csv", 
    fileFormat: ExcelFileFormat.XLSX);
// Access the default worksheet
WorkSheet worksheet = workbook.DefaultWorkSheet;
// Convert to DataTable with headers
DataTable dataTable = worksheet.ToDataTable(true);
// Display the data
foreach (DataRow row in dataTable.Rows)
{
    foreach (var item in row.ItemArray)
    {
        Console.Write($"{item}\t");
    }
    Console.WriteLine();
}
// Load CSV file into a WorkBook object
WorkBook workbook = WorkBook.LoadCSV("sales_data.csv", 
    fileFormat: ExcelFileFormat.XLSX);
// Access the default worksheet
WorkSheet worksheet = workbook.DefaultWorkSheet;
// Convert to DataTable with headers
DataTable dataTable = worksheet.ToDataTable(true);
// Display the data
foreach (DataRow row in dataTable.Rows)
{
    foreach (var item in row.ItemArray)
    {
        Console.Write($"{item}\t");
    }
    Console.WriteLine();
}
IRON VB CONVERTER ERROR developers@ironsoftware.com
$vbLabelText   $csharpLabel

The LoadCSV method intelligently parses your CSV file, automatically handling common cases such as quoted fields and embedded commas. The fileFormat parameter tells IronXL how to interpret the CSV data internally, ensuring it is treated as an Excel-compatible structure. Once loaded, the ToDataTable(true) method converts the worksheet into a DataTable, with the parameter `true` indicating that the first row should be used as column headers. This results in a clean, structured DataTable with named columns that match your CSV headers. For more advanced manipulation options, explore the complete Excel to DataTable tutorial.

How to Import DataTable into SQL Server?

Once you have your Data Table, importing it into SQL Server becomes remarkably efficient using SqlBulkCopy:

using System.Data;
using Microsoft.Data.SqlClient;
using IronXL;
class Program
{
    static void Main(string[] args)
    {
        // Create connection string
        string connectionString = @"Data Source=localhost;Initial Catalog=SalesDB;
                           Integrated Security=True;TrustServerCertificate=True;";
        // Read CSV into DataTable
        WorkBook workbook = WorkBook.LoadCSV("inventory_report.csv");
        DataTable dataTable = workbook.DefaultWorkSheet.ToDataTable(true);
        // Bulk insert into SQL Server
        using (SqlConnection connection = new SqlConnection(connectionString))
        {
            connection.Open();
            using (SqlBulkCopy bulkCopy = new SqlBulkCopy(connection))
            {
                // Set destination table name
                bulkCopy.DestinationTableName = "dbo.Inventory";
                // Map DataTable columns to SQL table columns
                bulkCopy.ColumnMappings.Add("ProductCode", "product_code");
                bulkCopy.ColumnMappings.Add("Quantity", "quantity");
                bulkCopy.ColumnMappings.Add("LastUpdated", "last_updated");
                // Set batch size for better performance
                bulkCopy.BatchSize = 1000;
                // Write data to SQL Server
                bulkCopy.WriteToServer(dataTable);
            }
        }
        Console.WriteLine($"Successfully imported {dataTable.Rows.Count} records");
    }
}
using System.Data;
using Microsoft.Data.SqlClient;
using IronXL;
class Program
{
    static void Main(string[] args)
    {
        // Create connection string
        string connectionString = @"Data Source=localhost;Initial Catalog=SalesDB;
                           Integrated Security=True;TrustServerCertificate=True;";
        // Read CSV into DataTable
        WorkBook workbook = WorkBook.LoadCSV("inventory_report.csv");
        DataTable dataTable = workbook.DefaultWorkSheet.ToDataTable(true);
        // Bulk insert into SQL Server
        using (SqlConnection connection = new SqlConnection(connectionString))
        {
            connection.Open();
            using (SqlBulkCopy bulkCopy = new SqlBulkCopy(connection))
            {
                // Set destination table name
                bulkCopy.DestinationTableName = "dbo.Inventory";
                // Map DataTable columns to SQL table columns
                bulkCopy.ColumnMappings.Add("ProductCode", "product_code");
                bulkCopy.ColumnMappings.Add("Quantity", "quantity");
                bulkCopy.ColumnMappings.Add("LastUpdated", "last_updated");
                // Set batch size for better performance
                bulkCopy.BatchSize = 1000;
                // Write data to SQL Server
                bulkCopy.WriteToServer(dataTable);
            }
        }
        Console.WriteLine($"Successfully imported {dataTable.Rows.Count} records");
    }
}
IRON VB CONVERTER ERROR developers@ironsoftware.com
$vbLabelText   $csharpLabel

The SqlBulkCopy class provides exceptional performance for large-scale data imports. The ColumnMappings collection allows you to map DataTable columns to database columns with different names, ensuring flexibility in your data schema. The BatchSize property optimizes memory usage by processing records in chunks rather than loading everything at once. This approach can efficiently handle millions of records efficiently, making it ideal for enterprise-scale data operations. Learn more about IronXL's DataSet and DataTable capabilities.

Output

Here, we can use the query *SELECT FROM dbo.Inventory;** in SSMS to check that our CSV data actually got imported into the SQL Server. First, let's take a look at the data we used in our example CSV file:

How to Read a CSV File into a DataTable in C#: Figure 2 - Sample CSV data

Now, if you look at the output for our query, we can see it has successfully copied every column, header, and row into the database table.

How to Read a CSV File into a DataTable in C#: Figure 3 - Database view showing the imported records

How to Handle Different Delimiters?

Not all CSV files use commas. IronXL seamlessly handles various delimiters:

using IronXL;
using System;
using System.Data;
using System.IO;
class Program
{
    static void Main()
    {
        // --- Tab-delimited file ---
        string tsvPath = @"export_data.tsv";
        WorkBook tsvWorkbook = WorkBook.LoadCSV(tsvPath, ExcelFileFormat.XLSX, "\t");
        // --- Semicolon-delimited file ---
        string semiPath = @"european_data.csv";
        string tempCsv = Path.Combine(Path.GetTempPath(), "european_data_comma.csv");
        // Replace semicolons with commas for proper parsing
        var lines = File.ReadAllLines(semiPath);
        for (int i = 0; i < lines.Length; i++)
        {
            lines[i] = lines[i].Replace(';', ',');
        }
        File.WriteAllLines(tempCsv, lines);
        WorkBook semiWorkbook = WorkBook.LoadCSV(tempCsv, ExcelFileFormat.XLSX);
        // Print both workbooks
        PrintWorkbook(tsvWorkbook, "Tab-delimited File");
        PrintWorkbook(semiWorkbook, "Semicolon-delimited File");
    }
    static void PrintWorkbook(WorkBook workbook, string name)
    {
        Console.WriteLine($"\n--- {name} ---\n");
        DataTable dt = workbook.DefaultWorkSheet.ToDataTable(true);
        // Print column headers
        foreach (DataColumn col in dt.Columns)
        {
            Console.Write($"{col.ColumnName}\t");
        }
        Console.WriteLine();
        // Print rows
        foreach (DataRow row in dt.Rows)
        {
            foreach (var item in row.ItemArray)
            {
                Console.Write($"{item}\t");
            }
            Console.WriteLine();
        }
    }
}
using IronXL;
using System;
using System.Data;
using System.IO;
class Program
{
    static void Main()
    {
        // --- Tab-delimited file ---
        string tsvPath = @"export_data.tsv";
        WorkBook tsvWorkbook = WorkBook.LoadCSV(tsvPath, ExcelFileFormat.XLSX, "\t");
        // --- Semicolon-delimited file ---
        string semiPath = @"european_data.csv";
        string tempCsv = Path.Combine(Path.GetTempPath(), "european_data_comma.csv");
        // Replace semicolons with commas for proper parsing
        var lines = File.ReadAllLines(semiPath);
        for (int i = 0; i < lines.Length; i++)
        {
            lines[i] = lines[i].Replace(';', ',');
        }
        File.WriteAllLines(tempCsv, lines);
        WorkBook semiWorkbook = WorkBook.LoadCSV(tempCsv, ExcelFileFormat.XLSX);
        // Print both workbooks
        PrintWorkbook(tsvWorkbook, "Tab-delimited File");
        PrintWorkbook(semiWorkbook, "Semicolon-delimited File");
    }
    static void PrintWorkbook(WorkBook workbook, string name)
    {
        Console.WriteLine($"\n--- {name} ---\n");
        DataTable dt = workbook.DefaultWorkSheet.ToDataTable(true);
        // Print column headers
        foreach (DataColumn col in dt.Columns)
        {
            Console.Write($"{col.ColumnName}\t");
        }
        Console.WriteLine();
        // Print rows
        foreach (DataRow row in dt.Rows)
        {
            foreach (var item in row.ItemArray)
            {
                Console.Write($"{item}\t");
            }
            Console.WriteLine();
        }
    }
}
IRON VB CONVERTER ERROR developers@ironsoftware.com
$vbLabelText   $csharpLabel

This code loads CSV files with different delimiters into IronXL WorkBook objects. Tab-delimited files are read using "\t" as the separator, while semicolon-delimited files are converted to standard CSV format before loading. Each workbook’s default worksheet is converted into a DataTable with ToDataTable(true), which uses the first row as column headers.

The PrintWorkbook method then iterates through the DataTable and displays the data in the console, including the headers and all rows. This flexibility addresses many of the common CSV parsing challenges discussed by developers.

Output

As you can see in our console output, both files are correctly formatted into data tables, despite starting out with different delimiters.

How to Read a CSV File into a DataTable in C#: Figure 4 - Different Delimiters output

Conclusion

IronXL transforms the complex task of CSV database imports into a straightforward, reliable process. With just a few lines of code, you can:

  • Read CSV files from any file path
  • Convert CSV data into a new DataTable
  • Preserve column headers and datatable columns
  • Import millions of rows into a database table efficiently

The library handles the intricacies of CSV parsing, memory management, and data type conversions automatically, letting you focus on your business logic rather than file processing details.

Ready to streamline your data import workflow? Start your free trial to find the perfect fit for your team's needs.

Questions Fréquemment Posées

Quel est le meilleur moyen de lire des fichiers CSV dans une DataTable en C#?

En utilisant IronXL, vous pouvez efficacement lire des fichiers CSV dans une DataTable en C# avec des exemples de code simples fournis dans notre guide du développeur.

Pourquoi devrais-je utiliser IronXL pour la conversion de CSV en DataTable?

IronXL offre une API simple pour analyser des fichiers CSV et les convertir en DataTables, vous permettant de manipuler et d'analyser facilement vos données en C#.

IronXL peut-il gérer de grands fichiers CSV lors de la conversion en DataTables?

Oui, IronXL est conçu pour traiter efficacement de grands fichiers CSV et les convertir en DataTables sans problèmes de performance.

IronXL prend-il en charge la manipulation des données après la conversion d'un CSV en DataTable?

Absolument, une fois que vous avez converti un fichier CSV en DataTable avec IronXL, vous pouvez facilement manipuler et analyser les données selon les besoins.

Comment puis-je importer des données CSV dans une base de données en utilisant IronXL?

Après avoir converti votre fichier CSV en DataTable avec IronXL, vous pouvez insérer les données dans une base de données en utilisant la connectivité standard des bases de données en C#.

IronXL est-il adapté au traitement des fichiers CSV dans les applications d'entreprise?

Oui, IronXL est conçu pour gérer les tâches de traitement de CSV dans les applications d'entreprise, offrant des performances et une fiabilité robustes.

Quels sont les avantages de convertir des fichiers CSV en DataTables en C#?

Convertir des fichiers CSV en DataTables permet une manipulation, une analyse et une intégration plus faciles avec les bases de données, améliorant les capacités de gestion des données de votre application.

IronXL peut-il être utilisé pour d'autres types de fichiers de feuille de calcul en dehors des CSV?

Oui, IronXL prend en charge divers types de fichiers de feuille de calcul, y compris les formats Excel, permettant des capacités polyvalentes de traitement des données en C#.

Jordi Bardia
Ingénieur logiciel
Jordi est le plus compétent en Python, C# et C++, et lorsqu'il ne met pas à profit ses compétences chez Iron Software, il programme des jeux. Partageant les responsabilités des tests de produit, du développement de produit et de la recherche, Jordi apporte une immense valeur à l'amé...
Lire la suite