Skip to footer content
USING IRONXL

Import CSV in C# | ASP.NET Core Tutorial | IronXL

Importing a CSV file in C# with ASP.NET Core means reading a file stream, parsing delimited rows, and mapping each record to a typed object -- all before you can do anything useful with the data. IronXL handles every step through a single API that works equally well for CSV, XLSX, and TSV files, so you spend time on your application logic rather than string-splitting edge cases.

Start your free trial to follow along and test these code examples in your own environment.

How Do You Install IronXL in a .NET Project?

Before writing any parsing code, add IronXL to your project. Open the Package Manager Console in Visual Studio or a terminal in your project directory and run one of the following commands:

Install-Package IronXL.Excel
dotnet add package IronXL.Excel
Install-Package IronXL.Excel
dotnet add package IronXL.Excel
SHELL

The package targets .NET 10, .NET 6+, .NET Framework 4.6.2+, and .NET Standard 2.0, so it fits any modern project type. No Microsoft Office installation is required on the server. After the package restores, add using IronXL; at the top of any file that calls the library.

For detailed setup options, including global tool installation and CI/CD pipeline configuration, visit the IronXL installation guide.

How Do You Import a CSV File in ASP.NET Core?

Importing a CSV file in ASP.NET Core requires reading the file stream from the server, parsing each row, and mapping values to a model class. While some developers reach for manual StreamReader implementations or third-party packages, IronXL provides a single method that handles CSV files alongside Excel formats without additional configuration.

The following code shows how to load a CSV file using IronXL's WorkBook.LoadCSV method:

using IronXL;

// Load the CSV file directly using the full file path
var csv = WorkBook.LoadCSV("products.csv");
WorkSheet worksheet = csv.DefaultWorkSheet;

// Access CSV data by iterating through rows
foreach (var row in worksheet.Rows)
{
    string productName = row.Columns[1].StringValue;
    decimal price = row.Columns[2].DecimalValue;
    Console.WriteLine($"Product: {productName}, Price: {price}");
}
using IronXL;

// Load the CSV file directly using the full file path
var csv = WorkBook.LoadCSV("products.csv");
WorkSheet worksheet = csv.DefaultWorkSheet;

// Access CSV data by iterating through rows
foreach (var row in worksheet.Rows)
{
    string productName = row.Columns[1].StringValue;
    decimal price = row.Columns[2].DecimalValue;
    Console.WriteLine($"Product: {productName}, Price: {price}");
}
Imports IronXL

' Load the CSV file directly using the full file path
Dim csv = WorkBook.LoadCSV("products.csv")
Dim worksheet As WorkSheet = csv.DefaultWorkSheet

' Access CSV data by iterating through rows
For Each row In worksheet.Rows
    Dim productName As String = row.Columns(1).StringValue
    Dim price As Decimal = row.Columns(2).DecimalValue
    Console.WriteLine($"Product: {productName}, Price: {price}")
Next
$vbLabelText   $csharpLabel

Understanding the WorkBook.LoadCSV Method

The WorkBook.LoadCSV method reads the CSV file and creates a worksheet where each line becomes a row and each delimited value becomes a cell. IronXL automatically detects the delimiter (comma, semicolon, or tab) and handles quoted fields that contain commas within the value. You do not need to configure a delimiter manually for standard CSV files.

Cell value accessors like StringValue, DecimalValue, IntValue, and DateTimeValue handle type conversion so you avoid manual parsing with int.TryParse or decimal.Parse. When a cell is empty or contains an unrecognized value, these accessors return the type's default rather than throwing an exception, which prevents unhandled errors during bulk import operations.

This approach eliminates the error-prone manual string parsing that custom implementations require, particularly for edge cases like escaped quotes, Windows-style line endings, and multi-line cell values. For more details on supported formats and delimiter options, see the IronXL CSV documentation.

ASP Import CSV: A Complete C# Developer's Guide: Image 1 - Importing a simple CSV file and reading the output in console

How Can You Create a Model Class for CSV Data?

Mapping CSV data to strongly-typed objects requires a model class that mirrors the file structure. You convert raw string data into specific types like integers, decimals, and DateTime values. For product inventory data, create a class with properties matching each CSV column:

public class Product
{
    public int Id { get; set; }
    public string Name { get; set; } = string.Empty;
    public decimal Price { get; set; }
    public int Quantity { get; set; }
}
public class Product
{
    public int Id { get; set; }
    public string Name { get; set; } = string.Empty;
    public decimal Price { get; set; }
    public int Quantity { get; set; }
}
Public Class Product
    Public Property Id As Integer
    Public Property Name As String = String.Empty
    Public Property Price As Decimal
    Public Property Quantity As Integer
End Class
$vbLabelText   $csharpLabel

Parsing CSV Rows into Typed Collections

With the model class defined, you can parse CSV records into a typed collection. The following example skips the header row using a loop index and maps each subsequent row to a Product object:

using IronXL;

WorkBook workbook = WorkBook.LoadCSV("inventory.csv");
WorkSheet ws = workbook.DefaultWorkSheet;
var records = new List<Product>();

// Skip header row (index 0), iterate through data rows
for (int i = 1; i < ws.Rows.Count(); i++)
{
    var row = ws.Rows[i];
    var product = new Product
    {
        Id = row.Columns[0].IntValue,
        Name = row.Columns[1].StringValue,
        Price = row.Columns[2].DecimalValue,
        Quantity = row.Columns[3].IntValue
    };
    records.Add(product);
}

Console.WriteLine($"Loaded {records.Count} products.");
using IronXL;

WorkBook workbook = WorkBook.LoadCSV("inventory.csv");
WorkSheet ws = workbook.DefaultWorkSheet;
var records = new List<Product>();

// Skip header row (index 0), iterate through data rows
for (int i = 1; i < ws.Rows.Count(); i++)
{
    var row = ws.Rows[i];
    var product = new Product
    {
        Id = row.Columns[0].IntValue,
        Name = row.Columns[1].StringValue,
        Price = row.Columns[2].DecimalValue,
        Quantity = row.Columns[3].IntValue
    };
    records.Add(product);
}

Console.WriteLine($"Loaded {records.Count} products.");
Imports IronXL

Dim workbook As WorkBook = WorkBook.LoadCSV("inventory.csv")
Dim ws As WorkSheet = workbook.DefaultWorkSheet
Dim records As New List(Of Product)()

' Skip header row (index 0), iterate through data rows
For i As Integer = 1 To ws.Rows.Count() - 1
    Dim row = ws.Rows(i)
    Dim product As New Product With {
        .Id = row.Columns(0).IntValue,
        .Name = row.Columns(1).StringValue,
        .Price = row.Columns(2).DecimalValue,
        .Quantity = row.Columns(3).IntValue
    }
    records.Add(product)
Next

Console.WriteLine($"Loaded {records.Count} products.")
$vbLabelText   $csharpLabel

The records collection now contains typed Product objects ready for database operations, JSON serialization, or further business logic. IronXL's cell value accessors handle type conversion automatically, including null handling for optional fields.

Handling Optional and Nullable Fields

Real-world CSV files often contain empty cells or optional columns. IronXL's value accessors return default values rather than exceptions when a cell is blank. For nullable types, you can use a conditional check:

// Reading an optional DateTime field
DateTime? lastUpdated = string.IsNullOrEmpty(row.Columns[4].StringValue)
    ? null
    : row.Columns[4].DateTimeValue;
// Reading an optional DateTime field
DateTime? lastUpdated = string.IsNullOrEmpty(row.Columns[4].StringValue)
    ? null
    : row.Columns[4].DateTimeValue;
' Reading an optional DateTime field
Dim lastUpdated As DateTime? = If(String.IsNullOrEmpty(row.Columns(4).StringValue), Nothing, row.Columns(4).DateTimeValue)
$vbLabelText   $csharpLabel

This pattern keeps your import code defensive without wrapping every cell access in a try-catch block. For guidance on handling complex data types and large files, visit the IronXL WorkSheet documentation.

How Do You Handle CSV File Uploads in a Web API?

Building an API endpoint that accepts CSV file uploads from a browser requires combining ASP.NET Core's IFormFile with IronXL's parsing capabilities. The following code demonstrates a complete controller implementation that parses the uploaded file and returns a JSON response:

using IronXL;
using Microsoft.AspNetCore.Mvc;

[Route("api/[controller]")]
[ApiController]
public class CsvController : ControllerBase
{
    [HttpPost("upload")]
    public async Task<IActionResult> UploadCsv(IFormFile file)
    {
        if (file == null || file.Length == 0)
            return BadRequest("Please upload a valid CSV file.");

        try
        {
            using var stream = new MemoryStream();
            await file.CopyToAsync(stream);
            stream.Position = 0;

            WorkBook workbook = WorkBook.Load(stream, "csv");
            WorkSheet ws = workbook.DefaultWorkSheet;
            var records = new List<Product>();

            // Skip header row, iterate through data rows
            for (int i = 1; i < ws.Rows.Count(); i++)
            {
                var row = ws.Rows[i];
                records.Add(new Product
                {
                    Id = row.Columns[0].IntValue,
                    Name = row.Columns[1].StringValue,
                    Price = row.Columns[2].DecimalValue,
                    Quantity = row.Columns[3].IntValue
                });
            }

            return Ok(new
            {
                message = "Import successful",
                count = records.Count,
                data = records
            });
        }
        catch (Exception ex)
        {
            return BadRequest($"Error processing file: {ex.Message}");
        }
    }
}
using IronXL;
using Microsoft.AspNetCore.Mvc;

[Route("api/[controller]")]
[ApiController]
public class CsvController : ControllerBase
{
    [HttpPost("upload")]
    public async Task<IActionResult> UploadCsv(IFormFile file)
    {
        if (file == null || file.Length == 0)
            return BadRequest("Please upload a valid CSV file.");

        try
        {
            using var stream = new MemoryStream();
            await file.CopyToAsync(stream);
            stream.Position = 0;

            WorkBook workbook = WorkBook.Load(stream, "csv");
            WorkSheet ws = workbook.DefaultWorkSheet;
            var records = new List<Product>();

            // Skip header row, iterate through data rows
            for (int i = 1; i < ws.Rows.Count(); i++)
            {
                var row = ws.Rows[i];
                records.Add(new Product
                {
                    Id = row.Columns[0].IntValue,
                    Name = row.Columns[1].StringValue,
                    Price = row.Columns[2].DecimalValue,
                    Quantity = row.Columns[3].IntValue
                });
            }

            return Ok(new
            {
                message = "Import successful",
                count = records.Count,
                data = records
            });
        }
        catch (Exception ex)
        {
            return BadRequest($"Error processing file: {ex.Message}");
        }
    }
}
Imports IronXL
Imports Microsoft.AspNetCore.Mvc
Imports System.IO
Imports System.Threading.Tasks

<Route("api/[controller]")>
<ApiController>
Public Class CsvController
    Inherits ControllerBase

    <HttpPost("upload")>
    Public Async Function UploadCsv(file As IFormFile) As Task(Of IActionResult)
        If file Is Nothing OrElse file.Length = 0 Then
            Return BadRequest("Please upload a valid CSV file.")
        End If

        Try
            Using stream As New MemoryStream()
                Await file.CopyToAsync(stream)
                stream.Position = 0

                Dim workbook As WorkBook = WorkBook.Load(stream, "csv")
                Dim ws As WorkSheet = workbook.DefaultWorkSheet
                Dim records As New List(Of Product)()

                ' Skip header row, iterate through data rows
                For i As Integer = 1 To ws.Rows.Count() - 1
                    Dim row = ws.Rows(i)
                    records.Add(New Product With {
                        .Id = row.Columns(0).IntValue,
                        .Name = row.Columns(1).StringValue,
                        .Price = row.Columns(2).DecimalValue,
                        .Quantity = row.Columns(3).IntValue
                    })
                Next

                Return Ok(New With {
                    .message = "Import successful",
                    .count = records.Count,
                    .data = records
                })
            End Using
        Catch ex As Exception
            Return BadRequest($"Error processing file: {ex.Message}")
        End Try
    End Function
End Class
$vbLabelText   $csharpLabel

Configuring the Endpoint and Multipart Form

For the [HttpPost("upload")] action to accept file uploads, the project must support multipart form data. In Program.cs for a minimal API setup, ensure you have called builder.Services.AddControllers() and app.MapControllers(). The endpoint becomes accessible at /api/csv/upload.

When testing from a browser form, set the form's enctype attribute to multipart/form-data and use a file input element. For API clients like Postman, select "form-data" in the request body, add a key named file, and attach the CSV file. The controller returns a JSON object containing the record count and the parsed data array, which client-side JavaScript can consume immediately.

ASP Import CSV: A Complete C# Developer's Guide: Image 2 - Successful output with the data read from the imported CSV file

Validating File Type Before Parsing

Before passing the stream to IronXL, validate the file extension to reject non-CSV uploads:

var extension = Path.GetExtension(file.FileName).ToLowerInvariant();
if (extension != ".csv" && extension != ".txt")
    return BadRequest("Only CSV files are accepted.");
var extension = Path.GetExtension(file.FileName).ToLowerInvariant();
if (extension != ".csv" && extension != ".txt")
    return BadRequest("Only CSV files are accepted.");
Dim extension As String = Path.GetExtension(file.FileName).ToLowerInvariant()
If extension <> ".csv" AndAlso extension <> ".txt" Then
    Return BadRequest("Only CSV files are accepted.")
End If
$vbLabelText   $csharpLabel

This check prevents malformed binary data from reaching the parser and provides a clear error message to API consumers. You can extend this validation to check MIME type using file.ContentType for stricter enforcement.

How Do You Save CSV Data to a Database?

After parsing CSV files into typed objects, you typically persist the records to a database. The following example extends the service layer pattern by using Entity Framework Core's AddRangeAsync for bulk inserts:

using IronXL;

public class CsvImportService
{
    private readonly AppDbContext _context;

    public CsvImportService(AppDbContext context)
    {
        _context = context;
    }

    public async Task<int> ImportProductsAsync(Stream csvStream)
    {
        WorkBook workbook = WorkBook.LoadCSV(csvStream);
        WorkSheet ws = workbook.DefaultWorkSheet;
        var products = new List<Product>();

        foreach (var row in ws.Rows.Skip(1))
        {
            products.Add(new Product
            {
                Id = row.Columns[0].IntValue,
                Name = row.Columns[1].StringValue,
                Price = row.Columns[2].DecimalValue,
                Quantity = row.Columns[3].IntValue
            });
        }

        await _context.Products.AddRangeAsync(products);
        return await _context.SaveChangesAsync();
    }
}
using IronXL;

public class CsvImportService
{
    private readonly AppDbContext _context;

    public CsvImportService(AppDbContext context)
    {
        _context = context;
    }

    public async Task<int> ImportProductsAsync(Stream csvStream)
    {
        WorkBook workbook = WorkBook.LoadCSV(csvStream);
        WorkSheet ws = workbook.DefaultWorkSheet;
        var products = new List<Product>();

        foreach (var row in ws.Rows.Skip(1))
        {
            products.Add(new Product
            {
                Id = row.Columns[0].IntValue,
                Name = row.Columns[1].StringValue,
                Price = row.Columns[2].DecimalValue,
                Quantity = row.Columns[3].IntValue
            });
        }

        await _context.Products.AddRangeAsync(products);
        return await _context.SaveChangesAsync();
    }
}
Imports IronXL

Public Class CsvImportService
    Private ReadOnly _context As AppDbContext

    Public Sub New(context As AppDbContext)
        _context = context
    End Sub

    Public Async Function ImportProductsAsync(csvStream As Stream) As Task(Of Integer)
        Dim workbook As WorkBook = WorkBook.LoadCSV(csvStream)
        Dim ws As WorkSheet = workbook.DefaultWorkSheet
        Dim products As New List(Of Product)()

        For Each row In ws.Rows.Skip(1)
            products.Add(New Product With {
                .Id = row.Columns(0).IntValue,
                .Name = row.Columns(1).StringValue,
                .Price = row.Columns(2).DecimalValue,
                .Quantity = row.Columns(3).IntValue
            })
        Next

        Await _context.Products.AddRangeAsync(products)
        Return Await _context.SaveChangesAsync()
    End Function
End Class
$vbLabelText   $csharpLabel

Wiring the Service into Dependency Injection

Register CsvImportService in Program.cs so the controller can request it through constructor injection:

builder.Services.AddScoped<CsvImportService>();
builder.Services.AddScoped<CsvImportService>();
$vbLabelText   $csharpLabel

Then update the controller constructor to accept the service and call ImportProductsAsync instead of building the list inline. This separation keeps controller actions thin and moves data-access logic into a testable service class. Entity Framework Core batches the AddRangeAsync call into a single INSERT statement per batch, which performs well for CSV files containing thousands of rows.

For very large imports (tens of thousands of rows), consider using EF Core's Bulk Extensions or a raw SQL BULK INSERT statement to reduce round-trips to the database.

How Do You Export Data Back to CSV Using IronXL?

IronXL is not limited to reading CSV files -- it also writes them. The SaveAsCsv method exports any worksheet to a CSV file, which is useful for generating reports or sending data to downstream systems:

using IronXL;

WorkBook workbook = WorkBook.Create(ExcelFileFormat.XLS);
WorkSheet ws = workbook.DefaultWorkSheet;

// Write headers
ws["A1"].Value = "Id";
ws["B1"].Value = "Name";
ws["C1"].Value = "Price";

// Write data rows
ws["A2"].Value = 1;
ws["B2"].Value = "Widget A";
ws["C2"].Value = 9.99;

ws["A3"].Value = 2;
ws["B3"].Value = "Widget B";
ws["C3"].Value = 14.49;

// Save as CSV
workbook.SaveAsCsv("export.csv");
Console.WriteLine("CSV export complete.");
using IronXL;

WorkBook workbook = WorkBook.Create(ExcelFileFormat.XLS);
WorkSheet ws = workbook.DefaultWorkSheet;

// Write headers
ws["A1"].Value = "Id";
ws["B1"].Value = "Name";
ws["C1"].Value = "Price";

// Write data rows
ws["A2"].Value = 1;
ws["B2"].Value = "Widget A";
ws["C2"].Value = 9.99;

ws["A3"].Value = 2;
ws["B3"].Value = "Widget B";
ws["C3"].Value = 14.49;

// Save as CSV
workbook.SaveAsCsv("export.csv");
Console.WriteLine("CSV export complete.");
Imports IronXL

Dim workbook As WorkBook = WorkBook.Create(ExcelFileFormat.XLS)
Dim ws As WorkSheet = workbook.DefaultWorkSheet

' Write headers
ws("A1").Value = "Id"
ws("B1").Value = "Name"
ws("C1").Value = "Price"

' Write data rows
ws("A2").Value = 1
ws("B2").Value = "Widget A"
ws("C2").Value = 9.99

ws("A3").Value = 2
ws("B3").Value = "Widget B"
ws("C3").Value = 14.49

' Save as CSV
workbook.SaveAsCsv("export.csv")
Console.WriteLine("CSV export complete.")
$vbLabelText   $csharpLabel

The exported file uses comma delimiters by default. For semicolon-delimited files common in European locales, call SaveAsCsv("export.csv", ";") with an explicit delimiter argument. You can also save to a MemoryStream and return the result as a file download from an API endpoint using File(stream, "text/csv", "export.csv").

For a full reference on export options, visit the IronXL save and export documentation.

Why Is IronXL a Better Choice Than Manual CSV Parsing?

Manual CSV parsing looks straightforward until you encounter edge cases: values that contain commas wrapped in quotes, newlines embedded in a cell, escaped quotation marks, and byte-order marks at the start of UTF-8 files. Handling all of these correctly in a custom StreamReader loop requires significant testing and ongoing maintenance.

IronXL addresses all of these cases internally. Comparing the two approaches shows clear advantages:

IronXL vs. Manual StreamReader for CSV Parsing in C#
Capability IronXL Manual StreamReader
Automatic delimiter detection Yes No -- must configure manually
Quoted field handling Built-in Requires custom logic
Type conversion (int, decimal, DateTime) Built-in accessors Requires Parse/TryParse calls
Multi-line cell values Handled automatically Difficult to implement correctly
BOM handling Automatic Requires StreamReader configuration
Excel format support (XLSX, XLS) Same API Requires separate library
Export to CSV SaveAsCsv method Requires separate write logic

Cross-Format Consistency

One of IronXL's practical advantages is that the same WorkBook.Load and worksheet iteration pattern works for XLSX, XLS, ODS, and CSV files. If your application needs to accept multiple spreadsheet formats from users, you can switch between formats without changing the parsing logic. Pass the file stream to WorkBook.Load and IronXL detects the format automatically based on the file signature.

This cross-format API means you write and test one code path instead of maintaining separate implementations for CSV and Excel. For a full list of supported formats, see the IronXL supported file formats page.

Performance Considerations for Large Files

For CSV files under 100 MB, IronXL performs well without any tuning. For larger files, consider these strategies:

  • Load the file from a path rather than copying it to a MemoryStream to reduce memory allocation.
  • Process rows in batches when inserting into a database rather than collecting all records before the first insert.
  • Use ws.Rows.Skip(1) with LINQ to avoid materializing the header row as a Product object.

The IronXL performance guide covers additional optimizations for high-volume import scenarios including parallel processing and streaming modes.

What Are Your Next Steps?

You now have a working pattern for every stage of CSV import in ASP.NET Core: installing the library, loading files from disk or uploaded streams, mapping rows to typed model objects, persisting records to a database with Entity Framework Core, and exporting data back to CSV when needed.

To build on this foundation, explore these resources:

With IronXL in your project, adding support for XLSX uploads alongside CSV requires no additional code changes -- the same WorkBook.Load call handles both. That consistency means you can expand the feature set of your import endpoint incrementally as your application requirements grow.

Frequently Asked Questions

How can I import CSV files in an ASP.NET Core application?

You can import CSV files in an ASP.NET Core application using IronXL by handling file uploads, parsing CSV data, and mapping it to model class objects. IronXL provides an API to facilitate these tasks efficiently.

What are the benefits of using IronXL for CSV import in C#?

IronXL offers an API that simplifies the process of importing CSV files. It allows you to easily parse data, map it to models, and convert records into JSON, making it an ideal choice for data-driven applications.

Can IronXL parse CSV data into model class objects?

Yes, IronXL can parse CSV data into model class objects, allowing you to work with structured data within your .NET application efficiently.

How does IronXL help with database integration when importing CSV files?

IronXL helps with database integration by allowing you to parse CSV data and map it to model objects that can be easily inserted or updated in your database.

Is it possible to return CSV records as JSON using IronXL?

Yes, with IronXL, you can convert CSV records to JSON format, which is useful for creating web APIs or integrating with front-end applications.

What file formats can IronXL handle besides CSV?

In addition to CSV, IronXL can handle various Excel file formats such as XLSX, XLS, and others, providing versatility for different spreadsheet needs.

Does IronXL support large CSV file imports?

IronXL is designed to handle large CSV file imports efficiently, ensuring performance and reliability in data-driven applications.

Jordi Bardia
Software Engineer
Jordi is most proficient in Python, C# and C++, when he isn’t leveraging his skills at Iron Software; he’s game programming. Sharing responsibilities for product testing, product development and research, Jordi adds immense value to continual product improvement. The varied experience keeps him challenged and engaged, and he ...
Read More

Iron Support Team

We're online 24 hours, 5 days a week.
Chat
Email
Call Me