ASP Import CSV: A Complete C# Developer's Guide
Working with CSV files is a daily reality for developers building data-driven applications. Whether users upload product inventory, customer records, or financial data, the ability to ASP import CSV efficiently makes or breaks your .NET core app. This article demonstrates how to handle CSV file uploads, parse CSV data into model class objects, and return records as JSON, all using IronXL's streamlined API.
Start your free trial to follow along and test these code examples in your own environment.
How Do You Import a CSV File in ASP.NET Core?
Importing a CSV file in ASP.NET Core requires reading the file stream from the server, parsing each new line, and mapping values to a model class. While some developers reach for the CsvHelper NuGet package or manually define a var reader using the System namespace, IronXL provides an alternative that handles CSV files alongside Excel formats without additional dependencies.
The following code shows how to load a CSV file using IronXL:
using IronXL;
// Load the CSV file directly using the full file path
var csv = WorkBook.LoadCSV("products.csv");
WorkSheet worksheet = csv.DefaultWorkSheet;
// Access CSV data by iterating through rows
foreach (var row in worksheet.Rows)
{
string customerName = row.Columns[1].StringValue;
decimal price = row.Columns[2].DecimalValue;
Console.WriteLine($"Product: {customerName}, Price: {price}");
}using IronXL;
// Load the CSV file directly using the full file path
var csv = WorkBook.LoadCSV("products.csv");
WorkSheet worksheet = csv.DefaultWorkSheet;
// Access CSV data by iterating through rows
foreach (var row in worksheet.Rows)
{
string customerName = row.Columns[1].StringValue;
decimal price = row.Columns[2].DecimalValue;
Console.WriteLine($"Product: {customerName}, Price: {price}");
}Imports IronXL
' Load the CSV file directly using the full file path
Dim csv = WorkBook.LoadCSV("products.csv")
Dim worksheet As WorkSheet = csv.DefaultWorkSheet
' Access CSV data by iterating through rows
For Each row In worksheet.Rows
Dim customerName As String = row.Columns(1).StringValue
Dim price As Decimal = row.Columns(2).DecimalValue
Console.WriteLine($"Product: {customerName}, Price: {price}")
NextOutput

The WorkBook.LoadCSV method reads the CSV file and creates a worksheet where each line becomes a row. IronXL automatically detects the delimiter and handles special characters, CSV file headers, and quoted fields. This approach eliminates the need for a new streamwriter or a manual streamwriter instance, preventing the error-prone manual string parsing that custom implementations require.
How Can You Create a Model Class for CSV Data?
Mapping CSV data to strongly-typed objects requires a model class that mirrors the file structure. You often need to convert raw string data into specific types like integers or decimals. For product inventory data, create a class with properties matching each column:
public class Product
{
public int Id { get; set; }
public string Name { get; set; }
public decimal Price { get; set; }
public int Quantity { get; set; }
}public class Product
{
public int Id { get; set; }
public string Name { get; set; }
public decimal Price { get; set; }
public int Quantity { get; set; }
}Public Class Product
Public Property Id As Integer
Public Property Name As String
Public Property Price As Decimal
Public Property Quantity As Integer
End ClassWith this model class defined, you can parse CSV records into a typed collection:
using IronXL;
WorkBook workbook = WorkBook.LoadCSV("inventory.csv");
WorkSheet ws = workbook.DefaultWorkSheet;
var records = new List<Product>();
// Skip header row, iterate through data rows
for (int i = 1; i < ws.Rows.Count(); i++)
{
var row = ws.Rows[i];
var product = new Product
{
ProductCode = row.Columns[0].StringValue,
Quantity = row.Columns[1].IntValue,
LastUpdated = row.Columns[2].DateTimeValue ?? default(DateTime)
};
records.Add(product);
}using IronXL;
WorkBook workbook = WorkBook.LoadCSV("inventory.csv");
WorkSheet ws = workbook.DefaultWorkSheet;
var records = new List<Product>();
// Skip header row, iterate through data rows
for (int i = 1; i < ws.Rows.Count(); i++)
{
var row = ws.Rows[i];
var product = new Product
{
ProductCode = row.Columns[0].StringValue,
Quantity = row.Columns[1].IntValue,
LastUpdated = row.Columns[2].DateTimeValue ?? default(DateTime)
};
records.Add(product);
}Imports IronXL
Dim workbook As WorkBook = WorkBook.LoadCSV("inventory.csv")
Dim ws As WorkSheet = workbook.DefaultWorkSheet
Dim records As New List(Of Product)()
' Skip header row, iterate through data rows
For i As Integer = 1 To ws.Rows.Count() - 1
Dim row = ws.Rows(i)
Dim product As New Product With {
.ProductCode = row.Columns(0).StringValue,
.Quantity = row.Columns(1).IntValue,
.LastUpdated = If(row.Columns(2).DateTimeValue, Nothing)
}
records.Add(product)
Next iThe var records collection now contains typed Product objects ready for database operations, JSON serialization, or further processing. IronXL's cell value accessors like IntValue and DateTimeValue handle type conversion automatically.
How Do You Handle CSV File Uploads in a Web API?
Building an API endpoint that accepts CSV file uploads from a browser requires combining ASP.NET Core's IFormFile with IronXL's parsing capabilities. The following code demonstrates a complete controller implementation that sends a JSON response:
using IronXL;
using Microsoft.AspNetCore.Mvc;
namespace CsvTestProject.Controllers
{
[Route("api/[controller]")]
[ApiController]
public class CsvController : ControllerBase
{
[HttpPost("upload")]
public async Task<IActionResult> UploadCsv(IFormFile file)
{
if (file == null || file.Length == 0)
return BadRequest("Please upload a valid CSV file.");
try
{
using var stream = new MemoryStream();
await file.CopyToAsync(stream);
stream.Position = 0;
WorkBook workbook = WorkBook.Load(stream, "csv");
WorkSheet ws = workbook.DefaultWorkSheet;
var records = new List<Product>();
// Skip header row, iterate through data rows
for (int i = 1; i < ws.Rows.Count(); i++)
{
var row = ws.Rows[i];
var product = new Product
{
ProductCode = row.Columns[0].StringValue,
Quantity = row.Columns[1].IntValue,
LastUpdated = row.Columns[2].DateTimeValue ?? default(DateTime)
};
records.Add(product);
}
return Ok(new
{
message = "Success!",
count = records.Count,
data = records
});
}
catch (Exception ex)
{
return BadRequest($"Error: {ex.Message}");
}
}
}
}using IronXL;
using Microsoft.AspNetCore.Mvc;
namespace CsvTestProject.Controllers
{
[Route("api/[controller]")]
[ApiController]
public class CsvController : ControllerBase
{
[HttpPost("upload")]
public async Task<IActionResult> UploadCsv(IFormFile file)
{
if (file == null || file.Length == 0)
return BadRequest("Please upload a valid CSV file.");
try
{
using var stream = new MemoryStream();
await file.CopyToAsync(stream);
stream.Position = 0;
WorkBook workbook = WorkBook.Load(stream, "csv");
WorkSheet ws = workbook.DefaultWorkSheet;
var records = new List<Product>();
// Skip header row, iterate through data rows
for (int i = 1; i < ws.Rows.Count(); i++)
{
var row = ws.Rows[i];
var product = new Product
{
ProductCode = row.Columns[0].StringValue,
Quantity = row.Columns[1].IntValue,
LastUpdated = row.Columns[2].DateTimeValue ?? default(DateTime)
};
records.Add(product);
}
return Ok(new
{
message = "Success!",
count = records.Count,
data = records
});
}
catch (Exception ex)
{
return BadRequest($"Error: {ex.Message}");
}
}
}
}Imports IronXL
Imports Microsoft.AspNetCore.Mvc
Imports System.IO
Imports System.Threading.Tasks
Namespace CsvTestProject.Controllers
<Route("api/[controller]")>
<ApiController>
Public Class CsvController
Inherits ControllerBase
<HttpPost("upload")>
Public Async Function UploadCsv(file As IFormFile) As Task(Of IActionResult)
If file Is Nothing OrElse file.Length = 0 Then
Return BadRequest("Please upload a valid CSV file.")
End If
Try
Using stream As New MemoryStream()
Await file.CopyToAsync(stream)
stream.Position = 0
Dim workbook As WorkBook = WorkBook.Load(stream, "csv")
Dim ws As WorkSheet = workbook.DefaultWorkSheet
Dim records As New List(Of Product)()
' Skip header row, iterate through data rows
For i As Integer = 1 To ws.Rows.Count() - 1
Dim row = ws.Rows(i)
Dim product As New Product With {
.ProductCode = row.Columns(0).StringValue,
.Quantity = row.Columns(1).IntValue,
.LastUpdated = If(row.Columns(2).DateTimeValue, default(DateTime))
}
records.Add(product)
Next
Return Ok(New With {
.message = "Success!",
.count = records.Count,
.data = records
})
End Using
Catch ex As Exception
Return BadRequest($"Error: {ex.Message}")
End Try
End Function
End Class
End NamespaceOutput

This public async Task method accepts a POST request with the CSV file in the request body. The code creates a MemoryStream, copies the uploaded file contents, then passes the stream directly to IronXL. The return records operation serializes the collection as JSON, which client applications can consume immediately.
For a .NET Core Web API project, create the endpoint using dotnet new webapp or your preferred project template. The route configuration makes the API endpoint accessible at /api/csv/upload. If you are working with Razor Pages, you might trigger this upload from your Index page. Simply add an HTML form with a file input and a submit button using the Bootstrap class btn btn primary to style the UI.
How Do You Save CSV Data to a Database?
After parsing CSV files, you'll typically persist the records to a database. Here's how to extend the CsvService class to write data using Entity Framework:
public class CsvService
{
private readonly AppDbContext _context;
public CsvService(AppDbContext context)
{
_context = context;
}
public async Task<int> ImportProducts(Stream csvStream)
{
WorkBook workbook = WorkBook.LoadCSV(csvStream);
WorkSheet ws = workbook.DefaultWorkSheet;
var products = new List<Product>();
foreach (var row in ws.Rows.Skip(1))
{
products.Add(new Product
{
Id = row.Columns[0].IntValue,
Name = row.Columns[1].StringValue,
Price = row.Columns[2].DecimalValue,
Quantity = row.Columns[3].IntValue
});
}
await _context.Products.AddRangeAsync(products);
return await _context.SaveChangesAsync();
}
}public class CsvService
{
private readonly AppDbContext _context;
public CsvService(AppDbContext context)
{
_context = context;
}
public async Task<int> ImportProducts(Stream csvStream)
{
WorkBook workbook = WorkBook.LoadCSV(csvStream);
WorkSheet ws = workbook.DefaultWorkSheet;
var products = new List<Product>();
foreach (var row in ws.Rows.Skip(1))
{
products.Add(new Product
{
Id = row.Columns[0].IntValue,
Name = row.Columns[1].StringValue,
Price = row.Columns[2].DecimalValue,
Quantity = row.Columns[3].IntValue
});
}
await _context.Products.AddRangeAsync(products);
return await _context.SaveChangesAsync();
}
}Imports System.IO
Imports System.Threading.Tasks
Public Class CsvService
Private ReadOnly _context As AppDbContext
Public Sub New(context As AppDbContext)
_context = context
End Sub
Public Async Function ImportProducts(csvStream As Stream) As Task(Of Integer)
Dim workbook As WorkBook = WorkBook.LoadCSV(csvStream)
Dim ws As WorkSheet = workbook.DefaultWorkSheet
Dim products As New List(Of Product)()
For Each row In ws.Rows.Skip(1)
products.Add(New Product With {
.Id = row.Columns(0).IntValue,
.Name = row.Columns(1).StringValue,
.Price = row.Columns(2).DecimalValue,
.Quantity = row.Columns(3).IntValue
})
Next
Await _context.Products.AddRangeAsync(products)
Return Await _context.SaveChangesAsync()
End Function
End ClassThe foreach loop processes each row while Skip(1) handles the CSV file header row. This pattern scales efficiently for reading CSV files with thousands of records.
Why Choose IronXL for CSV Operations?
IronXL simplifies CSV import workflows by providing a consistent API that works across CSV, Excel, and TSV formats. Unlike manual StreamReader implementations or the CsvHelper package, IronXL offers:
- Automatic delimiter detection and handling
- Built-in type conversion for numeric and date values
- Cross-platform support for .NET Core, .NET Framework, and .NET Standard
- Zero dependency on Microsoft Office or Excel installation
The library also supports creating and writing CSV files using the SaveAsCsv method, making it a complete solution for bidirectional CSV operations.
Ready to streamline your CSV file handling? Download IronXL and start building robust data import features today. For production applications, explore IronXL licensing options to unlock the full power of the library.
Conclusion
At the end of the day, importing data shouldn't be the most frustrating part of your week. By using a specialized tool, you can skip the headache of manually initializing a new streamreader for every single file or fighting with broken delimiters.
Whether you're building a massive enterprise .NET core app or just a small project, this approach keeps your code clean and your focus on the data itself. Go ahead, grab your actual key, give these examples a thorough test, and see how much more enjoyable an CSV import tasks can actually be.









