Skip to footer content
USING IRONXL

Streamline Data Processing with a C# CSV Parser (Guide)

CSV (Comma-Separated Values) files remain one of the most widely used formats for data exchange between applications, databases, and systems. Despite their apparent simplicity, parsing CSV files in C# correctly can quickly become a challenging problem. From handling quoted fields containing commas to managing line breaks within data cells, the nuances of CSV processing demand more than basic string manipulation.

Many developers start with a simple string.Split(',') approach, only to discover that real-world CSV files break these basic implementations in countless ways. Performance issues emerge when processing large datasets with multiple columns, memory consumption grows, and edge cases create data corruption that is difficult to debug. These challenges lead to countless hours spent writing and maintaining custom CSV parsing code that still does not handle every scenario correctly.

IronXL offers a solution that transforms CSV processing from a source of frustration into a reliable operation. As a complete Excel library for .NET, IronXL handles the complexities of CSV parsing while providing integration with Excel formats, making it ideal for applications that work with multiple data formats. Whether importing customer data, processing financial records, or managing inventory files, IronXL's C# CSV library parser eliminates common pitfalls that plague custom implementations.

IronXL homepage showcasing C# code example for reading Excel files without Microsoft Office interop dependencies

What Makes CSV Parsing Complex in C#?

The deceptive simplicity of CSV files masks numerous challenges that emerge when processing real-world data. While the format appears straightforward -- values separated by commas -- the reality involves handling multiple edge cases and performance considerations that can derail basic parsing approaches. According to discussions on Stack Overflow, even experienced developers struggle with proper CSV handling. The Microsoft .NET documentation on file input/output provides background on the underlying primitives, illustrating why building a production-ready CSV parser from scratch is a significant undertaking.

Why Does Basic String Splitting Fail?

Consider the most common beginner's approach to parse a CSV file:

string line = "John,Doe,30,Engineer";
string[] values = line.Split(','); // string array
string line = "John,Doe,30,Engineer";
string[] values = line.Split(','); // string array
$vbLabelText   $csharpLabel

This works perfectly for simple cases but immediately fails when encountering real-world data. Quoted fields with embedded commas are one major problem: a CSV line such as "Smith, John",Developer,"New York, NY",50000 gets split into five fields instead of four, corrupting the data structure and causing misalignment in subsequent processing.

Line breaks within fields also cause issues. According to RFC 4180, fields can contain line breaks when properly quoted. A multi-line address field breaks any line-by-line reading approach, requiring sophisticated state management to track whether a line break occurs within a quoted field or represents a new record.

Escape characters and quote handling create further complications. CSV files use various conventions for escaping quotes within quoted fields. Some use doubled quotes (""), while others use backslashes or other escape characters. Without proper handling, data like "She said, ""Hello!""",greeting becomes corrupted or causes parsing errors.

Different delimiters and encodings add yet more complexity. Not all "CSV" files use commas. Tab-separated values (TSV), pipe-delimited files, and semicolon-separated values are common variations. The RFC 4180 standard defines CSV format specifications, but many implementations deviate from it.

How Does Memory Management Affect Large File Processing?

Loading a 500MB CSV file entirely into memory using File.ReadAllLines() can cause significant performance degradation or out-of-memory exceptions. Processing millions of rows requires streaming approaches and efficient memory management to maintain application responsiveness.

These complexities compound when dealing with CSV files from different sources, each potentially using different conventions. Building a parser that handles all scenarios reliably requires substantial development effort and ongoing maintenance as new edge cases emerge.

How Does IronXL Handle CSV Processing?

IronXL provides a parser that handles real-world CSV complexities while maintaining ease of use. Rather than forcing developers to reinvent the wheel, IronXL offers a solution addressing every common CSV challenge through an intuitive API.

The parser's intelligent engine automatically detects and handles quoted fields, embedded delimiters, and line breaks within data. The engine adapts to different CSV dialects without requiring manual configuration, correctly interpreting files whether they follow strict RFC 4180 standards or use common variations.

Flexible delimiter support is built in. While commas remain the default, IronXL handles any delimiter character through simple configuration options. Whether working with tab-separated files, pipe-delimited exports, or semicolon-separated European formats, the same API handles all variations consistently.

Excel integration is another key advantage. Unlike standalone CSV parsers, IronXL provides bidirectional conversion between CSV and Excel formats. This capability enables workflows where CSV data imports into Excel workbooks for advanced formatting, formula application, and chart generation -- all programmatically through C# code.

Cross Platform Support diagram showing compatibility with C#, F#, and VB.NET across .NET versions 9, 8, 7, 6, Core, Standard, and Framework, with icons representing various platforms and deployment environments.

How Do You Install IronXL for CSV Parsing?

Installing IronXL requires just a few simple steps. The library integrates into any .NET project through NuGet, Microsoft's package management system. You can visit the IronXL NuGet installation guide for detailed setup instructions.

What Are the Installation Steps?

Install IronXL through the NuGet Package Manager Console or the .NET CLI:

# NuGet Package Manager Console
Install-Package IronXL.Excel

# .NET CLI
dotnet add package IronXL.Excel
# NuGet Package Manager Console
Install-Package IronXL.Excel

# .NET CLI
dotnet add package IronXL.Excel
SHELL

For licensing setup, you can obtain a trial license to evaluate IronXL fully before purchasing.

How Do You Load Your First CSV File?

Once installed, add the IronXL namespace to your C# files and load a CSV with just a few lines:

using IronXL;

// Load a CSV file using top-level statements
WorkBook workbook = WorkBook.LoadCSV("customers.csv");
WorkSheet sheet = workbook.DefaultWorkSheet;

// Read a specific cell value
string customerName = sheet["B2"].StringValue;
Console.WriteLine($"Customer: {customerName}");
using IronXL;

// Load a CSV file using top-level statements
WorkBook workbook = WorkBook.LoadCSV("customers.csv");
WorkSheet sheet = workbook.DefaultWorkSheet;

// Read a specific cell value
string customerName = sheet["B2"].StringValue;
Console.WriteLine($"Customer: {customerName}");
$vbLabelText   $csharpLabel

The WorkBook.LoadCSV() method intelligently parses the CSV file, automatically detecting delimiters and handling quoted fields. Since CSV files contain single sheets, the data is accessed through DefaultWorkSheet. The typed accessor StringValue provides type-safe value retrieval.

Input

Excel spreadsheet showing a customer database with columns for CustomerID, FirstName, LastName, Email, City, and Country, containing 10 rows of sample customer data.

Output

Visual Studio debug console showing output with 'Customer: Emily' text displayed

How Do You Read CSV Files with IronXL?

Reading CSV files with IronXL provides multiple approaches for different scenarios, from simple data extraction to complex processing workflows. The IronXL features page provides a complete overview of all capabilities, while the open workbook guide covers workbook handling in depth.

Feature overview of a C# spreadsheet manipulation library showing six main categories: Create, Save and Export, Edit Workbooks, Working With Data, Secure Your Workbooks, and Working With Cells.

How Do You Iterate Through CSV Rows?

The most direct approach uses LoadCSV with default settings and iterates through all rows:

using IronXL;

WorkBook workbook = WorkBook.LoadCSV("sales_data.csv");
WorkSheet sheet = workbook.DefaultWorkSheet;

// Iterate through all data rows (skipping header at row 0)
for (var row = 1; row <= sheet.RowCount; row++)
{
    string productName = sheet[$"A{row}"].StringValue;
    decimal price = sheet[$"B{row}"].DecimalValue;
    int quantity = sheet[$"C{row}"].IntValue;
    Console.WriteLine($"Product: {productName}, Price: ${price}, Qty: {quantity}");
}
using IronXL;

WorkBook workbook = WorkBook.LoadCSV("sales_data.csv");
WorkSheet sheet = workbook.DefaultWorkSheet;

// Iterate through all data rows (skipping header at row 0)
for (var row = 1; row <= sheet.RowCount; row++)
{
    string productName = sheet[$"A{row}"].StringValue;
    decimal price = sheet[$"B{row}"].DecimalValue;
    int quantity = sheet[$"C{row}"].IntValue;
    Console.WriteLine($"Product: {productName}, Price: ${price}, Qty: {quantity}");
}
$vbLabelText   $csharpLabel

IronXL's typed accessors automatically convert text to the appropriate .NET types, eliminating manual parsing. The loop continues through all rows using RowCount, which accurately reflects the total number of data rows in the file.

How Do You Handle Non-Standard Delimiters?

For CSV files with non-standard delimiters, IronXL provides configuration options through the listDelimiter parameter:

using IronXL;

// Load a tab-separated file
WorkBook workbook = WorkBook.LoadCSV("inventory.tsv",
    fileFormat: ExcelFileFormat.XLSX,
    listDelimiter: "\t");
WorkSheet sheet = workbook.DefaultWorkSheet;

// Build header list from column 0
var headers = new List<string>();
for (int col = 0; col < sheet.ColumnCount; col++)
{
    headers.Add(sheet.GetCellAt(0, col).StringValue);
}
Console.WriteLine("Columns: " + string.Join(" | ", headers));
using IronXL;

// Load a tab-separated file
WorkBook workbook = WorkBook.LoadCSV("inventory.tsv",
    fileFormat: ExcelFileFormat.XLSX,
    listDelimiter: "\t");
WorkSheet sheet = workbook.DefaultWorkSheet;

// Build header list from column 0
var headers = new List<string>();
for (int col = 0; col < sheet.ColumnCount; col++)
{
    headers.Add(sheet.GetCellAt(0, col).StringValue);
}
Console.WriteLine("Columns: " + string.Join(" | ", headers));
$vbLabelText   $csharpLabel

The listDelimiter parameter specifies field separators -- here, tabs for TSV files. The fileFormat parameter determines the internal representation after parsing. You can learn more about reading Excel files for additional file format options.

Screenshot of a tab-separated values (TSV) file named 'inventory.tsv' displayed in a text editor, showing a product inventory table with columns for ItemID, ItemName, Category, Quantity, UnitPrice, and Supplier.

Visual Studio Debug Console showing CSV column headers: ItemID, ItemName, Category, Quantity, UnitPrice, and Supplier

How Do You Handle Complex CSV Scenarios?

Real-world CSV files often contain complexities that break simple parsing approaches. IronXL handles these challenging scenarios gracefully, providing solutions for quoted fields, special characters, and encoding issues. The IronXL documentation covers all advanced scenarios in detail.

How Does IronXL Handle Quoted Fields and Special Characters?

IronXL automatically handles CSV files with quoted fields containing delimiters. The parser follows CSV standards, treating doubled quotes as escape sequences:

using IronXL;

// Create sample CSV with complex quoted fields
string csvContent = @"Name,Description,Price,Category
""Johnson, Mike"",""Premium keyboard with mechanical switches"",149.99,Electronics
""O'Brien, Sarah"",""Children's toy - ages 3+"",29.99,Toys";

File.WriteAllText("complex_data.csv", csvContent);

WorkBook workbook = WorkBook.LoadCSV("complex_data.csv");
WorkSheet sheet = workbook.DefaultWorkSheet;

for (int row = 1; row <= sheet.RowCount; row++)
{
    string name = sheet[$"A{row}"].StringValue;
    string description = sheet[$"B{row}"].StringValue;
    Console.WriteLine($"Name: {name}");
    Console.WriteLine($"Description: {description}");
}
using IronXL;

// Create sample CSV with complex quoted fields
string csvContent = @"Name,Description,Price,Category
""Johnson, Mike"",""Premium keyboard with mechanical switches"",149.99,Electronics
""O'Brien, Sarah"",""Children's toy - ages 3+"",29.99,Toys";

File.WriteAllText("complex_data.csv", csvContent);

WorkBook workbook = WorkBook.LoadCSV("complex_data.csv");
WorkSheet sheet = workbook.DefaultWorkSheet;

for (int row = 1; row <= sheet.RowCount; row++)
{
    string name = sheet[$"A{row}"].StringValue;
    string description = sheet[$"B{row}"].StringValue;
    Console.WriteLine($"Name: {name}");
    Console.WriteLine($"Description: {description}");
}
$vbLabelText   $csharpLabel

IronXL correctly interprets "Johnson, Mike" as a single field despite containing a comma, and properly processes nested quotes in descriptions. This automatic handling eliminates complex regular expressions or state machines that custom parsers require.

What About Character Encoding Issues?

Working with different character encodings requires careful consideration. IronXL handles various encodings automatically, ensuring international characters display correctly:

using IronXL;

WorkBook workbook = WorkBook.Load("international_data.csv");
WorkSheet sheet = workbook.DefaultWorkSheet;

for (int row = 1; row <= sheet.RowCount; row++)
{
    string city = sheet[$"A{row}"].StringValue;
    string country = sheet[$"B{row}"].StringValue;
    // Characters like n~, u-umlaut, e-acute display correctly
    Console.WriteLine($"Location: {city}, {country}");
}

// Save with UTF-8 encoding to preserve characters
workbook.SaveAsCsv("output_utf8.csv");
using IronXL;

WorkBook workbook = WorkBook.Load("international_data.csv");
WorkSheet sheet = workbook.DefaultWorkSheet;

for (int row = 1; row <= sheet.RowCount; row++)
{
    string city = sheet[$"A{row}"].StringValue;
    string country = sheet[$"B{row}"].StringValue;
    // Characters like n~, u-umlaut, e-acute display correctly
    Console.WriteLine($"Location: {city}, {country}");
}

// Save with UTF-8 encoding to preserve characters
workbook.SaveAsCsv("output_utf8.csv");
$vbLabelText   $csharpLabel

Whether working with UTF-8, UTF-16, or legacy ANSI encodings, IronXL preserves character integrity throughout read-write cycles. When saving CSV files, UTF-8 is used by default for maximum compatibility. Check the export guide for all output format options.

Input

Excel spreadsheet displaying international data with columns for Country, Region, Population, GDP USD, and Currency, showing 15 different countries with their respective economic information.

Output

Visual Studio Debug Console displaying location data with country names and regions in multiple languages, showing countries from Europe, North America, South America, Africa, and Asia.

How Do You Work with Custom Delimiters and Regional Formats?

Custom delimiters and regional formats require flexible configuration. European CSV files frequently use semicolons as delimiters and commas as decimal separators:

using IronXL;

// European CSV format (semicolon delimiter, comma decimal)
string europeanCsv = @"Product;Price;Quantity
Widget A;12,50;100
Gadget B;24,99;50";

File.WriteAllText("european.csv", europeanCsv);

WorkBook workbook = WorkBook.LoadCSV("european.csv",
    fileFormat: ExcelFileFormat.XLSX,
    listDelimiter: ";");
WorkSheet sheet = workbook.DefaultWorkSheet;

for (int row = 1; row <= sheet.RowCount; row++)
{
    string product = sheet[$"A{row}"].StringValue;
    string priceText = sheet[$"B{row}"].StringValue;
    decimal price = decimal.Parse(priceText.Replace(',', '.'));
    Console.WriteLine($"{product}: euro{price}");
}
using IronXL;

// European CSV format (semicolon delimiter, comma decimal)
string europeanCsv = @"Product;Price;Quantity
Widget A;12,50;100
Gadget B;24,99;50";

File.WriteAllText("european.csv", europeanCsv);

WorkBook workbook = WorkBook.LoadCSV("european.csv",
    fileFormat: ExcelFileFormat.XLSX,
    listDelimiter: ";");
WorkSheet sheet = workbook.DefaultWorkSheet;

for (int row = 1; row <= sheet.RowCount; row++)
{
    string product = sheet[$"A{row}"].StringValue;
    string priceText = sheet[$"B{row}"].StringValue;
    decimal price = decimal.Parse(priceText.Replace(',', '.'));
    Console.WriteLine($"{product}: euro{price}");
}
$vbLabelText   $csharpLabel

The listDelimiter parameter configures field splitting, while number parsing converts European decimal notation to .NET's expected format. This flexibility allows processing CSV files from any region without modifying source data. The import data guide covers additional data import scenarios.

How Do You Process Large CSV Files Efficiently?

Processing large CSV files presents unique challenges requiring thoughtful approaches to memory management and performance. IronXL provides strategies for handling files with millions of rows without overwhelming system resources.

How Do You Use Batch Processing for Large Datasets?

Batch processing divides large datasets into manageable chunks, preventing memory overload and enabling progress tracking:

using IronXL;

WorkBook workbook = WorkBook.LoadCSV("large_dataset.csv");
WorkSheet sheet = workbook.DefaultWorkSheet;

int batchSize = 1000;
int totalRows = sheet.RowCount;

for (int startRow = 1; startRow <= totalRows; startRow += batchSize)
{
    int endRow = Math.Min(startRow + batchSize - 1, totalRows);
    var batchResults = new List<(string Id, decimal Amount)>();

    for (int row = startRow; row <= endRow; row++)
    {
        string id = sheet[$"A{row}"].StringValue;
        decimal amount = sheet[$"B{row}"].DecimalValue;
        batchResults.Add((id, amount));
    }

    // Save batch results to database or file
    Console.WriteLine($"Processed rows {startRow} to {endRow}: {batchResults.Count} records");
}
using IronXL;

WorkBook workbook = WorkBook.LoadCSV("large_dataset.csv");
WorkSheet sheet = workbook.DefaultWorkSheet;

int batchSize = 1000;
int totalRows = sheet.RowCount;

for (int startRow = 1; startRow <= totalRows; startRow += batchSize)
{
    int endRow = Math.Min(startRow + batchSize - 1, totalRows);
    var batchResults = new List<(string Id, decimal Amount)>();

    for (int row = startRow; row <= endRow; row++)
    {
        string id = sheet[$"A{row}"].StringValue;
        decimal amount = sheet[$"B{row}"].DecimalValue;
        batchResults.Add((id, amount));
    }

    // Save batch results to database or file
    Console.WriteLine($"Processed rows {startRow} to {endRow}: {batchResults.Count} records");
}
$vbLabelText   $csharpLabel

Processing 1000 rows at a time allows garbage collection between batches, maintaining steady memory usage. The pattern also facilitates error recovery -- you can resume from the last successful batch rather than restarting from scratch. The Excel to DataSet guide shows how to work with bulk data in memory efficiently.

Microsoft Excel spreadsheet displaying a large dataset with columns for Country, Region, City, Population, GDP, Currency, Latitude, and Longitude, showing various international data entries.

Console output showing batch processing of CSV records in groups of 1000, with progress messages for rows 1 to 10001

How Do You Convert Between CSV and Excel Formats?

One of IronXL's standout features is conversion between CSV and Excel formats, enabling workflows that use both formats' strengths. This capability proves invaluable when importing CSV data for advanced Excel processing or exporting Excel reports as CSV for system integration.

How Do You Convert a CSV File to a Formatted Excel Workbook?

Converting CSV to Excel with formatting enhances data presentation and enables advanced features like formulas, charts, and styling:

using IronXL;

WorkBook workbook = WorkBook.LoadCSV("sales_report.csv");
WorkSheet sheet = workbook.DefaultWorkSheet;

// Format header row
for (int col = 0; col < sheet.ColumnCount; col++)
{
    var headerCell = sheet.GetCellAt(0, col);
    headerCell.Style.Font.Bold = true;
    headerCell.Style.BackgroundColor = "#4472C4";
    headerCell.Style.Font.Color = "#FFFFFF";
}

// Format currency columns
for (int row = 1; row <= sheet.RowCount; row++)
{
    var priceCell = sheet[$"C{row}"];
    priceCell.FormatString = "$#,##0.00";
}

// Auto-fit columns for better display
for (int col = 0; col < sheet.ColumnCount; col++)
{
    sheet.AutoSizeColumn(col);
}

workbook.SaveAs("formatted_report.xlsx");
Console.WriteLine("CSV converted to formatted Excel file");
using IronXL;

WorkBook workbook = WorkBook.LoadCSV("sales_report.csv");
WorkSheet sheet = workbook.DefaultWorkSheet;

// Format header row
for (int col = 0; col < sheet.ColumnCount; col++)
{
    var headerCell = sheet.GetCellAt(0, col);
    headerCell.Style.Font.Bold = true;
    headerCell.Style.BackgroundColor = "#4472C4";
    headerCell.Style.Font.Color = "#FFFFFF";
}

// Format currency columns
for (int row = 1; row <= sheet.RowCount; row++)
{
    var priceCell = sheet[$"C{row}"];
    priceCell.FormatString = "$#,##0.00";
}

// Auto-fit columns for better display
for (int col = 0; col < sheet.ColumnCount; col++)
{
    sheet.AutoSizeColumn(col);
}

workbook.SaveAs("formatted_report.xlsx");
Console.WriteLine("CSV converted to formatted Excel file");
$vbLabelText   $csharpLabel

The code applies bold formatting and colors to headers, creating visual hierarchy. Currency formatting with thousand separators improves numeric readability. AutoSizeColumn adjusts column widths to fit content. The cell formatting guide and write Excel file tutorial provide additional formatting techniques.

Input

Excel spreadsheet showing sales data with columns for Sale ID, Date, Region, Product, Sales Representative, Quantity, Unit Price, Total Sale, and Currency, containing 26 rows of international sales data.

Output

Visual Studio Debug Console showing the message 'CSV converted to formatted Excel file' after successful conversion.

The formatted Excel output displays processed sales data from the CSV parser, with properly formatted columns including mixed currencies.

How Do You Create New Excel Files from CSV Data?

Beyond simple conversion, IronXL enables you to create Excel files with multiple worksheets, formulas, and structured data from CSV sources. The merge cells guide shows how to create professional-looking reports with merged headers.

For containerized deployments, IronXL's conversion capabilities work in Docker environments without external dependencies or Office installations. This makes it ideal for cloud-native architectures where lightweight, self-contained processing is essential.

Why Should You Use IronXL for CSV Processing?

IronXL transforms CSV processing from a complex challenge into a reliable operation, eliminating countless edge cases and performance issues that plague custom implementations. The library's intelligent parser handles quoted fields, special characters, and various delimiters automatically while providing conversion between CSV and Excel formats.

Whether importing customer data, processing financial records, or converting between formats, IronXL's C# CSV parser handles complexities while you focus on business logic rather than parsing infrastructure.

The library's commitment to continuous improvement is evident through regular updates. With documentation covering everything from basic installation to advanced scenarios, IronXL provides the resources developers need to succeed with CSV and spreadsheet processing in .NET 10 applications.

Ready to simplify your CSV processing workflow? Start with a free trial license to evaluate the full feature set. When you're ready to deploy, review the available licensing options designed for projects of all sizes.

IronXL CSV Parser -- Key Capabilities at a Glance
Capability Description Common Use Case
Automatic delimiter detection Detects commas, tabs, semicolons, and pipes without configuration Importing files from third-party systems
Quoted field handling Correctly parses fields containing delimiters or line breaks Address and description fields in data exports
Encoding support Reads UTF-8, UTF-16, and ANSI encoded files Processing international data files
CSV to Excel conversion Converts and applies formatting, formulas, and styles in one step Generating formatted reports from raw data
Large file processing Batch processing patterns for multi-million row files ETL pipelines and data migration tasks

IronXL licensing page showing four pricing tiers (Lite $749, Plus $999, Professional $1,999, and Unlimited $3,999) with a toggle between IronXL and Iron Suite options

Frequently Asked Questions

What is a CSV file and why is it widely used?

A CSV (Comma-Separated Values) file is a simple text format for data exchange that is widely used due to its simplicity and ease of integration with various applications, databases, and systems.

What challenges might arise when parsing CSV files in C#?

Parsing CSV files in C# can be complex due to issues such as handling quoted fields containing commas, managing line breaks within data cells, and other nuances that go beyond basic string manipulation.

How can IronXL assist in parsing CSV files in C#?

IronXL offers a robust solution for parsing CSV files in C#, simplifying complex tasks and ensuring accurate data handling with its efficient parsing capabilities.

What features make IronXL suitable for CSV parsing?

IronXL provides features such as handling quoted fields, managing line breaks, and offering efficient data processing capabilities, making it suitable for parsing complex CSV files.

Is IronXL compatible with different CSV formats?

Yes, IronXL is designed to be compatible with various CSV formats, allowing developers to streamline data processing tasks across different systems and applications.

Can IronXL handle large CSV files efficiently?

IronXL is optimized to handle large CSV files efficiently, ensuring quick and accurate data processing without compromising performance.

Does IronXL support data manipulation after CSV parsing?

Yes, IronXL not only parses CSV files but also supports data manipulation and transformation, enabling developers to work seamlessly with the data.

How does IronXL ensure data accuracy during CSV parsing?

IronXL employs advanced parsing techniques to handle complex CSV structures, ensuring data accuracy and integrity during the parsing process.

What makes IronXL different from other CSV parsing libraries?

IronXL stands out due to its comprehensive feature set, efficiency, and ease of use, offering developers a powerful tool for handling CSV parsing challenges.

Where can I find more resources on using IronXL for CSV parsing?

You can find more resources and guides on using IronXL for CSV parsing on the Iron Software website and its documentation pages.

Jordi Bardia
Software Engineer
Jordi is most proficient in Python, C# and C++, when he isn’t leveraging his skills at Iron Software; he’s game programming. Sharing responsibilities for product testing, product development and research, Jordi adds immense value to continual product improvement. The varied experience keeps him challenged and engaged, and he ...
Read More