Developer Tools

Free JSON to CSV Converter Online

Convert JSON arrays to CSV. Handles nested objects, custom delimiters and download as .csv file.

JSON to CSV Conversion: A Practical Guide

JSON and CSV are two of the most common data interchange formats in software development. JSON (JavaScript Object Notation) is the lingua franca of modern APIs — flexible, self-describing, and human-readable. CSV (Comma-Separated Values) is the lingua franca of spreadsheets and data science tools — tabular, compact, and supported by every analytics platform on the planet. Converting between them is a routine task, but one with several important subtleties that this guide covers in depth.

Why Convert JSON to CSV?

The most common reasons developers need JSON-to-CSV conversion:

  • Spreadsheet analysis: Export API data to Excel or Google Sheets for pivot tables, charts, or sharing with non-technical stakeholders.
  • Database import: Many databases and ETL tools accept CSV files for bulk insert operations. PostgreSQL's COPY command, MySQL's LOAD DATA INFILE, and Snowflake's COPY INTO all support CSV natively.
  • Machine learning: Pandas, NumPy, and most ML frameworks load CSVs trivially. Converting API JSON responses to CSV is a standard step in data preparation pipelines.
  • Reporting tools: Tableau, Power BI, Looker, and Metabase all have first-class CSV import. JSON requires custom connectors or ETL.
  • Email and sharing: A CSV file opens in Excel on any machine without configuration. A JSON file requires a developer or a specialised viewer.

The RFC 4180 Standard

CSV has no single formal standard for its full lifetime — it predates most of the internet. RFC 4180 (published 2005) is the closest thing to a definitive specification. Key rules that every correct CSV implementation must follow:

  • Fields containing the delimiter, double quotes, or newlines must be wrapped in double quotes. A field like Smith, John using a comma delimiter must be written as "Smith, John".
  • Double quotes inside a quoted field must be escaped by doubling them. The value He said "hello" must be written as "He said ""hello""".
  • Each record ends with CRLF (carriage return + line feed): \r\n. Some tools accept LF alone; Excel historically requires CRLF on Windows.
  • The first record may be a header row. Headers are optional but strongly recommended for any human-readable dataset.
  • All records must have the same number of fields. If a row is missing a value, the cell is empty but the delimiter is still present.

This converter produces RFC 4180-compliant output, including CRLF line endings and proper quoting, so the output is safe to open in any spreadsheet application including Excel, LibreOffice Calc, and Numbers.

Delimiters: Comma, Semicolon, or Tab?

The choice of delimiter depends on your region and target application:

  • Comma (,): The most common delimiter in English-speaking countries. Works in all spreadsheet applications by default. Problematic if your data contains commas (addresses, descriptions) — the tool handles this automatically by quoting affected fields.
  • Semicolon (;): Standard in European locales (Germany, France, Netherlands, Poland, etc.) where commas serve as decimal separators. If you export a CSV with commas from Excel in these locales, it uses semicolons instead. Match the delimiter to your target environment.
  • Tab (\t): Tab-separated values (TSV) avoid quoting issues entirely since tabs almost never appear in data. TSV files open cleanly in Excel and Google Sheets without needing an import wizard, and are preferred by bioinformatics and genomics tools. The file extension is typically .tsv, though .txt is also common.

Handling Nested JSON Objects

CSV is inherently flat — every cell contains a scalar value. JSON can nest objects and arrays to arbitrary depth. This tool handles nesting through dot-notation flattening:

Given this input:

[{
  "id": 1,
  "user": {
    "name": "Alice",
    "address": {
      "city": "London",
      "country": "UK"
    }
  }
}]

The tool produces these columns: id, user.name, user.address.city, user.address.country. Each level of nesting is represented by a dot-separated key. This is the same convention used by Elasticsearch, MongoDB projections, and many logging systems.

Arrays nested inside objects (e.g. a tags array) are serialised as a JSON string within the cell. There is no universally agreed way to represent arrays in CSV — some tools repeat rows, some use pipe-separated values, some use JSON. This converter uses JSON serialisation, which is lossless and parseable.

Handling Missing and Null Values

Real-world JSON from APIs is rarely uniform. One object might have a phone field; the next might not. This converter handles heterogeneous data by:

  1. Collecting all unique keys across all objects in the array (the union of all keys).
  2. Using these as the column headers.
  3. For each row, filling in the value if it exists, or leaving the cell empty if the key is absent.
  4. Converting JSON null to an empty string.

This approach preserves all data without silent loss. The resulting CSV always has the same number of columns in every row, which is required by RFC 4180 and expected by all spreadsheet and database import tools.

Importing the CSV Into Popular Tools

Microsoft Excel: On Windows, double-clicking a CSV file typically opens it directly. If columns are not parsed correctly (a common issue with semicolons or Unicode data), use Data → From Text/CSV and configure the delimiter explicitly. Always verify that the first column is not garbled — this is a sign of encoding mismatch. This converter uses UTF-8, which Excel 2016+ handles correctly when using the CSV import wizard.

Google Sheets: File → Import → Upload, then choose Custom separator if you used semicolons or tabs. For comma-delimited files, Sheets usually auto-detects the format. UTF-8 encoding is fully supported.

PostgreSQL: Use the COPY command:

COPY users (id, name, email)
FROM '/path/to/data.csv'
WITH (FORMAT csv, HEADER true, DELIMITER ',', ENCODING 'UTF8');

Pandas (Python):

import pandas as pd
df = pd.read_csv('data.csv', encoding='utf-8')
print(df.head())

MySQL:

LOAD DATA INFILE '/path/to/data.csv'
INTO TABLE users
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY '\r\n'
IGNORE 1 ROWS;

Command-Line Alternatives

For automation and large files, the command line is often faster than a browser tool:

  • jq + awk: jq -r '(.[0] | keys_unsorted) as $keys | $keys, (.[] | [.[$keys[]]] | @csv)' data.json — produces RFC 4180 CSV using jq's built-in @csv formatter.
  • Python one-liner: python3 -c "import json,csv,sys; data=json.load(open('data.json')); w=csv.DictWriter(sys.stdout, fieldnames=data[0].keys()); w.writeheader(); w.writerows(data)"
  • csvkit: in2csv data.json > output.csv — a Python library with excellent type inference and schema detection.
  • miller (mlr): mlr --ijson --ocsv cat data.json — a powerful stream-processing tool that handles nested JSON natively.

Performance Considerations

For datasets under 50 000 rows, the browser is a perfectly capable converter. This tool processes the full JSON array in memory and renders the output in a textarea. For larger datasets:

  • Use the Download button rather than copying from the textarea — writing a large string to the clipboard is more reliable than selecting text in a large textarea.
  • For files over 100 MB, prefer a command-line tool (jq, Python, or miller) which can stream the input without loading it all into memory.
  • If you need to process JSON line-delimited (NDJSON / jsonl format), each line is a separate JSON object — preprocess by wrapping in an array first, or use miller which handles NDJSON natively.

Common Pitfalls

Date formatting: JSON has no native date type — dates are usually strings (ISO 8601: 2024-01-15T10:30:00Z) or Unix timestamps (integers). In CSV, these are still strings. Excel may auto-format ISO dates as locale-specific date strings when opening the file — verify the format didn't change if dates are critical.

Large integers: JavaScript's JSON.parse loses precision for integers larger than 2^53 (9 007 199 254 740 991). If your JSON contains very large IDs (common with distributed systems using 64-bit snowflake IDs), use a streaming JSON parser that supports BigInt, or treat the large numbers as strings in your source system.

Boolean values: JSON booleans (true/false) are converted to the strings true/false in CSV. If your target database expects 1/0 or Y/N, transform the column after import.

Encoding: This converter uses UTF-8. If you open the CSV in Excel and see garbled characters (particularly for non-Latin scripts like Chinese, Arabic, or Cyrillic), try adding a UTF-8 BOM. Excel interprets files with a BOM as UTF-8. In code: prefix the file content with  before creating the Blob.

JSON vs CSV: When to Use Each

CriterionJSONCSV
Nested dataNative supportRequires flattening
Typed valuesstring, number, boolean, null, array, objectEverything is a string
Spreadsheet supportRequires conversionDirect open
File size (same data)Larger (keys repeated)Smaller (keys in header once)
Streaming large filesNDJSON works wellWorks well
API data exchangeUniversalUncommon (but used in some REST APIs)
Database importVaries by DBUniversal support

Security Considerations

A subtle security issue called CSV injection (also called formula injection) occurs when CSV data is imported into a spreadsheet and a cell value begins with =, +, -, or @. Spreadsheet applications interpret these as formulas. A malicious value like =HYPERLINK("http://attacker.com","Click me") becomes a clickable link in Excel.

To prevent this, sanitise cell values that start with formula characters before exporting. If you control the import side (e.g. building an admin panel that exports user-submitted data), prefix potentially dangerous values with a tab or single quote to prevent formula evaluation. This tool does not sanitise for CSV injection — if your data may contain user-submitted strings, add your own sanitisation layer before exporting to CSV that will be opened in a spreadsheet.

FAQ

Common questions

What JSON structure can be converted to CSV?

This tool accepts a JSON array of objects, where each object represents one row. Keys from the union of all objects become column headers. If objects have different keys, missing values appear as empty cells. A top-level object with an array property is also accepted — the tool will unwrap it automatically.

What delimiter options are available?

You can choose comma (,), semicolon (;), or tab (\t) as the delimiter. Semicolons are standard in European locales where commas serve as decimal separators. Tab-separated values (TSV) open cleanly in Excel and Google Sheets without any import wizard.

How are special characters handled?

Values that contain the delimiter, double quotes, or newlines are wrapped in double quotes per RFC 4180. Double quotes inside a value are escaped by doubling them (a single " becomes ""). This ensures the output is safe to open in any spreadsheet application.

What happens with null or missing values?

Null and undefined values are converted to empty strings. If a row is missing a key that exists in other rows, the cell is left empty. The column structure is always determined by the full union of keys across all rows, so no data is silently lost.

Can I convert nested JSON objects?

Yes. Nested objects are flattened using dot notation. For example, {"user":{"name":"Alice","age":30}} becomes two columns: user.name and user.age. Arrays inside objects are serialised to a JSON string within the cell, since CSV has no native array type.

How do I open the CSV in Excel or Google Sheets?

For Excel on Windows: double-click a comma-delimited CSV, or use Data → From Text/CSV for other delimiters. For Google Sheets: File → Import → Upload, then choose "Custom separator" if you used semicolons or tabs. UTF-8 encoding is used so international characters display correctly.

Is there a row limit?

There is no enforced limit — the converter processes your full JSON array. Very large datasets (50 000+ rows) may slow the browser preview. In that case, use the Download button to get the full CSV file without rendering a large textarea.

Is my data sent anywhere?

No. All conversion runs entirely in your browser using JavaScript. Your JSON data never leaves your machine, is never sent to a server, and is never logged. You can verify this by going offline — the tool works identically without an internet connection.

More in Developer Tools